# AMD Radeon RX 6000 "Big Navi" RDNA2 Graphics Card Launch Liveblog



## btarunr (Oct 28, 2020)

After thoroughly appetizing us with its "Where Gaming Begins: Episode 1" event announcing the Ryzen 5000 series "Zen 3" processors that offer up to 19% IPC gains, in the second Episode, we see the company announcing its next-generation Radeon RX 6000 "Big Navi" graphics cards based on the RDNA2 graphics architecture that introduce full DirectX 12 Ultimate readiness, including real-time raytracing hardware. In the run up to the RX 6000, NVIDIA is already reportedly preparing product-stack updates. In this liveblog, we uncover what has NVIDIA riled up, and whether AMD can pull off better pricing and availbility than the RTX 30-series.





*Update 15:59 UTC*: It is time! Welcome to the Radeon RX 6000 Series live blog.



 

*Update 16:01 UTC*: AMD CEO Dr Lisa Su takes centerstage, fresh off a good quarterly results announcement, and that big Xilinx acquisition announcement.





*Update 16:03 UTC*: FarCry 6 seems like an AMD optimized title.

*Update 16:03 UTC*: Possibly the flagship product.



 

 

*Update 16:04 UTC*: AMD is where gaming begins because next-gen consoles trust it. - Dr Su

*Update 16:05 UTC*: 50% generational improvement in perf/W.


 

 

*Update 16:06 UTC*: 26.7 billion transistors, almost as big as GA102


 

*Update 16:07 UTC*: RDNA2 has a breakthrough high-speed design.


 

 

*Update 16:08 UTC*: AMD RDNA2 compute unit 30% more energy efficient.


 

*Update 16:09 UTC*: InfinityCache works to significantly improve memory bandwidth, based on the Zen L3 cache. More than 2.17x bandwidth gain despite 256-bit.


 

*Update 16:10 UTC*: 30% higher frequencies on the same 7 nm node.


 

*Update 16:10 UTC*: DirectX 12 Ultimate and DirectStorage support.


 

*Update 16:11 UTC*: Over 2X performance gain over RX 5700 XT.


 

*Update 16:11 UTC*: The RX 6800 XT!


 

*Update 16:12 UTC*: 4K Gaming from AMD is here! Matches RTX 3080!!!


 

 

*Update 16:13 UTC*: Hello competition!


 

*Update 16:14 UTC*: Welcome back ATI Rage, as the AMD Rage Mode.


 

*Update 16:15 UTC*: When paired with Ryzen, you get a gaming performance boost, 13% perf increase.

*Update 16:15 UTC*: AMD introduces its take on the NVIDIA Reflex, announcing latest Radeon Lag and Radeon Boost. No special API needed.


 

 

 

*Update 16:16 UTC*: Full DX12 Ultimate support, including ray-tracing. Woirking on a DLSS-rival.



 

*Update 16:18 UTC*: AMD is leveraging studios working on Xbox Series X / PS5 devs to integrate its Radeon features on the PC.

*Update 16:18 UTC*: FarCry 6, DiRT 5, WoW Shadowlands (which gets raytracing), RiftBreaker, are AMD-optimized.

*Update 16:21 UTC*: Available Nov 18, starts, $649

*Update 16:21 UTC*: Also announcing RX 6800 : faster than RTX 2080 Ti, 4K + raytracing ready


 

*Update 16:22 UTC*: $579, November 18 availability for the RX 6800.

*Update 16:22 UTC*: Lisa gets ready for a "one more thing"

*Update 16:23 UTC*: Radeon RX 6900 XT: 80 CUs, 65% perf/Watt over 5700 XT.


 

 

*Update 16:25 UTC*: RTX 3090 BEATEN!!!!


 


*Update 16:26 UTC*: December 8 for the RX 6900 XT, priced $999.


 

*Update 16:26 UTC*: OOF, we cannot wait to test these cards out!

*Update 17:08 UTC*: The press-release can be found here.

*Update 17:19 UTC*: The complete AMD slide deck follows.



 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## kapone32 (Oct 28, 2020)

The Youtube video already has almost 85000 viewers!


----------



## MxPhenom 216 (Oct 28, 2020)

Come on AMD, you already got me believing in you again on CPUs. Now for GPUs


----------



## windwhirl (Oct 28, 2020)

Man, the YT chat is going mad lmao


kapone32 said:


> The Youtube video already has almost 85000 viewers!


130k+ by now!


----------



## kapone32 (Oct 28, 2020)

Are you not impressed!!!!!


----------



## TheDeeGee (Oct 28, 2020)

windwhirl said:


> Man, the YT chat is going mad lmao
> 
> 130k+ by now!



Where does one even find this link, why isn't it posted in this blog?


----------



## kapone32 (Oct 28, 2020)

TheDeeGee said:


> Where does one even find this link, why isn't it posted in this blog?


Youtube
AMD channel


----------



## windwhirl (Oct 28, 2020)

TheDeeGee said:


> Where does one even find this link, why isn't it posted in this blog?


----------



## MxPhenom 216 (Oct 28, 2020)

Holy 300w board power for 6800XT


----------



## QUANTUMPHYSICS (Oct 28, 2020)

All I'm concerned with is my AMD Stock share value.


----------



## kruk (Oct 28, 2020)

It's really nice to see how a great leadership can rescue a struggling company. With a fraction of resources they managed to catch up with Intel and nVidia in the same year, which is kind of crazy. Can't wait to see if their product stack scales down to PCIE only powered GPUs. 2020/2021 looks really exciting for gamers


----------



## kapone32 (Oct 28, 2020)

Where is that post that Cache had no influence on GPUs?


----------



## B-Real (Oct 28, 2020)

If I was right in quick maths, it seems it may be faster than the RTX 3080 in the averaged games shown in the graphs. :O Or equal at worst.


----------



## kapone32 (Oct 28, 2020)

kruk said:


> It's really nice to see how a great leadership can rescue a struggling company. With a fraction of resources they managed to catch up with Intel and nVidia in the same year, which is kind of crazy. Can't wait to see if their product stack scales down to PCIE only powered GPUs. 2020/2021 looks really exciting for gamers


Not looks...will


----------



## MxPhenom 216 (Oct 28, 2020)

B-Real said:


> It seems it may be faster than the RTX 3080. :O



Trades blows based on their graphs. A little faster in a good chunk of them.


----------



## windwhirl (Oct 28, 2020)

OK, so 6800 XT equal or better than RTX 3800? SHOTS FIRED


----------



## B-Real (Oct 28, 2020)

MxPhenom 216 said:


> Trades blows based on their graphs. A little faster in a good chunk of them.


Yep!


----------



## MxPhenom 216 (Oct 28, 2020)

Radeon Anti-Lag/Boost = Nvidia Reflex


----------



## Turmania (Oct 28, 2020)

I want that T-Shirt he is wearing.


----------



## kapone32 (Oct 28, 2020)

MxPhenom 216 said:


> Holy 300w board power for 6800XT


That includes memory and everything else


----------



## MxPhenom 216 (Oct 28, 2020)

Show us Ray tracing performance for fuck sakes.



kapone32 said:


> That includes memory and everything else



I'm aware. Wasn't expecting it that high for 6800XT. Thought the 6900XT would be around there.

Also that Godfall game is going to be good.

No one cares about Far Cry 6!


----------



## windwhirl (Oct 28, 2020)

MxPhenom 216 said:


> Show us Ray tracing performance for fuck sakes.


Saving the best for last? lmao


----------



## kapone32 (Oct 28, 2020)

I am peeing my pants right now this is insane!



MxPhenom 216 said:


> Show us Ray tracing performance for fuck sakes.
> 
> 
> 
> ...


I can't wait to run one of these through it's paces. Godfall does look like it will get a run through. I can't wait to oh I said that already.


----------



## Cheeseball (Oct 28, 2020)

MxPhenom 216 said:


> Radeon Anti-Lag/Boost = Nvidia Reflex



The first implementation of Radeon Boost was not so good. It basically lowers the resolution of the entire screen when there is mouse/keyboard movement. It's pretty bad if you set it to max making everything extremely pixelated. Not too helpful in competitive shooters like PUBG.

Early implementation was causing Anti-Lag to crash when I enabled it on my RX 5700 XT. Hopefully they improved it with the next Adrenalin release.

*EDIT*: Sounds like $649 for the RX 6800 XT. Not bad.


----------



## B-Real (Oct 28, 2020)

$649 with ~same performance, less power consumption, +6GB VRAM. Holy $hit.


----------



## windwhirl (Oct 28, 2020)

B-Real said:


> $649 with ~same performance, less power consumption, +6GB VRAM. Holy $hit.


With the non-XT 6800 also equipped with 16 GB of VRAM... Damn.


----------



## Turmania (Oct 28, 2020)

Last year, I said it there AMD is behind couple of years to Nvidia. It seems not only they caught up with them, they even passed them. well done. I love competition.


----------



## xkm1948 (Oct 28, 2020)

So no DLSS2.0 equals, no mention of RTX performance.

Price is OK though.


----------



## windwhirl (Oct 28, 2020)

RX 6900 XT!? OH YEAH THIS IS HAPPENING LMAO

EDIT: Actually, meh, that should have had a 100 CUs lol


----------



## B-Real (Oct 28, 2020)

xkm1948 said:


> So no DLSS2.0 equals, no mention of RTX performance.
> 
> Price is OK though.


  I feel pity for you, really. DLSS 2.0 is supported maybe 2 games? LOL


----------



## mouacyk (Oct 28, 2020)

6900XT $899? please

Darn, off by $100


----------



## windwhirl (Oct 28, 2020)

mouacyk said:


> 6900XT $899? please


999. That... might be a little high in this case... but if it matches or beats the 3090... that might be worth it.


----------



## B-Real (Oct 28, 2020)

6900XT faster or as fast as 3090, and $500 less? LOOOL


----------



## Dinnercore (Oct 28, 2020)

And just like that, the 3090 is obsolete. Insane!


----------



## xkm1948 (Oct 28, 2020)

Nice 6900. Finally a top of the line since Fury days!


----------



## Cheeseball (Oct 28, 2020)

Ooh $999 for the RX 6900 XT

Really not bad AMD. I like that pricing.


----------



## kapone32 (Oct 28, 2020)

Nov 18 One pay cheque to buy a 5600X and another to get a 6000 series


----------



## Legacy-ZA (Oct 28, 2020)

Well done AMD, well done.


----------



## mouacyk (Oct 28, 2020)

Not to mention AMD has absolutely been killing in Vulkan performance and Linux compatibility.  Providing supply is good, I'm glad to have waited and will be jumping ship.

Rage mode, wtf is that?


----------



## windwhirl (Oct 28, 2020)

Now, the only thing left is the waiting game. AMD really needs to push the best drivers they can for November 18.


----------



## hathoward (Oct 28, 2020)

Awesome! Would've liked more insight into raytracing performance though :/


----------



## owen10578 (Oct 28, 2020)

HYPE for the 6900XT! If it can truly match a 3090 at $999 AND a lower power level then Nvidia is screwed with how much they're charging for it...


----------



## _Flare (Oct 28, 2020)

no RT-Perf shown
RX 6800 at 579 is too expensive


----------



## Vya Domus (Oct 28, 2020)

Thing is in the last chart that was with the overclock and that smart memory access thing, so about 5% slower probably without those. Still, for 500$ less, it's not even funny. Sadly all of these years of thousand dollar flagships from Nvidia skewed the prices into the stratosphere, 999$ is still overpriced.


----------



## chandras (Oct 28, 2020)

RIP 3090


----------



## Turmania (Oct 28, 2020)

I expect price adjustment from Nvidia pretty soon, this is why I love competition. Though, they first have to make products  available to purchase to make price arrangement, but you get the idea...


----------



## MxPhenom 216 (Oct 28, 2020)

I want a 6800XT to go with my 5800x build, unless Nvidia undercuts with a price drop.


----------



## mouacyk (Oct 28, 2020)

Vya Domus said:


> Thing is in the last chart that was with the overclock and that smart memory access thing, so about 5% slower probably without those. Still, 500$ less, it's not even funny.


Can't wait to see NVidia response...


----------



## kapone32 (Oct 28, 2020)

Well instead of Nov 18 I get a couple more pays to save for the 6900XT Dec 8 just in time for CP2077.


----------



## milewski1015 (Oct 28, 2020)

Missed opportunity to call it the 6969 XT 

Hopefully availability is good and drivers are stable


----------



## Metroid (Oct 28, 2020)

AMD wants margin, reason only 6800/6900 will be launched, amd wants to capitalize on the $600 gpus first then 6700 and so on around $400.


----------



## jabbadap (Oct 28, 2020)

B-Real said:


> 6900XT faster or as fast as 3090, and $500 less? LOOOL



*with rage mode on. Either way full blown competition is finally here again, can't wait to see full line ups from both IHVs.


----------



## neatfeatguy (Oct 28, 2020)

Awesome!

I'm scraping together cash for a new system build - with AMD and Nvidia having good cards with more reasonable prices over the last gen, I shouldn't have an issue finding one that fits my needs when the time comes to build a new system next month. Can't wait for the other cards and official reviews to come out.


----------



## DuxCro (Oct 28, 2020)

Those performance nubmers of RX 6900XT...it said down next to it..+ RAGE + the infinity setting. Does that mean both of those things were activated for benchmark?


----------



## Turmania (Oct 28, 2020)

_Flare said:


> no RT-Perf shown
> RX 6800 at 549 is too expensive



Yeah it seems it is direct competitior to 3070, but with higher price and higher power draw. they probably will make a price adjustment before availiability. I would just spend extra 75 to get the 6800 xt.


----------



## MxPhenom 216 (Oct 28, 2020)

milewski1015 said:


> Missed opportunity to call it the 6969 XT
> 
> Hopefully availability is good and drivers are stable



Yeah drivers are my main concern now that i know the performance is there for the most part. I wanna see actual reviewer benchmarks too see where the RTX performance is at.


----------



## repman244 (Oct 28, 2020)

Looks good, but I was hoping for 6800 to be cheaper...I guess the days of top cards going for 400€ are long gone.


----------



## B-Real (Oct 28, 2020)

windwhirl said:


> Now, the only thing left is the waiting game. AMD really needs to push the best drivers they can for November 18.


And pray for better stocks than NV this time.


----------



## HD64G (Oct 28, 2020)

Performance is a bit better than I expected and that's because they pushed power draw to 300W. But prices are logical. Feature set is very nice. Synergies with Zen3 is a great marketing tool also. Good product series imho. AMD turns to a ramming machine for gaming lately. 6700(XT) series for next year probably.


----------



## B-Real (Oct 28, 2020)

_Flare said:


> no RT-Perf shown
> RX 6800 at 549 is too expensive


LOL


----------



## ratirt (Oct 28, 2020)

6900XT 300 Watts only :O Not sure about you guys but that's awesome and $999? Can't say only since it is a bit but beating 3090 and so much less power.
I only wonder, If there is any way these can be overclocked. I mean 300 Watts so if you add 50W more what would the clocks look like?
Must say I'm impressed.


----------



## Vya Domus (Oct 28, 2020)

Turmania said:


> I expect price adjustment from Nvidia pretty soon, this is why I love competition.



You're gonna be left waiting, Nvidia is very much like Apple, no price change for the lifetime of the product. Last time they lowered prices was back in 2016 and even then it was a joke because most cards were being sold way over their MSRP anyway.


----------



## windwhirl (Oct 28, 2020)

ratirt said:


> 6900XT 300 Watts only :O Not sure about you guys but that's awesome and $999? Can't say only since it is a bit but beating 3090 and so much less power.
> I only wonder, If there is any way these can be overclocked. I mean 300 Watts so if you add 50W more what would the clocks look like?
> Must say I'm impressed.


Until the cards are reviewed, we don't know. But I think they already push the clocks quite a bit straight from fabs, so I don't think there will be huge gains by OCing it.


----------



## BoboOOZ (Oct 28, 2020)

_Flare said:


> no RT-Perf shown
> RX 6800 at 549 is too expensive


I think they anticipate good yields and they prefer to sell the 6800XT to you.

Which is probably the one I'll get, looks like awesome value.


----------



## B-Real (Oct 28, 2020)

MxPhenom 216 said:


> Yeah drivers are my main concern now that i know the performance is there for the most part. I wanna see actual reviewer benchmarks too see where the RTX performance is at.


If you check drivers in the last years of NV, they are non better than AMD's. AMD had bigger driver issues back in the days of the 7000 series as far as I remember and it burned into people's mind. Just like the hot and egg frying things.



repman244 said:


> Looks good, but I was hoping for 6800 to be cheaper...I guess the days of top cards going for 400€ are long gone.


There's definitely be smaller cards later.


----------



## Space Lynx (Oct 28, 2020)

AMD HAS BEAT NVIDIA!!!! RTX 3090 $1500 KILLER!!!!   RX 6900 XT AT $999!!!!  this is the golden age!!!!!


----------



## windwhirl (Oct 28, 2020)

lynx29 said:


> AMD HAS BEAT NVIDIA!!!! RTX 3090 $1500 KILLER!!!!   RX 6900 XT AT $999!!!!  this is the golden age!!!!!



The day of miracles has come? lol


----------



## Turmania (Oct 28, 2020)

2020 has become a weird year even more. The landscape has changed even in PC industry. Well done AMD. Now, hopefully you can send the samples to review sites so we can make a final decision before launch date.


----------



## MxPhenom 216 (Oct 28, 2020)

B-Real said:


> If you check drivers in the last years of NV, they are non better than AMD's. AMD had bigger driver issues back in the days of the 7000 series as far as I remember and it burned into people's mind. Just like the hot and egg frying things.
> 
> 
> There's definitely be smaller cards later.



Except if you ask 5700XT owners on this site they still have problems a year later...


----------



## okbuddy (Oct 28, 2020)

6900xt $500 cheaper than no stock 3090


----------



## Space Lynx (Oct 28, 2020)

im buying 6900XT  100%


----------



## kapone32 (Oct 28, 2020)

DuxCro said:


> Those performance nubmers of RX 6900XT...it said down next to it..+ RAGE + the infinity setting. Does that mean both of those things were activated for benchmark?


Does it matter?


----------



## windwhirl (Oct 28, 2020)

lynx29 said:


> im buying 6900XT  100%


Cool down, dammit, you're too hyper for me with that signature LMAO


----------



## MxPhenom 216 (Oct 28, 2020)

lynx29 said:


> im buying 6900XT  100%



Jesus your signature is an eye sore


----------



## Tomgang (Oct 28, 2020)

Hot dam. If AMD'S number are true. I for the first time ever, may have to admit AMD nailed it on both cpu and gpu side.

Nvidia really have to release a rtx 3080 ti now. That rumor rtx 3080 20 gb is already DOA.

One thing does concern me. For me to chose amd gpu for the first time ever. They really need to be sure that they can release stable and trouble free drivers. One of the reasons I have always on til now choosen Nvidia's cards. Is that it seems nvidia driver is more stable, but not trouble free either.

So am I for the first time ending up with pure amd based pc. I don't know yet. But it sure as he'll seems to be a great time to replace X58 with what ever I end up chosing.


----------



## repman244 (Oct 28, 2020)

B-Real said:


> There's definitely be smaller cards later.



I'm not talking about "smaller" cards, I'm talking about prices going sky high.


----------



## Turmania (Oct 28, 2020)

I would like to see a conparison with 6900xt and 6800xt
 Since they draw same power.


----------



## medi01 (Oct 28, 2020)

Curious, that there are a bit more transistors in 3090 than 6900XT. (about  1 billion of difference)


----------



## dragontamer5788 (Oct 28, 2020)

I'm not buying until I see decent ROCm support (and 3rd party benches). 

But these numbers are impressive. The GPUs are definitely on my radar now.


----------



## B-Real (Oct 28, 2020)

repman244 said:


> Looks good, but I was hoping for 6800 to be cheaper...I guess the days of top cards going for 400€ are long gone.


There's definitely be smaller cards later.


----------



## Cheeseball (Oct 28, 2020)

B-Real said:


> If you check drivers in the last years of NV, they are non better than AMD's. AMD had bigger driver issues back in the days of the 7000 series as far as I remember and it burned into people's mind. Just like the hot and egg frying things.
> 
> 
> There's definitely be smaller cards later.



Ah, I can't agree with that. The drivers (at least specific for the RX 5700 XT) were not exactly great, especially with the downclocking in older games (DX 9 and 10 titles) and blackscreens/CTDs. If I recall, they were able to get it stable by the September 2019 releases. After that (and disabling any of the AMD optimization in Adrenalin), the RX 5700 XT was rock stable. Great hardware, bad software.


----------



## windwhirl (Oct 28, 2020)

dragontamer5788 said:


> I'm not buying until I see decent ROCm support (and 3rd party benches).
> 
> But these numbers are impressive. The GPUs are definitely on my radar now.


Wasn't ML, compute and all that moved to CDNA, which is now like its own product?


----------



## Space Lynx (Oct 28, 2020)

MxPhenom 216 said:


> Jesus your signature is an eye sore



you love it, don't lie to yourself.


----------



## Cheeseball (Oct 28, 2020)

kapone32 said:


> Does it matter?



It should, kind of. Most people won't be using Rage Mode on all the time (like I leave Anti-Lag and Chill disabled). Not sure about that Infinity setting though.


----------



## Turmania (Oct 28, 2020)

I never liked to see Ryzen paired up with Nvidia. That defeats the whole point of AMD does it not? You are not truly a red untill you get both Ryzen and Radeon.


----------



## Rahnak (Oct 28, 2020)

Kinda bummed neither company has shown any products under the 500€ mark but I'm pleased that AMD seems to have brought competition back to the high end space. Kudos to them.

Still curious about RT perf and DLSS competitor. Shame they didn't show anything about that.


----------



## Chrispy_ (Oct 28, 2020)

So the 6900XT is a 3090 at $999 instead of Nvidia's $1499 which is great.
But the 6800 is a 3070 at $579 instead of Nvidia's $499 which isn't so great.

The leaks were wrong about the CU counts of the 6800 series, too - the 6800 in particular was the card I think more people were hoping would be competitive and they've already lost to Nvidia on price, and that's taking AMD's own benchmark numbers rather than the independent reviewer benchmark numbers realeased yesterday. Don't get me wrong, I'm sure some people are going to be super hyped for the thousand-dollar 6900 flagship but the stats show that only a tiny fraction of people actually buy them, compared to the stuff in the $200-500 price point.


----------



## nguyen (Oct 28, 2020)

Rasterizaton performance are good but AMD really expect to sell those GPU at those prices ? or perhaps they can't price them any lower ?


----------



## okbuddy (Oct 28, 2020)

72/80=0.9

999-649 >>>>> 0.9

10% bump you need $350


plus only 3 video ports, 3080 got five


----------



## Cheeseball (Oct 28, 2020)

windwhirl said:


> Wasn't ML, compute and all that moved to CDNA, which is now like its own product?



RDNA (and most likely RDNA2) can still do compute, but its just not as effective as the previous Radeon VII/RX Vegas. It's also the reason why I'm still going to stick with my RTX 3080.


----------



## mahirzukic2 (Oct 28, 2020)

lynx29 said:


> you love it, don't lie to yourself.


I do, for sure. It's so over the top, gotta love it.


----------



## dragontamer5788 (Oct 28, 2020)

windwhirl said:


> Wasn't ML, compute and all that moved to CDNA, which is now like its own product?



Well, there's no information about CDNA either. Traditionally, AMD's compute line (ie: MI25) shares similarities with  their consumer line (ie: Vega64). MI6 is like Fury Nano. Etc. etc.

If this RDNA2 6900XT architecture is any good, AMD will make a compute-card with similar specs.


----------



## Manoa (Oct 28, 2020)

MxPhenom 216 said:


> Except if you ask 5700XT owners on this site they still have problems a year later...



I know someone personally who buyed 5700XT nitro+ new and traded it for 1080Ti becuase of many problems, this was just a cuple months ago :x


----------



## xkm1948 (Oct 28, 2020)

lynx29 said:


> im buying 6900XT  100%




DO IT MAN, JUST BUILD IT ALREADY


----------



## Arkz (Oct 28, 2020)

I wish they would announce a more friendly priced lower end card. I wanna upgrade from my RX580 but really about £250-300 is my limit. That used to get you a high end card 10 years ago. If I spend that now it will still be weaker than the Xbox Series X probably.


----------



## PLSG08 (Oct 28, 2020)

The RX 6800 non XT could've been a much better card if it were priced @499. Would've been competitive enough with the 3070. 

Mind you according to the note on the end of the Stream, all the scores were done with the 5900x and an engineering board, this meant that the scores would've been the best case scenario. If you were you using a 3000 series CPU with a 400 series mobo you'd most likely get less performance. 

I kinda think that AMD would do a price drop right before they launch the cards at nov 18. They might drop the price down of the 6800 non XT to a 525 or even the 499 price point w/c would absolutely be competitive (if the slides are true with the performance)


----------



## Ibotibo01 (Oct 28, 2020)

Will AMD make RX 6800 8GB for $449-499? Please AMD, bring RX 6700 and 6600 series before Nvidia.


----------



## B-Real (Oct 28, 2020)

Cheeseball said:


> Ah, I can't agree with that. The drivers (at least specific for the RX 5700 XT) were not exactly great, especially with the downclocking in older games (DX 9 and 10 titles) and blackscreens/CTDs. If I recall, they were able to get it stable by the September 2019 releases. After that (and disabling any of the AMD optimization in Adrenalin), the RX 5700 XT was rock stable. Great hardware, bad software.


I didn't say they were great.  I said the really worse drivers were back in the days of the 7000. I bought a 270X back then, I can remember forums. But there were much less problems for the next 200 series. I used a 270X for 4 years, I had zero issues. Also using an RX 570 for nearly 2 years now, same, zero issues. What I wanted to point at is that today's NV drivers are not better than AMD ones. Just see what happened to RTX 3000, and maybe 2,5-3 years back, with Watch Dogs 2 not starting up or instantly crashing with a WHQL driver, there were Chrome video playback problems, etc.


----------



## Durvelle27 (Oct 28, 2020)

Now I have to debate between getting a RX 6800 XT or RX 6900 XT


----------



## mb194dc (Oct 28, 2020)

Presumably AMD numbers are for reference card. Wondering how much power and clock headroom 6800xt could have with good cooling.... 

Looks from presentation that numbers are for a boost clock of only 2250, chances of AIB with 2500 (as rumored) boost clock and can even push higher with more watts on water or better cooling.


----------



## MDWiley (Oct 28, 2020)

Oooh I’m excited. Third party reviews can’t come soon enough. I just hope the drivers are good. If so I might just switch from nvidia.


----------



## saikamaldoss (Oct 28, 2020)

MxPhenom 216 said:


> No one cares about Far Cry 6!



You may not but some do... lol


----------



## sepheronx (Oct 28, 2020)

Will have to add an additional $300 CAD to my budget to get that 6800XT


----------



## Space Lynx (Oct 28, 2020)

xkm1948 said:


> DO IT MAN, JUST BUILD IT ALREADY



i already have the msi x570 tomahawk, and my ram should oc to the new infinity fbric max limit 1:1 of 4000 cas 16 since i have b-die.

im only waiting on cpu and gpu to come in stock to finish my build. already have the PC built.


----------



## Nucleoprotein (Oct 28, 2020)

Chrispy_ said:


> So the 6900XT is a 3090 at $999 instead of Nvidia's $1499 which is great.
> But the 6800 is a 3070 at $579 instead of Nvidia's $499 which isn't so great.
> 
> The leaks were wrong about the CU counts of the 6800 series, too - the 6800 in particular was the card I think more people were hoping would be competitive and they've already lost to Nvidia on price, and that's taking AMD's own benchmark numbers rather than the independent reviewer benchmark numbers realeased yesterday. Don't get me wrong, I'm sure some people are going to be super hyped for the thousand-dollar 6900 flagship but the stats show that only a tiny fraction of people actually buy them, compared to the stuff in the $200-500 price point.



6800 16GB vs 3070 8GB, see the difference in price?


----------



## kapone32 (Oct 28, 2020)

Chrispy_ said:


> So the 6900XT is a 3090 at $999 instead of Nvidia's $1499 which is great.
> But the 6800 is a 3070 at $579 instead of Nvidia's $499 which isn't so great.
> 
> The leaks were wrong about the CU counts of the 6800 series, too - the 6800 in particular was the card I think more people were hoping would be competitive and they've already lost to Nvidia on price, and that's taking AMD's own benchmark numbers rather than the independent reviewer benchmark numbers realeased yesterday. Don't get me wrong, I'm sure some people are going to be super hyped for the thousand-dollar 6900 flagship but the stats show that only a tiny fraction of people actually buy them, compared to the stuff in the $200-500 price point.


Isn't the 6800 competing with the 3080?


----------



## PLSG08 (Oct 28, 2020)

Ibotibo01 said:


> Will AMD make RX 6800 8GB for $449-499? Please AMD, bring RX 6700 and 6600 series before Nvidia.


They would most likely do a 6700 XT and non XT with both of them bringing 8gb to combat the 3060 release next year around Q1 and Q2.


----------



## xkm1948 (Oct 28, 2020)

lynx29 said:


> i already have the msi x570 tomahawk, and my ram should oc to the new infinity fbric max limit 1:1 of 4000 cas 16 since i have b-die.
> 
> im only waiting on cpu and gpu to come in stock to finish my build. already have the PC built.




You getting the AMD reference or waiting again for AIC cards? Assuming 6900XT?


----------



## B-Real (Oct 28, 2020)

nguyen said:


> Rasterizaton performance are good but AMD really expect to sell those GPU at those prices ? or perhaps they can't price them any lower ?


What did you expect?

6800XT: 3080 performance, $50 less, less power consumption, +6 GB VRAM
6900XT: 3090 performance, $500 less, less power consumption, -8 GB VRAM


----------



## kapone32 (Oct 28, 2020)

sepheronx said:


> Will have to add an additional $300 CAD to my budget to get that 6800XT


Hopefully


----------



## Cheeseball (Oct 28, 2020)

B-Real said:


> I didn't say they were great.  I said the really worse drivers were back in the days of the 7000. I bought a 270X back then, I can remember forums. But there were much less problems for the next 200 series. I used a 270X for 4 years, I had zero issues. Also using an RX 570 for nearly 2 years now, same, zero issues. What I wanted to point at is that today's NV drivers are not better than AMD ones. Just see what happened to RTX 3000, and maybe 2,5-3 years back, with Watch Dogs 2 not starting up or instantly crashing with a WHQL driver, there were Chrome video playback problems, etc.



You mean the release driver for the RTX 3080? Yeah that was bad since there was CTDs all around, but they have resolved it with the second driver that came out a week later. My RTX 3080 is rock solid with hotfix 456.98 and the previous WHQL 456.71.

I know what you're talking about though, I did have a HD 7870 XT (1,536 cores, but only 2 GB at 256-bit) back in 2013 until 2016.


----------



## Turmania (Oct 28, 2020)

Durvelle27 said:


> Now I have to debate between getting a RX 6800 XT or RX 6900 XT


$350 more for 10% increase. Maths are not in favour of RX 6900 XT.


----------



## sepheronx (Oct 28, 2020)

kapone32 said:


> Hopefully



Yeah, got $650 aside for it.  So yeah, about $300 more.  Cause you and I both know us Canadians will get very screwed in terms of prices.


----------



## Quicks (Oct 28, 2020)

Will be skipping this gen as well. Maybe just skip PC gaming altogether getting very expensive...


----------



## Durvelle27 (Oct 28, 2020)

Turmania said:


> $350 more for 10% increase. Maths are not in favour of RX 6900 XT.


But its top tier and I have a 4K 120Hz VRR Display


----------



## Turmania (Oct 28, 2020)

I expect that they will introduce lower wattage cards like 200w, 150w and 100w range but both companies do that after 6 months from launch.


----------



## dragontamer5788 (Oct 28, 2020)

Turmania said:


> $350 more for 10% increase. Maths are not in favour of RX 6900 XT.



Its called decoy pricing.

6900 XT is the decoy, it exists primarily to make the 6800 XT look better.


----------



## sepheronx (Oct 28, 2020)

Quicks said:


> Will be skipping this gen as well. Maybe just skip PC gaming altogether getting very expensive...



Don't fret yet.  Wait till the more mid range cards come out.  They may be very solid for 1440p and reasonable price.


----------



## Turmania (Oct 28, 2020)

Durvelle27 said:


> But its top tier and I have a 4K 120Hz VRR Display


Seems to me, you already gave your answer to yourself, go for it


----------



## pavle (Oct 28, 2020)

Cheeseball said:


> ..........I know what you're talking about though, I did have a HD 7870 XT (1,536 cores, but only 2 GB at 256-bit) back in 2013 until 2016.


HD 7870 has 1280 shaders by the way...

And there I thought my HD 6950 would be better than the new Hoseron 6900 XT.


----------



## birdie (Oct 28, 2020)

I'm waiting for reviews and AMD kinda skimped on RTRT performance figures: "we have it, great, but we won't show the numbers". Perhaps as predicted, RDNA 2.0 cards have a great rasterization performance but not so much RTRT performance. Still, it's really great the we finally have competition at high end graphics for the first time in many many years. Also, AMD hasn't mentioned any DLSS alternative which further casts doubt into their RTRT performance. It surely looks like NVIDIA will remain the king of RTRT enabled games whose number is only going to increase now that both console support RTRT.

And lastly, let me calm down everyone's excitement here: absolute most people out there won't buy any of RTX 3070/3080/3090 RX 6800(XT)/6900 XT cards: they are all priced very high.

I'm waiting for midrange products where we'll see these vendors' true colors: RTX 3050/3060(Ti), RX 6500/6600. It's what really matters. Cards for the rich Europeans and Americans - no so much. Again, the real money and the real market are below $300 for a GPU. We haven't yet seen anything from either AMD or NVIDIA in this regard.


----------



## windwhirl (Oct 28, 2020)

nguyen said:


> Rasterizaton performance are good but AMD really expect to sell those GPU at those prices ? or perhaps they can't price them any lower ?





Arkz said:


> I wish they would announce a more friendly priced lower end card. I wanna upgrade from my RX580 but really about £250-300 is my limit. That used to get you a high end card 10 years ago. If I spend that now it will still be weaker than the Xbox Series X probably.



Performance shown was 4K. Budget gamers (like myself  ) are probably still using 1080p and will continue to do so for a while. For them, there is little need to go overboard with a +600 USD card. And besides, we only saw the top three cards (6800, 6800 XT and 6900 XT). We don't know what they're gonna launch below those.


Turmania said:


> $350 more for 10% increase. Maths are not in favour of RX 6900 XT.


At the top of the line, perf. per dollar is sort of not really as important as on the lower tiers.


----------



## Houd.ini (Oct 28, 2020)

windwhirl said:


> Now, the only thing left is the waiting game. AMD really needs to push the best drivers they can for November 18.


Their waiting game drivers are said to be up to snuff though.


----------



## dragontamer5788 (Oct 28, 2020)

birdie said:


> I'm waiting for midrange products where we'll see these vendors' true colors: RTX 3050/3060(Ti), RX 6500/6600. It's what really matters. Cards for the rich Europeans and Americans - no so much.



I'm certainly with you on that. All of these cards are out of my personal price range.

But its still exciting to see the competition. The lower-end cards will almost certainly launch in the coming months, so today we get to see a preview of the future.


----------



## Space Lynx (Oct 28, 2020)

xkm1948 said:


> You getting the AMD reference or waiting again for AIC cards? Assuming 6900XT?



AMD reference for me (if possible) depends whats in stock on launch day, I imagine it will sell out fast)  sigh.


----------



## bpgt64 (Oct 28, 2020)

Makes me wonder if they'll have inventory...or nah


----------



## Cheeseball (Oct 28, 2020)

pavl3 said:


> HD 7870 has 1280 shaders by the way...
> 
> And there I thought my HD 6950 would be better than the new Hoseron 6900 XT.



Yeah this is why I mentioned HD 7870 XT, which is a cut down HD 7950 and -1 GB of VRAM.  Look up the HD 7870 XT JokerCard by Club3D and you'll know what I mean.

EDIT: Here ya go. Mine specifically was the PowerColor PCS+ HD 7870 MYST Edition.


----------



## DuxCro (Oct 28, 2020)

kapone32 said:


> Does it matter?


Eeeeer.....YES. Ofc it matters. Is the card on par with RTX 3090 out of the box, or do i need to activate RAGE mode (overclock it) and pair it with ZEN3 CPU and activate that mode in BIOS for the card to compete with RTX 3090. Yes, it matters.


----------



## Khonjel (Oct 28, 2020)

RX 6800 is 60CU > RTX 3070

RX 6800 XT is 72CU ≈ RTX 3080

RX 6900 XT is 80CU ≈ RTX 3090

Aside from so many Rs and Xs I had to type (geez!) I hope AMD has a backup for the rumored 3070 Ti and 3080 Ti. Though that's just my inner-AMDiot hoping. As a consumer, where my under $500 cards at?


----------



## pavle (Oct 28, 2020)

Oh of course... must have slipped my mind.


----------



## sepheronx (Oct 28, 2020)

Khonjel said:


> RX 6800 is 60CU > RTX 3070
> 
> RX 6800 XT is 72CU ≈ RTX 3080
> 
> ...



Agreed.  Prices are getting to be ridiculous.  While these prices are pretty decent, but they are still just overall very pricy for average person.  So a more mid range GPU is very ideal.


----------



## Chomiq (Oct 28, 2020)

If they actually deliver 6800 XT at the €649 (pricing was in USD, so I'm spitballing a 1:1 conversion which includes VAT) price point that will match the 3080 in most games - I'm all in.

Why?

Because $700 3080 becomes a €800+ 3080 (and that's if you can buy one) in EU. Simple as that.


----------



## Zyll Goliat (Oct 28, 2020)

Not bad....not bad at all!!!


----------



## EarthDog (Oct 28, 2020)

My questions...

What are the results not using "best API"? What is it like when the cards go H2H in DX11? DX12? Not one using DX11, the other 12.

What is the RT performance like. Not a peep...

Also, these results appear to be, as usual, the absolute best case for these cards(to be fair, they both do this). They are overclocking them and using the infinity cache against stock Nvidia cards. ICache, while awesome, is only with AMD based systems. With that, is it a BIOS update that enables the option in the BIOS?


----------



## phill (Oct 28, 2020)

Outstanding from AMD...  Can't wait to be given the choice!!


----------



## P4-630 (Oct 28, 2020)

Quicks said:


> Will be skipping this gen as well. Maybe just skip PC gaming altogether getting very expensive...



You can always get a console.


----------



## R0H1T (Oct 28, 2020)

lynx29 said:


> i already have the msi x570 tomahawk, and my ram should oc to the new infinity fbric max limit 1:1 of 4000 cas 16 since i have b-die.
> 
> im only waiting on cpu and gpu to come in stock to finish my build. already have the PC built.


Infinity fabric, infinity cache, admit it how many of you thought of this!


----------



## Legacy-ZA (Oct 28, 2020)

MxPhenom 216 said:


> Jesus your signature is an eye sore



So is your comment, the Lord is sad.


----------



## R0H1T (Oct 28, 2020)

EarthDog said:


> My questions...
> 
> What are the results not using "best API"? What is it like when the cards go H2H in DX11? DX12? Not one using DX11, the other 12.
> 
> ...


The 6900XT is at best a 3080Ti or Super competitor, the pricing kinda reflects that & for 99% of the market that's about enough.


----------



## BoboOOZ (Oct 28, 2020)

dragontamer5788 said:


> Its called decoy pricing.
> 
> 6900 XT is the decoy, it exists primarily to make the 6800 XT look better.


Well, if you compare it to the 3090 it's dirt cheap, it consumes less and it probably has quite a lot of overclocking headroom.

Also, everybody looks at the MSRP, in my experience MSRP is worthless, wait to see the actual availabilty and street prices.


----------



## DuxCro (Oct 28, 2020)

Can't wait to see how these thingies clock under water. Not by me ofc. I'm a poor bastardo. But still...curious.


----------



## nguyen (Oct 28, 2020)

EarthDog said:


> My questions...
> 
> What are the results not using "best API"? What is it like when the cards go H2H in DX11? DX12? Not one using DX11, the other 12.
> 
> ...



Asking all the important questions .

Well let just hope AMD can alleviate the situtation with Nvidia Ampere stock problem, I'm sure as heck AMD won't be taking back any market share with this pricing scheme though, too expensive even when rasterization might be all people care about.



B-Real said:


> What did you expect?
> 
> 6800XT: 3080 performance, $50 less, less power consumption, +6 GB VRAM
> 6900XT: 3090 performance, $500 less, less power consumption, -8 GB VRAM



I would be more worry about the 128mb infinity cache running out, causing micro stutter rather than the 10GB GDDR6X running out  .


----------



## RH92 (Oct 28, 2020)

The nagative side of this presentation is that AMD gave no numbers what so ever about Raytracing and DLSS ( their equivalent ) performance . Now i don't know about you guys but i buy next gen GPU for next gen features so considering both AMD and NVIDIA  seem to be equal on rasterization the fact that AMD showed no numbers for next gen features isn't a good selling argument . More likely than not they are far behind NVIDIA on these fields so yeah.

Also mid tier 6800 , being 80 dollars more expensive than 3070 , sure more performant aswell ( not sure if the perf diff worths the price diff  ) but at 580$ im afraid AMD customers who were looking for competitive mid tier pricing are going to be very disapointed . One thing is for sure the days AMD was competing on price are gone !


----------



## MxPhenom 216 (Oct 28, 2020)

DuxCro said:


> Can't wait to see how these thingies clock under water. Not by me ofc. I'm a poor bastardo. But still...curious.



I'll let you know


----------



## ahenriquedsj (Oct 28, 2020)

I will buy 6800 (Performace / RAM)


----------



## BoboOOZ (Oct 28, 2020)

RH92 said:


> Also mid tier 6800 , being 80 dollars more expensive than 3070 , sure more performant aswell ( not sure if the perf diff worths the price diff  ) but at 580$ im afraid AMD customers who were looking for competitive mid tier pricing are going to be very disapointed . One thing is for sure the days AMD was competing on price are gone !


Two words: street price


----------



## windwhirl (Oct 28, 2020)

BoboOOZ said:


> Also, everybody looks at the MSRP, in my experience MSRP is worthless, wait to see the actual availabilty and street prices.


Goes double for me, I live in a different dimension regarding pricing and availability lol


----------



## Space Lynx (Oct 28, 2020)

RH92 said:


> The nagative side of this presentation is that AMD gave no numbers what so ever about Raytracing and DLSS ( their equivalent ) performance . Now i don't know about you guys but i buy next gen GPU for next gen features so considering both AMD and NVIDIA  seem to be equal on rasterization the fact that AMD showed no numbers for next gen features isn't a good selling argument . More likely than not they are far behind NVIDIA on these fields so yeah.
> 
> Also mid tier 6800 , being 80 dollars more expensive than 3070 , sure more performant aswell ( not sure if the perf diff worths the price diff  ) but at 580$ im afraid AMD customers who were looking for competitive mid tier pricing are going to be very disapointed . One thing is for sure the days AMD was competing on price are gone !



i could care less about ray tracing. doesn't impress me at all in the games I have seen it in. i would rather turn it off and gain an extra 30-50 frames for the smoothness of high refresh gaming.


----------



## RH92 (Oct 28, 2020)

BoboOOZ said:


> Two words: street price



 ??? Could you elaborate ...


----------



## randompeep (Oct 28, 2020)

Khonjel said:


> RX 6800 is 60CU > RTX 3070
> 
> RX 6800 XT is 72CU ≈ RTX 3080
> 
> ...


Fo_real...if smartphone market shown how it's done, why they'd bother releasing low-end products, heh ? 
Yet I'm sitting out there waiting for a 60$ RX 4/570 or a GTX 970...to call it a deal for older games (5-10 years) in 4k


----------



## Space Lynx (Oct 28, 2020)

the real winner of the day is the vram... so much more headroom than Nvidia's lineup. very nice.


----------



## AsRock (Oct 28, 2020)

MxPhenom 216 said:


> Show us Ray tracing performance for fuck sakes.
> 
> 
> 
> ...




Fuck Ray Tracing

And as for FC6 you speak for your self.


----------



## BoboOOZ (Oct 28, 2020)

RH92 said:


> ??? Could you elaborate ...


MSRP is just theoretical, what matters is the actual price at retailers. Like right now, the MSRP for the 3080 is quite decent at 700usd, but can you actually buy one at the MSRP? No.

Example : I bought my 5700XT for 50 USD < MSRP with 2 games. The only Nvidia card that I could buy for the price was a 2060  at 100USD over MSRP. This is a permanent situation in France, MSRP is simply a theoretical figure.


----------



## TheLostSwede (Oct 28, 2020)

Turmania said:


> I would like to see a conparison with 6900xt and 6800xt


Is that when you only compare the cons?


----------



## EarthDog (Oct 28, 2020)

For the dupe thread..................

My questions...

What are the results not using "best API"? What is it like when the cards go H2H in DX11? DX12? Not one using DX11, the other 12.

What is the RT performance like. Not a peep...

Also, these results appear to be, as usual, the absolute best case for these cards (normal, of course). They are overclocking them and using the infinity cache. This is against stock Nvidia cards. ICache, while awesome, is only with AMD based systems. With that, is it a BIOS update that enables the option in the BIOS?

What systems can IC work in? Is it CPU dependent or chipset? Will it work on B450 with a 1 series? 2 series? etc?


----------



## Khonjel (Oct 28, 2020)

sepheronx said:


> Agreed.  Prices are getting to be ridiculous.  While these prices are pretty decent, but they are still just overall very pricy for average person.  So a more mid range GPU is very ideal.


Now that you mention it I think AMD might decrease price of RX 6800 after all. Maybe $520 or near that range. And slot in RTX 3070 Ti competitor.

But I'm still being optimistic here. What will they name it? My guess is AMD wants to avoid 6800X name cause it might cause confusion with ZEN 5 Ryzen 6800X (AMD's own doing. Who gave them the grand idea to skip from RX 500 to RX 5000?). And how will they even price it?

By my guess AMD wants RX 6800XT and 6900XT on people's builds (pricing them lower than competition), for a halo effect if you will, while not very interested in 6800 or wants more profit from it since it'll inevitably sell more units. Maybe turning off 20 CU is just not profitable. Or maybe my extra profit hypothesis is correct. People will pay a bit more for more performance anyway a la RX 5700 XT vs RTX 2070 and 2070 Super, ofc I'm not considering AMD driver woes or Nvidia fans who'll always buy Nvidia.


----------



## kapone32 (Oct 28, 2020)

EarthDog said:


> For the dupe thread..................
> 
> My questions...
> 
> ...


I am going to be selfish and say that none of these things matter. The most important thing is they are all faster than the 5700XT


----------



## Chomiq (Oct 28, 2020)

AMD said that Rage mode is suppose to give you 1-2% boost due to increased power limit. The rest is due to their SAM.

How much is attributed to "Infinity Cache" in this scenario? Does the base result for RX 6800 XT include the benefits from Infinity Cache (I'm assuming this since it's suppose to be tested on a 5900X). If so, what's the base result for a non 5000-series CPU?

This is important because suddenly that price difference between an RTX and an RX might be covered by the additional expense on an upgrade to a 5000-series CPU.


----------



## Cheeseball (Oct 28, 2020)

EarthDog said:


> For the dupe thread..................
> 
> My questions...
> 
> ...



I think the Infinity Cache is that built-in 128 MB that they mention on their slides. I hope its system agnostic though and doesn't rely on having an AMD motherboard and Zen 3 CPU.


----------



## mahirzukic2 (Oct 28, 2020)

EarthDog said:


> For the dupe thread..................
> 
> My questions...
> 
> ...


Well in that case, wait for the actual reviews which should come in about 2 - 3 weeks from now.


----------



## Khonjel (Oct 28, 2020)

randompeep said:


> Fo_real...if smartphone market shown how it's done, why they'd bother releasing low-end products, heh ?
> Yet I'm sitting out there waiting for a 60$ RX 4/570 or a GTX 970...to call it a deal for older games (5-10 years) in 4k


It's funny now you mention smartphone market. Phone makers even Apple are making phones in ye olden days' $700 flagship prices again. Ofc they don't call it flagship but it's 90% flagship anyway.


----------



## r9 (Oct 28, 2020)

I'm gonna quote Gordon Ramsey FUCK ME! I've never been so happy being wrong!


----------



## saikamaldoss (Oct 28, 2020)

I cant wait to get the 5950x and 6900XT


----------



## EarthDog (Oct 28, 2020)

kapone32 said:


> I am going to be selfish and say that none of these things matter. The most important thing is they are all faster than the 5700XT


lol, your bar was low if beating the 5700XT makes you happy, lol.

On a more serious note, those question are all quite valid.


Chomiq said:


> View attachment 173673
> AMD said that Rage mode is suppose to give you 1-2% boost due to increased power limit. The rest is due to their SAM.
> 
> How much is attributed to "Infinity Cache" in this scenario? Does the base result for RX 6800 XT include the benefits from Infinity Cache (I'm assuming this since it's suppose to be tested on a 5900X). If so, what's the base result for a non 5000-series CPU?


Yep, I saw the same thing. You only get this 'significant' uptick when you have an AMD based system...

... but what, EXACTLY is that? Do I have to have a specific generation of CPU? Chipset? What about how these will fare on the MAJORITY of systems out there (intel based) that DO NOT get these boosts? Feels a bit misleading...(but par for the course from AMD/NV).



mahirzukic2 said:


> Well in that case, wait for the actual reviews which should come in about 2 - 3 weeks from now.


In other news, water is wet. 



Cheeseball said:


> I think the Infinity Cache is that built-in 128 MB that they mention on their slides. I hope its system agnostic though and doesn't rely on having an AMD motherboard and Zen 3 CPU.


uhhh, that is exactly what it is....I'm looking for the specific DETAILS.


----------



## TheLostSwede (Oct 28, 2020)

okbuddy said:


> plus only 3 video ports, 3080 got five


You can only use four simultaneously though.


----------



## BoboOOZ (Oct 28, 2020)

Oh yeah, and since we're all talking about the new stuff, the Infinity Cache will be awesome for APU's. can't wait to see what RDNA2 APU's are going to look like.


----------



## mahirzukic2 (Oct 28, 2020)

Cheeseball said:


> I think the Infinity Cache is that built-in 128 MB that they mention on their slides. I hope its system agnostic though and doesn't rely on having an AMD motherboard and Zen 3 CPU.


I would pretty much think that's the way it is, because it would be horrible value if it would be pair with any other platform other than Zen 3, which would force them to put up lower MSRP than they currently have.

So I think it's safe to assume it is system agnostic and it just works like any other HW:

you get HW, install it in the system
you install drivers in your OS
????
profit


----------



## saikamaldoss (Oct 28, 2020)

Khonjel said:


> RX6800 is 60CU > RTX 3070
> 
> RX 6800 XT is 72CU ≈ RTX 3080
> 
> ...



_Easy_-_Peasy Chinesy... _RX6700 with 54CU or 50CU if not will do a price cut on 6800 lol ha ha ha


----------



## Cheeseball (Oct 28, 2020)

TheLostSwede said:


> You can only use four simultaneously though.



The RTX 3080 only has four (by default). Which vendor had the one with five ports?


----------



## Space Lynx (Oct 28, 2020)

r9 said:


> I'm gonna quote Gordon Ramsey FUCK ME! I've never been so happy being wrong!



I see the unbelievers are finally arriving.  Useless vermin.


----------



## Turmania (Oct 28, 2020)

AMD pulled a great trick here to try to lure their Ryzen user base to Radeon.


----------



## r9 (Oct 28, 2020)

That 6800 kind of pricey at $579.


----------



## Space Lynx (Oct 28, 2020)

r9 said:


> That 6800 kind of pricey at $579.




double the vram for an extra 79 bucks is a fair deal imo.


----------



## r9 (Oct 28, 2020)

lynx29 said:


> I see the unbelievers are finally arriving.  Useless vermin.


I'm too happy at the moment so I'm not gonna even get offended. Lol


----------



## TheLostSwede (Oct 28, 2020)

Durvelle27 said:


> Now I have to debate between getting a RX 6800 XT or RX 6900 XT


Sounds like a nice dilemma to have.


----------



## BoboOOZ (Oct 28, 2020)

mahirzukic2 said:


> I would pretty much think that's the way it is, because it would be horrible value if it would be pair with any other platform other than Zen 3, which would force them to put up lower MSRP than they currently have.
> 
> So I think it's safe to assume it is system agnostic and it just works like any other HW:
> 
> ...


Yepp, it's just a GPU cache similar to other caches, it's just they must've done something to improve the size/latency ratio, similar to zen 3. But it's just like the CPU cache works, irrespective of the rest of the system, so does the GPU cache.


----------



## Khonjel (Oct 28, 2020)

saikamaldoss said:


> _Easy_-_Peasy Chinesy... _RX6700 with 54CU or 50CU if not will do a price cut on 6800 lol ha ha ha


The rumor mill/leaks so far lead to 6700 XT being Navi 22 aka 40 CU but with blistering clock speeds. But anyways who knows.


----------



## windwhirl (Oct 28, 2020)

r9 said:


> That 6800 kind of pricey at $579.


We're talking 4K performance and 16 GB of VRAM. Not a bad price all things considered.

I'd say RTRT too, but it's not clear how that fares.


----------



## Space Lynx (Oct 28, 2020)

TheLostSwede said:


> Sounds like a nice dilemma to have.



I am leaning towards 5600 XT and 6800 XT, but honestly at the end of the day im going to order whatever i can click on fastest that is in stock on launch day for each.


----------



## RedelZaVedno (Oct 28, 2020)

Pricing with exception of 6900XT is a complete joke. Much worse than Zen 3, because Zen3 dominates Intel and RDNA2 does not.

-I mean $579 USD for RX 6800... No AI upscaling (DLSS), poorer RT implementation. You basically overpay $79 for 8 gigs of vram. Just awful.
-RX 6800 XT $649 USD, again no "DLSS", no RT benchmark comparison, the only reason for buying AMD is 6 gigs more GDDR6 and 50 bucks less. Not Enough by far.
-The only good GPU is 6900XT for $999.

AMD just shot itself in the foot again. This time without a reason. Rasterization performance is here, all AMD had to do was to substantially  undercut Ampere Pricing (aka RX 6800 for $449 and RX 6800 XT for $599) and reviewers would have to say they're the smart buy option. As things stand today I expect NGreedia GPU division to grow market share even further into 90% mark at which point AMD being an underdog in discrete DIY GPU market becomes an understatement. So sad


----------



## r9 (Oct 28, 2020)

lynx29 said:


> double the vram for an extra 79 bucks is a fair deal imo.


Personally I would take 6800 with 8gb for $449. I'll take the 4k chance.


----------



## BoboOOZ (Oct 28, 2020)

Where is @Vayra86 , I think he has to buy a 6900XT now...


----------



## kings (Oct 28, 2020)

I think that AMD missed an opportunity with the 6800. It's true that it has 16GB VRAM, but it should lose a lot in RT and it doesn't have any DLSS equivalent, being $79 more expensive.

In my view, AMD had made a better move to put this card with 8GB like the RTX 3070 and sold it below $500.


----------



## wheresmycar (Oct 28, 2020)

Its all about 1440p 144hz gaming for me and both the 6800 and 6800XT look amazing for the task. 



> *Update 16:15 UTC*: When paired with Ryzen, you get a gaming performance boost, 13% perf increase.



This is the bit which has me a little confused. These GPUs work better with Ryzen (assuming zen 3) CPUs? What am I missing here?


----------



## Dave65 (Oct 28, 2020)

Damn some nice pricing for sure and that 16 gig.
Wonder how things are in the nvidia crowd today


----------



## TheLostSwede (Oct 28, 2020)

dragontamer5788 said:


> Its called decoy pricing.
> 
> 6900 XT is the decoy, it exists primarily to make the 6800 XT look better.


Does that apply to the RTX 3090 as well then?


----------



## Chomiq (Oct 28, 2020)

EarthDog said:


> Yep, I saw the same thing. You only get this 'significant' uptick when you have an AMD based system...
> 
> ... but what, EXACTLY is that? Do I have to have a specific generation of CPU? Chipset? What about how these will fare on the MAJORITY of systems out there (intel based) that DO NOT get these boosts? Feels a bit misleading...(but par for the course from AMD/NV).


I think TPU's post cleared that up :








						AMD Announces the Radeon RX 6000 Series: Performance that Restores Competitiveness
					

AMD (NASDAQ: AMD) today unveiled the AMD Radeon RX 6000 Series graphics cards, delivering powerhouse performance, incredibly life-like visuals, and must-have features that set a new standard for enthusiast-class PC gaming experiences. Representing the forefront of extreme engineering and design...




					www.techpowerup.com
				




AMD Infinity Cache - A high-performance, last-level data cache suitable for 4K and 1440p gaming with the highest level of detail enabled. *128 MB of on-die cache dramatically reduces latency and power consumption, delivering higher overall gaming performance than traditional architectural designs.*
AMD Smart Access Memory - *An exclusive feature of systems with AMD Ryzen 5000 Series processors, AMD B550 and X570 motherboards and Radeon RX 6000 Series graphics cards*. It gives AMD Ryzen processors greater access to the high-speed GDDR6 graphics memory, accelerating CPU processing and providing up to a 13-percent performance increase on a AMD Radeon RX 6800 XT graphics card in Forza Horizon 4 at 4K when combined with the new Rage Mode one-click overclocking setting.
So Infinity Cache is on GPU die, Smart Access Memory is enabled on a 5000-series CPU, B550 or X570 board and a 6000-series GPU. Which means you shouldn't see a drastic decrease in base result on a non 5000 series CPU with similar performance.


----------



## Space Lynx (Oct 28, 2020)

TheLostSwede said:


> Does that apply to the RTX 3090 as well then?



I know you have connections lostswede, get me a reserve spot for 5600x and 6800xt, you know you want to!


----------



## EarthDog (Oct 28, 2020)

Dave65 said:


> Damn some nice pricing for sure and that 16 gig.
> Wonder how things are in the nvidia crowd today


No F's given... competition is AWESOME!

That said, I'm surprised they reached 3090 levels... but I'd like to see results without overclocking (Rage) and Infinifty Cache (AMD only)...



Chomiq said:


> I think TPU's post cleared that up :
> 
> 
> 
> ...


Ty!

Sorry, there are like 3 threads going on here about this subject... The one birdie posted, the live blog, and then the summary...

So, basically, it is the latest and greatest to get that performance. Cool beans. Awesome for those going balls deep into the ecosystem, bad news for the majority of existing PC owners that won't get the boost at all because they are running Intel.


----------



## Cheeseball (Oct 28, 2020)

RedelZaVedno said:


> Pricing is a complete joke. Much worse than Zen 3, because Zen3 dominates and RDNA2 doesn't. I mean $579 USD for RX 6800... No AI upscaling, poorer RT implementation. You basically overpay $79 for 8 gigs of vram. Just aweful.
> RX 6800 XT $649 USD, again no "DLSS", no RT benchmark comparison, the only reason for buying AMD is 6 gigs more GDDR6 and 50 bucks less. Not Enough by far. The only good GPU is 6900XT for $999. AMD just shot itself in the foot again. This time without a reason. Rasterization performance is here, all AMD had to do was to substantially  undercut Ampere Pricing (aka RX 6800 for $449 and RX 6800 XT for $599) and reviewers would have to say they're the smart buy option. As things stand today I expect NGreedia GPU division to grow market share even further into 90% mark at which point AMD being an underdog in discrete DIY GPU market becomes an understatement. So sad



Did you expect AMD to have some sort of AI/Tensor-focused implementation that could take on NVIDIA's DLSS? RDNA and RDNA2 are mostly rasterzation improvements which is mostly good for gaming and 3D workstation use. They've been brute-forcing power and it seems to be going well without overdoing the power consumption. This is a win-win for them regardless.


----------



## R0H1T (Oct 28, 2020)

Infinity cache is on die L2 or L2 (L2+L3 cache?) not sure where anyone got this idea that it depends on anything else! As for Smart Access Memory, I assume everyone forgot *Heterogeneous compute* (HSA) or unified memory pool? Well this is probably that one early implementation for it & no it you can't just add any (AMD) CPU to make it work.


----------



## dragontamer5788 (Oct 28, 2020)

TheLostSwede said:


> Does that apply to the RTX 3090 as well then?



Yup.

Decoy pricing is a fundamental marketing technique. It applies to both the low-end (why buy "small fries" when "medium fries" are just 10-cents more expensive for double the food?), as well as the high-end ($1500 is clearly not reasonable. So obviously $700 is a reasonable price).

------

Everyone does it. "i3" doesn't have enough features, "i7" is too expensive, clearly "i5" is the perfect CPU. Eventually, when Intel wants to sell more i7s, they invent the i9 so that the i9 can serve as the decoy and "nudge" people towards buying the i7.


----------



## Chrispy_ (Oct 28, 2020)

kapone32 said:


> Isn't the 6800 competing with the 3080?


I think the XT is competing with the 3080, The vanilla 6800 is quite a significant CU and clockspeed chop from the 6800XT, I guess it may compete with a 3070Ti if Nvidia decide to officially counter it at $579.


----------



## randompeep (Oct 28, 2020)

Khonjel said:


> It's funny now you mention smartphone market. Phone makers even Apple are making phones in ye olden days' $700 flagship prices again. Ofc they don't call it flagship but it's 90% flagship anyway.


In a world where the consumers become more & more strange/different from yesterday, the flagships are getting pricier. Also, the cost of production gets higher, so the profit...but I'd say everyone would be satisfied it the GPU market would be about seeing the full stack in the release day.
Too many poor kids are talking about the next-gen GPU, yet their budget for a new PC is 3-4 times smaller than the actual flagship product. 
And yeah, as a consumer I except seeing more the first day with later add-ons (XT/Ti's) and gimmicks (power limit unlock for example).
BTW how you say 90% flagship for Apple products ? It's actually a really variable procentage, depending on the time of the year...but as for the Q4 of every year, I see Apple leading the smartphone/tablet market any hour. Both single core and multi core performance...I mean it's their Christmas beast mode 100% 1500$ per purchase season
So both nvidia and amd wanna join the campus somehow.


----------



## BoboOOZ (Oct 28, 2020)

kings said:


> I think that AMD missed an opportunity with the 6800. It's true that it has 16GB VRAM, but it should lose a lot in RT and it doesn't have DLSS, being $79 more expensive.
> 
> In my view, AMD had made a better move to have put this card with 8GB like the RTX 3070 and sold it below $500.


I think it's not an interesting card for them to sell, it shares the same big die with the bigger brothers, so they lose money by artificially cutting down the die, because very few will have so many defective CU as to come naturally with only 60 CU.

I'm sure the 6700XT will be more interesting, so either buy a 3070 or a 6800XT, or wait for the 6700XT.


----------



## RedelZaVedno (Oct 28, 2020)

Dave65 said:


> Damn some nice pricing for sure and that 16 gig.
> Wonder how things are in the nvidia crowd today


$80 for +8 gigs of GDDR6... Micron 1GB GDDR6 @ 14Gbps currently costs $6,86 when ordered in low quantities and much less if ordered in tons. You're paying double the price AMD pays for it. How is that a good deal?


----------



## R0H1T (Oct 28, 2020)

RedelZaVedno said:


> How is that a good deal?


Really now? So do you buy groceries at the same rate farmers sell it to say Amazon or any other retailer


----------



## Vayra86 (Oct 28, 2020)

windwhirl said:


>



OK They fucking nailed it this time.

No further comments. Might switch for this.


----------



## r9 (Oct 28, 2020)

AMD has both intel and nvidia between a rock and a hard place. Think of Big navi as just the beginning equivalent to the first Ryzen. Once they get some money to come from it will get even better for AMD.  
Both intel and nvidia have the money to fight it, this will propel the tech game big time!


----------



## Space Lynx (Oct 28, 2020)

r9 said:


> AMD has both intel and nvidia between a rock and a hard place. Think of Big navi as just the beginning equivalent to the first Ryzen. Once they get some money to come from it will get even better for AMD.
> Both intel and nvidia have the money to fight it, this will propel the tech game big time!




its glorious time to be a gamer that is certain.


----------



## TheLostSwede (Oct 28, 2020)

Cheeseball said:


> The RTX 3080 only has four (by default). Which vendor had the one with five ports?


Some even have six.


----------



## Vayra86 (Oct 28, 2020)

BoboOOZ said:


> Where is @Vayra86 , I think he has to buy a 6900XT now...



6900 even? I will settle for anything between 3070- 3080 range perf, it seems like 6800XT is as high as I'll need to go. And less really is more in my case, as I'll max at 1440p no problem.

But yeah. Damn, son! I'll concede right here right now I've underestimated Navi/RDNA2. This really is great news, and long overdue.


----------



## Space Lynx (Oct 28, 2020)

the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...


----------



## Vayra86 (Oct 28, 2020)

lynx29 said:


> the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...  thats the kind of shit that pisses me off.



That makes about as much sense as your signature right now.


----------



## BoboOOZ (Oct 28, 2020)

Vayra86 said:


> 6900 even? I will settle for anything between 3070- 3080 range perf, it seems like 6800XT is as high as I'll need to go. And less really is more in my case, as I'll max at 1440p no problem.
> 
> But yeah. Damn, son! I'll concede right here right now I've underestimated Navi/RDNA2.


With the pricing, they're really pushing you to buy the 6800XT anyway,  I'm curious where the perf/dollar compared to the 3070 will be.


----------



## GhostRyder (Oct 28, 2020)

Huh, I was surprised that there even is a 6900XT as I thought that was just a rumor and the reality was the 6800XT was the top with it just being competitive with the 3080.  Very impressive based on slides, but I will still be waiting on reviews like from @W1zzard .   

However, I definitely will be waiting now for the 6900 XT reviews and for that price if its competitive I will be getting a 6900XT to replace my Titan X (Pascal) since I was unable to secure a 3080 or a 3090.

I am just happy we finally have top end competition again.


----------



## Deleted member 24505 (Oct 28, 2020)

Looks like AMD has kicked Ngreedia in the nuts


----------



## Vayra86 (Oct 28, 2020)

BoboOOZ said:


> With the pricing, they're really pushing you to buy the 6800XT anyway,  I'm curious where the perf/dollar compared to the 3070 will be.



Certainly but even with these optimistic numbers, I'm waiting for reviews, and I'm going to wait out the early adopter woes too. This 1080 is running as it did on day one... no rush. Let it all settle... this perf level isn't going places anytime soon, the jump is substantial and I expect minor updates/refreshes from here on out for the coming 2 years. Much like Turing was to Pascal.


----------



## randompeep (Oct 28, 2020)

Idk mane...they already got their shit on console hardware, 'few' supercomputers built around Ryzen all over the world. And now eventually they'll win the consumer PC market.
Whoever bought SOME stock in early 2010's must be rich by now.


----------



## Vayra86 (Oct 28, 2020)

GhostRyder said:


> Huh, I was surprised that there even is a 6900XT as I thought that was just a rumor and the reality was the 6800XT was the top with it just being competitive with the 3080.  Very impressive based on slides, but I will still be waiting on reviews like from @W1zzard .
> 
> However, I definitely will be waiting now for the 6900 XT reviews and for that price if its competitive I will be getting a 6900XT to replace my Titan X (Pascal) since I was unable to secure a 3080 or a 3090.
> 
> I am just happy we finally have top end competition again.



I reckon 6900XT is what the 3090 is to the 3080, maybe give or take a few %.


----------



## RedelZaVedno (Oct 28, 2020)

Cheeseball said:


> Did you expect AMD to have some sort of AI/Tensor-focused implementation that could take on NVIDIA's DLSS? RDNA and RDNA2 are mostly rasterzation improvements which is mostly good for gaming and 3D workstation use. They've been brute-forcing power and it seems to be going well without overdoing the power consumption. This is a win-win for them regardless.


How so? Nvidia has 3070TI in their sleeves and AMD just enabled them $600 pricing. AKA 6800 at $579 is dead. $50 less for GPU that is trading blows with 3080 and has no AI upscaling and probably way worse ray tracing performance is a non seller too. AMD is the company with 20% discrete GPU market share, not Nvidia and they're acting like their brand is equal in GPU world. I expect xx70/xx80 Ampere to outsell RDNA2  10 to 1. That's the harsh reality check AMD will have to face it they don't lower their prices. That's coming from ATI 4870/AMD 6780/R290x/RX480/Vega56 owner, not some Nvidia fanboy


----------



## Chrispy_ (Oct 28, 2020)

saikamaldoss said:


> _Easy_-_Peasy Chinesy... _RX6700 with 54CU or 50CU if not will do a price cut on 6800 lol ha ha ha


I don't think they'll chop it down that much. The most they chopped down the 40CU Navi10 die was 32CU for the OEM-only RX 5600. Dropping Navi21 from 80CU to 60CU is already a very deep cut with a lot of wasted/dead silicon. We may yet see something with 12GB if AMD decide to harvest chips with defective memory controllers, but the sheer silicon costs and yields would need to be atrocious for them to make financial sense to offer a 50CU part out of 80CU silicon.

More likely there will be a big hole in AMD's lineup for a while, as Navi22 starts at 40CU and will likely replace the 5700XT for the $250-450 price points - It won't be a worthwhile upgrade for 5700-series owners but it'll add DXR raytracing support and bring a minor performance boost from the architectural upgrades.

AMD talked about huge performance/Watt gains in their presentation but I think a 40CU Navi10 vs a 40CU Navi22 will show only minor architectural, clock-for-clock performance gains. I'm just guessing at this point but I suspect the 6700XT will be 25% faster than the 5700XT, and at least half of that performance will come from higher clocks, not architectural advances.


----------



## GhostRyder (Oct 28, 2020)

Vayra86 said:


> I reckon 6900XT is what the 3090 is to the 3080, maybe give or take a few %.


Yea, the jump between the top two skews on both sides is marginal.  I just was shocked it existed (At least coming this soon).  Its good that we will see some top end competition again like the old days so we can actually have more choices.


----------



## Vayra86 (Oct 28, 2020)

RedelZaVedno said:


> How so? Nvidia has 3070TI in their sleeves and AMD just enabled them $600 pricing. AKA 6800 at $579 is dead. $50 less for GPU that is trading blows with 3080 and has no AI upscaling and probably way worse ray tracing performance is a non seller too. AMD is the company with 20% discrete GPU market share, not Nvidia and they're acting like their brand is equal in GPU world. I expect xx70/xx80 Ampere to outsell RDNA2  10 to 1. That's the harsh reality check AMD will have to face it they don't lower their prices. That's coming from ATI 4870/AMD 6780/R290x/RX480/Vega56 owner, not some Nvidia fanboy



LOL. You would pay for proprietary DLSS? That's an odd one. That's like paying for PhysX and its per-title implementation. Realistically though... all DLSS is, is a performance tweak with visual impact. If the performance is already there... why would you?

Will Nvidia outsell this time? I'm not so sure, but probably... still doesn't say much about what's the better GPU choice right now. I think the verdict on that is still not entirely in and DLSS certainly isn't the deal maker or breaker IMO.

A few facts are clear right now:
- AMD has an architectural advantage right now with Infinity Cache, as they now have a technology that gives them a higher efficiency VRAM subsystem. To top it off, they can make do with _slower memory_ - and not just a smaller bus. That is huge.
- AMD has a substantial VRAM advantage at the same price point and where it matters for the resolution it wants to serve
- AMD has capacity parity with the new consoles
- AMD clearly has a better node leading to a cost efficient die.

If you ask me, the stars have aligned and favoring DLSS over the above list of clear advantages, both short and long term, seems like a very odd, and perhaps even biased decision. I'd reconsider.

*signed, prior Nvidia advocate. The tables have turned.



tigger said:


> Looks like AMD has kicked Ngreedia in the nuts



More like sledgehammer to the face...


----------



## GhostRyder (Oct 28, 2020)

Vayra86 said:


> Will Nvidia outsell this time? I'm not so sure, but probably... still doesn't say much about what's the better GPU choice right now. I think the verdict on that is still not entirely in and DLSS certainly isn't the deal maker or breaker IMO.


Yea I agree, I think the only way they will outsell nVidia is if they have such a high volume of GPU's where as nVidia cant get enough out especially before the holidays.  Amd still has to build consumer confidence back with these GPU's actually living up to the hype then we could see them coming back in full.


----------



## Chrispy_ (Oct 28, 2020)

Khonjel said:


> The rumor mill/leaks so far lead to 6700 XT being Navi 22 aka 40 CU but with blistering clock speeds. But anyways who knows.


I hope the rumour mill is wrong in this case, otherwise there's going to be a big old gap in the lineup again, and this time that gap will be right at the upper mainstream sweet spot (which seems to be $300-400 now, apparently).

If Navi22 happens to have 44 or 48CU that would probably be better for meeting the 60CU of the vanilla RX6800. I already replied a few posts up that chopping Navi21 from 80CU down to 60CU is already a huge cut. Maybe I'm wrong but I can't believe AMD will want to hit a high-volume, popular price point with big silicon trimmed down even further.


----------



## BoboOOZ (Oct 28, 2020)

Chrispy_ said:


> AMD talked about huge performance/Watt gains in their presentation but I think a 40CU Navi10 vs a 40CU Navi22 will show only minor architectural, clock-for-clock performance gains. I'm just guessing at this point but I suspect the 6700XT will be 25% faster than the 5700XT, and at least half of that performance will come from higher clocks, not architectural advances.


I would be a little more optimistic, because the 5700XT had an overclocked core but the VRAM was lagging behind, it was borderline memory starved, so I'm betting more on 30-35% improvement, but we'll have to see.


----------



## TheLostSwede (Oct 28, 2020)

lynx29 said:


> I know you have connections lostswede, get me a reserve spot for 5600x and 6800xt, you know you want to!


I think everyone I used to know at AMD have left...



dragontamer5788 said:


> Yup.
> 
> Decoy pricing is a fundamental marketing technique. It applies to both the low-end (why buy "small fries" when "medium fries" are just 10-cents more expensive for double the food?), as well as the high-end ($1500 is clearly not reasonable. So obviously $700 is a reasonable price).
> 
> ...


Just wanted to make sure you didn't imply this was an AMD only thing.

I prefer onion rings over fries...


----------



## RedelZaVedno (Oct 28, 2020)

Vayra86 said:


> LOL. You would pay for proprietary DLSS? That's an odd one. That's like paying for PhysX and its per-title implementation. Realistically though... all DLSS is, is a performance tweak with visual impact. If the performance is already there... why would you?
> 
> Will Nvidia outsell this time? I'm not so sure, but probably... still doesn't say much about what's the better GPU choice right now. I think the verdict on that is still not entirely in and DLSS certainly isn't the deal maker or breaker IMO.
> 
> ...


Maybe not in yours, but It is in most consumers' eyes. AMD offered better performance against 3070 for MORE $$$ giving Nvidia opportunity to launch 3070TI and the same performance as 3080 for 50 bucks less without possibility of AI upscaling and poorer ray tracing capabilities. All that is working for AMD is more vram. AMD is acting like it owns 80% not 20% of the DIY GPU market. It's just silly if they can't see that. I'm starting to believe they're content with dominating consoles and staying marginal player in gaming PC GPU market.


----------



## moproblems99 (Oct 28, 2020)

I just hope one is going to be available.  This is the prime reason I could care less of CP2077 launch.  I need a damn GPU to run it.


----------



## randompeep (Oct 28, 2020)

RedelZaVedno said:


> I expect xx70/xx80 Ampere to outsell RDNA2  10 to 1.


Keep it 4:1, keep it real. Do you think they're caring about people bragging on AMD 'fantastic' prices ? If Ryzen 3000 sells like pancakes currently at +10-20% over MSRP, how the GPU higher-tier marketing strategy would fail ?
We're talking about the next-gen, perf/$ can't be judged just yet. After AIB's get in stores, we could roll it up and down some more time. Till then both nvidia and AMD plan their position for these cheeky high-end and the supposed second hand market killer lineup @250-450$


----------



## kruk (Oct 28, 2020)

Khonjel said:


> RX 6800 is 60CU > RTX 3070
> 
> RX 6800 XT is 72CU ≈ RTX 3080
> 
> ...



The clocks on the 6900 XT are suspiciously low, which suggests that we might be able to get a 6950 XT in the future as the production matures. This means that they can sell the new SKU at $999 and reduce the 6900 XT price to match the 3080 Ti ...


----------



## Cheeseball (Oct 28, 2020)

RedelZaVedno said:


> How so? Nvidia has 3070TI in their sleeves and AMD just enabled them $600 pricing. AKA 6800 at $579 is dead. $50 less for GPU that is trading blows with 3080 and has no AI upscaling and probably way worse ray tracing performance is a non seller too. AMD is the company with 20% discrete GPU market share, not Nvidia and they're acting like their brand is equal in GPU world. I expect xx70/xx80 Ampere to outsell RDNA2  10 to 1. That's the harsh reality check AMD will have to face it they don't lower their prices. That's coming from ATI 4870/AMD 6780/R290x/RX480/Vega56 owner, not some Nvidia fanboy



The RX 6800 has 16 GB of GDDR6 at 256-bit, much like the RTX 3070 which is just 8 GB at the same speed. I would seriously go for the RX 6800 for the extra $79 just for more RAM.

AI upscaling and RT (since its not mainstream) are just extra features at the moment. The fact that 6800 XT and RTX 3080 are nearly match-for-match with each other is the big picture here and makes AMD competitive at the high-end once again.


----------



## TheLostSwede (Oct 28, 2020)

lynx29 said:


> the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...


AMD's focus today is not on CPU or GPUs, it's on console SoCs. Want to buy a Ryzen 4000 mobile CPU from AMD today? If you're not Lenovo, HP, Dell or Asus, good luck, no stock as their allocation at TSMC is now being used up to make enough chips for Sony and Microsoft.


----------



## Chomiq (Oct 28, 2020)

EarthDog said:


> Ty!
> 
> Sorry, there are like 3 threads going on here about this subject... The one birdie posted, the live blog, and then the summary...
> 
> So, basically, it is the latest and greatest to get that performance. Cool beans. Awesome for those going balls deep into the ecosystem, bad news for the majority of existing PC owners that won't get the boost at all because they are running Intel.


From the description it looks like if you want a match, you can have it as base. You want some additional advantage go balls deep on AMD build (in titles that support the feature).


----------



## Vayra86 (Oct 28, 2020)

RedelZaVedno said:


> Maybe not in yours, but It is in most consumers' eyes. AMD offered the better performance against 3070 for more giving Nvidia opportunity to launch 3070TI and the same performance as 3080 for 50 bucks less without possibility of AI upscaling and poorer ray tracing capabilities. All that is working for AMD is more vram. AMD is acting like it owns 80% not 20% of the DIY GPU market. It's just silly if they can't see that. I'm starting to believe they're content with dominating consoles and staying marginal player in gaming PC GPU market.



Doubtful. We've already seen how tables can turn when it comes to these things. Intel's mindshare is quickly waning for example. It took Ryzen a few generations, you can expect similar with RDNA2. That's why I'm saying, you're probably going to be right, but don't mistake that for a lack of change in perspective. People need to ease into these things, and AMD has a lot of wiggle room here in terms of price that Nvidia really doesn't have. They will definitely be eating further into their margin than AMD this time around.

But the bigger issue is the mid-term to long-term market for Nvidia. Ampere was supposed to be their big overhaul right, I mean, this is FINALLY the Volta we always drooled over, really, all RT capable and everything with top 4K perf. So what's next for NV? How do you scale up a 320W board that is already out of balance in multiple ways? I mean its great they have GDDR6X. But they can't get a wider bus for it really without cutting on core power budget. What's their step up now?

AMD however is now in a position where Nvidia was during Pascal. Memory efficiency that is out of reach for its competitor. Boost that leaves a major gap with competitor. Feature parity with current day hardware and software. AMD can leverage technology that Nvidia simply doesn't have, nor can access easily. They've got a design win here and its ten times more impressive than Nvidia repurposing their cores for RT.

We find ourselves now, immediately after this presentation... considering what magic rabbit Huang has left in the hat. Because he needs one badly.

A hidden motivator though might the realization for customers that AMD Is also in the consoles for a while now. That is why the feature parity bit matters so much. If the GPUs can do the same things, show the same picture, why would you not switch? There is familiarity with both brands after all.

Fingers crossed now for no driver oopsies.


----------



## wolar (Oct 28, 2020)

It was weird that 6800 released with 16gb ram and that high price tbh. Around 40$ less would be more appropriate and for that performance i would rather have 8gb and 100$ less. 
Regardless of how much price reduction it would be, i think an 8gb 6800 makes more sense, and i would like to see a 10gb 6800xt matching the 3080 as well.


----------



## okbuddy (Oct 28, 2020)

ps5 36cu 2200mhz is exactly half 6800xt
$399
 vs
$649


----------



## randompeep (Oct 28, 2020)

TheLostSwede said:


> AMD's focus today is not on CPU or GPUs, it's on console SoCs. Want to buy a Ryzen 4000 mobile CPU from AMD today? If you're not Lenovo, HP, Dell or Asus, good luck, no stock as their allocation at TSMC is now being used up to make enough chips for Sony and Microsoft.


+1
Time will tell if they'll make enough console chips tho!! Taking in consideration the post-COVID growth on the gaming market and the continous rising on the computers' prices, I except some numbers on the gamepad side of things.


----------



## matar (Oct 28, 2020)

AMD , Hats Off...


----------



## RedelZaVedno (Oct 28, 2020)

randompeep said:


> Keep it 4:1, keep it real. Do you think they're caring about people bragging on AMD 'fantastic' prices ? If Ryzen 3000 sells like pancakes currently at +10-20% over MSRP, how the GPU higher-tier marketing strategy would fail ?
> We're talking about the next-gen, perf/$ can't be judged just yet. After AIB's get in stores, we could roll it up and down some more time. Till then both nvidia and AMD plan their position for these cheeky high-end and the supposed second hand market killer lineup @250-450$


Ryzen 3000 is much better value than intel, especially if you take the need for expensive Z490 MB and decent AMD cooler into account. How is paying $579 for xx70 GPU OK with ppl? I bought most of AMD mid to high GPUs simply because they were better value even when they were room heaters. I can't see value in $579 GPU anymore. It saddens me but I admit that if NVidia launches 3070TI with 12 or 16 gigs of ram for $599 I'll go with green option. Yes, "just" because it has DLSS and better ray tracing, worth paying 30 bucks more imho. I expected much more better value from AMD


----------



## TheLostSwede (Oct 28, 2020)

okbuddy said:


> plus only 3 video ports, 3080 got five



Looks like four to me.


----------



## Vayra86 (Oct 28, 2020)

TheLostSwede said:


> Looks like four to me.



wtf they put RGB on the memory now?


----------



## Rahnak (Oct 28, 2020)

RedelZaVedno said:


> Yes, "just" because it has DLSS and better ray tracing, worth paying 30 bucks more imho. I expected much more better value from AMD



AMD has their own competitor to DLSS. It's part of FidelityFX (therefore open) and it's called Super Resolution. It's also available to the consoles so the potential for wide usage is there. The better ray tracing is true, but nvidia is on their 2nd gen now, so I don't think anyone was expecting AMD to match that.


----------



## Cheeseball (Oct 28, 2020)

RedelZaVedno said:


> Ryzen 3000 is much better value than intel, especially if you take the need for expensive Z490 MB and decent AMD cooler into account. How is paying $579 for xx70 GPU OK with ppl? I bought most of AMD mid to high GPUs simply because they were better value even when they were room heaters. I can't see value in $579 GPU anymore. It saddens me but I admit that if NVidia launches 3070TI with 12 or 16 gigs of ram for $599 I'll go with green option. Yes, "just" because it has DLSS and better ray tracing, worth paying 30 bucks more imho. I expected much more better value from AMD



They are not going to release a RTX 3070 Ti at $599. That would be a stupid business decision when the 3070 is at $499 and the 3080 is at $699. Thats only a $100 difference between each tier and is still at the bottom high-performance segment which is not exactly where most of the money comes in.

RTX 2060 Super = $399
RTX 2070 Super = $599

RTX 2070 non-Super came out the year prior for the same $599 price (someone correct me if I'm wrong about this), so this wouldn't count


----------



## Chomiq (Oct 28, 2020)

okbuddy said:


> 72/80=0.9
> 
> 999-649 >>>>> 0.9
> 
> ...


Some 30 series cards have 5 outputs, except you can still only connect 4 displays.


----------



## RedelZaVedno (Oct 28, 2020)

Vayra86 said:


> Doubtful. We've already seen how tables can turn when it comes to these things. Intel's mindshare is quickly waning for example. It took Ryzen a few generations, you can expect similar with RDNA2. That's why I'm saying, you're probably going to be right, but don't mistake that for a lack of change in perspective. People need to ease into these things, and AMD has a lot of wiggle room here in terms of price that Nvidia really doesn't have. They will definitely be eating further into their margin than AMD this time around.
> 
> But the bigger issue is the mid-term to long-term market for Nvidia. Ampere was supposed to be their big overhaul right, I mean, this is FINALLY the Volta we always drooled over, really, all RT capable and everything with top 4K perf. So what's next for NV? How do you scale up a 320W board that is already out of balance in multiple ways? I mean its great they have GDDR6X. But they can't get a wider bus for it really without cutting on core power budget. What's their step up now?
> 
> ...


All Nvidia has to do is to transfer Ampere to TSMC 7nm euv process because going with Samsung 8nm (aka 10nm) was a shitshow. They'll probably do that in the next  18 months imo.


----------



## Vya Domus (Oct 28, 2020)

nguyen said:


> AMD really expect to sell those GPU at those prices ?



Does Nvidia expect to sell GPUs at even higher prices ? Let me guess, yes, because they were first by 2 months or some other made up nonsense. You know, AMD should pay people to buy their products probably, right ?

What a joke of a comment.


----------



## randompeep (Oct 28, 2020)

RedelZaVedno said:


> Ryzen 3000 is much better value than intel, especially if you take the need for expensive Z490 MB and decent AMD cooler into account. How is paying $579 for xx70 GPU OK with ppl? I bought most of AMD mid to high GPUs simply because they were better value even when they were room heaters. I can't see value in $579 GPU anymore. It saddens me but I admit that if NVidia launches 3070TI with 12 or 16 gigs of ram for $599 I'll go with green option. Yes, "just" because it has DLSS and better ray tracing worth paying 30 bucks more


Honestly I don't see the need of DLSS just yet. I wanna game at 1080p - I got an 75 Hz IPS and the GPU power I need. I wanna play sth on TV @4k - i usually keep 9 ft between my eyes and the screen, so 1440p looks crisp asf in most of the titles. So I recommend and buy whatever gives the most performance per watt x perf per dollar at the time (inclining right or left, depending on the specific needs).
Trust me, AMD does indeed have a better value paired with a budget mobo (90%+ of the cases), but only if you're talking MSRPs. I'd say Intel got equal or better value for the 1080p gaming scenario with their no iGPU CPUs. Well, I'm pointing one more time at the older games or e-sports titles where O don't see as much of a difference between going ddr4 @2666 CL15-16 vs 3200/3600 CL16-18


----------



## ratirt (Oct 28, 2020)

RedelZaVedno said:


> All Nvidia has to do is to transfer Ampere to TSMC 7nm euv process because going with Samsung 8nm (aka 10nm) was a shitshow. They'll probably do that in the next  18 months imo.


NV did it to cut price so if you want 7nm EUV prepare to pay more for any card NV has presented so far transfered to 7nm EUV.


----------



## Khonjel (Oct 28, 2020)

kruk said:


> The clocks on the 6900 XT are suspiciously low, which suggests that we might be able to get a 6950 XT in the future as the production matures. This means that they can sell the new SKU at $999 and reduce the 6900 XT price to match the 3080 Ti ...


Production is already mature though. AMD skipped 5nm and all the ensuing supply war and stayed with mature (as of now) 7nm. I think opposite will happen. AMD keeps the best of the best Navi 21 dies for RX 6900 XT while relegating slow/not-so-best dies to RX 6900 to fight off RTX 3080 Ti.


----------



## RedelZaVedno (Oct 28, 2020)

Cheeseball said:


> They are not going to release a RTX 3070 Ti at $599. That would be a stupid business decision when the 3070 is at $499 and the 3080 is at $699. Thats only a $100 difference between each tier and is still at the bottom high-performance segment which is not exactly where most of the money comes in.
> 
> RTX 2060 Super = $399
> RTX 2070 Super = $599
> ...


My bet is 3070TI 12 gigs to match 6800 at $599 and 3080TI with 16 gigs to compete with 6900XT at $1099 in Q1 2021.


----------



## ratirt (Oct 28, 2020)

Khonjel said:


> Production is already mature though. AMD skipped 5nm and all the ensuing supply war and stayed with mature (as of now) 7nm. I think opposite will happen. AMD keeps the best of the best Navi 21 dies for RX 6900 XT while relegating slow/not-so-best dies to RX 6900 to fight off RTX 3080 Ti.


I don't think they have skipped 5nm. They just stayed with 7nm. Maybe AMD didn't see a reason to go 5nm. From what we have seen so far, 7nm does the job. Why go 5nm and risk low yields and price increase.


----------



## BoboOOZ (Oct 28, 2020)

TheLostSwede said:


> AMD's focus today is not on CPU or GPUs, it's on console SoCs. Want to buy a Ryzen 4000 mobile CPU from AMD today? If you're not Lenovo, HP, Dell or Asus, good luck, no stock as their allocation at TSMC is now being used up to make enough chips for Sony and Microsoft.


Just to point out the obvious, the number of wafers reauired for console APUs is huge, it' more than everything else they do (CPU or GPU alike).


----------



## Crustybeaver (Oct 28, 2020)

xkm1948 said:


> Nice 6900. Finally a top of the line since Fury days!


Almost, the 980Ti was top of the line but it was better on the AMD optimised games.


----------



## Super XP (Oct 28, 2020)

I've said this before and I'll say it again, RDNA2 is the real deal. After the ZEN2 and now ZEN3 success, AMD pulled off RDNA2 and caught Nvidia by Surprise.


----------



## Vayra86 (Oct 28, 2020)

RedelZaVedno said:


> All Nvidia has to do is to transfer Ampere to TSMC 7nm euv process because going with Samsung 8nm (aka 10nm) was a shitshow. They'll probably do that in the next  18 months imo.



That still doesn't fix the memory part, only die size and power. So they'll go wider in bus.... and AMD can still get faster memory and bus. The best Nvidia's got in our current knowledge is a way to buy a generation worth of time. As we know from Intel's current position, that may very well not be enough.

Thinking of rabbits for Nvidia, I think their attempt to acquire ARM plays a role in that and if it does, time is not on their side.

One point that did stand out negatively for me... the price of the 6800 at 579. That doesn't look good against the 3070 at 500 even with the small performance advantage it seems to have.


----------



## RedelZaVedno (Oct 28, 2020)

ratirt said:


> NV did it to cut price so if you want 7nm EUV prepare to pay more for any card NV has presented so far transfered to 7nm EUV.


I agree on that, Ngreedia never gives anything for free unless it has to. But AMD has become the same greedy profit margin rising  corp as Intel and NV. I'm seriously considering boycotting them both by buying GPUs only on 2nd hand market.


----------



## randompeep (Oct 28, 2020)

BoboOOZ said:


> Just to point out the obvious, the number of wafers reauired for console APUs is huge, it' more than everything else they do (CPU or GPU alike).


Correct me if I'm wrong, but no way AMD would actually make comparable profit from the desktop GPUs as they do with the mobile Ryzens they've released the past years.
As some people say here and there, yeah the console SoC's is where the fat cash at


----------



## Super XP (Oct 28, 2020)

nguyen said:


> Rasterizaton performance are good but AMD really expect to sell those GPU at those prices ? or perhaps they can't price them any lower ?


So its OK for Nvidia to rip people off with high prices but when AMD prices its GPUs at a fair value its too expensive for you?  



RedelZaVedno said:


> I agree on that, Ngreedia never gives anything for free unless it has to. But AMD has become the same greedy profit margin rising  corp as Intel and NV. I'm seriously considering boycotting them both by buying GPUs only on 2nd hand market.


I have to respectfully disagree with your comment labeling AMD as equally greedy as Nvidia and Intel. Quite the opposite actually, AMD obviously has to answer to its shareholders and is careful to produce reasonable margins. That is why, out of the 3 companies, AMD is well known to price its products fairly.


----------



## MxPhenom 216 (Oct 28, 2020)

lynx29 said:


> i could care less about ray tracing. doesn't impress me at all in the games I have seen it in. i would rather turn it off and gain an extra 30-50 frames for the smoothness of high refresh gaming.



I wouldn't care about it at all in a competitive multiplayer game, but in a game that's single player with a good storyline. Id like to have Ray tracing.


----------



## Super XP (Oct 28, 2020)

RedelZaVedno said:


> All Nvidia has to do is to transfer Ampere to TSMC 7nm euv process because going with Samsung 8nm (aka 10nm) was a shitshow. They'll probably do that in the next  18 months imo.


I've heard RDNA3 is even more impressive over RDNA2 and will see performance gains equal or more as we've seen with ZEN2 to ZEN3. AMD is in a great position at the moment, they just need to hang on and keep pushing forward. They are the only CPU/GPU company that pushes innovation and tech in a competitive way.


----------



## petepete (Oct 28, 2020)

From 1900 xt to 2900 xt and beyond- I really hope AMD doesn't let me down again with the hype


----------



## Khonjel (Oct 28, 2020)

BTW can someone tag the news guys? I only remember the names of @btarunr and @Raevenlord. Boy do I have a tasty news piece idea for you.

All (almost) tests are done on a Ryzen 9 5900X system. I say almost cause I saw one slide being tested on Ryzen 5 3600X. AMD gave you numbers for Nvidia and AMD cards. Now you just gotta extrapolate the CPU number.

I know it's trying to hit a pot-shot since the numbers are from 1440p and 4k and most likely GPU-bound but hey why not check it out.

Steve from GamersNexus also confirmed from AMD that normal vs SAM+RAGE Mode benches are also done at same system, likely Ryzen 5900X one.

Only problem currently imo is finding out which slides are which. Slide names are RX-500 something something in the footnotes.


----------



## Super XP (Oct 28, 2020)

petepete said:


> From 1900 xt to 2900 xt and beyond- I really hope AMD doesn't let me down again with the hype


The stakes are far too high for a letdown. Once I heard members of the ZEN3 team assisted with RDNA2 I was confident it would perform well.


----------



## MxPhenom 216 (Oct 28, 2020)

lynx29 said:


> the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...



LOL, TSMC 7nm is being mass produced. Don't worry about it. The volume of 7nm TSMC is able to pump out should be quite adequate. Can't say the same about Samsung's 8nm (10nm+)


----------



## randompeep (Oct 28, 2020)

Super XP said:


> That is why, out of the 3 companies, AMD is well known to price its products fairly.


Fairly with the knowledge of people labeling their whole lineup as a better value overall...and people keep that in mind even if the prices are inflated!
Coming at a computer store near you - the guy with IT degree who recommends an Intel laptop because AMD runs too hot...or these who say an AMD FX discounted system is a better value over a pentium curent gen at the same price because *AMD*


----------



## BoboOOZ (Oct 28, 2020)

randompeep said:


> Correct me if I'm wrong, but no way AMD would actually make comparable profit from the desktop GPUs as they do with the mobile Ryzens they've released the past years.
> As some people say here and there, yeah the console SoC's is where the fat cash at


From what I know, some desktop GPU are pretty profitable, the higher tier ones ( 5700 and up), the margins are higher than with console APUs. Console APUs are a volume business, which they are contractually bound to ensure for many years to come.

However, where the most money is for AMD is EPYC chiplets, that is probably why the 5500XT are so uncompetitively priced, they make way more money selling 7nm wafers as EPYCS then as 5500 or even 5600 GPUs.

Laptop APU shortage is a symptom of AMD growth pains: their market share is increasing too fast, faster than they can reserve capacity at TSMC, so something's gotta give. They cannot reduce the console chips, they don't want to stop the EPYC business, so they make low-end GPU less attractive and they do not fully meet the demand for laptop APUs. The situation will be better next year, probably.


----------



## MxPhenom 216 (Oct 28, 2020)

Super XP said:


> *So its OK for Nvidia to rip people off with high prices but when AMD prices its GPUs at a fair value its too expensive for you? *
> 
> 
> I have to respectfully disagree with your comment labeling AMD as equally greedy as Nvidia and Intel. Quite the opposite actually, AMD obviously has to answer to its shareholders and is careful to produce reasonable margins. That is why, out of the 3 companies, AMD is well known to price its products fairly.



To be fair, we don't really know his position on Nvidia pricing...

..So your comment looks dumb.


----------



## Space Lynx (Oct 28, 2020)

MxPhenom 216 said:


> LOL, TSMC 7nm is being mass produced. Don't worry about it. The volume of 7nm TSMC is able to pump out should be quite adequate. Can't say the same about Samsung's 8nm (10nm+)




if I can't buy 5600x  and 6800xt on launch day im going to PM you and be like told you so bruh...


----------



## windwhirl (Oct 28, 2020)

lynx29 said:


> if I can't buy 5600x  and 6800xt on launch day im going to PM you and be like told you so bruh...


That's a demand problem lol


----------



## Makaveli (Oct 28, 2020)

QUANTUMPHYSICS said:


> All I'm concerned with is my AMD Stock share value.



Why do you buy NV gpu's if you are an AMD share holder?


----------



## randompeep (Oct 28, 2020)

BoboOOZ said:


> From what I know, some desktop GPU are pretty profitable, the higher tier ones ( 5700 and up), the margins are higher than with console APUs. Console APUs are a volume business, which they are contractually bound to ensure for many years to come.
> 
> However, where the most money is for AMD is EPYC chiplets, that is probably why the 5500XT are so uncompetitively priced, they make way more money selling 7nm wafers as EPYCS then as 5500 or even 5600 GPUs.
> 
> Laptop APU shortage is a symptom of AMD growth pains: their market share is increasing too fast, faster than they can reserve capacity at TSMC, so something's gotta give. They cannot reduce the console chips, they don't want to stop the EPYC business, so they make low-end GPU less attractive and they do not fully meet the demand for laptop APUs. The situation will be better next year, probably.


Epyc needs continous development amd research over the AI machine's (and/or supercomputer) needs. A console SoC is sketched, built and tested. Then they sell millons.
I'd say their biggest bet is on consoles since the first AMD based gaming device wildly spread.


----------



## R0H1T (Oct 28, 2020)

lynx29 said:


> if I can't buy 5600x and 6800xt on launch day im going to PM you and be like told you so bruh


I hope you keep your trigger ready & aim for the headshot, as soon as they release the krakens!

P.S. Don't be late


----------



## Space Lynx (Oct 28, 2020)

MxPhenom 216 said:


> I wouldn't care about it at all in a competitive multiplayer game, but in a game that's single player with a good storyline. Id like to have Ray tracing.



next gen consoles are focusing on ray tracing, nvidia is actually going to be left in the dust since they use proprietary ray tracing, most game developers will be focusing on AMD dx12 RT because consoles are the money makers, and we will get those ports. so nvidia has RT crown now, but in future they wont.


----------



## windwhirl (Oct 28, 2020)

lynx29 said:


> next gen consoles are focusing on ray tracing, nvidia is actually going to be left in the dust since they use proprietary ray tracing, most game developers will be focusing on AMD dx12 RT because consoles are the money makers, and we will get those ports. so nvidia has RT crown now, but in future they wont.


As I understand it, DX games implement raytracing through DirectX, not through Nvidia's own API (if there's such a thing). Vulkan I don't know though, but likely will have its own extension for it.


----------



## BoboOOZ (Oct 28, 2020)

randompeep said:


> Epyc needs continous development amd research over the AI machine's (and/or supercomputer) needs. A console SoC is sketched, built and tested. Then they sell millons.


Oh, Epyc uses the same Zen chiplets, it just uses more of them (and potentially higher bins). But the chiplets themselves have the same architecture and come from the same wafers as zen/threadripper chiplets. That's the genius of it.


----------



## Khonjel (Oct 28, 2020)

lynx29 said:


> the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...


I know it's old but according to this, since Apple moved to 5nm and Qualcomm mostly using Samsung 8nm now (QC's current hot-selling mid-ranger SD720G and upcoming SD875) AMD should be the biggest customer of TSMC 7nm atm.


----------



## TheLostSwede (Oct 28, 2020)

RedelZaVedno said:


> All Nvidia has to do is to transfer Ampere to TSMC 7nm euv process because going with Samsung 8nm (aka 10nm) was a shitshow. They'll probably do that in the next  18 months imo.


That's it? I think you have no idea what you're asking and how difficult it would be.
It would be quicker and easier for Nvidia to start a new design specifically for whatever TSMC node they want to go with, than to try and move an already taped out design from Samsung to TSMC.


----------



## mtcn77 (Oct 28, 2020)

Khonjel said:


> I know it's old but according to this, since Apple moved to 5nm and Qualcomm mostly using Samsung 8nm now (QC's current hot-selling mid-ranger SD720G and upcoming SD875) AMD should be the biggest customer of TSMC 7nm atm.


Also, the entire console business model rests on AMD's shoulders, so the industry will do its best to make it work in any way possible.


----------



## InVasMani (Oct 28, 2020)

About time knew AMD was going to get more and more aggressively competitive in the GPU market, but this is actually feels a touch sooner than I anticipated. I figured we wouldn't see this type of rebound until the next GPU generation in all honesty. Lisa Su for president 2020.



Super XP said:


> I've heard RDNA3 is even more impressive over RDNA2 and will see performance gains equal or more as we've seen with ZEN2 to ZEN3. AMD is in a great position at the moment, they just need to hang on and keep pushing forward. They are the only CPU/GPU company that pushes innovation and tech in a competitive way.


 This makes me so excited about the Xilinix acquisition. FPGA's with infinity cache and Ryzen CPU to compress all the data before sending it to the main OS CPU and loaded into system memory yeah that'll be amazing in a nice PCIE 4.0 or maybe PCIE 5.0 by that point x16 slot with quad M.2 devices in raid-5 and raid-10 if you wanted to raid 8 together with two cards on like a ThreadRipper platform. The future is wild to think about right now. I want to see these RNDA2 cards in crossfire I'm very interested to see if the infinity cache along with PCIE 4.0 can minimize micro stutter significantly especially when you take into account AMD's anti lag as well. These new RNDA2 cards with a Ryzen 5600X are going to be amazing together. It's hard to fathom what will come next for ThreadRipper along with RDNA3 and once they transition to 5nm as well exciting times for the tech world after like what felt like a decade of drought stuck in Intel/Nvidia limbo. I'm really excited to see the fine grain clock gating from AMD that's super important to eek out better efficiency for AMD's designs. Luckily any progress AMD means in that region on CPU or GPU side could be applied to the other in essence which is a huge perk. 

The infinity cache is a perfect example of innovation on the CPU side being materialized on the GPU side for the company as well this is exactly the sort of thing people envisioned and hoped to see with the ATI merger so many years ago leveraging the IP strengths from each. It's really going to be great to see what will happen with the combination of FPGA's and it's future APU's along with infinity cache and the I/O die. I think in the not so distant future AMD could do away with the motherboard chipset and probably just incorporate a dual socket design in place of it in some kind of bigLITTLE infinity cache design leveraging both combined for the overall chipset functionality and features


----------



## mtcn77 (Oct 28, 2020)

TheLostSwede said:


> That's it? I think you have no idea what you're asking and how impossible it would be. You clearly have no clue how chip fabrication works.
> It would be quicker and easier for Nvidia to start a new design specifically for whatever TSMC node they want to go with, than to try and move an already taped out design from Samsung to TSMC.


It is great that we have your fine discretion. I cannot think of a better analogy for Nvidia's situation. I had a tough time explaining it.


----------



## TheLostSwede (Oct 28, 2020)

mtcn77 said:


> It is great that we have your fine discretion. I cannot think of a better analogy for Nvidia's situation. I had a tough time explaining it.


Sorry? What are you on about?


----------



## mtcn77 (Oct 28, 2020)

TheLostSwede said:


> Sorry? What are you on about?


The usual stuff.








						RTX 3080 Crash to Desktop Problems Likely Connected to AIB-Designed Capacitor Choice
					

My advice don't pull someone up for using AIBS and then rant to us telling us we have to use the same abbreviation you just pulled someone up for. You are not the English language police, you can tell me how to do nothing, sir. ... . And I'm English not American.   You're responding to someone...




					www.techpowerup.com


----------



## cueman (Oct 28, 2020)

big and so on.,,nope

same as nvidia it rtx 3080,but, rtx 3080 goes ALL games over 60fps for 4K games, except FS 2020,let see real test for rx 6800 and rx 6900

also, i like to see both tested same ryzen 5950 cpu...remembe its 11% faster for gaming than any cpu,single core....11%


summarumm,average,rtx 3080 is 10% faster, belive it,and nvidia gpu have real RT support, and beat easily amd 6000 series for that case...you see? amd hasnone of info RT performance.


also,rtx 3080 ti coming and also rtx 3070 ti..


but i see that 7nm TSMC is better than samsung 8nm, an nvidia new gpus coming with that.


both top 2 gpus maded for 4K gaming and that is moust important battlefield.

...these days many gpu can running 60fps for WQHD resolution, thats not problem..doesent matter if its 124fps or 60fps,.. 60 fps is enough...its price and options.

4K is battle is between rx 6800xt/rx5900xtx vs rtx 3080/3090

who is king? nvidia looks,still



we seen it 18 of november.


----------



## DemonicRyzen666 (Oct 28, 2020)

I forgot what I was watching but Each CU has a RT unit connected to it mean this the 6900 XT has 80 RT cores and the rtx 3090 has 82 RT cores.


----------



## windwhirl (Oct 28, 2020)

DemonicRyzen666 said:


> I forgot what I was watching but Each CU has a RT unit connected to it mean this the 6900 XT has 80 RT cores and the rtx 3090 has 82 RT cores.


They don't necessarily work the same, though.


----------



## moproblems99 (Oct 28, 2020)

Vayra86 said:


> LOL. You would pay for proprietary DLSS? That's an odd one. That's like paying for PhysX and its per-title implementation. Realistically though... all DLSS is, is a performance tweak with visual impact. If the performance is already there... why would you?



Because features bro!  I need to run CSGO at 300fps on my 144hz (or lower) monitor with all the details at low.



lynx29 said:


> the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...



Rx 5000 are EOL.



RedelZaVedno said:


> How so? Nvidia has 3070TI in their sleeves and AMD just enabled them $600 pricing. AKA 6800 at $579 is dead. $50 less for GPU that is trading blows with 3080 and has no AI upscaling and probably way worse ray tracing performance is a non seller too.



Do you remember last year when they dropped the prices of the RX 5000 series after NV dropped theirs?  They are going to do it again.  Ray tracing still sticks and DLSS is only for people that don't want to pony up the money to play nice on the monitor that they overbought.



Vayra86 said:


> But yeah. Damn, son! I'll concede right here right now I've underestimated Navi/RDNA2. This really is great news, and long overdue.



Same.  I did not think they would get much more than an RCH over the 2080ti.


----------



## agatong55 (Oct 28, 2020)

cueman said:


> big and so on.,,nope
> 
> same as nvidia it rtx 3080,but, rtx 3080 goes ALL games over 60fps for 4K games, except FS 2020,let see real test for rx 6800 and rx 6900
> 
> ...



I am so confused at what I just read.


----------



## moproblems99 (Oct 28, 2020)

wolar said:


> i would like to see a 10gb 6800xt matching the 3080 as well.



Why would you want that?



RedelZaVedno said:


> They'll probably do that in the next 18 months imo.



That would be a disaster for then if all they did was pump out ampere on 7nm for the next 18 months.


----------



## Anymal (Oct 28, 2020)

Wtf means Frames per second (up to) on their slides? Up to, what, 3 runs avg fps?


----------



## mechtech (Oct 28, 2020)

Cut that 6800XT in half and let me know when it is available.


----------



## DemonicRyzen666 (Oct 28, 2020)

windwhirl said:


> They don't necessarily work the same, though.



I know that, but that's also a good thing. Since they'd have difference on image quality on both designs. 
I remember a long time ago. I forget who did it, but they took work station computer ECC with no over clock and work station GPU's with ECC and compared Consumer cards and consumer cpus image quality against it as the work station was the base for the image with no errors. They has to compare each and ever last frame with each other over the time the bench. I for get what software they used but it took months to do. in the end i usually hovered around 95% of the same image quality. It ranged from 100% up at worse you only got 92% of the same quality image.


----------



## Lionheart (Oct 28, 2020)

I wonder if this Infinity cache will be implemented into 5000 series APU's to remedy System RAM bandwidth.


----------



## Space Lynx (Oct 28, 2020)

moproblems99 said:


> Rx 5000 are EOL.





Vayra86 said:


> That makes about as much sense as your signature right now.




how so?  AMD announced a few weeks ago they are still producing rx 5700 and zen 2 cpu's "if there is demand for it we are still making it"  and Amazon is getting a restock of zen 2 after zen 3 comes out... so....  you can see it on their order page even... "in stock in november"


----------



## Turmania (Oct 28, 2020)

I want to say something so Lisa, you talking about scalpers and availability! Yet we can not buy 
ryzen 3 3300x for 3 months! Really? Come on!


----------



## MxPhenom 216 (Oct 28, 2020)

lynx29 said:


> how so?  AMD announced a few weeks ago they are still producing rx 5700 and zen 2 cpu's "if there is demand for it we are still making it"  and Amazon is getting a restock of zen 2 after zen 3 comes out... so....  you can see it on their order page even... "in stock in november"



There is literally an article here declaring from AMD that 5700xt is EOL. Itll be produced, but the 5700 non xt is done



> *Update, October 7th 2020:* AMD has confirmed it has ceased production for the RX 5700, but that RX 5700 XT manufacturing will be ongoing at least until 1Q2021. It's unclear what this means for the company's RDNA2 launch plans; it could be speculated the company will be releasing halo products first, with lower tiers being launched at a later time, in line with NVIDIA's usual launch cadence. This would justify the RX 5700 being kept in fabrication, since with a substantial price cut, it could become a mainstream AMD product).


----------



## Space Lynx (Oct 28, 2020)

MxPhenom 216 said:


> There is literally an article here declaring from AMD that 5700xt is EOL. Itll be produced, but the 5700 non xt is done



you just proved my point, thank you.  they are still wasting manufacturing 7nm space for old tech when they know their new stuff will sell out instantly. they could be using those same machines making 5700 xt for the next gen stuff right now and a month ago and into 2021.


----------



## Rahnak (Oct 28, 2020)

There's an article on Wccftech that says that a slide with RT performance was leaked. If true, it puts the 6000 series somewhere between RX2000 and RX3000 series, which is pretty impressive, in my opinion. Doesn't mention which 6000 series card was tested though.









						AMD Radeon RX 6000 "RDNA 2 Big Navi" GPU Ray Tracing Performance Detailed - NVIDIA's RTX 3080 With RT Cores 33% Faster Than AMD's Ray Accelerator Cores
					

AMD has provided the first ray tracing performance numbers of its next-gen RDNA 2 GPU based Radeon RX 6000 series graphics cards.




					wccftech.com


----------



## windwhirl (Oct 28, 2020)

lynx29 said:


> you just proved my point, thank you.  they are still wasting manufacturing 7nm space for old tech when they know their new stuff will sell out instantly. they could be using those same machines making 5700 xt for the next gen stuff right now and a month ago and into 2021.



About that, I have two considerations:
1-Contracts/Agreements between TSMC and AMD: Basically, whether the RDNA1 production is kinda required to meet the terms of their deals or not. Also, and somewhat related to that, whether AMD can simply say "screw it" and switch out RDNA1 production for RDNA2 overnight.
2-Contracts/Agreements between AMD and OEMs/AIBs/etc.: Whether AMD has to provide RDNA1 products to others due to legal obligations, regardless of the shiny new RDNA2.

@TheLostSwede you seem more knowledgeable on the fabs topic, so I refer to you  I won't ask about legal agreements and what not because those papers are probably not public anyway, but is it possible for AMD to switch out RDNA1 for RDNA2 in TSMC fabs relatively quickly (say, a couple weeks at most) or is it a long, slow process?


----------



## Deleted member 24505 (Oct 28, 2020)

lynx29 said:


> you just proved my point, thank you.  they are still wasting manufacturing 7nm space for old tech when they know their new stuff will sell out instantly. they could be using those same machines making 5700 xt for the next gen stuff right now and a month ago and into 2021.



So just stop producing 5700 and force everyone wanting new gpu's to buy new series. what if not everyone has not got the cash for a 6xxx GPU? the 5700 is still a pretty viable GPU.


----------



## Steevo (Oct 28, 2020)

When do we get reviews?


----------



## Space Lynx (Oct 28, 2020)

Steevo said:


> When do we get reviews?




nov 18th


----------



## dinmaster (Oct 28, 2020)

Steevo said:


> When do we get reviews?



nov 18 for 6800 and 6800xt then dec 8 for the 6900xt i would imagine as those are the release dates.


----------



## rvalencia (Oct 28, 2020)

DemonicRyzen666 said:


> I forgot what I was watching but Each CU has a RT unit connected to it mean this the 6900 XT has 80 RT cores and the rtx 3090 has 82 RT cores.


RX 6900 XT runs at a higher clock speed when compared to RTX 3090.

RT is memory bandwidth extensive on geometry search engine workload which usually has small data storage consumption with high memory bandwidth, hence the very high-speed 128 MB Infinity Cache.

RDNA 2 based game consoles seem to be missing very high-speed 128 MB Infinity Cache.


----------



## moproblems99 (Oct 28, 2020)

lynx29 said:


> how so?  AMD announced a few weeks ago they are still producing rx 5700 and zen 2 cpu's "if there is demand for it we are still making it"  and Amazon is getting a restock of zen 2 after zen 3 comes out... so....  you can see it on their order page even... "in stock in november"



Unrelated, but why do you keep flip-flopping about being a broke minimalist hippie and then spouting off a bunch of nonsense about buying a 5900X and BigNavi.  Then you'll trash AMD about waiting and write a dissertation about the 9700K being more cost effective and slobbering over Nvidia stuff and then back and forth again.  Are you having an identity crisis or just confused?  Not trying to be a dick, just wondering.


----------



## Space Lynx (Oct 29, 2020)

moproblems99 said:


> Unrelated, but why do you keep flip-flopping about being a broke minimalist hippie and then spouting off a bunch of nonsense about buying a 5900X and BigNavi.  Then you'll trash AMD about waiting and write a dissertation about the 9700K being more cost effective and slobbering over Nvidia stuff and then back and forth again.  Are you having an identity crisis or just confused?  Not trying to be a dick, just wondering.



lol i won't buy 5900x, its just fun trolling how silly the fanboy crap is here.  dude i play hearthstone and magic the gathering for hours daily.  id be fine with my gtx 1070 for next 3 years.  i most likely won't be able to a 5600x on launch day before it sells out, in which case I will wait for non-x 5600 in spring 2021 and OC it to a single core 4.7 and lulz in all caps. and then you can find me annoying again at how i mock it all

but i will say this, its nice to see AMD doing so well... the competition this is generating on CPU and GPU front is going to change everything.


----------



## InVasMani (Oct 29, 2020)

Let's hope RNDA3/Zen4 can refine this infinity cache further along with more cores and better frequency from 5nm hopefully so much flex from AMD right now relative to where they were not so long ago firing all cylinders so to speak. Now if Gaben can just remake the original Counterstrike with RTRT.


----------



## N3M3515 (Oct 29, 2020)

kings said:


> I think that AMD missed an opportunity with the 6800. It's true that it has 16GB VRAM, but it should lose a lot in RT and it doesn't have any DLSS equivalent, being $79 more expensive.
> 
> In my view, AMD had made a better move to put this card with 8GB like the RTX 3070 and sold it below $500.



Ithink they will release a 6800 8GB and sell ir for $500


----------



## wolar (Oct 29, 2020)

moproblems99 said:


> Why would you want that?
> 
> 
> 
> That would be a disaster for then if all they did was pump out ampere on 7nm for the next 18 months.


Because if that happens the price difference will be even bigger and make it a real nice card to own if you are not doing 4k and instead going for high FPS on lower resolutions, right?


----------



## moproblems99 (Oct 29, 2020)

wolar said:


> Because if that happens the price difference will be even bigger and make it a real nice card to own if you are not doing 4k and instead going for high FPS on lower resolutions, right?



Pretty sure you don't even need one of these for high fps gaming on lower resolutions.


----------



## Totally (Oct 29, 2020)

lynx29 said:


> how so?  AMD announced a few weeks ago they are still producing rx 5700 and zen 2 cpu's "if there is demand for it we are still making it"  and Amazon is getting a restock of zen 2 after zen 3 comes out... so....  you can see it on their order page even... "in stock in november"



5700 is EOL, the 5700 XT is still in production iirc. They probably got to a point where they don't need to bin the chips anymore. And with the new cards coming why not give the weakest sku with smallar margins the axe?


----------



## Space Lynx (Oct 29, 2020)

Totally said:


> 5700 is EOL, the 5700 XT is still in production iirc. They probably got to a point where they don't need to bin the chips anymore. And with the new cards coming why not give the weakest sku with smallar margins the axe?



this makes sense, cheers


----------



## Steevo (Oct 29, 2020)

Totally said:


> 5700 is EOL, the 5700 XT is still in production iirc. They probably got to a point where they don't need to bin the chips anymore. And with the new cards coming why not give the weakest sku with smallar margins the axe?




Die loss is still a thing. Plus why not reward lower bins with higher clocks and maybe the ability to unlock features (something Nvidia doesn't do, and the reason I found this site..... Years ago)

They probably have contracts for production. 

They are trying to sell out of what inventory they have.


----------



## kruk (Oct 29, 2020)

Khonjel said:


> Production is already mature though. AMD skipped 5nm and all the ensuing supply war and stayed with mature (as of now) 7nm. I think opposite will happen. AMD keeps the best of the best Navi 21 dies for RX 6900 XT while relegating slow/not-so-best dies to RX 6900 to fight off RTX 3080 Ti.



Maybe.  The maturity argument seems reasonable ... but the gap between RX 6900 XT and 6800 XT is already tiny, and unlike nVidia, which can use more VRAM for RTX 3080 Ti, AMD can't really do much.
I mean ... they released water cooled Vega with 375 W TBP, why would they hold back now, especially when the competition went crazy with the power consumption?

And then there is this:


__ https://twitter.com/i/web/status/1321522404800299009
Maybe 30% frequency increase over RDNA was for Navi 22, maybe we misunderstood how they calculated this, or maybe they have another ace up their sleeves (maybe Navi RAGE Edition?) ...


----------



## Space Lynx (Oct 29, 2020)

kruk said:


> Maybe.  The maturity argument seems reasonable ... but the gap between RX 6900 XT and 6800 XT is already tiny, and unlike nVidia, which can use more VRAM for RTX 3080 Ti, AMD can't really do much.
> I mean ... they released water cooled Vega with 375 W TBP, why would they hold back now, especially when the competition went crazy with the power consumption?
> 
> And then there is this:
> ...




or maybe they are just giving us generous overclocking headroom for board partners too? its possible.


----------



## okbuddy (Oct 29, 2020)

TheLostSwede said:


> Looks like four to me.



usb for charging  iphone


----------



## Caring1 (Oct 29, 2020)

okbuddy said:


> usb for charging  iphone


Lol, it's for HMD's.


----------



## ratirt (Oct 29, 2020)

kruk said:


> Maybe. The maturity argument seems reasonable ... but the gap between RX 6900 XT and 6800 XT is already tiny, and unlike nVidia, which can use more VRAM for RTX 3080 Ti, AMD can't really do much.
> I mean ... they released water cooled Vega with 375 W TBP, why would they hold back now, especially when the competition went crazy with the power consumption?
> 
> And then there is this:


The 3080 TI will be in between the 3080 and 3090. So not sure what you are after with this. Sure the 6800Xt is not so far back but isn't that what the 3080 is versus 3090? If NV releases 3080TI which will be in between NV 3080 and 3090 with more Vram than the 3080 the difference in will be very small. AMD already has sufficient amount of VRam so why bother. 3080 Ti may have more Vram but what would the performance look like. I think now, this is the main factor here if NV decides to release 3080 TI that is. 
The question is, when NV release it, when we can have it in the market shelves? 3080 and 3090, where are live are scheduled for January btw. 3070 at the end of November.
If AMD delivers November 18th, it is a huge win for AMD.


----------



## Zach_01 (Oct 29, 2020)

Nucleoprotein said:


> 6800 16GB vs 3070 8GB, see the difference in price?


6800 according to the slides and AMD statements is +18% faster than the 2080Ti/3070 8GB with a +16% in price.
Although those graphs was with the 6800 the SmartAccessMemory on. If you cut that off it may be just 15% faster or around that.


Khonjel said:


> RX 6800 is 60CU > RTX 3070
> 
> RX 6800 XT is 72CU ≈ RTX 3080
> 
> ...


The 6800 probaly is the 3070Ti competitor as its above the 2080Ti/3070

---------------------------------------------------------------

I can see there is a lot of confusion about the new feature AMD is calling "Smart Access Memory" and how it works. My 0.02 on the subject.
According to the presentation the SAM feature can be enabled only in 500series boards with a ZEN3 CPU installed. My assumption is that they use PCI-E 4.0 capabilities for this, but I'll get back to that.
The SAM feature has nothing to do with InfinityCache. IC is used to compensate the 256bit bandwithd between the GPU and VRAM. *Thats it, end of story*. *And according to AMD this is equivalent of a 833bit bus*. Again, this has nothing to do with SAM. IC is in the GPU and works for all systems the same way. They didnt say you have to do anything to "get it" to work. If it works with the same effectiveness with all games we will have to see.

Smart Access Memory
*They use SAM to have CPU access to VRAM and probably speed up things a little on the CPU side*. Thats it. They said it in the presentation, and they showed it also...
And they probably can get this done because of PCI-E 4.0 speed capability. If true thats why no 400series support.
They also said that this feature may be better in future than it is today, once game developers optimize their games for it.
I think AMD just made PCI-E 4.0 (on their own platform) more relevant than it was until now!

Full CPU access to GPU memory:



-------------------------------------------------------------------------------

My opinion on the 6000series is mixed.
They probably dont have as good implementation of RTRT and DLSS equivalent than nVIdia has, probably a 2080Ti perf for RTRT at least and cant comment for the Fidelity thing because I dont know.
But that is not why the mixed taste the 6000series had left me. RTRT is the future yes but today or the next year is still too restricted and not so important to most. For some is. For variable render resolution we will also see how it goes.

The 6800XT seem to me the only card priced well at 649$ according to rasterization performance against the 3080. With no rage mode or SAM. Given the fact that with those on it may beat the 3080 depending the OC headroom and the 3080's inability to clock more than marginal it makes 6800XT more appealing.

The 6900XT is not a 3090 rival the way 6800XT is for 3080. Yes it had the about the same chart differences with the below pair (6800XT vs 3080) but only with rage mode and SAM on. Which means that without them, out of the box on 300W will be probably half way between 3080 and 3090. Really tight gap but still. That makes it about 5~6% better than 6800XT? more or less. For +54% price? Yeah...
Probably AIBs may exploit more out of the 6900XT GPU and match the 3090 or even pass it, but I hardly believe that those will cost less than 1100~1200$.
The 6900XT will probably be a 3080Ti competitor on perf and price(?).

I was expecting a little more distinguishable products at the top end from AMD and not follow the nVidia stupidity. Why AMD didnt give +10% power draw headroom for the "FE" to match or even pass the 3090... I dont know. Maybe its the cooler design that max out at 300W. As I said AIBs (if will exist for 6900XT) probably will do that, but not for 999$.

The 6800 at 579$... a little steep... It will be more clearer when the 3070Ti is out, but comparing it to the 6800XT and its perf difference its should've been at 549$ max. That would make more sense.
+10% more price for +15% more performance of the 3070nonTi and a nice 3070Ti rival which I see the 600$ price tag on it.

I guess reviews will clear things better, but I think wont make any significant differences on my initial thoughts.


Edit(s):
typo(s)


----------



## TheLostSwede (Oct 29, 2020)

windwhirl said:


> About that, I have two considerations:
> 1-Contracts/Agreements between TSMC and AMD: Basically, whether the RDNA1 production is kinda required to meet the terms of their deals or not. Also, and somewhat related to that, whether AMD can simply say "screw it" and switch out RDNA1 production for RDNA2 overnight.
> 2-Contracts/Agreements between AMD and OEMs/AIBs/etc.: Whether AMD has to provide RDNA1 products to others due to legal obligations, regardless of the shiny new RDNA2.
> 
> @TheLostSwede you seem more knowledgeable on the fabs topic, so I refer to you  I won't ask about legal agreements and what not because those papers are probably not public anyway, but is it possible for AMD to switch out RDNA1 for RDNA2 in TSMC fabs relatively quickly (say, a couple weeks at most) or is it a long, slow process?


Right, so AMD already has a great example of how foundry contracts work in their dealings with Global Foundries. They signed a contract with GloFo for X amount of wafers per X amount of time. GloFo obviously stopped developing cutting edge foundry processes, so I'm sure AMD could've gotten out of the contract somehow, but this is why we're getting the X570 chipset and the I/O die made by GloFo on a different node, which most likely also saves AMD some cash.

As for TSMC, their contracts are most likely the same, you place an order for X amount of capacity and you can make whatever products you've taped up with them, using up that allocation.

The key thing here is that you  pay a fee to tape out your products, it's not little money, something small like a USB 3.0 device controller is in the region of US$500,000 on a not cutting edge node. This is one of the big costs when it comes to making an IC, so ideally you don't want more than one tape out, but we know that this isn't always the case. Once taped out on the fabs node, you can always go back and make more of the same thing and as we know, over time the fab tends to improve both yields and the quality of the chips they make.

So yes, as long as it's on a node that AMD has a contract to manufacture on, they can switch in whatever product they want. It would obviously take a bit of time to do this, a week or two I would say, as this isn't quite like baking cookies. I mean, just remember the issues that happened not to long ago for Toshiba/WD where a 13- minute power outage wasted 6-9 exabytes of flash and you realise how sensitive these fabs are. I have been to a few fabs over the years and although it's not the most exciting thing in the world, they are really quite amazing considering the tolerances and cleanliness that's required to make chips.

The short answer is, yes, AMD can swap in and manufacture whatever they want, as long as they have a contract to manufacture on said node and have enough capacity left.

As for your second question, there are different types of supply contracts. In most cases there's an EOL notice, you must've seen some of the ones Intel send out to notify it's customers that a certain product won't be produced past a certain date and this and this date is the final order and shipment dates. The other is like when AMD makes the console SoCs where they have a contract to supply X amount of chips per months for a certain period of time. I very much doubt AMD has such a contract with any of the graphics card makers. Quite often there's an allocation based per what each partner is planning on manufacturing for the next six months or so. I doubt many of their partners would still be interested in an older product now, so there isn't going to be any issues for AMD. Normally there's also a certain amount available in rolling stock, something you can quite easily see if you go to someone like NXP or Microchip and try to order some parts. Popular products are available in stock, whereas more niche products you might have to place an order for and then there's a lead time of 12-16 weeks at a minimum as an example.



Cheeseball said:


> They are not going to release a RTX 3070 Ti at $599. That would be a stupid business decision when the 3070 is at $499 and the 3080 is at $699. Thats only a $100 difference between each tier and is still at the bottom high-performance segment which is not exactly where most of the money comes in.
> 
> RTX 2060 Super = $399
> RTX 2070 Super = $599
> ...


The super was "only" $499 at launch.








						GeForce 20 series - Wikipedia
					






					en.m.wikipedia.org
				






Lionheart said:


> I wonder if this Infinity cache will be implemented into 5000 series APU's to remedy System RAM bandwidth.


It's apparently based on the Zen L3 cache design, so that might be tricky.


----------



## SN2716057 (Oct 29, 2020)

And now we sit back and wait for the reviews. 

The RX 6800 XT looks like a good competitor for the 3080 (short listed) but I wonder how the AMD will perform on my monitor (Acer Predator XB1 XB271HU [G-sync]). Cause I don't want to buy a new one.


----------



## TheLostSwede (Oct 29, 2020)

Caring1 said:


> Lol, it's for HMD's.


More and more monitors have DP alt mode over USB-C now.


----------



## Turmania (Oct 29, 2020)

If 6900XT, has the same power requirement of 6800XT. I believe it is special cherry on top chips that go to 6900 XT.


----------



## Zach_01 (Oct 29, 2020)

Turmania said:


> If 6900XT, has the same power requirement of 6800XT. I believe it is special cherry on top chips that go to 6900 XT.


And thats why the +20day release date. I expect 6900XT to be less available than 6800/6800XT and it does make sense since people buying 999+$ GPU card are far few. And other than that the 6800XT has miles away better perf/$ ratio (=less demand for 6900XT)


----------



## mb194dc (Oct 29, 2020)

Need to wait for the reviews. Then need to see how high AIB can push the clocks, rumors are of Asus cards doing 2500 boost and higher than that might even be possible. Cards weakness will be RT, though with all the development going on for console games using essentially the same architecture, it could be largely mitigated in new games.

GPU market going to be most interesting since 2015 and will be interesting to see if AMD go for market share with a price war next year if they can produce mountains of these cards.


----------



## medi01 (Oct 29, 2020)

Zach_01 said:


> The 6800 at 579$... a little steep... It will be more clearer when the 3070Ti is out, but comparing it to the 6800XT and its perf difference its should've been at 549$ max.



18% faster card with 100% more VRAM could not cost 16% more.
Sense makes it none.

Why would yet another GA102 based NV GPU be less of a clusterf*ck than 3080?



Zach_01 said:


> half way between 3080 and 3090.


It helps to remember that that "way" is 10% of perf.


----------



## kapone32 (Oct 29, 2020)

lynx29 said:


> the thing that sucks the most is that AMD is still using 7nm factories ot make ryzen 3000 series and navi 5 series... like seriously, screw it, go all in on this new stuff so it doesn't sell out on day 1...


One of the things that we have to keep in mind is how long these chips have been in development. Let's keep in mind that this architecture was also manufactured for the consoles. To me that says that there should be no stock issues. As far as I understand it the 5000 series is EOL anyway. The 3000 series is mature now so they should have lots of 3100s. For all we know the 3300x could be failed console CPUs which could explain it's Global scarcity. This is also the end of this node. We will probably see AM5 no earlier than  mid 2021. They need the 3000 series to fill the stack for now. The pandemic shows no sign of slowing meaning that the explosion of the PC market will continue. I watched a case (Antec 400) go from $74.99 in my cart to $124.99 in less than a month on Amazon.


----------



## Dave65 (Oct 29, 2020)

RedelZaVedno said:


> $80 for +8 gigs of GDDR6... Micron 1GB GDDR6 @ 14Gbps currently costs $6,86 when ordered in low quantities and much less if ordered in tons. You're paying double the price AMD pays for it. How is that a good deal?


Just checked with my wallet, it says it's a damn good deal.


----------



## Shatun_Bear (Oct 29, 2020)

The $650 6800XT looks an even better deal compared to the '$700' 3080 as 700 for that card is basically a unicorn and will remain one; actual street price is $800, if you're lucky enough to be one of the people that gets your hand on one of the 200 stock...so $150 saving for similar performance and with non-gimped memory.

And if you go with a Ryzen 5000 (most new buyers will) performance should be faster than the 3080.


----------



## medi01 (Oct 29, 2020)

RedelZaVedno said:


> $80 for +8 gigs of GDDR6...


+16% price for +18% perf and +8GB VRAM as a bonus.



Totally said:


> 5700 is EOL, the 5700 XT is still in production iirc.


AMD denied any of 5700 series stopped being produced, so, uh, where did you get 5700 is EOL from?
There is a huge gap between 5700XT $399 and 6800 $579 and I doubt anything but perhaps mild price drop will happen here until low end 6000 series arrive.


----------



## moproblems99 (Oct 29, 2020)

Caring1 said:


> Lol, it's for HMD's.



Her Majesty's Drones?


----------



## windwhirl (Oct 29, 2020)

moproblems99 said:


> Her Majesty's Drones?


Hats off to you, that was too good!


----------



## Totally (Oct 29, 2020)

medi01 said:


> +16% price for +18% perf and +8GB VRAM as a bonus.
> 
> 
> AMD denied any of 5700 series stopped being produced, so, uh, where did you get 5700 is EOL from?
> There is a huge gap between 5700XT $399 and 6800 $579 and I doubt anything but perhaps mild price drop will happen here until low end 6000 series arrive.



from the tpu article a couple weeks back


----------



## EarthDog (Oct 29, 2020)

Totally said:


> from the tpu article a couple weeks back


For medi and shiton bear... 








						AMD RX 5700 Series Reportedly Enter EOL - No Longer Manufactured
					

Update, October 7th 2020: AMD has confirmed it has ceased production for the RX 5700, but that RX 5700 XT manufacturing will be ongoing at least until 1Q2021. It's unclear what this means for the company's RDNA2 launch plans; it could be speculated the company will be releasing halo products...




					www.techpowerup.com
				






> Update, October 7th 2020: *AMD has confirmed it has ceased production for the RX 5700,* but that RX 5700 XT manufacturing will be ongoing at least until 1Q2021.


----------



## moproblems99 (Oct 29, 2020)

EarthDog said:


> For medi and shiton bear...



Reading, it does a body good.


----------



## FairNando (Oct 29, 2020)




----------



## windwhirl (Oct 29, 2020)

FairNando said:


> View attachment 173776


After 13 pages of debate, hype, etc., I think I had quite the entertainment lol


----------



## Super XP (Oct 30, 2020)

MxPhenom 216 said:


> To be fair, we don't really know his position on Nvidia pricing...
> 
> ..So your comment looks dumb.


Not really, I am speaking about past and present facts. Nvidia is well known to sell its GPUs for a premium cost despite the fact they don't deserve such premium. A great example is the RTX 2000 series, highly overpriced. Look at it objectively and you should also agree, as most Nvidia fanboys also complained by the high cost for little to no benefits for justifying that cost.


----------



## Vayra86 (Oct 30, 2020)

lynx29 said:


> next gen consoles are focusing on ray tracing, nvidia is actually going to be left in the dust since they use proprietary ray tracing, most game developers will be focusing on AMD dx12 RT because consoles are the money makers, and we will get those ports. so nvidia has RT crown now, but in future they wont.



Same API....

whats proprietary  is how Nvidia handles the processing  within its own hardware, no different from AMD really.

An API like DX12 is an abstraction layer. It translates. Nothing more, and green and red are both looking at the same translation service.

The custom hardware in consoles is.... custom so not quite the same as a Navi discrete GPU. Shared memory, fixed storage speeds... On top of that... history has certainly not provided proof of any sort of better optimization for AMD GPUs.


----------



## MxPhenom 216 (Oct 30, 2020)

Super XP said:


> Not really, I am speaking about past and present facts. Nvidia is well known to sell its GPUs for a premium cost despite the fact they don't deserve such premium. A great example is the RTX 2000 series, highly overpriced. Look at it objectively and you should also agree, as most Nvidia fanboys also complained by the high cost for little to no benefits for justifying that cost.



Well maybe if AMD had anything to compete against Nvidia's RTX 2000 series it may have drove costs down. This time they do.

Nvidia has been in the market by themselves for quite a while.


----------



## Super XP (Oct 30, 2020)

MxPhenom 216 said:


> Well maybe if AMD had anything to compete against Nvidia's RTX 2000 series it may have drove costs down. This time they do.
> 
> Nvidia has been in the market by themselves for quite a while.


The 1080 and 1080ti were somewhat fairly priced. Even Nvidia's previous Gen. The RTX 2080, 2080ti and 2070 were selling at ripoff prices. And Nvidia Fanboys agree. 

Anyhow I agree AMD left the high end open to Nvidia, because there priority was to get ZEN out successfully. Now they have RDNA2 that competes well.


----------



## Fluffmeister (Oct 30, 2020)

FairNando said:


> View attachment 173776



Oh for sure, Nvidia cards are rarer then hens teeth, but AMDs cards still don't hit the market until the 18th of November.... what a joke.


----------



## moproblems99 (Oct 30, 2020)

Fluffmeister said:


> Oh for sure, Nvidia cards are rarer then hens teeth, but AMDs cards still don't hit the market until the 18th of November.... what a joke.



Well, barely any better than the 12 combined 3090s, 3080s, 3070s that have been released.


----------



## Fluffmeister (Oct 30, 2020)

moproblems99 said:


> Well, barely any better than the 12 combined 3090s, 3080s, 3070s that have been released.



Well if AMD are sitting on thousands of 6000 series cards in a warehouse somewhere, I'd suggest now is a good time to take advantage of Nvidia's supply woes, just a thought.


----------



## Zach_01 (Oct 30, 2020)

Fluffmeister said:


> Oh for sure, Nvidia cards are rarer then hens teeth, but AMDs cards still don't hit the market until the 18th of November.... what a joke.


Yes but it was funnier than nVidias? At least AMD is trying to be honest and release the cards when they can, and not do a Jensen “release” that took place only to undercut and underhype RDNA2.

They probably do have more cards than nvidia. Also nVidia had in their plan to ship at least a couple of 100K cards by November but rumors say that they cut that in half because they need to restructure their line and segmentation.


----------



## Fluffmeister (Oct 30, 2020)

I think if they had decent stock they should take advantage that is all.

Forgive me, I appreciate AMD can do no wrong.


----------



## Zach_01 (Oct 30, 2020)

They do have the advantage even if they release the 18th of next month. nVidia will supply also but AMD will have more.


----------



## MxPhenom 216 (Oct 30, 2020)

Fluffmeister said:


> Well if AMD are sitting on thousands of 6000 series cards in a warehouse somewhere, I'd suggest now is a good time to take advantage of Nvidia's supply woes, just a thought.



Or Nvidia has purposefully made the supply low for their new cards...

I have a feeling Nvidia cards will suddenly be in abundance when AMD cards hit the market.


----------



## Fluffmeister (Oct 30, 2020)

MxPhenom 216 said:


> Or Nvidia has purposefully made the supply low for their new cards...



Hey it's possible, it must be nice to sell every single card without issue.

People hated these overpriced cards, now even AMD fans are happy to open their wallets.


----------



## MxPhenom 216 (Oct 30, 2020)

Fluffmeister said:


> Hey it's possible, it must be nice to sell every single card without issue.
> 
> People hated these overpriced cards, now even AMD fans are happy to open their wallets.



Its a tactic. Theres rumors that Nvidia will have thousands of cards ready to go towards the end of November.

What elses comes out towards the end of November???

Hell the Microcenter here in Denver had around 100 3070 FE cards and over 10 MSI Ventus 3s. And several other Evga yesterday. Still had some into the afternoon. If you wanted a 3070, it was pretty easy to get one from the sounds of it. Trying to get one online wasn't the way.

Yes, Microcenter is also selling FEs now.


----------



## medi01 (Nov 1, 2020)

Totally said:


> from the tpu article a couple weeks back



I wouldn't call rumors articles, more like cheap-clickbait.
AMD has issued a statement right after, calling it all BS.


----------



## Space Lynx (Nov 1, 2020)

these extra benches on AMD's website are mighty impressive.  Does anyone know if smart access memory will auto be turned on if I have x570 mobo, zen 3 cpu and big navi gpu?  or will it have to be turned on in AMD drivers?


----------



## Super XP (Nov 1, 2020)

lynx29 said:


> these extra benches on AMD's website are mighty impressive.  Does anyone know if smart access memory will auto be turned on if I have x570 mobo, zen 3 cpu and big navi gpu?  or will it have to be turned on in AMD drivers?


I would have to assume everything would end up being automatically enabled but that Drivers will have to probably play a role in identifying what combo you are using like a Radeon & a Ryzen CPU.
But if you are looking for more refined details on how this works, I think AMD is keeping that quiet, keep the competition from learning how this works perhaps? I remember AMD filed several Patents back in 2017 or 2018 about Infinity Cache and some other memory utilization thingies? 
I am also sure both new PS5's and Xbox Series X will also be utilizing SAM for sure.



> *AMD Smart Access Memory*





> *(Ryzen + Radeon Combo)*, the data channel gets expanded to harness the full potential of GPU memory – removing the bottleneck to increase performance. Obviously, in AMD optimized games, you’ll experience this feature to be best implemented, still great bonus for gamers truly.


----------



## EarthDog (Nov 1, 2020)

lynx29 said:


> these extra benches on AMD's website are mighty impressive.  Does anyone know if smart access memory will auto be turned on if I have x570 mobo, zen 3 cpu and big navi gpu?  or will it have to be turned on in AMD drivers?


Last I heard, it needs to be enabled in the bios.


----------



## Space Lynx (Nov 1, 2020)

EarthDog said:


> Last I heard, it needs to be enabled in the bios.



i always browse the BIOS when it gets updated, so I will give this a look, cheers.  hopefully enabling it doesn't mess up ram overclocking.


----------



## medi01 (Nov 2, 2020)

moproblems99 said:


> Well, barely any better than the 12 combined 3090s, 3080s, *3070s *that have been released.



About 20% faster (that is what 6800 is to 3070) is spelled "barely better" these days.


----------



## gridracedriver (Nov 2, 2020)

I said it would happen...


----------



## MxPhenom 216 (Nov 2, 2020)

moproblems99 said:


> Well, barely any better than the 12 combined 3090s, 3080s, 3070s that have been released.



My local Microcenter alone had around 100 RTX3070 FE at launch and they went out of stock in the afternoon of launch day. If you wanted a 3070 it was pretty easy to get one, but trying to online was way harder especially with Newegg since they have zero bot protection.


----------



## moproblems99 (Nov 2, 2020)

MxPhenom 216 said:


> My local Microcenter alone had around 100 RTX3070 FE at launch and they went out of stock in the afternoon of launch day. If you wanted a 3070 it was pretty easy to get one, but trying to online was way harder especially with Newegg since they have zero bot protection.



For those that have a microcenter in their state...I had Best Buy.


----------



## MxPhenom 216 (Nov 2, 2020)

moproblems99 said:


> For those that have a microcenter in their state...I had Best Buy.



Best Buy was a much safer bet to get one than Newegg.

I almost got an FE card on Best Buy, but was too slow to hit check out lul. Didnt even notice i got one in the cart until like 5 minutes later and they were gone. Best Buy was releasing them in waves.


----------



## Super XP (Nov 2, 2020)

MxPhenom 216 said:


> Best Buy was a much safer bet to get one than Newegg.
> 
> I almost got an FE card on Best Buy, but was too slow to hit check out lul. Didnt even notice i got one in the cart until like 5 minutes later and they were gone. Best Buy was releasing them in waves.


Makes sense to control the wolves from buying them all up I suppose.


----------

