# AMD Radeon RX 6000 Series Specs Leak: RX 6900 XT, RX 6800 XT, RX 6700 Series



## btarunr (Oct 21, 2020)

AMD's Radeon RX 6000 series graphics cards, based on the RDNA2 graphics architecture, will see the introduction of the company's first DirectX 12 Ultimate graphics cards (featuring features such as real-time raytracing). A VideoCardz report sheds light on the specifications. The 7 nm "Navi 21" and "Navi 22" chips will power the top-end of the lineup. The flagship part is the Radeon RX 6900 XT, followed by the RX 6800 XT and RX 6800; which are all based on the "Navi 21." These are followed by the RX 6700 XT and RX 6700, which are based on the "Navi 22" silicon. 

The "Navi 21" silicon physically features 80 RDNA2 compute units, working out to 5,120 stream processors. The RX 6900 XT maxes the chip out, enabling all 80 CUs, and is internally referred to as the "Navi 21 XTX." Besides these, the RX 6900 XT features 16 GB of GDDR6 memory across a 256-bit wide memory interface, and engine clocks boosting beyond 2.30 GHz. The next SKU in AMD's product stack is the RX 6800 XT (Navi 21 XT), featuring 72 out of 80 CUs, working out to 4,608 stream processors, the same 16 GB 256-bit GDDR6 memory configuration as the flagship, while its engine clocks go up to 2.25 GHz. 






A notch below the RX 6800 XT is the RX 6800 (Navi 21 XL), which cuts down the "Navi 21" further, giving it 64 compute units or 4,096 stream processors; the very same 16 GB of 256-bit GDDR6 memory interface, and up to 2.15 GHz engine clocks. The RX 6900 XT, along with the RX 6800 series, will be announced in the October 28 presser. 

The next chip AMD is designing is the 7 nm "Navi 22" silicon, which features 40 compute units. On paper, this count looks similar to that of the "Navi 10," and it remains to be seen if this is a re-badge or a new silicon based on RDNA2. The RX 6700 XT maxes this chip out, featuring 40 CUs or 2,560 stream processors; while the RX 6700 features fewer CUs (possibly 36). The interesting thing about these two is their memory configuration—12 GB of 192-bit GDDR6.

*View at TechPowerUp Main Site*


----------



## Quicks (Oct 21, 2020)

Looking GOOD hope the prices look good as well.


----------



## Mussels (Oct 21, 2020)

I need some reliable performance leaks before my 3080 gets shipped, so i know if i should cancel or not


----------



## ratirt (Oct 21, 2020)

I wonder if the 6700 XT will perform around or a bit faster than the 5700 XT? Logic dictates it will be faster than a 5700 XT but who knows how much faster and if it will actually be faster. 


Mussels said:


> I need some reliable performance leaks before my 3080 gets shipped, so i know if i should cancel or not


You still are waiting for the 3080? oh boy. The Navi 6000 will definitely come before you get the card so you will have your chance of comparison. I'm also aiming at the more less 3080 type of performance and waiting for new Navi to see what it will offer. Either way I need a new faster GPU.


----------



## Solid State Soul ( SSS ) (Oct 21, 2020)

Mussels said:


> I need some reliable performance leaks before my 3080 gets shipped, so i know if i should cancel or not


Performance isn't always the full picture, stable drivers also matters, look how the RX 5000 series launched, terrible drivers for months.


----------



## Mussels (Oct 21, 2020)

Solid State Soul ( SSS ) said:


> Performance isn't always the full picture, stable drivers also matters, look how the RX 5000 series launched, terrible drivers for months.



But they're stable in the long run. I've got an RX 470 and 580 in the house in lesser machines and they run totally smooth, i keep forgetting they're not Nvidia and try to open the overlay with the NV shortcut of alt-Z to stream things.


----------



## ebivan (Oct 21, 2020)

Well, I really hope the 6800XT is on par with the 3080 on the price and on the performance side. 

Maybe 700€ and 10% slower than 3080, that would be great. And of course it would be great if the cards were actually available for that price and not only in minimal quantities for twice the price...


----------



## ratirt (Oct 21, 2020)

Solid State Soul ( SSS ) said:


> Performance isn't always the full picture, stable drivers also matters, look how the RX 5000 series launched, terrible drivers for months.


Honestly, never noticed anything wrong with the drivers and I'm rocking AMD 5600 XT for several months.


ebivan said:


> Well, I really hope the 6800XT is on par with the 3080 on the price and on the performance side.
> 
> Maybe 700€ and 10% slower than 3080, that would be great. And of course it would be great if the cards were actually available for that price and not only in minimal quantities for twice the price...


I'm starting to think that the 6900XT is the one on par with 3080 or around. Even if 6800XT is on par, the 6900XT will not be far ahead of the 3080 and 6800XT.


----------



## ebivan (Oct 21, 2020)

ratirt said:


> Honestly, never noticed anything wrong with the drivers and I'm rocking AMD 5600 XT for several months.
> 
> I'm starting to think that the 6900XT is the one on par with 3080 or around. Even if 6800XT is on par, the 6900XT will not be far ahead of the 3080 and 6800XT.


At the end of the day its the price not the model no thats important. Huang promised 4k 60fps for 700 bucks. Thats what i expect. Nothing more, nothing less. I really hope Su can give me what Huang didnt!


----------



## ratirt (Oct 21, 2020)

ebivan said:


> At the end of the day its the price not the model no thats important. Huang promised 4k 60fps for 700 bucks. Thats what i expect. Nothing more, nothing less. I really hope Su can give me what Huang didnt!


By all means it is the price. I'm waiting impatiently for the new NAVI to see the performance and the price. I know it will be good but how good it's yet to be revealed.


----------



## Anymal (Oct 21, 2020)

How they both decided several years ago TBP would be 320W? Smaller node, no problem, lets make it more power hungry.


----------



## turbogear (Oct 21, 2020)

Only another week until official launch. 
Let's see when official reviews will come out after the launch date.
Interested to see the performance and price of 6900XT compared to 3080 specially RT performance.
The power consumption according to other leaks seems to be similar to 3080. 

Lets see if the launch from AMD will be smoother than NVidia or we will have similar issue of non existing stocks everywhere.

My Radeon VII water-cooled is still doing well on 2K monitor so not in hurry to buy another card. 

If the Radion 6900XT is great performer it may be time to go to 4K gaming.  
6900XT could become my next water-cooling project. 
It all depends if this card will impress me with the performance or would be a disappointment.
Therefore I will wait for review from TPU.


----------



## Deleted member 190774 (Oct 21, 2020)

Solid State Soul ( SSS ) said:


> Performance isn't always the full picture, stable drivers also matters, look how the RX 5000 series launched, terrible drivers for months.


Well it certainly didn't help that there was an underlying hardware issue (just what I'd heard).


----------



## laszlo (Oct 21, 2020)

Mussels said:


> I need some reliable performance leaks before my 3080 gets shipped, so i know if i should cancel or not




unfortunately i can't help you, the only reliable leak for the moment is the one with "take a.."


----------



## ebivan (Oct 21, 2020)

I really think supply will be better. AMD learned from their experiment with HBM that exotic memory is not the answer for mainstream cards. Nvidia is just having the same learning experience. You can't launch a product relying on a single exotic type of memory that is only produced by a single manufacturer..


----------



## nguyen (Oct 21, 2020)

ebivan said:


> I really think supply will be better. AMD learned from their experiment with HBM that exotic memory is not the answer for mainstream cards. Nvidia is just having the same learning experience. You can't launch a product relying on a single exotic type of memory that is only produced by a single manufacturer..



Really ? remember GDDR5X and how it made Pascal so good it's hard to replace.


----------



## ratirt (Oct 21, 2020)

ebivan said:


> I really think supply will be better. AMD learned from their experiment with HBM that exotic memory is not the answer for mainstream cards. Nvidia is just having the same learning experience. You can't launch a product relying on a single exotic type of memory that is only produced by a single manufacturer..


That is for sure. Any launch that AMD will perform with cards out would be better than NVidia's. The cards for Norway are scheduled for January. How is that a good launch when you get teh cards 4 months later? With AMD it will definitely be better. How much better? We will see soon.


nguyen said:


> Really ? remember GDDR5X and how it made Pascal so good it's hard to replace.


AMD is not using GDDR6X like Nvidia does. GDDR6 availability is better. Way better. Is Pascal so hard to replace? I wouldn't say so.


----------



## FinneousPJ (Oct 21, 2020)

Okay so I'm guessing the 6800 non-X is the one to get if you wanna try tickling out some extra performance by manual OC. Looks like it'll still feature the full 16 GB RAM -- I just hope the price is right. Sub $400 should be a reasonable expectation I think.


----------



## ratirt (Oct 21, 2020)

FinneousPJ said:


> Okay so I'm guessing the 6800 non-X is the one to get if you wanna try tickling out some extra performance by manual OC. Looks like it'll still feature the full 16 GB RAM -- I just hope the price is right. Sub $400 should be a reasonable expectation I think.


Sub $400? Oh I wish that was true but I wouldn't count on that. I'd go $500 for the 6800.


----------



## FinneousPJ (Oct 21, 2020)

ratirt said:


> Sub $400? Oh I wish that was true but I wouldn't count on that. I'd go $500 for the 6800.


Well, I'm spitballing like so:
6900 XT <> 3080, $700
6800 XT <> 3070, $500
6800 <> 3060, $400

But it's only a guess.


----------



## ebivan (Oct 21, 2020)

FinneousPJ said:


> Well, I'm spitballing like so:
> 6900 XT <> 3080, $700
> 6800 XT <> 3070, $500
> 6800 <> 3060, $400
> ...


That would be great but I dont think that's realistic. Since there never actually were any 3080 for 700 (or has anyone acutally gotten a FE card?), AMD will propably not sell an 3080 equivalent for that price.


----------



## FinneousPJ (Oct 21, 2020)

ebivan said:


> That would be great but I dont think that's realistic. Since there never actually were any 3080 for 700 (or has anyone acutally gotten a FE card?), AMD will propably not sell an 3080 equivalent for that price.


What would be your guess? Shift everything up by $100?

6900 XT <> 3080, $800
6800 XT <> 3070, $600
6800 <> 3060, $500


----------



## ratirt (Oct 21, 2020)

FinneousPJ said:


> What would be your guess? Shift everything up by $100?
> 
> 6900 XT <> 3080, $800
> 6800 XT <> 3070, $600
> 6800 <> 3060, $500


There's 6700XT as well and this one would be $400 I think.


----------



## ebivan (Oct 21, 2020)

FinneousPJ said:


> What would be your guess? Shift everything up by $100?
> 
> 6900 XT <> 3080, $800
> 6800 XT <> 3070, $600
> 6800 <> 3060, $500



Im not sure, but Ryzen 5000 prices suggest that AMD is much more confident with prices these days. Even if their GPUs are not in the same (leading) position as their Ryzen CPUs (yet). The fact that nVidia just cant deliver and there are gazillions of gamers out there willig to spent 1k for 3080 if it was just available somewhere might enable them to adjust prices to nVidias level and not be the "a little worse but but also a little cheaper" option anymore


----------



## ratirt (Oct 21, 2020)

ebivan said:


> Im not sure, but Ryzen 5000 prices suggest that AMD is much more confident with prices these days. Even if their GPUs are not in the same (leading) position as their Ryzen CPUs (yet). The fact that nVidia just cant deliver and there are gazillions of gamers out there willig to spent 1k for 3080 if it was just available somewhere might enable them to adjust prices to nVidias level and not be the "a little worse but but also a little cheaper" option anymore


Yes, more confident for CPUs but GPUs is a different story. The 6900XT for $700 is OK considering NV's failure to deliver and paying $1k for 3080 is just downright crazy. This supply issue for NV's cards may work pretty well for AMD if their cards are comparable in performance. A lot of people will drop the 3080 and go with AMD. That would be huge for the company if they sell a lot in the first months. That is of course if AMD will deliver which I'm sure they try to make it as available as possible.


----------



## AnarchoPrimitiv (Oct 21, 2020)

I'm sure even if performance is there and is equivalent to Nvidia's offerings, some people will feel entitled to AMD pricing their cards $200 cheaper


----------



## Ibotibo01 (Oct 21, 2020)

I think that RX 6800 would be better value than RTX 3060/3070 and Nvidia doesn't deliver their GPUs. AMD will take advantage if they deliver before Q1 2021.





RX 6900 XT's core count is 2 times of RX 5700 XT so i show GTX 1060 VS GTX 1080 (performance gap is same with between RX 6900 XT and RX 5700 XT but RDNA2 will probably give 1.2 performance boost over RDNA1.



100/57 = 1.754, 1.66*1.2*0.91(Bandwidth) = 1.81 I think 6900 XT is faster than RTX 3080 and 6900XT's performance is between RTX 3090 and RTX 3080.

My prediction about prices and performances
RX 6900 XT     $799           faster than RTX 3080 but weaker than RTX 3090
RX 6800 XT     $629           faster than RTX 3070 and almost same with RTX 3080's performance
RX 6800          $479            faster than RTX 2080 Ti by %5-10
RX 6700 XT     $379           same performance with RTX 3060 Ti but it has 12GB VRAM
RX 6700          $329           same performance with RTX 3060/2080
RX 6600 XT     $229           faster than RTX 2060/S/3050 Ti
RX 6500 XT     $179           same performance with RTX 3050 Ti/GTX 1660 Ti
RX 6500          $129           same performance with RTX 3050/GTX 1660

All of GPUs are capable of Ray Tracing and even RX 6500 and RTX 3050 give RTX 2060's Ray Tracing performance. *All in all, it is my prediction. *


----------



## Ja.KooLit (Oct 21, 2020)

Ibotibo01 said:


> I think that RX 6800 would be better value than RTX 3060/3070 and Nvidia doesn't deliver their GPUs. AMD will take advantage if they deliver before Q1 2021.
> 
> View attachment 172659
> RX 6900 XT's core count is 2 times of RX 5700 XT so i show GTX 1060 VS GTX 1080 (performance gap is same with between RX 6900 XT and RX 5700 XT but RDNA2 will probably give 1.2 performance boost over RDNA1.
> ...



I hope youre right. Then Amd would be my next card


----------



## bug (Oct 21, 2020)

I see nobody noticed the seemingly large gap between the 6700XT and 6800. If nothing else, it means the price difference is most likely >$100 between the two.
One more week to go, fingers crossed.


----------



## Turmania (Oct 21, 2020)

I probably get a card that has a 150-160w avg. Gaming usage. That means I have to wait for a while longer, not bad, they would fix more driver/stability issues by then.


----------



## ShurikN (Oct 21, 2020)

Here's hoping that the 6700 is no more than 300€


----------



## bug (Oct 21, 2020)

ShurikN said:


> Here's hoping that the 6700 is no more than 300€


With 192 bit bus, I'd hope the 6700_XT_ is €300 or less.


----------



## WeeRab (Oct 21, 2020)

Solid State Soul ( SSS ) said:


> Performance isn't always the full picture, stable drivers also matters, look how the RX 5000 series launched, terrible drivers for months.


You said that with a straight face too...even after the manifest and well documented problems with Nvidia drivers for the 3800/3900.
I'll settle for products you can actually buy - and not some semi-vapourware, crash-to-desktop, bug-ridden Nvidia cards.


----------



## RedelZaVedno (Oct 21, 2020)

RX 6900 XT = 23.55 TFLOPS-32 (75% more than 2080TI)
RX 6800 XT = 20.74 TFLOPS (54% more than 2080TI)
RX 6800      = 17.61 TFLOPS (31% more than 2080TI)
RX 6700XT (if clocked at 2.15GHz) = 11 TFLOPS (18% less than 2080TI/ on pair with 2080S )

Not bad at all... RDNA2 should give Ampere good run for it's money IF AMD chooses to be aggressive on pricing.


----------



## bug (Oct 21, 2020)

WeeRab said:


> You said that with a straight face too...even after the manifest and well documented problems with Nvidia drivers for the 3800/3900.
> I'll settle for products you can actually buy - and not some semi-vapourware, crash-to-desktop, bug-ridden Nvidia cards.


Well, Nvidia's problems were at least confirmed and worked around quickly 
As for availability, I don't care for cards that expensive, so it's all the same to me.


----------



## kapone32 (Oct 21, 2020)

Nothing but roses for AMD. If all of these cards clock past 2 GHZ then it means that the 6700XT should be about 15% faster than the 5700XT with more VRAM. That is traditional though I want to know more about the rumoured cache on these cards.


----------



## bug (Oct 21, 2020)

kapone32 said:


> Nothing but roses for AMD. If all of these cards clock past 2 GHZ then it means that the 6700XT should be about 15% faster than the 5700XT with more VRAM. That is traditional though I want to know more about the rumoured cache on these cards.


What do you honestly expect from a cache? It will speed-up access as long as you don't exceed its cache and it will do nothing when you do.
NB If it's a cache for VRAM, having to stay within the cache's size would defeat the purpose of more VRAM. But this may be some other type of cache (if it exists at all, I think AMD never confirmed its existence).


----------



## RH92 (Oct 21, 2020)

Quicks said:


> Looking GOOD hope the prices look good as well.



Depends whats your definition of  ''prices looking good''  . If by looking good you are expecting an AMD to SKU to perform similarly to an Nvidia counterpart and cost much less then with all respect due you are fooling yourself . AMD doesn't work like this anymore , nowadays when AMD manages to compete with nvidia on performance prices are almost the same , so if you were complaining about Nvidia pricing then im afraid it won't look good for you !


----------



## ebivan (Oct 21, 2020)

RedelZaVedno said:


> RX 6900 XT = 23.55 TFLOPS-32 (75% more than 2080TI)
> RX 6800 XT = 20.74 TFLOPS (54% more than 2080TI)
> RX 6800      = 17.61 TFLOPS (31% more than 2080TI)
> RX 6700XT (if clocked at 2.15GHz) = 11 TFLOPS (18% less than 2080TI/ on pair with 2080S )
> ...



Well, TFLOPS say something about computing performance. Which dosn't really mean gaming performance. 
Its like comparing sequential write speeds with real world ssd performance.


----------



## RH92 (Oct 21, 2020)

Mussels said:


> I need some reliable performance leaks before my 3080 gets shipped, so i know if i should cancel or not


 
Im on the same boat here . Im waiting for Caseking to fulfill my TUF 3080 preorder but at this point it won't hurt me to see what AMD can offer . Although to be honest most peoples here judge those GPUs on pure rasterization performance , it's not my case so i will only go for AMD if they are both close on rasterization and raytracing perf .


----------



## kapone32 (Oct 21, 2020)

bug said:


> What do you honestly expect from a cache? It will speed-up access as long as you don't exceed its cache and it will do nothing when you do.
> NB If it's a cache for VRAM, having to stay within the cache's size would defeat the purpose of more VRAM. But this may be some other type of cache (if it exists at all, I think AMD never confirmed its existence).


It could be a mimic in some way of Infinity fabric. I have only ever seen rumours but I thought that something like that between the GPU and VRAM would be intriguing. In the article it was stating that it could mitigate the small bit bus vs the 3000 series cards and the speed difference between GDDR6 and 6X.


----------



## Turmania (Oct 21, 2020)

So 28th will be announcement, but when will they actually launch and up for reviews?


----------



## kapone32 (Oct 21, 2020)

Turmania said:


> So 28th will be announcement, but when will they actually launch and up for reviews?


Hopefully Nov 5 but most likely for CP 2077.


----------



## bug (Oct 21, 2020)

kapone32 said:


> It could be a mimic in some way of Infinity fabric. I have only ever seen rumours but I thought that something like that between the GPU and VRAM would be intriguing.


Infinity Fabric is an interconnect solution, it's nothing like a cache.



kapone32 said:


> In the article it was stating that it could mitigate the small bit bus vs the 3000 series cards and the speed difference between GDDR6 and 6X.


Again, a cache cannot do this, save for a few, limited scenarios.

I think you're waiting for magic, so you'll be disappointed.


----------



## medi01 (Oct 21, 2020)

Ibotibo01 said:


> RX 6800 XT $629 faster than RTX 3070 and almost same with RTX 3080's performance


There is no reason for AMD to undercut 3080, a missing in action card.


----------



## Jism (Oct 21, 2020)

I'm so excited about the times we live in. Both green and red camps releasing cards that do 4K sustained 60FPS.


----------



## HD64G (Oct 21, 2020)

Logical predictions based on the GPU market today and the leaks:

Full-die 80CU 6900XT for $700
Lightly cut-down die 72CU 6800XT for 600$
Heavily cut-down die 64CU 6800 for 500$

That would be great if 6900XT is above the performance of RTX3080 10GB. If it is on par or slightly behind deduct $50 from those prices. And remember that the 6800 will be almost surely faster on average than 3070 8GB.

As for the RX6700 series my guess is they will be at least 20-25% faster than RX5700 series and will cost less. So, 2080S level of performace for $250-350 (XT and not-XT into that range) would be great for us consumers me thinks.

My 5c.


----------



## Vya Domus (Oct 21, 2020)

kapone32 said:


> In the article it was stating that it could mitigate the small bit bus vs the 3000 series cards and the speed difference between GDDR6 and 6X.



It's not "could", it's "does". That why caches exist, to mitigate the limitations caused by slow main memory. Caches work very well when the access patterns are sequential and guess what, the sort of things you need to compute for graphics, are always like that.


----------



## bug (Oct 21, 2020)

Vya Domus said:


> It's not "could", it's "does". That why caches exist, to mitigate the limitations caused by slow main memory. Caches work very well when the access patterns are sequential and guess what the sort of things you need to compute for graphics are always like that.


Except that it doesn't work like that. Cache is much, much smaller than main memory. If you happen to frequently use only, say 16MB of the main memory, those MB will be retained in cache and used from there, resulting in dramatically improved performance. But if the video card wants to access say 8GB, the cache doesn't help much anymore.
Also, cache memory is so fast in part because it's very power hungry.


----------



## RedelZaVedno (Oct 21, 2020)

ebivan said:


> Well, TFLOPS say something about computing performance. Which dosn't really mean gaming performance.
> Its like comparing sequential write speeds with real world ssd performance.



It scaled pretty well till Ampere launch aka 11.15 Tflops 2080S being 14% slower than 13.45Tflops 2080TI, 9,75 Tflops 5700XT being between 15-20% slower than 2800S... I have no doubt 23.55 TFLOPS  GPU will compete well against 3080 if not 3090 in standard rasterization performance if we put AI upscaling and RT aside and that's all I really care about for now. Cheapest 4K/90fps capable and most power efficient GPU gets my money.


----------



## AsRock (Oct 21, 2020)

Anymal said:


> How they both decided several years ago TBP would be 320W? Smaller node, no problem, lets make it more power hungry.



Yeah your talking a 500W just to play a game lmao.


----------



## okbuddy (Oct 21, 2020)

6900xt only 11.111% faster than 6800xt


----------



## bug (Oct 21, 2020)

okbuddy said:


> 6900xt only 11.111% faster than 6800xt


It really doesn't matter. I've always said $1000 cards can be safely disregarded. What piques my interest is the level of performance pushed down into the sub-$300 segment. I expect on an enthusiast forum not everybody feels the same, but the vast majority of buyers do.


----------



## Vya Domus (Oct 21, 2020)

AsRock said:


> Yeah your talking a 500W just to play a game lmao.



500W to play a game with the best possible quality and performance. If you just want to play it you need a lot less than that.


----------



## Nater (Oct 21, 2020)

ebivan said:


> At the end of the day its the price not the model no thats important. Huang promised 4k 60fps for 700 bucks. Thats what i expect. Nothing more, nothing less. I really hope Su can give me what Huang didnt!



It's not just price right now.  Availability is HUGE at the moment.  Who cares how much the 3080 is listed at when you cannot get one at that MSRP/ESRP.

$700/slower/READY TO SHIP > $700/faster/OUT OF STOCK


----------



## Chrispy_ (Oct 21, 2020)

I hope the 64CU model is $400 and sub-250W.

Depending on how well the RDNA2 architecture scales and how much IPC improvement there is, that model should be about twice as fast as a 5700XT which means it is potentially a 3070 competitor, only it'll have a more appropriate amount of GDDR6 than the Nvidia card.

I'm sure the 320W variants will be nice and fast but I'm not interested in trying to deal with that much heat from a card.


----------



## Metroid (Oct 21, 2020)

These are all gaming gpus and the 256bit memory is the important factor here. If people want computing then go nvidia, if they want gaming then choose amd. Interesting, tables have been turned, nvidia used to be for gaming and amd for computing, this time is the opposite.


----------



## FinneousPJ (Oct 21, 2020)

Metroid said:


> These are all gaming gpus and the 256bit memory is the important factor here. If people want computing then go nvidia, if they want gaming then choose amd. Interesting, tables have been turned, nvidia used to be for gaming and amd for computing, this time is the opposite.


True, and I imagine that fewer cores at a higher frequency will also work in AMD's advantage for gaming.


----------



## bug (Oct 21, 2020)

Vya Domus said:


> 500W to play a game with the best possible quality and performance. If you just want to play it you need a lot less than that.


When I got into PCs, the choice was between a 250 and a 350W PSU. Granted, this was before _2D_ acceleration was upon us...


----------



## phanbuey (Oct 21, 2020)

bug said:


> When I got into PCs, the choice was between a 250 and a 350W PSU. Granted, this was before _2D_ acceleration was upon us...



Also the PC was a completely closed box with tiny vents somewhere in the front lol.


----------



## cueman (Oct 21, 2020)

keep your rtx 3080...no worry

rx 6900 xt loose 15-20% and tdp is near over 350W.


----------



## Punkenjoy (Oct 21, 2020)

A big bunch of performance improvement came from increased power usage. Although the main performance improvement factor still come from process improvement.


----------



## phanbuey (Oct 21, 2020)

Looks like I'm going to be holding on to the 2080ti for this round -- getting 16K gpu score on timespy and 7.5K timespy extreme at 290W + I put a silent cooler on it + more DLSS support will keep it alive with a +30% boost in Cyberpunk.

3080 doesn't even seem worth it unless I can pick up a used one at some point with a 5800x/5600x.


----------



## AsRock (Oct 21, 2020)

Vya Domus said:


> 500W to play a game with the best possible quality and performance. If you just want to play it you need a lot less than that.



Not 100% true, as i know RDR2 can look better than it does with my 390X and that's up to a 420w just for 40-60 fps.


----------



## Xex360 (Oct 21, 2020)

nVidia messed up with Ampere, high prices (MSRP of 1000$ here for no logical reason), only 10gb of vram, and worse the overpriced cards don't even exist in the real life. I really hope AMD will have a real launch with reasonable prices.


----------



## tomc100 (Oct 21, 2020)

I just hope the power draw is really low when I'm not gaming since I don't game that often but use the computer mostly for surfing the net, youtube, and work.


----------



## purecain (Oct 21, 2020)

Please AMD release your cards so that i may cancel my 3090 order.


----------



## Dante Uchiha (Oct 21, 2020)

bug said:


> Except that it doesn't work like that. Cache is much, much smaller than main memory. If you happen to frequently use only, say 16MB of the main memory, those MB will be retained in cache and used from there, resulting in dramatically improved performance. But if the video card wants to access say 8GB, the cache doesn't help much anymore.
> *Also, cache memory is so fast in part because it's very power hungry.*



*Look Infinity Cache Patent details:*

"We propose shared L1 caches in GPUs. To the best of our knowledge, this is the first paper that performs a thorough characterization of shared L1 caches in GPUs and shows that they can significantly improve the collective L1 hit rates and reduce the bandwidth pressure to the lower levels of the memory hierarchy."

• "We develop GPU-specific optimizations to reduce inter-core communication overheads. These optimizations are vital for maximizing the benefits of the shared L1 cache organization."

• "We develop a GPU-specific lightweight dynamic scheme that classifies application phases and reconfigures the L1 cache organization (shared or private) based on the phase behavior."

• "We extensively evaluate our proposal across 28 GPGPU applications. *Our dynamic scheme boosts performance by 22% (up to 52%) and energy efficiency by 49% for the applications that exhibit high data replication and cache sensitivity without degrading the performance of the other applications*. This is achieved at a modest area overhead of 0.09 mm2 /core."





__





						ADAPTIVE CACHE RECONFIGURATION VIA CLUSTERING - ADVANCED MICRO DEVICES, INC.
					

A method of dynamic cache configuration includes determining, for a first clustering configuration, whether a current cache miss rate exceeds a miss rate threshold. The first clustering configuration




					www.freepatentsonline.com
				





			https://adwaitjog.github.io/docs/pdf/sharedl1-pact20.pdf


----------



## MxPhenom 216 (Oct 21, 2020)

No GDDR6X?  



purecain said:


> Please AMD release your cards so that i may cancel my 3090 order.



I just want them to release cards that are competitive with Nvidia's high end again. That's all i ask. Then my Ryzen 3 build im doing next year will be more fun.


----------



## Crustybeaver (Oct 21, 2020)

Meanwhile I expect Nvidia are sat there with the Ti / Super variants ready for AMDs next move. No way they're letting AMD have the fastest cards come Christmas.


----------



## Chrispy_ (Oct 21, 2020)

I hope the 6700-series (Navi 22) are still RDNA2 with DXR capabilities. It would be confusing as heck to have a mix of DXR capable and DXR incapable cards all under the same naming scheme.

At least Nvidia use RTX and GTX to distinguish their two lineups.


----------



## Icon Charlie (Oct 21, 2020)

With this information they left out.... WATTAGE....  Not good... Not good at all.
I'm not to heat up my damn house with a computer. 

It really looks like I'm going to stick with my current Video card and maybe upgrade my CPU later.


----------



## efikkan (Oct 21, 2020)

ebivan said:


> That would be great but I dont think that's realistic. Since there never actually were any 3080 for 700 (or has anyone acutally gotten a FE card?), AMD will propably not sell an 3080 equivalent for that price.


That's untrue. All AiBs have models at the same MSRP as the FE card.



Vya Domus said:


> It's not "could", it's "does". That why caches exist, to mitigate the limitations caused by slow main memory. Caches work very well when the access patterns are sequential and guess what, the sort of things you need to compute for graphics, are always like that.


Caches only help for sequential access patterns if the average bandwidth is high enough.


----------



## Chrispy_ (Oct 21, 2020)

efikkan said:


> That's untrue. All AiBs have models at the same MSRP as the FE card.


That's usually true but unlike the FE which is a high-quality design with binned silicon and an expensive cooler, the MSRP uses the standard, cheap reference design and likely the cheapest-to-produce cooler they can get away with. So, compared to the FE it's terrible value for money.


bug said:


> It really doesn't matter. I've always said $1000 cards can be safely disregarded. What piques my interest is the level of performance pushed down into the sub-$300 segment. I expect on an enthusiast forum not everybody feels the same, but the vast majority of buyers do.


The overwhelmingly popular price point is always the xx60 series which for about a decade got stuck at $199 because that was such an appealing price point to target; Thanks to inflation a $200 card is no longer the mid-range sweet spot but it works out at about $275 in today's money.

The 2060 screwed the pooch as a pricing anomaly, and the only other time nvidia deviated from this pricing was the GTX 260 back in 2008, also considered a complete rip-off. I very much doubt Nvidia is going to target $275 for anything decent unless AMD has something that's cleaning house at that price point.


----------



## kapone32 (Oct 21, 2020)

bug said:


> Except that it doesn't work like that. Cache is much, much smaller than main memory. If you happen to frequently use only, say 16MB of the main memory, those MB will be retained in cache and used from there, resulting in dramatically improved performance. But if the video card wants to access say 8GB, the cache doesn't help much anymore.
> Also, cache memory is so fast in part because it's very power hungry.


Next week this time we will all be in the know of what is real and not.


----------



## Vya Domus (Oct 21, 2020)

AsRock said:


> Not 100% true, as i know RDR2 can look better than it does with my 390X and that's up to a 420w just for 40-60 fps.



Sure but that's a really old GPU. Gotta look at things from the current generation.


----------



## ebivan (Oct 21, 2020)

efikkan said:


> That's untrue. All AiBs have models at the same MSRP as the FE card


Sure, and those are just piling up on the shelves where you live? Come on, nobody (or almost nobody) got an 3080 for 700. Even the cheapest Zotac card is at 850 and still not available...


----------



## Vya Domus (Oct 21, 2020)

Crustybeaver said:


> Meanwhile I expect Nvidia are sat there with the Ti / Super variants ready for AMDs next move. No way they're letting AMD have the fastest cards come Christmas.



It will be funny to see a 3080 "ti" that's 5-8% faster.

Reality check, Nvidia probably didn't plan to use GA102 for the 3080 but rather a fully enabled GA104 like they always did which could have made room for a 3080ti. For some _unknown _reason (let's leave it at that for the time being) they decided GA104 wasn't enough so the whole stack got pushed up. There is no room for "ti" variants, or at least not for any that would matter. There is potential room for a 3070ti but that's because they disabled more SMs than usual, the 3070 was meant to be the 3080 and 3090 the Titan.


----------



## Zach_01 (Oct 21, 2020)

Vya Domus said:


> It will be funny to see a 3080 "ti" that's 5% faster.


Most probably they have nothing except the 20GB variants. When they move to TSMC’s 7nm I guess they will.


----------



## efikkan (Oct 21, 2020)

Chrispy_ said:


> That's usually true but unlike the FE which is a high-quality design with binned silicon and an expensive cooler, the MSRP uses the standard, cheap reference design and likely the cheapest-to-produce cooler they can get away with. So, compared to the FE it's terrible value for money.


Nonsense, there are excellent models like the Asus RTX 3080 TUF, unless you are going to overclock, which most buyers don't do.



ebivan said:


> Sure, and those are just piling up on the shelves where you live? Come on, nobody (or almost nobody) got an 3080 for 700. Even the cheapest Zotac card is at 850 and still not available...


Unprecedented demand doesn't mean almost nobody got/gets their products.


----------



## ebivan (Oct 21, 2020)

Yeah sure, its not a supply issue Mr Huang


----------



## efikkan (Oct 21, 2020)

ebivan said:


> Yeah sure, its not a supply issue Mr Huang


There is no reason to be snarky 
Now you're down to semantics. You can certainly call it a _supply issue_ when demand is significantly higher than the supply, as long as you don't mislead people into thinking there are production or shipment issues (which are different kinds supply issues), which several earlier posts have alluded to.


----------



## MxPhenom 216 (Oct 21, 2020)

Xex360 said:


> nVidia messed up with Ampere, high prices (MSRP of 1000$ here for no logical reason), only 10gb of vram, and worse the overpriced cards don't even exist in the real life. I really hope AMD will have a real launch with reasonable prices.



I mean US prices were actually pretty surprising to me. Not nearly as much as I thought they would be for the performance.

Where they screwed the pooch I think was going with Samsung 8nm. Its just an extension of their 10nm which wasn't very good anyways.
Samsung yields are shit and GDDR6X is in short supply too.

With reports that Nvidia will be switching to TSMC 7nm for possible Super variants of the cards I think things will turn around for Ampere.

10GB of vram is fine. GDDR6*X *too and a lot more bandwidth.

Having said all that, my new build early next year with a 5800x, will probably also get a 6900XT.


----------



## medi01 (Oct 21, 2020)

efikkan said:


> ...when demand is significantly higher than the supply...


Which can happen for 2 reasons:
1) Demand is too high
2) Supply is terribly bad

One needs hell of a green reality distortion field not to notice #2 (not even pre-orders are accepted)


----------



## r9 (Oct 21, 2020)

MxPhenom 216 said:


> I mean US prices were actually pretty surprising to me. Not nearly as much as I thought they would be for the performance.
> 
> Where they screwed the pooch I think was going with Samsung 8nm. Its just an extension of their 10nm which wasn't very good anyways.
> Samsung yields are shit and GDDR6X is in short supply too.
> ...



Nvidia released a card 30% faster than a $1200 card for $699 and they expected to sell only 5 cards. Lol


----------



## MxPhenom 216 (Oct 21, 2020)

medi01 said:


> Which can happen for 2 reasons:
> 1) Demand is too high
> 2) Supply is terribly bad
> 
> One needs hell of a green reality distortion field not to notice #2 (not even pre-orders are accepted)





r9 said:


> Nvidia released a card 30% faster than a $1200 card for $699 and they expected to sell only 5 cards. Lol



Mhmm. 

From what I understand, GDDR6X is in very short supply. Only Micron is making it right now. Also Samsung 8nm (10nm+) yields are pretty bad.

Or Nvidia is purposefully reducing the supply. If thats the case, itll just hurt Nvidia in the end since most people will just buy these new AMD cards if the performance is there.


----------



## medi01 (Oct 21, 2020)

MxPhenom 216 said:


> From what I understand, GDDR6X is in very short supply.



Ah, sure thing, dude.
It is GDDR6x and not that oversized GA102 chip that is the reason.

Surely, $699 for a GPU 20-30% aster than $1200 card is nothing unusual either, totally expected and there are no reasons to see that something doesn't quite add up here.


----------



## MxPhenom 216 (Oct 21, 2020)

medi01 said:


> Ah, sure thing, dude.
> It is GDDR6x and not that oversized GA102 chip that is the reason.
> 
> Surely, $699 for a GPU 20-30% aster than $1200 card is nothing unusual either, totally expected and there are no reasons to see that something doesn't quite add up here.



Had Nvidia been able to use TSMC I think the supply issues would be considerably less.

Demand for new cards this year seems way way higher than previously and I'm not entirely sure why.


----------



## ebivan (Oct 21, 2020)

efikkan said:


> There is no reason to be snarky
> Now you're down to semantics. You can certainly call it a _supply issue_ when demand is significantly higher than the supply, as long as you don't mislead people into thinking there are production or shipment issues (which are different kinds supply issues), which several earlier posts have alluded to.


No reason to defend nvidia either. They screwed up. They made promises they could not deliver. 
They didnt expect people to buy that extremely well priced extremely fast card? Who calculates expected demands over there? First year business school students? No, they made promises which they knew they wouldnt be able to keep.
Of course there are production issues, how could you deny that? 
It's not importent if its Samsungs or Microns fault that there are almost no cards available. Nvidia wasnt ready to launch and they did anywasy. I really hope AMD sells a shitload of cards while Nvidia still cannot deliver for months!


----------



## Umbral (Oct 21, 2020)

Possible theoretical AMD strategy:

6900 XT = 5% slower than 3090, $650 16GB RAM

6800 XT = 5% slower than 3080, 5% slower than 6900XT $450 16 GB RAM
*6800 = 10~15% faster than 3070, 5% slower than 6800 XT $400 16 GB RAM*

6700 = 10 % faster than 3060 TI, 10% slower than 3070 $325 12 GB RAM

AMD wins at price/performance at every tier. Not to mention 99% games optimized for consoles on AMD hardware.

*Best Card for price/performance = 6800 vanilla*


----------



## efikkan (Oct 21, 2020)

ebivan said:


> No reason to defend nvidia either. They screwed up. They made promises they could not deliver.


What _precisely_ did they screw up? Which _exact promises_ did they fail to deliver on?



ebivan said:


> They didnt expect people to buy that extremely well priced extremely fast card?


They had tens of thousands available on launch day, comparable to the launch of Pascal and Turing, but that's not enough when the demand is probably >10x the supply. It's not much of a "screwup" considering there probably shipped more RTX 3080 cards on launch day than AMD's "flagship" Radeon VII did in its entire production run. 



ebivan said:


> Who calculates expected demands over there? First year business school students? No, they made promises which they knew they wouldnt be able to keep.


You don't know much about how microchips are made do you?
It's not a matter of wanting to produce chips, the limit is wafer capacity in the foundries. You can be sure Nvidia ordered as much as they could. If you want to blame someone, blame Samsung who didn't make the factory larger when they planned it >5 years ago.



ebivan said:


> It's not importent if its Samsungs or Microns fault that there are almost no cards available. Nvidia wasnt ready to launch and they did anywasy.


Logic fail.
Waiting wouldn't have increased the total number of cards shipped to date.



ebivan said:


> I really hope AMD sells a shitload of cards while Nvidia still cannot deliver for months!


I certainly do hope AMD makes good cards, and for once make decent volumes. AMD cards have been in much shorter supply than Nvidia cards in the past five years, and very little of that can be blamed on miners.


----------



## AsRock (Oct 21, 2020)

Vya Domus said:


> It will be funny to see a 3080 "ti" that's 5-8% faster.
> 
> Reality check, Nvidia probably didn't plan to use GA102 for the 3080 but rather a fully enabled GA104 like they always did which could have made room for a 3080ti. For some _unknown _reason (let's leave it at that for the time being) they decided GA104 wasn't enough so the whole stack got pushed up. There is no room for "ti" variants, or at least not for any that would matter. There is potential room for a 3070ti but that's because they disabled more SMs than usual, the 3070 was meant to be the 3080 and 3090 the Titan.



LOL, the 3090 is not a TITAN,  nVidia point at it like if it was but it's not.


----------



## Zach_01 (Oct 21, 2020)

medi01 said:


> Ah, sure thing, dude.
> It is GDDR6x and not that oversized GA102 chip that is the reason.


I bet its neither of those...



medi01 said:


> Surely, $699 for a GPU 20-30% aster than $1200 card is nothing unusual either, totally expected and there are no reasons to see that something doesn't quite add up here.


Now you are getting somewhere...

1. Ampere was not ready for September, but just wanted to create hype before RNDA2
2. nVidia will supply the market when they are ready for the 20GB models of 1000+$, and that was the goal from start. After the prices of RDNA2 of course.


*Good supply* but great demand...!!! I'm laughing...
nVidia did not fail...
Just didnt want to sell a 700$ MSRP flagship GPU...


----------



## ebivan (Oct 21, 2020)

efikkan said:


> What _precisely_ did they screw up? Which _exact promises_ did they fail to deliver on?
> 
> 
> They had tens of thousands available on launch day, comparable to the launch of Pascal and Turing, but that's not enough when the demand is probably >10x the supply. It's not much of a "screwup" considering there probably shipped more RTX 3080 cards on launch day than AMD's "flagship" Radeon VII did in its entire production run.
> ...



Yeah sure man, you're right and I'm wrong!
I just clicked on your name and read the last coupe of posts you wrote. Our little chat ends right here, as i don't really see you as someone who discusses. You are someone who is just always right and I can't compete with that. And I really share none of the opinions I read you have.


----------



## R0H1T (Oct 21, 2020)

efikkan said:


> I certainly do hope AMD makes good cards, and for once make decent volumes. AMD cards have been in much shorter supply than Nvidia cards in the past five years, and very little of that can be blamed on miners.


Well technically speaking AMD's selling at least 2-5x as many GPUs as Nvidia, Intel probably outsells both. So it's not an Apples to Apples comparison, even if AMD were superior to Nvidia they'd have (nearly) double the issues keeping up with supplies of APU, CPU, consoles, embedded chips & finally dGPU - notice the last one probably has the least volumes.


----------



## Yazzia (Oct 21, 2020)

Umbral said:


> Possible theoretical AMD strategy:
> 
> 6900 XT = 5% slower than 3090, $650 16GB RAM
> 
> ...



Why would AMD sell a card basically the same as the 3090 for less than half the price? The delusion is real


----------



## bug (Oct 21, 2020)

Dante Uchiha said:


> *Look Infinity Cache Patent details:*
> 
> "We propose shared L1 caches in GPUs. To the best of our knowledge, this is the first paper that performs a thorough characterization of shared L1 caches in GPUs and shows that they can significantly improve the collective L1 hit rates and reduce the bandwidth pressure to the lower levels of the memory hierarchy."
> 
> ...


Well, if it's L1 cache, it's going to be tiny. And please note they narrow down the specific scenario where that cache helps.
Don't get me wrong, caching works, it improves performance. But it's different from bandwidth. I mean, look at Intel's CPUs. Those have had pretty smart caches (three levels of them) that do a pretty good job. But they still only hide latency, they're not a substitute for bandwidth.
Of course, GPU usage patterns are "a little" different, but I hope you get the idea.


----------



## Sp408 (Oct 21, 2020)

Can someone explain the difference between the 8nm and 7nm? Does that actually make a difference in performance or anything important?


----------



## bug (Oct 21, 2020)

Sp408 said:


> Can someone explain the difference between the 8nm and 7nm? Does that actually make a difference in performance or anything important?


It's an important distinction, but it's too complicated to explain here. Read up on silicon manufacturing nodes if you have a week or so to spare


----------



## R0H1T (Oct 21, 2020)

Sp408 said:


> Can someone explain the difference between the 8nm and 7nm? Does that actually make a difference in performance or anything important?


Hard to say, unless Nvidia releases the exact same cards on 7nm we'll never really know how good TSMC's node is, it *could be* better or worse for Ampere. On most objective parameters TSMC is superior but again we know this isn't how it works in real life. Just look at Intel's 10nm debacle for instance.


----------



## Xmpere (Oct 21, 2020)

If the 6700 is in 250-350 range. Instant cop. Gonna be my first gpu if it is.


----------



## Chrispy_ (Oct 21, 2020)

efikkan said:


> Nonsense, there are excellent models like the Asus RTX 3080 TUF, unless you are going to overclock, which most buyers don't do.


How is that MSRP? it's £100 more than the FE on Nvidia's official 3080 page and £30 more than other entry-level cards from other AIB vendors.



			https://www.nvidia.com/en-gb/shop/geforce/gpu/?page=1&limit=9&locale=en-gb&category=GPU&gpu=RTX%203080&sorting=lp


----------



## TheoneandonlyMrK (Oct 21, 2020)

bug said:


> Well, if it's L1 cache, it's going to be tiny. And please note they narrow down the specific scenario where that cache helps.
> Don't get me wrong, caching works, it improves performance. But it's different from bandwidth. I mean, look at Intel's CPUs. Those have had pretty smart caches (three levels of them) that do a pretty good job. But they still only hide latency, they're not a substitute for bandwidth.
> Of course, GPU usage patterns are "a little" different, but I hope you get the idea.


GPU don't have 3 tier cache and think what your now saying!?.
They have one unified tier now and unifying that just helped zen 3 to a 19% improvement.
I get you and we'll see, it's not long but no need for the hard stances either.


----------



## bug (Oct 21, 2020)

theoneandonlymrk said:


> GPU don't have 3 tier cache and think what your now saying!?.
> They have one unified tier now and unifying that just helped zen 3 to a 19% improvement.


Not sure what you're trying to say, but Zen3 retains the 3 levels of cache, only it doesn't segment L3 cache per CCX (because it doesn't have CCXs anymore). But that's not responsible for the whole 19% perf uplift.


theoneandonlymrk said:


> I get you and we'll see, it's not long but no need for the hard stances either.


No hard stance here, just keeping expectations in check.


----------



## Zach_01 (Oct 21, 2020)

theoneandonlymrk said:


> GPU don't have 3 tier cache and think what your now saying!?.
> They have one unified tier now and unifying that just helped zen 3 to a 19% improvement.
> I get you and we'll see, it's not long but no need for the hard stances either.


To be fair, the +19% IPC gains was not only from unifing cache pool. ZEN3 has a lot more architectural changes/improvements.
Still we cant compare the 2 different chips (CPU/GPU). This is a first for a GPU and we have no baseline for what AMD cooked up with RDNA2.


----------



## Vya Domus (Oct 21, 2020)

AsRock said:


> LOL, the 3090 is not a TITAN,  nVidia point at it like if it was but it's not.



That's the point. Do you not find it strange the way they awkwardly said it's "TItan performance". Yeah, because that was meant to be a Titan.


----------



## MxPhenom 216 (Oct 22, 2020)

Sp408 said:


> Can someone explain the difference between the 8nm and 7nm? Does that actually make a difference in performance or anything important?



Well TSMCs 7nm is way better than Samsungs 8nm. Samsungs 8nm is super old and its just an extension of their 10nm which isn't very exciting. Its got a bit better density and a higher drive cell option.


----------



## TheoneandonlyMrK (Oct 22, 2020)

bug said:


> Not sure what you're trying to say, but Zen3 retains the 3 levels of cache, only it doesn't segment L3 cache per CCX (because it doesn't have CCXs anymore). But that's not responsible for the whole 19% perf uplift.
> 
> No hard stance here, just keeping expectations in check.


What I'm saying is look how your minimising what they did with zen3's cache and how it helped IPC and consider a GPU cache closer to it's cores!?, They're is a chance it could be good they showed as much with the Ryzen architecture, was always about cores,cache and connection.


----------



## Chrispy_ (Oct 22, 2020)

Vya Domus said:


> That's the point. Do you not find it strange the way they awkwardly said it's "TItan performance". Yeah, because that was meant to be a Titan.


People who bought Titans were buying them for the 24GB. We have a whole bunch of 2080Ti cards for rendering performance, but we had to buy a few Titans to cover those occasional workloads that don't run on 11GB cards. Titans are a ripoff, but they're still half the price of Quadros.

Big Navi is going to be interesting because the vanilla 6800 is a 16GB card for a _fraction_ of the cost of a 2080Ti,  a 3090, a Titan, or a Quadro RTX. 
If you work in a cinematic / simulation industry then the raw performance isn't actually the key metric, it's whether the card in question can actually load the model(s) into VRAM or not. There's a metric I go by for certain types of purchasing that's not commonly useful, and that's $/GB _for a GPU_.


----------



## kiddagoat (Oct 22, 2020)

I haven't posted regularly around here in quite some time.   Why does it seem now, much more than before, that any AMD, Nvidia, or Intel article quickly deviates from the topic of the article and dives into back and forth fanboy loyalism??? 

This article is about AMD's upcoming product stack and not even a third of the way through here comes all the 30x0 series crap that's been regurgitated since the launch of Ampere.   Maybe I am just old and out of touch.   Why can't these discussions stay on topic?


----------



## Chrispy_ (Oct 22, 2020)

kiddagoat said:


> Why can't these discussions stay on topic?


Because Astroturf and Fanboy; Welcome to the 21st century.


----------



## AsRock (Oct 22, 2020)

Vya Domus said:


> That's the point. Do you not find it strange the way they awkwardly said it's "TItan performance". Yeah, because that was meant to be a Titan.



It don't and really think it will have double the ram when they can do it.  First of the 3090 is too cheap to be a Titen as it's not really for gamers.


----------



## kostaspyrkas (Oct 22, 2020)

Solid State Soul ( SSS ) said:


> Performance isn't always the full picture, stable drivers also matters, look how the RX 5000 series launched, terrible drivers for months.


I have a 5700xt  sapphire  pulse  almost  from the beginning. .. no problems at all... and i play a wide range of games


----------



## Xex360 (Oct 22, 2020)

r9 said:


> Nvidia released a card 30% faster than a $1200 card for $1099 and they expected to sell only 5 cards. Lol


Fixed that for you, how come Turing was priced the same everywhere (except where taxes are the issue), but Ampere costs up to 40% outside the US.


----------



## Ibotibo01 (Oct 22, 2020)

RDNA2 will aged like wine. GTX 600 series'(even 700 series) performance are worse than HD 7000 series. HD 7970 gives same performance with GTX 780/Ti in 2020/2019 games.


----------



## bug (Oct 22, 2020)

theoneandonlymrk said:


> What I'm saying is look how your minimising what they did with zen3's cache and how it helped IPC and consider a GPU cache closer to it's cores!?, They're is a chance it could be good they showed as much with the Ryzen architecture, was always about cores,cache and connection.


And what I'm saying is you don't seem to understand what cache does and how it works


----------



## Flanker (Oct 22, 2020)

kiddagoat said:


> I haven't posted regularly around here in quite some time.   Why does it seem now, much more than before, that any AMD, Nvidia, or Intel article quickly deviates from the topic of the article and dives into back and forth fanboy loyalism???
> 
> This article is about AMD's upcoming product stack and not even a third of the way through here comes all the 30x0 series crap that's been regurgitated since the launch of Ampere.   Maybe I am just old and out of touch.   Why can't these discussions stay on topic?


It's been like that for almost every forum I visited in the last 15 years


----------



## puma99dk| (Oct 22, 2020)

With the current shortages of Nvidia's RTX 3080 and specially cards like the Asus Tuf model I am still thinking about going AMD this time around.

I am only sad to see that the RX 6900 XT (Navi 21 XTX) is a AMD exclusive part like Nvidia's Titan cards are wish AIB partner would get clearance to make custom versions of the Navi21 XTX models.

But I am still hoping even it will be a ref design I can get it with Sapphire branding so I get keep using Sapphire's TriXX software because of the scaling option that I use on my 4K screen from time to time.


----------



## Vya Domus (Oct 22, 2020)

puma99dk| said:


> I am only sad to see that the RX 6900 XT (Navi 21 XTX) is a AMD exclusive



I am not sure what you mean by exclusive, it's said that AMD will only allow reference models for the 6900XT not that they will only be sold through AMD.


----------



## puma99dk| (Oct 22, 2020)

Vya Domus said:


> I am not sure what you mean by exclusive, it's said that AMD will only allow reference models for the 6900XT not that they will only be sold through AMD.



With exclusive I mean no custom PCB and cooling solutions but what about the bios? that will properly show AMD as vendor so I don't believe that the Sapphire TriXX software will then work.


----------



## TheoneandonlyMrK (Oct 22, 2020)

bug said:


> And what I'm saying is you don't seem to understand what cache does and how it works


Yeh yeah, some are just less negative.


----------



## ShurikN (Oct 22, 2020)

puma99dk| said:


> no custom PCB


AMD's reference PCB and board design are one of the best on the market. Everything is over-engineered, and it rivals much more expensive Sapphire models for example.


----------



## renz496 (Oct 22, 2020)

28th is the official unveil. when can we expect actual review?


----------



## FinneousPJ (Oct 22, 2020)

renz496 said:


> 28th is the official unveil. when can we expect actual review?


Based on Zen 3, on launch day.


----------



## bug (Oct 22, 2020)

renz496 said:


> 28th is the official unveil. when can we expect actual review?











						Upcoming Hardware Launches 2023 (Updated Dec 2022)
					

This article serves as a continuously updated summary of currently known leaks and official announcements regarding upcoming hardware releases in 2023 and beyond. We cover and keep track of developments for Intel Meteor Lake, AMD Zen 4 X3D, NVIDIA's new GeForce 40 GPUs, DDR6 and GDDR7 memory...




					www.techpowerup.com


----------



## Punkenjoy (Oct 22, 2020)

It's wrong to compare CPU cache to GPU cache. Yes both are catch, but the workload is so different that you can't make comparaison on performance. 

Also here, it's not an increase or reduction in cache, it's the ability to lookup others L1 caches of others gpu cores to see if the information they need is there. 

GPU do many iteration on a image, on texture and etc and they work on the same stuff most of the time. This is why there is a real potential of gain and bandwidth saving (Yes cache save bandwidth... if you have a cache hit, you don't need to read it...). 

CPU on the other hands do all kind of work and handle many different process. They process all kind of data and getting cache hit is harder because of that.


----------



## bug (Oct 22, 2020)

Punkenjoy said:


> It's wrong to compare CPU cache to GPU cache. Yes both are catch, but the workload is so different that you can't make comparaison on performance.
> 
> Also here, it's not an increase or reduction in cache, it's the ability to lookup others L1 caches of others gpu cores to see if the information they need is there.
> 
> ...


I see. So basically everyone was stupid till now adding bandwidth, because with a little bit of cache added they could have solved the problem. It all makes sense now.

Edit: Do we even know this is a data cache and not an instruction cache?


----------



## Vya Domus (Oct 22, 2020)

Punkenjoy said:


> It's wrong to compare CPU cache to GPU cache. Yes both are catch, but the workload is so different that you can't make comparaison on performance.
> 
> Also here, it's not an increase or reduction in cache, it's the ability to lookup others L1 caches of others gpu cores to see if the information they need is there.
> 
> ...



In summary graphics workloads provide both spatial and temporal locality for data and instructions almost all of the time. Anyway it's not just L1 cache lookup, the chips themselves will probably have a lot of L2 cache as well.


----------



## bug (Oct 22, 2020)

Vya Domus said:


> In summary graphics workloads provide both spatial and temporal locality for data and instructions almost all of the time. Anyway it's not just L1 cache lookup, the chips themselves will probably have a lot of L2 cache as well.


The post quoting from the patent applications says it's 0.09 sq mm / core. I wouldn't bet on "a lot of L2 cache" fitting in there.

I don't doubt AMD has done something smart here, but you guys are simply expecting too much from a L1 cache. The cache my be more effective than it is on a CPU, but bridging hundreds of GB/s of bandwidth difference? I have to see it to believe it.


----------



## Punkenjoy (Oct 22, 2020)

bug said:


> The post quoting from the patent applications says it's 0.09 sq mm / core. I wouldn't bet on "a lot of L2 cache" fitting in there.
> 
> I don't doubt AMD has done something smart here, but you guys are simply expecting too much from a L1 cache. The cache my be more effective than it is on a CPU, but bridging hundreds of GB/s of bandwidth difference? I have to see it to believe it.



Indeed, but it's only a portion of the final solution. Both GPU have all kind of memory tricks like compression to reduce memory bandwidth. People focus on that because it sound cool and it's what they know. Also we don't necessarily know if Ampere is really memory bandwidth bound.  

 NAVI10 currently have 512 kb of L1 (128 kb per 10 CU block). if AMD keep the same size per block of 10 CU, the available L1 cache will go from 128 kb to 1 MB for full NAVI21 witch is quite a big increase. If they match the cache size Per CU, they could have around 8 MB of L2 cache.

how much that will help, i don't know. No matter what people say, only the benchmark will really reveal the final information. We may be discussing about stuff that have marginal impact on the real performance.


----------



## medi01 (Oct 22, 2020)

MxPhenom 216 said:


> Had Nvidia been able to use TSMC I think the supply issues would be considerably less.


Yeah.
Or had NV priced card that is 20-30% faster than $1200 card at around $999.
I wonder why didn't they do it. 
Oh wait, this can't be the reason, can it:


__
		https://www.reddit.com/r/Amd/comments/jg21ai



Sp408 said:


> Can someone explain the difference between the 8nm and 7nm? Does that actually make a difference in performance or anything important?


Neither honestly describes what the process really is (e.g. 7nm TSMC means 22x22nm transistor, while 14nm Intel means 24x24nm), but Samsung's 8nm is even further from truth, so there is that.


----------



## dalekdukesboy (Oct 22, 2020)

I really like the "feel" of this launch. They didn't do anything too crazy or exotic with the cards and 256 bit wide bus obviously easier to manufacture and get better yields working properly than the Nvidia approach. The really high clock speeds make me wonder if they had to do it to compete with Nvidia or they just have an exceptional product that they squeezed to the max and maybe even outperforms some of the equivalent Nvidia stack...however I will say as someone mentioned the launch CAN'T be worst than Nvidia's 3080 launch...if they get 10 cards out in week 1 they probably did better .


----------



## yeeeeman (Oct 22, 2020)

Nvidia will have a bit of trouble with these new cards.


----------



## Punkenjoy (Oct 22, 2020)

if you look at OC Navi10 cards, they already do 2 GHz game clock. So the "official" numbers for all NAVI21 cards aren't that far away from  the current NAVI10 clock.

We could say AMD want to push it to compete with Nvidia, but on the other end, why would they limit the frequency if people are able to OC these cards with custom bios and stuff? They better just sell them with clock as fast as they can do.


----------



## Chrispy_ (Oct 22, 2020)

Punkenjoy said:


> if you look at OC Navi10 cards, they already do 2 GHz game clock. So the "official" numbers for all NAVI21 cards aren't that far away from  the current NAVI10 clock.



Navi21 is on N7P that's 7% faster (according to TSMC) at the same power level.






So Navi10's 2GHz game clock becomes 2.15GHz without even changing the TDP or accounting for any architectural improvements that always get made to improve clockspeeds between generations.


----------



## MxPhenom 216 (Oct 22, 2020)

medi01 said:


> Yeah.
> Or had NV priced card that is 20-30% faster than $1200 card at around $999.
> I wonder why didn't they do it.
> Oh wait, this can't be the reason, can it:
> ...



So then we are assuming that either the 6800XT is slower by a tad and Nvidia knows this, but will be a lot cheaper or what?

If its faster, best believe AMD will price it accordingly.


----------



## medi01 (Oct 22, 2020)

MxPhenom 216 said:


> So then we are assuming that either the 6800XT is slower by a tad and Nvidia knows this, but will be a lot cheaper or what?



AMD has a very competitive lineup inbound.
Pricing on NV cards is BS, street price on 3070 is around 700 Euros, 3080 is missing in action on top of that.

NV won't be able to recover with Ampere on Samsung 8nm, it an un-redeemable clusterf*ck.


----------



## MxPhenom 216 (Oct 22, 2020)

medi01 said:


> AMD has a very competitive lineup inbound.
> Pricing on NV cards is BS, street price on 3070 is around 700 Euros, 3080 is missing in action on top of that.
> 
> NV won't be able to recover with Ampere on Samsung 8nm, it an un-redeemable clusterf*ck.



They definitely do. Been a long time coming really. AMD hasnt been very competitive in the high end GPU space for years now. probably since the R9-290x.

How do you know that when the 3070 isnt even out yet?

Nvidia will probably counter next year with Super and Ti variants using TSMC 7nm.


----------



## N3M3515 (Oct 22, 2020)

MxPhenom 216 said:


> Nvidia will probably counter next year with Super and Ti variants using TSMC 7nm.



I don't know about the 3070 super, but 3080 super? when 3090 is at spiting distance(12% more less)....
Maybe a 3080 super with 20GB and 6% better than regular 3080


----------



## MxPhenom 216 (Oct 22, 2020)

N3M3515 said:


> I don't know about the 3070 super, but 3080 super? when 3090 is at spiting distance(12% more less)....
> Maybe a 3080 super with 20GB and 6% better than regular 3080



Id suspect itll mostly just be a process change, higher clocks if they can and more memory. No cores unlocked, etc.

That 3090 is a fucking joke for gamers.


----------



## Max(IT) (Oct 22, 2020)

Mussels said:


> But they're stable in the long run. I've got an RX 470 and 580 in the house in lesser machines and they run totally smooth, i keep forgetting they're not Nvidia and try to open the overlay with the NV shortcut of alt-Z to stream things.


 unfortunately 5700XT are still not stable today, unlike the RX580 series...


----------



## MxPhenom 216 (Oct 22, 2020)

Max(IT) said:


> unfortunately 5700XT are still not stable today, unlike the RX580 series...



What do you mean? Drivers still a nightmare?


----------



## Max(IT) (Oct 22, 2020)

medi01 said:


> AMD has a very competitive lineup inbound.
> Pricing on NV cards is BS, street price on 3070 is around 700 Euros, 3080 is missing in action on top of that.
> 
> NV won't be able to recover with Ampere on Samsung 8nm, it an un-redeemable clusterf*ck.


There is no street price for the 3070, since the 3070 isn’t yet “on the street “.



MxPhenom 216 said:


> What do you mean? Drivers still a nightmare?


Yep.
I wouldn’t call it a nightmare (it was until April) but I still have some freezes and black screens, from time to time.

E


yeeeeman said:


> Nvidia will have a bit of trouble with these new cards.


especially  if these new cards will actually be available (and I’m not entirely sure about that).


----------



## MxPhenom 216 (Oct 22, 2020)

Max(IT) said:


> There is no street price for the 3070, since the 3070 isn’t yet “on the street “.
> 
> 
> *Yep.
> ...




Yeah so that is what i've been worried about if the 6800XT is enticing enough. In the past I have had issues with AMD drivers, but the last time i had one of their cards was a 5870.


----------



## Max(IT) (Oct 22, 2020)

MxPhenom 216 said:


> Yeah so that is what i've been worried about if the 6800XT is enticing enough. In the past I have had issues with AMD drivers, but the last time i had one of their cards was a 5870.


I’m considering a 6800XT too, but I know I’m taking a risk.
AMD is great at hardware production, but not in the software development


----------



## Umbral (Oct 22, 2020)

Yazzia said:


> Why would AMD sell a card basically the same as the 3090 for less than half the price? The delusion is real



slower than 3090, 16 GB vs 24 GB, closer to 3080 than 3090.

Simple.


----------



## c1979h4life (Oct 22, 2020)

ratirt said:


> I wonder if the 6700 XT will perform around or a bit faster than the 5700 XT? Logic dictates it will be faster than a 5700 XT but who knows how much faster and if it will actually be faster.
> 
> You still are waiting for the 3080? oh boy. The Navi 6000 will definitely come before you get the card so you will have your chance of comparison. I'm also aiming at the more less 3080 type of performance and waiting for new Navi to see what it will offer. Either way I need a new faster GPU.


The RDNA2 GPUs were touted by AMD as being up to 50% more powerful than last gen, so I at least think it will be between 30-40% faster than the 5700XT, which would probably put it in the 500-600 dollar range. They went up on the new 5xxx series CPUS also. I want to see what the 6800 base model does.


----------



## bug (Oct 22, 2020)

c1979h4life said:


> The RDNA2 GPUs were touted by AMD as being up to 50% more powerful than last gen, so I at least think it will be between 30-40% faster than the 5700XT, which would probably put it in the 500-600 dollar range. They went up on the new 5xxx series CPUS also. I want to see what the 6800 base model does.


Remember, 5700 XT features a 256 bit memory bus, 5600(XT) supposedly cuts that back to 192 bit. That means a cheaper PCB and we'll have to wait and see what it means for performance.


----------



## Zach_01 (Oct 22, 2020)

Punkenjoy said:


> Indeed, but it's only a portion of the final solution. Both GPU have all kind of memory tricks like compression to reduce memory bandwidth. People focus on that because it sound cool and it's what they know. Also we don't necessarily know if Ampere is really memory bandwidth bound.
> 
> NAVI10 currently have 512 kb of L1 (128 kb per 10 CU block). if AMD keep the same size per block of 10 CU, the available L1 cache will go from 128 kb to 1 MB for full NAVI21 witch is quite a big increase. If they match the cache size Per CU, they could have around 8 MB of L2 cache.
> 
> how much that will help, i don't know. No matter what people say, only the benchmark will really reveal the final information. We may be discussing about stuff that have marginal impact on the real performance.


8MB? No... no...
Leaks are "saying" about 128MB total cache in RNDA2 flagship at least. We cant really confirm it though.

Rough estimation
If we accept the biggest Navi21 die is 536mm² then:

5700XT (40CU + SOC) = 251mm²
From pictures of the die about 150mm² is the 40CUs and the other 100mm² for SOC stuff.
80CUs = 300mm², but with enhanced 7NP node (+10~15% density) this could result a 250~270mm² area for 80CUs + 100~150mm² for SOC = 80CU + SOC = 350~420mm².
This leaving another 115~185mm² up to 536 for cache only.

Remember that Navi21 has the same 256bit bandwidth so it does not need a more complex mem cotroller (no extra space on die from Navi10).

So the 100+MB cache can be a thing on the Navi21 die...


----------



## r9 (Oct 23, 2020)

MxPhenom 216 said:


> Mhmm.
> 
> From what I understand, GDDR6X is in very short supply. Only Micron is making it right now. Also Samsung 8nm (10nm+) yields are pretty bad.
> 
> Or Nvidia is purposefully reducing the supply. If thats the case, itll just hurt Nvidia in the end since most people will just buy these new AMD cards if the performance is there.



They just can't produce enough, low gpu and ddr6x yields. Hopefully they get on top of it and so does AMD with


Xex360 said:


> Fixed that for you, how come Turing was priced the same everywhere (except where taxes are the issue), but Ampere costs up to 40% outside the US.


If they released for $1099 the extra money would have went in their pocket and not in a bunch of vultures pockets. Without AMD for $1099, $100 cheaper you get 30% perf boost so it's not totally outrageous.
I'm not trying to defend nvidia, but the situation they are in it's not intentional just a result of some bad calls like manufacturing and ddr6x. I'm sure if it was up to them they would like to be able to make and sell more. One for obvious reason $$$ and second the more they sell now the less market is left for AMD. But I highly doubt that AMD availability will be much better. And to remind those who didn't know amd and nvidia have been sued for having agreement between and price fixing. So I don't expect any price undercutting or anything, the most we can hope for is buy at regular price.


----------



## Super XP (Oct 23, 2020)

The Infinity Cache looks quite interesting. AMD releasing a 256-Bit Bus shows they are confident in there innovative IF Cache. Looking forward to seeing this thing in action. I wonder if the Consoles had anything to do with this type of design, by keeping costs down all while maintaining performance/watt.


----------



## Punkenjoy (Oct 23, 2020)

Super XP said:


> The Infinity Cache looks quite interesting. AMD releasing a 256-Bit Bus shows they are confident in there innovative IF Cache. Looking forward to seeing this thing in action. I wonder if the Consoles had anything to do with this type of design, by keeping costs down all while maintaining performance/watt.


Maybe, but I suspect it's related to NAVI3x rumours to use chiplets like Zen2/3. They probably wants each cores complex to be as self sufficient as possible. 

I also don't expect 128 mb of cache. Maybe more than 1 MB of L1 and 8 MB of L2. AMD is adding new feature like hardware ray tracing and variable rate shading and we don't know yet how much space theses news features and the others architectural change will use.

But in the end it do not really matter. It's just fun to speculate. Only benchmark will matter after launch


----------



## mrjayviper (Oct 25, 2020)

Is the 6700xt the spiritual successor to the 5700xt?

Thanks


----------



## Max(IT) (Oct 25, 2020)

mrjayviper said:


> Is the 6700xt the spiritual successor to the 5700xt?
> 
> Thanks


I don’t think so.
6700 will probably be similar to 5700XT but with RT hardware support.

The “spiritual” successor of 5700XT will probably be the 6800XT, with the 6900XT on an upper level


----------



## Zach_01 (Oct 25, 2020)

Punkenjoy said:


> Maybe, but I suspect it's related to NAVI3x rumours to use chiplets like Zen2/3. They probably wants each cores complex to be as self sufficient as possible.
> 
> I also don't expect 128 mb of cache. Maybe more than 1 MB of L1 and 8 MB of L2. AMD is adding new feature like hardware ray tracing and variable rate shading and we don't know yet how much space theses news features and the others architectural change will use.
> 
> But in the end it do not really matter. It's just fun to speculate. Only benchmark will matter after launch


It is said that AMD architecture would not have dedicated units for RTRT and DLSS equivalent, but will implement those inside the ”regular” pipeline. We really don’t know if and how that is increasing transistor count for the GPU as the nVidia architecture does.

Personally I do believe that it will have a large cache, more than 100MB, to compensate the narrow bandwidth.


----------



## RoutedScripter (Oct 25, 2020)

Here we go again, I have to hide my eyes while typing this, to not see all these spoilers !!!

Only learned about this now, but this is pure drama as expected, see you all later on the 29th after the proper reveal.

I get more happyness and enjoyment out of watching a proper reveal than any of this in the end.


----------



## Zach_01 (Oct 25, 2020)

RoutedScripter said:


> Here we go again, I have to hide my eyes while typing this, to not see all these spoilers !!!
> 
> Only learned about this now, but this is pure drama as expected, see you all later on the 29th after the proper reveal.
> 
> I get more happyness and enjoyment out of watching a proper reveal than any of this in the end.


Good for you!
Enjoy your 3070 and leave us here on our misery if you dont have an actuall opinion and anything constructive to say.


----------



## bug (Oct 25, 2020)

Zach_01 said:


> Good for you!
> Enjoy your 3070 and leave us here on our misery if you dont have an actuall opinion and anything constructive to say.


What is there to say constructively about leaked specs, really?


----------



## RoutedScripter (Oct 26, 2020)

Zach_01 said:


> Good for you!
> Enjoy your 3070 and leave us here on our misery if you dont have an actuall opinion and anything constructive to say.



Gee, thanks, I never had a Nvidia GPU in my life.

Yes ofcourse, constructive to say, after I would have been spoiled by all the leaks, I could only go hide my head in sand, not have any motivation talking about it. When I was young I also wanted leaks and spent a ton of time on forum and scoring the web for leads, it can be interesting, but it's a lot of time, and it can get crazy if you don't slow the horses down. Now, with all things considered, life, other things, health, priorities, I find it spoiling, I'm honestly not going to lose that much nerves and time over this highly stressful phenomenon that this leak-hunting sport has become, that's taking over so many people's lifes, dribbling about GPU leaks all day long, and actually being seriously bothered, people losing their shit when they can't get a GPU on day 1 and nonsense like this, or getting super angry and actually bothering customer support over some very light argument, this bitching and crying is causing other people negativity, and well anyone looking at such negative news isn't going to pull anything inspiring out of that, that's not enjoyment. Holding companies to high-standard and fairness is one thing, but it's all about the proper approach.

I'm commenting in general, not saying it's necessairly like this everywhere, but most of these discussions are more bickering and seem to be so heavily seriously integrated with some people's lives, a lot of precious time vested into it, what in the end is only about consumption of entertainment, consumption of force-feeded information and ideas someone else developed, playing games and only doing that isn't some kind of a high achivement for humanity, it's being feed down the throat, to lose so much nerves over that is obviously pretty evident that such a subject isn't really understanding what's happening here. If it were about work, workstation, like actual practical things we need, like a proper prosumer HW segment, like proper pro-user OS, so many things are so crappy in this space and there's not a single of these gamer dudes ranting about that ... no linux is not a replacement for Windows, linux is a bunch of various industry-related groups of agendas who are mostly favoring their goals, not the goals of an optimal example of a prosumer (pro-user), so many things in linux are either totally missing or convoluted that is just 1-2 clicks on windows, and it's not about clicks, I'm talking about the easiest most primitive and standard simple things that should NOT take more than 1-2 clicks or key presses. Most linux GUIs are infact not even meant for pro-users either. The most practical things that would seriously deconfuse all of this isn't being done, not much attention, thought, programming effort is focused into that, I think firstly it's just a failure to understand the issues, they aren't consciously aware of them. Yet they all praise themselfs of advancing ... advancing what? They're not really advancing things that would really make the golden age of PCs again, that would really get rid of all of this drama and we would enjoy things a lot more overall. So much of the tech news and PC desktop tech is so boring in so many ways, many advancements that happen are usually some things that barely matter in practice to an average and pro-user overall, yes you have faster speed, yes this and that, but are you really enjoying the games THAT MUCH, is it more like 20% enjoyment and 80% forum, driver, OS, reboot, reinstall, crap, error, bug, issue, CS, new CPU, HW failure drama ... and then people even have the time to bother and push their already exhausted bodies bickering about speculation and leaks, that's the last thing one would do for it's sanity.

Do you people understand that with all of the effort of all of these devs we would have created a much better experience overall on the PC desktop space, there could be these courses where beginners would be thought up, not left to churn in this never ending drama, because so many companies, FOSS programmers just AREN'T WORKING IN YOUR BEST, WISEST AND PROPER INTEREST, so many programmers are wasting time fiddling diddlign with some worthless FOSS apps just for the sake of programing and getting the dopamine rush effect from feeling like they're contributing to some social group and are part of it, it's a form of sport too, they're not really trying to improve the biggest offender, the biggest elephant in the room the freaking Windows OS, no, they're busy developing their python pixie dust app that does some XYZ thing on your desktop and isn't really fundamentally going to change the course at all, so much stuff out there is lovel, nice, but it's NOT ENOUGH, isn't making a fundamental impact, because the whole culture does not know how to analyze and properly identify and tackle this issue, we need to fundamentally rething the whole ecosystem, it unfortuntely starts with finances and he who has the HW manufacturing capability, but existing customers if they band together they COULD have a chance forcing the current industry culture to radically change, if they really want to. But several players DO enjoy this set of circumstances, the ones who profit from it. One thing is for sure, websites about leak news PROFIT from all this pointless and life-sucking drama.

Many more people than I realized are overtaken and seriously bothered by all of this, if the leak isn't to their expectations, they kinda even blow up, what??? It was all speculation and lottery anyway, but they don't even realize it I think that it's taking a tool on their health as a consequence of being a stressful experience and lifestyle.
From all the other things in life, school, parents, transition to adult life, adult life dangers, mafias, drugs, evil corporations and predatory military-grade advertisments ... an average teenager goes through a meatgrinder of stress and psychological assaults and rarely do they figure it out in time, and this obsession with leaks is like one of those traps, and they IMO don't realize it, probably a form of sport, the constant excitement that goes into overdrive and into stressful arguments is unwise.

I've talked about it before, this time I'll be shorter, one of the things it comes down to largely is simply beginner inexperience, simple as that, that's what an average gamer is, a developing human who is relatively inexperienced. If previous generations don't do good enough job teaching the next, than this is what you get. The next generation always faces the problem of having to re-figure and re-discover what older generations already did but haven't put enough thought into passing that on as effectively.
Certainly the previous generations can't be blamed for everything, it's a loop, they also had the same problem themselfs, they're not experienced enough themselfs either, otherwise they would know how important it is to pass on the knowledge and discoveries.
New times bring new things and challenges that weren't explored and learned before. These gamer kids have this challenge ahead of them to figure out, unless someone warns them about it, they may be at it for a long long time. They should ask themselfs how much precious effort, time and mental capacity goes around bickering about HW specs all day long? Time and effort we all really would need for many other things in life. Do they have a sense of time, how much time do they spend on speculating and theorizing about? It's about lack of knowledge, with that experience and with that lack of perspective, with it all of this specs speculation and leaks start to feel insignificant versus all the rest of things in life that justifiably warrant more time and effort to be spent on them. This whole problem isn't really anything special to this field/topic.

Look, a proper technical type discussion you would do with your buddy-ies over a weekend visit or an afternoon sitdown at a bar is one thing, but this crazyness running around the web is just ridicolous, it feels like kindergarden circus, it feels jittery and frantic half the time, it's almost like everything is all about leaks, leaks this, leaks that, and  the whole experience relies on leaks and rumors half the time, and I would even blame the maufacturers themselfs which are in some ways actually benefiting from this because of generated buzz and official information is in many ways by most of todays crony corporations very confused and bent against the customer. This is evident in the product labeling and model numbering systems. In almost every case the model numbers are so confusing on purpose to be of an extra chore for a customer to figure out exactly what fits his needs.

Then there's the other part of "_who knows what a few hours earlier than someone else_" and the funny thing about it it's usually like people who don't even have any practical or financial gain over knowing this information earlier than they officially would, it's so silly, the only ones with most benefit by leaks are only the few engineers working at the competitor, the second group is probably some shady investors and financial speculators, probably other employees who are up to no good, but IMO everyone else in the community is in it just for the sports, and it's all about being part of a campagin, a social group, and I think it's all about getting that dopamine rush of being feeling a "worthy" part of a social group, people get rated over who knows more details about which HW part and they brag to prove their worth in a social group. All of this drama and buzz has direct benefit for various tech websites to varying degrees which are abusing this phenomenon for profits, and profiting on the drama means they have no interest in doing anything about the problems of it.

What I'm warning here is that these gamer dudes should know they're actually part of a larger bowl of soup, this whole hype-train is actually a business method. If it would be more fair it be another thing, but no, these gamer people don't know that, they're totally not getting anything IMO in return, except for that psychological excitement? That limited dopamine rush they get from the hype is all negated by themselfs when they get so upset when something goes against their expectations, so they're constantly causing themselfs harm all the time.
Excitement is good but only in short bursts in life, it adds to stress and chronic stress is one of the most unhealthiest things ever, your pushing your body day and day out to be hyped and excited and totally like a crazed chicken with it's head cut off running around the tech webz ...
I do understand the hype around CPUs and GPU can be fun, I've been there done that, but I rather be hyped mostly on that big day of the announcement. I rather have healthy balanced dose of hype or excitement. What some people do is total overkill that morphs into something totally opposite and I think it does them more bad than good overall. Some gamer people probably sit for very long times, which is very unhealthy, just navigating the webs and pressing F5 to refresh frantically, that's actually a sign of some other health issue which may show it self with these symptoms, so either way it's not something normal IMO. This kind of broken unhealthy hype and excitement should not be taught to the next generations.


----------



## Max(IT) (Oct 26, 2020)

wow... both of you derailed the thread in a big way.


----------



## mrthanhnguyen (Oct 26, 2020)

6900xt ($350) = 3080 ($700)
That would be awesome.


----------



## Max(IT) (Oct 26, 2020)

mrthanhnguyen said:


> 6900xt ($350) = 3080 ($700)
> That would be awesome.


That would be impossible


----------



## Punkenjoy (Oct 26, 2020)

i think some people think AMD is a non-profit organization...


----------



## Max(IT) (Oct 26, 2020)

Punkenjoy said:


> i think some people think AMD is a non-profit organization...


They just have unrealistic expectations


----------



## Zach_01 (Oct 26, 2020)

Punkenjoy said:


> i think some people think AMD is a non-profit organization...





Max(IT) said:


> They just have unrealistic expectations


No no... just look the 1500$ MSRP GPU he got... His commend's meaning is elsewhere



mrthanhnguyen said:


> 6900xt ($350) = 3080 ($700)
> That would be awesome.


It would be awsome even at 1000$ and would make your fake Titan of 1500+$  look s.... s.... stupid I want to say? I'm not sure... oh well... I'm confused...


----------



## Camm (Oct 26, 2020)

The question is I think will RT+DLSS act as the next PhysX \ Gameworks \ Tesselation \ Gsync spoiler that previous generations have had where AMD has a faster card in rasterisation (or in general, a better card), but the market see's some feature or strength from Nvidia as being worth the price premium / worse product.

Against DLSS, I expect AMD to deploy a DirectML solution that will work in any DX12 title, mostly negating DLSS (with the proviso that you will need to run at a higher resolution for AMD than with DLSS for no noticeable quality loss). RT is a bit more of a wildcard (expecting that AMD will be slower), games are starting to get it more and more, but still at questionable levels of use.


----------



## kapone32 (Oct 26, 2020)

ShurikN said:


> AMD's reference PCB and board design are one of the best on the market. Everything is over-engineered, and it rivals much more expensive Sapphire models for example.


I hope they don't make it a pain to install a Waterblock.


----------



## bug (Oct 26, 2020)

mrthanhnguyen said:


> 6900xt ($350) = 3080 ($700)
> That would be awesome.


Not if you were a shareholder, it wouldn't


----------



## Camm (Oct 26, 2020)

bug said:


> Not if you were a shareholder, it wouldn't



I do think it would be a mistake for AMD to not price aggressively, AMD has a mindshare problem in the GPU market and turning all the screws on Nvidia when they won't be able to respond effectively would put AMD in the best position going forward.

That being said, too many idiots still think of AMD as the no frills or budget brand when the reality is they have plenty of world beating products.


----------



## N3M3515 (Oct 26, 2020)

mrthanhnguyen said:


> 6900xt ($350) = 3080 ($700)
> That would be awesome.



More like $650


----------



## bug (Oct 26, 2020)

Camm said:


> I do think it would be a mistake for AMD to not price aggressively, AMD has a mindshare problem in the GPU market and turning all the screws on Nvidia when they won't be able to respond effectively would put AMD in the best position going forward.
> 
> That being said, too many idiots still think of AMD as the no frills or budget brand when the reality is they have plenty of world beating products.


Pricing aggressively is one thing, but same performance at half-price is just nuts.
Pricing aggressively usually means selling at close to zero profit or subsidizing from other divisions. Only AMD knows if Ryzen+Epyc allows them to do that already.


----------



## Camm (Oct 26, 2020)

bug said:


> Pricing aggressively is one thing, but same performance at half-price is just nuts.
> Pricing aggressively usually means selling at close to zero profit or subsidizing from other divisions. Only AMD knows if Ryzen+Epyc allows them to do that already.



My apologies if it seemed to come across that I was advocating for stupid prices (I thought I covered that when I said too many idiots treat AMD as a budget brand, but apparently not).


----------



## renz496 (Oct 27, 2020)

Camm said:


> The question is I think will RT+DLSS act as the next PhysX \ Gameworks \ Tesselation \ Gsync spoiler that previous generations have had where AMD has a faster card in rasterisation (or in general, a better card), but the market see's some feature or strength from Nvidia as being worth the price premium / worse product.
> 
> Against DLSS, I expect AMD to deploy a DirectML solution that will work in any DX12 title, mostly negating DLSS (with the proviso that you will need to run at a higher resolution for AMD than with DLSS for no noticeable quality loss). RT is a bit more of a wildcard (expecting that AMD will be slower), games are starting to get it more and more, but still at questionable levels of use.



even so it is not something that will work automatically. just like DX12 multi GPU. the biggest problem with machine learning is it needs to be trained first (it's in the name). this will take time and resource to do it right. so even if they can work with any DX12 tittle those specific tittle still need training. with DLSS the training part mostly being financed by nvidia as part of the sponsorship. historically AMD did not fond of stuff that require them to spend their own resource. they rather someone else to cover the expense. be it game developer themselves or their hardware partner. probably why in the past AMD look upon DLSS weakness and try to counter it with completely different tech (image sharpening).


----------



## Camm (Oct 27, 2020)

renz496 said:


> it is not something that will work automatically. just like DX12 multi GPU.



There's reason I mentioned DirectML as it can be implemented at the pipeline rather than engine level, negating the need for explicit support. Training I could see coming as part of driver updates or covered by a generic model, especially with games that support dynamic resolution (although that'd have to be exposed specifically for a AMD solution to work). I doubt AMD can match the quality and performance of DLSS, but neither does it need to if it can get to 90% of the quality at a low enough resolution to make a performance difference.


----------



## goodeedidid (Oct 28, 2020)

No GDDR6X? I guess this isn't going to really be on par as the 3080 as some people were assuming.


----------



## mrthanhnguyen (Oct 28, 2020)

6900xt ($999)>= 3090 ($1499)
Who goes nut now?


----------



## dragontamer5788 (Oct 28, 2020)

goodeedidid said:


> No GDDR6X? I guess this isn't going to really be on par as the 3080 as some people were assuming.



This comment didn't age well...

It happens to all of us, lol. I still don't understand how the 6900 XT can keep up with the 3090 with such a memory disadvantage. I get that the 128MB cache is helping, but... how?


----------



## dir_d (Oct 28, 2020)

dragontamer5788 said:


> This comment didn't age well...
> 
> It happens to all of us, lol. I still don't understand how the 6900 XT can keep up with the 3090 with such a memory disadvantage. I get that the 128MB cache is helping, but... how?


----------



## Crustybeaver (Oct 28, 2020)

Vya Domus said:


> It will be funny to see a 3080 "ti" that's 5-8% faster.
> 
> Reality check, Nvidia probably didn't plan to use GA102 for the 3080 but rather a fully enabled GA104 like they always did which could have made room for a 3080ti. For some _unknown _reason (let's leave it at that for the time being) they decided GA104 wasn't enough so the whole stack got pushed up. There is no room for "ti" variants, or at least not for any that would matter. There is potential room for a 3070ti but that's because they disabled more SMs than usual, the 3070 was meant to be the 3080 and 3090 the Titan.


So if it's 5-8% faster doesn't that make it faster than the 6800XT and competitive with the 6900XT?


----------



## Nater (Oct 28, 2020)

dragontamer5788 said:


> I still don't understand how the 6900 XT can keep up with the 3090 with such a memory disadvantage. I get that the 128MB cache is helping, but... how?



Isn't that what the Infinity Cache is all about?  You do mean memory speed right, not capacity? (16GB vs 24GB)


----------



## dragontamer5788 (Oct 28, 2020)

Nater said:


> Isn't that what the Infinity Cache is all about?  You do mean memory speed right, not capacity? (16GB vs 24GB)



Bandwidth specifically, yeah.

The 128MB cache must be waaaayyyyy faster, to make up for the overall lower bandwidth to main VRAM.


----------



## medi01 (Oct 29, 2020)

Crustybeaver said:


> So if it's 5-8% faster doesn't that make it faster than the 6800XT and competitive with the 6900XT?


It makes it competitive with 3090 which is about 10% faster 3080, lol.
But then it only has 10GB VRAM.

And do you want to talk about availability of cards that aren't available EVEN FOR PRE-ORDER (at announced price)?

2080Ti MSRP was $999, ya know. On paper.



dragontamer5788 said:


> The 128MB cache must be waaaayyyyy faster, to make up for the overall lower bandwidth to main VRAM.


Cool part is that this cache bit is from Zen 3 development.
AMD's GPU-CPU synergy starts to pay off.

They jumped from being behind on using bandwidth effectively to trouncing NV.


----------



## Crustybeaver (Oct 29, 2020)

medi01 said:


> It makes it competitive with 3090 which is about 10% faster 3080, lol.
> But then it only has 10GB VRAM.
> 
> And do you want to talk about availability of cards that aren't available EVEN FOR PRE-ORDER (at announced price)?
> ...


My post was in reference to the proposed 3080Ti, how can you talk about availability when it's at the rumour stage? Based on the 6900XT being $999 it would be safe to assume that the 3080Ti would be a direct competitor to this in both price and perf.


----------



## medi01 (Oct 29, 2020)

Crustybeaver said:


> My post was in reference to the proposed 3080Ti, how can you talk about availability when it's at the rumour stage?


It is GA102 based, remind me about availability of those chips...
Gap between 3090 and 3080 is 10%, how does one fit glorious 3080Ti into it.

Ampere is DOA, non-redeemable. Only FUD to save face:


----------



## Camm (Oct 29, 2020)

medi01 said:


> Gap between 3090 and 3080 is 10%, how does one fit glorious 3080Ti into it.



You kinda aren't, you shave off a meaningless amount of cores, halve the RAM, wallah, you have your "3080 Ti".


----------



## Nater (Oct 29, 2020)

I expect the market to adjust.  You'll see the 6900XT (if there's AIB cards released) shoot up to $1200 or so initially, while the 3090 will drop a couple hundred.  Otherwise is REALLY is a niche product for the creator/gamers.


----------



## Crustybeaver (Oct 29, 2020)

medi01 said:


> It is GA102 based, remind me about availability of those chips...


My guess is that 6900XT will sell out and as there's no announcement of AIB cards wouldn't be surprised to see cards hit crazy aftermarket prices. Whether it be a 3080, 3090, 3080Ti or 6900XT, who knows maybe even the 6800XT, think the availability is going to be scarce for the next six to twelve months


----------



## EarthDog (Oct 29, 2020)

WeeRab said:


> crash-to-desktop, bug-ridden Nvidia cards.


You're aware this was fixed with a driver BEFORE your post, right? Get a clue, dude. Your posts are full of outdated or just plain wrong information.


----------



## Punkenjoy (Oct 30, 2020)

Zach_01 said:


> It is said that AMD architecture would not have dedicated units for RTRT and DLSS equivalent, but will implement those inside the ”regular” pipeline. We really don’t know if and how that is increasing transistor count for the GPU as the nVidia architecture does.
> 
> Personally I do believe that it will have a large cache, more than 100MB, to compensate the narrow bandwidth.


seem you were right


----------



## Super XP (Oct 30, 2020)

mrjayviper said:


> Is the 6700xt the spiritual successor to the 5700xt?
> 
> Thanks


Now that AMD officially announced the 6800, 6800XT & 6900's, yes the 6700XT should be the successor to the 5700XT because we now know the *6800XT is the Big Navi* everyone was talking about.


----------



## goodeedidid (Nov 11, 2020)

dragontamer5788 said:


> This comment didn't age well...
> 
> It happens to all of us, lol. I still don't understand how the 6900 XT can keep up with the 3090 with such a memory disadvantage. I get that the 128MB cache is helping, but... how?


Let's first see real-world usages and don't base our opinions on in-house spec-sheets..


----------



## Gmr_Chick (Nov 20, 2020)

I'm itching to see some news about the 6700's. When it finally drops and the reviews are good, I'm saying bye bye to my 1660 Super


----------

