Tuesday, August 31st 2021

AMD Reportedly Readying RX 6900 XTX, Bringing the Battle to NVIDIA RTX 3090
Graphics cards may be on their way to becoming unicorns that you can only pay for after finding the proverbial pot of gold from under a rainbow, but that doesn't mean AMD and NVIDIA will slow down their competition any time soon - especially in this market, there's a huge profit to be made. And AMD may just be finally readying their true halo product - a graphics card that aims to beat NVIDIA's RTX 3090 across the board. Twitter user CyberPunkCat shared an alleged AMD slide showcasing a new, overpowered RX 6900 XTX graphics card. AMD's naming scheme for their RX 6900 series may be slightly confusing nowadays: the original RX 6900 XT carries the Navi 21 XTX die, and AMD has recently released a higher-performance version of that Navi 21 chip in the form of the Navi 21 XTXH - which power the liquid-cooled versions of the RX 6900 XT, with higher overall clocks than the original GPU release. However, there hasn't been a change in the RX 6900 XT nomenclature - but this new slide suggests otherwise.
If the leaked slide is real (keep your NaCl ready, as always), it appears that the RX 6900 XTX might pair both the higher-performance Navi 21 XTXH chip with higher memory speeds. While both Navi 21 XT and Navi 21 XTXH both make use of 16 Gbps GDDR6 memory, the slide indicates that the RX 6900 XTX will feature 18 Gbps memory speeds, exploring another avenue for increased performance. This decision would bring an increase in maximum theoretical memory subsystem bandwidth from the 512 Gbps in the RX 6900 XT up to 576 Gbps - a 13% increase, which would not translate into a proportional increase in final performance. However, considering how our own reviews show that AMD's RX 6900 XT with the Navi 21 XTXH silicon is already between one and three percent faster than NVIDIA's RTX 3090, even a slight, 5% performance increase over that cards' performance means that AMD might be able to claim the performance crown for the mainstream market. It's been a while since that happened, hasn't it?
Sources:
CyberPunkCat @ Twitter, via Tom's Hardware
If the leaked slide is real (keep your NaCl ready, as always), it appears that the RX 6900 XTX might pair both the higher-performance Navi 21 XTXH chip with higher memory speeds. While both Navi 21 XT and Navi 21 XTXH both make use of 16 Gbps GDDR6 memory, the slide indicates that the RX 6900 XTX will feature 18 Gbps memory speeds, exploring another avenue for increased performance. This decision would bring an increase in maximum theoretical memory subsystem bandwidth from the 512 Gbps in the RX 6900 XT up to 576 Gbps - a 13% increase, which would not translate into a proportional increase in final performance. However, considering how our own reviews show that AMD's RX 6900 XT with the Navi 21 XTXH silicon is already between one and three percent faster than NVIDIA's RTX 3090, even a slight, 5% performance increase over that cards' performance means that AMD might be able to claim the performance crown for the mainstream market. It's been a while since that happened, hasn't it?
107 Comments on AMD Reportedly Readying RX 6900 XTX, Bringing the Battle to NVIDIA RTX 3090
I also didn't say anything about pricing, clearly the market lets them charge whatever they want. But right now AMD doesn't have a product MSRP'd at $300 to challenge the 3060. There are many more customers at that end of the spectrum needing to upgrade from a prior generation than there are customers looking for yet another 6900 or 3090 to not be able to choose from due to product scarcity.
www.newegg.com/p/pl?d=6900+xt&N=8000%204131
Should AMD just sit back and watch nvidia release 3090 super?
You go from talking about releasing top tier products, to transitioning back to complaining about lack of 3060 and 6600 xt MSRP cards. I'm not sure where you are going with this point? Stock isn't going to improve until TSMC catches up to demand. AMD could release 5 more variants of 6600 and nvidia could release 5 more variants of 3060, but the supply would still not change at TSMC. Your comment really makes no sense.
Just comment that you are upset with not MSRP cards being available, rather than making comments that provide no logic.
cough Minecraft Cough
RT is great but the hardware is not there to support it. You have a glimpse of RT and you have already fallen in love with it. Just like a teenager falls in love with a first girl who kissed him.
RT is a great move forward no doubt but it is still the hardware limitation and it will take a while before it doesn't cause a slideshow in games.
The flip side of that coin being, the majority of RT capable hardware out there (consoles) having relatively weak capability will hopefully be a tailwind to clever innovation and optimisation to eek the most out of what we've got.
You dont see the difference when you use DLSS and that is fine but you do see a huge difference when you switch RT on and it is worth the FPS hit? That my friend is preference but the fact is, hardware is struggling with RT. Most people dont want to use it cause they don't see a lot of difference in visual quality but they surely do see FPS impact. Low benefit for switching it on considering the FPS hit.
The games you have specified Control and Metro. Sure you can play with RT but as you said, you have to tweak it (sacrifice some details). All maxed out it will still dip below 60 or even 40 (Control) and you have a card for $1.5k or something like that.
Imagine you paying the same money as I did, only that I get a better gaming experience in 50% of the time and equal on the rest, then you just say you don't care about gaming experience anyways.
Let me guess, if you had played CP2077, do you need to sacrifice some rasterized settings in order to get 4K 60FPS? by your own definitition then rasterized performance of the 6900XT is not even there yet, much less RT performance.
Matter of preference, some people would not use either because for them it is not worth it due to the difference in visuals. Yeah and some people are just being a douche. People am I right? You speak of preference. Majority of people say clearly the RT is good but the hardware is not there yet.
Back to preferential, I don't consider DLSS a compromise because I prefer the output in all games I play with it vs without (antialiasing, reduced/non existent shimmering, fine detail).
I have never tried justifying my purchase, since it is the most uncompromising GPU currently available anyways (well until the 3090 Super). I have played all the latest AAA games with the best gaming experience allowed by current Rasterization + Ray Tracing hardware.
And now you brag that you played all games with RT + rasterization? Who cares about what you play and what card you are using. Source for what exactly? That the hardware for fully ray traced games we currently have is not enough?
Never said DLSS is a compromise. It is a great feature but that is not the point we are discussing here.
Ray tracing for today's available cards is too much and games are not fully ray traced due to hardware constraints. Visual gains to FPS drop is unjustified and using DLSS which is great, proves that the hardware can't keep up and the games are not even fully ray traced.
You can use RT still but it is a glimpse of what the RT API can do and the hardware cant keep up with it. That's my point.
I'm not sure that in today's market it would necessarily be a primary driver of my purchase, getting a card at my desired rast perf level @ or close to MSRP would be higher on the list, and it 100% was when I landed a 3080 at launch, but I've been pleasantly surprised in the RT domain.
I will absolutely not disagree that hardware RT needs to become more capable, by likely an order of magnitude or more in the long run (how long is a piece of string? how long would you like it to be?), and for sure it will. But I think the majority of people don't find the hardware lacking to be the foremost barrier to RT, I think the foremost barrier is adoption in games and effects used. Perhaps we are both right in our own ways, to varying extents.
Ray Tracing makes sense in area that Rasterization cannot do well, that is realistic Reflections, Global illumination, Emissive lighting and Shadows. So yeah DXR is the best of both world.
Sure the RT capability of RX6000 are a joke, but you are making general assumption that RT is not worth the perf trade off, certain not to all the people who own RTX3000 :D.
If I were to make an educated guess, RTX4000 will offer the same improvement to both Rasterization and RT compare to RTX3000, meaning the perf cost with RT ON will remain relatively constant between Turing, Ampere and Ada.
Even Intel acknowledged the importance of RT, that they have dedicated RT cores in their upcoming GPU
So here is my question for you, since you don't agree with my statement 'majority of people say RT is good but the hardware lacks', is the hardware we currently have enough for fully ray traced games? Because the games will get more demanding for sure on the RT side and rasterization side. And one more question, how do you want me to give you a link to 'majority of people say'? I think the conclusion is the one you seek not a link or a source to 'majority of people say'.
I see our friend @nguyen is breaking a thrust from laughing giving meaningless arguments. I sure hope the 3090 in a 2 years time will get better with RT when new, more demanding games come out. (sarcasm)
I'm pretty sure you will be getting RDNA3, so aren't you being hypocritical? or you are saying your 6900XT is too strong in rasterization that it won't be obsolete in a year :roll: ?
Edit: nice try finding that techtuber though, he is playing CP2077 with RT+DLSS for the best gaming experience, as opposed to no RT/DLSS. Playing CP2077 at 4K with RT+DLSS is next gen visual that money can buy atm.
My 6900XT will do fine don't you worry. :)
I think its much more realistic to look at economic realities and actual content, both of which aren't rosy for RT.