Monday, January 4th 2021

AMD Radeon Navi 21 XTXH Variant Spotted, Another Flagship Graphics Card Incoming?

AMD has recently launched its Radeon "Big Navi" 6000 series of graphics cards, making entry to the high-end market and positioning itself well against the competition. The "Big Navi" graphics cards are based on Navi 21 XL (Radeon RX 6800), Navi 21 XT (Radeon RX 6800 XT), and Navi 21 XTX (Radeon RX 6900 XT) GPU revision, each of which features a different number of Shaders/TMUs/ROPs. The highest-end Navi 21 XTX is the highest performance revision featuring 80 Compute Units with 5120 cores. However, it seems like AMD is preparing another similar silicon called Navi 21 XTXH. Currently, it is unknown what the additional "H" means. It could indicate an upgraded version with more CUs, or perhaps a bit cut down configuration. It is unclear where such a GPU would fit in the lineup or is it just an engineering sample that is never making it to the market. It could represent a potential response from AMD to NVIDIA's upcoming GeForce RTX 3080 Ti graphics card, however, that is just speculation. Other options suggest that such a GPU would be a part of mainstream notebook lineup, just like Renoir comes in the "H" variant. We have to wait and see what AMD does to find out more.
Sources: USB, via VideoCardz
Add your own comment

40 Comments on AMD Radeon Navi 21 XTXH Variant Spotted, Another Flagship Graphics Card Incoming?

#26
Caring1
XT - more power
XH - extra Hot ;)
Posted on Reply
#27
FreedomOfSpeech
Three weeks ago I got a 6900XT. After two weeks I gave up. Buyed new cables, used DDU and CRU several times, it didn't worked well. The card did not work with my LG C9, only 4k@60 was working, Freesync didn't, VRR didn't, HDR didn't, 4k@120Hz@8bit+dithering was working for one evening with blackscreens every 5 seconds and microstuttering at its finest. The next day the TV was showing "No Signal" till I dropped down to 60Hz. When I was playing on the TV, my Monitor showed some weired flickering, maybe it's called Freesync Flickering. It is a Dell S3220DGF with Freesync 2 HDR. Freesync only worked in windowed mode, at FullScreen there was so much flickering that it was not usable. I was so hyped from AMD, and so disappointed from Nvidia with their paperlaunch, customer politics (Hardware unboxed), #CudaCore "lying", but with Nvidia everything is working properly right from the start. Perhaps the HDMI 2.1 port on the AMD card is the reason for this mess because it is only 40Gb while the LG C9 got the full 48Gb. On the other side, it shouldn't affect the S3220DGF with a DP 1.4 cable (which I had to buy to get the 165Hz out of it, because Dell wanted to safe 10 cents and delivers only a DP 1.2 cable with this monitor).
Now I'm waiting for the 3080ti.
Posted on Reply
#28
Zyfe
Maybe it can be The RX 6950XT with more CUs
Posted on Reply
#29
TheinsanegamerN
The Navi 21 XTXHash edition, for all the miners out there!
Posted on Reply
#30
SKD007
TurmaniaI have yet to find a current gen AMD offerings in both CPU and GPU segments available to order at rated prices or close to it. It is just been a paper launch year. From all companies. Very disappointed.
Ya its so bad. I finally gave in and bought a 5950x for 1099 but hoping i can get a 3080TI or super for actual price
Posted on Reply
#31
Master Tom
saikamaldossEither HBM version or The full fat CU with 7300+ core version ? Hope so... good that scalpers emptied all that stock that stopped me from making the move on a new GPU... Great news
The Radeon™ RX 6900 XT has the full Navi 21 chip.
Posted on Reply
#33
LemmingOverlord
AldainThe H does not stand for Mobile when it comes to AMD.. Their mobile lineup always has a M prefix when it comes to the GPU, so this could be an HBM 2.0 card.
For some reason a reply I gave has not shown up. Must've been a client-side issue, or a PEBCAC.

As I suggested in my first comment, I'm referring to the H-series CPUs. I'm deducing the XTXH would be an XTX-class GPU coupled with an H-series CPU (i.e. high-performance mobile). As mobile parts go, AMD does add the "m" suffix to their retail branding, but XTX is not a retail name (it was once, a long time ago, but not any more), XTX defines the die-family it belongs to.

At 27W I can't say it's a 3090-killer, as this looks like the power envelope for a mobile GPU.
Posted on Reply
#34
Adam Krazispeed
evernessincePretty sure Micron and Nvidia teamed up for GDDR6X. Not sure if it's open to others.
pretty bullshit / Anti-Competitive if you ask me, Nvidia/Micron should be PENTILIZED BYT THE FTC for this ANTI-Competitive Behavior? this is BULLSHIT!! THEY CANT KEEP THIS FROM OTHE TECH COMPANIES LIKE AMD, THIS IS ANTI COMPETITIVE BEHIVIOR!!!
LemmingOverlordFor some reason a reply I gave has not shown up. Must've been a client-side issue, or a PEBCAC.

As I suggested in my first comment, I'm referring to the H-series CPUs. I'm deducing the XTXH would be an XTX-class GPU coupled with an H-series CPU (i.e. high-performance mobile). As mobile parts go, AMD does add the "m" suffix to their retail branding, but XTX is not a retail name (it was once, a long time ago, but not any more), XTX defines the die-family it belongs to.

At 27W I can't say it's a 3090-killer, as this looks like the power envelope for a mobile GPU.
besides, THE TOP XTX CHIP WOULDNT BE USED FOR A MOBILE PLATFORM ANY WAY. NO WAY IN HELL
Posted on Reply
#35
evernessince
Adam Krazispeedpretty bullshit / Anti-Competitive if you ask me, Nvidia/Micron should be PENTILIZED BYT THE FTC for this ANTI-Competitive Behavior? this is BULLSHIT!! THEY CANT KEEP THIS FROM OTHE TECH COMPANIES LIKE AMD, THIS IS ANTI COMPETITIVE BEHIVIOR!!!


besides, THE TOP XTX CHIP WOULDNT BE USED FOR A MOBILE PLATFORM ANY WAY. NO WAY IN HELL
It's pretty interesting as Micron has been working on PAM4 signaling (the basis behind GDDR6X) since 2007. Most likely all Nvidia provided was help with the practical implementation and a promise to buy a ton of GDDR6X chips.

But hey, this is micron we are talking about here. They've been caught being anti-competitive in the past. I very much doubt they care about screwing AMD over if Nvidia has promised to buy a ton of their chips.
Posted on Reply
#36
medi01
Mouth of SauronLike 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...
There are 3 major points missed by people hyped about NV's RT.

First, this is how things look like on a 7870 level GPU (PS4, non pro):


reflections, shades, light effects, you name it, all in there.


Second, this is how things look in the demo of the latest version of the most popular game engine on the planet, oh, it uses none of the "hardware RT" even though it is present on the respective platform and even though hardware RT is supported by even older version of the same engine:



Last, but not least, people do not seem to know what actually is "hardware RT" and what it isn't.
If we trust DF "deep dive" (in many parts, honestly, pathetic, but there aren't too many reviews of the kind to choose from) there are multiple steps involved in what is regarded as RT:

1) Creating the structure
2) Checking rays for intersections
3) Doing something with that (denoising, some temporal tricks for reflection, etc)

In this list, ONLY STEP 2 is hardware accelerated. This is why, while AMD likely beats NV in raw RT-ing power (especially with that uber infinity cache) most green sponsored tiles run poorly on AMD GPUs, as #1 and #3, while having nothing to do with RT, are largely optimized for NV GPUs.

This also shows how "AMD is behind on RT" doesn't quite reflect "what RT" it is about (and why AMD's GPUs wipe the floor with green GPUs in, say, Dirt 5 RT).


Ultimately, tech fails to deliver on... pretty much any front:

Promise 1: "yet unseen graphics"
There are NO uber effects that we have not seen already, and to make it more insulting, even GoW on PS4 has 90%+ of all of them, despite using pathetic GPU

Promise 2: "mkay, we had those effects, but now it's so much easier to implement"
It is exactly the opposite. There is a lot of tinkering, most of it GPU manufacturer specific AND it still has major negative impact on performance.

CP2077 is an interesting example of it, with many "RT on" versions looking WORSE than "RT off".

Oh, and then there is "DLSS: The Hype For Brain Dead"... :D
Posted on Reply
#38
Fluffmeister
saikamaldossEither HBM version or The full fat CU with 7300+ core version ? Hope so... good that scalpers emptied all that stock that stopped me from making the move on a new GPU... Great news
Ahh HBM, now we are back to 256bit GDDR already... hence the Fury!
Posted on Reply
#40
Chrispy_
medi01Ultimately, [RT] tech fails to deliver on... pretty much any front:

Promise 1: "yet unseen graphics"
There are NO uber effects that we have not seen already, and to make it more insulting, even GoW on PS4 has 90%+ of all of them, despite using pathetic GPU

Promise 2: "mkay, we had those effects, but now it's so much easier to implement"
It is exactly the opposite. There is a lot of tinkering, most of it GPU manufacturer specific AND it still has major negative impact on performance.

CP2077 is an interesting example of it, with many "RT on" versions looking WORSE than "RT off".

Oh, and then there is "DLSS: The Hype For Brain Dead"... :D
I jumped on Turing at launch (didn't give a damn about RTX, just needed more raw performance) and having seen most, if not all, of the demos, benchmarks, and games with my own eyes I am 100% with you on RTX being overhyped garbage.

Too many of the RTX on/off comparisons aren't fair comparisons, they're RTX vs no effort at all - not RTX vs the alternative shadow/reflection/illumination/transparency methods we've seen since DX11 engines became popular. Gimmicks like Quake2 RTX or Minecraft are interesting tech demos, but that's not to say you couldn't also get very close facsimiles of their appearance at far higher framerates if developers put in the effort to import those assets into a modern game engine that supports SSAO, SSR, dynamic shadowmaps etc.

IMO, realtime raytracing may be the gold-standard for accuracy but it's also the least-efficient, dumbest, brute-force method that ends up being the least elegant solution with the worst possible performance of all the potential methods to render a scene's lighting.

DLSS and FidelityFX SR should not be dragged down with the flaws of raytracing though - sure, it's a crutch that can go some way towards mitigating the inefficiencies of realtime raytracing, but that shining praise of the technology - it can singlehandedly undo most of the damage caused by the huge performance penalties of RT and alongside VRS, I believe it is the technology that will let us continue increasing resolution without needing exponentially more GPU power to keep up. I have a 4K120 TV and AAA games simply aren't going to run at 4K120 without help. 4K displays have been mainstream for a decade and mass 8K adoption is looming on the horizon, no matter how pointless it may seem.
Posted on Reply
Add your own comment
Aug 14th, 2024 17:09 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts