• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon Navi 21 XTXH Variant Spotted, Another Flagship Graphics Card Incoming?

My strongest candidate is the laptop version, heavily cut (NVIDIA is doing it also).

I strongly doubt any memory-related change, if it wasn't engineered from the start and in ready design. The reason being they strongly opted for GDDR6 (not 'X', not HBM-any) and made that large on-chip cache for compensation. More memory = more cache or different bus, so different silicon. Different memory type = more expensive + whole cache stuff is then of questionable need (guess they tested extensively already, so it's possible but not probable).

A related guess is that the whole top-line is mostly finished, there can be better binning or something like that, price correction (highly needed), and the whole RDNA2 train goes toward (desperately) needed mid/low range, APUs too. I don't know if any lithography gimmick like Zen to Zen+ is very easy to do, if it is then maybe that - but not NOW, maybe at 21Q4...

Ampere and RDNA2 are new products with unfinished lineup, both companies will surely look to return the expensive R&D investments, no chance that anyone goes crazy and launch a new generation this year. AMD got what they wanted, similar to Zen 1 vs. Intel - not besting them in all scenarios, but being competitive and bit cheaper. NVIDIA still has Jensens stupid 'fastest overall' crown (they invested huge money in the past for similar stupid cards which nobody sane bought), and what I dub as 'equally idiotic RT crown' (being first adopter among buyers is... either willing enthusiasm, supporting new tech because someone needs to OR being too rich, too uninformed OR being desperate to play CP2077 or whatever 4 other games support it).

[Yeah, I was connected with or worked in that (rendering) field for three decades. I read the news from professional/dedicated forums even now. Hobby now, but a way above average knowledge still, if I may say so]

Like 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...

My (posted, initial) opinion was that GOOD RT needs at least 5 years. But some guy from the gaming industry said that real-lime PHOTOREALISTIC gaming is 10 years away, guess he knows it much better than I do. Also, RT isn't equal to photorealism at all.

Also, who wants ALL games to be photorealistic? Not all players, for sure.

[I've skipped all tech details. I wanted to write an easy to understand, general article about RT - but the time...]

Back to the topic - NVIDIA has what they want now, and AMD has it, too. My opinion is that high-end improvements will come with better lithography - 6nm, 5nm, less; perhaps MCM - for both (and perhaps Intel, hahahaha).

Not like anything any GPU producer will invest soooo much THIS year, probably next, too.

So, perhaps binning, relatively small improvements - perhaps this year, larger ones next or even later... There. My opinion. For both.

Oh, and NVIDIA knows much more about RT (and photorealism) then they advertise now. There is a good ebook about it on NVIDIA site - yup, I've read it, nothing to disprove what I said here, or before. F00k, they have top-level 'guys' in the field - would be weird otherwise, RT existed for longer than I'm in it... True photorealism likely require VR, because it's needed to follow eyeball movement to follow the focus, just to name one tech-detail...
 
XT - more power
XH - extra Hot ;)
 
Three weeks ago I got a 6900XT. After two weeks I gave up. Buyed new cables, used DDU and CRU several times, it didn't worked well. The card did not work with my LG C9, only 4k@60 was working, Freesync didn't, VRR didn't, HDR didn't, 4k@120Hz@8bit+dithering was working for one evening with blackscreens every 5 seconds and microstuttering at its finest. The next day the TV was showing "No Signal" till I dropped down to 60Hz. When I was playing on the TV, my Monitor showed some weired flickering, maybe it's called Freesync Flickering. It is a Dell S3220DGF with Freesync 2 HDR. Freesync only worked in windowed mode, at FullScreen there was so much flickering that it was not usable. I was so hyped from AMD, and so disappointed from Nvidia with their paperlaunch, customer politics (Hardware unboxed), #CudaCore "lying", but with Nvidia everything is working properly right from the start. Perhaps the HDMI 2.1 port on the AMD card is the reason for this mess because it is only 40Gb while the LG C9 got the full 48Gb. On the other side, it shouldn't affect the S3220DGF with a DP 1.4 cable (which I had to buy to get the 165Hz out of it, because Dell wanted to safe 10 cents and delivers only a DP 1.2 cable with this monitor).
Now I'm waiting for the 3080ti.
 
Last edited:
The Navi 21 XTXHash edition, for all the miners out there!
 
I have yet to find a current gen AMD offerings in both CPU and GPU segments available to order at rated prices or close to it. It is just been a paper launch year. From all companies. Very disappointed.
Ya its so bad. I finally gave in and bought a 5950x for 1099 but hoping i can get a 3080TI or super for actual price
 
Either HBM version or The full fat CU with 7300+ core version ? Hope so... good that scalpers emptied all that stock that stopped me from making the move on a new GPU... Great news
The Radeon™ RX 6900 XT has the full Navi 21 chip.
 
I'm expecting a higher binned hydro cooled 6900XT
 
The H does not stand for Mobile when it comes to AMD.. Their mobile lineup always has a M prefix when it comes to the GPU, so this could be an HBM 2.0 card.
For some reason a reply I gave has not shown up. Must've been a client-side issue, or a PEBCAC.

As I suggested in my first comment, I'm referring to the H-series CPUs. I'm deducing the XTXH would be an XTX-class GPU coupled with an H-series CPU (i.e. high-performance mobile). As mobile parts go, AMD does add the "m" suffix to their retail branding, but XTX is not a retail name (it was once, a long time ago, but not any more), XTX defines the die-family it belongs to.

At 27W I can't say it's a 3090-killer, as this looks like the power envelope for a mobile GPU.
 
Last edited:
Pretty sure Micron and Nvidia teamed up for GDDR6X. Not sure if it's open to others.
pretty bullshit / Anti-Competitive if you ask me, Nvidia/Micron should be PENTILIZED BYT THE FTC for this ANTI-Competitive Behavior? this is BULLSHIT!! THEY CANT KEEP THIS FROM OTHE TECH COMPANIES LIKE AMD, THIS IS ANTI COMPETITIVE BEHIVIOR!!!

For some reason a reply I gave has not shown up. Must've been a client-side issue, or a PEBCAC.

As I suggested in my first comment, I'm referring to the H-series CPUs. I'm deducing the XTXH would be an XTX-class GPU coupled with an H-series CPU (i.e. high-performance mobile). As mobile parts go, AMD does add the "m" suffix to their retail branding, but XTX is not a retail name (it was once, a long time ago, but not any more), XTX defines the die-family it belongs to.

At 27W I can't say it's a 3090-killer, as this looks like the power envelope for a mobile GPU.
besides, THE TOP XTX CHIP WOULDNT BE USED FOR A MOBILE PLATFORM ANY WAY. NO WAY IN HELL
 
pretty bullshit / Anti-Competitive if you ask me, Nvidia/Micron should be PENTILIZED BYT THE FTC for this ANTI-Competitive Behavior? this is BULLSHIT!! THEY CANT KEEP THIS FROM OTHE TECH COMPANIES LIKE AMD, THIS IS ANTI COMPETITIVE BEHIVIOR!!!


besides, THE TOP XTX CHIP WOULDNT BE USED FOR A MOBILE PLATFORM ANY WAY. NO WAY IN HELL

It's pretty interesting as Micron has been working on PAM4 signaling (the basis behind GDDR6X) since 2007. Most likely all Nvidia provided was help with the practical implementation and a promise to buy a ton of GDDR6X chips.

But hey, this is micron we are talking about here. They've been caught being anti-competitive in the past. I very much doubt they care about screwing AMD over if Nvidia has promised to buy a ton of their chips.
 
Like 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...

There are 3 major points missed by people hyped about NV's RT.

First, this is how things look like on a 7870 level GPU (PS4, non pro):


reflections, shades, light effects, you name it, all in there.


Second, this is how things look in the demo of the latest version of the most popular game engine on the planet, oh, it uses none of the "hardware RT" even though it is present on the respective platform and even though hardware RT is supported by even older version of the same engine:



Last, but not least, people do not seem to know what actually is "hardware RT" and what it isn't.
If we trust DF "deep dive" (in many parts, honestly, pathetic, but there aren't too many reviews of the kind to choose from) there are multiple steps involved in what is regarded as RT:

1) Creating the structure
2) Checking rays for intersections
3) Doing something with that (denoising, some temporal tricks for reflection, etc)

In this list, ONLY STEP 2 is hardware accelerated. This is why, while AMD likely beats NV in raw RT-ing power (especially with that uber infinity cache) most green sponsored tiles run poorly on AMD GPUs, as #1 and #3, while having nothing to do with RT, are largely optimized for NV GPUs.

This also shows how "AMD is behind on RT" doesn't quite reflect "what RT" it is about (and why AMD's GPUs wipe the floor with green GPUs in, say, Dirt 5 RT).


Ultimately, tech fails to deliver on... pretty much any front:

Promise 1: "yet unseen graphics"
There are NO uber effects that we have not seen already, and to make it more insulting, even GoW on PS4 has 90%+ of all of them, despite using pathetic GPU

Promise 2: "mkay, we had those effects, but now it's so much easier to implement"
It is exactly the opposite. There is a lot of tinkering, most of it GPU manufacturer specific AND it still has major negative impact on performance.

CP2077 is an interesting example of it, with many "RT on" versions looking WORSE than "RT off".

Oh, and then there is "DLSS: The Hype For Brain Dead"... :D
 
Either HBM version or The full fat CU with 7300+ core version ? Hope so... good that scalpers emptied all that stock that stopped me from making the move on a new GPU... Great news

Ahh HBM, now we are back to 256bit GDDR already... hence the Fury!
 
Sapphire nitro RX 6900 XT SE




Techpowerup.gif

2745 clock pic.gif
 
Ultimately, [RT] tech fails to deliver on... pretty much any front:

Promise 1: "yet unseen graphics"
There are NO uber effects that we have not seen already, and to make it more insulting, even GoW on PS4 has 90%+ of all of them, despite using pathetic GPU

Promise 2: "mkay, we had those effects, but now it's so much easier to implement"
It is exactly the opposite. There is a lot of tinkering, most of it GPU manufacturer specific AND it still has major negative impact on performance.

CP2077 is an interesting example of it, with many "RT on" versions looking WORSE than "RT off".

Oh, and then there is "DLSS: The Hype For Brain Dead"... :D
I jumped on Turing at launch (didn't give a damn about RTX, just needed more raw performance) and having seen most, if not all, of the demos, benchmarks, and games with my own eyes I am 100% with you on RTX being overhyped garbage.

Too many of the RTX on/off comparisons aren't fair comparisons, they're RTX vs no effort at all - not RTX vs the alternative shadow/reflection/illumination/transparency methods we've seen since DX11 engines became popular. Gimmicks like Quake2 RTX or Minecraft are interesting tech demos, but that's not to say you couldn't also get very close facsimiles of their appearance at far higher framerates if developers put in the effort to import those assets into a modern game engine that supports SSAO, SSR, dynamic shadowmaps etc.

IMO, realtime raytracing may be the gold-standard for accuracy but it's also the least-efficient, dumbest, brute-force method that ends up being the least elegant solution with the worst possible performance of all the potential methods to render a scene's lighting.

DLSS and FidelityFX SR should not be dragged down with the flaws of raytracing though - sure, it's a crutch that can go some way towards mitigating the inefficiencies of realtime raytracing, but that shining praise of the technology - it can singlehandedly undo most of the damage caused by the huge performance penalties of RT and alongside VRS, I believe it is the technology that will let us continue increasing resolution without needing exponentially more GPU power to keep up. I have a 4K120 TV and AAA games simply aren't going to run at 4K120 without help. 4K displays have been mainstream for a decade and mass 8K adoption is looming on the horizon, no matter how pointless it may seem.
 
Back
Top