Sunday, October 18th 2020

AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

AMD's RDNA2-based cards are just around the corner, with the company's full debut of the secrecy-shrouded cards being set for October 28th. Rumors of high clocks on AMD's new architecture - which were nothing more than unsubstantiated rumors up to now - have seemingly been confirmed, with Patrick Schur posting on Twitter some specifications for upcoming RNDA2-based Navi 21 XT. Navi 21 XT falls under the big Navi chip, but likely isn't the top performer from AMD - the company is allegedly working on a Navi 21 XTX solution, which ought to be exclusive to their reference designs, with higher clocks and possibly more CUs.

The specs outed by Patrick are promising, to say the least; that AMD's Big Navi can reach clocks in excess of 2.4 GHz with a 250 W+ TGP (quoted at around 255 W) is certainly good news. The 2.4 GHz (game clock) speeds are being associated with AIB cards; AMD's own reference designs should be running at a more conservative 2.3 GHz. A memory pool of 16 GB GDDR6 has also been confirmed. AMD's assault on the NVIDIA 30-series lineup should embody three models carved from the Navi 21 chip - the higher performance, AMD-exclusive XTX, XT, and the lower performance Navi 21 XL. All of these are expected to ship with the same 256 bit bus and 16 GB GDDR6 memory, whilst taking advantage of AMD's (rumored, for now) Infinity Cache to make up for the lower memory speeds and bus. Hold on to your hats; the hype train is going full speed ahead, luckily stopping in a smooth manner come October 28th.
Sources: Patrick Schur @ Twitter, via Videocardz
Add your own comment

229 Comments on AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

#76
INSTG8R
Vanguard Beta Tester
theoneandonlymrkDefinitely for mining apparently!.rumoured!.
Nah I mean the terrible first Thicc they released that as basically useless at cooling...
Posted on Reply
#77
TheoneandonlyMrK
INSTG8RNah I mean the terrible first Thicc they released that as basically useless at cooling...
Thing looked like it should have cooled well, half ass final build not enough clamp pressure and more besides.
Posted on Reply
#78
INSTG8R
Vanguard Beta Tester
theoneandonlymrkThing looked like it should have cooled well, half ass final build not enough clamp pressure and more besides.
Yeah but have to give them credit the replacement turned out to be an excellent card.
Posted on Reply
#79
renz496
theoneandonlymrkHow can a good card be bad for AMD as Xfx have shown, there's a market for all their old cards, the new, well see.

Unlike some I think there's room for more profitable GPU vendors not less.

Back in the dawn of 3d the pliable market was small so many a maker went under or we're bought, now two is far from enough, or balance.
the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Posted on Reply
#80
INSTG8R
Vanguard Beta Tester
renz496the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Well this is definitely their chance with Nvidia's supply issue and inflated prices. If big Navi is truly competitive and available they have a chance to really take some market share back.
Posted on Reply
#81
TheoneandonlyMrK
renz496the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Except with chips as big as Nvidia's on the latest node's, with the very latest memory, all in tight supply, that's not going to happen ,the price is going, gone up,.
Posted on Reply
#82
Mussels
Freshwater Moderator
Hell if its really 250W, might be worth it over my 3080 pre order...
Posted on Reply
#83
Cheeseball
Not a Potato
Still going to stick with my guess about the RX 6900 XT/6800 XT (whatever Navi 21 will be):

$599 and just within 5% to 15% to the RTX 3080, just like how the RX 5700 XT was to the 2070 Super. Enough RAM and optimized memory bandwidth (from the supposed Infinity Cache) to run games at 4K pretty well.

For sure it should best the RTX 2080 Ti and beat the upcoming RTX 3070 in 1080p/1440p performance.
Posted on Reply
#84
Mussels
Freshwater Moderator
fynxerYea, this guy must been high GPU DUST for sure and he thinks unicorns exists too.

Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that. Not a chance in hell this is true.

GHz, not MHz... and he says game frequency not base clock which we don't know much about yet.
Posted on Reply
#85
Mysteoa
renz496the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Nvidia makes more money because they have bigger market share in consumer and professional/server GPU markets whereas AMD still have a small CPU and GPU market. They do make a decent amount of money from consoles, but they also have a huge dept that they are paying off. Lest not forget that Nvidia still has more employees than AMD. This thing takes time, you can't compare a company that doing good all the time to one that is recovering and expect it to be more profitable than the first one.
Posted on Reply
#86
Bruno_O
CheeseballStill going to stick with my guess about the RX 6900 XT/6800 XT (whatever Navi 21 will be):

$599 and just within 5% to 15% to the RTX 3080, just like how the RX 5700 XT was to the 2070 Super. Enough RAM and optimized memory bandwidth (from the supposed Infinity Cache) to run games at 4K pretty well.

For sure it should best the RTX 2080 Ti and beat the upcoming RTX 3070 in 1080p/1440p performance.
Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).
Posted on Reply
#87
rtwjunkie
PC Gaming Enthusiast
EarthDogNice... im imagining a 3080 competitor (within a few % give or take) and cheaper. Sounds like a winner!
Depending on how true the rumor is, it certainly sounds competitive.
fynxerWhy this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that.
Read it again. It says “can reach”, not base clocks.
Posted on Reply
#88
Cheeseball
Not a Potato
Bruno_ORecent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).
Still 5% to 15% behind in most games. It only beats it in COD Warzone and Borderlands.

Also I own a 5700 XT and it does match up.
Posted on Reply
#89
INSTG8R
Vanguard Beta Tester
CheeseballStill 5% to 15% behind in most games. It only beats it in COD Warzone and Borderlands.

Also I own a 5700 XT and it does match up.
For me it’s been a decent 1440 card so I’m aiming to get the 6700XT when the market settles and there’s a Nitro version. I’m not leaving 1440 anytime soon
Posted on Reply
#90
Camm
I would not have liked to have been an earlier adopter of a 3080\3090, with what scuttlebutt of AMD's performance is looking like.

Can't really make a Super/Ti variant, so Nvidia's only option will be price cuts (depending on where AMD price of course, but I'm expecting AMD to want to recoup market share, so with a product with a likely lower BOM cost, now's the time).

Even then, with how expensive many of those coolers look to make, that'll eat in seriously to Nvidia's margins.
Posted on Reply
#91
nguyen
Bruno_ORecent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).
It was benchmarked with 3950X test rig with PCIe 4.0
5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X

Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

Now that Cyberpunk 2077 is about to release, I wonder what 5700XT owners are gonna feel though, imminent upgrade to Big Navi ? The CP2077 HypeTrain is going stronger and stronger everyday :D
Posted on Reply
#92
Camm
nguyenAlso HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).
PCIE runs regardless of game.

DLSS 2.0 is still currently limited to a very small amount of games. If/Once DLSS 2.0 (as lets be honest, 1.0 is trash) gets good penetration across a variety of games, its kinda ambiguous to include it in benchmarking.
Posted on Reply
#93
Totally
INSTG8RFor me it’s been a decent 1440 card so I’m aiming to get the 6700XT when the market settles and there’s a Nitro version. I’m not leaving 1440 anytime soon
Same but I'll probably skip over this gen
Posted on Reply
#94
INSTG8R
Vanguard Beta Tester
TotallySame but I'll probably skip over this gen
Well I’m certainly done with flagships so it’s not so hard on the wallet and I want to stay current for testing purpose. I want the performance uplift and the new features.
nguyenIt was benchmarked with 3950X test rig with PCIe 4.0
5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X

Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

Now that Cyberpunk 2077 is about to release, I wonder what 5700XT owners are gonna feel though, imminent upgrade to Big Navi ? The CP2077 HypeTrain is going stronger and stronger everyday :D
PCI 4.0 will not make one iota of difference for a 5700XT or a 3090....DLSS is a performance ”trick” and would totally skew any benchmarks, has to be like for like settings wise to be a fair equal benchmark
Posted on Reply
#95
nguyen
CammPCIE runs regardless of game.

DLSS 2.0 is still currently limited to a very small amount of games. If/Once DLSS 2.0 (as lets be honest, 1.0 is trash) gets good penetration across a variety of games, its kinda ambiguous to include it in benchmarking.
If so then HUB should have benched 5700XT with PCIe 3.0 just to be fair, no ? just go to BIOS and choose PCIe 3.0

Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS). It's not like RTX users are leaving DLSS 2.0 OFF in any game, 5700XT owners however might have to use PCIe 3.0 to fix the black screen bug.
INSTG8RWell I’m certainly done with flagships so it’s not so hard on the wallet and I want to stay current for testing purpose. I want the performance uplift and the new features.
PCI 4.0 will not make one iota of difference for a 5700XT or a 3090....DLSS is a performance ”trick” and would totally skew any benchmarks, has to be like for like settings wise to be a fair equal benchmark
Not really, you can check HUB PCIe 3.0 vs 4.0 benchmarks, even the 5700XT can gain 5% with PCIe 4.0

Every optimization is a performance trick, do you really care when it give the same Image Quality ?
If no one explained how DLSS work, you would just consider it an optimization like anything else.
Posted on Reply
#96
Camm
nguyenJust saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS)
I got what you were saying, I said that DLSS unlike PCIE isn't available across all titles. Considering game choice to benchmark is also subjective, should HWU remove DLSS titles from their benchmarking suite?

Besides, HWU at the end of every benchmark I've seen of them from late have included DLSS numbers separately in the same review so people can work out for themselves if they would prefer the performance at an upscaled resolution. DLSS is very good, but by its nature, isn't lossless.
Posted on Reply
#97
INSTG8R
Vanguard Beta Tester
nguyenIf so then HUB should have benched 5700XT with PCIe 3.0 just to be fair, no ? just go to BIOS and choose PCIe 3.0

Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS). It's not like RTX users are leaving DLSS 2.0 OFF in any game, 5700XT owners however might have to use PCIe 3.0 to fix the black screen bug.
Again find me one single benchmark or test(W1zzard has done the tests) that 4.0 makes any difference in performanc, you can’t it’s a non issue with GPUs 4.0 is only beneficial to NVME drives. DLSS is performance trick and its technically not actually running at the resolution it’s being benched at. You are comparing applies and pumpkins here One does nothing to effect performance to any ones advantage. The other is absolutely a method to gain performanc. If it can’t win without “tricks “ on then it just can’t win with outright performance which is what benchmark are measuring.

Edit: fell free t check this to a;y kind of advantage PCU 4 offers...
www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/
Posted on Reply
#98
nguyen
INSTG8RAgain find me one single benchmark or test(W1zzard has done the tests) that 4.0 makes any difference in performanc, you can’t it’s a non issue with GPUs 4.0 is only beneficial to NVME drives. DLSS is performance trick and its technically not actually running at the resolution it’s being benched at. You are comparing applies and pumpkins here One does nothing to effect performance to any ones advantage. The other is absolutely a method to gain performanc. If it can’t win without “tricks “ on then it just can’t win with outright performance which is what benchmark are measuring.
So you go to a F1 race and demand that every team must spend the same time on every optimization ? like changing car tires just be the same amount of time ? everyone must use the same fuel additives ?
It's the end result that matter, as long as Image Quality is equal, people couldn't care less about any optimization.
Work smarter, not harder :D

Posted on Reply
#99
INSTG8R
Vanguard Beta Tester
nguyenSo you go to a F1 race and demand that every team must spend the same time on every optimization ? like changing car tires just be the same amount of time ? everyone must use the same fuel additives ?
It's the end result that matter, as long as Image Quality is equal, people couldn't care less about any optimization.

Uh again totally off track with F1 yes pun intended. F1 has a strict set of rules that all cars must abide by from ride heights to fuel tank size and yes they all run the same fuel. Your asking to include what is basically a cheat to be included as a valid benchmark. Tha5 would be like one driver gets an extra turbo.. benchmarking is about comparing cards under equal settings and equal conditions DLSS is the furthest thing from equal conditions. By your logic every cards 4K results are accurate while DLSS is actually 720p upscaling to 4K How you can even consider that an equal bench result is truly laughable ...
Posted on Reply
#100
nguyen
INSTG8RUh again totally off track with F1 yes pun intended. F1 has a strict set of rules that all cars must abide by from ride heights to fuel tank size and yes they all run the same fuel. Your asking to include what is basically a cheat to be included as a valid benchmark. Tha5 would be like one driver gets an extra turbo.. benchmarking is about comparing cards under equal settings and equal conditions DLSS is the furthest thing from equal conditions. By your logic every cards 4K results are accurate while DLSS is actually 720p upscaling to 4K How you can even consider that an equal benc( result is truly laughable ...
Well if 720p upscaled to 4K have equal IQ as native 4K, I say go for it.
Every hardware editorial seems to agree that if DLSS is available, use it. They didn't say anything about trading visual quality for performance (as least with DLSS in Quality mode).

The only thing with DLSS is that it's unfair to AMD, who has no answer yet. But it's a competition out there, not some fun race between friends.

Let think of it this way, while AMD continue to optimize rasterization performance for their GPU, Nvidia effort is to incorporate DLSS into new games. But somehow benching DLSS is too unfair for AMD, it makes no fricking sense.

Also equal settings and equal conditions were never actually equal, some games just have optimizations for specific brand, just like Far Cry 5 feature rapid-packed math.
Posted on Reply
Add your own comment
Dec 22nd, 2024 08:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts