• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Navi 21 XT Seemingly Confirmed to Run at ~2.3, 2.4 GHz Clock, 250 W+

Yeah talking about all these “Navi issues” And XFX really didn’t help with their release...thankfully the came out with a redesign that it now one of the better regarded cards
Definitely for mining apparently!.rumoured!.
 
Nah I mean the terrible first Thicc they released that as basically useless at cooling...
Thing looked like it should have cooled well, half ass final build not enough clamp pressure and more besides.
 
Thing looked like it should have cooled well, half ass final build not enough clamp pressure and more besides.
Yeah but have to give them credit the replacement turned out to be an excellent card.
 
How can a good card be bad for AMD as Xfx have shown, there's a market for all their old cards, the new, well see.

Unlike some I think there's room for more profitable GPU vendors not less.

Back in the dawn of 3d the pliable market was small so many a maker went under or we're bought, now two is far from enough, or balance.

the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
 
the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Well this is definitely their chance with Nvidia's supply issue and inflated prices. If big Navi is truly competitive and available they have a chance to really take some market share back.
 
the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.
Except with chips as big as Nvidia's on the latest node's, with the very latest memory, all in tight supply, that's not going to happen ,the price is going, gone up,.
 
Hell if its really 250W, might be worth it over my 3080 pre order...
 
Still going to stick with my guess about the RX 6900 XT/6800 XT (whatever Navi 21 will be):

$599 and just within 5% to 15% to the RTX 3080, just like how the RX 5700 XT was to the 2070 Super. Enough RAM and optimized memory bandwidth (from the supposed Infinity Cache) to run games at 4K pretty well.

For sure it should best the RTX 2080 Ti and beat the upcoming RTX 3070 in 1080p/1440p performance.
 
Yea, this guy must been high GPU DUST for sure and he thinks unicorns exists too.

Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that. Not a chance in hell this is true.

unicorn-puke1.jpg

GHz, not MHz... and he says game frequency not base clock which we don't know much about yet.
 
the problem is when nvidia start retaliating to those price war. at one point AMD will not be able to sustain that price war. AMD sells CPU/APU, have major consoles under their wing and yet they still make less money than nvidia that only sales GPU (and some tegra). but ultimately AMD should understand doing price war for a decade with nvidia will only reduce their profit margin more rather than stealing more nvidia market share.

Nvidia makes more money because they have bigger market share in consumer and professional/server GPU markets whereas AMD still have a small CPU and GPU market. They do make a decent amount of money from consoles, but they also have a huge dept that they are paying off. Lest not forget that Nvidia still has more employees than AMD. This thing takes time, you can't compare a company that doing good all the time to one that is recovering and expect it to be more profitable than the first one.
 
Still going to stick with my guess about the RX 6900 XT/6800 XT (whatever Navi 21 will be):

$599 and just within 5% to 15% to the RTX 3080, just like how the RX 5700 XT was to the 2070 Super. Enough RAM and optimized memory bandwidth (from the supposed Infinity Cache) to run games at 4K pretty well.

For sure it should best the RTX 2080 Ti and beat the upcoming RTX 3070 in 1080p/1440p performance.

Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).
 
Nice... im imagining a 3080 competitor (within a few % give or take) and cheaper. Sounds like a winner!
Depending on how true the rumor is, it certainly sounds competitive.

Why this is BULLSH!T is because he is saying 2.4 MHz in base frequency which would make boost much higher than that.
Read it again. It says “can reach”, not base clocks.
 
Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).

Still 5% to 15% behind in most games. It only beats it in COD Warzone and Borderlands.

Also I own a 5700 XT and it does match up.
 
Still 5% to 15% behind in most games. It only beats it in COD Warzone and Borderlands.

Also I own a 5700 XT and it does match up.
For me it’s been a decent 1440 card so I’m aiming to get the 6700XT when the market settles and there’s a Nitro version. I’m not leaving 1440 anytime soon
 
I would not have liked to have been an earlier adopter of a 3080\3090, with what scuttlebutt of AMD's performance is looking like.

Can't really make a Super/Ti variant, so Nvidia's only option will be price cuts (depending on where AMD price of course, but I'm expecting AMD to want to recoup market share, so with a product with a likely lower BOM cost, now's the time).

Even then, with how expensive many of those coolers look to make, that'll eat in seriously to Nvidia's margins.
 
Recent benchmakrs from HW unboxed show the 5700xt tied with the 2070 S, both first parties.
The AMD fine wine attacked again (new uArch so of course big gains over time).

It was benchmarked with 3950X test rig with PCIe 4.0
5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X

Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

Now that Cyberpunk 2077 is about to release, I wonder what 5700XT owners are gonna feel though, imminent upgrade to Big Navi ? The CP2077 HypeTrain is going stronger and stronger everyday :D
 
Last edited:
Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

PCIE runs regardless of game.

DLSS 2.0 is still currently limited to a very small amount of games. If/Once DLSS 2.0 (as lets be honest, 1.0 is trash) gets good penetration across a variety of games, its kinda ambiguous to include it in benchmarking.
 
For me it’s been a decent 1440 card so I’m aiming to get the 6700XT when the market settles and there’s a Nitro version. I’m not leaving 1440 anytime soon

Same but I'll probably skip over this gen
 
Same but I'll probably skip over this gen
Well I’m certainly done with flagships so it’s not so hard on the wallet and I want to stay current for testing purpose. I want the performance uplift and the new features.

It was benchmarked with 3950X test rig with PCIe 4.0
5700XT gain a few % with PCIe 4.0 and 2070S lose a few % with 3950X vs 10900K, some game even run very poorly with 3950X

Also HUB refuse to include DLSS 2.0 results, which RTX users will gladly use on any title that support it.
Kinda unfair to use feature available on one card (PCIe 4.0) but refuse to do the same with the other (DLSS).

Now that Cyberpunk 2077 is about to release, I wonder what 5700XT owners are gonna feel though, imminent upgrade to Big Navi ? The CP2077 HypeTrain is going stronger and stronger everyday :D
PCI 4.0 will not make one iota of difference for a 5700XT or a 3090....DLSS is a performance ”trick” and would totally skew any benchmarks, has to be like for like settings wise to be a fair equal benchmark
 
PCIE runs regardless of game.

DLSS 2.0 is still currently limited to a very small amount of games. If/Once DLSS 2.0 (as lets be honest, 1.0 is trash) gets good penetration across a variety of games, its kinda ambiguous to include it in benchmarking.

If so then HUB should have benched 5700XT with PCIe 3.0 just to be fair, no ? just go to BIOS and choose PCIe 3.0

Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS). It's not like RTX users are leaving DLSS 2.0 OFF in any game, 5700XT owners however might have to use PCIe 3.0 to fix the black screen bug.

Well I’m certainly done with flagships so it’s not so hard on the wallet and I want to stay current for testing purpose. I want the performance uplift and the new features.
PCI 4.0 will not make one iota of difference for a 5700XT or a 3090....DLSS is a performance ”trick” and would totally skew any benchmarks, has to be like for like settings wise to be a fair equal benchmark

Not really, you can check HUB PCIe 3.0 vs 4.0 benchmarks, even the 5700XT can gain 5% with PCIe 4.0

Every optimization is a performance trick, do you really care when it give the same Image Quality ?
If no one explained how DLSS work, you would just consider it an optimization like anything else.
 
Last edited:
Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS)

I got what you were saying, I said that DLSS unlike PCIE isn't available across all titles. Considering game choice to benchmark is also subjective, should HWU remove DLSS titles from their benchmarking suite?

Besides, HWU at the end of every benchmark I've seen of them from late have included DLSS numbers separately in the same review so people can work out for themselves if they would prefer the performance at an upscaled resolution. DLSS is very good, but by its nature, isn't lossless.
 
If so then HUB should have benched 5700XT with PCIe 3.0 just to be fair, no ? just go to BIOS and choose PCIe 3.0

Just saying, not really fair a comparison when you use a feature that is available on one (PCIe 4.0) but refuse to do the same for the other (DLSS). It's not like RTX users are leaving DLSS 2.0 OFF in any game, 5700XT owners however might have to use PCIe 3.0 to fix the black screen bug.
Again find me one single benchmark or test(W1zzard has done the tests) that 4.0 makes any difference in performanc, you can’t it’s a non issue with GPUs 4.0 is only beneficial to NVME drives. DLSS is performance trick and its technically not actually running at the resolution it’s being benched at. You are comparing applies and pumpkins here One does nothing to effect performance to any ones advantage. The other is absolutely a method to gain performanc. If it can’t win without “tricks “ on then it just can’t win with outright performance which is what benchmark are measuring.

Edit: fell free t check this to a;y kind of advantage PCU 4 offers...
 
Again find me one single benchmark or test(W1zzard has done the tests) that 4.0 makes any difference in performanc, you can’t it’s a non issue with GPUs 4.0 is only beneficial to NVME drives. DLSS is performance trick and its technically not actually running at the resolution it’s being benched at. You are comparing applies and pumpkins here One does nothing to effect performance to any ones advantage. The other is absolutely a method to gain performanc. If it can’t win without “tricks “ on then it just can’t win with outright performance which is what benchmark are measuring.

So you go to a F1 race and demand that every team must spend the same time on every optimization ? like changing car tires just be the same amount of time ? everyone must use the same fuel additives ?
It's the end result that matter, as long as Image Quality is equal, people couldn't care less about any optimization.
Work smarter, not harder :D

 
Last edited:
So you go to a F1 race and demand that every team must spend the same time on every optimization ? like changing car tires just be the same amount of time ? everyone must use the same fuel additives ?
It's the end result that matter, as long as Image Quality is equal, people couldn't care less about any optimization.

Uh again totally off track with F1 yes pun intended. F1 has a strict set of rules that all cars must abide by from ride heights to fuel tank size and yes they all run the same fuel. Your asking to include what is basically a cheat to be included as a valid benchmark. Tha5 would be like one driver gets an extra turbo.. benchmarking is about comparing cards under equal settings and equal conditions DLSS is the furthest thing from equal conditions. By your logic every cards 4K results are accurate while DLSS is actually 720p upscaling to 4K How you can even consider that an equal bench result is truly laughable ...
 
Last edited:
Back
Top