- Joined
- May 2, 2017
- Messages
- 7,762 (2.77/day)
- Location
- Back in Norway
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
I don't disagree with any of that, but I still never assume anything above what is promised. AMD under current leadership (ex-Koduri, that is) has been pretty trustworthy in their marketing for the most part. Still, I can't trust that to continue - corporations are opportunistic and almost exclusively focused on short term profits, and fundamentally do not care whatsoever about any kind of sustained ethics or even just acting consistently as long as it has some sort of sales/marketing gain, so one can never really trust history to indicate anything much - the best that's possible is to hope that they're choosing to not be complete exploitative assholes. I'm absolutely hopeful that the previous couple of generations will indeed be a solid indication of how their promised numbers should be interpreted - but hope and trust are not the same thing. Hence, I'm sticking with what has been explicitly promised - but as I said I'll be happy to be proven wrong. (And, of course, unappy to be proven wrong if they don't deliver 50% as well.)For RDNA1 they claimed a 50% perf/watt gain over Vega. This was done by them comparing V64 to the 5700XT with both parts at stock.
For RDNA2 they claimed a 50% perf/watt gain in early released slides but in the reveal event they claimed 54% and 64%. 54% was 5700XT vs 6800XT at 4k in a variety of games (listed in the footnotes of their slide). The 64% was 5700XT vs 6900XT at 4K in the same games. This was further confirmed in some reviews but it heavily depended on how they tested perf/watt. Those sites that use the power and performance data from 1 game saw very different results. TPU saw about a 50% gain where as Tehcspot / HUB saw 70%+ gain because HUB used Doom Eternal and the 5700XT underperformed and TPU used CP2077 and the 6900XT underperformed. If you look at the HUB average uplift of the 6800XT and 6900XT then it actually matched up really well with AMDs claimed improvements.
So the AMD method seems to be compare SKU to SKU at stock settings, measure the average frame rate difference in a suite of titles and then work out the perf/watt delta.
With the >50% I do agree with using the 50% as a baseline but I feel confident that they are not doing a best vs worst comparison because that is not something AMD have done prior under current leadership.
I'm not familiar with those Enermax numbers you mention, but there's also the variable of resolution scaling that needs consideration here. It looks like you're calculating only at 2160p? That obviously makes sense for a flagship SKU, but it also means that (unless RDNA3 scales much better with resolution than RDNA2), these cards will absolutely trounce the 4090 at 1440p - a 2.1x performance increase from the 6900XT at 1440p would go from 73% performance v. the 4090 to 153% performance - and that just sounds (way) too good to be true. It would definitely be (very!) interesting to see how customers would react to a card like that if that were to happen (and AMD didn't price it stupidly), but I'm too skeptical to believe that to be particularly likely.What it does do though is give us some numbers to play with. If the Enermax numbers are correct and top N31 is using 420W then you can get the following numbers.
Baseline TBP Power Delta Perf/Watt multi Performance Multi Estimate vs 4090 in Raster 6900XT 300 1.4x 1.5x 2.1x +10% 6900XT 300 1.4x 1.64x (to match 6900XT delta) extreme upper bound! 2.3x +23% Ref 6950XT 335 1.25x 1.5x 1.88x +15% Ref 6950XT 335 1.25x 1.64x Again extreme upper bound! 2.05x +25%
Now the assumption I am making here is pretty obvious and that is the design goal of N31 was 420W to begin with which would mean it was wide enough to use that power in the saner part of the f/v curve. If it was not 420W to begin with and has been pushed to this through increasing clocks then it is obvious the perf/watt will drop off and the numbers above will be incorrect.
The other assumption is the Enermax numbers are correct. It is entirely possible that the reference TBP for N31 will be closer to 375W which with these numbers would put it about on par with the 4090.
My view is the TBP will be closer to 375-400W rather than 420W in which case anywhere from about equal to 5% ahead of the 4090 seems to be the ballpark I expect top N31 to land in but there is room for a positive surprise should AMDs >50% claim be like their >5Ghz claim or the >15% single thread claim in the Zen 4 teaser slide and be a rather large underselling of what was actually achieved. Still I await actual numbers on that front and until then I am assuming something in the region of +50%.
AMD always gets the "Will they be able to take them down this time?" underdog hype, which to some extent disadvantages Nvidia - it's much harder for them to garner the type of excitement that follows with a potential upset of some kind. But on the other hand, Nvidia has massive reach, tons of media contacts, and are covered and included steadily everywhere across the internet. Not to mention that the tone of that coverage always already expects them to be superior - which isn't as exciting as an underdog, but it still gets people reading, as "how fast will my next GPU be?" (with the default expectation of this being an Nvidia GPU) is just as interesting to people as "will AMD be able to match/beat Nvidia this time?"However it always seems to be more interest/hype in AMD leaks than Nvidia from what I've seen.
Of course in terms of leaks there's also the question of sheer scale: Nvidia outsells AMD's GPU division by ~4x, meaning they have 4x the production volume, 4x the shipping volume, and thus far more products passing through far more hands before launch, with control of this being far more difficult due to this scale.
Last edited: