- Joined
- May 2, 2017
- Messages
- 7,762 (2.78/day)
- Location
- Back in Norway
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
I don't disagree with the majority of what you're saying here, though I think you're ignoring the changing realities behind the past situations you are describing vs. AMD in 2020. AMD PR has absolutely done a lot of shady stuff, have overpromised time and time again, and is generally not to be trusted until we have a significant body of proof to build trust on. Their product positioning and naming in China (like the 560D, various permutations of "580" and so on, etc.) is also deeply problematic. But so far I don't think I've seen Dr. Su overpromise or skew results in a significant way - but I might obviously have missed or forgotten something - and the "up to 50% improved perf/W" comes from her. (Sure, you could debate the value of Cinebench as a measure of overall performance - I think it shows AMD in too good a light if seen as broadly representative - but at least it's accurate and representative of some workloads.) And despite the fundamental shadyness of promising maximums ("up to") rather than averages or baselines, there's at least the security that it must in some sense be true for AMD to not be sued by their shareholders. And given how early that was said relative to the launch of the GPUs, I would say any baseline or average promise would be impossible to make.My pessimism has been on the right track more often than not though, when it comes to these predictions.
So far AMD has not shown us a major perf/w jump on anything GCN-based ever, but now they call it RDNA# and they suddenly can? Please. Tonga was a failure and that is all they wrote. Then came Polaris - more of the same. Now we have RDNA2 and already they've been clocking the 5700XT out of its comfort zone to get the needed performance. And to top it off they felt the need to release vague 14Gbps BIOS updates that nobody really understood, post/during launch. You don't do that if you've got a nicely rounded, future proof product here.
I'm not seeing the upside here, and I don't think we can credit AMD with trustworthy communication surrounding their GPU department. It is 90% left to the masses and the remaining 10% is utterly vague until it hits shelves. 'Up to 50%'... that sounds like Intel's 'Up to' Gigahurtz boost and to me it reads 'you're full of shit'.
Do you see Nvidia market 'up to'? Nope. Not a single time. They give you a base clock and say a boost is not guaranteed... and then we get a slew of GPUs every gen that ALL hit beyond their rated boost speeds. That instills faith. Its just that simple. So far, AMD has not released a single GPU that was free of trickery - either with timed scarcity (and shitty excuses to cover it up, I didn't forget their Vega marketing for a second, it was straight up dishonest in an attempt to feed hype), cherry picked benches (and a horde of fans echoing benchmarks for games nobody plays), supposed OC potential (Fury X) that never materialized, supposed huge benefits from HBM (Fury X again, it fell off faster than GDDR5 driven 980ti which is still relevant with 6GB), the list is virtually endless.
Even in the shitrange they managed to make an oopsie with the 560D. 'Oops'. Wasn't that their core target market? Way to treat your customer base. Of course we both know they don't care at all. Their revenue is in the consoles now. We get whatever falls off the dev train going on there.
Nah, sorry. AMD's GPU division has lost the last sliver of faith a few generations back, over here. I don't see how or why they would suddenly provide us with a paradigm shift. So far, they're still late with RDNA as they always have been - be it version 1, 2 or 3. They still haven't shown us a speck of RT capability, only tech slides. The GPUs they have out lack feature set beyond RT. Etc etc ad infinitum. They've relegated themselves to followers and not leaders. There is absolutely no reason to expect them to leap ahead. Even DX12 Ultimate apparently caught them by surprise... hello? Weren't you best friends with MS for doing their Xboxes? Dafuq happened?
On top of that, they still haven't managed to create a decent stock cooler to save their lives, and they still haven't got the AIBs in line like they should. What could possibly go wrong eh
//end of AMD roast Sorry for the ninja edits.
Beyond that, most of what you describe is during the tenure of Koduri, and while it is obviously wrong to place the blame for this (solely) at his feet, he famously fought tooth and nail for near total autonomy for RTG, with him taking the helm for the products produced and deciding the direction taken with them. He obviously wasn't at fault for the 64 CU architectural limit of GCN, which crippled AMD's GPU progress from the Fury X and onwards, but he was responsible for how the products made both then and since were specified and marketed. And he's no longer around, after all. All signs point towards there having been some serious talking-tos handed out around AMD HQ in the past few years.
But beyond the near-constant game of musical chairs that is tech executive positions, the main change is AMD's fortunes. In 2015 they were near bankrupt, and definitely couldn't afford to splurge on GPU R&D. In 2020, they are riding higher than ever, with Ryzen carrying them to record revenues and profits. In the meantime they've shown with RDNA that even on a relatively slim budget (RDNA development must have started around 2016 or so, picking up speed around 2018 at the latest) they could improve things signifcantly, and now they're suddenly flush with cash, including infusions from both major high performance console manufacturers. The last time they had that last part was pre-2013, when they were already struggling financially, and both console makers went (very) conservative in cost and power draw for their designs. That is by no means the case this time around. They can suddenly afford to build as powerful a GPU as they want to within the constraints of their architecture, node and fab capacity.
And as I mentioned, RDNA has shown that AMD has found a way out of the GCN quagmire - while 7nm has obviously been enormously beneficial in allowing them to get close to (5700 XT), match (5700, 5500 XT) or even beat (5600 XT) Nvidia's perf/W, it is by no means the main reason for this, as is easily seen by comparing the efficiency of the Radeon VII vs. even the 5700 XT. And with RDNA being a new architecture with a lot of new designs, it stands to reason that there are more major improvements to be made to it in its second generation than there were to GCN 1.whatever.
As for the Fury X falling behind the 980 Ti: not by much. Sure, the 980 Ti is faster, and by a noticeable percentage (and a higher percentage than at launch), but they're still firmly in the same performance class. The 980 Ti has "aged" better, but by a few percent at best.
So while I'm all for advocating pessimism - you'll find plenty of my posts here doing that, including on this very subject - in this case I think you're being too pessimistic. I'm not saying I think AMD will beat or even necessarily match Ampere either in absolute performance or perf/W, but there are reasons to believe AMD has something good coming, just based on firm facts: We know the XSX can deliver a 52CU, 12TF GPU and an 8c16t 3.6GHz Zen2 CPU in a console package, and while we don't know its power draw, I'll be shocked if that SoC consumes more than 300W - console power delivery and cooling, even in the nifty form factor of the XSX, won't be up for that. We also know the XSX runs at a relatively low clock speed with its 52 CUs thanks to the PS5 demonstrating that RDNA 2 can sustain 2.1 GHz even in a more traditional (if enormous) console form factor. We also know that even RDNA 1 can clearly beat Nvidia in perf/W if clocked reasonably (hello, 5600 XT!). What can we ascertain from this? That RDNA 2 at ~1.8GHz is quite efficient; that RDNA 2 is capable of notably higher clock speeds than RDNA 1, and that AMD is entirely capable of building a wider die than the RX 5700 - even for a cost-sensitive console.