This exact thought came to my mind lol. Where's that 4870 team that David wang and co led, they were great. There was no shitty marketing, great prices, great cards, no nonsense. They made nvidia look like absolute fools in that generation. IIRC nvidia had like a massive 25% or so price cut a couple of months after launch because of RV770's arrival. Not happening now, but for a card that's supposed to be a stop gap till UDNA, if it can somehow be close to 4080S that's somehow not too far off a 5080 which I would've never expected but that 5080 turned out to be a POS on PCI-E.
The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.
1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade
2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.
3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.
It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
Bonus points for mentioning Baumann's baby. I also mentioned that in a post not too long ago. I think I posted it, maybe I deleted it...I forget. I do that sometimes.
I get very saddened by people buying into nVIDIA's savvy crap that does not benefit them long-term but they think does (until they complain about it later), and reviewers aren't helping. I won't get into it.
Some people kinda/sorta already did, but it goes beyond that in ways I don't want to get into. I don't want to start a fight with any Youtube math teachers that want allocation.
I look to AMD for solace, it isn't there, and I get mad. It's doesn't mean they don't and/or can't make good-value products, but they used to LEAD in very important aspects; also make their strengths known.
I know it comes out in my posts, and I apologize for that. There's just something about the culture changing from nerds to normies whom think they understand, but don't; really mostly nVIDIA marketing.
...and AMD's marketing is awful to boot, which doesn't help. The whole 9070 series thing is a gigantic clusterfuck the likes I have never seen before, and they should be ashamed.
I get that they want time to catch up on features, but how they went about exposing this series and then trying to shove it back in the closet is beyond ridiculous. The price/placement uncertainty...it's bad form.
Whoever they have now in marketing is no Dave Baumann. Hell, whoever they have now is no Scott Herkelman. Very obvious things are in disarray now, perpaps because of layoffs.
It's starting to make sense why AMD has three 8 pin power connectors and giant coolers on some skus now. They plan to clock their midrange to 5080 territory by the looks of what is happening with Nvidia's clown show.
Higher than that, my friend.
Don't forget about clock speeds. The 7800 XT runs at 2.4 GHz, the 9070 XT is at 2.9-3 GHz.
Overclock a 7800xt. It's clocked in the toilet at stock for marketing purposes of this very card.
It gains around 19% performance in many cases. Look at W1zard's reviews.
The best a 7800 can clock is 2936mhz, but a 7700 (oddly similar across multiple cards) 3133mhz. That doesn't make any sense other than obv 7800xt was going to be clock-limited but instead got PL-limited.
Oddly, 7900xtx can also hit around 3165-3200mhz on the same arch before going power bananas?
It's stinky. Reeks of artificial product segmentation (granted RDNA4 has 8-bit ops; tensor cores). He's not wrong (for the most part).
The question truly is how high it will clock. If it's only ~3.3-3.4ghz max, that's bad considering die size is ~15-20% larger than it should be.
Not necessarily bad for those products, but for the chip overall if it can't be binned higher (at >375w).
If it's 3.5-3.7ghz on 3x8-pin (and release a >20gbps ram card), then we're talking an actual improvement wrt chip design and not just marketing tactics.