I don't feel like I got screwed. I feel like money doesn't buy what it used to though.
Right, you're totally not annoyed with having 12GB on what is essentially a 16GB card, signed Nvidia, because they released a 4070ti Super that's a lot better at almost the same cost.
You don't feel screwed? And now you're ready to jump on the successor, again, too early and paying far too much, and you'll still be saying 'you don't get what you used to for your money' ? If you had waited a few months and went 4070ti S in the last round, there would have been
ZERO reason to even put this 5070ti on your radar to begin with. You'll be gaining what 10-15% perf and 4GB VRAM, its almost a sidegrade.
Its strange how that logic works wrt spending on hardware, honestly. Just be honest, the 4070ti was a shit buy. I'm going to be honest too: I regret buying the 7900XT at the price it had at the time. 250 bucks over what it costs now. Worst timing I've had on a GPU. Could've had a free B580 for that today.
We should do better
So the TDP is going up a bit compared to 4070ti? That always seems like the wrong direction to me.
Same node, a few more shaders and a higher voltage to extract another 3% you could've had from an OC yourself.
Nvidia is going to sell this card to everyone who didn't buy 4070ti Super, basically, just stalling at virtually the same level of performance right there, but calling it new.
It's an odd pacing, at least as far back as I've had my head dipped in things.
- Pascal (10) was a breakout hit and was priced insanely well across the board, seeing volumes of 1060s, 1070s, and 1080/1080Tis moving. Even the 1050Ti sold like hotcakes as a slot-only card.
- Turing (20/16) came out sorely overpriced and underperformed due to it being mostly just an experimentalist platform for 1st gen RT and matrix math. The 16 series cut the crap and focused on price appeal, to much success with SIs and OEMs. Seen as an afterthought nowadays...
- Ampere (30) was, like the 10 series, so good that literally everything sold. Again. Even the RTX 3050, a laughably bad model in the lineup, was moving units. Everyone wanted a piece of that fresh and functional 2nd gen RT and the true fruit of all those fancy arch additions—DLSS. Good DLSS. It was the crypto boom that ruined prices after launch.
- Ada (40) follows the same leapfrogging pattern that was forming the last few generations, being an overpriced, underperforming software showcase for special first-adopters only. DLSS framegen, ray reconstruction, AI toolkit—yadda yadda no one friggin' cares.
Naturally, we should expect the 50 series to be a return to form for Nvidia, another lineup of home-runs that makes everyone happy, but the forecast isn't looking great. Higher power draw, barely improving on shader count from Ada/Ampere, and a lame leg for VRAM on the lower end. They're putting less and less into impressing across the board, and putting more and more into the ultra-mega-super-expensive chips from the 4090 to the—what, B6000? A6000 Blackwell?
The margins are getting to their head, and the shareholders are only gonna stop huffing nitrous when the stock price eventually, inevitably, crashes to what Nvidia is ACTUALLY worth as an asset.
What? Ampere was good? What are you talking about? It was heavily underspecced and overpriced and baked on the
worst node we've seen on Nvidia cards in decades. The worst perf/W as well, that is a big reason why Ada looks so good now. If you would just look at the hardware in isolation (not the pricing, indeed mining at the time that inflated things badly, Nvidia positioned the MSRP's quite alright initially) its really not a good generation at all. What sold cards at the time was mining, and it barely leaving scraps for gamers. I think you're right about DLSS. But that's not really related to the GPUs, is it, its artificial segmentation. I think people would have gobbled up the GPUs anyway.
10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either. I think Ampere may go down history as the generation with the lowest useable lifetime of Nvidia's GPU stacks, all things considered. What also didn't and still doesn't help it, is the development in games where new engine developments just simply slaughter these GPUs. Ampere has been plagued with issues already in various games. To illustrate, a 4090 is almost
twice the GPU that a 3090 is at this moment, and that's
all core/shader performance, neither card is VRAM constrained.