The biggest point of interest for me will be the price. If the 5090 is 4090 price and offers a decent jump, that bodes very well for mid range and high end card prices.
It's always safe to assume the customer is getting shafted by Nvidia these days.
I used to be so excited for new generation of GPU and CPU but I just can't anymore the pricing is leaving such a bad taste in my mouth I just buy peripherals or add to my water cooling. Honestly consoles never looked better.
Offering better performance at a higher price is so bad you'd rather get railroaded by console makers and their walled gardens?
The grass is supposed to be GREENER on the other side, not covered in barbed wire.
So, if it's an old-fashioned monolithic die, then it's only half what the GB202 they have shown for the datacenter, as that's two dies glued together. So does that mean the 5090 will have a GB203 die?
Looks like the consumers are getting the shaft yet again, unless the 5080 gets the GB203 die as well, I suppose.
The performance uplift over Lovelace will be interesting with this series, as to me it sounds like a lot of overclocking is going to be needed to bring the big performance gains to these cards. Maybe 600w+ 5090s will be a thing, and 350w 5080s etc...?
I dont understand this, how is the customer getting the shaft? If the 5090 is a step over a 4090, then its not getting shafted. Now pricing may do that, certainly. Besides, if the GB203 pulls 600w, why complain, since a full size chip would be flat out impossible to cool.
Why not a Radeon with more VRAM? RX 7900 XT has 20 GB, RX 7800 XT has 16 GB?
I guess right now is a bad moment for buying, because the new generation is coming soon.
I'm assuming, because if someone wants RT, buying AMD is a terrible idea and gets you, at best, ampere tier performance.
Remember how the Radeon Fury X with HBM wasn't a "big deal", and now look at HBM....it's so valuable it can't be used in consumer cards. The first application or a new technology's isn't always guaranteed to make a big splash, but eventually it catches on to the point where we don't understand how we survived without it. Chiplets WILL happen, it's inevitable.
That's a nice cope. In reality, the HBM equipped failure x was an embarrassment in the market, and the HBM equipped Vega 56 and 64 failed to compete with the cheaper GDDR equipped Pascal cards. HBM still exists, sure, in cards that cost more then my car and sometimes nearly as much as my house. But in consumer products? Nope. Somehow, 10 years later we still survive in the consumer world without HMB just fine.