Saturday, February 1st 2025

Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value
It's not every day that we witness a famous NSA whistleblower voice their disappointment over modern gaming hardware. Edward Snowden, who likely needs no introduction, did not bother to hold back his disapproval of NVIDIA's recently launched RTX 5090, RTX 5080, and RTX 5070 gaming GPUs. The reviews for the RTX 5090 have been mostly positive, although the same cannot be said for its affordable sibling, the RTX 5080. Snowden, voicing his thoughts on Twitter, claimed that NVIDIA is selling "F-tier value for S-tier prices".
Needless to say, there is no doubt that the RTX 5090's pricing is quite exorbitant, regardless of how anyone puts it. Snowden was particularly displeased with the amount of VRAM on offer, which is also hard to argue against. The RTX 5080 ships with "only" 16 GB of VRAM, whereas Snowden believes that it should have shipped with at least 24, or even 32 GB. He further adds that the RTX 5090, which ships with a whopping 32 GB of VRAM, should have been available with a 48 GB variant. As for the RTX 5070, the security consultant expressed desire for at least 16 GB of VRAM (instead of 12 GB).But that is not all that Snowden had to say. He equated selling $1000+ GPUs with 16 GB VRAM to a "monopolistic crime against consumers," further accusing NVIDIA of "endless next-quarter" thinking. This is debatable, considering that NVIDIA is a publicly traded company, and whether they stay afloat does boil down to their quarterly results, whether we like it or not. There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two. The rest, of course, remains to be seen.
Source:
@Snowden
Needless to say, there is no doubt that the RTX 5090's pricing is quite exorbitant, regardless of how anyone puts it. Snowden was particularly displeased with the amount of VRAM on offer, which is also hard to argue against. The RTX 5080 ships with "only" 16 GB of VRAM, whereas Snowden believes that it should have shipped with at least 24, or even 32 GB. He further adds that the RTX 5090, which ships with a whopping 32 GB of VRAM, should have been available with a 48 GB variant. As for the RTX 5070, the security consultant expressed desire for at least 16 GB of VRAM (instead of 12 GB).But that is not all that Snowden had to say. He equated selling $1000+ GPUs with 16 GB VRAM to a "monopolistic crime against consumers," further accusing NVIDIA of "endless next-quarter" thinking. This is debatable, considering that NVIDIA is a publicly traded company, and whether they stay afloat does boil down to their quarterly results, whether we like it or not. There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two. The rest, of course, remains to be seen.
243 Comments on Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value
4080 -> 4080S $200 cheaper +1% perf
4080S -> 5080 +11% perf but it overclocks like crazy + another 10%.
And you know, 7900 is actually twice as good (perf/$), which roughly means that nv has been selling garbage for two years now, and is not going to improve)
In general performance, Nvidia has the better cards, yes, and if you need 4080S+ tier, there just isn't anything AMD has to offer. But below that, Nvidia isn't better, price-performance wise.
I'll see if I can get in the waitlist somehow.
The 5080 with 16GB is a gaming card. With a 32GB variant it could have become a great inference card, and Snowden's gripe is Nvidia's still forcing the upsell to the 5090, which has more processing and bandwidth capability than is needed for inference (the 5080 too; a 5070 with 24GB or 48GB would be the sweet spot I think). Nvidia's announced the $3000 "DIGITS" mini-PC later this year but is conspicuously avoiding the sub-$1000 market.
LLM support for Radeon cards is steadily improving, so maybe AMD is the answer.
So you were complaining about something you didn't communicate about and expected that everyone reading your post should also be able to read your mind.
That's a bold strategy Cotton, let's see how if it pays off for him.
High yield estimates the 5090 and 32GB VRAM for about 350$ each. For the 5080 it is half for half the VRAM, but the GPU is actually less then half the cost.
So I understand why Nvidia only used 16GB of VRAM, since VRAM is actually the highest cost component on the 5080. 3GB chips might be even costlier per GB.
That all said, I don't see great performance benefits from GDDR7 on the RTX5000 series, so Nvidia could have gone with the GDDR6(X) and make the cards cheaper or put more on it for the same price... As for the shortage, it is simply Nvidia putting datacenters first... From a buisness standpoint I can understand it, from a RTX-customer standpoint I don't like it.
GH100/Hopper and GB100/Blackwell-datacenter is made on the same node as the desktop Blackwell and Ada... I would say Nvidia is best per GPU-unit and since your typical gaming PC can only use one of them, people tend to buy the most powerful at a higher price. It is a purely buisness decision to limit the VRAM on everything but the halo product to prevent people from using those cards for AI and push people into buying the MUCH more expensive workstation cards.
As for the 8->16GB on the 4060TI, Nvidia does know 8GB is not enough for that amount of performance, but 12GB is only possible with a 96 or 192bit bus and the 4060Ti has 128bit and everything below 128bit crushes the performance.
also make up some bullshit feature, doesn't have to be anything meaningful or useful, but advertise the hell out of it as the next best thing since sliced break and only make it available on Radeon GPU's. Additionally bundle your graphic cards with Ryzen cpu's, see it fly off the shelves.
But lets be honest: If you really wanna use RT and not have a big performance-hit, you still need high end (4080+). AMD has nothing there. But I think the software-implementation will get better over the next few years so that you can enable RT without completely tanking in perf.
Ultra realism and photo-accuracy isn't the only way to produce a good-looking image in a game. It's a brush in the painter's hand, just like any other. I know how game rendering works. Read my post above to see what I meant earlier.