Saturday, February 1st 2025

Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

It's not every day that we witness a famous NSA whistleblower voice their disappointment over modern gaming hardware. Edward Snowden, who likely needs no introduction, did not bother to hold back his disapproval of NVIDIA's recently launched RTX 5090, RTX 5080, and RTX 5070 gaming GPUs. The reviews for the RTX 5090 have been mostly positive, although the same cannot be said for its affordable sibling, the RTX 5080. Snowden, voicing his thoughts on Twitter, claimed that NVIDIA is selling "F-tier value for S-tier prices".

Needless to say, there is no doubt that the RTX 5090's pricing is quite exorbitant, regardless of how anyone puts it. Snowden was particularly displeased with the amount of VRAM on offer, which is also hard to argue against. The RTX 5080 ships with "only" 16 GB of VRAM, whereas Snowden believes that it should have shipped with at least 24, or even 32 GB. He further adds that the RTX 5090, which ships with a whopping 32 GB of VRAM, should have been available with a 48 GB variant. As for the RTX 5070, the security consultant expressed desire for at least 16 GB of VRAM (instead of 12 GB).
But that is not all that Snowden had to say. He equated selling $1000+ GPUs with 16 GB VRAM to a "monopolistic crime against consumers," further accusing NVIDIA of "endless next-quarter" thinking. This is debatable, considering that NVIDIA is a publicly traded company, and whether they stay afloat does boil down to their quarterly results, whether we like it or not. There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two. The rest, of course, remains to be seen.
Source: @Snowden
Add your own comment

133 Comments on Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

#76
JustBenching
N/Ain retrospection
2014 GTX 980 398 mm² 549 USD 4 GB
RTX 5080 378 mm² 999 USD 16 GB

What are you complaining about 5080 is 7x faster and provides 4x more memory for 2x the price. 24 Gbit 3GB chips can't be released soon enough.

You have to factor in inflation. the cost of living has increased.

Nvidia provides interesting times to live in and gets all of this whining in return.
We want nvidia to be a company for the people and sell us gpus at a loss out of the kindness of their hearts.
ZoneDymoWell this takes the cake for the most childish response on the thread.
Its also ironic because im pretty sure the first time DLSS was shown, that was how people described the look of it, vaseline on the screen.
Yeap, DLSS 1 was disgusting (worse than a traditional upscaler, lol), but I still don't get why do we have to bring amd in this thread.
Posted on Reply
#77
ZoneDymo
JustBenchingWe want nvidia to be a company for the people and sell us gpus at a loss out of the kindness of their hearts.


Yeap, DLSS 1 was disgusting (worse than a traditional upscaler, lol), but I still don't get why do we have to bring amd in this thread.
We dont, you brought up AMD in response to a generalized criticism on upscalers and frame generators.
So you should really ask yourself that question.

Oh and on that other part....idk how old you are but man.... there is a line between "Giving stuff away at a loss" and "ripping people off for ever growing profits"...
I would LOVE if Nvidia (and AMD and Intel for that matter ) was forced to disclose how much profit they make on every gpu sold per type.

It also reminds me of that disgusting corporate pitty us phrasing of "its no longer cost effective" oh really? it isnt? how so? how about you disclose exactly how much pure profit the game provided? and that is before you fire all your personnel to give the CEO another 50 million dollar bonus.....
Posted on Reply
#78
lexluthermiester
ZoneDymoWell this takes the cake for the most childish response on the thread.
Hmm. Imagine that..
Posted on Reply
#79
AusWolf
AssimilatorYes. Because that's what it is.
How is not everything you see on your screen anything other than some kind of trick or fakery?
Posted on Reply
#80
FoulOnWhite
And still, TPU will have a 5xxx owners thread and umpteen post of OC's and benchies.
Posted on Reply
#81
freeagent
FoulOnWhiteAnd still, TPU will have a 5xxx owners thread and umpteen post of OC's and benchies.
The exact same way 7xxx users bragged about VRAM and mocked everything under it on the green side?

Yeah I can see that.
Posted on Reply
#82
mechtech
Read "lashes out at nvidia" and brain thought Linus Torvalds for a sec
Posted on Reply
#83
Hecate91
JustBenchingWe want nvidia to be a company for the people and sell us gpus at a loss out of the kindness of their hearts.
Jensen isn't gonna have to sell his jacket collection if he lowers his 70% profit margins by 10-15%
freeagentThe exact same way 7xxx users bragged about VRAM and mocked everything under it on the green side?

Yeah I can see that.
Or all the times I've seen people with RTX 30 and 40 series cards say VRAM didn't matter.
So much for not judging what other people have in their tower though.
Posted on Reply
#85
freeagent
Hecate91Jensen isn't gonna have to sell his jacket collection if he lowers his 70% profit margins by 10-15%

Or all the times I've seen people with RTX 30 and 40 series users say VRAM didn't matter.
So much for not judging what other people have in their tower though.
VRAM doesn't bother me like it does you.

I do not care what you have in your tower.

If you love it so much go play a game and quit whining about everything in every thread.
Posted on Reply
#86
dyonoctis
ContraYou can't calculate or claim anything. Neural rendering is a SER 2.0 feature.
It's the same SER 1.0 feature implemented in Ada. This feature requires the development of a second code path in every game that uses it, as well as a second engine code path to support it. So, two years after its implementation, it is used in exactly one NV-sponsored demonstrator - CP2077.
There are no other demonstrators known to use it, or even planned ones. At this rate of adoption, this technology will never be implemented.
Thanks
Are you talking about ray reconstruction in cyberpunk? there's more games using that feature. Other than that, cyberpunk is really not a "neural rendering demonstrator", Direct X 12 made publicly available the necessary components to make neural rendering a reality just a few weeks ago, and the current gen of consoles doesn't have the hardware to enable it. The RTX remix version of Half life 2 (Nvidia is obviously very involved in the project) is currently the only game that will allow an early preview of the tech.
Microsoft prepares DirectX to support neural rendering for AI-powered graphics — a key feature of the update will be Cooperative Vector support | Tom's Hardware
I played Half-Life 2 RTX with Nvidia neural rendering, and it looks damn fine
Posted on Reply
#87
Hecate91
VRAM doesn't bother me because I have more than enough. ;)
And if people didn't care what was in someone else's system there wouldn't be childish mocking over it while trying to turn this into a brand fight.
Anyway, the refusal to accept general criticisms is part of the problem, until some realize leather jacket man doesn't care about the gaming market and vote with their wallet, they will continue to rip everyone off lining up to buy the next card.
edit- thank you for the laugh react, very mature.
Posted on Reply
#88
TumbleGeorge
freeagentIf you love it so much go play a game and quit whining about everything in every thread.
A true gamer doesn't need a digital device to play. He still have a brain, which is a biological computer.
Posted on Reply
#89
dyonoctis
AusWolfRaster graphics has been good for us in the last quarter of a decade, and now we have to use words like "tricks and fakery" to describe it? :confused:

Other than that, interesting read. Still, my point stands - the 5080 will long be obsolete before we see anything come out of this research.
That's what raster graphics mainly is. For having played around with both I can you that offline path traced graphics are easier to work with. Here's a perfect picture to illustrate the hurdle of making a mirror in games. In offline 3D this is super easy to do: make a reflective shader and you are done.

It's not as pejorative as it sounds, even movies are making use of tricks sometimes to optimize the render time. But Raster isn't the Top-end of CG, it can look very good, but there's drawbacks to it when it comes to accurately depicting the world. Otherwise, it would have been far more popular for rendering movies. The Witcher 4 trailer in UE5 looked really good, yet UE didn't make offline renderers obsolete.
How Do Mirrors in Video Games Work?
Posted on Reply
#90
Assimilator
AusWolfHow is not everything you see on your screen anything other than some kind of trick or fakery?
I could turn that around and ask why you have such a problem with upscalers, but I won't because it's not liable to generate a productive discussion. Instead I highly suggest you do some research on how rasterisation fundamentally works, the inherent problems WRT lights and shadows that are caused by how it works, and how and why ray-tracing doesn't (cannot) suffer from the same drawbacks.

If I had to sum it up in the context of this thread, though: rasterisation is to rendering what upscalers are, whereas ray-tracing is rendering at native quality.
Posted on Reply
#91
DemonicRyzen666
dyonoctisThat's what raster graphics mainly is. For having played around with both I can you that offline path traced graphics are easier to work with. Here's a perfect picture to illustrate the hurdle of making a mirror in games. In offline 3D this is super easy to do: make a reflective shader and you are done.

It's not as pejorative as it sounds, even movies are making use of tricks sometimes to optimize the render time. But Raster isn't the Top-end of CG, it can look very good, but there's drawbacks to it when it comes to accurately depicting the world. Otherwise, it would have been far more popular for rendering movies. The Witcher 4 trailer in UE5 looked really good, yet UE didn't make offline renderers obsolete.
How Do Mirrors in Video Games Work?
That chaotic evil just reminds or Hogwarts with its shiny chalkboards.

Raytracing reflections on things that don't need it.
Posted on Reply
#92
Assimilator
DemonicRyzen666That chaotic evil just reminds or Hogwarts with its shiny chalkboards.

Raytracing reflections on things that don't need it.
High-grade blackboards are made from porcelain-enameled steel, which is a reflective material. So you're basically upset about realism, which is... okay, that's a take of all time.
Posted on Reply
#93
JustBenching
Hecate91Jensen isn't gonna have to sell his jacket collection if he lowers his 70% profit margins by 10-15%
And he is going to do that, why? We should thank god he is still making gaming gpus else we'd be stuck with...
Posted on Reply
#94
Assimilator
JustBenchingAnd he is going to do that, why? We should thank god he is still making gaming gpus else we'd be stuck with...
I don't thank god for any corporations, thanks.
Posted on Reply
#95
JustBenching
AssimilatorI don't thank god for any corporations, thanks.
I do. If not for Nvidia id be playing cyberpunk at 10 fps cause nobody can get to that level of rt performance.
Posted on Reply
#96
Hecate91
DemonicRyzen666That chaotic evil just reminds or Hogwarts with its shiny chalkboards.

Raytracing reflections on things that don't need it.
It isn't just Hogwarts, but IMO the floors don't always have to be shiny either.
Things that don't need to be soaking wet shiny is one annoyance I have with RT reflections.
JustBenchingAnd he is going to do that, why? We should thank god he is still making gaming gpus else we'd be stuck with...
Oh yes thank god Nvidia is still giving the scraps to gamers while up selling every tier. I'd rather be stuck with some fair competition between AMD and Intel, than the monopoly,marketing lies and anti-consumer tactics from Nvidia.
Posted on Reply
#97
Visible Noise
AusWolfHow is not everything you see on your screen anything other than some kind of trick or fakery?
If you actually want to learn how game rendering works, start here:
Posted on Reply
#98
JustBenching
Hecate91Oh yes thank god Nvidia is still giving the scraps to gamers while up selling every tier. I'd rather be stuck with some fair competition between AMD and Intel, than the monopoly,marketing lies and anti-consumer tactics from Nvidia.
The notion that Nvidia gives scraps to gamers cannot be true for the simple fact that if it was true their competitors would be offering twice the performance by now, but they don't. Something doesnt add up here. I'm giving the benefit of the doubt to amd, if their 9070xt doesnt match the 5080 at half the price then they are also giving us "the scraps" in which case the phrase itself doesn't mean anything.
Posted on Reply
#99
Visible Noise
Hecate91until some realize leather jacket man doesn't care about the gaming market
Sure, sure, Nvidia doesn’t care about gamers. They engineered all this gaming technology:

images.nvidia.com/aem-dam/Solutions/geforce/blackwell/nvidia-rtx-blackwell-gpu-architecture.pdf

because they don’t care. They did it because they didn’t have anything else to do I’m sure.

Back onto ignore with you. I’ll check again next month to see if a clue has seeped into your brain.
Posted on Reply
#100
Bomby569
nvidia sells AI cards with 1000% profit margin and in the gpu market has no competition, they make their own weather like it or not. This is more a failure or the competition, every monopoly will end like this
Posted on Reply
Add your own comment
Feb 2nd, 2025 01:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts