Saturday, February 1st 2025

Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

It's not every day that we witness a famous NSA whistleblower voice their disappointment over modern gaming hardware. Edward Snowden, who likely needs no introduction, did not bother to hold back his disapproval of NVIDIA's recently launched RTX 5090, RTX 5080, and RTX 5070 gaming GPUs. The reviews for the RTX 5090 have been mostly positive, although the same cannot be said for its affordable sibling, the RTX 5080. Snowden, voicing his thoughts on Twitter, claimed that NVIDIA is selling "F-tier value for S-tier prices".

Needless to say, there is no doubt that the RTX 5090's pricing is quite exorbitant, regardless of how anyone puts it. Snowden was particularly displeased with the amount of VRAM on offer, which is also hard to argue against. The RTX 5080 ships with "only" 16 GB of VRAM, whereas Snowden believes that it should have shipped with at least 24, or even 32 GB. He further adds that the RTX 5090, which ships with a whopping 32 GB of VRAM, should have been available with a 48 GB variant. As for the RTX 5070, the security consultant expressed desire for at least 16 GB of VRAM (instead of 12 GB).
But that is not all that Snowden had to say. He equated selling $1000+ GPUs with 16 GB VRAM to a "monopolistic crime against consumers," further accusing NVIDIA of "endless next-quarter" thinking. This is debatable, considering that NVIDIA is a publicly traded company, and whether they stay afloat does boil down to their quarterly results, whether we like it or not. There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two. The rest, of course, remains to be seen.
Source: @Snowden
Add your own comment

133 Comments on Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

#51
agent_x007
AusWolfActually, there's a myriad of good games that don't need a high-end GPU. You just need to look further than the typical copy-paste AAA crap.
I know, but that's not even close to being enough to break NV monopoly/"lobotomy" :(
Posted on Reply
#52
Ruru
S.T.A.R.S.
24GB isn't possible with a 256-bit bus yet. I doubt that people would still want an overpriced 192-bit card like the 4070 Ti was.
Posted on Reply
#53
Dahita
freeagentThe cool part about all of this is that no one has to buy anything. There are other hobbies that are a lot more fun..

On the internet a nice computer gets you bragging rights, irl it means nothing to anyone but you.
Who is talking about bragging?! We're talking about a situation of monopoly that is being used and abused. You're on a technology/computer website, obviously the focus is going to be on these topics.

I don't think I'm in the bragging business when looking to upgrade my 2070 SUPER pal. Wake up.
Posted on Reply
#54
azrael
I'm guessing that he's trying to run Tails completely on the GPU, and apparently the 5090 has neither enough performance nor memory... :p
Posted on Reply
#55
dyonoctis
AusWolfI'm not convinced whether neural rendering is really something to look forward to, or just another Nvidia buzzword.

Anyway, the 5080 is present-day technology for present-day games. When neural rendering really kicks off, it'll probably long be obsolete, just like Turing is for RT.
It's not an Nvidia Buzzword, they were the first to market it, but not the only ones who've been looking into it. Intel, AMD, game studios and academics are actively researching it. The goal is to both go beyond what rasterization can do (like good subsurface scattering, or complex shaders to accurately represent some materials) and make path tracing more efficient rather than using brute force. Raster graphics is all about using tricks and other fakery for a more efficient rendering. Neural rendering is the same principle but will leverage machine learning to render the visuals.
Intel Is Working on Real-Time Neural Rendering
AMD's 'Neural Supersampling' seeks to close the gap with Nvidia's DLSS - NotebookCheck.net News

There's an interesting read, about how the CG industry would rather use additional power to enable more visual effects rather than doing the same thing but faster, which is an issue with 4K and high refresh rate becoming a thing. The CEO of Epic says that you could give them a gpu x10 faster than what we have now, they would still find a way to bring that hardware to its knees. But hardware isn't the only limitation. We've probably reached a point where CG artist are looking to fix problems that are glaring to their eyes, but that most gamers don't see. Hence any talk about why they are looking beyond raster graphics doesn't really hit with people. A bit like how a seasoned painter would see areas of improvement in his craft when someone with untrained eyes could think that he already reached his peak.
When I asked Tim Sweeney, CEO and founder of Epic Games, he was less optimistic and said, “Many aspects of rendering are still very limited by shader performance, such as the ability to achieve a high frame rate at a high resolution, with geometry, lighting, and subsurface scattering detail as precise as your eyes can see. We could easily consume another 10× shader performance without running out of improvements.”

He continued: “There are a few details that are limited not by performance but by our lack of sufficiently powerful algorithms, such as rendering faces, face animation, body animation, character dialogue, and other problems that amount to simulating human intelligence, emotion, and locomotion. If you gave us an infinitely fast GPU today, we still couldn’t do humans that were indistinguishable from reality.

“Compared to past decades when we had no clear idea of how we’d ever solve those problems, nowadays we generally think that AI trained on vast amounts of human interaction data (which Epic will only ever do under proper license!) will be able to bridge the gap to absolute realism, possibly even by the end of the decade. I’m not sure how much GPU performance that will require. Perhaps not more than today’s GPUs.”
Posted on Reply
#56
AusWolf
dyonoctisIt's not an Nvidia Buzzword, they were the first to market it, but not the only ones who've been looking into it. Intel, AMD, game studios and academics are actively researching it. The goal is to both go beyond what rasterization can do (like good subsurface scattering, or complex shaders to accurately represent some materials) and make path tracing more efficient rather than using brute force. Raster graphics is all about using tricks and other fakery for a more efficient rendering. Neural rendering is the same principle but will leverage machine learning to render the visuals.
Intel Is Working on Real-Time Neural Rendering
AMD's 'Neural Supersampling' seeks to close the gap with Nvidia's DLSS - NotebookCheck.net News

There's an interesting read, about how the CG industry would rather use additional power to enable more visual effects rather than doing the same thing but faster, which is an issue with 4K and high refresh rate becoming a thing. The CEO of Epic says that you could give them a gpu x10 faster than what we have now, they would still find a way to bring that hardware to its knees. But hardware isn't the only limitation. We've probably reached a point where CG artist are looking to fix problems that are glaring to their eyes, but that most gamers don't see. Hence any talk about why they are looking beyond raster graphics doesn't really hit with people. A bit like how a seasoned painter would see areas of improvement in his craft when someone with untrained eyes could think that he already reached his peak.
Raster graphics has been good for us in the last quarter of a decade, and now we have to use words like "tricks and fakery" to describe it? :confused:

Other than that, interesting read. Still, my point stands - the 5080 will long be obsolete before we see anything come out of this research.
Posted on Reply
#57
Assimilator
Doesn't Snowden have better things to do, like fellate Vladimir Putin?
AusWolfRaster graphics has been good for us in the last quarter of a decade, and now we have to use words like "tricks and fakery" to describe it?
Yes. Because that's what it is.
Posted on Reply
#58
Contra
Jtuck9Whether it's a Cambrian explosion or just low hanging (or confirmation bias) I'm trying not to underestimate a.i. If it makes adoption and integration easier then perhaps we'll see it sooner rather than later.
@dyonoctis : It's not an Nvidia Buzzword, they were the first to market it
You can't calculate or claim anything. Neural rendering is a SER 2.0 feature.
It's the same SER 1.0 feature implemented in Ada. This feature requires the development of a second code path in every game that uses it, as well as a second engine code path to support it. So, two years after its implementation, it is used in exactly one NV-sponsored demonstrator - CP2077.
There are no other demonstrators known to use it, or even planned ones. At this rate of adoption, this technology will never be implemented.
Thanks
Posted on Reply
#59
TumbleGeorge
freeagentOn the internet a nice computer gets you bragging rights, irl it means nothing to anyone but you.
One(two) more thing/s. In fact, bragging about hardware is very engaging and even life-threatening. There will always be someone with higher-end hardware than yours that will cause you at least heartburn and expenses that are detrimental to your personal and family life, and there will always be someone poor enough who can't afford it, who envies you and might even rob your home at gunpoint to acquire it.
Posted on Reply
#60
ZoneDymo
ok...and?
are we going to quote everyone who says anything about this release now?

I think its been objectively shown many times now this release is a complete joke.
But consumers still line up to buy it.....so again, why even be mad at Nvidia?
Hell imo they should release their next line with again just 16gb or even less memory and only actually make the 90 version better, give the rest even more frame gen and exclusive DLSS 5 for all I care.
People will buy it anyway.
Posted on Reply
#61
3valatzy
TumbleGeorgeThere will always be someone with higher-end hardware than yours
Tell me more about it - if one has an RTX 5090 and Ryzen 9 9950X, who has "higher" ?
Posted on Reply
#62
freeagent
TumbleGeorgeOne(two) more thing/s. In fact, bragging about hardware is very engaging and even life-threatening. There will always be someone with higher-end hardware than yours that will cause you at least heartburn and expenses that are detrimental to your personal and family life, and there will always be someone poor enough who can't afford it, who envies you and might even rob your home at gunpoint to acquire it.
I personally do not care about other peoples computers, I like to look at them in the case pics thread. But to me its just hardware, run what you want. I dont judge anyone for what they have in their tower.

Unless they are kind of being a jerk about it..
Posted on Reply
#63
lexluthermiester
N/ANvidia provides interesting times to live in and gets all of this whining in return.
ThomasKSays who? A beloved fanboy, that's who.
Calling someone who has just made a fairly accurate statement a fanboy says greatly more about you than it does them. Seriously, improve the maturity a little bit.

NVidia's business practices are not agreeable and boarder on the predatory, but to imply that they have not brought significant innovations to tech and the world in general is not just silly and totally blind to reality, but utterly daft.

Edward Snowden's statement is clearly not from a techie's point of view. It was a frustrated outburst from someone who expected better from NVidia. Many of us feel the same way, myself included. However, at the end of the day, he's not talking from the perspective if a tech professional or enthusiast.
freeagentI personally do not care about other peoples computers, I like to look at them in the case pics thread. But to me its just hardware, run what you want. I dont judge anyone for what they have in their tower.
THIS! YES! :rockout: I really wish more people had this perspective instead of being snooty, self-righteous, nose-in-the-air ahole elitists!
Posted on Reply
#64
ZoneDymo
lexluthermiesterCalling someone who has just made a fairly accurate statement a fanboy say greatly more about you than it does them. Seriously, improve the maturity a little bit.

NVidia's business practices are not agreeable and boarder on the predatory, but to imply that they have not brought significant innovations to tech and the world in general is not just silly and totally blind to reality, but utterly daft.

Edward Snowden's statement is clearly not from a techie's point of view. It was a frustrated outburst from someone who expected better from NVidia. Many of us feel the same way, myself included. However, at the end of the day, he's not talking from the perspective if a tech professional or enthusiast.
throws in a "no u are" and then talks about maturity ;)
Posted on Reply
#65
lexluthermiester
ZoneDymothrows in a "no u are" and then talks about maturity ;)
No, you are..

(hint, that was me being a smart-alec)
Posted on Reply
#66
TumbleGeorge
3valatzyTell me more about it - if one has an RTX 5090 and Ryzen 9 9950X, who has "higher" ?
There are "rumors" that hi-end enthusiast components are still available. AMD still has ThreadripperX, even Intel has versions of Xeon that are in that class, although there are "no more consumer" not called "core i" series (which were on lga 2066). In fact, the last few generations of professional graphics cards are also suitable for gaming and at least Nvidia does not call them Quadro, but RTX XXXX like usual consumer models but having models, some of which have more VRAM. Example: "NVIDIA RTX 6000 Ada Generation Graphics Card".
Ps. In the very closest past I read about Ryzen Threadripper 9000 series which leaked with its code naming.
Posted on Reply
#67
MentalAcetylide
heh, "Edward Snowden", I'm sure he gets "snowed in" quite a bit where he lives. A guy complaining about scalping who himself is guilty of scalping the US government in the guise of altruistic "whistle blowing".
Posted on Reply
#68
Hecate91
Well if rasterization is tricks and fakery, then upscaling and AI fake frames are turd polish and vaseline smears on the screen.
Posted on Reply
#69
JustBenching
Hecate91Well if rasterization is tricks and fakery, then upscaling and AI fake frames are turd polish and vaseline smears on the screen.
Yes, AMD's FSR, framegeneration and AFMF are turd polish and vaseline smears but this is an nvidia topic man.
Posted on Reply
#70
Hecate91
JustBenchingYes, AMD's FSR, framegeneration and AFMF are turd polish and vaseline smears but this is an nvidia topic man.
You're the one who always has to bring up AMD, nice try though.
Posted on Reply
#71
TumbleGeorge
freeagentI personally do not care about other peoples computers, I like to look at them in the case pics thread. But to me its just hardware, run what you want. I dont judge anyone for what they have in their tower.

Unless they are kind of being a jerk about it..
Oops, now I see that the translation sounds like it's being personally targeted. Someone needs to fix Google Translate, finally.
Posted on Reply
#72
JustBenching
Hecate91You're the one who always has to bring up AMD, nice try though.
You were talking about amd's features and im the one bringing them up? Come on now...
Posted on Reply
#73
thesmokingman
Hecate91Well if rasterization is tricks and fakery, then upscaling and AI fake frames are turd polish and vaseline smears on the screen.
I prefer the phrase lipstick on a pig!
Posted on Reply
#74
ZoneDymo
JustBenchingYes, AMD's FSR, framegeneration and AFMF are turd polish and vaseline smears but this is an nvidia topic man.
Well this takes the cake for the most childish response on the thread.
Its also ironic because im pretty sure the first time DLSS was shown, that was how people described the look of it, vaseline on the screen.
Posted on Reply
#75
Hecate91
ZoneDymook...and?
are we going to quote everyone who says anything about this release now?

I think its been objectively shown many times now this release is a complete joke.
But consumers still line up to buy it.....so again, why even be mad at Nvidia?
Hell imo they should release their next line with against just 16 or even less memory and only actually make the 90 version actually better, give the rest even more frame gen and exclusive DLSS 5 for all I care.
People will buy it anyway.
I'm not even mad at Nvidia anymore, it's just sad how the state of the gpu market is, and some people still haven't realized how badly the leather jacket man is ripping them off, since even with a paper launch and no real gains over the 40 series there is still so much hype over a disappointing gen of cards.
Posted on Reply
Add your own comment
Feb 2nd, 2025 00:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts