Saturday, February 1st 2025

Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

It's not every day that we witness a famous NSA whistleblower voice their disappointment over modern gaming hardware. Edward Snowden, who likely needs no introduction, did not bother to hold back his disapproval of NVIDIA's recently launched RTX 5090, RTX 5080, and RTX 5070 gaming GPUs. The reviews for the RTX 5090 have been mostly positive, although the same cannot be said for its affordable sibling, the RTX 5080. Snowden, voicing his thoughts on Twitter, claimed that NVIDIA is selling "F-tier value for S-tier prices".

Needless to say, there is no doubt that the RTX 5090's pricing is quite exorbitant, regardless of how anyone puts it. Snowden was particularly displeased with the amount of VRAM on offer, which is also hard to argue against. The RTX 5080 ships with "only" 16 GB of VRAM, whereas Snowden believes that it should have shipped with at least 24, or even 32 GB. He further adds that the RTX 5090, which ships with a whopping 32 GB of VRAM, should have been available with a 48 GB variant. As for the RTX 5070, the security consultant expressed desire for at least 16 GB of VRAM (instead of 12 GB).
But that is not all that Snowden had to say. He equated selling $1000+ GPUs with 16 GB VRAM to a "monopolistic crime against consumers," further accusing NVIDIA of "endless next-quarter" thinking. This is debatable, considering that NVIDIA is a publicly traded company, and whether they stay afloat does boil down to their quarterly results, whether we like it or not. There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two. The rest, of course, remains to be seen.
Source: @Snowden
Add your own comment

132 Comments on Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

#26
Legacy-ZA
I am sure the snake leather jacket man, is losing no sleep over this whole debacle. Heck, I wouldn't be surprised if it's his A.I bots buying and scalping his GPU's. :roll:
Posted on Reply
#27
Hecate91
Jtuck9I've read that people are disappointed that a 5080 doesn't outperform a 4090. I'm not sure how reasonable or feasible an expectation that was? Hence the shift of emphasis toward DLSS4 perhaps?
I think people are right to be disappointed, past generations of xx80 gpus have been a significant performance jump, the focus on DLSS4 makes a lot of sense though as these cards aren't any different than the rtx 40 series, it might as well have been a refresh of the super series. Paul's hardware has a nice chart showing the generational improvements since the gtx680.
Posted on Reply
#28
AusWolf
PatriotFor the same reason the rdna4 rumors are silly... going to be flagship gpus for $500... its laughable, if it performs like the competitors 1k card its going to be priced like it.
That's probably why they put it on hold until March. Maybe they really wanted to release it for $500, but then they saw Nvidia's offerings and thought "hold on a minute". Thus, the $800 9070 XT was born.
Posted on Reply
#29
Jtuck9
PatriotIt comes down to the keynote, last few gens Leather jacket getse up and says... because of our new features the new xx70 beats the xx90 of last gen.
I'm keen to see how that plays out!

"There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two."
Hecate91I think people are right to be disappointed, past generations of xx80 gpus have been a significant performance jump, the focus on DLSS4 makes a lot of sense though as these cards aren't any different than the rtx 40 series, it might as well have been a refresh of the super series. Paul's hardware has a nice chart showing the generational improvements since the gtx980.
Show me these charts when neural rendering becomes more established (in a year or two perhaps)
Posted on Reply
#30
MxPhenom 216
ASIC Engineer
Honestly hes kind of right. Im a lot less excited for this gen of GPUs, than i was maybe a month ago. Now i may just hold out and bum it out a bit longer with the 3080Ti
Posted on Reply
#31
agent_x007
Jtuck9Show me these charts when neural rendering becomes more established (in a year or two perhaps)
RT took 7 years to actually become relevant* as requirement to run games (making Turing cards that launch it... relevant at most), why you think neural rendering will be even remotely useful on Blackwell in shorter timeframe ?
*that's one AAA title that requires it at this point. More are coming, yes - but it's still a long time.
Posted on Reply
#32
AusWolf
Jtuck9Show me these charts when neural rendering becomes more established (in a year or two perhaps)
I'm not convinced whether neural rendering is really something to look forward to, or just another Nvidia buzzword.

Anyway, the 5080 is present-day technology for present-day games. When neural rendering really kicks off, it'll probably long be obsolete, just like Turing is for RT.
Posted on Reply
#33
leonavis
Jtuck9"There is no denying that NVIDIA is in desperate need of some true competition in the high-end segment, which appears to be the only way to get the Green Camp to price their hardware appropriately. AMD's UDNA GPUs are likely set to do just that in a year or two."
Yeaaa.... I believe that when I see it xD
Posted on Reply
#34
Hecate91
agent_x007RT took 7 years to actually become relevant* as requirement to run games (making Turing cards that launch it... relevant at most), why you think neural rendering will be even remotely useful on Blackwell in shorter timeframe ?
*that's one AAA that requires it at this point. More are coming, yes - but it's still a long time.
Agreed, with how long it took RT to become relevant, or usable enough on a mid range card to turn on the feature and enjoy a game with it, I don't expect neural rendering to be a feature worth buying a card now to wait a whole year or two for it.
Posted on Reply
#35
Jtuck9
agent_x007RT took 7 years to actually become relevant* as requirement to run games (making Turing cards that launch it... relevant at most), why you think neural rendering will be even remotely useful on Blackwell in shorter timeframe ?
*that's one AAA that requires it at this point. More are coming, yes - but it's still a long time.
Whether it's a Cambrian explosion or just low hanging (or confirmation bias) I'm trying not to underestimate a.i. If it makes adoption and integration easier then perhaps we'll see it sooner rather than later.
Hecate91Agreed, with how long it took RT to become relevant, or usable enough on a mid range card to turn on the feature and enjoy a game with it, I don't expect neural rendering to be a feature worth buying a card now to wait a whole year or two for it.
Yea. If I'm buying an RDNA 4 card in March I'm not thinking about it. I've I were to buy a 5080 I would be.
Posted on Reply
#36
AusWolf
Hecate91Agreed, with how long it took RT to become relevant, or usable enough on a mid range card to turn on the feature and enjoy a game with it
Wait... is RT enjoyable on a midrange card now? :oops:
Posted on Reply
#37
Jtuck9
AusWolfWait... is RT enjoyable on a midrange card now? :oops:
I did think if the latest id tech engine was licensed (I don't think it is?!) it might be held in better esteem.
Posted on Reply
#38
agent_x007
AusWolfWait... is RT enjoyable on a midrange card now? :oops:
It doesn't have to be.
Plain "minimum requirement" force you to buy RT capable card, you simply get what you can afford and adjust game settings to make it playable (Low, DLSS, 720p, etc.). I assume though, someone wants to play this XYZ game badly enough (since getting GPU + game combo is more expensive than just buying game).
Note : Slowest RT capable GPU cards are 3050 6GB and RX 6400 ;)
Posted on Reply
#39
Timbaloo
It's about time to shut down the internet. It's just used by spoilt brats b****ing and moaning about what they deserve and how evil companies are for not catering their welfare needs.
Posted on Reply
#40
AusWolf
TimbalooIt's about time to shut down the internet. It's just used by spoilt brats b****ing and moaning about what they deserve and how evil companies are for not catering their welfare needs.
If you don't like the conversation, don't take part in it. It's that simple. Personally, I'm not gonna sing a hymn on how great the 5080 is when it's not.
Posted on Reply
#41
RyanOCallaghan01
N/Ain retrospection
2014 GTX 980 398 mm² 549 USD 4 GB
RTX 5080 378 mm² 999 USD 16 GB

What are you complaining about 5080 is 7x faster and provides 4x more memory for 2x the price. 24 Gbit 3GB chips can't be released soon enough.

You have to factor in inflation. the cost of living has increased.

Nvidia provides interesting times to live in and gets all of this whining in return.
Finally, someone who sees the light without whinging so much.

Yes, people are buying these GPUs, and no, they're not all idiots. All the complaining like you believe in getting something for little to nothing, or forgetting how a business works, is much worse.
Posted on Reply
#42
Hecate91
Jtuck9Yea. If I'm buying an RDNA 4 card in March I'm not thinking about it. I've I were to buy a 5080 I would be.
I just want more GPU power for the money, not software features.
AusWolfWait... is RT enjoyable on a midrange card now? :oops:
It depends on if you even notice RT enough to trade game settings for shinier details, IMO we're still not there yet on what I consider midrange, $400-500, but everyone considers $700+ as midrange now.
Also reminds me of a HWUB video, RT in their testing was a mix of it's hardly noticeable unless you look for it, difficult to tell but different, and they thought some games did look better.
Posted on Reply
#43
eidairaman1
The Exiled Airman
Maybe he should buy a RX 7900 XTX or a RX 9070XT
Posted on Reply
#44
Sound_Card
leonavisYeaaa.... I believe that when I see it xD
Custom UDNA is already taped out and Sony is testing it right now on prototype PS6 boxes.

I believe Navi 41 was canceled to put more engineers on UDNA because all indication seems to point that UDNA is way ahead of schedule. For some reason, I'm getting the feeling UDNA might be a Zen. Not to derail ...
Posted on Reply
#45
Denver
freeagentThe cool part about all of this is that no one has to buy anything. There are other hobbies that are a lot more fun..

On the internet a nice computer gets you bragging rights, irl it means nothing to anyone but you.
I support the idea. What's the method of undoing Jensen's lobotomy?
Posted on Reply
#46
eidairaman1
The Exiled Airman
DenverI support the idea. What's the method of undoing Jensen's lobotomy?
Yeah by voting with wallet.
Posted on Reply
#47
leonavis
Sound_CardCustom UDNA is already taped out and Sony is testing it right now on prototype PS6 boxes.

I believe Navi 41 was canceled to put more engineers on UDNA because all indication seems to point that UDNA is way ahead of schedule. For some reason, I'm getting the feeling UDNA might be a Zen. Not to derail ...
Well, I sure hope you're right.

Best for my use-case would be a badass X3D-APU. No graphics card in the PC, just a 420mm-watercooled CPU that's efficient and still powerful enough to run modern games. What a dream. ^^
Posted on Reply
#48
R0H1T
freeagentAt least someone is being vocal about it. Not just crying about prices on forums.
What's he doing besides crying on Twitter X :rolleyes:
Posted on Reply
#49
agent_x007
DenverI support the idea. What's the method of undoing Jensen's lobotomy?
Make all games that require RTX to work too expensive to be played (or plain boring/frustrating/"not fun" to play), and make games that are fun to play too easy to run (so that GPU upgrade is never required again).

This should limit NV gaming business, and eventually force them to move on due to not generating enough profits.

And yes, this kind of thing will kill ALL GPU gaming business after a while (not only NV).
Good thing it's impossible to do at this stage ;D
Posted on Reply
#50
AusWolf
agent_x007Make all games that require RTX to work too expensive to be played (or plain boring/frustrating/"not fun" to play), and make games that are fun to play too easy to run (so that GPU upgrade is never required again).

This should limit NV gaming business, and eventually force them to move on due to not generating enough profits.

And yes, this kind of thing will kill ALL GPU gaming business after a while (not only NV).
Good thing it's impossible to do at this stage ;D
Actually, there's a myriad of good games that don't need a high-end GPU. You just need to look further than the typical copy-paste AAA crap.
Posted on Reply
Add your own comment
Feb 2nd, 2025 00:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts