• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

Make all games that require RTX to work too expensive to be played (or plain boring/frustrating/"not fun" to play), and make games that are fun to play too easy to run (so that GPU upgrade is never required again).

This should limit NV gaming business, and eventually force them to move on due to not generating enough profits.

And yes, this kind of thing will kill ALL GPU gaming business after a while (not only NV).
Good thing it's impossible to do at this stage ;D
Actually, there's a myriad of good games that don't need a high-end GPU. You just need to look further than the typical copy-paste AAA crap.
 
Actually, there's a myriad of good games that don't need a high-end GPU. You just need to look further than the typical copy-paste AAA crap.
I know, but that's not even close to being enough to break NV monopoly/"lobotomy" :(
 
24GB isn't possible with a 256-bit bus yet. I doubt that people would still want an overpriced 192-bit card like the 4070 Ti was.
 
The cool part about all of this is that no one has to buy anything. There are other hobbies that are a lot more fun..

On the internet a nice computer gets you bragging rights, irl it means nothing to anyone but you.
Who is talking about bragging?! We're talking about a situation of monopoly that is being used and abused. You're on a technology/computer website, obviously the focus is going to be on these topics.

I don't think I'm in the bragging business when looking to upgrade my 2070 SUPER pal. Wake up.
 
I'm guessing that he's trying to run Tails completely on the GPU, and apparently the 5090 has neither enough performance nor memory... :p
 
I'm not convinced whether neural rendering is really something to look forward to, or just another Nvidia buzzword.

Anyway, the 5080 is present-day technology for present-day games. When neural rendering really kicks off, it'll probably long be obsolete, just like Turing is for RT.
It's not an Nvidia Buzzword, they were the first to market it, but not the only ones who've been looking into it. Intel, AMD, game studios and academics are actively researching it. The goal is to both go beyond what rasterization can do (like good subsurface scattering, or complex shaders to accurately represent some materials) and make path tracing more efficient rather than using brute force. Raster graphics is all about using tricks and other fakery for a more efficient rendering. Neural rendering is the same principle but will leverage machine learning to render the visuals.
Intel Is Working on Real-Time Neural Rendering
AMD's 'Neural Supersampling' seeks to close the gap with Nvidia's DLSS - NotebookCheck.net News
1738440406617.png

There's an interesting read, about how the CG industry would rather use additional power to enable more visual effects rather than doing the same thing but faster, which is an issue with 4K and high refresh rate becoming a thing. The CEO of Epic says that you could give them a gpu x10 faster than what we have now, they would still find a way to bring that hardware to its knees. But hardware isn't the only limitation. We've probably reached a point where CG artist are looking to fix problems that are glaring to their eyes, but that most gamers don't see. Hence any talk about why they are looking beyond raster graphics doesn't really hit with people. A bit like how a seasoned painter would see areas of improvement in his craft when someone with untrained eyes could think that he already reached his peak.
When I asked Tim Sweeney, CEO and founder of Epic Games, he was less optimistic and said, “Many aspects of rendering are still very limited by shader performance, such as the ability to achieve a high frame rate at a high resolution, with geometry, lighting, and subsurface scattering detail as precise as your eyes can see. We could easily consume another 10× shader performance without running out of improvements.”

He continued: “There are a few details that are limited not by performance but by our lack of sufficiently powerful algorithms, such as rendering faces, face animation, body animation, character dialogue, and other problems that amount to simulating human intelligence, emotion, and locomotion. If you gave us an infinitely fast GPU today, we still couldn’t do humans that were indistinguishable from reality.

“Compared to past decades when we had no clear idea of how we’d ever solve those problems, nowadays we generally think that AI trained on vast amounts of human interaction data (which Epic will only ever do under proper license!) will be able to bridge the gap to absolute realism, possibly even by the end of the decade. I’m not sure how much GPU performance that will require. Perhaps not more than today’s GPUs.”
 
It's not an Nvidia Buzzword, they were the first to market it, but not the only ones who've been looking into it. Intel, AMD, game studios and academics are actively researching it. The goal is to both go beyond what rasterization can do (like good subsurface scattering, or complex shaders to accurately represent some materials) and make path tracing more efficient rather than using brute force. Raster graphics is all about using tricks and other fakery for a more efficient rendering. Neural rendering is the same principle but will leverage machine learning to render the visuals.
Intel Is Working on Real-Time Neural Rendering
AMD's 'Neural Supersampling' seeks to close the gap with Nvidia's DLSS - NotebookCheck.net News
View attachment 382853
There's an interesting read, about how the CG industry would rather use additional power to enable more visual effects rather than doing the same thing but faster, which is an issue with 4K and high refresh rate becoming a thing. The CEO of Epic says that you could give them a gpu x10 faster than what we have now, they would still find a way to bring that hardware to its knees. But hardware isn't the only limitation. We've probably reached a point where CG artist are looking to fix problems that are glaring to their eyes, but that most gamers don't see. Hence any talk about why they are looking beyond raster graphics doesn't really hit with people. A bit like how a seasoned painter would see areas of improvement in his craft when someone with untrained eyes could think that he already reached his peak.
Raster graphics has been good for us in the last quarter of a decade, and now we have to use words like "tricks and fakery" to describe it? :confused:

Other than that, interesting read. Still, my point stands - the 5080 will long be obsolete before we see anything come out of this research.
 
Doesn't Snowden have better things to do, like fellate Vladimir Putin?

Raster graphics has been good for us in the last quarter of a decade, and now we have to use words like "tricks and fakery" to describe it?
Yes. Because that's what it is.
 
Whether it's a Cambrian explosion or just low hanging (or confirmation bias) I'm trying not to underestimate a.i. If it makes adoption and integration easier then perhaps we'll see it sooner rather than later.
@dyonoctis : It's not an Nvidia Buzzword, they were the first to market it
You can't calculate or claim anything. Neural rendering is a SER 2.0 feature.
It's the same SER 1.0 feature implemented in Ada. This feature requires the development of a second code path in every game that uses it, as well as a second engine code path to support it. So, two years after its implementation, it is used in exactly one NV-sponsored demonstrator - CP2077.
There are no other demonstrators known to use it, or even planned ones. At this rate of adoption, this technology will never be implemented.
Thanks
 
On the internet a nice computer gets you bragging rights, irl it means nothing to anyone but you.
One(two) more thing/s. In fact, bragging about hardware is very engaging and even life-threatening. There will always be someone with higher-end hardware than yours that will cause you at least heartburn and expenses that are detrimental to your personal and family life, and there will always be someone poor enough who can't afford it, who envies you and might even rob your home at gunpoint to acquire it.
 
ok...and?
are we going to quote everyone who says anything about this release now?

I think its been objectively shown many times now this release is a complete joke.
But consumers still line up to buy it.....so again, why even be mad at Nvidia?
Hell imo they should release their next line with again just 16gb or even less memory and only actually make the 90 version better, give the rest even more frame gen and exclusive DLSS 5 for all I care.
People will buy it anyway.
 
Last edited:
One(two) more thing/s. In fact, bragging about hardware is very engaging and even life-threatening. There will always be someone with higher-end hardware than yours that will cause you at least heartburn and expenses that are detrimental to your personal and family life, and there will always be someone poor enough who can't afford it, who envies you and might even rob your home at gunpoint to acquire it.
I personally do not care about other peoples computers, I like to look at them in the case pics thread. But to me its just hardware, run what you want. I dont judge anyone for what they have in their tower.

Unless they are kind of being a jerk about it..
 
Nvidia provides interesting times to live in and gets all of this whining in return.
Says who? A beloved fanboy, that's who.
Calling someone who has just made a fairly accurate statement a fanboy says greatly more about you than it does them. Seriously, improve the maturity a little bit.

NVidia's business practices are not agreeable and boarder on the predatory, but to imply that they have not brought significant innovations to tech and the world in general is not just silly and totally blind to reality, but utterly daft.

Edward Snowden's statement is clearly not from a techie's point of view. It was a frustrated outburst from someone who expected better from NVidia. Many of us feel the same way, myself included. However, at the end of the day, he's not talking from the perspective if a tech professional or enthusiast.

I personally do not care about other peoples computers, I like to look at them in the case pics thread. But to me its just hardware, run what you want. I dont judge anyone for what they have in their tower.
THIS! YES! :rockout: I really wish more people had this perspective instead of being snooty, self-righteous, nose-in-the-air ahole elitists!
 
Calling someone who has just made a fairly accurate statement a fanboy say greatly more about you than it does them. Seriously, improve the maturity a little bit.

NVidia's business practices are not agreeable and boarder on the predatory, but to imply that they have not brought significant innovations to tech and the world in general is not just silly and totally blind to reality, but utterly daft.

Edward Snowden's statement is clearly not from a techie's point of view. It was a frustrated outburst from someone who expected better from NVidia. Many of us feel the same way, myself included. However, at the end of the day, he's not talking from the perspective if a tech professional or enthusiast.

throws in a "no u are" and then talks about maturity ;)
 
Tell me more about it - if one has an RTX 5090 and Ryzen 9 9950X, who has "higher" ?
There are "rumors" that hi-end enthusiast components are still available. AMD still has ThreadripperX, even Intel has versions of Xeon that are in that class, although there are "no more consumer" not called "core i" series (which were on lga 2066). In fact, the last few generations of professional graphics cards are also suitable for gaming and at least Nvidia does not call them Quadro, but RTX XXXX like usual consumer models but having models, some of which have more VRAM. Example: "NVIDIA RTX 6000 Ada Generation Graphics Card".
Ps. In the very closest past I read about Ryzen Threadripper 9000 series which leaked with its code naming.
 
Last edited:
Well if rasterization is tricks and fakery, then upscaling and AI fake frames are turd polish and vaseline smears on the screen.
Yes, AMD's FSR, framegeneration and AFMF are turd polish and vaseline smears but this is an nvidia topic man.
 
I personally do not care about other peoples computers, I like to look at them in the case pics thread. But to me its just hardware, run what you want. I dont judge anyone for what they have in their tower.

Unless they are kind of being a jerk about it..
Oops, now I see that the translation sounds like it's being personally targeted. Someone needs to fix Google Translate, finally.
 
Well if rasterization is tricks and fakery, then upscaling and AI fake frames are turd polish and vaseline smears on the screen.
I prefer the phrase lipstick on a pig!
 
Yes, AMD's FSR, framegeneration and AFMF are turd polish and vaseline smears but this is an nvidia topic man.

Well this takes the cake for the most childish response on the thread.
Its also ironic because im pretty sure the first time DLSS was shown, that was how people described the look of it, vaseline on the screen.
 
Back
Top