• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Edward Snowden Lashes Out at NVIDIA Over GeForce RTX 50 Pricing And Value

ok...and?
are we going to quote everyone who says anything about this release now?

I think its been objectively shown many times now this release is a complete joke.
But consumers still line up to buy it.....so again, why even be mad at Nvidia?
Hell imo they should release their next line with against just 16 or even less memory and only actually make the 90 version actually better, give the rest even more frame gen and exclusive DLSS 5 for all I care.
People will buy it anyway.
I'm not even mad at Nvidia anymore, it's just sad how the state of the gpu market is, and some people still haven't realized how badly the leather jacket man is ripping them off, since even with a paper launch and no real gains over the 40 series there is still so much hype over a disappointing gen of cards.
 
in retrospection
2014 GTX 980 398 mm² 549 USD 4 GB
RTX 5080 378 mm² 999 USD 16 GB

What are you complaining about 5080 is 7x faster and provides 4x more memory for 2x the price. 24 Gbit 3GB chips can't be released soon enough.

You have to factor in inflation. the cost of living has increased.

Nvidia provides interesting times to live in and gets all of this whining in return.
We want nvidia to be a company for the people and sell us gpus at a loss out of the kindness of their hearts.

Well this takes the cake for the most childish response on the thread.
Its also ironic because im pretty sure the first time DLSS was shown, that was how people described the look of it, vaseline on the screen.
Yeap, DLSS 1 was disgusting (worse than a traditional upscaler, lol), but I still don't get why do we have to bring amd in this thread.
 
We want nvidia to be a company for the people and sell us gpus at a loss out of the kindness of their hearts.


Yeap, DLSS 1 was disgusting (worse than a traditional upscaler, lol), but I still don't get why do we have to bring amd in this thread.

We dont, you brought up AMD in response to a generalized criticism on upscalers and frame generators.
So you should really ask yourself that question.

Oh and on that other part....idk how old you are but man.... there is a line between "Giving stuff away at a loss" and "ripping people off for ever growing profits"...
I would LOVE if Nvidia (and AMD and Intel for that matter ) was forced to disclose how much profit they make on every gpu sold per type.

It also reminds me of that disgusting corporate pitty us phrasing of "its no longer cost effective" oh really? it isnt? how so? how about you disclose exactly how much pure profit the game provided? and that is before you fire all your personnel to give the CEO another 50 million dollar bonus.....
 
Last edited:
And still, TPU will have a 5xxx owners thread and umpteen post of OC's and benchies.
 
And still, TPU will have a 5xxx owners thread and umpteen post of OC's and benchies.
The exact same way 7xxx users bragged about VRAM and mocked everything under it on the green side?

Yeah I can see that.
 
Read "lashes out at nvidia" and brain thought Linus Torvalds for a sec
 
We want nvidia to be a company for the people and sell us gpus at a loss out of the kindness of their hearts.
Jensen isn't gonna have to sell his jacket collection if he lowers his 70% profit margins by 10-15%
The exact same way 7xxx users bragged about VRAM and mocked everything under it on the green side?

Yeah I can see that.
Or all the times I've seen people with RTX 30 and 40 series cards say VRAM didn't matter.
So much for not judging what other people have in their tower though.
 
And still, TPU will have a 5xxx owners thread and umpteen post of OC's and benchies.
Yeah, very likely. Speaking of..
Here we go.
 
Last edited by a moderator:
Jensen isn't gonna have to sell his jacket collection if he lowers his 70% profit margins by 10-15%

Or all the times I've seen people with RTX 30 and 40 series users say VRAM didn't matter.
So much for not judging what other people have in their tower though.
VRAM doesn't bother me like it does you.

I do not care what you have in your tower.

If you love it so much go play a game and quit whining about everything in every thread.
 
You can't calculate or claim anything. Neural rendering is a SER 2.0 feature.
It's the same SER 1.0 feature implemented in Ada. This feature requires the development of a second code path in every game that uses it, as well as a second engine code path to support it. So, two years after its implementation, it is used in exactly one NV-sponsored demonstrator - CP2077.
There are no other demonstrators known to use it, or even planned ones. At this rate of adoption, this technology will never be implemented.
Thanks
Are you talking about ray reconstruction in cyberpunk? there's more games using that feature. Other than that, cyberpunk is really not a "neural rendering demonstrator", Direct X 12 made publicly available the necessary components to make neural rendering a reality just a few weeks ago, and the current gen of consoles doesn't have the hardware to enable it. The RTX remix version of Half life 2 (Nvidia is obviously very involved in the project) is currently the only game that will allow an early preview of the tech.
Microsoft prepares DirectX to support neural rendering for AI-powered graphics — a key feature of the update will be Cooperative Vector support | Tom's Hardware
I played Half-Life 2 RTX with Nvidia neural rendering, and it looks damn fine
 
VRAM doesn't bother me because I have more than enough. ;)
And if people didn't care what was in someone else's system there wouldn't be childish mocking over it while trying to turn this into a brand fight.
Anyway, the refusal to accept general criticisms is part of the problem, until some realize leather jacket man doesn't care about the gaming market and vote with their wallet, they will continue to rip everyone off lining up to buy the next card.
edit- thank you for the laugh react, very mature.
 
Last edited:
Raster graphics has been good for us in the last quarter of a decade, and now we have to use words like "tricks and fakery" to describe it? :confused:

Other than that, interesting read. Still, my point stands - the 5080 will long be obsolete before we see anything come out of this research.
That's what raster graphics mainly is. For having played around with both I can you that offline path traced graphics are easier to work with. Here's a perfect picture to illustrate the hurdle of making a mirror in games. In offline 3D this is super easy to do: make a reflective shader and you are done.

It's not as pejorative as it sounds, even movies are making use of tricks sometimes to optimize the render time. But Raster isn't the Top-end of CG, it can look very good, but there's drawbacks to it when it comes to accurately depicting the world. Otherwise, it would have been far more popular for rendering movies. The Witcher 4 trailer in UE5 looked really good, yet UE didn't make offline renderers obsolete.
How Do Mirrors in Video Games Work?
1738449589454.png
 
How is not everything you see on your screen anything other than some kind of trick or fakery?
I could turn that around and ask why you have such a problem with upscalers, but I won't because it's not liable to generate a productive discussion. Instead I highly suggest you do some research on how rasterisation fundamentally works, the inherent problems WRT lights and shadows that are caused by how it works, and how and why ray-tracing doesn't (cannot) suffer from the same drawbacks.

If I had to sum it up in the context of this thread, though: rasterisation is to rendering what upscalers are, whereas ray-tracing is rendering at native quality.
 
That's what raster graphics mainly is. For having played around with both I can you that offline path traced graphics are easier to work with. Here's a perfect picture to illustrate the hurdle of making a mirror in games. In offline 3D this is super easy to do: make a reflective shader and you are done.

It's not as pejorative as it sounds, even movies are making use of tricks sometimes to optimize the render time. But Raster isn't the Top-end of CG, it can look very good, but there's drawbacks to it when it comes to accurately depicting the world. Otherwise, it would have been far more popular for rendering movies. The Witcher 4 trailer in UE5 looked really good, yet UE didn't make offline renderers obsolete.
How Do Mirrors in Video Games Work?
View attachment 382882

That chaotic evil just reminds or Hogwarts with its shiny chalkboards.

Raytracing reflections on things that don't need it.
 
That chaotic evil just reminds or Hogwarts with its shiny chalkboards.

Raytracing reflections on things that don't need it.
High-grade blackboards are made from porcelain-enameled steel, which is a reflective material. So you're basically upset about realism, which is... okay, that's a take of all time.
 
And he is going to do that, why? We should thank god he is still making gaming gpus else we'd be stuck with...
I don't thank god for any corporations, thanks.
 
That chaotic evil just reminds or Hogwarts with its shiny chalkboards.

Raytracing reflections on things that don't need it.
It isn't just Hogwarts, but IMO the floors don't always have to be shiny either.
Things that don't need to be soaking wet shiny is one annoyance I have with RT reflections.
And he is going to do that, why? We should thank god he is still making gaming gpus else we'd be stuck with...
Oh yes thank god Nvidia is still giving the scraps to gamers while up selling every tier. I'd rather be stuck with some fair competition between AMD and Intel, than the monopoly,marketing lies and anti-consumer tactics from Nvidia.
 
How is not everything you see on your screen anything other than some kind of trick or fakery?

If you actually want to learn how game rendering works, start here:
 
Oh yes thank god Nvidia is still giving the scraps to gamers while up selling every tier. I'd rather be stuck with some fair competition between AMD and Intel, than the monopoly,marketing lies and anti-consumer tactics from Nvidia.
The notion that Nvidia gives scraps to gamers cannot be true for the simple fact that if it was true their competitors would be offering twice the performance by now, but they don't. Something doesnt add up here. I'm giving the benefit of the doubt to amd, if their 9070xt doesnt match the 5080 at half the price then they are also giving us "the scraps" in which case the phrase itself doesn't mean anything.
 
until some realize leather jacket man doesn't care about the gaming market
Sure, sure, Nvidia doesn’t care about gamers. They engineered all this gaming technology:


because they don’t care. They did it because they didn’t have anything else to do I’m sure.

Back onto ignore with you. I’ll check again next month to see if a clue has seeped into your brain.
 
Back
Top