• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 50 Technical Deep Dive

Why do I get the feeling that introduction of Cooperative Vectors API for DirectX and Slang will end up being much more significant than we now realize?

A lot of these features like Mega-Geometry have the potential to be significant but it's always hard to tell what developers will adopt.
 
RTX 5090 should have been water cooled only. You need to push that heat out of the case without warming up other stuff inside, mainly CPU. RTX 5090 literally deserves AIO mounted on top of case: 420mm and at lest 60mm thick radiator. EDIT: Forgot to mention second radiator, lol, one radiator is not enough for 575W.
 
Last edited:
RTX 5090 should have been water cooled only. You need to push that heat out of the case without warming up other stuff inside. RTX 5090 literally deserves AIO mounted on top of case: 420mm and at lest 38mm thick radiator.

The problem with that is a 10 year + lifespan card, which the 5090 is easily for most people with backlogs of games combined with future games using frame gen. AIO's simply don't last that long. An innovative blower fan design working with a company like Noctua to design a new fan for it would have been the way I went personally. Two fans rotating in opposite directions to reduce vibrations = lower noise levels, combined with Noctua fan technology = quiet blower fan that can handle a 575watt card. I might be wrong, but I bet those engineers could have figured it out 100%.
 
How do they achieve 35ms latency with AI stuff in pipeline when on native it is 70ms? Please someone explain.

Simple. It takes much longer to render a frame with DLSS off. The latency reduction from DLSS more than offsets the latency increase from frame gen. Frame gen only increases latency if you compare it to DLSS.
 
5080 only 15% Faster Than 4080 in pure raster performance???
The more you buy the more you get Fed by The Leather jacket! FU NGreedia:nutkick:

Sea Reaction GIF by Seattle-Tacoma International Airport
 
5080 only 15% Faster Than 4080 in pure raster performance???
That's probably nvidias cherry pick result also! :laugh: It actually may be closer to ~10% gain overall.
 
Last edited:
@Space Lynx
anyone with +1K for a single part, should have the funds to use an aio for the cpu (at least), reducing overall case temps as well.
and while many aio on the lower price range might not, i doubt you will have problems with anything like an eisbaer, especially considering you can upgrade/replace everything (G1/4 stuff).
im now using mine (converted to incl gpu) for over 4y now, without any problems.
 
A lot of these features like Mega-Geometry have the potential to be significant but it's always hard to tell what developers will adopt.
Developers are writing games for consoles first, and will only adopt extra features for the PC port if it's easy, or if Nvidia literally invade their studio and offer on-site support and wads of money.
 
A lot of these features like Mega-Geometry have the potential to be significant but it's always hard to tell what developers will adopt.

Yeah, this and neural rendering is the most interesting and maybe mega Geometry at least being able to run on older rtx cards will help it's adoption. The problem is the next generation will likely be out before they are widely adopted.

Ada had some specific techniques that were never really used because they only run on ada...


Still this is feeling like Turing 2.0 where the benefits are going to take time and we might be moving to the 60 series before they actually matter.
 
Why do I get the feeling that introduction of Cooperative Vectors API for DirectX and Slang will end up being much more significant than we now realize?
So was ray tracing years ago and yet here we are, still trying to figure out ways of patching out issues in image quality and performance. Did not turn out how people expected it.

That's probably nvidia cherry pick result also! :laugh: It actually may be closer to ~10% gain overall.
I calculated a similar same performance uplift from a DF video in a thread a while ago and many were still coping believing a 5080 will actually be faster than a 4090.

Nvidia literally invade their studio and offer on-site support and wads of money.
Best they can do is ask for wads of money.
 
That's alot of marketing guff trying to disguise a poor generational uplift....

What is that phrase... Tell me there isn't a noticeable performance improvements without telling me there isn't a noticeable performance improvements..... lol.

Below the 5090 this is looking like 40 series super refresh 2.0 they could have called the 5080 the 4080 Super Duper....... lol
 
Last edited:
As someone who moved from 3060TI (300€) to 4070Ti Super (payed 700€), I have zero upgrade options unless I'm willing to spend 1250€ after taxes to gain 25% performace. Wth Nvidia? Jensen obviously don't want gamer's money anymore :confused:
 
In a way, I kind of like that raw performance uplift is only 10-15% (excluding 5090), as it gives competition a higher chance to catch up. I hope both AMD and Intel will be competitive enough for prices to come down.
 
Will reviewers call out the tiny uplift in hardware specs for what it is I wonder? In the mid range 5070, you get single digit uplift in shaders 4.35% and native performance increase won't be more than 10% best case scenario.

Meanwhile Nvidia sell you upscaling and increase their margins...

But who'll tell it how it actually is ?
 
In a way, I kind of like that raw performance uplift is only 10-15% (excluding 5090), as it gives competition a higher chance to catch up. I hope both AMD and Intel will be competitive enough for prices to come down.

Stagnation is never a good thing even if some of the new feature look promising.

None of the possible reasons why are good either

1. Silicon is hitting a wall

2. This is the best they can do while keeping cost similar although spending 50 more on each die would likely lead to decent performance improvements.

3. Greed, are they just trying to maximize margins as much as possible.

The competition fumbling their launches or being behind in perfomance isn't a good reason to Stagnate on generational performance improvements

The 5090 will still murder everything their competition makes and the average consumer will look at it and buy an Nvidia card anyways the halo effect is real with Nvidia top card each generation.
 
Last edited:
1. Silicon is hitting a wall
For this reason Unreal Engine 5 should never exist. Only exit is something light with good optimization.
 
all nice blabla pr talk I'll wait for the reviews before buying something.
 
Will reviewers call out the tiny uplift in hardware specs for what it is I wonder? In the mid range 5070, you get single digit uplift in shaders 4.35% and native performance increase won't be more than 10% best case scenario.

Meanwhile Nvidia sell you upscaling and increase their margins...

But who'll tell it how it actually is ?

We're still just speculating if it's 20% faster while being 8% cheaper than the launch 4070 what should a reviewer say.

Or do they compare it to the 4070 super where it'll be barely faster but 8% cheaper...

I lean towards the latter but the people actually interested in this card are 2070/3070 owners who can't afford the 5070ti and won't touch AMD DGPU with a ten foot stick... Honestly I feel bad for them if this is the best Nvidia can do after 4-6 years.

Lets be real though the competition isn't even going to beat last generation cards with the only real improvement being cost potentially.

We are f#$% regardless....

For this reason Unreal Engine 5 should never exist. Only exit is something light with good optimization.

While I don't disagree more games look like they are going ue5 not less....

We will have to wait till release ofc but Black State n the recent DF direct looked like it performed pretty well while looking good apparently the developer is using its own techniques vs relying on nanite.

all nice blabla pr talk I'll wait for the reviews before buying something.

The best course of action since the dawn of time. Although this launch below the 5090 is looking like a yawn fest at least at launch it'll be interesting in 2 years how these have aged. I don't expect the 5070 to age well but would love to be wrong.
 
Developers are writing games for consoles first, and will only adopt extra features for the PC port if it's easy, or if Nvidia literally invade their studio and offer on-site support and wads of money.
Nvidia be like insert potato graphics here, comes out the bear goggles hallucinating rtx neural network ai generated propriety graphics here.
How else would you mask the throttle monster from lowering performance without the help of ai multiple frame gen.
Please when reviewing the fe edition check any variance for potential throttling in a closed case real world scenarios. :lovetpu:
 
God damn it.

My dreams of a (edit: second hand) $400 4080/7900XTX to replace my (edit: second hand) $400 6800XT are fading fast.

Why Papa Jen Hsun?

View attachment 380096

I think the 40 series will see the lowest price drops of any generation outside of a crypto boom on the used market lol....
 
I wonder if SteamOS for desktop will support Nvidia or if it will still be AMD system locked as it is now designed. I wouldn't mind upgrading to a 5080, but I also want to dual boot Windows 11 and SteamOS in the future. Hmm.

edit: nm if it ends up being true its only 10-15% faster in raster than a 4080... I will pass. It would be nice to have DLSS4 and multi frame gen for future games, but I have so much backlog, I might as well wait for rtx 6000 series at this point.
 
Last edited:
Back
Top