• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

I am really impressed by cooling efficiency. Incredible results. Good job Nvidia. Wonder how much does liquid metal contribute to this. Hopefully someone will test it, eventually.

Now please, other manufacturers, kindly get inspired and don't ever come again with bigger than 3-slot GPUs in the future, okay?

As for performance, I used results from this review to calculate all resolution average:

1737651373507.png


RTX 4090 has around 18% less transistors count and 25% less compute units. I used all games average fps to calculate efficiency (not juct Cyberpunk) and RTX 5090 is in fact less efficient than RTX 4090, since it consumes roughly 20% more power per frame. Maybe the price increase (+$500) is let's say justified, but performance-wise and efficiency-wise this is far from being special. In other words, now all performance increases is about scaling - the more compute units, the more performance. The more you buy, the more you have, but definitely not save. Were the RTX 5090 priced at $1700, then you'd also get more for less. We'll see when they jump to 3nm or 2nm node.

What indeed is special, as I already mentioned, is cooling efficiency of RTX 5090 cooler. Really damn impressive. My doubts were unjustified, I must say.
 
Last edited:
Hope these are not the hardware units required for DLSS
We hope those leftover screws aren't yours, @W1zzard. We'd rather have you without loose screws (imagine them missing, then), not any more than usual :laugh:
 
As expected, it costs a fortune and only a small increase over the 4090 even at 4K, I doubt anyone will use it lower but you never know I guess. Also draws nearly 600w...

We're hitting silicon wall and games are being optimized for consoles which have way less power, for me RT is a nothing burger and don't even start on image quality destroying upscaling...

Probably doesn't bode well for the lower down cards either given they have much smaller uplifts in hardware by comparison.
 
Has anybody tested yet how the card performs with different CPUs?
Why? If you're getting a 5090 for gaming there's no reason to get anything other than a 9800x3d

If you need to do workstation tasks you can afford a second production PC with a 9950x or whatever else works best for your workload and testing those CPUs with a 5090 in gaming is dumb
 
Card has been reassembled, only 3 screws left over. Runs perfectly fine, temps are only 2°C higher than LM. The fixme has been updated.
How are vapor chamber 3D "pipes" connected to fins? Are they soldered similarly as CPU air cooler base with copper heatpipes?
 
Why? If you're getting a 5090 for gaming there's no reason to get anything other than a 9800x3d

If you need to do workstation tasks you can afford a second production PC with a 9950x or whatever else works best for your workload and testing those CPUs with a 5090 in gaming is dumb
People have very nice PCs with Intel CPUs, you cannot expect everybody to immediatelly throw all this very nice and expensive stuff away.
 
Both impressive and unimpressive at the same time.

I'd be more impressed if this maintained the 450w at the same 1600 MSRP.

But on the other hand I expect this to creep up to a 40-50% uplift over time. This does seem to have more cpu overhead than ada hopefully they get that sorted becuase it's cpu limited a lot of the time.
 
Has anybody tested yet how the card performs with different CPUs?
Guess we'll see whenever comes the next CPU review. I see no reason to keep the 4090 for the test rig other than having to re-test everything.
 
Both impressive and unimpressive at the same time.

I'd be more impressed if this maintained the 450w at the same 1600 MSRP.

But on the other hand I expect this to creep up to a 40-50% uplift over time. This does seem to have more cpu overhead than ada hopefully they get that sorted becuase it's cpu limited a lot of the time.
Maybe the Super will do that, next gen for sure will.. maybe.. but AI right..
 
Both impressive and unimpressive at the same time.

I'd be more impressed if this maintained the 450w at the same 1600 MSRP.

But on the other hand I expect this to creep up to a 40-50% uplift over time. This does seem to have more cpu overhead than ada hopefully they get that sorted becuase it's cpu limited a lot of the time.

I believe that is the driver feeding the SM's more VLIW like data sets. I would rather see GPU's be more CPU bound, we have fast CPU's with lots of cores these days and stupid fast interconnects and storage.
 
I believe that is the driver feeding the SM's more VLIW like data sets. I would rather see GPU's be more CPU bound, we have fast CPU's with lots of cores these days and stupid fast interconnects and storage.

I'm probably in the minority but I'd rather hit 300 fps locked at 1440p than be locked into using 4k without upscaling to not be cpu limited lol.
 
This performance is about what was expected. The 5080/5070 performance numbers should be interesting. The new power cable is a reassuring sign. It's still a lame design but at least it's much safer.
 
Looks like a 35% more raster and ray tracing for 33% more cores. That's pretty good but that power usage. Wow!

With the linear scaling, that doesn't bode well for the 5080 which has 5% more cores than the 4080 Super but we will see in a week.
Bodes really bad for 5070 which has 6144 cores vs 7168 of the 4070 Super and they cannot really clock it any higher due to same semiconductor process.
 
This performance is about what was expected. The 5080/5070 performance numbers should be interesting. The new power cable is a reassuring sign. It's still a lame design but at least it's much safer.

The 5080 not being cpu limited at 1440p will help a lot but I expect a 40-50% drop at 4k in games that are not cpu limited on the 5090. We got about a week to see. The 5070 is 60% of a 5080 hardware wise or 37% of a 5090 so it's hard to guess how it will perform.
 
Impressive but also not.
 
The 5080 not being cpu limited at 1440p will help a lot but I expect a 40-50% drop at 4k in games that are not cpu limited on the 5090. We got about a week to see. The 5070 is 60% of a 5080 hardware wise or 37% of a 5090 so it's hard to guess how it will perform.
RTX 5090 claimed boost was 33% and it's actually is! TPU 35% faster on average, HWU 27% faster on average.
RTX 5080 claimed boost was 15%
RTX 5070 Ti claimed boost was 20%
RTX 5070 claimed boost was 20%
 
yes.
as expected, the cooler performance is nothing but breathtaking. i wonder how many millions they've invested in this revolutionary design. probably alot.
Is it really? The card is completely stuck to the throttle point ceiling and the driver keeps it in check, while VRAM is running at 93C. Not sure this is a great show for a 2000 dollar card. I mean sure, its cool they crammed it into a 2 slotter, but that's really all it wrote.

Overall I'm unimpressed given the immense shader count, the power target and the product performance as a whole. Its a bigger 4090 that isn't bigger. Well yay. It just underlines nothing was gained between Ada and Blackwell for gaming. Its just shaders > performance. It is exactly as my pessimist self predicted: there is nothing here but a bigger 4090 and the gap between it and the 5080 is there so they can just spin the 4090 for a longer time. Zero advances in anything that's worthy of note, really.
 
As expected, it costs a fortune and only a small increase over the 4090 even at 4K, I doubt anyone will use it lower but you never know I guess. Also draws nearly 600w...

We're hitting silicon wall and games are being optimized for consoles which have way less power, for me RT is a nothing burger and don't even start on image quality destroying upscaling...
46% in some titles, 12% in others points to a CPU bottleneck. Nvidia didn't need to do this when 5070 5080 are simply refreshed 4070 4070 TiS, 4080S by adding only 256-512 more CUDA. 5090 could have been 16896 CUDA 384 bit.
Yes the wall is near, Next node N3 30% N2 15% A16 7%. We can finally be done with the forever wait for the just around the corner product and enter the forever wait for the never going to get any better product era.
This makes very little sense.

It's completely reasonable to expect performance and value improvements even when the manufacturing node is the same. Maxwell was manufactured on the same node as Kepler, yet brought meaningful performance and efficiency improvements.

Nvidia has just jumped the shark this time because there is zero competition from AMD and all their profits are tied up in AI, so they deliver a product which is just "more of the same" to gamers.
That was a once in a lifetime event, the tile based rendering was a clever optimisation, nothing else can be improved or fixed like that. No more miracles.
 
Drawing 600W in real life use??? Hell I might just exchange my 700W IR panel with 5090. Finally a true winter friendly GPU. It would come handy in these days with temps around -5C.
On the flip side, summer heat is getting more brutal each year. Imagine gaming with this beast when temps hit +40Co_O
Melting Hot Dog GIF
 
RTX 5090 claimed boost was 33% and it's actually is! TPU 35% faster on average, HWU 27% faster on average.
RTX 5080 claimed boost was 15%
RTX 5070 Ti claimed boost was 20%
RTX 5070 claimed boost was 20%

I know what was claimed but in games that are RT heavy and aren't cpu limited gains are around 40%

This review still points towards the cards under this one being underwhelming though the 5080 is only 65% of the sm count vs the 5090 after all.
 
On the Overclocking page - it doesn't list the percentage gained? Is this the 1st card to achieve this omission :p
 
Did any review peek at Far Cry 6 benchmarks?
 
Last edited:
Back
Top