• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

As it was rumored, no improvements on efficiency vs the 4090. Performs better, but also has a similar increase in power consumption and price
 
It is safe to assume that there were no archidectural improvements for classical CUDA Cores/raster performance? It seems to me that the improvement comes as much from increased Cores as from improved bandwidth.


So the 5070 should perform worse than the RTX 4070 Super and the 5080 barely better than the RTX 4080 Super.
 
5080 and below for normal people will thanks to its lower ram capacity and silicon area hopefully have much lower power draw.
RTX 5080 360w (RX 7900 XTX 355w), RTX 5070 Ti 300w, RTX 5070 250w.
 
Looking at the teardown of the card.... once you take in to account the additional costs of connectors, ribbon cables, and additional PCIe connection PCB... I can't see how the highly modular construction is likely any cheaper to produce.
The only positive is that it would be easier to fix damaged cards due to edge connector damage, etc., and really how big a problem was that?
Nvidia downplayed cards being damaged, although some youtube channels made a good bit of content on fixing some of these cards - maybe it was more of an issue than they really let on.

Not gonna lie, seeing the bare board does make me think of an MXM card, and when you factor in the additional connected bits, it's like some of the MXM>Desktop PCIe adapter cards.
Maybe there'll be a DTR mobile workstation / laptop derivative.
 
So the 5070 should perform worse than the RTX 4070 Super
Probably between RTX 4070 Super (220w tdp) and RTX 4070 Ti 12GB (285w tdp) or closer to second one. After RTX 5080 review almost everything will be clear.
 
It is safe to assume that there were no archidectural improvements for classical CUDA Cores/raster performance? It seems to me that the improvement comes as much from increased Cores as from improved bandwidth.
The CUDA cores can now all execute either INT or FP, on Ada only half had that capability. When I asked NVIDIA for more details on the granularity of that switch they acted dumb and gave me an answer to a completely different question and mentioned "that's all that we can share"
 
So the 5070 should perform worse than the RTX 4070 Super
I guess not; 5070 is less power limited and as we all have seen, 5090 has a more inspiring V/F curve than 4090 has. 3ish GHz OOTB won't surprise me. That, along with much faster VRAM, will land 5070 3% left or right of 4070S depending on a game and settings. But it definitely won't beat 4070 Ti. That's for sure.
 
Do people really pay $2000 to play crappy 1080p upscaled gamed?
 
What I missing from the review is screenshot/comparison between the modes actual quality.
I've tried it in Cyberpunk, visible improvement, but require more AI processing power, there's a noticeable drop in performance on my 3070: 60 FPS turn into 40 FPS.
edit: On quality settings* I've only done a quick test, but other people are saying that balanced or even performance can look than dlss 2 quality
 
Last edited:
Very interesting RTX 5090 has exactly the same efficiency as RTX 4090 looks like AMD has a chance to catch up nvidia in this area.

watt-per-frame.png
 
Last edited:
The CUDA cores can now all execute either INT or FP, on Ada only half had that capability. When I asked NVIDIA for more details on the granularity of that switch they acted dumb and gave me an answer to a completely different question and mentioned "that's all that we can share"
Thank you for your reply. So maybe they are more flexible, but still not perfect.
 
So maybe they are more flexible, but still not perfect.
They're as perfect as ever, the tensors are now being accessed directly and all shaders are integer and floating point capable. Which is not helping much unless a game title needs more than 50% integer, usually 25%.
 
The power consumption is freaking awful! Hello fermi! Terrible work from Nvidia...I was hoping for better performance per watt and sadly no... next gen laptops will just consume more power to get better performance... RTX4060 it is...oh well
 
Though the performance is there I am sure this will not sell well in areas where Power is expensive. 600 Watts is a serious amount of wattage to consume in Gaming alone.
Anyone that can afford a 5090 probably isn't overly concerned about the cost to run it for gaming.

If you game 4 hours a day, that's 28 hours a week.
If the GPU runs at a continuous 600W an hour while gaming you end up with 16.8kWh a week.
If you pay $0.10 / kWh = $1.68 a week
If you pay $0.20 / kWh = $3.36 a week
If you pay $0.30 / kWh = $5.04 a week
If you pay $0.70 / kWh = $11.76 a week
Remember, this is if the GPU is running a sustained, continuous 600W those 4 straight hours of gaming. It all depends on the game, resolution, settings and so on. Also, remember the V-Sync power chart shows the GPU pulling about 90W. The above numbers would be for top-end power draw scenarios.

Personally I wouldn't want a GPU that can suck 600W for gaming. Not to mention the fact that this GPU is priced nearly 3x over what I'm comfortable in spending on a GPU, so I'm not the target for this product. If I had oodles of money and no brains, I'd get one, but I've got a meager amount of money and brains so I won't be getting one.
 
Of course it isn't. The 5090 will eventually be a similar class of paperweight, it just takes more time. The comparisons are great to make, they provide you with much needed perspective on how silly it is to spend 2K on a GPU. Or how useful, given your use case. Fact is, B580 also offers 16GB, so if its just VRAM you need...
Ppls who never bought expensive Gpus thinks they lost the money they invest in GPU

5090 have still lot of value afther 2yers.
its not goes to Zero!

The power consumption is freaking awful! Hello fermi! Terrible work from Nvidia...I was hoping for better performance per watt and sadly no... next gen laptops will just consume more power to get better performance... RTX4060 it is...oh well
I dont see AMD highend GPUs nowhere..
Also better efficiency than AMD GPUs Lol!!

energy-efficiency.png
 
i thought the 3090ti got released because of RX6950xt
I didn't mean why it was created was thanks to binning. But that the ti moniker on 3090 had been justified only because of binning. I'm willing to believe the reason for it's creation was crypto as Bwaze stated

Then why are we comparing a b580 (msrp 250 $) to an 5090 (msrp 2000 $).

It's the best $ to performance (if you can get it for that price) card against the strongest card currently on the market.

Who thinks:

Hmmm... I can buy this card from intel for 250 $ (with perfromance between a 4060 ti 8GB & 16GB Version) or the 5090 for 2000 $.

The comparison is just nonsensical.
It's ok to disagree. There's plenty of people who will compare, even if just at the very beginning, the full stack of offerings to better understand what's out there. I'm not saying someone is at the last step deciding between such a disparate price range. You're scenario seems more a strawman than actual argument
 
Think I should grab the 4090 Ti 5090 over my 4080? or upgrade my CPU, PSU (850w) and Monitor first? :laugh:

Thanks for the review Wizz!
 
It's ok to disagree. There's plenty of people who will compare, even if just at the very beginning, the full stack of offerings to better understand what's out there. I'm not saying someone is at the last step deciding between such a disparate price range. You're scenario seems more a strawman than actual argument
eh, sure.

Also, if you are interested in the b580 theres a different thread talking about the problems they had with it.
 
If it's 1% better than the most efficient card on earth, then it is, by definition, the new most efficient card on earth.

Pulling 600w on its own does not make something inefficient.
RTX 4090 was not most efficient card, based on W1zzard's reviews:
1737728276579.png

(TPU's RTX 4080 Super review here.)
It’s more efficient than the previously top efficiency card.

What would you call it, less efficient?
Same, it's not more efficient than previous top efficient card - aka RTX 4080(S).

Still, remember, guys, those results are based on just one game - Cyberpunk 2077. It varies between games, just so you know.
Unfortunately, GN video shows efficiency comparison only in 3 games, which is still more than in one as seen on TPU:

It would be nice to have bigger statistical sample, 10 games at least, same settings, same rest of hardware, RTX 4090 vs RTX 5090. The more games, the better accuracy of the results. What was already tested by German colleagues, they limited power of RTX 5090 to 450W (RTX 4090 level) and saw 11-15% performance improvement. That means RTX 5090 limited to 450W is indeed more efficient than RTX 4090. As for 575W TGP, I don't think so. I'd say they are pretty much on par, though RTX 4090 might be very slightly more efficient. Of course, undervolted RTX 5090 might be totally different story, similarly to undervolted RTX 4090's story.
 
eh, sure.

Also, if you are interested in the b580 theres a different thread talking about the problems they had with it.
I hadn't seen the one here but saw the Hardware Unboxed video. Pretty sad to see. I'm still rooting for Intel to continue in the GPU market. I really hope the interim CEOs don't do away with the division

What was already tested by German colleagues, they limited power of RTX 5090 to 450W (RTX 4090 level) and saw 11-15% performance improvement. That means RTX 5090 limited to 450W is indeed more efficient than RTX 4090. As for 575W TGP, I don't think so. I'd say they are pretty much on par, though RTX 4090 might be very slightly more efficient. Of course, undervolted RTX 5090 might be totally different story, similarly to undervolted RTX 4090's story.
I could be misremembering, but I coulda sworn at the 4090 launch, some people power limited them to 300w and found them to be much more efficient as well
 
Probably between RTX 4070 Super (220w tdp) and RTX 4070 Ti 12GB (285w tdp) or closer to second one. After RTX 5080 review almost everything will be clear.

I guess not; 5070 is less power limited and as we all have seen, 5090 has a more inspiring V/F curve than 4090 has. 3ish GHz OOTB won't surprise me. That, along with much faster VRAM, will land 5070 3% left or right of 4070S depending on a game and settings. But it definitely won't beat 4070 Ti. That's for sure.
Definitely, but it's shaping up to be a pretty lame scenario, isn't it?
 
Definitely, but it's shaping up to be a pretty lame scenario, isn't it?
The same price as RTX 4070 but performance boost most likely will be tiny ~15-25%

GTX 770 refresh scenario.....
perfrel_1920.gif
 
Last edited:
I have to say this is a pretty disappointing card compared to the 4090 launch. Yes its the best card on the market no doubt, but the way its treated its just a 4090ti essentially. I mean the power draw is very high, the performance is somewhere on average around 25% - 35% better than the 4090, and its priced ridiculously high. The best part of this card in my book is the cooler design with the board its using, I really like how well it dissipates the heat and keeps the card cool.

This does really feel like a stopgap series. Its not really worth buying to replace a 4XXX series card (At least the top currently). I know there is more to it than the basic performance metrics but for whats being charged it should perform alot better considering we keep jumping the price every generation.
 
Definitely, but it's shaping up to be a pretty lame scenario, isn't it?
What else is here to expect? AMD are clowning to the maximum, NV have zero incentive to provide any better.
 
I have to say this is a pretty disappointing card compared to the 4090 launch. Yes its the best card on the market no doubt, but the way its treated its just a 4090ti essentially. I mean the power draw is very high, the performance is somewhere on average around 25% - 35% better than the 4090, and its priced ridiculously high. The best part of this card in my book is the cooler design with the board its using, I really like how well it dissipates the heat and keeps the card cool.

This does really feel like a stopgap series. Its not really worth buying to replace a 4XXX series card (At least the top currently). I know there is more to it than the basic performance metrics but for whats being charged it should perform alot better considering we keep jumping the price every generation.
RTX 5090 is a "extreme refresh" but lower tier probably will be just a "refresh".
 
Back
Top