• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 Founders Edition

So 25% more performance for 27% more money 2 years later. How revolutionary!

Hey, AMD! Here's your chance to strike nvidia where it hurts. Blackwell is looking to be a major disappointment, so if rDNA4 really is that good, it's a prime opportunity to smack them across the proverbial room.

4090 launch MSRP was $1600.
Please let's be factually correct. I think you meant 25% more performance for 27% more money 27 months later. How revolutionary!
 
So can this card run my neo g9 (7680x2160p @ 240hz) without compression? :D
Lol my 4K QD-OLED 240Hz will use all that Bandwidth already! You might need to wait for DP 2.2 or HDMI 2.3 :D
 
What kind of sorcery is it that despite the card itself being much smaller and compact compared to the 4090 FE model and other AIB cards, the card somehow stays around 77°C even though the 5090 consumes more power than the 4090 lol, the engineering behind that cooler is insanely impressive in that regards. Memory temps though look pretty rough for the FE at least.
 
Anyone that can afford a 5090 probably isn't overly concerned about the cost to run it for gaming.

If you game 4 hours a day, that's 28 hours a week.
If the GPU runs at a continuous 600W an hour while gaming you end up with 16.8kWh a week.
If you pay $0.10 / kWh = $1.68 a week
If you pay $0.20 / kWh = $3.36 a week
If you pay $0.30 / kWh = $5.04 a week
If you pay $0.70 / kWh = $11.76 a week
Remember, this is if the GPU is running a sustained, continuous 600W those 4 straight hours of gaming. It all depends on the game, resolution, settings and so on. Also, remember the V-Sync power chart shows the GPU pulling about 90W. The above numbers would be for top-end power draw scenarios.

Personally I wouldn't want a GPU that can suck 600W for gaming. Not to mention the fact that this GPU is priced nearly 3x over what I'm comfortable in spending on a GPU, so I'm not the target for this product. If I had oodles of money and no brains, I'd get one, but I've got a meager amount of money and brains so I won't be getting one.

I would argue that the people paying $2k+ for a video card are the ones with money problems. Generally people who are good with money or are just rich don't get there by purchasing $2000 video cards. It's like the fact that nearly all of the people driving around luxury cars cannot afford them.
 
What kind of sorcery is it that despite the card itself being much smaller and compact compared to the 4090 FE model and other AIB cards, the card somehow stays around 77°C even though the 5090 consumes more power than the 4090 lol, the engineering behind that cooler is insanely impressive in that regards. Memory temps though look pretty rough for the FE at least.
4090 FE was 66°C. I'm not impressed with 5090 FE temps.
 
Why are you getting the same performance in elden ring with the 5090 and the 4090 but in any other game its fine? it's the lack of drivers or the engine is hot garbage?
 
Why are you getting the same performance in elden ring with the 5090 and the 4090 but in any other game its fine? it's the lack of drivers or the engine is hot garbage?
Several reviewers have had issues with drivers so I wouldn't be surprised.
The RTX 50s will need several weeks & months of optimization anyway, just like any new GPUs that have just been released. Devs will optimize their games for them over time.
 
Please forgive me for duplicating the post from the other thread. But there are more people here.
Noticed a difference in measurements. Path Tracing both. For different driver versions, the difference is also too big, imho.
Why have FPS dropped?
 
Nvidia has gotten very good at controlling the perception of it's products. They stopped 4000 series production prior to the 5000 series and drained the channel of stock precisely to make the 5000 series appear to be better value.
Does it remind you of the 7800x 3d?
 
Please forgive me for duplicating the post from the other thread. But there are more people here.
Noticed a difference in measurements. Path Tracing both. For different driver versions, the difference is also too big, imho.
Why have FPS dropped?
Different settings/test scene/drivers/test system
 
I have one more question…
Does the RTX5070 have the same fps as the RTX4090 as they have announced?
 
Almost as expected. I did expect them to improve efficiency a bit though, they've pretty much flatlined. Needs more testing at different power levels suppose.

So this is basically a 4090 Ti Super of sorts when it comes to performance. ~30% more power for 30% more performance at 25% higher price. Zero efficiency improvements.
Ye, I also hoped for at least a 15-20% power efficiency improvement (10% from node + 10% from arch, but NV has no secret sauce in the drawer), but given the rumors of basically the same process node and later the increased TDP of 27.78% and NV's own published graphs of the non-DLSS result, where some have calculated the FPS improvement to be 28%, if I remember correctly, one could have expected the 0% power efficiency improvement from when NV announced their GeForce 5000 series.
Ye, unfortunately from 4090 to 5090 everything has increased linearly: performance, price and power consumption.

Okay fastest GPU money can buy, tons of VRAM for workstation tasks, 2-slot card is great, but insanely expensive and power consumption is horrendous!

Even idle power consumption is way too high. Wtf has Nvidia done here? This needs to be made on TSMC 2N so badly.
I'd say 30W in idle is ok for the 16 VRAM modules and this big chip, but 40W in multi-monitor 54W for video playback is kinda high. But indeed, at least the 32GB VRAM can be seen as an improvement for workstations/AI LLM self-hosting. The 2-slot design isn't super necessary for workstation or even normal users, it makes the GPU hotter and louder (40dB(A)) - 40dB(A) is where the concentration starts to being affected (a 2.5-slot maybe would have been below the 40 dB(A)). Wonder why they didn't went to a TSMC 3N variant (2N is way too early and still in testing afaihaveread), APPLE is using 3N 2nd generation for 8 months or so, but maybe it's still too expensive, too expensive for bigger chips due to lower yields and/or avialability, pure guessing.

Making the transformer model also available on the previous gens GeForce 4000, 3000 and 2000 series begs the question wether making DLSS Multi Frame Generation only available on the GeForce 5000 series was a pure marketing segmentation decision?
 
Different settings/test scene/drivers/test system
Different settings/ - possible, bot not sure, because in theory, the same graphic settings should be used. Although you know better, since you are the author of both reviews
test scene/ - The most logical option, but again, usually using a standard performance test or the same scene. Although...
drivers/ - Possibly, but not that significant. Although, of course, there were cases when a more recent driver gave a decrease in performance.
test system/ - About the same, last time the test system had 13900k, now the absolute gaming flagship is 9800x3d. But do not forget that testing was done at maximum graphics settings in 4K, where the processor does not play a big role. (Memory& storage are great too).

Please forgive me for my stupid tediousness. But this is just my curiosity.
Anyway, great review, as always. A herculean job.
 
I have one more question…
Does the RTX5070 have the same fps as the RTX4090 as they have announced?

The answer is resounding yes, yes it has.

If you limit both cards to 30 fps in a game where both could deliver more.

This is just as like an answer as Nvidia’s “Yes, if you use interpolation to quadruple the frames of RTX 5070, and not the frames of RTX 4090”.
 
I bought a 4090 in December of 23 for below MSRP.

View attachment 381337
Anecdote does not negate the norm. Getting a 4090 FE card via BestBuy was so convoluted and next to impossible that it defies any good-faith defense. When the vast majority of people making purchases are forced into paying quite a bit more than MSRP, that's the norm, not various anecdotal exceptions.

Unless the history is defied and the 5090 remains readily available at $2000, its price will be more than $2000. Based on how the 4090 was handled and the fact that AMD has completely surrendered that portion of the market to Nvidia (i.e. pure monopolization), why should consumers expect a strong supply of FE 5090s via BestBuy? What people should expect is for a corporation to do everything in its power to increase profitability. If that means pushing people to purchase 3rd-party partner 5090s at higher price points (which is what it meant with the 4090) then that is what should be expected.

Wait more than a year and it might be better than a paper launch isn't also much of a rebuttal for the corrosive effects of monopoly power.
 
Anecdote does not negate the norm. Getting a 4090 FE card via BestBuy was so convoluted and next to impossible that it defies any good-faith defense. When the vast majority of people making purchases are forced into paying quite a bit more than MSRP, that's the norm, not various anecdotal exceptions.

Unless the history is defied and the 5090 remains readily available at $2000, its price will be more than $2000. Based on how the 4090 was handled and the fact that AMD has completely surrendered that portion of the market to Nvidia (i.e. pure monopolization), why should consumers expect a strong supply of FE 5090s via BestBuy? What people should expect is for a corporation to do everything in its power to increase profitability. If that means pushing people to purchase 3rd-party partner 5090s at higher price points (which is what it meant with the 4090) then that is what should be expected.

Wait more than a year and it might be better than a paper launch isn't also much of a rebuttal for the corrosive effects of monopoly power.
I totally agree.
always over $2k ? --> physiological

I think also low availability at launch has been strategically driven at least in the last 3 series (RTX3k to RTX5k) from NVidia.

Furthermore, waiting/postponing a possible purchase (unless extremely lucky occasions) seems to follow a rule of one year of market stabilization but definitely the price tends to to go back up after this period (and value is more and more inconsistent).

Consider also the "real" value of a 5090 after 12/18 months from launch could be totally distorted from other perspectives...
I think it's like trying to predict the market for a luxury good... it just tends not to follow usual market rules.
 
Different settings/ - possible, bot not sure, because in theory, the same graphic settings should be used. Although you know better, since you are the author of both reviews
test scene/ - The most logical option, but again, usually using a standard performance test or the same scene. Although...
drivers/ - Possibly, but not that significant. Although, of course, there were cases when a more recent driver gave a decrease in performance.
test system/ - About the same, last time the test system had 13900k, now the absolute gaming flagship is 9800x3d. But do not forget that testing was done at maximum graphics settings in 4K, where the processor does not play a big role. (Memory& storage are great too).

Please forgive me for my stupid tediousness. But this is just my curiosity.
Anyway, great review, as always. A herculean job.
Didn't they also add new RT options since the game launched? so max at launch != max now
 
Different settings/test scene/drivers/test system
It's still crazy that DLSS Performance (1080p) is only delivering 2x performance when it's Natively rendered which is 4x less pixels than 4K... we should be getting at least 3x performance (I don't think the 9800X3D is such a huge bottleneck, but maybe...).

The answer is resounding yes, yes it has.

If you limit both cards to 30 fps in a game where both could deliver more.

This is just as like an answer as Nvidia’s “Yes, if you use interpolation to quadruple the frames of RTX 5070, and not the frames of RTX 4090”.
FPS yeah not smoothness nor latency haha

I totally agree.
always over $2k ? --> physiological
Yeah $2000 was my bet since Nvidia are very greedy nowadays...

But what is scary is all the AIBs also charging a lot more this generation...
For example the 4090 FE was $1600 but the MSI 4090 SUPRIM X and SUPRIM LIQUID X were both $1750 (+9,4%) whereas the same models on the 5090 are $2400 for the SUPRIM (+20%) and $2500 for the SUPRIM LIQUID SOC (+25%) !

ASUS were charging $2000 for their ROG STRIX OC (+25% vs FE) whereas the 5090 ASTRAL OC is $2800 (+40% vs FE) !!!

Those AIBs prices are ridiculous!
 
But what is scary is all the AIBs also charging a lot more this generation...
For example the 4090 FE was $1600 but the MSI 4090 SUPRIM X and SUPRIM LIQUID X were both $1750 (+9,4%) whereas the same models on the 5090 are $2400 for the SUPRIM (+20%) and $2500 for the SUPRIM LIQUID SOC (+25%) !

ASUS were charging $2000 for their ROG STRIX OC (+25% vs FE) whereas the 5090 ASTRAL OC is $2800 (+40% vs FE) !!!

Those AIBs prices are ridiculous!

I do wonder if there's any credence to the rumors that some AIBs expressed surprise at NVIDIA's final pricing structure revealed during CES.
Makes me think that the initial price was supposed to be 2499 USD instead of 1999 USD. It'd be the only way these insane AIB markups make any sense.
 
I do wonder if there's any credence to the rumors that some AIBs expressed surprise at NVIDIA's final pricing structure revealed during CES.
Makes me think that the initial price was supposed to be 2499 USD instead of 1999 USD. It'd be the only way these insane AIB markups make any sense.
Maybe but why not dropping the prices right before release? They did not give the pricing until the 5090 NDA was lifted aka 2 days ago. Therefore they could have changed their MSRP. But +20, +25% and +40% seem like a lot coming from a GPU already sold for $2000
 
It's still crazy that DLSS Performance (1080p) is only delivering 2x performance when it's Natively rendered which is 4x less pixels than 4K... we should be getting at least 3x performance (I don't think the 9800X3D is such a huge bottleneck, but maybe...).


FPS yeah not smoothness nor latency haha


Yeah $2000 was my bet since Nvidia are very greedy nowadays...

But what is scary is all the AIBs also charging a lot more this generation...
For example the 4090 FE was $1600 but the MSI 4090 SUPRIM X and SUPRIM LIQUID X were both $1750 (+9,4%) whereas the same models on the 5090 are $2400 for the SUPRIM (+20%) and $2500 for the SUPRIM LIQUID SOC (+25%) !

ASUS were charging $2000 for their ROG STRIX OC (+25% vs FE) whereas the 5090 ASTRAL OC is $2800 (+40% vs FE) !!!

Those AIBs prices are ridiculous!

And reviewers, although noting the high price, don’t bother to expose this “price creep” although they have much better insight into all this than regular customers.

I wonder if the base models (the ones that should have been MSRP, but will also be more expensive) will be rarer than hen’s teeth, so people will eventually cave in and buy off all these overpriced models?

I do wonder if there's any credence to the rumors that some AIBs expressed surprise at NVIDIA's final pricing structure revealed during CES.
Makes me think that the initial price was supposed to be 2499 USD instead of 1999 USD. It'd be the only way these insane AIB markups make any sense.

I doubt it. AIBs have really detailed price negotiations a year in advance, and even small changes require re-negotiations. Maybe they just know how little volume of this “Gaming” line there is, so instead of generating revenue from large number of cards sold they have to generate it from a smaller number - basically pre-scalping the buyers.
 
Anyone have any thoughts on the transistor count compared to Ada? Save for the ROP's, everything increased by 33% across the board, yet the transistor count only went up 20%. I realize it's not necessarily going to be 1:1 scaling, but I also wouldn't expect a difference of 9 billion transistors either.
 
Worst part of this is 5090 looks to be the best offering in terms of gen to gen perfromance in 5000 series.
 
Back
Top