• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Founders Edition

Just noticed this.
1737676612488.png

That’s… some interesting behavior from the Swarm Engine. Doesn’t show up on higher resolutions.
 
MOMMIES CC look out, the little ones will be snatching it from your purse to buy this non sense.
 
@W1zzard

I see the avg fps and fps/$, just curious if an fps vs transistor or fps/1k transistors chart could be considered?
 
Power consumption video playback? Multi monitors? Idle?! ITS HORRIBLE!
People literally hated AMD and XTX because of this, but i guess...everyone and their mom will love Nvidia now. For the record, XTX was high too. I watch a lot of movies, can you imagine draining 50-60W for no reason at all, while my 10W CPU can nicely play movies on my laptop without any issue, and i can even put it to my 4k TV. What's going on with Nvidia now? They are becoming AMD?

Also, multi monitor is super high, i got 3 monitors+TV, ouch. This is not a card for me, cus i keep my stuff on 24/7 during work, gaming and i even fall asleep sometimes. Useless power drain and higher energy bills. Yes, someone that spend 2000 bucks like me still cares about that. Trust me, when you got a ton of lights, heating (water+air) fridges and many other stuff, every bit helps. Not that long ago, people hated the 50-80W bulbs for a reason. It's not peanuts. Anyways, this card will most likely be 3k in Europe, so i probably won't get it. If it was closer to 2k, i might consider it.. but probably not. I don't buy defective wastes of energy. I did skip XTX after all, even tho it was better than RTX 4080, price wise and even performance wise (minus RT)
 
Welcome to the days of being CPU limited at 4K.

Still think I’ll get one though. Seems like fun.
 
Anyone who got a MSRP 4090 wont he proverbial lottery.

Turns out the 4090 WAS the next 1080ti.....
I still gloat about how I got a 4090, 13900KF, 32GB, 1TB SSD, 4TB HDD prebuilt from CLX through best buy off of OfferUp cuz the guy selling it "got an extra one" when he ordered his same setup for $2250 back in April of '23. I've been riding high

I sw that graph, but I seriously disagree with it.

Gaming power use, the 4090 sits at 411 watt, and the 5090 at 587. That's a 42% increase.

then on the performance charts, we see a 26% average increase at 4k.

so something doesnt add up.

Context is a difficult thing to grasp.

For a 2 slot cooler to be managing a card pulling damn close to 600w is nothing short of a miracle.
Agreed it's quite the amazing feat, it still feels wrong to have any component of a video card running over 70C. But it may be that I've just been seeing that kind of thing over the last 6 years

How is efficiency calculated? Which Watt and FPS data are used?




edit: I made a mistake in the original post, there are things that are not clear to me with the efficiency calculation.
Its mighty big of you to admit a mistake. Well done mate

Here's some fun with numbers. The 5090 is four times faster than the B580 for eight times the MSRP. I feel like Nvidia customers are paying half of the price for hardware and half of the price for software.
Yes, Nvidia is putting the cost to train the DLSS models into the video cards. Tom's Hardware has an article about Nvidia having a practically super computer working consistently for the past 6 years to develop and improve DLSS.
 
it's a halo product, you should not compare it (PerfromanceTo$) to other cards...

But 5080 and under might be pretty ugly ^^"
I feel this is a fallacy, it should totally be compared to other products and depending on the usage, a choice must be made. Purely for gaming, you must have money to burn. AI, game development, video editing, modeling and rendering, and gaming, here's your best option.

Also I wouldn't be me if I hadn't done that.
View attachment 381162
View attachment 381164

One will need to buy approximately 170 thousand RTX 5090s, or invest at least $340 million USD + VAT in order to get one football field worth of RTX 5090s.

Now that RTX 5090 has been measured in football fields we can go back to actual gaming.
I honestly thought only people from the USA do these kinds of random measurements. Like "it weighs as much as 8 elephants!"

Yes this is just the 4090 Ti especially if Nvidia brings DLSS4 to the 4000 series.
They most likely won't though. It's a benefit to them to keep this software locked to their newest generation. There hasn't even been a rumor to have DLSS 3 to 30 series cards. And it's kinda been nagging me that there a mismatch in DLSS version and the thousands number in the series. Meaning DLSS 4 5000 series, DLSS 3 4000 series

They should have kept the cooler a 3 slot. Not liking the temps considering it's already using liquid metal. This was never meant for me, the power consumption is insane
I like that there's an extreme offering towards the smaller range. I figure none of the AIB artners would do it since I haven't seen any 4090 or 4080s like that

Ah, missed the latencies, thank you, I still want to see some image comparisons and a short video with MFG enabled.
TPU will definitely have an article evaluating DLSS 4 and all its permutations that have been updated
 
Damn AMD really is awful at ML isn't it?
Well, I have no comment on that since never been able to measure up.
This is why I asked @W1zzard to use token/s instead that arbitrary time option.
That way you can compare the things they tested, but not good if you wish to compare your own rig's performance.
Generally I like what I have - no issue, and the 20GB VRAM is still a lot, only the 4090/7900XTX now the 5090 and the industrial cards have more
But when it comes to complex LLMs there is no enough :roll:

Here is the new DeepSeek Llama 8B F16 44,12 token/s is quite good!
1737683323988.png

And Microsoft's Phi4 14B Q8 with 42.83 token/s
1737683363691.png

Yeah, it is equation :slap::D
So it is not bad, but would LOVE if TPU had a section for this in the GPU reviews and maybe even in the GPU database
 
I mean yeah it’s an impressive gpu but the price unimpressive …. I will tank it with my 3090 for another 2 years for 60 series or whatever amd offering they have then

but folks on 10 series should upgrade and 20 series can consider … 30 series folks I think can skip this 50 series
What's crazy is remembering the 3090ti launch around January of '22 with the expected release of the 40 series later that year. The 3090ti went for 2k USD and it was less than 10% greater than the 3090. I believe the consensus was that it was just binning to get the ti version. What a terrible deal that was
 
How can you write in conclusion, that 5090 is highly energy efficient, when its efficiency is just about 1-2% better than previous 4090?! And lower models of previous generation are more efficient too! That's no advancement at all!
 
How can you write in conclusion, that 5090 is highly energy efficient, when its efficiency is just about 1-2% better than previous 4090?! And lower models of previous generation are more efficient too! That's no advancement at all!

It’s more efficient than the previously top efficiency card.

What would you call it, less efficient?
 
I keep harping on about the wall that silicon lithography has hit and people keep ignoring me, and then do surprised Pikachu faces when there is no efficiency gain generation-on-generation. Because efficiency, by and large, comes from the node size and that isn't getting appreciably smaller. If y'all are crying this hard about lack of generational performance uplift now, you're gonna be drowning in your tears for a long time, because there ain't any good solutions in sight in the next half-decade at best. Physics is a harsh mistress.
Totally agree, and it's why Nvidia has been developing tensor cores, ray tracing, DLSS, and Frame Gen. They knew that pure hardware improvements weren't going to continue at a steady pace and so they went laterally to make improvements
 

Check @ min. 16:08 for a comparison with the fps locked.
A weird review, but I kept watching, so his presentation works.

So 4090 more efficient than 5090. Dude loves his frame rates, the review indicates he might be a bit of a shooter gamer, called 30fps unplayable, and visual artefacts as a worth it for higher frame rates. His capped rate testing was still really high frame rates. :) Yes I consider 144 really high.

I did similar testing when comparing my 4080 to 3080, 30 series more efficient than 40 at low end games, because the cards can run at lower voltage and clocks. Although I think in this case the issue has some similarities, but ultimately the extra cores hindering the 5090, and it cant make up for it by dropping clocks and voltage due to a high floor set in the bios/driver.

Something of note on the TPU data as well, look at how high the power draw is for this thing when playing videos.
 
How about if we put 5090 at higher than 4k resolution? I agree with you re the wall. 600w power consumption for 27% increase at 4k shows it.

My guess is at 8k 5090 with massive memory bandwidth will be much more efficient, even if unplayable.
On a mass scale, how much 8k adoption do you really think there is? I highly doubt it's even more than 5% of gamers. W1zz is tired and is hitting that 80% of coverage as best he can.
 
"at the SKU's baseline price of USD $1,999"

Ostensible price of $1999.

Anyone who paid attention to how the pricing for the 4090 went knows how this will go. A tiny number of FEs will be available, sporadically, and everyone else will have to buy 3rd-party cards with tiny overclocks for quite a bit higher price. The threads and trackers for "hope" over getting an FE 4090 is something gamers (and home AI enthusiasts) shouldn't forget nor should tolerate.
 
It’s more efficient than the previously top efficiency card.

What would you call it, less efficient?

You'd call it a draw. If the result is within margin or error, it will change depending on run to run variance, test setup, software version, and other test considerations that may vary from review to review depending on methodology. Consider other large factors like test suite used will heavily influence the results as well.

Totally agree, and it's why Nvidia has been developing tensor cores, ray tracing, DLSS, and Frame Gen. They knew that pure hardware improvements weren't going to continue at a steady pace and so they went laterally to make improvements

Nvidia pushed those technologies to enable AI and Real-time ray tracing, not because we are hitting the limits of how much we can shrink chips.

This is a tock generation, performance uplifts were expected to be small. People need to stop running around like the sky is falling every time there's a tock generation only for performance gains to return to normal as they always do.

Scaling chips down is getting harder but the amount of investment has been exploding. This equilibrium has been the balancing force in chip manufacturing since it's inception.
 
You'd call it a draw. If the result is within margin or error, it will change depending on run to run variance, test setup, software version, and other test considerations that may vary from review to review depending on methodology. Consider other large factors like test suite used will heavily influence the results as well.



Nvidia pushed those technologies to enable AI and Real-time ray tracing, not because we are hitting the limits of how much we can shrink chips.

This is a tock generation, performance uplifts were expected to be small. People need to stop running around like the sky is falling every time there's a tock generation only for performance gains to return to normal as they always do.

Scaling chips down is getting harder but the amount of investment has been exploding. This equilibrium has been the balancing force in chip manufacturing since it's inception.
I'm not saying we've hit the limit yet, but that it is coming soon. I can't recall where I read that electrons don't physically perform the same and can jump around at the 1 nanometer transistor scale. I'm saying that Nvidia is prepping us with all these software and lateral hardware developments along side the traditional architecture and node improvements
 
"at the SKU's baseline price of USD $1,999"

Ostensible price of $1999.

Anyone who paid attention to how the pricing for the 4090 went knows how this will go. A tiny number of FEs will be available, sporadically, and everyone else will have to buy 3rd-party cards with tiny overclocks for quite a bit higher price. The threads and trackers for "hope" over getting an FE 4090 is something gamers (and home AI enthusiasts) shouldn't forget nor should tolerate.

I bought a 4090 in December of 23 for below MSRP.

1737685618878.jpeg
 
How can you write in conclusion, that 5090 is highly energy efficient, when its efficiency is just about 1-2% better than previous 4090?! And lower models of previous generation are more efficient too! That's no advancement at all!
Look through pages like 6-9 to see this being discussed. It's pretty much summed up that it's increased power consumption but is also performing great to keep up with that
 
For CS2, the most commonly used resolution is 960P. Are you willing to add this test specifically for CS2?
 
I bought a 4090 in December of 23 for below MSRP.

View attachment 381337
Congrats on getting a 4090 at basically MSRP at month 14 of its release......So everyone just hold your horses 12+ months and you MAY get it at MSRP following previous trends is what your saying?



Have people missed the fact that this is on the SAME manufacturing node as the 40xx series is? Why were people expecting to see MASSIVE efficency/power draw gains in this part?

Its like when intel went from 12th gen to 13th to 14th gen. You cant beat the laws of physics, if anything the fact they have increase die size and transistor count with such low drops in clock speed im pretty damn impressive, add to the fact they refined the design enough to eeek out a few percentage points of efficency in the heavily loaded areas is pretty good. Look at intel with the 14nm+++++++++++++ era or the 12/13/14th gen eras.
 
Congrats on getting a 4090 at basically MSRP at month 14 of its release......So everyone just hold your horses 12+ months and you MAY get it at MSRP following previous trends is what your saying?



Have people missed the fact that this is on the SAME manufacturing node as the 40xx series is? Why were people expecting to see MASSIVE efficency/power draw gains in this part?

Its like when intel went from 12th gen to 13th to 14th gen. You cant beat the laws of physics, if anything the fact they have increase die size and transistor count with such low drops in clock speed im pretty damn impressive, add to the fact they refined the design enough to eeek out a few percentage points of efficency in the heavily loaded areas is pretty good. Look at intel with the 14nm+++++++++++++ era or the 12/13/14th gen eras.
Mid to end of gen generally is best time to buy yeah.
 
If it's 1% better than the most efficient card on earth, then it is, by definition, the new most efficient card on earth.

Pulling 600w on its own does not make something inefficient.
No it does not when the power usage itself goes up by 27% it total counter acts any improvements made at all.
 
Mid to end of gen generally is best time to buy yeah.
Get a 3090 Ti at 1999 and watch it crash to 999 six months later. What a great idea, Same deal with the 5090 when 6080 comes out. You never know. Now with 32 GB that will be a little harder to beat. But if 5080 gets a 24 GB super refresh or even a Ti, that's encroaching in 4090 territory 90%.
 
Back
Top