• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

Nvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
Are you aware that TONS of people buy those for productivity, and not for games? Way better price than their enterprise lineup, gets the job done and has no restrictions when meant to be placed in a DIY setup. I'm one of such users:
1735931098861.png


I listed several (even legitimate) use cases for the 4090. Did I say which kind of buyer you personally were? ;)
The way you phrase things does sound aggressive and judgy nonetheless, and I believe that's what leads to many rude answers you often get as well.

Me. You know it, but... ?
They often have lower clocks and power limits (for better power efficiency), but will work just fine for games.
Most of that argument you got off reddit is based on price to performance, which is valid, but doesn't meant the product won't be able to run games (and even do it well!)

I still don't think a x90 card is meant for gaming purely because of the insane specs and price.
Well, it's meant to be used however the customer wants to. If they want to use it just as a paperweight they're free to.
 
I think that arguing about what the 5090 is “meant” to be is missing the point a bit. It’s a halo-tier consumer GPU. That’s what it is. Said consumer can use it for gaming, for work (with certain restrictions vis a vis the professional line), for mining shitcoins, for OC challenges - whatever. I don’t think that NVidia particularly cares as long as they get theirs. Whether or not said SKU makes sense for each individual users particular tasks and monetary sensibilities is rather irrelevant to the product itself. It just is. It’s the “best” card one can buy in the consumer space. It justifies its existence by this fact alone.

Think about it this way - is something like a Porsche 911 GT3 RS a fast car, the “best” 911 on offer? Yup. Is it a full on race car for professionals though? No, it’s still a street legal vehicle. Does it make sense to buy it for daily driving? Not particularly. Can an average customer with access to money that can buy one even extract full potential of one? Fuck no. Does Porsche really care what you buy it for or justifies its existence? Again, no. Because that way lies madness and one can conclude that the only people who SHOULD buy one are already incredibly good drivers to use on track days. It even might make sense, this logic, but it isn’t how the world works.
 
In art and design There's lots of freelancer who prefer the Geforce RTX xx90 over the workstation RTX. CAD/Scientific simulation users are pretty much the only target that I've really seen have a hard on for those professional cards, otherwise I've seen professionals says that in their domain, the ROI is just higher with a Geforce. And if a project really need more than that, they would use a cloud farm anyway (and charge the client accordingly)
For CAD/CAM you need the Quadro drivers. For freelance creators you want the RTX. Hence the creator drivers for Geforce cards that came out a good bit back.

For AI, ML, DL or CUDA Geforce is where it's at. And even at the price the X090 cards are at or the Titans before them slapping four of them in a box and going from there is cheap and economical. Some systems pack six or eight X090 cards. Freelancers, contractors, businesses, universities and more go out and buy gobs of the cards often in premade systems and racks because it's cheap.

What's justifying the existence of these cards with the specs they have isn't gaming performance just as the export bans on them aren't out of fear China will stomp people in CS2 or Overwatch. It's for their actual intended justification as compute cards.

CUDA changed everything. Geforce is not a gaming brand and has not been for a good long time. It's a compute brand. I fully expect the higher end of the range to keep running away in price and performance in the future as there is still tons more performance wanted and tons more money people are willing to spend. If you doubled the price for 25% more perf it would still sell out.
 
Freelancers, contractors, businesses, universities and more go out and buy gobs of the cards often in premade systems and racks because it's cheap.
My university has a cluster with a nice mix of 3090s and A100s (40GB version, not the 80GB one, sadly), pretty nice whenever I need more horsepower.
 
That's exactly what I said
That's what I think so, too. If the 5080 isn't enough, I'm not gonna spend double, and then limit my 5090 to be only a little bit faster than the 5080. It's a huge waste of money.
But what I'm saying is a power limited 5090 won't be " a little bit faster". Much like the 4090, sacrificing 5-10 percent performance the 4090 is still WAY ahead of the 4080.
Then why do we have massive differences between GPUs such as the 5700 XT vs the Vega 64, the former of which was faster with only 62% of the cores, despite having not much of a clock speed difference?
Hmmm...not much? In what universe? The boost of rDNA 1 hits 1900 ish, compared to just 1500 for the vega. I can tell you, from owning two, that the vega did not maintain anywhere close to 1500 under load either, much closer to 1350, MAYBE 1400 with manual tuning and the fan cranked to maximum. So in the real world, you're looking at closer to a 500 MHz boost in clock rate.

Also, shader core count is not everything. Vega 64 and the 5700xt had the same ROP count. Vega 56, despite being cut down, was within margin of error of the vega 64 by comparison, suggesting vega 64 was ROP limited. Even vega 56 had issues too, the HBM was too latent to allow the GPU to fully stretch it's legs.

In short: not an arch issue, but instead multiple design issues that gimped the cards actual potential. Much like the 7900xtx, vega 64 should have been a tier higher given it's core size and power use but was held back due to design issues.
 
The way you phrase things does sound aggressive and judgy nonetheless, and I believe that's what leads to many rude answers you often get as well.
The rude answers come from people who can't be bothered to read my posts fully and properly, and are just looking for a trigger instead of a starter for an educated conversation. It's sad, but hey, humans do human things, right?

Well, it's meant to be used however the customer wants to. If they want to use it just as a paperweight they're free to.
Sure, don't let me stop them. :)

I'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
 
I'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
Its a bit too expensive for you and me, but not for the guys who are waiting for this.
 
Its a bit too expensive for you and me, but not for the guys who are waiting for this.
That's fine, but the fact that there are regular gamers waiting for this kind of explains why GPUs are so expensive these days (referring to another thread). It's because people want them anyway, so why shouldn't they be expensive, right? ;)
 
And I thought I went overboard with a 1200W PSU with my Q6600 build(I intended to go XFire at some point).
Now I’m back down to 850W
But the Kilowatt + PSU is pretty much becoming mandatory…
 
I'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
Phrasing it this way is perfectly nice. Calling it stupid or something akin to that is not.
That's fine, but the fact that there are regular gamers waiting for this kind of explains why GPUs are so expensive these days (referring to another thread). It's because people want them anyway, so why shouldn't they be expensive, right? ;)
They are a minority. Even in this forum a x090 is a minority, so that's not the main driving factor for those prices.
And I thought I went overboard with a 1200W PSU with my Q6600 build(I intended to go XFire at some point).
Now I’m back down to 850W
But the Kilowatt + PSU is pretty much becoming mandatory…
I will still be running dual GPUs with my 850W PSU until it dies :p
 
I'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
I think a significant percentage of these cards are dual use. Gaming performance is just a welcome bonus (and that's how the card gets included in Steam stats).
 
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
Too late for that. I'm putting in a bigger fan. "I've got to have more power Scotty!" said Capt. Kirk. :lovetpu: y'all got me :laugh:mao
Next year I'm going for aviator oxygen and a leather jacket.
 
Think about it this way - is something like a Porsche 911 GT3 RS a fast car, the “best” 911 on offer? Yup. Is it a full on race car for professionals though? No, it’s still a street legal vehicle. Does it make sense to buy it for daily driving? Not particularly. Can an average customer with access to money that can buy one even extract full potential of one? Fuck no. Does Porsche really care what you buy it for or justifies its existence? Again, no. Because that way lies madness and one can conclude that the only people who SHOULD buy one are already incredibly good drivers to use on track days. It even might make sense, this logic, but it isn’t how the world works.

Now I want to play Forza :)
 
A higher TDP was definitely predictable knowing that NVIDIA are using a pretty similar TSMC node (4nm). Now let's hope those TDPs are like on RTX 40s, almost never reaching them unless it's at 4K (or DLSS Quality) @ Ultra settings and with Ray Tracing or Path Tracing.
 
600W, $2000+, 20-30% ….hard, hard, hard pass
20-30%?
This thing is much more faster than 20-30% vs 4090 if u meaning that?

And im sure u will pass it anyway and going to buy another AMD low/mid tier gpu

Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
U have very very poor AC if it overloads extra 575w
We have 4 PC all+300w and room temps not even raising.

And u know, its not using 575w when its in u PC u need to stress it 100%
 
U have very very poor AC if it overloads extra 575w
We have 4 PC all+300w and room temps not even raising.

And u know, its not using 575w when its in u PC u need to stress it 100%
My 4090 can definitely make my room pretty hot when using the 530W BIOS...
 
assuming you want 50% usage of psu:
9800x3d 176w
rtx 5090 575w
everything else in the case 100w
100w reserve
951w total power

2x 951w= 1902w

where's that 2kw psu? and we don't talk about sli/nvlink setup.

optimised setup rtx 5090: ?430w 75%+176wcpu+200w others = 806w x2 = 1600w psu those are in retailers shops
 
You are laughably full of crap claiming over 1200 watts would not impact the temperature of a room.
Well, if he dumps 1200 W into his room from his PC, then turns on the 2000 W A/C, then of course he doesn't feel a single °C difference. Only his electricity bill does. :p

I think a significant percentage of these cards are dual use. Gaming performance is just a welcome bonus (and that's how the card gets included in Steam stats).
I'm sure that's the intended purpose. But you know people... some of them just can't live without the newest, bestest, most expensivest toy (and then come here only to brag about it).
 
My PSU recently died so I "downgraded" to a 1000W. Seeing the TDP I thought my recent purchase would put the 5090 out of reach but according to those online PSU calculators I only need 650W if I slap in a 4090, so 1000W should be enough. I have a hard time believing that though lol

I've not really touched my "luxury spending" account for over a year, so am tempted to get the 5090 if it's less than £2k...
I've been on 1600p for 13 years so am eyeing up those 4k 240hz OLEDs. Worried about that sweet burn in and Nvidia vendor locking DLSS 5 to the 6* series. Imagine spending £2k for that to happen... Will probably do some research after it comes out, second guess myself, then not spend anything for another year :laugh:

It doesn't. I answered a question. The hostility... Geez. :confused:

Should everybody who doesn't intend to buy a 5090 just shut the F up and leave? :confused:
Something about pots and kettles. You sort of spent the first page harrasing someone for being "a slave to buying things" followed by passive aggressive remarks about said slavery and America...
 
assuming you want 50% usage of psu:
9800x3d 176w
rtx 5090 575w
everything else in the case 100w
100w reserve
951w total power

2x 951w= 1902w

where's that 2kw psu? and we don't talk about sli/nvlink setup.

optimised setup rtx 5090: ?430w 75%+176wcpu+200w others = 806w x2 = 1600w psu those are in retailers shops
Wut?? The 9800X3D is closer to 60~70W in gaming loads, max a hundred.
Also why would you reserve 200W for the rest of the system? That's a lot, in gaming scenarios we should be under 40~50W for the whole rest of it.
700W total is a reasonable system power estimate.

If you want maximum efficiency the load range is 40 to 60% and you can still go a bit over without much loss.

With this in mind a 1200W PSU is already more than enough. And even a barely decent 850W would probably do the trick. The power rating of the PSU doesn't need to be divided by 3, there already is a safety margin.
 
Something about pots and kettles. You sort of spent the first page harrasing someone for being "a slave to buying things" followed by passive aggressive remarks about said slavery and America...
Taking a stab back when all parties have moved on ages ago is definitely worth it. You must feel very proud for calling out on all that social injustice radiating from my horribly evil posts. :rolleyes:

Let's be clear once again: I do not have a problem with anyone who wants a 5090. Just don't shove it under my nose out of a misguided superiority/inferiority complex, will 'ya? :)

The GPU you buy does not make you better or worse. It's just an object. A piece of silicon, metals and plastic. Using it for bragging rights implies a very sad way of existence.
 
A 1200 watt Psu will be enough for 5090 right?
 
Last edited:
I'm sure that's the intended purpose. But you know people... some of them just can't live without the newest, bestest, most expensivest toy (and then come here only to brag about it).
I can't say i know what's the intended purpose... but if you can run DX by night and CUDA by day, be it for work, learning, as a hobby or searching for interplanetary communication, then the GPU sure seems less overpriced.
 
Back
Top