• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

Desperation?

My dude their main competitor is supposedly not even trying high end this upcoming gen.

I was obviously (but apparently not obviously enough) not referring to desperation with regard to their non-existing competitor ;) but desperation in finding the much needed performance gains to motivate people to buy a RTX 5090.
As I said, it's not going to be an easy task. The boost from the optimized 5nm process will be minimal, the Blackwell architecture will only provide that much of a boost and the 512-Bit memory interface (if true) will contribute to a higher power envelope as well as a higher cost.

I would be very positively surprised if nVidia can manage to squeeze more than a +30% gain out of the RTX 5090 over the RTX 4090. Maybe they can in ray-tracing by slashing the rasterizing performance (even more) but overall I believe that they will be facing difficulties to make the RTX 5090 an attractive upgrade for RTX 4090 owners. The 512-Bit interface reeks of desperation to be able to advertise at least some innovation (instead of real innovation like a 3nm process).

As I said in my previous post, I'm convinced that both nVidia and AMD will be more or less half-assing their upcoming consumer generations in favor of AI. Can't really blame them either. There is billions to be made from AI/datacenter while consumer stuff is comparatively boring and limited.
They have long since moved their hardware and software top talent to AI. We consumers will have to take a backseat until the AI curve flattens.
 
I would like chip designers to get back to shipping the processors with default clocks dialed in at peak efficiency, and leaving plenty of headroom meat on the bone for overclockers.

This right here, boost/turbo/dynamic frequencies ruined the best part of PC building.
 
900 W is the SXM datacenter variant of the largest processor. It's been that way in a very longtime.

It wont be scary if it has a functional power connector.

Which it does.

450W is the max I'm willing to pick, be it a 5080 or 5080Ti

600W, lol there are PSU just with 600W available for the whole damn PC and now it's just the GPU

The 5080 will likely have the same 320 W footprint of the 4080, and likewise, realistically consume 200-220 W in most workloads.
 
Seems about right
 
Too bad they cannot make a 250 Watt version do the same output of the 600 watt version now that would be an actual milestone. The Orginial road map they had when they had the 1000's series said they would look into increasing performance while reducing power. Then they threw that into the garbage.
 
There are appliances in your house that draw far more current from a standard outlet.

Toasters, microwave ovens, hair dryers, and more. And you've probably had them for years, maybe even decades.

The device in question needs to be carefully designed and properly used. It's the same whether it's a graphics card or a stand mixer.

That said 600W is probably nearing the practical limit for a consumer grade GPU. Combined with other PC components, display, peripherals, etc., that's using up a lot of what's available in a standard household circuit (15A @ 120V here in the USA). And it's not really wise to push the wiring and breaker in a standard household circuit to the limit for long periods of time.

It depends, a 14900K + 4090 system could exceed 1000w.

Mind you people aren't running their toasters, microwaves, and ovens 6-24 hours a day whereas a computer with a 4090 might be set to running AI or rendering all day. Go ahead a leave your toaster on continuously for a year and see what happens.

A lot of people's issue isn't even per say with the power consumption but the power connector. They may have revised the connector to fix it still being able to draw power when not fully seated but that doesn't address the low safety tolerances nor the fact that the cable itself still puts pressure on the pins and requires 30cm of straight cable from the connector and is not tolerant of horizontal misalignment.
 
Last edited:
250W is pretty high for the lowest product on the stack.

They probably leave a bit of headroom the 4090 FE cooler can technically handle 600w but it was put on the 300w 4080 as well.
 
@ this point, I'd be all for a separate PSU (on-card or external, ala multi-chip Voodoo)
-48VDC, 12V converted on-card?
Folks actually used to mod the Xbox 360 power brick to feed their extra GPUs in the classic 5,1 Mac Pro.
 
better to use dual cards than some overweight chunk of metal
Videogame industry already tried that and it didn't work out for a variety of reasons. This topic has been beaten to death like a dead horse so I'll leave it to you to research it and find the autopsies online.

Enjoy!

:):p:D
 
Wow... No GPU focused on running games should consume 600W.
Personally, I always strive to optimize for lower power consumption(whenever possible). Undervolting and limiting/capping the frame rate can make a significant difference, especially on mid and high-end hardware.
 
Looking forward to the efficiency gains here, I wonder what the performance level will be around the 300 watt mark, that's my most interesting spot for a mid/high end, and then what's possible for sub 100w, the lower power segment is really.......... heating up :)

If efficiency increases (work done/power used), which it has with RTX 40xx compared to RTX 30xx, significantly, the power limits don't bother me.
40 series is also the first in multiple generations where most cards don't just constantly bump against the board power limit too, 450w for the 4090 sound scary but in reality it's rarely on that limiter. I'd be undervolting whatever I get anyway, feels like a nothing burger.
 
Last edited:
600W on a gaming GPU
Hope it is not going through a single 12v-2x6 connector...


A Lot Power GIF by DreamWorks Trolls
 
250w Blackwell with 2x the performance of my 6750xt would be compelling.
It's just about 20 percent better than Ada. Too meh. 170 W and less as a bare minimum (if we don't see FPS per $ increase which is a REAL possibility).

I just hope reasonable GPUs of ~450 USD will exist by mid'25.
 
It's just about 20 percent better than Ada. Too meh. 170 W and less as a bare minimum (if we don't see FPS per $ increase which is a REAL possibility).

I just hope reasonable GPUs of ~450 USD will exist by mid'25.

That 20% than Ada figure has largely been conjecture from rumors and extrapolating as little architectural improvements as possible because "it's still on TSMC N4".

The RTX 4080 is already 2x faster than the 6750 XT, it's just 3 times as pricy. Generational uplift should bring cheaper cards at this performance level at the bare minimum.
 
There are appliances in your house that draw far more current from a standard outlet.

Toasters, microwave ovens, hair dryers, and more. And you've probably had them for years, maybe even decades.

The device in question needs to be carefully designed and properly used. It's the same whether it's a graphics card or a stand mixer.

That said 600W is probably nearing the practical limit for a consumer grade GPU. Combined with other PC components, display, peripherals, etc., that's using up a lot of what's available in a standard household circuit (15A @ 120V here in the USA). And it's not really wise to push the wiring and breaker in a standard household circuit to the limit for long periods of time.

Yeah but you tend to not run them for hours.
 
I would happily get a 600W part and step it down to 429W, the one reticle size highNA 429 mm², so perfect. I hope it never gets crazier than this and it will sooner or later unless the EU steps in.
 
Seeing how crazy efficient the 4000 series can be undervolted, i have no doubt the 5000 series will be the same.
 
but ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....
No need for that little power.

Your skis has, when dried a skin resistance of arounf 100 kOhm. To get a heart attack there is at least 50mA needed flowing straight through the heart. With dried skin there are flowing a maximum of 6 mA.
 
Generational uplift should bring cheaper cards at this performance level at the bare minimum.
I'm not the one who supports the opposite.

But hey, since AMD don't even try to compete and Intel don't have it just yet why should we get any FPS per $ improvements? They will be limited to say the least.
 
There are appliances in your house that draw far more current from a standard outlet.

Toasters, microwave ovens, hair dryers, and more. And you've probably had them for years, maybe even decades.

The device in question needs to be carefully designed and properly used. It's the same whether it's a graphics card or a stand mixer.

That said 600W is probably nearing the practical limit for a consumer grade GPU. Combined with other PC components, display, peripherals, etc., that's using up a lot of what's available in a standard household circuit (15A @ 120V here in the USA). And it's not really wise to push the wiring and breaker in a standard household circuit to the limit for long periods of time.
Sure and vacuum cleaners... but guess what. None of those appliances are permanently fixed to your desk now are they. Nor do they run all evening
 
The power draw of 600 watts is crazy.
 
300W max is reasonable to me. 350W if I close one eye.
 
Back
Top