Thursday, May 9th 2024

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

According to Benchlife.info insiders, NVIDIA is supposedly in the phase of testing designs with various Total Graphics Power (TGP), running from 250 Watts to 600 Watts, for its upcoming GeForce RTX 50 series Blackwell graphics cards. The company is testing designs ranging from 250 W aimed at mainstream users and a more powerful 600 W configuration tailored for enthusiast-level performance. The 250 W cooling system is expected to prioritize compactness and power efficiency, making it an appealing choice for gamers seeking a balance between capability and energy conservation. This design could prove particularly attractive for those building small form-factor rigs or AIBs looking to offer smaller cooler sizes. On the other end of the spectrum, the 600 W cooling solution is the highest TGP of the stack, which is possibly only made for testing purposes. Other SKUs with different power configurations come in between.

We witnessed NVIDIA testing a 900-watt version of the Ada Lovelace AD102 GPU SKU, which never saw the light of day, so we should take this testing phase with a grain of salt. Often, the engineering silicon is the first batch made for the enablement of software and firmware, while the final silicon is much more efficient and more optimized to use less power and align with regular TGP structures. The current highest-end SKU, the GeForce RTX 4090, uses 450-watt TGP. So, take this phase with some reservations as we wait for more information to come out.
Source: Bechlife.info
Add your own comment

84 Comments on NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

#51
chrcoluk
If they want to try to balance compactness maybe this time xx80 cards will be dual slot, instead of being as big as a tank.
Posted on Reply
#52
TheinsanegamerN
My god, the pearl clutching. "muh power, muh TDP!" Funny, I remember the SAME type of people whining and crying about Fermi's power use. "300w is insane, a GPU should pull more then 150w". Now it's switched to "Waah 600w is too much, 350w is the limit".

Just....buy a smaller GPU and stop crying? Just go get a 6750xt or 4060ti and revel in the power efficiency!
chrcolukIf they want to try to balance compactness maybe this time xx80 cards will be dual slot, instead of being as big as a tank.
So the high end should be artificially gimped so you can have a dual slot cooler? Because the only way they are doing this is to dramatically under-clock the thing out of the box or cut down the GPU die size, at which point you have a xx7x card with a xx8x name and price, which I seem to remember people didnt like with the 4000 series.
LycanwolfenToo bad they cannot make a 250 Watt version do the same output of the 600 watt version now that would be an actual milestone. The Orginial road map they had when they had the 1000's series said they would look into increasing performance while reducing power. Then they threw that into the garbage.
The 4000s series is LIGHTYEARS ahead of the 1000 series in perf/W. Just because they didnt artificially limit their GPUs to 1080ti performance doesnt mean theres not been major efficiency improvements.
AsRockYeah but you tend to not run them for hours.
So if you dont want the heat, buy a lower TDP card then? The point is, its not a risk, nor is it going to electrocute your cat or burn down your house.
Posted on Reply
#53
Fleurious
As long as the 600W GPUs can communicate with my furnace to balance heating duties for 8months of the year i’ll be happy :p. I’ll just sit the exhaust near one of the air returns as a pre-heater :p.
Posted on Reply
#54
rv8000
Dr. DroThat 20% than Ada figure has largely been conjecture from rumors and extrapolating as little architectural improvements as possible because "it's still on TSMC N4".

The RTX 4080 is already 2x faster than the 6750 XT, it's just 3 times as pricy. Generational uplift should bring cheaper cards at this performance level at the bare minimum.
3000 > 4000 series didn’t move the performance per dollar at all until the 4070s. Depending on where AMD 8000 series ends up on performance segments, I expect Nvidia GPUs to be the same or worse value next gen respective to price. Likely to end up with a 5060 card at 4070 performance for $500-550 or more.
Posted on Reply
#55
DudeBeFishing
cvaldesI would like chip designers to continue putting the OC headroom in the boost clock so I don't have to spend hours diddling with various tools to do what GPU engineers can do better than me. They have advanced degrees in electrical engineering, mathematics, physics, etc. plus years of experience as paid professionals not dilettantes. I don't. And I already paid them when bought my card.
Meanwhile, I spent hours troubleshooting stuttering, all due to the boost algorithm thinking it's a good idea to drop the clock mid game.
Posted on Reply
#56
Dr. Dro
rv80003000 > 4000 series didn’t move the performance per dollar at all until the 4070s. Depending on where AMD 8000 series ends up on performance segments, I expect Nvidia GPUs to be the same or worse value next gen respective to price. Likely to end up with a 5060 card at 4070 performance for $500-550 or more.
Yes, Ada has a pricing problem compared to Ampere, at least at the official sphere and at SKUs other than the 4090. But I don't think that the 5060 will be $550, no.
Posted on Reply
#57
ghazi
I see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.
Posted on Reply
#58
freeagent
My PSU came with a 600w plug, I am ok :laugh:
Posted on Reply
#59
AusWolf
"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
Posted on Reply
#60
64K
AusWolf"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
It could mean that we'll need more info from Nvidia about this or it could mean that Nvidia no longer has any interest in entry level to lower midrange GPUs because the profit margins are too low for them and they will surrender that segment of the market to AMD and Intel. We'll have to wait and see.
Posted on Reply
#61
ypsylon
Ignoring games. If 5090 will deliver the same performance jump as switch from 1080Ti to 3090 then I see no problem. Compute wise that was 350W in one card instead 4 or 5 x250W 1080Tis to do same kind of rendering job. Absolute no-brainer. I'm only bloody terrified of those connectors and all the calamities which follow new power plugs. For that very reason I completely skipped 4000 and waiting for new stuff to get out with new PSU with all bells and whistles + 5090. Looking back to previous switch 4-5x uplift for 250W more I can live with this.
Posted on Reply
#62
Vayra86
AusWolf"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
Why would they be gone? You just get a tinier slice of silicon.
Posted on Reply
#63
AusWolf
Vayra86Why would they be gone? You just get a tinier slice of silicon.
Power requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
Posted on Reply
#64
ARF
AusWolfPower requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
Will they remain on the old TSMC 4N process, or will they prefer to wait for something better - TSMC 2N or 3N?
At least, they still have a quite significant "FPS" lead over the old Radeon RX 7900 XTX, so why even bother to release useless products?
dgianstefaniIf efficiency increases (work done/power used), which it has with RTX 40xx compared to RTX 30xx, significantly, the power limits don't bother me.
Kn0xxPTbut ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....
dgianstefaniIt's not scary at all.
grammar_phreakIt wont be scary if it has a functional power connector.
I see pictures with heavy cards bent, sagging is quite an issue.
Broken, cracked PCBs with molten connectors will become a norm.
I choose to avoid, though :D

Solution - water cooling.
Posted on Reply
#65
AusWolf
ARFAt least, they still have a quite significant "FPS" lead over the old Radeon RX 7900 XTX, so why even bother to release useless products?
Because few people care about $1000+ graphics cards. What you see here in the forum is not representative of the general public.
Posted on Reply
#66
MentalAcetylide
Kn0xxPTbut ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....
They're not going to release no 600+ watt graphics card for the regular consumer like us. I don't even think something like that could be properly cooled without resorting to some kind of large exotic water cooling solution, the large custom case to go with it, and the expected AC cost for keeping the ambient room temp down to reasonable levels.
Also, something like that for rendering would suck ass because you would only be able to feasibly have one of those things in the same case without going all-out full custom system, cooling, wall outlets, etc., and probably a beefy 2000-watt PSU & a decent UPS w/ surge protection. That's easily looking at around $10k+ right there if you're a DIY.
Like I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.
Posted on Reply
#67
ARF
MentalAcetylideLike I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.
Only if they can extract higher transistor density, significantly improve the FPS count over the current offerings.
Otherwise, there will be little to no initiative for anyone to throw another large sum of money. For 10-15-20% more FPS it won't be worth it.
Posted on Reply
#68
TheinsanegamerN
AusWolfPower requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
Erm.....the 4060 pulls 110w under load. The sub 150w cards havent gone anywhere. :slap:
AusWolfBecause few people care about $1000+ graphics cards. What you see here in the forum is not representative of the general public.
Enough people care that, so far as Steam is concerned, more people buy 4090s then the entirety of AMD and Nvidia has seen fit to release $1000 GPUs since the late 2000s.

but you know not many people buy this type of card. Few bought the 8800 GTX ultra or the GTX 590 or the 1080ti, those are just impossible to find today because nobody bought them right?
MentalAcetylideThey're not going to release no 600+ watt graphics card for the regular consumer like us. I don't even think something like that could be properly cooled without resorting to some kind of large exotic water cooling solution, the large custom case to go with it, and the expected AC cost for keeping the ambient room temp down to reasonable levels.
Also, something like that for rendering would suck ass because you would only be able to feasibly have one of those things in the same case without going all-out full custom system, cooling, wall outlets, etc., and probably a beefy 2000-watt PSU & a decent UPS w/ surge protection. That's easily looking at around $10k+ right there if you're a DIY.
Like I said, they're going to keep the wattage down, and I'm sure if they wanted to, they could make them more efficient, but its cheaper for them to stick with their current manufacturing processes.
Ahem....the 3090ti exists, and peaks at over 600 watts. It's sustained draw at stock os over 500 watts.
Posted on Reply
#70
Lew Zealand
AusWolf"Ranging from 250 W" ... I'm more scared of this part of the statement. Where did the 120-150 W mainstream cards that we all loved (like the 1060) go? :(
The 4060, compared to the 1060 6GB:

uses less power, 110W vs. 120W
delivers more 1080p fps at time of review (96 vs 83) in W1zz's reviews
costs less after inflation, $300 vs $323
delivers 2.4x the fps at 1080p in today's games (a reasonable 3-gen uplift, ~34%/gen)

I bought the 1060 6G as my first PC gaming GPU and the 4060 is a well-matched successor. I'll bet the 5060 will be a little higher power but still delivers more fps/W.
Posted on Reply
#71
AusWolf
TheinsanegamerNErm.....the 4060 pulls 110w under load. The sub 150w cards havent gone anywhere. :slap:
Good point.
TheinsanegamerNEnough people care that, so far as Steam is concerned, more people buy 4090s then the entirety of AMD and Nvidia has seen fit to release $1000 GPUs since the late 2000s.

but you know not many people buy this type of card. Few bought the 8800 GTX ultra or the GTX 590 or the 1080ti, those are just impossible to find today because nobody bought them right?
Nvidia halo cards are a different matter altogether. They kind of exist within their own market space. People will buy them regardless of their price.
Posted on Reply
#72
harm9963
I went Almont 6 years with my 1080Ti as my main rig, my 4090 looks to do the same , I went thru more CPU , 1055 8370 2700X and now 5950X.
Posted on Reply
#73
Vayra86
AusWolfPower requirements have been steadily increasing in the last 2-3 generations, and judging by the quoted part of the statement, the trend seems to continue.
Ampere increased them even in the midrange but Ada didnt, really. The 4060 gets by on 130W. The 1060 does the same.

I think what we are seeing is that the range of performance between the x60 and x80ti/90 has increased quite a bit. The TDP is stretched up mostly on the higher end.

So; you really do get a tinier slice of silicon. The x60 is less card today than it was relatively 3-6 generations back
ghaziI see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.
Power grids have bigger issues, like EVs and solar. Its precisely NOT usage but power input that destroys network stability.
Posted on Reply
#74
Wasteland
Conveniently, new AAA games generally aren't very good. As games, that is--they're fairly impressive in terms of graphical fidelity, but of course the rate of improvement there has slowed to a crawl over the last decade, too.

If we ever reach a point where the newest GPUs threaten to max out the power of the average domestic circuit, it won't be especially difficult to opt out.
Posted on Reply
#75
ARF
ghaziI see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.
I think no, because many houses will be then at a threat of real fire hazard. You will need to build new cities and buildings infrastructure to handle that.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so.
WastelandConveniently, new AAA games generally aren't very good. As games, that is--they're fairly impressive in terms of graphical fidelity, but of course the rate of improvement there has slowed to a crawl over the last decade, too.
If we ever reach a point where the newest GPUs threaten to max out the power of the average domestic circuit, it won't be especially difficult to opt out.
The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
Posted on Reply
Add your own comment
Dec 21st, 2024 10:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts