Thursday, May 9th 2024

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

According to Benchlife.info insiders, NVIDIA is supposedly in the phase of testing designs with various Total Graphics Power (TGP), running from 250 Watts to 600 Watts, for its upcoming GeForce RTX 50 series Blackwell graphics cards. The company is testing designs ranging from 250 W aimed at mainstream users and a more powerful 600 W configuration tailored for enthusiast-level performance. The 250 W cooling system is expected to prioritize compactness and power efficiency, making it an appealing choice for gamers seeking a balance between capability and energy conservation. This design could prove particularly attractive for those building small form-factor rigs or AIBs looking to offer smaller cooler sizes. On the other end of the spectrum, the 600 W cooling solution is the highest TGP of the stack, which is possibly only made for testing purposes. Other SKUs with different power configurations come in between.

We witnessed NVIDIA testing a 900-watt version of the Ada Lovelace AD102 GPU SKU, which never saw the light of day, so we should take this testing phase with a grain of salt. Often, the engineering silicon is the first batch made for the enablement of software and firmware, while the final silicon is much more efficient and more optimized to use less power and align with regular TGP structures. The current highest-end SKU, the GeForce RTX 4090, uses 450-watt TGP. So, take this phase with some reservations as we wait for more information to come out.
Source: Bechlife.info
Add your own comment

84 Comments on NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

#77
Kn0xxPT
ARFThe last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
Posted on Reply
#78
colossusrageblack
The coolers are just oversized, I doubt they'll actually try and push 600W. Just look at the 4080's cooler, it's probably rated for over 500W, but in truth the 4080 uses as much power as a 3070ti, which is insane for the amount of horsepower it provides. The cooler on my 4080 Gigabyte Eagle is so oversized that the fans don't even run half the time I'm in game.
Posted on Reply
#79
Vayra86
Kn0xxPTThe thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
The games dont need that hardware at all to sell, thats the most ironic thing here. The biggest sales cannons are the best immersive worlds. The bigger scope of games is where its at now: we have fast storage, we have strong CPUs... none of this requires a killer GPU.

And that IS really where games should seek to impress. It is the world you enter that matters, how it looks is secondary at best. RT and more realtime brute forcing is the only way to 'undo' the efficiency gains of raster technologies and keep selling fools the dream that graphics really do keep getting 'better', while the biggest advances are not RT, but all sorts of other engine advancements.
Posted on Reply
#80
AusWolf
Kn0xxPTThe thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
Games don't need that amount of performance at all. We just all want shiny ultra RT graphics at 4K running at 120 FPS minimum. Yet, here I am, playing Hogwarts Legacy on a 6500 XT on low graphics and having tons of fun. ;)
Posted on Reply
#81
ghazi
ARFI think no, because many houses will be then at a threat of real fire hazard. You will need to build new cities and buildings infrastructure to handle that.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so.



The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
At 2000W this is true to some extent. I can't speak for other parts of the world but any wiring here can handle close to 1800W, I have a 1600W space heater...
And while I don't quite agree with the last part, I do like your spirit. I sure hope Crysis 4 comes with a worthy campaign and the same graphics ambition, and doesn't water anything down. The idea of TAA ruining the vegetation like all modern games scares me though.
Have you played the Crysis 3 remaster with RT though? It's not a massive difference but it sure is more visually immersive.
Posted on Reply
#82
DudeBeFishing
According the NEC, which is the code most of the USA follows, Cord-and-plug-connected equipment is not allowed to exceed 80% of the circuit's rating.

15A 120V (1800W) circuits can't exceed 12A (1440W)
20A 120V (2400W) circuits can't exceed 16A (1920W)

A 80+ Titanium rated PSU is at worst 90% efficient at 100% load.

15A 120V is limited to a 1290W on a 80+ Titanium PSU (1440W * 0.9 = 1290W)
20A 120V is limited to a 1728W on a 80+ Titanium PSU (1920W * 0.9 = 1728W)

The max power a PSU can output goes lower as the efficiency goes lower. You also have to account for ignorance or neglect in the average home. Most people don't replace a loose outlet. Even some of my tight fitting receptacles get uncomfortably hot when drawing 12A consistently from a space heater.

I guess what I'm trying to say is I don't see top end GPU's getting much more power hungry. 600W is probably going to be the max they pull at stock, since they need to account for people using 2 GPUs. I don't mean 600W continuosly, just 600W spikes.
Posted on Reply
#83
Assimilator
DudeBeFishingAccording the NEC, which is the code most of the USA follows, Cord-and-plug-connected equipment is not allowed to exceed 80% of the circuit's rating.

15A 120V (1800W) circuits can't exceed 12A (1440W)
20A 120V (2400W) circuits can't exceed 16A (1920W)

A 80+ Titanium rated PSU is at worst 90% efficient at 100% load.

15A 120V is limited to a 1290W on a 80+ Titanium PSU (1440W * 0.9 = 1290W)
20A 120V is limited to a 1728W on a 80+ Titanium PSU (1920W * 0.9 = 1728W)

The max power a PSU can output goes lower as the efficiency goes lower. You also have to account for ignorance or neglect in the average home. Most people don't replace a loose outlet. Even some of my tight fitting receptacles get uncomfortably hot when drawing 12A consistently from a space heater.

I guess what I'm trying to say is I don't see top end GPU's getting much more power hungry. 600W is probably going to be the max they pull at stock, since they need to account for people using 2 GPUs. I don't mean 600W continuosly, just 600W spikes.
This is a USA-only problem because your country is dumb and uses 120V instead of 240V, requiring double the amperage for the same wattage.
Posted on Reply
#84
DudeBeFishing
AssimilatorThis is a USA-only problem because your country is dumb and uses 120V instead of 240V, requiring double the amperage for the same wattage.
It's North America that uses 120V, not just the USA. There's other countries outside of North America, such as Brazil and Saudi Arabia, that use 120V. Then there's Japan at 100V. Besides, I doubt anyone wants a 2000W space heater next to their desk in the middle of summer.
Posted on Reply
Add your own comment
Dec 21st, 2024 10:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts