• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
I see the argument that 600W is manageable and sure, it is manageable. I think the concern is moreso that the competitive pressures in the market combined with stagnating returns on process tech development are putting us in a position where every generation has a massive increase in power draw, which is simply unsustainable in the long term. Will we one day be arguing "2000W is too much, I will only buy a 1250W GPU?" At some point power grids will not be able to keep up.

I think no, because many houses will be then at a threat of real fire hazard. You will need to build new cities and buildings infrastructure to handle that.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so.

Conveniently, new AAA games generally aren't very good. As games, that is--they're fairly impressive in terms of graphical fidelity, but of course the rate of improvement there has slowed to a crawl over the last decade, too.
If we ever reach a point where the newest GPUs threaten to max out the power of the average domestic circuit, it won't be especially difficult to opt out.

The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
 
Joined
Jun 30, 2017
Messages
75 (0.03/day)
The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
 
Joined
May 16, 2023
Messages
91 (0.16/day)
The coolers are just oversized, I doubt they'll actually try and push 600W. Just look at the 4080's cooler, it's probably rated for over 500W, but in truth the 4080 uses as much power as a 3070ti, which is insane for the amount of horsepower it provides. The cooler on my 4080 Gigabyte Eagle is so oversized that the fans don't even run half the time I'm in game.
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
The games dont need that hardware at all to sell, thats the most ironic thing here. The biggest sales cannons are the best immersive worlds. The bigger scope of games is where its at now: we have fast storage, we have strong CPUs... none of this requires a killer GPU.

And that IS really where games should seek to impress. It is the world you enter that matters, how it looks is secondary at best. RT and more realtime brute forcing is the only way to 'undo' the efficiency gains of raster technologies and keep selling fools the dream that graphics really do keep getting 'better', while the biggest advances are not RT, but all sorts of other engine advancements.
 
Joined
Jan 14, 2019
Messages
12,570 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The thing is, that games(mostly game engines) nowdays are purely dumping things to the GPU to figure out how to do things. The development is not optimized or tailored to a specific performance threshold... if the game runs bad, buy a new gpu, or wait for performance patch if the game sells enough... Nvidia is making money of this because they can "tailor" the drivers to specific games to achieve performance.

Does that gives power to Nvidia to cripple a game if they want ? ... oh yes. But the one to blame is GameEngines and GameDev's ...
If games didn't need a 9090 to run optimal, there wasn't the need to powerdraw a GPU just because of that.
Games don't need that amount of performance at all. We just all want shiny ultra RT graphics at 4K running at 120 FPS minimum. Yet, here I am, playing Hogwarts Legacy on a 6500 XT on low graphics and having tons of fun. ;)
 
Joined
Dec 5, 2017
Messages
157 (0.06/day)
I think no, because many houses will be then at a threat of real fire hazard. You will need to build new cities and buildings infrastructure to handle that.
The trend now is actually the opposite. You buy lower emission TVs, refrigerators, washing machines... your average light bulbs went from 100-watts down to 9-watts or so.



The last good game was Crysis 3 back in 2013. After that and the original Far Cry, there is nothing even remotely appealing in graphics realism and breakthrough revolutionary innovative graphics.
Ray-tracing alone is still an embarrassing hoax.
At 2000W this is true to some extent. I can't speak for other parts of the world but any wiring here can handle close to 1800W, I have a 1600W space heater...
And while I don't quite agree with the last part, I do like your spirit. I sure hope Crysis 4 comes with a worthy campaign and the same graphics ambition, and doesn't water anything down. The idea of TAA ruining the vegetation like all modern games scares me though.
Have you played the Crysis 3 remaster with RT though? It's not a massive difference but it sure is more visually immersive.
 
Joined
Apr 21, 2022
Messages
106 (0.11/day)
According the NEC, which is the code most of the USA follows, Cord-and-plug-connected equipment is not allowed to exceed 80% of the circuit's rating.

15A 120V (1800W) circuits can't exceed 12A (1440W)
20A 120V (2400W) circuits can't exceed 16A (1920W)

A 80+ Titanium rated PSU is at worst 90% efficient at 100% load.

15A 120V is limited to a 1290W on a 80+ Titanium PSU (1440W * 0.9 = 1290W)
20A 120V is limited to a 1728W on a 80+ Titanium PSU (1920W * 0.9 = 1728W)

The max power a PSU can output goes lower as the efficiency goes lower. You also have to account for ignorance or neglect in the average home. Most people don't replace a loose outlet. Even some of my tight fitting receptacles get uncomfortably hot when drawing 12A consistently from a space heater.

I guess what I'm trying to say is I don't see top end GPU's getting much more power hungry. 600W is probably going to be the max they pull at stock, since they need to account for people using 2 GPUs. I don't mean 600W continuosly, just 600W spikes.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
According the NEC, which is the code most of the USA follows, Cord-and-plug-connected equipment is not allowed to exceed 80% of the circuit's rating.

15A 120V (1800W) circuits can't exceed 12A (1440W)
20A 120V (2400W) circuits can't exceed 16A (1920W)

A 80+ Titanium rated PSU is at worst 90% efficient at 100% load.

15A 120V is limited to a 1290W on a 80+ Titanium PSU (1440W * 0.9 = 1290W)
20A 120V is limited to a 1728W on a 80+ Titanium PSU (1920W * 0.9 = 1728W)

The max power a PSU can output goes lower as the efficiency goes lower. You also have to account for ignorance or neglect in the average home. Most people don't replace a loose outlet. Even some of my tight fitting receptacles get uncomfortably hot when drawing 12A consistently from a space heater.

I guess what I'm trying to say is I don't see top end GPU's getting much more power hungry. 600W is probably going to be the max they pull at stock, since they need to account for people using 2 GPUs. I don't mean 600W continuosly, just 600W spikes.
This is a USA-only problem because your country is dumb and uses 120V instead of 240V, requiring double the amperage for the same wattage.
 
Joined
Apr 21, 2022
Messages
106 (0.11/day)
This is a USA-only problem because your country is dumb and uses 120V instead of 240V, requiring double the amperage for the same wattage.
It's North America that uses 120V, not just the USA. There's other countries outside of North America, such as Brazil and Saudi Arabia, that use 120V. Then there's Japan at 100V. Besides, I doubt anyone wants a 2000W space heater next to their desk in the middle of summer.
 
Top