• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,651 (0.99/day)
According to Benchlife.info insiders, NVIDIA is supposedly in the phase of testing designs with various Total Graphics Power (TGP), running from 250 Watts to 600 Watts, for its upcoming GeForce RTX 50 series Blackwell graphics cards. The company is testing designs ranging from 250 W aimed at mainstream users and a more powerful 600 W configuration tailored for enthusiast-level performance. The 250 W cooling system is expected to prioritize compactness and power efficiency, making it an appealing choice for gamers seeking a balance between capability and energy conservation. This design could prove particularly attractive for those building small form-factor rigs or AIBs looking to offer smaller cooler sizes. On the other end of the spectrum, the 600 W cooling solution is the highest TGP of the stack, which is possibly only made for testing purposes. Other SKUs with different power configurations come in between.

We witnessed NVIDIA testing a 900-watt version of the Ada Lovelace AD102 GPU SKU, which never saw the light of day, so we should take this testing phase with a grain of salt. Often, the engineering silicon is the first batch made for the enablement of software and firmware, while the final silicon is much more efficient and more optimized to use less power and align with regular TGP structures. The current highest-end SKU, the GeForce RTX 4090, uses 450-watt TGP. So, take this phase with some reservations as we wait for more information to come out.



View at TechPowerUp Main Site | Source
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,083 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
If efficiency increases (work done/power used), which it has with RTX 40xx compared to RTX 30xx, significantly, the power limits don't bother me.
 
Joined
Jun 30, 2017
Messages
75 (0.03/day)
If efficiency increases (work done/power used), which it has with RTX 40xx compared to RTX 30xx, significantly, the power limits don't bother me.
but ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,083 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Joined
Jul 4, 2018
Messages
120 (0.05/day)
Location
Seattle area, Wa
System Name Not pretty
Processor Ryzen 9 9950x
Motherboard Crosshair X870E
Cooling 420mm Arctic LF III, for now
Memory 64GB, DDR5-6000 cl30, G.Skill
Video Card(s) EVGA FTW3 RTX 3080ti
Storage 1TB Samsung 980 Pro (Win10), 2TB WD SN850X (Win11)
Display(s) old 27" Viewsonic 1080p, Asus 1080p, Viewsonic 4k
Case Corsair Obsidian 900D
Power Supply Super Flower
Benchmark Scores Cinebench r15, w/ 1680v2 @ 4.6ghz and XMP enabled, 1648 1680v2 @ 4.7ghz RAM @ stock 1333MT/s, 1696
Joined
Dec 31, 2020
Messages
1,000 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
just add a second 2x6 pin to the flagship. all the current designs are reusable, they just keep recycling things, pin to pin compatble even. nothing changes. same N4/N4P node same difference.
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
pumping power trend continues.

They tested up to 900w for ADA doesn't mean they will actually release a gpu that is capable of pulling that wattage.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Guys, companies like Apple, Nvidia, AMD, Intel, etc. don't ship everything that passes POST in their labs.

We all have to wait to see what Nvidia decides to productize for the RTX 50 series. But it most certainly won't be every single configuration they have tested.
 
Last edited:
Joined
Jan 29, 2012
Messages
6,881 (1.46/day)
Location
Florida
System Name natr0n-PC
Processor Ryzen 5950x-5600x | 9600k
Motherboard B450 AORUS M | Z390 UD
Cooling EK AIO 360 - 6 fan action | AIO
Memory Patriot - Viper Steel DDR4 (B-Die)(4x8GB) | Samsung DDR4 (4x8GB)
Video Card(s) EVGA 3070ti FTW
Storage Various
Display(s) Pixio PX279 Prime
Case Thermaltake Level 20 VT | Black bench
Audio Device(s) LOXJIE D10 + Kinter Amp + 6 Bookshelf Speakers Sony+JVC+Sony
Power Supply Super Flower Leadex III ARGB 80+ Gold 650W | EVGA 700 Gold
Software XP/7/8.1/10
Benchmark Scores http://valid.x86.fr/79kuh6
MSI AB - Power limit slider Oh how much I appreciate you

Nice card render. So they will all have hbm ?
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
MSI AB - Power limit slider Oh how much I appreciate you

Nice card render. So they will all have hbm ?
I read GPU rumor articles pretty frequently and I've seen nothing about any expectation that the next generation of consumer videocards will be using HBM.

The benefits of HBM are better harnessed in the datacenter so I assume almost all of the HBM parts will go into AI accelerators for enterprise customers. Typical consumer workloads like gaming won't exploit the full HBM performance envelope. I don't see that changing anytime soon. If a videogame is going to sell well, it needs to run decently on a wide variety of hardware including more modestly specced systems like mid-range notebook GPUs and consoles.

From a business standpoint, there is very little logic in putting HBM in some consumer cards and regular GDDR memory in others. The more sensible solution has been used before: put better specced GDDR in the high end cards and less expensive memory in the entry-level and mid-level cards.
 
Last edited:
Joined
Jun 27, 2011
Messages
6,772 (1.37/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
If efficiency increases (work done/power used), which it has with RTX 40xx compared to RTX 30xx, significantly, the power limits don't bother me.
While I personally don't want a 600w card of heat, the 4090 is fairly efficient in performance per watt. I would rather not have an 600w heat (approximately) in my office.

250w Blackwell with 2x the performance of my 6750xt would be compelling. I doubt it would be anything resembling a reasonable price given how far ahead of AMD they are.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
but ..600w for a GPU is a bit scary....... it should have "Eletric Hazard" sticker on it....
There are appliances in your house that draw far more current from a standard outlet.

Toasters, microwave ovens, hair dryers, and more. And you've probably had them for years, maybe even decades.

The device in question needs to be carefully designed and properly used. It's the same whether it's a graphics card or a stand mixer.

That said 600W is probably nearing the practical limit for a consumer grade GPU. Combined with other PC components, display, peripherals, etc., that's using up a lot of what's available in a standard household circuit (15A @ 120V here in the USA). And it's not really wise to push the wiring and breaker in a standard household circuit to the limit for long periods of time.
 
Last edited:
Joined
Sep 9, 2022
Messages
106 (0.13/day)
The RTX 5000 series needs a performance boost so they MUST go with the same 600W envelope as the RTX 4090 on the top SKU. No doubt about it.

Remember: These new cards will once again be built on a 5nm process just like the RTX 4000 series. It will be a slightly optimized process over "4N" but the gains from that optimization will be minimal. Any and all performance gains will have to come from the new Blackwell architecture and, if the rumors are true, from a broader 512-bit memory interface.
In fact, pure desperation to achieve performance gains, is probably the sole reason why the RTX 5090 will receive a 512-bit bus. If the RTX 5000 series would be built on 3nm, we would probably get the same 384-bit bus for saving costs and power reqs.

We can thank AI for all of that. It will be quite a disappointment in H2/2024 and early 2025. Looks like both nVidia and AMD have dedicated all of their 3nm capacities solely to AI/datacenter. I would even go so far and bet that they have made a non-public arrangement behind the curtains wrt leaving their consumer stuff on 5nm. Zen 5 will also be another 5nm CPU and RDNA 4 is also bound to be produced on 5nm.

What we will be getting in the consumer space in the next few months and into next year will most likely be pretty damn incremental and iterative. In a bit of a twist of irony, the real next big thing could actually come from Intel in the form of Panther Lake in mid or H2/2025 (if Intel 18A is at least half as good as Pat is trying to make us believe).
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,083 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
The RTX 5000 series needs a performance boost so they MUST go with the same 600W envelope as the RTX 4090 on the top SKU. No doubt about it.

Remember: These new cards will once again be built on a 5nm process just like the RTX 4000 series. It will be a slightly optimized process over "4N" but the gains from that optimization will be minimal. Any and all performance gains will have to come from the new Blackwell architecture and, if the rumors are true, from a broader 512-bit memory interface.
In fact, pure desperation to achieve performance gains, is probably the sole reason why the RTX 5090 will receive a 512-bit bus. If the RTX 5000 series would be built on 3nm, we would probably get the same 384-bit bus for saving costs and power reqs.

We can thank AI for all of that. It will be quite a disappointment in H2/2024 and early 2025. Looks like both nVidia and AMD have dedicated all of their 3nm capacities solely to AI/datacenter. I would even go so far and bet that they have made a non-public arrangement behind the curtains wrt leaving their consumer stuff on 5nm. Zen 5 will also be another 5nm CPU and RDNA 4 is also bound to be produced on 5nm.

What we will be getting in the consumer space in the next few months and into next year will most likely be pretty damn incremental and iterative. In a bit of a twist of irony, the real next big thing could actually come from Intel in the form of Panther Lake in mid or H2/2025 (if Intel 18A is at least half as good as Pat is trying to make us believe).
Desperation?

My dude their main competitor is supposedly not even trying high end this upcoming gen.

NVIDIA are competing with themselves, and I don't think they're going to find it difficult.

Reminder that most 4090 cards use around 450 W out of the box. People like to throw the "600 W" figure around but that's just the max power limit, and only for some cards.
power-maximum-3.png

power-gaming.png
 
Joined
Feb 17, 2010
Messages
1,676 (0.31/day)
Location
Azalea City
System Name Main
Processor Ryzen 5950x
Motherboard B550 PG Velocita
Cooling Water
Memory Ballistix
Video Card(s) RX 6900XT
Storage T-FORCE CARDEA A440 PRO
Display(s) MAG401QR
Case QUBE 500
Audio Device(s) Logitech Z623
Power Supply LEADEX V 1KW
Mouse Cooler Master MM710
Keyboard Huntsman Elite
Software 11 Pro
Benchmark Scores https://hwbot.org/user/damric/
I would like chip designers to get back to shipping the processors with default clocks dialed in at peak efficiency, and leaving plenty of headroom meat on the bone for overclockers.
 
Joined
Oct 18, 2013
Messages
6,263 (1.53/day)
Location
Over here, right where you least expect me to be !
System Name The Little One
Processor i5-11320H @4.4GHZ
Motherboard AZW SEI
Cooling Fan w/heat pipes + side & rear vents
Memory 64GB Crucial DDR4-3200 (2x 32GB)
Video Card(s) Iris XE
Storage WD Black SN850X 4TB m.2, Seagate 2TB SSD + SN850 4TB x2 in an external enclosure
Display(s) 2x Samsung 43" & 2x 32"
Case Practically identical to a mac mini, just purrtier in slate blue, & with 3x usb ports on the front !
Audio Device(s) Yamaha ATS-1060 Bluetooth Soundbar & Subwoofer
Power Supply 65w brick
Mouse Logitech MX Master 2
Keyboard Logitech G613 mechanical wireless
Software Windows 10 pro 64 bit, with all the unnecessary background shitzu turned OFF !
Benchmark Scores PDQ
it should have "Eletric Hazard" sticker on it
More like a radiation hazard sticker, for the reactor that you will need to provide the juice for it, hehehe :)
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
I would like chip designers to get back to shipping the processors with default clocks dialed in at peak efficiency, and leaving plenty of headroom meat on the bone for overclockers.
I would like chip designers to continue putting the OC headroom in the boost clock so I don't have to spend hours diddling with various tools to do what GPU engineers can do better than me. They have advanced degrees in electrical engineering, mathematics, physics, etc. plus years of experience as paid professionals not dilettantes. I don't. And I already paid them when bought my card.
 
Last edited:
Joined
Apr 18, 2019
Messages
2,396 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
600W Design Power?
Can we have active phase change heat pump loops, yet (again)?
 
Joined
Jun 27, 2011
Messages
6,772 (1.37/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
I would like chip designers to get back to shipping the processors with default clocks dialed in at peak efficiency, and leaving plenty of headroom meat on the bone for overclockers.
It really isn't hard to undervolt down to peak efficiency. You can even use the power slider if you want a quick and dirty way of doing it. My 6750xt is most efficient at around 50% power level, or alternatively 2000mhz with a heavy undervolt. It took me less than an hour to find that. The 4090 is incredibly efficient when undervolted to peak efficiency.
 
Joined
Jul 20, 2020
Messages
1,151 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
I would like chip designers to get back to shipping the processors with default clocks dialed in at peak efficiency, and leaving plenty of headroom meat on the bone for overclockers.

"Peak efficiency" changes with different jobs and frequently you continue to get more efficiency per watt as you continue to lower clocks. Are you saying you want to run at 2000 MHz all the time? I don't and nobody here does either. And this is coming from someone using their i7-9700F right now at 3.5 GHz (thanks @unclewebb for the fantastic Throttlestop). I turn it up in games that need it but most don't.

I choose to run at this efficient speed but guess what? If I run at 3.0 GHz it's even (a little) more efficient. Maybe you could run your CPU with Turbo off at all times to maximize your efficiency but most people want the best performance out of the box. The rest of us weirdos can tune for efficiency afterwards.

And if Blackwell has better efficiency than Ada (it should) then bring on the 600W GPUs. You can tune those for efficiency and if you don't like the power draw, it sounds like Nvidia will have a 250W option for you. Which you can tune as well!
 
Joined
Sep 26, 2022
Messages
225 (0.28/day)
450W is the max I'm willing to pick, be it a 5080 or 5080Ti

600W, lol there are PSU just with 600W available for the whole damn PC and now it's just the GPU
 
Joined
Apr 18, 2019
Messages
2,396 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
450W is the max I'm willing to pick, be it a 5080 or 5080Ti

600W, lol there are PSU just with 600W available for the whole damn PC and now it's just the GPU
@ this point, I'd be all for a separate PSU (on-card or external, ala multi-chip Voodoo)
-48VDC, 12V converted on-card?
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
250W is pretty high for the lowest product on the stack.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
It's only a rumor, not a verified account witness. As we know, Nvidia has many GPUs with lower power requirements for both mobile and certain usage cases that don't require much 3D performance (like signage).

Just because 250W is what these people purported "seeing" today doesn't mean that there aren't chip designs that Nvidia hasn't bothered benchmarking yet.

As we know, Nvidia tends to work from the top of the stack downward from their largest and most powerful GPUs to smaller ones with less transistors.
 
Top