• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5000 Series “Blackwell” TDPs Leaked, Entire Lineup Unified with 12+4 Pin Power Connector

Joined
Jul 26, 2024
Messages
213 (1.37/day)
My RTX 4070 refuses to budge from 195W under full load, even tried undervolting it, it doesn't give a damn it just keeps going up to 195W
My 4070 Super UV'd doesnt go over 185w no matter what, usually stays below 170, but youre saying 4070 won't go below 195w ?
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,013 (2.49/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Unbelievable! Are people going to pay 4-5k USD if not more for this card!? :roll: :kookoo:
LOL this card is not going to be $4-5k. It's Quadro counterpart will, but not GeForce don't be ridiculous
 
Last edited:
Joined
Feb 13, 2018
Messages
163 (0.06/day)
Location
Finland
Processor i7 4770K
Motherboard Asus Z87-Expert
Cooling Noctua NH-U12S, &case fans all controlled by Aquaero 6
Memory 2x8GB TeamGroup Xtreem LV 2133MHz
Video Card(s) Vega 64
Storage Samsung 840 Pro + 2x 5GB WD Red@RAID1
Display(s) Dell U3014
Case Lian Li PC-A71B
Audio Device(s) Sound Blaster ZxR, Objective2 (2x), AKG K702&712, Beyerdynamic DT990
Power Supply Seasonic Prime Titanium 650 (+Eaton 5P 1550 as "backup power")
Mouse Logitech G700
Keyboard Logitech G810
I'm a little out of the loop since I prefer AMD, has the 12V-2×6 connector solved the failure issues of the 12VHPWR? Because a repeat performance of $2000+ GPUs burning would be extremely disappointing.
Really very little has changed for how it takes wear/"non-perfections".
Power connector is still same very high current for tiny contacts and high total current/power density by cramming them very tight together for no real safety margins mess.
Vs older connectors not pushing the maximums and having looser spacing better allowing any possible heat to dissipate.

Mr. "Safety is pure waste" Rush Job would be proud...
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,941 (3.90/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I like it, nice and tidy, only 1 cable to run.. pretty sweet.

I have way more than 30 insertion cycles on my cable.

Toight like a tiger, maybe other PSU brands use chintzy plastic?
 
Joined
Aug 29, 2005
Messages
7,308 (1.03/day)
Location
Stuck somewhere in the 80's Jpop era....
System Name Lynni PS \ Lenowo TwinkPad L14 G2
Processor AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U
Motherboard ASRock B650M PG Riptide Bios v. 3.10 AMD AGESA 1.2.0.2a \ Lenowo BDPLANAR Bios 1.68
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB
Display(s) KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p
Case Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi
Software Win11 IoT Enterprise 24H2 UK | Win11 IoT Enterprise LTSC 24H2 UK / Arch (Fan)
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
My 4070 Super UV'd doesnt go over 185w no matter what, usually stays below 170, but youre saying 4070 won't go below 195w ?

My Asus GeForce RTX 4070 Dual OC won't go lower at max when I try to UV with MSI Afterburner tested it earlier this year when I got the card.

Even tried locked voltage which should bring it down it doesn't it just goes up to 195W at max when it's under heavy use.
 
Joined
Jun 14, 2020
Messages
3,550 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I don't understand how we're swinging so far again with the Power. I mean I had a 1200W PSU over a decade ago as at the time I planned on Xfiring some 29000XTs(yeah yeah fail card) Then I dropped down to 1050W and now currently running an 850W Platinum.
Why when the nodes are going down which should be hypothetically more efficient are the last few gens becoming so power hungry again....I mean I know my 850W is "enough" but the fact that we are back to 1000, 1200 and even 1600W PSU's and GPUs like this that apparently have already saturated the stupid connector they've had to add another....
I think there needs to be a distinction between power hungry and "pulling a lot of power". I bet a paycheck the 5090 will be the most efficient card ever produced. Whoever is primarily concerned with power should just grab a 5090 and limit it to 250-300w or whatever they feel like. It will still be the fastest card around with the best perf / watt. But for those who value performance above power, these cards are capable of handling more power to spit more fps. Nothing wrong with that either. Im running my 4090 locked to 320watts, but I wouldn't mind if it was also capable of pulling 2000 watts for when / if I require the extra performance. I really don't get what the issue is.

My Asus GeForce RTX 4070 Dual OC won't go lower at max when I try to UV with MSI Afterburner tested it earlier this year when I got the card.

Even tried locked voltage which should bring it down it doesn't it just goes up to 195W at max when it's under heavy use.
Of course undervolting doesn't lower power draw, that's because you are just moving the frequency curve to the left, meaning your card is trying to hit higher clockspeeds. So by UV, you are just gaining performance with the same power. If you want to lower power, power limit it.
 
Joined
Jul 26, 2024
Messages
213 (1.37/day)
My Asus GeForce RTX 4070 Dual OC won't go lower at max when I try to UV with MSI Afterburner tested it earlier this year when I got the card.

Even tried locked voltage which should bring it down it doesn't it just goes up to 195W at max when it's under heavy use.
I do that manually in MSI AB, I managed to achieve the out of the box clocks (2730MHz) at 985mv, cut power from 205-210W to 170-180W. You need a volt/freq curve editor for that (Ctrl+F).
 
Last edited:

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,118 (2.00/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
FTFY

Let's reiterate the fact that 10 series x80 ran on 180W TDP
Ada's 'super efficient' 4080 wants 320 TDP. On a 'new node'

The only reason Ada looked good is because Ampere was such a shitshow in efficiency and Turing was just Pascal with shitty RT.
4080 is 3.5x faster than the 1080 in raw raster, much faster than that if RT or AI stuff is used, and currently is the most efficient card in existence.

Raw power draw is not a measure of efficiency despite it being used for shock effect.
 
Joined
Jul 26, 2024
Messages
213 (1.37/day)
FTFY

Let's reiterate the fact that 10 series x80 ran on 180W TDP
Ada's 'super efficient' 4080 wants 320 TDP. On a 'new node'

The only reason Ada looked good is because Ampere was such a shitshow in efficiency and Turing was just Pascal with shitty RT.
always been a huge fan of buying a mid-sized die for efficiency, I absolutely loved gtx1080 and how it easily beat my 1.5G 980Ti at over 100W less power (hence my sig, wish things could go back to Pascal days). Blackwell shapes to be an Ampere repeat. If 5070Ti has a 350W TDP, I'll need to decide if I care about upgrading all that much. The alternatives look kinda grim with 9070XT leaks putting it closer to 4070Ti than 4080 like it was rumored earlier.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
13,044 (2.97/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / console
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
What happened with the efficiency where Maxwell and Pascal shined? 500W TDP for the flagship, really?
 
Joined
Jun 14, 2020
Messages
3,550 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
What happened with the efficiency where Maxwell and Pascal shined? 500W TDP for the flagship, really?
Do you think that pascal will be more efficient than the 5090?
 
Joined
Sep 17, 2014
Messages
22,725 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
4080 is 3.5x faster than the 1080 in raw raster, much faster than that if RT or AI stuff is used, and currently is the most efficient card in existence.

Raw power draw is not a measure of efficiency despite it being used for shock effect.
Nobody said that, that's just what you want to make of it to live the illusion that all is well here.

The fact is, TDP has virtually doubled to get that 3,5x faster.
At the same time, your games still run at similar FPS.
One might even debate if they look better than they used to do on the 1080, honestly, given the current TAA junk.

Such progress. Have you saved the world yet with your hyper efficient GPUs? Or is the exact opposite happening, and is the reality that all efficiency gained is just a free pass to use ever more power? What's the energy footprint YoY of all GPUs on the globe? An interesting thing to place alongside your ooh-aaah about most efficient GPU in existence. It gets even funnier if we factor in the cost and footprint of the fabs and node progress required to obtain 3.5x faster. I think if you put it all together the net sum might be we've actually just wasted an immense amount of resources so we can still barely run 4K - not much other than in 2017.

Stop bullshitting yourself, there should be a shock effect when you consider the x80 has doubled its TDP over 3-4 generations when in the past, it never did that.

Do you think that pascal will be more efficient than the 5090?
Doesn't matter. What matters is whether that singular gaming system with a top end GPU in it, uses 300-350W (2016), or over 600W (2024). And that's not touching the OC button, mind you, that's running both systems 'conservatively'. In the end, physics hasn't changed, you're still getting an ever increasing energy bill and heat production. Efficient or not.
 
Last edited:

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,118 (2.00/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Nobody said that, that's just what you want to make of it to live the illusion that all is well here.

The fact is, TDP has virtually doubled to get that 3,5x faster.
At the same time, your games still run at similar FPS.
One might even debate if they look better than they used to do on the 1080, honestly, given the current TAA junk.

Such progress. Have you saved the world yet with your hyper efficient GPUs? Or is the exact opposite happening, and is the reality that all efficiency gained is just a free pass to use ever more power? What's the energy footprint YoY of all GPUs on the globe? An interesting thing to place alongside your ooh-aaah about most efficient GPU in existence. It gets even funnier if we factor in the cost and footprint of the fabs and node progress required to obtain 3.5x faster. I think if you put it all together the net sum might be we've actually just wasted an immense amount of resources so we can still barely run 4K - not much other than in 2017.

Stop bullshitting yourself, there should be a shock effect when you consider the x80 has doubled its TDP over 3-4 generations when in the past, it never did that.
Remind me the TDP of the first Geforce card and the ones five generations later?

TDP goes up, it's normal.

Mind you your comment is rambling all over the place, so.

Doesn't matter. What matters is whether that singular gaming system with a top end GPU in it, uses 300-350W (2016), or over 600W (2024). And that's not touching the OC button, mind you, that's running both systems 'conservatively'. In the end, physics hasn't changed, you're still getting an ever increasing energy bill and heat production. Efficient or not.
Wrong, efficiency is work done/energy used, so if we all stuck with your beloved 1080 class GPUs, the world would actually use more power as it would take longer to do the same computing tasks. Gaming is what, 5% of the market? That's assuming you don't frame rate lock to your monitor refresh rate, in which case new GPUs use less power regardless of peak power draw potential.

Interesting you're not commenting on new tech like DLSS enabling lower resolution rendering with similar IQ, equating to further efficiency on top of the generational improvements. Is hot for handhelds like the Switch 2.

But old good new bad right?

If you're so concerned about power draw and efficiency why are you using an AMD card?
 
Last edited:
Joined
Sep 17, 2014
Messages
22,725 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Wrong, efficiency is work done/energy used, so if we all stuck with your beloved 1080 class GPUs, the world would actually use more power as it would take longer to do the same computing tasks.

Interesting you're not commenting on new tech like DLSS enabling lower resolution rendering with similar IQ, equating to further efficiency on top of the generational improvements. Is hot for handhelds like the Switch 2.

But old good new bad right?
The reality would rather be that a lot of computing tasks wouldn't happen to begin with. Such as the ones we now call AI. Its not old good new bad, its about realizing that sometimes, enough is enough, and ever more is a certain road to oblivion. That's all this is about. There's a point where stuff starts to bite you in the ass, and we're solid in that territory right now, globally.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,118 (2.00/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
The reality would rather be that a lot of computing tasks wouldn't happen to begin with. Such as the ones we now call AI. Its not old good new bad, its about realizing that sometimes, enough is enough, and ever more is a certain road to oblivion.
Good luck telling that to hardware/software manufacturers, or the consumers/enterprise desiring more performance. Things get faster and aren't going to stop.
 
Joined
Sep 17, 2014
Messages
22,725 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Good luck telling that to hardware/software manufacturers, or the consumers/enterprise desiring more performance. Things get faster and aren't going to stop.
Correct, and I do not applaud that anymore, nor do I pull wool over my eyes telling everyone how incredibly efficient I can do things now, compared to yesteryear. All I see, is that I'm still doing the very same things I did in 2016. There are a few more pixels. The screen's a bit bigger. My energy bill is a bit higher. Parts are (far) more expensive.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,118 (2.00/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Correct, and I do not applaud that anymore, nor do I pull wool over my eyes telling everyone how incredibly efficient I can do things now, compared to yesteryear. All I see, is that I'm still doing the very same things I did in 2016. There are a few more pixels. The screen's a bit bigger. My energy bill is a bit higher. Parts are (far) more expensive.
Play the same games at the same settings and resolution, frame capped, on your new GPU vs an old one then try to tell me new GPUs aren't more efficient.

There is absolutely nothing stopping you from doing this, but if you want to play new games at new settings, there is a cost.
 
Joined
Sep 17, 2014
Messages
22,725 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Play the same games at the same settings and resolution, frame capped, on your new GPU vs an old one then try to tell me new GPUs aren't more efficient.

There is absolutely nothing stopping you from doing this, but if you want to play new games at new settings, there is a cost.
I've said GPUs use more power, not that they haven't gained efficiency. Stop putting words in my mouth ;)

You're not wrong though.
 
Joined
Jul 20, 2020
Messages
1,156 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
I have no problem with increased power use at the top end, even creeping increases in power use at an arbitrary numerical branding label like "80 series". Because that's all

Arbitrary.

If a whale wants higher performance and the cost is higher power, then go ahead and pay. I greatly prefer efficiency, low power use, and low cost per frame. If that has migrated from the "80" class to the "70" or "60" class, then so be it. The 4070/Super and 4060 Ti are good GPUs for this, have excellent efficiency with very good performance (yes even the 4060 Ti) and coincidentally hover around the old 1080's power use. They also hover around the 1080's price FWIW.

Buy the product on its measured merits, not its branding or any artificial class-based argument.

And yes I put my pesos where my mouth is, amongst other GPUs I have a 4060 Ti which I bought it primarily for it's efficiency. Toss up with the 4070 Super but it was this past summer and I preferred to spend less at the time and wait on the 5070's efficiency as it'll arrive sooner than the 5060/Ti.
 
Joined
Jan 14, 2019
Messages
12,633 (5.81/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Nobody said that, that's just what you want to make of it to live the illusion that all is well here.

The fact is, TDP has virtually doubled to get that 3,5x faster.
At the same time, your games still run at similar FPS.
One might even debate if they look better than they used to do on the 1080, honestly, given the current TAA junk.

Such progress. Have you saved the world yet with your hyper efficient GPUs? Or is the exact opposite happening, and is the reality that all efficiency gained is just a free pass to use ever more power? What's the energy footprint YoY of all GPUs on the globe? An interesting thing to place alongside your ooh-aaah about most efficient GPU in existence. It gets even funnier if we factor in the cost and footprint of the fabs and node progress required to obtain 3.5x faster. I think if you put it all together the net sum might be we've actually just wasted an immense amount of resources so we can still barely run 4K - not much other than in 2017.

Stop bullshitting yourself, there should be a shock effect when you consider the x80 has doubled its TDP over 3-4 generations when in the past, it never did that.


Doesn't matter. What matters is whether that singular gaming system with a top end GPU in it, uses 300-350W (2016), or over 600W (2024). And that's not touching the OC button, mind you, that's running both systems 'conservatively'. In the end, physics hasn't changed, you're still getting an ever increasing energy bill and heat production. Efficient or not.
I kind of agree, although I think Nvidia's x90 cards are a replacement for high-end SLi, and not a single high-end card like the 1080.
 
Joined
Feb 3, 2017
Messages
3,831 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I think there needs to be a distinction between power hungry and "pulling a lot of power". I bet a paycheck the 5090 will be the most efficient card ever produced. Whoever is primarily concerned with power should just grab a 5090 and limit it to 250-300w or whatever they feel like. It will still be the fastest card around with the best perf / watt. But for those who value performance above power, these cards are capable of handling more power to spit more fps. Nothing wrong with that either. Im running my 4090 locked to 320watts, but I wouldn't mind if it was also capable of pulling 2000 watts for when / if I require the extra performance. I really don't get what the issue is.
You can pretty easily take a 4090 and run it at 225W. Afaik changing the power limit still has a minimum of 50% so not under that. It'll be very-very efficient and the performance drop will be surprisingly small considering it literally uses half the power. The problem? Manufacturer has no incentive to sell you the same GPU/card for a lower price. So if you are willing to pay more there are very cool efficient solutions available :)

Of course undervolting doesn't lower power draw, that's because you are just moving the frequency curve to the left, meaning your card is trying to hit higher clockspeeds. So by UV, you are just gaining performance with the same power. If you want to lower power, power limit it.
It is a little bit about the viewpoint. If the aim is lowering power consumption power limiting will do that but you lose performance. Undervolting allows to reclaim some of that lost performance without increasing the power.
 
Joined
Sep 17, 2014
Messages
22,725 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I kind of agree, although I think Nvidia's x90 cards are a replacement for high-end SLi, and not a single high-end card like the 1080.
That is a way to look at it. But this was about the x80; the TDP inflation trickles down. Also, diminishing returns happen, both on the hardware and software/graphics front. I'm not denying this is progress in its own right, but only if you define progress as more is better, forgetting the actual cost of all that in the non-digital world.

Right now there is a continuous escalation of TDP's gen over gen, and sure, we've seen this in the past, but consecutively, and with these jumps? Not quite.

The thing is I don't even blame Nvidia or AMD for it. We know all stops are pulled to get ever more performance out of a piece of silicon and sell us another product. There's just too much focus on selling products all the time, everywhere.
 
Joined
Jan 14, 2019
Messages
12,633 (5.81/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
That is a way to look at it. But this was about the x80; the TDP inflation trickles down. Also, diminishing returns happen, both on the hardware and software/graphics front. I'm not denying this is progress in its own right, but only if you define progress as more is better, forgetting the actual cost of all that in the non-digital world.

Right now there is a continuous escalation of TDP's gen over gen, and sure, we've seen this in the past, but consecutively, and with these jumps? Not quite.

The thing is I don't even blame Nvidia or AMD for it. We know all stops are pulled to get ever more performance out of a piece of silicon and sell us another product. There's just too much focus on selling products all the time, everywhere.
Well, there were already lots of kW+ PSUs on the market for SLi/CF systems that had no place anymore when the slow death of multi GPU happened. Nvidia only gave them another reason to exist. And the $1000+ price doesn't make me believe that they're "just" high-end GPUs at all. They're much more than that.

The 5090 for example, will have double the shader count of the 5080 and a 200 W higher TDP - you could fit an entire product stack in between. That makes it not a gaming card, but a "look at my massive e-balls" card in my eyes, which is exactly what SLi/CF were all about.
 
Top