• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

Joined
Sep 5, 2023
Messages
459 (0.94/day)
Location
USA
System Name Dark Palimpsest
Processor Intel i9 13900k with Optimus Foundation Block
Motherboard EVGA z690 Classified
Cooling MO-RA3 420mm Custom Loop
Memory G.Skill 6000CL30, 64GB
Video Card(s) Nvidia 4090 FE with Heatkiller Block
Storage 3 NVMe SSDs, 2TB-each, plus a SATA SSD
Display(s) Gigabyte FO32U2P (32" QD-OLED) , Asus ProArt PA248QV (24")
Case Be quiet! Dark Base Pro 900
Audio Device(s) Logitech G Pro X
Power Supply Be quiet! Straight Power 12 1200W
Mouse Logitech G502 X
Keyboard GMMK Pro + Numpad
this looks like just one more point against the claims of 5080 being faster than 4090. It's the same process node, only 65% of the cores, lower power rating, lower memory bandwidth, less memory...the only way this even competes with 4090 is going to be if there's a new DLSS tech or if they made 5000-series better at frame-gen and that's how they're compared. Raw raster power, no way.
 
Joined
Jun 14, 2020
Messages
3,661 (2.19/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Just buy a 5080 and save $1000+. The performance between a 5090 at 320W and a 5080 at 360W is going to be about the same.
But i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,127 (3.96/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Well, there is over 1GHz difference in core speed on my 4070Ti vs 3070Ti, more cache, and other tweaks, I am sure they have good AI to design these things :D
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
But i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
Sure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,127 (3.96/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Sure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
Yes, because when running a GPU like this, your only concern is if you have a big enough PSU
 
Joined
Dec 12, 2016
Messages
2,001 (0.68/day)
Sure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
I think the confusion lies in the process nodes. The move from the 2000 series to the 3000 series was horrible because of the 8 nm Samsung node. The situation greatly improved with the 4000 series on the 4 nm TSMC node. That's why the 4090 is such a great performer. But now the 2000 to 3000 series situation is happening again as the 5000 series is on the same 4 nm TSMC node. Efficiency can only go down if you add 30% more transistors to the same process node unless you greatly decrease the clock speed which negates any performance improvements over the previous generation.
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Yes, because when running a GPU like this, your only concern is if you have a big enough PSU
The same way when you're buying a Ferrari, your only concern is whether you have space for it in your garage next to your other Ferraris? Um, maybe.

I'm still thinking that if a 5090 performs at 100%, and a 5080 at 320 W performs at 50%, and you can get 60% by running your 5090 at 320 W, then the other 40% is wasted money.

Edit: Then, you basically paid double price for 20% more performance.

I think the confusion lies in the process nodes. The move from the 2000 series to the 3000 series was horrible because of the 8 nm Samsung node. The situation greatly improved with the 4000 series on the 4 nm TSMC node. That's why the 4090 is such a great performer. But now the 2000 to 3000 series situation is happening again as the 5000 series is on the same 4 nm TSMC node. Efficiency can only go down if you add 30% more transistors to the same process node unless you greatly decrease the clock speed which negates any performance improvements over the previous generation.
I completely agree, although this wasn't my question.
 
Joined
Dec 28, 2012
Messages
4,004 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Doom is mostly a shooter on rails, or by sections, they couldn't pull that off in a open world for example. Not to say they didn't do a great job but not all games are equal.
*cough cough* RAGE *cough cough*

I agree. I want to see speed increases due to advancements in GPU architecture, like I did in the Pascal years, and not due to cramming more parts into a chip and increasing power (what I call brute forcing).
Pascal's main advancements came from cramming significantly more transistors onto a chip with a higher power limit, specially given maxwell was its predecessor. Rose colored glasses and all that.
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Pascal's main advancements came from cramming significantly more transistors onto a chip with a higher power limit, specially given maxwell was its predecessor. Rose colored glasses and all that.
Well, Maxwell wasn't a bad architecture, either, imo... but I get what you mean.
 
Joined
Dec 28, 2012
Messages
4,004 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
The same way when you're buying a Ferrari, your only concern is whether you have space for it in your garage next to your other Ferraris? Um, maybe.

I'm still thinking that if a 5090 performs at 100%, and a 5080 at 320 W performs at 50%, and you can get 60% by running your 5090 at 320 W, then the other 40% is wasted money.

Edit: Then, you basically paid double price for 20% more performance.
Except its usually the opposite, the 5090 at 320w would be putting out 60%, while the 5080 at 320w would be putting out 50%.

Besides, if you are buying the 5090, it's because the 5080 isnt enough for what you want. For most who want high end hardware, drawing 525w isnt a concern. The high end has always had huge power draw (hello SLI era).
Well, Maxwell wasn't a bad architecture, either, imo... but I get what you mean.
No it wasnt bad. It was great. My point was that overall most GPU generations are defined by MOAR COARS and more power, with power being offset by smaller nodes. IPC is far less important to GPUS then it is CPUs, parallelism and clock speeds make a much larger difference. Been true for a long time.
 
Joined
May 29, 2024
Messages
33 (0.15/day)
Location
United States of America
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS B650-F ROG STRIX GAMING WIFI ATX
Cooling DeepCool AK500 ZERO DARK
Memory TeamGroup T-Create Expert 32GB Kit (2 x 16GB) DDR5-6000 CL30
Video Card(s) Gigabyte RTX 3050 Gaming OC
Storage WD Black SN850 1TB PCIe 4.0
Display(s) ASUS ROG Swift OLED PG27AQDM
Case Fractal Design North
Audio Device(s) Topping DX3 Pro+ DAC/AMP, Byerdynamic TYGR 300R, HyperX QuadcastS
Power Supply MSI MEG Ai1000P 1000W 80+ Platinum
Mouse LAMZU Maya X
Keyboard DURGOD 65% Gateron Yellow switches
Software Windows 10 Pro
Will I be chill with a 1000W ATX 3.1 PSU for the 5090? (paired with 7800X3D)
 
Joined
May 10, 2023
Messages
433 (0.71/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Just buy a 5080 and save $1000+. The performance between a 5090 at 320W and a 5080 at 360W is going to be about the same. Maybe and this is a big maybe, the 5090 will be a little faster but don't forget that Nvidia is using the same node as the 4000 series. This means efficiency of the 5000 series will go down as more transistors are added.

This is a buyer beware situation and no company logo on the box beats physics.
You seem to be under the assumption that performance scales linearly with power.
A 5090 at a lower power budget than a 5080 is still going to have almost double the memory bandwidth, and way more cores, even if those are clocked lower.
The same way when you're buying a Ferrari, your only concern is whether you have space for it in your garage next to your other Ferraris? Um, maybe.

I'm still thinking that if a 5090 performs at 100%, and a 5080 at 320 W performs at 50%, and you can get 60% by running your 5090 at 320 W, then the other 40% is wasted money.

Edit: Then, you basically paid double price for 20% more performance.


I completely agree, although this wasn't my question.
Your assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.
 
Joined
Jun 30, 2008
Messages
271 (0.04/day)
Location
Sweden
System Name Shadow Warrior
Processor 7800x3d
Motherboard Gigabyte X670 Gaming X AX
Cooling Thermalright Peerless Assassin 120 SE ARGB White
Memory 64GB 6000Mhz cl30
Video Card(s) XFX 7900XT
Storage 8TB NVME + 4TB SSD + 3x12TB 5400rpm
Display(s) HP X34 Ultrawide 165hz
Case Fractal Design Define 7 (modded)
Audio Device(s) SMSL DL200 DAC / AKG 271 Studio / Moondrop Joker..
Power Supply Corsair hx1000i
Mouse Roccat Burst Pro
Keyboard Cherry Stream 3.0 SX-switches
VR HMD Quest 1 (OLED), Pico 4 128GB
Software Win11 x64
Joined
Dec 1, 2022
Messages
296 (0.39/day)
Should I write "in my opinion" in front of every post I make? :confused:

I am an Nvidia user, by the way, just not in my main gaming rig at the moment. I've got two HTPCs that both have Nvidia GPUs in them. Does that make me more qualified to comment here?
It seems to be getting to that point, people take the system specs too seriously,lol.
Let me disagree there. The 5090 has double of everything compared to the 5080 (shaders, VRAM, etc) which is already gonna be a stupidly expensive card. The 5090 is only GeForce by name to sell it to gamers. But it is not a card that your average gamer needs. Otherwise, there wouldn't be such a gigantic gap between it and the 5080 in specs.
The 5090 is more of an RTX A series card than a Geforce card, double the shaders and VRAM also likely means double the price as well, I doubt Jensen is going to be generous since businesses bought up the 4090.
Maybe its just me missing the pricing structure of Pascal, there was only a $200 difference between x80 and x80Ti, the Titan XP wasn't something gamers with money to waste were buying.
 
Joined
Jun 14, 2020
Messages
3,661 (2.19/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
I don't think the performance loss from dropping to 320w will be even 5%. It's the same with CPUs, you can push 50% extra power for single digits performance.

Im currently running 320w with clocked memory and it's around 2-3% faster than stock 450w, so I don't think the 5090 will be any different.
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
the 5090 at 320w would be putting out 60%, while the 5080 at 320w would be putting out 50%.
That's exactly what I said.

Besides, if you are buying the 5090, it's because the 5080 isnt enough for what you want. For most who want high end hardware, drawing 525w isnt a concern. The high end has always had huge power draw (hello SLI era).
That's what I think so, too. If the 5080 isn't enough, I'm not gonna spend double, and then limit my 5090 to be only a little bit faster than the 5080. It's a huge waste of money.

No it wasnt bad. It was great. My point was that overall most GPU generations are defined by MOAR COARS and more power, with power being offset by smaller nodes. IPC is far less important to GPUS then it is CPUs, parallelism and clock speeds make a much larger difference. Been true for a long time.
Then why do we have massive differences between GPUs such as the 5700 XT vs the Vega 64, the former of which was faster with only 62% of the cores, despite having not much of a clock speed difference?
 
Joined
Jun 14, 2020
Messages
3,661 (2.19/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Your assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.
10-20 is still huge, I don't think it will be over 5% honestly
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I don't think the performance loss from dropping to 320w will be even 5%. It's the same with CPUs, you can push 50% extra power for single digits performance.

Im currently running 320w with clocked memory and it's around 2-3% faster than stock 450w, so I don't think the 5090 will be any different.
May you be right, then. ;)

It seems to be getting to that point, people take the system specs too seriously,lol.
Does it matter, though? Can current AMD users not have an opinion on an Nvidia card and vice versa? Do people sign their souls away when they choose Coca-Cola instead of Pepsi one day? I don't think so.

The 5090 is more of an RTX A series card than a Geforce card, double the shaders and VRAM also likely means double the price as well, I doubt Jensen is going to be generous since businesses bought up the 4090.
Maybe its just me missing the pricing structure of Pascal, there was only a $200 difference between x80 and x80Ti, the Titan XP wasn't something gamers with money to waste were buying.
Exactly. But now, Nvidia wants even gamers to buy the Titan, ehm... x90 card, despite its price.
 
Joined
May 10, 2023
Messages
433 (0.71/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Joined
Jun 14, 2020
Messages
3,661 (2.19/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yeah, I'm assuming a worst case scenario just to be safe.
Just tested it in CP2077, 73 fps @ 440-450 watts, 71 fps @ 320w. That's at 4k native
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Just tested it in CP2077, 73 fps @ 440-450 watts, 71 fps @ 320w. That's at 4k native
That's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
 
Joined
May 26, 2021
Messages
142 (0.11/day)
Haven't the official/public TDP numbers been technically TGPs - as in whole card consumption - for a while now? For both AMD and Nvidia, the power consumption numbers measured in reviews are within measuring error of power limit that is set to the TDP. There was a point where GPU manufacturers tried to make things complicated but that did not last long.
TGP is only for the GPU chip. TBP is Total Board power.
 
Joined
Jun 14, 2020
Messages
3,661 (2.19/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
That's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
I think the 450w makes the card faster in other workloads (than restricted at 320w) but as much as ive tested, games seem to be restricted from memory bandwidth, so they don't scale that much with power. If there are non gaming workloads that don't benefit from memory as much, I guess the 450w will give better performance. Still, i don't expect anything over 10% in either case. What's nvidia thinking? Probably the same thing intel is thinking when they decide to ship CPUs at 400 watts :D

Ocing vram gives me ~8-9% performance, overclocking the core to 3000mhz gives me ~2%.
 
Joined
May 10, 2023
Messages
433 (0.71/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
That's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
So that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:
1735918863441.png


That has been the case since... always. Here's another example with a 2080ti:
1735918816167.png


Games often don't really push a GPU that hard, so the consumption while playing is really lower than the actual limit.
 
Joined
Jan 14, 2019
Messages
13,049 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
So that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:
View attachment 378225
What that diagram tells me is that the 3090 should be a 250-260 Watt card. There is no need for it to eat more than that out-of-the-box. Overclockers would be happy with that, too.
 
Top