Monday, February 3rd 2025

NVIDIA GeForce RTX 5090 Runs on 3x8-Pin PCI Power Adapter, RTX 5080 Not Booting on 2x8-Pin Configuration

NVIDIA's flagship GeForce RTX 5090 demonstrated flexibility in power compatibility, while its sibling, the RTX 5080, struggled with stricter requirements. Recent tests by a German tech outlet, ComputerBase, reveal that the RTX 5090 can operate with three 8-pin PCI power connectors instead of the recommended four, albeit with a performance trade-off. However, the RTX 5080 fails to boot when using only two 8-pin connectors. The RTX 5090, with a default TDP of 575 W, officially requires a 600 W 12V-2×6 connector or an adapter with four 8-pin PCI cables. However, tests on the ASUS ROG RTX 5090 Astral and Zotac RTX 5090 Solid show the GPU boots even with three 8-pin cables, capping its TDP at 450 W—matching the three connectors' 150 W-per-cable spec. Performance losses are modest: benchmarks indicate a 5% drop in average FPS at 450 W compared to full power.

In contrast, the RTX 5080's 360 W TDP proves less forgiving. Attempts to run the Founders Edition and Zotac RTX 5080 AMP Extreme Infinity with two 8-pin connectors (300 W total) resulted in failure: the screen remained blank, and the card refused to initialize. NVIDIA's firmware appears to lack a lower power-limit threshold for the RTX 5080, unlike the 5090, which automatically adjusts when detecting insufficient power delivery. This requirement forces users to adhere strictly to the three 8-pin or 12V-2×6 power connectors. While the RTX 5090 offers flexibility for users upgrading from older systems, the RTX 5080's limitations may frustrate owners of less powerful PSUs. For the RTX 5090, the 5% performance penalty at 450 W may be a reasonable trade-off for avoiding costly PSU upgrades, but RTX 5080 users have no such recourse. Verifying power supply compatibility, as underpowered setups risk instability or hardware damage, is a must, and when your $2000+ GPU runs, you should at least power it properly. This experiment is more a "for science" type of run.
Sources: ComputerBase, via VideoCardz
Add your own comment

30 Comments on NVIDIA GeForce RTX 5090 Runs on 3x8-Pin PCI Power Adapter, RTX 5080 Not Booting on 2x8-Pin Configuration

#1
AusWolf
So much for all those "but you can undervolt it" arguments.
Posted on Reply
#2
Wirko
How does the card know what's at the other end of the adapter?
Posted on Reply
#3
AusWolf
WirkoHow does the card know what's at the other end of the adapter?
Maybe the PSU won't supply the power it needs?
Posted on Reply
#4
Macro Device
WirkoHow does the card know what's at the other end of the adapter?
Sensor pins.
Posted on Reply
#5
AusWolf
Macro DeviceSensor pins.
What are the sensor pins connected to when you're using an adapter?
Posted on Reply
#6
hsew
AusWolfWhat are the sensor pins connected to when you're using an adapter?
They could be shorted in such a way as to put the card in an agnostic or fixed power limit mode.
Posted on Reply
#7
Assimilator
575W @ 100% versus 450W @ 95%, it's crazy how far above their optimal range these chips are clocked.
Posted on Reply
#9
phanbuey
it will just heat up those 3 cables more. Ask me how I know.



Was running some maps in POE 2 when the distinct smell of burning plastic filled the air. I was only running 3 cables because I read when I first bought the card that would limit the card in my build to 450W instead of 450W+, and was in spec.

I would not recommend it.
Posted on Reply
#11
AnotherReader
AusWolf16% is still not a lot for a 27% reduction in power.
It's a 16% increase with a 28% increase in power, not a 28% reduction in power. 99th percentile frame times also improved by 20% with the higher power limit. I would say that it's highly usage dependent; for games that don't run into the card's power limit, a lower limit doesn't really hurt.
Posted on Reply
#12
jonup
AnotherReaderIt's a 16% increase with a 28% increase in power, not a 28% reduction in power. 99th percentile frame times also improved by 20% with the higher power limit. I would say that it's highly usage dependent; for games that don't run into the card's power limit, a lower limit doesn't really hurt.
it's also 28% increase in the maximum possible consumption, but the actual and average power consumption will be much more linear to the increase in performance.
Posted on Reply
#13
3valatzy
phanbueyit will just heat up those 3 cables more. Ask me how I know.

Was running some maps in POE 2 when the distinct smell of burning plastic filled the air. I was only running 3 cables because I read when I first bought the card that would limit the card in my build to 450W instead of 450W+, and was in spec.
450 watts with 3 PCIe 8-pin means 12.5 amps per connector.
150 watts which is the theoretical limit of the same 8-pin means 12.5 amps.
450 watts over 4 PCIe 8-pin means 9.375 amps per connector.

Something doesn't add up in your experiment. Are you sure something is not wrong in your case ?
AusWolfSo much for all those "but you can undervolt it" arguments.
That's an ugly artificial software limitation, not that the hardware is no capable to support it. Blame Nvidia, vote with your wallet and don't buy.
Posted on Reply
#14
GhostRyder
Clearly they had to push it that far in power to make it worth buying. Otherwise it would have been so close to the 4080 Super that it probably would hurt sales in the long run (I guarantee it would have sold it immediately regardless).

We really are not going to see much this generation overall even if a 5080 ti comes out. Its really going to be 6XXX that makes a much more major difference all around.
Posted on Reply
#15
AusWolf
3valatzyThat's an ugly artificial software limitation, not that the hardware is no capable to support it. Blame Nvidia, vote with your wallet and don't buy.
Well, I was never looking for a $1000+ graphics card in the first place, so I think I'm fine. :)
GhostRyderIt's really going to be 6XXX that makes a much more major difference all around.
Don't bet on that, yet.
Posted on Reply
#16
3valatzy
GhostRyderClearly they had to push it that far in power to make it worth buying. Otherwise it would have been so close to the 4080 Super that it probably would hurt sales in the long run (I guarantee it would have sold it immediately regardless).

We really are not going to see much this generation overall even if a 5080 ti comes out. Its really going to be 6XXX that makes a much more major difference all around.
I bet Nvidia will charge first with a super-duper RTX 5000 lineup, and then if lucky, will release a "new architecture" on the same old 4nm TSMC node, because 2nm and 3nm will remain prohibitively expensive, and just because Nvidia doesn' want to sell graphics cards any longer.
Posted on Reply
#17
Wirko
Maybe the adapter isn't just wires but contains some electronics, and sends a "600W available" signal through the sensor pins only if all four 8-pin cables are connected.
Posted on Reply
#18
GhostRyder
AusWolfWell, I was never looking for a $1000+ graphics card in the first place, so I think I'm fine. :)


Don't bet on that, yet.
Could be right, they do seem more focused on software and that could be their future.
3valatzyI bet Nvidia will charge first with a super-duper RTX 5000 lineup, and then if lucky, will release a "new architecture" on the same old 4nm TSMC node, because 2nm and 3nm will remain prohibitively expensive, and just because Nvidia doesn' want to sell graphics cards any longer.
Oh I don't think they hate selling cards, they just now want to focus on stuff that is more proprietary and make it part of the cards they sell.
Posted on Reply
#19
petroj
Wait, aren't these cards pulling 75w from the PCIe slot as the standard allows them to?
Posted on Reply
#20
AnotherReader
jonupit's also 28% increase in the maximum possible consumption, but the actual and average power consumption will be much more linear to the increase in performance.
Well, they didn't measure average power draw under the lower limit so we can only rely on how silicon typically scales with increased power. Average power consumption will increase by more than the increase in performance. Note that the example I gave was a game that was running close to the power limit at stock so limiting to 450 W resulted in a significant decrease in performance.
Posted on Reply
#21
_roman_
3valatzySomething doesn't add up in your experiment. Are you sure something is not wrong in your case ?
You are right we do not know all details.

Some power supplies have Y - CAbles.
Two graphic card connectors on the same cable which goes to the same connector to the power supply unit. It looks like the letter Y.

Regardless - I feel sorry for someone who get his expensive hardware damaged by an unfinished product.

I think those sensor pins just check if there are 12 Volt DC or not on 5 different pins what i saw a few weeks ago. I wouldnot call them sensor pints. They are just checking if there is a high signal there or not most likely.

Gamers nexus has equipment which they did a lot of efforts to calibrate. they can measure the peg slot and the other wires in total.

What i remember those 5000 series graphic card take around 40 Watts in idle. Just check the video first please. Thank you. It does not matter if its 32 or 40 or 35 Watts. Its far off the 6 to 12 Watts it should be in idle mode.

At the end of the day only the power consumption at the wall socket counts. That includes the efficiency for the efficiency curve of the power supply unit. Which is hardly measured in the range from 1 to 75 Watts output.

--

Assuming that information is correct, you just need to set two pins to Gnd for the 600 Watts. That should be not difficult for an adapter cable.
www.smpspowersupply.com/atx3connectors.html
Posted on Reply
#22
phanbuey
_roman_You are right we do not know all details.

Some power supplies have Y - CAbles.
Two graphic card connectors on the same cable which goes to the same connector to the power supply unit. It looks like the letter Y.

Regardless - I feel sorry for someone who get his expensive hardware damaged by an unfinished product.

I think those sensor pins just check if there are 12 Volt DC or not on 5 different pins what i saw a few weeks ago. I wouldnot call them sensor pints. They are just checking if there is a high signal there or not most likely.

Gamers nexus has equipment which they did a lot of efforts to calibrate. they can measure the peg slot and the other wires in total.

What i remember those 5000 series graphic card take around 40 Watts in idle. Just check the video first please. Thank you. It does not matter if its 32 or 40 or 35 Watts. Its far off the 6 to 12 Watts it should be in idle mode.

At the end of the day only the power consumption at the wall socket counts. That includes the efficiency for the efficiency curve of the power supply unit. Which is hardly measured in the range from 1 to 75 Watts output.
It didn't damage the card, just melted 2 cables themselves... and melted a bit of the plastic on one of the 3 connectors on the PSU.

I went ahead and ordered a corsair 12vhpwrcable, and she's been running great at 450W.

It was just too much heat for those 3 8pins cables? They melted pretty evenly... they may have been defective, or maybe some other issues, who knows... but running anything at the absolute theorhetical max is never a great idea. Very little room for error - the connector from the nvidia dongle to the psu cable also melted and exuded some sort of clear resin.
Posted on Reply
#23
Endymio
3valatzyThat's an ugly artificial software limitation, not that the hardware is no capable to support it. Blame Nvidia, vote with your wallet and don't buy.
This is the most absurd pseudo-objection I've ever heard. The 2x8 configuration isn't simply a small undervolt; it's an attempt to supply the card only 2/3 of normal power. No one buys a $1000 video card to undervolt it to the point that it runs like a $500 card.
Posted on Reply
#24
Zach_01
EndymioThis is the most absurd pseudo-objection I've ever heard. The 2x8 configuration isn't simply a small undervolt; it's an attempt to supply the card only 2/3 of normal power. No one buys a $1000 video card to undervolt it to the point that it runs like a $500 card.
What 2/3 are you talking about?
This is a 5080 with 360W power limit and 2x8pin can supply 300W + the 75W from PCI-E. Total of 375W.
Compared to a 5090 that requires 575W and only getting 450W + 75W (525W) the 5080 should be "easier" to run with 2x8pin.

Clearly this is a limitation somewhere that could/should(?) have been avoided. Maybe it would require a more complex PCB/Power delivery subsystem that most likely the 5090 has.
In all honesty when you buy a 1000~1500+ GPU you dont try to cheap on power...

The 12VHPWR has a build-in sensing system for 450W and 600W options.


Posted on Reply
#25
Endymio
Zach_01Compared to a 5090 that requires 575W and only getting 450W + 75W (525W) the 5080 should be "easier" to run with 2x8pin.
That assumes the voltage/freq curves for both chips are the same (they aren't) and that the 5080 is pulling the maximum both from the bus and each plug (it isn't, or it wouldn't need the third plug).

Claiming this board is a fail because you can't run it drastically out of spec is an outrageously puerile argument. Had NVidia done what you asked, then the first cable or mobo that was even slightly out-of-spec would have caused anything from a melted cable to an outright fire, and you'd have gone off the rails over that instead.
Posted on Reply
Add your own comment
Feb 3rd, 2025 20:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts