Monday, February 3rd 2025

NVIDIA GeForce RTX 5090 Runs on 3x8-Pin PCI Power Adapter, RTX 5080 Not Booting on 2x8-Pin Configuration

NVIDIA's flagship GeForce RTX 5090 demonstrated flexibility in power compatibility, while its sibling, the RTX 5080, struggled with stricter requirements. Recent tests by a German tech outlet, ComputerBase, reveal that the RTX 5090 can operate with three 8-pin PCI power connectors instead of the recommended four, albeit with a performance trade-off. However, the RTX 5080 fails to boot when using only two 8-pin connectors. The RTX 5090, with a default TDP of 575 W, officially requires a 600 W 12V-2×6 connector or an adapter with four 8-pin PCI cables. However, tests on the ASUS ROG RTX 5090 Astral and Zotac RTX 5090 Solid show the GPU boots even with three 8-pin cables, capping its TDP at 450 W—matching the three connectors' 150 W-per-cable spec. Performance losses are modest: benchmarks indicate a 5% drop in average FPS at 450 W compared to full power.

In contrast, the RTX 5080's 360 W TDP proves less forgiving. Attempts to run the Founders Edition and Zotac RTX 5080 AMP Extreme Infinity with two 8-pin connectors (300 W total) resulted in failure: the screen remained blank, and the card refused to initialize. NVIDIA's firmware appears to lack a lower power-limit threshold for the RTX 5080, unlike the 5090, which automatically adjusts when detecting insufficient power delivery. This requirement forces users to adhere strictly to the three 8-pin or 12V-2×6 power connectors. While the RTX 5090 offers flexibility for users upgrading from older systems, the RTX 5080's limitations may frustrate owners of less powerful PSUs. For the RTX 5090, the 5% performance penalty at 450 W may be a reasonable trade-off for avoiding costly PSU upgrades, but RTX 5080 users have no such recourse. Verifying power supply compatibility, as underpowered setups risk instability or hardware damage, is a must, and when your $2000+ GPU runs, you should at least power it properly. This experiment is more a "for science" type of run.
Sources: ComputerBase, via VideoCardz
Add your own comment

65 Comments on NVIDIA GeForce RTX 5090 Runs on 3x8-Pin PCI Power Adapter, RTX 5080 Not Booting on 2x8-Pin Configuration

#51
Vayra86
Zach_01What 2/3 are you talking about?
This is a 5080 with 360W power limit and 2x8pin can supply 300W + the 75W from PCI-E. Total of 375W.
Compared to a 5090 that requires 575W and only getting 450W + 75W (525W) the 5080 should be "easier" to run with 2x8pin.

Clearly this is a limitation somewhere that could/should(?) have been avoided. Maybe it would require a more complex PCB/Power delivery subsystem that most likely the 5090 has.
In all honesty when you buy a 1000~1500+ GPU you dont try to cheap on power...

The 12VHPWR has a build-in sensing system for 450W and 600W options.


Well maybe the card just doesn't look at slot power and 'wants' all of it from the other end? Is that possible? We've seen in the past its not just 'oh this gives me juice I'll just take it' as we've also seen cards pull too much from the slot.
wolfI can see why, 450w+ is starting to get crazy. I'm getting comfier with the 300w range though :twitch:

2w I'd say seems to low, maybe 10-40w seems more reasonable if it really wants all it's power through the plug in cables.
300W was always my hard limit and it hasn't changed. 250W being optimal to me in a regular system. If you go bigger you need to worry a lot more about case cooling, case choice, but also PSU sizing, etc. Its a straight up price increase for almost the entire system. And those systems also tend to be louder.
Posted on Reply
#52
JustBenching
AusWolfSo can I give my 6750 XT just 150 Watts through a single 6-pin coming from a noname 300 W PSU and the PCI-e slot? :p
If you limit it to 150w sure, why not. I've used my 4090 on a 650w psu
Posted on Reply
#53
AusWolf
JustBenchingIf you limit it to 150w sure, why not.
Heh, I'd like to see someone try.
JustBenchingI've used my 4090 on a 650w psu
650 W is enough for a 4090 as long as you don't have an unlocked i9 CPU and ten HDDs, pumps, fans and such.
Posted on Reply
#54
JustBenching
AusWolfHeh, I'd like to see someone try.


650 W is enough for a 4090 as long as you don't have an unlocked i9 CPU and ten HDDs, pumps, fans and such.
If 650 is enough then what are we even talking about?
Posted on Reply
#55
AusWolf
JustBenchingIf 650 is enough then what are we even talking about?
This:
JustBenchingThere is no such thing as need, hardware doesn't have needs. They take as much power as the user decides to give them.
Obviously hardware has needs. You need a certain voltage to maintain certain clock speeds, which means power is being used. There's also no power saving limiting things when you press the power button.

The only hardware that doesn't have needs is the one sitting on your shelf.
Posted on Reply
#56
JustBenching
If you can run the until recently highest end gpu that costs over 2k$ with a 650w gpu then it's a non issue.
Posted on Reply
#57
_roman_
wolfI don't know what you mean by argument? does undervolting not work when using software?
I think you misunderstood the basics.

in the direct current world is wattage equal voltage multiplied with current or current multiplied with current multiplied with resistance or voltage multiplied with voltage divided by resistance. Basics.
(Blame this forum for not proper software to make mathematical basic formulas from 3rd school year here.)

Auswolf stated - the card will not boot without all proper connectors during bootup.

You need to bootup the hardware first.
than you can make software hacks and tweaks
when the hardware does not bootup with less connectors, you can not undervolt

i hope this is clear now.


Remember:

Like in OSI layered modell. (that is very important you may read it and learn it please)

Physical layer = hardware first - missing connector - nope
much later comes software
much later comes the application

edit: don*t be angry. These are basics in electronics or physics or mathematics. The other stuff are basics from my education. nothing new.

edit: That is not an argument get another power supply unit. When the card takes less Watts an older power supply could be more the fit. The real reason are those power spikes which are insane. Which most likely previous cards did not really generate. See igorslab. Sometimes he is right, sometimes there is room for improvement. Why should someone needs a new power supply unit when the older card had similar wattage as the newer card? the only reason are those power spikes.
Posted on Reply
#58
3valatzy
_roman_when the hardware does not bootup with less connectors, you can not undervolt
This is nonsense, the cards must boot with close to idle power settings, not at maximum peak clocks, etc...
Posted on Reply
#59
_roman_
wolfI wouldn't expect a card with 2x8-pin to work with only one connected for example, even if a power target was set needing less than 225w.
In my point of view it hsould be possible.

The connector can distinguish between 4 different wattage modes for the 600 Watt connector. The card should bootup.

Check post #22 link - the card should even bootup with 100 Watts - initial permitted power at system power up. The card should run fine with 150 Watts. Any questions?
Assuming htat spec is correct in #22 - nvidia has a design error with that particular graphic card mainboard - or software error with that particular graphic card mainboard
Sense0Sense1Initial Permitted Power at System Power UpMaximum Sustain Power after Software Configuration
GndGnd375 W600 W
OpenGnd225 W450 W
GndOpen150 W300 W
OpenOpen100 W150 W
When you reply, please provide datasheet for all components involved, especially for the controller ics, provide full schematics for a recent powersupply unit, atx 3.1 spec and full schematics for those cables and adapters.
AusWolfNaturally, cards with external power connectors don't use the PCI-e slot to its full 75 W specification, but to say they use 2 W is a bit daft.
I doubt the Wattage is limited over the PEG slot. I forgot which card it was, i think one card even took over 125 Watts over the PEG slot.

Anyway. Nvidia does not contribute to open source software - see recent gamers nexus video.

I invite AMD, Nvidia, INTEL, power supply companies and the others: Please publish the full specifications, the full schematics, the full datasheets, the full register and programming sheets for all components being used on pc parts.

Well it's easy to sell garbage, when you do not publish the full specifications and no one can check if the firmware, software, hardware is working according to the specifications. Check e.g. the linux kernel (it is only the kernel!) for all the workaround for the hardware and firmware bugs. A generic statement.

I really want to see in less than 2 minutes the full nvidia graphic card connector specs on the nvidia homepage in english. That includes datasheet, application notes and other common documents needed to design a device.
Posted on Reply
#60
wolf
Better Than Native
_roman_I think you misunderstood the basics.
_roman_In my point of view it hsould be possible.
I'm thinking it's you that's misunderstood the basics. I'm not providing you anything. It's reasonable to expect hardware to be connected as the manufacturer intended. I have zero intention of proving or justifying anything to you, as if you're some barrier that if I don't what I've said is invalid, the hubris.
Posted on Reply
#61
_roman_
Visible NoiseNo. As a rule Nvidia only pulls about 2W from the PCIe slot.
I wrote on igorslab for ages that I want to see proper measurements.

I only know gamers nexus measures with internal checked and before external calibrated measurement device the peg slot and all the cables.

I doubt 12 Watts on the peg slot are 2 watts. I doubt the windows software reads out these values correctly.

source:

NVIDIA GeForce RTX 5090 Founders Edition Review & Benchmarks: Gaming, Thermals, & Power



That measurement is plausible.

I also wrote on those cpu tests in teh past. I want to see the wattage for every cpu pin - not the hole mainboard nonsense. Like igorslab does for example in the past. Igor for example can not differentiate between cpu only consumption and the correct cpu + mainboard + mainboard peripherals + ram consumption + something else which may be on the mainboard I forgot as of now.
Posted on Reply
#62
Visible Noise
JustBenchingThat's completely wrong. The card not starting cause a cable is missing has nothing to do with power draw or undervolting. There are 3080 models with 2x 8pin and 3x 8pins. They all draw the same power (locked to 360) but the 3x8pins don't start with only 2 cables. That has nothing to do with them requiring more power, it's just the way they were designed.
We'll see what happens the 3x8pin 9070XT I guess. By some people's thinking they won't be able to be undervolted.
_roman_Nvidia does not contribute to open source software - see recent gamers nexus video.
You should get your information from better sources than techtubers that have their own agendas.

developer.nvidia.com/open-source
github.com/NVIDIA
Posted on Reply
#63
Vayra86
3valatzyThis is nonsense, the cards must boot with close to idle power settings, not at maximum peak clocks, etc...
Right, you mean like your PC booting up with an idle CPU? OH wait

Let's reflect a little bit on us talking about how a card should behave 'power wise' when its one of the rare cards that has odd behaviour and a... oh! 600W power allowance! Gosh
Posted on Reply
#64
_roman_
Visible NoiseYou should get your information from better sources than techtubers that have their own agendas.
Just - don't

I had a Nvidia 9800m gts (asus g70sg) with "bar" error in the firmware. I patched by hand every kernel source I used with that graphic cards for years. It was several years, not only weeks or months because the card had a hardware - firmware error with the binary nvidia-drivers.
I had a nvidia 660m GTX (asus g75VW) - barely any open source stuff
I had in 2023 to test again the windows 10 pro and the gnu linux driver state again a second hand MSI 960 GTX 4GB card. I used it for several months to test it. One of the reasons why i had before radeon 6600XT and bought after that radeon 6800 non xt and later radeon 7800XT. I'm one of the few users who did not test intel, but reevaluated the other gpu manufacturer also for driver quality, daily windows 10 pro and the linux user space daily scenario. I was also interested about the 4GB VRAM and how slow that card really is.

Just do not tell stories who none believes. I used notebooks with nvidia graphics with the same gnu linux installation for a very long time.

Years ago I checked this page - no progress - not the bare minimum works
nouveau.freedesktop.org/FeatureMatrix.html
Posted on Reply
#65
KiloFeenix
AusWolfWhat are the sensor pins connected to when you're using an adapter?
The sense pins are connected to psu ground, so if one of the adapters isn't plugged in it would know, why the 5090 can do it, idk.
Posted on Reply
Add your own comment
Mar 6th, 2025 13:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts