Thursday, March 24th 2022

NVIDIA to Bundle Power Adapter with GeForce RTX 3090 Ti

For those looking to invest in a GeForce RTX 3090 Ti card, but being unable to acquire a new power supplies with a 12VHPWR connector, there's good news today, as NVIDIA is said to be bundling a power adapter with the cards. The adapter does have its own requirements though, as it's a three to one type of adapter, which means your power supply still needs to have at least three 8-pin PCIe type power connectors. The adapter is said to be able to deliver 450 Watts of power to the upcoming graphics card, which is in line with the various leaks and rumours about the GeForce RTX 3090 Ti.

It's worth keeping in mind that these types of mechanical adapters won't allow the graphics card to communicate with the PSU, something that will be reserved for ATX 3.0 power supplies it seems. It'll also lead to some extra cable tangle near the graphics card, as the design of the pictured connector doesn't exactly look slick or space saving. Hopefully some power supply manufacturers will come up with a neater solution, until the transition to ATX 3.0 PSUs take place.
Source: VideoCardz
Add your own comment

23 Comments on NVIDIA to Bundle Power Adapter with GeForce RTX 3090 Ti

#2
Chomiq
Here we go again.
Posted on Reply
#3
ZeppMan217
Might as well just pack a separate power brick with it and call it a day.
Posted on Reply
#4
ThrashZone
Hi,
Pigtail hell
This little piggy went to market
This little piggy stayed home
This little piggy went WHEA_UNCORRECTABLE_ERROR, all the way home :laugh:
Posted on Reply
#5
wolar
What is this monstrosity
Posted on Reply
#6
TheLostSwede
News Editor
wolarWhat is this monstrosity
It's for powering your George Foreman grill...
Posted on Reply
#7
ncrs
ZeppMan217Might as well just pack a separate power brick with it and call it a day.
Well, NVIDIA bought 3dfx, so they have the technology!
Posted on Reply
#8
thegnome
Good stuff, no new psu needed.
Posted on Reply
#9
Unregistered
Given the price they should just bundle it with a PSU.
Posted on Edit | Reply
#10
csgabe
They should include copper shims too.
Posted on Reply
#11
ThrashZone
thegnomeGood stuff, no new psu needed.
Hi,
At the price of the gpu another psu is chicken feed :laugh:
Posted on Reply
#12
Dr_b_
whats missing in the box is a $1000 credit for paying the first months electric utility bill to run the card
Posted on Reply
#13
ppn
why always the braided cable always, none of mine are.
Posted on Reply
#14
sanorene
Me sleeving the power cables to my 3090ti
Posted on Reply
#15
R-T-B
DeathtoGnomes@R-T-B I spoke too soon about the connector.
Technically I called adapters in that convo before you, but meh...
Posted on Reply
#16
ThrashZone
Hi,
Nice to see other manufactures not doing what nvidia is doing power wise
Most are just normal 2x8 or 3x8 plugs.
Posted on Reply
#17
chrcoluk
Not unexpected, it would be odd if they didnt given all other 3000 FE cards had bundled.
Posted on Reply
#18
Chrispy_
Nvidia seems obsessed with pushing a new power standard despite the fact that literally every PSU manufacturer and the PCI-SIG standards body aren't interested.
If you could trust that stuff made in China has the AWG that it's rated for, then Nvidia's new standard is great - 14AWG wire can carry up to 90W per cable pair, making for a compact, compliant, space-efficient connector.

The reason the current 8pin PCIe power connector only supports 150W over three cable pairs is becuase the PCI-SIG knows that stuff gets made in China, where everything is a con unless triple-checked at every stage. They downgraded a 90W cable pair to 50W max because they know that a 14AWG cable is probably going to be closer to 16AWG in many cases.

Also, USB-A 2.0 refuses to die because it's ubiquitous; PCIe 6+2 power connectors are the same.
Posted on Reply
#19
MentalAcetylide
ThrashZoneHi,
Pigtail hell
This little piggy went to market
This little piggy stayed home
This little piggy went WHEA_UNCORRECTABLE_ERROR, all the way home :laugh:
I don't think the piggy will even make it home.

Its just getting ridiculous with the growing power demands for these newer cards. The two that I have are each capable of acting as a space heater by themselves and won't ever be used together until I get a beefy AC unit for that part of the house. Even just gaming on an RTX 3090 without any overclock will heat up a small room quickly. I can only imagine what a 4000 series version of a 3090 will do to the electric bill. If NVidia keeps going in this direction, I don't think very many are going to be buying their upper tier cards.
Posted on Reply
#20
Chrispy_
The thing about ridiculous TDPs is that only the 0.1% of the most devout fanboys will buy them. You can game just fine on a 3060Ti and until games become demanding enough to outstrip the capabilities of 200W cards, there's no pressing need to dive headfirst into a 450W TDP for a graphics card.

Game devs will target the largest demographic first, and that's probably the relatively modest XBSX for now - and that has a ~200W total system draw from the wall socket. The RTX 3050 already has twice the number of GPUs in the steam hardware survey despite it being on the market for five weeks compared to the 3090's 18 months. Of the top 10 gaming GPUs in that hardware survey, the hungriest one uses just 170W.
Posted on Reply
#21
bonehead123
Chrispy_there's no pressing need to dive headfirst into a 450W TDP for a graphics card.
epeen man....

luv it, luv it, gotta get sum moar of it, along with a shiny new warp core from Utopia Planitia (or 3) & a buttload of fresh-cut dilithium crystals from the Rura Penthe labor camp to power it all, hehehehe :)
Posted on Reply
#22
johnspack
Here For Good!
I want one..... wonder how it will do in my dual core pentium laptop.......
Posted on Reply
#23
InVasMani
Might as well bundled a PSU with it while their at it since 450w is more than my Seasonic PSU is rated for.
ncrsWell, NVIDIA bought 3dfx, so they have the technology!
To be fair to 3DFX it had a Intel chip on it so that was probably as much power consumption as the 4 3DFX chips combined or more judging by the P cores vs E cores.
Posted on Reply
Add your own comment
Nov 23rd, 2024 08:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts