Monday, February 19th 2024

NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

NVIDIA is reportedly looking to change the power connector standard for the fourth successive time in a span of three years, with its upcoming GeForce RTX 50-series "Blackwell" GPUs, Moore's Law is Dead reports. NVIDIA began its post 8-pin PCIe journey with the 12-pin Molex MicroFit connector for the GeForce RTX 3080 and RTX 3090 Founders Edition cards. The RTX 3090 Ti would go on to standardize the 12VHPWR connector, which the company would debut across a wider section of its GeForce RTX 40-series "Ada" product stack (all SKUs with TGP of over 200 W). In the face of rising complains of the reliability of 12VHPWR, some partner RTX 40-series cards are beginning to implement the pin-compatible but sturdier 12V-2x6. The implementation of the 16-pin PCIe Gen 6 connector would be the fourth power connector change, if the rumors are true. A different source says that rival AMD has no plans to change from the classic 8-pin PCIe power connectors.

Update 15:48 UTC: Our friends at Hardware Busters have reliable sources in the power supply industry with equal access to the PCIe CEM specification as NVIDIA, and say that the story of NVIDIA adopting a new power connector with "Blackwell" is likely false. NVIDIA is expected to debut the new GPU series toward the end of 2024, and if a new power connector was in the offing, by now the power supply industry would have some clue. It doesn't. Read more about this in the Hardware Busters article in the source link below.

Update Feb 20th: In an earlier version of the article, it was incorrectly reported that the "16-pin connector" is fundamentally different from the current 12V-2x6, with 16 pins dedicated to power delivery. We have since been corrected by Moore's Law is Dead, that it is in fact the same 12V-2x6, but with an updated PCIe 6.0 CEM specification.
Sources: Moore's Law is Dead, Hardware Busters
Add your own comment

106 Comments on NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

#51
Knight47
dgianstefaniBesides, as people miss, this isn't NVIDIA making these connectors, it's PCI-SIG or other standardization consortiums.
Yeah, it's Intel and AMD who forced to make 12VHPWR and nvidia got baited, Intel isn't even using their own connector. It was a Team Blue+Team Red plan to ruin Team Green.
Posted on Reply
#52
dgianstefani
TPU Proofreader
Knight47Yeah, it's Intel and AMD who forced to make 12VHPWR and nvidia got baited, Intel isn't even using their own connector. It was a Team Blue+Team Red plan to ruin Team Green.
1. Intel doesn't have any consumer GPUs that come close to the power ratings of this connector (yet).
2. AMD has already indicated it's interest in using the 16 pin in their future products.
Posted on Reply
#53
gurusmi
DY69SXWot The Hell you are taking about!! No one is forcing you to do anything! Yet!! And you are crying like your life depends on this!!
To explain it a bit especially for you. I own a company and sell the products via internet. The delivery seems to be free of charge. But that is only visual. I have the price i want to have at the end. I add the postal fees for worldwide transportation. That price I set as the selling price. So everybody pays for transportation. But as it is announced as free everybody is happy. With adapters it's the same. The producer adds one and sum up the costs for that adapter. All the idiots are happy to get that adapter for "free".
DY69SXAnd to answer why adapter's are produced is to give you possibility of choice!! But anyway you won't buy RTX50 so why you are crying for
Nope. If i want to use the nVidia Card i will by the adapter. I don't rely on the producer to get an adapter. The rig i build costs around 7.700€. 20€ more or less is not worth to think about. But i learned already as a child that i should not use thing and throw them to the garbage straight after. That's against my education. And you'reright. I won't buy a nVidia card because i don't want to be a beta tester. After that rig is built it has to earn money for the next 3-4 years.
Posted on Reply
#54
ThrashZone
Hi,
Well obviously NV had zero QC over any of this connector crap.
Hell they going to color code this one so users know which one they get lol
Posted on Reply
#55
Random_User
Panther_SeraphinSO anyone who bought a new PSU wiht the "new" 12V connector basically got Jebaited
They did it intentionally, just to watch how consumers let them bend over at own expense. The've locking people into own ecosystem, even by having proprietaty connectot. Nvidia tests the waters, with how far they can go with most ridiculous ideas, to close the gap with Apple, without taking any loss. And people let them. At this point this looks exaclty like the fasion industry, and Apple already has one foot in this area.
Hecate91Having a more reliable power connector is one reason I went with an AMD gpu, I got tired of nvidia's greed and planned obsolescence with VRAM.
It seems like Nvidia went with a new connector for aesthetics reasons, because they couldn't fit 3x 8 pin on their already cost cut GPU design having a triangle cutout on the board.
Aesthetic reasons? There can be no aesthetics in the device that has the size of a real clay brick, and still has no space to accomodate three reliable 8pin connectors. And the actual one is still located in the middle of... worst possible spot on the card, with no space for the wires to go.
Hecate91The manufacturing tolerances in datacenter hardware isn't the same as consumer hardware, 12VHPWR isn't suitable for consumer use when there is no room for error, unlike 8 pin & 6+2 pin molex which even the cheapest garbage connector will fit and you can tell when it clicks into place.
Having another new connector harms anyone who paid a premium for an ATX 3.0 power supply, I'd also rather have a PCB be another few inches longer with the tradeoff of knowing the connector won't melt.
Do be honest, for all these years, I haven't met any info related to the melted 6/8Pin connector on VGA being melted, outside some outrageous amateur overclocking. Maybe that's why Corsair never joined this scam bandwagon.
THere definitely might be the place in the enterprise area. But knowing how prone to failure this 12HPWR can be, in the terms of excess attention levels and rigorousness, I strongly doubt enterprise/server market is great place for it. In the area, where there's hundreds if not thousands of cards being connected simultaneously, it's last place to think if one wire can accidentally pop out of connector. Not to mention the connection should be just "click and ready", and never even waste any second on whether it's "really" inserted, or not.
TheDeeGeeA change to make it safer... everyone complains... lol

The world is so fucked.
The issue is not in the effort to make it safer. The problem is everyone was telling the 12HPWR was garbage from the very beginning. But Nvidia was trying to have a higher ground, and reduce this ti the user failure. Even GN baited into this shit, and vent into Nvidia advocates. Yes they mentioned this issue. But it doesn't help, by diminishing the responsibility of trillion dollar company, that pushes the untested connector upon every user.
And now, they go out with "possibly" new design, and still might look as a savior.
Vayra86The world is more fucked when it comes to thinking how this ever passed common sense QA. Also Im still in wait and see mode wrt this happening. The source is MLID.

None of this should have ever happened. The 4090 is the only card that would need an extra 8 pin. And its a fucking brick sized GPU. Space enough.

Its always good to keep going back to the why. Who benefits here? The 4090 with 12VHPWR adapter is too wide for most cases. Its fucking useless. Its placed in the wrong location. Need we go on?
Exactly. There's no way, such huge company, which sells millions of cards, can't find resourses to test the effin plug and socket, which goes into these connector. Instead they probably put money into making "techtuber gang" to patter this issue away.
dgianstefaniMaybe the 16 pin was on the table before NVIDIA knew AMD wasn't making a high end RDNA4, but it seems NVIDIA has dropped their multi-chip flagship card as there will be no competition for them in the high end. A dual die halo SKU would probably have needed more than 600 W.
Looks like a valid point. Since they have no challenge to even put any effort into making new connector. And even if it is, it's more likely appear in AI/Enterprise first, this time.
btarunrUpdate 15:48 UTC: Our friends at Hardware Busters have reliable sources in the power supply industry with equal access to the PCIe CEM specification as NVIDIA, and say that the story of NVIDIA adopting a new power connector with "Blackwell" is likely false. NVIDIA is expected to debut the new GPU series toward the end of 2024, and if a new power connector was in the offing, by now the power supply industry would have some clue. It doesn't. Read more about this in the Hardware Busters article in the source link below.
Rumour or not, this story might have had place, like dgianstefani have mentioned. But I guess even if it was real, no PSU manufacturers would confess, because it will make even a bigger outrage, for bringing yet another connector, after selling more expensive PSUs that have nought future-proofing. They might not want to step into the same cr*p twice, and this time to let the "GPU ventor" to get their sh*t sorted out first, before pushing it onto others.
Dirt ChipA new day, a new PSU connector standard.
After the sh*tfest that USB "standardising" consortium made the entire world to deal with. The PCI-SIG, is seems inclined to join the clown show. There's no trust left for these organisations, that let corporations to push their issued stuff for high margins, and they don't prevent these from happening. Same goes to numerous regulating entities.
Posted on Reply
#56
SOAREVERSOR
gurusmiI will be taking a AMD GPU for my next rig. Just smiling about NVidia. How much tries will they need to have a new solid connector.AMD needs the same power and still uses the old reliable connector.
It's a non issue as most GPUs sold are not going to be the highend ones that caused all the problems. The fact that a ton of people who now buy high end GPUs have no real business building a computer makes for a lot of great hilarity but ultimately who cares!
Posted on Reply
#57
gurusmi
SOAREVERSORIt's a non issue as most GPUs sold are not going to be the highend ones that caused all the problems. The fact that a ton of people who now buy high end GPUs have no real business building a computer makes for a lot of great hilarity but ultimately who cares!
I will buy a 7900xt. That one can drive my two 40" UWQHD monitors on their refresh rate of 155Hz. Definitely the 7900xt is a high end card. This card will be water cooled. Like my 4 NVME SSD's and 2 RAM bars. I don't care what others do or not. That's their story. Not mine. But i don't want to kidden by manufactorers. And i don't want to be a test person.
Posted on Reply
#58
Dawora
gurusmiI will be taking a AMD GPU for my next rig. Just smiling about NVidia. How much tries will they need to have a new solid connector.AMD needs the same power and still uses the old reliable connector.
its better to take Nvidia GPU because Amd dont have future HighEnd GPUs

And im sure u want to take very expensive Gpu, but try not to get caught when u take it.
Posted on Reply
#59
Minus Infinity
Let's see Tom weasel his way out of this one. He'll go on the attack for sure and say he was taken out of context or some such BS.
Posted on Reply
#60
Dr. Dro
GodisanAtheist-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.
Bingo.
gurusmiNope. If i want to use the nVidia Card i will by the adapter. I don't rely on the producer to get an adapter. The rig i build costs around 7.700€. 20€ more or less is not worth to think about. But i learned already as a child that i should not use thing and throw them to the garbage straight after. That's against my education. And you'reright. I won't buy a nVidia card because i don't want to be a beta tester. After that rig is built it has to earn money for the next 3-4 years.
Bad idea. The adapter supplied by Nvidia is easily the highest quality adapter available, and note I mentioned supplied by Nvidia: all FE and AIB models come with one.

The only upgrade from that is procuring a native cable that is directly compatible with your power supply. Fortunately, most high-end power supplies have received third party cables, for example, the EVGA G2/P2/T2 and corresponding 1st generation Super Flower Leadex (of which they are derived from) power supplies will work just fine with the CableMod E-series cable, and I'm sure options are available if you don't like that company for whatever reason. Corsair provides first-party cables for most of their high capacity (750W+) units. In the absence of compatible cables (for example some CWT/HEC/Andyson low or midrange power supplies by EVGA or Thermaltake), your best bet is to use the supplied 3- or 4-way 8-pin to 12VHPWR adapter cable.
Posted on Reply
#61
Crackong
RahkShaHopefully this means they have built in extra redundancy. 8-pin connectors have 30%-60% additional wattage capacity over their rated 150w load, the 12VHPWR only has 10% or so.
According to der8auer.
The 8-pin spec is 216-288/150 = 44%-92% additional wattage capacity
And 12vhpwr is 660/600 = 10%

So the 12vhpwr is trading safety factor for smaller size.

Posted on Reply
#62
Nihillim
dgianstefaniI like the modern tiny PCBs with a single connector, makes waterblocking cards nice and compact.
Great for the folks who fall into that group. The air cooling one is far larger though, and they're getting screwed, because:
Vayra86The 4090 with 12VHPWR adapter is too wide for most cases. Its fucking useless. Its placed in the wrong location. Need we go on?
This even applies to the 4080 in some cases.
Posted on Reply
#63
N/A
CrackongAccording to der8auer.
So the 12vhpwr is trading safety factor for smaller size.
Says derb8er. Says Igor, what does he know about it. Well Both Microfit+ and Mini Fit are rated UpTo 13A. same no difference.
As a result the Overprovision is cut in half while 4x 8 pins are replaced by a single 16 pin.
For a 300-450W GPU 150-300 W safety remains,
but if quality terminals are used it is really 13 A by 12V by 6 and 936W, woot. 2x 3x safety overprovision.
Posted on Reply
#64
Crackong
N/AFor a 300-450W GPU 150-300 W safety remains,
In your calculation
660/300-450 = 2.2 - 1.46 safety factor

While 300-450 is 0.5 - 0.75 of the maximum rated power of the 12vhpwr connector
The comparison will be 75 - 112.5 for the regular 8pin
288/75 - 112.5 = 3.84 - 2.56 safety factor

Still, 8pin is much much safer vs the 12vhpwr.

And for your '13 A by 12V by 6 and 936W' calculation,
First, 12vhpwr connector doesn't use 13A pins within spec.
So your 13A calculation is out-of-spec in the first place.
The safety factor will be:

936/600 = 1.56 safety factor

In a fair comparison, the 8pin with 13A rated pins into it now boosted to 468W out-of-spec
And its safety factor will be:

468/150 = 3.12 safety factor

You just made the 8pin twice as safe vs the 12vhpwr
Posted on Reply
#65
SOAREVERSOR
dgianstefaniThe connectors make sense. They're used in industry and datacentres at massive scale without apparent issue.

The fact that connectors evolve harms who? You get a free adapter in the box.

I like the modern tiny PCBs with a single connector, makes waterblocking cards nice and compact.
The issue is that back in the day building a computer took some remote level of skill. It hasn't since the early 2000's and it's been getting worse. So now we have a legion of fucking morons who should never be allowed near safety cutters building computers while screaming about the problems they hit, demanding companies make less money so their g4mz04rzzz PC can be faster, and also demanding companies like Nintendo go bankrupt just to help m4h g4m1inG PC!!!.

That is what you are dealing with.
Posted on Reply
#66
Dr. Dro
SOAREVERSORThe issue is that back in the day building a computer took some remote level of skill. It hasn't since the early 2000's and it's been getting worse. So now we have a legion of fucking morons who should never be allowed near safety cutters building computers while screaming about the problems they hit, demanding companies make less money so their g4mz04rzzz PC can be faster, and also demanding companies like Nintendo go bankrupt just to help m4h g4m1inG PC!!!.

That is what you are dealing with.
No argument there. Building computers has become far less of a "nerd's pastime" than it used to be, but it's not without its positive sides, I suppose.
Posted on Reply
#67
Crackong
SOAREVERSORThe issue is that back in the day building a computer took some remote level of skill. It hasn't since the early 2000's and it's been getting worse. So now we have a legion of fucking morons who should never be allowed near safety cutters building computers while screaming about the problems they hit, demanding companies make less money so their g4mz04rzzz PC can be faster, and also demanding companies like Nintendo go bankrupt just to help m4h g4m1inG PC!!!.
That is what you are dealing with.
Totally.

But since they can't cherry-pick their customer, only thing they could do is to assume everyone are idiots and make things more idiot-proof.
And instead of making more idiot-proof connectors they made a 'idiot-prone' one, and so the backfire comes...
Posted on Reply
#68
gurusmi
Dr. DroBingo.



Bad idea. The adapter supplied by Nvidia is easily the highest quality adapter available, and note I mentioned supplied by Nvidia: all FE and AIB models come with one.

The only upgrade from that is procuring a native cable that is directly compatible with your power supply. Fortunately, most high-end power supplies have received third party cables, for example, the EVGA G2/P2/T2 and corresponding 1st generation Super Flower Leadex (of which they are derived from) power supplies will work just fine with the CableMod E-series cable, and I'm sure options are available if you don't like that company for whatever reason. Corsair provides first-party cables for most of their high capacity (750W+) units. In the absence of compatible cables (for example some CWT/HEC/Andyson low or midrange power supplies by EVGA or Thermaltake), your best bet is to use the supplied 3- or 4-way 8-pin to 12VHPWR adapter cable.
I really don't know why i should use adapters that i don't need. I won't buy a nVidia 4080 Super or above. I don't see any reason to pay at least 1.100€ for a nVidia 4080S when i can get a AMD 7900XT for 780€. I don't use to throw my money out of the window. There is not even one rational benefit i would trade in by the higher price. I'm not that braindead to buy a GPU only because of the more or less good reputation. Additionally I don't care about High-End or low-Range. As long as the card will do what i want to have done everything is fine for me.

Also i guess that the cables and their quality of my new bought Corsair HX1500i PSU ist not that bad. Because of the size of my case i did need extensions. That are CableMod ones. I had the same extensions already in use inside my actual rig since 4 yrs. If nVidia is not able to know what a norm is? I'm sorry. Then they are not worth to be in my focus.The older PCIe norm for power supply have shown their reliablility and trustworthy for years. To make something new only because it is new is real stupid. A standardization process is started to ensure that the described thinbg is available for use in a larger time frame. Just imagine a screw producer would introduce each year new Screw sizes. M2,7, M3,3,...

I'm not that little boy building a pc. I'm not i need to get a "Good boy" to compensate my low or non existing self esteem. Aside all that nVidia is more or less trash in the world of Linux.
Posted on Reply
#69
Vayra86
Knight47Yeah, it's Intel and AMD who forced to make 12VHPWR and nvidia got baited, Intel isn't even using their own connector. It was a Team Blue+Team Red plan to ruin Team Green.
That's.... pretty far out there. Source? Or just conjecture
Posted on Reply
#70
Dr. Dro
gurusmiI really don't know why i should use adapters that i don't need. I won't buy a nVidia 4080 Super or above. I don't see any reason to pay at least 1.100€ for a nVidia 4080S when i can get a AMD 7900XT for 780€. I don't use to throw my money out of the window. There is not even one rational benefit i would trade in by the higher price. I'm not that braindead to buy a GPU only because of the more or less good reputation. Additionally I don't care about High-End or low-Range. As long as the card will do what i want to have done everything is fine for me.

Also i guess that the cables and their quality of my new bought Corsair HX1500i PSU ist not that bad. Because of the size of my case i did need extensions. That are CableMod ones. I had the same extensions already in use inside my actual rig since 4 yrs. If nVidia is not able to know what a norm is? I'm sorry. Then they are not worth to be in my focus.The older PCIe norm for power supply have shown their reliablility and trustworthy for years. To make something new only because it is new is real stupid. A standardization process is started to ensure that the described thinbg is available for use in a larger time frame. Just imagine a screw producer would introduce each year new Screw sizes. M2,7, M3,3,...

I'm not that little boy building a pc. I'm not i need to get a "Good boy" to compensate my low or non existing self esteem. Aside all that nVidia is more or less trash in the world of Linux.
Other than the 4080 Super being a better card in every conceivable way (it is on the 7900 XTX's level, after all - not the XT's) and using far less power than the competition to achieve it? I don't get what you're trying to say here. If power efficiency is a concern (because energy prices in Europe are insanity) you've purchased the wrong hardware. The oversized 1500 W power supply could be replaced by a more efficient one (such as the Titanium-rated AX1000), and the Nvidia card would win out in power consumption every time and in under any circumstances because the RTX 4080 (and Super) use significantly less power than the AMD counterparts. The only justification I see for AMD if you can afford a 4080 Super is if you use Linux as your main OS, which well... you seem to.

The 12VHPWR connector doesn't automatically translate to "I chug 600 W, hear me roar!". Using DLSS and especially DLSS-G in supported games would help you retain image quality while lowering power consumption even further. Starfield with DLAA-G (native resolution DLSS with frame generation) at ultra high settings at 4K/120 would run sub-200W on my 4080, and that's because it's a Strix with a sky high power limit and without any undervolting or curve optimization involved.
Posted on Reply
#71
gurusmi
I don't know why y'all try to force me to the green area. It's like i would need a truck to deliver things downtown and everybody is talking about an usage on the Interrstate.

1. It's quite easy. I decide what card i take because i pay for it. Also the powerconsumption story is a ferrytale. The 4080 Super needs up to 340W. The 7900xt needs 315W. Both with stock OC. The AMD card can use a lot of more power to reach the higher investment prices of a nVidia 4080/4080S. With the higher estimated Power needs of a 4080S that will never happen. Btw. In my area the 4080 and the 4080S have the same price level. The 4080 uses up to 320W so 20 W less than the 4080S. I definitly don't see (in your words) significantly less power every time and in any circumstances. All the wattage figures are taken from Geizhals, a german price search engine. btw. I also own a brand new MSI 1.000W PSU. But that doesn't suit the power needs.

2. If the 4080 Super is more powerful doesn't matter to me as i need the card to drive my monitors. I don't use programs that utilitize the nVidia unique techniques "blablabla". I don't play games on my PC. If i want to play games i invite friends sit together with them and play analog games like i.e. Monopoly. Im a member of the Gen X. I learned to use my head. Nowadays everybody seems to use AI instead. To take my upper example: I don't need that huge overland Truck. It is enough that the truck will be able to carry the weight in downtown.

Far away from your experiences my build is running mainly on Linux. Showing the desktop on two parallel UWQHD (3440x1440 px) monitors at 155 hz refresh rate. I don't give a sh*t on how good which card is at games. I work on the rig and it has to make money. I'm developing software (free pascal/lazarus, Gambas), calculate on a spreadsheed (Excel/Libre Office) with a hell of self written macros, scan objects in 3d, prepare those scans for 3D printing, slice them, etc. If i would do photogrammetry i could use the power of a 4080s. But i don't. All of my workload is far far far away from anything, that you think about.
Posted on Reply
#72
GerKNG
Or we just stick to 8 pins as usual and stop limit GPUs to 150W per Plug.
Two 8 pins are more than enough for a 4090 Strix OC
Posted on Reply
#73
Keullo-e
S.T.A.R.S.
I truly hope that AMD stays away from these melting MacGyvered connectors.
Posted on Reply
#74
Dr. Dro
gurusmiI don't know why y'all try to force me to the green area. It's like i would need a truck to deliver things downtown and everybody is talking about an usage on the Interrstate.

1. It's quite easy. I decide what card i take because i pay for it. Also the powerconsumption story is a ferrytale. The 4080 Super needs up to 340W. The 7900xt needs 315W. Both with stock OC. The AMD card can use a lot of more power to reach the higher investment prices of a nVidia 4080/4080S. With the higher estimated Power needs of a 4080S that will never happen. Btw. In my area the 4080 and the 4080S have the same price level. The 4080 uses up to 320W so 20 W less than the 4080S. I definitly don't see (in your words) significantly less power every time and in any circumstances. All the wattage figures are taken from Geizhals, a german price search engine. btw. I also own a brand new MSI 1.000W PSU. But that doesn't suit the power needs.

2. If the 4080 Super is more powerful doesn't matter to me as i need the card to drive my monitors. I don't use programs that utilitize the nVidia unique techniques "blablabla". I don't play games on my PC. If i want to play games i invite friends sit together with them and play analog games like i.e. Monopoly. Im a member of the Gen X. I learned to use my head. Nowadays everybody seems to use AI instead. To take my upper example: I don't need that huge overland Truck. It is enough that the truck will be able to carry the weight in downtown.

Far away from your experiences my build is running mainly on Linux. Showing the desktop on two parallel UWQHD (3440x1440 px) monitors at 155 hz refresh rate. I don't give a sh*t on how good which card is at games. I work on the rig and it has to make money. I'm developing software (free pascal/lazarus, Gambas), calculate on a spreadsheed (Excel/Libre Office) with a hell of self written macros, scan objects in 3d, prepare those scans for 3D printing, slice them, etc. If i would do photogrammetry i could use the power of a 4080s. But i don't. All of my workload is far far far away from anything, that you think about.
Theoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:

www.techpowerup.com/review/thermal-grizzly-kryosheet-amd-gpu/

Anyway, since you're a Linux user that means you don't really have the option to run Nvidia IMHO. Not only the things that make having an Nvidia card worth it aren't available to you, maintaining them under Linux sucks as well. They are more power efficient but they also mean you need to run Windows. Bit of a moot point.
ChloefileI truly hope that AMD stays away from these melting MacGyvered connectors.
Unlikely IMO and the only reason 7900 XTX didn't have them is that the hardware design was finalized by the point those began to rollout. Other cards just rode the wave primarily as a publicity stunt "hey look ours don't blow up"
Posted on Reply
#75
theouto
It was disproved by a more reputable source (I hope), but this just means that we can expect some high power consumption figures.

I guess nvidia is scared by Intels 400W CPU, so they need to assert their dominance.
Posted on Reply
Add your own comment
May 16th, 2024 21:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts