Tuesday, December 31st 2024
AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors
AMD will continue using traditional PCI Express power connectors for its upcoming Radeon RX 9000 series RDNA 4 graphics cards, according to recent information shared on the Chiphell forum. While there were some expectations that AMD would mimic NVIDIA's approach, which requires the newer 16-pin 12V-2×6 connector for its GeForce RTX 50 series, the latest information suggests a more traditional power approach. While AMD plans to release its next generation of graphics cards in the first quarter, most technical details remain unknown. The company's choice to stick with standard power connectors follows the pattern set by their recent Radeon RX 7900 GRE, which demonstrated that conventional PCI Express connectors can adequately handle power demands up to 375 W. The standard connectors eliminate the need for adapters, a feature AMD could highlight as an advantage. An earlier leak suggested that the Radeon RX 9070 XT can draw up to 330 W of power at peak load.
Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
Sources:
Chiphell, via HardwareLuxx
Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
133 Comments on AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors
1) Why not simply have better safety margins on the new connector (12VHPWR, 12V2X6)? Surely you aren't implying they didn't have enough space for a slightly more beefed up connector.
2) What's stopping them from using 2 connectors?
I don't see any engineer not asking these obvious questions and it's quite apparent decisions were made not by the engineers in terms of the safety margin or quality of this new connector.
Reading through this thread, a lot of the comments for the new connector (and ironically VRAM) seem to boil down to "it isn't an issue for me or anyone I know and therefore it isn't an issue". For those issues though, both sides can be mutually correct without being wrong. It not being an issue for you and people you know doesn't mean it isn't an issue. It essentially equates to people with ownership entitlement and feelings punching down the opinions of others and ignoring facts.
Regardless of opinions, we should all be able to agree on two things: 1) Any increase in failure rate, particularly in a power cable, is unacceptable. 2) It would be better for customers if Nvidia included more VRAM.
If you cannot agree on the two above, your feelings have overwritten facts.
Blame the manufacturer for making such a stupid design where that could happen.
Both of mine have regular 2x8-pin, which I run off a 850W PSU.
But as you've said yourself, just because it's not in the news doesn't mean it ain't happening, happy reading. These are not 12vhpwr, I assure you, they are the safe non fire hazardous 8pins :D
www.reddit.com/search/?q=8pin+melted+connector&cId=3fb1a911-885f-44b8-83dc-15ccf54a651e&iId=9f25cd7f-5b64-43e5-b785-31556aea5336
The issue is the connector is designed with a lesser margin of safety, and less tolerance compared to the 8 pin connector.
Saying it works fine for you is just confirmation bias and bringing your feelings for the brand you like into the argument. Gamers Nexus was the only one to scientifically test the connector,probably more than Nvidia did because it seems like if the engineers correctly tested the connector they would've found issues with it. Although scientific testing doesn't always apply to real life usage, again there is little margin for safety compared to the 8 pin connector. Neither are you. The thread is already off topic and I'm not going to discuss something off topic with links. But if you consider 3090's getting bricked from a game, and the 3090Ti overheating because of VRAM being placed on the back of the card minimal issues then its clear you want to bring feelings into it. And I would consider saying 8GB is enough without proof, as being an extreme statement because many tech youtubers have debunked that. If you have to go looking for a melted 8 pin connector, then you should know what problem is, like I said its a confirmation bias from people that want to ignore the facts.
FYI there are reports of 3090 connectors burning up as well: nvidia/comments/yj4sga
That it wasn't as prevalent was probably due to defect present in the original 12VHPWR connector that allowed it to run only partially plugged in and the fact that the connector was not as widely rolled out. 12V2X6 claims to fix that particular defect. There are several forces outside the engineers designing a project that get significant influence over a product's design. "me and my friends don't have an issue with the connector" is not mutually exclusive with statements like "this connector has a higher failure rate". Both can be true. One is an opinion while the other is looking at the broader picture using data. You are mistaken, I pointed out that comments for the new connector are based on opinion because they are in fact opinions. The evidence against the connector is data driven, it shows the connector has a higher failure rate.
You agree later in your comment with my point that any increase in failure rate in a power cable is unacceptable. If you have data that shows otherwise please share. Otherwise you've presented a contradiction here. Sigh, there are plenty of examples of TPU and HWUB benchmarks showing VRAM usage exceeding 8GB (used, not allocated). There are videos showing the ill effects of such.
Regardless of whether a game is seeing a performance dip or not, It's wild that you cannot agree that consumer cards would be better off with a bit more VRAM.
Anyway, with the progress slowing down, this is not a biggie. Compare spending $250 on a GPU that will run out of steam in 3 years (RTX 2080 Ti, roughly $83 a year) and spending $450 on a GPU that'll be fine for 6 years (RTX 3080 Ti, roughly $75 a year). 1. You need to prove it's ridiculous.
2. I don't wanna explain my point in even more basic language than I already did.
But if you insist then let's say some guy bought a GTX 1080 Ti during 2017 and at some point in 2024, upgraded it to a 4080 because he only just now ran out of games that run fine on 1080 Ti. This roughly translates to +170% base performance boost with much more features supported. Thus, a serious upgrade. Massive, even. The same guy will be fine with his 4080 till like the middle of 2030s (when it dies) and upgrades only because his GPU has passed. Why do I think so? Because in the current state of the market, nothing suggests rapid gen-to-gen improvements. Not limited enough to concern, unless we're talking 4070 Ti which is too expensive for what VRAM it sports.
RT performance in AMD GPUs doesn't even exist so whatevs. I was talking discrete desktop GPU market ONLY. This is the market where AMD are happily suiciding and I dunno how and why it could be unclear from my statement.
In TPU testing on the Intel Arc B580, it does pull ahead of the 4060Ti 8GB on Hogwarts Legacy at 1440P. Agreed, I'm done replying to them after he resulted to throwing insults, they keep asking for proof even though people have posted proof, yet he hasn't provided any proof to support his claims.
Both sides are butt hurt about it. Nvidia made a new connector with same safety factor but a better designed interface between the connections to allow better contact.
TL:DR - It's partly marketing and partly reasonable to call out the issue with the new unproven connector with a lower than usual safety factor.