Tuesday, December 31st 2024
AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors
AMD will continue using traditional PCI Express power connectors for its upcoming Radeon RX 9000 series RDNA 4 graphics cards, according to recent information shared on the Chiphell forum. While there were some expectations that AMD would mimic NVIDIA's approach, which requires the newer 16-pin 12V-2×6 connector for its GeForce RTX 50 series, the latest information suggests a more traditional power approach. While AMD plans to release its next generation of graphics cards in the first quarter, most technical details remain unknown. The company's choice to stick with standard power connectors follows the pattern set by their recent Radeon RX 7900 GRE, which demonstrated that conventional PCI Express connectors can adequately handle power demands up to 375 W. The standard connectors eliminate the need for adapters, a feature AMD could highlight as an advantage. An earlier leak suggested that the Radeon RX 9070 XT can draw up to 330 W of power at peak load.
Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
Sources:
Chiphell, via HardwareLuxx
Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
133 Comments on AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors
If you need 2 or 3 of those, better go with the new standard Imo.
Anyway, if AIB can choose what connector to use, I see no problem- all options will be available. No real right\wrong answer here.
www.techspot.com/news/99094-another-16-pin-rtx-4090-power-adapter-has.html
The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.
There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
My last Radeon GPU was a 4890 that I overclocked to 1Ghz LOL.
But I doubt it'll be enough for the 9070 XT. This matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title. Especially the ones where DLSS is the only upscaler that works correctly. I would've agreed if that was a comparison with an 8 GB GPU but 10 GB is nowhere near obsolete, also 320-bit bus really helps a lot.
The leaks we got suggest 9070 XT just barely outperforming 7900 GRE which is roughly 3090/3090 Ti area. This is faster than 3080, sure, but it's not a lot of difference.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB
Watch:
This is the most recent TPU GPU review. The hardest benchmarking mode possible, 2160p. No DLSS, everything on Ultra (RT off tho), no slacking. And still, 3080 is only 14% behind 3090. It wins against 7800 XT despite less VRAM. It doesn't trail behind 7900 GRE much, just a tiny gap of 7.5%.
I don't see how 10 GB is any problematic at pedestrian resolutions like 1440p. Just no way. Just go from Ultra textures to Medium-High and you'll find yourself with half your VRAM doing a whole lot of nothing, waiting for instructions, and the games won't look like garbage because textures are overtuned anyway. Yes, sure, having more is great but you're stretching it.
Watch:
I also have a 12 GB GPU and I have never run out of VRAM playing whatever game. Perhaps once when I enabled settings that "run" @ 20 FPS on a 4090... Other than that, the "8 GB is obsolete" is only true in the sense the leather jacket guy is too greedy and provides too little generational uplift.
Its not exactly looking like they're very frequent, what I do see, is loser-Youtuber-territory here. Big screamy face, all caps headline, outrage!
That being said, I do have a few boxes of popcorn waiting for the 5090 release.
Buy a new Gpu to get more Vram only is just stupid whitout getting more performance.
Better to go 5070Ti to get performance boost
There will always be cases because there are thousands of cards on the market, and there is always an error rate in any hardware. In the case of the 4090, there will always be a bad connector, but after some time, you can't talk about something generalized, but something that affected a small percentage of users and that today has no more relevance than for the 5 minutes of glory of some youtuber or some random post on Reddit.
Fucking preach. And yet you would still run into people saying that “well, we can do cards pulling 600W and the cooling works, so why
contain itlimit ourselves, it’s performance”. I wouldn’t grab anything above 250W for myself, but hey, if people want space heaters it’s their choice.As for the connector, I would trust W1zz over outrage grifters any day of the week - if he says that over dozens of cards and thousands of plug-unplug cycles he didn’t ran into any problems and none of his acquaintances/contacts did either then the whole thing is overblown and is just cases of user error and/or rare defective cards, which happens.
btw, all I said is that mine works fine, and you put that idiotic "laugh" reaction on it. sorry it didn't burn my pc down, which you would probably like.