Tuesday, December 31st 2024

AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

AMD will continue using traditional PCI Express power connectors for its upcoming Radeon RX 9000 series RDNA 4 graphics cards, according to recent information shared on the Chiphell forum. While there were some expectations that AMD would mimic NVIDIA's approach, which requires the newer 16-pin 12V-2×6 connector for its GeForce RTX 50 series, the latest information suggests a more traditional power approach. While AMD plans to release its next generation of graphics cards in the first quarter, most technical details remain unknown. The company's choice to stick with standard power connectors follows the pattern set by their recent Radeon RX 7900 GRE, which demonstrated that conventional PCI Express connectors can adequately handle power demands up to 375 W. The standard connectors eliminate the need for adapters, a feature AMD could highlight as an advantage. An earlier leak suggested that the Radeon RX 9070 XT can draw up to 330 W of power at peak load.

Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
Sources: Chiphell, via HardwareLuxx
Add your own comment

133 Comments on AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

#1
Zazigalka
Good, though I don't really mind the whatever-pin on my 4070S. You plug it in, and it works fine.
Posted on Reply
#2
Dirt Chip
As long as one 8pin is enough all is good.
If you need 2 or 3 of those, better go with the new standard Imo.

Anyway, if AIB can choose what connector to use, I see no problem- all options will be available. No real right\wrong answer here.
Posted on Reply
#3
3valatzy
ZazigalkaGood, though I don't really mind the whatever-pin on my 4070S. You plug it in, and it works fine.
Why it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.

www.techspot.com/news/99094-another-16-pin-rtx-4090-power-adapter-has.html

The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.

There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
Posted on Reply
#4
katzi
If the prices are good, Might end up with a Radeon gpu to replace my 3080.

My last Radeon GPU was a 4890 that I overclocked to 1Ghz LOL.
Posted on Reply
#5
Macro Device
katzito replace my 3080.
Ain't gonna be much faster than that, even by today's ridiculous standards of +5% being a whopping upgrade. I'd rather skip this generation. 9070 XT is unlikely to be significantly faster than 3090 (3090 Ti if we're feeling really ambitious) and your 3080 isn't really far behind. More sense in waiting for 4080 series or better GPUs to become affordable.
Posted on Reply
#6
3valatzy
katziIf the prices are good, Might end up with a Radeon gpu to replace my 3080.

My last Radeon GPU was a 4890 that I overclocked to 1Ghz LOL.
Macro DeviceAin't gonna be much faster than that
It will be. Because the VRAM bottleneck will be solved. A miserable 10 GB vs 60% more VRAM.
Posted on Reply
#7
Macro Device
Dirt ChipAs long as one 8pin is enough all is good.
In the perfect world, they would've been squeezing ~330 W off one 8-pin (AWG14) so add ~70 W from the PCI-e interface on top of that and only the hungriest GPUs would've needed more than one.
But I doubt it'll be enough for the 9070 XT.
3valatzyA miserable 10 GB vs 60% more VRAM.
This matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title. Especially the ones where DLSS is the only upscaler that works correctly. I would've agreed if that was a comparison with an 8 GB GPU but 10 GB is nowhere near obsolete, also 320-bit bus really helps a lot.

The leaks we got suggest 9070 XT just barely outperforming 7900 GRE which is roughly 3090/3090 Ti area. This is faster than 3080, sure, but it's not a lot of difference.
Posted on Reply
#8
3valatzy
Macro DeviceThis matters in like four games and in five more if we talk absurd use cases (UHD+ texture packs and/or settings so high it's <15 FPS anyway) and 3080 has the edge to stay solid in every other title.
Wrong. 10 GB is miserable even at 1440p.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB

Watch:

Posted on Reply
#9
Knight47
Macro DeviceAin't gonna be much faster than that, even by today's ridiculous standards of +5% being a whopping upgrade. I'd rather skip this generation. 9070 XT is unlikely to be significantly faster than 3090 (3090 Ti if we're feeling really ambitious) and your 3080 isn't really far behind. More sense in waiting for 4080 series or better GPUs to become affordable.
This thing will be barely any faster than the 4 years old 6900XT in raster, let alone the 7900XTX that it supposed to beat for half the price.
Posted on Reply
#10
Macro Device
3valatzyWrong. 10 GB is miserable even at 1440p.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB

Watch:

All games from this video are perfectly playable on RTX 3070 Ti, an 8-GB GPU, at high settings with some ray tracing going on. VRAM allocation != VRAM usage. The fact the driver has allocated 14 GB doesn't mean the game will have issues if you have less than 14 GB VRAM. We'd see horrible benchmark results on 3080 otherwise but...

This is the most recent TPU GPU review. The hardest benchmarking mode possible, 2160p. No DLSS, everything on Ultra (RT off tho), no slacking. And still, 3080 is only 14% behind 3090. It wins against 7800 XT despite less VRAM. It doesn't trail behind 7900 GRE much, just a tiny gap of 7.5%.

I don't see how 10 GB is any problematic at pedestrian resolutions like 1440p. Just no way. Just go from Ultra textures to Medium-High and you'll find yourself with half your VRAM doing a whole lot of nothing, waiting for instructions, and the games won't look like garbage because textures are overtuned anyway. Yes, sure, having more is great but you're stretching it.
Posted on Reply
#11
TheDeeGee
3valatzyWhy it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.

www.techspot.com/news/99094-another-16-pin-rtx-4090-power-adapter-has.html

The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.

There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
Old news, you seeing any reports the past months? No.
Posted on Reply
#12
3valatzy
Macro DeviceAll games from this video are perfectly playable on RTX 3070 Ti, an 8-GB GPU
Again wrong. The 8GB GPUs are obsolete today.

Watch:

TheDeeGeeOld news, you seeing any reports the past months? No.
Yes, they can't disappear:

Posted on Reply
#13
Macro Device
3valatzyThe 8GB GPUs are obsolete today.
How does my colleague play them then? xD

I also have a 12 GB GPU and I have never run out of VRAM playing whatever game. Perhaps once when I enabled settings that "run" @ 20 FPS on a 4090... Other than that, the "8 GB is obsolete" is only true in the sense the leather jacket guy is too greedy and provides too little generational uplift.
Posted on Reply
#14
Vayra86
3valatzyWhy it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.

www.techspot.com/news/99094-another-16-pin-rtx-4090-power-adapter-has.html
The article is from June 2023
3valatzyYes, they can't disappear:

April 2024.

Its not exactly looking like they're very frequent, what I do see, is loser-Youtuber-territory here. Big screamy face, all caps headline, outrage!

That being said, I do have a few boxes of popcorn waiting for the 5090 release.
Posted on Reply
#15
Macro Device
Vayra86a few boxes of popcorn waiting for the 5090 release
Doubt there'll be anything more cheeky than the price and the fact it can't run %game_name% at a 255 million resolution at beyond ultra settings and all the folks go apeshit around this.
Posted on Reply
#16
Vayra86
Macro DeviceDoubt there'll be anything more cheeky than the price and the fact it can't run %game_name% at a 255 million resolution at beyond ultra settings and all the folks go apeshit around this.
Stop sounding so rational, this is the internet
Posted on Reply
#17
Macro Device
Vayra86Stop sounding so rational, this is the internet
I am on my day off, I can do whatever I want, sir Hydrant!
Posted on Reply
#18
Dawora
3valatzyIt will be. Because the VRAM bottleneck will be solved. A miserable 10 GB vs 60% more VRAM.
New AMD GPU is still slow and bad upgrade..
Buy a new Gpu to get more Vram only is just stupid whitout getting more performance.

Better to go 5070Ti to get performance boost
Posted on Reply
#19
AusWolf
"conventional PCI Express connectors can adequately handle power demands up to 375 W" - considering that no consumer card should eat more than 375 W, there should be no need of a 12-pin connector on a consumer card, ever.
Posted on Reply
#20
Knight47
Vayra86The article is from June 2023


April 2024.

Its not exactly looking like they're very frequent, what I do see, is loser-Youtuber-territory here. Big screamy face, all caps headline, outrage!

That being said, I do have a few boxes of popcorn waiting for the 5090 release.
Posted on Reply
#21
clopezi
Knight47
I'm sorry but those videos are youtubers BS, and the same on the three videos.

There will always be cases because there are thousands of cards on the market, and there is always an error rate in any hardware. In the case of the 4090, there will always be a bad connector, but after some time, you can't talk about something generalized, but something that affected a small percentage of users and that today has no more relevance than for the 5 minutes of glory of some youtuber or some random post on Reddit.
Posted on Reply
#22
Onasi
@AusWolf
Fucking preach. And yet you would still run into people saying that “well, we can do cards pulling 600W and the cooling works, so why contain it limit ourselves, it’s performance”. I wouldn’t grab anything above 250W for myself, but hey, if people want space heaters it’s their choice.

As for the connector, I would trust W1zz over outrage grifters any day of the week - if he says that over dozens of cards and thousands of plug-unplug cycles he didn’t ran into any problems and none of his acquaintances/contacts did either then the whole thing is overblown and is just cases of user error and/or rare defective cards, which happens.
Posted on Reply
#23
Zazigalka
3valatzyWhy it matters: Since October, dozens of RTX 4090 owners have reported melting power adapter cables. Despite investigations from Nvidia and third parties, a definitive cause has yet to be determined. It was thought that pairing the GPU with an ATX 3.0 power supply was a safe solution… until now.

www.techspot.com/news/99094-another-16-pin-rtx-4090-power-adapter-has.html

The problem is that the new power connector is an engineering mistake - both mechanically and electrically it doesn't qualify for the task at hand - to carry high currents safely.
The problem is that it is too small, too weak, too unstable.

There is a reason why the AMD engineers only use the approved 6-8-pin power connectors - they have been proved as safe for decades.
Since what October ? that piece of news is 1.5yrs old.
btw, all I said is that mine works fine, and you put that idiotic "laugh" reaction on it. sorry it didn't burn my pc down, which you would probably like.
Posted on Reply
#24
Why_Me
Knight47
It keeps getting better ^^

Posted on Reply
#25
AusWolf
Onasi@AusWolf
Fucking preach. And yet you would still run into people saying that “well, we can do cards pulling 600W and the cooling works, so why contain it limit ourselves, it’s performance”. I wouldn’t grab anything above 250W for myself, but hey, if people want space heaters it’s their choice.
I couldn't agree more. Micro-ATX being the biggest size I'm willing to build a PC for myself, heat is a major consideration for me (not to mention I find 4080+ cards are a bit excessive even in performance).
OnasiAs for the connector, I would trust W1zz over outrage grifters any day of the week - if he says that over dozens of cards and thousands of plug-unplug cycles he didn’t ran into any problems and none of his acquaintances/contacts did either then the whole thing is overblown and is just cases of user error and/or rare defective cards, which happens.
I wouldn't mind the connector if my PSU came with one. It's a 750 W Seasonic Prime that I bought just before the 12-pin became mainstream, and I'm not willing to ditch the highest quality and most expensive PSU I've ever had that still has roughly 10 years of warranty left just because Nvidia said so. I'm also not the biggest fan of converters and extensions.
Posted on Reply
Add your own comment
Jan 3rd, 2025 02:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts