Tuesday, December 31st 2024

AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

AMD will continue using traditional PCI Express power connectors for its upcoming Radeon RX 9000 series RDNA 4 graphics cards, according to recent information shared on the Chiphell forum. While there were some expectations that AMD would mimic NVIDIA's approach, which requires the newer 16-pin 12V-2×6 connector for its GeForce RTX 50 series, the latest information suggests a more traditional power approach. While AMD plans to release its next generation of graphics cards in the first quarter, most technical details remain unknown. The company's choice to stick with standard power connectors follows the pattern set by their recent Radeon RX 7900 GRE, which demonstrated that conventional PCI Express connectors can adequately handle power demands up to 375 W. The standard connectors eliminate the need for adapters, a feature AMD could highlight as an advantage. An earlier leak suggested that the Radeon RX 9070 XT can draw up to 330 W of power at peak load.

Intel reportedly cited similar reasons for using standard power connectors in their Arc "Battlemage" graphics cards, suggesting broader industry support for maintaining existing connection standards. NVIDIA's different approach reportedly requires all board partners to use the 12V-2×6 connector for the RTX 50 series, removing the option for traditional PCI Express power connectors. In contrast, AMD's decision gives its manufacturing partners more flexibility in their design choices, and MBA (Made by AMD) reference cards don't enforce the new 12V-2×6 power connector standard. Beyond the power connector details and general release timeframe pointing to CES, AMD has revealed little about the RDNA 4 architecture's capabilities. Only the reference card's physical appearance and naming scheme appear to be finalized, leaving questions about performance specifications unanswered, as early underwhelming performance leaks are somewhat unreliable until final drivers and final optimizations land.
Sources: Chiphell, via HardwareLuxx
Add your own comment

133 Comments on AMD Radeon "RDNA 4" RX 9000 Series Will Feature Regular 6/8-Pin PCI Express Power Connectors

#26
Macro Device
AusWolfI'm not willing to ditch the highest quality and most expensive PSU I've ever had that still has roughly 10 years of warranty left just because Nvidia said so. I'm also not the biggest fan of converters and extensions.
Same reason for me to let my Thermaltake "Volga" run for a decade more.
Posted on Reply
#27
JustBenching
As I've said before, you don't need a converters or an extension. You can get a cable that goes straight from your PSU to your graphics card. It's no different than a normal 8pin. Actually, i'd argue the 16pin is safer if you are running 250w cards, since you are using it at like 1/3 capacity.
Posted on Reply
#28
TheDeeGee
3valatzyAgain wrong. The 8GB GPUs are obsolete today.

Watch:




Yes, they can't disappear:

8 fucking months ago.
Posted on Reply
#29
Ruru
S.T.A.R.S.
Props to AMD, why fix something what isn't broken?
Why_MeIt keeps getting better ^^

Wait, wut? How many revisions this new connector needs?
Posted on Reply
#30
Daven
Macro DeviceAin't gonna be much faster than that, even by today's ridiculous standards of +5% being a whopping upgrade. I'd rather skip this generation. 9070 XT is unlikely to be significantly faster than 3090 (3090 Ti if we're feeling really ambitious) and your 3080 isn't really far behind. More sense in waiting for 4080 series or better GPUs to become affordable.
The rumors peg the 9070XT between 10 to 40% faster than the 3080.
Posted on Reply
#31
Macro Device
DavenThe rumors peg the 9070XT between 10 to 40% faster than the 3080.
Which doesn't contradict with what I just said. It's barely noticeable. Significant starts from 100%.
Posted on Reply
#32
chstamos
It seems like AMD graphics division has managed to disappoint so consistently that nobody expects them to deliver anything worthwhile anymore. Maybe they'll pull an intel and manage not to completely fuck up just when everybody expects them to. Wouldn't bet the farm on it, but you never know.
Posted on Reply
#33
Daven
DaworaNew AMD GPU is still slow and bad upgrade..
Buy a new Gpu to get more Vram only is just stupid whitout getting more performance.

Better to go 5070Ti to get performance boost

Its only Amd fans who dont know what means
VRAM allocation and VRAM usage? Right?

Ppls who never ever buy Nvidia write this trash and BS about Vram. Thats how it goes atm in every tech forum
Butthurt fans cant take it when Nvidia is top dog here and topics are full of Vram/price BS from Amd fans


Better to sound rational than talking BS about prices and Vrams 24/7 like some butthurt amd fans
Everyone buys Nvidia. They won. What more do you want?
Macro DeviceWhich doesn't contradict with what I just said. It's barely noticeable. Significant starts from 100%.
You are making personal opinions and decisions for others.
Posted on Reply
#34
AusWolf
JustBenchingAs I've said before, you don't need a converters or an extension. You can get a cable that goes straight from your PSU to your graphics card. It's no different than a normal 8pin. Actually, i'd argue the 16pin is safer if you are running 250w cards, since you are using it at like 1/3 capacity.
Yes, if your PSU is compatible. Unfortunately, finding out whether it is or not, and which cable you need (manufacturers like swapping pin layouts for no reason) can be a bit of a nightmare.
Posted on Reply
#35
JustBenching
AusWolfYes, if your PSU is compatible. Unfortunately, finding out whether it is or not, and which cable you need (manufacturers like swapping pin layouts for no reason) can be a bit of a nightmare.
It's really not hard to find. Took me 5 seconds to figure it out for your PSU

seasonic.com/12vhpwr-cable/
Posted on Reply
#36
AusWolf
DavenYou are making personal opinions and decisions for others.
Let's be honest... you would never in your life notice a 10% difference without an FPS counter on screen. Am I right or wrong?
JustBenchingIt's really not hard to find. Took me 5 seconds to figure it out for your PSU

seasonic.com/12vhpwr-cable/
That's why I said "can be". I'm glad the info is available for my PSU, though. :)

Edit: Oh hey, what's the difference between this and this? I'm getting lost among all these standards.
Posted on Reply
#37
Dr. Dro
3valatzyWrong. 10 GB is miserable even at 1440p.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB

Watch:

That is not how VRAM works. The more you have, the more will be allocated, therefore readouts on 16+ GB cards are almost always higher than on 12 GB and below GPUs. This is irrespective of vendor. The original 10 GB 3080 has shown itself adequate even on the games that tend to crap themselves on 8 GB, which are relatively rare (and poorly coded anyway). Besides, such super high VRAM capacities also primarily assume people are using adequate system RAM for proper VRAM paging by the WDDM model, but I have seen people run 4090 GPUs on machines with 16 GB of system RAM... what a waste.

Currently, unless you are pushing the most extreme settings at 4K resolution, 16 GB+ is nothing but comfy, that is all. Realistically, you are looking at very few games that can make real use of more than 16 GB, mostly modded Creation Engine games with high resolution texture mods. Last I was playing Fallout 76, I had some texture mods that constantly ran my 3090 into the 24000 MB range. Not exactly optimized, regardless.
Posted on Reply
#38
Macro Device
DavenYou are making personal opinions and decisions for others.
Let's face it. A person works their butt off and spends hard earned money, a big chunk thereof, to get like 30 to 42 FPS boost best case scenario. Terrible performance in either case.

This is what I call a waste of effort. I'd get it if GPUs were giveaways but it's expensive tech. Getting less than double your current performance feels like shooting your own foot. Yeah, you might sell your old GPU for good money to offset it but it's also additional effort in the first place, plus it is not a given. It might lie on your shelf forever until someone buys it for the price you're fine with. Who knows.

That's why I recommend everyone who is not getting paid for their calculating power to ditch the idea of upgrading every couple years and enjoy upgrades to the best GPU they can afford when their current one doesn't even remotely catch up with 1080p30. That way the upgrades are less regular which means less effort. The upgrades are more significant which means more joy. Also cheaper to do so.

This is not your gym progress where every percent matters. One can live with an "obsolete" GPU if they don't use it for actual work.
Posted on Reply
#39
Daven
AusWolfLet's be honest... you would never in your life notice a 10% difference without an FPS counter on screen. Am I right or wrong?
Is it important to you to be right or wrong about this?

The range is rumored between just faster than the 7900GRE to just slower than the 7900XTX. If the latter that’s a good boost over the 3080. It’s especially good if the power is lower and RT performance is up (if you care about that). Pricing well under $500 would just be the cherry on top. And if you are building an all white rig like I am, more GPUs are coming in white. Finally, the 3080 comes with 10GB while the 9070XT is rumored to come with 16GB.

Building computers can be a hobby with enjoyment just from carrying out the upgrade.
Posted on Reply
#40
FreedomEclipse
~Technological Technocrat~
katziIf the prices are good, Might end up with a Radeon gpu to replace my 3080.


Join the dark side.
Posted on Reply
#41
AusWolf
DavenIs it important to you to be right or wrong about this?
No, it's not, but I do have an opinion - namely, that any hobby gamer who thinks 10% is important lives in placebo world created and manipulated by clickbait reviews.
DavenBuilding computers can be a hobby with enjoyment just from carrying out the upgrade.
Definitely! I only upgrade for fun myself, not because I need to. It still isn't the generally recommended course of action for the masses who only game and don't care about tech at all.
FreedomEclipseJoin the dark side.
It's not even so dark at all - being on Linux with an all-AMD rig feels like enlightenment after the constant nagging of Windows and the monthly driver updates. :D
Posted on Reply
#42
Daven
Macro DeviceLet's face it. A person works their butt off and spends hard earned money, a big chunk thereof, to get like 30 to 42 FPS boost best case scenario. Terrible performance in either case.

This is what I call a waste of effort. I'd get it if GPUs were giveaways but it's expensive tech. Getting less than double your current performance feels like shooting your own foot. Yeah, you might sell your old GPU for good money to offset it but it's also additional effort in the first place, plus it is not a given. It might lie on your shelf forever until someone buys it for the price you're fine with. Who knows.

That's why I recommend everyone who is not getting paid for their calculating power to ditch the idea of upgrading every couple years and enjoy upgrades to the best GPU they can afford when their current one doesn't even remotely catch up with 1080p30. That way the upgrades are less regular which means less effort. The upgrades are more significant which means more joy. Also cheaper to do so.

This is not your gym progress where every percent matters. One can live with an "obsolete" GPU if they don't use it for actual work.
The 3080 was released almost 4.5 years ago. I think the world will keep rotating if someone upgrades from it in the next few months.
AusWolfIt's not even so dark at all - being on Linux with an all-AMD rig feels like enlightenment after the constant nagging of Windows and the monthly driver updates. :D
I’m also on an all AMD rig but I still use Windows and I hate it. I need to install Linux like you did.
Posted on Reply
#43
Macro Device
DavenThe 3080 was released almost 4.5 years ago.
It is a high performance device that still stands strong. You only upgrade from it if you want more than just play recent games. With the progress slowing down to almost zero percent per generation and with the market becoming utterly monopolised it can easily serve them with pleasure for another half dozen years.
DavenI think the world will keep rotating if someone upgrades from it in the next few months.
Of course. But they have a right not to upgrade to something that has very little improvement over what they have now. And who are we to deny this right?
Posted on Reply
#44
AusWolf
DavenI’m also on an all AMD rig but I still use Windows and I hate it. I need to install Linux like you did.
Have you ever used Linux? Come over to the software / Linux forum where there's plenty of resources and help in either case. :)

I used to be in the same situation as you. Finding Copilot randomly installed on my PC one morning was the last straw.
Posted on Reply
#45
AVATARAT
DaworaNew AMD GPU is still slow and bad upgrade..
Buy a new Gpu to get more Vram only is just stupid whitout getting more performance.

Better to go 5070Ti to get performance boost

Its only Amd fans who dont know what means
VRAM allocation and VRAM usage? Right?

Ppls who never ever buy Nvidia write this trash and BS about Vram. Thats how it goes atm in every tech forum
Butthurt fans cant take it when Nvidia is top dog here and topics are full of Vram/price BS from Amd fans


Better to sound rational than talking BS about prices and Vrams 24/7 like some butthurt amd fans
Please explain to me why Radeon is slower/badder?

Just skip me the DLSS drama and the blurring or RT's low FPS that result from that, what else does Nvidia offer? Ah yes, DLAA.

A bad (joke) 16 pin connector? Ancient software or buggy/burden newer app? Less and slower VRAM? Ancient shader caching that's better to be turned off in the driver to not waste storage space/life? The need for a last generation card to get a last generation DLSS for better blurring and the need for this to be implemented by the game developer of course (sorry if the game is old, but hey you can use DLSS2FSR mod :laugh:)? Locked Frame Generation to DLSS so you can't use them individually? Nvidia Experience :D ? New great prices for each new generation (the more you buy, the more you save!!!)?


Yeah I know Radeon 7000's are space heaters and FSR blurring is the worst, anything else? But see rumors say the new 9000 cards will use lower watts and the new FSR will be like DLSS with less blur, so it will be good (for those who use it). So you would need something new as a claim to be against.
_______
And before you say something, I have an RTX 4080 Super and an RX 7900 XTX.
Posted on Reply
#46
efikkan
AusWolf"conventional PCI Express connectors can adequately handle power demands up to 375 W" - considering that no consumer card should eat more than 375 W, there should be no need of a 12-pin connector on a consumer card, ever.
Even at 250W things get loud and challenging to cool, so somewhere around 300W would be a hard limit for most.
RuruProps to AMD, why fix something what isn't broken?
Wait, wut? How many revisions this new connector needs?
I think this XKCD strip is highly relevant:

It's yet another standard we didn't need and "no body" asked for, that should have been the end of the discussion. Don't forget it's generally more valuable to have lasting standard than having the "optimal" standard.
3valatzyWrong. 10 GB is miserable even at 1440p.
Today the bare minimum is 14 GB, but in order to stay future-proof for at least 2-3 years you need 20 GB
Can we please cut it with this nonsense?
Each architecture utilizes VRAM differently, comparing VRAM size across vendors is a fool's errand. And large VRAM isn't "future proofing", not unless you want to look at pretty slide shows.
DaworaIts only Amd fans who dont know what means
VRAM allocation and VRAM usage? Right?
<snip>
Better to sound rational than talking BS about prices and Vrams 24/7 like some butthurt amd fans
And yet most of them buys Nvidia cards anyways when the dust settles…
Dr. DroCurrently, unless you are pushing the most extreme settings at 4K resolution, 16 GB+ is nothing but comfy, that is all. Realistically, you are looking at very few games that can make real use of more than 16 GB, mostly modded Creation Engine games with high resolution texture mods. Last I was playing Fallout 76, I had some texture mods that constantly ran my 3090 into the 24000 MB range. Not exactly optimized, regardless.
There are many uses of large VRAM outside gaming, but in gaming, it's mostly a gimmick. Graphics cards don't have enough bandwidth to utilize it in a single frame anyways.

In regards to (unofficial) texture mods;
1) Results for most game engines will be undesirable, as more advanced engines are using calibrated LoD algorithms, textures, shaders and sometimes dynamic loading of assets. LoD algorithms mix multiple mip levels of textures, and the results will generally be very wasteful if you just replace (some) textures with higher resolution ones without recalibrating everything, in best case scenarios you're looking at very high VRAM allocation for very little visual "improvement" (most of it wasted due to mip levels), but in many cases there can be flickering, glitching or loading issues/popping in cases with dynamic loading.

2) What is the added benefit?
Unless someone has access to higher quality raw material or create new better assets, we're not really adding truly better textures. In most cases it's just upscaled textures with some noise and filtering added, so there isn't any more information in the texture, just an illusion of higher resolution. It's just as pointless as these "AI" upscaling algorithms - no real information is added. (Most games can(and some already do) achieve the same result with a "detail texture" or noise added in shaders at very little cost)
Or an analogy; it's about as smart as turning up the sharpness on your TV believing you get a better picture. :rolleyes:
Posted on Reply
#47
Daven
AusWolfHave you ever used Linux? Come over to the software / Linux forum where there's plenty of resources and help in either case. :)

I used to be in the same situation as you. Finding Copilot randomly installed on my PC one morning was the last straw.
I regretfully admit that I have never used Linux. After my last experience trying to install Win 11 2H24 on my laptop, I am so done with Windows. Now I just need to find the time to make the switch.
efikkanIt's yet another standard we didn't need and "no body" asked for, that should have been the end of the discussion. Don't forget it's generally more valuable to have lasting standard than having the "optimal" standard.
Unfortunately, Nvidia created this 'standard' because they are planning 600W GPUs. We didn't really know this at the time and would have laughed at anyone suggesting GPUs would need so much power. You would need four 8-pin power connectors to cover 600W and I'm sure that would be hard to fit on the weirdly shaped PCBs Nvidia ended up creating.

As I have stated many times in the past, 600W and $2000+ for a GPU that might increase performance 20-30%...hard, hard, hard pass.
Posted on Reply
#48
AcE
It’s fine because with 330W max it can easily get away with 2x8 pin connectors, but this isn’t particularly great, they won’t need to include an adaptor then. That’s it. Nvidia on the other hand needs to use new connector because the high end gpus are too power hungry and would need 4x 8 pin connectors which is too much, they want a small pcb for their ref cards and there’s no room for that. This is whole reason why they invented small connectors in the first place for small pcb design of the 3090 back then. That connector was just capable of 300W, it evolved to the industry standard 600W connector then which was used on PSUs and 3090 Ti and 40 series. And then that one evolved to the new one that is now used with newer psus and 50 series soon.
Posted on Reply
#49
JustBenching
AusWolfLet's be honest... you would never in your life notice a 10% difference without an FPS counter on screen. Am I right or wrong?


That's why I said "can be". I'm glad the info is available for my PSU, though. :)

Edit: Oh hey, what's the difference between this and this? I'm getting lost among all these standards.
I would avoid the 90 degree one. Basically instead of the plug going into your GPU like any normal cable it goes in angled, hence the name.
Posted on Reply
#50
Crackong
RuruWait, wut? How many revisions this new connector needs?
^ This.

Every revision means money lost for already made inventory.
If it ain't broke.
The manufacturors won't bother to make all these revisions.

Posted on Reply
Add your own comment
Jan 5th, 2025 04:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts