Saturday, January 11th 2025

XFX Radeon RX 9070 Series Graphics Cards at 2025 International CES

XFX at the 2025 International CES showed off a pair of Radeon RX 9070 series custom-design graphics cards. The company will keep these designs common to both the flagship RX 9070 XT, and the RX 9070. Both board designs were shown off at AMD's RX 9070 series booth. The premium custom design is being referred to as "Black." There are actually two sub-variants of this card, one called Black, which lacks any RGB LED lighting, but a second more premium one where the top of the card has an RGB LED diffuser spanning the entire top-front edge, including the triangular ends with the XFX and Radeon logos. This card wasn't shown to use, but is part of AMD's CES pre-brief.

The premium Black card features a large aluminium fin-stack heatsink, along with a trio of what look like 100 mm and 90 mm axial airflow fans. The metal backplate has a ridged pattern. The PCB underneath appears to be about three quarters the length of the card, with a large cutout in the backplate letting much of the airflow from the third fan go through the heatsink and out the back. This card draws power from three 8-pin PCIe power connectors, for a total power input configuration of 525 W, which is obviously high for what is expected to be a 300 W-class GPU, but this isn't the only card with an over-the-top power input configuration. The ASUS TUF Gaming has three 8-pin PCIe power, while the ASRock Taichi uses a 16-pin 12V2x6. Most RX 9070 series cards we've seen have just two 8-pin power connectors.
The presence of these high-power input configurations could hint that the RX 9070 XT likes to overclock, and since it is the top SKU in the RX 9000 series, AMD will allow board partners to go to town overclocking it, even if it takes dialing up the power limits by a fair bit. This way they get to justify pricing these cards north of $600, given that we're hearing that custom-design RX 9070 XT typically starts at $550, and the baseline price for this SKU could be as little as $480.
Back to XFX, and we spotted their second custom-design. This card will be fairly premium although not as over the top as the Black. It is built around a white color scheme, and featuers a triple-slot cooling solution, compared to the 4-slot cooler of the Black. The PCB is less than 2/3 the length of the card, so all the airflow from its third fan goes through a large cutout. The card draws power from two 8-pin power connectors.
Add your own comment

61 Comments on XFX Radeon RX 9070 Series Graphics Cards at 2025 International CES

#51
AusWolf
Random_UserOn the other hand, many AIBs like Asus, just seems to be partly reusilng their TUF, ProArt, and other coolers for the RDNA4.
Personally, I would avoid the big names who do both AMD and Nvidia, if you're looking for an AMD card. Like you said, they design their coolers for Nvidia, and just reuse them on AMD, resulting in stuff like Asus's mess with the ROG Strix 5700 XT which was an overheating piece of ... due to wrong coldplate pressure. If you want AMD, look for Sapphire, Powercolor, ASRock, or XFX, and forget about Asus, MSi and Gigabyte.
Posted on Reply
#52
Random_User
AusWolfPersonally, I would avoid the big names who do both AMD and Nvidia, if you're looking for an AMD card. Like you said, they design their coolers for Nvidia, and just reuse them on AMD, resulting in stuff like Asus's mess with the ROG Strix 5700 XT which was an overheating piece of ... due to wrong coldplate pressure. If you want AMD, look for Sapphire, Powercolor, ASRock, or XFX, and forget about Asus, MSi and Gigabyte.
Yeah. The Sapphire, TUL and even XFX were reliable solutions since ATi, and I dare to say- the begining of GPU industry. Sadly the last nitro+ had the GPU bracket pressure issue. Nonetheless, the Nitro+ non-vapor chamber design, like on 7000 was great.
I would be glad to get the 7900GRE, if it wouldn't ever unavailable since the release day, where I live.
But Sapphire seems to change the cooler design each new generation. This is both simultaneuosly both Pro and Con. Some of the coolers were great, and would be nice to come back, though.
Posted on Reply
#53
kapone32
Random_UserYeah. The Sapphire, TUL and even XFX were reliable solutions since ATi, and I dare to say- the begining of GPU industry. Sadly the last nitro+ had the GPU bracket pressure issue. Nonetheless, the Nitro+ non-vapor chamber design, like on 7000 was great.
I would be glad to get the 7900GRE, if it wouldn't ever unavailable since the release day, where I live.
But Sapphire seems to change the cooler design each new generation. This is both simultaneuosly both Pro and Con. Some of the coolers were great, and would be nice to come back, though.
The PCB on the 6800XT and 7900XT are pretty much the same. Where they can all of these vendors try to reuse PCBs where they can. These are obviously 7900XT/XTX PCBs as they came with 2 and 3 8 pin connections depending on what generation of card they use. The shroud and GPU positioning is where the difference is that usually does not allow you to reuse a Waterblock. These large designs for me are to give the illusion of the 5090. That card is Huge. I know there are 2 slot variants but we will see.
Posted on Reply
#54
Zach_01
AusWolfIt's more convenient to say 1, 2, 3 slots. Why complicate things that don't need to be complicated?
Nothing is complicated to me… and to a lot others apparently.
If I have sound card or whatever other card down there I’d like to know if a GPU is 2.2 or 2.8 slots thick. Can I be ok 0.8 slots clearance?
AusWolfAre you intentionally messing with us now? Yes, the fans are where you drew the red ovals. The card is pretty much "upside down" relative to how it would sit in a normal PC case.


Dang looking it wrong

Yes that’s a 4slot card

Did not realize this is the side of the fan shroud
Posted on Reply
#55
Random_User
kapone32The PCB on the 6800XT and 7900XT are pretty much the same. Where they can all of these vendors try to reuse PCBs where they can. These are obviously 7900XT/XTX PCBs as they came with 2 and 3 8 pin connections depending on what generation of card they use. The shroud and GPU positioning is where the difference is that usually does not allow you to reuse a Waterblock. These large designs for me are to give the illusion of the 5090. That card is Huge. I know there are 2 slot variants but we will see.
I meant the overall shroud/cooler look. The GPU positioning is not an issue. The shroud and look can be still be the same. This might be a result of Sapphire wants to differentiate their graphic cards series/generations. But still.

Particularly the Nitro+ design, was comparably consistent/similar since Vega 64, up until RX 7900 series. The same goes to XFX MERC/Speedster, which was introduced since RX 5700 XT/RDNA1.

Vega 64


RX 5700 XT


RX 6800 XT


RX 7900 XT


I personally think, their RX580 cooler design pretty nice, and it could be generally easilly" re-used" for later RX6600/7600 (with GPU position change), considering RX580-590 had much much higher TDP/TBP. The same applies to RX7000 "premium" metal-shroud Nitro+ design.

As of 5090... I doubt any of AIBs would be able to keep their 5090 solutions as "compact" as nVidia's own one. Simply because the AIB's are unable to afford this complex and expensive cooler design, for the amount of cards AIBs sell. This is just a luxury, limited number exclusive FE premium design by nVidia, that emphasizes their luxury "Apple"-ish $3.5B status.
Posted on Reply
#56
TPUnique
Man, all this angst about videocards whose designs are completely acceptable... one might call them out for being bland, which they are, but fugly ? Not in a million years :kookoo:

You want something ugly ? Go look at Yeston's Sakura girly models... the cringe is real with these ones.
Posted on Reply
#57
AusWolf
TPUniqueMan, all this angst about videocards whose designs are completely acceptable... one might call them out for being bland, which they are, but fugly ? Not in a million years :kookoo:

You want something ugly ? Go look at Yeston's Sakura girly models... the cringe is real with these ones.
The cringe is real with both the Sakura and these here, imo.
Posted on Reply
#58
TPUnique
Zach_01Ok, PCI-E slot stuck for ages at 75W, but maybe for a reason…

Making the slot 100-200W capable most likely will require more expensive board standards.
And all boards will have to follow it. Every single one of them.
Do we really want even more expensive boards?
Will that reduce the cost of GPUs because of less external connectors? They would still have the same power circuitry just redesigned to have more power from slot.
A hotter slot mind you…
You bring up good points, but the PCI-SIG could also come up with a new, supplementary standard.
I.e. PCIe-300 to channel 300W of power. Manufacturers would be free to implement it or not. Typically, they wouldn't on their most budget cards, so the additional cost wouldn't passed onto cost-conscious buyers.
Posted on Reply
#59
Zach_01
TPUniqueYou bring up good points, but the PCI-SIG could also come up with a new, supplementary standard.
I.e. PCIe-300 to channel 300W of power. Manufacturers would be free to implement it or not. Typically, they wouldn't on their most budget cards, so the additional cost wouldn't passed onto cost-conscious buyers.
Complex selections from users for the right board and now you have GPUs that can fit both standards? Or 2 different type of GPUs?
Not seeing it happening soon IMO
For what? Less connectors and cables?
Eventually if power keeps growing for AMD past 550W it will be a single nvidia type connector, or a better version of the existing that could carry tones of current to support 600~800W

And by 2035-2040 it will be both what you're saying + external power for 1+KW GPUs, unless they come up with a different type of home PCs... lol... that few will afford
Posted on Reply
#60
TPUnique
Zach_01Complex selections from users for the right board and now you have GPUs that can fit both standards? Or 2 different type of GPUs?
Not seeing it happening soon IMO
For what? Less connectors and cables?
Eventually if power keeps growing for AMD past 550W it will be a single nvidia type connector, or a better version of the existing that could carry tones of current to support 600~800W

And by 2035-2040 it will be both what you're saying + external power for 1+KW GPUs, unless they come up with a different type of home PCs... lol... that few will afford
Yes, the very reason why manufacturer keep pushing back connect mobos and GPUs.



I don't see it happening soon either, but if BTF hardware's minuscule market share continues to grow, the prospect of a beefed-up PCIe spec would be increasingly plausible.
Posted on Reply
#61
lukart
Massive card, not sure why they changed their designs. The PowerColor Reaper seems to be the best looking and most reasonable card so far.
Posted on Reply
Add your own comment
Feb 8th, 2025 18:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts