• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

XFX Radeon RX 9070 Series Graphics Cards at 2025 International CES

Joined
Jun 1, 2010
Messages
422 (0.08/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
XFX being in game, is actually a good thing.
The shroud and cooler design though... like others have mentioned, the RX7000 series cooler was pretty much amazing, and was on par with PowerColor one, and even could outperform Sapphire's Pulse and even Nitro+. Sleek, simple and robust design. I can't find the reason behind not putting it on RDNA4, besides the last being the temporary stop-gap solution. This particular 9070's design though, looks a bit cheap...

On the other hand, many AIBs like Asus, just seems to be partly re-usilng their previous TUF, ProArt, and other coolers for the RDNA4.

But seeing this many AIBs and partners having the ready products, gives an impression, that the HW part is ready for mass production, and RDNA4 mignt not be a scarce product. Since the AIB/partner wouldn't risk entering the R&D and production, for something that is going to have the supply constrains.
This might be a wishful thinking, but let's hope this is "just" the driver readiness issue.
This PNY is a beauty and an interesting design compared to that XFX :kookoo: That XFX is the worst graphics card design ever. I had never seen something similar before.\

Just stay with Sapphire, ASRock.
Like it was said before, the XFX RX7000 and even RX6000 were completely fine, by both visual design and thermal performance.

I guess kinda, reading up on it, to me, its kinda vague.

Like AMD when they released the RX480 it had a 6 pin connector but the card would often jump over 150 watts putting out of spec strain on the pci-e slot.
They got flack for that and did an update that fixed it.

With this MSI system claiming to be able to provide 168 watts from the slot alone...would that mean you can install an RX480 on there and just leave out the 6-pin connector?

It seems to me the boards of gpu's are made to expect power from certain regions...but either way, its nice that MSI makes some noise about this, but again, I just want the official PCI-E spec to change and be updated to deliver more power, so GPU's can be made with that in mind and just ship without extra connectors (or atleast that youdont have to hook those up to anything if you have a newer motherboard/psu etc.
I personally, would keep PCI-E slot for data only. Feeding the power-hungry graphics card by PCIE slot, would just bring an unnecessary, excessive strain on the wiring of motherboard, putting under the risk the integrity of entire PCIE bus. It's much wiser to feed the required power through additional 8-pin cables, that can be replaced along with the PSU itself, if something would go wrong. But that's just IMO.

Or even better: let's make GPUs under 200 W a thing again! At least in the midrange.

In my opinion, an entry-level card should have no power connector, a midrange one should have an 8-pin, and a high-end one two 8-pins. We don't need the new 12-pin standard, either.

It's not because electricity is expensive (it is, but that's besides the point), but because of all the heat it dumps into your case. It's unnecessary and excessive.
There's absulotely no excuse, the AIBs do not make the 150/200/300W designs of each GPU SKU.
Like some Pro design- simple sleek one, with robust VRM, but limited by having limited/no OC. Just reliable, energy efficient, two slot design, plug'n'play card.
Then some SFF ones, with even more compact design. Doesn't matter if it is OC and not, the cooler must be very efficient and compact. Might be limited to 150-200W as well.
And of course OC variants, for someone who used to burn the watts for fun. No limit, no efficiency models just for "extra" "free" fps, for those who dare.

This way, the companies would have every consumer type satisfied, instead of pushing the same triple-four slot triple fan chonkers under different shrouds, with basicaly zero diferences by design and price.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,350 (6.10/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
On the other hand, many AIBs like Asus, just seems to be partly reusilng their TUF, ProArt, and other coolers for the RDNA4.
Personally, I would avoid the big names who do both AMD and Nvidia, if you're looking for an AMD card. Like you said, they design their coolers for Nvidia, and just reuse them on AMD, resulting in stuff like Asus's mess with the ROG Strix 5700 XT which was an overheating piece of ... due to wrong coldplate pressure. If you want AMD, look for Sapphire, Powercolor, ASRock, or XFX, and forget about Asus, MSi and Gigabyte.
 
Joined
Jun 1, 2010
Messages
422 (0.08/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Personally, I would avoid the big names who do both AMD and Nvidia, if you're looking for an AMD card. Like you said, they design their coolers for Nvidia, and just reuse them on AMD, resulting in stuff like Asus's mess with the ROG Strix 5700 XT which was an overheating piece of ... due to wrong coldplate pressure. If you want AMD, look for Sapphire, Powercolor, ASRock, or XFX, and forget about Asus, MSi and Gigabyte.
Yeah. The Sapphire, TUL and even XFX were reliable solutions since ATi, and I dare to say- the begining of GPU industry. Sadly the last nitro+ had the GPU bracket pressure issue. Nonetheless, the Nitro+ non-vapor chamber design, like on 7000 was great. I would be glad to get the 7900GRE, if it wouldn't ever unavailable since the release day, where I live.
But Sapphire seems to change the cooler design each new generation. This is both simultaneuosly both Pro and Con. Some of the coolers were great, and would be nice to come back, though.
 
Joined
Jun 2, 2017
Messages
9,555 (3.44/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Yeah. The Sapphire, TUL and even XFX were reliable solutions since ATi, and I dare to say- the begining of GPU industry. Sadly the last nitro+ had the GPU bracket pressure issue. Nonetheless, the Nitro+ non-vapor chamber design, like on 7000 was great. I would be glad to get the 7900GRE, if it wouldn't ever unavailable since the release day, where I live.
But Sapphire seems to change the cooler design each new generation. This is both simultaneuosly both Pro and Con. Some of the coolers were great, and would be nice to come back, though.
The PCB on the 6800XT and 7900XT are pretty much the same. Where they can all of these vendors try to reuse PCBs where they can. These are obviously 7900XT/XTX PCBs as they came with 2 and 3 8 pin connections depending on what generation of card they use. The shroud and GPU positioning is where the difference is that usually does not allow you to reuse a Waterblock. These large designs for me are to give the illusion of the 5090. That card is Huge. I know there are 2 slot variants but we will see.
 
Joined
Sep 3, 2019
Messages
3,683 (1.88/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MT/s 1.38V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
It's more convenient to say 1, 2, 3 slots. Why complicate things that don't need to be complicated?
Nothing is complicated to me… and to a lot others apparently.
If I have sound card or whatever other card down there I’d like to know if a GPU is 2.2 or 2.8 slots thick. Can I be ok 0.8 slots clearance?
Are you intentionally messing with us now? Yes, the fans are where you drew the red ovals. The card is pretty much "upside down" relative to how it would sit in a normal PC case.

IMG_8536.jpeg

Dang looking it wrong

Yes that’s a 4slot card

Did not realize this is the side of the fan shroud
 
Last edited:
Joined
Jun 1, 2010
Messages
422 (0.08/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
The PCB on the 6800XT and 7900XT are pretty much the same. Where they can all of these vendors try to reuse PCBs where they can. These are obviously 7900XT/XTX PCBs as they came with 2 and 3 8 pin connections depending on what generation of card they use. The shroud and GPU positioning is where the difference is that usually does not allow you to reuse a Waterblock. These large designs for me are to give the illusion of the 5090. That card is Huge. I know there are 2 slot variants but we will see.
I meant the overall shroud/cooler look. The GPU positioning is not an issue. The shroud and look can be still be the same. This might be a result of Sapphire wants to differentiate their graphic cards series/generations. But still.

Particularly the Nitro+ design, was comparably consistent/similar since Vega 64, up until RX 7900 series. The same goes to XFX MERC/Speedster, which was introduced since RX 5700 XT/RDNA1.
Vega 64


RX 5700 XT


RX 6800 XT


RX 7900 XT
I personally think, their RX580 cooler design pretty nice, and it could be generally easilly" re-used" for later RX6600/7600 (with GPU position change), considering RX580-590 had much much higher TDP/TBP. The same applies to RX7000 "premium" metal-shroud Nitro+ design.


As of 5090... I doubt any of AIBs would be able to keep their 5090 solutions as "compact" as nVidia's own one. Simply because the AIB's are unable to afford this complex and expensive cooler design, for the amount of cards AIBs sell. This is just a luxury, limited number exclusive FE premium design by nVidia, that emphasizes their luxury "Apple"-ish $3.5B status.
 
Last edited:

TPUnique

New Member
Joined
Dec 17, 2024
Messages
18 (0.69/day)
Man, all this angst about videocards whose designs are completely acceptable... one might call them out for being bland, which they are, but fugly ? Not in a million years :kookoo:

You want something ugly ? Go look at Yeston's Sakura girly models... the cringe is real with these ones.
 
Joined
Jan 14, 2019
Messages
13,350 (6.10/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Man, all this angst about videocards whose designs are completely acceptable... one might call them out for being bland, which they are, but fugly ? Not in a million years :kookoo:

You want something ugly ? Go look at Yeston's Sakura girly models... the cringe is real with these ones.
The cringe is real with both the Sakura and these here, imo.
 

TPUnique

New Member
Joined
Dec 17, 2024
Messages
18 (0.69/day)
Ok, PCI-E slot stuck for ages at 75W, but maybe for a reason…

Making the slot 100-200W capable most likely will require more expensive board standards.
And all boards will have to follow it. Every single one of them.
Do we really want even more expensive boards?
Will that reduce the cost of GPUs because of less external connectors? They would still have the same power circuitry just redesigned to have more power from slot.
A hotter slot mind you…
You bring up good points, but the PCI-SIG could also come up with a new, supplementary standard.
I.e. PCIe-300 to channel 300W of power. Manufacturers would be free to implement it or not. Typically, they wouldn't on their most budget cards, so the additional cost wouldn't passed onto cost-conscious buyers.
 
Joined
Sep 3, 2019
Messages
3,683 (1.88/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MT/s 1.38V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
You bring up good points, but the PCI-SIG could also come up with a new, supplementary standard.
I.e. PCIe-300 to channel 300W of power. Manufacturers would be free to implement it or not. Typically, they wouldn't on their most budget cards, so the additional cost wouldn't passed onto cost-conscious buyers.
Complex selections from users for the right board and now you have GPUs that can fit both standards? Or 2 different type of GPUs?
Not seeing it happening soon IMO
For what? Less connectors and cables?
Eventually if power keeps growing for AMD past 550W it will be a single nvidia type connector, or a better version of the existing that could carry tones of current to support 600~800W

And by 2035-2040 it will be both what you're saying + external power for 1+KW GPUs, unless they come up with a different type of home PCs... lol... that few will afford
 

TPUnique

New Member
Joined
Dec 17, 2024
Messages
18 (0.69/day)
Complex selections from users for the right board and now you have GPUs that can fit both standards? Or 2 different type of GPUs?
Not seeing it happening soon IMO
For what? Less connectors and cables?
Eventually if power keeps growing for AMD past 550W it will be a single nvidia type connector, or a better version of the existing that could carry tones of current to support 600~800W

And by 2035-2040 it will be both what you're saying + external power for 1+KW GPUs, unless they come up with a different type of home PCs... lol... that few will afford
Yes, the very reason why manufacturer keep pushing back connect mobos and GPUs.



I don't see it happening soon either, but if BTF hardware's minuscule market share continues to grow, the prospect of a beefed-up PCIe spec would be increasingly plausible.
 
Top