• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

XFX Radeon RX 7900 XTX Magnetic Air

Sure, but I can control when to install them, so it aligns with my full retests every few months. Definitely no plans to add any always online game. My life is complicated enough already
I appreciate all the work you do. No doubt.
 
I guess XFX understood the game and brought us a maxed out 7900XTX for only 980$ which is a good deal.

I know the era of 500$ for high-end cards is over; but still; this card will last you for years to come.
 
The thermal solution on the XFX RX 7900 XTX Magnetic Air has eight heatpipes. The main heatsink also provides cooling for the memory chips and VRM circuitry.

You didn't mentioned that baseplate is actually vapor chamber. :)
Updated
 
Nice, thanks! Now it reflect real situation.

Also, i want to ask you about one thing: You tested PCI Express scaling before (like 3.0 vs 4.0 and x16 vs x8), but maybe test other PCI Express features? Not like big article or review, but something with data?

I personally compared few options, but useing only 3DMark PCI Express feature test (as something repeatable and ideally designed just for that):

1. Auto;
2. Forced GEN4;
3. Preferred I\O (GPU);
4. PCIe Ten Bit Tag Support;
5. Data Link Feature Exchange.

To extract data from each run i have to contact with UL (they share information how i can get minimums and maximums from run instead of just one average number). It allowed me to get ~18 000 results (from all runs) and i created grapth.

PCI Express BIOS settings.png

So, Data Link Feature exchange clearly shifted in right and use more PCI Express data. Unfortuanly i can't run tests like you do (with games which have positive respond on PCI Express). But i think it might be interesting anyway.

P.S. Also 3DMark PCI Express Feature test run result: https://www.3dmark.com/pcie/509980 - 26,77 GB\s
 
Funny that I have bought over 50 GPUs during the past 2 decades and have had a single instant where the fan fail, it was with the XFX R9 290

XFX should not be charging a premium for this at all.
In my case, i had to replace the whole thing (cooler, fan, etc) because the one included by the OEM was absolute trash.

it was a Ngreedia gpu, made by MSI, back in 2003 or so.

But yes, none of all the gpus i had, ever needed a replacement fan.
 
Nice review @W1zzard, but wait, does this actually mean that somebody, anybody, is attempting to introduce a feature that is even remotely innovative.... oh my, how dare they.:fear:..:eek:

(even though they used yet ANUTHA rehash of a rehash of a previously released card)
 
XFX has offered removable and swappable fans before. It's a good idea but they weren't cheap and they suffered from poor availability within a year or so of launch, making the whole idea entirely pointless.

Getting replacement fans for non-swappable GPUs is a little more faff, but in the event you have to do it you can usually source the exact original part from Aliexpress or eBay for less than the XFX swappable fans, but you'll pay with your time and effort since getting the fan shroud off can often take 10-15 minutes of GPU disassembly depending on the design of your GPU cooler.
 
Aren't magnetic fans nothing new? If I'm not wrong e.g. 6700XT Nitro had fans like that or similar. My friend had this card, but maybe I understood him wrong.
Sapphire has something quite similar; they have quick-replaceable fans on their Nitro line and even used to sell different fan blade and fan led colors before RGB and aRGB became the norm. They're not magnetic, but IIRC, they lock into place either with 1 screw or a stiff plastic tab.
 
What they think is user friendly: proprietary magnetic fans which are about to take at least a week to arrive to you. If ever.
What is really user friendly: standard 80/92/120/140 mm ATX case fan compatibility so you could buy a set of new fans exactly on the day something went wrong.
 
1718743854762.png

Card looks great and seems easy to maintain with cleaning dust, although those fans are proprietary so hopefully XFX still has a bunch of them down the line for RMA.

The only other problem is that the card is too tall and long at 14cm and 35 cm. Great for medium-plus sized ATX cases, but not so much for microATX and SFF stuff. The more recent XFX cards look really cool but are way too big for the most part.
 
cards look really cool but are way too big
That's not only XFX, the whole industry are trying their hardest to normalise these chonkers. Of course every reasonable user enjoys silent and cool equipment that's very far from risky conditions but c'mon, this has become ridiculous 2.5 generations ago already.
 
I would really like to see this become more common.
GPU fans break? Now you don't need to redneck one together or have to buy a new heatsink, just take out the fans and slot a new one in.
I'd like to see how it fares over time. Since I'm sure you'd have to replace them more often wouldn't you?
 
I'd have preferred to see standard sized fans used, then people could put in whatever they wanted, albeit the cables would be far less aesthetic and the card would undoubtedly be thicker. This is neat but $70 extra for something that's rarely a problem... I dunno man tough sell. I suppose however, for Radeon's with a VRAM pool disproportionately large relative to performance, getting many many years out of it is important to some buyers, so it's not entirely uncalled for. Hec just cleaning the card will be easier so that's something.
 
Card looks great and seems easy to maintain with cleaning dust, although those fans are proprietary so hopefully XFX still has a bunch of them down the line for RMA.

The only other problem is that the card is too tall and long at 14cm and 35 cm. Great for medium-plus sized ATX cases, but not so much for microATX and SFF stuff. The more recent XFX cards look really cool but are way too big for the most part.

That's not only XFX, the whole industry are trying their hardest to normalise these chonkers. Of course every reasonable user enjoys silent and cool equipment that's very far from risky conditions but c'mon, this has become ridiculous 2.5 generations ago already.

How do they try ? The last real attempt was with Radeon RX Vega 64, which had special BIOS options for power saving.

1718753986192.png


Today this gives you power spikes of insane 507 W!

1718754042888.png
 
How do they try ? The last real attempt was with Radeon RX Vega 64, which had special BIOS options for power saving.

View attachment 351933

Today this gives you power spikes of insane 507 W!

View attachment 351934

That's not comparable. I can get my 7900 XTX to use only 280W to 320W (spikes up to 400W still) by lowering the power limit. Those special BIOS options are basically doing the same thing: Limiting TGP

The spikes are an inherent flaw of the architecture, but have been improved compared to RDNA2's 6900/6950 XT. Also, this XFX model is overclocked.

Unrelated to the topic: I was hoping for the 7700 XT or 7800 XT to not spike too much above 240W so I can use it on my eGPU, but unfortunately that doesn't seem to be doable (it causes the AC adapter to alarm and trips OCP/OVP at around 310W). Amazingly the older RX 6800 can do it (with just the OCP/OVP warning with spikes up to 250W), but at noticeably less performance compared to the newer gen.
 
How do they try ? The last real attempt was with Radeon RX Vega 64, which had special BIOS options for power saving.
Geez, we really need a built-in English to "I barely finished kindergarten" English translator here.

They try their hardest to make the biggest, ugliest, and the most monstrous graphics cards the new norm, ultimately declaring small and quote-unquote small GPUs a mistake, putting it mildly. Not the opposite.
 
A $70 price premium for magnetic fans is not the compelling value-add that XFX seems to believe it is.

Agreed, but XFX wants some spotlight and they're stuck being an AMD AIB. With Radeon being in the poor state it is (I really thought we'd see more than 1% improvement in the benchmarks since the first review of the 7900 XTX), and no new product launches in the horizon, they have to do *something* to look cool. Unfortunately, this will never be as cool as the ASUS x Noctua RTX 4080 Super, so there's that
 
That's not only XFX, the whole industry are trying their hardest to normalise these chonkers. Of course every reasonable user enjoys silent and cool equipment that's very far from risky conditions but c'mon, this has become ridiculous 2.5 generations ago already.
Agreed.

Funny how AMD got so little recognition for the size of their reference cards (7900 xt and xtx).

I wanted the xtx so bad, but Sapphire had the Pulse 7900xtx for 850 plus 2 games worth 170, so i couldn’t resist that offer.
 
Last edited:
Agreed.

Funny how AMD got so little recognition for the size of their reference cards (7900 xt and xtx).

I wanted one the xtx so bad, but Sapphire had the Pulse 7900xtx for 850 plus 2 games worth 170, so i couldn’t resist that offer.

It doesn't help that the initial batch had a manufacturing defect in the vapor chamber either. But you don't want to run the MBA design on a card with such a hideously high TDP anyway. There's no incentive to purchase such a card because they are consistently the loudest, hottest and simultaneously the worst performers. Any space-constrained build is far better serviced by an equivalent Ada Lovelace GPU regardless of targeted volume, since their performance per watt is far higher and its upscaling features (which can be used for power saving and thermal management purposes) are still superior to what AMD currently offers.

Hopefully, they will consider releasing a high-quality first-party design, something like they did with RX Vega generation. They will need to pull a compact and efficient GPU with RDNA 4 since they won't compete with Blackwell on the performance front, so a premium design for this niche makes sense to me.
 
I can get my 7900 XTX to use only 280W to 320W (spikes up to 400W still) by lowering the power limit. Those special BIOS options are basically doing the same thing: Limiting TGP

Yes, and they do it, while you can't because you can't decrease it to 180 - 210W as it the case with Radeon RX Vega 64. So, yeah, your lowering doesn't count.

They try their hardest to make the biggest, ugliest, and the most monstrous graphics cards the new norm, ultimately declaring small and quote-unquote small GPUs a mistake, putting it mildly. Not the opposite.

Err, good luck then selling that piece of crap. I don't agree to put it in my case, so the risk for explosion / fire works there would rise exponentially. :D
 
Yes, and they do it, while you can't because you can't decrease it to 180 - 210W as it the case with Radeon RX Vega 64. So, yeah, your lowering doesn't count.



Err, good luck then selling that piece of crap. I don't agree to put it in my case, so the risk for explosion / fire works there would rise exponentially. :D

You should be able to decrease a 7900 XTX to ~210W, but performance is going to be harshly reduced, especially if ray tracing is involved. It's simply not designed for low power operation, with that amount of memory and core type that it has.

There is a market and a demand for high-performance, compact GPUs, point in case:


AMD doesn't really have anything to fill this niche right now. The last time they did was the Radeon R9 Nano, which was more than a few years ahead of its time. The RX 7600 is not powerful enough, and the RX 7800 XT doesn't perform as well as its competition when everything is accounted for.
 
Sapphire had easy solution for years now:

Single screw and a pin to pad.
 
It doesn't help that the initial batch had a manufacturing defect in the vapor chamber either.
Which was corrected, can we say the same for the fire hazard that is the still flawed 12VHPWR? That thing is like playing Russian roulette.
There's no incentive to purchase such a card because they are consistently the loudest, hottest and simultaneously the worst performers.
Per the Techpowerup review:

The idle noise chart has been removed starting with this review, because all modern graphics cards support idle-fan-stop.

fannoise_load.png

far better serviced by an equivalent Ada Lovelace GPU regardless of targeted volume, since their performance per watt is far higher
Funny how performance per watt only matters when it can be used against AMD, but its ok when it applies to Intel or even Ngreedia (their Blackwell AI
chips are rated at 1200W) From this article.
upscaling features (which can be used for power saving and thermal management purposes) are still superior to what AMD currently offers.
I refuse to be forced to a tech that limits my options, so if I have to use upscaling (which I consider a form of cheating, same for fake frames) I will use the option that doesnt keep locked into one vendor, even if the results are not as pretty as the other.
They will need to pull a compact and efficient GPU with RDNA 4
I think they will since the rumor is that RDNA4 is a kind of refresh of RDNA3, until a proper reset/new architecture that will be RDNA5.
since they won't compete with Blackwell on the performance front,
I assume that you are part of the group that think that only the 4090 exist.
Absolute performance is not the only thing that matters, especially when you cant pay or wont pay for something that is overpriced.
 
Funny how performance per watt only matters when it can be used against AMD, but its ok when it applies to Intel or even Ngreedia (their Blackwell AI
chips are rated at 1200W) From this article.
Here in the wild we see the typical AMD GPU fanboy. The fanboy, when confronted with the truth of how AMD's consumers GPUs are poorer on a performance-per-watt level than their competition, desperately defends the honour of their favourite company by moving the goalposts to talk about the competition's non-consumer GPUs. Simultaneously, the fanboy screams "NGEEEEDIA" while soiling itself, retreating while hurling its feces at the perceived threat. Back in the safety of its mother's basement, the fanboy masturbates furiously to its collection of posters of AMD's long-dead mascot Ruby, an R9 Nano clutched in one hand and its genitals in the other.
 
Which was corrected, can we say the same for the fire hazard that is the still flawed 12VHPWR? That thing is like playing Russian roulette.

Yes - and AMD GPUs will be using this connector in the future. RX 7000 series don't because the hardware design was finalized before they deployed it. AsRock already has a model with the updated power connector available. Time to stop pretending otherwise.


Per the Techpowerup review:

fannoise_load.png

It is worth noting the the decibel scale is logarithmic, not linear. An increase of 10 dBA implies in a tenfold increase of noise intensity. ;)

Funny how performance per watt only matters when it can be used against AMD, but its ok when it applies to Intel or even Ngreedia (their Blackwell AI chips are rated at 1200W) From this article.

High wattage does not imply low performance per watt, however, SXM modules are not comparable to PCIe AICs, even in the enterprise niche you brought up for... I don't know what reason.

I refuse to be forced to a tech that limits my options, so if I have to use upscaling (which I consider a form of cheating, same for fake frames) I will use the option that doesnt keep locked into one vendor, even if the results are not as pretty as the other.

You're contradicting yourself. If you absolutely refuse to use any of the AMD HYPR-RX features such as Anti-Lag+ or Fluid Motion Frames then I might throw you a bone here, but otherwise, I'm calling out on those being things that were "lame, phony or whatever" until AMD added because NVIDIA had them and their user base demanded this feature parity (which isn't achieved because AMD's technologies are of inferior quality). That they're open-source and platform-agnostic only enabled GeForce owners to make use of them, too... not that they would, I haven't seen a single person picking FSR 2.2 when DLSS is available, and with DirectSR (single implementation), it's guaranteed that all games that support super resolution features will now ship with a combination of FSR 2.2 + other technologies by DirectX metacommand, with minimal development time.

I think they will since the rumor is that RDNA4 is a kind of refresh of RDNA3, until a proper reset/new architecture that will be RDNA5.

It would be excellent if they did so. No argument there. In fact, this is what we all want.

I assume that you are part of the group that think that only the 4090 exist.
Absolute performance is not the only thing that matters, especially when you cant pay or wont pay for something that is overpriced.

No, not really. I do not have an RTX 4090 and I made a conscious decision not to buy one, even though I spent almost as much to get the card I wanted. But people are very much willing to pay for a 4090, because its feature set is complete and its performance is on a league all its own. These cards sold and sell incredibly well.
 
Back
Top