# MSI Radeon RX 6900 XT Gaming X Trio



## W1zzard (Feb 18, 2021)

The Gaming X Trio from MSI is the first RX 6900 XT custom design we review. It comes with a large overclock out of the box, and the cooler is much better than what the AMD RX 6900 XT reference design offers. We also saw substantially improved overclocking potential from MSI's new Radeon flagship.

*Show full review*


----------



## dicktracy (Feb 18, 2021)

Nvidia is still selling RTX 3090 FE for $1499 and it was available just today at Best Buy... It's not cheaper than a 3090 at all.


----------



## W1zzard (Feb 18, 2021)

dicktracy said:


> Nvidia is still selling RTX 3090 FE for $1499 and it was available just today at Best Buy... It's not cheaper than a 3090 at all.


Once significant volume of 3090 becomes available at $1500, the 6900 XT premium models will drop to $1200 or so


----------



## kapone32 (Feb 18, 2021)

dicktracy said:


> Nvidia is still selling RTX 3090 FE for $1499 and it was available just today at Best Buy... It's not cheaper than a 3090 at all.



The most expensive 6900XT 






						Canada Computers | Best PC, Laptop, Gaming Gear, Printer, TV, Cables - Canada Computers & Electronics
					

The best deals on laptops, PC, game systems, components, small appliances, cables, and office supplies. Save more by shopping online or in-store!




					www.canadacomputers.com
				




The Cheapest 3090






						Canada Computers | Best PC, Laptop, Gaming Gear, Printer, TV, Cables - Canada Computers & Electronics
					

The best deals on laptops, PC, game systems, components, small appliances, cables, and office supplies. Save more by shopping online or in-store!




					www.canadacomputers.com


----------



## Fluffmeister (Feb 18, 2021)

Anyone able to grab a RTX 3080 early and close to MSRP won the internet.

I miss the good old days when AMD fans were outraged by a $1000 Titan card.


----------



## Nuckles56 (Feb 18, 2021)

What happened to the power usage numbers? It looks like AMD must have stuffed up the drivers sometime after launch day when they had great idle numbers.


Fluffmeister said:


> Anyone able to grab a RTX 3080 early and close to MSRP won the internet.
> 
> I miss the good old days when AMD fans were outraged by a $1000 Titan card.


I miss the days when everyone was like that's too much for a card and laughed even harder at the titan z for 3 grand...


----------



## Fluffmeister (Feb 18, 2021)

Nuckles56 said:


> What happened to the power usage numbers? It looks like AMD must have stuffed up the drivers sometime after launch day when they had great idle numbers.
> 
> I miss the days when everyone was like that's too much for a card and laughed even harder at the titan z for 3 grand...



Yeah, two Titans on one board! I miss the good old days when Crossfire and SLI was a thing too!

Damn I miss the good old days.


----------



## kapone32 (Feb 18, 2021)

Fluffmeister said:


> Yeah, two Titans on one board! I miss the good old days when Crossfire and SLI was a thing too!
> 
> Damn I miss the good old days.


A secret on the box of my 6800XT. Crossfire support Yes.


----------



## Fluffmeister (Feb 18, 2021)

kapone32 said:


> A secret on the box of my 6800XT. Crossfire support Yes.



Good luck with that.


----------



## kapone32 (Feb 18, 2021)

Fluffmeister said:


> Good luck with that.


I wouldn't even think of getting a second card.......right now.


----------



## AnarchoPrimitiv (Feb 18, 2021)

Anyone else agree that the old GPU data should be forgotten in favor of testing new cards with a 5900x/5950x instead of a 9900k so we can see what these cards are really capable of?  I, along with the vast majority of users, are not realistically in the market for a $1000+ videocard, so I believe they, like me, come to these reviews to see what these cards are capable of in the best case scenario...not the worst.


----------



## Xuper (Feb 18, 2021)

@W1zzard 

any idea Why power spike is quite high? I thought it's flaw design of reference.


----------



## W1zzard (Feb 18, 2021)

Nuckles56 said:


> What happened to the power usage numbers? It looks like AMD must have stuffed up the drivers sometime after launch day when they had great idle numbers.


Before, we tested 1080p 60 hz, now 1440p 60 hz



AnarchoPrimitiv said:


> Anyone else agree that the old GPU data should be forgotten in favor of testing new cards with a 5900x/5950x instead of a 9900k


Soon, but 5800x, which i have to buy, amd is unable to provide any samples, and 5900x is way overpriced


----------



## P4-630 (Feb 18, 2021)

My 2070 Super is still ahead in raytracing titles...


----------



## cueman (Feb 18, 2021)

nahh,summum,still,not beat rtx 3090 and even rtx 3080 gpu when its take also AIB MSI trio version.

we must remembe that compare is against rtx 3080 FE model,not AIB one.

power eat is very big,more than rtx 3090, and peak is higher than any gpu can eat,ever...hmm


----------



## TheLostSwede (Feb 18, 2021)

Fluffmeister said:


> Anyone able to grab a RTX 3080 early and close to MSRP won the internet.
> 
> I miss the good old days when AMD fans were outraged by a $1000 Titan card.


I would've been able to yesterday, but the card was sold 30 seconds later. It was less than $840.


----------



## Xuper (Feb 18, 2021)

What did you change after revising Your review?


----------



## Lindatje (Feb 18, 2021)

RX 6900 XT is the best GPU you can ''buy'' at the moment.

The 9900k is bottlenecking the RX 6900XT.


----------



## TheinsanegamerN (Feb 18, 2021)

I would love one of these cards, if I could get my hands on one that is. 

I wonder something, Samsung made 18Gbps GDDR6 memory at one point, why have none of the card makers attempted a 6900xt with such RAM? The 6900xt does benefit from memory OCing after all.


----------



## ARF (Feb 18, 2021)

@W1zzard Could you please change the test setup with Ryzen 9 5900X or Ryzen 9 5950X?
What is the driver version for this review with Radeon RX 6900XT - it's a missing information?


----------



## the54thvoid (Feb 18, 2021)

I'd love to love this unicorn but the power draw is silly and the price/availability makes it, well... a freaking unicorn.

I mean, its an awesome card, virtually silent, with that power draw (a miracle) but 'meh' all over the house. PC enthusiasts are getting humped this year. Again.


----------



## RedelZaVedno (Feb 18, 2021)

_With spikes of up to 619 W_...?  100% over TDP power draw spikes? Ouch, that's some bad engineering from AMD. 3090's +33%  over TDP is borderline acceptable, but +100% is just abysmal.
Get your s*** together AMD, this GPU is a potential PSU killer.


----------



## Lindatje (Feb 18, 2021)

RedelZaVedno said:


> _With spikes of up to 619 W_...?  100% over TDP power draw spikes? Ouch, that's some bad engineering from AMD. 3090's +33%  over TDP is borderline acceptable, but +100% is just abysmal.
> Get your s*** together AMD, this GPU is a potential PSU killer.


Its a software bug, it is not 619w.


----------



## kapone32 (Feb 18, 2021)

RedelZaVedno said:


> _With spikes of up to 619 W_...?  100% over TDP power draw spikes? Ouch, that's some bad engineering from AMD. 3090's +33%  over TDP is borderline acceptable, but +100% is just abysmal.
> Get your s*** together AMD, this GPU is a potential PSU killer.


I just can't see a single card pulling that much wattage. Even 2 Vega 64s were that high.


----------



## RedelZaVedno (Feb 18, 2021)

kapone32 said:


> I just can't see a single card pulling that much wattage. Even 2 Vega 64s were that high.


The card has three 8-pin power inputs. This configuration is rated for up to 525 W of power draw, add 75 W from PCIE and you are at 600 W max + all cables must have some buffer before they melt so 619W is quite possible. But then again it might be software misreporting, if reading is not taken directly from a wall outlet (which it should be).


----------



## kapone32 (Feb 18, 2021)

RedelZaVedno said:


> The card has three 8-pin power inputs. This configuration is rated for up to 525 W of power draw, add 75 W from PCIE and you are at 600 W max + all cables must have some buffer before they melt so 619W is quite possible. But then again it might be software misreporting, if reading is not taken directly from a wall outlet (which it should be).


I know it is possible to draw oveer 500 watts but 619 is more than double the spec sheet for this card.


----------



## W1zzard (Feb 18, 2021)

Lindatje said:


> Its a software bug, it is not 619w.


Which software? This is a physical measurement with lab equipment

it’s also card only, not whole system

and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt


----------



## kapone32 (Feb 18, 2021)

W1zzard said:


> Which software? This is a physical measurement with lab equipment
> 
> it’s also card only, not whole system
> 
> and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt


I just read the review of the Gigabyte card and understand now that these are spikes. The argument of lower quality PSUs does hold true though as some PSUs can only support their rated power in spikes. That would mean though that you would need a good 850 to 1000 Watt PSU to really have this card sing. I am so glad that I bought a HX1200I about 2 years ago. It is kind of sad thought that the PC space has become even more a realm of the well heeled as a quality high wattage PSU will cost you money.


----------



## ARF (Feb 18, 2021)

W1zzard said:


> Which software? This is a physical measurement with lab equipment
> 
> it’s also card only, not whole system
> 
> and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt



It's quite strange that 20.8 driver supports the Radeon RX 6900 XT?

Maybe the driver on your system is wrong, so the whole review should be remade with 21.2.2?


----------



## Shatun_Bear (Feb 18, 2021)

Impressed how much faster this is than the 3080 in 4K. Doesn't draw too much more power either. Samsung 8nm putting in some (bad) work for Ampere.

I agree with the posters in here RE time to retire the 9900K in the test bench. It's holding back the higher end cards and giving a misleading representation of performance.


----------



## Lindatje (Feb 18, 2021)

W1zzard said:


> Which software? This is a physical measurement with lab equipment
> 
> it’s also card only, not whole system
> 
> and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt


We don`t see that with our 2 (not crossfire) RX 6900 xt`s... so i think that you need retest it, maybe with another system to be sure.?


----------



## W1zzard (Feb 18, 2021)

Lindatje said:


> We don`t see that with our 2 (not crossfire) RX 6900 xt`s... so i think that you need retest it, maybe with another system to be sure.?


How do you measure power?

edit: are you asking about the psu shutting off? Just buy some cheap 650w psu. Obviously the seasonic in my primary test system is running fine without problems


----------



## qubit (Feb 18, 2021)

"Gaming noise levels are simply outstanding, almost whisper-quiet, which is unbelievable for a card in this performance class, and temperatures are impressive, too. Once again, MSI strikes with the perfect balance between temperatures and noise, even without going the dual-BIOS route."

This is why I buy MSI cards nowadays and always the ones with this fantastic cooler. They actually achieve the holy grail of silence with great cooling, often being the quietest of the lot. Oh and they're reliable too, unlike that Zotac GTX 1080 POS that I had a few years ago.


----------



## W1zzard (Feb 19, 2021)

Driver versions has been added to the test setup table


----------



## Lindatje (Feb 19, 2021)

W1zzard said:


> How do you measure power?
> 
> edit: are you asking about the psu shutting off? Just buy some cheap 650w psu. Obviously the seasonic in my primary test system is running fine without problems


With a professional measuring instrument. So I mean the peak wattage of 619 watts, we are far from that. We tested with Adrenalin 2020 Edition 21.2.1.
We are under the 390watts and with OC under the 450watts for peak and and I don't see it with other reviewers either.

Its strange....


----------



## W1zzard (Feb 19, 2021)

Lindatje said:


> With a professional measuring instrument. So I mean the peak wattage of 619 watts, we are far from that. We tested with Adrenalin 2020 Edition 21.2.1.
> We are under the 390watts and with OC under the 450watts for peak and and I don't see it with other reviewers either.
> 
> Its strange....


What's your sample rate? You need at least 20 per second to catch those short spikes, I have 40. An oscilloscope with current clamp can provide even more detail, but not sure if I want to sink a few more thousand $ into this.

The data is very consistent here. I've tested three 6900 XT cards so far, weeks apart, all above 600 W spikes.



Lindatje said:


> I don't see it with other reviewers either.


I don't think anyone besides Igor does this kind of testing. 
People using NVIDIA PCAT have only 10 samples per second, people using a KillAWatt have 1 or 0.5 samples per second


----------



## mdbrotha03 (Feb 20, 2021)

Fluffmeister said:


> Anyone able to grab a RTX 3080 early and close to MSRP won the internet.
> 
> I miss the good old days when AMD fans were outraged by a $1000 Titan card.


With effort you can still snag an FE 3000 series from best buy @ original MSRP


----------



## lightning70 (Feb 20, 2021)

Quality card cooling is muscular and beautiful O.C potential is good but the most expensive RX 6900XT at $ 1800. Is it worth the money?
The RX 6800 XT is more affordable and available, I think it could approach the 6900 XT with a good Overclock.


----------



## brink (Feb 22, 2021)

W1zzard said:


> Before, we tested 1080p 60 hz, now 1440p 60 hz
> 
> 
> Soon, but 5800x, which i have to buy, amd is unable to provide any samples, and 5900x is way overpriced


interesting. i'm having a asus tuf gaming rx 6900 xt. connected via dp a 4K monitor @60hz. radeon overlay is showing me as low as 5 W.
i understand that this number is missing some other elements, but compared to your 28W it seems quite a difference.
does that mean that the gpu chip itself draws 5-7W in idle and the rest 23-21W?


----------



## W1zzard (Feb 22, 2021)

What memory clock do you see?


----------



## brink (Feb 22, 2021)

W1zzard said:


> What memory clock do you see?


----------



## W1zzard (Feb 22, 2021)

Interesting. GPU-Z shows the same value? This is sitting at the desktop with 4K60 ? Which monitor?


----------



## brink (Feb 22, 2021)

W1zzard said:


> Interesting. GPU-Z shows the same value? This is sitting at the desktop with 4K60 ? Which monitor?


i updated my system specs in the profile.
monitor is eizo flexscan ev3285, 32"
just logged with gpu-z ... interesting, it reports toggling 13 / 5 W, radeon overlay shows 5 W



brink said:


> i updated my system specs in the profile.
> monitor is eizo flexscan ev3285, 32"
> just logged with gpu-z ... interesting, it reports toggling 13 / 5 W, radeon overlay shows 5 W


also checked hwinfo sensors. when gpu goes into idle, some entries are greyed out. seems there is no sensible reading.
still far away from 28 W.
the gpu card has also some led - i don't know if this consumption is visible somewhere. (?)


----------



## firemachine69 (Feb 24, 2021)

I own a Gigabyte 6900xt OC. Thermals are horrible, it's loud, and even with an OC, it doesn't match the Ventus 3080 OC (which I briefly owned, then stupidly sold to a miner to get a trio 6800xt which I still have in my possession.) 

What is it with AMD and half-baked solutions? The 6900 series should have came with gddr-6x memory, without exception, and AMD needs to clamp down on third-party garbage iterations.


----------



## Caring1 (Feb 24, 2021)

firemachine69 said:


> I own a Gigabyte 6900xt OC. Thermals are horrible, it's loud, and even with an OC, it doesn't match the Ventus 3080 OC (which I briefly owned, then stupidly sold to a miner to get a trio 6800xt which I still have in my possession.)
> 
> What is it with AMD and half-baked solutions? The 6900 series should have came with gddr-6x memory, without exception, and AMD needs to clamp down on third-party garbage iterations.


There's a reason Gigabyte is one of the cheapest to purchase.


----------



## firemachine69 (Feb 24, 2021)

lightning70 said:


> Quality card cooling is muscular and beautiful O.C potential is good but the most expensive RX 6900XT at $ 1800. Is it worth the money?
> The RX 6800 XT is more affordable and available, I think it could approach the 6900 XT with a good Overclock.



My Trio 6800XT crashed hard when it hit 65c, without fail. But I had it running speed-wise well past my stock Gigabyte 6900xt OC in terms of GPU and mem clocks (2710/2100ish-fast mem timings/stock volts). Except the 6800xt, even OC'ed hard, unlike the 6900xt stock, is very tolerable in terms of noise output. Too bad, because my 6900xt is an overclocking monster with lots of room to go, IF the thermals were decent (2650/2100/fast timings/1050mV). I've had it up to 2760/2150 and sucking up power like a bat out of you-know-where.
It thermal-throttles after a few minutes in my nr400 matx case, which never happened with the 3080/6800xt (both msi), no matter how hard I pushed them.


Fluffmeister said:


> Anyone able to grab a RTX 3080 early and close to MSRP won the internet.
> 
> I miss the good old days when AMD fans were outraged by a $1000 Titan card.



I actually did. That 3080 Ventus oc I picked up on preorder for $1069 cad without sales tax back in January - Yes I had a premium 3080 in my hands for about $800 USD. Trust me, it hurts to know that particular card is gone, because it OC'ed like a champ as well.



Caring1 said:


> There's a reason Gigabyte is one of the cheapest to purchase.



Unfortunately I needed a mobo at a reasonable price with 10 usb ports (and I can't stand the untidy front usb, which I relegate for usb drives only anyways). Gigabyte was the only game in town, then I wanted to match the video card, being nostalgic from the Windows XP era (took a hiatus from PC's for a decade), when gigabyte made quality stuff at a fair price. Oh how times have changed...


----------



## Pumper (Feb 25, 2021)

I just want to say that the Vsync power consumption chart is a really great addition.


----------



## W1zzard (Feb 25, 2021)

Pumper said:


> I just want to say that the Vsync power consumption chart is a really great addition.


Thanks, much appreciated. I wasn't 100% sure when I added it, but it does provide useful additional insights


----------



## brink (Feb 26, 2021)

W1zzard said:


> Thanks, much appreciated. I wasn't 100% sure when I added it, but it does provide useful additional insights


it shows how much more efficient the gpus can be.
since I have a "office" monitor, I'm just at 60 hz and v-sync.
I setup my games so that I achieve 60 fps, using 3200x1800.
example: with my old rx 5700, that I undervolted to 0.95 V, radeon overlay was showing me in the witcher 3 a gpu usage of 90-100 %, frequencies 1600-1750 mhz and power consumption of ~140-160 watts. I was happy.
now I have been given the rx 6900 xt and played some hour without changing a thing. radeon overlay is showing me a very fluctuating gpu usage and frequencies (clearly 'underemployed') and a power consumption of 85-95 watts. WOW!
but the thing is: whereas the rx 5700 was at the max of its frequency range and so voltage and power usage, the rx 6900 xt is at 0.8 V and at max power efficiency.

so thanks, W1zzard, for that addition 
that chart shows how the cpu can behave when it frequently reaches max power efficiency levels.


----------

