• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI Radeon RX 6900 XT Gaming X Trio

The card has three 8-pin power inputs. This configuration is rated for up to 525 W of power draw, add 75 W from PCIE and you are at 600 W max + all cables must have some buffer before they melt so 619W is quite possible. But then again it might be software misreporting, if reading is not taken directly from a wall outlet (which it should be).
I know it is possible to draw oveer 500 watts but 619 is more than double the spec sheet for this card.
 
Its a software bug, it is not 619w.
Which software? This is a physical measurement with lab equipment

it’s also card only, not whole system

and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt
 
Which software? This is a physical measurement with lab equipment

it’s also card only, not whole system

and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt
I just read the review of the Gigabyte card and understand now that these are spikes. The argument of lower quality PSUs does hold true though as some PSUs can only support their rated power in spikes. That would mean though that you would need a good 850 to 1000 Watt PSU to really have this card sing. I am so glad that I bought a HX1200I about 2 years ago. It is kind of sad thought that the PC space has become even more a realm of the well heeled as a quality high wattage PSU will cost you money.
 
Which software? This is a physical measurement with lab equipment

it’s also card only, not whole system

and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt

It's quite strange that 20.8 driver supports the Radeon RX 6900 XT?

Maybe the driver on your system is wrong, so the whole review should be remade with 21.2.2?
 
Impressed how much faster this is than the 3080 in 4K. Doesn't draw too much more power either. Samsung 8nm putting in some (bad) work for Ampere.

I agree with the posters in here RE time to retire the 9900K in the test bench. It's holding back the higher end cards and giving a misleading representation of performance.
 
Last edited:
Which software? This is a physical measurement with lab equipment

it’s also card only, not whole system

and yes low quality psus will just turn off with those spikes, happens to me all the time when Testing 6900 xt
We don`t see that with our 2 (not crossfire) RX 6900 xt`s... so i think that you need retest it, maybe with another system to be sure.?
 
We don`t see that with our 2 (not crossfire) RX 6900 xt`s... so i think that you need retest it, maybe with another system to be sure.?
How do you measure power?

edit: are you asking about the psu shutting off? Just buy some cheap 650w psu. Obviously the seasonic in my primary test system is running fine without problems
 
Last edited:
"Gaming noise levels are simply outstanding, almost whisper-quiet, which is unbelievable for a card in this performance class, and temperatures are impressive, too. Once again, MSI strikes with the perfect balance between temperatures and noise, even without going the dual-BIOS route."

This is why I buy MSI cards nowadays and always the ones with this fantastic cooler. They actually achieve the holy grail of silence with great cooling, often being the quietest of the lot. Oh and they're reliable too, unlike that Zotac GTX 1080 POS that I had a few years ago.
 
Driver versions has been added to the test setup table
 
How do you measure power?

edit: are you asking about the psu shutting off? Just buy some cheap 650w psu. Obviously the seasonic in my primary test system is running fine without problems
With a professional measuring instrument. So I mean the peak wattage of 619 watts, we are far from that. We tested with Adrenalin 2020 Edition 21.2.1.
We are under the 390watts and with OC under the 450watts for peak and and I don't see it with other reviewers either.

Its strange....
 
With a professional measuring instrument. So I mean the peak wattage of 619 watts, we are far from that. We tested with Adrenalin 2020 Edition 21.2.1.
We are under the 390watts and with OC under the 450watts for peak and and I don't see it with other reviewers either.

Its strange....
What's your sample rate? You need at least 20 per second to catch those short spikes, I have 40. An oscilloscope with current clamp can provide even more detail, but not sure if I want to sink a few more thousand $ into this.

The data is very consistent here. I've tested three 6900 XT cards so far, weeks apart, all above 600 W spikes.

I don't see it with other reviewers either.
I don't think anyone besides Igor does this kind of testing.
People using NVIDIA PCAT have only 10 samples per second, people using a KillAWatt have 1 or 0.5 samples per second
 
Anyone able to grab a RTX 3080 early and close to MSRP won the internet.

I miss the good old days when AMD fans were outraged by a $1000 Titan card.
With effort you can still snag an FE 3000 series from best buy @ original MSRP
 
Quality card cooling is muscular and beautiful O.C potential is good but the most expensive RX 6900XT at $ 1800. Is it worth the money?
The RX 6800 XT is more affordable and available, I think it could approach the 6900 XT with a good Overclock.
 
Before, we tested 1080p 60 hz, now 1440p 60 hz


Soon, but 5800x, which i have to buy, amd is unable to provide any samples, and 5900x is way overpriced
interesting. i'm having a asus tuf gaming rx 6900 xt. connected via dp a 4K monitor @60hz. radeon overlay is showing me as low as 5 W.
i understand that this number is missing some other elements, but compared to your 28W it seems quite a difference.
does that mean that the gpu chip itself draws 5-7W in idle and the rest 23-21W?
 
What memory clock do you see?
 

Attachments

  • PXL_20210222_130058684.jpg
    PXL_20210222_130058684.jpg
    588.2 KB · Views: 118
Interesting. GPU-Z shows the same value? This is sitting at the desktop with 4K60 ? Which monitor?
 
Interesting. GPU-Z shows the same value? This is sitting at the desktop with 4K60 ? Which monitor?
i updated my system specs in the profile.
monitor is eizo flexscan ev3285, 32"
just logged with gpu-z ... interesting, it reports toggling 13 / 5 W, radeon overlay shows 5 W

i updated my system specs in the profile.
monitor is eizo flexscan ev3285, 32"
just logged with gpu-z ... interesting, it reports toggling 13 / 5 W, radeon overlay shows 5 W
also checked hwinfo sensors. when gpu goes into idle, some entries are greyed out. seems there is no sensible reading.
still far away from 28 W.
the gpu card has also some led - i don't know if this consumption is visible somewhere. (?)
 

Attachments

  • PXL_20210222_133219922.jpg
    PXL_20210222_133219922.jpg
    1.7 MB · Views: 98
  • GPU-Z Sensor Log.zip
    GPU-Z Sensor Log.zip
    2.4 KB · Views: 86
  • 1614011707393.png
    1614011707393.png
    132.4 KB · Views: 110
I own a Gigabyte 6900xt OC. Thermals are horrible, it's loud, and even with an OC, it doesn't match the Ventus 3080 OC (which I briefly owned, then stupidly sold to a miner to get a trio 6800xt which I still have in my possession.)

What is it with AMD and half-baked solutions? The 6900 series should have came with gddr-6x memory, without exception, and AMD needs to clamp down on third-party garbage iterations.
 
I own a Gigabyte 6900xt OC. Thermals are horrible, it's loud, and even with an OC, it doesn't match the Ventus 3080 OC (which I briefly owned, then stupidly sold to a miner to get a trio 6800xt which I still have in my possession.)

What is it with AMD and half-baked solutions? The 6900 series should have came with gddr-6x memory, without exception, and AMD needs to clamp down on third-party garbage iterations.
There's a reason Gigabyte is one of the cheapest to purchase.
 
Quality card cooling is muscular and beautiful O.C potential is good but the most expensive RX 6900XT at $ 1800. Is it worth the money?
The RX 6800 XT is more affordable and available, I think it could approach the 6900 XT with a good Overclock.

My Trio 6800XT crashed hard when it hit 65c, without fail. But I had it running speed-wise well past my stock Gigabyte 6900xt OC in terms of GPU and mem clocks (2710/2100ish-fast mem timings/stock volts). Except the 6800xt, even OC'ed hard, unlike the 6900xt stock, is very tolerable in terms of noise output. Too bad, because my 6900xt is an overclocking monster with lots of room to go, IF the thermals were decent (2650/2100/fast timings/1050mV). I've had it up to 2760/2150 and sucking up power like a bat out of you-know-where.
It thermal-throttles after a few minutes in my nr400 matx case, which never happened with the 3080/6800xt (both msi), no matter how hard I pushed them.
Anyone able to grab a RTX 3080 early and close to MSRP won the internet.

I miss the good old days when AMD fans were outraged by a $1000 Titan card.

I actually did. That 3080 Ventus oc I picked up on preorder for $1069 cad without sales tax back in January - Yes I had a premium 3080 in my hands for about $800 USD. Trust me, it hurts to know that particular card is gone, because it OC'ed like a champ as well.

There's a reason Gigabyte is one of the cheapest to purchase.

Unfortunately I needed a mobo at a reasonable price with 10 usb ports (and I can't stand the untidy front usb, which I relegate for usb drives only anyways). Gigabyte was the only game in town, then I wanted to match the video card, being nostalgic from the Windows XP era (took a hiatus from PC's for a decade), when gigabyte made quality stuff at a fair price. Oh how times have changed...
 
Last edited:
I just want to say that the Vsync power consumption chart is a really great addition.
 
I just want to say that the Vsync power consumption chart is a really great addition.
Thanks, much appreciated. I wasn't 100% sure when I added it, but it does provide useful additional insights
 
Thanks, much appreciated. I wasn't 100% sure when I added it, but it does provide useful additional insights
it shows how much more efficient the gpus can be.
since I have a "office" monitor, I'm just at 60 hz and v-sync.
I setup my games so that I achieve 60 fps, using 3200x1800.
example: with my old rx 5700, that I undervolted to 0.95 V, radeon overlay was showing me in the witcher 3 a gpu usage of 90-100 %, frequencies 1600-1750 mhz and power consumption of ~140-160 watts. I was happy.
now I have been given the rx 6900 xt and played some hour without changing a thing. radeon overlay is showing me a very fluctuating gpu usage and frequencies (clearly 'underemployed') and a power consumption of 85-95 watts. WOW!
but the thing is: whereas the rx 5700 was at the max of its frequency range and so voltage and power usage, the rx 6900 xt is at 0.8 V and at max power efficiency.

so thanks, W1zzard, for that addition
that chart shows how the cpu can behave when it frequently reaches max power efficiency levels. :)
 
Back
Top