• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASRock Radeon RX 6900 XT OC Formula

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,874 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The ASRock Radeon RX 6900 XT OC Formula is built using AMD's new Navi 21 XTXH chip, which runs much higher clocks than what the regular RX 6900 XT can achieve. In our testing, this is the first AMD card in a long time to beat NVIDIA's current-generation flagship, the RTX 3090.

Show full review
 
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.
 
@W1zzard how to get a copy of GPU-Z 2.38.3 that you are using to detect the 6900XTXH? Current 2.38.0 does not detect my Red Devil XTXH and I wanted to compare, please? (I did submit a copy of the 6900 XTXH Red Devil to the DB, as an FYI)
 
Last edited:
VRM looks like 14+2 Vcore and SOC. With 3+2 Vmem and VDDCI to me. Just judging by the splits in the power planes in the pictures.
 
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.

I don't see the problem, nor is it a step back as it is 50% more power hungry than a 2080ti but it is also 50% faster, efficiency remains the same. AMD was not there when the 2080 ti was released Nvidia was. Now AMD has caught up and Nvidia is still in the same place, I don't understand how you can reason that it isn't progress on AMD's part. I don't see an argument against, if someone has a ti or any nvidia card and wants a larger performance envelope, it's a side-grade efficiency wise but a large bump perf, if that person owns a prior gen AMD card, it's a bump in both metrics.
 
Last edited:
Progress is meant to bring efficiency
Look one more time @page 35 and find out who's capping the consumer GPU perf/watt ATM
On a side note, a TUF 1650 Super I've bought for 150 bucks in october was a 'hooo lala' for a mid tier rig with a cheap PSU and electricity concerns

One good thing you do is skipping one heck of expensive GPUs lineup! Any acquisition lead to further price hikes so...:peace:
 
Look one more time @page 35 and find out who's capping the consumer GPU perf/watt ATM
On a side note, a TUF 1650 Super I've bought for 150 bucks in october was a 'hooo lala' for a mid tier rig with a cheap PSU and electricity concerns

One good thing you do is skipping one heck of expensive GPUs lineup! Any acquisition lead to further price hikes so...:peace:

He's clearly saying that not enough. From what I understand, he wants it to sip a level of power similar to that of a 2080ti while offering the current amount of performance. Which is unrealistic considering not much has changed technology wise since the 2080 ti to warrant such gains.
 
Progress is meant to bring efficiency, not brute force.
"Should be" and "is" are as far apart from each other in GPU market as possible. Just check how high-end GPU coolers looked 15 years ago - small heatsink with 40mm fan installed. Each generation of GPU brings more efficiency, but also raw power increase over the said efficiency gains, that's why we ended up with chungus cards right now.
 
Very good review. Really nice card. I'm really impressed. Expensive as all hell like everything else. Would like to see this compared to the other XTX models. Hope you get all of them to test.
 
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.
AMD loves to ignore power efficiency and overvolt/overclock everything by default. You certainly won't find efficiency in a factory OC version of a product whose stock version is already so far beyond the efficiency sweet spot on the clock/voltage curve!

Undervolt this and you'll likely see what you're looking for, but this isn't the version you should be undervolting because it's designed from the ground up to handle 400W. if you really want the most efficiency AMD has to offer, you need to get the reference 6900XT and undervolt it to 0.95V or so, which will net you the performance of a good factory-OC 6800XT but at around 180W; Those last 200MHz cost well over 100W.
 
Last edited:
Fast card but wow the heat, power consumption and price to get those last few % !! 391w gaming and 600+w spikes, and a nuclear hole in your wallet, yikes.

Still, good on you AMD for making the darn thing in the first place, because why not I guess.
 
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.
Ever since ray tracing was brought up, enregy efficieny has taken a step backwards, the reason why 3080 is faster than a 2080Ti is because it consumes more power. now 350w consumer GPUs are becoming mainstream which further increases the size and weight of the coolers and PSU requirements, it was this gen that we started to see 4 slot coolers and that tells you something.

"Should be" and "is" are as far apart from each other in GPU market as possible. Just check how high-end GPU coolers looked 15 years ago - small heatsink with 40mm fan installed. Each generation of GPU brings more efficiency, but also raw power increase over the said efficiency gains, that's why we ended up with chungus cards right now.
And that is very bad, when triple slot coolers started to be common, motherboard vendors made metal reinforced PCIe slots, now we see GPUs that come with support brackets in their box, there should be a limit, otherwise this ridiculousness wont stop

AMD loves to ignore power efficiency and overvolt/overclock everything by default. You certainly won't find efficiency in a factory OC version of a product whose stock version is already so far beyond the efficiency sweet spot on the clock/voltage curve!
I believe the only reason AMD managed to catch up to Nvidia in efficiency this gen, is because Nvidia have not made that much improvements in power efficiency since RTX 2000 series, all there R&D efforts have gone to RT cores and Tensore cores, however gaming performance for watt have been not so much improved, and what they did instead, increased power draw from 250w to 350w.

Fast card but wow the heat, power consumption and price to get those last few % !! 391w gaming and 600+w spikes, and a nuclear hole in your wallet, yikes.

Still, good on you AMD for making the darn thing in the first place, because why not I guess.
yeah, all of us who said, 650w were good enough for single GPU high end systems look like fools right now, but some of the blame goes to AMD and mostly Nvidia, for not making the generational improvements in efficiency that we'v come to expect
 
all of us who said, 650w were good enough for single GPU high end systems look like fools right now
I've had a 3080 since launch day, using a Corsair SF600 Gold PSU with no issues whatsoever, the whole system has never pulled >500w from the wall, and as you can see from the charts in this review the consumption and spikes are much more manageable on a 3080. The 3090 and even more so the 6900XT tho ... dayum.

Of course with Ampere and RDNA2 to some extent too, undervolting is always an option to pump up efficiency and lower overall consumption, but that is definitely not the point of the card in this review.
 
yeah, all of us who said, 650w were good enough for single GPU high end systems look like fools right now, but some of the blame goes to AMD and mostly Nvidia, for not making the generational improvements in efficiency that we'v come to expect

Is it blame if the expectations are unreasonable?
 
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.


If you want energy efficiency, buy an Rx6800 Non-XT, or under-volt your $3000 GPU.
Just by checking the TPU chart there you can clearly spot the efficiency winner.

Top of the line GPUs are always expected to throw efficiency out of the window to squeeze out the last bit of performance.
 
I believe the only reason AMD managed to catch up to Nvidia in efficiency this gen, is because Nvidia have not made that much improvements in power efficiency since RTX 2000 series, all there R&D efforts have gone to RT cores and Tensore cores, however gaming performance for watt have been not so much improved, and what they did instead, increased power draw from 250w to 350w.
We won't know for sure until AMD and Nvidia use the same foundry again, but it looks like a lot of the blame for Ampere's silly power draw is on Samsung.

Extrapolating clockspeeds, voltages, and transistor count from the RTX-2000 series, if Ampere was made on TSMC 7nm it would most likely use at least 20% less power.
 
kinda pity it isn't called Navi XDXD, a laughing face for nvidia and current price absurdity.
 
Very happy to see Asrock bringing back their OC-Formulas to life, this 6900XT does not dissapoint!
It's crazy price indeed, but these cards also not meant for the normal people :D
I wonder what's the bios setting for their GPU alone, 320W?
 
Thanks for another very detailed and well written review :)

Also great to see how it performs. Would be awesome to see it in a "normal market" where prices are only dictated by msrp & "what's hot" and not the craziness we have right now, but can't change that sadly.
 
"Overclocking requires power limit increase"

So has every single nvidia card since 2017 and AMD as well. Please make sure to add this to nvidia card reviews as well.
 
He's clearly saying that not enough. From what I understand, he wants it to sip a level of power similar to that of a 2080ti while offering the current amount of performance. Which is unrealistic considering not much has changed technology wise since the 2080 ti to warrant such gains.
It seems like efficiency wasn't the purpose of this specific model, hence the 'OC Formula' naming scheme. The chart puts the original 6900XT ahead of AIBs and this have been a thing since forever for OC/maxed out versions, am I right ? I'd except that level of efficiency (2080Ti for future flagship GPUs) in the 2022-2024 consoles (Pro's for PS or actually refreshing this odd generation for both XO & PS) or 2024-2026 GPUs. Till then, don't buy some 800-1200W PSU for these beta monsters of computing...
It's the same principle you see in tuned cars -> you do it for sucking up the last drop of potential performance, while neglecting power consumption and additional needs/consumables required to work it out. You can't call out for Toyota Prius' fuel efficiency @SSC Tuatara. This card was clearly made for beatin' 3090's rasterization performance at any resolution, and my guess is it's gonna do it at 8k too over time, as new drivers will be squeezing it even tighter.
 
Back
Top