• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASRock Radeon RX 6900 XT OC Formula

That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.
OC Formula. Says it right on the box. This is the card DESIGNED to remove as much power limits as possible. This is not what you want.
 
W1z - Is this being tested against 3090 reference card, or an OEM?
 
Navi21 is a great chip. When tuned for efficiency it is the best, when tuned for performance it gets the best @4K, since for lower res the stock ref model already wins.
 
For $2000 no thanks. That RT performs is dog poo poo. If I wanna play pure rasterization game I would be fine with a lower level card. Flagship GPUs are supposed to be good in everything, including feature set and etc.

@W1zzard Would you be using the Metro Exodus Enhanced Edition going forward?

Navi21 is a great chip. When tuned for efficiency it is the best, when tuned for performance it gets the best @4K, since for lower res the stock ref model already wins.
when tuned for performance it gets the best @4K @ rasterization only

Again, not paying $2000 for a pure rasterization GPU at this age.
 
For $2000 no thanks. That RT performs is dog poo poo. If I wanna play pure rasterization game I would be fine with a lower level card. Flagship GPUs are supposed to be good in everything, including feature set and etc.

@W1zzard Would you be using the Metro Exodus Enhanced Edition going forward?


when tuned for performance it gets the best @4K @ rasterization only

Again, not paying $2000 for a pure rasterization GPU at this age.
I wouldn't pay over $500 for any GPU while you would pay 4 times more to have RTX on for the few games that support it though. OK.
 
W1z - Is this being tested against 3090 reference card, or an OEM?
3090 FE, all my comparison cards are reference design

Would you be using the Metro Exodus Enhanced Edition going forward?
Undecided yet, mostly because there is no way to turn off RT. So I might stick with normal Metro, so I can report on the RT perf penalty, which is important

That RT performs is dog poo poo
Between RTX 3070 and RTX 3080 is so bad?
 
I wouldn't pay over $500 for any GPU while you would pay 4 times more to have RTX on for the few games that support it though. OK.

They priced it as a flagship without the quality of a premium flagship. Only die hard fans would get this over a 3090.

Not just me, the 6900XT have quite a lot of stock at my local Micro center. At over $2000 a piece, nobody in their sane mind would buy these. Hence why lots of available 6900XT stock because they are bad value.

Oh and also they are not good at mining, cannot do AI / ML, cannot be used scientific computing (ROCm does not support Navi2X).

So basically the world's fastest rasterization GPU and just that.

img_4029-jpg.198047


3090 FE, all my comparison cards are reference design


Undecided yet, mostly because there is no way to turn off RT. So I might stick with normal Metro, so I can report on the RT perf penalty, which is important


Between RTX 3070 and RTX 3080 is so bad?


For a $2000 GPU yup that is not good.

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.

Something like this

stacked-bar-example-1.png
 
They priced it as a flagship without the quality of a premium flagship. Only die hard fans would get this over a 3090.

Not just me, the 6900XT have quite a lot of stock at my local Micro center. At over $2000 a piece, nobody in their sane mind would buy these. Hence why lots of available 6900XT stock because they are bad value.

Oh and also they are not good at mining, cannot do AI / ML, cannot be used scientific computing (ROCm does not support Navi2X).

So basically the world's fastest rasterization GPU and just that.

img_4029-jpg.198047





For a $2000 GPU yup that is not good.

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.

Something like this

View attachment 199565
Nice! At least AMD GPUs with their lower mining and greater gaming potential will drop sooner and closer to their MSRPs.
 
For a $2000 GPU yup that is not good.
Fair point

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.

Something like this
My charting engine doesn't allow that. Not sure if useful with the way we have the text over the bar, and not outside of the bar
 
Nice! At least AMD GPUs with their lower mining and greater gaming potential will drop sooner and closer to their MSRPs.

greater rasterization gaming

But sure. if it can go back to MSRP faster than some folks would be happy. Reality is AMD's board partners are happily charging up the price to insane level as well

Fair point


My charting engine doesn't allow that. Not sure if useful with the way we have the text over the bar, and not outside of the bar

I know it probably would never be fixed. But man that graph just so busy like impossible to understand. Well unless it is intended to confuse folks.

Can you do contrasting colors at least between RT and non RT? That way it would quickly pop out the results for people to grasp.

Or group the results by GPU instead of sort by FPS from small to big. Show it in groups would be more informative.
 
I know it probably would never be fixed
Never say never. I wrote every bit of that charting engine, so at least I don't have to bother with externals :)

Can you do contrasting colors at least between RT and non RT?
wej9rrgj.jpg

already have contrasting colors?
 
Never say never. I wrote every bit of that charting engine, so at least I don't have to bother with externals :)


wej9rrgj.jpg

already have contrasting colors?


No. Blue and Green are NOT contrasting colors. Contrasting colors are on the opposite end of color wheel. The opposite of blue should be orange. The opposite of green is red. You have blue and green which are not grouped contrasting colors.

Also you have grey in there as well, making it a 3 color instead of 2 contrasting color. The information you want to convey is RT on vs off, with your current color scheme it is not working.

Highly recommend you gave this a read. You have great RT data but the presentation and execution have a lot more room for improvement.

1603954546-image2-4.png
 
So basically the world's fastest rasterization GPU and just that.
consumerism is at its peak ~ selling one product for one top tier feature, supported by so called PCMR heads with non-ethical 'transaction' history aka boomers/flexers or even rich kids
But if the price it's stable at the top, the trend should be stabilizing hundred by hundred down to the cheapest (??? $) model aviable/in production line.

Bettin' next time GPUs go cheap, it wouldn't be the same as the last crypto bubble. They'll be stacking some more and more...
 
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.

Why ?, nVidia clearly didn't. Although looks like AMD pushed it a little extra.

How ever i would of liked them not to even bother and see if it kept nVidia guessing.
 
Brilliant review as always @W1zzard :)
 
3090 FE, all my comparison cards are reference design

Based on the review, for users who own a 3090 OEM, i.e., the one listed in my system specs, I don't see a significant advantage of owning this the 6900 XTXH over the 3090 OEM aside from slight FPS gains on the 6900 XT. Those gains could be diminished or increased as developers are using AMD technology across consoles and PC platforms. I still may consider giving overall advantage to 3090 OEM due to the 8gb of extra GDDR6X memory that could help with future proofing.

I am also a loyal to AMD graphics technology over the last decade... I wanted to purchase a 6900 XT, but could not find one in stock.

W1zz don't you think 16gb versus 3090 24gb should be a negative?
 
Last edited:
That card is readily available at German Mindfactory.de. 5pcs left.
Price 2090 euros. Little cheaper than your average AIB RTX3090 right now. But RTX3090 has DLSS advantage. If upcoming FSR performance is any good then maybe soon the 2090€ price tag is kinda OK.
And I use this "OK" very very lightly, no gaming cards should cost over 1000€! But with 2090€ spent you get the fastest card on all resolutions basically.
Only 4K is hit and miss, but it pounds RTX3090 at 1080p/1440p high refresh rate gaming.

Who would have thought that AMD is capable beating Nvidia at highest level on summer 2020 after RTX3090 launched!?
 
Only 4K is hit and miss
4K has been hit or miss since forever.
The GTX 1080 was billed as the first "4K gaming champion" and yet in games of its day, framerates were in the 30s and 40s.
4K30 is only suitable for certain types of games, and most people would choose 1080p60 over 4K30 because it feels better and makes the experience more enjoyable.
 
4K has been hit or miss since forever.
The GTX 1080 was billed as the first "4K gaming champion" and yet in games of its day, framerates were in the 30s and 40s.
4K30 is only suitable for certain types of games, and most people would choose 1080p60 over 4K30 because it feels better and makes the experience more enjoyable.

4K isn't hit or miss with either card, you are achieving 60 FPS majority minimum across every game tested with exception of 1 or 2.

The 6900 XTXH is faster than the 3090 FE in relative performance across 1080P, 2K, and 4K. The only downfall is that you are left with 8gb less GDDR with the 6900 XTXH versus the 3090. DLSS is great, but the developer must support it through training and the list of supporters isn't that large. Example, in Cyberpunk you have to enable DLSS to achieve 60 FPS with all settings MAX. AMD is releasing a more open competitor, possibly without training, next month possibly.
 
Last edited:
4K isn't hit or miss with either card, you are achieving 60 FPS majority minimum across every game tested with exception of 1 or 2.

The 6900 XTXH is faster than the 3090 FE in relative performance across 1080P, 2K, and 4K. The only downfall is that you are left with 8gb less GDDR with the 6900 XTXH versus the 3090. DLSS is great, but the developer must support it through training and the list of supporters isn't that large. Example, in Cyberpunk you have to enable DLSS to achieve 60 FPS with all settings MAX. AMD is releasing a more open competitor, possibly without training, next month possibly.
I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's really 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a fantastic job on the latest and greatest games at the best graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a 2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!
 
I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's really 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a fantastic job on the latest and greatest games at the best graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a 2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!

I still have no idea why people like to quote the current price of 3090 when they were selling for MSRP +10% for months, most people who are willing to fork out 3000usd for 3090 are likely using them for mining, heck 3090 is making 30usd/day mining atm.

The only bad thing about owning a 3090 was that people wish they bought more of them when they were readily available for 1600-1800usd back in 2020 :roll:, I'm regretting that I only bought 1 myself.
 
I still have no idea why people like to quote the current price of 3090
We're quoting the current 3090 price because the review card in question was only launched a few days ago.
When the 3090 was 'only' $1800, the XTXH chips didn't even exist.
 
I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's really 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a fantastic job on the latest and greatest games at the best graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a 2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!

I spent $1700 on my 3090. I bought it at MSRP. They are still available at MSRP through the Newegg shuffle only, although I purchased mine prior to the shuffle on Newegg.

Again, I don't agree that these cards aren't capable of good 4K performance when not using DLSS or Fidelity. They are and I use it every day on my 4K PC. Further, the proof is in the review. Look at the AVERAGE FPS across the suite of tested 4K titles. The exception is 1 or 2 games. One of them is Cyberpunk. People are trying to act like Cyberpunk is the bar for AAA 4K. Cyberpunk is an un-optimized bug-ridden game that hit that market way too early. I have the game.

Anyhow, I decided to sell my PNY RTX 3090 for a Powercolor 6900 XTXH Ultimate. There currently aren't games on the market which take advantage of more than 10-12 gb of ram, and I don't see them doing so over the next 3 years either. The performance outside of future memory limitations is better on my Powercolor 6900 XTXH Ultimate versus my PNY RTX 3090. My 6900 XTXH is auto overclocked to 2650 MHz.


1620830764789.png
 
W1z - Could you provide your Wattman settings for the overclock section?
 
W1z - Could you provide your Wattman settings for the overclock section?
Power at max, mem at whatever is stable (2160), gpu min at default, gpu max at highest stable (2850)
 
Back
Top