# ASRock Radeon RX 6900 XT OC Formula



## W1zzard (May 6, 2021)

The ASRock Radeon RX 6900 XT OC Formula is built using AMD's new Navi 21 XTXH chip, which runs much higher clocks than what the regular RX 6900 XT can achieve. In our testing, this is the first AMD card in a long time to beat NVIDIA's current-generation flagship, the RTX 3090.

*Show full review*


----------



## the54thvoid (May 6, 2021)

That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.


----------



## Cybrnook2002 (May 6, 2021)

@W1zzard how to get a copy of GPU-Z 2.38.3 that you are using to detect the 6900XTXH? Current 2.38.0 does not detect my Red Devil XTXH and I wanted to compare, please? (I did submit a copy of the 6900 XTXH Red Devil to the DB, as an FYI)


----------



## buildzoid (May 6, 2021)

VRM looks like 14+2 Vcore and SOC. With 3+2 Vmem and VDDCI to me. Just judging by the splits in the power planes in the pictures.


----------



## Totally (May 6, 2021)

the54thvoid said:


> That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.



I don't see the problem, nor is it a step back as it is 50% more power hungry than a 2080ti but it is also 50% faster, efficiency remains the same. AMD was not there when the 2080 ti was released Nvidia was. Now AMD has caught up and Nvidia is still in the same place, I don't understand how you can reason that it isn't progress on AMD's part. I don't see an argument against, if someone has a ti or any nvidia card and wants a larger performance envelope, it's a side-grade efficiency wise but a large bump perf, if that person owns a prior gen AMD card, it's a bump in both metrics.


----------



## randompeep (May 6, 2021)

the54thvoid said:


> Progress is meant to bring efficiency


Look one more time @page 35 and find out who's capping the consumer GPU perf/watt ATM
On a side note, a TUF 1650 Super I've bought for 150 bucks in october was a 'hooo lala' for a mid tier rig with a cheap PSU and electricity concerns

One good thing you do is skipping one heck of expensive GPUs lineup! Any acquisition lead to further price hikes so...


----------



## Totally (May 6, 2021)

randompeep said:


> Look one more time @page 35 and find out who's capping the consumer GPU perf/watt ATM
> On a side note, a TUF 1650 Super I've bought for 150 bucks in october was a 'hooo lala' for a mid tier rig with a cheap PSU and electricity concerns
> 
> One good thing you do is skipping one heck of expensive GPUs lineup! Any acquisition lead to further price hikes so...



He's clearly saying that not enough. From what I understand, he wants it to sip a level of power similar to that of a 2080ti while offering the current amount of performance. Which is unrealistic considering not much has changed technology wise since the 2080 ti to warrant such gains.


----------



## Mescalamba (May 6, 2021)

Bit like V8. While I do like them, my wallet doesnt.


----------



## Dammeron (May 6, 2021)

the54thvoid said:


> Progress is meant to bring efficiency, not brute force.


"Should be" and "is" are as far apart from each other in GPU market as possible. Just check how high-end GPU coolers looked 15 years ago - small heatsink with 40mm fan installed. Each generation of GPU brings more efficiency, but also raw power increase over the said efficiency gains, that's why we ended up with chungus cards right now.


----------



## ODOGG26 (May 7, 2021)

Very good review. Really nice card. I'm really impressed. Expensive as all hell like everything else. Would like to see this compared to the other XTX models. Hope you get all of them to test.


----------



## Chrispy_ (May 7, 2021)

the54thvoid said:


> That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.


AMD loves to ignore power efficiency and overvolt/overclock everything by default. You certainly won't find efficiency in a factory OC version of a product whose stock version is already so far beyond the efficiency sweet spot on the clock/voltage curve!

Undervolt this and you'll likely see what you're looking for, but this isn't the version you should be undervolting because it's designed from the ground up to handle 400W. if you really want the most efficiency AMD has to offer, you need to get the reference 6900XT and undervolt it to 0.95V or so, which will net you the performance of a good factory-OC 6800XT but at around 180W; Those last 200MHz cost well over 100W.


----------



## wolf (May 7, 2021)

Fast card but wow the heat, power consumption and price to get those last few % !! 391w gaming and 600+w spikes, and a nuclear hole in your wallet, yikes.

Still, good on you AMD for making the darn thing in the first place, because why not I guess.


----------



## Solid State Soul ( SSS ) (May 7, 2021)

the54thvoid said:


> That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.


Ever since ray tracing was brought up, enregy efficieny has taken a step backwards, the reason why 3080 is faster than a 2080Ti is because it consumes more power. now 350w consumer GPUs are becoming mainstream which further increases the size and weight of the coolers and PSU requirements, it was this gen that we started to see 4 slot coolers and that tells you something.



Dammeron said:


> "Should be" and "is" are as far apart from each other in GPU market as possible. Just check how high-end GPU coolers looked 15 years ago - small heatsink with 40mm fan installed. Each generation of GPU brings more efficiency, but also raw power increase over the said efficiency gains, *that's why we ended up with chungus cards right now.*


And that is very bad, when triple slot coolers started to be common, motherboard vendors made metal reinforced PCIe slots, now we see GPUs that come with support brackets in their box, there should be a limit, otherwise this ridiculousness wont stop



Chrispy_ said:


> AMD loves to ignore power efficiency and overvolt/overclock everything by default. You certainly won't find efficiency in a factory OC version of a product whose stock version is already so far beyond the efficiency sweet spot on the clock/voltage curve!


I believe the only reason AMD managed to catch up to Nvidia in efficiency this gen, is because Nvidia have not made that much improvements in power efficiency since RTX 2000 series, all there R&D efforts have gone to RT cores and Tensore cores, however gaming performance for watt have been not so much improved, and what they did instead, increased power draw from 250w to 350w.



wolf said:


> Fast card but wow the heat, power consumption and price to get those last few % !! 391w gaming and 600+w spikes, and a nuclear hole in your wallet, yikes.
> 
> Still, good on you AMD for making the darn thing in the first place, because why not I guess.


yeah, all of us who said, 650w were good enough for single GPU high end systems look like fools right now, but some of the blame goes to AMD and mostly Nvidia, for not making the generational improvements in efficiency that we'v come to expect


----------



## wolf (May 7, 2021)

Solid State Soul ( SSS ) said:


> all of us who said, 650w were good enough for single GPU high end systems look like fools right now


I've had a 3080 since launch day, using a Corsair SF600 Gold PSU with no issues whatsoever, the whole system has never pulled >500w from the wall, and as you can see from the charts in this review the consumption and spikes are much more manageable on a 3080. The 3090 and even more so the 6900XT tho ... dayum.

Of course with Ampere and RDNA2 to some extent too, undervolting is always an option to pump up efficiency and lower overall consumption, but that is definitely not the point of the card in this review.


----------



## Totally (May 7, 2021)

Solid State Soul ( SSS ) said:


> yeah, all of us who said, 650w were good enough for single GPU high end systems look like fools right now, but some of the blame goes to AMD and mostly Nvidia, for not making the generational improvements in efficiency that we'v come to expect



Is it blame if the expectations are unreasonable?


----------



## Crackong (May 7, 2021)

the54thvoid said:


> That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.




If you want energy efficiency, buy an Rx6800 Non-XT, or under-volt your $3000 GPU.
Just by checking the TPU chart there you can clearly spot the efficiency winner.

Top of the line GPUs are always expected to throw efficiency out of the window to squeeze out the last bit of performance.


----------



## Chrispy_ (May 7, 2021)

Solid State Soul ( SSS ) said:


> I believe the only reason AMD managed to catch up to Nvidia in efficiency this gen, is because Nvidia have not made that much improvements in power efficiency since RTX 2000 series, all there R&D efforts have gone to RT cores and Tensore cores, however gaming performance for watt have been not so much improved, and what they did instead, increased power draw from 250w to 350w.


We won't know for sure until AMD and Nvidia use the same foundry again, but it looks like a lot of the blame for Ampere's silly power draw is on Samsung.

Extrapolating clockspeeds, voltages, and transistor count from the RTX-2000 series, if Ampere was made on TSMC 7nm it would most likely use at least 20% less power.


----------



## Solid State Soul ( SSS ) (May 7, 2021)

Crackong said:


> Top of the line GPUs are always expected to throw efficiency out of the window to squeeze out the last bit of performance.


250w used to be the what high end GPUs draw, now it goes up to 390w !!!!!

That's madness


----------



## Crackong (May 7, 2021)

Solid State Soul ( SSS ) said:


> 250w used to be the what high end GPUs draw, now it goes up to 390w !!!!!
> 
> That's madness



My thoughts are the same.
So I bought a Rx6800 non-XT


----------



## altermere (May 7, 2021)

kinda pity it isn't called Navi XDXD, a laughing face for nvidia and current price absurdity.


----------



## lukart (May 7, 2021)

Very happy to see Asrock bringing back their OC-Formulas to life, this 6900XT does not dissapoint!
It's crazy price indeed, but these cards also not meant for the normal people 
I wonder what's the bios setting for their GPU alone, 320W?


----------



## Arcdar (May 7, 2021)

Thanks for another very detailed and well written review 

Also great to see how it performs. Would be awesome to see it in a "normal market" where prices are only dictated by msrp & "what's hot" and not the craziness we have right now, but can't change that sadly.


----------



## 80251 (May 7, 2021)

And as to availability I guess no one has to ask...


----------



## cellar door (May 7, 2021)

"Overclocking requires power limit increase"

So has every single nvidia card since 2017 and AMD as well. Please make sure to add this to nvidia card reviews as well.


----------



## randompeep (May 7, 2021)

Totally said:


> He's clearly saying that not enough. From what I understand, he wants it to sip a level of power similar to that of a 2080ti while offering the current amount of performance. Which is unrealistic considering not much has changed technology wise since the 2080 ti to warrant such gains.


It seems like efficiency wasn't the purpose of this specific model, hence the 'OC Formula' naming scheme. The chart puts the original 6900XT ahead of AIBs and this have been a thing since forever for OC/maxed out versions, am I right ? I'd except that level of efficiency (2080Ti for future flagship GPUs) in the 2022-2024 consoles (Pro's for PS or actually refreshing this odd generation for both XO & PS) or 2024-2026 GPUs. Till then, don't buy some 800-1200W PSU for these beta monsters of computing...
It's the same principle you see in tuned cars -> you do it for sucking up the last drop of potential performance, while neglecting power consumption and additional needs/consumables required to work it out. You can't call out for Toyota Prius' fuel efficiency @SSC Tuatara. This card was clearly made for beatin' 3090's rasterization performance at any resolution, and my guess is it's gonna do it at 8k too over time, as new drivers will be squeezing it even tighter.


----------



## Redwoodz (May 7, 2021)

the54thvoid said:


> That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.


OC Formula. Says it right on the box. This is the card DESIGNED to remove as much power limits as possible. This is not what you want.


----------



## Ravenas (May 7, 2021)

W1z - Is this being tested against 3090 reference card, or an OEM?


----------



## HD64G (May 7, 2021)

Navi21 is a great chip. When tuned for efficiency it is the best, when tuned for performance it gets the best @4K, since for lower res the stock ref model already wins.


----------



## xkm1948 (May 7, 2021)

For $2000 no thanks. That RT performs is dog poo poo. If I wanna play pure rasterization game I would be fine with a lower level card. Flagship GPUs are supposed to be good in everything, including feature set and etc.

@W1zzard Would you be using the Metro Exodus Enhanced Edition going forward?



HD64G said:


> Navi21 is a great chip. When tuned for efficiency it is the best, when tuned for performance it gets the best @4K, since for lower res the stock ref model already wins.


when tuned for performance it gets the best @4K *@ rasterization* only

Again, not paying $2000 for a pure rasterization GPU at this age.


----------



## HD64G (May 7, 2021)

xkm1948 said:


> For $2000 no thanks. That RT performs is dog poo poo. If I wanna play pure rasterization game I would be fine with a lower level card. Flagship GPUs are supposed to be good in everything, including feature set and etc.
> 
> @W1zzard Would you be using the Metro Exodus Enhanced Edition going forward?
> 
> ...


I wouldn't pay over $500 for any GPU while you would pay 4 times more to have RTX on for the few games that support it though. OK.


----------



## W1zzard (May 7, 2021)

Ravenas said:


> W1z - Is this being tested against 3090 reference card, or an OEM?


3090 FE, all my comparison cards are reference design



xkm1948 said:


> Would you be using the Metro Exodus Enhanced Edition going forward?


Undecided yet, mostly because there is no way to turn off RT. So I might stick with normal Metro, so I can report on the RT perf penalty, which is important



xkm1948 said:


> That RT performs is dog poo poo


Between RTX 3070 and RTX 3080 is so bad?


----------



## xkm1948 (May 7, 2021)

HD64G said:


> I wouldn't pay over $500 for any GPU while you would pay 4 times more to have RTX on for the few games that support it though. OK.



They priced it as a flagship without the quality of a premium flagship. Only die hard fans would get this over a 3090.

Not just me, the 6900XT have quite a lot of stock at my local Micro center. At over $2000 a piece, nobody in their sane mind would buy these. Hence why lots of available 6900XT stock because they are bad value.

Oh and also they are not good at mining, cannot do AI / ML, cannot be used scientific computing (ROCm does not support Navi2X). 

So basically the world's fastest rasterization GPU and just that.









W1zzard said:


> 3090 FE, all my comparison cards are reference design
> 
> 
> Undecided yet, mostly because there is no way to turn off RT. So I might stick with normal Metro, so I can report on the RT perf penalty, which is important
> ...




For a $2000 GPU yup that is not good.

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it. 

Something like this


----------



## HD64G (May 7, 2021)

xkm1948 said:


> They priced it as a flagship without the quality of a premium flagship. Only die hard fans would get this over a 3090.
> 
> Not just me, the 6900XT have quite a lot of stock at my local Micro center. At over $2000 a piece, nobody in their sane mind would buy these. Hence why lots of available 6900XT stock because they are bad value.
> 
> ...


Nice! At least AMD GPUs with their lower mining and greater gaming potential will drop sooner and closer to their MSRPs.


----------



## W1zzard (May 7, 2021)

xkm1948 said:


> For a $2000 GPU yup that is not good.


Fair point



xkm1948 said:


> Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.
> 
> Something like this


My charting engine doesn't allow that. Not sure if useful with the way we have the text over the bar, and not outside of the bar


----------



## xkm1948 (May 7, 2021)

HD64G said:


> Nice! At least AMD GPUs with their lower mining and greater gaming potential will drop sooner and closer to their MSRPs.



greater *rasterization* gaming

But sure. if it can go back to MSRP faster than some folks would be happy. Reality is AMD's board partners are happily charging up the price to insane level as well



W1zzard said:


> Fair point
> 
> 
> My charting engine doesn't allow that. Not sure if useful with the way we have the text over the bar, and not outside of the bar



I know it probably would never be fixed. But man that graph just so busy like impossible to understand. Well unless it is intended to confuse folks.

Can you do contrasting colors at least between RT and non RT? That way it would quickly pop out the results for people to grasp.

Or group the results by GPU instead of sort by FPS from small to big. Show it in groups would be more informative.


----------



## W1zzard (May 7, 2021)

xkm1948 said:


> I know it probably would never be fixed


Never say never. I wrote every bit of that charting engine, so at least I don't have to bother with externals 



xkm1948 said:


> Can you do contrasting colors at least between RT and non RT?







already have contrasting colors?


----------



## xkm1948 (May 7, 2021)

W1zzard said:


> Never say never. I wrote every bit of that charting engine, so at least I don't have to bother with externals
> 
> 
> 
> ...




No. Blue and Green are NOT contrasting colors. Contrasting colors are on the opposite end of color wheel.  The opposite of blue should be orange. The opposite of green is red. You have blue and green which are not grouped contrasting colors.

Also you have grey in there as well, making it a 3 color instead of 2 contrasting color. The information you want to convey is RT on vs off, with your current color scheme it is not working.

Highly recommend you gave this a read. You have great RT data but the presentation and execution have a lot more room for improvement.






						11 Colour scales and legends | ggplot2
					

After position, the most commonly used aesthetics are those based on colour, and there are many ways to map values to colours in ggplot2. Because colour is complex, the chapter starts with a...



					ggplot2-book.org


----------



## randompeep (May 8, 2021)

xkm1948 said:


> So basically the world's fastest rasterization GPU and just that.


_consumerism_ is at its peak ~ selling one product for one top tier feature, supported by so called PCMR heads with non-ethical 'transaction' history aka boomers/flexers or even rich kids
But if the price it's stable at the top, the trend should be stabilizing hundred by hundred down to the cheapest (??? $) model aviable/in production line.

Bettin' next time GPUs go cheap, it wouldn't be the same as the last crypto bubble. They'll be stacking some more and more...


----------



## AsRock (May 8, 2021)

the54thvoid said:


> That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.



Why ?, nVidia clearly didn't.  Although looks like AMD pushed it a little extra.

How ever i would of liked them not to even bother and see if it kept nVidia guessing.


----------



## phill (May 10, 2021)

Brilliant review as always @W1zzard


----------



## Ravenas (May 10, 2021)

W1zzard said:


> 3090 FE, all my comparison cards are reference design



Based on the review, for users who own a 3090 OEM, i.e., the one listed in my system specs, I don't see a significant advantage of owning this the 6900 XTXH over the 3090 OEM aside from slight FPS gains on the 6900 XT. Those gains could be diminished or increased as developers are using AMD technology across consoles and PC platforms. I still may consider giving overall advantage to 3090 OEM due to the 8gb of extra GDDR6X memory that could help with future proofing.

I am also a loyal to AMD graphics technology over the last decade... I wanted to purchase a 6900 XT, but could not find one in stock.

W1zz don't you think 16gb versus 3090 24gb should be a negative?


----------



## Viilutaja (May 11, 2021)

That card is readily available at German Mindfactory.de. 5pcs left.
Price 2090 euros. Little cheaper than your average AIB RTX3090 right now. But RTX3090 has DLSS advantage. If upcoming FSR performance is any good then maybe soon the 2090€ price tag is kinda OK.
And I use this "OK" very very lightly, no gaming cards should cost over 1000€! But with 2090€ spent you get the fastest card on all resolutions basically.
Only 4K is hit and miss, but it pounds RTX3090 at 1080p/1440p high refresh rate gaming.

Who would have thought that AMD is capable beating Nvidia at highest level on summer 2020 after RTX3090 launched!?


----------



## Chrispy_ (May 11, 2021)

Viilutaja said:


> Only 4K is hit and miss


4K has been hit or miss since forever. 
The GTX 1080 was billed as the first "4K gaming champion" and yet in games of its day, framerates were in the 30s and 40s.
4K30 is only suitable for certain types of games, and most people would choose 1080p60 over 4K30 because it feels better and makes the experience more enjoyable.


----------



## Ravenas (May 11, 2021)

Chrispy_ said:


> 4K has been hit or miss since forever.
> The GTX 1080 was billed as the first "4K gaming champion" and yet in games of its day, framerates were in the 30s and 40s.
> 4K30 is only suitable for certain types of games, and most people would choose 1080p60 over 4K30 because it feels better and makes the experience more enjoyable.



4K isn't hit or miss with either card, you are achieving 60 FPS majority minimum across every game tested with exception of 1 or 2.

The 6900 XTXH is faster than the 3090 FE in relative performance across 1080P, 2K, and 4K. The only downfall is that you are left with 8gb less GDDR with the 6900 XTXH versus the 3090. DLSS is great, but the developer must support it through training and the list of supporters isn't that large. Example, in Cyberpunk you have to enable DLSS to achieve 60 FPS with all settings MAX. AMD is releasing a more open competitor, possibly without training, next month possibly.


----------



## Chrispy_ (May 12, 2021)

Ravenas said:


> 4K isn't hit or miss with either card, you are achieving 60 FPS majority minimum across every game tested with exception of 1 or 2.
> 
> The 6900 XTXH is faster than the 3090 FE in relative performance across 1080P, 2K, and 4K. The only downfall is that you are left with 8gb less GDDR with the 6900 XTXH versus the 3090. DLSS is great, but the developer must support it through training and the list of supporters isn't that large. Example, in Cyberpunk you have to enable DLSS to achieve 60 FPS with all settings MAX. AMD is releasing a more open competitor, possibly without training, next month possibly.


I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's _really_ 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a _fantastic _job on the latest and greatest games at the _best_ graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a  2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!


----------



## nguyen (May 12, 2021)

Chrispy_ said:


> I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.
> 
> 4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's _really_ 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a _fantastic _job on the latest and greatest games at the _best_ graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a  2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!



I still have no idea why people like to quote the current price of 3090 when they were selling for MSRP +10% for months, most people who are willing to fork out 3000usd for 3090 are likely using them for mining, heck 3090 is making 30usd/day mining atm.

The only bad thing about owning a 3090 was that people wish they bought more of them when they were readily available for 1600-1800usd back in 2020 , I'm regretting that I only bought 1 myself.


----------



## Chrispy_ (May 12, 2021)

nguyen said:


> I still have no idea why people like to quote the current price of 3090


We're quoting the *current *3090 price because the review card in question was only launched a few days ago.
When the 3090 was 'only' $1800, the XTXH chips _didn't even exist_.


----------



## Ravenas (May 12, 2021)

Chrispy_ said:


> I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.
> 
> 4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's _really_ 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a _fantastic _job on the latest and greatest games at the _best_ graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a  2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!



I spent $1700 on my 3090. I bought it at MSRP. They are still available at MSRP through the Newegg shuffle only, although I purchased mine prior to the shuffle on Newegg.

Again, I don't agree that these cards aren't capable of good 4K performance when not using DLSS or Fidelity. They are and I use it every day on my 4K PC. Further, the proof is in the review. Look at the AVERAGE FPS across the suite of tested 4K titles. The exception is 1 or 2 games. One of them is Cyberpunk. People are trying to act like Cyberpunk is the bar for AAA 4K. Cyberpunk is an un-optimized bug-ridden game that hit that market way too early. I have the game.

Anyhow, I decided to sell my PNY RTX 3090 for a Powercolor 6900 XTXH Ultimate. There currently aren't games on the market which take advantage of more than 10-12 gb of ram, and I don't see them doing so over the next 3 years either. The performance outside of future memory limitations is better on my Powercolor 6900 XTXH Ultimate versus my PNY RTX 3090. My 6900 XTXH is auto overclocked to 2650 MHz.


----------



## Ravenas (May 14, 2021)

W1z - Could you provide your Wattman settings for the overclock section?


----------



## W1zzard (May 14, 2021)

Ravenas said:


> W1z - Could you provide your Wattman settings for the overclock section?


Power at max, mem at whatever is stable (2160), gpu min at default, gpu max at highest stable (2850)


----------



## Ravenas (May 15, 2021)

W1zzard said:


> Power at max, mem at whatever is stable (2160), gpu min at default, gpu max at highest stable (2850)



Thank you.


----------



## Gpuplusultra (May 15, 2021)

I just bought an Asrock 6900xt Phantom. I'm confused because of what others say online. Can XT chips on 6900xt reach performance with overclocks and things like Powerplay that Buildzoid showed? Will that improve and close the gap to these XTX chips?


----------



## Ravenas (May 24, 2021)

W1zzard, will you be getting a Powercolor 6900 XTXH Ultimate for review?


----------



## W1zzard (May 25, 2021)

Ravenas said:


> W1zzard, will you be getting a Powercolor 6900 XTXH Ultimate for review?


Doesn't seem like it. Apparently their plan is to focus on influencers who show flashy OCs, that make people think the can achieve the same.


----------



## Ravenas (May 26, 2021)

Comparing average FPS 4K to the EVGA 3090 FTW Ultra… I like this better than FE comparison, but slightly different review scenarios. See attached.


----------



## Gpuplusultra (May 27, 2021)

Ravenas and W1zzard is there a difference to junction Temps?  For example do these chips stay cooler or do they reach their higher clocks with less crashing because of power limits?

I'm new to overclocking and I would like to get the most out of my card as I play in 4k. I haven't messed with Powerplay nor More Power Tool.


----------



## Ravenas (May 27, 2021)

Gpuplusultra said:


> Ravenas and W1zzard is there a difference to junction Temps?  For example do these chips stay cooler or do they reach their higher clocks with less crashing because of power limits?
> 
> I'm new to overclocking and I would like to get the most out of my card as I play in 4k. I haven't messed with Powerplay nor More Power Tool.



The applicable new cards are XTXH, binned chips designed to support 10% higher clock rates. Meaning they are cherry picked chips that are stable with higher voltages, ultimately delivering higher clocks. Moreover, the OEMs are designing the cards to deliver greater voltage throughputs.

Use Wattman in AMD driver suite to adjust settings, or if you are fine at settling at an auto overclock just select AUTO.

--

An example of a binned cherry-picked CPU is if you were to visit a processor overclocking forum. Some users can obtain better overclocking than others on the same chip with the same hardware. This can be referred to as a lottery, where the lucky buyer gets a binned chip. Whereas in the case of the XTXH, the OEM charges you more, however you are guaranteed a lottery pick and they guarantee the highest factory overclock, with potential to overclock high under a factory warranty.


----------



## danakin (Jun 1, 2021)

ive got my hand on one of those cards,

sadly i can only turn 2675mhz gpu clock/2090 memory (fast timing) to pass timespy.

@W1zzard

what settings do you use in ungine heaven benchmark? (resolution?)

in your review you are at 2857mhz gpu clock/2160 vram.



does that mean my chip is far inferior?


----------



## W1zzard (Jun 1, 2021)

Did you set power limit +15%? Otherwise OC will be held back

I'm using a custom scene in Unigine, nothing you can reproduce


----------



## jeffb0918 (Jun 10, 2021)

Ravenas said:


> I spent $1700 on my 3090. I bought it at MSRP. They are still available at MSRP through the Newegg shuffle only, although I purchased mine prior to the shuffle on Newegg.
> 
> Again, I don't agree that these cards aren't capable of good 4K performance when not using DLSS or Fidelity. They are and I use it every day on my 4K PC. Further, the proof is in the review. Look at the AVERAGE FPS across the suite of tested 4K titles. The exception is 1 or 2 games. One of them is Cyberpunk. People are trying to act like Cyberpunk is the bar for AAA 4K. Cyberpunk is an un-optimized bug-ridden game that hit that market way too early. I have the game.
> 
> Anyhow, I decided to sell my PNY RTX 3090 for a Powercolor 6900 XTXH Ultimate. There currently aren't games on the market which take advantage of more than 10-12 gb of ram, and I don't see them doing so over the next 3 years either. The performance outside of future memory limitations is better on my Powercolor 6900 XTXH Ultimate versus my PNY RTX 3090. My 6900 XTXH is auto overclocked to 2650 MHz.



Hi! Maybe I'm far too late to the thread, but I'm kinda considering doing what you did here. I also spent 1700 on a 3090 TUF OC, so, close to MSRP. And I am now thinking of swapping to the OC Formula XTXH. What is your experience so far? Should I make the switch? Any advice/tips? 

I'm not (yet) into XOC at the moment, but I'd like to slowly build up towards that. I do enjoy tinkering around, benchmarking, overclocking, etc over the weekend, mainly with air/water setup for the time being, at times with chilled water, chilled air, morning winter air, and am not yet at the stage of hardware modding. I also use the GPU for gaming/work aside from the benchmarking/overclocking hobby of course. Do you think 3090 is an overall better keep for its performance, DLSS, RTX, VRAM, and overclock ability? Or do you think this 6900XTXH is the way to go based on your experience performing that switch?

Thanks anyway!


----------



## Ravenas (Jul 20, 2021)

jeffb0918 said:


> Hi! Maybe I'm far too late to the thread, but I'm kinda considering doing what you did here. I also spent 1700 on a 3090 TUF OC, so, close to MSRP. And I am now thinking of swapping to the OC Formula XTXH. What is your experience so far? Should I make the switch? Any advice/tips?
> 
> I'm not (yet) into XOC at the moment, but I'd like to slowly build up towards that. I do enjoy tinkering around, benchmarking, overclocking, etc over the weekend, mainly with air/water setup for the time being, at times with chilled water, chilled air, morning winter air, and am not yet at the stage of hardware modding. I also use the GPU for gaming/work aside from the benchmarking/overclocking hobby of course. Do you think 3090 is an overall better keep for its performance, DLSS, RTX, VRAM, and overclock ability? Or do you think this 6900XTXH is the way to go based on your experience performing that switch?
> 
> Thanks anyway!



My experience has been great. The PowerColor 6900 XTXH Red Devil Ultimate is faster than my PNY RTX 3090 was (the Sapphire 6900 XTXH Toxic Extreme is really the top card of the XTXH in my opinion, unless you desire to invest in the PowerColor 6900XTXH Liquid Devil Ultimate with EKs water block). The VRAM is not really an advantage on the 3090 because ultimately you run out of shader power before you need the extra ram. The 6900 XTXH cards are available as well.

To answer your questions:

DLSS is a nice technology no doubt, but my prediction is that it fades out similar to gsync with respect to adaptive sync or freesync due to AMD's open approach with FSR.
VRAM, again 24 GB versus 16 GB is not an argument. You run out of shader power prior to running out of VRAM.
Overclocking... The XTXH is binned chips. You pay to win the silicon lottery. No power limit combined with lottery silicon equates to very good overclocking. This card will provide 2800-2850 MHz GPU with 2100-2150 MHz ram overclocked. The only card better in my opinion is the Sapphire 6900 XTXH Toxic Extreme Edition.

The XTXH is the best option hands down. It's faster than most 3090 OEM or FE across suite testing. AMD drivers are better than they have ever been so don't let people talk to you about that.


----------



## 80251 (Jul 21, 2021)

How do they bin GPU's without soldering them to a PCB? Is there some sort of test ball socket jig they use to temporarily mount them to a test PCB?


----------



## Ravenas (Jul 21, 2021)

80251 said:


> How do they bin GPU's without soldering them to a PCB? Is there some sort of test ball socket jig they use to temporarily mount them to a test PCB?




__
		https://www.reddit.com/r/nvidia/comments/9sdtbp


----------



## jpadilla54 (Jan 16, 2022)

What are the size of thermal pads on this ASROCK Rx 6900 XT OC FORMULA?


----------



## baltmin (May 10, 2022)

Is anybody using it for photo/video editing? How happy/unhappy are you?


----------



## Chrispy_ (May 11, 2022)

baltmin said:


> Is anybody using it for photo/video editing? How happy/unhappy are you?


We tried a 6800XT and it was fine but lack of CUDA and faster performance on RTX cards in Premiere was why we didn't buy more.
Also availability; All 16GB RX-6000 cards were grossly understocked and overpriced here in the UK.


----------



## nexus35 (Jun 4, 2022)

Hello
6900 xt oc formula has 1100 € today interesting or not?


----------

