• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6900 XT

It is barely faster than RTX 3080 at 4K but cost 400 dollar more

Not worth buying over RTX 3080 specially when it lacks DLSS and poor RT performance

Both RTX 3090 and RX6900 are bad value for gamers


Also, Nvidia will get something similar to SAM in the futures. It has been confirmed
 
You need to look at those numbers a little closer.

Are you sure?

mspaint_2020-12-09_15-52-47.png
 
Fixed that for you.

You need to look at those numbers a little closer.


That is an opinion not everyone will agree with.
Indeed they can't be too bad value to be sold out for starters. That said quantities are low, but plenty of people are complaining about that at the same time so the demand is around.

Alright, but does that take into account "Radeon Boost" being actively enabled? SAM works better at reduced resolutions as well so "Radeon Boost" pairs very ideally with it. That is a more significant perk than DLSS as well I'd argue since it isn't some cherry picked AAA developer enabled feature on handful of games. If I'm not mistake "Radeon Boost" will work across all games which is a significant difference between the two. I'd say regardless it's priced fairly appropriately in line with performance from the looks of things given it's got higher average frame rates than the RTX 3080.

Sure you can argue the RTX 3080 is stronger at RTRT, but average frame rates aren't RTRT in the first place those are the 0.1% frame rates in the context of game development how titles like that exist right now. If you took the catalog of all steam games in it's entirety GPU the RX6900 XT should end up ahead should performance average more or less hold true across more game titles. The fact is RTRT won't skew results too much in the big picture right now because there are so few titles with it at this junction point in time and it will take years for that to even really begin to change substantially.
 
Last edited:
Thank you for the review W1zzard. Great graphics card at the price point versus 3090. Though, I still think 4k hardware is not *quite* there.

I don't agree with the recent review downgrades caused by "Not for the average gamer." I would rather you provide a review on the hardware, not a review on what percentage of the market will want the hardware. I wish we could do away with this in future reviews.
 
Something is wrong with the RTX 3090. It's not meant for PC Gaming, seeing how bad it performs in PC Gaming relative to the price Nvidia is asking. Had this not been released, AMDs RX 6900XT would have been the fastest GPU on the planet, at least for a short period till Nvidia musters up the RTX 3080Ti. So Now we all know why Nvidia had a need to launch a over priced, power sucking, server GPU called RTX 3090, to keep AMDs RDNA2 from achieving a label of fastest GPU, at least for a short while that is. o_O:eek::cry::peace::clap::kookoo::toast::roll:
 
Are you blind or do you not understand your own example?

Your initial modified quote said it's "barely faster than RTX 3090 ", instead of the 3080 from the actual quote. Which was factually wrong, and my post demonstrated that.

Then apparently you realized your mistake, probably after this reply to me, and edited the quote again in your old post, to say it is "barely slower than RTX 3090". Which is debatable, but OK, at least it's not completely wrong.

The point is the 6900 XT is closer to a 3080 than a 3090 at 4K, so the claim of the original poster of that quote, that it is barely faster than the 3080 at 4K, is still more appropriate than your second edit to it, in my opinion.

Alright, but does that take into account "Radeon Boost" being actively enabled? SAM works better at reduced resolutions as well so "Radeon Boost" pairs very ideally with it. That is a more significant perk than DLSS as well I'd argue since it isn't some cherry picked AAA developer enabled feature on handful of games. If I'm not mistake "Radeon Boost" will work across all games which is a significant difference between the two. I'd say regardless it's priced fairly appropriately in line with performance from the looks of things given it's got higher average frame rates than the RTX 3080.

Honestly, at the moment I don't care about either DLSS or Radeon Boost. I don't intend to use DLSS, but I may change my mind in the future. And Radeon Boost just reduces the actual rendering resolution when you make fast movements, if I understand correctly. It's a nice feature, but in a title like Microsoft Flight Simulator, which is what I played the most recently, you rarely make fast movements. Depending on the game and your play style so it might not help much, if at all, even if in theory it could work for any game.
 
So Now we all know why Nvidia had a need to launch a over priced, power sucking, server GPU called RTX 3090, to keep AMDs RDNA2 from achieving a label of fastest GPU, at least for a short while that is.
Wrong. NVidia launched the 3090 as a premium product and it shines in that capacity. It is currently the only card shown so far to do 8k gaming. The 6900XT is likely to be able to do it as well, but no one has shown those results yet.

Then apparently you realized your mistake, probably after this reply to me, and edited the quote again in your old post, to say it is "barely slower than RTX 3090".
Yup, true. I did see my error and corrected it. Still doesn't matter. Their original statement was blatantly and deliberately incorrect and that is what I was pointing out. Perhaps you failed to understand that context.
Which is debatable, but OK, at least it's not completely wrong.
Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

Yes, NVidia technically has the performance crown with the 3090, but only by the slimest of margins and only through collating averages over a limited number of gaming titles. As I said earlier, the 3090s advantages are the RTRT performance and extra 8GB of VRAM. Otherwise AMD has matched NVidia this round of GPU's. 3090-like performance for 2/3 the price? 6900XT. 3080-like performance for $110 less? 6800XT.

Trying to minimize AMDs progress and offerings in the way many users have been doing in this thread is the same kind of narrow-minded nonsense that people were spewing about RTRT with the release of the RTX2000 series cards a few years ago. It's pathetic.
 
I think I got to the bottom of it. Using multiple monitors triggers the problem with my Gigabyte RTX 3080 Vision OC. I have 2 or 3 displays connected at all times: a 4K @ 60 Hz monitor over DisplayPort, a 3440x1440 monitor at 100Hz, also over DisplayPort, and a 4K TV at 60Hz HDR, over HDMI, which I usually keep turned off.

After closing all applications, it still refused to reduce GPU memory speed. But I noticed when Windows turns off my displays the GPU memory frequency and power usage finally goes down. So, I disconnected my 4K monitor. The power usage went down to 7%, and the memory frequency dropped to 51MHz from 1188MHz. I turned on the 4K TV instead, the power usage and memory frequency remained low. I turned off the 4K TV again and reconnected the 4K monitor. The power usage and memory frequency went up again. I disconnected the 3440x1440 display, the frequency and power usage dropped. I turned on the 4K TV, the power usage and memory frequency remained low.

So, in short, if I connect both my monitors, over DisplayPort, the memory frequency never goes down. As a final experiment, I connected the 3440x1440 display over HDMI, at 50Hz. There were some oscillations, depending on which apps were open, but the GPU power usage and memory frequency remained low, for the most part.

So, I'm guessing it really doesn't like having multiple monitors at high refresh rates and resolutions connected, especially over DisplayPort. This is how the power and frequency usage looked while I was disconnecting/connecting various monitors:

View attachment 178801

The thing is, I looked at all the 3080 TPU reviews, and none of them mentioned the GPU memory frequency being higher when idle and using multiple monitors, unless I missed something.

@W1zzard have seen anything like on any of the 3080s in your tests, GPU memory frequency never going down while using multiple monitors? You have a table with clock profiles on each GPU review, and for all your 3080 reviews you listed the multi-monitor GPU memory frequency as 51MHz. How exactly did you test that? How many monitors, at which resolutions/refresh rates, and how were they connected? DisplayPort, or HDMI? If there were just a couple of monitors at low resolutions, then that might explain the difference to my experience with the Gigabyte RTX 3080 Vision OC.

It's a bit off topic but anyway, i got this issue during a few month with my 2070 Super (1440p 144hz monitor on DP + 1080p 60Hz monitor on hdmi) but it solved itself at some point. It seems to be quite common issue with nvidia card. If a clean driver reinstall doesn't solve the issue, you can use the Multi Display Power Saver module from Nvidia Inspector to force the reduced frequency when idling.
 
It's a bit off topic but anyway, i got this issue during a few month with my 2070 Super (1440p 144hz monitor on DP + 1080p 60Hz monitor on hdmi) but it solved itself at some point. It seems to be quite common issue with nvidia card. If a clean driver reinstall doesn't solve the issue, you can use the Multi Display Power Saver module from Nvidia Inspector to force the reduced frequency when idling.

Thanks for trying to help, but I already found the problem, it was having HDR enabled on one of the 4K displays at 60Hz. If I disable it, the power comes down significantly. My old card didn't support 4K HDR at 60Hz with the HDMI to DisplayPort adapter I was using, which is probably why I didn't have this problem with the old card. So, in a way, it's not a bug, it's a feature :)

Still, according to the TPU reviews the new Radeons are much more efficient in multi-monitor setups, so it's something to consider if you don't use your PC just for gaming.

Anyway, as @W1zzard requested, I added more details about my troubleshooting in a dedicated thread:

 
Thanks for trying to help, but I already found the problem, it was having HDR enabled on one of the 4K displays at 60Hz. If I disable it, the power comes down significantly. My old card didn't support 4K HDR at 60Hz with the HDMI to DisplayPort adapter I was using, which is probably why I didn't have this problem with the old card. So, in a way, it's not a bug, it's a feature :)

Still, according to the TPU reviews the new Radeons are much more efficient in multi-monitor setups, so it's something to consider if you don't use your PC just for gaming.

Anyway, as @W1zzard requested, I added more details about my troubleshooting in a dedicated thread:



Ahah! fantastic catch, i'd just been fiddling with HDR on my new screen as well and could have made the same mistake (honestly, HDR looks so bad on monitors)
 
Wrong. NVidia launched the 3090 as a premium product and it shines in that capacity. It is currently the only card shown so far to do 8k gaming. The 6900XT is likely to be able to do it as well, but no one has shown those results yet.


Yup, true. I did see my error and corrected it. Still doesn't matter. Their original statement was blatantly and deliberately incorrect and that is what I was pointing out. Perhaps you failed to understand that context.

Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

Yes, NVidia technically has the performance crown with the 3090, but only by the slimest of margins and only through collating averages over a limited number of gaming titles. As I said earlier, the 3090s advantages are the RTRT performance and extra 8GB of VRAM. Otherwise AMD has matched NVidia this round of GPU's. 3090-like performance for 2/3 the price? 6900XT. 3080-like performance for $110 less? 6800XT.

Trying to minimize AMDs progress and offerings in the way many users have been doing in this thread is the same kind of narrow-minded nonsense that people were spewing about RTRT with the release of the RTX2000 series cards a few years ago. It's pathetic.

A 6800XT is $110 less than a 3080? News to me...
 
A 6800XT is $110 less than a 3080? News to me...
It's as easy as looking up the the prices.
For example;
$699
$649
$589
Unless my math is off, that's a $50 difference in favor of the 6900XT. The $110 difference is for the 6800. Seems I looked at a 6800 when I looked up prices earlier. Even still AMD has the value add.
 
Last edited:
It's as easy as looking up the the prices.
For example;
$699
$649
$589
Unless my math is off, that's a $50 difference in favor of the 6900XT. The $110 difference is for the 6800. Seems I looked at a 6800 when I looked up prices earlier.

That is an 6800xt, not 6900xt. Msrp of 6900xt is $999
 
The price differences may depend on the country, but here (in France), there is almost no price differences between a 6800XT (from ~770€) and a rtx 3080 (start at ~800€). The 6900XT are listed starting at 1250€, which seems quite high compared to the difference in performance with a 3080, especially if you don't have a ryzen 5000.
 
Not debatable. The data is clearly displayed and while the average shows the 6900XT leaning slightly toward the 3080, there are many instances where the performance of 6900XT tops the 3090. As @Mussels rightly said earlier, there is no clear winner. Depending on the game title, each card is within 2% or 3% of each other.

OK, it's not debatable. I'm just going to post numbers then, without debating. This is how the 6900XT looks at 4K compared to the 3090, even if you give it every possible advantage, including enabling SAM:

- 17% slower in Jedi: Fallen Order
- 15% slower in Control
- 15% slower in Anno 1800
- 15% slower in The Witcher 3
- 14% slower in Civilization VI
- 14% slower in Metro Exodus
- 12% slower in Devil May Cry 5
- 8% slower in Divinity Original Sin II
- 8% slower in Borderlands 3
- 7% slower in DOOM Eternal
- 6% slower in Red Dead Redemption 2
- 4% slower in F1 2020
- 4% slower in Gears 5
- 3% slower in Assassin's Creed Odyssey
- 3% slower in Death Stranding
- 3% slower in Sekiro: Shadows Die Twice
- 3% slower in Shadow of the Tomb Raider
- 2% slower in Project Cars 3
- 1% slower in Strange Brigade

- 3% faster in Far Cry 5
- 6% faster in Battlefield V
- 6% faster in Detroit Become Human
- 8% faster in Hitman 2

Look, don't get me wrong, the 6900XT is a nice card, and I have no problem buying AMD products when they are better than the competing ones and the price makes sense. For example I have a 5800X in a box on my desk right now, and I'm waiting for the motherboard to be delivered.

AMD has to be congratulated for closing the gap to NVidia, and I can't wait to see the next generation of AMD GPUs. All I'm saying is that at this price point the 6900XT is not something that makes me go "wow". Neither is the 3090, considering its huge price. I'm not saying not to buy them. If you need that additional performance and are willing to pay the price, go for it. I don't.
 
Now do the same comparison for 1440p - not everyone is focused on 4k
 
OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.
 
Great to see Zen3 + SAM combo tested.

even rx 6900 xt and with SAM on and picket mem cant beat nvidia 3090 Fonders edition model.

It depends on which games you test.
Pick up the newest hottest and uh oh doh.

1607594690317.png


The games in question:

1607594709433.png


OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.
There is a number of people with decent 1440p monitor looking for high framerates.

This is how the 6900XT looks at 4K
Most games (understandably) are old crap.
And this is likely why AMD has better results at 1440p and below: you get into CPU limited scenarios with that old crap like Civ4.
 
All I'm saying is that at this price point the 6900XT is not something that makes me go "wow". Neither is the 3090, considering its huge price. I'm not saying not to buy them.
You're forgetting the prices of Vega, Radeon7 and the RTX2000 series card. The 3090 is effectively the RTX Titan replacement, which offers approx 50% greater performance and at $1000 less. The RX5000 and RX6000 are likewise less expensive than previous gen gpu's and offer amazing performance jumps. Maybe I find everything exciting and amazing because I don't have a short memory, can keep perspective and context clearly in view. That wasn't a jab at you personally because it seems a lot of people are simply forgetting to remember the recent past.

The reality is thus, GPU offerings from both companies are exceptional this generation and are a serious value when compared to past generations of GPUs. Logic is lost on anyone who does not see and understand the context of that perspective.
 
Now do the same comparison for 1440p - not everyone is focused on 4k
AMD just needs to make single card dual GPU solution with two RX 6900 XT's problem solved $500's more expensive and best case 17% more performance than a RTX 3090...no need to worry about TDP, noise, or heat output those figures aren't considerations to Nvidia RTX 3 series users at that end of the spectrum.

OK, but the question is, do you really need either of the 3090 or the 6900XT at 1440p? I agree that some people do, but they are probably a minority.
Alright, but 4K isn't a minority? DLSS/RTRT games aren't a minority in contrast to the amount of games w/o those features!? People that can just burn $500's for best case 17% more performance and overlook heat, noise, and power usage aren't minorities!? Who are you really Tom Cruise!?
 
Last edited:
Back
Top