• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6800 XT

Great work W1z

AMD delivered what they promised. The performance compared to RTX 3080 is impressive despite the noisy cooler. The price is also better in Canadian pesos. It makes my decision a but harder since I don't need RT but do like the idea and what it does for the games that support it.

What I don't get is all the people complaining about RT performance. If I remember correctly, the same people were the first to dismiss RT as a gimmicky feature when it came out. Now these same people are dismissing AMD for not having a strong performance with RT enabled. Appreciate the competition and improvement that have been made because it's more likely that RDNA3 will have better RT performance and that will help the competition in terms of pricing.
 
Why didn't you guys use the worlds fastest CPU for gaming?
That would also allow you to test SAM+Ragemod?
You saw the SAM review with 5900X ? There's a big link in the conclusion
Rage Mode has nothing to do with the CPU or platform, it's explained in the review, too

Seriously
 
You saw the SAM review with 5900X ? There's a big link in the conclusion
Rage Mode has nothing to do with the CPU or platform, it's explained in the review, too

Seriously

Still your fault ... darn link is not big enough ... make it flash.
 
Even while the ray tracing performance seems pretty bad. These cards seem to lose more performance with ray tracing then Nvidia Turing did by a bit, I may still end up getting one early next year. This 1070 is piss slow and I plan to put my next card under water, and feel the 6800/6800XT will be a lot more fun to have on water than a 3080 with how much overclocking headroom these things have.

That is unless Nvidia release the 3080Ti on TSMC 7nm, but that remains to be seen and if a different node really helps Nvidia's arch all that much.

Nah, you misunderstand, Nvidia did it for all that production capacity that Samsung offers! And there is STILL a demand problem, go figure!

But yeah, this is leaps and bounds ahead. Apparently Nvidia's Ampere happened at AMD, or something. It pales in comparison - on a technical level that is, because the stack is competitive now in both camps. AMD has some pricing to fix if you ask me for the 6800. The 6800XT is in an okay place but still brings up the dilemma wrt ray tracing for some. A bit of a lower price point would have eliminated that outright - or at least more so.

Yeah my dilemma depending on how things shake out starting early next year and if Nvidia releases a 3080Ti that isnt using Samsung anymore (rumor mill is it could be on TSMC 7nm), but I wanted my next card to be fairly good at both ray tracing and rasterization. Ampere is right in that ball park imo, but then its power consumption and heat is something else. Then we have the 6800XT that fits the rastorization, efficiency, and overclocking potential to be really fun to have on water cooling, but then be pretty booty for ray tracing and no DLSS equivalent as of yet...

Ampere = Adequate ray tracing performance for all the games id actually want to run ray tracing on (non competitive multiplayer titles), but power and heat, and very little tweaking headroom, and bad scaling for overclocking

Big Navi = Really good in everything but ray tracing as it stands right now....and no DLSS

@W1zzard What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?
 
Last edited:
What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?
Some cosmetic issues in the control panel, nothing worth mentioning. No stability issues, no crashes
 
What I would like to see is a Performance on DXR 1.0 Titles vs Performance on DXR 1.1 titles. I wonder if the new style of DXR which the RX 6800XT was made for performs significantly better. Overall whatever card I can actually get my hands on will replace my 1080TI.
 
What I would like to see is a Performance on DXR 1.0 Titles vs Performance on DXR 1.1 titles. I wonder if the new style of DXR which the RX 6800XT was made for performs significantly better. Overall whatever card I can actually get my hands on will replace my 1080TI.
I don't think microbenchmarks are useful to the vast majority of readers, if you want to code some test and send it to me i'd be happy to run it.
Need the source for security, but happy to sign an NDA. This offer is open to anyone reading this.post.
 
Last edited:
And another aspect of Big Navi's value is that the drivers seem to be great. Not any reviewer mentioned any serious problem. Let's hope it is a start of constant stability there.
 
This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.

The perf/W charts have been updated, too, and relevant texts as well.
 
Great work W1z

AMD delivered what they promised. The performance compared to RTX 3080 is impressive despite the noisy cooler. The price is also better in Canadian pesos. It makes my decision a but harder since I don't need RT but do like the idea and what it does for the games that support it.

What I don't get is all the people complaining about RT performance. If I remember correctly, the same people were the first to dismiss RT as a gimmicky feature when it came out. Now these same people are dismissing AMD for not having a strong performance with RT enabled. Appreciate the competition and improvement that have been made because it's more likely that RDNA3 will have better RT performance and that will help the competition in terms of pricing.

Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

With 8-10GB cards to carry them forward.

:laugh:

Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

Knowing the technical details and actual performance, heat and TDP now, its crystal clear AMD has a much stronger offering. Even if they don't top the 3080 in all situations - they have a much better piece of silicon on offer, and more VRAM to keep it going forever. These cards will be holding value, whereas a 3080 is just the 2080ti all over again - eclipsed within a single gen.

A few weeks back one could think 'hm, 3080 at 700, thats a great deal!'. Today, not so much. Its a new norm and the 3080 is really on the worst end of it. Especially if you consider that Nvidia is deploying a largely enabled GA102 for it. It means that without a shrink, Ampere is a dead end, and even on 7nm its worth no more than a refresh. This was the major redesign? Back to the drawing board then.

Its going to be interesting to see how important people think RT performance really is, because it truly is the one differentiator Nvidia can hold on to.
 
Last edited:
Still your fault ... darn link is not big enough ... make it flash.

Like the internet of the 90s with giant flashing "CLICK HERE!" links?

Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

It'll be interesting to see what happens moving forward; now that AMD has their foot in most gaming companies' doors simply due to optimizing for consoles, it's more than likely that we'll see AMD optimized features first and Nvidia optimizations following, just due to the nature of going for multi-platform releases and similar core hardware between consoles and AMD CPUs/GPUs. It's just a matter of AMD finalizing their version of DLSS and getting them to utilize it, and optimizations to RT.
 
Its just the last straw, along with DLSS, that people who can't let Nvidia go really have. They hold on to that because in their mind it brings an advantage - even if not a single person can truly quantify that advantage, because both RT and DLSS support are rare occurrences.

They bank on the fact that going forward, many more games will be getting that support.

With 8-10GB cards to carry them forward.

:laugh:

Sorry, but I can't suppress the irony here. RT performance can go all over the place with the coming console gen, the idea that todays' results are any solid measure or comparison of it is ridiculous. We have no real RT metric yet and Nvidia has a generational advantage they can only use once.

Knowing the technical details and actual performance, heat and TDP now, its crystal clear AMD has a much stronger offering. Even if they don't top the 3080 in all situations - they have a much better piece of silicon on offer, and more VRAM to keep it going forever. These cards will be holding value, whereas a 3080 is just the 2080ti all over again - eclipsed within a single gen.

A few weeks back one could think 'hm, 3080 at 700, thats a great deal!'. Today, not so much. Its a new norm and the 3080 is really on the worst end of it. Especially if you consider that Nvidia is deploying a largely enabled GA102 for it. It means that without a shrink, Ampere is a dead end, and even on 7nm its worth no more than a refresh. This was the major redesign? Back to the drawing board then.

Its going to be interesting to see how important people think RT performance really is, because it truly is the one differentiator Nvidia can hold on to.
I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.
 
Check the discussion in the forum comments of the non-xt review, you’ll understand
So what you are essentially telling me is that, a 6800 takes more power than a 6800 XT. That doesn't seem realistic..
 
Well, slower than the 3080 while similarly unobtainable, and with less features. I don't care about ray tracing, but I have a 3840x2160 120Hz screen to feed with frames and the 6800XT just doesn't cut it. Also, DLSS will probably future proof the 3080 much better as far as achieving playable framerates. I have to say this launch is heavily overhyped. It's a good GPU, which couldn't be said about AMD products for years, but not better than the competition. So, I won't cancel my 3080 order.
yeah no, if you want 4k at ultra, 10GB won't be enough. DLSS, much like PhysX, will prob be only available on a handful of titles. And it isn't slower, it's about the same while overclocking much more and using much less energy. I also have a Sony X900H 4k@120hz to feed, and a 16GB highly overclockable 6800XT is a much better deal, now and even more in the future.

Even while the ray tracing performance seems pretty bad. These cards seem to lose more performance with ray tracing then Nvidia Turing did by a bit, I may still end up getting one early next year. This 1070 is piss slow and I plan to put my next card under water, and feel the 6800/6800XT will be a lot more fun to have on water than a 3080 with how much overclocking headroom these things have.

That is unless Nvidia release the 3080Ti on TSMC 7nm, but that remains to be seen and if a different node really helps Nvidia's arch all that much.



Yeah my dilemma depending on how things shake out starting early next year and if Nvidia releases a 3080Ti that isnt using Samsung anymore (rumor mill is it could be on TSMC 7nm), but I wanted my next card to be fairly good at both ray tracing and rasterization. Ampere is right in that ball park imo, but then its power consumption and heat is something else. Then we have the 6800XT that fits the rastorization, efficiency, and overclocking potential to be really fun to have on water cooling, but then be pretty booty for ray tracing and no DLSS equivalent as of yet...

Ampere = Adequate ray tracing performance for all the games id actually want to run ray tracing on (non competitive multiplayer titles), but power and heat, and very little tweaking headroom, and bad scaling for overclocking

Big Navi = Really good in everything but ray tracing as it stands right now....and no DLSS

@W1zzard What is your initial thoughts on drivers for these cards? Seem okay, need work? Notice anything outright?

most new games, based on consoles, will have very light RT effects, and will all be using the "AMD standard" as used on consoles. I wouldn't be worried with the 6800 RT performance, it's good enough for some effects here and there - same as the consoles. Both Ampere and new 6000 series are too week for full blown RT effects anyway, and that's coming only in the PS6 era imho.

I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.
8-10GB is a hard fact / limit people experience today on 1440p / 4K at ultra
RT / DLSS games are just a handful, with arguable quality uplift (many people saying they can't see a difference). People are more hyped about these features due to nVidia's marketing / reviewers rather than real usage scenarios (like PhysX was). Now if you tell me you value CUDA or nvenc, then yes these are valid scenarios where Radeons don't have a say.
 
Last edited:
And another aspect of Big Navi's value is that the drivers seem to be great. Not any reviewer mentioned any serious problem. Let's hope it is a start of constant stability there.
I don't remember any reviewer having driver issues on RX 5000 series release either. But few weeks later general public started posting online about various issues. Utimately time will tell.
 
I think RX5000's driver was good, to begin with and there is some function in the software stack that just doesn't work
 
'With SAM enabled, we see the averages change "dramatically" (in the context of competition), with the RX 6800 XT now being 2% faster across all three resolutions. This helps the RX 6800 XT match the RTX 3080 at 1080p, while beating it by 1% at 1440p and being just 4% slower at 4K UHD—imagine these gains without even touching other features, such as Radeon Boost or Rage Mode! '

Now imagine techpowerup doing a serious review including rage + sam mode together
 
I think you are dismissing valid arguments about RT and DLSS just as much as some people dismiss valid arguments in favor of AMD. But in the end, the important things is that there is now choices for people depending on what they think is important to them, be it RT performances, consumption, price or whatever float their boat when it come to playing on their computer.

You're absolutely right. I'm not dismissing them entirely though - its just that a choice for 'more RT performance and DLSS' is a choice for a variable, completely abstract advantage. You just don't know how it will develop going forward, and the baseline performance on both cards is decent enough for 'playable' - except perhaps at 4K, where Nvidia might be able to keep over 45-50 FPS more readily with RT on. That is the extent of the validity of those arguments, really. FWIW, apparently AMD Is also launching a DLSS-equivalent. It'll likely not be as useful, but more readily available to a broad range of games. But in much the same way, I wouldn't count on any of it.

In much the same vein we can't really predict how VRAM requirements will develop going forward, but there is a similar sort of risk of lacking performance there on Nvidia's end and it kind of extends to the 3070 too with its rather low 8GB. And then you're talking not just about RT perf, but everything that takes a hit. At the same time, there is one guarantee, none of these cards can really do more RT than a few fancy effects here or there. And still lose a lot of frames. And since the GPUs in the consoles won't be getting faster this gen, that's what you've got for the next three-five years going forward.

There is indeed something to choose now, and that's great.
 
Last edited:
So what you are essentially telling me is that, a 6800 takes more power than a 6800 XT. That doesn't seem realistic..
Ah you misunderstood. Look at the charts from the thread, note how 6800 XT runs closer to 300 W at higher resolution. That's why I switched all cards 2080 Ti and up to power testing at 1440p and not 1080p. Both reviews have been updated accordingly, check if the numbers listed in the review make more sense now.
 
8-10GB is a hard fact / limit people experience today on 1440p / 4K at ultra
RT / DLSS games are just a handful, with arguable quality uplift (many people saying they can't see a difference). People are more hyped about these features due to nVidia's marketing / reviewers rather than real usage scenarios (like PhysX was). Now if you tell me you value CUDA or nvenc, then yes these are valid scenarios where Radeons don't have a say.
I already posted about it, but here it is again.

I personally liked RT on Control and more recently on WD:L and i did find the difference noticeable when i had to deactivate it for some tests after some hour of playing. I also look forward to play minecraft RT with some friends, as the results are really amazing.

Now, i have a 2070 Super OC, all the game i play usually do above 60fps at 1440p with very few exception. The most important one being WD:L, that both hit the 8Go VRAM limit when RT is on with the HD texture pack and the way i'm playing. So i'm aware about the low vram issue you mentioned.

The reason why i have been following this review is that i was hoping for a card with 16Go Vram that can justify to replace my actual GPU. But even with the better efficiency and rasterization performance, why would i buy a 6800XT to have worse performances in the games that would make me want to upgrade my actual GPU in the first place?

So now you can tell me to drop RT on Legion and enjoy my 85fps with my 6800XT (according to guru3d bench). But then i could also just deactivate it on my 2070Super while keeping DLSS which both solve the memory issue and give me 80FPS with a small drop in image quality for 0€ and even less power consumption than the 6800XT.
On the other hand, if i choose to buy the 3080, i will benefit from the FPS boost in every game like the 6800XT, it will solve my issue with Legion and allow me to play minecraft RT in comfortable condition. I may run into issue with the 10GB Vram at some point and it may cost me a bit more of electricity but it would still seems like a better deal for me right now.

@W1zzard : i'm curious about something regarding media playback, you mentioned that it put the memory frequency to the max, does it affect gaming performances in a meaningful way if you have a video playing while a running a demanding game ? Asking because i usually listen to some youtube video while playing.
 
Last edited:
  • Like
Reactions: SLK
i'm curious about something regarding media playback, you mentioned that it put the memory frequency to the max, does it affect gaming performances in a meaningful way if you have a video playing while a running a demanding game ? Asking because i usually listen to some youtube video while playing.
I've never tested that. Doubt it, decode happens on separate hardware in the GPU, and clocks are at max due to gaming already
 
I think the reason why multi-monitor consumes less power is because of Infinity Cache not memory.
 
I think the reason why multi-monitor consumes less power is because of Infinity Cache not memory.
No, the reason is that memory clock is now running very low compared to previous AMD cards, check my reviews, the data is there

It is possible that they leverage the L3 cache to reduce the number of memory accesses, so that the low memory clock is ok, but why is this not a problem on NVIDIA who don't have L3 cache?
 
I noticed card doesn't expose memory temp , right?
 
Is it just me or does the infinity cache(?) seem to help with a more consistent frame rate & a lot less spikes wrt 3080 ~
 
Back
Top