• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6800

I'm speaking from overall benchmarks I've seen so far. Let's treat 3080 and 6800 XT are in the same ballpark performance and 1440p as our control environment.
1) In games where 6800 XT leads 3080, in 4k 3080 leads or difference is miniscule.
2) In games where 6800 XT is hair-width slower than 3080, in 4k 3080 leads farther.

6800 otoh is all around faster than 3070 but in 4k the gap becomes narrower.

That's not the proper way to figure out if a GPU lacks memory bandwidth the higher you go in resolution.

The 6800 and 6800XT both have the exact same memory bandwidth, at 4K the XT is 15% faster at 1440P it's 12% faster. XT scaled better not worse as you would expect if it was indeed lacking memory bandwidth. That's also partly because at 1440P games are more CPU bound but it's fairly obvious there is nothing out of the ordinary which would suggest otherwise.
 
That's not the proper way to figure out if a GPU lacks memory bandwidth the higher you go in resolution.

The 6800 and 6800XT both have the exact same memory bandwidth, at 4K the XT is 15% faster at 1440P it's 12% faster. XT scaled better not worse as you would expect if it was indeed lacking memory bandwidth. That's also partly because at 1440P games are more CPU bound but it's fairly obvious there is nothing out of the ordinary which would suggest otherwise.
I'm not saying RX 6800 is bandwidth starved while 6800 XT is not. I'm saying they both are bandwidth starved at 4k, compared to 3000 series that is.
 
I'm not saying RX 6800 is bandwidth starved while 6800 XT is not. I'm saying they both are bandwidth starved at 4k, compared to 3000 series that is.
While I have no data to support this claim, have you considered that L3 cache might simply see lower hit rates at 4K?
 
The 6800 is totally capable of 4K.

I think that's highly dependent on expectations.
I was considering getting a 3070 that would have to do some 4k gaming, but I know I'll have to lower settings here and there. With future titles, you'll probably need to scale back more. Whether that defines a 4k card for you or not, only you can decide.

Edit: It gets even more complicated when you factor in RTRT.
 
Last edited:
While I have no data to support this claim, have you considered that L3 cache might simply see lower hit rates at 4K?
While I don't know for sure that bandwidth starvartion is happening or not but looking at multiple reviews it looks like both RX 6800 and RX 6800 XT shine their best at 1080p. While the distance between their Nvidia counterparts decreases the higher the resolution.
And as for L3 cache hit rate, I'll admit I'm not knowledgeable about the subject. But according to AMD, it does indeed see slower hit rate at 4K.

While I can't find slide image, here's a post discussing the AMD slide. 54% hit rate on 4k, 67-68% at 1440p and 75% at 1080p. So you're meaning reduced hitrate means infinity cache is ineffective at higher resolution?
 
Last edited:
So you're meaning reduced hitrate means infinity cache is ineffective at higher resolution?
That's exactly the definition of "reduced hit rate" in caching
 
  • Like
Reactions: bug
That's exactly the definition of "reduced hit rate" in caching
So isn't it a roundabout way of saying RDNA 2 is bandwidth starved at higher resolution?
Infinity Cache was supposed to help alleviate the bandwidth limitation of 256-bit GDDR6 wasn't it?
 
So isn't it a roundabout way of saying RDNA 2 is bandwidth starved at higher resolution?
Infinity Cache was supposed to help alleviate the bandwidth limitation of 256-bit GDDR6 wasn't it?
It's complicated. If the GPU is frequently waiting for data to process, then it's starved. But that's not easily measured.
Plus, memory is always slower than the GPU (or CPU for that matter) so in that sense you could say all GPUs are bandwidth starved (otherwise we wouldn't need cache in the first place).
 
Last edited:
The benchmarks don't show that. The 6800 seems to be firmly a 3070 competitor as it's wedged inbetween the 3070 and 3080, performance wise. AMD would be wise to adjust their prices quickly. And let's face reality, AMD has to be planning the 6900 & 6900XT. Those will be the models that compete(or even beat out) the 3080, 3080ti and 3090.
The 6800 is faster in every benchmark in every 1440p and 2160p benchmark where it should be with the amount of vram it has compared to the 3070 and in most cases, it 10-20fps faster in those resolutions. It will get even better with mature drivers. Again the 6700 will be I'm guessing a little slower than a 3070 and around 50-100 dollars cheaper and probably will overclock and reach the 3070 or close enough to it.

I really think if they wanted it to go up against the 3070 then we would have got a 6850 moniker. I bet they are banking on the 6700 to do well and maybe even cannibalize the 6800 series due to the amount of 6800 wafers that didn't make it and will be just cut down to 6700. Just my guess I haven't put any time into researching anything about the 6700 other than the 6700 is coming out, but with the 6800 benchmarks I will be looking out for it for sure.

Why would you hate to say that? It's an excellent point! The 6800 & 6800XT are going to be a god-send for anyone wanting to do rendering and other tasks that require high amounts of VRAM. In that context, the price is excellent and the 6800 is a very well balanced card for the money.

I said this over in the 6800XT review, but it bares repeating, Welcome back to the GPU performance party AMD! Well nice this!
Because I don't want there to be a 580 dollar card.. lol I've bought plenty of highend cards new for around 400 bucks in the past after its been out for a few months to a year(obviously this has nothing to do with the MSRP but me being thrifty.. haha)... So you are right 580 is great and will be a godsend for anyone wanting to do rendering and due to how many we will actually see is probably why they have them higher than the 3070.
 
The 6800 is faster in every benchmark in every 1440p and 2160p benchmark where it should be with the amount of vram it has compared to the 3070 and in most cases, it 10-20fps faster in those resolutions. It will get even better with mature drivers. Again the 6700 will be I'm guessing a little slower than a 3070 and around 50-100 dollars cheaper and probably will overclock and reach the 3070 or close enough to it.
Are we looking at the same review? Because the averages clearly show what I was referring to;

Please understand that I'm not in any way dissing or downplaying on the new Radeon hotness here. This is one hell of a showing and these cards put AMD back in the top tier GPU arena!
 
hi1v5lxnxe.jpg

This is all my games at all resolutions on 6800 XT. I think for next round of retesting I'll revise my power testing to be a bit more demanding
That's more like what's been mostly measured for power consumption level.

There's certaily very valid point in using same game as reference for all cards.
But gotten power consumption being lower than in many games and on typical high end settings isn't exactly good for average user not understanding that.

Though I wonder if there's still similar Power Save profile as in Vega 64, which dropped power draw dramatically with only minor performance hit:
https://www.techpowerup.com/review/amd-radeon-rx-vega-64/29.html
 
Are we looking at the same review? Because the averages clearly show what I was referring to;

Please understand that I'm not in any way dissing or downplaying on the new Radeon hotness here. This is one hell of a showing and these cards put AMD back in the top tier GPU arena!
What were we talking about? lol You don't agree that the 6800 is faster than the 3070 in most of the benchmarks? I would agree with you if the 6800 only had 8gb but it has 16gb. We will have to disagree. I don't even know why we are still talking about this, because I don't plan to buy either card. I don't disagree with you totally but if they drop the price of the 6800 to the 3070 price then that doesn't leave a lot of room for the 6700. I think the 6700 will be priced lower than the 3070 and the 6700 xt will be price a little higher or the same as the 3070 with vram closer to the 3070.
 
What were we talking about? lol You don't agree that the 6800 is faster than the 3070 in most of the benchmarks?
No, I'm saying that the 6800 is direct competition for the 3070 not the 3080 because performance falls closer to the 3070 performance mark. I drew that conclusion from the numbers shown here by W1zzard and elsewhere. The 6800XT is competition for the 3080 and a solid alternative for the 3090. AMD is very likely planning a 6900/6900XT for early next year and will likely be the direct competition for the 3090.
 
No, I'm saying that the 6800 is direct competition for the 3070 not the 3080 because performance falls closer to the 3070 performance mark. I drew that conclusion from the numbers shown here by W1zzard and elsewhere. The 6800XT is competition for the 3080 and a solid alternative for the 3090. AMD is very likely planning a 6900/6900XT for early next year and will likely be the direct competition for the 3090.
Yeah, it looks good for AMD this round.
 
This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.

The perf/W charts have been updated, too, and relevant texts as well.
 
This review has been updated with new gaming power consumption numbers for RTX 2080 Ti, RTX 3070, RTX 3080, RTX 3090, RX 6800 and RX 6800 XT. For these cards I'm now running Metro at 1440p and not 1080p, to ensure proper loading.

The perf/W charts have been updated, too, and relevant texts as well.
Greatly appreciated! :)
 
The 6800 is totally capable of 4K.

Did I say that its not capable of 4K in general? If one has a 2160p120 or 2160p144 monitor and is aiming at high graphical settings in today's AAA games, I would recommend going with the XT instead.
 
Last edited:
No, I'm saying that the 6800 is direct competition for the 3070 not the 3080 because performance falls closer to the 3070 performance mark. I drew that conclusion from the numbers shown here by W1zzard and elsewhere. The 6800XT is competition for the 3080 and a solid alternative for the 3090. AMD is very likely planning a 6900/6900XT for early next year and will likely be the direct competition for the 3090.
I wonder what the odds are AMD reduces the bus width and utilizes HBM2/HBM2E. The infinity cache actually means with HBM memory they can use a lower bus width and still have better bandwidth than GDDR6/GDDR6X independent of it even more than before. Cost is still a consideration though the power consumption and heat offset benefits are in turn. Here's some hypothetical comparisons factoring in infinity cache and different memory types and bus widths. The fact that infinity cache combined with HBM2E at 128-bit can have more than 3 times the bandwidth of a 384-bit GDDR6X is a rather staggering difference.

256-bit GDDR 624GB/s (w/o infinity cache)
128-bit HBM2 832 GB/s (with infinity cache) 8GB capacity
384-bit GDDR6X 936GB/s (w/o infinity cache)
128-bit HBM2E 997.75GB/s (with infinity cache) 24GB capacity

I think one reason AMD could even consider doing so is just for the mobile segment where heat output and power consumption is much more important when taking into account battery life and cooling restraints with the form factor. AMD also has a efficiency winner with RNDA2 on it's hands as well such a chip could dominate the mobile gaming segment in a really big way perhaps. Still I'm wondering how bandwidth dependent AMD's RDNA2 architecture is by comparison to Nvidia's Ampere that obviously is another big factor on the bandwidth and corresponding bus width options that infinity cache adds a neat twist upon.
 
Last edited:
dude it beats a 2080 ti and is $580. its a great deal imo. i am very happy AMD is competing again and so should you. future of PC gaming is bright.
i mean a 3070 also does that at 500$ but ok.
 
Back
Top