• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

You are amazing!! :D :D

I really wish this scalping was not happening.. Those prices :(


It is the new norm.

I remember back in the day during ATI HD3870 days, the MSRP only lasted like a few days. Then on newegg every single 3870 went up in price big time
 
So you want users to invest time and money in all of that just to reach the performance level of what Nvidia can give you for mere $50 more, while offering much faster raytracing, features like DLSS and better video encoding, stable drivers?

what do you mean it doesn't beat nvidia in anything? from ars technica review.
rx-6800-series.010-1440x1080.jpg
 
Confirming their inability to catch up with nVidia was a huge relief for me.

They clearly demonstrated with their CPU launch that they give crap about the end user in terms of value for money and all they wanted was a chance to pass ahead so they can overcharge their products.

Thank you AMD but we didn't need a new "Intel" in place of Intel and we didn't need a new "nVidia" to replace nVidia.

This company has clearly lost their path...

Their inability to catch up? I wish I could see the world through your special eyes. At stock its about 2-3% slower than a card $50 more, and it uses 100W less, has 10% overclocking ability out of the box compares to 3% from team green, and with other tweaks it ends up as fast or slightly faster, plus much better frame times. But I guess you can't see that, cognitive dissonance is real .

No one is forcing anyone to buy it, and everything is worth what someone is willing to pay. Do you think AMD gets all the money from the sale of these cards? If you do perhaps try reading up on supply and demand economics.
 
Confirming their inability to catch up with nVidia was a huge relief for me.

They clearly demonstrated with their CPU launch that they give crap about the end user in terms of value for money and all they wanted was a chance to pass ahead so they can overcharge their products.

Thank you AMD but we didn't need a new "Intel" in place of Intel and we didn't need a new "nVidia" to replace nVidia.

This company has clearly lost their path...


Companies exist to make $$$$$, not to become friends with folks.
 
Yeah, that is a complete lie. The actual data are completely different.

Steam Catalog size 23000+

Xbox Catalog size is 1001

Xbox One Catalog size is 2645

Xbox One X Catalog size is 428

Yeah, most games are actually made for PC. Stop repeating this long-ago debunked AMD fanboy crap already.
He was most likely referring to games most people play. Not indy crao
What shift? Why without CPU or RAM? My 2 years old 9900K is paired with 64GB of RAM, standard speed for the platform. And I still get the same 4K framerate as I would with a brand new top of the line ryzen with expensive high speed memory and a lot of time investing into fine tuning it. That's according to W1zzard's own review here on TPU.

Core i9-9900K 100.7%, Ryzen 9 5950X 100%.

relative-performance-games-38410-2160.png


So what exactly are you talking about?
What shift? Why without CPU or RAM? My 2 years old 9900K is paired with 64GB of RAM, standard speed for the platform. And I still get the same 4K framerate as I would with a brand new top of the line ryzen with expensive high speed memory and a lot of time investing into fine tuning it. That's according to W1zzard's own review here on TPU.

Core i9-9900K 100.7%, Ryzen 9 5950X 100%.

relative-performance-games-38410-2160.png


So what exactly are you talking about?
Can you imagine a world where there are thousands upon thousands of people that have CPUs that came out 2, 3, 4 or 5 years ago? They might want to upgrade. Some might want to optimize to the max. Perhaps even buying perfectly tuned RAM with the new "in thing", a Ryzen 5000 series CPU.

Granted, the difference for you would be minimal, but I'm going to go out on a limb and guess that those that have a 9900K are a minority of the entire PC gaming space.
 
TLDR:
  • Just buy ... NVIDIA. Even 3-5 years from now you'll be able to play the most demanding titles thanks to DLSS just by lowering resolution.
There fixed that for you.


Moving on from infinity cache trashers that just got schooled by reality.


Great release ,I'm calling it a draw@Earthdog.:)
 
As for the performance disparity between the 6800xt and 3080, this review favors the 3080 far more than the other five reviews I've read. Sure, if you are one of the five people in the world that play Anno 1800, Nvidia is the way to go, no doubt. At least two of the other games in this review I've never heard of. From the looks of it, the games are so obscure that AMD hasn't even bothered to create drivers for them. Yet they were included in this review only to skew results in Nvidia's favor as it seems the author knows his audience here. This message board seems to have more delusional Green kool aid drinkers than most.
Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
- RNDA2 has better performance per watt
- Offers more Vram
- 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
- worse in RT (roughly on pair with 2080TI)
- lacks AI supersampling ("DLSS")

It all comes down to what features do you want. Price performance ratio is about the same. 6800 is faster with more vram than 3070, but costs 16% more. 6800XT is on pair with 3080 has more vram BUT worse RT and no AI SS and costs 7% less. All AMD brings to the table is more options to chose from but it isn't necessary better value. Now 6800 costing $499 and 6800XT $599, that would be true Ampere killers. AMD has clearly chosen profit margins over market share gain. That's their decision to make. I personally still hate Ngreedia because of Turing, but I've fallen out of love for Team red too, since they've decided to rise profit margins. I'll just buy what suits my needs best for as cheap as I can get it.
 
Hardware Unboxed and Gamers Nexus came to similar conclusion as TechPowerUp (1080p being the only real difference):
- RNDA2 has better performance per watt
- Offers more Vram
- 6800XT is on pair with 3080 in standard rasterization (better at 1080p, roughly on pair at 1440p, worse at 4K),
- worse in RT (roughly on pair with 2080TI)
- lacks AI supersampling ("DLSS")

It all comes down to what features do you want. Price performance ratio is about the same. 6800 is faster with more vram than 3070, but costs 16% more. 6800XT is on pair with 3080 has more vram BUT worse RT and no AI SS and costs 7% less. All AMD brings to the table is more options to chose from but it isn't necessary better value. Now 6800 costing $499 and 6800XT $599, that would be true Ampere killers. AMD has clearly chosen profit margins over market share gain. That's their decision to make. I personally still hate Ngreedia because of Turing, but I've fallen out of love for Team red too, since they've decided to rise profit margins. I'll just buy what suits my needs best for as cheap as I can get it.


its important to remember the types of people who buy these cards OC. nvidia next gen cards dont really OC at all regardless of what you do to them. and these get close to 10% gains on oc... so really my 6800 is already matching a 3080 when you add that in with my tuned ram/rage mode/smart access memory. probably surpassing a 3080 actually at this point, not sure. don't care. but its a damn good deal for $579

So you post an AMD-sponsored game here, not even in 4K resolution, but in 1440p instead (LOL). Why don't you post this instead, from the same source?
rx-6800-series.001-1440x1080.jpeg


this game was heavily optrimized for next gen consoles as a launch title. its a sign of future AAA games as well, console comes first. ;) AMD will continue to show improved numbers in future next gen titles versus nvidia
 
Low quality post by Makaveli
This is not appropriate behavior on TechPowerUp. Please exercise better posting habits per our forum guidelines. Thanks!
TLDR:
  • Truly stellar rasterization performance per watt thanks to the 7nm node.
  • Finally solved multi-monitor idle power consumption!!
  • Quite poor RTRT performance (I expected something close to RTX 2080 Ti, nope, far from it).
  • No tensor cores (they are not just for DLSS but various handy AI features like background noise removal, background image removal and others).
  • Horrible x264 hardware encoder.
  • Almost no attempt at creating a decent competition. I remember AMD used to fiercly compete with NVIDIA and Intel. AMD 2020: profits and brand value (fueled by Ryzen 5000) first.
Overall: kudos to RTG marketing machine which never fails to overhype and underdeliver. In terms of being future-proof people should probably wait for RDNA 3.0 or buy ... NVIDIA. Even 3-5 years from now you'll be able to play the most demanding titles thanks to DLSS just by lowering resolution.

please keep this joke of a post here on TPU and don't bring it over to the anandtech forum please.
 
its important to remember the types of people who buy these cards OC

Lol for most of AMD history since polaris, vega and fury AMD users were UNDERVOLTING gpus and not OCing them xD. Because they ran like shit -noisy and hot.
 
Actuall you don't. AMD in fact HAD to go for a feature like Infinity Cache. It was not an option, it was a requirement. It's because of their raytracing solution. If you look at the RDNA2 presentation slides, there is a clear line that says:
"4 texture OR ray ops/clock per CU"
m7WjdguDI7kJLI8C.jpg


Now what do you think that means? I'll enlighten you. Remember there is such a thing as the bounding volume hierarchy tree (BVH). That is a big chunk of data, as it holds the bounding boxed for all objects in the scene to help optimize ray intersections. Unfortunately for AMD, as you see in the slide, their cores cannot perform raytracing operations at the same time as texturing operations, unlike in Nvidia's design. Even worse, they are using the same memory (as AMD repeatedly stated) as they use for texture data. If AMD GPUs did not have the Infinity Cache, they would be in huge trouble, as their per-core cache would keep being invalidated all the time, having to dump the BVH data and replacing them with texture data (and vice versa). And you can see the Big Navi paying the price for that in 4k and raytracing.

A simple way to disprove completely what you claim is to look at the SoCs used in consoles, the last level cache that you desperately try to prove AMD added to make RT viable is no where to be found on those, which means that it clearly wasn't it's purpose. Your speculation is plain and simple wrong.

You must be out of your mind to believe for a second that AMD just dedicated 1/5 of the die to improve RT performance which ended up being inferior to Nvidia's anyway. There are other flaws with their implementation that need to be addressed long before bandwidth became an issue.

Also because of the way BVH works some pointer chasing is required meaning caches don't help much.

No, that cache is an inevitable evolution of GPUs given that the ratio of DRAM bandwidth per thread has plummeted over the years and it's obvious that there will be a point in time after which no more performance can be extracted. It's there to aid all around performance and I bet Nvidia will be forced to implement something similar as well at some point. Anyway, the point is that it has nothing to do with RT.
 
Last edited:
He was most likely referring to games most people play. Not indy crao


Can you imagine a world where there are thousands upon thousands of people that have CPUs that came out 2, 3, 4 or 5 years ago? They might want to upgrade. Some might want to optimize to the max. Perhaps even buying perfectly tuned RAM with the new "in thing", a Ryzen 5000 series CPU.

Granted, the difference for you would be minimal, but I'm going to go out on a limb and guess that those that have a 9900K are a minority of the entire PC gaming space.
That is an incorrect statement. The correct would be: You can get the same gaming performance from buying a standard Intel CPU today, with standard RAM, as you would from the top-of-the-line newest Ryzen CPUs with expensive highly clocked and tuned RAM. Simply because you are GPU limited. Even if you manage to scrape 1% of a difference somewhere, you won't notice the difference in game. You will however notice the difference in the amount of time and money investment.
 
So you post an AMD-sponsored game here, not even in 4K resolution, but in 1440p instead (LOL). Why don't you post this instead, from the same source?
rx-6800-series.001-1440x1080.jpeg


Some of us play games other than minecraft.

If your specialized RTRT version of minecraft brickwork needs a green card to make the bricks ray traced, do it, buy why thread crap about the specific title with a specific add on in specific scenarios when 90%+ of others don't care? I am not saying don't comment, but why trash talk about such a specific scenario? How fast does your 3080 render this webpage? Fast enough to ignore other overwhelming evidence that AMD has made cards that put Nvidia to shame for power consumption and frame times? Fast enough to overlook that the competition is only going to help consumers?
 
If this on avg is double the performance of the 5700XT and the 5700XT is now last gen, I wonder if the 5700XT prices will drop to $300US or less??
 
this game was heavily optrimized for next gen consoles as a launch title. its a sign of future AAA games as well, console comes first. ;) AMD will continue to show improved numbers in future next gen titles versus nvidia
The same argument was appearing ever since first AMD-powered consoles appeared, for what - close to a decade now? And it never happened.
 
Sure, if you pull out some obscure game like that you can make AMD look good./s


But that game is obscure, and no one wants 60FPS on Ultra settings, they want 20 FPS at 4K with ultraish settings!!!
 
Actuall you don't. AMD in fact HAD to go for a feature like Infinity Cache. It was not an option, it was a requirement. It's because of their raytracing solution. If you look at the RDNA2 presentation slides, there is a clear line that says:
"4 texture OR ray ops/clock per CU"
m7WjdguDI7kJLI8C.jpg


Now what do you think that means? I'll enlighten you. Remember there is such a thing as the bounding volume hierarchy tree (BVH). That is a big chunk of data, as it holds the bounding boxed for all objects in the scene to help optimize ray intersections. Unfortunately for AMD, as you see in the slide, their cores cannot perform raytracing operations at the same time as texturing operations, unlike in Nvidia's design. Even worse, they are using the same memory (as AMD repeatedly stated) as they use for texture data. If AMD GPUs did not have the Infinity Cache, they would be in huge trouble, as their per-core cache would keep being invalidated all the time, having to dump the BVH data and replacing them with texture data (and vice versa). And you can see the Big Navi paying the price for that in 4k and raytracing.

it's 4 Box or 1 Triangle Intersection per cycle

arch5.jpg
 
Gamers need this GPU, here the reviews.


Better than the rtx 3080 with lower power consuption and is cheaper.
On 1080P its better than the rtx 3090
1605722783516.png

1605722731490.png

Good job AMD for beating NVIDIA. :clap:

(Some NVIDIA fan`s are not happy, lol.)
 
Some of us play games other than minecraft.

If your specialized RTRT version of minecraft brickwork needs a green card to make the bricks ray traced, do it, buy why thread crap about the specific title with a specific add on in specific scenarios when 90%+ of others don't care? I am not saying don't comment, but why trash talk about such a specific scenario? How fast does your 3080 render this webpage? Fast enough to ignore other overwhelming evidence that AMD has made cards that put Nvidia to shame for power consumption and frame times? Fast enough to overlook that the competition is only going to help consumers?
Then why do it for Assasin's Creed Valhalla? Do you think more people are playing that then Minecraft? ;)
 
Lol for most of AMD history since polaris, vega and fury AMD users were UNDERVOLTING gpus and not OCing them xD. Because they ran like shit -noisy and hot.


times change I guess? neat

Then why do it for Assasin's Creed Valhalla? Do you think more people are playing that then Minecraft? ;)


next gen console optimization is a real thing and it will continue to favor AMD in AAA ports in the future.
 
I think you should rerun the entire benchmark with a different windows installation/drivers and maybe hardware because something is fishy in your tests..

E.g ACO the 6800 is FASTER than the 6800xt @1080p while at very low FPS in general while in other benchmarks of the same game for example the 6800xt was faster even compared to 3090 at this particular game...

and these measurements are very low and affect the "relative performance" average you give and many people will refere to in the near future and i find this missleading.
They are equal. The differences you are seeing is random variance between test runs. AMD's driver is more CPU limited than the NVIDIA driver. Note how the FPS are identical (within random margin) at 1080p and 1440p
 
Back
Top