Monday, November 2nd 2020

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

AMD sent ripples in its late-October even launching the Radeon RX 6000 series RDNA2 "Big Navi" graphics cards, when it claimed that the top RX 6000 series parts compete with the very fastest GeForce "Ampere" RTX 30-series graphics cards, marking the company's return to the high-end graphics market. In its announcement press-deck, AMD had shown the $579 RX 6800 beating the RTX 2080 Ti (essentially the RTX 3070), the $649 RX 6800 XT trading blows with the $699 RTX 3080, and the top $999 RX 6900 XT performing in the same league as the $1,499 RTX 3090. Over the weekend, the company released even more benchmarks, with the RX 6000 series GPUs and their competition from NVIDIA being tested by AMD on a platform powered by the Ryzen 9 5900X "Zen 3" 12-core processor.

AMD released its benchmark numbers as interactive bar graphs, on its website. You can select from ten real-world games, two resolutions (1440p and 4K UHD), and even game settings presets, and 3D API for certain tests. Among the games are Battlefield V, Call of Duty Modern Warfare (2019), Tom Clancy's The Division 2, Borderlands 3, DOOM Eternal, Forza Horizon 4, Gears 5, Resident Evil 3, Shadow of the Tomb Raider, and Wolfenstein Youngblood. In several of these tests, the RX 6800 XT and RX 6900 XT are shown taking the fight to NVIDIA's high-end RTX 3080 and RTX 3090, while the RX 6800 is being shown significantly faster than the RTX 2080 Ti (roughly RTX 3070 scores). The Ryzen 9 5900X itself is claimed to be a faster gaming processor than Intel's Core i9-10900K, and features PCI-Express 4.0 interface for these next-gen GPUs. Find more results and the interactive graphs in the source link below.
Source: AMD Gaming Benchmarks
Add your own comment

147 Comments on AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

#76
mysterfix
Zach_01I'm not going to argue... but
AMD pricing +54% more for a +5% perf GPU has somehow followed the utter stupidity of...
nVidia pricing +114% more for a +10% perf GPU

It is what it is...!

Non the less, AMDs cards have more perf/$ value.
The simple fact is Both parts are very low yield cards and both companies know we have silly people with more money than common sense that will pay to have "The Fastest Bestest graphics card money can buy". If no one bought them they would lower prices.
Posted on Reply
#77
OGoc
The avg frames using 5900x seem lower than TPU's using 9900K @ 5.0
Posted on Reply
#78
ODOGG26
EarthDogI'v heard rage mode isn't much that it is mostly SAM doing this(?). I recall Linus mentioning that it isn't much more than power limit increase and fan speed increase to get more boost.

This is great for the few who are balls deep in their ecosystem...but what about for the majority of users? How many, in the current landscape, are using non B550/X570 systems (an overwhelming majority, surely)? You need to upgrade your CPU and mobo to support this feature. Personally, for the majority, we need to see how these perform without. From the chart above, looks like it takes a win back to a loss and a couple of wins back to virtual ties. I'd love to see this compared to overclocked 3080's instead of whatever they have... oh and on the same API.
I dont see what you are seeing in that chart. I see from left to right. Win, tie, win, win(u can chalk it up to variance but bar is higher), win(same as previous win), loss, loss. This is if you ignore the white added performance. Technically more wins than losses but all in all we can call them equal. As far as overclocking go, I dont think the 3080 has a chance OC v OC. These new Ampere dont overclock good. Not much left on the table imo. Not proven yet but there's already reports of great oc for RDNA 2
Posted on Reply
#79
InVasMani
I still want to see 1080p results the RX6800 will probably beat the RTX 3080 in quite a few of those cases which is somewhat hilarious the e-league Intel crowd will probably love those results. I don't understand Nvidia's thinking with Ampere design they are pushing 4K performance while also pushing RTRT that has no chance in hell of being practical at 4K with good results. There is something very wrong with that picture to me. I'd be really interesting to see how RNDA2 does at RTRT at 1080p compared to RTX and even 720p for that matter. The mClassic would be a rather interesting with the 720p RTRT upscale it to 1080p 120Hz.
Posted on Reply
#80
Cheeseball
Not a Potato
Modern Warfare and BFV have always been a bit more Radeon-biased, CODMW is up to 10% better on the RX 5700 XT compared to the 2070 Super with the same graphical settings (RT off, NVIDIA Reflex off) and BFV has always shown to be faster on AMD cards (5700 XT can beat the 2080 Super at 1080p and 1440p).

SOTR and Doom Eternal seem to be the more even benchmarks of them all, with the Foundation engine in SOTR being on DX12 (although it supports DLSS and RTX) and id Engine Tech 7 on Vulkan.

I'd like to see how well these do in the Quantic Dreams engine or in the latest revision of Unreal Engine 4. My prediction is that the 6800 XT is match-for-match with the RTX 3080 and the RTX 3090 will be beaten due to the waaaaay more affordable price.
ODOGG26I dont see what you are seeing in that chart. I see from left to right. Win, tie, win, win(u can chalk it up to variance but bar is higher), win(same as previous win), loss, loss. This is if you ignore the white added performance. Technically more wins than losses but all in all we can call them equal. As far as overclocking go, I dont think the 3080 has a chance OC v OC. These new Ampere dont overclock good. Not much left on the table imo. Not proven yet but there's already reports of great oc for RDNA 2
The RX 5700 XT didn't really overclock great (2,000+ MHz only yielded at most 10 FPS with most models) as well, but we'll see how the 6800 XT works out.
Posted on Reply
#81
EarthDog
ODOGG26I dont see what you are seeing in that chart. I see from left to right. Win, tie, win, win(u can chalk it up to variance but bar is higher), win(same as previous win), loss, loss. This is if you ignore the white added performance. Technically more wins than losses but all in all we can call them equal. As far as overclocking go, I dont think the 3080 has a chance OC v OC. These new Ampere dont overclock good. Not much left on the table imo. Not proven yet but there's already reports of great oc for RDNA 2
We're looking at the same chart right? The one AMD put up there, not the DIY bar chart below my post, right? I stand by what I said 100%. Nothing was wrong in what I said.

I see (without all the boosts), AMD.... Wins/Ties/Wins/Wins/Wins/Loses/Loses. I'd also like to note the scale. 20% between tiers. so most of these wins (Doom, GOW5, Hitman) are really negligible. I'd call the cards about equal as well (unless you're overclocking with Rage and using X570/B550/5000 Series). But yeah, W/T/W/W/W/L/L with negligible differences between two of those wins.

RE: Overclocking, indeed, nobody has a lot of headroom these days. That said, if you look at TPU reviews, we are seeing a couple-few % depending on the model. With how close some of these AMD benchmarks are, that makes those titles a tie or flips them the other way negligibly, just like what we see in red only.
Posted on Reply
#82
Cheeseball
Not a Potato
EarthDogRE: Overclocking, indeed, nobody has a lot of headroom these days. That said, if you look at TPU reviews, we are seeing a couple-few % depending on the model.
Indeed, these new GPUs (speaking from my experience with the RX 5700 XT and RTX 3080) are just like Ryzen 3rd gen, keep them cool and they will boost higher. No need to increase clocks (at least for me with a 144 Hz monitor), but reducing power draw while keeping stock performance (or even better if it stays cool) is what it is now.
Posted on Reply
#84
ODOGG26
CheeseballModern Warfare and BFV have always been a bit more Radeon-biased, CODMW is up to 10% better on the RX 5700 XT compared to the 2070 Super with the same graphical settings (RT off, NVIDIA Reflex off) and BFV has always shown to be faster on AMD cards (5700 XT can beat the 2080 Super at 1080p and 1440p).

SOTR and Doom Eternal seem to be the more even benchmarks of them all, with the Foundation engine in SOTR being on DX12 (although it supports DLSS and RTX) and id Engine Tech 7 on Vulkan.

I'd like to see how well these do in the Quantic Dreams engine or in the latest revision of Unreal Engine 4. My prediction is that the 6800 XT is match-for-match with the RTX 3080 and the RTX 3090 will be beaten due to the waaaaay more affordable price.



The RX 5700 XT didn't really overclock great (2,000+ MHz only yielded at most 10 FPS with most models) as well, but we'll see how the 6800 XT works out.
Agreed. Should be better one would think with being full RDNA now. But we shall see.
Posted on Reply
#85
Makaveli
There are so many clueless post in this thread its hilarious thanks for the laughs.
Posted on Reply
#86
Batailleuse
RedelZaVedno1 Gbyte GDDR6, 3,500 MHz, 15 Gbps (MT61K256M32JE-14: A TR )costs $7.01 at the moment if you order up to a million pieces. You can negotiate much lower prices if you order more. That's 56 bucks for 8 gigs in worse case scenario (around 40 is more realistic). AMD is likely making nice profit selling you additional vram for 80 bucks.
you forget a few thing in your pricing,

1= you talk raw price, you forgot there is also manufacturing to add
2= you forget that those are B2B price (no VAT) you have to add VAT to consumers
3= you forget about commercial margins.

when you add say 20% VAT on your 56 is ... 84

Honestly ... big fanboyism towards Nvidia right here. they give slightly better perf and double the ram for slightly more price but not much more that what the 8gb costs.
Posted on Reply
#87
Unregistered
MakaveliThere are so many clueless post in this thread its hilarious thanks for the laughs.
yeah man
#88
Batailleuse
TheTechGuy1337I completely agree with this guy. Refresh rate does not matter as much after 60 fps. I own a 60hz, 120hz, and a 144hz monitor. One of which is a laptop with horrid gray to gray scale of 45ms. All three of them perform similarly. They feel the same with vsync on the the 60hz monitor. If you were to turn vsync off then the difference shows. That is it. If the game is producing frames higher than a monitor can handle. That is the only time these higher refresh rate monitors come into play. However, with implementations like vsync this becomes less of a deal. In my opinion response time, gray to gray scale performance, brightness, and color accuracy are hands down the most important aspects to monitors. I want to be lost in a new world and not reminded that I have to work in the morning.
each person has different sensivity to refresh rates.

60 for me is low and i clearly feel the difference between my 144hz and a 60hz

however for having done a blind test on 100-120-144-240 past 120hz i cant see the difference anymore.

so for now im keeping my 144hz monitor, and i know that eventually if a decent smart TV 120hz 4K screen i can handle that but i couldnt go back to 60Hz
Posted on Reply
#89
spnidel
nguyenReally, tell me what single player games can you play at 144hz at 4K Ultra setting with your 5700XT ? CSGO ?
lmao who said anything about 4k 144hz wtf
Posted on Reply
#90
Ashtr1x
NeuralNexusNO ONE REALLY CARES ABOUT RAYTRACING...All future games will be optimized for AMD's raytracing solution anyway. Given that the gaming being developed for this generation will be built to purposefully use Zen 2 and RDNA 2 architecture as the base specs.
Same BS everytime. What happened to the AMD based Jaguar trash and the pathetic GPUs in the 8th gen conslow boxes ? Nothing but corners being cut and trash tradeoffs.
Posted on Reply
#91
TheoneandonlyMrK
Ashtr1xSame BS everytime. What happened to the AMD based Jaguar trash and the pathetic GPUs in the 8th gen conslow boxes ? Nothing but corners being cut and trash tradeoffs.
Well , like Atari and many others you could crack on and build your own, I would be surprised if you can beat the performance of the Xbox series X for it's price.
I mean there are a few reasons for jaguar and it's failure but they're arguably just the competition was better but damn that was a while ago.
Posted on Reply
#92
InVasMani
I really want to see AMD add 1080p results to these benchmarks so I can see how the RX6800 performance is in relation to the RTX 3080 at that resolution. The performance edge the RTX 3080 holds on average over the RX6800 narrows quite a bit at 1440p so if that extends even further at 1080p that's very interesting to look at given that the RX6800 really is aiming to compete against the RTX 3070 price wise. The RTRT performance will be interesting as well especially especially if relatively close and comparable and the RNDA2 architecture tends to perform much more competitively against Ampere at 1440p and below. If the infinity cache is playing a role in the results that's also quite fascinating and could be a huge upside perk if going with lower resolutions and higher refresh rates rather than the opposite.

www.amd.com/en/gaming/graphics-gaming-benchmarks
Shatun_BearNvidia cheaping out by using Samsung's poor 8nm node is really biting them in the backside...

I can't believe in 1440p the middle RDNA2 card (6800XT) is faster than the 3090!! And this is without RAGE mode enabled:

videocardz.com/newz/amd-discloses-more-radeon-rx-6900xt-rx-6800xt-and-rx-6800-gaming-benchmarks
So the RX6800 relative to the RTX 3080 at 4K is 10.8% slower while at 1440p it's 5.2% slower. I suppose that means at 1080p the RX6800 should be about even to the RTX 3080 at that resolution at least in these types of benchmarks. In the case of RTRT it could be different of course still impressive for a card that's actually competing with the much cheaper RTX 3070. It looks like that resolution comparative difference is more pronounced on the RX6800 than with the RX6800XT and RX6900XT the additional performance gained relative to the RTX 3080 is lower for those higher up models. The fact that it's 5.2% slower in relative terms dropping from 4K to 1440p and also is not 5.4% slower is telling as well. The RX6800 will likely compete quite well against the RTX 3080 at 1080p resolution perhaps from the looks of things if the results continue to fall in line at 1080p all while being much cheaper.

Posted on Reply
#93
B-Real
1d10tIt's funny because last year people ignore these RTRT and refer them as to "unnecessary" but now is major selling point along DLSS.
Absolutely what I wanted to answer to him... :D Great catch.
Posted on Reply
#94
mtcn77
B-RealAbsolutely what I wanted to answer to him... :D Great catch.
No catch. Nvidia hasn't buckled the trend of beating old cards to the same benchmarks yet. This is a reflection of the mobile mindset. You don't have to move the competition forward there.
I wish to say ray tracing is a success but let us be real, unless consoles make it to the mainstream, it won't be.
Exclusivity is good and all, but it doesn't pay the bills. I have anectodal reference of just how pitiful the situation is right now. I'm sure you have watched the webcasters talking the same market insider pitches on how little, just a drop in the bucket, sponsorships are next to the big bucks. Unless sales move forward, the number of ray tracing consumers are just hunting for nvidia sponsored titles. Have they yet caught on to their big break, or are they that many?
Posted on Reply
#95
Initialised
Back on October 9th you released this: www.techpowerup.com/273150/amd-big-navi-performance-claims-compared-to-tpus-own-benchmark-numbers-of-comparable-gpus



1: Can we please have an update of this chart for the three SKUs since it appears that this is based on 6800 (little big Navi)?
2: Do you think this indicates that the lower TGP/CU units in the consoles do indeed compare to 2080Ti?
3: Do the console numbers give an indication of the lower SKUs in the RX 6000 product stack?
4: Do the PS and Xbox SoCs pave the way for the 6000G APUs?
5: Do the PS5 and XBox get a 5nm refresh in a couple of years?

CUs, Frequency, Platform
20 1.565 GHz XBox Series S (6500? equivalent to 5700)
36 2.23 GHz PlayStation 5 (6600?)
52 1.825 GHz Xbox Series X (6700?)
60 1.815/2.105 GHz 6800
72 2.015/2.25 GHz 6800XT
80 2.015/2.25 GHz 6900
Posted on Reply
#96
mtcn77
I knew I read this somewhere. Now I know what mesh shaders do, I guess. It is great: each thread that covers a pixel doing a coverage test had to be tied to a workgroup. Well, not any more... This frees a big chunk of the pipeline from fixed unit z testing. Coupled with the benefit of being cacheable and not rop bound; now, the hardware is free from small triangle penalties and the graphics can be ported over to compute. Pretty neat.
Posted on Reply
#97
moproblems99
Presuming the benchmarks are accurate, what impresses me most is DX11 performance. Granted most of the games in DX11 were Frostbite which always favored AMD but they haven't been competitive in DX11 for quite a while.
Posted on Reply
#98
Zach_01
InitialisedBack on October 9th you released this: www.techpowerup.com/273150/amd-big-navi-performance-claims-compared-to-tpus-own-benchmark-numbers-of-comparable-gpus



1: Can we please have an update of this chart for the three SKUs since it appears that this is based on 6800 (little big Navi)?
2: Do you think this indicates that the lower TGP/CU units in the consoles do indeed compare to 2080Ti?
3: Do the console numbers give an indication of the lower SKUs in the RX 6000 product stack?
4: Do the PS and Xbox SoCs pave the way for the 6000G APUs?
5: Do the PS5 and XBox get a 5nm refresh in a couple of years?

CUs, Frequency, Platform
20 1.565 GHz XBox Series S (6500? equivalent to 5700)
36 2.23 GHz PlayStation 5 (6600?)
52 1.825 GHz Xbox Series X (6700?)
60 1.815/2.105 GHz 6800
72 2.015/2.25 GHz 6800XT
80 2.015/2.25 GHz 6900
The TPU estimation was based on AMD FPS numbers showed on ZEN3 event. Lisa Su said that it was on a 6800XT. Possibly on unfinished tuning and clock settings.
And you can't compare console SOCs to PC components CPU/GPU. Console's package of CPU and GPU combined is completely custom, designed for a console with its constraints of power draw and heat output, and price.
Posted on Reply
#99
Minus Infinity
6800XT + 5900X will be a nice Xmas present assuming I can get a hold of them.
Posted on Reply
#100
InVasMani
I think ZEN3 5600X is good value as is RX6800. That said I think the ZEN3 5800X could be the optimal CPU for gaming a bit higher base clock/peak over the ZEN3 5600X with two additional cores, but a tougher pill to swallow on the price if on a limited budget. I suppose these days I'd probably opt for that than stepping up to a RX6800XT to be fair personally I'm finding CPU core count and base frequency more and more appealing from a overall system standpoint. I think 'd get more mileage in the long run plus the RX6800 is great value for RNDA2 judging from what I've seen thus far.
Posted on Reply
Add your own comment
Aug 17th, 2024 14:13 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts