• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

50 Games Tested: GeForce RTX 3080 vs. Radeon RX 6800 XT

well, nvidia rtx 3080 10gb still has lead some performance, more than radeon rx 6800 xt 16gb, on all resolution of pc gaming..... but, sadly........... that VRAM 10gb it will disadvantage for most new pc games with resolution 4K....... so, radeon rx with 16gb VRAM will be get more gain with most new pc games with resolution 4K......

but, what about nvidia rtx 3080 10gb with resolution 3K or 5K on pc games with all LCD ultra wide screen 34 inch or 49 inch ???
At the same time RT is being used more and more. It's pretty much in every AAA game nowadays and I feel like the lack of RT performance on the 6800xt is going to hurt it more than the VRAM on the 3080. In fact the lack of RT performance is already an issue atm.
 
50 games... you're a glutton for punishment. :p And that 2 minutes per game doesn't include the setup time for that game, either.

Well, it's great to see AMD being much more competitive against NVIDIA nowadays having essentially the same performance without RT. Now they just need to improve their RT performance and NVIDIA really will have something to worry about. This competition is great for us consumers as I've always said.

And one day price and availability of graphics cards will sort itself out, sigh.
 
I loved how you made those charts, chart by game release year was really nice addition. Great job!
 
Great work @W1zzard , the -6% at the end of 2020 was with an Intel CPU/ecosystem this plays a role also, as anyone can check from the previous reviews, Nvidia 3000 series (and more specific the 3080 because the % performance difference vs 3070 went down 2% additionally) lost ground when you switched to 5800X, are you planning when AMD launches the new 3D V-cache CPUs (Alder lake platform probably would also be more mature at that time) to rethink what architecture you will be using, because although Intel the last 15 years had the gaming performance crown and lost it to AMD last year, they always had very good value CPUs (entry i5s for example) and AMD even better CPU offers some times, from last year AMD doesn't have anything to compete for example with the $157 (11400F/10400F) range and the vast majority of the gamers don't buy 5800X systems, so if the performance difference is very close between them (still AMD will be a little better at gaming imo, but we will see) it would be odd to support AMD and this business decision (I know they don't have capacity etc, but the recent RX 6600 series pricing indicates that there is a new not exactly pro consumer way of thinking since the MSRPs shows the indications of the company and also sends to the competition a message to also raise their MSRP pricing...
Many will say that Intel & Nvidia is equally bad or more so, but I didn't see Intel in their C2D/Sandy Bridge/etc era forgetting the other price brackets and to those that will say that Nvidia doesn't do this anymore because they have build other markets to make their margins, I would say let's AMD also be creative and build for themselves a place in other markets because I certainly prefer some big corporation to pay $50.000 or what ever for a RTX blade rack than the plain consumer to build these profits...
 
What's going on with the GTA V benchmark? Those are some interesting swings.
 
I would expect to see this type of article from Hardware Unboxed and.... you did a better job than them. This is very impressive. Thanks for the information
 
6800xt - Pixel Rate 288.0 GPixel/s Texture Rate 648.0 GTexel/s
3080 - Pixel Rate 164.2 GPixel/s Texture Rate 465.1 GTexel/s
3080 vs 6800xt - Pixel Rate +75% GPixel/s Texture Rate +39%
I don’t understand why the 3080 does not outperform the 6800xt with such a difference.
There is more to performance than just pixel rates and textures rates. So its a good question but with more complicated answer.
 
What about the best game ever??

Terraria?!? :)
 
Glorious article.

And I won't say told you so... but... I did... History simply repeats.

"When the GeForce RTX 3080 released with 10 GB VRAM I was convinced it would be sufficient for the lifetime of the card. Recently, we have for the first time seen titles reach that amount at the highest settings in 4K, but only with ray tracing enabled. Examples of that are DOOM Eternal and Far Cry 6."

The cards are hardly out for a year in the wild, with core performance to spare, especially with DLSS which is the only way to run RT proper.

When a gen releases alongside a new console, you can rest assured demands go up fast. Also, the disconnect in relative performance gain between the core and the VRAM's capacity is only going to show itself stronger in the near future. Over 3 generations from Pascal to now, cards have LOST nearly half the VRAM relative to core performance. Its a huge gap and bigger cache won't bridge it.

Note that Nvidia is releasing larger capacities too at this tier and below. Multiple versions, as far as we know. I'm still convinced the Samsung node is partly to blame for the weird stack Ampere has become, and/or the limited availability of GDDR6X at the time.

6800xt - Pixel Rate 288.0 GPixel/s Texture Rate 648.0 GTexel/s
3080 - Pixel Rate 164.2 GPixel/s Texture Rate 465.1 GTexel/s
3080 vs 6800xt - Pixel Rate +75% GPixel/s Texture Rate +39%
I don’t understand why the 3080 does not outperform the 6800xt with such a difference.

Theoretical maximum throughput is not practical/real life performance. Graphics processing is a pipeline and the weakest link determines the final performance number. BIOS, driver, game, OS on the software side, and CPU/RAM/GPU and in a tiny subset also storage on the hardware side.
 
Last edited:
I like the concise format, it's an all-around good article. I just wish that card reviewers would include, alongside new and benchmark-busting titles, currently popular games that people are actually playing and might care more about. While many of them will run on pretty much anything without a problem and so don't really need to be included, some like CS:GO are competitive shooters that, regardless of difficulty to run, people are constantly searching for max framerate and so buying choices could be impacted based on which card performs better even in those potato games.
 
Loved the article. Personally I'd love to see a few things added:
  • A way to switch between single page and paginated (like we do for regular reviews)
  • Lower tier cards compared - e.g. the 6600 vs 3060, 6600XT vs 3060Ti and 6700XT vs 3070. I know the performance might not be the same, but prices (at least MSRP) are comparable.
  • How about price inclusions for the games you are trying out at the time of publishing? That should help gamers of all budget look up games that they are interested in and can fit in their budget.
 
Great review as usual.
What i liked the most is the performance on games as per year which it shows a clear path for AMD and the future.

Sharing the same thought about consoles, using AMD hardware clearly will help on future developments to improve performance over AMD hardware on PC.

I'm using a 3600x at 4.3GHz paired with the 6800XT and in 4K i have nothing than joy and hopefully, AMD will maintain the pace with drivers and fck ups which everytime seems to be less and less.
 
Great, I still can't buy both.
 
Great effort there
 
AMD is like fine red wine, gets better with age!
Nvidia is like fine beer, better when freshly brewed but starts to go off and stale with age!

If there ever was an example of this, just look at the venerable GCN architecture! The R9-290X to this day runs similar performance to a 580 and will run every single game tested today at 1080p! Can you say that about GTX 780!?!?!? which was its competition when it was released? Both cards are near a decade old!!!!
 
@W1zzard the discrepancies between certain titles is very interesting because it shows that a reviewer could very easily bias the results heavily in one card's favor if they had a sample size <=10.
 
AMD is like fine red wine, gets better with age!
Facts i've been using Ati/AMD gpu's for 20+ years now going back to the very First Radeon 64DDR they always improve with time.
 
Thanks for the benchmarks, I don't even want to think about the amount of work that goes in to doing something like this.
As someone who owns and plays with a 3080 and a 6800xt (at 1440p) I can confirm that they on average perform very similar, but when it comes to individual games I´ve seen up to a 20-25% difference in framerate, so it would be easy to "improve" the result in either GPU's favor just by selecting the right games.

For me the 6800xt is actually the GPU I prefer. I have actually had less issues with my monitors with the 6800xt than the 3080, and the drivers (so far) have been solid in the game I've played. The 3080 actually had issues with flickering some drivers, also had problems with textures not loading correctly in some games with a driver a few months ago. Using 2 monitors also increases VRAM usage a bit (usually by 200-700mb).

If someone asked me before I tried the 6800xt for myself, what GPU I would rather buy at msrp, the 6800xt or 3080, my answer would've been the 3080. Today it would actually be the 6800xt. Sure, having DLSS and better performing RT is nice, but DLSS below 4k is overrated imo (at 1440p I have always preferd native whenever I compared them) and with RT I only notice reflections looking better, other things such as GI, shadows, etc. only look "different" in most cases. But the biggest issue for me with the 3080 is actually how much hotter my room gets while playing games, I don't know if the Nvidia GPU also makes my 5900x run hotter, but the amount of "extra" heat generated feels like more than what it should be. I have a reference 6800xt (300W) that I OC slightly and a 3080 Auros Master (375W factory OC), and the difference feels more like 150W than 75W.
 
@W1zzard I am a big fan of the layout and presentation of these split graphs, this is definitely the way to go for directly comparing two products. That is one seriously comprehensive write up, I appreciate it!!
 
Now I'm tempted to buy Borderlands 3 just to see how buggy and bad it really is when running on DX12, if it still justifies using DX11. From the reviews I've seen on other sites, the Radeon beats the Geforce in this game when using DX12, sometimes very narrowly, sometimes significantly (probably due to different testing methodologies). So it would of course help the Radeon's average performance if you switched that game to DX12 :p
 
Sounds like someone games at 1440p.

I for one appreciate the 4K data.

Looking at the recent Steam Hardware Survey 2,39% of gamers play in 4K, 8,71% in WQHD & 66,50% in FHD.
To me it doesn't make much sense to waste so much testing time for such a minority when you can simply do your own upscaling performance math. ;)

@OP: Thanks, great review. Appreciate the the massive effort you put into it. :rockout:

But, there are some points I would :love: to see next time:

+ more tested games from the Steam Charts (Top Games)
(on your list I see so much irrelevant "Showcase Games" that aren't even played anymore)

+ I would prefer benching of mid range cards (like AMD 6800 & Nvidia 3070)
(looking at the Steam Hardware Survey there are only 1,00% using a 3080 & the 6800XT isn't even on the list, guess because of the bad GPU availability.
Benching of top range cards might be great click bait, but for the absolute majority it's just flyover wet dreams)
 
I don't run ray tracing -- anyone running 4k 120-144Hz needs at least a 6800 or 3080 to do so. Ray tracing isn't worth the hit in FPS for most people. Virtually anyone running an LG CX (a very popular gaming display) or any of the newer 4k displays is running these cards without RT.
Same here but at 1440p. I vastly prefer 140+ FPS to anything below 100. For example, I play Division 2 a lot (buggy, crashy, freezy, repetitive garbage, yes, perhaps I'm a masochist :p) and generally it runs at 120-180 FPS on my system, but there is one specific place on the DC map where the FPS drops to 70-80 and I HATE it. Had the same FPS drops with the RTX 2080 Ti and RTX 3070 (but with proportionally lower FPS of course), so it's not the RX 6800 XT's fault. It's the devs not optimizing that part of the map sufficiently and every time I go around that spot, I can tell the game is running at shitty 80 FPS again without even looking at the FPS counter.
Same thing with raytracing off vs on in games that support it. Just not worth it. And I don't care if the difference is 160 FPS to 80 FPS (like with the Radeon) or 160 to 100 FPS (like it would be on a Geforce), I'm not giving up those FPS to have better reflections in puddles that I don't even notice.
Also: Tried playing Control with ray tracing on and yes, the FPS was noticeably worse, but the ultimate reason why I turned ray tracing off was because the reflections felt so out-of-place. Too pronounced. A room with a glossy floor reflected in the floor and the reflection felt more visually pronounced than the original. Weird. Reflections don't act like that in real life. NVidia trying too hard to push this onto the developers and players. Everything must be reflective! The reflections must SHINE! THE REFLECTIONS MUST BURN THE PLAYERS' EYES OUT!!! R T X FOREVER!!!

(the last 4 sentences are obviously exaggerated and jokes and sarcasm etc)
 
Yeah RT is something you have to look for to an unreasonable degree.

There was actually a point after playing Cyberpunk and playing around with RT settings where I walked outside, in real life, after we got some rain and thought "Wow these reflections look really great. Awesome graphics!".

^Among the other red flags there - signs I spent way too much time in CP2077 looking at puddles.
 
Thanks for the benchmarks, I don't even want to think about the amount of work that goes in to doing something like this.
As someone who owns and plays with a 3080 and a 6800xt (at 1440p) I can confirm that they on average perform very similar, but when it comes to individual games I´ve seen up to a 20-25% difference in framerate, so it would be easy to "improve" the result in either GPU's favor just by selecting the right games.

For me the 6800xt is actually the GPU I prefer. I have actually had less issues with my monitors with the 6800xt than the 3080, and the drivers (so far) have been solid in the game I've played. The 3080 actually had issues with flickering some drivers, also had problems with textures not loading correctly in some games with a driver a few months ago. Using 2 monitors also increases VRAM usage a bit (usually by 200-700mb).

If someone asked me before I tried the 6800xt for myself, what GPU I would rather buy at msrp, the 6800xt or 3080, my answer would've been the 3080. Today it would actually be the 6800xt. Sure, having DLSS and better performing RT is nice, but DLSS below 4k is overrated imo (at 1440p I have always preferd native whenever I compared them) and with RT I only notice reflections looking better, other things such as GI, shadows, etc. only look "different" in most cases. But the biggest issue for me with the 3080 is actually how much hotter my room gets while playing games, I don't know if the Nvidia GPU also makes my 5900x run hotter, but the amount of "extra" heat generated feels like more than what it should be. I have a reference 6800xt (300W) that I OC slightly and a 3080 Auros Master (375W factory OC), and the difference feels more like 150W than 75W.

Nice to see a real life review from someone that owns both.
 
Back
Top