• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

50 Games Tested: GeForce RTX 3080 vs. Radeon RX 6800 XT

Why theres no test with 1080p on mid with several of cards?

Cause a HD7970 from 2012 for about 120€ (today) perform better than a 1650ti for atm. 400$
 
Why theres no test with 1080p on mid with several of cards?

Cause a HD7970 from 2012 for about 120€ (today) perform better than a 1650ti for atm. 400$
Probably because this wasn't a full GPU round up and review. The title specifically has two gpu's in it why would you be expecting more?
 
Just realized that I kinda contradicted myself in my last post by saying "better reflections in puddles that I don't even notice" and then ranting about Control having too pronounced reflections. But that is basically my experience with RT atm: either I barely notice it (like Doom Eternal - too much action and demon killing to notice reflections) or it just screams "HEY I'M A REFLECTION LOOK AT ME" like in Control.
 
Yeah for sure, i bought on ebay a 7970 6GB from Saphire for a friend and it devasted all till a RX 580 8GB

In Hitman 2016:
7970 6GB in 1080p its = 84 FPS
RX 580 8GB in 1080p its = 89 FPS
 
Looking at the recent Steam Hardware Survey 2,39% of gamers play in 4K, 8,71% in WQHD & 66,50% in FHD.
To me it doesn't make much sense to waste so much testing time for such a minority when you can simply do your own upscaling performance math.
You have it backwards:
You can use high resolutions to find out what the GPU is really capable of and downscale that to have a rough idea what it could do in a lower resolution given enough CPU power. But the other way around the GPU is often no longer the limiting factor so the results do not scale to higher resolutions.

And just to troll around a bit I'll now fire up CP2077 in DLSS-UHD using RT because that's what an RTX2080Ti can still do faster than any AMD card.
 
Man when I see these articles. I think about what it would take to get one of these cards its not just the card, but my whole system needs a make over. I want one, but i'm okay with playing 10 year old titles until I can afford a system make over. But I can tell my systems days are number 4790k and gtx 970 is getting pushed to its limits. Regardless this article is helpful thinking about make a full team red rework now.
 
Excellent format. I like it.

This sort of thing would probably be a 2-hour long youtube video, lol. But keeping it a short, graph-heavy text representation is really the best. The numbers speak for themselves, and you've chosen a great way to collate the data!
 
Looking at the recent Steam Hardware Survey 2,39% of gamers play in 4K, 8,71% in WQHD & 66,50% in FHD.
To me it doesn't make much sense to waste so much testing time for such a minority when you can simply do your own upscaling performance math. ;)

@OP: Thanks, great review. Appreciate the the massive effort you put into it. :rockout:

But, there are some points I would :love: to see next time:

+ more tested games from the Steam Charts (Top Games)
(on your list I see so much irrelevant "Showcase Games" that aren't even played anymore)

+ I would prefer benching of mid range cards (like AMD 6800 & Nvidia 3070)
(looking at the Steam Hardware Survey there are only 1,00% using a 3080 & the 6800XT isn't even on the list, guess because of the bad GPU availability.
Benching of top range cards might be great click bait, but for the absolute majority it's just flyover wet dreams)
Consoles target 4K. The best displays (OLED HDR TVs) are 4K 120.

Its weird to see PC gamers with high end hardware opt in for lower than console resolutions with FAR inferior LCD displays.
 
You sound like those guys back in 2010s looking at Crysis with tessellation.

We all know how it turns out.

I think you don't remember the shitshow that tessellation was at the time and how it was abused by NVIDIA in order to exploit a hardware advantage over ATi but provided no benefit to the user, and in fact was to the detriment of performance despite no visual difference. (Also, tessellation is a lot older than Crysis, and was actually implemented in hardware first by ATi all the way back in the Radeon 9000 series.)

Tesselation gets used everywhere excessively?

Bad analogy methinks...

Remember the sub-pixel-sized triangles on concrete barriers and tessellated oceans rendered under the map that you couldn't even see?

Yes, that's what's happening. It's used in pretty much every game and is so normal that there is no longer any option to even tune it.

AMD actually enforces a limit on tessellation factor at the driver level as a result of the BS mentioned in my first sentence above. It's still there today (and you can adjust it.)
 
Last edited:
What a great work you did there @W1zzard ! Thanks for the effort! Only objection from my side the inclusion of 6 games that use UE4. Also, I would ommit the previous Far Cry and F1 games. Also Division 2 is a great game and optimised well that could be included. But that's only me.
 
You have it backwards:
You can use high resolutions to find out what the GPU is really capable of and downscale that to have a rough idea what it could do in a lower resolution given enough CPU power. But the other way around the GPU is often no longer the limiting factor so the results do not scale to higher resolutions.

And just to troll around a bit I'll now fire up CP2077 in DLSS-UHD using RT because that's what an RTX2080Ti can still do faster than any AMD card.

In theory, yes. But looking at GPU roundup benchmark charts in 4K, there is mostly just a few FPS difference between the cards. Which makes performance guessing for lower resolutions pretty unprecise. 4K is just too demanding for current GPU's to show any major performance differnces (when comparing apples to apples).

Not to mention that a few FPS differnce will most likely not swing any buyer decisions. :)

Consoles target 4K. The best displays (OLED HDR TVs) are 4K 120.

Its weird to see PC gamers with high end hardware opt in for lower than console resolutions with FAR inferior LCD displays.

Console & PC gamers are a different breed. Console cowboys are fine with sluggish gamepad controls & non constant 60 FPS. On PC gamers who play mostly RTS etc. are also fine with 4K 60FPS. But gamers who mostly play fast paced shooter games etc. will opt for 120+FPS on FHD or WQHD any day over 4K.

Also unclear what you mean with "FAR inferior LCD displays". The LCD panels are mostly on the same quality level, while a 4K display just has some more pixels (sharper image) & a FHD/WQHD gaming display has higher pixel refresh rates (super fluid image). It just comes down what fits your needs more. ;)
 
Hmmn, maybe this is a biased site, only Nvidia and AMD discrete GPU tested in 50 game's?!.:p
 
C'mon man, let the Intels launch first.
 
I would prefer benching of mid range cards (like AMD 6800 & Nvidia 3070)
Looking at the steam hardware survey the real card that should be being Benched is a 1060 (7 percent) and a 1050
I dont think we want more benchmarks of these cards...
 
When the review is more exciting than actually playing the games I think you might have a problem or new found hobby!
 
but, what about nvidia rtx 3080 10gb with resolution 3K or 5K on pc games with all LCD ultra wide screen 34 inch or 49 inch ???
RTX 3080 on 3440x1440 34" 144hz display, yet to have any VRAM-related concerns whatsoever, plus I'm not opposed to turning the texture setting down if and when this is required, for the virtually indistinguishable difference it usually gives. I've been tweaking and optimizing games around my tastes and hardware for decades, this is no different.

By the time the VRAM is potentially going to be a genuine day-to-day concern, the world will have moved onto newer generation/s, not to mention things like Direct storage/RTX IO, so it'll all fade into relative obscurity while people talk about the graphics cards and games du-jour IMO. It's certainly not something that bothers me in the slightest, despite the FUD that gets spread and purchase justification of alternatives. Nonetheless, I'm sure it'll still get brought up over and over as if to be some sort of gotcha from some, and as an objective warning to a small subset of potential buyers from others.

@W1zzard, I'd like to see the games on the list that feature RT effects (13 of 50 titles?) tested also with RT on, especially if the game puts RT on by default which I know can be the case. Not accusing you of a biased article in the slightest here, but it's a part of the picture many people don't want to ignore. You mention it in your conclusion which is good, but I'd rather see two sets of results, the overall resolution averages with RT off, and with RT on. RT performance is an appreciable part of this equation that merits being tested.

Aside from that, I guess I'm not entirely surprised that when tested with a different base system, different configuration options, and a different list of games, the margins between the cards are different, it almost seems self-evident. I think there is merit in benching as apples to apples as possible with both, same old test system, same games (preferably even the same build, or both original and current) as originally tested, same drivers used in the first suite where they were both included vs today's current drivers, to see where each card has actually improved with drivers. I am certain that would be fascinating to many.

Edited for Typo's.
 
Last edited:
Console & PC gamers are a different breed. Console cowboys are fine with sluggish gamepad controls & non constant 60 FPS. On PC gamers who play mostly RTS etc. are also fine with 4K 60FPS. But gamers who mostly play fast paced shooter games etc. will opt for 120+FPS on FHD or WQHD any day over 4K.

Also unclear what you mean with "FAR inferior LCD displays". The LCD panels are mostly on the same quality level, while a 4K display just has some more pixels (sharper image) & a FHD/WQHD gaming display has higher pixel refresh rates (super fluid image). It just comes down what fits your needs more. ;)
Strong disagree. We go for both better graphics AND better controls and fps. Not one or the other. if you have a 6900 XT or 3090, you can do 4K120 on High settings (if not Ultra) on most modern games. Yes a few will be the exception but that is irrelevant, that is ALWAYS the case historically and some games cant even do 1440/120 so the point STILL stands.

People who play fast-paced games will just accept they will use higher than console settings (High) instead of max. Maybe with Ultra textures though.

And no LCD panels with their inability to display black and sluggishness cannot equal OLED displays. The difference in quality is stark and I for one refuse to pay more than 450 EUR for a LCD panel, no matter its supposed features since its by default low-end technology these days.
 
Nice article. I love graphs and data :D Too bad the Horizon Zero Dawn is not in the mix. Or GodFall which is fairly new.
 
From all Esports games, they've included only DOTA 2. I wonder what are the average FPS at 2160p.

With my PC (i7-10700K+RX6800XT) I make about 220 avg. FPS at max settings and 2160p; FPS cap is removed.
 
Consoles target 4K. The best displays (OLED HDR TVs) are 4K 120.

Its weird to see PC gamers with high end hardware opt in for lower than console resolutions with FAR inferior LCD displays.

There is more to panel properties than the display tech inside it. OLED is superior on that one, but not when it comes to price (its still triple what you pay for a decent lcd monitor) , not on display diagonal, not on the availability of various form factors like UW, and not on continous display of static content.

Or put differently, there are many unique selling points still attached to LCD monitors.

You could also say that keyboard is inferior, havent we got touch now? But we know better. Different use cases for different target markets. Its good that they blend towards each other as OLED now does... but does it really when you are sitting in front of a 40-48 inch TV at arms length at your desk? For productivity, its shit. Ergonomics suffer. Etc.

The strength of the PC and of gaming on it, is the immense flexibility. The PC gamer you see on Youtube with RGB spewing out of every hole is just TV for kids. Marketing. In the real world, tons of people play wildly different games on rigs made of cardboard to complete battlestations.
 
but does it really when you are sitting in front of a 40-48 inch TV at arms length at your desk?
The angle that the screen puts in your eye matters. That's why phone screens are popular even though they're barely 5-6".

At arm's length(say, 3 feet) a 40" monitor/TV is going to appear pretty large. I've got a 22" monitor and it covers about 60% of my range of sight at that distance.
A 40" monitor will just cause me to look up and down each time I want to look at something. I've not used one, but that's what I think.
 
The angle that the screen puts in your eye matters. That's why phone screens are popular even though they're barely 5-6".

At arm's length(say, 3 feet) a 40" monitor/TV is going to appear pretty large. I've got a 22" monitor and it covers about 60% of my range of sight at that distance.
A 40" monitor will just cause me to look up and down each time I want to look at something. I've not used one, but that's what I think.

I've sat in front of a 32 inch HDTV when the things produced 720p. Pixel density was shit of course, but wrt even that diagonal, you're already struggling a little bit with 16:9 in a desktop/desk setting/view distance. A typical desk is about 80cm deep so there are limits.

I'm using a 34 inch ultrawide now, and this situation is a little bit different, for productivity you just have 2x1440p experience as you can side by side two full size windows and direct your attention to either half; and for gaming you tend to get accustomed to having more picture out of focus, which is really quite immersive and it works well with a curve. But 34 inch 21:9 UW is the absolute max I'd go, any bigger and I'd want to sit further, and that would make the curve a problem. I've already placed this screen further away than I did a 24 inch 1080p. But... even so, I do have to turn my head frequently when checking UI elements in games. For that purpose, its alright, because the focus is still in center, but it is more of a strain already, albeit slightly, than having everything 'in focus'.

And yeah... too much height is actually more of a killer. Looking up or down, neh. Not pleasant.
 
And yeah... too much height is actually more of a killer. Looking up or down, neh. Not pleasant.
True. For more than 27" I'd say go ultrawide. But then I've never actually used more than a 27".

When the review is more exciting than actually playing the games I think you might have a problem or new found hobby!
Hobby? For me it's become a vice!
 
There is more to panel properties than the display tech inside it. OLED is superior on that one, but not when it comes to price (its still triple what you pay for a decent lcd monitor) , not on display diagonal, not on the availability of various form factors like UW, and not on continous display of static content.

Or put differently, there are many unique selling points still attached to LCD monitors.

You could also say that keyboard is inferior, havent we got touch now? But we know better. Different use cases for different target markets. Its good that they blend towards each other as OLED now does... but does it really when you are sitting in front of a 40-48 inch TV at arms length at your desk? For productivity, its shit. Ergonomics suffer. Etc.

The strength of the PC and of gaming on it, is the immense flexibility. The PC gamer you see on Youtube with RGB spewing out of every hole is just TV for kids. Marketing. In the real world, tons of people play wildly different games on rigs made of cardboard to complete battlestations.
Thing is, for high-end users that are looking at 1000+ USD GPUs, I do not consider the price of these TVs and displays to be an issue. If you are ready to spend 1800 EUR on a 6900 XT or 2400 on a 3090, then an OLED TV is chump change for you. I agree LCDs make sense for lower-end users though.

No idea what display diagonal is supposed to be. Fair on the form factors and fair in static content (but its an overrated issue nowadays, and yes exceptions exist but we dont consider the exceptions when making general statements) but the sheer brutal advantage in gaming is non-negotiable.

Going from a high-end 165 hz Asus whatever it was LCD 1440p monitor to a 4K OLED TV was a bigger upgrade for me than going from a 5700 XT to a 6900 XT. The biggest visual upgrade ive done in years. Godlike advantage over LCDs. Hell it made all the old games I play look amazing (modded DOOM 3 on OLED...awesome). That is why I feel so strongly here, its like my eyes have been opened.

Keyboard and Touch displays are not as comparable. But for the final part - you are correct. PC Gaming is great due to its freedom. Its modding, emulation, back compat - freedom.
 
Thing is, for high-end users that are looking at 1000+ USD GPUs, I do not consider the price of these TVs and displays to be an issue. If you are ready to spend 1800 EUR on a 6900 XT or 2400 on a 3090, then an OLED TV is chump change for you. I agree LCDs make sense for lower-end users though.

No idea what display diagonal is supposed to be. Fair on the form factors and fair in static content (but its an overrated issue nowadays, and yes exceptions exist but we dont consider the exceptions when making general statements) but the sheer brutal advantage in gaming is non-negotiable.

Going from a high-end 165 hz Asus whatever it was LCD 1440p monitor to a 4K OLED TV was a bigger upgrade for me than going from a 5700 XT to a 6900 XT. The biggest visual upgrade ive done in years. Godlike advantage over LCDs. Hell it made all the old games I play look amazing (modded DOOM 3 on OLED...awesome). That is why I feel so strongly here, its like my eyes have been opened.

Keyboard and Touch displays are not as comparable. But for the final part - you are correct. PC Gaming is great due to its freedom. Its modding, emulation, back compat - freedom.

I'd be wary of the elitism in PC gaming because its mostly an influencer-thing. It doesn't exist, only in the minds of the lucky few who manage to get their hands on overpriced hardware. And you could question the point of that entirely, because all the hardware games just fine really. And the games don't always get better for them either. Too many gaming discussions are focused on FPS counters and other stuff that really only is about presentation.

So is it really important to spend over 1K on a screen? I beg to differ. The technology progresses rapidly, spending big now, is simply early adopting and you know you're overpaying. LG OLED has come down in price, yes. But its still triple that of an LCD that also offers a very immersive gaming experience. And if you go VA, black isn't an issue you will notice either.

The overall price tag for a gaming rig soars to great heights lately with all the new things you 'can' do. But its a mistake to consider them necessary or somehow universally 'better'. There are and were good reasons for display differences between segments like TV and PC. And they aren't gone.

As for not considering exceptions... oh? I do actually. The exceptions MAKE the rule. Its very common for PCs to display static content, and you only need one exceptional situation to reduce the lifespan of your TV.

Display Diagonal = size. 48 inch. 40 inch. Etc.
 
Back
Top