• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Metro Exodus Benchmark Performance, RTX & DLSS

TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???
It is getting harder and harder to link people to their benchmarks as they seem to never line up with all the other sites. for example no other review of the Radeon 7 had it losing to the Nvidia cards like it did on here.
 
man dlss is a let down. Shit is so blurry. That is what I was afraid off. Running good old resolution with DLSS off. I think nvidia should just dedicate all the other tensor cores to Ray tracing to boost performance and let go of the dlss stuff.

About the whole "you can't use DLSS with RT" debacle, at least you have a choice to either have accurate lighting & shadows with horri-bad image quality + low fps OR RTX Off with DLSS to have a good balance of ok image quality & high fps count. Still, DLSS isn't ready like RTX, so it's clear that we all know RTX alone does the job fantastically. I bet DLSS tech will be taken a little more seriously from game devs coz both of them need to "learn" since I feel that the complexity of deep learning tech isn't taken seriously & treated like some gimmick tech (which it's not when put to good use).

Are you really blaming the developers here? Nvidia is supposed to do the heavy lifting on DLSS. Lets blame Nvidia for wasting stupid die space on DLSS, I would have easily accepted they use the entire tensor cores to do Rtx only or add more cuda cores to give performance increase without having to deal with dlss.
 
Damn, ray traced lighting is such a better use of the tech than the reflections in BFV. The game looks great, only issue being that it should have a couple more bounces for interiors as these look darker than they should now.

DLSS is a joke as expected. No idea why tech sites/youtubers keep bringing it up as a "feature" when doing GPU reviews.
 
VRAM not in play at 1080..... look at TPUs test results for the 3 GB versus 6GB model. The 6GB gets about 6% better numbers, but we also see that that the VRAM is not the reason. The 6 GB varies by more than the VRAM, it has 11% more shaders. So how do we know it's not the VRAM ? If it was a contributing factor at 1080p, then it would necessarily be a significantly large factor with a larger performance difference at 1440p. It's not. There may come a day when more VRAM is needed at 1080p but, given those test results, that day has not yet arrived.
I'm sorry but 3GB is not sufficient for even 1080P with high textures. Metro games were always frugal with VRAM (and looked great despite it). The 1060 3GB is a disgrace of a card and so its the upcoming 1660 Ti 3GB. People that buy these cards deserved to get burned in newer games. ~
 
If @RCoon would review it it would be trustworthy. His reviews are never afraid to show negatives or to pronounce a game as not good.
I'm at the mercy of Terminals.io

As you can tell by the barren game reviews lately they've not been giving me much. Early year means all games come out on the same week forcing me to choose one out of five.
 
Well that's the thing. And the reason why photo editors used IPS screens. Look at a photo of grandma on ya screen after she used her "Glamor Shots " Gift Card and she looks great on the IPS screen ... look at the same pic on TN and she looks like a old hooker with overdone lipstick and rouge :) Much akin to popular digital music with exaggerated bass and trebel to make them sound better of the $15 sound systems on phones, Budget MoBos and boom boxes ... played in a hi end audiophile system, they and people are almost runing outta the room holding their ears. It's "uncomfortable to say the least". So it's not just that devs have to adjust how they render images to use the ray tracing as a feature, they have to rethink the artificial adjustments that they have been using over the years to make scenes look better ... because the scene was not rendering properly with regard to lighting effects, that scene would be overly bright. Now by only allowing light effects to be painted from transparaent surfaces, much of the light coming in that scene is now blocked by opaque surfaces. It really shows further down in thet shed scene ... but I like what the review did there showing the pimples along with the clear skin. With accurate lighting effects it's ging to matter where and how big your light sources are and faking it to get desired lighting levels won't work anymore.
Adjusting to the new lighting model isn't probably that hard. The problem is everybody has to still support rasterization, so they have to make both look good at the same time. Not impossible, but not as straightforward as it could be.
It's because DLSS is a complete failure. Every single review I looked at say the same thing. A little boost in performance at a cost of Picture Quality. I'll take the Picture Quality myself, then again I'll take both, which is why I use a Radeon :D
I'm sure you run nothing but SSAA, because you'll take picture quality :wtf:
 
GameGPU also tested the game:
https://gamegpu.com/action-/-fps-/-tps/metro-exodus-test-gpu-cpu

EzQjez3.png


The test sequence:

man the russians know how to test a damn game. Never seen such a detailed test, impressive
thanks for the link
 
DLSS is a joke as expected. No idea why tech sites/youtubers keep bringing it up as a "feature" when doing GPU reviews.

it can be useful... for people who prefer high FPS over some blurriness. There are these kids who boast about 144Hz and never going back to "abysmal" 60Hz. Well, DLSS is for them. And clearly, monitor industry is catering to them as well, so I can understand the business perspective behind DLSS.

Lets not forget about people who turn everything down just to get 240Hz on their silly 24" monitor.
 
Nvidia cards don't care whether you use DX11 or DX12 (except for RTX/DLSS). AMD cards need to use DX12.

it can be useful... for people who prefer high FPS over some blurriness. There are these kids who boast about 144Hz and never going back to "abysmal" 60Hz. Well, DLSS is for them. And clearly, monitor industry is catering to them as well, so I can understand the business perspective behind DLSS.
No it isn't. This is for people who prefer higher resolution than their GPU can run. You can see from FPS graphs that 144 FPS is out of the question even (or especially) with DLSS.
I think upscaling is cancer but consoles have been doing it for most of the generation and DLSS compares well with other methods. Marketing around DLSS is stupid but the idea is sound.
 
I think upscaling is cancer but consoles have been doing it for most of the generation and DLSS compares well with other methods. Marketing around DLSS is stupid but the idea is sound.

Look what Ars says about it:
To be more precise: at its best, Metro Exodus' DLSS 4K mode looks sharper and cleaner than those PS4 Pro games, while in motion, it's easy to spot once-a-minute grainy artifacts when Nvidia's tech tries to keep up with fast action and motion-blur effects.
https://arstechnica.com/gaming/2019...ayer-game-to-usher-in-the-pc-ray-tracing-era/

At the end of the day, the whole computer graphics is based upon simulating the real deal and cheating as much as possible, as long as the eye won't easily tell the difference ;)
 
Spider-Man and God of War are the absolute pinnacle of checkerboarding (and PS4 Pro is claimed to have hardware support for it).
 
Spider-Man and God of War are the absolute pinnacle of checkerboarding (and PS4 Pro is claimed to have hardware support for it).
I wouldn't know about that, I don't have a console.
But it's the first review I've come across to talk about DLSS during actual game play. Granted, Metro titles have always had awesome graphics, but it seems DLSS can work rather nicely.
Also, i"m not sure who does the actual DLSS training. If it's Nvidia, we can expect some level of consistency between titles. If it's up to individual developers, then we can expect some devs will try to get away with less optimization, by forcing DLSS into reducing more details.
 
Hm! Very much as I expected. Completely lackluster DLSS and RTX features and a fantastic game. Looks like I'm set with rasterization for the coming few years. I see absolutely no advantage in this RTX implementation. Scenes are wáy too dark and others look like they lack ambient occlusion. The outdoor scenes are also not great to look at, it feels like Fallout 4, too much color saturation etc and everything is blue tinted.

One thing is clear, RTX completely screws up the color balance/realism of the scene compared to the RTX-off comparisons... As we've already been able to gather from the previews of indoor locations, as well. It looks like the fake HDR on Reshade, except now you get four times the performance hit. Well played :D If I want the 'extra immersion' from darkened scenes, I'll drop my gamma curve a bit.. same effect, 0% perf cost.

And then DLSS... wow. Nice way to lose all definition so you can live with the illusion of playing on a high res. Its a blurfest, I'll take TAA over this junk any day. That is, if you even have the choice and are in the lucky 'Yes' box, for that *exact* game, on that *exact* res, with that *exact* GPU. It'd be hilarious if it wasn't such a sad attempt to add value to a terrible GPU gen..
 
Last edited:
day one patch
PC Specific Updates:

  • Tuned HDR saturation
  • RTX Improvements / Bug Fixing
  • Added DLSS Support
  • Additional HUD removal options (coming soon to console) when playing in Ranger Mode
  • Added Motion Blur options (coming soon to console)

Additional PC Fixes Since Review Code Was Sent:
  • Fixed Locking of player input after scene of rescuing Yermak
  • Removed v-sync option from benchmark launcher
  • Tuning and fixes for Atmos audio system
  • Fixed memory corruption in DX12
  • Fixed crash on launching game on old AMD CPUs
  • Fixed crash after changing of resolution and Shading Rate to 4k/4x on Video cards with 2Gb and less
  • Fixed blurred UI when DLSS is enabled
  • Fixed visual artifacts for RTX high mode
  • Fixed input lock in when patching gas mask during combat
  • Fixed forcing to V-Sync on after alt-tabbing the game running at maximal available resolution for the monitor
  • Fixed crash when pressing ALT + Tab during start videos
  • Fixed forcing of V-SYNC mode if the game resolution is different than the desktop
  • DLSS can be applied in the Benchmark
  • Tuned DLSS sharpness to improve image quality
  • Updated learned data for DLSS to improve image quality with DLSS on
 
Fixed blurred UI when DLSS is enabled
That's an important one for DLSS users

Fixed forcing to V-Sync on after alt-tabbing the game running at maximal available resolution for the monitor
Most important one for me, it was so annoying having to restart the game all the time

Tuned DLSS sharpness to improve image quality
Updated learned data for DLSS to improve image quality with DLSS on
Interesting, we'll see how that turns out
 
Hairworks have a huge performance impact on AMD GPU's, with that disabled they are on par with similar Nvidia GPUS. (1060 vs RX 580)
 
Nvidia is getting some serious flack even from so called Nvidia fanboys about how terrible the picture quality is when enabling that feature called DLSS. They go as far as to claim the RTX series is the worst GPU series launch in a long time. Playing a game like METRO Exodus must be played with the upmost highest of Picture Quality. I am sure my Radeon RX 580 will do the trick well enough. :D

These were also posted on Reddit and forwarded to Nvidia tech support I believe. lol

Here is more proof DLSS makes the image look washed out, despite upping the Picture Quality settings to Maximum. There is something seriously wrong with this so called enhancement. Live on YOUTUBE.
DLSS Problems
 

Attachments

  • DLSS sucks 2.jpg
    DLSS sucks 2.jpg
    422.2 KB · Views: 612
  • DLSS sucks.jpg
    DLSS sucks.jpg
    895.6 KB · Views: 673
  • DLSS sucks 3.jpg
    DLSS sucks 3.jpg
    248.1 KB · Views: 461
Last edited:
TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???

I guess different areas of the game tax different bits of the GPU differently or something. I'm just confused that TPU has the 3GB 1060 beating 580 when Guru3D has it a hair off the 1070.

Bcz TPU is testing the actual game while guru3d is stuck on demo benchmark. Gamer's Nexus just released a video about metro exodus and he got similiar result as TPU.
 
When RX 580 8GB lost to 1060 3GB, you know who really sponsored this game, they Nerf the game so mush that the red team will really f*^%&d up :) Nvidia at it's best
 
When RX 580 8GB lost to 1060 3GB, you know who really sponsored this game, they Nerf the game so mush that the red team will really f*^%&d up :) Nvidia at it's best

Hahaha,nice joke. I didn't see you complaining when rx 570 4gb beat gtx 1060 6gb in Resident Evil 2(an AMD sponsored and heavily AMD gimped game). They nerfed the game so much that green team really f*^%&d up. But i didn't see any nvidia fan crying back then, but whenever nvidia does this, AMD fanboys start rebelion with their pitchfork.
1080p.png
 
Last edited:
Hahaha,nice joke. I didn't see you complaining when rx 570 4gb beat gtx 1060 6gb in Resident Evil 2(an AMD sponsored and heavily AMD gimped game). They nerfed the game so much that green team really f*^%&d up. But i didn't see any nvidia fan crying back then, but whenever nvidia does this, AMD fanboys start rebelion with their pitchfork.

Fan logic from AMD side, what else do you expect?

Also I wonder what the fans gonna say when AMD and maybe Intel start implementing similar AI accelerated AA through DirectML. Will there be similar level of crying from the fanboys again then?

Source:

https://www.guru3d.com/news-story/a...tive-with-radeon-vii-though-directml-api.html
https://wccftech.com/amd-radeon-vii-excellent-result-directml/
 
Critics review is meaningless. Only players reviews count. Since you cant usually bribe players. Same for Rotten Tomatoes.

So, uh, how valuable are those user scores right now?
 
I'm sorry but 3GB is not sufficient for even 1080P with high textures. Metro games were always frugal with VRAM (and looked great despite it). The 1060 3GB is a disgrace of a card and so its the upcoming 1660 Ti 3GB. People that buy these cards deserved to get burned in newer games. ~

huh? From what I've seen the GTX1060 3Gb is able to run most games at very good settings at 1080p including Metro Exodus. The upcoming 1660 Ti 3GB should be even better at 1080p. Those cards fit a specific price point and customer. Frankly I would rather have a faster GPU with less ram then a slower GPU with more RAM then it can handle (hello AMD 570 8GB card). Now I would argue the GTX1060 3GB is over priced and needs to be between the AMD 570 and AMD 580 4GB in price but I would also say most of Nvidia's cards are over priced to AMD's offerings.

Obviously the GTX 1060 6GB and GTX 1660 Ti 6GB are/will be better cards then the 3GB name sake but people won't always want to spend that money if they don't have the need or desire for that hardware.
 
huh? From what I've seen the GTX1060 3Gb is able to run most games at very good settings at 1080p including Metro Exodus. The upcoming 1660 Ti 3GB should be even better at 1080p. Those cards fit a specific price point and customer. Frankly I would rather have a faster GPU with less ram then a slower GPU with more RAM then it can handle (hello AMD 570 8GB card). Now I would argue the GTX1060 3GB is over priced and needs to be between the AMD 570 and AMD 580 4GB in price but I would also say most of Nvidia's cards are over priced to AMD's offerings.

Obviously the GTX 1060 6GB and GTX 1660 Ti 6GB are/will be better cards then the 3GB name sake but people won't always want to spend that money if they don't have the need or desire for that hardware.
I wonder if anyone remembers the days when fewer memory meant lower latency memory and could result in faster cards ;)
Also, when you buy a $150 video card and expect to max out the latest AAA titles, the problem is not the video card.
 
Back
Top