• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Metro Exodus Benchmark Performance, RTX & DLSS

Don't trust their performance numbers, they're using built-in benchmark.
steve from gamers nexus says in-game benchmark is unrealistically heavy.

While that might be true, it makes it impossible to compare benchmarks from TPU with another site. Meanwhile the other sites that just use the built-in benchmarks can be compared with other sites that use the built in benches also. Peer review.
 
So a 2070 with raytracing on High is pretty much a 2080 with raytracing on Ultra? $300 doesn't buy what it used to... Happy with my 2070.
 
Sorry but 60fps shouldn't be the limiting factor. 60fps should be the minimum. People actually have monitors 1440p monitors that go to 160Hz.
Indeed. But two things...

1. 60 FPS is what the majority deem as 'playable' FPS.
2. Maybe it shouldn't... but if you look at the table (maybe it was in another thread here) it looks like NVIDIA attacked first where it NEEDED a performance boost. If you look at the link below, you will see that at 2560x1440 the 2080 Ti at 2560x1440 with RT to ULtra is almost 75 FPS. More is always better, but at least to me, it seems fairly obvious they put the effort where it was explicitly necessary first. I would imagine to see more resolutions added as time allows for the AI to 'train'.
https://www.techpowerup.com/forums/...rformance-rtx-dlss.252502/page-3#post-3993996
 
Don't mind the performance differences, it's expected.

I just want praise how beautiful RTX is and how TPU correctly phrase it's beauty.
 
Did you look at VRAM allocation? It's a bit over 3GB at 1440p.
It's not just about Vram though. 580 is a faster card than 3GB 1060 in most things.

What is the lower limit of your freesync monitor? Most freesync monitors don't start working till 40fps or above.

I bought this one.

Oh god please don't tell me it's bad. But seriously, please don't. It'll trigger my anxiety T_T ~ It's the only 144hz in my budget there was literally nothing else.
 
rtx light looks great,not gonna buy a 2080ti to have that tho.
 
Not sure about using a 4k monitor just to get a more blurry image....
I think you are better off with a 1440p maybe even a 1080p monitor in that case.
Those comparisons are misleading. Yes, screenshots will look worse under a magnifying glass, but that's not the point. The point is, do you notice those differences are noticeable during normal gameplay? I have seen no review that talks about that.
 
To be honest RTX doesn't look THAT impressive, when I look at the second picture with the pipe and look how bright the sky is, I'm having Far Cry 1 HDR flashbacks, some of the other shots looks too dark. I don't know it looks in motion, but it seems very hard to loot stuff under that lighting.
 
This game is not out yet! Grrrrrrrrr! Mmmmmmm! :D

When will it be installable from Epic?
 
Critics review is meaningless.
If @RCoon would review it it would be trustworthy. His reviews are never afraid to show negatives or to pronounce a game as not good.
 
GameGPU also tested the game:
https://gamegpu.com/action-/-fps-/-tps/metro-exodus-test-gpu-cpu

EzQjez3.png


The test sequence:
 
What you expect from Nvidia sponsored title?? Poor performence on Readons.

I didn't see you or anyone saying same things when AMD sponsored titles are tested and gives poor performance on nvidia. E.g: Resident evil 2 remake. Hypocrisy?

yeah it seems fishy here honestly I suspect 4A may have been drinking some green poison when they implemented RTX....

Yeah,same with resident evil 2 remake. Capcom drank some red poison while making it.
 
Last edited:
Seems to be nice game, but poor advertisement. Absence in steam and denuvo makes it even harder to suggest if game is worth paying for it now.
RTX is something for a change, tech has some potential, especially if devs will get some experience.
However, the whole image difference is far cry from what I suggest worth premium they ask for these cards - and I'm shocked with number of those idiots blinded with ngreedia marketing and advertisement, swearing they'll grab new rtx card right NOW, just after they saw exodus screenshots.
Especially considering that only 2080/Ti makes some sence for 1080/1440p - 2060/2070 rtx perfomance is not impressive, RTX in 4K is DOA, that's for sure.

P.S. DLSS is a complete joke! - greedy huang should have better invested transistors in rtx cores, to make at least average 60fps@4K - but he chose the bridge that wants both bones...
 
3GB 1060 beating the 8GB 580 :(

VRAM not in play at 1080..... look at TPUs test results for the 3 GB versus 6GB model. The 6GB gets about 6% better numbers, but we also see that that the VRAM is not the reason. The 6 GB varies by more than the VRAM, it has 11% more shaders. So how do we know it's not the VRAM ? If it was a contributing factor at 1080p, then it would necessarily be a significantly large factor with a larger performance difference at 1440p. It's not. There may come a day when more VRAM is needed at 1080p but, given those test results, that day has not yet arrived.


This is going to spark quite a few discussions. DXR means light is more accurately rendered. Some will say the brighter images look better, some won't. It's just like some prefer Canon's more saturated colors, even if Nikon has more lifelike defaults.

Well that's the thing. And the reason why photo editors used IPS screens. Look at a photo of grandma on ya screen after she used her "Glamor Shots " Gift Card and she looks great on the IPS screen ... look at the same pic on TN and she looks like a old hooker with overdone lipstick and rouge :) Much akin to popular digital music with exaggerated bass and trebel to make them sound better of the $15 sound systems on phones, Budget MoBos and boom boxes ... played in a hi end audiophile system, they and people are almost runing outta the room holding their ears. It's "uncomfortable to say the least". So it's not just that devs have to adjust how they render images to use the ray tracing as a feature, they have to rethink the artificial adjustments that they have been using over the years to make scenes look better ... because the scene was not rendering properly with regard to lighting effects, that scene would be overly bright. Now by only allowing light effects to be painted from transparaent surfaces, much of the light coming in that scene is now blocked by opaque surfaces. It really shows further down in thet shed scene ... but I like what the review did there showing the pimples along with the clear skin. With accurate lighting effects it's ging to matter where and how big your light sources are and faking it to get desired lighting levels won't work anymore.


Don't trust their performance numbers, they're using built-in benchmark.
steve from gamers nexus says in-game benchmark is unrealistically heavy.

An in game / demo is like using the same test on your students when you are aware last years answers are in circulation. Regardless of what hoops it puts the card thru, everybody knows what's coming and can optimize performance for it. But of course, those tweaks can blow up in ya face if either then driver or the game gets patched or changed.
 
Last edited:
DLSS Image Quality looks TERRIBLE.

To be honest RTX doesn't look THAT impressive, when I look at the second picture with the pipe and look how bright the sky is, I'm having Far Cry 1 HDR flashbacks, some of the other shots looks too dark. I don't know it looks in motion, but it seems very hard to loot stuff under that lighting.
It's because DLSS is a complete failure. Every single review I looked at say the same thing. A little boost in performance at a cost of Picture Quality. I'll take the Picture Quality myself, then again I'll take both, which is why I use a Radeon :D

Perhaps maybe you should look at the facts rather than attacking people.
A. Reviewers make build choices that may negatively impact performance, most of this is not purposefully slanted. That said, a good read of the reviewers guide should show which systems setups a vendor prefers to make theirs shine.... might be worth having more than 1 platform to test on to make sure you are not accidentally biasing. It's a shitton of work and I wouldn't expect it out of day 1 reviews. The performance engineer in me always wants to ensure validity of test cases.

B. RTX is cool but early, DLSS is... fuck I turn off motion blur why would I want this shiite.

C. This is a RTX showcase title... They delayed the game to make it a game works title, this game may not have made it to production without nvidia money.
So yes it is definitely going to be AMD deoptimised... but we probably wouldn't get to play it without that money.

D. there is a reason why you shouldn't rely on a single site, but you should definitely take anomalies with a grain of salt and try to identify why they are different. Never start with the assumption of fudging numbers... Reviewing is hard work. If I recall... there was a site giving amd (cpu) much better numbers because they turned off high precision timer or something which boosted amd a touch and was very detrimental to intel... so once again, configs are VERY important.

E. Plenty of reasons to wait a year for this game... /thread.

I am happy to see my 1080ti will still have a great experience tho.
And the other rig with vega64 on freesync will also be okish.
Umm, my Sapphire Radeon RX580 and my 2k 144Hz FreeSync monitor will have absolutely no issues playing Metro Exodus. If my setup can do it, the Vega64 will be able to breeze through it. This all based on your "okish" comment.
 
Funny how everyone starts crying when they see AMD failing in an Nvidia sponsored titles, people immediately starts claiming that it's de-optimized for AMD.

But when nvidia fails in a heavily AMD biased and sponsored game, people starts to celebrate saying it's AMD's raw gpu power unleashed, FineWine and other baseless myths. The amount of ignorance and hypocrisy is really worrying.
 
Funny how everyone starts crying when they see AMD failing in an Nvidia sponsored titles, people immediately starts claiming that it's de-optimized for AMD.

But when nvidia fails in a heavily AMD biased and sponsored game, people starts to celebrate saying it's AMD's raw gpu power unleashed, FineWine and other baseless myths. The amount of ignorance and hypocrisy is really worrying.
Nothing is De-optimized for either Nvidia or AMD. Though AMD & Nvidia sponsored games are usually optimized for there GPUs and that's OK.
 
I wonder if ultrawide resolutions will be DLSS supported. I've just tried out BFV and the DLSS option was grayed out... means I've got to stick to Medium DXR for now I guess.
 
About the whole "you can't use DLSS with RT" debacle, at least you have a choice to either have accurate lighting & shadows with horri-bad image quality + low fps OR RTX Off with DLSS to have a good balance of ok image quality & high fps count. Still, DLSS isn't ready like RTX, so it's clear that we all know RTX alone does the job fantastically. I bet DLSS tech will be taken a little more seriously from game devs coz both of them need to "learn" since I feel that the complexity of deep learning tech isn't taken seriously & treated like some gimmick tech (which it's not when put to good use).
 
Back
Top