# Metro Exodus Benchmark Performance, RTX & DLSS



## W1zzard (Feb 13, 2019)

Metro Exodus launches this week. We thoroughly tested the game's technical aspect, on 3 resolutions, with 16 graphics cards, including the whole RTX lineup and Radeon VII. We also have a ton of screenshots and side-by-side image comparison using NVIDIA RTX and DLSS.

*Show full review*


----------



## jabbadap (Feb 13, 2019)

Out of no where, and bam Metro goodness... Did I miss something or where is the DLSS numbers on performance page?


----------



## ArbitraryAffection (Feb 13, 2019)

Ouch somewhat poor Radeon performance. 

But by the time i buy this maybe the drivers and/or patches will improve things. Either way I have just bought a Freesync 1080p monitor so even if my 570 can only do 40~ fps it should still be playable and smooth,..


----------



## raptori (Feb 13, 2019)

Thanks for the Performance review, Interesting that some pictures in RTX off looks better and nice to see that developers didn't make RTX-off looks bad on purpose ... like the first picture I'd say that RTX -off looks better and it seems that ray-tracing doesn't give enough global illumination.

Does DX11 give better performance ?


----------



## dirtyferret (Feb 13, 2019)

awesome, thank you


----------



## Al Chafai (Feb 13, 2019)

the game runs better on Nvidia Hardware,How the 3 Gb 1060 coming that close to The RX 590.
do u think AMD will release a Game ready driver to boost performance?
also thnx for the amazing analysis.


----------



## nickbaldwin86 (Feb 13, 2019)

makes me want a 2k series card before buying the game... oh well should play just the same without the never ending eye candy on my current rig setup


----------



## Al Chafai (Feb 13, 2019)

raptori said:


> Thanks for the Performance review, Interesting that some pictures in RTX off looks better and nice to see that developers didn't make RTX-off looks bad on purpose ... like the first picture I'd say that RTX -off looks better and it seems that ray-tracing doesn't give enough global illumination.


i agree,but i think that the RTX On looks better in the outdoor screen shots,and looks bad indoor.
idk but i think that's what i noticed.
also is it me or the DLSS screens shots are so blurry compared to the normal ones?


----------



## W1zzard (Feb 13, 2019)

jabbadap said:


> Did I miss something or where is the DLSS numbers on performance page?


Didn't have time for full benching of all cards with DLSS, the DLSS screenshot comparison page has performance numbers with RTX 2080 Ti in the titles of the images



Darksider92 said:


> also is it me or the DLSS screens shots are so blurry compared to the normal ones?


Yeah seems more blurry than what we've seen in Final Fantasy I think



Darksider92 said:


> do u think AMD will release a Game ready driver to boost performance?


I'm sure they will release a driver, just not sure how much improvement


----------



## IceShroom (Feb 13, 2019)

ArbitraryAffection said:


> Ouch somewhat poor Radeon performance.
> 
> But by the time i buy this maybe the drivers and/or patches will improve things. Either way I have just bought a Freesync 1080p monitor so even if my 570 can only do 40~ fps it should still be playable and smooth,..


What you expect from Nvidia sponsored title?? Poor performence on Readons.


----------



## londiste (Feb 13, 2019)

raptori said:


> Interesting that some pictures in RTX off looks better and nice to see that developers didn't make RTX-off looks bad on purpose ... like the first picture I'd say that RTX -off looks better and it seems that ray-tracing doesn't give enough global illumination.


This is much more likely a question of building the level and lighting. Assuming that RTX still does pointlights just with rasterization methods, this highlights the differences the lighting methods. RTX is likely to be more "correct" which the original level/lighting design does not account for.


----------



## jabbadap (Feb 13, 2019)

W1zzard said:


> Didn't have time for full benching of all cards with DLSS, the DLSS screenshot comparison page has performance numbers with RTX 2080 Ti in the titles of the images
> 
> 
> Yeah seems more blurry than what we've seen in Final Fantasy I think
> ...



What do you compare DLSS, no-AA or some other AA mode? FF had horrible TAA, which made DLSS look better.


----------



## kastriot (Feb 13, 2019)

Did you tried low settings with 2080Ti+RTX  @4K and does it go over 60fps?


----------



## champsilva (Feb 13, 2019)

raptori said:


> Thanks for the Performance review, Interesting that some pictures in RTX off looks better and nice to see that developers didn't make RTX-off looks bad on purpose ... like the first picture I'd say that RTX -off looks better and it seems that ray-tracing doesn't give enough global illumination.
> 
> Does DX11 give better performance ?



No, exactly the same (i've checked 2 reviews)


----------



## jabbadap (Feb 13, 2019)

champsilva said:


> No, exactly the same (i've checked 2 reviews)



Which is rather good thing. Is there cpu scaling anywhere?


----------



## illli (Feb 13, 2019)

Interesting, my thoughts on the rtx/non-rtx images:

most non-rtx images look better than the rtx ones. while some rtx might be more 'accurate' some are way too dark to see anything. 
some rtx images look nearly indistinguishable from the non-rtx ones.
a few rtx images look better than the non rtx ones. 

maybe in a few years it'll look better once game studios have more experience working with it.


----------



## Markosz (Feb 13, 2019)

Well well well, just before reading through this I watch the DLSS Port Royal Benchmark video NVIDIA posted a week ago.

The performance gain is really nice and much welcome (But not as great as in the bencmark), but also there was no image quality improvement which they highlighted in that video.
In fact the whole image was blurrier, which brings up an interesting question. Since DLSS is machine learning based, the more learning it does, the results get better.
They either did tons of machine learning for the benchmark, hence it's results, or they purpusefully cause image quality loss for the 'no DLSS' image.

But either way, how are we supposed to know how long will they support a game with this? The results for each game might be very different and some might be even worse quality than DLSS off.


----------



## champsilva (Feb 13, 2019)

jabbadap said:


> Which is rather good thing. Is there cpu scaling anywhere?



Same.


----------



## ArbitraryAffection (Feb 13, 2019)

IceShroom said:


> What you expect from Nvidia sponsored title?? Poor performence on Readons.


yeah it seems fishy here honestly I suspect 4A may have been drinking some green poison when they implemented RTX....


----------



## metalfiber (Feb 13, 2019)

For once, glad to see it's not just a console port.


----------



## nickbaldwin86 (Feb 13, 2019)

ArbitraryAffection said:


> yeah it seems fishy here honestly I suspect 4A may have been drinking some green poison when they implemented RTX....



Yeah how DARE they use the most current technology to make an amazing game with AMAZING graphics. 

NV is paving new roads with new graphics technology. while AMD has a new card out that is a refresh of old and has NO new tech on board. hold your breath and turn blue doing so because AMD will have RayTracing of some kind by 2020


----------



## ArbitraryAffection (Feb 13, 2019)

nickbaldwin86 said:


> Yeah how DARE they use the most current technology to make an amazing game with AMAZING graphics.
> 
> NV is paving new roads with new graphics technology. while AMD has a new card out that is a refresh of old and has NO new tech on board. hold your breath and turn blue doing so because AMD will have RayTracing of some kind by 2020


They can implement RTX and they can implement RTX _AND _exclusde GCN specific optimisations, you know, to cripple performance on Radeon cards? I wouldn't put it past NVIDIA to push something like that.

Also Guru3D have Radeons doing MUCH better, I'm so confused:


----------



## W1zzard (Feb 13, 2019)

Markosz said:


> They either did tons of machine learning for the benchmark


3DMark is running on rails, so it's easy to optimize for, because only a few ten thousand of images are shown for the full benchmark, no user control


----------



## SystemMechanic (Feb 13, 2019)

TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???


----------



## ArbitraryAffection (Feb 13, 2019)

SystemMechanic said:


> TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???


I guess different areas of the game tax different bits of the GPU differently or something. I'm just confused that TPU has the 3GB 1060 beating 580 when Guru3D has it a hair off the 1070.


----------



## Al Chafai (Feb 13, 2019)

SystemMechanic said:


> TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???


fix what?
in Guru3D's results 1070 gets 40 fps @1080p lol 
i always trust TPU when it comes to their HW or game analysis.


----------



## neatfeatguy (Feb 13, 2019)

"_What is worth mentioning though is that the game does look more blurry when DLSS is enabled _"

Okay, it wasn't just my imagination looking at the comparison images for DLSS enabled and off, that the DLSS looked blurry. It feels like they overlay the image with motion blur on objects/background, that's not directly in your line of sight. It kind of hurts my eyes, so I'm not sure if I'd like using DLSS even if I did have a RTX card.

I also don't like the fact you can't disable Motion Blur - one of the worst graphic options (in my opinion) out there and you can't even turn it off in the options menu. I'm sure you can find a way to disable it manually, but I find tinkering with options/settings manually to get tiresome (yeah, I'm lazy at times). I'd just like to be able to adjust all settings in the settings menu and be done with it. I had to fight with options in the .ini with Fallout 4 to get it to work correctly and with me getting lazier as I get older, I don't have the patience like I used to for fighting with games to get them to function correctly.

I won't be picking up this game off EPIC's game store, so I guess my comments are kind of moot.....


----------



## M2B (Feb 13, 2019)

SystemMechanic said:


> TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???



Don't trust their performance numbers, they're using built-in benchmark.
steve from gamers nexus says in-game benchmark is unrealistically heavy.


----------



## W1zzard (Feb 13, 2019)

SystemMechanic said:


> how is guru3d getting 20fps lower at 4k than yours ???


Maybe they're using the in-game benchmark? I'm using actual gameplay


----------



## jabbadap (Feb 13, 2019)

W1zzard said:


> Maybe they're using the in-game benchmark? I'm using actual gameplay



What settings do you use, plain Ultra preset or customized? (Hairworks, Advanced Physics etc.)


----------



## barku (Feb 13, 2019)

Great review again wizzard. I think its time to upgrade my 1070 to 2080 at 2k res.


----------



## Al Chafai (Feb 13, 2019)

W1zzard said:


> Maybe they're using the in-game benchmark? I'm using actual gameplay


pretty sure that the method u used is more realistic to what we will be using and experiencing.i don't trust that built in Benchmark just like the old Metro games used to,it's more taxing than the actual game.
good review man,


----------



## W1zzard (Feb 13, 2019)

jabbadap said:


> What settings do you use, plain Ultra preset or customized? (Hairworks, Advanced Physics etc.)


https://www.techpowerup.com/reviews/Performance_Analysis/Metro_Exodus/3.html


----------



## jabbadap (Feb 13, 2019)

W1zzard said:


> https://www.techpowerup.com/reviews/Performance_Analysis/Metro_Exodus/3.html



How hairyworks affects performance? Does it have reasonable level of tesselation and AA level not to make graphics cards on their knees as it was with Witcher 3.


----------



## W1zzard (Feb 13, 2019)

jabbadap said:


> How hairyworks affects performance? Does it have reasonable level of tesselation and AA level not to make graphics cards on their knees as it was with Witcher 3.


AMD is protecting against over-tessellation in the driver since Witcher


----------



## jabbadap (Feb 13, 2019)

W1zzard said:


> AMD is protecting against over-tessellation in the driver after Witcher



Well I think it was after crysis 2 for Tessellation. Witcher 3 problem were more High MSAA level of the hair than excess use of tessellation. Either way Hairworks usually tanks performance in both sides.


----------



## SystemMechanic (Feb 13, 2019)

M2B said:


> Don't trust their performance numbers, they're using built-in benchmark.
> steve from gamers nexus says in-game benchmark is unrealistically heavy.


Ahh right. Because gamegpu and guru3d have the same 50ish fps avg but tpu and oc3d got 71avg. I was just looking at graphs assuming settings were the same.


----------



## Nxodus (Feb 13, 2019)

"..so is the story, which does seem a little bit linear at first"

I love linear games


----------



## John Naylor (Feb 13, 2019)

When I first started looking at the pics, especially first one, I was like... RTX is way too dark .... it really started to show it's stuff in the pic with the snow.  Then i went back and looked at the 1st pic and it hit me ... from a game play standpoint, I liked that I could see more (I used to turn up gamma in Diablo caves to "see more")  but it immediately became apparent that given the observed light conditions, the darkness was "accurate".


----------



## moproblems99 (Feb 13, 2019)

Well, I still don't see any compelling reason to want RTX or DLSS so far.  They seem like a trade-off so far.  Some look better, some look worse.  In any case, next generation should let us know what we are really in for.  I am glad to see that performance is mostly ok with RTX thought.  At least is doesn't totally kill performance.



M2B said:


> Don't trust their performance numbers, they're using built-in benchmark.
> steve from gamers nexus says in-game benchmark is unrealistically heavy.



It doesn't really matter as long comparisons are consistent.  As long as you are comparing numbers from the in game benchmark, it is all relevant.  As long as people don't compare in-game benchmark numbers to live gameplay numbers.


----------



## Tsukiyomi91 (Feb 13, 2019)

Great review as always w1zz! All the more reason for me to play this game knowing that the 2060 perform so well at Ultra with RTX on High.


----------



## B-Real (Feb 13, 2019)

ArbitraryAffection said:


> Ouch somewhat poor Radeon performance.
> 
> But by the time i buy this maybe the drivers and/or patches will improve things. Either way I have just bought a Freesync 1080p monitor so even if my 570 can only do 40~ fps it should still be playable and smooth,..


What? :O Vega 56 leads 1070, and RX580 is near the 1060.


----------



## ArbitraryAffection (Feb 13, 2019)

B-Real said:


> What? :O Vega 56 leads 1070, and RX580 is near the 1060.








3GB 1060 beating the 8GB 580


----------



## XiGMAKiD (Feb 13, 2019)

Great review, for now it looks like RTX is more of a gimmick. Also the pics in Screenshot page are pretty varied but too bad it doesn't show which one is RTX on or off


----------



## Space Lynx (Feb 13, 2019)

ArbitraryAffection said:


> Ouch somewhat poor Radeon performance.
> 
> But by the time i buy this maybe the drivers and/or patches will improve things. Either way I have just bought a Freesync 1080p monitor so even if my 570 can only do 40~ fps it should still be playable and smooth,..




oh I don't know about that, vega64 i have seen on sale for $330 in last few months, and it beats a gtx 1070 by 5-10 fps according to all reviews I have seen.

I have a laptop 1070 though, so maybe I am just jealous   maybe at High settings with like shadows turned to normal, maybe I can hit 100 fps on my 1070 laptop which is overclocked to 100hz ^^



XiGMAKiD said:


> Great review, for now it looks like RTX is more of a gimmick. Also the pics in Screenshot page are pretty varied but too bad it doesn't show which one is RTX on or off



I think DLSS is a scam, looks blurry as crap to me. I prefer it off.


----------



## 64K (Feb 13, 2019)

Game of the year imo


----------



## W1zzard (Feb 13, 2019)

XiGMAKiD said:


> Also the pics in Screenshot page are pretty varied but too bad it doesn't show which one is RTX on or off


They are all RTX off


----------



## FreedomEclipse (Feb 13, 2019)

metalfiber said:


> For once, glad to see it's not just a console port.



Its hard to test RTX on current gen of consoles. It will probably be hard on next gen too since we're so close to seeing the PS5 drop. the specs are most likely finalised and since it will be an AMD chip. will either not supported or if it is supported, it will be extremely limited with what it can do. 

By the time the PS6 is dropping, RTX will either be completely dead due to it being nothing more than a gimmick or Nvidia would of pushed real hard to get the industry to adopt it so the technology matured. 

Ive seen Gamers Nexus's video on it.










and TBF its something which I could live without and not really care a whole lot about.


----------



## xkm1948 (Feb 13, 2019)

AMD GPU does poorly in a new game.

AMD fans:

A. It is the reviewer's fault! He/she is biased towards Nvidia (Nvidia shill theory)
B. RTX and DLSS are gimmicks! 
C. Developer took Nvidia money (Developer is Nvidia shill theory)
D. But XXX sites showed XXX GPU is way better (Accusing TPU theory)
E. I wont play this game (Ostrich strategy)

So yeah, whenever AMD does poorly in something, it is ALWAYS someone else's fault. Not the saint, savior and holy underdog AMD's fault.


----------



## 64K (Feb 13, 2019)

xkm1948 said:


> AMD GPU does poorly in a new game.
> 
> AMD fans:
> 
> ...



Then you miss out on a great game.


----------



## Vya Domus (Feb 13, 2019)

DLSS obliterates texture detail and alpha effects and people still don't want to accept this is a terrible upscaling method. Might as well lower your resolution manually, hell it may even look better.


----------



## Tomgang (Feb 13, 2019)

My GPU is ready for metro, but are my old CPU up to the task. Gonna try it out when it is released on Friday. So as of now Metro is gonna be the first new game to try out for 2019 and see if 2019 is the year X58 is finally gonna be seen as outdated for gaming.

All that ray-tracing and DLSS you RTX guys can keep it. The image really dit not impress. So i am happy with what i got now: GTX 1080 TI


----------



## Imsochobo (Feb 13, 2019)

lynx29 said:


> I think DLSS is a scam, looks blurry as crap to me. I prefer it off.


First game to give something to users with RTX (BF5 is just a joke...)

DLSS - Doesn't look like the solution at all, no good implementation I've seen so far and the Raytracing here is actually really good in the pictures I see and performance hit isn't too bad so I can't blame it on the dev



xkm1948 said:


> AMD GPU does poorly in a new game.
> AMD fans:
> 
> A. It is the reviewer's fault! He/she is biased towards Nvidia (Nvidia shill theory)
> ...



A, D, E = Fanboy shit.
B = partially true, it's not worth buying RTX2XXX, emphasis on 2XXX for it's RTX, I believe some sort of it is the future with raytracing or pathtracing but atm, let devs play with it for a bit, this game is the first game where I see they've used Raytracing well, BF5 showed off nothing I've not seen with rasterized gameengines (I can run on any gpu). thus so far it's been a gimmick and just like gameworks.
Don't think many will buy RTX2080 for this game... thus don't buy RTX card for it's RTX, not saying gimmick but for products out there right now it definitely is.
But I'm saying buy a RTX card as AMD can't compete.

DLSS = Not seen anything posetive, performance gain for reduced quality, quality settings exist for a reason and don't need DLSS.

C. here is my opinion, hardly ever when I see a AMD titled game does a nvidia card perform beyond horrible and it's usually just optimized to use amd's hardware where benchmarks show that amd cards perform more according to theoretical performance
while nvidia titles It tanks performance on amd cards like hell and even the  game doesn't even look that good and it still tanks performance with it's gameworks stuff and most of the time I can only say: it looks different, not better and not worse it's just plain pointless.
So when I had my 980TI I still disabled gameworks in every title I played cause I never saw any point in having enabled.

Only reason I have a Vega was freesync and Linux support the latter being the only place amd cards really are just superiour.


----------



## Xuper (Feb 13, 2019)

> Surprisingly the game doesn't let you select between full-screen, windowed or maximized window. Rather the game always runs in full screen mode at your current desktop resolution, and the "Resolution" setting here controls the rendering resolution. This causes some issues when your desktop is set to 1080p on a 4K monitor — Metro will only run in 1080p, even though you're selecting 4K in settings.


I'm fine 


> Motion blur is always enabled, the options are "low", "normal", "high" — "off" is not available, which is super annoying.


 , Motion blur is reason that I disabled in BV5 , super annoying.


> DLSS can be enabled or disabled, separately from ray-tracing. DLSS is available based on GPU, resolution and raytracing setting. To enable DLSS at 4K you need a RTX 2070 or better, raytracing can be on or off. For DLSS at 1440p you must have raytracing enabled and a RTX 2060. At 1080p, DLSS can only be enabled on RTX 2060 & RTX 2070, only when raytracing is enabled. No idea why NVIDIA chose to limit it that way.


What the hell ? If you have RTX2060 , you should enable RT then DLSS? I guess there were too much buggy that developers left it to later so they can fix important some Issue/bugs


----------



## Vya Domus (Feb 13, 2019)

Xuper said:


> What the hell ? If you have RTX2060 , you should enable RT then DLSS? I guess there were too much buggy that developers left it to later so they can fix important some Issue/bugs



I have a feeling these are restrictions imposed by Nvidia so they can have "tiers" of some sort. They don't want people to have too much of a horrible experience by destroying their performance/image quality too much with RTX/DLSS so they limit your choice.


----------



## Mescalamba (Feb 13, 2019)

WTH was supposed DLSS do?

RTX clearly works and nicely. Im quite curious about future and nicely realistically lit games..


----------



## bug (Feb 13, 2019)

John Naylor said:


> When I first started looking at the pics, especially first one, I was like... RTX is way too dark .... it really started to show it's stuff in the pic with the snow.  Then i went back and looked at the 1st pic and it hit me ... from a game play standpoint, I liked that I could see more (I used to turn up gamma in Diablo caves to "see more")  but it immediately became apparent that given the observed light conditions, the darkness was "accurate".


This is going to spark quite a few discussions. DXR means light is more accurately rendered. Some will say the brighter images look better, some won't. It's just like some prefer Canon's more saturated colors, even if Nikon has more lifelike defaults.

Also, it's pretty clear the 2060 can't really do DXR


----------



## Xuper (Feb 13, 2019)

xkm1948 said:


> AMD GPU does poorly in a new game.
> 
> AMD fans:
> 
> ...



Hey please *STOP* ! Don't fuel Red/Green war even if someone started ,DON'T JOIN!


----------



## bug (Feb 13, 2019)

Mescalamba said:


> WTH was supposed DLSS do?



Render at lower resolution then upscale assisted by per-title presets in the driver.


----------



## moproblems99 (Feb 13, 2019)

Xuper said:


> Hey please *STOP* ! Don't fuel Red/Green war even if someone started ,DON'T JOIN!



He's just salty he bought into the Fury X hype.  Let him be.  His wounds will heal soon.


----------



## bug (Feb 13, 2019)

ArbitraryAffection said:


> 3GB 1060 beating the 8GB 580


Did you look at VRAM allocation? It's a bit over 3GB at 1440p.


----------



## Vya Domus (Feb 13, 2019)

Xuper said:


> DON'T JOIN!



Join ? He is at the helm of it.


----------



## Xuper (Feb 13, 2019)

Vya Domus said:


> Join ? He is at the helm.





bug said:


> Did you look at VRAM allocation? It's a bit over 3GB at 1440p.



But 1060 3GB is only for 1080p. Not good if someone buys this card for 1440p, Btw this is Ultra setting , So No one plays at Ultra , almost 95% people on High setting and I'm one of them.


----------



## Nima (Feb 13, 2019)

Ray tracing and global illumination looks awesome. night and day difference in some scenes. I think global illumination will be the best application of ray tracing in games. but DLSS was really disappointing in this game. strangely even the 1440p image looks sharper than 4k+DLSS on.


----------



## cucker tarlson (Feb 13, 2019)

RVII at 2070 performance,not good.
I'm glad at 1440p with 1080Ti I can expect very good framerates at ultra,I'll wait for the game reviews before buying though.


----------



## M2B (Feb 13, 2019)

cucker tarlson said:


> RVII at 2070 performance,not good.
> I'm glad at 1440p with 1080Ti I can expect very good framerates at ultra,I'll wait for the game reviews before buying though.


----------



## cucker tarlson (Feb 13, 2019)

M2B said:


> View attachment 116380


I need more than that.


----------



## Countryside (Feb 13, 2019)

Looks real nice can't wait to try it


----------



## EarthDog (Feb 13, 2019)

M2B said:


> Don't trust their performance numbers, they're using built-in benchmark.
> steve from gamers nexus says in-game benchmark is unrealistically heavy.


I think what is important here, actually, is the RELATIVE performance of the cards not actual in-game FPS for most. As a consumer, I would be happy to see BETTER performance in game than the included benchmark. 

The good news is, the integrated benchmark is repeatable and allows for empircal testing. Run throughs can be inconsistent due to all the variables. The longer the run the better off is what we found for manual runs.


----------



## Patriot (Feb 13, 2019)

xkm1948 said:


> AMD GPU does poorly in a new game.
> 
> AMD fans:
> 
> ...



Perhaps maybe you should look at the facts rather than attacking people.
A. Reviewers make build choices that may negatively impact performance, most of this is not purposefully slanted.   That said, a good read of the reviewers guide should show which systems setups a vendor prefers to make theirs shine.... might be worth having more than 1 platform to test on to make sure you are not accidentally biasing.   It's a shitton of work and I wouldn't expect it out of day 1 reviews.  The performance engineer in me always wants to ensure validity of test cases.

B.  RTX is cool but early, DLSS is... fuck I turn off motion blur why would I want this shiite.

C. This is a RTX showcase title... They delayed the game to make it a game works title, this game may not have made it to production without nvidia money.
So yes it is definitely going to be AMD deoptimised... but we probably wouldn't get to play it without that money.

D. there is a reason why you shouldn't rely on a single site, but you should definitely take anomalies with a grain of salt and try to identify why they are different.  Never start with the assumption of fudging numbers... Reviewing is hard work.  If I recall... there was a site giving amd (cpu) much better numbers because they turned off high precision timer or something which boosted amd a touch and was very detrimental to intel... so once again, configs are VERY important.

E.  Plenty of reasons to wait a year for this game... /thread.

I am happy to see my 1080ti will still have a great experience tho.
And the other rig with vega64 on freesync will also be okish.


----------



## moob (Feb 13, 2019)

From the Conclusion: _What is worth mentioning though is that the game does look more blurry when DLSS is enabled — 4K DLSS still looks much better than simply running at 1440p._

What? No it doesn't. From the last image comparing 1440p to 4K+DLSS, the overall 1440p image looks better to my eyes. The DLSS image is a blurry mess. It _literally_ hurts my eyes. Though I'm the same as others here in that motion blur is one of the first things I turn off in any game.


----------



## Gasaraki (Feb 13, 2019)

What is really baffling is that if you have a 2080Ti, you can't use DLSS at 2560x1440. Only 2080s or below.

Huh?


----------



## EarthDog (Feb 13, 2019)

Gasaraki said:


> What is really baffling is that if you have a 2080Ti, you can't use DLSS at 2560x1440. Only 2080s or below.
> 
> Huh?


My take on this is to get the performance up where it actually needs it. The 2080Ti can run 1440p with RT enabled and Ultra for 60 FPS.


----------



## Gasaraki (Feb 13, 2019)

ArbitraryAffection said:


> Ouch somewhat poor Radeon performance.
> 
> But by the time i buy this maybe the drivers and/or patches will improve things. Either way I have just bought a Freesync 1080p monitor so even if my 570 can only do 40~ fps it should still be playable and smooth,..



What is the lower limit of your freesync monitor? Most freesync monitors don't start working till 40fps or above.



EarthDog said:


> My take on this is to get the performance up where it actually needs it. The 2080Ti can run 1440p with RT enabled and Ultra for 60 FPS.



Sorry but 60fps shouldn't be the limiting factor. 60fps should be the minimum. People actually have monitors 1440p monitors that go to 160Hz.


----------



## Captain_Tom (Feb 13, 2019)

"NVIDIA DLSS is a new form of anti-aliasing which renders the game at reduced resolution, upscales it, and then *fills in the details using a pre-trained AI network*... We are a bit *puzzled though, why NVIDIA is limiting the use of DLSS in some ways* that make little sense. For example, you can enable DLSS at 1080p..."

You answered your own question.  It takes a lot of preemptive work to "pre-train" a given set of tensor cores to run DLSS correctly.  It doesn't "just work." 

Nvidia needs to code each DLSS option individually for each Turing Core Configuration they plan to support, per game.  Hence Nvidia isn't going to waste a ton of time pre-training 1080p for the 2080 Ti.  No one would use it, and it takes time/money.

Same reason why they force you to use Raytracing for DLSS in BFV - DLSS isn't needed unless you use Raytracing, and as such they aren't going to waste time training different paths for useless settings combinations.  If you aren't using raytracing, you don't need to downgrade your graphics with DLSS.


----------



## Gasaraki (Feb 13, 2019)

M2B said:


> Don't trust their performance numbers, they're using built-in benchmark.
> steve from gamers nexus says in-game benchmark is unrealistically heavy.



While that might be true, it makes it impossible to compare benchmarks from TPU with another site. Meanwhile the other sites that just use the built-in benchmarks can be compared with other sites that use the built in benches also.  Peer review.


----------



## Lightofhonor (Feb 13, 2019)

So a 2070 with raytracing on High is pretty much a 2080 with raytracing on Ultra? $300 doesn't buy what it used to... Happy with my 2070.


----------



## EarthDog (Feb 13, 2019)

Gasaraki said:


> Sorry but 60fps shouldn't be the limiting factor. 60fps should be the minimum. People actually have monitors 1440p monitors that go to 160Hz.


Indeed. But two things...

1. 60 FPS is what the majority deem as 'playable' FPS.
2. Maybe it shouldn't... but if you look at the table (maybe it was in another thread here) it looks like NVIDIA attacked first where it NEEDED a performance boost. If you look at the link below, you will see that at 2560x1440 the 2080 Ti at 2560x1440 with RT to ULtra is almost 75 FPS. More is always better, but at least to me, it seems fairly obvious they put the effort where it was explicitly necessary first. I would imagine to see more resolutions added as time allows for the AI to 'train'.
https://www.techpowerup.com/forums/...rformance-rtx-dlss.252502/page-3#post-3993996


----------



## cucker tarlson (Feb 13, 2019)

Gasaraki said:


> What is really baffling is that if you have a 2080Ti, you can't use DLSS at 2560x1440. Only 2080s or below.
> 
> Huh?


cause dlss is really poor at 1440p

http://www.pcgameshardware.de/commoncfm/comparison/clickSwitch.cfm?id=151751

should be used for 4K monitors exclusively


----------



## satrianiboys (Feb 13, 2019)

Don't mind the performance differences, it's expected. 

I just want praise how beautiful RTX is and how TPU correctly phrase it's beauty.


----------



## ArbitraryAffection (Feb 13, 2019)

bug said:


> Did you look at VRAM allocation? It's a bit over 3GB at 1440p.


It's not just about Vram though. 580 is a faster card than 3GB 1060 in most things.



Gasaraki said:


> What is the lower limit of your freesync monitor? Most freesync monitors don't start working till 40fps or above.



I bought this one.

Oh god please don't tell me it's bad. But seriously, please don't. It'll trigger my anxiety T_T ~ It's the only 144hz in my budget there was literally nothing else.


----------



## bug (Feb 13, 2019)

ArbitraryAffection said:


> It's not just about Vram though. 580 is a faster card than 3GB 1060 in most things.



You said "most"


----------



## cucker tarlson (Feb 13, 2019)

rtx light looks great,not gonna buy a 2080ti to have that tho.


----------



## Zubasa (Feb 13, 2019)

cucker tarlson said:


> cause dlss is really poor at 1440p
> 
> http://www.pcgameshardware.de/commoncfm/comparison/clickSwitch.cfm?id=151751
> 
> should be used for 4K monitors exclusively


Not sure about using a 4k monitor just to get a more blurry image....
I think you are better off with a 1440p maybe even a 1080p monitor in that case.


----------



## Mescalamba (Feb 13, 2019)

M2B said:


> View attachment 116380



Critics review is meaningless. Only players reviews count. Since you cant usually bribe players. Same for Rotten Tomatoes.


----------



## bug (Feb 13, 2019)

Zubasa said:


> Not sure about using a 4k monitor just to get a more blurry image....
> I think you are better off with a 1440p maybe even a 1080p monitor in that case.


Those comparisons are misleading. Yes, screenshots will look worse under a magnifying glass, but that's not the point. The point is, do you notice those differences are noticeable during normal gameplay? I have seen no review that talks about that.


----------



## Mescalamba (Feb 13, 2019)

ArbitraryAffection said:


> It's not just about Vram though. 580 is a faster card than 3GB 1060 in most things.
> 
> 
> 
> ...



Its most likely 48Hz-144Hz, should be fine.


----------



## moproblems99 (Feb 13, 2019)

Mescalamba said:


> Only players reviews count.



Says none of the previous Metro games that got review bombed by children.


----------



## Apocalypsee (Feb 13, 2019)

To be honest RTX doesn't look THAT impressive, when I look at the second picture with the pipe and look how bright the sky is, I'm having Far Cry 1 HDR flashbacks, some of the other shots looks too dark. I don't know it looks in motion, but it seems very hard to loot stuff under that lighting.


----------



## erixx (Feb 13, 2019)

This game is not out yet! Grrrrrrrrr! Mmmmmmm!  

When will it be installable from Epic?


----------



## rtwjunkie (Feb 14, 2019)

Mescalamba said:


> Critics review is meaningless.


If @RCoon would review it it would be trustworthy.  His reviews are never afraid to show negatives or to pronounce a game as not good.


----------



## CAT-THE-FIFTH (Feb 14, 2019)

GameGPU also tested the game:
https://gamegpu.com/action-/-fps-/-tps/metro-exodus-test-gpu-cpu







The test sequence:


----------



## Frutika007 (Feb 14, 2019)

IceShroom said:


> What you expect from Nvidia sponsored title?? Poor performence on Readons.



I didn't see you or anyone saying same things when AMD sponsored titles are tested and gives poor performance on nvidia. E.g: Resident evil 2 remake. Hypocrisy?



ArbitraryAffection said:


> yeah it seems fishy here honestly I suspect 4A may have been drinking some green poison when they implemented RTX....



Yeah,same with resident evil 2 remake. Capcom drank some red poison while making it.


----------



## Robcostyle (Feb 14, 2019)

Seems to be nice game, but poor advertisement. Absence in steam and denuvo makes it even harder to suggest if game is worth paying for it now.
RTX is something for a change, tech has some potential, especially if devs will get some experience. 
However, the whole image difference is far cry from what I suggest worth premium they ask for these cards - and I'm shocked with number of those idiots blinded with ngreedia marketing and advertisement, swearing they'll grab new rtx card right NOW, just after they saw exodus screenshots.
Especially considering that only 2080/Ti makes some sence for 1080/1440p - 2060/2070 rtx perfomance is not impressive, RTX in 4K is DOA, that's for sure.

P.S. DLSS is a complete joke! - greedy huang should have better invested transistors in rtx cores, to make at least average 60fps@4K - but he chose the bridge that wants both bones...


----------



## John Naylor (Feb 14, 2019)

ArbitraryAffection said:


> 3GB 1060 beating the 8GB 580



VRAM not in play at 1080..... look at TPUs test results for the 3 GB versus 6GB model.  The 6GB gets about 6% better numbers, but we also see that that the VRAM is not the reason.  The 6 GB varies by more than the VRAM, it has 11% more shaders.  So how do we know it's not the VRAM ?   If it was a contributing factor at 1080p, then it would necessarily be a significantly large factor with a larger performance difference at 1440p.  It's not.   There may come a day when more VRAM is needed at 1080p but, given those test results, that day has not yet arrived.




bug said:


> This is going to spark quite a few discussions. DXR means light is more accurately rendered. Some will say the brighter images look better, some won't. It's just like some prefer Canon's more saturated colors, even if Nikon has more lifelike defaults.



Well that's the thing.   And the reason why photo editors used IPS screens.   Look at a photo of grandma on ya screen after she used her "Glamor Shots " Gift Card and she looks great on the IPS screen ... look at the same pic on TN and she looks like a old hooker with overdone lipstick and rouge   Much akin to popular digital music with exaggerated bass and trebel to make them sound better of the $15 sound systems on phones, Budget MoBos and boom boxes ... played in a hi end audiophile system, they and people are almost runing outta the room holding their ears.  It's "uncomfortable to say the least".   So it's not just that devs have to adjust how they render images to use the ray tracing as a feature, they have to rethink the artificial adjustments that they have been using over the years to make scenes look better ... because the scene was not rendering properly with regard to lighting effects, that scene would be overly bright.  Now by only allowing light effects to be painted from transparaent surfaces,  much of the light coming in that scene is now blocked by opaque surfaces.  It really shows further down in thet shed scene ... but I like what the review did there showing the pimples along with the clear skin.  With accurate lighting effects it's ging to matter where and how big your light sources are and faking it to get desired lighting levels won't work anymore.




M2B said:


> Don't trust their performance numbers, they're using built-in benchmark.
> steve from gamers nexus says in-game benchmark is unrealistically heavy.



An in game / demo is like using the same test on your students when you are aware last years answers are in circulation.   Regardless of what hoops it puts the card thru, everybody knows what's coming and can optimize performance for it.  But of course, those tweaks can blow up in ya face if either then driver or the game gets patched or changed.


----------



## Super XP (Feb 14, 2019)

*DLSS Image Quality looks TERRIBLE. *



Apocalypsee said:


> To be honest RTX doesn't look THAT impressive, when I look at the second picture with the pipe and look how bright the sky is, I'm having Far Cry 1 HDR flashbacks, some of the other shots looks too dark. I don't know it looks in motion, but it seems very hard to loot stuff under that lighting.


It's because DLSS is a complete failure. Every single review I looked at say the same thing. A little boost in performance at a cost of Picture Quality. I'll take the Picture Quality myself, then again I'll take both, which is why I use a Radeon 



Patriot said:


> Perhaps maybe you should look at the facts rather than attacking people.
> A. Reviewers make build choices that may negatively impact performance, most of this is not purposefully slanted.   That said, a good read of the reviewers guide should show which systems setups a vendor prefers to make theirs shine.... might be worth having more than 1 platform to test on to make sure you are not accidentally biasing.   It's a shitton of work and I wouldn't expect it out of day 1 reviews.  The performance engineer in me always wants to ensure validity of test cases.
> 
> B.  RTX is cool but early, DLSS is... fuck I turn off motion blur why would I want this shiite.
> ...


Umm, my Sapphire Radeon RX580 and my 2k 144Hz FreeSync monitor will have absolutely no issues playing Metro Exodus. If my setup can do it, the Vega64 will be able to breeze through it. This all based on your "okish" comment.


----------



## Frutika007 (Feb 14, 2019)

Funny how everyone starts crying when they see AMD failing in an Nvidia sponsored titles, people immediately starts claiming that it's de-optimized for AMD.

But when nvidia fails in a heavily AMD biased and sponsored game, people starts to celebrate saying it's AMD's raw gpu power unleashed, FineWine and other baseless myths. The amount of ignorance and hypocrisy is really worrying.


----------



## Super XP (Feb 14, 2019)

R4WN4K said:


> Funny how everyone starts crying when they see AMD failing in an Nvidia sponsored titles, people immediately starts claiming that it's de-optimized for AMD.
> 
> But when nvidia fails in a heavily AMD biased and sponsored game, people starts to celebrate saying it's AMD's raw gpu power unleashed, FineWine and other baseless myths. The amount of ignorance and hypocrisy is really worrying.


Nothing is De-optimized for either Nvidia or AMD. Though AMD & Nvidia sponsored games are usually optimized for there GPUs and that's OK.


----------



## SpartanM07 (Feb 14, 2019)

I wonder if ultrawide resolutions will be DLSS supported. I've just tried out BFV and the DLSS option was grayed out... means I've got to stick to Medium DXR for now I guess.


----------



## Tsukiyomi91 (Feb 14, 2019)

About the whole "you can't use DLSS with RT" debacle, at least you have a choice to either have accurate lighting & shadows with horri-bad image quality + low fps OR RTX Off with DLSS to have a good balance of ok image quality & high fps count. Still, DLSS isn't ready like RTX, so it's clear that we all know RTX alone does the job fantastically. I bet DLSS tech will be taken a little more seriously from game devs coz both of them need to "learn" since I feel that the complexity of deep learning tech isn't taken seriously & treated like some gimmick tech (which it's not when put to good use).


----------



## robb (Feb 14, 2019)

SystemMechanic said:


> TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???


 It is getting harder and harder to link people to their benchmarks as they seem to never line up with all the other sites. for example no other review of the Radeon 7 had it losing to the Nvidia cards like it did on here.


----------



## Nkd (Feb 14, 2019)

man dlss is a let down. Shit is so blurry. That is what I was afraid off. Running good old resolution with DLSS off. I think nvidia should just dedicate all the other tensor cores to Ray tracing to boost performance and let go of the dlss stuff.



Tsukiyomi91 said:


> About the whole "you can't use DLSS with RT" debacle, at least you have a choice to either have accurate lighting & shadows with horri-bad image quality + low fps OR RTX Off with DLSS to have a good balance of ok image quality & high fps count. Still, DLSS isn't ready like RTX, so it's clear that we all know RTX alone does the job fantastically. I bet DLSS tech will be taken a little more seriously from game devs coz both of them need to "learn" since I feel that the complexity of deep learning tech isn't taken seriously & treated like some gimmick tech (which it's not when put to good use).



Are you really blaming the developers here? Nvidia is supposed to do the heavy lifting on DLSS. Lets blame Nvidia for wasting stupid die space on DLSS, I would have easily accepted they use the entire tensor cores to do Rtx only or add more cuda cores to give performance increase without having to deal with dlss.


----------



## Pumper (Feb 14, 2019)

Damn, ray traced lighting is such a better use of the tech than the reflections in BFV. The game looks great, only issue being that it should have a couple more bounces for interiors as these look darker than they should now.

DLSS is a joke as expected. No idea why tech sites/youtubers keep bringing it up as a "feature" when doing GPU reviews.


----------



## ArbitraryAffection (Feb 14, 2019)

John Naylor said:


> VRAM not in play at 1080..... look at TPUs test results for the 3 GB versus 6GB model.  The 6GB gets about 6% better numbers, but we also see that that the VRAM is not the reason.  The 6 GB varies by more than the VRAM, it has 11% more shaders.  So how do we know it's not the VRAM ?   If it was a contributing factor at 1080p, then it would necessarily be a significantly large factor with a larger performance difference at 1440p.  It's not.   There may come a day when more VRAM is needed at 1080p but, given those test results, that day has not yet arrived.


I'm sorry but 3GB is not sufficient for even 1080P with high textures. Metro games were always frugal with VRAM (and looked great despite it). The 1060 3GB is a disgrace of a card and so its the upcoming 1660 Ti 3GB. People that buy these cards deserved to get burned in newer games. ~


----------



## RCoon (Feb 14, 2019)

rtwjunkie said:


> If @RCoon would review it it would be trustworthy.  His reviews are never afraid to show negatives or to pronounce a game as not good.


I'm at the mercy of Terminals.io

As you can tell by the barren game reviews lately they've not been giving me much. Early year means all games come out on the same week forcing me to choose one out of five.


----------



## bug (Feb 14, 2019)

John Naylor said:


> Well that's the thing.   And the reason why photo editors used IPS screens.   Look at a photo of grandma on ya screen after she used her "Glamor Shots " Gift Card and she looks great on the IPS screen ... look at the same pic on TN and she looks like a old hooker with overdone lipstick and rouge   Much akin to popular digital music with exaggerated bass and trebel to make them sound better of the $15 sound systems on phones, Budget MoBos and boom boxes ... played in a hi end audiophile system, they and people are almost runing outta the room holding their ears.  It's "uncomfortable to say the least".   So it's not just that devs have to adjust how they render images to use the ray tracing as a feature, they have to rethink the artificial adjustments that they have been using over the years to make scenes look better ... because the scene was not rendering properly with regard to lighting effects, that scene would be overly bright.  Now by only allowing light effects to be painted from transparaent surfaces,  much of the light coming in that scene is now blocked by opaque surfaces.  It really shows further down in thet shed scene ... but I like what the review did there showing the pimples along with the clear skin.  With accurate lighting effects it's ging to matter where and how big your light sources are and faking it to get desired lighting levels won't work anymore.


Adjusting to the new lighting model isn't probably that hard. The problem is everybody has to still support rasterization, so they have to make both look good at the same time. Not impossible, but not as straightforward as it could be.


Super XP said:


> It's because DLSS is a complete failure. Every single review I looked at say the same thing. A little boost in performance at a cost of Picture Quality. I'll take the Picture Quality myself, then again I'll take both, which is why I use a Radeon


I'm sure you run nothing but SSAA, because you'll take picture quality


----------



## Nxodus (Feb 14, 2019)

CAT-THE-FIFTH said:


> GameGPU also tested the game:
> https://gamegpu.com/action-/-fps-/-tps/metro-exodus-test-gpu-cpu
> 
> 
> ...



man the russians know how to test a damn game. Never seen such a detailed test, impressive
thanks for the link


----------



## AmonRaa (Feb 14, 2019)

More performance with DLSS enabled means less light more black?


----------



## Nxodus (Feb 14, 2019)

Pumper said:


> DLSS is a joke as expected. No idea why tech sites/youtubers keep bringing it up as a "feature" when doing GPU reviews.



it can be useful... for people who prefer high FPS over some blurriness.  There are these kids who boast about 144Hz and never going back to "abysmal" 60Hz. Well, DLSS is for them. And clearly, monitor industry is catering to them as well, so I can understand the business perspective behind DLSS.

Lets not forget about people who turn everything down just to get 240Hz on their silly 24" monitor.


----------



## londiste (Feb 14, 2019)

CAT-THE-FIFTH said:


> GameGPU also tested the game:
> https://gamegpu.com/action-/-fps-/-tps/metro-exodus-test-gpu-cpu


Nvidia cards don't care whether you use DX11 or DX12 (except for RTX/DLSS). AMD cards need to use DX12.



Nxodus said:


> it can be useful... for people who prefer high FPS over some blurriness.  There are these kids who boast about 144Hz and never going back to "abysmal" 60Hz. Well, DLSS is for them. And clearly, monitor industry is catering to them as well, so I can understand the business perspective behind DLSS.


No it isn't. This is for people who prefer higher resolution than their GPU can run. You can see from FPS graphs that 144 FPS is out of the question even (or especially) with DLSS.
I think upscaling is cancer but consoles have been doing it for most of the generation and DLSS compares well with other methods. Marketing around DLSS is stupid but the idea is sound.


----------



## bug (Feb 14, 2019)

londiste said:


> I think upscaling is cancer but consoles have been doing it for most of the generation and DLSS compares well with other methods. Marketing around DLSS is stupid but the idea is sound.



Look what Ars says about it:


> To be more precise: at its best, _Metro Exodus_' DLSS 4K mode looks sharper and cleaner than those PS4 Pro games, while in motion, it's easy to spot once-a-minute grainy artifacts when Nvidia's tech tries to keep up with fast action and motion-blur effects.


https://arstechnica.com/gaming/2019...ayer-game-to-usher-in-the-pc-ray-tracing-era/

At the end of the day, the whole computer graphics is based upon simulating the real deal and cheating as much as possible, as long as the eye won't easily tell the difference


----------



## londiste (Feb 14, 2019)

Spider-Man and God of War are the absolute pinnacle of checkerboarding (and PS4 Pro is claimed to have hardware support for it).


----------



## bug (Feb 14, 2019)

londiste said:


> Spider-Man and God of War are the absolute pinnacle of checkerboarding (and PS4 Pro is claimed to have hardware support for it).


I wouldn't know about that, I don't have a console.
But it's the first review I've come across to talk about DLSS during actual game play. Granted, Metro titles have always had awesome graphics, but it seems DLSS _can_ work rather nicely.
Also, i"m not sure who does the actual DLSS training. If it's Nvidia, we can expect some level of consistency between titles. If it's up to individual developers, then we can expect some devs will try to get away with less optimization, by forcing DLSS into reducing more details.


----------



## Vayra86 (Feb 14, 2019)

Hm! Very much as I expected. Completely lackluster DLSS and RTX features and a fantastic game. Looks like I'm set with rasterization for the coming few years. I see absolutely no advantage in this RTX implementation. Scenes are wáy too dark and others look like they lack ambient occlusion. The outdoor scenes are also not great to look at, it feels like Fallout 4, too much color saturation etc and everything is blue tinted.

One thing is clear, RTX completely screws up the color balance/realism of the scene compared to the RTX-off comparisons... As we've already been able to gather from the previews of indoor locations, as well. It looks like the fake HDR on Reshade, except now you get four times the performance hit. Well played  If I want the 'extra immersion' from darkened scenes, I'll drop my gamma curve a bit.. same effect, 0% perf cost.

And then DLSS... wow. Nice way to lose all definition so you can live with the illusion of playing on a high res. Its a blurfest, I'll take TAA over this junk any day. That is, if you even have the choice and are in the lucky 'Yes' box, for that *exact* game, on that *exact* res, with that *exact* GPU. It'd be hilarious if it wasn't such a sad attempt to add value to a terrible GPU gen..


----------



## erixx (Feb 14, 2019)

*day one patch*
*PC Specific Updates:*

Tuned HDR saturation
RTX Improvements / Bug Fixing
Added DLSS Support
Additional HUD removal options (coming soon to console) when playing in Ranger Mode
Added Motion Blur options (coming soon to console)
 
*Additional PC Fixes Since Review Code Was Sent:*

Fixed Locking of player input after scene of rescuing Yermak
Removed v-sync option from benchmark launcher
Tuning and fixes for Atmos audio system
Fixed memory corruption in DX12
Fixed crash on launching game on old AMD CPUs
Fixed crash after changing of resolution and Shading Rate to 4k/4x on Video cards with 2Gb and less
Fixed blurred UI when DLSS is enabled
Fixed visual artifacts for RTX high mode
Fixed input lock in when patching gas mask during combat
Fixed forcing to V-Sync on after alt-tabbing the game running at maximal available resolution for the monitor
Fixed crash when pressing ALT + Tab during start videos
Fixed forcing of V-SYNC mode if the game resolution is different than the desktop
DLSS can be applied in the Benchmark
Tuned DLSS sharpness to improve image quality
Updated learned data for DLSS to improve image quality with DLSS on


----------



## W1zzard (Feb 14, 2019)

erixx said:


> Fixed blurred UI when DLSS is enabled


That's an important one for DLSS users



erixx said:


> Fixed forcing to V-Sync on after alt-tabbing the game running at maximal available resolution for the monitor


Most important one for me, it was so annoying having to restart the game all the time



erixx said:


> Tuned DLSS sharpness to improve image quality
> Updated learned data for DLSS to improve image quality with DLSS on


Interesting, we'll see how that turns out


----------



## 50eurouser (Feb 14, 2019)

Hairworks have a huge performance impact on AMD GPU's, with that disabled they are on par with similar Nvidia GPUS. (1060 vs RX 580)


----------



## Super XP (Feb 15, 2019)

Nvidia is getting some serious flack even from so called Nvidia fanboys about how terrible the picture quality is when enabling that feature called DLSS. They go as far as to claim the RTX series is the worst GPU series launch in a long time. Playing a game like METRO Exodus must be played with the upmost highest of Picture Quality. I am sure my Radeon RX 580 will do the trick well enough. 

These were also posted on Reddit and forwarded to Nvidia tech support I believe. lol

*Here is more proof DLSS makes the image look washed out, despite upping the Picture Quality settings to Maximum. There is something seriously wrong with this so called enhancement. Live on YOUTUBE**.*
DLSS Problems


----------



## Frutika007 (Feb 15, 2019)

SystemMechanic said:


> TPU. Please fix ur benchmarks. First Anthem and now this...how is guru3d getting 20fps lower at 4k than yours ???





ArbitraryAffection said:


> I guess different areas of the game tax different bits of the GPU differently or something. I'm just confused that TPU has the 3GB 1060 beating 580 when Guru3D has it a hair off the 1070.



Bcz TPU is testing the actual game while guru3d is stuck on demo benchmark. Gamer's Nexus just released a video about metro exodus and he got similiar result as TPU.


----------



## MaZz7 (Feb 15, 2019)

When RX 580 8GB lost to 1060 3GB, you know who really sponsored this game, they Nerf the game so mush that the red team will really f*^%&d up  Nvidia at it's best


----------



## Frutika007 (Feb 15, 2019)

MaZz7 said:


> When RX 580 8GB lost to 1060 3GB, you know who really sponsored this game, they Nerf the game so mush that the red team will really f*^%&d up  Nvidia at it's best



Hahaha,nice joke. I didn't see you complaining when rx 570 4gb beat gtx 1060 6gb in Resident Evil 2(an AMD sponsored and heavily AMD gimped game). They nerfed the game so much that green team really f*^%&d up. But i didn't see any nvidia fan crying back then, but whenever nvidia does this, AMD fanboys start rebelion with their pitchfork.


----------



## xkm1948 (Feb 15, 2019)

R4WN4K said:


> Hahaha,nice joke. I didn't see you complaining when rx 570 4gb beat gtx 1060 6gb in Resident Evil 2(an AMD sponsored and heavily AMD gimped game). They nerfed the game so much that green team really f*^%&d up. But i didn't see any nvidia fan crying back then, but whenever nvidia does this, AMD fanboys start rebelion with their pitchfork.



Fan logic from AMD side, what else do you expect?

Also I wonder what the fans gonna say when AMD and maybe Intel start implementing similar AI accelerated AA through DirectML. Will there be similar level of crying from the fanboys again then?

Source:

https://www.guru3d.com/news-story/a...tive-with-radeon-vii-though-directml-api.html
https://wccftech.com/amd-radeon-vii-excellent-result-directml/


----------



## moproblems99 (Feb 15, 2019)

Mescalamba said:


> Critics review is meaningless. Only players reviews count. Since you cant usually bribe players. Same for Rotten Tomatoes.



So, uh, how valuable are those user scores right now?


----------



## dirtyferret (Feb 15, 2019)

ArbitraryAffection said:


> I'm sorry but 3GB is not sufficient for even 1080P with high textures. Metro games were always frugal with VRAM (and looked great despite it). The 1060 3GB is a disgrace of a card and so its the upcoming 1660 Ti 3GB. People that buy these cards deserved to get burned in newer games. ~



huh?  From what I've seen the GTX1060 3Gb is able to run most games at very good settings at 1080p including Metro Exodus.  The upcoming 1660 Ti 3GB should be even better at 1080p.  Those cards fit a specific price point and customer.  Frankly I would rather have a faster GPU with less ram then a slower GPU with more RAM then it can handle (hello AMD 570 8GB card).  Now I would argue the GTX1060 3GB is over priced and needs to be between the AMD 570 and AMD 580 4GB in price but I would also say most of Nvidia's cards are over priced to AMD's offerings.  

Obviously the GTX 1060 6GB and GTX 1660 Ti 6GB are/will be better cards then the 3GB name sake but people won't always want to spend that money if they don't have the need or desire for that hardware.


----------



## bug (Feb 15, 2019)

dirtyferret said:


> huh?  From what I've seen the GTX1060 3Gb is able to run most games at very good settings at 1080p including Metro Exodus.  The upcoming 1660 Ti 3GB should be even better at 1080p.  Those cards fit a specific price point and customer.  Frankly I would rather have a faster GPU with less ram then a slower GPU with more RAM then it can handle (hello AMD 570 8GB card).  Now I would argue the GTX1060 3GB is over priced and needs to be between the AMD 570 and AMD 580 4GB in price but I would also say most of Nvidia's cards are over priced to AMD's offerings.
> 
> Obviously the GTX 1060 6GB and GTX 1660 Ti 6GB are/will be better cards then the 3GB name sake but people won't always want to spend that money if they don't have the need or desire for that hardware.


I wonder if anyone remembers the days when fewer memory meant lower latency memory and could result in faster cards 
Also, when you buy a $150 video card and expect to max out the latest AAA titles, the problem is not the video card.


----------



## Mescalamba (Feb 15, 2019)

moproblems99 said:


> So, uh, how valuable are those user scores right now?



Exception for once. Bombing reviews isnt norm. Nor should be.

Beside if someone isnt idiot, they will figure where is truth too.. Using internet, especially today does require some common sense.


----------



## mobiuus (Feb 15, 2019)

what is the differences between ultra and extreme graphic setting?
besides fps drop i can't notice any visible change...
perhaps higher antialiasing??


----------



## Captain_Tom (Feb 16, 2019)

ArbitraryAffection said:


> It's not just about Vram though. 580 is a faster card than 3GB 1060 in most things



There is no point in trying to reason with that guy lol.  He thinks Infinity Fabric is a failure... and in fact he thinks he is so right that he doesn't see the folly in people who disagree with him in his signature.


----------



## erixx (Feb 16, 2019)

I hope to see (somewhere) numbers for the 1080 Ti SLI vs the new topend card...

With ultra settings, 4K, and G-Sync, 60Hz, I get mostly 55 FPS.


----------



## Frutika007 (Feb 16, 2019)

Gamer's Nexus got 79fps with 1080p ultra(rtx high) while TPU got only 59fps on that same settings with RTX 2060. And people are saying TPU is showing more fps than actual and calling them nvidia shill?? LOL. It's the complete opposite. TPU is showing lower fps than what others actually got.


----------



## Super XP (Feb 16, 2019)

R4WN4K said:


> Gamer's Nexus got 79fps with 1080p ultra(rtx high) while TPU got only 59fps on that same settings with RTX 2060. And people are saying TPU is showing more fps than actual and calling them nvidia shill?? LOL. It's the complete opposite. TPU is showing lower fps than what others actually got.View attachment 116567


Are test systems identical?


----------



## erixx (Feb 16, 2019)

My FPS is just subjective, looking at value during gameplay. I just upped the settings to Extreme and FPS lowered to 40-50. Now it starts to feel uncomfortable. Ultra was nice (no waves of enemies). 
I hear SLI calling. I found a used normal evga sc2 1080ti for 570 yourows, lol


----------



## bug (Feb 16, 2019)

Super XP said:


> Are test systems identical?


TPU uses the i7-8700k and 16 GB of RAM, GN uses the i7-8086k and 32 GB of RAM.
The system doesn't matter much though, this is GPU bottlenecked scenario. The answer you're looking for is both sites have tested in-game footage which is obviously different. GN usually uses pre-baked benchmarks (and they say they will revert to that), they just made an exception this time because they were actually looking at pre-baked vs real game play differences. They found out the built-in is unrealistically heavy. That's ok in my book, but that wasn't @R4WN4K's point.


----------



## EarthDog (Feb 16, 2019)

Super XP said:


> Are test systems identical?


test system dont make that much of a difference unless one of the test systems was a potato.


----------



## jabbadap (Feb 16, 2019)

Steve does not use hairworks, W1z does.


----------



## erixx (Feb 17, 2019)

I went to DX11 from DX12 and the fishes in far away river look much better from a tower (rephrasing, they looked bad in DX12 and look as should in DX11).
I got 58 fps.
Then under DX11 I lowered Shader level from 1.0 to 0.8 and now I have a stable 60 fps (v-sync and g-sync on) (1080ti light OC)
Then again back to DX12, Shader level 0.8 and it stays at fixed 60 FPS and fishes look good.... Whatever it might be (may it was the Extreme setting). I'll keep it a while.


----------



## bug (Feb 17, 2019)

erixx said:


> I went to DX11 from DX12 and the fishes in far away river look much better from a tower (rephrasing, they looked bad in DX12 and look as should in DX11).
> I got 58 fps.
> Then under DX11 I lowered Shader level from 1.0 to 0.8 and now I have a stable 60 fps (v-sync and g-sync on) (1080ti light OC)
> Then again back to DX12, Shader level 0.8 and it stays at fixed 60 FPS and fishes look good.... Whatever it might be (may it was the Extreme setting). I'll keep it a while.


Ah, the lost art of looking at other setting besides presets. Respect, sir,


----------



## erixx (Feb 17, 2019)

I finished totally seasick after playing several hours so tweaking is -still- a must, sir!


----------



## Nkd (Feb 18, 2019)

are you using the builtin benchmark to run the test? I believe in game you get more but built in benchmark gave me 81fps at 1440p ultra without any RTX on and stuff on my 2080ti. and little over 60 fps with rtx high on 2080ti. That is with buit in benchmark though. The benchmark does seem to be on the heavy side and worst case.


----------



## W1zzard (Feb 18, 2019)

Nkd said:


> are you using the builtin benchmark to run the test? I believe in game you get more but built in benchmark gave me 81fps at 1440p ultra without any RTX on and stuff on my 2080ti. and little over 60 fps with rtx high on 2080ti. That is with buit in benchmark though. The benchmark does seem to be on the heavy side and worst case.


As mentioned several times before, I do not use the in-game benchmark, I'm using actual gameplay (in open world)


----------



## Mescalamba (Feb 18, 2019)

W1zzard said:


> As mentioned several times before, I do not use the in-game benchmark, I'm using actual gameplay (in open world)



I know you probably dont need confirmation, but what you doing is best. Benchmark in games are often not very telling in terms of how actual gameplay looks like.


----------



## bug (Feb 18, 2019)

Mescalamba said:


> I know you probably dont need confirmation, but what you doing is best. Benchmark in games are often not very telling in terms of how actual gameplay looks like.


Built-in benchmarks aren't meant to be that telling (one could argue recorded bits aren't either). The point about built-in benchmarks is they're repeatable.


----------



## EarthDog (Feb 18, 2019)

bug said:


> Built-in benchmarks aren't meant to be that telling (one could argue recorded bits aren't either). The point about built-in benchmarks is they're repeatable.


People have lost site of the fact that integrated benchmarks aren't direct analogs to game play anyway. What they do provide is an empirical method of testing where results are 100% repeatable and show relative performance between a group of cards. It shouldn't be translated directly to FPS in-game, regardless.


----------



## bug (Feb 18, 2019)

EarthDog said:


> People have lost site of the fact that integrated benchmarks aren't direct analogs to game play anyway. What they do provide is an empirical method of testing where results are 100% repeatable and show relative performance between a group of cards. It shouldn't be translated directly to FPS in-game, regardless.


I'd say built-in benchmarks _are_ representative of gameplay. It's just that they tend to be a worst case scenario, otherwise the developer would stand accused they try to make it seem their game performs better than it actually does. I don't have a problem with that, but people don't have all that in mind. people tend to just look at numbers. If they don't like the numbers, it's either the reviewer's fault (somehow) or the other camp has paid to gimp the game for their video cards. It's so predictable it's not even boring anymore.


----------



## EarthDog (Feb 18, 2019)

I think it gives a good idea, in general... like close enough...horsehoes/hand grenades style, but the real takeaway is the relative distance between cards for each benchmark. 



bug said:


> It's so predictable it's not even boring anymore.


It isnt always like that... it depends on where (what forum) you are lay your head.


----------



## bug (Feb 19, 2019)

EarthDog said:


> It isnt always like that... it depends on where (what forum) you are lay your head.



I admit I don't frequent that many forums. But the ones I do are like that.


----------



## megaclite (Feb 19, 2019)

ArbitraryAffection said:


> Also Guru3D have Radeons doing MUCH better, I'm so confused:


Because TPU fake


----------



## londiste (Feb 19, 2019)

Interview with 4A guys has some nice details and thoughts:
Tech Interview: Metro Exodus, ray tracing and the 4A Engine's open world upgrades (EuroGamer Digital Foundry)


> Ray tracing has two quality settings: high and ultra. Ultra setting traces up to one ray per pixel, with all the denoising and accumulation running in full. The high setting traces up to 0.5 rays per pixel, essentially in a checkerboard pattern, and one of the denoising passes runs as checkerboard. We recommend high for the best balance between image quality and performance, but please note that we are still experimenting a lot, so this information is valid only at the time of writing.


----------



## bug (Feb 19, 2019)

londiste said:


> Interview with 4A guys has some nice details and thoughts:
> Tech Interview: Metro Exodus, ray tracing and the 4A Engine's open world upgrades (EuroGamer Digital Foundry)


This pretty much sums up the whole discussion about RTX: it's in its infancy. If you want to be an early adopter (and can afford the asking price), go ahead and show your support. If you like things ironed out for you, then wait.
But no, this level of common sense cannot be expected on the Internet. We need to go all sjw and point out how companies are evil and such instead.


----------



## dirtyferret (Feb 19, 2019)

EarthDog said:


> People have lost site of the fact that integrated benchmarks aren't direct analogs to game play anyway. What they do provide is an empirical method of testing where results are 100% repeatable and show relative performance between a group of cards. It shouldn't be translated directly to FPS in-game, regardless.


+1
It's similar to all those "which CPU should I buy" threads.   _Should I get the CPU that's faster in synthetic benchmarks or the one that's just as fast or faster in virtually every real world scenario?   _I don't know, are you building a PC that only runs sythetic benchmarks?


----------



## Super XP (Feb 19, 2019)

bug said:


> This pretty much sums up the whole discussion about RTX: it's in its infancy. If you want to be an early adopter (and can afford the asking price), go ahead and show your support. If you like things ironed out for you, then wait.
> But no, this level of common sense cannot be expected on the Internet. We need to go all sjw and point out how companies are evil and such instead.


Less We Forget.... 

__
		https://www.reddit.com/r/AMD_Stock/comments/8rbdtb

More like Self Induced by Nvidia's gimmicky.  If anything.


----------



## bug (Feb 19, 2019)

Super XP said:


> Less We Forget....
> 
> __
> https://www.reddit.com/r/AMD_Stock/comments/8rbdtb
> ...


Man, you've shown me


----------



## londiste (Feb 19, 2019)

Super XP said:


> Less We Forget....
> 
> __
> https://www.reddit.com/r/AMD_Stock/comments/8rbdtb
> More like Self Induced by Nvidia's gimmicky.  If anything.


This video is honestly too long to thoroughly analyze and comment on but also remember that this is a very biased and one-sided video. The facts are not wrong but other facts are purposefully neglected and everything is presented with a certain angle. Jim is a pretty good speaker and as usual with videos he can simply talk you to death.


----------



## Frutika007 (Feb 19, 2019)

For those who were saying TPU is wrong, Techspot/Hardware Unboxed's benchmark results are up and they are pretty much very close to TPU's result. Also keep in mind that TPU tested the day 1 version where techspot/hardware unboxed waited and tested the after patch version - https://www.techspot.com/review/1795-metro-exodus-benchmarks/


----------



## Mescalamba (Feb 19, 2019)

Doubt TPU is wrong since I saw review which shows pretty much exactly the same and its coming from ATi fanboy run webpage that does "reviews". And even there RVII barely touches RTX 2080.

Only thing I can come up is that not all RVII are same and its not like everyone is using exactly same stuff around. And since modern GPUs overclock (boost clock) pretty much based on how good piece you get in lottery, it can mess up with results a bit.


----------



## OneMoar (Feb 22, 2019)

the only youtube reviewer I trust for benchmarks is GN and his results agree with TPU
yall can take your nobody youtubers and stuff it

let me remind you of the hardware review pecking order

review sites are first always has been always will be you-tubers are always second they get there info from sites like Techpowerup, Anandtech,  Jonnyguru, guru3d, ect ect

I just finished metro not 10m ago and I played the entire thing with RTX off I found that with a properly calibrated gamma curve RTX provides very little and tends to make things overly dark and the extra shadows are not worth the 40% hit in fps after a hour I forgot it existed and never bothered switching it back on


----------



## Vlada011 (Mar 5, 2019)

4K Resolution High Settings vs 1080p RTX I choose 4K always.
4K Resolution 60fps vs 1440p 100 fps I choose 4K always.


Why they even show us effects on 1080p when people can't enable them on 3840 or 3860 resolution?
Why? When in previous years they show us how picture was clear and nice and no need for filters on so high resolution compare to 1080p.
Now they will show us effects playable on 1080p... And who swallow such things, I'm curious to met such profiles?

For normal people reaction would be Oooh nice, great... Come one NVDIA hurry up we want playable fps on 4K with those effects and high settings.
Until then no money for you. When they develop more, when games adopt, when we got clear evidence that this will not finish as PhysX because PhysX was closer to reality than RTX and without serious improvement in that direction no simulation of reality and again they didn't do nothing years.


----------

