# Deus Ex: Mankind Divided Performance Analysis



## W1zzard (Aug 23, 2016)

Deus Ex Mankind Divided has just been released today. We bring you a performance analysis using the most popular graphics cards, at four resolutions, including 4K, at both Ultra and High settings. We also took a closer look at VRAM usage.

*Show full review*


----------



## Ferrum Master (Aug 23, 2016)

Well... one thing is sure... I won't buy the game now , the thing is too demanding for my system.


----------



## qubit (Aug 23, 2016)

Damn, I need to upgrade my 780 Ti SLI just because of the 3GB RAM limitation. I'm even noticing it at 1080p on CoD: Advanced Warfare when I turn the quality settings up, especially anti-aliasing and that's a 2014 game. Basically, the cards run out of RAM before they run out of performance. Quite sad to have to consider replacement just for this, really.


----------



## Ferrum Master (Aug 23, 2016)

qubit said:


> have to consider replacement just for this, really.



yeah, next year maybe there will be a capable card.


----------



## Dammeron (Aug 23, 2016)

Deus Ex: MD - game about vampires...

Just watch the benchmark video - they don't have shadows!!!


----------



## rtwjunkie (Aug 23, 2016)

@W1zzard  Thanks for the Performance tests! 

I see one error.  You list the 980Ti as 8GB in the charts.

Looks like I will have to stick to "High" and not "Ultra" with my 980Ti so I can play at 60fps.  Should still look awesome though!


----------



## red_stapler (Aug 23, 2016)

Yeah, this probably isn't going to run too well on my 7950.


----------



## W1zzard (Aug 23, 2016)

rtwjunkie said:


> I see one error. You list the 980Ti as 8GB in the charts.


Fixed, thanks!


----------



## ZeppMan217 (Aug 23, 2016)

I love these game reviews!

Also, at High settings, RX480 beats GTX1060 by 10 FPS and 7 at Ultra!? Was DXMD made with some AMD voodoo?


----------



## G33k2Fr34k (Aug 23, 2016)

ZeppMan217 said:


> I love these game reviews!
> 
> Also, at High settings, RX480 beats GTX1060 by 10 FPS and 7 at Ultra!? Was DXMD made with some AMD voodoo?



It's the same reason why the GTX 1060 beats the RX 480 in Rise of the Tomb Raider or The Witcher 3 IMO. AMD's GCN cards excel at general compute tasks since they have relatively more SIMD FP units than Nvidia's chips. So when a lot of the post processing and simulation effects are done using custom shaders that utilize these SIMD units, AMD's cards tend to do better.

The GTX 1060 does much worse against the RX 480 in Doom using the Vulkan renderer:


----------



## Joss (Aug 23, 2016)

Thank you so much, wish we had (many) more of these.


----------



## Footman (Aug 23, 2016)

Hmm, I was waiting for this game. looks like I'll need to sell my watercooled 980ti and buy a 1080 to get decent frame rates at 1440!!! Argg.....


----------



## Captain_Tom (Aug 23, 2016)

ZeppMan217 said:


> I love these game reviews!
> 
> Also, at High settings, RX480 beats GTX1060 by 10 FPS and 7 at Ultra!? Was DXMD made with some AMD voodoo?



Nvidia cards age horribly, and the 480 was 32% stronger than the 1060 in Vulkan - these are pretty much the exact results I expected.  

What is crazy to me is that the 1070 is only like 20% stronger than the 480 in DX11!!!  In DX12 they may be equal (Hence why AMD doesn't think it needs ultra cards yet).



Footman said:


> Hmm, I was waiting for this game. looks like I'll need to sell my watercooled 980ti and buy a 1080 to get decent frame rates at 1440!!! Argg.....



The 1080 that won't even be stronger than the Fury X when the DX12 patch hits?!   Frankly speaking you are better off waiting for the next Fury or just dealing with lowered settings.


----------



## Footman (Aug 23, 2016)

Captain_Tom said:


> The 1080 that won't even be stronger than the Fury X when the DX12 patch hits?!   Frankly speaking you are better off waiting for the next Fury or just dealing with lowered settings.


Hmm, that's food for thought.


----------



## Captain_Tom (Aug 23, 2016)

Footman said:


> Hmm, that's food for thought.



Yeah no card out right now has caught up with the latest graphical demands :-/


----------



## kaspar737 (Aug 23, 2016)

Captain_Tom said:


> Nvidia cards age horribly, and the 480 was 32% stronger than the 1060 in Vulkan - these are pretty much the exact results I expected.
> 
> What is crazy to me is that the 1070 is only like 20% stronger than the 480 in DX11!!!  In DX12 they may be equal (Hence why AMD doesn't think it needs ultra cards yet).


You do know this game uses the same engine as Hitman right?


----------



## Footman (Aug 23, 2016)

So what comes after Polaris (480) then?


----------



## Lightofhonor (Aug 23, 2016)

Footman said:


> So what comes after Polaris (480) then?


481?

 I'd expect at 490 by the end of the year, hopefully sooner.


----------



## Captain_Tom (Aug 23, 2016)

kaspar737 said:


> You do know this game uses the same engine as Hitman right?



Haha and?  It also is the latest game with some of the best graphics, and again it isn't even using DX12 yet (Hitman only got these results with DX12 in use).

Maybe BF1 will prove that this isn't the new normal, but I highly doubt that.



Lightofhonor said:


> 481?
> 
> I'd expect at 490 by the end of the year, hopefully sooner.



490 will almost definitely be out by the end of the year, but I think it will be another GDDR card; unfortunately I don't want anything but Vega with HBM...


----------



## $ReaPeR$ (Aug 23, 2016)

this is funny.. hilarious to be honest!


----------



## dyonoctis (Aug 23, 2016)

Captain_Tom said:


> 490 will almost definitely be out by the end of the year, but I think it will be another GDDR card; unfortunately I don't want anything but Vega with HBM...


So far it looks like RX 400 will only consist of polaris (10,11) and Vega (10,11 too). Since we know that th RX 490 is a confirmed name, it will likely be vega (hence with HBM), since AMD didn't talk about a third family of chip.


----------



## 64K (Aug 23, 2016)

It's good to see a game putting a real challenge to current GPUs. Necessity is the mother of invention and we are going to need more powerful GPUs for max at 1440p and certainly at 4K if this continues. Time to bump up the performance a notch or two Nvidia/AMD.


----------



## Footman (Aug 23, 2016)

I have some time to wait and see what Vega brings.


----------



## INSTG8R (Aug 23, 2016)

WOW! I can't even get it to start... I hit "play" and it just crashes....I'm not even gonna look at the performance charts it will probably depress me even more...


----------



## rtwjunkie (Aug 23, 2016)

INSTG8R said:


> WOW! I can't even get it to start... I hit "play" and it just crashes....I'm not even gonna look at the performance charts it will probably depress me even more...



Now I'm curious, with my end goal merely to see if it starts.


----------



## INSTG8R (Aug 23, 2016)

rtwjunkie said:


> Now I'm curious, with my end goal merely to see if it starts.


Yeah it's an ugly mess right now with no explanations. Nixxes has been on the Hub trying to collect logs etc, The "solutions" have ranged from turning off the DLC to creating a new User Account. I have tried all the "fixes" except the User Account thing, I refuse to do that for one game and well it's just as hit and miss as all the other "fixes" Ironically there is a FuryX user who was playing just fine but my Fury goes nowhere....


----------



## rtwjunkie (Aug 23, 2016)

ok.....despite preloading with some 20GB, it is downloading 22.8GB today. WTF???


----------



## Fumero (Aug 23, 2016)

46fps in 1440p ultra settings with a gtx 1080, WTF?


----------



## yogurt_21 (Aug 23, 2016)

so this is mostly dx 11 atm , so why is the 480 faster than the 1060?


----------



## Fluffmeister (Aug 23, 2016)

Wow performance sucks right across the board, lucky I have 3000+ games to play before I pickup the GOTY edition for 5 bucks.


----------



## INSTG8R (Aug 24, 2016)

rtwjunkie said:


> ok.....despite preloading with some 20GB, it is downloading 22.8GB today. WTF???


Mine was 46GB with all the DLC


----------



## Vayra86 (Aug 24, 2016)

These Fury X numbers... are staggering.

AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?


----------



## INSTG8R (Aug 24, 2016)

Vayra86 said:


> These Fury X numbers... are staggering.
> 
> AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?



Shhh...My Fury won't even launch....


----------



## btarunr (Aug 24, 2016)

We're going to have Doom Vulkan numbers soon. Also, we'll be switching to DX12 on Hitman.


----------



## ShurikN (Aug 24, 2016)

Vayra86 said:


> These Fury X numbers... are staggering.
> 
> AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?


Not a surprise since it uses more or less the same engine as Hitman (2016), both are derivatives of Glacier 2. And since we know for a fact how well AMD performs in Hitman, this one was a given.


----------



## INSTG8R (Aug 24, 2016)

And they haven't even put DX12 in yet so that should be quite interesting. I know Hitman runs great for me in DX12


----------



## Gabkicks (Aug 24, 2016)

wow  D: this game is kicking my gtx 1080's ass at ultra 1080p


----------



## arbiter (Aug 24, 2016)

Captain_Tom said:


> Nvidia cards age horribly, and the 480 was 32% stronger than the 1060 in Vulkan - these are pretty much the exact results I expected.
> 
> What is crazy to me is that the 1070 is only like 20% stronger than the 480 in DX11!!!  In DX12 they may be equal (Hence why AMD doesn't think it needs ultra cards yet).


That is the case til you start pairing that 480 with a weaker cpu that is most likely gonna be paired with a 480 then 1060 takes over in performance.



yogurt_21 said:


> so this is mostly dx 11 atm , so why is the 480 faster than the 1060?



AMD sponsored game like AOTS. Game is optimized for AMD cards.



btarunr said:


> We're going to have Doom Vulkan numbers soon. Also, we'll be switching to DX12 on Hitman.


I think least for mid to low range cards like 480/1060, needs to be test more likely cpu that would be used with them as well to show how they stack up instead of top end cpu that is not likely to be used.

Reason i ask is another site had a graph showing a 480/1060 paired with a weaker cpu instead of a top end intel fps gains seemed to suffer a bit. Would be valid to show what a person would expect with a less power cpu. http://i.imgur.com/JF7ngP5.png


----------



## rtwjunkie (Aug 24, 2016)

Alright, this shit is embarrassing. I just ran the in-game benchmark using my system specs system with high settings, modified by checking box for tesselation, unchecking motion blur, and adding 4xMSAA.

27.1fps average. 17.8 minimum, and 30.2 maximum.

EDIT: Reducing Shadows overall to medium, while turning on contact shadow hardening, turning off msaa and using temporal anti-aliasing prodyces much better results with no noticeable drop in image quality.

59.7fps average. 50.9 minimum, and 61.3 maximum.    All this is on a mere 1080p too.


----------



## Captain_Tom (Aug 24, 2016)

dyonoctis said:


> So far it looks like RX 400 will only consist of polaris (10,11) and Vega (10,11 too). Since we know that th RX 490 is a confirmed name, it will likely be vega (hence with HBM), since AMD didn't talk about a third family of chip.



There are actually a few inconsistencies that have made me wonder what AMD will do in the coming months.  Here's what we know:

1) AMD's roadmap shows ALL polaris in 2016

2) AMD's roadmap shows ALL vega in 2017

3) Rumors would suggest that AMD is indeed launching a more powerful card before 2017, and common sense would back this up.  After all Vega will AT LEAST be a 4096 SP HBM2 card, and that would be 2x stronger than the 480.  They will need 2-3 cards in-between the 480 and that monster.   

4) 256-bit GDDR5X or 384-bit GDDR5 would provide enough bandwidth for a much more powerful core than the 480.  

Thus I think we have another GDDR card on the way, and it would make a ton of sense.  Who cares about efficiency when DX12 games will make AMD nearly as efficient as Nvidia?  My money is on a 12GB 384-bit GDDR5 card coming out with 3500 SP's.  It would probably match the 1080 in the latest games while having the same amount of memory as the Titan, and after all it would be cheap to make.



rtwjunkie said:


> Alright, this shit is embarrassing. I just ran the in-game benchmark using my system specs system with high settings, modified by checking box for tesselation, unchecking motion blur, and adding 4xMSAA.
> 
> 27.1fps average. 17.8 minimum, and 30.2 maximum.
> 
> ...



I can't tell if you are complaining or not.  My good sir you possess a 128-bit card with a paltry 2.4 TF.  That's barely better stats than the current $110 RX 460.  Why would you expect any better?


----------



## rtwjunkie (Aug 24, 2016)

Captain_Tom said:


> I can't tell if you are complaining or not.  My good sir you possess a 128-bit card with a paltry 2.4 TF.  That's barely better stats than the current $110 RX 460.  Why would you expect any better?



um no. System specs is a 980Ti. I got it sorted to something more reflective of W1zzard's results.


----------



## Captain_Tom (Aug 24, 2016)

rtwjunkie said:


> um no. System specs is a 980Ti. I got it sorted to something more reflective of W1zzard's results.



My bad I read 960 in your signature.  Sorry your 980 Ti is performing like my overclocked 470.  But again I make the same point: The 980 Ti has 6.1 TFLOPS.  Why would it be much stronger than a stock 480?  The specs certainly don't suggest it should be...


----------



## pat-roner (Aug 24, 2016)

Rip my 980 ti.

Luckily it should perform better than the stock 980ti though!


----------



## neliz (Aug 24, 2016)

Captain_Tom said:


> There are actually a few inconsistencies that have made me wonder what AMD will do in the coming months.  Here's what we know:
> 
> 1) AMD's roadmap shows ALL polaris in 2016
> 
> ...




there is no HBM2 available for anyone on the market, until Q2/Q3 '17. That's one of the reasons you don't see the fabled "P100 SXM2 card" used anywhere.


----------



## Kanan (Aug 24, 2016)

Great results, but seeing that the Fury X scaling doesn't change much in % compared to GeForce 1080 when increasing from 1080p to 1440p, I highly doubt DX12 will make any difference, because the GPUs are not bottlenecked by DX11. This or all cards will profit from DX12 the same, so the differences will stay the same too. It's basically a high demanding graphics game, typically these sorts of games doesn't need DX12 to max the GPU out, somewhat like Crysis (3).


----------



## AndreiD (Aug 24, 2016)

Can we have a test with all of the 'GCN optimized' effects turned off? Would be an interesting exercise to check out how it runs with AO set to only ON or OFF, Contact Hardening Shadows OFF and Volumetric Lighting ON or OFF.    
DE:MD seems to be using most of AMD's GPUOpen libraries, like TressFX, AOFX and ShadowFX, it would be nice if they at least labeled them properly in the settings menu.         
PCGH tried to keep it more agnostic, they have CHS (ShadowFX) turned off and AO toned down and their results are more in line with what's expected.


----------



## Captain_Tom (Aug 24, 2016)

neliz said:


> there is no HBM2 available for anyone on the market, until Q2/Q3 '17. That's one of the reasons you don't see the fabled "P100 SXM2 card" used anywhere.



Link?  

Everything I have read says December - January for first availability.


----------



## Captain_Tom (Aug 24, 2016)

AndreiD said:


> Can we have a test with all of the 'GCN optimized' effects turned off? Would be an interesting exercise to check out how it runs with AO set to only to ON or OFF, Contact Hardening Shadows OFF and Volumetric Lighting ON or OFF.
> DE:MD seems to be using most of AMD's GPUOpen libraries, like TressFX, AOFX and ShadowFX, it would be nice if they at least labeled them properly in the settings menu.
> PCGH tried to keep it more agnostic, they have CHS (ShadowFX) turned off and AO toned down and their results are more in line with what's expected.



Game looks like complete garbage with shadow hardening off, so do what YOU want buddy lol


----------



## AndreiD (Aug 24, 2016)

Captain_Tom said:


> Game looks like complete garbage with shadow hardening off, so do what YOU want buddy lol


Well Hairworks looks better than the default hair in The Witcher 3 but most reviews turn that effect off for fairness sake. It wouldn't be that hard to try to turn off vendor specific effects in AMD Gaming Evolved games for a fair comparison, or at least have some 'agnostic' settings tests included besides the default Ultra/High settings ones.


----------



## Captain_Tom (Aug 24, 2016)

AndreiD said:


> Well Hairworks looks better than the default hair in The Witcher 3 but most reviews turn that effect off for fairness sake. It wouldn't be that hard to try to turn off vendor specific effects in AMD Gaming Evolved games for a fair comparison, or at least have some 'agnostic' settings tests included besides the default Ultra/High settings ones.



Hairworks looks worse half the time, and only marginally better sometimes (For a massive performance hit on even Nvidia GPU's).  TressFX in Tombraider on the other hand is still the best hair I have ever seen in a game.

Again though the point is you are basically asking them to run the game on Low.  Some games only offer HBAO (No HDAO), and if that is the only option they should (And usually do) bench it when testing Ultra settings.


----------



## AndreiD (Aug 24, 2016)

Captain_Tom said:


> Again though the point is you are basically asking them to run the game on Low.  Some games only offer HBAO (No HDAO), and if that is the only option they should (And usually do) bench it when testing Ultra settings.


 
I've yet to see a game which only offers HBAO as the only AO option (please offer examples), most offer SSAO or something else besides vendor specific AO and you have to be far gone the deep end of fanboyism to think that turning 2-3 vendor speicific effects off is 'running the game on low'.     
Like I've said above, it would be nice to include 'agnostic' tests besides the default Ultra/High ones, that's all.


----------



## neliz (Aug 24, 2016)

Captain_Tom said:


> Link?
> 
> Everything I have read says December - January for first availability.



My link is working on the side of business that actually builds those systems and drinks with Joe Macri.


----------



## neliz (Aug 24, 2016)

AndreiD said:


> Well Hairworks looks better than the default hair in The Witcher 3 but most reviews turn that effect off for fairness sake. It wouldn't be that hard to try to turn off vendor specific effects in AMD Gaming Evolved games for a fair comparison, or at least have some 'agnostic' settings tests included besides the default Ultra/High settings ones.


and how about games that are optimized with Gameworks? How do you get those to run "agnostic"

Vendor optimizations have always been a part of the benchmark industry, with both parties crying foul when there's a performance hampering feature enabled (batman:AA springs to mind as one of the most recent ones, but it's nearly impossible to do this anymore. With the way both parties have been sponsoring developers both visibly and under the table, expecting any AA game that is used as a benchmark to be "agnostic" these days is like believing in Santa Claus.

Since you're talking about $250.000 - $500.000 spent on a single AAA title, you bet they'll always have some kind of competitive advantage in benchmarks, whether it's visually laid out for you or not.


----------



## Assimilator (Aug 24, 2016)

Seems like the perf issues on NVIDIA cards boil down to this being another bad console port. I'm sure there will be a patch out soon to rectify this issue. Once again I'm glad I skipped the preorder hype train.


----------



## renz496 (Aug 24, 2016)

yogurt_21 said:


> so this is mostly dx 11 atm , so why is the 480 faster than the 1060?



this game is using modified engine used in hitman. just look at hitman. the game faster on AMD hardware even on DX11. simply put this game engine favors GCN architecture more (just like UE4 work quite well on nvidia hardware including kepler) . i have been expecting this when i heard the Dawn Engine is just a modified Glacier 2 engine used in hitman.


----------



## renz496 (Aug 24, 2016)

Vayra86 said:


> These Fury X numbers... are staggering.
> 
> AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?



the game engine simply favors GCN architecture. look at hitman 2016 for example.


----------



## renz496 (Aug 24, 2016)

Kanan said:


> Great results, but seeing that the Fury X scaling doesn't change much in % compared to GeForce 1080 when increasing from 1080p to 1440p, I highly doubt DX12 will make any difference, because the GPUs are not bottlenecked by DX11. This or all cards will profit from DX12 the same, so the differences will stay the same too. It's basically a high demanding graphics game, typically these sorts of games doesn't need DX12 to max the GPU out, somewhat like Crysis (3).



probably we are going to see same result like hitman? (almost the same engine). in hitman DX12 does not always result better FPS even on radeon hardware.


----------



## Nokiron (Aug 24, 2016)

The performance is horrendous and it looks quite bad. God dang it Square, don't ruin Deus Ex (again).


----------



## Shatun_Bear (Aug 24, 2016)

The performance of this game favouring AMD cards (or at least the 480) was the reason a few weeks ago that I asked W1zzard to update his benchmarking suite of games. Newer titles seem to perform better on the recent AMD cards he has reviewed. Take a few of the old titles off the list (like BF3 or Crysis 3) and include a few titles from this year (this game and DOOM at least) and the performance summary between the 480 and 1060 is very different.

I would be interested to see a performance summary of the 480 VS 1060 using only games released in 2015. I would wager it would be very close, especially if you include the respective strongest performing API mode for DOOM (OpenGL for 1060 and Vulkan for DOOM).


----------



## bug (Aug 24, 2016)

INSTG8R said:


> Shhh...My Fury won't even launch....


It's probably so fast it finishes the entire game while you blink


----------



## Frick (Aug 24, 2016)

I really hope there'll be a demo.


----------



## las (Aug 24, 2016)

64K said:


> It's good to see a game putting a real challenge to current GPUs. Necessity is the mother of invention and we are going to need more powerful GPUs for max at 1440p and certainly at 4K if this continues. Time to bump up the performance a notch or two Nvidia/AMD.



The game does not look good, so it's probably only demanding because of bad optimization. This looks downright terrible:


----------



## rtwjunkie (Aug 24, 2016)

Assimilator said:


> Seems like the perf issues on NVIDIA cards boil down to this being another bad console port. I'm sure there will be a patch out soon to rectify this issue. Once again I'm glad I skipped the preorder hype train.



No, I don't believe so. On High, I'm averaging 58fps, and it looks frickin fantastic (I literally cannot understand those who say it doesn't. Screenshots always fuck up how good a game looks). What we have here is not a bad console port, but a very demanding new Crysis 3.


----------



## Liviu Cojocaru (Aug 24, 2016)

Did anyone with SLI gave this a go,if yes how is it performing?


----------



## Frick (Aug 24, 2016)

las said:


> The game does not look good, so it's probably only demanding because of bad optimization. This looks downright terrible:



That's with lowered settings, I assume.


----------



## Nokiron (Aug 24, 2016)

There is a weird hazy feeling that seems really off. It's there no matter which settings I change.

It looks horrible with motion blur, DOF, CA, temporal AA and sharpen turned on.


----------



## Captain_Tom (Aug 24, 2016)

Assimilator said:


> Seems like the perf issues on NVIDIA cards boil down to this being another bad console port. I'm sure there will be a patch out soon to rectify this issue. Once again I'm glad I skipped the preorder hype train.



That is just flat out not true.  It runs incredibly smoothly and is one of the best looking games out right now.


----------



## Captain_Tom (Aug 24, 2016)

rtwjunkie said:


> No, I don't believe so. On High, I'm averaging 58fps, and it looks frickin fantastic (I literally cannot understand those who say it doesn't. Screenshots always fuck up how good a game looks). What we have here is not a bad console port, but a very demanding new Crysis 3.



I think some people that say these silly things are just looking at the benchmark and title screen.  Before I actually started PLAYING the game I thought it looked un-optimized as well.  The thing is the benchmark is truly just so you can compare the performance of cards in a worst case scenario.  It doesn't look cool like the infamous Metro: Last Light benchmark.


----------



## Nokiron (Aug 24, 2016)

Captain_Tom said:


> That is just flat out not true.  It runs incredibly smoothly and is *one of the best looking games out right now.*


Are we playing the same game?


----------



## Assimilator (Aug 24, 2016)

rtwjunkie said:


> No, I don't believe so. On High, I'm averaging 58fps, and it looks frickin fantastic (I literally cannot understand those who say it doesn't. Screenshots always fuck up how good a game looks). What we have here is not a bad console port, but a very demanding new Crysis 3.



Crysis 3 wasn't particularly demanding nor particularly good-looking. That aside, I don't see anything in this game that should make it any more demanding than anything we've seen to date.



Frick said:


> That's with lowered settings, I assume.



It's from this article, so we'd have to ask @W1zzard.


----------



## rtwjunkie (Aug 24, 2016)

Assimilator said:


> Crysis 3 wasn't particularly demanding nor particularly good-looking. That aside, I don't see anything in this game that should make it any more demanding than anything we've seen to date.


Good Sir, did  we play the same game?


----------



## alexsubri (Aug 24, 2016)

I have a FX 8370 @ 4.5ghz and a GTX 1070. I have everything on Ultra with Texture Resolution at 4x, MSAA off (1440p 144hz GSNYC monitor), and Exclusive Fullscreen. I got an average of 39 FPS in the benchmark. Come the actual game play, I've completed the first mission with no lag and ease.

In the plane I averaged around 70 FPS

In the whole first level mission I got about 40-55FPS.

This is with the latest Nvidia Driver Update (I did a clean uninstall before I've updated drivers with DDU Removal Tool)

Don't know why a lot of people are being salty. This game looks absolutely stunning. Will post pics later tonight.


----------



## yogurt_21 (Aug 24, 2016)

las said:


> The game does not look good, so it's probably only demanding because of bad optimization. This looks downright terrible:


you do realize that's a jpeg right? Compressed formats won't do justice to in game. That being said most screens put against those of the witcher 3 show that this game has decent graphics but not quite up to that level. Dues Ex focused more on the characters and buildings, not so much on the background items that can really game immersion over the top.


----------



## champsilva (Aug 24, 2016)

I made a graphics comparassion using a GTX 1080.


----------



## alexsubri (Aug 24, 2016)

champsilva said:


> I made a graphics comparassion using a GTX 1080.



Nice video, is this in 1080p or 1440p?


----------



## Captain_Tom (Aug 24, 2016)

Nokiron said:


> Are we playing the same game?



Yes.  Ultra actually means Ultra in this game.  Run it on High, it should double your framerate and look nearly the same.


----------



## Captain_Tom (Aug 24, 2016)

las said:


> The game does not look good, so it's probably only demanding because of bad optimization. This looks downright terrible:


Hahaha lmao that is not how the game looks.  That has to be on Low.


----------



## INSTG8R (Aug 24, 2016)

Captain_Tom said:


> Hahaha lmao that is not how the game looks.  That has to be on Low.


Agreed. They put out a patch already so now I can pay. High@1440 no MSAA, still kinda testing but looks great to me and 60-70fps in the first level. Looks way better than that tho.


----------



## ZeppMan217 (Aug 24, 2016)

AndreiD said:


> Can we have a test with all of the 'GCN optimized' effects turned off? Would be an interesting exercise to check out how it runs with AO set to only ON or OFF, Contact Hardening Shadows OFF and Volumetric Lighting ON or OFF.
> DE:MD seems to be using most of AMD's GPUOpen libraries, like TressFX, AOFX and ShadowFX, it would be nice if they at least labeled them properly in the settings menu.
> PCGH tried to keep it more agnostic, they have CHS (ShadowFX) turned off and AO toned down and their results are more in line with what's expected.


If I'm reading their graph right, it doesn't change anything: RX480 is still 5 FPS faster than a GTX1060; there's a noteworthy thing though - RX480 is 15 FPS faster than the castrated GTX1060 3 GB. Perhaps Nvidia overdid it with the cutting?


----------



## Kanan (Aug 24, 2016)

Cut down 1060 is to go against RX 470 so the perf is okay I'd say.


----------



## rtwjunkie (Aug 24, 2016)

Captain_Tom said:


> Yes.  Ultra actually means Ultra in this game.  Run it on High, it should double your framerate and look nearly the same.



Agreed.  High looks almost the same as Ultra and very playable.  None of my screenshots look like the game either. People just have to see it...it's a very good looking game.


----------



## UnversedXI (Aug 24, 2016)

Feel kind of let down by the visuals in this. It has an aesthetic and graphics similar to Killzone Shadow Fall yet that game did it nearly 3 years ago and on relatively weak hardware. 

Here's my own *heavily compressed* screenshots from when i played KZ:SF, feel free to point out where Deus Ex game manages to look better (I noticed that the volumetric lights in particular are far higher res) https://www.dropbox.com/sh/g6e4p2sletykx43/AAB_zJd6x1N0UN1ecvrDFpA2a?dl=0


----------



## alexsubri (Aug 25, 2016)

Heres my screenshots. Ultra settings Texture resolution 4x and msaa off.


----------



## ViperXTR (Aug 25, 2016)

hmm no 1070 tests, though it seems like it will be between 980Ti-FuryX in performance?


----------



## dwade (Aug 25, 2016)

This just proves that no GPU is futureproof. Newer games will bring even a Titan X on its knees @ 1080p maxed settings. It's better to buy midrange cards, sell, and then buy the next one.


----------



## ZeppMan217 (Aug 25, 2016)

Kanan said:


> Cut down 1060 is to go against RX 470 so the perf is okay I'd say.


Not really - RX470 is the same 5 FPS ahead of GTX1060 3 GB. This marketing naming bullshit is gonna create some awkward situations.


----------



## Kanan (Aug 25, 2016)

ZeppMan217 said:


> Not really - RX470 is the same 5 FPS ahead of GTX1060 3 GB. This marketing naming bullshit is gonna create some awkward situations.


It is because the pricing is the same. Naming is shit though, yep.


----------



## OneMoar (Aug 25, 2016)

I tweaked the settings turned some of the post effects off and left most things on high with the texture quality on very high
completed my play-though without is dipping below 60 and most of the time venturing into the 70's and licking the 80's

the loading times are artocius tho if its not installed to a ssd 

the game is short unless you care about every irrelevant side quest
only about 12h and thats doing some of the sidequests


----------



## Assimilator (Aug 25, 2016)

It seems that MSAA is the culprit in poor performance. HardOCP just released their own test results and according to them a minimum of MSAA 4x is required for acceptable quality, at a hit of around 30FPS. They recommend using Temporal AA instead, which apparently looks just as good and only costs ~1FPS. It also looks like the built-in benchmark is very much a worst-case scenario, which is a good thing IMO.

@W1zzard what AA settings (if any) did you use for your benching?


----------



## rtwjunkie (Aug 25, 2016)

Yeah, I realized MSAA was too big of a hit at release. People can go by my settings i listed earlier here for a very good playability while still looking great setting.

Temporal AA on and I don't see the jaggies.....honestly don't know why they even offered the MSAA. With Temporal, it's unnecessary.


----------



## alexsubri (Aug 25, 2016)

rtwjunkie said:


> Yeah, I realized MSAA was too big of a hit at release. People can go by my settings i listed earlier here for a very good playability while still looking great setting.
> 
> Temporal AA on and I don't see the jaggies.....honestly don't know why they even offered the MSAA. With Temporal, it's unnecessary.



I have MSAA off too, and when it comes to 2k-4k resolutions, you can barely see the jagged edges anymore, so its pointless to have AA or MSAA on. It's a performance hit like you said. I know there is another form of AA that mimics MSAA without the hit. It might be FXAA, but I'm not sure.


----------



## SimmonsTheMad33 (Aug 25, 2016)

Has anyone with a 900/10 series Nvidia Card forced MFAA via NVCP to mitigate the MSAA performance hit?

Actually dont own the game (yet) but no one has mentioned MFAA yet and considering 2xMSAA+MFAA should give you 4xMSAA visuals but only have a perf cost to that of 2xMSAA, might be useful...


----------



## OneMoar (Aug 25, 2016)

Sampled AA of any type is going to incurr a hit the game really does not need it the engine does a good job of managing the aliasing and you have txaa to fallback on I didn't have any noticeable aliasing even at 1080


----------



## Mitsurugi (Aug 25, 2016)

Running a 3770k and 980ti SLI at 3440 x 1400 at Very High settings. This game is sooo beautiful and it flies. 

Loving this game.


----------



## xkm1948 (Aug 25, 2016)

980Ti lagging behind FuryX further and further away.


----------



## W1zzard (Aug 25, 2016)

Assimilator said:


> @W1zzard what AA settings (if any) did you use for your benching?


AA off


----------



## Liviu Cojocaru (Aug 26, 2016)

I bought the game yesterday, at 3440x1440 with 2 GTX 970 OC SLI I can barely run the game at Ultra with MSAA off over 35 FPS  . I can see now the limitations of my cards, first time I've seen this was in The Division and now it's all obvious. Come on already with the 1080Ti !!!


----------



## Mitsurugi (Aug 26, 2016)

I think it was unrealistic to expect any more... Especially as the game might be limited by the 3.5GB + 0.5Gb memory config and you are asking your cards, to push 5 million pixels around in ultra detail.


----------



## RustyKats (Aug 26, 2016)

xkm1948 said:


> 980Ti lagging behind FuryX further and further away.



It's been said for awhile, NV GPUs age terribly. After a next-gen arrival, current stuff gets put on legacy and "Game Ready" optimizations shift to the new GPUs.

Did you guys see the 780Ti?

It's getting totally wrecked by the 290 and 290X, which back then, it was 15% faster. This is actually quite common in most of the recent AAA games, Kepler is just dead.







At GURU3D as well:

http://www.guru3d.com/articles_page..._graphics_performance_benchmark_review,6.html






Seriously, anyone who bought 290 or 290X from way back then, still enjoying great performance today in modern AAA games. Right up there with 390/X! They were a lot cheaper than 780, Titan or 780Ti too.

RX 480 vs GTX 1060 now, already trading blows, give it another 6 months and the RX 480 will destroy the 1060 as all the newer games come out.


----------



## Mitsurugi (Aug 26, 2016)

I think you are being selective in the benchmarks you are showing. The 1070 and the 980 ti scale much better as the resolution increases. These cards are made for 2560 x 1440 and above. This is taken from the same review you mentioned.

All things said, this is just ONE game. Over the past 2 years I think, team 'green' have made the best cards. Hopefully team 'red' hits back soon.

I think it is clear in 2016, for AAA titles that 4GB graphics memory, is now starting to be a minimum requirement for resolutions above 1080p.


----------



## RustyKats (Aug 26, 2016)

4K? Do you think freaken 15 to 20 FPS is relevant to gamers? Heck even the GTX 1080 gets 31 FPS on HIGH, not even Ultra, which it tanks to 25 FPS. Do you enjoy playing action games as a slideshow?

At least if you want to get real, use 1440p numbers for top of the line GPUs.

My 1080p comparison was to highlight how bad Kepler is in modern games. It's already UNPLAYBLE at 1080p for the top Kepler, 780Ti on High, not even Ultra. It's getting smacked by a 380X and 7970Ghz for perspective!


----------



## bug (Aug 26, 2016)

Mitsurugi said:


> I think it was unrealistic to expect any more... Especially as the game might be limited by the 3.5GB + 0.5Gb memory config and you are asking your cards, to push 5 million pixels around in ultra detail.


Actually, if you read the review, it says right there that 1440 at Ultra needs 4.2 GB VRAM. So even without 970's limitation, a 4GB card runs into trouble.
Afaict, this game is meant to be run at very high detail settings. Ultra seems to just kill performance with no noticeable improvements in return.


----------



## Liviu Cojocaru (Aug 26, 2016)

Mitsurugi said:


> I think it was unrealistic to expect any more... Especially as the game might be limited by the 3.5GB + 0.5Gb memory config and you are asking your cards, to push 5 million pixels around in ultra detail.


Yes I kind of knew this but I was hoping for just a bit more, my PC is crying for a graphics card upgrade that is why I am considering going for the 1080ti when it's launched.


----------



## marios15 (Aug 26, 2016)

What's interesting is the rx 460 being as fast as the 7950!!


----------



## Kanan (Aug 26, 2016)

About 780 ti being bad:
A) 780ti isn't bad, just amd had years of time to polish drivers of GCN now, that's why 290X is so fast now among others compared to Kepler.
B) most 290X back then were reference models with throttling involved, now they are all running above 1000 mhz compared to the 850-900 back then.
C) I'm pretty sure they used low clocked ref 780tis for that test, reviews of PCGH with a custom 780ti (1100mhz) show clearly better results not only in deus ex but in all games.


----------



## bug (Aug 26, 2016)

Kanan said:


> About 780 ti being bad:
> A) 780ti isn't bad, just amd hat years of time to polish drivers of GCN now, that's why 290X is so fast now among others compared to Kepler.
> B) most 290X back then were reference models with throttling involved, now they are all running above 1000 mhz compared to the 850-900 back then.
> C) I'm pretty sure they used low clocked ref 780tis for that test, reviews of PCGH with a custom 780ti (1100mhz) show clearly better results not only in deus ex but in all games.



You're ignoring the elephant in the room: DE:MD is clearly an AMD title. It's based on all those technologies AMD opened up recently. It's useless for generic comparisons between brands.


----------



## Kanan (Aug 26, 2016)

bug said:


> You're ignoring the elephant in the room: DE:MD is clearly an AMD title. It's based on all those technologies AMD opened up recently. It's useless for generic comparisons between brands.


I'm not ignoring it, it's just a obvious fact. And yeah it's pretty useless for that. I was more like talking generally.


----------



## RejZoR (Aug 26, 2016)

This statement confuses me greatly: "GTX 1080 can barely handle Ultra 1080p"

I'm running the game on Ultra 1080p (granted, I use post process AA instead of MSAA) on my GTX 980 and it's very smooth. Granted it's just the first level in Dubai so I can't speak for larger maps later on, but this one is pretty big when you look "out the window". The reason I don't use MSAA is because it's very expensive across the board, not just in this game and today's post process edge filtering has evolved so far it's basically free while delivering basically 16x AA in most cases. It doesn't feel like basic FXAA either, but I can be wrong.


----------



## Ikaruga (Aug 27, 2016)

RejZoR said:


> This statement confuses me greatly: "GTX 1080 can barely handle Ultra 1080p"
> 
> I'm running the game on Ultra 1080p (granted, I use post process AA instead of MSAA) on my GTX 980 and it's very smooth. Granted it's just the first level in Dubai so I can't speak for larger maps later on, but this one is pretty big when you look "out the window". The reason I don't use MSAA is because it's very expensive across the board, not just in this game and today's post process edge filtering has evolved so far it's basically free while delivering basically 16x AA in most cases. It doesn't feel like basic FXAA either, but I can be wrong.


This statement confuses me greatly: *"it's very smooth"*


----------



## champsilva (Aug 27, 2016)

alexsubri said:


> Nice video, is this in 1080p or 1440p?



1080p. 

10chars


----------



## rtwjunkie (Aug 27, 2016)

Ikaruga said:


> This statement confuses me greatly: *"it's very smooth"*



i'm assuming he means just like I am playing it. I have moved up to Very High (with a few exceptions) and am at a constant 58-60fps. That slight difference isn't noticeablw, so: Smooth.


----------



## Ikaruga (Aug 27, 2016)

rtwjunkie said:


> i'm assuming he means just like I am playing it. I have moved up to Very High (with a few exceptions) and am at a constant 58-60fps. That slight difference isn't noticeablw, so: Smooth.


But "Smooth" is a subjective term


----------



## RejZoR (Aug 27, 2016)

Ikaruga said:


> This statement confuses me greatly: *"it's very smooth"*



What's confusing about it? Very smooth means it's playable without any hitching, stuttering or lagging with no perception that it's barely running at 30fps.

For "GTX 1080 can barely handle Ultra 1080p", I can only understand it as "It's hardly achieving 30fps on Ultra at 1080p". Which sounds like a nonsense considering GTX 1080 is almost twice as fast as my GTX 980...


----------



## marios15 (Aug 28, 2016)

Your framerate can have lots of dips to 10-20fps and still average 30+, removing the dips might improve your average by only 2-3fps but the actual gameplay will feel a lot smoother


----------



## Ikaruga (Aug 28, 2016)

RejZoR said:


> What's confusing about it? Very smooth means it's playable without any hitching, stuttering or lagging with no perception that it's barely running at 30fps.
> 
> For "GTX 1080 can barely handle Ultra 1080p", I can only understand it as "It's hardly achieving 30fps on Ultra at 1080p". Which sounds like a nonsense considering GTX 1080 is almost twice as fast as my GTX 980...


What I was trying to say that -while the original sentence could use a different wording indeed - your sentence is confusing too. Just to clarify things, looks like your perception of smoothness at 30fps equals my perception at 60 fps, so my original post stands, because it seems to me that you are doing the same thing what you are questioning W1zzard about.



Kanan said:


> 30 fps is only smooth if it is a console-like steady 30 fps without dips even further down and its "another kind" of smoothness compared to 60fps. Or in other words 60 fps is even smoother (and 120 even smoother than 60 what ultimately should be the cap what a human being can recognise).


For me, it's about 85Hz. If I check the UFO frametest , 60Hz is flat out horrible "chunky", and I start to see smooth motion above 85-90 (luckily my IPS monitor maxes out at 96Hz, so I use to play at 90Hz if the game supports it)


----------



## Kanan (Aug 28, 2016)

30 fps is only smooth if it is a console-like steady 30 fps without dips even further down and its "another kind" of smoothness compared to 60fps. Or in other words 60 fps is even smoother (and 120 even smoother than 60 what ultimately should be the cap what a human being can recognise).


----------



## RejZoR (Aug 28, 2016)

Do you think that me as an owner of a 144Hz gaming monitor consider 30fps as "smooth" ?


----------



## Kanan (Aug 28, 2016)

RejZoR said:


> Do you think that me as an owner of a 144Hz gaming monitor consider 30fps as "smooth" ?


No I was generally speaking. Imo 30 is only playable, smooth only starts at 60 and really good is as already stated 80 or 90+. 144 165 200 and other hz counters are more or less useless and more for marketing purposes. Back in the days comp shooter players had 100hz crts and it was perfect, and nothing changed, your eyes are still human eyes.


----------



## the54thvoid (Aug 28, 2016)

1440p left and 1080p right on 980ti at 1500Mhz.

I was concerned when the _first_ 'banner' was Gaming Evolved (like prime of place - this is Sparta type banner) and we knew it was a good chance this would be a bit hard on Nvidia.  But if I'm matching a stock Fury X and I'm still on W7 (no DX12 for me), I'm happy enough.  I'm going to play on V High at either 1080 or 1440, I'll run some benches to decide.


----------



## rtwjunkie (Aug 28, 2016)

the54thvoid said:


> 1440p left and 1080p right on 980ti at 1500Mhz.
> 
> I was concerned when the _first_ 'banner' was Gaming Evolved (like prime of place - this is Sparta type banner) and we knew it was a good chance this would be a bit hard on Nvidia.  But if I'm matching a stock Fury X and I'm still on W7 (no DX12 for me), I'm happy enough.  I'm going to play on V High at either 1080 or 1440, I'll run some benches to decide.



Just to reassure you, the benchmark is a harder hit than actual game. You should be able to do Very High on 1080p no problem. I'm mostly Very High, no MSAA, and Temporal AA checked.


----------



## the54thvoid (Aug 28, 2016)

rtwjunkie said:


> Just to reassure you, the benchmark is a harder hit than actual game. You should be able to do Very High on 1080p no problem. I'm mostly Very High, no MSAA, and Temporal AA checked.



71.8fps on 1080p with Very High, 50.6 on 1440p with Very High.  Though I think I may have slightly harder settings than wizz given the discrepancies between scores from previous post.

And as usual in AAA games, very little noticeable graphics difference (if any).


----------



## Prima.Vera (Aug 29, 2016)

Guys is it possible when you do a performance review to do a CPU review also between different generations please? Would be very interesting to see how a 2700k compares with a 6700k in higher resolutions for example.
Thanks!


----------



## john_ (Aug 29, 2016)

> When using High details, which is still challenging for many cards performance-wise, memory usage stays well below 3 GB, which is important for the GTX 1060 3 GB for example, or the RX 470/480 4 GB.



It is important for a *4GB* RX 470/480 the memory to stay below 3GB? Oh... this is fun.
Maybe AMD has implemented a 3+1GB design?


----------



## the54thvoid (Aug 29, 2016)

john_ said:


> It is important for a *4GB* RX 470/480 the memory to stay below 3GB? Oh... this is fun.
> Maybe AMD has implemented a 3+1GB design?



Why nitpick?  He's probably throwing that in there for balance.  There are no modern AMD cards with <4GB with comparable power to run the game.


----------



## john_ (Aug 29, 2016)

Balance?






3GB is not equal to 4GB, no matter how much you will try to gild the pill. You want to call it balance? Call it balance.

And there are three scenarios where the memory usage is over 3GB and lower than 4GB. And in two of those three scenarios the game is pretty much playable with an RX480 or an GTX1060(6GB) cards. But those cases are never mentioned here.






Nothing more


----------



## Shatun_Bear (Aug 30, 2016)

So this is when the narrative starts changing - from GTX 1060 is faster than the RX 480 to the RX 480 has a clear performance advantage in new titles. And it goes without saying the 480 is faster in DX12 and Vulkan as well.

Consider the card has 2GB extra VRAM over the 1060s. Given all these facts, why were the custom 1060s (9.6 to 9.7 scores?!?) so unequivocally recommended over the custom 480s (9.1 to 9.4 scores) by TPU? Don't tell me efficiency.


----------



## Prima.Vera (Aug 30, 2016)

alexsubri said:


> Heres my screenshots. Ultra settings Texture resolution 4x and msaa off.
> 
> View attachment 78248 View attachment 78249 View attachment 78250 View attachment 78251 View attachment 78252


Dat Bump-Mapping looks faker than Pamela Anderson's tits! )) lol. Faces also look artificial.... Other than that all good ))

Anyways, I have a very big feeling that the nVidia drivers are not yet optimal for this game, considering the performance bump the AMD cards have. Therefore most likely nVdia will release some performance drivers for this game sometimes later. Just speculation, but looking at the benches, the nVidia drivers seems to be at a disadvantage here. Especially for the new generation cards...


----------



## the54thvoid (Aug 30, 2016)

Shatun_Bear said:


> So this is when the narrative starts changing - from GTX 1060 is faster than the RX 480 to the RX 480 has a clear performance advantage in new titles. And it goes without saying the 480 is faster in DX12 and Vulkan as well.
> 
> Consider the card has 2GB extra VRAM over the 1060s.* Given all these facts,* why were the custom 1060s (9.6 to 9.7 scores?!?) so unequivocally recommended over the custom 480s (9.1 to 9.4 scores) by TPU? Don't tell me efficiency.



Because in the test suite the 1060 outperformed the 480.  It's that simple.  As the suite changes and includes this game (hopefully) and maybe the one proper Vulkan game of Doom, AMD will score a few more hits but be aware that* a test suite of games makes the score relevant, not one or two*.  And it's funny you say "don't say efficiency" when that was one of the main aims and touted benefits of Polaris (2.5 x efficiency, since amended by AMD to only 2x).

It is remarkable that people such as you feel so hard done by when the overwhelming PC gaming library out there is still DX11 based.  Wait till 2017/2018, then we'll have lots of DX12 goodies (and Vega, Zen and Volta).  All will be well.


----------



## Kanan (Aug 30, 2016)

Maybe so but TPU these days is conservative on their game testing approach, other sites do simply a better job in testing both dx11 and modern apis games and therefore show a more balanced picture of the facts.

Modern apis clearly show that rx480 has more performance under its hood than 1060, so comparing efficiency keep that in mind. For me the rx 480 is the clear (future) winner, like the 7970 vs 680 or Hawaii vs gk110 it will probably win soon.
And as nobody should buy cards just for one year it's obvious the rx480 is better - this is just my opinion though. BTW 680 and 780ti are both clearly better in that comparison compared to 1060 - it's just a mediocre gpu or the rx480 is simply too good.


----------



## the54thvoid (Aug 30, 2016)

Kanan said:


> Maybe so but TPU these days is conservative on their game testing approach, other sites do simply a better job in testing both dx11 and modern apis games and therefore show a more balanced picture of the facts.
> 
> Modern apis clearly show that rx480 has more performance under its hood than 1060, so comparing efficiency keep that in mind. For me the rx 480 is the clear (future) winner, like the 7970 vs 680 or Hawaii vs gk110 it will probably win soon.
> And as nobody should buy cards just for one year it's obvious the rx480 is better - this is just my opinion though. BTW 680 and 780ti are both clearly better in that comparison compared to 1060 - it's just a mediocre gpu or the rx480 is simply too good.



No argument from me. If I were in the market for a mid range upgrade, I'd buy a 8GB RX480. My point was regarding the score and test suite. Particularly, you can't give a card points based on how it will perform next year. 
However, if I were to upgrade now from a 980ti, AMD isn't an option. The best card for now, even in DX12 until at least 1st half 2017 is GTX1080 (or Titan X).


----------



## Kanan (Aug 30, 2016)

Your point is valid, just too many dx11 games in tpu bench suite.

I have a similar problem, I want to upgrade but these 2 Gpus are a sidegrade I don't care about and 1070 is simply too expensive, I'll probably never buy a gpu higher than 350€ as 200-350 always was the sweet spot for semi highend cards. I won't pay a big premium for garbage that didn't make it to gtx 1080. I can wait, prices will go down.


----------



## the54thvoid (Aug 30, 2016)

Kanan said:


> Your point is valid, just too many dx11 games in tpu bench suite.
> 
> I have a similar problem, I want to upgrade but these 2 Gpus are a sidegrade I don't care about and 1070 is simply too expensive, I'll probably never buy a gpu higher than 350€ as 200-350 always was the sweet spot for semi highend cards. I won't pay a big premium for garbage that didn't make it to gtx 1080. I can wait, prices will go down.



I know we're way off topic but I had thought given the 980ti price drops (to accommodate) the 1070, AMD might have dropped the Fury X price similarly. Unfortunately not. I was considering a cheaper Fury X as a side grade in a new PC system to ride the initial wave of DX12.


----------



## Kanan (Aug 31, 2016)

the54thvoid said:


> I know we're way off topic but I had thought given the 980ti price drops (to accommodate) the 1070, AMD might have dropped the Fury X price similarly. Unfortunately not. I was considering a cheaper Fury X as a side grade in a new PC system to ride the initial wave of DX12.


Well it's cheap-ish at mindfactory.de ~399 € (XFX). I'd say that's good value.


----------



## Vayra86 (Aug 31, 2016)

Kanan said:


> Your point is valid, just too many dx11 games in tpu bench suite.
> 
> I have a similar problem, I want to upgrade but these 2 Gpus are a sidegrade I don't care about and 1070 is simply too expensive, I'll probably never buy a gpu higher than 350€ as 200-350 always was the sweet spot for semi highend cards. I won't pay a big premium for garbage that didn't make it to gtx 1080. I can wait, prices will go down.



I'm kind of in the same boat right now. I simply refuse to pay a premium for Pascal when it is nothing more than Maxwell with higher clocks and lower power use. Fuck that. I want actual, tangible performance increases, and GTX 1070 is going for a similar or even higher price than the still slightly stronger GTX 980ti. Add to that the fact that Pascal AND Maxwell have a weak performance on the newer APIs and my only interest right now is in AMD's upcoming high end. Nvidia is overcharging for 14nm and they know it, anyone contesting that needs a history lesson on price/perf shifts between generations. Meanwhile GTX 1060 will once again be a hamstrung card as we can already see by VRAM ór shader power, and this is also quite close to the norm for a Nvidia x60 release (there's always some shit holding it back too much and you'll feel underpowered within 6 months of release). Where AMD offers a very well rounded RX480, Nvidia offers a rag-tag 1060 tied together with duct tape and a few memory chips with Auto-OC pushing it to its limit out of the box. Noty.

Meanwhile, I've got a 780ti still going pretty strong and more than sufficient for 1080p, even at and above 90-100 fps given some tweaks that hardly impact the visual qualities of the games I play. Most new titles that are big on graphics are 'meh' in terms of gameplay so I'm not too worried waiting abit or tuning settings down for a while.

The marketplace is notoriously unhealthy for anything above the 250-300 dollar mark, and buying into it now is going to be a huge regret later. Already you can see the 1080 which is supposed to be the top dog (I disregard Titan because, well, it's the worst price/perf you can possibly get into) that gets totally SWAMPED in a game like Deus Ex that relies heavier on new API features and a newer engine. The 4K card... and not even a comfortable 30 fps minimum can be extracted from it. There go 700-800 euro's worth of GPU for you. Let's face it. 1080 is a money grab and its a shit card for 4K in the near future. Deus Ex tells us this story and the vast majority of other new API based games support that fact.

It also tells us the Fury X is once again making waves. I might pick up one of those and wait for the real deal sometime next year.

About the bench suite... I don't feel it is hard to draw your own conclusions based on the games/APIs where AMD is stronger and where it is not. It's easy to see and doesn't require a bench suite overhaul. If we let Wizz do all his testing only to have a quick look at some bar charts... I'd hate to see him waste even more effort overhauling the suite. We just need to start reading between the lines, as I do above. Also, the reality is that the vast majority of the gaming market IS STILL DX11 and a bench suite is never meant to be some wishful-thinking projection of our 'future' (we have AMD for that  ), but a reflection of the games we play today.


----------



## W1zzard (Aug 31, 2016)

Vayra86 said:


> I'd hate to see him waste even more effort overhauling the suite.


Too late, I'm rebenching right now with everything that's new


----------



## Kanan (Aug 31, 2016)

> Where AMD offers a very well rounded RX480, Nvidia offers a rag-tag 1060 tied together with duct tape and a few memory chips with Auto-OC pushing it to its limit out of the box. Noty.


hahaha, very good. I concur with that.


> Meanwhile, I've got a 780ti still going pretty strong and more than sufficient for 1080p, even at and above 90-100 fps given some tweaks that hardly impact the visual qualities of the games I play. Most new titles that are big on graphics are 'meh' in terms of gameplay so I'm not too worried waiting abit or tuning settings down for a while.


Welcome to the club!


> The marketplace is notoriously unhealthy for anything above the 250-300 dollar mark, and buying into it now is going to be a huge regret later. Already you can see the 1080 which is supposed to be the top dog (I disregard Titan because, well, it's the worst price/perf you can possibly get into) that gets totally SWAMPED in a game like Deus Ex that relies heavier on new API features and a newer engine. The 4K card... and not even a comfortable 30 fps minimum can be extracted from it. There go 700-800 euro's worth of GPU for you. Let's face it. 1080 is a money grab and its a shit card for 4K in the near future. Deus Ex tells us this story and the vast majority of other new API based games support that fact.


Yep, and the GTX 1080, for me at least, is only a advanced 1440p or rather Ultra Wide Gaming Monitor - card (with a tad higher resolution, something almost perfect imo that sits between 1440p and 4K, so it's really good from a performance perspective and also looks very nice) as 4K is a irrelevant resolution for serious gamers (60 hz anyone?). You can go and play consoles too, if you think 60 hz or FPS is nice, basically thats about my opinion, maybe a tad less extreme than that.


> It also tells us the Fury X is once again making waves. I might pick up one of those and wait for the real deal sometime next year.


I was thinking about buying the Sapphire Fury Nitro, it's relatively cheap at only 330 €. But then again, I don't want a new (soon bottlenecked) 4 GB card. I think the main reason Fiji flopped is the 4 GB-thing. It was not enough back then, and it's even more not enough now. What a pity, because Fiji really starts to shine now.


> Too late, I'm rebenching right now with everything that's new


Wise decision, TPU will profit from that.


----------



## W1zzard (Sep 1, 2016)

Kanan said:


> Wise decision, TPU will profit from that.


There was never any other decision, it's just logistics. Rebenching takes a long time and several reasons why I couldn't do it earlier.


----------



## Vayra86 (Sep 1, 2016)

W1zzard said:


> Too late, I'm rebenching right now with everything that's new



Fair enough


----------



## Kanan (Sep 1, 2016)

W1zzard said:


> There was never any other decision, it's just logistics. Rebenching takes a long time and several reasons why I couldn't do it earlier.


Whatever, I'm looking forward seeing the new bench suite.


----------



## eidairaman1 (Sep 5, 2016)

Run it no issue here lol, good review, course the DX games have been well optimized since the original


----------



## OtherSyde (Sep 14, 2016)

Gabkicks said:


> wow  D: this game is kicking my gtx 1080's ass at ultra 1080p



Weird... I've got everything set to Ultra except MSAA:Off, Shadows on Medium, and Contact Hardening Shadows:Off and I'm generally getting between 50-70fps according to FRAPS and it's very playable (1080p/60Hz). And I've only got a sh!tty EVGA 970 SuperClocked 4GB (ahem, 3.5GB).


----------



## rtwjunkie (Sep 14, 2016)

OtherSyde said:


> Weird... I've got everything set to Ultra except MSAA:Off, Shadows on Medium, and Contact Hardening Shadows:Off and I'm generally getting between 50-70fps according to FRAPS and it's very playable (1080p/60Hz). And I've only got a sh!tty EVGA 970 SuperClocked 4GB (ahem, 3.5GB).



That's not far off from me. It is because Shadows and MSAA are the real frame rate killers.


----------



## OtherSyde (Sep 14, 2016)

rtwjunkie said:


> That's not far off from me. It is because Shadows and MSAA are the real frame rate killers.



God I _know,_ right? And granted MSAA is _good,_ but it's _so_ _not worth_ the absurd performance hit when compared to TXAA or FXAA. I used TXAA 2x and 4x all throughout Crysis 3 and it looked great at 1080p - why in the hell didn't they put _that_ in DX:MD instead of the stupid performance-raping MSAA which cripples even the newest GTX 1080's at higher settings? So dumb. Maybe they'll incorporate more antialiasing settings into the game later, like they did with DX:HR.


----------



## rtwjunkie (Sep 14, 2016)

OtherSyde said:


> God I _know,_ right? And granted MSAA is _good,_ but it's _so_ _not worth_ the absurd performance hit when compared to TXAA or FXAA. I used TXAA 2x and 4x all throughout Crysis 3 and it looked great at 1080p - why in the hell didn't they put _that_ in DX:MD instead of the stupid performance-raping MSAA which cripples even the newest GTX 1080's at higher settings? So dumb. Maybe they'll incorporate more antialiasing settings into the game later, like they did with DX:HR.



Check the box in settings for Temporal AA (no slider, just a checkbox).  Very little impact and seems to perform very well.  And it looks a lot better than FXAA does.


----------

