# The Witcher 3: Performance Analysis



## W1zzard (May 19, 2015)

In this article, we put the GTX Titan X, R9 295X2, GTX 980, R9 290X, GTX 970, R9 290, GTX 960, and R9 285 through the Witcher 3: Wild Hunt. We do so at resolutions of 1600x900, 1920x1080, 2560x1440, and 4K to assess what hardware you need to play this recently released top-title.

*Show full review*


----------



## ZoneDymo (May 20, 2015)

Looking good but man does seem to be quite a needy beast on the high end side, guess those new gpu's in June got a game to be bought for .

Also totally with you on hating the (motion) blur effects.
Its just baffeling people A want it turned on and B developers even made it in the first place.

Lets reduce pixel lag and response time on monitors more and more to reduce blur during motion....and then introduce a motion blur effect....

Just wtf, and apart from that, why would you even want it? why do you want an image to blur when its moving and dont tell me its realistic because your eyes can keep it sharp irl because your eyes cant keep the image sharp during motion without adding some motion blur effect either.


----------



## We'llBangOK? (May 20, 2015)

So, a 7850 2GB has no chance on 900p ?


----------



## silapakorn (May 20, 2015)

I want to upgrade to 1440p screen too but can't find any brand that goes bigger than 28".


----------



## EzioAs (May 20, 2015)

We'llBangOK? said:


> So, a 7850 2GB has no chance on 900p ?



With those image quality settings? No.

But I'm certain some medium-high blend settings would work well. If AMD could release a driver update that would be even better, perhaps.


----------



## Finners (May 20, 2015)

Not great for AMD. Will you retest when/IF AMD release a driver for this with optimisations? 

Terrible from AMD not to have a driver out in time.


----------



## dj-electric (May 20, 2015)

I'm a GTX 780 TI user. I used to think me card could blast enything on my 1440P screen.


----------



## MakeDeluxe (May 20, 2015)

For some reason playing in borderless windowed mode made my frametimes jump through the roof and it introduced some insane stuttering even at over 50 fps


----------



## jigar2speed (May 20, 2015)

Dj-ElectriC said:


> I'm a GTX 780 TI user. I used to think me card could blast enything on my 1440P screen.


That's gameworks for you right there brother - GTX 7** and below plus AMD cards are not optimized for this game.


----------



## NC37 (May 20, 2015)

Holy crap, under 2GB VRAM. Bravo devs. That there is the benefit of a game having a longer dev time instead of being rushed out. Sadly, EA will learn nothing from this.


----------



## dj-electric (May 20, 2015)

The game is under 2GB VRAM for obvious reasons, have u seen the horrible vegetation?


----------



## Mathragh (May 20, 2015)

Dj-ElectriC said:


> The game is under 2GB VRAM for obvious reasons, have u seen the horrible vegetation?


Yeah, although I cannot remember where, I've read that while the devs at some point said the install was over 50GB, it now is way less.
One can take an educated guess what parts of the game took a hit from that (and why they did it in the first place if you fancy a tinfoil hat)


----------



## The Quim Reaper (May 20, 2015)

I'm running the game with two 970's @4K, and if you want to turn everything to max (including hairworks) then I'd recommend capping the game to 30fps, you certainly wont be able to run it at 60fps.

Even running at a 30fps cap, my GPU usage is right on the limits, and during some cut scenes, frame rates can drop to about 25fps (worst case), during general play though, I've not seen it dip (yet) below 30fps.


----------



## rodneyhchef (May 20, 2015)

Should probably get that second 970 now...


----------



## jabbadap (May 20, 2015)

The Quim Reaper said:


> I'm running the game with two 970's @4K, and if you want to turn everything to max (including hairworks) then I'd recommend capping the game to 30fps, you certainly wont be able to run it at 60fps.
> 
> Even running at a 30fps cap, my GPU usage is right on the limits, and during some cut scenes, frame rates can drop to about 25fps (worst case), during general play though, I've not seen it dip (yet) below 30fps.



Try disabling movie ubersampling(should be in file bin/config/base/visuals.ini):

```
[Visuals]
Gamma=1
MovieFramerate=30.0
MovieUbersampling=true
```
Change MovieUbersampling=false. 

One tip for game play was lowering hairworks AA, in default it uses 8AA which is very demanding. For better fps one could try 4 or 2 or even disable it completely(0). So open file bin/config/base/Rendering.ini and find  line _HairWorksAALevel=8 _try lower level of AA there.


----------



## raptori (May 20, 2015)

"You can tweak the settings by editing the HairWorksAALevel in the game's bin\config\base\rendering.ini. " A great find wizard .


----------



## v12dock (May 20, 2015)

I was wondering it you were going to do a performance analysis of this game. I glad you did the game looks/plays great.


----------



## cokker (May 20, 2015)

Finners said:


> Not great for AMD. Will you retest when/IF AMD release a driver for this with optimisations?
> 
> Terrible from AMD not to have a driver out in time.



Drivers won't do much and not worth a re-test, I was playing GTA 5 on old drivers and didn't notice a difference with the latest (15.4), meanwhile my 970 friends are crashing with ERR_GFX_D3D errors.


----------



## Vego (May 20, 2015)

silapakorn said:


> I want to upgrade to 1440p screen too but can't find any brand that goes bigger than 28".


try Philips, that have some 3440x1440 34" monitors i think


----------



## Severus (May 20, 2015)

Hello, 

I am running it at 3440 x 1440p with a 970 and it's pretty demanding. Running around 40 fps. 
Any tips on setting so I can reach 60 fps without losing the wonderful graphics?

Cheers!


----------



## jabbadap (May 20, 2015)

raptori said:


> "You can tweak the settings by editing the HairWorksAALevel in the game's bin\config\base\rendering.ini. " A great find wizard .



Uhoh, silly me didn't read the conclusion page. Yes good review Wizzard.



Severus said:


> Hello,
> 
> I am running it at 3440 x 1440p with a 970 and it's pretty demanding. Running around 40 fps.
> Any tips on setting so I can reach 60 fps without losing the wonderful graphics?
> ...



You should check tweaking guide on geforce.com(especially config file tweaks section):
http://www.geforce.com/whats-new/gu...-hunt-graphics-performance-and-tweaking-guide


----------



## BorisDG (May 20, 2015)

No 780 Ti in the list? WTF? TPU are you kidding? Fast test and re-add.


----------



## manofthem (May 20, 2015)

Very good review W1zz, thanks for testing and posting. Can we assume Witcher 3 is going to be added into the test suite from here on out?


----------



## EarthDog (May 20, 2015)

I take it that there isn't a CFx profile for the game as the 295x2 didn't scale at all (negative scaling in fact)...

... and either I missed it (likely) or there was no mention of the lack of scaling on AMD. I see a mention about drivers and performance, but no warnings about its current lack of scaling.

Excellent write up Wizz... keep em coming!


----------



## Ikaruga (May 20, 2015)

I would be also angry if I would have a Titan and I would get those terrible frame-rates, but I still find it a little bit funny how the interwebs flame Nvidia because a developer did not optimize their game for "older" Nvidia chips.


----------



## DoomDoomDoom (May 20, 2015)

Nice to see that the older architecture is getting the shaft. /sarcasm


----------



## techx (May 20, 2015)

I'm utterly shocked that nobody noticed that screenshot used in the article is a 2013 capture before the massive graphical downgrade.  The game looks nothing like that today.  Shame on the author.


----------



## EarthDog (May 20, 2015)

You may want to look at the game today... it looks pretty damn close to me looking at screencaps of the game today versus that image.


----------



## techx (May 20, 2015)

EarthDog said:


> You may want to look at the game today... it looks pretty damn close to me looking at screencaps of the game today versus that image.


I and many others would disagree.  It's a very misleading screenshot.  Don't tell me this is "damn close".


----------



## swaaye (May 20, 2015)

What does it matter?   I guess maybe just don't buy the game?   It's not the first time an early demo wasn't an entirely accurate portrayal of what the final product would be.


----------



## techx (May 20, 2015)

swaaye said:


> What does it matter?   I guess maybe just don't buy the game?


Just ballsy to use a misrepresentation of what the game really looks like as the feature image of the article.  Tis all.


----------



## swaaye (May 20, 2015)

techx said:


> Just ballsy to use a misrepresentation of what the game really looks like as the feature image of the article.  Tis all.


I just assumed it was concept art or a old bullshot.  But yeah maybe a screenshot would be better.


----------



## EarthDog (May 20, 2015)

techx said:


> Just ballsy to use a misrepresentation of what the game really looks like as the feature image of the article.  Tis all.


Ballsy... LOL. The article is about PERFORMANCE, not how the game looks. Wizz used a stock screenshot for a PERFORMANCE review. If this was a review on IQ, then I would say you have a beef. 

Would an actual in game screenshot be better? I'll give you that. I just think you came into this with a bit more venom than was perhaps needed.

As far as the GIFs you posted, the scene completely changed. It looks great to me, comparable in fact. Its just that there isn't the same foliage around. But the shadows, and textures, look pretty damn good to me (for terrible way to compare them in GIFs).


----------



## Saidrex (May 20, 2015)

Well, dunno what I do wrong, I'm getting 40-50 fps on minimum settings in 1080p, I have GTX770 and from what I heard I'm not only one getting sh**y performance on 700 series card, something to do with NVIDIA's drivers. Many people already starter conspiracy theories that NVIDIA intentionally tries to sabotage 700 series to force people to upgrade to 900 series.


----------



## W1zzard (May 21, 2015)

manofthem said:


> Very good review W1zz, thanks for testing and posting. Can we assume Witcher 3 is going to be added into the test suite from here on out?


definitely, need to rebench all cards first though for project cars and witcher. rebench done already that includes gta v



EarthDog said:


> I take it that there isn't a CFx profile for the game as the 295x2 didn't scale at all (negative scaling in fact)


third para in the conclusion



EarthDog said:


> Would an actual in game screenshot be better?


i was just too lazy to take one and rather added kepler results. googled the first image that had a scene i liked, didnt pay attention to the displayed image quality


----------



## mastrdrver (May 21, 2015)

How to run Hairworks on AMD without crippling performance.


----------



## avatar_raq (May 21, 2015)

mastrdrver said:


> How to run Hairworks on AMD without crippling performance.


This reminds me of AMD's TressFX in Tomb Raider 2013, which ran on my 780ti rather well.


----------



## gasolina (May 21, 2015)

The Hairwork is similar to physx ?? http://www.geforce.com/hardware/technology/physx/games here it's on the list ? I was wondering a GTX 980 + 1 Physx 760 performance will be better than a single 980 in case of Hairwork Enable ?


----------



## cokker (May 22, 2015)

techx said:


> I and many others would disagree.  It's a very misleading screenshot.  Don't tell me this is "damn close".



Is that horse simulator 2015? What happened to the atmosphere?


----------



## bug (May 22, 2015)

The patch released yesterday is supposed to improve things with HairWorks. Not that I care much about that particular tech (the Witcher series have always been about story and atmosphere), but is there any chance you guys retest some of the cards to check if there's indeed a performance gain?


----------



## AsRock (May 22, 2015)

Great review Wiz but looks like at this time if you want to max the game a 980 at least is needed not a 970 as it only goes 4 fps more than the 290 which still is under 60 fps except for the 1st  resolution you tested.

I am totally shocked on that ram usage, that is very well done indeed. Knowing CDR they will bring out a massive patch later too.


As nVidia have already got their updated drivers out i hope you can find the time to do a re test when AMD do too although i cannot see them getting much better as they are now really.


----------



## rtwjunkie (May 22, 2015)

techx said:


> I'm utterly shocked that nobody noticed that screenshot used in the article is a 2013 capture before the massive graphical downgrade.  The game looks nothing like that today.  Shame on the author.


 
Actually, the game looks pretty damned good today.  But, as I said the day before release when everyone was up in arms, the Witcher games have always been about the story, role-palying and atmosphere, not the graphics.  The nice graphics you get are just a bonus.

For anyone who feels Kepler is getting hurt by the new drivers with this game, go back to 347.88, or whatever number that was that was last before the GTA V game-ready driver.  I have found my Kepler running beautifully in this game with that driver.  That was also the last driver, I hear, that was particularly attuned to Keplers, so that may be the difference.

@W1zzard, great review!  Thanks!!


----------



## VashCZ (May 22, 2015)

yes yes, I have GTX 760 and I really must wait for optimization
it takes 1000 - 1500 MB VRAM, which is good, but why, I would prefer 1900MB VRAM and 60FPS :-D

"the game definitely feels like a PC-first title" - JOKE?!? in performance way, maybe, but overall... no way mage!


----------



## kloaf11 (May 22, 2015)

techx said:


> I and many others would disagree.  It's a very misleading screenshot.  Don't tell me this is "damn close".



Well if you listened to news released that was an art rendered trailer and not an in game trailer like watch dogs had and did. It wasnt the finished product with all the in game details in it. 

Also for those hating on amd are you really surprised a game nvidia helped optimise and develop works so well with the nvidia cards? probably the same reason they dont work on the 700 series with the newest drivers. They want people to buy their cards instead of amd cards and they want you to buy their newest cards not their older models. Its their business model. Make Money.


----------



## RichF (May 23, 2015)

techx said:


> I and many others would disagree.  It's a very misleading screenshot.  Don't tell me this is "damn close".


_
"My recommendation for 4K is a GTX 970 SLI (because of its awesome price-to-performance ratio)"

"As you can see, Witcher 3 is very modest in its VRAM requirements, which is quite surprising given how beautiful the graphics are. To me this looks like good coding"_


----------



## haswrong (May 23, 2015)

RichF said:


> _"My recommendation for 4K is a GTX 970 SLI (because of its awesome price-to-performance ratio)"
> 
> "As you can see, Witcher 3 is very modest in its VRAM requirements, which is quite surprising given how beautiful the graphics are. To me this looks like good coding"_



so tell me, is there a qualitative difference from tw and tw2? because i have 4GB kepler graphics card, and id rather prefer swifter performance than reduced texture data. i think i will have to play some older game to have a smoother performance.


----------



## Hiryougan (May 25, 2015)

Downgrade? Please. You just need to play a bit with a rendering.ini file and use SweetFX if you prefer VGX trailer colors.
Check this: http://imgur.com/a/1wecL


----------



## RichF (May 25, 2015)

Hiryougan said:


> Downgrade? Please. You just need to play a bit with a rendering.ini file and use SweetFX if you prefer VGX trailer colors.
> Check this: http://imgur.com/a/1wecL


Does that bring back the atmospheric detail, smoke, dirt, vegetation detail, and draw distance? The poster's argument is about more than just the "colors". Also, why should people have to rely on a third-party utility in the first place?


----------



## Hiryougan (May 25, 2015)

RichF said:


> Does that bring back the atmospheric detail, smoke, dirt, vegetation detail, and draw distance? The poster's argument is about more than just the "colors". Also, why should people have to rely on a third-party utility in the first place?


Except SweetFX there are no 3rd party utilties. Everything is in the config files friend.


----------



## bug (May 25, 2015)

RichF said:


> Does that bring back the atmospheric detail, smoke, dirt, vegetation detail, and draw distance? The poster's argument is about more than just the "colors". Also, why should people have to rely on a third-party utility in the first place?



What do you mean "bring back"? They were never there to begin with.
Whatever you saw back in 2013 was obviously a tech-demo or pre-rendered. Neither the GTX980 nor the Radeon 290X existed back then so nobody could accurately predict what would be possible to ship in 2015. One could only guess.


----------



## commander calamitous (May 26, 2015)

bug said:


> What do you mean "bring back"? They were never there to begin with.
> Whatever you saw back in 2013 was obviously a tech-demo or pre-rendered. Neither the GTX980 nor the Radeon 290X existed back then so nobody could accurately predict what would be possible to ship in 2015. One could only guess.


Of course it was, and anyone with any brain knows it will be, but in the trailers themselves it does say "in game footage"


----------



## AsRock (May 26, 2015)

commander calamitous said:


> Of course it was, and anyone with any brain knows it will be, but in the trailers themselves it does say "in game footage"



Yes but as always you cannot expect it be the same or better as was not a finished product.  Now if it was released then got nerf that be a different matter then how ever it's not the case.

Like with most demos there is a warning that it's not the finished product you just would not of seen that, if it was on E3 or what ever playing the demo of it then it would of said that some were.

People need to stop this bs.


----------



## bug (May 26, 2015)

commander calamitous said:


> Of course it was, and anyone with any brain knows it will be, but in the trailers themselves it does say "in game footage"



Even when a trailer says "in-game footage", you know the whole world isn't there yet, the AI isn't finished - there are just too many unknowns at that point; now if you have a playable beta, that could be a fair indicator. I always take pre-release material as guidelines. If I want to know what a game looks like, I look at screenshots in day 1 reviews.

Also keep in mind that because of technical difficulties, video cards are still stuck on 28nm, whereas they were supposed to be on 20 or even 16nm today. That would have probably allowed CDPR to enable more stuff in their engine. And btw, settings beyond Ultra are avaialable directly in the config files. The game _can_ actually look better than Ultra. Next year, when video cards move to 16nm and stacked memory will come with much greater bandwidth, we may actually be able to go beyond Ultra (not on mid-range cards, but enthusiasts will most likely benefit).

And one last thing: having played the first two installments, I preordered this one without even looking at how it looked. I knew it was going to look great (and bring my video card to its knees) and it does.


----------



## rodneyhchef (May 26, 2015)

If you turn on everything the vram usage goes right up to ~3.5GB at 4k. I think I was hitting the 'slow' ram on my 970 as there were some slideshow cutscenes and texture pop-in. It just happened in one situation where the mem usage was around 3.4GB (In the palace in vizima cutscene, with geralt having a shave)

It's just barely playable if you have a gsync monitor and a single 970, but I am going to be purchasing another, or possibly upgrading to a 980 TI, depending on how expensive it is.


----------



## ZweiGaming (May 26, 2015)

how does the game run with Gtx 970 sli, does anybodys know?


----------



## rodneyhchef (May 26, 2015)

I found a 980 sli bench suite and that setup beat a Titan x. Scaled pretty well

http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark


----------



## DeViLzzz (May 28, 2015)

All I got to say is I don't like how this game doesn't look like what they showed at E3 in 2013 I believe it was.  It is Ubi**** all over again.


----------



## rysyndrome (May 29, 2015)

ZweiGaming said:


> how does the game run with Gtx 970 sli, does anybodys know?



I have all the graphics setting on ultra/max except for Hairworks (off) and it gives me 60fps constant at 1080p.  Running my triple screen setup (5670x1080) and best I can do is 40fps on low settings, no overclock. Haven't needed to overclock until Witcher 3 and GTA5 were released.  Thinking about selling the GTX970s and getting SLI GTX980Ti's for the addition memory.  GTA5 at 5670x1080 cripples the GTX970s.


----------



## RichF (May 29, 2015)

bug said:


> What do you mean "bring back"? They were never there to begin with.
> Whatever you saw back in 2013 was obviously a tech-demo or pre-rendered. Neither the GTX980 nor the Radeon 290X existed back then so nobody could accurately predict what would be possible to ship in 2015. One could only guess.


You really didn't get my post?

The argument that the game had to be dropped down to the current quality level strikes me as bogus, but if it makes people happy that's fine. Keeping VRAM usage around 1-2 GB was an optional, not required, decision.


----------



## Bansaku (Jun 6, 2015)

rysyndrome said:


> GTA5 at 5670x1080 cripples the GTX970s.



Not trying to be a team Red fanboy here, but with FXAA and everything maxed 6088x1080 I can get a steady 60FPS in GTA V with my CFX HD7950s (OC'd). Try turning grass density down 1 notch.


----------



## rysyndrome (Jun 7, 2015)

Bansaku said:


> Not trying to be a team Red fanboy here, but with FXAA and everything maxed 6088x1080 I can get a steady 60FPS in GTA V with my CFX HD7950s (OC'd). Try turning grass density down 1 notch.



I think it's a SLI or VRAM issue with the SLI 970s on GTA5.  I am running Med-High settings with FXAA and getting 85fps on avg.  However there are some weird graphical anomalies as I get above 3-3.5GB VRAM usage that make the game unplayable.  Only turning the settings way down and thus VRAM usage does this not happen.


----------



## Maya2009 (Jun 7, 2015)

rysyndrome said:


> I think it's a SLI or VRAM issue with the SLI 970s on GTA5.  I am running Med-High settings with FXAA and getting 85fps on avg.  However there are some weird graphical anomalies as I get above 3-3.5GB VRAM usage that make the game unplayable.  Only turning the settings way down and thus VRAM usage does this not happen.



It's an architecture faults that it's 3.5GB VRAM + 0.5M Chunk card, only GTA 5 and Battlefield 4 are struggling with this.


----------



## Prima.Vera (Jun 8, 2015)

I want back the videocards where you can add by yourself the amount of RAM you want.
I remember back in the day, when I upgraded my S3 Virge GX from 1MB of VRAM to 4MB of VRAM, and all of a sudden I could play Duke Nukem 3D and ROTT from ~30fps on 400x300 VGA resolution, up to ~65fps on 800x600 SVGA. Oh, what a jump in quality that was! But those were the good times.


----------



## Footman (Jun 8, 2015)

Interesting that I am not able to get SLI working properly on my GTX 970 SLI rig....
Tried everything and second GPU is snoozing at 135mhz during gameplay. 

Anyone else seeing SLI issues?


----------



## Doyle66 (Jun 11, 2015)

silapakorn said:


> I want to upgrade to 1440p screen too but can't find any brand that goes bigger than 28".


Have you looked at this?Acer Predator XR341CK


----------



## Dan848 (Jun 12, 2015)

Finners said:


> Not great for AMD. Will you retest when/IF AMD release a driver for this with optimisations?
> 
> Terrible from AMD not to have a driver out in time.



An AMD driver has been out since May 2015:

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## Finners (Jun 13, 2015)

Dan848 said:


> An AMD driver has been out since May 2015:
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx



My post was on the 20th, those drivers released on the 28th.


----------



## rysyndrome (Jun 18, 2015)

Maya2009 said:


> It's an architecture faults that it's 3.5GB VRAM + 0.5M Chunk card, only GTA 5 and Battlefield 4 are struggling with this.



I removed GeForce Experience and it seems to have helped with not just the graphical anomalies but also raised my FPS across both games on the SLI 970s.  Will keep testing it.


----------



## EarthDog (Jun 25, 2015)

You know what would be awesome on these game performance analysis... is putting CPU results in here too... For example, how it runs on Intel quad with Dual/DUal with HT, Quad, quad with HT, hex with HT... and the same for AMD - quad/hex/octo. Then overclock and underclock them...

wow, that is a lot of work, LOL!


----------



## Dan848 (Jun 25, 2015)

As we all know by now, the AMD Radeon R9 Fury X is here... Time for a new review/thread.


----------



## EarthDog (Jun 26, 2015)

Yay... an ohioan!


----------



## bug (Jun 26, 2015)

Dan848 said:


> As we all know by now, the AMD Radeon R9 Fury X is here... Time for a new review/thread.



It's a little below the 980 Ti. Trading blows in some games, beating it in a select few, but overall slightly below. Drains more power, too.
And not unexpected either: while using HBM, the much slower memory speed translates into not that much more bandwidth. And the raw power hasn't seen much of a bump either.

Edit: Do not read the above as a criticism to AMD. Because of TSMC, they're stuck on 28nm like everybody else, so they can't physically fit more processing power into the current GPUs. Remember, the larger the die, the higher the cost to build one. But next year when GPU makers will (hopefully) be able to make the jump to 16nm, there will be good things in store for us.


----------



## jonathan1107 (Aug 13, 2015)

Any of you guys have "worst" frame rates in the witcher 3 since the patch 1.08?


----------



## No Nrg (Aug 24, 2015)

jonathan1107 said:


> Any of you guys have "worst" frame rates in the witcher 3 since the patch 1.08?


 I've been beating the benchmarks in this original ariticle with a i5-4690k and GTX 980 at 1440p.

Average 55fps with everything on Ultra except Foliage distance is set to high


----------

