# Rise of the Tomb Raider: Performance Analysis



## W1zzard (Jan 29, 2016)

We benchmark Rise of the Tomb Raider on 12 graphics cards, in four resolutions, including SLI. Also included are measurements of VRAM consumption, which is high, but seems extremely well optimized.

*Show full review*


----------



## The Quim Reaper (Jan 29, 2016)

Yay!!!

1440p and my 970Sli setup is king of the hill, awesome.


----------



## RCoon (Jan 29, 2016)

Memory management in this game is excellent, particularly considering this game looks absolutely beautiful at almost every turn.


----------



## Prima.Vera (Jan 29, 2016)

Why no more tests on older hardware like the 780 ti cards???


----------



## Ferrum Master (Jan 29, 2016)

Simply it chews everything you throw at it... seems reasonable to me actually.

It would be fun to see when the game starts to stutters using 4GB, 6GB, etc system RAM. Are at least 6GB (usually X58) really enough still.


----------



## NC37 (Jan 29, 2016)

"4GB is all you'll ever need to 1080 and under." -Said everyone that has just been proven to be idiots for buying into that crap

TOLD YOU ALL FREAKEN SO!!! BITE MY SHINY 8GB VRAM!!! That will be utterly obsolete in due time for sure.

Nice that it is optimized and the 970 bug isn't hampering it too bad. But lets wait and see what else is coming. Still a lot of sloppy port devs to go for future games.

Hopefully AMD updates their drivers for it. I've seen other benches on this game and it really looked like AMD was behind on it again.


----------



## Xzibit (Jan 29, 2016)

A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.



			
				Destructoid said:
			
		

> Update: Due to playing a pre-release build, I did not have access to Nvidia's Game Ready Drivers while writing this. I can now confirm that installing the Game Ready Drivers that were released today (January 27) _did not _fix the problems I discuss below.
> 
> In fact, I would say it's made things worse: the stuttering is more frequent and the loading times are now two or three times longer than what I saw pre-drivers.





			
				DSOGaming said:
			
		

> At this point, we should also note that the game’s Very High textures require GPUs with at least 3GB of VRAM. While enabling them, we noticed minor stutters. By dropping their quality to High, we were able to enjoy the game without any stutters at all.





			
				PCWorld said:
			
		

> UPDATE, 2:00 PM Pacific: I’ve installed Nvidia’s Game Ready Drivers and it helped a bit, but didn’t completely eliminate the stuttering. The Geothermal Valley continues to display precipitous drops in frame rate, falling from around 100 down to 55. Tweaking the Level of Detail down a notch gained me back five frames (bringing it to a slightly-stuttery 60), but be aware that even a high-powered rig might show quite a bit of slowdown in these larger areas regardless of how the rest of the game runs.





			
				PCGamer said:
			
		

> While my 970 GTX couldn't keep up with the demands of running every option at maximum—dropping to a stutter during cutscenes and set pieces—a few sensible reductions had it running smoothly and consistently at 60 frames per second.



I'll wait until at least the 3rd patch is released.


----------



## W1zzard (Jan 29, 2016)

Xzibit said:


> A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.


Those used the pre-release version of the game. The game runs perfectly here (rig in my system specs and VGA test rig as listed in the article)


----------



## Jack1n (Jan 29, 2016)

I think most people would be more interested in the 390 non x rather then the 390x in the benchmark results.


----------



## Xzibit (Jan 29, 2016)

W1zzard said:


> Those used the pre-release version of the game. The game runs perfectly here (rig in my system specs and VGA test rig as listed in the article)



Is there a difference ?  I'm wondering if its just in name or was there a update.

One would think they would update there reviews by now and most of the article updates refer to the drivers that had little improvement to making things worse.

Judging by the discussion on Steam it seams to be the same thing with people reporting stuttering and frame drops.


----------



## Enterprise24 (Jan 29, 2016)

780 Ti is around 970 right ?


----------



## msamelis (Jan 29, 2016)

It feels a little awkward getting just 60fps with a 980ti and a ROG Swift. I am not complaining but you know, I was expecting to get 50% more fps. In any case the game looks beautiful but we should be able to have the PC version months ago. Thanks for the review W1zzard.


----------



## siluro818 (Jan 29, 2016)

Jack1n said:


> I think most people would be more interested in the 390 non x rather then the 390x in the benchmark results.


At 1440p it should be just under or equal to GTX980, which yet again shows what great value these cards provide. On my OC'd Sapphire Nitro the game never drops under 30FPS and that's with maxed out settings, not even very high (apart of the hair and ofc I use SMAA not super-sampling AA). Also this is the first game I've seen to use all 8GB of VRAM. This why I dropped by here to see if my Afterburner has gone nuts, or if it's really filling up 6.5 - 7.5 GB, which is where things are most of the time.


----------



## the54thvoid (Jan 29, 2016)

Xzibit said:


> Is there a difference ?  I'm wondering if its just in name or was there a update.
> 
> One would think they would update there reviews by now and most of the article updates refer to the drivers that had little improvement to making things worse.
> 
> Judging by the discussion on Steam it seams to be the same thing with people reporting stuttering and frame drops.



One such review? (PC World)



> The biggest problems seem to occur in a locale known as the “Geothermal Valley,” a large wooded area with numerous swaying trees. Look at the trees, the frame rate drops. Turn the other direction, it shoots back up again. That’s on a 980 Ti, but a friend of mine is running an R9 280X and tells me he experiences the same issue in the same place.



Also affects AMD.

I've not yet hit geothermal valley yet but my game is smooth as silk even at <60fps.


----------



## Ferrum Master (Jan 29, 2016)

the54thvoid said:


> game is smooth as silk even at <60fps.



Getting close to that 30FPS cinematic game experience.


----------



## the54thvoid (Jan 29, 2016)

Ferrum Master said:


> Getting close to that 30FPS cinematic game experience.



Well, 50-60fps.  With SMAA on (not x2, just on) and everything else maxed.  1440p.

Yup.  Butter smooth.  Even when i had SMAA at x4, it was about 35-40fps and still silky.  Glad I never bought a Gsync monitor.  If dev's just tried harder, games can be really nice to play.


----------



## birdie (Jan 29, 2016)

What's great about this game is that Lowest and Highest quality settings are pretty much indistinguishable aside from tesselation, slightly better textures and improved lighting which basically means that you can adjust the game for even mid range cards without compromising visual fidelity.

The geometry is pretty much the same all around.


----------



## Bytales (Jan 29, 2016)

Where is my 32gb Hbm2/per GPu quad polaris Setup ? Ive been waiting all my life for it !


----------



## Ferrum Master (Jan 29, 2016)

the54thvoid said:


> Well, 50-60fps.  With SMAA on (not x2, just on) and everything else maxed.  1440p.
> 
> Yup.  Butter smooth.  Even when i had SMAA at x4, it was about 35-40fps and still silky.  Glad I never bought a Gsync monitor.  If dev's just tried harder, games can be really nice to play.



Yeah we have very similar machines both... Did you try see how does it look on DSR x4 , just for spoiling your mood and see a proper way to get rid of jaggies and make a hole on your wallet


----------



## newtekie1 (Jan 29, 2016)

NC37 said:


> "4GB is all you'll ever need to 1080 and under." -Said everyone that has just been proven to be idiots for buying into that crap
> 
> TOLD YOU ALL FREAKEN SO!!! BITE MY SHINY 8GB VRAM!!! That will be utterly obsolete in due time for sure.
> 
> ...



Except if you are intelligent enough to actually read:



			
				W1zard said:
			
		

> Using GTX Titan X, which has 12 GB of VRAM, we tested the memory usage of the game. As you can see it always fills up around 6-7 GB of VRAM. This may sound shocking at first, but *in reality the game runs very well with cards that don't have as much memory* - look at the performance charts. It seems that *Crystal Dynamics' engine will happy use as much VRAM as it can, but is very well optimized to make due with much less, without sacrificing framerate*.



This is one of those sloppy devs, this is as bad as it gets.  Just cram all the textures possible into VRAM, even if they aren't anywhere near being used to render the current scene.  This is what makes it seem like we need more RAM, when we really don't.


----------



## wiak (Jan 29, 2016)

Xzibit said:


> A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.
> 
> 
> 
> ...


hmm did they ever try it on amd hardware?, on my 7970 it runs butterly smooth with default high settings and a little slower on high AA/purehair very high but still smooth 40-60fps range

"Think of GameWorks as a set of game visual effects and features that only run on NVIDIA GeForce GPUs."
not really they work on amd but amd has to use atleast a week of man power resources to optimize them, as they are binary blobs that are not open, unlike GPUOpen or bullet physics


----------



## GhostRyder (Jan 29, 2016)

Going to download and try this game out soon!  Though I may wait for a CFX profile if I want it to run up to 144hz at 1440p.


----------



## trog100 (Jan 29, 2016)

the game cant need that much memory otherwise the 970 sli set up would not be doing so well..

general performance looks similar to witcher 3 to me..

trog


----------



## Casecutter (Jan 29, 2016)

W1zzard said, "NVIDIA has released an optimized driver for the game a couple of days ago, while we haven't heard a *peep out of AMD - like usual*. Even though the game runs great using AMD's latest 16.1 driver from earlier this month, I wish AMD would follow NVIDIA's practice to release an updated driver before a new big game comes out. This would give *users the assurance that their rig will be ready* for the game and also transport the message to customers that AMD *cares about them*."

Really... Nvidia has had this game for months to play with and optimize and you harp on RGT!  I'd bet RGT has had like 6 weeks since they received the final version, and somehow they need to do better?  Man if this was Nvidia's best effort, hate to see where they started.  Perhaps RTG looked at it and said, we got no reason to mess with this, it runs fine.  From the statement(s) above it came across like RTG has an issue, they obviously don't...  As you said "The game feels right, provides immersive gameplay and comes with amazing graphics that run well on both".  

Just because Nvidia has insecurities they have to portray "they did something" to get press, or probably most of their efforts went to cleanup their own Gamewreck issues, seem just as unambiguous.   Nvidia tags it "Game Ready" and there's oohs-n-ahh's like it's it magical...

Is it me or are all these Nvidia titles now always feel overtly dark and strangely the same anymore.


----------



## okidna (Jan 29, 2016)

the54thvoid said:


> I've not yet hit geothermal valley yet but my game is smooth as silk even at <60fps.



Geothermal Valley (GV) is definitely the heaviest environment in this game, at least for now. 
I just spent 3 hours roaming around in GV doing tombs, crypts, and collectibles. Performance wise it's still okay for me, FPS goes down as low as 35 FPS in the jungle (foliage heavy area) with* this setting* but no stutter.


----------



## BiggieShady (Jan 29, 2016)

Kudos for using gif to show all options in settings window ... this should be done everywhere.
I don't know why but I didn't expect w1zzard's hand to be that steady


----------



## the54thvoid (Jan 29, 2016)

Casecutter said:


> Is it me or are all these Nvidia titles now always feel overtly dark and strangely the same anymore.



It's not an Nvidia title.  They simply did they old 'buy in' for gfx priority.  You know, like AMD did with the first one with Tress FX.  You know, Tress FX, the proprietary software.  And anything with bunnies and deer and archaeology can't be dark.  Unless you're a goth.

And for some reason WCCFtech say that Pure Hair is AMD?  That's not right.  Is it?



> The release is fitted with a list of graphical updates that include the high-end ambient occlusion technique from NVIDIA that we know as HBAO+ along with higher Anisotropic filtering of 16x, full hardware based tessellation, increased geometry and textures, dynamic foliage compared to static allowing foliage to react dynamically to the player movement in the environment and last but not least, AMD’s Pure hair technology which uses their TressFX 3.0 engine to render and emulate Lara’s hair realistically.
> 
> Read more: http://wccftech.com/rise-of-the-tomb-raider-pc-performance-analysis/#ixzz3yefDfsc7​


----------



## brutlern (Jan 29, 2016)

the54thvoid said:


> It's not an Nvidia title.  They simply did they old 'buy in' for gfx priority.  You know, like AMD did with the first one with Tress FX.  You know, Tress FX, the proprietary software.  And anything with bunnies and deer and archaeology can't be dark.  Unless you're a goth.
> 
> And for some reason WCCFtech say that Pure Hair is AMD?  That's not right.  Is it?



Given the fact that AMD TressFX is part of it's new open source initiative GPUOpen,  my guess is Nixxies/Crystal Dynamics took TressFx, tweaked it, and made a new one called PureHair. Just speculating here.


----------



## Casecutter (Jan 29, 2016)

Sorry I can't truly say what it is... but perhaps it the games that are being release and that they've had a more brooding darkish feel even when out in the light of day.   Sure it's like always this epic Goth feel, if you will.  I suppose this is what HDR rendering to increase the quality of color in pixels is suppose to help.

And correct not sure who/what PureHair is...  Nvidia it's Hairworks, RGT TressFX.  Let's just finally everyone join hands and say just get on with great hair... and dispose of the the tri-angler matted look of yore.  It like a mullet just say no.


----------



## altermere (Jan 29, 2016)

brutlern said:


> Given the fact that AMD TressFX is part of it's new open source initiative GPUOpen,  my guess is Nixxies/Crystal Dynamics took TressFx, tweaked it, and made a new one called PureHair. Just speculating here.





> PureHair, a new hair solution made in collaboration between AMD and Eidos Montreal's research and development lab, improves upon TRESSFX.


http://motherboard.vice.com/read/glimpse-of-the-purehair-hair-rendering-engine-at-gdc

Also, what about CPUs? Does 5820k provide any advantage of smooth frametimes over quad i7's?


----------



## GLD (Jan 29, 2016)

This will be playable on my rig, but dang, it may struggle running on higher settings.


----------



## Xuper (Jan 29, 2016)

impossible Look at this :










Now fury X is faster than Titan ?


----------



## Kissamies (Jan 29, 2016)

So I guess I won't have problems with 290..


----------



## HumanSmoke (Jan 29, 2016)

behrouz said:


> Now fury X is faster than Titan ?


It looks like the hierarchy is sensitive to settings and maps. Depending on what is used you could make a case either way...and either way, there isn't much in it. ComputerBase's results  also seem to reveal that the stuttering seems more apparent in HBM equipped cards - presumably some driver optimization still required for the tech.


----------



## Fluffmeister (Jan 29, 2016)

I'm only a few hours into the game, but damn... it's pretty.



behrouz said:


> impossible Look at this :



What is impossible? There is a difference between VRAM usage and an actual baseline requirement to be playable, although equally thats not to say it doesn't impact performance. But the game brings a world of hurt to GPU's @ 4K, and as @HumanSmoke link suggested, different maps put different loads on the cards.



behrouz said:


> Now fury X is faster than Titan ?



Sure by 1FPS,  but then Fiji is already under water and Titan X is pretty conservatively clocked as far as GM200's go, hell it's basically got it's brakes on.

Regardless AMD's top dog does just fine.


----------



## RustyKats (Jan 29, 2016)

Why did you add that jibe against AMD?

Performance is outstanding with the Steam release version (there was actually a pre-day 1 patch).

390X is ~= or faster than 980.

Fury X scales at higher resolutions.

On twitter, RTG/AMD have said they are releasing a driver in the next day or two, to enable CrossFire and further optimize the game.

This is an NV sponsored title, it normally isn't this quick to optimize through the closed source GameWorks features.


----------



## Osjur (Jan 30, 2016)

Everything maxed except SSAA on Fury X and its playable at 1600p (35-60fps) but it stutters if I use Very High textures. Changing that setting to High and it doesn't stutter anymore. So my guess is that 4GB VRAM is not just enough with this resolution :/

A side note: Very High preset is not maxed as there is still shadow and pure hair settings which you can increase.

Another note: HBAO+ works with AMD cards, you can even test it in game. Performance hit is around 10-20% on Fury X. I don't know why TPU says that its Nvidia only feature?

Though I will most likely just disable HBAO+ as the performance hit is not worth with the IQ increase (and in some cases: decrease).


----------



## buffyvpsfan (Jan 30, 2016)

WTH no r9 390 in review, I know its almost just like a 970 but only that nvidia didnt lie about how much vram it had and its got 4+ more vram.  couldnt that make a difference in tomb raider.  seems VERY biased you didn't include team red gpu here to compare the 970.


----------



## W1zzard (Jan 30, 2016)

Osjur said:


> Another note: HBAO+ works with AMD cards, you can even test it in game. Performance hit is around 10-20% on Fury X. I don't know why TPU says that its Nvidia only feature?
> 
> Though I will most likely just disable HBAO+ as the performance hit is not worth with the IQ increase (and in some cases: decrease).


fixed, thanks



buffyvpsfan said:


> WTH no r9 390 in review, I know its almost just like a 970 but only that nvidia didnt lie about how much vram it had and its got 4+ more vram.  couldnt that make a difference in tomb raider.  seems VERY biased you didn't include team red gpu here to compare the 970.


Unfortunately I don't have a 390 non-X, AMD doesn't seem interested in sampling it.


----------



## xfia (Jan 30, 2016)

allocating the whole vram pool is not a bad thing and it doesnt mean that you need 4GB for 1080p or that 8GB is some kind of minimum now or in the foreseeable future. 
the game is beautiful right? well that is how they do it well in this case. 

*Memory management* is the act of managing computer memory at the system level. The essential requirement of memory management is to provide ways to dynamically allocate portions of memory to programs at their request, and free it for reuse when no longer needed. This is critical to any advanced computer system where more than a single process might be underway at any time.[1]
Several methods have been devised that increase the effectiveness of memory management. Virtual memory systems separate the memory addresses used by a process from actual physical addresses, allowing separation of processes and increasing the effectively available amount of RAM using paging or swapping to secondary storage. The quality of the virtual memory manager can have an extensive effect on overall system performance.


----------



## arbiter (Jan 31, 2016)

Xzibit said:


> A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.





W1zzard said:


> Those used the pre-release version of the game. The game runs perfectly here (rig in my system specs and VGA test rig as listed in the article)


Yea i used very high settings at 1080p on a gtx980 and pretty much flat 60 for 90+% of the game.


newtekie1 said:


> This is one of those sloppy devs, this is as bad as it gets. Just cram all the textures possible into VRAM, even if they aren't anywhere near being used to render the current scene. This is what makes it seem like we need more RAM, when we really don't


The company that did the port has did a lot of few games that turned out well. totalbiscuit said that on his review, they did games like theif, deus ex HR, Hitman absolution. Company has history of doing the job pretty good.


Casecutter said:


> Really... Nvidia has had this game for months to play with and optimize and you harp on RGT!


Really.... how far in advance did Nvidia have the LAST tomb raider game? I bet amd had access to this game a lot earlier then nvidia did with the last one.


----------



## Relayer (Jan 31, 2016)

W1zzard said:


> Unfortunately I don't have a 390 non-X, AMD doesn't seem interested in sampling it.



Aren't all 390/X aftermarket only? You wouldn't get them from AMD, would you? I assumed they'd come from the AIB's?


----------



## sweet (Jan 31, 2016)

Hope that this won't turn out to be a bug fest like most of Gameworks title


----------



## Fluffmeister (Jan 31, 2016)

Osjur said:


> Everything maxed except SSAA on Fury X and its playable at 1600p (35-60fps) but it stutters if I use Very High textures. Changing that setting to High and it doesn't stutter anymore. So my guess is that 4GB VRAM is not just enough with this resolution :/
> 
> A side note: Very High preset is not maxed as there is still shadow and pure hair settings which you can increase.
> 
> ...



Indeed, HBAO+ is the only nvidia feature, and it still works on AMD cards as you say, so no drama there.

PC Perspective noticed spikes/stuttering on Fiji based cards too, probably just needs some driver love to fix it, or maybe a future game patch will address it.

http://www.pcper.com/reviews/Graphi...Performance-Results/Adding-GTX-970-and-R9-390


----------



## xfia (Jan 31, 2016)

behrouz said:


> impossible Look at this :
> 
> 
> 
> ...


Its common for AMD gpu's to pull ahead at higher resolutions and it looks to still be the case. Its because at every tier they use more powerful vram. NV wouldnt have a chance without AMD and Jadec spoon feeding them free technology.


----------



## jabbadap (Jan 31, 2016)

Fluffmeister said:


> Indeed, HBAO+ is the only nvidia feature, and it still works on AMD cards as you say, so no drama there.
> 
> PC Perspective noticed spikes/stuttering on Fiji based cards too, probably just needs some driver love to fix it, or maybe a future game patch will address it.
> 
> http://www.pcper.com/reviews/Graphi...Performance-Results/Adding-GTX-970-and-R9-390



Most likely due cpu, guru3d:s cpu scaling test shows that fury works best with one core(Amd and dx11 threading  ). Needs driver patch from amd, well dx12 rendering path patch from CD should work too.


----------



## newtekie1 (Jan 31, 2016)

arbiter said:


> The company that did the port has did a lot of few games that turned out well. totalbiscuit said that on his review, they did games like theif, deus ex HR, Hitman absolution. Company has history of doing the job pretty good.


When I say sloppy, meant in memory management only.  Just slamming as many textures as possible is sloppy, but that doesn't mean they are overall bad.


----------



## gpu2016 (Jan 31, 2016)

I playing DOTA 2 with my 980 ti


----------



## Casecutter (Feb 1, 2016)

arbiter said:


> Really.... how far in advance did Nvidia have the LAST tomb raider game? I bet amd had access to this game a lot earlier then nvidia did with the last one.


While I whole heartedly see a sponsor (Nvidia/RGT) will always have the advantage, that's not my gripe.  Though now Nvidia promotes "Game Ready" release day drivers, I can't recall AMD in the past expounding on such an advantage. 

It's was the idea that W1zzard included a disparaging paragraph of RTG not following suite, when clearly RGT was *not at all* sub-standard with there performance in playing the game.  If RGT came with release day drivers; _A)_ there would be calls of copying and trying to steal Nvidia's Thunder;  _B)_ They'd relinquish the position to see what Nvidia was delivering and having several days of ascertaining real instances of "bugs" and getting such 'fixes" in one all encompassing release. 

If RTG had an issue within the game or waning performance, then yes there's reason to call RTG to task but, that was clearly not the case.  The non-sponsoring side should be as RTG has been "amiable", then use the opportunity of being second-fiddle to their advantage.  Perhaps RTG is sitting on optimizations that are huge, knowing what they know is there any reason to jump in with huge 10% improvements, or pocket those till later.  Me I'd wait a 5 days or so and bring any fixes and raise performance 2-3%, as not show that I've got a handle on what Nvidia might bring down the road.


----------



## No Nrg (Feb 1, 2016)

I was playing this weekend on my rig and had a pretty good experience at 1440p. I'm running FXAA and everything else maxed, but pure hair off (not worth the 5-8fps). Kept a 40-60fps rate with my overclocked GTX 980, which is perfect given my G-sync monitor. 

I do get random stutters mostly when large vistas are coming into view quickly and loading all at once or right before some cut scenes begin. Some intense scenes have me dropping frames or going into single digits. 98% of the time it is consistent and smooth though. Running FRAPS, I notice CGI cut scenes are locked at 30fps and I get a screen tear in the upper portion on all of these. 

Overall though the experience is fine, the game looks amazing. The issues above are nothing a future update couldn't fix; certainly nothing that would have me cry foul and claim it's a bad port.


----------



## okidna (Feb 2, 2016)

New Crimson Hotfix (16.1.1) for AMD users :

*Highlights*

Performance/Quality improvements and an AMD Crossfire™ profile is available for Rise of the Tomb Raider™
An AMD Crossfire™ profile is available for Fallout 4
Link : http://support.amd.com/en-us/kb-art...mson-Edition-16.1.1-Hotfix-Release-Notes.aspx


----------



## Fluffmeister (Feb 2, 2016)

And here are some updated results using AMD's new hotfix driver:

http://www.computerbase.de/2016-02/...ise-of-the-tomb-raider-1920-1080-crimson-1611

http://www.pcgameshardware.de/Rise-...451/Specials/Grafikkarten-Benchmarks-1184288/


----------



## Kanan (Feb 3, 2016)

newtekie1 said:


> Except if you are intelligent enough to actually read:
> 
> 
> 
> This is one of those sloppy devs, this is as bad as it gets.  Just cram all the textures possible into VRAM, even if they aren't anywhere near being used to render the current scene.  This is what makes it seem like we need more RAM, when we really don't.


Has nothing to do with sloppy. It's the opposite. The game engine works optimal and uses as much Vram as it gets - that's why it can have high Vram usage on cards with high amounts of Vram. The reason why it does that, is so that it can circumvent reloading textures. The same as with Call of Duty MW 3 + Black Ops 3. And both games run with GPUs that have only 2 or 3 GB Vram pretty well. Sloppy would be if the game would need that high Vram always, but it doesn't. It just tries to use higher Vram capacities, that's it, and not ignore it like most other game engines do.


----------



## newtekie1 (Feb 3, 2016)

Kanan said:


> Has nothing to do with sloppy. It's the opposite. The game engine works optimal and uses as much Vram as it gets - that's why it can have high Vram usage on cards with high amounts of Vram. The reason why it does that, is so that it can circumvent reloading textures. The same as with Call of Duty MW 3 + Black Ops 3. And both games run with GPUs that have only 2 or 3 GB Vram pretty well. Sloppy would be if the game would need that high Vram always, but it doesn't. It just tries to use higher Vram capacities, that's it, and not ignore it like most other game engines do.



No, most other game devs actually try to put some effort into only loading what is actually needed into VRAM, instead of just cramming everything it possible can in.


----------



## Kanan (Feb 3, 2016)

newtekie1 said:


> No, most other game devs actually try to put some effort into only loading what is actually needed into VRAM, instead of just cramming everything it possible can in.


You don't get my point. It uses all it can so it doesn't need to reload textures. Just because a different engine has another (in this case I'd like to say better) philosophy, doesn't mean it's "sloppy". Tomb Raider 2013 had a very good engine that had one of the best GPU usages, even with AMD cards that tend to be underused in 1080p and with CF + SLI, almost with any configuration and even with weak CPUs. I played it with my HD 5970 on extreme settings and it worked very good, it was one of the few games that worked really well on that setup. And this is the successor to it with improved but same engine, just needs some more time to get all the bugs out, but the thing with the Vram usage is working as they indended I guess. Same with COD. Both engines scale ideal to how much Vram is available.


----------



## Kissamies (Feb 3, 2016)

gpu2016 said:


> I playing DOTA 2 with my 980 ti


And I play Broken Sword 1 with my R9 290. So what's the point? Having a powerful GPU doesn't mean that only the newest games should be played.


----------



## AHMAD_ESLIM (Feb 7, 2016)

for all fans that can't use the profile for RISE OF TOMB RAIDER  on AMD YOU have to enable the "exclusive fullscreen" option in game settings
that's make the second card working in
CF default mode


----------



## mcraygsx (Feb 12, 2016)

I wish techpowerup include some aged cards like 780 TI and 690.


----------



## xfia (Feb 14, 2016)

@mcraygsx likes ebay, paypal and blowing shit up in space with lasers<


----------



## Bluescreendeath (Feb 19, 2016)

NC37 said:


> "4GB is all you'll ever need to 1080 and under." -Said everyone that has just been proven to be idiots for buying into that crap
> 
> TOLD YOU ALL FREAKEN SO!!! BITE MY SHINY 8GB VRAM!!! That will be utterly obsolete in due time for sure.
> 
> ...



Despite what this article claims, the benchmarks looks more like piss poor optimization. There is a huge performance hit between 4GB to 2GB. The problem is the game does NOT look that good compared to other games out there. There is no reason why it needs so much VRAM on 1080p and under. Crysis 3, the newer Far Crys, etc all look just as good if not better, and run fine with 2GB VRAM at 1080p.

It looks like developers are getting lazier at optimizing their games for lower spec PCs thanks to GPU hardware improving so much and so quickly.


----------



## valkylin000 (Mar 11, 2016)

*My hardware:Intel 5960x ,Evga 980Ti SC SLI ,Asus PG348Q 3440x1440 75Hz G-Sync monitor.Driver ver:364.51
I got strange situation with Rise of the tomb raider in SLI mode.
This is screenshot with single 980Ti : 
*

















*

Here is 980Ti SLI mode with same setting(use ipad take photograph for sli visual Indicators)
*












*
You can see the SLI performance so low at the same scenes with same setting,so what's wrong with it?*


----------



## BiggieShady (Mar 11, 2016)

valkylin000 said:


> You can see the SLI performance so low at the same scenes with same setting,so what's wrong with it?


You may try custom SLI profile through nvidia inspector:


> In order to enable this new profile, you will need the NVIDIA Inspector Tool. Search for Rise of the Tomb Raider’s profile and change its SLI bits (DX11) to 0x080002F5. Then, press on the magnifier icon in order to reveal NVIDIA’s Undefined options. Search for 0x00A0694B and set it to 0x00000001.


or simply import premade custom profile: ROTTR[possible-SLI-Variant].zip


----------



## okidna (Mar 12, 2016)

New patch just released, and guess what? They added DirectX 12 support and VXAO : http://steamcommunity.com/app/391220/discussions/0/405694115202867954/

I guess I'm going to install my Windows 10 copy to see what's all the fuss about.


----------



## bajs11 (Apr 12, 2017)

Is this a joke? GTX 970 SLI gets 8gb of Vram and outperforms almost everything??
so by that logic i should have just gotten Another gtx 970 instead upgrading to gtx 1080

fyi vram doesnt stack if you run in sli


----------

