• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Rise of the Tomb Raider: Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,757 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
We benchmark Rise of the Tomb Raider on 12 graphics cards, in four resolutions, including SLI. Also included are measurements of VRAM consumption, which is high, but seems extremely well optimized.

Show full review
 
Last edited:
Memory management in this game is excellent, particularly considering this game looks absolutely beautiful at almost every turn.
 
Why no more tests on older hardware like the 780 ti cards???
 
Simply it chews everything you throw at it... seems reasonable to me actually.

It would be fun to see when the game starts to stutters using 4GB, 6GB, etc system RAM. Are at least 6GB (usually X58) really enough still.
 
Last edited:
"4GB is all you'll ever need to 1080 and under." -Said everyone that has just been proven to be idiots for buying into that crap

TOLD YOU ALL FREAKEN SO!!! BITE MY SHINY 8GB VRAM!!! That will be utterly obsolete in due time for sure.

Nice that it is optimized and the 970 bug isn't hampering it too bad. But lets wait and see what else is coming. Still a lot of sloppy port devs to go for future games.

Hopefully AMD updates their drivers for it. I've seen other benches on this game and it really looked like AMD was behind on it again.
 
A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.

Destructoid said:
Update: Due to playing a pre-release build, I did not have access to Nvidia's Game Ready Drivers while writing this. I can now confirm that installing the Game Ready Drivers that were released today (January 27) did not fix the problems I discuss below.

In fact, I would say it's made things worse: the stuttering is more frequent and the loading times are now two or three times longer than what I saw pre-drivers.

DSOGaming said:
At this point, we should also note that the game’s Very High textures require GPUs with at least 3GB of VRAM. While enabling them, we noticed minor stutters. By dropping their quality to High, we were able to enjoy the game without any stutters at all.

PCWorld said:
UPDATE, 2:00 PM Pacific: I’ve installed Nvidia’s Game Ready Drivers and it helped a bit, but didn’t completely eliminate the stuttering. The Geothermal Valley continues to display precipitous drops in frame rate, falling from around 100 down to 55. Tweaking the Level of Detail down a notch gained me back five frames (bringing it to a slightly-stuttery 60), but be aware that even a high-powered rig might show quite a bit of slowdown in these larger areas regardless of how the rest of the game runs.

PCGamer said:
While my 970 GTX couldn't keep up with the demands of running every option at maximum—dropping to a stutter during cutscenes and set pieces—a few sensible reductions had it running smoothly and consistently at 60 frames per second.

I'll wait until at least the 3rd patch is released.
 
Last edited:
  • Like
Reactions: nem
A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.
Those used the pre-release version of the game. The game runs perfectly here (rig in my system specs and VGA test rig as listed in the article)
 
I think most people would be more interested in the 390 non x rather then the 390x in the benchmark results.
 
Those used the pre-release version of the game. The game runs perfectly here (rig in my system specs and VGA test rig as listed in the article)

Is there a difference ? I'm wondering if its just in name or was there a update.

One would think they would update there reviews by now and most of the article updates refer to the drivers that had little improvement to making things worse.

Judging by the discussion on Steam it seams to be the same thing with people reporting stuttering and frame drops.
 
Last edited:
780 Ti is around 970 right ?
 
It feels a little awkward getting just 60fps with a 980ti and a ROG Swift. I am not complaining but you know, I was expecting to get 50% more fps. In any case the game looks beautiful but we should be able to have the PC version months ago. Thanks for the review W1zzard.
 
I think most people would be more interested in the 390 non x rather then the 390x in the benchmark results.
At 1440p it should be just under or equal to GTX980, which yet again shows what great value these cards provide. On my OC'd Sapphire Nitro the game never drops under 30FPS and that's with maxed out settings, not even very high (apart of the hair and ofc I use SMAA not super-sampling AA). Also this is the first game I've seen to use all 8GB of VRAM. This why I dropped by here to see if my Afterburner has gone nuts, or if it's really filling up 6.5 - 7.5 GB, which is where things are most of the time.
 
Is there a difference ? I'm wondering if its just in name or was there a update.

One would think they would update there reviews by now and most of the article updates refer to the drivers that had little improvement to making things worse.

Judging by the discussion on Steam it seams to be the same thing with people reporting stuttering and frame drops.

One such review? (PC World)

The biggest problems seem to occur in a locale known as the “Geothermal Valley,” a large wooded area with numerous swaying trees. Look at the trees, the frame rate drops. Turn the other direction, it shoots back up again. That’s on a 980 Ti, but a friend of mine is running an R9 280X and tells me he experiences the same issue in the same place.

Also affects AMD.

I've not yet hit geothermal valley yet but my game is smooth as silk even at <60fps.
 
Getting close to that 30FPS cinematic game experience. :D

Well, 50-60fps. With SMAA on (not x2, just on) and everything else maxed. 1440p.

Yup. Butter smooth. Even when i had SMAA at x4, it was about 35-40fps and still silky. Glad I never bought a Gsync monitor. If dev's just tried harder, games can be really nice to play.
 
What's great about this game is that Lowest and Highest quality settings are pretty much indistinguishable aside from tesselation, slightly better textures and improved lighting which basically means that you can adjust the game for even mid range cards without compromising visual fidelity.

The geometry is pretty much the same all around.
 
  • Like
Reactions: nem
Where is my 32gb Hbm2/per GPu quad polaris Setup ? Ive been waiting all my life for it !
 
Well, 50-60fps. With SMAA on (not x2, just on) and everything else maxed. 1440p.

Yup. Butter smooth. Even when i had SMAA at x4, it was about 35-40fps and still silky. Glad I never bought a Gsync monitor. If dev's just tried harder, games can be really nice to play.

Yeah we have very similar machines both... Did you try see how does it look on DSR x4 :D, just for spoiling your mood and see a proper way to get rid of jaggies and make a hole on your wallet :D
 
"4GB is all you'll ever need to 1080 and under." -Said everyone that has just been proven to be idiots for buying into that crap

TOLD YOU ALL FREAKEN SO!!! BITE MY SHINY 8GB VRAM!!! That will be utterly obsolete in due time for sure.

Nice that it is optimized and the 970 bug isn't hampering it too bad. But lets wait and see what else is coming. Still a lot of sloppy port devs to go for future games.

Hopefully AMD updates their drivers for it. I've seen other benches on this game and it really looked like AMD was behind on it again.

Except if you are intelligent enough to actually read:

W1zard said:
Using GTX Titan X, which has 12 GB of VRAM, we tested the memory usage of the game. As you can see it always fills up around 6-7 GB of VRAM. This may sound shocking at first, but in reality the game runs very well with cards that don't have as much memory - look at the performance charts. It seems that Crystal Dynamics' engine will happy use as much VRAM as it can, but is very well optimized to make due with much less, without sacrificing framerate.

This is one of those sloppy devs, this is as bad as it gets. Just cram all the textures possible into VRAM, even if they aren't anywhere near being used to render the current scene. This is what makes it seem like we need more RAM, when we really don't.
 
Last edited:
A lot of reviewers are reporting stutter fest and most of them reviewed it with Nvidia hardware.









I'll wait until at least the 3rd patch is released.
hmm did they ever try it on amd hardware?, on my 7970 it runs butterly smooth with default high settings and a little slower on high AA/purehair very high but still smooth 40-60fps range

"Think of GameWorks as a set of game visual effects and features that only run on NVIDIA GeForce GPUs."
not really they work on amd but amd has to use atleast a week of man power resources to optimize them, as they are binary blobs that are not open, unlike GPUOpen or bullet physics
 
Going to download and try this game out soon! Though I may wait for a CFX profile if I want it to run up to 144hz at 1440p.
 
the game cant need that much memory otherwise the 970 sli set up would not be doing so well..

general performance looks similar to witcher 3 to me..

trog
 
W1zzard said, "NVIDIA has released an optimized driver for the game a couple of days ago, while we haven't heard a peep out of AMD - like usual. Even though the game runs great using AMD's latest 16.1 driver from earlier this month, I wish AMD would follow NVIDIA's practice to release an updated driver before a new big game comes out. This would give users the assurance that their rig will be ready for the game and also transport the message to customers that AMD cares about them."

Really... Nvidia has had this game for months to play with and optimize and you harp on RGT! I'd bet RGT has had like 6 weeks since they received the final version, and somehow they need to do better? Man if this was Nvidia's best effort, hate to see where they started. Perhaps RTG looked at it and said, we got no reason to mess with this, it runs fine. From the statement(s) above it came across like RTG has an issue, they obviously don't... As you said "The game feels right, provides immersive gameplay and comes with amazing graphics that run well on both".

Just because Nvidia has insecurities they have to portray "they did something" to get press, or probably most of their efforts went to cleanup their own Gamewreck issues, seem just as unambiguous. Nvidia tags it "Game Ready" and there's oohs-n-ahh's like it's it magical...

Is it me or are all these Nvidia titles now always feel overtly dark and strangely the same anymore.
 
I've not yet hit geothermal valley yet but my game is smooth as silk even at <60fps.

Geothermal Valley (GV) is definitely the heaviest environment in this game, at least for now.
I just spent 3 hours roaming around in GV doing tombs, crypts, and collectibles. Performance wise it's still okay for me, FPS goes down as low as 35 FPS in the jungle (foliage heavy area) with this setting but no stutter.
 
Back
Top