• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Shadow of the Tomb Raider benchmark

Yes, my build is the current full game version from Steam, but here's the thing: it's a clone of the setup on my NUC, where the missing NPCs show up properly.

I just DDU'd both GPU drivers and uninstalled the other ones, and then installed the proper drivers and current version of Radeon Adrenaline for this setup. Yeah, you're not supposed to do that kind of crap but the Steam d/l start and then replacement with the game folder doesn't work with Rocket League (always re-downloads dammit) and an occasional other game and I didn't want to get stuck killing my bandwidth. No other issues I've seen so the clone seems OK. Just this missing NPC weirdness in SotTR.

I'll see what I can find on the kid's machine but I'm gonna have to throw a stick or iPhone in the backyard to distract him. Wish me luck!
 
Why try to solve a quandary when instead you can scrape the bottom of the barrel by sticking in a slot-power only GPU into an office PC?

The latest 7nm Shortage Desperation Build™:

20210419170757_1.jpg


Dell Optiplex 9020, 280W stock PSU
Core i7-4790 (not K), tops out at about 64W at 100% CPU use
Asus Phoenix 1050 Ti (UV to 0.9V ~1775 MHz Cores OC, 7350 MHz Memory OC, uses ~61W)
Stock 8GB RAM 1600 MHz CL11

Does well enough I guess, better than a stick to the eye. Another game: wrangled with it for quite a while in Horizon Zero Dawn and got OK performance and visuals with 1080p 80%, mostly Medium settings, Shadows Hi, AO off, Camera-based AA. Still rummages in the 40s in busier scenes, hi 50s most other times. Weirdly AO doesn't seem to make a big visual difference in this game, may just be where I was in the map.
 
Why try to solve a quandary when instead you can scrape the bottom of the barrel by sticking in a slot-power only GPU into an office PC?
Oh, so that 1050Ti just needed more breathing room! Now it's on a par with my oc'd HD7970. I'd still call that a win for the AMD card, though, being nearly five years older. Wish I had been able to buy a better GPU before shit hit the fan, but now it looks like this old salt is here to stay :)
 
Last edited:
All the gpu power I could buy in 2021.
20210421014953_1.jpg
 
2 runs with an M1 Macbook Air. Seems to mostly stay above 30FPS with the second group of settings, but it actually didn't do that bad on what's close to 1080P highest. Not sure I'd try to play this game on the MBA, especially without any active cooling. I'm assuming this runs under Rosetta 2 as well.
IMG_1774.jpegIMG_1775.jpeg
 
2 runs with an M1 Macbook Air. Seems to mostly stay above 30FPS with the second group of settings, but it actually didn't do that bad on what's close to 1080P highest. Not sure I'd try to play this game on the MBA, especially without any active cooling. I'm assuming this runs under Rosetta 2 as well.
View attachment 200727View attachment 200728
Apple, taking gaming forward to 20FPS :D That raw power :roll:
 
Apple, taking gaming forward to 20FPS :D That raw power :roll:
What did you expect? I know it's not earth shattering, but let me know of another passively cooled chip powered by an IGP that could do any better.

EDIT: For a frame of reference, a few pages back, there's a system with an i5-8259U with 1050ti that scores close to the same.
 
Last edited:
What did you expect? I know it's not earth shattering, but let me know of another passively cooled chip powered by an IGP that could do any better.

EDIT: For a frame of reference, a few pages back, there's a system with an i5-8259U with 1050ti that scores similarly.

In addition that it's it's interpreting x86 to ARM at the same time (as you mentioned).
 
What did you expect? I know it's not earth shattering, but let me know of another passively cooled chip powered by an IGP that could do any better.

EDIT: For a frame of reference, a few pages back, there's a system with an i5-8259U with 1050ti that scores close to the same.
I never expected much, but the hype that came with the M1 was sickening, so when I see these results I am very amused :laugh:
 
I never expected much, but the hype that came with the M1 was sickening, so when I see these results I am very amused :laugh:
Again, I don’t know what you expected. It has no active cooling, and is running on 8GB of shared system RAM, emulating a DX12 ported game! It’s also the 7GPU unit, vs 8. What do we think this chips thermal headroom is, 10W?
 
Again, I don’t know what you expected. It has no active cooling, and is running on 8GB of shared system RAM, emulating a DX12 ported game! It’s also the 7GPU unit, vs 8. What do we think this chips thermal headroom is, 10W?
Again, I do not debate that for what it is its a good chip, but from that to all the "its the best ever" , "its going to make x86 obsolete" and all that jazz, to see such a puny result is funny for me. Might not be for you, but we are not all amused by the same things :laugh: And the 1050TI is an a entry level from 6 years ago built on 14nm, so the comparison is not really something to be proud of for the M1. But for a chip with on die graphics, by itself its not that bad.

But putting aside all the comments on the topic of the M1, its a really interesting addition to this pool of result, first non x86 results, that in itself is kind of cool :)
 
Ya gotta stop taking marketing hyperbole personally lol.
 
Ya gotta stop taking marketing hyperbole personally lol.
I normally would, just that I have a couple of Apple fanatics in my circle of friends, they drive me crazy with each opportunity, and I refuse to see the light :laugh:
That does not mean that I can not appreciate what the M1 as a start could mean for the future, right now its an excellent chip for a low powered device, and that's it.
 
I get that not everyone likes Apple (I don’t exactly love them either), but M1 was a very strong showing, IMO. Intel helped Apple look better, with all the execution woes and Kaby Lake refreshes. Had Intel been executing all these years, M1 would still be just as workable, but maybe not as competitive. I have no complaints about general performance--it renders web pages as fast as anything (Firefox), and my biggest demand is photo editing (20MP RAW) and occasional video editing. The point really was just to run the benchmark on this chip, and to see if it was even possible/playable. I really don't play video games any more, so I could care less how well it actually did, but it reminded me of playing on the last gen consoles (PS4/XboxOne). Despite the FPS dips, it actually seemed fairly consistent and didn't look too bad. Again, I wouldn't want to play it on a passively cooled setup anyway. For all I know, it started throttling during the bench.

Ya gotta stop taking marketing hyperbole personally lol.
Yes, all the companies can hype up their stuff. Intel has been hard at it lately, where some of the first words their new CEO said was how they can't be getting beat by a lifestyle company, and then a month or two later, they start bashing Apple and getting creative with their Blue Slides. All these companies know the excitement that builds around their launches, so they will use that to their advantage. Their are certainly some biased "review" sites out there for each camp, so you definitely want to go to the right sources for honest assessments. Anandtech isn't what it used to be, but they still keep it pretty scientific, and I found their assessment of the M1 to be spot on.
 
Last edited:
I get that not everyone likes Apple (I don’t exactly love them either), but M1 was a very strong showing, IMO. Intel helped Apple look better, with all the execution woes and Kaby Lake refreshes. Had Intel been executing all these years, M1 would still be just as workable, but maybe not as competitive. I have no complaints about general performance--it renders web pages as fast as anything (Firefox), and my biggest demand is photo editing (20MP RAW) and occasional video editing. The point really was just to run the benchmark on this chip, and to see if it was even possible/playable. I really don't play video games any more, so I could care less how well it actually did, but it reminded me of playing on the last gen consoles (PS4/XboxOne). Despite the FPS dips, it actually seemed fairly consistent and didn't look too bad. Again, I wouldn't want to play it on a passively cooled setup anyway. For all I know, it started throttling during the bench.


Yes, all the companies can hype up their stuff. Intel has been hard at it lately, where some of the first words their new CEO said was how they can't be getting beat by a lifestyle company, and then a month or two later, they start bashing Apple and getting creative with their Blue Slides. All these companies know the excitement that builds around their launches, so they will use that to their advantage. Their are certainly some biased "review" sites out there for each camp, so you definitely want to go to the right sources for honest assessments. Anandtech isn't what it used to be, but they still keep it pretty scientific, and I found their assessment of the M1 to be spot on.
I don't necessarily have an issue with Apple, more with the people who are fanboys of Apple, and generally any fanboys :laugh:
Yes, Intel "helped" them, Intel is 14++++++ for life :D, but that does not mean that Apple only made a good chip just because Intel is stuck in time, Apple made a good chip regardless of Intel, and I am partially impressed by it. I am more curious though what lets say AMD can do with 5nm comparatively to M1, and what Intel can do with their 7nm, even if that might take 2 more years or more to get to 7nm.
Anyway, comparing different architectures is extremely tricky when it comes to performance, and then adding a different software stack makes it even more complicated.

In CPU general task M1 is quite strong, if the GPU is involved not so much it seems, and a "gaming" task seems to bring the chip to its knees, and I don't mean only GPU, but looking at the CPU scores in that screenshot, they are quite low CPU wise. I wonder if that is mainly because of the "we use so little power we don't need a cooler" mantra and the chip is energy starved and temperature limited, not a great combination in electronics.

If what you got is native code and that is the power of the chip, that does not look well for the future of M1 as a chip for the masses, if its Rosetta emulated, than the emulation is really bad, and I say this knowing all the benchmarks of the M1, it should not be anywhere as weak in the CPU section...but again, different architecture and god knows what software running it.

Anyway, you said you don't plan to game on it, so this is a rhetorical discussion, anything else, if you are happy with your purchase, generally that is a good purchase :)
And regarding hype, I am to old to to be swayed by some fancy words and some cute pictures or some paid promotional videos, if you don't want to be suckered into spending money on useless stuff, that's a trait one must acquire :laugh: And Intel's "lifestyle company" reference was just lame, you fight with your product not by throwing a tantrum because the "lifestyle company" showed you hot to better design a chip :kookoo:

And the thing that I should have said from the start, thank you for sharing the result, its interesting to see and think about.
 
What a creepy benchmark...

Anyway, the average FPS was 101 at 1920x1200 and 'GPU Bound' was 54%. I don't get that...

tombbench.png
 
54% GPU bound isn't that big of a bottleneck... seems to be pretty well balanced at 1080P.
But if you look at the screenshot it's only 39% at 1080, it's 54% at 1200. Seems like a huge difference given only a 3fps increase average dropping to 1080.
 
But if you look at the screenshot it's only 39% at 1080, it's 54% at 1200. Seems like a huge difference given only a 3fps increase average dropping to 1080.
The 1080 and 7700K seem to be pretty well-balanced for this game then if so few FPS can swing the dependency that far.

However you didn't test at the Highest setting, which is the standard for this SotTR thread. Go ahead and try that and you'll see yourself more highly GPU-bound at 1080p.
__________

And because I can't stop spamming this thread, I upgraded the internals in my Shoddy Bomb Casing to contain slightly fewer better Used Pinball Machine Parts:

20210520122237_1.jpg


Ryzen 5 2600 @stock
Radeon 5600XT
4x4GB 2800MHz 16-17-17-39 barely stable crap DDR4
B450 ASRock ultra-crap MoBo which can barely keep above RAM stable
some other junk what keeps the above powered up.
 
Last edited:
However you didn't test at the Highest setting, which is the standard for this SotTR thread.
ahhhh crud... I didn't even notice that. Oops...
 
By going to i9 9900, how many FPS can I gain or it wouldn't help me at all? it's a clevo laptop with gtx1060 6gb mobile. thanks in advance.
1624452434556.png
 
Last edited:
By going to i9 9900, how many FPS can I gain or it wouldn't help me at all? it's a clevo laptop with gtx1060 6gb mobile. thanks in advance.
View attachment 205109

can you swap out the cpu in your laptop? if you can then you might be able to hit 60fps but i'd be concerned with the thermals
maybe your laptop cooler is only setup for cooling a dual core or quad core... dont think its for an 8 core...
maybe you could grab an i5 9600 should give you a nice boost too..
 
By going to i9 9900, how many FPS can I gain or it wouldn't help me at all? it's a clevo laptop with gtx1060 6gb mobile. thanks in advance.
View attachment 205109

You may get a few more fps but I think you'll find you'll just end up more GPU bound or overheating your CPU because it's HSF isn't capable of cooling a 9900 just learn to drop some settings down to high rather than ultra and you get more fps and I doubt you'll notice the difference in picture quality
 
can you swap out the cpu in your laptop? if you can then you might be able to hit 60fps but i'd be concerned with the thermals
maybe your laptop cooler is only setup for cooling a dual core or quad core... dont think its for an 8 core...
maybe you could grab an i5 9600 should give you a nice boost too..
yes,clevo laptops with desktop sockets are upgradeable in their generation. with g4600 or i3 9100 & good thermal repaste,cpu temps won't go over 65-66'c with 120mv undervolt...from what i have seen online, others with i9 9900 or i7 9700k,would have temps around 90'c which is still lower than 100'C thrpttle temp; i don't want to use 9900K (for sure will face overheating and only by deliding+repsting+UV,user can get away with it), or i9 9900t 35W which performs around i7 9700-9700k as it's base and boost clocks are not impressive for gaming performance compared to 65W/95W ones although having 8c/16t may benefit in some workloads.
I am also counting on fps boost that AMD's "FSR" will bring to the games and how much improvement will user see with any change in cpu department which would affect on those 20-30% for each quality step user choose(ultra quality->quality->balanced->performance) .
 
Back
Top