# Shadow of the Tomb Raider benchmark



## StefanM (Dec 11, 2018)

Download here: https://store.steampowered.com/app/750920/Shadow_of_the_Tomb_Raider/

Settings: DX12, 1080p, ultra high


----------



## Solaris17 (Dec 11, 2018)

All clocks are vanilla, I dont OC. and I have a few programs open, though it probably doesnt matter much.


----------



## Hockster (Dec 11, 2018)

I didn't know the game even had a benchmark.


----------



## HTC (Dec 11, 2018)

Different settings and resolutions: can't compare this way.

OP: figure out a standard that all participants can use and separate resolutions, with said standard.


----------



## MrGenius (Dec 11, 2018)

1080p highest settings possible w/ i7-3770K @ 5.2GHz + RX Vega 64 @ 1675/1185




1440p highest settings possible w/ i7-3770K @ 5.2GHz + RX Vega 64 @ 1675/1190


----------



## rtwjunkie (Dec 11, 2018)

Hockster said:


> I didn't know the game even had a benchmark.


Well it was in-game. I guess it got a separate release now.


----------



## EarthDog (Dec 11, 2018)

Please organize this thread with specific settings and such to follow so people can look at and compare against the same settings. Respectfully, threads like this are pretty useless when it's just a pool of results...


----------



## droopyRO (Dec 11, 2018)

I can't install the bloody thing. Tried to update Steam, restart PC, dose not work.


----------



## MrGenius (Dec 11, 2018)

droopyRO said:


> I can't install the bloody thing. Tried to update Steam, restart PC, dose not work.
> View attachment 112339


I was having the same problem. But then I closed the main Steam window, while leaving the smaller download window open. And suddenly the download window displayed correctly. Allowing me to download it.


----------



## droopyRO (Dec 11, 2018)

It works, installing it now, thanks.


----------



## biffzinker (Dec 11, 2018)




----------



## droopyRO (Dec 12, 2018)

Here is mine, the 8600k is at 4.7Ghz, RAM at 3000Mhz, stock MSI 1070Ti Gaming:


----------



## Vayra86 (Dec 12, 2018)

I would suggest a *one-size-fits all* approach for settings: *Graphics on Highest preset, TAA, 1080p, Fullscreen* (non-exclusive)

Like this.
- All settings can be verified from the screenshot (adjusting graphics detail settings will result in it showing 'Custom' where you see Highest atm)
- No issues with multi screen setups / refresh rates (exclusive fullscreen can create problems for some setups)
- Decent load on the GPU both for high end and midrange.


----------



## biffzinker (Dec 12, 2018)

Vayra86 said:


> I would suggest a *one-size-fits all* approach for settings: *Graphics on Highest preset, TAA, 1080p, Fullscreen* (non-exclusive)


OP already had: 





StefanM said:


> Settings: DX12, 1080p, ultra high


 it's not that hard to match the same settings.


----------



## EarthDog (Dec 12, 2018)

The problem is nobody followed it. At least the first person didn't. 

We are not sure if this is simply a pool of results (bleh and pretty useless as a thread) or if the OP is going to keep track of these... he's been here but hasn't responded to anything.


----------



## Vayra86 (Dec 12, 2018)

biffzinker said:


> OP already had:  it's not that hard to match the same settings.



What is 'ultra high' then? And the OP also doesn't specify AA setting, nor exclusive fullscreen. So it IS hard to match. As can be seen by every result that followed.


----------



## EarthDog (Dec 12, 2018)

Typically, if one suggests "Ultra High", its a preset in the game. If a preset is suggested without an additional information, then it stands to reason to just use the defaults of the suggested preset.

Why would exclusive fullscreen and fullscreen be listed. Performance won't be any different between them, right? If users have trouble running one or the other, switching won't make a difference.


----------



## Vayra86 (Dec 12, 2018)

EarthDog said:


> Typically, if one suggests "Ultra High", its a preset in the game. If a preset is suggested without an additional information, then it stands to reason to just use the defaults of the suggested preset.
> 
> Why would exclusive fullscreen and fullscreen be listed. Performance won't be any different between them, right? If users have trouble running one or the other, switching won't make a difference.



Performance can vary between fullscreen and exclusive fullscreen modes, not sure if it does so for this bench. The problem with exclusive fullscreen is that it also sets a refresh rate in this bench, which can have performance or frame capping effects. It surely doesn't add to all doing the same run.

Ultra high is not a preset you can choose. There is High and Highest. Did you even look at this benchmark or?


----------



## EarthDog (Dec 12, 2018)

Vayra86 said:


> Did you even look at this benchmark or?


I did, and run it for both my GPU reviews and Motherboard reviews, actually. Sorry I do not recall its exact verbiage offhand. Its set it and forget it for me, bub. 

That said, you can see from the screenshot in the first post it is set to HIGHEST (did you even look at his screenshot or...? )... but clarity would be helpful nonetheless. 

EDIT: Exclusive FS is OFF by default, for the record. 

EDIT2: Thread is a mess... would love for the staff to come in and clean this up a bit or encourage the OP to do so...


----------



## Hardi (Dec 12, 2018)




----------



## tvamos (Dec 12, 2018)

Did anyone experience 60fps lock? First run was fine, tried OC but third scene was locked to 60, was 80+ in first run.


----------



## MrGenius (Dec 12, 2018)

No FPS lock here.


Vayra86 said:


> I would suggest a *one-size-fits all* approach for settings: *Graphics on Highest preset, TAA, 1080p, Fullscreen* (non-exclusive)


To that I would add: *Graphics driver settings at default, List your CPU and GPU models and their clock speeds*

i7-3770K @ 5.2GHz + RX Vega 64 @ 1680/1190


----------



## EarthDog (Dec 12, 2018)

MrGenius said:


> To that I would add: *Graphics driver settings at default, List your CPU and GPU models and their clock speeds*


This would be up to the discretion of the OP maintaining it... at least, that is what you told me when I brought it up in your thread. 

Really though it depends on the goal of the thread. If the goal is to see who gets the most FPS, then tweaking of the driver should be allowed. If the point is to get like for like comparisons, then of course something like that should be implemented. 

I always thought threads like this are for 'racing' not for comparing like for like.


----------



## lsevald (Dec 12, 2018)

i7-4770K @ 4.4GHz + palit 1080ti@2012/1447 (wc, 140% powerlimit, 1.093Vgpu)


----------



## MrGenius (Dec 12, 2018)

EarthDog said:


> This would be up to the discretion of the OP maintaining it... at least, that is what you told me when I brought it up in your thread.
> 
> Really though it depends on the goal of the thread. If the goal is to see who gets the most FPS, then tweaking of the driver should be allowed. If the point is to get like for like comparisons, then of course something like that should be implemented.
> 
> I always thought threads like this are for 'racing' not for comparing like for like.


I'm not really in favor of doing it one way or the other in this case. I can see the value in both methods. I was just suggesting the way I might do it. Well...that kind of implies that's the way I did it too. Which is true. I suppose I should try it the other way and see what happens.

Anywho...OP maintenance. Probably not going to be a thing with this one.


----------



## EarthDog (Dec 12, 2018)

Nope. just going to be a pool of results... lulz.


----------



## CrAsHnBuRnXp (Dec 12, 2018)

Here is mine


----------



## Tomgang (Dec 12, 2018)

I would be much better if every one run at the same settings.

Old fart incoming. That will be I7 980X @ 4.43 GHz and GTX 1080 TI GPU/MEM 2000/1514.

Highest preset









Manuel set settings highest possible.


----------



## biffzinker (Dec 12, 2018)

MrGenius said:


> Anywho...OP maintenance. Probably not going to be a thing with this one.


We need @T4C Fantasy.


----------



## CrAsHnBuRnXp (Dec 12, 2018)

Id recreate the thread if I knew i could be bothered with updating a list


----------



## Solaris17 (Dec 13, 2018)

alot of mad people here.


----------



## rtwjunkie (Dec 13, 2018)

Solaris17 said:


> alot of mad people here.


Because it’s a mess.  Threads like this with benchmarks are useless unless everybody does exactly the same.  It also helps them if someone is compiling the results.


----------



## Solaris17 (Dec 13, 2018)

rtwjunkie said:


> Because it’s a mess.  Threads like this with benchmarks are useless unless everybody does exactly the same.  It also helps them if someone is compiling the results.



What if the OP wanted you to just post screen shots like he asked? Maybe he is doing some internal comparison.

Maybe he never intended to make a leader board.


----------



## rtwjunkie (Dec 13, 2018)

Solaris17 said:


> What if the OP wanted you to just post screen shots like he asked? Maybe he is doing some internal comparison.
> 
> Maybe he never intended to make a leader board.


That’s certainly a possibility. Initially I thought the OP was just informing of a stand-alone benchmark.

It’s still a mess more caused by everyone piling on without consistency..


----------



## MrGenius (Dec 13, 2018)

I ain't mad. I couldn't care less honestly. I don't even consider game-based benchmarks to be real benchmarks. Do you see any on HWBOT? Apparently someone agrees with me. 

EDIT: Instant refresh on what my foot tastes like. I forgot about Aquamark.


----------



## jboydgolfer (Dec 13, 2018)

am i missing something, maybe a different download option, or is this a 15Gb download to run a benchmark?


----------



## CrAsHnBuRnXp (Dec 13, 2018)

jboydgolfer said:


> am i missing something, maybe a different download option, or is this a 15Gb download to run a benchmark?


It is because you're installing a trial version of the game


----------



## delshay (Dec 18, 2018)

I install the trial version & ran the benchmark, but the icon to run the benchmark a second time seems to have disappeared. Is anyone else having this problem after shutting down & restarting the trial version?


----------



## n1mmel (Dec 18, 2018)




----------



## EarthDog (Dec 18, 2018)

Solaris17 said:


> What if the OP wanted you to just post screen shots like he asked? Maybe he is doing some internal comparison.
> 
> Maybe he never intended to make a leader board.


Who knows. Perhaps, being staff and wanting clean forums and threads which make sense, to reach out and ask the guy? Or maybe suggest it? What kind of internal comparison can be done with no mention of specific settings to follow. The first post is just a vomit of a screen shot and his settings. Garbage thread is garbage thread... it would be wonderful if staff did something to figure it out and make it worthwhile.


----------



## Solaris17 (Dec 18, 2018)

EarthDog said:


> Who knows. Perhaps, being staff and wanting clean forums and threads which make sense, to reach out and ask the guy? Or maybe suggest it? What kind of internal comparison can be done with no mention of specific settings to follow. The first post is just a vomit of a screen shot and his settings. Garbage thread is garbage thread... it would be wonderful if staff did something to figure it out and make it worthwhile.



I don't have a problem with it though. Report it or ask op.


----------



## EarthDog (Dec 18, 2018)

Some like a messy home, I suppose. Oh well.

Unsub'd.


----------



## Brunsas (Apr 30, 2019)

This is my 8086K @ 5.2ghz with an RTX 2080TI


----------



## Lindatje (Apr 30, 2019)

AMD Sapphire RX Vega 56 Nitro+ OC
AMD Ryzen 1600x standard.
3333MHz Mem 14-14-14-26


----------



## championdude1 (May 22, 2019)




----------



## rtwjunkie (May 22, 2019)

championdude1 said:


> what do these stats mean???


What do you mean? I don’t see anything marked.


----------



## TewChainsaw (Jun 23, 2019)

GTX 970 - DX12, MAX SETTINGS with TAA > 57 AVG FPS. The CPU is a limitator


----------



## blued (Jun 29, 2019)

Why are people turning down anisotropic filtering? It has minimal performance effect! Do not confuse it with AA (big perf impact).
Anyway, my 4k results. Note the settings (not fully maxed out, but visuals still breathtaking).


----------



## Edwired (Jun 29, 2019)

Can anyone explain to me about gpu bound as i just downloaded this game left everything on default and did the benchtest once it was done i got 58% or 59% gpu bound does that mean the cpu or gpu falling behind on render?


----------



## biffzinker (Jun 29, 2019)

Edwired said:


> Can anyone explain to me about gpu bound as i just downloaded this game left everything on default and did the benchtest once it was done i got 58% or 59% gpu bound does that mean the cpu or gpu falling behind on render?


Your CPU is holding up the GPU is my understanding.


----------



## Vya Domus (Jun 29, 2019)

Edwired said:


> Can anyone explain to me about gpu bound as i just downloaded this game left everything on default and did the benchtest once it was done i got 58% or 59% gpu bound does that mean the cpu or gpu falling behind on render?



V-sync could be on.


----------



## Edwired (Jun 29, 2019)

biffzinker said:


> Your CPU is holding up the GPU is my understanding.


Ah thank you just abit confusing with the benchmark graph as it all over the place as one stage it the cpu on top of gpu at the start then at the end of the gpu was on top of the cpu. As i left everything on default and got about 31fps average on xeon e5450 and asus gtx 1050 ti oc expedition. I know the cpu is old and holding back but it decent enough to run this game if i change alot of the settings to see if it get better or worse than the first benchmark. As the cpu is overclocked to 4ghz at the moment and the gpu clocks is at default.

Im begining to understand it more now i can get the frame graph close to eachother. Just the milliseconds graph shows two massive spikes by the cpu render while the gpu stays relax across the graphic. Something is off in my overclock. I wish more games and benchmark test have graphs like this game have which give a clear view on the performance


----------



## EarthDog (Jun 30, 2019)

biffzinker said:


> Your CPU is holding up the GPU is my understanding.


This is correct. Needs moar IPC and clocks.


----------



## Edwired (Jun 30, 2019)

Well i did bump the xeon e5450 to 4.2ghz with the ram freq to 1129mhzthe fps went from 38 to 46 with everything low to off and 1920x1080 res while watching cpu and gpu hit 100%. Seems the game is more cpu bound is my guess

As this is the best i can squeeze out of the xeon e5450 overclocked to 4.2ghz, memory 1129mhz, gpu core and memory overclocked to 100 via msi afterburner res 1600x900 with everything low and off


----------



## Hotobu (Jul 2, 2019)

Readouts like these make me want to buy this game just for the benchmark.


----------



## Kissamies (Jul 2, 2019)

I need to rerun this, before and after Crossfire.


----------



## Kissamies (Jul 5, 2019)

Everything maxed out with 290 Crossfire:


----------



## EarthDog (Jul 5, 2019)

79 FPS and 500W...


----------



## Kissamies (Jul 5, 2019)

EarthDog said:


> 79 FPS and 500W...


More than 500W


----------



## Melvis (Jul 5, 2019)

Does this game or the benchmark support SLi?


----------



## biffzinker (Jul 5, 2019)

Melvis said:


> Does this game or the benchmark support SLi?


Yes, more at this link:

__
		https://www.reddit.com/r/nvidia/comments/9fakk4


----------



## Melvis (Jul 5, 2019)

biffzinker said:


> Yes, more at this link:
> 
> __
> https://www.reddit.com/r/nvidia/comments/9fakk4



Doesnt seem to work out of the box for me sadly, I changed it to force alternate frame rendering 1 and it works but not great at all, bit surprised how bad it is in this game honestly, other Tomb games perform great!


----------



## Edwired (Jul 5, 2019)

If you are talking about device hung pop up which linked to event 4101 nvidia driver crashed and recovered. I had the same problem on the single gpu on resident evil 2 remake you have to add the registry tweak to fix that problem it the three registry TdrDelay, TdrDdiDelay, TdrLevel that needed to be added. Have a look on the internet to set it up on your computer as those three registry are not there with fresh install of windows. Every time you used ddu to do clean up the graphic driver you have to add the three registry again


----------



## Melvis (Jul 6, 2019)

Force Alternate Frame Rending 1 Exclusive full screen




Force Alternate Frame Rending 2 Exclusive full screen




Single GPU exclusive Fullscreen




Force Alternate Frame Rending 2 Same as OP Settings exactly




So after all those runs, SLi does make a difference but not by alot and by going into Nvidia Control panel and Force Alternate Frame Rending 2 and having the game set to NON Full screen seems to give the best results for me anyway. Not as optimised as I thought this game to my surprise..

Oh its also reporting the wrong Driver as Im running the latest


----------



## advanced3 (Jul 10, 2019)




----------



## Edwired (Jul 10, 2019)

Nvidia graphic latest driver have a few fixes for ray tracing on this game for the rtx graphic card users


----------



## masterdeejay (Jul 15, 2019)

1080p Highest preset Dx12
R9 380 4Gb Crossfire, Dual X5675 stock clocks


----------



## EarthDog (Jul 15, 2019)

I wonder what those gpus would do when not hamstrung by those xeons...


----------



## masterdeejay (Jul 15, 2019)

EarthDog said:


> I wonder what those gpus would do when not hamstrung by those xeons...


I will try my friends 1070Ti SLI with that xeons.  I have used GTX 970 with an older dual xeon E5450 before and that was fine for Battlefield 4. I hope there will be more games that support more cores. dx12 helps a little


----------



## EarthDog (Jul 15, 2019)

We know its 'fine' but you are leaving performance on the table with those CPUs at stock and as old as they are. Crossfire and SLI at 1080p even with the older cards, needs all the CPU horsepower it can get.

Even if games supported XX amount of cores and threads, the slow speed and low IPC will still place a glass ceiling on things.

You are 7% GPU bound... that clearly tells us you are really really CPU limited here.


----------



## Edwired (Jul 15, 2019)

That interesting to know


----------



## Mac2580 (Jul 15, 2019)

CPU: 4.2Ghz locked, no boost, no EIST, C-States off.
GFX: Stock besides prefer maximum performance in Nvidia Control Panel.


----------



## Tomgang (Jul 15, 2019)

masterdeejay said:


> I will try my friends 1070Ti SLI with that xeons.  I have used GTX 970 with an older dual xeon E5450 before and that was fine for Battlefield 4. I hope there will be more games that support more cores. dx12 helps a little



He is right about those old xeon's. Stock they are weak. A shame that your motherboard dosent support overclock as the old x58 i7 and xeon's can reach some pretty nice overclock when cooled properly. Xeon's can reach 4.2 and up to 4.6 ghz depending on silicon and cooling.

If you wanna see what x58 can really do when overclock. Look in the link below of my system. It has a i7 980X cpu. But that dosent matter as xeon and i7 cpu for x58 is the same. Xeons hust have support for a few more features like ecc memory.









						Shadow of the Tomb Raider benchmark
					

Nope. just going to be a pool of results... lulz.




					www.techpowerup.com


----------



## EarthDog (Jul 15, 2019)

You're held back a ton too.. 10% gpu bound


----------



## Tomgang (Jul 15, 2019)

EarthDog said:


> You're held back a ton too.. 10% gpu bound



Yeah i know and as i have told you before. That's why i am planing to get a ryzen 3000 setup later some time after ryzen 9 3950X comes out.


----------



## Mac2580 (Jul 15, 2019)

Either way


Tomgang said:


> Yeah i know and as i have told you before. That's why i am planing to get a ryzen 3000 setup later some time after ryzen 9 3950X comes out.


Your benchmark impressed me the most actually. I know it was an "enthusiast" CPU but I didnt expect it to still hold 60FPS+ 10 years later.


----------



## Tomgang (Jul 15, 2019)

Mac2580 said:


> Either way
> 
> Your benchmark impressed me the most actually. I know it was an "enthusiast" CPU but I didnt expect it to still hold 60FPS+ 10 years later.



You be surprized how far X58 can go in games then it's overclock and optimized well. But 2019 has taken it tall on x58 and games. Metro exodus and Far Cry New Dawn is to much for X58 at higher settings. Else it lag spikes to much. But games from 2018 and older X58 has handle maxed out every time and still hold those 60 fps+. Hornestly i am as well surprized for how long X58 had stayed relevant.

If you want to, i have few benchmark of my system here and also one from far cry 5 at ultra setting in different resolution.



http://imgur.com/a/uHjbbMg


----------



## phanbuey (Jul 16, 2019)

7820x w/ 2080ti  -- Bit CPU bound with @ 1080P  all my apps are open so it's not a clean run.


----------



## Edwired (Jul 16, 2019)

Funny about the gpu bound even with xeon e5450 i have overclocked to 4.275ghz the gpu bound goes down a few % from 39% at 1920x1080 everything low and off as soon i drop the resolution to 1600x900 the gpu bound goes to 8 to 10% depending on per bench test. The way i see it im held back with the cpu and ram and mothetboard.

With the vsync on half refresh it still want to run 60fps as i just tested it now as the graph is all over the place with wild flucuations. But if you used msi afterburner with rtss to limit the frame to 60 is alittle better but minor stuttering and drop out. But in my case i limited it to 30fps to help reduce the bottleneck in the cpu and the graph for cpu it nearly a flat line across with 1600x900 res it just told me 0% gpu bound. Which tells me the game is not really optimized well.

Try limiting the frames to 60 on newer computers to see it reduce the gpu bound and report back as i be interested how it affected on newer computer


----------



## EarthDog (Jul 16, 2019)

The motherboard doesn't have much to do with anything. Its the system itself. The CPU and the memory speed (b/c that is what the CPU supports) is the issue, yes. We've been saying that all along.


----------



## Edwired (Jul 16, 2019)

Look at above message @EarthDog


----------



## EarthDog (Jul 16, 2019)

What is the endgame?


----------



## Edwired (Jul 16, 2019)

How you mean?


----------



## EarthDog (Jul 16, 2019)

Sorry.. I dknt even have the game installed. Your curiosity will have to be satisfied by others. Sorry.


----------



## phanbuey (Jul 16, 2019)

ur memory subsystem is what is holding the system back... whatever the min cpu is is what the game will flatline at with a framerate limit.  In my case 109-107 FPS in yours @ 40FPS.


----------



## Edwired (Jul 16, 2019)

EarthDog said:


> Sorry.. I dknt even have the game installed. Your curiosity will have to be satisfied by others. Sorry.


Cool no problem



phanbuey said:


> ur memory subsystem is what is holding the system back... whatever the min cpu is is what the game will flatline at with a framerate limit.  In my case 109-107 FPS in yours @ 40FPS.
> 
> View attachment 126982


As for memory subsystem you mean speed or timings?


----------



## Vayra86 (Jul 16, 2019)

Chloe Price said:


> Everything maxed out with 290 Crossfire:



You should try one card. You might see better numbers. 1% GPU bound... ouch



phanbuey said:


> It's usually latency first and bandwith second, but it depends on the game/setup .  Generally with all of the newer "open world" games that stream texture/object live, and if the stream isn't fast enough the whole game slows down and will start stuttering even if you have CPU power to spare.



He kinda needs DDR4 at this point, and higher clocks on the CPU. That RAM is already pretty neat if its the one in specs - even though its 8GB only.

But still, 32% GPU bound means that there is something to gain, but it can be pretty limited after all. You only need to lack a bit of CPU grunt to avoid being GPU bound. Add the little bit and you may already hit your card's limits. It is just a lowly 1050ti, after all.


----------



## phanbuey (Jul 16, 2019)

Edwired said:


> Cool no problem
> 
> 
> As for memory subsystem you mean speed or timings?



It's usually latency first and bandwith second, but it depends on the game/setup .  Generally with all of the newer "open world" games that stream texture/object live, and if the stream isn't fast enough the whole game slows down and will start stuttering even if you have CPU power to spare.



Vayra86 said:


> He kinda needs DDR4 at this point, and higher clocks on the CPU. That RAM is already pretty neat if its the one in specs - even though its 8GB only.



yeah 100% agree.  A move over to a new budget platform can possibly double his performance - a 3600 with a 16gb ddr4 kit and a $60 b350 will double his CPU fps for most new games.


----------



## Edwired (Jul 16, 2019)

phanbuey said:


> It's usually latency first and bandwith second.  Generally with all of the newer "open world" games that stream texture/object live, and if the stream isn't fast enough the whole game slows down and will start stuttering even if you have CPU power to spare.


Ah i see as it 5-5-4-15-5-85 the rest is auto @ 1140mhz it ddr2 on my board. I still have to tweak it more as im still learning the black voodoo magic in timings.

With the fps locked to 30 it barely 2 or 3 stutters happened but if i let it go bananas the game get heavy stuttering all the way through the bench test even witg the vsync on or on with half refresh rate it stuttering non stop


----------



## Vayra86 (Jul 16, 2019)

Edwired said:


> Ah i see as it 5-5-4-15-5-85 the rest is auto @ 1140mhz it ddr2 on my board. I still have to tweak it more as im still learning the black voodoo magic in timings.



Oh shit its DDR2  Never mind. I never cared to look further than latency 

Looks like you need a platform upgrade soon!

Yep, stutter is a clear sign of RAM problems. (Oh we both love to ninja edit don't we )

Check this out, for giggles. TW Warhammer 2: 3570k @ 4.2 Ghz & DDR3 1600cl9, versus 8700K @ 4.7 & DDR4 3200cl16. Same GPU.

Note the major drops and minimums...


----------



## Edwired (Jul 16, 2019)

Vayra86 said:


> Oh shit its DDR2  Never mind. I never cared to look further than latency
> 
> Looks like you need a platform upgrade soon!


I know it old it deserved a beating to be honest as it what i have at the moment as i edited the last post as well. I know i need an upgrade badly as i been in and out of hospital for tests in the last few months and money isnt the top of my list


----------



## Vayra86 (Jul 16, 2019)

Edwired said:


> I know it old it deserved a beating to be honest as it what i have at the moment as i edited the last post as well. I know i need an upgrade badly as i been in and out of hospital for tests in the last few months and money isnt the top of my list



Well, its nice to look forward to a major upgrade, isn't it? Take your time, now's not the optimal buying time (just yet) anyway. Well, you could grab a DDR4 RAM kit now, those are cheap.


----------



## Splinterdog (Jul 16, 2019)

First one with FreeSync off




And FreeSync on - 2 fps gained


----------



## Mac2580 (Jul 17, 2019)

Tomgang said:


> You be surprized how far X58 can go in games then it's overclock and optimized well. But 2019 has taken it tall on x58 and games. Metro exodus and Far Cry New Dawn is to much for X58 at higher settings. Else it lag spikes to much. But games from 2018 and older X58 has handle maxed out every time and still hold those 60 fps+. Hornestly i am as well surprized for how long X58 had stayed relevant.
> 
> If you want to, i have few benchmark of my system here and also one from far cry 5 at ultra setting in different resolution.
> 
> ...


Yeah, thanks the benchmarks are very interesting. At the time that I purchased my 7700K, I didnt even consider a 6800K, due to the cost. In your opinion did the extra 2 cores/4 threads increase FPS in modern multithreaded games or would an overclocked i7 950 match yours for strictly gaming?


----------



## Tomgang (Jul 17, 2019)

Mac2580 said:


> Yeah, thanks the benchmarks are very interesting. At the time that I purchased my 7700K, I didnt even consider a 6800K, due to the cost. In your opinion did the extra 2 cores/4 threads increase FPS in modern multithreaded games or would an overclocked i7 950 match yours for strictly gaming?



Yes i cut feel the difference. I had a I7 920 oc to 4 GHz before and replaced it back in januar 2017 with my current I7 980X. There where a difference that cut be felt. But it is not only the ekstra cores that help. Its a combination of more cores/threads, more L3 cashe and higher core clock that help as well. But yeah as some games now are taking advantage of more than 4 core/8 threads these days. More cores are deffently a good thing. If you are getting a new cpu today. Get at least one with 6 cores and 12 threads, else you are gonna regred it the coming years of gaming. And also some games are these days also taking advantage of Intels hyper threading and AMD´s SMT.

Here are some benchmark with my system having a I7 920 oc to between 4 GHz and 4.4 GHz. Note oc over 4 GHz whas benchmark only as the CPU became to hot for every day use.



http://imgur.com/a/WqD1iHK


See this video where I7 7700K and I7 8700K are teste with and with out hyper threading on. There are some games that get a performance drop even throw there still are 6 cores active but no hyper threading on. Means for moderns games it is not advice any more to get a quad-core cpu and deffently not with out Hyper Threading/SMT.


----------



## Mac2580 (Jul 18, 2019)

Thanks, yeah I was going to get a 2080TI but after checking out alot of reviews it cant max all games at 4k60FPS. Thinking about getting a 2070 Super instead and replacing my 7700k while its still worth alot on the used market. Im wondering avout core/thread count as im seriously considring getting the i9 9900k over the 3900X. The OC is only used for gaming and youtube so the value advantage of the Zen chip doesnt do anything in my use case. I want to conclude my purchases sooner rather than later, as Im planning to buy a property soon which will reduce my expendable monthly income to nil


----------



## masterdeejay (Jul 18, 2019)

Ok so i have now two 1070 cards for the dual retro xeon. I dont have sli cable yet so i cant run it sli mode. But I think that this tomb raider game is not optimized on cpu side. I tried a much more cpu optimized game for the dual cards first. I dont know that this results are good or bad. I use the pc for working not for gaming, and i only play older games. (Skyrim, 7dtd, MC, VintageStory, Simcity4)

Ashes of the Singularity: Escalation 
1080P Crazy preset :

== Hardware Configuration ================================================
GPU 0:        NVIDIA GeForce GTX 1070
GPU 1:        NVIDIA GeForce GTX 1070
CPU:        GenuineIntel
        Intel(R) Xeon(R) CPU           X5675  @ 3.07GHz
Physical Cores:            12
Logical Cores:            24
Physical Memory:         98295 MB
Allocatable Memory:        134217727 MB
==========================================================================


== Configuration =========================================================
API:                        DirectX 12
script:                        BenchFinal
==========================================================================
Quality Preset:                    Crazy
==========================================================================

Resolution:                1920x1080
Fullscreen:                True
Bloom Quality:                High
PointLight Quality:            High
Glare Quality:                High
Shading Samples:             16 million
Terrain Shading Samples:         12 million
Shadow Quality:                High
Temporal AA Duration:            0
Temporal AA Time Slice:            0
Multisample Anti-Aliasing:        4x
Texture Rank :                0


== Total Avg Results ================================================= 
Total Time:                     60.003120 ms per frame
Avg Framerate:                     71.511780 FPS (13.983710 ms)
Weighted Framerate:                 70.766586 FPS (14.130963 ms)
CPU frame rate (estimated if not GPU bound):     72.343674 FPS (13.822908 ms)
Percent GPU Bound:                 80.063126 %
Driver throughput (Batches per ms):         4392.802734 Batches
Average Batches per frame:             13197.397461 Batches
Average Particles simulated per frame          269667 
==========================================================


----------



## dOBER (Jul 20, 2019)




----------



## blobster21 (Jul 20, 2019)




----------



## Kissamies (Jul 21, 2019)

Melvis said:


> Does this game or the benchmark support SLi?


SLI? I have AMD cards 



Vayra86 said:


> You should try one card. You might see better numbers. 1% GPU bound... ouch


Oh hell no, max fps was like 50... Crossfire scales fine here.


I had a 980 @ 1500/2000 about an year ago and that couldn't get 60fps.


----------



## Melvis (Jul 21, 2019)

Chloe Price said:


> SLI? I have AMD cards



Haha indeed you do! My last crossfire set up was 280X's, for the most part worked great but a string of 11updates broke it in Trine 2 (and maybe some others) and never got fixed I gave up and moved back to SLi. How are you finding your crossfire in todays games? I have had alot of crossfire cards over the yrs, starting back with 2x4870X2's


----------



## Kissamies (Jul 21, 2019)

Melvis said:


> Haha indeed you do! My last crossfire set up was 280X's, for the most part worked great but a string of 11updates broke it in Trine 2 (and maybe some others) and never got fixed I gave up and moved back to SLi. How are you finding your crossfire in todays games? I have had alot of crossfire cards over the yrs, starting back with 2x4870X2's


At least BF1 got the benefit of two cards and IT KICKS ASS!

I have a perverted idea of getting a 295X2 with these. 

Fucking Quadfire.


----------



## Melvis (Jul 21, 2019)

Chloe Price said:


> At least BF1 got the benefit of two cards and IT KICKS ASS!
> 
> I have a perverted idea of getting a 295X2 with these.
> 
> Fucking Quadfire.



Nice! I can only imagine! as I dont have the game haha, but I know from the past when it works it works very well.

O M G that is perverted but I like it! lol Do tell if you end up doing it, i want pics!

My Crossfire 4870X2 days, omg such power, such performance, such NOISE! lol still one of my all time favorite set ups.


----------



## Athlonite (Jul 21, 2019)

well here we go actually ran better than I thought it would







and with a couple of things changed


----------



## Kissamies (Jul 21, 2019)

Melvis said:


> Nice! I can only imagine! as I dont have the game haha, but I know from the past when it works it works very well.
> 
> O M G that is perverted but I like it! lol Do tell if you end up doing it, i want pics!


There will definitely be pics if I end up with that, Tri-Fire at least 



> My Crossfire 4870X2 days, omg such power, such performance, such NOISE! lol still one of my all time favorite set ups.


Heh, exactly the same when I had a HD 3870 X2 10 years ago.. I'd had that in Tri-Fire with my old HD 3850 but I had a NForce motherboard so no Crossfire


----------



## Melvis (Jul 23, 2019)

Chloe Price said:


> There will definitely be pics if I end up with that, Tri-Fire at least
> 
> 
> Heh, exactly the same when I had a HD 3870 X2 10 years ago.. I'd had that in Tri-Fire with my old HD 3850 but I had a NForce motherboard so no Crossfire



Nice! I will be looking forward to it if you do 

oh wow ok even older then my set up, nice! and yeah thats a bummer, the good old days haha


----------



## augustben (Aug 4, 2019)

Finally saved up enough money to get myself a new PC. The game runs smoothly and amazingly! I had to disable DirectX 12 though since the game won’t launch with it enabled.

Edit: I managed to get SOTR to work with DirectX 12 again, and I got better frame rates than before!


----------



## Pumper (Aug 9, 2019)

So is up with that GPU Bound stat? I'm running 4770K at 4.2GHz with 32GB 1600 DDR3 oc'ed to 1866. Upgraded my 970 to 1070Ti today, but the GPU Bound stat makes no sense - was 5% with 970 (46 avg. FPS on Ultra 1200p) and 23% with 1070Ti (67 avg. FPS).

So if lower % means bigger CPU bottleneck, shouldn't it we reverse?


----------



## lsevald (Aug 9, 2019)

Pumper said:


> So is up with that GPU Bound stat? I'm running 4770K at 4.2GHz with 32GB 1600 DDR3 oc'ed to 1866. Upgraded my 970 to 1070Ti today, but the GPU Bound stat makes no sense - was 5% with 970 (46 avg. FPS on Ultra 1200p) and 23% with 1070Ti (67 avg. FPS).
> 
> So if lower % means bigger CPU bottleneck, shouldn't it we reverse?



I went from a 4770k@4.4GHz + 1080ti=116fps (43% gpu bound) to a 9900k@5.1GHz + 1080ti =143fps (87% gpu bound):







So I guess it means that, with a faster CPU, my GPU is more of a limiting factor than before. Maybe think of it as a scale, 50% would mean the game engine is waiting equally as much time for the CPU as the GPU to finish work. So in your case, yes, a faster GPU should mean less gpu bound? Are you sure you run exactly the same settings between the two, as that number will change depending on high or low quality settings and res, if I run it at 1280x720 and low settings I get like 1-2% GPU bound. Confusing stuff, don't listen to me


----------



## Pumper (Aug 12, 2019)

Looks like DX11 is the issue here (I'm still on Win7), so the CPU usage gets murdered in populated areas (drops to 40-50% usage) and takes the GPU along for the ride.


----------



## EarthDog (Aug 12, 2019)

Pumper said:


> Looks like DX11 is the issue here (I'm still on Win7), so the CPU usage gets murdered in populated areas (drops to 40-50% usage) and takes the GPU along for the ride.


Doesn't DX12 use more CPU? Also, look above you.. DX12 shows the same drop in the same area, no?


----------



## Pumper (Aug 12, 2019)

EarthDog said:


> Doesn't DX12 use more CPU? Also, look above you.. DX12 shows the same drop in the same area, no?



It does, but in a good way as it does not bottleneck the GPU, unlike in DX11.

Here's a great DX11vsDX12 video with CPU/GPU usage shown: https://youtu.be/G9YNqbmIYYI


----------



## EarthDog (Aug 12, 2019)

Pumper said:


> It does, but in a good way as it does not bottleneck the GPU, unlike in DX11.


I don't understand why it is any different in bottlenecking the GPU...when it uses more of the CPU in the first place. What speed is your processor at? Stock? Maybe create a system spec list so we can see what hardware you have going....


----------



## anachron (Aug 12, 2019)

Here are my results with and without dx12 in 1080p highest settings. For reference, i have gsync and vsync on in nvidia control panel as i forgot to remove it. I did hit 144fps but only during a very short time.
1080p dx12 :


1080p dx11 :


It seems that dx12 does indeed use a lot more cpu (43% cpu bound in dx12 vs 27% in dx11), but the performances are still a lot better than in dx11. Whoops, bad reading skills.

My results at 1440p / Rtx medium / no DLSS :


----------



## EarthDog (Aug 12, 2019)

anachron said:


> It seems that dx12 does indeed use a lot more cpu (43% cpu bound in dx12 vs 27% in dx11), but the performances are still a lot better than in dx11.


That is *G*PU bound value, not CPU.


----------



## anachron (Aug 12, 2019)

EarthDog said:


> That is *G*PU bound value, not CPU.


Yep sorry, was editing while you replied. Guess i didn't get enough sleep last night


----------



## Pumper (Aug 12, 2019)

EarthDog said:


> I don't understand why it is any different in bottlenecking the GPU...when it uses more of the CPU in the first place. What speed is your processor at? Stock? Maybe create a system spec list so we can see what hardware you have going....



It's a software issue, not hardware (as posted in my first reply, I'm running 4770k at 4.2 with RAM at 1866), just like anachron showed in his DX11vDX12 tests. DX11 kills the CPU render performance for some reason and drags down GPU along with it, resulting it lower overall CPU/GPU usage % and thus lower FPS.


----------



## EarthDog (Aug 12, 2019)

It doesn't do that with my system... that doesn't make sense with how its supposed to work, AFAIK.


----------



## mtrai (Aug 12, 2019)

Here are mine so far on my Ref PowerColor 5700 XT  posting since I am not seeing any.  1920 x 1080 highest settings.  Redoing my 2560 x 1440 as I ran it at high not highest.


----------



## Taraquin (Aug 12, 2019)

Hmm, I'm getting really low CPU-score with my i5 8400. Weird.


----------



## anachron (Aug 12, 2019)

It would be interesting to know at which frequency your 8400 run during the bench and what memory you have. I got twice the performance with my i5 8600k @ 4.7ghz.


----------



## phanbuey (Aug 12, 2019)

Taraquin said:


> Hmm, I'm getting really low CPU-score with my i5 8400. Weird. View attachment 129105



Put your power settings to HIGH PERFORMANCE and try again.  You're probably downclocking.


----------



## Taraquin (Aug 12, 2019)

Hmm, it was my ram timings. Been running 2666@13-16-16-32\CR1\tRFC 350 for a while now, but seemed tovremember that I used to get higher scotes when I ran 14-16-16-32. Changed it and things made more sense 

I change power plan after that and got 20% better with that  thx for advice.

Actually, turns out it was not the ram, but a Throttlestop-setting called speedshift, checked in my BIOS and speedstep is enabled by default there. disabled speedshift in Throttlestop, selected high performance Power plan again with the odd ram-timings (13-16-16-32) and CPU render average went up to 176 so a slight improvement from CL14  Funny thing is that this setting has not interfered With any of my other games or benchmarks (Cinebench, Unigine heaven, For Honor etc), only SOTTR which seem to drop CPU-speed quite significantly.


----------



## Taraquin (Aug 14, 2019)

i7 6700hq 4c/8h 3.1 all core turbo is awfully slow compared to my i5 8400. 87 vs 176 on CPU render avg.


----------



## EarthDog (Aug 14, 2019)

My dude..... print screen that... screen caps are sooooooo 1990s..... 

I'm also pretty sure that CPU render is influenced some by the GPU.


----------



## Taraquin (Aug 15, 2019)

EarthDog said:


> My dude..... print screen that... screen caps are sooooooo 1990s.....
> 
> I'm also pretty sure that CPU render is influenced some by the GPU.


Sorry, you are right


----------



## PooPipeBoy (Aug 15, 2019)

Tested my 4670K/GTX1060 on 1080p highest preset.



Spoiler: Results


----------



## lsevald (Aug 15, 2019)

I had some time to retest my old 1080ti mining rescue, running the xoc bios with no power limit at max 1.200V. I stopped at 2152MHz for the gpu, it might do a couple of more notches, as I didn't spot any graphic glitches, but I figured 150fps was a nice round number for 1920x1080 and High preset

Edit: this is on Server 2016 (v1607)


----------



## Taraquin (Aug 15, 2019)

Going from my 580 to 5700XT gave me about exact twice the performance, impressef that my i5 8400 keeps pace with the 5700XT without problems


----------



## phanbuey (Aug 18, 2019)

Taraquin said:


> View attachment 129311
> 
> Going from my 580 to 5700XT gave me about exact twice the performance, impressef that my i5 8400 keeps pace with the 5700XT without problems



That 8400 is a beast no matter how many reviewers knock that thing.


Tweaked the OC a bit:


----------



## Duvar (Sep 4, 2019)

Ryzen 3600 Result 1440p highest:


----------



## phanbuey (Sep 14, 2019)

1440P results
8700K @ 5.1ish w/2080TI


----------



## Lindatje (Jan 11, 2020)




----------



## Felix123BU (Jan 26, 2020)

Vega 64 OC (1647core, 1150mem, +50 Power Limit), Ryzen 3800X stock (second attempt) - Love it how Vega now regularly beats the 1080 and comes close or even surpasses stock 1080TI in some games
And with the above settings im sitting at average 260 watt for the GPU and around 390 watt total sys power


----------



## potato580+ (Jan 26, 2020)

my old bench with msi twin froz, ryzen 2200g/8gb ram dual, all stock clock
in fullhd ofcourse



this one with gtx 980



pretty much a mess combo on msi yes
i think 980 is better pair result


----------



## mozo (Jan 26, 2020)

Arch Linux:


----------



## Kissamies (Jan 26, 2020)

Chloe Price said:


> Everything maxed out with 290 Crossfire:


Same settings, same CPU & RAM, 980 Ti


----------



## basco (Jan 26, 2020)

Mr.Chloe can i ask ya - did ya try the 290cfx in Dx11 mode?


----------



## Kissamies (Jan 26, 2020)

basco said:


> Mr.Chloe can i ask ya - did ya try the 290cfx in Dx11 mode?


No if I remember correctly, I remember the game running better with DX12


----------



## Felix123BU (Jan 26, 2020)

Chloe Price said:


> At least BF1 got the benefit of two cards and IT KICKS ASS!
> 
> I have a perverted idea of getting a 295X2 with these.
> 
> Fucking Quadfire.



 Quadfire


----------



## Kissamies (Jan 27, 2020)

Felix123BU said:


> Quadfire


Well, I ditched that idea since the other R9 290 broke and so did that X470 mainboard, sold the survivor 290 and moved once again to Nvidia.


----------



## harm9963 (Jan 27, 2020)

Chloe Price said:


> Well, I ditched that idea since the other R9 290 broke and so did that X470 mainboard, sold the survivor 290 and moved once again to Nvidia.


 Still have my 290X CFX,  was going to do a 295x2 as well, but got a new 1080Ti in September  21  2018 ,got the last one at Frys, but now i have a itching for  one more 290x,*Crossfire x3* project.


----------



## Felix123BU (Jan 27, 2020)

Chloe Price said:


> Well, I ditched that idea since the other R9 290 broke and so did that X470 mainboard, sold the survivor 290 and moved once again to Nvidia.


Pity, would have been fun for the sake of QUAD POWAH only 
I actually do have 4 RX 480's but no board to put them in together, and multi card support seems to be on life support anyway.


----------



## Kissamies (Jan 27, 2020)

Felix123BU said:


> Pity, would have been fun for the sake of QUAD POWAH only
> I actually do have 4 RX 480's but no board to put them in together, and multi card support seems to be on life support anyway.


Goofin' around with these is for fun, it doesn't need to have any sense


----------



## scope54 (Feb 8, 2020)




----------



## Taraquin (Mar 16, 2020)

Ryzen 3600 stock, 2x8 3266@cl14, 5700XT 1600@850mv (silent setup).

And same setting running 5700XT stock


----------



## Athlonite (Mar 17, 2020)

a new one from me with the R73700X 





Yeah I know my GPU is shit but getting a newer and better one here costs a damn small fortune


----------



## Hugis (Apr 2, 2020)

Never did this bench before, seems my 780ti is getting battered(but still puts in a fairly decent showing), time to save my pennies up for somit new i guess


----------



## UncleFuzz (Apr 7, 2020)

First Bench with 2080ti


----------



## blued (Apr 10, 2020)

Not sure why people turn down anisotropic filtering. It has minimal impact on performance. I leave it on Afx16 in the drivers global settings and have done so for last 15 years.


----------



## mrthanhnguyen (Apr 10, 2020)

1080P HIghest setting, smaa4x, rx580 1400/2150, 9900ks 5.4ghz HT off


----------



## NoJuan999 (Apr 10, 2020)

3700x (PBO) / RTX2060 Super (+130 GPU / +500 MEM)


----------



## CGi-Quality (Apr 29, 2020)




----------



## Taraquin (Apr 29, 2020)

NoJuan999 said:


> 3700x (PBO) / RTX2060 Super (+130 GPU / +500 MEM)
> View attachment 151043


What ram do you have? You should tweak some with dram calc because your CPU-score is a bit low


----------



## NoJuan999 (Apr 29, 2020)

Taraquin said:


> What ram do you have? You should tweak some with dram calc because your CPU-score is a bit low


My RAM is as tweaked as I can get it.
G.Skill Ripjaws V 16GB (2 x 8GB) (F4-3600C16D-16GVKC) @ 3733 MHz 16-19-19-19-36-56








						How Low Can You Go? Memory Latency Competition - AIDA64
					

Leaderboards are located at the bottom of this post - *Updated 3/14 4:00am*  I thought it would be cool to get a "pulse" on some of the newer systems and see how modern platforms perform in terms of memory latency. We can compare and contrast to older generations as well (DDR3 vs DDR4 vs dual...




					www.techpowerup.com
				




And I believe the CPU scores are a tad low is because My GPU is doing most of the work.
It is GPU Bound but only at 74% compared to 99% GPU Bound with an RX580 like tis:








						Shadow of the Tomb Raider benchmark
					

My dude..... print screen that... screen caps are sooooooo 1990s..... :p  I'm also pretty sure that CPU render is influenced some by the GPU. Sorry, you are right ;)




					www.techpowerup.com
				





PS
My CPU scores pretty well on CB R20 and CPUz Benchmarks compared to other 3700x rigs.
And I am really satisfied with my FPS with this setup.








						Post your Cinebench R23 Score
					

PBO off improved my score a little bit




					www.techpowerup.com
				











						AMD Ryzen 7 3700X @ 4174.03 MHz - CPU-Z VALIDATOR
					

[cnyw57] Validated Dump by DMAN999 (2019-10-17 15:06:57) - MB: Asus ROG STRIX B450-F GAMING - RAM: 16384 MB




					valid.x86.fr


----------



## Taraquin (Apr 30, 2020)

Hmm, that is strange. I don't mean to critizise you, I just think you have urealized potential  I get 157avg/115min on CPU game after some furthe ram tweaking and those scores are independent of GPU since they only show what CPU contributes with. I only have R5b3600 so yours should be higher. Lowering trrds to 4 and tfaw to 16 gave me 3fps alone. Lowering tRFC from 640 to 550 gave me another 5. Do you have Hynix CJR-die? Try getting tRFC down to 500 or maybe lower if it's 600ish now, that will make a big difference. Ramtweaking barely impacts Cinebench, I get about the same when running 3000cl16-17-17-38 as 3733cl16-19-15-30 like I do now. Games are a big difference. CPU-render went from 133avg/91min to 157/115 with ram tweaking alone.


----------



## Warsaw (Apr 30, 2020)

On my bench my FPS boosted from about 66fps to 88fps with some O/C on my vid card. However, looking at my CPU numbers and my render performance I get this weird jump midway through the bench. Anyone know what that could be caused from?  When I've looked over others that have the 2700x mine seems out of like with that CPU Render jump.

Additionally I get spikes in games from time to time, where my framerates will be fine but I'll get a small micro stutter (can be every 30 seconds to a min or two). I've looked at everything that I've thought of but I haven't come to a firm conclusion.

-Update
Fixed it by taking off the FPS limiter on my RivaTuner, now it's a smooth graph. Not sure if this will have a baring on my sudden stutters as those almost seem attributed to whenever there is something loading off of the RAM or my storage. Tried also putting on Ultimate Performance under my power profile to see if this has an impact.


----------



## Hardi (Apr 30, 2020)

Hardi said:


> View attachment 112400



a litle upgrade






Hardi said:


> View attachment 112400


----------



## NoJuan999 (Apr 30, 2020)

Taraquin said:


> Hmm, that is strange. I don't mean to critizise you, I just think you have urealized potential  I get 157avg/115min on CPU game after some furthe ram tweaking and those scores are independent of GPU since they only show what CPU contributes with. I only have R5b3600 so yours should be higher. Lowering trrds to 4 and tfaw to 16 gave me 3fps alone. Lowering tRFC from 640 to 550 gave me another 5. Do you have Hynix CJR-die? Try getting tRFC down to 500 or maybe lower if it's 600ish now, that will make a big difference. Ramtweaking barely impacts Cinebench, I get about the same when running 3000cl16-17-17-38 as 3733cl16-19-15-30 like I do now. Games are a big difference. CPU-render went from 133avg/91min to 157/115 with ram tweaking alone.


No worries.
I did run the Benchmark from within the game and not the standalone Benchmark.
I don't think that should matter but it might.
My RAM has Hynix DJR ICs.
I did get tRFC down to 489.


----------



## Taraquin (May 1, 2020)

I think I see what is limiting your performance. Trrds trrdl and tfaw. Trrds is in a 1:4 relation to tfaw. Try lowering tfaw to 28 and what happends, if that work, liwer tfaw to 24 and trrds to 6, next step is 20 and 5, last step is 16 and 4, but itmight not be stable. It works for me. Then try lowering trrdl to 6 og 7. I bet you will see a big improvement at the cpu-score if you get those a bit lower.


----------



## Uggis007 (May 13, 2020)

How can I get the sli to work again for dx12? 
when I enable dx12 the game won’t start. 
I only get a black screen with my mouse courser on for a couple of seconds, before it goes back to the desktop.
When I choose dx11 the game starts with no problem.


----------



## EarthDog (May 14, 2020)

Uggis007 said:


> How can I get the sli to work again for dx12?
> when I enable dx12 the game won’t start.
> I only get a black screen with my mouse courser on for a couple of seconds, before it goes back to the desktop.
> When I choose dx11 the game starts with no problem.


Dont know.. but I'd sell any sli/CFx system and go single card....


----------



## Uggis007 (May 14, 2020)

augustben said:


> View attachment 128416
> 
> Finally saved up enough money to get myself a new PC. The game runs smoothly and amazingly! I had to disable DirectX 12 though since the game won’t launch with it enabled.
> 
> ...



how did you get dx12 to work again?


----------



## Deleted member 197986 (Jun 7, 2020)




----------



## professor dumb dumb (Jun 10, 2020)

10980xe at 4.85ghz, Bclk 167, Mesh 3.15ghz
64gb ddr4 at 16-16-16-32 1T 288
2080 ti Hybrid at 2025/1925
Hyperthreading off.
Game Settings: 1080p Highest TAA


----------



## blued (Jun 11, 2020)

StarExplorer said:


> View attachment 158199
> 
> View attachment 158201


Wow.. you are severely bottlenecked by that 9 year old CPU.


----------



## Fizban (Jun 19, 2020)

blued said:


> Wow.. you are severely bottlenecked by that 9 year old CPU.



Lol, my laptops 1660 TI is matching his 2080 TI due to my cpu being massively superior.


----------



## Kissamies (Jun 26, 2020)

The old Phenom still handles it.. X2 555 BE @ X4 3.8GHz


----------



## Athlonite (Jul 2, 2020)

well splurged and bought a new GPU so here it is folks


----------



## Fizban (Jul 7, 2020)

New laptop:





CPU scores don't seem to fare too poorly vs the 3700X above, and the 2070 Super looks to outperform desktop 5700 by around 10% as well, so not too bad.

EDIT: Wait, 16x Aniso vs 8, Ultra vs High Shadow, and he has High Screen Space Contact Shadows on. I just picked the highest preset, seems he tweaked it to make it more demanding. I'll re-run with those settings just to see if I still outmuscle the 5700.

I am unsure if he also picked MSAAx4, if so then I get smoked, I ran it that way on accident my first run and then noticed when I tried to see why my fps was so bad.





Turns out those settings make no tangible difference. 2 FPS. The only real question is if that was TAA or SMAA4X. On SMAA4X my fps tanks, hard. Like, result is then in the 80's.

Overclocked GPU by 160 mhz, got 117 fps. Had been fine with 175 mhz in other games, but not this one, 175, 170, and 165 all made it crash. Will try raising VRAM clock now to see if it that gives any appreciable gains.

A 1375 mhz OC on the VRAM only nets 1 extra fps, pushing it to 118. Are there any games where VRAM OC's actually make any real difference?


----------



## Athlonite (Jul 8, 2020)

Fizban said:


> Turns out those settings make no tangible difference. 2 FPS. The only real question is if that was TAA or SMAA4X. On SMAA4X my fps tanks, hard. Like, result is then in the 80's.



SMAA x4 on my score mainly because I can't do the RTX stuff so wanted to push the GPU in ways it could do it.

Not that I'd play a game with those sorts of setting some of them are just way to much suck on perf without that much visual benefit like Motion Blur now I've driven at over 200MPH and done an HALO jump from a plane and I never once suffered from motion blur so I turn that shit off also pure hair is another suck on perf that I don't need as I more often than not play 1st person not third so I'm not spending all day looking at my marvelous hair (Pfft)


----------



## Fizban (Jul 8, 2020)

That is with SMAA4x. No longer a competition at that point, your 5700 smokes me. I'm actually surprised by that. I'd expect to outperform a 5700. Techpowerup's GPU ranking say a 5700 is 4% faster than a 2070 Super Mobile, and my 2070 Super Mobile is THE highest scoring one on the 3dmark firestrike leaderboard.  I had actually guessed you were using TAA, but wasn't sure.

If that was SMAA4x, then this game seems to oddly favor AMD cards despite being made with NVIDIA specific features.


----------



## mouacyk (Jul 8, 2020)




----------



## ThrashZone (Jul 8, 2020)

Hi ^^^^
1080ti did quite well


----------



## mouacyk (Jul 9, 2020)




----------



## mouacyk (Jul 10, 2020)




----------



## mrthanhnguyen (Jul 10, 2020)




----------



## Hyderz (Jul 13, 2020)

my benchmark good ol gtx 1050


----------



## uco73 (Jul 23, 2020)

1080p ultra settings, no overclocking


----------



## KIKOFREDY (Aug 24, 2020)

Here,s mine.... Ryzen 5 3600 - RTX 2060 KO - RAM 32Gb


----------



## harm9963 (Aug 24, 2020)

Seems to like 4 sticks of ram.


----------



## Taraquin (Aug 25, 2020)

KIKOFREDY said:


> Here,s mine.... Ryzen 5 3600 - RTX 2060 KO - RAM 32Gb
> 
> View attachment 166564


Have you used ryzen dram calc? Have you set powerplan to high? Your CPU-score is very low. I have a 3600 myself and get 158 avg\113 min, you get 89\54 with the same CPU as me.


----------



## Athlonite (Aug 25, 2020)

Taraquin said:


> Have you used ryzen dram calc? Have you set powerplan to high? Your CPU-score is very low. I have a 3600 myself and get 158 avg\113 min, you get 89\54 with the same CPU as me.



I think it has more to do with the fact their CPU temps are at 95c so either it's really hot where they live or using a stock cooler isn't cutting it


----------



## Taraquin (Aug 25, 2020)

Athlonite said:


> I think it has more to do with the fact their CPU temps are at 95c so either it's really hot where they live or using a stock cooler isn't cutting it


His score is 60% percent of mine, high ambient temps and stock cooler does not explain that. Running balanced powerplan og not tuning ram could explain that. My CPU-fps rose from 132 to 158 by ram-OC with Dram calc from 3000cl15 til 3733cl16. High performance powerplan also helped.


----------



## Athlonite (Aug 26, 2020)

Taraquin said:


> His score is 60% percent of mine, high ambient temps and stock cooler does not explain that. Running balanced powerplan og not tuning ram could explain that. My CPU-fps rose from 132 to 158 by ram-OC with Dram calc from 3000cl15 til 3733cl16. High performance powerplan also helped.



Until they can better control those temps Overclocking is the last thing they need to do so stop pushing them to cook their system prematurely.  You do whatever you want with your system but their CPU temps are sitting at 95c that's really high and pushing things like the IMC will only worsen those temps


----------



## Taraquin (Aug 26, 2020)

Athlonite said:


> Until they can better control those temps Overclocking is the last thing they need to do so stop pushing them to cook their system prematurely.  You do whatever you want with your system but their CPU temps are sitting at 95c that's really high and pushing things like the IMC will only worsen those temps


Of course I recommend repaste or change of cooler, but honestly, ram OC did not affect temp of CPU at all on my setup, it depends on several things, ram voltage is one of them, if I up voltage of ram CPU-temp also increases slightly, but there was no difference from 3000cl15 to 3733cl16 with stock 1.35V. A underclock with fixed voltage of the CPU (for instance running 3.7 barely affects gamingperformance, but consumptions and temps when only needing 1V vs stock 1.3-1.4V is huge) would probably also be a good idea if he does not change cooler or ambient temp is a problem over time.


----------



## Hardcore Games (Aug 26, 2020)

I use 4K (3840x2160), not some mamby pamby dinosaur TN panel


----------



## JrRacinFan (Aug 31, 2020)

@Taraquin  probably display resolution. I run 75hz vsync with dlss rest is high (raytracing disabled) and get between 65-75fps. 1920Mhz boosted core 7Ghz memory.









As you can see I am now 50% GPU bound at 1080p no vsync (because my panel doesnt support 75hz @ 1080p)


----------



## Taraquin (Sep 1, 2020)

JrRacinFan said:


> @Taraquin  probably display resolution. I run 75hz vsync with dlss rest is high (raytracing disabled) and get between 65-75fps. 1920Mhz boosted core 7Ghz memory.
> 
> 
> 
> ...


Okay. What I look at is the CPU game average. This changes very little by resolution and quality setting and v-sync does not affect this. It says something about what the potential is of the cpu. This is how many fps your CPU would put out without a GPU bottleneck. For some reason SOTTR often enters a low performance state unless you set performance profile til high. Ryzen balanced sometimes work. Ram speed is also crucial. With my CPU running stock at 3000cl16 xmp I get avg 110 when using balanced power plan, 133 fps CPU-game at 720p or 1080p highest with high performance power plan. A 20% increade due to powerplan. When I use my 3733cl16 dram preset I get 163fps, that is almost 25% better perf you to ram tuning. If I OC my CPU to 4.1 I get 165. 

If you want the best performance you must use a powerplan that makes SOTTR run at max CPU-speed and use dram calc. Combine this and you get about 50% higher CPU-performance. It's free and easy to do. 

In most games the powerplan does very little to performance, but in SOTTR and a few others it has a huge impact. If you use v-sync on a low Hz monitor at high res it doesn't matter much. At higher refresh rates it matters a lot.


----------



## reavl (Oct 16, 2020)

and me, being 0% GPU bound.. so I'm 100% CPU bound?  Time to get a new CPU. Looks like my 5820K will bottleneck anything above 1080ti for 1080p. My GTX 1080 was a nice pairing with this CPU. This is at 4.1ghz OC, although i can OC it to 4.4ghz and I do see very tangible fps gains for every 0.1ghz i OC. A very nice chip to OC considering it only boosts to 3.8ghz by default. I was going to go 3600X but i see now that many of them here have the same CPU game avg!, so maybe I will wait for Ryzen 5000's. The RTX 3080 is undervolted to 1740mhz here, >100w less power draw.

Here is with default voltage and boost for the RTX 3080 TUF OC:, mine boosts to 1965mhz (1980mhz when its not heat saturated). So, 5-10 more frames at 1080p. End result is the same ofc, because its CPU bound.


----------



## dOBER (Oct 16, 2020)

x





3080 highest settings 1080p

Something is broken because i got a higher score with my 2080ti and it shows 1% cpu bound with same cpu and overclock.

View attachment 127264


----------



## Taraquin (Oct 16, 2020)

dOBER said:


> x
> 
> View attachment 171978
> 
> ...


Look at CPU game and GPU. Your GPU is pushing out much more frames than the CPU. Have you overclocked the 9900K? Running ram stock? You should be able to get over 200 with a tweaked 9900K, 250 with a really good tweaked one. For comparison my Ryzen 3600 gets around 160fps on CPU render and 115fps min, but I have my ram tweaked to 3733cl16 with tight subs.


----------



## dOBER (Oct 17, 2020)

Taraquin said:


> Look at CPU game and GPU. Your GPU is pushing out much more frames than the CPU. Have you overclocked the 9900K? Running ram stock? You should be able to get over 200 with a tweaked 9900K, 250 with a really good tweaked one. For comparison my Ryzen 3600 gets around 160fps on CPU render and 115fps min, but I have my ram tweaked to 3733cl16 with tight subs.


my i9-9900k runs at 5.2ghz allcore





gpu draws zero power and i cant get any good score like with with my 2080ti. in 3D mark my gpu uses about 440w and cpu scores are normal there too


----------



## Taraquin (Oct 17, 2020)

dOBER said:


> my i9-9900k runs at 5.2ghz allcore
> 
> View attachment 172083
> 
> gpu draws zero power and i cant get any good score like with with my 2080ti. in 3D mark my gpu uses about 440w and cpu scores are normal there too


Are you using high performance profile in power setting? On my i5 8400 performance in SOTTR is crap when using balanced, but good with high perf. Seems balanced liwers performance quite a lot in some games. Dishonored 2 and SOTTR runs bad on balanced but good on high perf.


----------



## bottleofrum (Oct 25, 2020)

Ryzen 5 3600, overclocked 4200MHz, undervolt 1250mV
Nvidia 3080 FE, undervolt 887mV at 1,890MHz


----------



## jormungand (Oct 25, 2020)

Here are mine


----------



## mrthanhnguyen (Oct 25, 2020)

Everyone should use the same setting. Highest and 1080p or 1440p. Do not change anything else.


----------



## EarthDog (Oct 25, 2020)

mrthanhnguyen said:


> Everyone should use the same setting. Highest and 1080p or 1440p. Do not change anything else.


we've been asking for that since the thread started two years ago, lol. OP didn't maintain it and staff ok'd it, so.... its just a pile of results... even worse, random ones.

It is what it is.


----------



## jlewis02 (Oct 25, 2020)

My system 
I need a 30?? card already


----------



## Athlonite (Oct 26, 2020)

new mobo and Adata SX8200 Pro 1TB NVMe same CPU/Ram/GPU


----------



## Deleted member 202104 (Oct 26, 2020)

'Highest' preset chosen, no other changes.

1080p:







1440p:


----------



## Taraquin (Nov 13, 2020)

Anyone with a tweaked Ryzen 5000 with results to share?


----------



## Felix123BU (Nov 21, 2020)

With my new 6800 XT, its a beast, but at 1920x1080 its heavily held back by my CPU (3800X), score could be quite higher, but since I play a 3440x1440, thats fine 







And the 6800 XT at 4k. not bad at all


----------



## blued (Nov 22, 2020)




----------



## mb194dc (Nov 23, 2020)

5600xt @ 2125 Ryzen 5 1600 @ 4ghz 1080p highest settings:


----------



## jonRock3 (Nov 28, 2020)

Ryzen 7 5800X with PBO and curve optimizer enabled, and FCLK is at 2067MHz with 4133 MHz CL16 RAM. 1080 Ti at 2126MHz on the core and 12GHz effective on the memory. Purely GPU bound and getting a little better than stock RTX 2080 Super performance (according to this: NVIDIA GeForce RTX 2080 SUPER review - DX12: Shadow Of The Tomb Raider (guru3d.com) )





Here's 1080p:


----------



## jlewis02 (Nov 29, 2020)

Testing a 2080s until my new psu gets here for my 3080


----------



## RandallFlagg (Nov 30, 2020)

10400 power unlocked w/102.5 bclk

EVGA 2060 KO Ultra with +150 GPU/ +150 Mem

Running 1080 preset High no changes.


----------



## jlewis02 (Nov 30, 2020)

Run it at highest please



RandallFlagg said:


> 10400 power unlocked w/102.5 bclk
> 
> EVGA 2060 KO Ultra with +150 GPU/ +150 Mem
> 
> Running 1080 preset High no changes.


----------



## harm9963 (Nov 30, 2020)

jonRock3 said:


> Ryzen 7 5800X with PBO and curve optimizer enabled, and FCLK is at 2067MHz with 4133 MHz CL16 RAM. 1080 Ti at 2126MHz on the core and 12GHz effective on the memory. Purely GPU bound and getting a little better than stock RTX 2080 Super performance (according to this: NVIDIA GeForce RTX 2080 SUPER review - DX12: Shadow Of The Tomb Raider (guru3d.com) )
> 
> View attachment 177348
> 
> ...


What are your curved  settings at ? ,   are you using auto or manual !  need to be more detail , plus need to add system specs


----------



## RandallFlagg (Nov 30, 2020)

jlewis02 said:


> Run it at highest please



At highest :


----------



## harm9963 (Nov 30, 2020)

jlewis02 said:


> Testing a 2080s until my new psu gets here for my 3080
> View attachment 177464View attachment 177480View attachment 177481


Exclusive Fullscreen please


----------



## Taraquin (Nov 30, 2020)

jonRock3 said:


> Ryzen 7 5800X with PBO and curve optimizer enabled, and FCLK is at 2067MHz with 4133 MHz CL16 RAM. 1080 Ti at 2126MHz on the core and 12GHz effective on the memory. Purely GPU bound and getting a little better than stock RTX 2080 Super performance (according to this: NVIDIA GeForce RTX 2080 SUPER review - DX12: Shadow Of The Tomb Raider (guru3d.com) )
> 
> View attachment 177348
> 
> ...


Have you tuned the ram any?   Could you try to rerun at 1920 medium? Look at your CPU game score, that is how many fps CPU puts out


----------



## jlewis02 (Nov 30, 2020)

harm9963 said:


> Exclusive Fullscreen please


----------



## mrthanhnguyen (Dec 1, 2020)




----------



## Sovsefanden (Dec 1, 2020)

I love games with built in benchmark! Highly prefer these over synthetic benchmarking. Wish all AAA titles would have this.


----------



## jonRock3 (Dec 1, 2020)

harm9963 said:


> What are your curved  settings at ? ,   are you using auto or manual !  need to be more detail , plus need to add system specs





Taraquin said:


> Have you tuned the ram any?   Could you try to rerun at 1920 medium? Look at your CPU game score, that is how many fps CPU puts out


My RAM is at 4133MHz and 16-16-16-16-36 timings in my ASUS X570-E motherboard. My 5800X had -10 on all cores for the curve optimizer and +200MHz for PBO. Latency is around 55ns in Aida64. That's with 2067MHz FCLK.


----------



## jonRock3 (Dec 2, 2020)

mrthanhnguyen said:


> View attachment 177701


Holy hell. That's nice. What's your cpu and ram clocked at?


----------



## mrthanhnguyen (Dec 2, 2020)

jonRock3 said:


> Holy hell. That's nice. What's your cpu and ram clocked at?


10900k @5.5ghz, 2x16gb 4600c16, rtx 3090 2160mhz-2175mhz.


----------



## Taraquin (Dec 3, 2020)

jonRock3 said:


> My RAM is at 4133MHz and 16-16-16-16-36 timings in my ASUS X570-E motherboard. My 5800X had -10 on all cores for the curve optimizer and +200MHz for PBO. Latency is around 55ns in Aida64. That's with 2067MHz FCLK.


Used dram calc? If you want to improve even further there are some subtimings which you should tweak. You should be able to get latency down to 50-52 with a few ram tweaks  tRC, tRFC, tRRDS, tRRDL, tFAW, tWR for instance has much impact.


----------



## jonRock3 (Dec 3, 2020)

Taraquin said:


> Used dram calc? If you want to improve even further there are some subtimings which you should tweak. You should be able to get latency down to 50-52 with a few ram tweaks  tRC, tRFC, tRRDS, tRRDL, tFAW, tWR for instance has much impact.


I haven't used it yet. I'll try it out though. I'll have to look up a tutorial on how to use it though.


----------



## RandallFlagg (Dec 4, 2020)

mrthanhnguyen said:


> 10900k @5.5ghz, 2x16gb 4600c16, rtx 3090 2160mhz-2175mhz.



What's your northbridge (cache) at?  That's one of the fastest 10900K's I've seen.


----------



## mrthanhnguyen (Dec 4, 2020)

RandallFlagg said:


> What's your northbridge (cache) at?  That's one of the fastest 10900K's I've seen.


5.2 cache. Sp104.


----------



## RandallFlagg (Dec 4, 2020)

mrthanhnguyen said:


> 5.2 cache. Sp104.



Wow, you won the CPU and Chipset lottos dude!


----------



## mouacyk (Dec 4, 2020)

RandallFlagg said:


> Wow, you won the CPU and Chipset lottos dude!


Or the dude likes to pump voltage.


----------



## mrthanhnguyen (Dec 4, 2020)

mouacyk said:


> Or the dude likes to pump voltage.


1.35v llc8 or 1.55v llc6.


----------



## jlewis02 (Dec 5, 2020)

100/1250 on gpu 5ghz on cpu


----------



## jlewis02 (Dec 7, 2020)




----------



## Taraquin (Dec 7, 2020)

What


jlewis02 said:


> 100/1250 on gpu 5ghz on cpu
> View attachment 178328


What is your ram running at? My twraked Ryzen 3600 with 3733 ram get the same CPU score as you. You have a lot on unrealised potential


----------



## jlewis02 (Dec 7, 2020)

Taraquin said:


> What
> 
> What is your ram running at? My twraked Ryzen 3600 with 3733 ram get the same CPU score as you. You have a lot on unrealised potential



Memory is at 2666 have some 4000 ordered


----------



## mrthanhnguyen (Dec 7, 2020)

jlewis02 said:


> View attachment 178484View attachment 178485View attachment 178486


Did not expect almost 40fps difference from 3080 to 3090 and 5ghz to 5.5ghz at 1440p.


----------



## Taraquin (Dec 7, 2020)

jlewis02 said:


> Memory is at 2666 have some 4000 ordered


You don't need that. Even most 2666 ram are capable of running much faster. You can use thaiphoon burner and find out which dies they are. I bet you can do 3200cl16 without issues, if you got Hynix C/D or Micron E/B you can do 4000+ without problems


----------



## harm9963 (Dec 11, 2020)

New build 21277.1000 rs





Taraquin said:


> You don't need that. Even most 2666 ram are capable of running much faster. You can use thaiphoon burner and find out which dies they are. I bet you can do 3200cl16 without issues, if you got Hynix C/D or Micron E/B you can do 4000+ without problems


My 3600CL15 is doing 3800 CL14 , and my 5800X unlock that door.


----------



## Taraquin (Dec 14, 2020)

harm9963 said:


> New build 21277.1000 rsView attachment 179189
> 
> 
> My 3600CL15 is doing 3800 CL14 , and my 5800X unlock that door.View attachment 179209


Good score, a few more tips: set trrds to 4 and tfaw to 16, set twr to 12 or 10, trrdl to 6 or 4, trc might do 42-46. Try setting tRFC a bit lower, 240-270 should be possible  If you are able to set all those I bet you can get up to 5% extra performance  Look at the CPU game avg and compare if you tweak further


----------



## phanbuey (Dec 15, 2020)

4k highest, new 10850K in rig:


----------



## jlewis02 (Dec 15, 2020)

Memory speed helps out.
Can't do 4k anymore gave the monitor back.


----------



## cRs (Dec 30, 2020)

8700k 5Ghz oc 
2x8 16Gb 3000Mhz
It's time to change my cpu


----------



## Felix123BU (Jan 9, 2021)

aaand my 6800XT with my new baby, the Ryzen 5600X
vs the runs with the Ryzen 3800X, +27 FPS with the 5600X at 1080
Not bad at all 






phanbuey said:


> 4k highest, new 10850K in rig:
> View attachment 179619



Is this 3840 x 2160? A 2080Ti should not be even close to 92FPS at 4k, not even shunt modded


----------



## phanbuey (Jan 9, 2021)

Felix123BU said:


> aaand my 6800XT with my new baby, the Ryzen 5600X
> vs the runs with the Ryzen 3800X, +27 FPS with the 5600X at 1080
> Not bad at all
> View attachment 183199
> ...



Yah I think something was really odd with that run... my 3080 scores like 83 FPS at 4k at the same settings 

Probably some weird glitch where DLSS didn't turn off or soemthing.


----------



## Deleted member 202104 (Jan 9, 2021)

All stock, Highest settings


----------



## Felix123BU (Jan 10, 2021)

phanbuey said:


> Yah I think something was really odd with that run... my 3080 scores like 83 FPS at 4k at the same settings
> 
> Probably some weird glitch where DLSS didn't turn off or soemthing.


Thought you had Huang's secret one in a million 2080TI


----------



## phanbuey (Jan 10, 2021)

4k Highest w/ 3080

Workstation settings - 5.0ghz velocity boost all core until 75 C then 4.9ghz
undervolted msi ventus 3080.


1440P settings:





5.1 Ghz/ 5.0 'Chernobyl' settings, 1440P:


----------



## Felix123BU (Jan 10, 2021)

phanbuey, what would your 4K score be with the 3080 maxxed out? Curious, I get 85FPS constant with my 6800XT, though its overclocked to 2.6ghz


----------



## phanbuey (Jan 10, 2021)

Felix123BU said:


> phanbuey, what would your 4K score be with the 3080 maxxed out? Curious, I get 85FPS constant with my 6800XT, though its overclocked to 2.6ghz



So fully maxxed out doesnt exist for my card because i have the lowest-end MSI Ventus 3x - the only reason I even got it is because my friend couldn't get it stable *at stock* in his mini ITX case.  The power limit increases to 100% (stock) there is no way, even with a bios flash to increase it higher, so he ended up selling it to me at a discount.  That 83FPS is about as high as she will go with full stability.

2.6 Ghz on an 6800xt will spank it all day long.

I'm pretty happy with it though, it has HDMI 2.1, so I can run 120hz Gsync HDR 4K (the 2080ti had to run heavy 4:2:0 compression and no HDR) and is about 15% faster than my heavily OC'd 2080ti at 4k.   And it's silent, you know, as a positive side effect of being a turd that can only maintain stability with an undervolt -- still puts out the same scores as a stock 3080 FE so no complaints.


----------



## Felix123BU (Jan 10, 2021)

phanbuey said:


> So fully maxxed out doesnt exist for my card because i have the lowest-end MSI Ventus 3x - the only reason I even got it is because my friend couldn't get it stable *at stock* in his mini ITX case.  The power limit increases to 100% (stock) there is no way, even with a bios flash to increase it higher, so he ended up selling it to me at a discount.  That 83FPS is about as high as she will go with full stability.
> 
> 2.6 Ghz on an 6800xt will spank it all day long.
> 
> I'm pretty happy with it though, it has HDMI 2.1, so I can run 120hz Gsync HDR 4K (the 2080ti had to run heavy 4:2:0 compression and no HDR) and is about 15% faster than my heavily OC'd 2080ti at 4k.   And it's silent, you know, as a positive side effect of being a turd that can only maintain stability with an undervolt -- still puts out the same scores as a stock 3080 FE so no complaints.


 i was asking because the 3080 should be faster at 4k in theory, so since my 6800XT is heavily OC, would have been interesting to see a OC vs OC at 4k in this game. Yours is as you say "lowest-end", mine is the reference 6800XT, so would have been sort of "fair"


----------



## Felix123BU (Jan 15, 2021)

yay, broke 190 FPS in 1080 Highest, all AMD powaaaah      was initially suckered into "the 5600X is just as good", it aint, got the 5800X instead, and omg its a beast


----------



## phanbuey (Jan 15, 2021)

Felix123BU said:


> yay, broke 190 FPS in 1080 Highest, all AMD powaaaah      was initially suckered into "the 5600X is just as good", it aint, got the 5800X instead, and omg its a beast
> 
> View attachment 184163




Sweet man!

I'm waiting for Nvidia* to enable that resizable BAR support to see if it makes any difference....


----------



## Felix123BU (Jan 15, 2021)

phanbuey said:


> Sweet man!
> 
> I'm waiting for Nvidia* to enable that resizable BAR support to see if it makes any difference....


I am curious too to see if it has a different impact vs on an AMD gpu, and which one is the better implementation of it, or which one can better take advantage of it.
On mine its theoretically enabled, though I get basically the exact same frames in Tomb Raider with it on or off.


----------



## Deleted member 202104 (Jan 15, 2021)

Felix123BU said:


> On mine its theoretically enabled, though I get basically the exact same frames in Tomb Raider with it on or off.



I see the same behavior with Intel.  1-3 fps, so well within the margin of error.


----------



## Felix123BU (Jan 15, 2021)

weekendgeek said:


> I see the same behavior with Intel.  1-3 fps, so well within the margin of error.


yup, exact same thing, I cant really quantify it in this game


----------



## phanbuey (Jan 20, 2021)

Welp it's no 5800x but not bad for a $390 budget option   For whatever reason im 43% GPU bound while @Felix123BU is 0% .


----------



## Felix123BU (Jan 20, 2021)

phanbuey said:


> View attachment 184754
> 
> 
> Welp it's no 5800x but not bad for a $390 budget option  For whatever reason im 43% GPU bound while @Felix123BU is 0% .



I am confused by what the benchmark in this game is reporting  Sometimes im 0% GPU bound, other times 27% 
The Cpu Render section always shows better on Intel, the CPU Game section is better on mine but the CPU Render is quite a lot lower.
I have no clue how the average is calculated, its not the average displayed in the sub section, nor a mathematical division of any of the values...

And btw, the 10850K is a very good processor, I would have considered it have I not have already been on AM4 with a good motherboard. I bet if you overclock the crap out of it it will match my 5800X rather easy, though the 3080 is a smidge weaker at 1080


----------



## RandallFlagg (Jan 20, 2021)

Kinda surprised I have not seen a 10600K in this thread.


----------



## mouacyk (Jan 21, 2021)

phanbuey said:


> Welp it's no 5800x but not bad for a $390 budget option  For whatever reason im 43% GPU bound while @Felix123BU is 0% .


3080 is dozing off at 1080p, no fault of CPU.


----------



## nguyen (Jan 21, 2021)

some 3090 loving


----------



## mrthanhnguyen (Jan 21, 2021)

nguyen said:


> some 3090 loving
> 
> View attachment 184920


U run stock? Coz my system has like 70-80 more fps.


----------



## nguyen (Jan 21, 2021)

mrthanhnguyen said:


> U run stock? Coz my system has like 70-80 more fps.



I'm pretty sure not at 4K


----------



## mrthanhnguyen (Jan 21, 2021)

nguyen said:


> I'm pretty sure not at 4K


Oh sorry thought it 1440p.


----------



## mouacyk (Jan 22, 2021)

4k, dsr on 2560x1440 monitor


----------



## Felix123BU (Jan 22, 2021)

mouacyk said:


> 4k, dsr on 2560x1440 monitor
> View attachment 185023


what's the difference in quality vs native 2k?


----------



## mouacyk (Jan 22, 2021)

Felix123BU said:


> what's the difference in quality vs native 2k?


Much sharper image due to the downsampling, but obviously not as sharp as native 4K monitor.


----------



## Lew Zealand (Jan 23, 2021)

Intel NUC8i5 4c8t @3.6GHz ACT, 16GB 2400 MHz, GTX 1060-6GB in Akitio Node eGPU.

FPS seems OK compared to other 1060s in 4c8t machines though the min and 95% are a bit low as expected from TB3 latency.










Core i5-8400, B360, 16GB 2666MHz CL13-13-13-35, PNY GTX 1080 w/crap cooler, so UV to .95 and OC to roughly 2012-2025MHz, temp dependent.

If I get real frisky, I'll swap the GPUs and test vice versa to see how the eGPU affects them differently.  Yeah.  We'll see if I do that.


----------



## Lew Zealand (Feb 3, 2021)

Got around to swapping the GPUs.  Only the NUC w/1080 for now:






Intel NUC8i5 4c8t @3.6GHz ACT, 16GB 2400 MHz, PNY GTX 1080 w/crap cooler (UV to .95 and OC to roughly 2012-2025MHz, temp dependent) in Akitio Node eGPU.

Clear CPU limitation.  Ran it at 1440p for fun and got 62fps, 92% GPU-bound, with the CPU bound spots being NPC-heavy.  The 1080 is a better 1440p card than the 1060 is a 1080p card, not surprising with twice the CUDA cores.

1060 in 8400 to come later.


----------



## mrthanhnguyen (Feb 4, 2021)

1 more mora so 1 more update
1440p



4k


----------



## Lew Zealand (Feb 5, 2021)

Finally, the 1060 in the i5-8400 box:






Core i5-8400, B360, 16GB 2666MHz CL13-13-13-35, MSI Gaming X GTX 1060-6GB

15-16% hit in overall FPS at 1080p when swapping the 1060 from a gaming desktop to an eGPU.  Not too bad.  But a 20% hit to 1% lows as you might expect with the reduced throughput and added latency of PCIe-TB3-PCIe translation.

Compare to 23-24% hit in overall FPS at 1080p when swapping the 1080 from a gaming desktop to an eGPU.  However the difference here is that the 4c8t CPU in the NUC now becomes the limitation in NPC-heavy areas with a 1080, which leads to about 37% reduction in 1% lows.  If I had the 6c12t NUC10i7, that would give a more apples-to-apples comparison but yeah, that one's pretty expensive...

OK and I've played a few games on the 1080 in the NUC+eGPU compared to the gaming PC i5-8400 and here's what I see:

Ark:SE - model pop-in is bad in this game in general and is noticeably worse in the eGPU with obvious frame drops.  However most of the time, it plays pretty similarly to the gaming desktop with only a slightly lower FPS at 1440p with my medium-high custom settings.

Control - Notably worse FPS but no pop-in problems.  This game is pretty GPU-heavy yet easy on the CPU, but they must "talk" a lot as the average FPS is notably lower at 1440p Medium, like ~53fps in the gaming PC to ~35 fps in the NUC+eGPU.  I did not expect this.

DiRT Rally - FPS a little lower at 1440p Ultra yet still high enough not to be noticeable, feels the same.  No pop-in problems though the FPS in the first 10 seconds of a race _are _notably lower - texture load-ins? but no dropped frames, just noticeably lower overall FPS at that time which is then alleviated in about 10 sec.

Of course I haven't actually *played* SotTR yet as I'm still working on RotTR but I haven't played that on the NUC+eGPU yet.  Maybe tonight.


----------



## Kissamies (Feb 13, 2021)

Highest but RT shadows on @ medium


----------



## Fizban (Feb 14, 2021)

11% GPU bound, nice cpu benchmark, lul


----------



## harm9963 (Feb 14, 2021)

1080Ti still has life 
My 5950X elevated my 1080Ti for sure !


----------



## Kissamies (Feb 14, 2021)

harm9963 said:


> *1080Ti *still has life


Exactly.


----------



## Fizban (Feb 14, 2021)

1080 TI's a solid gpu still, but the cpu is doing a lot of work there, the game is very cpu-bound usually.


----------



## Kissamies (Feb 14, 2021)

Fizban said:


> 1080 TI's a solid gpu still, but the cpu is doing a lot of work there, the game is very cpu-bound usually.


I feel pretty balanced with an AMD 3600


----------



## Fizban (Feb 14, 2021)

Chloe Price said:


> I feel pretty balanced with an AMD 3600


My i7 holds back my laptops GPU hard. I'd have to compare at 1440P or 4K to beat his 1080 TI as a result.


----------



## Kissamies (Feb 14, 2021)

Fizban said:


> My i7 holds back my laptops GPU hard. I'd have to compare at 1440P or 4K to beat his 1080 TI as a result.


Anyway, best purchase ever as I paid 300EUR from this last week


----------



## thesmokingman (Feb 14, 2021)

Just ran this on a G9, stock with maxed power.

1080



5120


----------



## QuietBob (Feb 17, 2021)

Haven't seen a 3300X here, so here ya go. DX 12 @ 1080p, highest preset:



Yes, I'm aware of the GPU bottleneck  This is a 9 year old GPU. I'd still say the game is playable at around 30 fps.
But check out that little beast of a CPU with 4c/8t. Even an RTX2080 wouldn't be bottlenecked at these settings.

EDIT: Looking at the scores above, it could also easily drive an RTX3090 at 4K


----------



## Det0x (Feb 17, 2021)

5950x vs "CPU Game" 





Used this benchmark as a purely cpu+memory test, hence the "medium" settings, but it dont matter for CPU game score which i was after 
3090 is running stock bios and everyday clocks.


----------



## mouacyk (Feb 17, 2021)

Det0x said:


> Think this is the fastest AMD "CPU Game" numbers in this thread (?)
> 
> Used this benchmark as a purely cpu+memory test, hence the "medium" settings, but it dont matter for CPU game score which i was after
> 3090 is running stock bios and everyday clocks.


Would be more meaningful comparing apples to apples.


----------



## Lew Zealand (Feb 19, 2021)

Awright in honor of Nvidia re-releasing the GTX 1050 Ti so you only need to pay $150 over market value for a scalped GPU, I give you this barn burner:





Intel NUC8i5 Core i5-8259U @3.6GHz ACT 4c8t, 2x8GB 2400 RAM, eGPU w/ASUS Phoenix 1050 Ti (75W slot power only) OC +165cores, +200memory

Obviously useless at these settings, so I tried at Low (not Lowest- no shadows there and looks awful) settings, which look... OK:





That's probably playable...ish though 40fps lows are a little rough.

I'll pop it in the 8400 to see if full PCI bandwidth and latency will help for more normal folks out there.


----------



## QuietBob (Feb 19, 2021)

Just a little update. I've noticed that AA wasn't enabled for my previous test, so here's the same config with TAA:



The HD7970 is soundly beating the 1050Ti, with the latter starting at EUR 180 / USD 220 new here


----------



## Det0x (Feb 20, 2021)

New CPU+memory highscore 
261 fps average "CPU game"


----------



## mrthanhnguyen (Feb 25, 2021)

so this thread comes back to everyone custom setting again.


----------



## Det0x (Feb 25, 2021)

LOL i was benching the wrong version of the game.. Free version on steam was v505 (patch 15) which was missing 4 performance updates 
Version 298 = patch 19

This is more like it :innocent:
CPU Game = *Average 282 fps





*


----------



## thesmokingman (Feb 25, 2021)

All these custom settings are pointless.


----------



## Taraquin (Feb 26, 2021)

Det0x said:


> LOL i was benching the wrong version of the game.. Free version on steam was v505 (patch 15) which was missing 4 performance updates
> Version 298 = patch 19
> 
> This is more like it :innocent:
> ...


Could you please rerun at 1080p@highest to get an apple to apple with the others?


----------



## Det0x (Feb 26, 2021)

Taraquin said:


> Could you please rerun at 1080p@highest to get an apple to apple with the others?


I use this as a memory+cpu benchmark..
Dont matter if you run 1080p@medium or 4k@highest, you get the same "CPU Game" score either way (within run to run variance)

Reference for a highly optimized intel system 1440p vs 2160: https://www.techpowerup.com/forums/...-raider-benchmark.250511/page-11#post-4450321
1080p vs 5k: https://www.techpowerup.com/forums/...-raider-benchmark.250511/page-11#post-4457340
1080p vs 1440p vs 2160p: https://www.techpowerup.com/forums/threads/shadow-of-the-tomb-raider-benchmark.250511/page-10

The reason for me using exactly 1080 medium is because it was the settings for this thread: https://www.diskusjon.no/topic/1888561-shadow-of-the-tomb-raider-benchmark-tråden/

But like i said above, it makes no difference for "CPU Game" score.

The only interesting part of this benchmark is the memory (subsystem) scaling, as few other games/benchmarks show it as clearly as this one (and its a fair comparison for both amd and intel cpus)
If i wanted to bench something 90%+ GPU bound, i would benchmark my 3090 in 3dmark, not Shadow of the tomb raider lol
But it seems like some people didn't like the new CPU Game highscores. (?)  "Tough luck but its you guys that dont understand how to use this benchmark and are doing it wrong"


----------



## Felix123BU (Feb 26, 2021)

Det0x said:


> LOL i was benching the wrong version of the game.. Free version on steam was v505 (patch 15) which was missing 4 performance updates
> Version 298 = patch 19
> 
> This is more like it :innocent:
> ...


I was like goood damn, 282, but then I saw Medium settings


----------



## Taraquin (Feb 26, 2021)

Det0x said:


> I use this as a memory+cpu benchmark..
> Dont matter if you run 1080p@medium or 4k@highest, you get the same "CPU Game" score either way (within run to run variance)
> 
> Reference for a optimized intel system: https://www.techpowerup.com/forums/...-raider-benchmark.250511/page-11#post-4450321
> ...


Yes, but this tread is not about CPU only, it is about GPU aswell  Many here want to see GPU-scores. Could you rerun so we can see what actual fps you get with highest setting?


----------



## Det0x (Feb 26, 2021)

Taraquin said:


> Yes, but this tread is not about CPU only, it is about GPU aswell  Many here want to see GPU-scores. Could you rerun so we can see what actual fps you get with highest setting?


Judging from previous comments in this thread, you seem to one of the few who actually know what they are talking about, and since you ask me fairly, i will do a run with graphics on highest later today or tomorrow


----------



## Vendor (Feb 26, 2021)

not bad for my 750 ti but remember it's 1080p low but i still think my gpu did fairly well


----------



## Kissamies (Feb 26, 2021)

Vendor said:


> not bad for my 750 ti but remember it's 1080p low but i still think my gpu did fairly well
> View attachment 190078


750 Ti is still surprisingly fine.


----------



## Vendor (Feb 26, 2021)

Chloe Price said:


> 750 Ti is still surprisingly fine.


yeah true but it was actually 1080p lowest* and now low, i made a slight mistake there


----------



## Kissamies (Feb 26, 2021)

Vendor said:


> yeah true but it was actually 1080p lowest* and now low, i made a slight mistake there


I suppose that it's playable at lowest with 720p so it still manages.


----------



## Vendor (Feb 26, 2021)

Chloe Price said:


> I suppose that it's playable at lowest with 720p so it still manages.


40fps is more than enough for me and i even fancy stable 30fps because of my gpu


----------



## Kissamies (Feb 26, 2021)

Vendor said:


> 40fps is more than enough for me and i even fancy stable 30fps because of my gpu


I understand, it's not so fast-paced game so it's playable.


----------



## Det0x (Feb 26, 2021)

Scrubs



*edit*
Forgot to include "framecounter", so other run


----------



## Taraquin (Feb 27, 2021)

Det0x said:


> Scrubs
> View attachment 190106
> *edit*
> Forgot to include "framecounter", so other run
> View attachment 190107


Impressive both CPU and GPU-wise. Any OC on GPU? Interesting to see that you are almost 50% GPU-bound with the fastest gaming CPU available!


----------



## Det0x (Feb 27, 2021)

Taraquin said:


> Impressive both CPU and GPU-wise. Any OC on GPU? Interesting to see that you are almost 50% GPU-bound with the fastest gaming CPU available!


GPU is MSI Suprim X on stock cooling and stock bios (max450w) running a OC @ +135 core and +1250 mem.

But like i said earlier, the interesting thing about this bench is the CPU+ memory scaling, not GPU performance..

Oh and where have all these naysayers gone ? They seemed very interested in fruit, especially apples ? Did they learn anything in regards "CPU Game" ? 


> Dont matter if you run 1080p@medium or 4k@highest, you get the same "CPU Game" score either way (within run to run variance)


----------



## Lindatje (Feb 27, 2021)

With a small adjustment to the memory, the CPU game goes up. tFAW from 60 to 24. tRC from 87 to 50. tRFC from 350 to 304. tWR from 26 to 14.

And the GPU-Bound is strange, from 67% to 98%....


----------



## Det0x (Feb 27, 2021)

Lindatje said:


> With a small adjustment to the memory, the CPU game goes up. tFAW from 60 to 24. tRC from 87 to 50. tRFC from 350 to 304. tWR from 26 to 14.
> 
> *And the GPU-Bound is strange, from 67% to 98%....*


Not really, faster CPU Game = you are more GPU bound.
The CPU have to wait for the GPU before it can draw the next frame, så the GPU becomes more and more the bottleneck with a faster CPU and/or higher resolution.


----------



## Taraquin (Feb 27, 2021)

Det0x said:


> GPU is MSI Suprim X on stock cooling and stock bios (max450w) running a OC @ +135 core and +1250 mem.
> 
> But like i said earlier, the interesting thing about this bench is the CPU+ memory scaling, not GPU performance..
> 
> Oh and where have all these naysayers gone ? They seemed very interested in fruit, especially apples ? Did they learn anything in regards "CPU Game" ?


I totally agree that cpu+mem-scaling is interesting, however, many here also wanna compare gpu performance and wants input on weather cpu is limiting perf or not  Hence is why someone sa7d that we should test at 1080p highest to compare perf.


----------



## Det0x (Mar 6, 2021)

Any 11700k's here?


----------



## Felix123BU (Mar 6, 2021)

Det0x said:


> Any 11700k's here?


Don't think so, all the 11700k's went into hiding out of shame for the leaked samples


----------



## Det0x (Mar 6, 2021)

Felix123BU said:


> Don't think so, all the 11700k's went into hiding out of shame for the leaked samples


Nizzen from overclockers.net is sharing some numbers on this norwegian forum (where iam also from)





__





						Shadow of the tomb raider benchmark-tråden
					





					translate.google.com
				







__





						Shadow of the tomb raider benchmark-tråden
					





					www.diskusjon.no


----------



## Felix123BU (Mar 6, 2021)

Det0x said:


> Nizzen from overclockers.net is sharing some numbers on this norwegian forum (where iam also from)
> 
> 
> 
> ...


 poor Intel, they really cant catch a brake. Would have been nice to see a bench at Highest with TAA, not at Medium and no AA from that 11700k


----------



## IchigoQc (Mar 6, 2021)

INTEL 10700K stock chipset B460 so memory is locked @2933MHZ (I know I can't overclock with B460, will upgrade later .... been more then 8 years since I build a PC, OC was a standard feature back then  ) MSI RTX 2060 Ventus XS OC with AIO ... the stock cooler was horrible couldn't bear the sound of it.


----------



## jboydgolfer (Mar 7, 2021)

i need to finish upgrading my PC. its bad when your GPU has nearly twice as much memory as your entire PC.


----------



## Taraquin (Mar 7, 2021)

jboydgolfer said:


> i need to finish upgrading my PC. its bad when your GPU has nearly twice as much memory as your entire PC.
> 
> View attachment 191295


Your 8600K seems to be underperforming. Have you tweaked ram/oc cpu?


----------



## Felix123BU (Mar 7, 2021)

Taraquin said:


> Your 8600K seems to be underperforming. Have you tweaked ram/oc cpu?


That and this game reaaaally liking moar cores, I had a 20 FPS bum from going from a 5600X to a 5800X, and a friend with a 5900X and the same GPU and equivalent ram settings gets 10 FPS more than my 8 core


----------



## harm9963 (Mar 7, 2021)

Felix123BU said:


> That and this game reaaaally liking moar cores, I had a 20 FPS bum from going from a 5600X to a 5800X, and a friend with a 5900X and the same GPU and equivalent ram settings gets 10 FPS more than my 8 core


I had 2700X _ 122  then a 5800X _132 and now 5950X _ 140


----------



## Taraquin (Mar 8, 2021)

harm9963 said:


> I had 2700X _ 122  then a 5800X _132 and now 5950X _ 140


Look at the avg on CPU-game. That is the important one that shows how many fps your CPU is producing. Going from the stock 3000cl16-ram xmp on my Ryzen 3600 to 3733cl15 with dram calc I got 163fps. That is a 25% better CPU-score from ram-tweaking alone.


----------



## Splinterdog (Mar 8, 2021)

Splinterdog said:


> First one with FreeSync off
> View attachment 126987
> 
> And FreeSync on - 2 fps gained
> View attachment 126988


These are a big improvement over the 2600X/RX580 with both being at 1080p/Ultra/DX12.


----------



## Taraquin (Mar 9, 2021)

Splinterdog said:


> These are a big improvement over the 2600X/RX580 with both being at 1080p/Ultra/DX12.
> View attachment 191585
> 
> View attachment 191586


Done anything with your ram? Your cpu score is lower than what I get with my Ryzen 3600. With modest ram tweaking you should get 200+ fps on cpu-game  Tried dram calc?


----------



## Splinterdog (Mar 9, 2021)

Taraquin said:


> Done anything with your ram? Your cpu score is lower than what I get with my Ryzen 3600. With modest ram tweaking you should get 200+ fps on cpu-game  Tried dram calc?


With respect, you're comparing apples to oranges because your 3060Ti  GPU is much more powerful than mine.
And no, I haven't overclocked the RAM from its default 2400Mhz just yet.


----------



## Taraquin (Mar 9, 2021)

Splinterdog said:


> With respect, you're comparing apples to oranges because your 3060Ti  GPU is much more powerful than mine.
> And no, I haven't overclocked the RAM from its default 2400Mhz just yet.


No, I had the 5700XT before. Cpu game avg score is the same with 5700XT and 3060ti. It shows how many fps the cpu generates, gpu game shows how many fps the gpu produces. Just a tip if you want to improve cpu performance  Dram calc 3733 fast preset raised my score from 130 to over 160 fps.


----------



## Splinterdog (Mar 9, 2021)

Taraquin said:


> No, I had the 5700XT before. Cpu game avg score is the same with 5700XT and 3060ti. It shows how many fps the cpu generates, gpu game shows how many fps the gpu produces. Just a tip if you want to improve cpu performance  Dram calc 3733 fast preset raised my score from 130 to over 160 fps.


Thanks for your advice, but for the moment I'm more than happy with the performance


----------



## INSTG8R (Mar 9, 2021)

Splinterdog said:


> With respect, you're comparing apples to oranges because your 3060Ti  GPU is much more powerful than mine.
> And no, I haven't overclocked the RAM from its default 2400Mhz just yet.


I have a fairly identical setup but not the game, it’s coming to Game Pass I‘ve wanted to run the bench so I’ll definitely post results for comparison when it‘s available. But you should definitely give your RAM a kick regardless may not show better FPS but you’re holding your CPU back at such low speeds I hope your Time Spy score is old because mine is double that.


----------



## Det0x (Apr 3, 2021)

Det0x said:


> Scrubs
> View attachment 190106
> *edit*
> Forgot to include "framecounter", so other run
> View attachment 190107


Testrun of my new everyday 24/7 settings: 
273 fps average CPU Game




PS: Nvidia Rebar lowers the CPU game numbers by ~10-15 fps, recommend to turn it off


----------



## mrthanhnguyen (Apr 3, 2021)

Det0x said:


> Testrun of my new everyday 24/7 settings:
> 273 fps average CPU Game
> View attachment 195061
> View attachment 195060
> PS: Nvidia Rebar lowers the CPU game numbers by ~10-15 fps, recommend to turn it off


highest setting.


----------



## Det0x (Apr 3, 2021)

mrthanhnguyen said:


> highest setting.


This again ? Why ?
If you want highest setting you can look at this screenshot instead:





> Dont matter if you run 1080p@medium or 4k@highest, you get the same "CPU Game" score either way (within run to run variance)
> 
> Reference for a highly optimized intel system 1440p vs 2160: https://www.techpowerup.com/forums/...-raider-benchmark.250511/page-11#post-4450321
> 1080p vs 5k: https://www.techpowerup.com/forums/...-raider-benchmark.250511/page-11#post-4457340
> ...


----------



## Taraquin (Apr 3, 2021)

Det0x said:


> This again ? Why ?
> If you want highest setting you can look at this screenshot instead:View attachment 195079


It was decided by the starter of the tread StefanM for unknown reasons. I think many here are more interested in the real avg fps than cpu game which they don't understand. You, me and a few other prefer the cpu game stats to see how the cpu performs, but I guess we are a minority.


----------



## Felix123BU (Apr 3, 2021)

Det0x said:


> This again ? Why ?
> If you want highest setting you can look at this screenshot instead:View attachment 195079


Why? Because this thread is about 1080 Highest, because all followed this from the beginning, and by throwing in random setting you dilute the results.

Now, I can very much agree that this game is an oddity, the moar cores you throw at it, the better your FPS will be, each extra 2 cores will give extra frames, going up to 16/32, and in that regard, the CPU game is sort of more interesting. But that should be the subject of another thread.


----------



## Det0x (Apr 3, 2021)

If you want to bench the GPU, use 3dmark.
The rest of the world use this benchmark as a cpu/memeory bench, not a 99% limited GPU bench.

But by all means, have at it




If threadstarter StefanM wanted to keep a leaderboard on page1 i could see the need conform to a single setting, but as it stands now, 90% of the results here are from unoptimized systems, from people who dont know how to tweak a system..



> "Now, I can very much agree that this game is an oddity, the moar cores you throw at it, the better your FPS will be, each extra 2 cores will give extra frames, going up to 16/32, and in that regard, the CPU game is sort of more interesting. But that should be the subject of another thread.


No, this game dont scale with cores/threads past a certain point.. 250+ fps CPU Game is not hard with a 8core zen3 cpu (5800x)

Russian SoTB benchthread uses 720p lowest, Scandinavian use 1080p medium. overclockers.net use whatever.


----------



## Felix123BU (Apr 3, 2021)

Det0x said:


> If you want to bench the GPU, use 3dmark.
> The rest of the world use this benchmark as a cpu/memeory bench, not a 99% limited GPU bench.
> 
> But by all means, have at it
> ...


Dude, you are being an ass for no good reason. We all have a gazillion opinions on a variety of topics. This thread was meant for 1080 Highest. If you don't like it, or don't agree with it, start another thread where you can show the whole world that you are right and everybody else is wrong. Or post whatever you like without being disrespectful to the other members of this thread.


----------



## Det0x (Apr 3, 2021)

Felix123BU said:


> Dude, you are being an ass for no good reason. We all have a gazillion opinions on a variety of topics. This thread was meant for 1080 Highest. If you don't like it, or don't agree with it, start another thread where you can show the whole world that you are right and everybody else is wrong. Or post whatever you like without being disrespectful to the other members of this thread.


Maybe you think I'm being disrespectful, but could that have something to do with people commenting on things they dont understand/know what they are talking about ?


mouacyk said:


> Would be more meaningful comparing apples to apples.





thesmokingman said:


> All these custom settings are pointless.





Felix123BU said:


> I was like goood damn, 282, but then I saw Medium settings


Seems like to me some of you guys didn't know "CPU Game" is not affected by quality, resolution nor renderscale. ?

And that's why this game is a popular benchmark for cpu/memory, as it scales very well with tweaking of memory timings and latency.. GFX dont matter here.. But ofcourse if you want to brag about your epeen 3090 which require no knowledge to get a high GPU score with, and maybe keep your 1440p highscore, then graphic score is the most important and only highest settings should be used i guess, but tough luck, it didn't make it. I simply dont find it interesting to benchmark graphic cards in SotTR.

And i think i have been sharing usefull information about how the different game versions affect game scores (free version vs paid version) and how rebar (resizable bar) affects scoring, but if you think iam an ass then so be it. But i mostly think that have something to do with how you comment on others first, use Taraquin as an example if you want to have a better tone.


----------



## Felix123BU (Apr 3, 2021)

Det0x said:


> Maybe you think I'm being disrespectful, but could that have something to do with people commenting on things they dont understand/know what they are talking about ?
> 
> 
> 
> ...


Again, "Maybe you think I'm being disrespectful, but could that have something to do with people commenting on things they don't understand/know what they are talking about ?" is disrespectful.
This thread is and was from its beginning comparing results of 1080 Highest in SOTR, not debating the influences of the human mind on the universe, or alternatively CPU scores in SOTR. Do you understand?

And yeah, CPU score and subsequently CPU performance are hugely important for higher FPS in this game, nobody I think is debating that, and it might help some, but that does not change the fact that this thread was created for comparing performance at a certain setting, namely once again, 1080 Highest. There have been enough posts with 1080 Highest and a combo of 2K, 4K, and people also discussing the influence of CPU ingame, of Rebar or Ram tuning.

Coming in and shitting on everybody because they don't see your wisdom, even though being politely reminded on several occasions that here people compare 1080 Highest, is highly disrespectful and rude.

Nobody would have had anything against you posting helpful stuff, me included, but when your messages read like "listen dumb posters here, I have the ultimate knowledge, you are all dumb sheep that don't get what shit is about, I am the best thing that happened to your lives" you sound disrespectful towards all who posted in this tread and somehow managed to select settings for 1080 Highest, post the results, and have conversations about it. Again, nobody said you can not chat about anything game related, but a minimum show of respect would have been starting your rant with a post of 1080 Highest, then crapping on the world for not thinking like you and trying to hijack the thread.


----------



## Det0x (Apr 3, 2021)

No point to discuss this further, like with many millennials, feelings are more important than facts.
And i won't stoop to that level with namecalling and/or telling others what they should or shouldn't do.

In my view we are not sheep and such should not follow just because it feels comforting, when faced with a better option..
Anyway, this is the fastest rocketlake ive seen to date:
250fps average CPU Game







> 5250mhz core
> 5000mhz singlerank memory


*edit typo*


----------



## Felix123BU (Apr 3, 2021)

Det0x said:


> No point to discuss this further, like with many millenniums, feelings are more important than facts.
> And i won't stoop to that level with namecalling and/or telling others what they should do.
> 
> In my view we are not sheep and such should not follow just because it feels comforting, when faced with a better option..
> ...


This is not about feelings but respect towards the posters in this thread. Using the "millenniums" argument, which I suppose is millennials, is just pure lame.
But hey, feel free to continue trolling with "facts", but disregarding the fact that this is a 1080 Highest thread, you non-millennial you


----------



## DemonicRyzen666 (Apr 3, 2021)

Det0x said:


> In my view we are not sheep and such should not follow just because it feels comforting, when faced with a better option..


This quote just makes me think about how everyone reads reviews.
I'm wondering no one has test mGPU on this benchmark ? 
It does support it does it not ? ¯\_(ツ)_/¯


----------



## Felix123BU (Apr 3, 2021)

DemonicRyzen666 said:


> This quote just makes me think about how everyone reads reviews.
> I'm wondering no one has test mGPU on this benchmark ?
> It does support it does it not ? ¯\_(ツ)_/¯


In a way it supports it, in another it does not. It supports it in the way that you can always use "better" settings to get "better" results. In where it does not is that the point of having a meaningful review is to test everything at a set in stone standard, aka settings across the board, else the results have no meaning.
That's the point some don't get or ignore, this thread was meant for people comparing FPS at a very specific setting. The moment you change settings for whatever reason, the whole comparisons done here since last year lose their meaning and validity.
Not that having a wide variety of hardware configuration here gives a super accurate picture, but its still interesting to see how stuff stacks in the grand scheme of things. 


This is the church of "1080 Highest", all who transgress and wish to receive absolution must post a "1080 Highest" confession and all their sins shall be forgiven


----------



## Felix123BU (Apr 3, 2021)

Det0x said:


> @ Felix123BU
> 
> How about this then, you maintain the GPU highest standing, and i maintain the CPU standing.
> Or maybe its much more fun to argue in the thread instead of actually contributing ?
> ...


I see you can not let go of the CPU obsession   
But damn, it must have take you some time to extract these results, see, arguing sometimes gets interesting results 
What about you open a thread for using this game as a CPU bench, and I can chip in, as in we  settle on a fixed custom resolution (lowest possible to show off CPU), I follow that, you follow the CPU scores, this way we leave this thread to what it was meant from the start and you are happy, I am happy, the universe is a bit happier. I mean it, I am up for it as much as time allows me.
There could be some interesting discussion going on there.


----------



## Taraquin (Apr 3, 2021)

Can contribute with a higher 5600X score, medium though and my 3060ti at mining uv/uc. 4.8GHz, 4000cl16 B-die tuned. 



Det0x: Good job compiling! Think I recognize you from diskusjon.no 

A bit interesting that a 5600X close to matches a 10900K@5.5GHz, 4cores extra with 4800 ram-ish. Amd has come a long way since Ryzen gen 1.


----------



## Felix123BU (Apr 3, 2021)

Look @Det0x , and all who might be interested in this topic, CPU powah in SOTR benchmarks!

Shadow Of The Tomb Raider - CPU Performance and general game benchmark discussions | TechPowerUp Forums

Lets not spam this thread anymore with off-topics, shall we?


----------



## nguyen (Apr 4, 2021)

Now this is getting old @Det0x , if the CPU results don't change between Medium and Highest, you should have post Highest in the first place. It seems your ego got hurt when others just kindly reminded you.
Now you are just spamming the thread with results nobody cares but you?
What else? post some synthetic CPU benchmark to show off your 5950X?


----------



## Felix123BU (Apr 4, 2021)

nguyen said:


> Now this is getting old @Det0x , if the CPU results don't change between Medium and Highest, you should have post Highest in the first place. It seems your ego got hurt when others just kindly reminded you.
> Now you are just spamming the thread with results nobody cares but you?
> What else? post some synthetic CPU benchmark to show off your 5950X?


"What else? post some synthetic CPU benchmark to show off your 5950X? "  ..... don't goad him, he just might  because as he said "The rest of the world use this benchmark as a cpu/memeory bench" 
Staying on topic and respecting others is for conformists and millennials (not that I see what millennials have to do with staying on topic, but hey, if @Det0x says it, must be true)


----------



## the54thvoid (Apr 4, 2021)

Okay folks, no more name-calling, distractions or thread derailing.

Another thread has been created by @Felix123BU for CPU comparisons.









						Shadow Of The Tomb Raider - CPU Performance and general game benchmark discussions
					

Hello World!  Since Shadow Of The Tomb Raider is a particularly CPU intensive game, or better said, CPU performance has a major impact on FPS, it would be interesting to see how our setups handle the game.  For that, lets settle on very specific in-game settings:  Fullscreen Exclusive Fullscreen...




					www.techpowerup.com
				




As far as this thread is concerned, it's a mess. So many variables it doesn't seem to have much of a point. I'll keep it open for now as it's quite active but it'd be nice to see some structured guidelines and folk following them.

Also - bench scores taken from other forums don't count. TPU member scores only.


----------



## nguyen (Apr 4, 2021)

10875H + 2070 Super MaxQ (90W)




Seems like my laptop perform real close to newer high-end laptop with 5900HX + 3080 (115-130W)


----------



## Space Lynx (Apr 4, 2021)

this is stock gpu/stock cpu and SAM off in BIOS.  prob could hit 200 fps avg if i did mild oc's and turned on SAM. not sure


----------



## Taraquin (Apr 4, 2021)

BTW: CPU game score is a bit lower running highest vs lowest. I get 20 fps more running lowest consistently 



Det0x said:


> In my testing with Zen3 + Ampere SAM lowers the cpu game performance by ~ 10-15 fps.
> Would be interesting if you could check if the same was true for Zen3 + Navi21


Since SAM loads CPU more it's not a surprise. I think the results on both Ampere and RDNA2 will be similar.


----------



## Det0x (Apr 4, 2021)

Taraquin said:


> BTW: CPU game score is a bit lower running highest vs lowest. I get 20 fps more running lowest consistently
> 
> 
> Since SAM loads CPU more it's not a surprise. I think the results on both Ampere and RDNA2 will be similar.


*edit*
You are correct


----------



## Taraquin (Apr 4, 2021)

Maybe due to 5950X have more work capasity? My 3600 experienced the same. I was able to get 160 cpu game medium 1080p, but at 720p lowest I could get 180fps.


----------



## jboydgolfer (Apr 4, 2021)

Taraquin said:


> Your 8600K seems to be underperforming. Have you tweaked ram/oc cpu?


no, i wasnt running any OC on that CPU. i had RAM & CPU/GPU at stock speed


----------



## Taraquin (Apr 4, 2021)

Det0x said:


> You are correct and i am mistaken.
> Lowest quality do affects CPU Game numbers
> 
> Used the following settings to test this:
> ...


Interessting! Highest would be more of a gpu test while lowest should be used for cpu test  Medium is kinda pointless as it pushes neither very hard.


----------



## jboydgolfer (Apr 4, 2021)

Det0x said:


> No point to discuss this further, like with many millennials, feelings are more important than facts.
> And i won't stoop to that level with namecalling and/or telling others what they should or shouldn't do.
> 
> In my view we are not sheep and such should not follow just because it feels comforting, when faced with a better option..
> ...


I was trying to figure out why this dude score was so high I didn’t realize he was  using whatever settings got the highest results. 

i was asking myself, how with that hardware, is his score so high?
theres no point in just using whatever settings give favorable result, as it renders the standardized benchmark moot.

EDIt
many of these results are using non standard settings.  pointless.



> Settings: DX12, 1080p, ultra high


----------



## thesmokingman (Apr 4, 2021)

jboydgolfer said:


> I was trying to figure out why this dude score was so high I didn’t realize he was  using whatever settings got the highest results.
> 
> i was asking myself, how with that hardware, is his score so high?
> theres no point in just using whatever settings give favorable result, as it renders the standardized benchmark moot.
> ...


And lol he apparently didn't even know the difference and back to spamming again.


----------



## Felix123BU (Apr 4, 2021)

@thesmokingman
The rules of the internet say that at least one special person must pop up from time to time . They go by many names, trolls, shills, fools etc. But this guy was right on one point, its not really mine or another users right to moderate this, this is the role of a moderator. If this crap continues, this thread will die, classical thread hijacking if I have ever seen one


----------



## Lew Zealand (Apr 12, 2021)

Hey, let's SotTR bench something stupid because it's what I got.  Otherwise known as...

A shoddy bomb casing full of used pinball machine parts:





R3 1200 OC to 3.7GHz (@1.3v), 5600XT, 8GB 2800 MHz CL16 DDR4

I had most of the parts laying around so bought an ultra-cheap case, decent PS and B450 and Bob's Your Uncle, you have a CPU-bound gaming machine waiting for AMD CPUs to become affordable in 2023.  Frequent frame drops to the mid 20s w/CPU pegged @100% when playing Horizon ZD @1440p Prefer Quality (High equiv.).

Living the dream.


----------



## Felix123BU (Apr 12, 2021)

Lew Zealand said:


> Hey, let's SotTR bench something stupid because it's what I got.  Otherwise known as...
> 
> A shoddy bomb casing full of used pinball machine parts:
> 
> ...


Not bad, at least in regards to SOTTR, hey, at least you have a decent GPU and a B450, lots of options for an CPU upgrade on the table in the near future that would drastically change you performance.


----------



## QuietBob (Apr 13, 2021)

Lew Zealand said:


> R3 1200 OC to 3.7GHz (@1.3v), 5600XT, 8GB 2800 MHz CL16 DDR4


Compared with my result of a 3300X @ 4.5 all core there's barely any difference in the "CPU game" metric. Seems odd.


----------



## Lew Zealand (Apr 13, 2021)

Duh, I didn't look at your numbers before typing below, but my suspicion of funny business in my benchmark run stands.  Your CPU minimums are way higher than mine at 84 FPS where mine is 42.  However I'm not sure I'm comparing the correct thing, is *CPU Game* the only thing that matters?  And if so, what's the point of *CPU Render*? I really dunno what these 2 mean.

Honestly, while watching the benchmark which I've done on 6 or 7 different setups, it looked like it was doing something wrong early in the third stage before the camera descends into the village and the NPC count tanks the FPS thanks to 4c4t.  But that would only affect the top end FPS, but not the minimum FPS or averages much, which are visible as high frametimes early and late in the overall benchmark.  I'll do it again or a bunch of times and see if I get something drastically different.

BTW I just came out of playing about an hour of RiseotTR and this setup was actually doing rather better than I'd expected at 1440p Very High (like Highest in SotTR) but Textures one step down to High (65fps with dips into the 40s).  It's very clear that there's a performance bottleneck with this setup loading or streaming the max textures as FPS is _notably _smoother with those down 1 notch.  Lots of dips to the 20s with Very High textures.  VRAM usage hovers around 5800MB with Very High textures and 3100MB with High textures.
_______________

Update:

@QuietBob gets the Brass Ring.

There are a number of missing NPCs in the Market section right at the end of the SotTR benchmark when I test it with the R3 1200 and 5600 XT.  I've run this benchmark on a number of other systems and they are always there, even on a 4c8t NUC with the same GPU in an TB3 eGPU case, but here they are missing.  Here's who's apparently in the loo during the benchmark:

* Lady on her knees in the central stall who usually sits up on her knees to greet a passerby.  The passerby is there doing something else.
* All 3-4 people usually at the end stall near Lara: the kid and his dad, the butcher, and sometimes there's someone else next to him?  I forget if there's a 4th person there.

Other notes: at least 2 other NPCs are doing something else, walking around the street and getting right in front of the camera, which you usually don't see in the benchmark.  Lara's creeping on looking the opposite way as there's nobody at the butcher's.

This is weird and could be an enforced thing (??) on very low end CPUs as it's repeatable on this one.  Ugh, I have a 2c4t NUC that I can eGPU with this to go even lower end but maybe I'll just kick one of the kids off their R5 2600s and knock their setup down to 2c4t in Ryzen Master and try it there.  It sucks being curious but too lazy to satisfy it. 

Maybe kills this as a good benchmark, I wonder if people with 10900Ks or 59x0s actually get _more _NPCs?


----------



## QuietBob (Apr 13, 2021)

Lew Zealand said:


> Duh, I didn't look at your numbers before typing below, but my suspicion of funny business in my benchmark run stands.  Your CPU minimums are way higher than mine at 84 FPS where mine is 42.  However I'm not sure I'm comparing the correct thing, is *CPU Game* the only thing that matters?  And if so, what's the point of *CPU Render*? I really dunno what these 2 mean.
> 
> Honestly, while watching the benchmark which I've done on 6 or 7 different setups, it looked like it was doing something wrong early in the third stage before the camera descends into the village and the NPC count tanks the FPS thanks to 4c4t.  But that would only affect the top end FPS, but not the minimum FPS or averages much, which are visible as high frametimes early and late in the overall benchmark.  I'll do it again or a bunch of times and see if I get something drastically different.
> 
> ...


Those are some interesting findings! I just ran the benchmark on a 4C/4T Phenom II. After multiple runs I can say for sure that the "missing" NPCs are there: the kneeling vendor, the butcher and the cook, and their two customers. I have noticed, though, that some of the walking customers will appear in a different place in subsequent runs.

Now I'm confused. Could it be that I'm running the demo (build 505) and you're benching with the full version (build 298)? 

The way I understand this benchmark, "CPU Game" measures how effectively the CPU executes general game logic (physics simulation, NPC AI, collision detection, sound propagation, and all the underlying game mechanics). "CPU Render" shows how fast the processor can feed the necessary data to the graphics card, and "GPU" indicates how fast the card can present the final image on the screen.

"GPU Bound" at 100% would mean that the GPU cannot process all the information it receives from the processor in real time, resulting in a "GPU bottleneck". Conversely, at 0% the GPU would have to wait to be fed by the processor most of the time, aka "CPU bottleneck". In both cases we would see lower than optimal frame rates. The bigger the discrepancy in processing power between the two components, the lower the ultimate FPS.


----------



## Lew Zealand (Apr 13, 2021)

Yes, my build is the current full game version from Steam, but here's the thing: it's a clone of the setup on my NUC, where the missing NPCs show up properly.

I just DDU'd both GPU drivers and uninstalled the other ones, and then installed the proper drivers and current version of Radeon Adrenaline for this setup.  Yeah, you're not supposed to do that kind of crap but the Steam d/l start and then replacement with the game folder doesn't work with Rocket League (always re-downloads dammit) and an occasional other game and I didn't want to get stuck killing my bandwidth.  No other issues I've seen so the clone seems OK.  Just this missing NPC weirdness in SotTR.

I'll see what I can find on the kid's machine but I'm gonna have to throw a stick or iPhone in the backyard to distract him.  Wish me luck!


----------



## Lew Zealand (Apr 20, 2021)

Why try to solve a quandary when instead you can scrape the bottom of the barrel by sticking in a slot-power only GPU into an office PC?

The latest 7nm Shortage Desperation Build™:






Dell Optiplex 9020, 280W stock PSU
Core i7-4790 (not K), tops out at about 64W at 100% CPU use
Asus Phoenix 1050 Ti (UV to 0.9V ~1775 MHz Cores OC, 7350 MHz Memory OC, uses ~61W)
Stock 8GB RAM 1600 MHz CL11

Does well enough I guess, better than a stick to the eye.  Another game: wrangled with it for quite a while in Horizon Zero Dawn and got OK performance and visuals with 1080p 80%, mostly Medium settings, Shadows Hi, AO off, Camera-based AA.  Still rummages in the 40s in busier scenes, hi 50s most other times. Weirdly AO doesn't seem to make a big visual difference in this game, may just be where I was in the map.


----------



## QuietBob (Apr 20, 2021)

Lew Zealand said:


> Why try to solve a quandary when instead you can scrape the bottom of the barrel by sticking in a slot-power only GPU into an office PC?


Oh, so that 1050Ti just needed more breathing room! Now it's on a par with my oc'd HD7970. I'd still call that a win for the AMD card, though, being nearly five years older. Wish I had been able to buy a better GPU before shit hit the fan, but now it looks like this old salt is here to stay


----------



## walker15130 (Apr 21, 2021)

All the gpu power I could buy in 2021.


----------



## Darmok N Jalad (May 18, 2021)

2 runs with an M1 Macbook Air. Seems to mostly stay above 30FPS with the second group of settings, but it actually didn't do that bad on what's close to 1080P highest. Not sure I'd try to play this game on the MBA, especially without any active cooling. I'm assuming this runs under Rosetta 2 as well.


----------



## Felix123BU (May 18, 2021)

Darmok N Jalad said:


> 2 runs with an M1 Macbook Air. Seems to mostly stay above 30FPS with the second group of settings, but it actually didn't do that bad on what's close to 1080P highest. Not sure I'd try to play this game on the MBA, especially without any active cooling. I'm assuming this runs under Rosetta 2 as well.
> View attachment 200727View attachment 200728


Apple, taking gaming forward to 20FPS    That raw power


----------



## Darmok N Jalad (May 18, 2021)

Felix123BU said:


> Apple, taking gaming forward to 20FPS   That raw power


What did you expect? I know it's not earth shattering, but let me know of another passively cooled chip powered by an IGP that could do any better.

EDIT: For a frame of reference, a few pages back, there's a system with an i5-8259U with 1050ti that scores close to the same.


----------



## Deleted member 202104 (May 18, 2021)

Darmok N Jalad said:


> What did you expect? I know it's not earth shattering, but let me know of another passively cooled chip powered by an IGP that could do any better.
> 
> EDIT: For a frame of reference, a few pages back, there's a system with an i5-8259U with 1050ti that scores similarly.



In addition that it's it's interpreting x86 to ARM at the same time (as you mentioned).


----------



## Felix123BU (May 18, 2021)

Darmok N Jalad said:


> What did you expect? I know it's not earth shattering, but let me know of another passively cooled chip powered by an IGP that could do any better.
> 
> EDIT: For a frame of reference, a few pages back, there's a system with an i5-8259U with 1050ti that scores close to the same.


I never expected much, but the hype that came with the M1 was sickening, so when I see these results I am very amused


----------



## Darmok N Jalad (May 18, 2021)

Felix123BU said:


> I never expected much, but the hype that came with the M1 was sickening, so when I see these results I am very amused


Again, I don’t know what you expected. It has no active cooling, and is running on 8GB of shared system RAM, emulating a DX12 ported game! It’s also the 7GPU unit, vs 8. What do we think this chips thermal headroom is, 10W?


----------



## Felix123BU (May 18, 2021)

Darmok N Jalad said:


> Again, I don’t know what you expected. It has no active cooling, and is running on 8GB of shared system RAM, emulating a DX12 ported game! It’s also the 7GPU unit, vs 8. What do we think this chips thermal headroom is, 10W?


Again, I do not debate that for what it is its a good chip, but from that to all the "its the best ever" , "its going to make x86 obsolete" and all that jazz, to see such a puny result is funny for me. Might not be for you, but we are not all amused by the same things    And the 1050TI is an a entry level from 6 years ago built on 14nm, so the comparison is not really something to be proud of for the M1. But for a chip with on die graphics, by itself its not that bad.

But putting aside all the comments on the topic of the M1, its a really interesting addition to this pool of result, first non x86 results, that in itself is kind of cool


----------



## thesmokingman (May 18, 2021)

Ya gotta stop taking marketing hyperbole personally lol.


----------



## Felix123BU (May 18, 2021)

thesmokingman said:


> Ya gotta stop taking marketing hyperbole personally lol.


I normally would, just that I have a couple of Apple fanatics in my circle of friends, they drive me crazy with each opportunity, and I refuse to see the light   
That does not mean that I can not appreciate what the M1 as a start could mean for the future, right now its an excellent chip for a low powered device, and that's it.


----------



## Darmok N Jalad (May 18, 2021)

I get that not everyone likes Apple (I don’t exactly love them either), but M1 was a very strong showing, IMO. Intel helped Apple look better, with all the execution woes and Kaby Lake refreshes. Had Intel been executing all these years, M1 would still be just as workable, but maybe not as competitive. I have no complaints about general performance--it renders web pages as fast as anything (Firefox), and my biggest demand is photo editing (20MP RAW) and occasional video editing. The point really was just to run the benchmark on this chip, and to see if it was even possible/playable. I really don't play video games any more, so I could care less how well it actually did, but it reminded me of playing on the last gen consoles (PS4/XboxOne). Despite the FPS dips, it actually seemed fairly consistent and didn't look too bad. Again, I wouldn't want to play it on a passively cooled setup anyway. For all I know, it started throttling during the bench.



thesmokingman said:


> Ya gotta stop taking marketing hyperbole personally lol.


Yes, all the companies can hype up their stuff. Intel has been hard at it lately, where some of the first words their new CEO said was how they can't be getting beat by a lifestyle company, and then a month or two later, they start bashing Apple and getting creative with their Blue Slides. All these companies know the excitement that builds around their launches, so they will use that to their advantage. Their are certainly some biased "review" sites out there for each camp, so you definitely want to go to the right sources for honest assessments. Anandtech isn't what it used to be, but they still keep it pretty scientific, and I found their assessment of the M1 to be spot on.


----------



## Felix123BU (May 18, 2021)

Darmok N Jalad said:


> I get that not everyone likes Apple (I don’t exactly love them either), but M1 was a very strong showing, IMO. Intel helped Apple look better, with all the execution woes and Kaby Lake refreshes. Had Intel been executing all these years, M1 would still be just as workable, but maybe not as competitive. I have no complaints about general performance--it renders web pages as fast as anything (Firefox), and my biggest demand is photo editing (20MP RAW) and occasional video editing. The point really was just to run the benchmark on this chip, and to see if it was even possible/playable. I really don't play video games any more, so I could care less how well it actually did, but it reminded me of playing on the last gen consoles (PS4/XboxOne). Despite the FPS dips, it actually seemed fairly consistent and didn't look too bad. Again, I wouldn't want to play it on a passively cooled setup anyway. For all I know, it started throttling during the bench.
> 
> 
> Yes, all the companies can hype up their stuff. Intel has been hard at it lately, where some of the first words their new CEO said was how they can't be getting beat by a lifestyle company, and then a month or two later, they start bashing Apple and getting creative with their Blue Slides. All these companies know the excitement that builds around their launches, so they will use that to their advantage. Their are certainly some biased "review" sites out there for each camp, so you definitely want to go to the right sources for honest assessments. Anandtech isn't what it used to be, but they still keep it pretty scientific, and I found their assessment of the M1 to be spot on.


I don't necessarily have an issue with Apple, more with the people who are fanboys of Apple, and generally any fanboys  
Yes, Intel "helped" them, Intel is 14++++++ for life , but that does not mean that Apple only made a good chip just because Intel is stuck in time, Apple made a good chip regardless of Intel, and I am partially impressed by it. I am more curious though what lets say AMD can do with 5nm comparatively to M1, and what Intel can do with their 7nm, even if that might take 2 more years or more to get to 7nm.
Anyway, comparing different architectures is extremely tricky when it comes to performance, and then adding a different software stack makes it even more complicated.

In CPU general task M1 is quite strong, if the GPU is involved not so much it seems, and a "gaming" task seems to bring the chip to its knees, and I don't mean only GPU, but looking at the CPU scores in that screenshot, they are quite low CPU wise. I wonder if that is mainly because of the "we use so little power we don't need a cooler" mantra and the chip is energy starved and temperature limited, not a great combination in electronics.

If what you got is native code and that is the power of the chip, that does not look well for the future of M1 as a chip for the masses, if its Rosetta emulated, than the emulation is really bad, and I say this knowing all the benchmarks of the M1, it should not be anywhere as weak in the CPU section...but again, different architecture and god knows what software running it.

Anyway, you said you don't plan to game on it, so this is a rhetorical discussion, anything else, if you are happy with your purchase, generally that is a good purchase 
And regarding hype, I am to old to to be swayed by some fancy words and some cute pictures or some paid promotional videos, if you don't want to be suckered into spending money on useless stuff, that's a trait one must acquire  And Intel's "lifestyle company" reference was just lame, you fight with your product not by throwing a tantrum because the "lifestyle company" showed you hot to better design a chip 

And the thing that I should have said from the start, thank you for sharing the result, its interesting to see and think about.


----------



## Prime2515102 (May 18, 2021)

What a creepy benchmark...

Anyway, the average FPS was 101 at 1920x1200 and 'GPU Bound' was 54%. I don't get that...


----------



## Lindatje (May 18, 2021)

Prime2515102 said:


> What a creepy benchmark...
> 
> Anyway, the average FPS was 101 at 1920x1200 and 'GPU Bound' was 54%. I don't get that...
> 
> View attachment 200814


Your CPU is a big bottleneck.


----------



## phanbuey (May 18, 2021)

Lindatje said:


> Your CPU is a big bottleneck.



54% GPU bound isn't that big of a bottleneck... seems to be pretty well balanced at 1080P.


----------



## Prime2515102 (May 18, 2021)

phanbuey said:


> 54% GPU bound isn't that big of a bottleneck... seems to be pretty well balanced at 1080P.


But if you look at the screenshot it's only 39% at 1080, it's 54% at 1200. Seems like a huge difference given only a 3fps increase average dropping to 1080.


----------



## Lew Zealand (May 21, 2021)

Prime2515102 said:


> But if you look at the screenshot it's only 39% at 1080, it's 54% at 1200. Seems like a huge difference given only a 3fps increase average dropping to 1080.


The 1080 and 7700K seem to be pretty well-balanced for this game then if so few FPS can swing the dependency that far.

However you didn't test at the Highest setting, which is the standard for this SotTR thread.  Go ahead and try that and you'll see yourself more highly GPU-bound at 1080p.
__________

And because I can't stop spamming this thread, I upgraded the internals in my Shoddy Bomb Casing to contain slightly fewer better Used Pinball Machine Parts:





Ryzen 5 2600 @stock
Radeon 5600XT
4x4GB 2800MHz 16-17-17-39 barely stable crap DDR4
B450 ASRock ultra-crap MoBo which can barely keep above RAM stable
some other junk what keeps the above powered up.


----------



## Prime2515102 (May 24, 2021)

Lew Zealand said:


> However you didn't test at the Highest setting, which is the standard for this SotTR thread.


ahhhh crud... I didn't even notice that. Oops...


----------



## amirbahalegharn (Jun 23, 2021)

By going to i9 9900, how many FPS can I gain or it wouldn't help me at all? it's a clevo laptop with gtx1060 6gb mobile. thanks in advance.


----------



## Hyderz (Jun 23, 2021)

amirbahalegharn said:


> By going to i9 9900, how many FPS can I gain or it wouldn't help me at all? it's a clevo laptop with gtx1060 6gb mobile. thanks in advance.
> View attachment 205109



can you swap out the cpu in your laptop? if you can then you might be able to hit 60fps but i'd be concerned with the thermals 
maybe your laptop cooler is only setup for cooling a dual core or quad core... dont think its for an 8 core...
maybe you could grab an i5 9600 should give you a nice boost too..


----------



## Athlonite (Jun 23, 2021)

amirbahalegharn said:


> By going to i9 9900, how many FPS can I gain or it wouldn't help me at all? it's a clevo laptop with gtx1060 6gb mobile. thanks in advance.
> View attachment 205109



You may get a few more fps but I think you'll find you'll just end up more GPU bound or overheating your CPU because it's HSF isn't capable of cooling a 9900 just learn to drop some settings down to high rather than ultra and you get more fps and I doubt you'll notice the difference in picture quality


----------



## amirbahalegharn (Jun 24, 2021)

Hyderz said:


> can you swap out the cpu in your laptop? if you can then you might be able to hit 60fps but i'd be concerned with the thermals
> maybe your laptop cooler is only setup for cooling a dual core or quad core... dont think its for an 8 core...
> maybe you could grab an i5 9600 should give you a nice boost too..


yes,clevo laptops with desktop sockets are upgradeable in their generation. with g4600 or i3 9100 & good thermal repaste,cpu temps won't go over 65-66'c with 120mv undervolt...from what i have seen online, others with i9 9900 or i7 9700k,would have temps around 90'c which is still lower than 100'C thrpttle temp; i don't want to use 9900K (for sure will face overheating and only by deliding+repsting+UV,user can get away with it), or i9 9900t 35W which performs around i7 9700-9700k as it's base and boost clocks are not impressive for gaming performance compared to 65W/95W ones although having 8c/16t may benefit in some workloads.
I am also counting on fps boost that AMD's "FSR" will bring to the games and how much improvement will user see with any change in cpu department which would affect on those 20-30% for each quality step user choose(ultra quality->quality->balanced->performance) .


----------



## jaybepr (Jun 30, 2021)




----------



## skilzorz (Jul 7, 2021)




----------



## Zyll Goliat (Aug 12, 2021)

Here is my result....what I found interesting is that this old R9 Fury having good result in 1080p but 1440p is even better considering the 4gb limitation and card age on 1440p I have 41FPS which is not that far from 1080p.....


----------



## bissag (Aug 14, 2021)

1080Ti @ Stock 1440p @ Ultra


----------



## Lew Zealand (Aug 19, 2021)

Almost dead thread needs moar love.  Just like 92% GPU bound needs Moar Cores!

I present to you a modest upgrade from i5-8400 6c6t to i7-9700F (not K) 8c8t in an effort to get that pesky GPU-bound number to an overkill position.  At 1080p.  When I play at 1440p.  I swear this makes sense *somewhere*...





Above upgrade gets from 92 to 99% GPU bound and fps from 102 to 109.

Woo.

Still way cheaper than a new GPU so I win.


----------



## QuietBob (Sep 3, 2021)

3300X @ 4.5 all core, 6600XT @ stock + SAM:





Who said quad core gaming is dead?


----------



## harm9963 (Sep 3, 2021)

All CPU , 1080Ti vs 3080Ti.


----------



## pyrotenax (Sep 25, 2021)

1920x1080 | 5900X | 3080


----------



## AVATARAT (Oct 3, 2021)

1920x1080 | 5600x@PBO | 6700 XT


----------



## zebra_hun (Oct 4, 2021)

Hi, original settings, i don't know SotTR. Just Benched for a friend. 10*5400MHz allcore, 5100MHz Uncore, 4266MHz RAM OC.


----------



## Felix123BU (Oct 13, 2021)

and they said Windows 11 is worse for gaming   





+11 FPS with the same hardware settings vs Windows 10


----------



## masterdeejay (Oct 28, 2021)

Dx12 single gpu mode, dont know how to enable multi gpu mode.
If i enable Crossfire and fullscreen exclusive the pc reboots, but other games working fine on crossfire.
CPU is not for gaming (low freq 20 core xeon), and also the video card is not a gaming card but its playable.


----------



## Lew Zealand (Oct 28, 2021)

masterdeejay said:


> Dx12 single gpu mode, dont know how to enable multi gpu mode.
> If i enable Crossfire and fullscreen exclusive the pc reboots, but other games working fine on crossfire.
> CPU is not for gaming (low freq 20 core xeon), and also the video card is not a gaming card but its playable.



Set it on High, or Med with some judicious High changes (or HUB's SotTR recommended settings) and you'll get much closer to 60fps minimum and a great gaming experience.  On a GTX 1060, my Mins went up about 18% to close to 60FPS @1080p with the HUB settings.


----------



## Kissamies (Oct 30, 2021)

Everything on max and RT at medium


----------



## phanbuey (Oct 31, 2021)

PSA: if you're comparing your scores, make sure to pay attention to GAME BUILD -- the DEMO does not get as high as the real game....  There are large variations across game versions as well.


----------



## mrthanhnguyen (Nov 3, 2021)

Last run for AMD system. Im waiting for Alder Lake.


----------



## professor dumb dumb (Nov 19, 2021)

Might want to hold onto that 5950x.
Tough to beat that 314.  I've been working on it a bit - and still have a bit left i'd say but here's my current best:
Alder lake all p-core 5.1 e-cores disabled
cache / ring 49
DDR4-DR-4x8gb-3800mt/14,14,14,32,280-tight subs
Alltime is lowest settings, highest is as you'd expect.

I'll give it a go with 6900xt to see if there is a difference in the overhead (as cpu render is significantly faster with alder lake and there is copious optimizations in sotr for radeon) - but this is a reasonable result with geforce.  As soon as I can test with ddr5 I'll update.

Edit:
Best average cpu render: 642
Best average cpu game: 312 (radeon)


----------



## mrthanhnguyen (Nov 19, 2021)

professor dumb dumb said:


> Might want to hold onto that 5950x.
> Tough to beat that 314.  I've been working on it a bit - and still have a bit left i'd say but here's my current best:
> Alder lake all p-core 5.1 e-cores disabled
> cache / ring 49
> ...


I saw from murmick with 4000c15 1t hit 354fps and carillo with 4300c14 1t hit 366. You should tweak more.


----------



## professor dumb dumb (Nov 19, 2021)

mrthanhnguyen said:


> I saw from murmick with 4000c15 1t hit 354fps and carillo with 4300c14 1t hit 366. You should tweak more.


Imc on this 12900k sample is a bit weak - SR does a little bit better - but it won't finish a run through at 4k.  Have some 4000c14 coming which is a bin better than the 4400c16 I've been using so we'll see if there is anything left.  Some people have had success with new bios rev's on the strix (not me unfortunately) - but I won't have access to another 12900k until january to see if this is board (dirty dimm socket) or cpu (weak imc) - leaning more towards a weak imc as I can run gear2 at 5000+.

Edit: also I think win11 is slowing things down a bit - just tried the 6900xt and it didn't go as well as I'd hoped.
Avg Cpu render: 583
Avg Cpu game: 312


----------



## Taraquin (Nov 19, 2021)

professor dumb dumb said:


> Imc on this 12900k sample is a bit weak - SR does a little bit better - but it won't finish a run through at 4k.  Have some 4000c14 coming which is a bin better than the 4400c16 I've been using so we'll see if there is anything left.  Some people have had success with new bios rev's on the strix (not me unfortunately) - but I won't have access to another 12900k until january to see if this is board (dirty dimm socket) or cpu (weak imc) - leaning more towards a weak imc as I can run gear2 at 5000+.
> 
> Edit: also I think win11 is slowing things down a bit - just tried the 6900xt and it didn't go as well as I'd hoped.


Are you on the 449 beta of SOTTR? Performance seems best on thet one. Mumriken used that for hus 350+ run as I adviced him to


----------



## Teex (Dec 11, 2021)

CPU: Ryzen 7 5800x -> PBO + CO optimized per core
RAM: G.SKILL 32 GB KIT DDR4 3200 MHz CL16 Ripjaws V CL16-18-18-38 1,35 V OCed to -> 3733 MHz - 16-19-14-32-48 1,45 V
GPU: Sapphire Pulse 5700 XT OC + UV - > Core Clock - 2032 MHz Memory Clock - 1864 MHz CV 1111 mV


----------



## Teex (Dec 22, 2021)

My Old PC:
CPU: Intel i7 - 8700 
RAM: Kingston FURY Beast 2 x 8 GB KIT DDR4 2666 MHz CL16 + Corsair Vengeance LPX 2 x 8 GB KIT DDR4 2666 MHz CL16  CL16-18-18-38 1,20 V OCed to -> 2666 MHz - 13-16-16-35 1,35 V
GPU: NVIDIA GeForce GTX 1070 G1 Gaming 8G  - OC  - > Core Clock + 135 MHz Memory Clock - 335 MHz


----------



## Lew Zealand (Jan 8, 2022)

A new GPU snuck into my rig last night - 6600 XT.  Turns out it plays well with others:





1080p, Highest TAA.

Sapphire Pulse 6600 XT, 2750 Cores, 2250 Memory, +20% Power

Only 23% faster than my GTX 1080 at 1080p here, but 37% faster at 1440p.  8-core 9700f hits 80-90% CPU use in some areas, so a good match, that's set to 130W in Throttlestop instead of it's 65W default, so no CPU speed throttling.

GPU may not be paying attention to Undervolting as reported in GPU-Z (still at 1.15v when I have it set to 1.10v) but Superposition scores have inched up each time I make a change there so maybe it's just bad reporting?  You need to do this UVing for max OC on these GPUs as it bangs against the power limit even at it's default 2664 MHz Core clock.

It's fun to have a new toy ehm, serious computation device to optimize. Some things I've noticed so far compared to my Pulse 5600 XT is (1) it plays much better with OC tools and (2) the trash Radeon OGL implementation, most notable to me in Mineraft, seems to work much better on the 6600 XT than on the 5600 XT.


----------



## QuietBob (Jan 8, 2022)

Since I grabbed the freebie full version from EGS I re-tested to see how it compares with the demo benchmark:





As some people here have noted, the full version seems to use a newer, more optimized engine and yields higher results. The score is full 11% better in my case, with the same exact setup as in my demo run (stock 6600XT and oc'd 3300X).

What's more interesting, the higher score is entirely down to higher CPU fps: 20% higher avg render and 27% higher avg game  It appears that the revised engine has better CPU threading/utilization.


----------



## walker15130 (Jan 8, 2022)

walker15130 said:


> All the gpu power I could buy in 2021.


That's where you were wrong, buddy, here's some fresh new tests with my two month old 3060Ti.
Stock CPU:




CPU at 4400MHz didn't help much:




Even though I play at 1440p this good ol' ryzen 3600 tends to be the bottleneck more often than I'd like to admit. Too bad every upgrade path is steeply priced. Maybe 5800X3D will make people ditch their 5600/5800x


----------



## Petar666 (Jan 9, 2022)

ЕVGA FTW3 ULTRA 3080TI/2100/21000/10900KF@5.30Ghz/2x8GB@4266mhz


----------



## Duvar (Jan 16, 2022)

1080p lowest Ryzen 3600 4.6GHz 3800CL14 tuned subs






Update:


----------



## mrthanhnguyen (Feb 21, 2022)

12900k full p core and e core and ht enable.
1080p





1440p


----------



## AVATARAT (Feb 27, 2022)

1920x1080 | 5600x@PBO | 6800 XT

Score: 36134
FPS: 231




2560x1440 | 5600x@PBO | 6800 XT

Score: 27499
FPS: 176


----------



## masterdeejay (Mar 5, 2022)

Tesla M40 undervoltaged, no boost clock


----------



## looniam (Mar 5, 2022)

i'll just leave there here. 



a little ddr4 primary timings and increase power/fans in ab but pretty much out of the box otherwise.


----------



## dmgr13 (Mar 21, 2022)

any idea why my setup is performing below the average ? There's literally users here with less powerfull hardware but with higher frames... I don't get it


----------



## Lew Zealand (Mar 22, 2022)

0% GPU bound means 100% CPU bound, which suggests something else is runnign in the background eating up your CPU cycles.  But strangely, Steam is showing 234 FPS on that screen, which is rendering a full 3D scene in the background, and that should be roughly on par with your benchmark scores.

But it isn't, you have a max of 218 FPS on your GPU and 145 on your CPU.  What gives?  Restart your machine and try it out again, and use the Highest preset this time to match the standard (yes, which not everyone has done).  Maybe use Afterburner/Riva overlay and watch the %CPU use, Power and Frequency numbers to see what's up.


----------



## Taraquin (Mar 22, 2022)

dmgr13 said:


> any idea why my setup is performing below the average ? There's literally users here with less powerfull hardware but with higher frames... I don't get it


Have you tuned ram? Enabled xmp? Try running it at highest at 1080p. My 3600 with xmp only got 132fps then, you get 115fps. I guess you don't have xmp (ram running at 2400+/-) and a lot if bloatware.


----------



## dmgr13 (Mar 22, 2022)

Taraquin said:


> Have you tuned ram? Enabled xmp? Try running it at highest at 1080p. My 3600 with xmp only got 132fps then, you get 115fps. I guess you don't have xmp (ram running at 2400+/-) and a lot if bloatware.


i just did a clean fresh windows install, still performing poorly... Yes my rams are running at 3200mhz


----------



## Taraquin (Mar 22, 2022)

dmgr13 said:


> i just did a clean fresh windows install, still performing poorly... Yes my rams are running at 3200mhz View attachment 240844


Okay, I would suggest tuning ram. My 3000cl15 xmp ram did 132fps, after tuning I got 156fps, I can help you if you want?


----------



## dmgr13 (Mar 22, 2022)

Taraquin said:


> Okay, I would suggest tuning ram. My 3000cl15 xmp ram did 132fps, after tuning I got 156fps, I can help you if you want?


Hi. Quick update, I was using tomb raider trial version from steam. I unistalled the game and installed the full version (not original tho, cracked) and this is the results I got !! What changed ??


----------



## Taraquin (Mar 22, 2022)

dmgr13 said:


> Hi. Quick update, I was using tomb raider trial version from steam. I unistalled the game and installed the full version (not original tho, cracked) and this is the results I got !! What changed ??View attachment 240856


Demo is a different game version with lower performance. You can still get 30fps more with ram tuning, want help with that?


----------



## dmgr13 (Mar 22, 2022)

Taraquin said:


> Demo is a different game version with lower performance. You can still get 30fps more with ram tuning, want help with that?


that would be great, im working now but I sent you a PM so you can help me later. Thank you so much


----------



## gtfomyporch (Mar 29, 2022)

CPU - 5600x, PBO2 +150, CO disabled
Mobo - Gigabyte X570 Aorus Elite
RAM - 2x16 F4-3600C14D-32GTZNA at 3600 with tightened timings
GPU - XFX 6900XT Merc

This seems to be about the max I can hit at the moment without sacrificing stability.


----------



## Othnark (Apr 16, 2022)

Overclocked 5800x3d doing a bit of work, punishing my 2080ti. Demo not full game, 3080ti testing for high/highest in the near future.
720 Lowest

1080Lowest


----------



## mouacyk (Apr 22, 2022)

phanbuey said:


> PSA: if you're comparing your scores, make sure to pay attention to GAME BUILD -- the DEMO does not get as high as the real game....  There are large variations across game versions as well.
> 
> View attachment 223077


3800-14-14-14-28-2T Tightened Timings:


@Othnark the 3D cache on your 5800X is amazing isn't it?


----------



## zebra_hun (Apr 22, 2022)

New record:







DLSS on: (Quality)




1080 Lowest for cpu/ram bench:


----------



## DooM3 (Apr 22, 2022)

ryzen 3600 4,3 ghz  3800 mhz 1900 max







ryzen 3600 4,3 ghz  380 mhz 1900 min




Xeon W3680 4,4 ghz  2400 mhz 3600 max


----------



## zx128k (Apr 24, 2022)

There is something funny about the fps with this game.  I should be getting a lot higher FPS.  Performance is at least >20fps lower than it should be.


----------



## QuietBob (May 12, 2022)

5800X3D with IF @ 1900, RAM @ CL14, 6600XT @ stock + SAM:


----------



## damric (May 23, 2022)

Ryzen 5 5500 @4.825GHz
4x8GB Ballistix DDR-3600CL16 @4400CL20
Vega64 @1750/1100


----------



## Lew Zealand (Jun 29, 2022)

I know, nobody asked for this, which is why I'm posting.  Don't worry it will all make sense soooooon.....

Dell Optiplex 9020 MT
16GB Dell DDR3-1600 CL11 (4x4GB)
Core i5-4590
Radeon RX 6400, @PCIe 3.0 of course


General notes on the 6400:  I played a little more through *SotTR *where I'm at and it plays fine at a decent mix of Lo thru Hi settings for visual quality @1080p, averaging 45-50fps.  Many games play very well on this system but the *CPU* is a bottleneck in *AC: Odyssey*, just pegged at 90% or higher all the time at around 45fps.  Game plays the same at 1080p as 1440p as the GPU is twiddling its thumbs at 1080p.  *CP2077* plays OK with a mix of lo and med at 48FPS w/FSR2.0 @1080p but the new FSR2.0 hack makes specular highlights inside look *horrible* so even though I get a 20% increase in FPS, I prefer the 40fps native.  Other things look quite good with FSR 2.0, including my annoyance: the jittery palm fronds.  *HZD* is very playable at 900p 45-55 FPS, and I covered a lot of the game this way, however this card suffers from a Radeon driver error I see on occasion: flickering in-game assets, like trees and rocks.  They'll flicker out and back in in a split second, seems to happen at night and only at some times.  Immersion-breaking though.  *Forza Horizon 4* plays at around 120FPS @1440p with mostly High settings with some Ultra (car detail) and 4xMSAA (IMO an absolute requirement, I hate jaggies!).  Speaking of jaggies, *Rocket League* plays at 120-144 FPS @1440p but gives the impression that its CPU-limited as there are infrequent but annoying frame drops correlated with high CPU use.  I'm gonna swap in another CPU if I can or swap the 6400 out to another chassis to confirm.  *Doom (2016)* plays 1440p @~95FPS at High settings and is smooth as silk.  *Tomb Raider (2013)* crashes out as soon as the rendered menu screen loads, I've never seen this behavior, might have to reinstall as I even play this game on Intel iGPUs and an old 1GB Radeon 8570 DDR2.  Indie games like *Slime Rancher* and *Raft *play great at 60-90FPS 1440p High settings, as you'd expect. Low poly games like these look and play exactly the same to me at 60, 90, and 120FPS, I can't detect the difference. But Rocket League at 60, 90, 144 FPS, those differences are obvious.


----------



## Taraquin (Jun 30, 2022)

Lew Zealand said:


> I know, nobody asked for this, which is why I'm posting.  Don't worry it will all make sense soooooon.....
> 
> Dell Optiplex 9020 MT
> 16GB Dell DDR3-1600 CL11 (4x4GB)
> ...


It's not a contest, interesting to see unusual setups  If you are CPU bound then ram tuning, if possible in Dell bios, can give you a boost


----------



## Lew Zealand (Jun 30, 2022)

Taraquin said:


> It's not a contest, interesting to see unusual setups  If you are CPU bound then ram tuning, if possible in Dell bios, can give you a boost



Nah, Dell BIOS don't give you nothin'!  Which is OK as the point on this exercise is to see what you can expect by dropping in today's most widely drop-innable card to an older but decent office PC.  Mostly quite good, but I really didn't expect to get a CPU-bound game with a Haswell i5 with this card, but AC: Odyssey is just that!  For most other games the CPU is OK, including AC: Unity (and SotTR), though I haven't tried AC: Origins yet and I don't have Watch Dogs or Hitman 3.


----------



## Ja.KooLit (Oct 9, 2022)




----------



## OkieDan (Oct 18, 2022)

7950x all core PBO Curve Optimizer -15, DDR5 6000 30-40-40-96 (stock XMP profile).


----------



## Tomgang (Oct 23, 2022)

5600X with PBO on and RTX A2000 with overclock. I have a RTX 4090 aswell, bit 5950X is just holding that GPU back to much so score is not higher than with my RTX 3080.


----------



## Kissamies (Dec 10, 2022)

4K, high, medium RT


----------



## scope54 (Dec 14, 2022)

1440p


----------



## harm9963 (Jan 3, 2023)

New 6800 Red Dragon, plus new TV TCL 55 646.


----------



## Athlonite (Jan 3, 2023)

harm9963 said:


> New 6800 Red Dragon, plus new TV TCL 55 646.


You know it's much easier to press the PRTSC key on the keyboard then open MS Paint and paste as new image not only is it quicker the image will look better


----------



## harm9963 (Jan 3, 2023)

Athlonite said:


> You know it's much easier to press the PRTSC key on the keyboard then open MS Paint and paste as new image not only is it quicker the image will look better


Did by phone


----------



## harm9963 (Jan 5, 2023)

harm9963 said:


> View attachment 277366
> New 6800 Red Dragon, plus new TV TCL 55 646.


----------



## OkieDan (Jan 5, 2023)

harm9963 said:


> Did by phone


We know lol


----------

