• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Everyone is going to focus on 4K, because consoles run it (even if poorly) and PC players have it as their current gold standard
1440p was pretty popular, but it's dropping fast

The only reason consoles do 4K is because it's the TV standard, which jumped from 1080p straight to 4K (a waste of pixels below 77"). But both consoles support 1440p output now, which is actually the best option for most games if that's your screen's native resolution.
That's also the only reason I use 4K, since I have an OLED TV. But sitting 1.8 meters/6 feet away from it, 1440p actually looks very good (even the UI and Windows). 1080p is blurry, except when you turn on integer scaling, where it becomes ridiculously sharp (text does look pixelated, but images don't).

I see no reason why anyone would chooses a 4K monitor for gaming, unless they really want a huge screen on their desk. In my view 1440p is actually the perfect resolution for gaming, especially 3440x1440.

When the time comes that I have to replace my OLED TV, I will actually consider an OLED monitor. Hopefully we'll have a bigger selection by that time, and some bigger ultrawide models. I'd mount it to my TV stand using a desk arm and use it just like a TV, only closer.
 
The only reason consoles do 4K is because it's the TV standard, which jumped from 1080p straight to 4K (a waste of pixels below 77"). But both consoles support 1440p output now, which is actually the best option for most games if that's your screen's native resolution.
That's also the only reason I use 4K, since I have an OLED TV. But sitting 1.8 meters/6 feet away from it, 1440p actually looks very good (even the UI and Windows). 1080p is blurry, except when you turn on integer scaling, where it becomes ridiculously sharp (text does look pixelated, but images don't).

I see no reason why anyone would chooses a 4K monitor for gaming, unless they really want a huge screen on their desk. In my view 1440p is actually the perfect resolution for gaming, especially 3440x1440.

When the time comes that I have to replace my OLED TV, I will actually consider an OLED monitor. Hopefully we'll have a bigger selection by that time, and some bigger ultrawide models. I'd mount it to my TV stand using a desk arm and use it just like a TV, only closer.

I can PM you a recommendation for a good optician...
 
I know a doctor who can prescribe you some Obecalp. ;)

"I can't see sh!t, so everyone else is just making up that they can!" Seems to be your reasoning.

When i moved from a 27" 4k to 32" 4k monitor it was an obvious downgrade to image clarity... and doing 4k on a 65" tv... ugh... 8k tyvm !
 
I also had the reference design and the max graphic score i got in timespy extreme was 15100, with the nitro im getting 17 000. Close to 15% performance gain.
Yes the AIB will be faster its pushing much higher clocks and way more wattage that Nitro model probably does 460 watt TBP while the reference model with +15% power does like 390-400 watts.

however my original point was you don't need an AIB model to his 3 ghz clocks the reference model will do that, not that AIB models won't be faster they should be as they are all using 3x8pin connectors.
 
The DF review alot more positive about RT.


"Stunning Visuals"!?!

The title alone gives me the impression that this guy is biased.
 
I would like to see that steam survey verified, cause it sees alot of increase in very low resolutions... tells me that the survey was primarily sent to less developped markets.
You can run steam on PC's that are game servers and laptops, yknow.
I only used it to show that 1440p was common, extremely so - but now it's beginning to die off, and 4K is taking over thanks to cheaper displays (samsung in particular have released a heap of 4K high refresh displays in recent years)
 
Wow, whats this post saying?

Can anyone here confirm this? This guy saying the game enforces downward resolution scaling even without FSR.

TheWordOfTyler [+1] 33 points 16 hours ago*

The PC version runs at a constant 50% resolution even if you disable FSR.
I noticed after trying to run the game without FSR and saw it was really blurry.
If you try and change it in the config file it will just change back when the game launches.
You can use Nvidia DSR to create a "doubled" resolution which allows the game to run at native resolution and look correct.
 
You can run steam on PC's that are game servers and laptops, yknow.
I only used it to show that 1440p was common, extremely so - but now it's beginning to die off, and 4K is taking over thanks to cheaper displays (samsung in particular have released a heap of 4K high refresh displays in recent years)

I am aware of that, but the fairly large increase in all the very low res in that survey makes me think that the numbers aren't super reliable.
 
Apparently yes:

"Changing the graphics quality, changes the render resolution as well on PC.
Preset, render resolution
Epic = 100%
High = 87%
Medium = 71%
Low = 50%"

Post in thread 'Star Wars: Jedi Survivor [PS5, XBSX|S, PC]' https://forum.beyond3d.com/threads/star-wars-jedi-survivor-ps5-xbsx-s-pc.63093/post-2298432
Graphics quality seems such a weird name for a setting.

--

Looking at PS5, performance mode seems to hover in the 40s, are dev's just assuming everyone has VRR now? because with vsync that would be a stutter fest.
 
Except most don't. Even if the average fps increases, most games will see a sharp decline in frametime consistency with them enabled.


Exactly very true.

On Red Dead redemtpion 2, the minimum FPS dropped by 13% with e-cores on. With them off, minimum FPS was much higher. The average and max did not change much at all. Max was a couple FPS higher.

E-cores just do not do well for games. Just like dual CCD AMD CPUs when threads hop CCDs, the 1% lows tank.
 
Exactly very true.

On Red Dead redemtpion 2, the minimum FPS dropped by 13% with e-cores on. With them off, minimum FPS was much higher. The average and max did not change much at all. Max was a couple FPS higher.

E-cores just do not do well for games. Just like dual CCD AMD CPUs when threads hop CCDs, the 1% lows tank.

Indeed.

A friend of mine had really bad stuttering with his 13700f in star citizen... until we disabled the e-waste cores, then it played fine.
 
E-cores just do not do well for games. Just like dual CCD AMD CPUs when threads hop CCDs, the 1% lows tank.
Εxcept that's not at all how it works. Intel is using a hardware scheduler, ecores are the last resort when the CPU is out of resources before jumping to hyperthreading. It's completely different to amd's CCDs. Unless the game is specifically problematic with ecores (like star citizen suggested above), ecores do not harm performance. I haven't tried RDR2, I will in a bit, but most games I've actually tested ecores boost both average and lows. Warzone 2, TLOU, Cyberpunk, Spiderman and some other games which slip my brain right now have way higher 1% and 0.1% lows with the ecores on rather than off. On spiderman the difference is negligible honestly, but in the other 3 it's day and night, especially in specific heavy scenes. On warzone 2 I went from 130 to 170+ 0.1% lows by enabling the ecores.
 
It is weird that the number of overly long frames is apparently more problematic with E-cores compared to Hyperthreading. A "hyperthreaded core" (aka using both threads on a core) has less compute power increase than an using (one thread of) a P-core and an E-core.

This might point to scheduler issues.
 
It is weird that the number of overly long frames is apparently more problematic with E-cores compared to Hyperthreading. A "hyperthreaded core" (aka using both threads on a core) has less compute power increase than an using (one thread of) a P-core and an E-core.

This might point to scheduler issues.
Ιt is not, ecores are better than HT. It is usually suggested if you want to benchmark scores (you know, for leaderboard wins) to disable HT - clock the Pcores - leave Ecores on. It was different on alderlake since ecores capped your cache to 4.2 ghz, but even with ALD only a handful of games benefited from turning off ecores and clocking the cache. SOTR was one of them, but I think that's about it :roll:
 
Ιt is not, ecores are better than HT. It is usually suggested if you want to benchmark scores (you know, for leaderboard wins) to disable HT - clock the Pcores - leave Ecores on. It was different on alderlake since ecores capped your cache to 4.2 ghz, but even with ALD only a handful of games benefited from turning off ecores and clocking the cache. SOTR was one of them, but I think that's about it :roll:

I'm more thinking of comparing to AMD. You never see major problems with long frames when using hyperthreading on AMD.
 
Ηave you actually tested it?

No. I am going on reports of excessively slow frames. I'm not really interested in owning either Intel's P/E or AMD's high-cache/high-frequency split chips. It all smells a little much like crude hacks. It is telling that the professional markets (Xeons and EPYCs) doesn't do any of this jazz.
 
Game is a dumpster fire on PC, what they should have done was hire the developers from the SW BF series, beautiful renderings and smooth as silk gameplay.

No. I am going on reports of excessively slow frames. I'm not really interested in owning either Intel's P/E or AMD's high-cache/high-frequency split chips. It all smells a little much like crude hacks. It is telling that the professional markets (Xeons and EPYCs) doesn't do any of this jazz.
You will pay a buttload for the new Xeons too.

Intel Xeon W-3400 Series (Sapphire Rapids-112L)
SKU Cores/
Threads
Base Freq
(GHz)
Turbo Freq
(TB 2.0)
Turbo Freq
(TBM 3.0)
PCI Lanes
(Gen5)
L3 Cache
(MB)
Unlocked
(Perf Tuning)
TDP
(W)
Price (1KU)
w9-3495X56/1121.94.64.8112105Y350$5889
w9-3475X36/722.24.64.811282.5Y300$3739
w7-3465X28/562.54.64.811275Y300$2889
w7-345524/482.54.64.811267.5N270$2489
w7-344520/402.64.64.811252.5N270$1989
w5-3435X16/323.14.54.711245Y270$1589
w5-342512/243.24.44.611230N270$1189
 
Last edited:
Εxcept that's not at all how it works. Intel is using a hardware scheduler, ecores are the last resort when the CPU is out of resources before jumping to hyperthreading. It's completely different to amd's CCDs. Unless the game is specifically problematic with ecores (like star citizen suggested above), ecores do not harm performance. I haven't tried RDR2, I will in a bit, but most games I've actually tested ecores boost both average and lows. Warzone 2, TLOU, Cyberpunk, Spiderman and some other games which slip my brain right now have way higher 1% and 0.1% lows with the ecores on rather than off. On spiderman the difference is negligible honestly, but in the other 3 it's day and night, especially in specific heavy scenes. On warzone 2 I went from 130 to 170+ 0.1% lows by enabling the ecores.

Have you seen DF's analysis of Redfall on PC?

The game seems to use 8 P-cores and 3 E-cores on the 12900K, but the utilization on all but two cores is extremely low. And the framerates with all cores enabled are actually lower than just 6 or 8 P-cores without HT.

On a Ryzen 3600 it's a stuttery mess even though the usage is quite low as well. It can't even maintain locked 30 FPS.

None of it makes any sense. Most Unreal Engine 4 games exhibit a similar behavior.
 
No. I am going on reports of excessively slow frames. I'm not really interested in owning either Intel's P/E or AMD's high-cache/high-frequency split chips. It all smells a little much like crude hacks. It is telling that the professional markets (Xeons and EPYCs) doesn't do any of this jazz.
Well check this video, it's a 12900k. Look at the CPU usage and tell me what you think would happen if I turned off those ecores?


Have you seen DF's analysis of Redfall on PC?

The game seems to use 8 P-cores and 3 E-cores on the 12900K, but the utilization on all but two cores is extremely low. And the framerates with all cores enabled are actually lower than just 6 or 8 P-cores without HT.

On a Ryzen 3600 it's a stuttery mess even though the usage is quite low as well. It can't even maintain locked 30 FPS.

None of it makes any sense. Most Unreal Engine 4 games exhibit a similar behavior.
Buh, there are lot of broken beyond fixing games. Spiderman works better (and I mean, 20-25% better) with HT off. None of that makes sense, on a hardware level Pcores take priority, Ecores are used when P cores are swarmed, HT is used when E cores are also swarmed. Any game that deviates from the rule has something fundamentally f***ed up there in the code.

And I've seen lot of games that completely bypass this, they skip the Pcores, they skip the Ecores and they go straight for the HT cores. That's just asinine. That's why usually if you don't mind losing some MT performance it's better to turn HT off and be done with this nonsense


I just watched the video you mentioned, not a good indicator, we don't know if the issue is caused by ecores or by HT. He should have tested 8+8 with HT off as well.
 
Last edited:
Back
Top