• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Joined
Sep 8, 2009
Messages
1,077 (0.19/day)
Location
Porto
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Pro
Cooling AiO 240mm
Memory 2x 32GB Kingston Fury Beast 3600MHz CL18
Video Card(s) Radeon RX 6900XT Reference (amd.com)
Storage O.S.: 256GB SATA | 2x 1TB SanDisk SSD SATA Data | Games: 1TB Samsung 970 Evo
Display(s) LG 34" UWQHD
Audio Device(s) X-Fi XtremeMusic + Gigaworks SB750 7.1 THX
Power Supply XFX 850W
Mouse Logitech G502 Wireless
VR HMD Lenovo Explorer
Software Windows 10 64bit
Watch at 15:00 where Alex asked about Nvidia Streamline API. Apparently AMD engineer is arrogant enough to think their solution is the best for everyone, despite FSR2.x is quite useless at 1440p and below

LOL why the hell would an engineer for any company work towards a product that basically consists of weaseling their competitor's proprietary crap everywhere?
Alex's question is delusional, naive at best.

The only reason Streamline even exists is because Nvidia saw a decline in developer adoption for DLSS. They just want to say they're doing something open source (an integration tool lol yeah) while keeping the tech that matters closed source and exclusive to their hardware.
 
Joined
Jun 14, 2020
Messages
3,546 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I've just read through the review again, and there's no complaint about the game's looks, or any mention that it needs 20+ GB VRAM. And yeah, it stutters because it's UE4. I don't see AMD's hand in any of this.
Yeah, Jedi actually looks decent i have to admit, it's the exception rather than the rule. With that said, performance is horrible so there is that :D
 
Joined
Nov 11, 2016
Messages
3,465 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
LOL why the hell would an engineer for any company work towards a product that basically consists of weaseling their competitor's proprietary crap everywhere?
Alex's question is delusional, naive at best.

The only reason Streamline even exists is because Nvidia saw a decline in developer adoption for DLSS. They just want to say they're doing something open source (an integration tool lol yeah) while keeping the tech that matters closed source and exclusive to their hardware.

Funny thing is Streamline is open sourced and being used by modders to freely mod FSR into DLSS only game :rolleyes:

Game devs are free to integrate Streamline, then include DLSS/XeSS/FSR all at once, but it's actively blocked by AMD
 
Joined
Jan 8, 2017
Messages
9,517 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Joined
Oct 28, 2012
Messages
1,195 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Yes, that's exactly the point. I've just been told Nvidia sponsored titles run amazing on anything and then you told me that I am just been a contrarian.


No it didn't, there are games prior to it like Control which roughly had the same set of RT features and ran better, well if you can call that better, at least with a 4090 you could get above 60 natively.

View attachment 294039
Control world might not be on the same scale, even without RT the fps is higher. The impact of RT is -41% in both cases. and I remember Cyberpunk being openly mocked for getting pulled out from the PSN store, for the T pose, and wide array of bugs. The artistic direction and story were praised, but that got lost among the sea of memes about the bugs; Anyone finding any kind of enjoyment in the game was called an idiot. The right thing to do was to slander it. They made a dedicated subreddit so that the people having fun could talk with each other instead of being flamed for finding any positive point among the crisis.
GameWorks was also slandered before it became irrelevant.
The witcher 3 (GameWorks insideTM) was accused of being noticeably slower on Kepler and heavily favoring Maxwell on purpose.
Assasin's creed unity (Gameworks) was also a real dumpster fire.
And since Jensen outing about GPU price only going up from now, Nvidia is certainly not the most liked company among the trio.

I'm using Nvidia because I like to use Maxon Redshift and even cycle is much faster with optix, but I'm fully aware that Jensen interest don't really align with mine. :D

Even the argument that Nvidia is "avant-garde" for jumping on Ray tracing, Path tracing, upscaling, and frame generation is also vehemently disapproved by the people who believe that full rasterization and native rendering is bound to make a comeback, and RT is nothing more than a cash grab, unnecessary work for the developers, and actually stifle GPU innovation by focusing on tech bound to become abandonware.

There's slander on both sides. It's always been the case
 
Last edited:

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Cyberpunk runs great.

It didn't run great on release though. It was a mess too especially on consoles. It was so bad on Playstations that Sony removed it from their Store for a while. It was better on PC but there were still plenty of bugs. As time went by it got patched and polished and now it is fine.

UE 4 is a different animal altogether. It is well known for Traversal Stutters and Shader Compilation Stutters. I don't know if that can even be fixed or not but the VRAM issue must be addressed. It's EA we are talking about so there is some doubt in my mind that it will be properly patched but we'll see.

Have a look at this article. EA is trying to shift blame to High End Hardware gamers:

 
Joined
Oct 28, 2012
Messages
1,195 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
It didn't run great on release though. It was a mess too especially on consoles. It was so bad on Playstations that Sony removed it from their Store for a while. It was better on PC but there were still plenty of bugs. As time went by it got patched and polished and now it is fine.

UE 4 is a different animal altogether. It is well known for Traversal Stutters and Shader Compilation Stutters. I don't know if that can even be fixed or not but the VRAM issue must be addressed. It's EA we are talking about so there is some doubt in my mind that it will be properly patched but we'll see.

Have a look at this article. EA is trying to shift blame to High End Hardware gamers:

Might be time for UE4 to retire. HUB already debunked the whole win 10 vs w11 thing (yes, they don't use a CPU with e-cores, but EA didn't say that it was the issue either :D) :
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Might be time for UE4 to retire. HUB already debunked the whole win 10 vs w11 thing (yes, they don't use a CPU with e-cores, but EA didn't say that it was the issue either :D) :

UE 4 should be avoided by Developers. It's still popular among Developers though. Unreal Engines have been the most popular Engine to use for a long, long time. My understanding is that they are easier to learn and use by Developers.
 
Joined
Oct 28, 2012
Messages
1,195 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
UE 4 should be avoided by Developers. It's still popular among Developers though. Unreal Engines have been the most popular Engine to use for a long, long time. My understanding is that they are easier to learn and use by Developers.
And UE5 is apparently bound to become even more popular with all the news about studios ditching their in-house engine for it. We can only hope that it doesn't come with its own quirks as well.
 
Joined
Nov 18, 2010
Messages
7,602 (1.48/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
UE 4 should be avoided by Developers. It's still popular among Developers though. Unreal Engines have been the most popular Engine to use for a long, long time. My understanding is that they are easier to learn and use by Developers.

In this case it is all about assets, reusing thus making greater margin. We are talking about EA here. I recall they even bragged this being one of their fastest releases. Thus less salaries paid. It is all about money, and profit, not harder or not harder, avoid or not. If it would be earlier frostbite they would have used that, UT4, then they used that.
 
  • Like
Reactions: 64K
Joined
Jun 14, 2020
Messages
3,546 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
It didn't run great on release though. It was a mess too especially on consoles. It was so bad on Playstations that Sony removed it from their Store for a while. It was better on PC but there were still plenty of bugs. As time went by it got patched and polished and now it is fine.

UE 4 is a different animal altogether. It is well known for Traversal Stutters and Shader Compilation Stutters. I don't know if that can even be fixed or not but the VRAM issue must be addressed. It's EA we are talking about so there is some doubt in my mind that it will be properly patched but we'll see.

Have a look at this article. EA is trying to shift blame to High End Hardware gamers:

It was buggy yes, but performance was fine on PCs (and consoles,, excluding the old gen). By disabling RT even 3-4 year old CPU were getting 100+ fps in this game. It was RT specifically that made it heavy, in both the CPU and the GPU
 
Joined
Mar 18, 2023
Messages
935 (1.44/day)
System Name Never trust a socket with less than 2000 pins
And UE5 is apparently bound to become even more popular with all the news about studios ditching their in-house engine for it. We can only hope that it doesn't come with its own quirks as well.

Well, Kunos Simulatione went to UE for Assetto Corsa Competitione (sp?) and now announced that they go back to an in-house engine for AC2.

The user base blames the Unreal Engine for many things, and it didn't look as much better as it was consuming more resources.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,972 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Update May 1st: Now that EA Denuvo's five GPU changes per day expired, I was able to add more graphics cards, like RX 5700 XT, RTX 2080 Ti and Arc A770. No big surprises for either of those cards in performance, but the Intel Arc card displays serious visual corruption during gameplay, even with the 101.4335 Beta drivers installed. It's surprising that Intel decided to declare that beta "Game Ready for Jedi Survivor", when there's clearly visible rendering errors. While those are not game-breaking, they are still extremely annoying.

UE 4 should be avoided by Developers.
For game studios, UE is the best engine by far, working with it is extremely productive, it's very intuitive to use, you can access a huge talent pool across the world. There's an insane amount of pre-made assets that you can buy. UE is 100% open source, on Github, very high code quality, you can change every little bit inside it if you wanted to. The licensing terms are very reasonable. There's support for DX11, DX12, RT, FSR, DLSS--every interesting tech you can think of.

I tried making my GPU thermal load test in various engines including Unity, CryEngine and UE 4 and settled for UE4 after a day and SUPER happy with my choice
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Update May 1st: Now that EA Denuvo's five GPU changes per day expired, I was able to add more graphics cards, like RX 5700 XT, RTX 2080 Ti and Arc A770.
There might be a bug with the charts. The A770 and 5700 XT don't show up in the average or minimum FPS charts. The A770 and the 5700 XT are listed in the 25 game relative performance chart, but there is no graphic to correspond with the legend.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,972 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
There might be a bug with the charts. The A770 and 5700 XT don't show up in the average or minimum FPS charts. The A770 and the 5700 XT are listed in the 25 game relative performance chart, but there is no graphic to correspond with the legend.
Works for me, press F5, your browser cache (or some intermediary cache) might be caching the older charts
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
For game studios, UE is the best engine by far, working with it is extremely productive, it's very intuitive to use, you can access a huge talent pool across the world. There's an insane amount of pre-made assets that you can buy. UE is 100% open source, on Github, very high code quality, you can change every little bit inside it if you wanted to. The licensing terms are very reasonable. There's support for DX11, DX12, RT, FSR, DLSS--every interesting tech you can think of.

All that just makes those big developers look even worse. And I know it's the publisher who decides when a game gets released. But none of this changes the fact that most Unreal Engine 4 games run like crap, even on consoles. We definitely can't see any of these benefits in the finished products.

A few Sony first party games used this engine and they released in a good state as far as I can remember (the same can't exactly be said about their PC ports). So it can be done if the publisher cares enough.
 
Last edited:
Joined
Feb 21, 2006
Messages
2,240 (0.33/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.12.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Do you think so?
My nitro play the game at 3000mhz, that's an increase of 20% compared to the reference design clock speed of 2400 to 2500mhz.
My reference model will hit 2.8-3Ghz in certain games don't need an AIB model for that.

Just +15% on the power limit.

1682971334391.png
 
Last edited:
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Works for me, press F5, your browser cache (or some intermediary cache) might be caching the older charts
It works now, but pressing F5 earlier didn't work.
 
Joined
Dec 28, 2012
Messages
3,962 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Fair enough on the optical drive made error, but I was comparing the Series S to low end GPU's, those most definitely wouldnt be playing it at 4k either.

I would probably rather play a game at 720p/1080p for £250 than playing at same resolution for circa £500 or maybe at 1440p/4k for a couple of grand.
It's a fair point.
Also on latest AAA titles 12gig will starve you at middle of the range settings. You have to also remember people dont just buy a GPU for today they expect them to last at least a few years, its really just enthusiasts who are prepared to replace their hardware every year.
I disagree on this. Techspot found little evidence that the 10g 3080 was being held back, at ultra settings, at 4k. I believe they found 2 games where it happened. Others, despite pushing well north of 10g VRAM used, still performed fine.

12G, on a smaller GPU that wont be rocking ultra maxed out 4k, isnt going to be a real issue this generation. Consoles have 16g, but 10-12g at most is used for graphics. Mid range 12g GPU swill be able to handle that until the PS6 comes out in 5 more years.
I could make the same argument back to you, why does a mid range or low end GPU need dedicated RT shaders?
Simple. They dont. However, developing two architectures, one with RT cores and one without, would be prohibitively expensive. RT cores are part of the shader design, not a seperate bolted on piece. For light RT effects, which have already started becoming popular, even low end cards work decently, and that trend wont change since light RT hardware is found in both mainstream consoles.

truth be told, unreal engine 5 sucks so far...
Survivor is built on unreal engien 4, not 5. Arguably, that is the biggest issue, UE4 is beign pushed past its comfort zone. Much like games that were trying to push 120+ FPS on unreal engine 3.

UE5 may very well fix these issues.
 
Joined
Mar 29, 2023
Messages
1,045 (1.64/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Hate that they gave in to all the crying about performance, and just nerfed LOD distance across all settings with the patch today...

It shouldn't freaking have this short LOD distance on epic settings...

Issue is (as always) that people who have hardware that aint up for the task just use full maxed out settings... make it so that the game defaults to medium settings, and then you have to manually choose higher settings...
 

Attachments

  • JediSurvivor_2023_05_01_23_30_48_593.jpg
    JediSurvivor_2023_05_01_23_30_48_593.jpg
    6 MB · Views: 62
  • JediSurvivor_2023_05_01_23_30_43_248.jpg
    JediSurvivor_2023_05_01_23_30_43_248.jpg
    6 MB · Views: 61
Joined
Dec 28, 2012
Messages
3,962 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Update May 1st: Now that EA Denuvo's five GPU changes per day expired, I was able to add more graphics cards, like RX 5700 XT, RTX 2080 Ti and Arc A770. No big surprises for either of those cards in performance, but the Intel Arc card displays serious visual corruption during gameplay, even with the 101.4335 Beta drivers installed. It's surprising that Intel decided to declare that beta "Game Ready for Jedi Survivor", when there's clearly visible rendering errors. While those are not game-breaking, they are still extremely annoying.
Sounds like an intel product to me.
For game studios, UE is the best engine by far, working with it is extremely productive, it's very intuitive to use, you can access a huge talent pool across the world. There's an insane amount of pre-made assets that you can buy. UE is 100% open source, on Github, very high code quality, you can change every little bit inside it if you wanted to. The licensing terms are very reasonable. There's support for DX11, DX12, RT, FSR, DLSS--every interesting tech you can think of.

I tried making my GPU thermal load test in various engines including Unity, CryEngine and UE 4 and settled for UE4 after a day and SUPER happy with my choice
The real issue here isnt UE4 itself, its that, clearly, devs need to put some time into optimizing their code in UE4, and they are not doing that, ebcause for the last 10 years the answer has been to "throw hardware at the problem". Towards the end of the PS3/360 era, that began to change as the console hardware was not getting better and devs were forced to optimize to compete. If the PS5 era is as long, with a major recession looming, then we may see the same thing this generation.

Hate that they gave in to all the crying about performance, and just nerfed LOD distance across all settings...

It shouldn't freaking have this short LOD distance on epic settings...




Issue is (as always) that people who have hardware that aint up for the task just use full maxed out settings... make it so that the game defaults to medium settings, and then you have to manually choose higher settings...
The issue is that the 4090 should not be required to render a game that runs on an engine that functions on the PS4. The performance on consoles was also shite, with constant drops into the 20 FPS range. Even then the 4090 barely copes at times.

Thats utterly unacceptable.
 
Joined
Mar 29, 2023
Messages
1,045 (1.64/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Sounds like an intel product to me.

The real issue here isnt UE4 itself, its that, clearly, devs need to put some time into optimizing their code in UE4, and they are not doing that, ebcause for the last 10 years the answer has been to "throw hardware at the problem". Towards the end of the PS3/360 era, that began to change as the console hardware was not getting better and devs were forced to optimize to compete. If the PS5 era is as long, with a major recession looming, then we may see the same thing this generation.


The issue is that the 4090 should not be required to render a game that runs on an engine that functions on the PS4. The performance on consoles was also shite, with constant drops into the 20 FPS range. Even then the 4090 barely copes at times.

Thats utterly unacceptable.

And in what world is a 4090 required to run this game...



Answer = IT ISN'T.....

And if you have weaker hardware, don't use max settings... and no, consoles do not use max settings either...
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Hate that they gave in to all the crying about performance, and just nerfed LOD distance across all settings with the patch today...

So it looks worse, but the framerate on both screenshots is the same?
 
Top