• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lords of the Fallen Performance Benchmark

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,538 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Lords of the Fallen has been released, showcasing the incredible power of Unreal Engine 5. This souls-like game offers breathtaking visuals that rank among the most impressive we've seen, but it demands powerful hardware, too. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection of modern graphics cards.

Show full review
 
Joined
Jan 5, 2006
Messages
18,585 (2.72/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
So my GPU would do about 19fps @ 1440 with RT.....

Welp, not buying anyway. :D
 

janitorsup

New Member
Joined
Oct 16, 2023
Messages
1 (0.00/day)
Getting these stutters that you can see in Daniel Owen's video at 14:13 a lot... sometimes multiple of them in a row and it can happen even during fights because enemies chase you into transition zones. Also, I'm very CPU limited in the hub area for some reason and get bad frame rates but only there.

[link removed]

I'm on a 5600x and 7900 XT and I feel like the 5600x is pretty outdated now although in Daniel Owen's video the stutter happens even with a 7800x3d.

That said, the game does look amazing and most of the time when I'm within a single area it's very smooth.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,538 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
that you can see in Daniel Owen's video at 14:13 a lot
I ran through this area many times .. kept failing the first boss quite often t.t .. no such stutter here .. just retried .. nothing
 
Joined
Jun 29, 2023
Messages
537 (1.20/day)
Location
Spain
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
ICYMI, the graphs have "Ultraw" and "lowh" written, which I doubt were intentional

As for the performance, jesus fucking christ. The game looks quite average, why the fuck is performance this abysmal? Why have developers suddenly forgotten how to optimize their games? Or is it the unreal engine? Or is it that the ultra settings without RT turn on something that tanks performance without visual return? Because the scalability on display is impressive for low vs high graphics (though I'd argue that low should be closer to the actual performance judging by how the game looks), nanite perhaps?

Another UE5 game, another performance disasterclass, I am shocked that it's this bad on relatively high end hardware at 1080p
 

bug

Joined
May 22, 2015
Messages
13,584 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
What, another AAA release that does just fine with 8GB VRAM? But, but, but...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,538 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
ICYMI, the graphs have "Ultraw" and "lowh" written, which I doubt were intentional
I did, but was too lazy to fix because I didnt save the chart inputs, and thought "nobody will ever notice" .. fixed
 
Joined
Nov 18, 2010
Messages
7,424 (1.47/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
What, another AAA release that does just fine with 8GB VRAM? But, but, but...

Yeah, but to be honest some of the textures like wood besides the barrels look like from Morrowind.
 
Joined
Feb 11, 2009
Messages
5,507 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
honestly, the future is looking grim...this performance for a game that looks virtually no better then Dark Souls 2....
christ is this engine just super inefficient or what?

also is it not a bad idea to run Frame generation in a game like Dark Souls? like you would not want that for competitive games and this I would think is no suited to it for the same reason.
Maybe they are just hoping to dupe people with it to make up for the generally poor performance, idk.

EDIT, though I do want to say it does seem to scale decently with settings, so thats fine.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,584 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Since it’s generally accepted on the internet that Radeon drivers don’t work, @Wizzard, how many times did the game crash and display graphical corruption when testing Radeon cards?
Ah, the classical make up an accusation, so you can easily refute it immediately.

And to explain that a bit. If it was known/accepted "drivers don't work", nobody would be buying those cards. Really simple.
What is known/accepted is ATI/AMD drivers were historically more faulty and would incur a higher overhead. What is also known/accepted is that over time they got up there with Nvidia and they're now trading blows. The only significant thing I can name about AMD drivers otoh is they seem to regress idle power draw every now and then.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,538 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Since it’s generally accepted on the internet that Radeon drivers don’t work, @Wizzard, how many times did the game crash and display graphical corruption when testing Radeon cards?
Didn't encounter any crashes or similar issues on any cards. Check the Steam forums, maybe it's just me
 
Joined
Dec 30, 2021
Messages
381 (0.38/day)
The "more accurate ray tracing methods" descriptions are in reference to Lumen lighting and reflections. Specifically, this game uses software lumen only. Lumen is still considered a form of ray tracing, but it is not hardware accelerated in this game (it doesn't need to be due to the simplified nature of Lumen's ray tracing). Technically, you can even use those settings with GTX graphics cards without too severe of a hit to your frame rate. (edit: i mean, the frame rate will probably be poor in general, but the hit from turning on high GI or reflections won't be heavier than other cards)

If you include this game in the GPU benchmarking suite, I would recommend the ultra preset for GI and reflections for the basic test suite.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,538 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Specifically, this game uses software lumen only
Do you have a source for that? But I agree with you, also RX 5700 XT being able to run it. Even though UE has various fallbacks
 
Joined
May 16, 2023
Messages
75 (0.15/day)
The beauty of Unreal Engine 5 is that you don't need to run it at native resolution for it to look good, it's designed this way. The built in resolution scaler (Not DLSS or FSR) works great and looks as good as native down to 60% resolution at 4K. I haven't tried with DLSS on my 3080 12GB, but FSR has issues with particles, so I decided to use the built in scaler. I'm playing on a 6950XT at 4K with everything except global illumination on ultra at 65% resolution and I stay above 60fps with the only drops in the home base area while in Umbral. I think UE5 needs to market its resolution scaling far more, it's quite good.
 
Joined
Dec 30, 2021
Messages
381 (0.38/day)
Do you have a source for that? But I agree with you, also RX 5700 XT being able to run it. Even though UE has various fallbacks
I guess I don't, I'm just basing this off of my knowledge of how UE5 works from screwing around with it a little but I'm not an expert. As far as I'm aware, the stock "Global Illumination" setting in UE5.2 just controls Lumen in its software mode. Specifically, high, ultra, and epic are software lumen and low and medium are a very basic fallback. Hardware Lumen is a separate toggle you must enable in the settings, and this game doesn't have that setting anywhere, nor is it in the ini. The game also doesn't have the epic quality option available, though you can do an ini edit to enable it (don't—it performs very badly and barely improves anything)

Software Lumen uses a signed distance field while hardware Lumen replaces that with a BVH. Cards like the 5700 XT should choke and fail miserably when trying to run HW Lumen, so if it can run high and ultra quality at the same level of playability as the 6600 XT, then it should be SW lumen.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,538 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Hardware Lumen is a separate toggle you must enable in the settings, and this game doesn't have that setting anywhere, nor is it in the ini
Could it be that the setting is just a front for multiple settings that get adjusted internally depending on its value?
 
Joined
Oct 31, 2020
Messages
81 (0.06/day)
Processor 5800X3D
Motherboard ROG Strix X570-F Gaming
Cooling Arctic Liquid Freezer II 280
Memory G Skill F4-3800C14-8GTZN
Video Card(s) PowerColor RX 6900xt Red Devil
Storage Samsung SSD 970 EVO Plus 250GB [232 GB], Samsung SSD 970 EVO Plus 500GB
Display(s) Samsung C32HG7xQQ (DisplayPort)
Case Graphite Series™ 730T Full-Tower Case
Power Supply Corsair RM1000x
Mouse Basillisk X Hyperspeed
Keyboard Blackwidow Ultimate
Software Win 10 Home
Since it’s generally accepted on the internet that Radeon drivers don’t work, @Wizzard, how many times did the game crash and display graphical corruption when testing Radeon cards?
"generally accepted on the internet on the internet that Radeon drivers don’t work"
They write a lot on the internet,the issue is that it is difficult to understand people if they are telling the truth or not,because generally there is a lot of toxicity even in hardware matters,so after much thought I decided that only if I try the product myself will I have a real picture and my experience is as follows,I have 5 PCs with 1080gtx, rtx 2070, 5700xt, 6800 and 6900xt apart from the usual problems that are solved through the drivers,I have not faced anything that would make me definitively choose one company over the other specifically say that I run the 6900xt together with MPT(MorePowerTool) and the card is the same fast in raster as a 3090ti and similar performance in rtx with a 3080 without problems with 24/7 use,that's how stable the AMD drivers are .
 
Joined
Aug 4, 2020
Messages
1,604 (1.06/day)
Location
::1

we need more than 8gib of vram they said
 
Joined
Jun 29, 2023
Messages
537 (1.20/day)
Location
Spain
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
The beauty of Unreal Engine 5 is that you don't need to run it at native resolution for it to look good, it's designed this way. The built in resolution scaler (Not DLSS or FSR) works great and looks as good as native down to 60% resolution at 4K. I haven't tried with DLSS on my 3080 12GB, but FSR has issues with particles, so I decided to use the built in scaler. I'm playing on a 6950XT at 4K with everything except global illumination on ultra at 65% resolution and I stay above 60fps with the only drops in the home base area while in Umbral. I think UE5 needs to market its resolution scaling far more, it's quite good.
Now go to lower resolutions, or realize that many people (such as myself) don't like running upscalers because they cannot really offer the sharpness of a native image, which also come with way less ghosting (if the AA method is not garbage). 60% of 4K upscaled might look nice, but try doing that at 1080p and you will be greeted to horrors unseen, literally, it looks like smeared vaseline. That and, even on beefy gpus one cannot run at even native 1440p at acceptable framerates, when contemporary games that run on proprietary game engines may at times look better than most UE5 games and run light years better too, see for example: MWIII (a beta!), CS2, Half life: Alyx, F1 2023, Control, RDR2. While looks are mostly subjective I will concede, all the games listed run light years faster than most UE5 games while looking arguably better in many cases, though again, subjective.
 
Joined
Jun 16, 2023
Messages
114 (0.25/day)
I cannot believe I would barely get 100 fps with a 4090 in 1440p. I get over 100fps with a 2070 SUPER in Ultra settings + Ray Tracing in games like Dying Light 2.
So I'm skipping this gen of graphic cards AND this specific game until they learn how to use game engines properly.
 
Joined
May 13, 2008
Messages
762 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
The beauty of Unreal Engine 5 is that you don't need to run it at native resolution for it to look good, it's designed this way. The built in resolution scaler (Not DLSS or FSR) works great and looks as good as native down to 60% resolution at 4K. I haven't tried with DLSS on my 3080 12GB, but FSR has issues with particles, so I decided to use the built in scaler. I'm playing on a 6950XT at 4K with everything except global illumination on ultra at 65% resolution and I stay above 60fps with the only drops in the home base area while in Umbral. I think UE5 needs to market its resolution scaling far more, it's quite good.

It's kinda the point I've been trying to make a good long while, which is we're headed into the era of FSR/DLSS 'balanced' (~59% scale) for a lot of the more intensive/ue5 games. I haven't played this title personally, but I agree that the setting generally does not look bad at all IMHO regardless of upscaling tech, and given how Starfield/UE5 Fortnite were received (I haven't seen anyone complaining about the resolution on XSX) I don't think most people have a problem with it either. It makes sense when you think about it, as the resolutions are 2227*1253 (dlss) and 2259*1270 (fsr), which are essentially the max of the THX spec regarding distance per size/visual acuity, 40 degree arc, or as I like to put it: When you sit close-enough to a tv that it fills your whole vision without needing to move your head...that resolution should still be fine for the vast majority of people. I also think it's weird reviewers, if they test FSR/DLSS at all, generally go straight from 'quality' (66%, which is still often overkill imho) to "performance" (50%)....which can start to look a little questionable. It's almost like the modes match what they say on the tin, but people don't appear to care. Thanks for being a voice of reason.

I find it amusing that people are using this game as an example of why people don't need more than 8GB of VRAM. First of all, the texture quality in this title isn't particularly good. Second of all, let's not act like every game successfully pulls apart system/vram allocation from the shared ram pool of a console. Ideally, yes, and in that case because of the whatever it is, ~14GB available to a console should generally equal an ~8/6GB split, but you need to realize that just doesn't always happen; some PC ports require 10-12, rarely 12+GB atm of one or the other if not both. If you want to criticize all of those developers for their PC ports that's fine, but it doesn't change the reality that it has and will happen, especially as we move to lower native resolutions on current-gen consoles (which will happen not only bc ue5 is heavy and/or so there is the perception of games doing more and are looking better on the same hardware, but to capitalize on the potential of a ps5pro) or more intensive settings on PC. The one thing that truly might save you is the fact the XSS has (really 8GB, 2GB is 32-bit) 10GB of ram and developers are forced to release a port for it if they join the Xbox ecosystem. That doesn't, however, mean that that console won't eventually fall into 720p territory. It's already sub-900p in a lot of instances, even if you try to make the arguement 1080p is the general aim. You can fight it all you want, but the reality is that not only is 8GB questionable for 1080p, but 12GB is questionable for 1440p when matching GPU performance to it's buffer. There absolutely already is and will be more instances in the future a 4070ti will be below 1440p/60 simply because of the buffer; that's a card that seriously could use 13-14GB but doesn't have 16GB because product segmentation and the art of the upsell to the 4080. We can agree to disagree, but let's revisit this topic after the release of the PS5pro, Blackwell, and 3nm Navi. Heck, maybe after the release of Battlemage/Navi4x (which will likely match their performance to 16GB exactly and aim for 1440p; relegating 4080 to the same market eventually). Those cards will almost certainly be 16GB, and the next-gen likely 16-32GB. When you stop and realize the next consoles themselves will probably be 32GB, and the then older consoles then likely relegated to (1080p-)1440p maximum, it all starts to make sense. I'm not saying you're completely wrong at the moment, but wrt the fairly-immediate and especially longer-term future (especially in terms of spending $400+ on a video card) I would certainly opt for obtaining a larger buffer. People can choose to believe 4070ti is not planned for obsolescence by having 12GB or 4070 having the performance of a card that only needs ~10-11GB, but those will be the same people nVIDIA will target their fomo using settings unplayable on them when Blackwell releases. That's just (how nVIDIA runs their) business of selling new video cards, which is important to understand is incredibly savvy but also incredibly shitty for customers. They know that people will forget about this conversation when the next-generation releases and 10-12GB cards are old news and/or relegated to 1080p, just as they have now done several times before with 3/6GB cards. To bring it full circle, I and likely other people will then recommend people run those cards at 4k DLSS balanced, or roughly 59% scale (as Colossus recommends), if not 50%/'performance'/1080p.
 
Last edited:
Joined
Dec 12, 2016
Messages
1,622 (0.57/day)
Ah, the classical make up an accusation, so you can easily refute it immediately.

And to explain that a bit. If it was known/accepted "drivers don't work", nobody would be buying those cards. Really simple.
What is known/accepted is ATI/AMD drivers were historically more faulty and would incur a higher overhead. What is also known/accepted is that over time they got up there with Nvidia and they're now trading blows. The only significant thing I can name about AMD drivers otoh is they seem to regress idle power draw every now and then.
That is not the current ‘conventional wisdom’ at all. The TPU comments on Radeon content are filled with ‘I just wish Amd drivers didn’t suck’ or something similar. I appreciate Wizzard and others providing feedback in this review of not finding any significant problems using Radeon cards.
 
Top