• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Fallout 4, 9700k @ 5.1 Ghz., 32 GiB DDR4000: only a slight FPS diff. between 1080ti and 4090!

Joined
Mar 14, 2008
Messages
511 (0.08/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
It would be interesting to see if someone with a DDR5 system could try underclocking their RAM and see what kind of performance decline they experience in Fallout4 in the downtown Boston area.
My system is DDR5 but only 5200@5400 soooo maybe i should just buy a 7200 kit and see........ i have looked at it for a while.... now i have a good reason :D

Remove and install fresh mobo and gpu drivers
Maybe i should try the same, i just swapped my 1080ti with the 4090 and installed the newest driver.......
 
Joined
Jul 20, 2018
Messages
127 (0.06/day)
System Name Multiple desktop/server builds
Processor Desktops: 13900K, 5800X3D, 12900K | Servers: 2 x 3900X, 2 x 5950X, 3950X, 2950X, 8700K
Motherboard Z690 Apex, X570 Aorus Xtreme, Z690-I Strix
Cooling All watercooled
Memory DDR5-6400C32, DDR4-3600C14, DDR5-6000C36
Video Card(s) 4090 Gaming OC, 4090 TUF OC, 2 x 3090, 2 x 2080Ti, 1080Ti Gaming X EK, 2 x 1070, 2 x 1060
Storage dozens of TBs of SSDs, 112TB NAS, 140TB NAS
Display(s) Odyssey Neo G9, PG35VQ, P75QX-H1
Case Caselabs S8, Enthoo Elite, Meshlicious, Cerberus X, Cerberus, 2 x Velka 7, MM U2-UFO, Define C
Audio Device(s) Schiit Modius + SMSL SP200, Grace DAC + Drop THX AAA, Sony HT-A9, Nakamichi 9.2.4
Power Supply AX1200, Dark Power Pro 12 1500W
Mouse G Pro X Superlight Black + White
Keyboard Wooting 60HE, Moonlander
VR HMD Index, Oculus CV1
9700k shouldnt have issues. Both the games OP mentioned (GTA V and fallout 4) were built to run on jaguar cores. 9700k is more then capable of running both 100FPS+
100 FPS where? Fallout 4 becomes insanely CPU/memory bandwidth limited as you have more objects and terrain on screen. Just because it's built to run on jaguar cores doesn't mean it scales well. Even 100 FPS at 4K isn't anywhere near taking full advantage of a 4090.

My 13900K boosting to 5.7GHz can only get 83 FPS here on top of the Corvega plant.
Fallout 4 Screenshot 2023.04.19 - 16.43.00.92.png

If I turn 90 degrees to the right I hit my vsync cap at 240 FPS
Fallout 4 Screenshot 2023.04.19 - 16.43.20.18.png
 
Last edited:
Joined
Sep 10, 2018
Messages
6,786 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
100 FPS where? Fallout 4 becomes insanely CPU/memory bandwidth limited as you have more objects and terrain on screen. Just because it's built to run on jaguar cores doesn't mean it scales well. Even 100 FPS at 4K isn't anywhere near taking full advantage of a 4090.

My 13900K boosting to 5.7GHz can only get 83 FPS here on top of the Corvega plant.
View attachment 292257
If I turn 90 degrees to the right I hit my vsync cap at 240 FPS
View attachment 292258

Yeah I see in the mid/low 70s with a 5950X in spots.
 
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
The Corvega plant is on the outskirts of downtown Boston though, the place I was measuring FPS is on top of a skyscraper in the middle of downtown Boston, near the river (the Potomac?). The skyscraper is the one in which you save a guy from super mutants (I think it has a radio on top as well, which is how the guy was sending out his distress call).
 
Joined
Dec 15, 2022
Messages
26 (0.04/day)
FO4 is heavily limited by a single CPU thread. Windows thread scheduler just bounces the thread around the CPU like mad. 100% usage on a single core will show up as 12.5% load across all cores if you have 8 total logical cores and 6.25% with 16.

Looking at an area with lots of buildings results in tons of draw calls, which will crater performance. This is why downtown Boston has always run pretty badly. You would be better off setting an FPS cap at 60-90 and leaving it alone. Partly because last I checked physics and other things are tied to game FPS.

DXVK helps because Vulkan is generally better at handling draw calls than DX11.
 
Last edited:
Joined
Mar 14, 2008
Messages
511 (0.08/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
82-93 here but it is 75 looking down at the houses, and nothing above 93

1682008220541.png
 
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
@JalleR
In downtown Boston proper the FPS plummets even more. Is the railgun available in vanilla FO4 or is that something you get in the DLC content?
 
Joined
Sep 17, 2014
Messages
22,253 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Doubt, as with 1080 ti and 6700k back in my days did push much more fps in GTA V. Only for know it is unknown what settings are enabled on GTA V. OP did not provide information.
Perhaps the difference between older cards and the 4090 is the fact that Nvidia's drivers are no longer as efficient as they used to be, more CPU overhead.

If the game is already CPU limited, that will make itself known.

But yeah I remember the same from FO4. Its just a POS engine, anything in and around the city / with lots of assets in view is going to take a massive FPS hit.
 
Joined
Mar 14, 2008
Messages
511 (0.08/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
@JalleR
In downtown Boston proper the FPS plummets even more. Is the railgun available in vanilla FO4 or is that something you get in the DLC content?
I have all the DLC soooooooooo that is a good question :)
 
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
I'm thinking that Fallout 4 might be less CPU limited and more memory bandwidth limited -- it was designed to run on consoles running GDDR5 and a 256-bit memory data bus.
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,808 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
I'm thinking that Fallout 4 might be less CPU limited and more memory bandwidth limited -- it was designed to run on consoles running GDDR5 and a 256-bit memory data bus.
Its heavily thread limited and the engine is old about 20% of the code base is still from Net Immerse engine which was used for Morrowind in 2003. Its not well optimized compared to modern engines. Its also not meant to be run above 60 FPS due to physics. Higher frame rates fuck with the games Physics to a high degree. You can run it at a higher frame rate but it can cause serious wonky stuff to happen. That said, the game will scale with clock speed not thread count or bandwidth. Only way to improve FPS in certain situations is pure brute force since 1 thread causes a bottleneck for everything else.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,253 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm thinking that Fallout 4 might be less CPU limited and more memory bandwidth limited -- it was designed to run on consoles running GDDR5 and a 256-bit memory data bus.
Nah, its just a shit engine, the consoles target 30 FPS, and on launch couldn't even keep that up.
 
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
Its heavily thread limited and the engine is old about 20% of the code base is still from Net Immerse engine which was used for Morrowind in 2003. Its not well optimized compared to modern engines. Its also not meant to be run about 60 FPS due to physics. Higher frame rates fuck with the games Physics to a high degree. You can run it at a higher frame rate but it can cause serious wonky stuff to happen. That said, the game will scale with clock speed not thread count or bandwidth. Only way to improve FPS in certain situations is pure brute force since 1 thread causes a bottleneck for everything else.
It's already been proven Fallout 4 can benefit from higher frequency RAM.
 
Joined
Sep 10, 2018
Messages
6,786 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
It's already been proven Fallout 4 can benefit from higher frequency RAM.

It still requires both pretty sure if you had 8700k like ST performance but 8000 low latency DDR5 performance would still be crap better but still.
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,808 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
It's already been proven Fallout 4 can benefit from higher frequency RAM.
That was back in the DDR3 era.. with DDR4 it stopped scaling around DDR4 3000 MHz on my 8700K from DDR4 3000 which I run now vs 3666 MHz there is zero FPS difference. still dips to the same lows. It did scale quite well with memory back in the day but memory bandwidth on modern systems is high enough that its back to being thread limited. Going from 5 GHz on my 8700K down to 4.6 now had a bigger impact than the ram. So yeah it scales with a memory to a certain point but not so much now. Latency can have an impact as well as L3 cache in fact L3 cache is likely the biggest factor anything that doesn't fit into cache will get pushed to the system memory so Ryzen X3D chips will likely have bigger gains. But the biggest benefit is if you use a Core Affinity app and set the game to just run on set cores with no HT. tends to smooth things out and stops the game from swapping between so many cores on which its single thread bound anyway. So while it will distribute load the 1 important thread is the main culprit. If you can't scale clock speed then scaling bandwidth helps but suffice to say the game is still a buggy mess on ancient code.

As it is with the 7950X3D with same settings etc vs a 7950X with higher clocks but same memory the extra cache shows a 20-30% improvement in performance. Game wants to nom nom on more L3 cache to keep the CPU thread properly fed with data. Faster memory helps but will never compensate for that.
 
Last edited:
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
That was back in the DDR3 era.. with DDR4 it stopped scaling around DDR4 3000 MHz on my 8700K from DDR4 3000 which I run now vs 3666 MHz there is zero FPS difference. still dips to the same lows. It did scale quite well with memory back in the day but memory bandwidth on modern systems is high enough that its back to being thread limited. Going from 5 GHz on my 8700K down to 4.6 now had a bigger impact than the ram. So yeah it scales with a memory to a certain point but not so much now. Latency can have an impact as well as L3 cache in fact L3 cache is likely the biggest factor anything that doesn't fit into cache will get pushed to the system memory so Ryzen X3D chips will likely have bigger gains. But the biggest benefit is if you use a Core Affinity app and set the game to just run on set cores with no HT. tends to smooth things out and stops the game from swapping between so many cores on which its single thread bound anyway. So while it will distribute load the 1 important thread is the main culprit. If you can't scale clock speed then scaling bandwidth helps but suffice to say the game is still a buggy mess on ancient code.

As it is with the 7950X3D with same settings etc vs a 7950X with higher clocks but same memory the extra cache shows a 20-30% improvement in performance. Game wants to nom nom on more L3 cache to keep the CPU thread properly fed with data. Faster memory helps but will never compensate for that.
You're wrong again:
https://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page3.html

Furthermore this was on a 6700 Skylake at 4.5Ghz.
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,808 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
Well you have all the answers then I suggest buying the fastest memory kit you can possibly get to work with your system spending way more money than necessary and then deal with the headache of tweaking all the sub timings till its absolutely perfect. Oh but sadly any slightly more modern system with more L3 will outperform it. You have a massively expensive GPU bottlenecked by your CPU's lack of L3 cache in 1 game (the other 99% of games its likely perfectly fine)

Do whatever makes you happy :toast:
 
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
Well you have all the answers then I suggest buying the fastest memory kit you can possibly get to work with your system spending way more money than necessary and then deal with the headache of tweaking all the sub timings till its absolutely perfect. Oh but sadly any slightly more modern system with more L3 will outperform it. You have a massively expensive GPU bottlenecked by your CPU's lack of L3 cache in 1 game (the other 99% of games its likely perfectly fine)

Do whatever makes you happy :toast:
Let's see some proof.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,830 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) Odyssey OLED G9 (G95SC)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Modi+ & Valhalla 2
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
Well you have all the answers then I suggest buying the fastest memory kit you can possibly get to work with your system spending way more money than necessary and then deal with the headache of tweaking all the sub timings till its absolutely perfect. Oh but sadly any slightly more modern system with more L3 will outperform it. You have a massively expensive GPU bottlenecked by your CPU's lack of L3 cache in 1 game (the other 99% of games its likely perfectly fine)

Do whatever makes you happy :toast:

Between this thread.

This one: https://www.techpowerup.com/forums/...detect-anything-specific.307618/#post-4998832

and

This one: https://www.techpowerup.com/forums/threads/bad-perf-on-4070ti-in-mw2.307319/#post-4994231

I am starting to wonder if the 3xxx series was just the performance limit of what CPUs in the over all league (comparatively) could handle. It looks like jumping to 4xxx is really starting to show these systems weak points. I just wonder if no one has caught on because of the slow uptake on high end GPUs from nvidia and AMD. Of course that just conjecture since I need to look at graphs again, but I would not be surprised (and am starting to see a trend) in CPU bottlenecks from CPUs of similar performance to those shown (since at the time there was not much generational change).

I think people will bitch about drivers a little longer, and the slow uptake in sales due to pricing will keep it from showing its face clearly a little longer, but I am starting to think there is a shift in the sands where CPUs that "where fine" all of a sudden aren't now.
 
Joined
May 30, 2018
Messages
1,890 (0.81/day)
Location
Cusp Of Mania, FL
Processor Ryzen 9 3900X
Motherboard Asus ROG Strix X370-F
Cooling Dark Rock 4, 3x Corsair ML140 front intake, 1x rear exhaust
Memory 2x8GB TridentZ RGB [3600Mhz CL16]
Video Card(s) EVGA 3060ti FTW3 Ultra Gaming
Storage 970 EVO 500GB nvme, 860 EVO 250GB SATA, Seagate Barracuda 1TB + 4TB HDDs
Display(s) 27" MSI G27C4 FHD 165hz
Case NZXT H710
Audio Device(s) Modi Multibit, Vali 2, Shortest Way 51+ - LSR 305's, Focal Clear, HD6xx, HE5xx, LCD-2 Classic
Power Supply Corsair RM650x v2
Mouse iunno whatever cheap crap logitech *clutches Xbox 360 controller security blanket*
Keyboard HyperX Alloy Pro
Software Windows 10 Pro
Benchmark Scores ask your mother
CPU/memory performance aside, I saw somebody bring up the Boston areas....

All of the city areas in FO4 are woefully unoptimized. Well... that's not fair. They are just poorly optimized. Basically, all of the static objects in a given cell are combined into one big monster mesh. If it's not going to move or react to collision other than blocking actors/projectiles/dynamic objects, then it's more efficient to combine the meshes. Every individual mesh constitutes at least one extra drawcall... and realistically, several more, because each shader/material property adds another pass that your CPU must specify - and even FO4 uses things like shadow maps, normal bumpmaps, specular maps, etc. So each mesh you combine into one chunk, saves several drawcalls. When you have hundreds or thousands of meshes, you can cut down 4-5 digit amounts of drawcalls to ultimately produce the same thing by combining as many as you can into one. Thing is, they didn't do a great job dividing up the actual cells in the region for that, so it's always dealing with too many vertices. It's tied to how the game decides what to render, and what doesn't need rendering. The creation engine uses occlusion culling to work this out. Essentially, occlusion data from those big monster meshes, factored against player orientation, determine what parts of the mesh actually get the textures drawn on, have shaders applied, etc.

The issue is, a lot of the cells with these precombined 'master' meshes are simply too big, or shaped in a way where you will just be triggering double, triple, even quadruple the drawcalls of any other area. A big part of the problem is how the game decides when to apply occlusion culling - it wastes a lot of resources drawing things you can't actually see because it doesn't start culling until you are fairly close to an occluding object. Becomes a huge problem when you're in a city PACKED full of polygons and high-altitude points, especially when those cells are chonkin. Say you're standing on a tall building, looking on at rows of tall buildings out on the ground below. From your position, a lot of what's on the ground behind those buildings is blocked by them... but most of it is still being rendered. And a big part of why it does that, is because they're in the same cell, and it's thusly still tracking the occlusion data for all of that stuff over there same as anything in your immediate vicinity. On the ground, it's marginally better because there are enough objects in front of you to stop objects from rendering as far back in the cell. But you STILL almost have to be staring straight at a very close wall for the engine to stop caring what's behind it in some cases. Till then, it may still be fumbling around with geometry and shaders back behind it. To compound it further, there are still points where too many big cells converge, creating these death zones in the city where drawcalls spike even more as it begins to map textures to the polygons of multiple huge cells concurrently. You also run into points where you are rapidly entering and leaving multiple cells, forcing a lot of stuff to load and unload, bringing storage bottlenecks into the equation. We are talkin exponential increases in resource demands on multiple fronts. Even if your GPU could easily handle the massive amounts of drawcalls (which I think the best ones easily could,) I doubt any CPU out there can give that many quickly enough to keep everything feeding smoothly.

And understand ONE thing about drawcalls. They're CPU intensive, as they are when the CPU tells the GPU what to draw. Modern games, with all of their complexity, environmental destruction, dynamic meshes... very rarely surpass 5000 drawcalls. Hell, some of them don't even pass 1000! Maybe when they were still working on the game, pre-optimization it did 5-6000. Meanwhile, FO4's Boston hits 5 digits in some places. 5-digit drawcalls are a horrendous thing, and FO4 has no excuse for it. There are better looking, more intricate games sitting right next to it that don't use half the drawcalls FO4 does in its lightest areas. I just really want to emphasize how cavalier the game truly is when it comes to what it expects from a CPU - it's often more than double what any decently-optimized game could ever require. Actual orders of magnitude more than just about any open world game you could think to name. It's utter insanity. No matter what hardware you have, it will always underperform to some degree, because what it wants CPUs to chew through in dense areas is idiotically high, like Bethesda was when they put it out in this state.

We could talk more about the cell system too. The entire map is split in to cells that basically determine how/when to actively stream in what objects/textures, versus what will just be 2d (or false-3D) LODs from pregenerated billboard textures. It also determines when scripted things actually run. FO4 is a bit odd in that it keeps track of MANY more scripted variables than most other games. These variables pertain to things like where each NPC is in the game at such and such time of whichever particular day or point in the game, what they're doing, quest states, radiant/emergent events scattered everywhere across the map. And as you play, it must track more. When you are far enough away from any object bound by the states of these variables, they stop being 'active' in the sense that the game will not calculate what they are doing in each frame - instead, it will run a 'fast-track' script simulating all of the things that would've happened in the time before you got close enough to make them active again. A lot of them get resolved in the loading screens. But for things closer, the game is calculating every step every enemy and NPC makes, one frame at a time. This alone can actually lag your game - too many scripted mods cause the script system to queue, which forces the render pipeline to slow to however many frames it can process per second. No amount of CPU power fixes that, the scripting system has a a finite amount of things it can handle in one frame before it has to start dragging them out and racking up tasks. The speed it goes at is frametime-limited, never requiring much actual power to work - but no way around the time it takes to complete sequences of tasks. The city is very 'busy' and shows the weaknesses in Bethesda's methods. It goes by cells - the game will keep all of the dynamic script-related things within a circle of proximal cells active. And there's more of that in downtown than anywhere else in the game, by a whooolllleeee lot. You can actually change how many cells it extends out to, but increasing or decreasing it will desync scripts and depending on where/how you travel, can break your game about as badly as it can be broken.

Boston is notoriously bad. It's been everyone's problem for years, with a lot said about how/why, but not many great solutions. There was a time when the BEST possible configs still couldn't take you much over 30FPS in parts of it. And yeah, CPU does have a lot to do with it. The lag spike comes from dealing with say... 5000 drawcalls max for a given frame to say... 20 or 30 thousand. And then you factor in the amount of script actors/flags present in the area. Yeah, it's a CPU killer. It's also worth noting, FO4 is very poorly optimized for MT performance, so utilization is bad on those. The memory optimization is equally poor. You may actually look into downloading ENB. Even if you're not going to use the effects, it has a memory fix that you can set up, which helps significantly with overall stream-loading performance. Faster, higher bandwidth memory probably helps for the same reason. ENB just changes the allocations around, lets you better match how much RAM gets used for each MB of VRAM. It tries to make the game use your available VRAM more effectively. Similarly, any mods that reduce memory load, such as optimized textures, help a lot. Another big performance killer on the same front as the things I'm talking about... is grass. Reduce grass density/distance, and you can easily gain dozens of frames in really heavy areas. Grass eats drawcalls and memory for breakfast in FO4. Maybe that's the grass they were smoking when they put together this busted optimization.

As far as I know, there's no concrete fix for that kind of lag, though. The engine really is just that terribly inefficient, in its bones. It has a blood disorder from bad bone marrow. Pretty much every system has SOME hiccup with it. The engine really wasn't built out to handle the upped polygon/cript counts of FO4. They majorly overexerted it, and then took the fastest possible route to optimizing it just well enough. The whole thing needs a major overhaul before it can ever hope to run well imo. Getting high FPS absolutely everywhere is gonna take some modding. And the thing is... if you change any static objects in a cell, you break that precombined optimization and draw calls can easily get over 40k, at which point you might just get medusa'd. It has to be reconstructed manually after changes are made. So it would be a very tedious job for a modder to fully fix. Especially since any other mods changing the area break it again, and whole new precombined meshes need to be made for every possible combo of worldspace-altering mods. Over the years, modders have focused heavily on learning to change things without messing with them in the first place. You can break your whole save just by going to the in-game console and deleting the wrong tree... disabling that optimization for that whole area and causing unstable behavior that corrupts savedata. I'm tellin you man! Shit's mad busted. Icarus'd the crap outta that thing. She's a lil scorched up from that trip to the sun they took when they designed downtown boston.



EDIT: Something very important I forgot to mention: try installing F4SE. It's harmless and completely reversible. What it's really meant to do, is add functions to the engine's scripting system, but it also beefs the whole thing up so that it can handle more scripts, as you would need for modding. IIRC, vanilla FO4 has scripting issues, and there are cases where scripts don't resolve correctly, causing save bloat. The unresolved data hogs headroom, which can become an issue in places like boston. You'll notice increasing lag over time, and load times go wayy up, to the point where it will start hanging on load. Maybe you do some quest, or fast-travel between a certain sequence of locations... and from then on your game runs like garbage. F4SE makes it a lot harder for vanilla to get overburdened with script junk. There are also plugins for F4SE that streamline it and make it even more powerful in various ways, too. It's not only for installing advanced mods. You can use it to make the game run better, too. Though the unofficial patch mod corrects a lot of scripting errors that cause slowdown in the vanilla game to begin with. @evelynmarie made some great suggestions to raise the performance floor so that even the worst-performing areas at least perform at their best. Like, I personally can at least get over 60FPS in Boston with a 3900x and a 3060ti. And that's with visual mods. But I also have all of the optimization mods running, tweaked memory performance, changed how draw distance works in the ini (there are texture/mip related settings that aren't covered by 'draw distance' settings,) all sorts of balancing of different render relating settings, F4SE performance plugins, so on. But I DID have to pick at it from different angles over a very long time, and I still don't know what exactly makes it run decently on my setup.
 
Last edited:
Joined
Nov 11, 2013
Messages
43 (0.01/day)
Location
Canada
System Name Lovelace
Processor 13th Gen Intel Core i7-13700K
Motherboard ASUS ROG STRIX Z790-E GAMING WiFi @ BIOS 2503
Cooling EK Nucleus 360
Memory 32GB G.Skill Trident Z5 RGB Series RAM @ 7200 MHz
Video Card(s) ASUS TUF Gaming Radeon™ RX 7900 XTX OC Edition
Storage WD_BLACK SN850 1 TB, SN850X 2 TB, SN850X 4 TB
Display(s) TCL 55R617 (2018)
Case Fractal Design Torrent (White)
Audio Device(s) Schiit Magni Heretic & Modi+ / Philips Fidelio X2HR + Sennheiser HD 600 & HD 650
Power Supply Corsair RM850x Power Supply (2021)
Mouse Razer Quartz Viper Ultimate
Keyboard Razer Quartz Blackwidow V3
Software Windows 11 Professional 64-bit
I strongly recommend modding Fallout 4 with optimized textures (Workbase, Targeted Textures, and Previsibines Repair Pack version 69.3) and the Fallout 4 Unofficial Patch at least. Playing FO4 without those mods will severely hold the game back in terms of potetential performance due to the absolutely abysmal optimization Bethesda did when they released the game back in 2015.

There's also a dedicated modding guide specifically for improving the overall gameplay and performance of Fallout 4 called The Midnight Ride, which is a great starter guide for modding Fallout 4 to improve its performance to the point where the game will run at 60fps (at 1080p) no sweat even on a Ryzen 5 1600 / GTX 970 rig and it will probably make the game scale better on better hardware as well. DXVK does exist, but it isn't be all end all in terms of improving performance, and in some games it can actually make performance worse.

Without anything, yes, Fallout 4 does run like garbage, but thats honestly just because of how badly Bethesda optimized the vanilla game.

The Midnight Ride can be found by going here. If you're awry of clicking shortened links, just hover over the link and your browser will show you the hyperlink, of course.
 
Last edited:

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,808 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
As can be seen here while memory can have an impact the one thing all these CPUs have is more L3 cache. you will notice regardless of CPU the Ryzen X3D chips easily take the top spots and even with the older 5800 X3D does quite well and while some of those cpus have memory pushed to the max even the chips running more mundane memory still perform more than 2x what your CPU is capable of.

In fallout 4 L3 cache is the biggest limit to performance. Its fairly evident when the only non X3D chip to break the top 20 is a 7950X at 6 GHz on a single CCD. The L3 cache is where the Draw Calls / instructions are held and once you run out of space in the L3 it falls back to system ram. So while higher ram can boost performance the sad fact is an older CPU with less L3 regardless of your memory is still going to get absolutely slaughtered buy and high density L3 Cache equipped CPU. The best 9900K system pushed to its limit is still 30% slower than a 12th gen CPU with the same GPU. A stock 5800X3D can push nearly 50% more frames. In terms of single core performance the 5800X3D is only about 14% better on a single core benchmark. meaning 36% of the performance gain is mostly from the L3 Cache.

7950X vs 7950X3D there is a 50% performance delta because of the L3 cache. None of these are apples to apples comparisons but the Data is there. If you want fallout 4 to run well. You need at least 20-25 MB of L3 cache on Intel CPUs to avoid the worst of the bottlenecks even then the bottleneck still exists. It doesn't disappear either even at extremes if it did the 7950X wouldn't get stomped by a 7950X3D (64 MB vs 128 MB) Also the 7000 series is faster than the 5800X3D yet it manages to beat the 7950X as well.

Also worth noting is AMD GPUs will typically perform worse than they should due to the AMD driver having more overhead in DX11. While AMD has improved this it doesn't change the fact that once you overcome the L3 cache performance bottleneck AMD's driver can still hamstring performance here. As seen by the fact that a 7800X3D with 7900XTX will get demolished by even the ancient 980Ti. Game is no longer GPU limited with any semi current hardware it is 100% L3 cache limited. Again higher speed memory only helps because the Chips are running out of L3 for the instructions. More L3 limits the gains of high speed memory.

ACCXwFb.png
 
Joined
May 30, 2018
Messages
1,890 (0.81/day)
Location
Cusp Of Mania, FL
Processor Ryzen 9 3900X
Motherboard Asus ROG Strix X370-F
Cooling Dark Rock 4, 3x Corsair ML140 front intake, 1x rear exhaust
Memory 2x8GB TridentZ RGB [3600Mhz CL16]
Video Card(s) EVGA 3060ti FTW3 Ultra Gaming
Storage 970 EVO 500GB nvme, 860 EVO 250GB SATA, Seagate Barracuda 1TB + 4TB HDDs
Display(s) 27" MSI G27C4 FHD 165hz
Case NZXT H710
Audio Device(s) Modi Multibit, Vali 2, Shortest Way 51+ - LSR 305's, Focal Clear, HD6xx, HE5xx, LCD-2 Classic
Power Supply Corsair RM650x v2
Mouse iunno whatever cheap crap logitech *clutches Xbox 360 controller security blanket*
Keyboard HyperX Alloy Pro
Software Windows 10 Pro
Benchmark Scores ask your mother
As can be seen here while memory can have an impact the one thing all these CPUs have is more L3 cache. you will notice regardless of CPU the Ryzen X3D chips easily take the top spots and even with the older 5800 X3D does quite well and while some of those cpus have memory pushed to the max even the chips running more mundane memory still perform more than 2x what your CPU is capable of.

In fallout 4 L3 cache is the biggest limit to performance. Its fairly evident when the only non X3D chip to break the top 20 is a 7950X at 6 GHz on a single CCD. The L3 cache is where the Draw Calls / instructions are held and once you run out of space in the L3 it falls back to system ram. So while higher ram can boost performance the sad fact is an older CPU with less L3 regardless of your memory is still going to get absolutely slaughtered buy and high density L3 Cache equipped CPU. The best 9900K system pushed to its limit is still 30% slower than a 12th gen CPU with the same GPU. A stock 5800X3D can push nearly 50% more frames. In terms of single core performance the 5800X3D is only about 14% better on a single core benchmark. meaning 36% of the performance gain is mostly from the L3 Cache.

7950X vs 7950X3D there is a 50% performance delta because of the L3 cache. None of these are apples to apples comparisons but the Data is there. If you want fallout 4 to run well. You need at least 20-25 MB of L3 cache on Intel CPUs to avoid the worst of the bottlenecks even then the bottleneck still exists. It doesn't disappear either even at extremes if it did the 7950X wouldn't get stomped by a 7950X3D (64 MB vs 128 MB) Also the 7000 series is faster than the 5800X3D yet it manages to beat the 7950X as well.

Also worth noting is AMD GPUs will typically perform worse than they should due to the AMD driver having more overhead in DX11. While AMD has improved this it doesn't change the fact that once you overcome the L3 cache performance bottleneck AMD's driver can still hamstring performance here. As seen by the fact that a 7800X3D with 7900XTX will get demolished by even the ancient 980Ti. Game is no longer GPU limited with any semi current hardware it is 100% L3 cache limited. Again higher speed memory only helps because the Chips are running out of L3 for the instructions. More L3 limits the gains of high speed memory.

View attachment 292710
Makes perfect sense to me. FO4 is a drawcall freak. Thanks for the tasty data. Very insightful.

Also just wanna say, in spite of the L3 differences, none of these CPU's should be bottlenecking so badly there. The whole reason it's a problem in the first place is because FO4 is obscenely wasteful when it comes to drawcall distribution. To this day, I struggle to understand how they thought that was okay. I mean, you'd think modern games would be beating it out, as modern hardware generally has more headroom for drawcalls. And then you have FO4, asking for 4x more drawcalls for 50% less to actually happen on the friggin screen. I don't think even Cyberpunk at its worst was asking for as many as downtown Boston.

They really pushed that engine to its absolute limits. I love pointing this out far too much, but ever notice how almost no trees in the open world have leaves on them outside of that small, single-cell pre-war area where the prologue takes place? And yet, what do you see everywhere in the post-war worldspace but leaf piles? Sure, there's fallout abound, but it's over two centuries old fallout, and chernobyl will tell you that extreme radiation does not prevent craaazzzzyyy overgrowth. And the trees DO produce leaves for you to see... on the ground.

Personally, I believe that they would've wanted to have leaves on the trees if they could have. The game starts your postwar travels in the same season as when the bombs dropped, Fall. In that region, the leaves of many trees turn in the fall. I think this is just a convenience, because it's the only reasonable explanation for there being no leaves on the tree branches. And that's not great, as it's not like all trees just shed their leaves at that time. We can't say they're just dead because there are leaf piles, grasses, shrubs, all sorts of other smaller, simpler foliage. There's that, and then there's the fact that it's basically inconceivable that any system available at the time could've come close to handling the leaves and wind animations with how gluttonous the engine already is. Hell, it took modders a good 6-7 years how to figure out solid ways to do it. Everyone wanted the trees to have leaves - it's so bare and stale without them. But for the longest time, nobody could do it without DESTROYING performance. The standard advice was to disable previs/precombine optimizations when running them - I think parts of the mods wouldn't even appear if you didn't. They were breaking that on cells all over the map, knowingly and intentionally undoing the one thing that makes the game run playably. And nobody fully grasped the side-effects that would have. People didn't get that they were straight up breaking their games with trees, zapping frame rates in catastrophic, virtually unmitigatable ways, and even compromising their saves. The first couple waves of the most popular, most listed by game journalists, supposedly safe tree mods are pretty much all defunct now AFAIK, because they're basically unfixable and the methods used are fundamentally reviled by the engine itself. Mind you, droves of people still put up with it and *tried* to make it work, that was how important they were. I encountered people so far in denial about it they'd basically attack you for suggesting their favorite tree mod was the reason for all of the huge problems they were having. Just gotta have *something* there in all of that negative space.

There are now several known ways to add leaves/wind without breaking any of the optimization... but they're pretty damned heavy, and really only possible with present-day GPUs. And it was years of trial and error with nobody in multiple huge, interconnected communities coming close to having fully viable and stable live trees in FO4. And the ways it's actually done now aren't straightforward, it's all tricks and workarounds.

It just seems to me that they must've had a really hard time trying to figure that out. Or maybe they already knew from FO3 that it was too much, and decided "Eh, radioactive wasteland, who cares about the trees?" FO3 actually DID have leafy trees though. It's kind of a staple in big open-world environments. You need ways to break-up the horizon, obscure paths and points of interest, create more tension and dynamism in combat, lead the eye compositionally... I can't think of many reasons NOT to include them other than that they just couldn't remotely make it happen lol. If you want it to look 'authentic' you can make them unhealthy and/or mutated. But to have all of them standing there with no leaves, making everything bland/samey and killing visual mystique, is kinda terrible ngl. Skyrim had tons of trees. And you know what that did more than anything? It made the terrible LODs and skyboxes a much smaller part of what you were forced to see on the screen far more often :laugh: If what I'm saying is true, it's quite unbalanced. So much so they couldn't even squeeze in a way to reasonably fake the generally beneficial trees. To be a fly on the ceiling of the braincases of the people who planned this. The grasp of asset and feature scaling is nonsensical to me. And that goes far beyond the trees or the forest.

I think they got really overambitious with the aesthetic. When you look at the concept art for the city proper in particular, it is packed to the brim with stuff. It all looks awesome, but its very dense and busy. I know it's concept art, but it's especially impractical concept art for what they had to work with. They threw out so much just to accommodate for that, made performance incorrigibly bad, and made it so that a lot of basic things almost every 3D game has going on visually, are cut down or absent... just to put more clutter and objects on the screen in the actual locations. If you ask me, the big urban section could've been significantly smaller and simpler. It's a bear to run, and for what? More empty, destroyed buildings with nothing of interest inside of them and a bunch of small filler dungeons and bandit camps? I just don't understand some of these choices. It's crazy that optimization and really, overall design/scale decisions they made almost a decade ago continue to even be a factor for otherwise good hardware to this day. It's not a game that has aged well. And I say that as someone with like 2500 hours clocked in steam.
 
Last edited:

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,808 (1.72/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
Makes perfect sense to me. FO4 is a drawcall freak. Thanks for the tasty data. Very insightful.

Also just wanna say, in spite of the L3 differences, none of these CPU's should be bottlenecking so badly there. The whole reason it's a problem in the first place is because FO4 is obscenely wasteful when it comes to drawcall distribution. To this day, I struggle to understand how they thought that was okay. I mean, you'd think modern games would be beating it out, as modern hardware generally has more headroom for drawcalls. And then you have FO4, asking for 4x more drawcalls for 50% less to actually happen on the friggin screen. I don't think even Cyberpunk at its worst was asking for as many as downtown Boston.

They really pushed that engine to its absolute limits. I love pointing this out far too much, but ever notice how almost no trees in the open world have leaves on them outside of that small, single-cell pre-war area where the prologue takes place? And yet, what do you see everywhere in the post-war worldspace but leaf piles? Sure, there's fallout abound, but it's over two centuries old fallout, and chernobyl will tell you that extreme radiation does not prevent craaazzzzyyy overgrowth. And the trees DO produce leaves for you to see... on the ground.

Personally, I believe that they would've wanted to have leaves on the trees if they could have. The game starts your postwar travels in the same season as when the bombs dropped, Fall. In that region, the leaves of many trees turn in the fall. I think this is just a convenience, because it's the only reasonable explanation for there being no leaves on the tree branches. And that's not great, as it's not like all trees just shed their leaves at that time. We can't say they're just dead because there are leaf piles, grasses, shrubs, all sorts of other smaller, simpler foliage. There's that, and then there's the fact that it's basically inconceivable that any system available at the time could've come close to handling the leaves and wind animations with how gluttonous the engine already is. Hell, it took modders a good 6-7 years how to figure out solid ways to do it. Everyone wanted the trees to have leaves - it's so bare and stale without them. But for the longest time, nobody could do it without DESTROYING performance. The standard advice was to disable previs/precombine optimizations when running them - I think parts of the mods wouldn't even appear if you didn't. They were breaking that on cells all over the map, knowingly and intentionally undoing the one thing that makes the game run playably. And nobody fully grasped the side-effects that would have. People didn't get that they were straight up breaking their games with trees, zapping frame rates in catastrophic, virtually unmitigatable ways, and even compromising their saves. The first couple waves of the most popular, most listed by game journalists, supposedly safe tree mods are pretty much all defunct now AFAIK, because they're basically unfixable and the methods used are fundamentally reviled by the engine itself. Mind you, droves of people still put up with it and *tried* to make it work, that was how important they were. I encountered people so far in denial about it they'd basically attack you for suggesting their favorite tree mod was the reason for all of the huge problems they were having. Just gotta have *something* there in all of that negative space.

There are now several known ways to add leaves/wind without breaking any of the optimization... but they're pretty damned heavy, and really only possible with present-day GPUs. And it was years of trial and error with nobody in multiple huge, interconnected communities coming close to having fully viable and stable live trees in FO4. And the ways it's actually done now aren't straightforward, it's all tricks and workarounds.

It just seems to me that they must've had a really hard time trying to figure that out. Or maybe they already knew from FO3 that it was too much, and decided "Eh, radioactive wasteland, who cares about the trees?" FO3 actually DID have leafy trees though. It's kind of a staple in big open-world environments. You need ways to break-up the horizon, obscure paths and points of interest, create more tension and dynamism in combat, lead the eye compositionally... I can't think of many reasons NOT to include them other than that they just couldn't remotely make it happen lol. If you want it to look 'authentic' you can make them unhealthy and/or mutated. But to have all of them standing there with no leaves, making everything bland/samey and killing visual mystique, is kinda terrible ngl. Skyrim had tons of trees. And you know what that did more than anything? It made the terrible LODs and skyboxes a much smaller part of what you were forced to see on the screen far more often :laugh: If what I'm saying is true, it's quite unbalanced. So much so they couldn't even squeeze in a way to reasonably fake the generally beneficial trees. To be a fly on the ceiling of the braincases of the people who planned this. The grasp of asset and feature scaling is nonsensical to me. And that goes far beyond the trees or the forest.

I think they got really overambitious with the aesthetic. When you look at the concept art for the city proper in particular, it is packed to the brim with stuff. It all looks awesome, but its very dense and busy. I know it's concept art, but it's especially impractical concept art for what they had to work with. They threw out so much just to accommodate for that, made performance incorrigibly bad, and made it so that a lot of basic things almost every 3D game has going on visually, are cut down or absent... just to put more clutter and objects on the screen in the actual locations. If you ask me, the big urban section could've been significantly smaller and simpler. It's a bear to run, and for what? More empty, destroyed buildings with nothing of interest inside of them and a bunch of small filler dungeons and bandit camps? I just don't understand some of these choices. It's crazy that optimization and really, overall design/scale decisions they made almost a decade ago continue to even be a factor for otherwise good hardware to this day. It's not a game that has aged well. And I say that as someone with like 2500 hours clocked in steam.
It simple notice the FPS then remember physics is tied to framerate and high framerates can actually break the game / cause instability / crashes / etc especially with scripting. As such if the game can handle 30 FPS aka console target then for them it doesn't matter. PC even with hiccups still runs better than console and as long as the game can run stable on console they don't care. Bethesda has also proven that since modders can access pretty much everything (literally modders making full games based on assets / base code / animations then they can also fix issues. Which is why the unofficial patches exist and fix more stuff than Bethesda will ever bother to fix.

As for Skyrim the trees look like dog shit at distance they all appear the same. However, this is due to how the game loads "Cells" which is why Ugrid to load modding is so popular since it greatly improved distance rendering / LOD quality. That was the quick easy fix for graphics improvement from game to game. For example Oblivion was Ugrid to load 3 as the base. Fallout 3 / NV / 4 and Skyrim its Ugrid to load 5. Increasing Ugrid is the biggest and easiest way to improve overall visual quality, however, it also makes it so more scripts load / more NPCs are active / more draw calls etc etc. Yet its probably one of the most common ini mods people make. You can actually improve overall FPS by dropping Ugrid to 3 if memory serves me right but it can have some weird consequences. Same with going too high. Ugrid 7 is typically stable and I imagine it is where Bethesda will go next in order to boost visual fidelity.

But to put it in simpler terms. Ugrid 3 = 3x3 of interactable cells so 9 cells total where the game / NPCs / Scripts / physics / etc everything is active. Ugrids 5 is 5x5 so 25 cells so nearly 3x the data being processed. Going to Ugrid 7 boosts that 49 cells and Ugrid 9 = 81. Ugrid 11 = 121 cells. Anything above 7 is pretty much for shits and giggles and photo ops. As Ugrid 9 will eventually have issues / crashes / instability. However, Ugrid 7 I have 230 hours with no stability issues. Ugrid 9 forget it after about 10 hours It would begin to CTD. Altho with higher memory / vram / L3 cache it may eventually become stable.

Regardless Ugrid at a higher setting should in theory increase draw calls as well causing further bottlenecks.

But I digress its an issue / theme that has existed since Morrowind none of this is new, the same issue has plagued Bethesda titles on this engine for nearly 2 decades.

As for Bethesda's decision its pretty simple really, almost any NIF files etc from back in 2003 can be carried forward even now. Its why stuff like Morroblivion / Skywind / etc exist. Its why Bethesda was able to fairly easily port Skyrim to Fallout 4's engine with the SE version. Think about it you have a wealth of assets you have created over 20+ years that you can adapt and reuse, be it animations which can be tweaked, Scripts that can be repurposed, etc etc. It makes game development easier, however, as is usually the case, the easier path is usually far less optimal. Go figure.

 
Joined
Dec 12, 2020
Messages
1,755 (1.24/day)
@crazyeyesreaper and robotzombie
Maybe Bethesda should be hiring you guys to develop their next engine. Thanks for your excellent analyses too.

The lack of 9700k L3 cache might explain why my 4090's utilization goes down along with framerates when I look out over the ruined city of Vladivostok in the Metro:Exodus Sam's Story DLC or in a certain section of Dying Light. A 9900k would only net me 33% more L3 cache though so I'd imagine that would be a pointless upgrade.

How were the GTAV developers able to get around having excessive draw calls in their cityscapes (in particular, Los Santos)?

GTAIV has a certain road on the far eastern island that exhibits the exact same framedrop my 1080ti experienced, except now the GPU utilization is a lot less.
 
Top