- Joined
- Jan 1, 2012
- Messages
- 343 (0.07/day)
This is a mindset I've found myself gravitating towards over the years.No. But here's my general philosophy when troubleshooting or just when tweaking and dinking to see what happens: If I change a default setting and I don't see any improvement, I restore the default. I don't leave the change. Why?
Contrary to what some want everyone else to believe, the teams of 100s (1000s?) of PhDs, computer scientist and professional programmers at Microsoft, and their combined centuries of experience and their 10s of exabytes of empirical data to draw upon, really do know what they are doing. It is just with all that brainpower, experience, data and supercomputers to crunch with, they still cannot account for each and every one of the 1.6 billion "unique" scenarios and Windows computers out there.
So if changing the default does improve things fine. Leave it. But if no improvement is seen, change it back to the factory default setting.
Back when I was still learning and wanting to fiddle with things, I was more apt to want to "find things to fix". Disabling Windows services to lower memory and resource consumption, or finding some secret "optimization setting" to change, and that sort of thing. Maybe it's a result of our brains thinking "I changed thing, ergo it's an improvement" but I notice some people can't help but think a change is an improvement and will refuse to believe that maybe it's not, even if they can't substantiate why it's the improvement they think it is. It seems to boil down to "nothing bad happened in my experience after the change" so they refuse to budge from their position.
I hope I don't get into a bit of a rant here but it's somewhat on topic of "running out of memory". One thing in particular people still seem especially adamant about is changing page file settings. It seems better than it used to be; people seem to be learning that disabling it is bad but they're still stuck on setting a fixed, tiny size as though it's an optimization thing. I used to be one of those people myself, and until relatively recently even... until I did learn otherwise by having something bad happen. I replaced some parts in 2011 and one of them was an upgrade to 16 GB, so hey, I don't need a page file, right? So I disabled it. It "worked" for a whole six years... and then it didn't.
"What's going on?" I asked.
I posted that in 2017 thinking it was a Minecraft thing, and I was confused why I was out of RAM with almost half of my RAM still unused. It's actually easy to see what happened here in hindsight; I had reached my commit limit and it had no ability to grow because I had no page file. By setting my page file settings like that, it was effectively acting as a "please prevent me from being able to use all my memory if my commit charge ever outpaces my in use memory", and the commit charge almost always does outpace in use memory. Your commit limit is your RAM capacity plus current page file size. Minecraft is also something that can allocate much so more memory than it uses under certain circumstances, so I hit my commit limit of 16 GB well before using that much. There's a lot of other realistic and practical workflows that have much higher commit to in use balances too.
I had already been playing with PCs for a decade and a half at that point, but I never think it's too late to admit you didn't know better on something, and to reverse course. To the contrary, I'm always trying to humble myself that I have much to learn. I've set my page file back to system managed and started trying to learn more about memory management (even though I, relatively speaking, still know absolutely nothing here, but I'm at least one step closer than I was before). I wish other people would be more open to considering that maybe they don't know enough to have a formal opinion on something, but that seems to be getting rarer by the day. So seeing you say this was a bit refreshing, haha.
I guess it's because many enthusiasts and gamers tend to have far more RAM than they probably need or use, and most games alone probably don't allocate much more than they use, so these people disable or set a very small fixed page file and skirt issues. It's like buying an RTX 4090, putting it on a borderline PSU, not pushing said RTX 4090, and then concluding "this is fine". Yes, it's fine... when you don't even push it. It also means you're not in a position to be able to speak of what's enough to begin with. But these same people then recommend to others that they "should" change their page file settings, and without having a clue as to what the users' workflow and memory needs typically are! This isn't some one size fits all value setting relative to RAM capacity; it's relatively to the workload itself. People with a lot of RAM relative to their needs get away with limiting their commit limit, but that certainly doesn't make it the smarter thing to do for others.
Sorry for the mini rant.