• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Running out of RAM

No. But here's my general philosophy when troubleshooting or just when tweaking and dinking to see what happens: If I change a default setting and I don't see any improvement, I restore the default. I don't leave the change. Why?

Contrary to what some want everyone else to believe, the teams of 100s (1000s?) of PhDs, computer scientist and professional programmers at Microsoft, and their combined centuries of experience and their 10s of exabytes of empirical data to draw upon, really do know what they are doing. It is just with all that brainpower, experience, data and supercomputers to crunch with, they still cannot account for each and every one of the 1.6 billion "unique" scenarios and Windows computers out there.

So if changing the default does improve things fine. Leave it. But if no improvement is seen, change it back to the factory default setting.
This is a mindset I've found myself gravitating towards over the years.

Back when I was still learning and wanting to fiddle with things, I was more apt to want to "find things to fix". Disabling Windows services to lower memory and resource consumption, or finding some secret "optimization setting" to change, and that sort of thing. Maybe it's a result of our brains thinking "I changed thing, ergo it's an improvement" but I notice some people can't help but think a change is an improvement and will refuse to believe that maybe it's not, even if they can't substantiate why it's the improvement they think it is. It seems to boil down to "nothing bad happened in my experience after the change" so they refuse to budge from their position.

I hope I don't get into a bit of a rant here but it's somewhat on topic of "running out of memory". One thing in particular people still seem especially adamant about is changing page file settings. It seems better than it used to be; people seem to be learning that disabling it is bad but they're still stuck on setting a fixed, tiny size as though it's an optimization thing. I used to be one of those people myself, and until relatively recently even... until I did learn otherwise by having something bad happen. I replaced some parts in 2011 and one of them was an upgrade to 16 GB, so hey, I don't need a page file, right? So I disabled it. It "worked" for a whole six years... and then it didn't.

"What's going on?" I asked.

wP3xDka.png


I posted that in 2017 thinking it was a Minecraft thing, and I was confused why I was out of RAM with almost half of my RAM still unused. It's actually easy to see what happened here in hindsight; I had reached my commit limit and it had no ability to grow because I had no page file. By setting my page file settings like that, it was effectively acting as a "please prevent me from being able to use all my memory if my commit charge ever outpaces my in use memory", and the commit charge almost always does outpace in use memory. Your commit limit is your RAM capacity plus current page file size. Minecraft is also something that can allocate much so more memory than it uses under certain circumstances, so I hit my commit limit of 16 GB well before using that much. There's a lot of other realistic and practical workflows that have much higher commit to in use balances too.

I had already been playing with PCs for a decade and a half at that point, but I never think it's too late to admit you didn't know better on something, and to reverse course. To the contrary, I'm always trying to humble myself that I have much to learn. I've set my page file back to system managed and started trying to learn more about memory management (even though I, relatively speaking, still know absolutely nothing here, but I'm at least one step closer than I was before). I wish other people would be more open to considering that maybe they don't know enough to have a formal opinion on something, but that seems to be getting rarer by the day. So seeing you say this was a bit refreshing, haha.

I guess it's because many enthusiasts and gamers tend to have far more RAM than they probably need or use, and most games alone probably don't allocate much more than they use, so these people disable or set a very small fixed page file and skirt issues. It's like buying an RTX 4090, putting it on a borderline PSU, not pushing said RTX 4090, and then concluding "this is fine". Yes, it's fine... when you don't even push it. It also means you're not in a position to be able to speak of what's enough to begin with. But these same people then recommend to others that they "should" change their page file settings, and without having a clue as to what the users' workflow and memory needs typically are! This isn't some one size fits all value setting relative to RAM capacity; it's relatively to the workload itself. People with a lot of RAM relative to their needs get away with limiting their commit limit, but that certainly doesn't make it the smarter thing to do for others.

Sorry for the mini rant.
 
I've installed more than a hundred times ever since W2000, and many times I've done it directly without trying to figure out what's going on, but in the end it always depends on what's wrong.

This time it was different. It pretty much always takes three days before it goes wrong, and I just got too curious to let it go.

Can anyone guess why this code would cause memory leaks in Windows, but only after three days? I'm no expert.. :D The program itself never uses more than 4 MB.

View attachment 323638
What’s the purpose of this code? What is it supposed to do?

I’m wondering if you’re leaking the media player object by passing it to that async code thus creating new objects every time the code is called.

I’m not at my computer right now so I can’t really take a look at the code.
 
Last edited:
  • Like
Reactions: SL2
What’s the purpose of this code? What is it supposed to do?

I’m wondering if you’re leaking the media player object by passing it to that async code thus creating new objects every time the code is called.

I’m not at my computer right now so I can’t really take a look at the code.
Play a soundfile (silence) every tenth second.

If that were the case, I think I should see more Ram usage before day three. The paged pool gets about 6 GB larger per hour, so it's hard to miss.
poolmon points to EtwR, so I disabled all I could in Startup Event Trace Sessions, although that didn't help.

The last line makes the 10 s delay before continuing the loop, it has nothing to do with SoundPlayer, AFAIK. There's only one instance of SoundPlayer running.

I'm going to set longer intervals, and if that affects how long it takes until the problem comes back then I'll start focusing on my program.
I will also replace SoundPlayer with MediaPlayer and see if that changes anything. Or set the task scheduler to start it every x second instead of looping it.

Thread.Sleep() seems a bit dumb.. :)

Again, it's not the program that uses up all the RAM, but it causes Windows to do that.

I'm not either. I'm a electronics technician, not a programmer (and that's by choice). But I did find this: SoundPlayer causing Memory Leaks? that looks pretty similar to your situation, even though that post is from 13 years ago. It might give you, or someone who is a real programmer, a clue.
I will have a look at it, and I was going to change it to the MediaPlayer instead as I want volume control.
 
Last edited:
It seems to boil down to "nothing bad happened in my experience after the change" so they refuse to budge from their position.
That is exactly what seems to happen way too often. And that logic is so twisted too. Instead of "nothing bad happened" they should be basing their decision on, "did the change improve anything?" And if no improvement was seen, change it back. We see it all the time when folks (who typically have no formal training whatsoever) believe in their hearts they are smarter than the true memory/resource management experts at Microsoft - yet they have no clue what commit rates are, how to calculate them, or even wonder why Microsoft made the setting dynamic. It is not a "set and forget" setting. And yet, we see folks even boast about changing that is one of the first things they always do when installing Windows - because they always have done it that way. :(

Since Microsoft has demonstrated they can easily program Windows to change the PF setting as needed (dynamically), they could have easily coded it to disable (or set to 0) the PF if that was the optimal setting for that machine at that time. But they didn't. That should be the clue to just leave the setting alone.

Oh well. That's for another discussion.
 
teams of 100s (1000s?) of PhDs, computer scientist and professional programmers at Microsoft, and their combined centuries of experience and their 10s of exabytes of empirical data to draw upon, really do know what they are doing.

No no... microsoft bad, random internet people good.
 
If that were the case, I think I should see more Ram usage before day three.
I'm certainly the farthest thing from a coder, but I'm interested in seeing the answer to this all the same (even if there's a good chance I won't understand the reasoning).

I would think if there was a memory leak from something running continuously from start to lean towards doing so linearly from the get-go, but starting after three days is interesting.

And out of curiosity, what is the purpose for wanting to play a silent sound every ten seconds?
Since Microsoft has demonstrated they can easily program Windows to change the PF setting as needed (dynamically), they could have easily coded it to disable (or set to 0) the PF if that was the optimal setting for that machine at that time. But they didn't. That should be the clue to just leave the setting alone.
I have to wonder is this has anything to do with it (if that's even still relevant; it says it applies to Windows 10 but not Windows 11). The reasoning I get told for why some people support changing the page file settings is supposedly that some games have issues on system managed and that that these manual page file changes supposedly fix, but I can't verify as I don't typically play said games. If my mere theory here is correct (and that's giving credit to the problem even being there with these games), then it is the raising of the initial size that is fixing it, and I mention that the cutting back of the maximum size is unnecessary (and this is the real risk with it anyway, the raising of the initial size is whatever), but well... that falls on deaf ears.

But I think I'm giving too much credit because these people say the real reason for needing to change the settings is "we always had to do this". As if things never change, and as if certain things weren't maybe placebo all along.

But in my experience, the page file does behave fine for me on system managed. Modern Windows seems to set an initial size around 1/8th the installed RAM capacity and raise the commit limit if the commit charge ever gets close, and can do this up to 3 times RAM capacity. Seems pretty adaptable when it works at least (and it does for me).
 
Last edited:
  • Like
Reactions: SL2
The reasoning I get told for why some people support changing the page file settings is supposedly that some games have issues on system managed and that that these manual page file changes supposedly fix
If that happens with certain games, then the game is fundamentally broken and needs to be fixed.
 
The reasoning I get told for why some people support changing the page file settings is supposedly that some games have issues on system managed and that that these manual page file changes supposedly fix

"Game requirements: Please adjust your pagefile size before starting the game"....
 
No no... microsoft bad, random internet people good.
LOL

Yeah, shame on me for thinking 4 years of college, or 4 years of college plus 2 years of grad school, or worse, plus 4 years of grad school, and a bunch of certs would mean those folks might have a clue as to what they are doing. :kookoo:
 
Last edited:
"Game requirements: Please adjust your pagefile size before starting the game"....
And if any game company told me to do that, that's one game I'd tell them to stuff it. Where? Use your imagination.
 
I've installed more than a hundred times ever since W2000, and many times I've done it directly without trying to figure out what's going on, but in the end it always depends on what's wrong.

This time it was different. It pretty much always takes three days before it goes wrong, and I just got too curious to let it go.

Can anyone guess why this code would cause memory leaks in Windows, but only after three days? I'm no expert.. :D The program itself never uses more than 4 MB.

View attachment 323638
That while loop, while(true), is an infinite loop with no exit condition. That means while the program is running, it will spawn a new process continuously. Even if the program only uses 4MB of memory, it will still run out of memory in a matter of time.
 
And out of curiosity, what is the purpose for wanting to play a silent sound every ten seconds?
My BT connection causes crackling noises when no audio is played.
That while loop, while(true), is an infinite loop with no exit condition. That means while the program is running, it will spawn a new process continuously. Even if the program only uses 4MB of memory, it will still run out of memory in a matter of time.
Poolmon points to Windows memory leaks, not the program.
I thought only the lines inside the while loop were actually looped. new SoundPlayer() is outside the loop.

I saw this list about reasons for leaks, and it does make sense to me, as the problem doesn't show up until days have passed:

1701492263678.png


It seems like it's not the loop that causes it, but for how long the program runs. GC just gives up after a few days. I'd guess that having the program restart itself once a day makes more sense than using(), as the latter makes sense if the program cause memory leaks from the beginning.
 
Last edited:
My BT connection causes cracking noises when no audio is played.

Poolmon points to Windows memory leaks, not the program.
I thought only the lines inside the while loop were actually looped. new SoundPlayer() is outside the loop.

I saw this list about reasons for leaks, and it does make sense to me, as the problem doesn't show up until days have passed:

View attachment 323730

It seems like it's not the loop that causes it, but for how long the program runs. GC just gives up after a few days. I'd guess that having the program restart itself once a day makes more sense than using(), as the latter makes sense if the program cause memory leaks from the beginning.
You misunderstood - as it is an infinite loop without exit condition, as long as the program is running, it is constantly firing up a Task. The longer the program runs, the more tasks it is generating. Each Task will take up memory, and eventually, you will run out of memory.
 
"Game requirements: Please adjust your pagefile size before starting the game"....
I did address this in a further post.

"...it is the raising of the initial size that is fixing it, and I mention that the cutting back of the maximum size is unnecessary (and this is the real risk with it anyway, the raising of the initial size is whatever), but well... that falls on deaf ears."

I'm not saying adjusting the page file is wrong.

I'm not saying there's never a need to do so (to the contrary; it does exist as a setting and I personally think the low size of the initial size combined with the aforementioned linked issue may be resulting in these necessary situations and that Microsoft should lean more back towards its old way of using a larger initial size).

Those are both absolutes and I won't make such statements. I'm merely saying the other absolute is bad as well. When people give suggestions that it should be changed, even when it's not relevant or causing issues, and especially when they give suggestions to use relatively low maximum values, it becomes concerning. Why? Because too often have I seen others in the same situation I was in above. And how do they get in that position? When asked, it's often something along the lines of "my tech friend or some online suggestion told me to change it that way" or worse "I took it for service and they changed it" (shockingly, yes, I was told that once). When you have to clean the mess this bad suggestion causes one too many times, it starts to wear on you.
 
You misunderstood - as it is an infinite loop without exit condition, as long as the program is running, it is constantly firing up a Task. The longer the program runs, the more tasks it is generating. Each Task will take up memory, and eventually, you will run out of memory.
Why doesn't this show in RAM usage for the first < 3 days? Also, wouldn't this be visible in task manager? Sorry, I'm out of my element here lol..
 
Last edited:
with only fire fox running 23h2
Screenshot (214).png
Screenshot (215).png
 
I'm not sure what you're trying to say?
what ram usage can be on win 11 with tweaking and no startup tasks down to 28 processes use other than chrome it eats memory
 
what ram usage can be on win 11 with tweaking and no startup tasks down to 28 processes use other than chrome it eats memory
Yeah.. that's not what we're discussing, sorry. I had specific problem, see OP. Now I know the cause for it, and it's just a matter of getting the code right, and hopefully undestand it.

I gave up hunting RAM usage, processes, and checking BlackViper a decade ago.

Before that I spent way too much time trying to change things, my W7 record was 1.3 GB install size and a 500 MB ISO.
1701508540111.png
 
Why doesn't this show in RAM usage for the first < 3 days? Also, wouldn't this be visible in task manager? Sorry, I'm out of my element here lol..
If this was occurring right from the start, yes, I would think it would be reflected in your paged pool (and by extension, your in use memory) amounts.

If it's not, and why it's not, isn't something I could answer though. But regardless, it certainly seems it is leaking that memory, even if it has that weird delayed impact instead of a linear impact.
 
  • Like
Reactions: SL2
Why doesn't this show in RAM usage for the first < 3 days? Also, wouldn't this be visible in task manager? Sorry, I'm out of my element here lol..
Probably because each of the Task run only takes up a few bytes, so it does take a while for it to become problematic. I will take a look at it this evening and play with it in visual studio. Do you need to install any nuget packages for your code?
 
You misunderstood - as it is an infinite loop without exit condition, as long as the program is running, it is constantly firing up a Task. The longer the program runs, the more tasks it is generating. Each Task will take up memory, and eventually, you will run out of memory.
Then instead of using that call to an async task, you can do it like this...
C#:
int d = 10;

SoundPlayer sp = new SoundPlayer(@"C:\Windows\Silence.wav");

while (true)
{
    sp.PlaySync();
    Thread.Sleep(TimeSpan.FromSeconds(d)); // This makes the thread sleep instead of using a call to Task.Run()
}
 
  • Like
Reactions: SL2
Probably because each of the Task run only takes up a few bytes, so it does take a while for it to become problematic.
That's not how it works. Like I said, RAM usage won't budge for three days, and after that it increases by about 6 GB per hour. I have had task manager open on a separate display, keeping a close eye on it.
The leak kicked in yesterday like predicted. When I left home the paged pool was about 500 MB, and when I came home an hour later it was 6,2 GB, and I could see it go up with about 100 MB at a time. Non-paged had gone up to about 1 GB too. This is totally expected behavior based on what I ve seen before.
Do you need to install any nuget packages for your code?
No, only using this:
Code:
using System;
using System.Media;
using System.Threading.Tasks;
Then instead of using that call to an async task, you can do it like this...
I dunno, like I said before, I'm sceptical about Thread.Sleep().

___________________________________________________________

Currently I'm running with a 1 second delay (down from 10), just to see if the problem kicks in earlier. My bet is no, but I'm only 9 hours in so it's to early to tell.
 
Last edited:
This is a mindset I've found myself gravitating towards over the years.

Back when I was still learning and wanting to fiddle with things, I was more apt to want to "find things to fix". Disabling Windows services to lower memory and resource consumption, or finding some secret "optimization setting" to change, and that sort of thing. Maybe it's a result of our brains thinking "I changed thing, ergo it's an improvement" but I notice some people can't help but think a change is an improvement and will refuse to believe that maybe it's not, even if they can't substantiate why it's the improvement they think it is. It seems to boil down to "nothing bad happened in my experience after the change" so they refuse to budge from their position.

I hope I don't get into a bit of a rant here but it's somewhat on topic of "running out of memory". One thing in particular people still seem especially adamant about is changing page file settings. It seems better than it used to be; people seem to be learning that disabling it is bad but they're still stuck on setting a fixed, tiny size as though it's an optimization thing. I used to be one of those people myself, and until relatively recently even... until I did learn otherwise by having something bad happen. I replaced some parts in 2011 and one of them was an upgrade to 16 GB, so hey, I don't need a page file, right? So I disabled it. It "worked" for a whole six years... and then it didn't.

"What's going on?" I asked.

wP3xDka.png


I posted that in 2017 thinking it was a Minecraft thing, and I was confused why I was out of RAM with almost half of my RAM still unused. It's actually easy to see what happened here in hindsight; I had reached my commit limit and it had no ability to grow because I had no page file. By setting my page file settings like that, it was effectively acting as a "please prevent me from being able to use all my memory if my commit charge ever outpaces my in use memory", and the commit charge almost always does outpace in use memory. Your commit limit is your RAM capacity plus current page file size. Minecraft is also something that can allocate much so more memory than it uses under certain circumstances, so I hit my commit limit of 16 GB well before using that much. There's a lot of other realistic and practical workflows that have much higher commit to in use balances too.

I had already been playing with PCs for a decade and a half at that point, but I never think it's too late to admit you didn't know better on something, and to reverse course. To the contrary, I'm always trying to humble myself that I have much to learn. I've set my page file back to system managed and started trying to learn more about memory management (even though I, relatively speaking, still know absolutely nothing here, but I'm at least one step closer than I was before). I wish other people would be more open to considering that maybe they don't know enough to have a formal opinion on something, but that seems to be getting rarer by the day. So seeing you say this was a bit refreshing, haha.

I guess it's because many enthusiasts and gamers tend to have far more RAM than they probably need or use, and most games alone probably don't allocate much more than they use, so these people disable or set a very small fixed page file and skirt issues. It's like buying an RTX 4090, putting it on a borderline PSU, not pushing said RTX 4090, and then concluding "this is fine". Yes, it's fine... when you don't even push it. It also means you're not in a position to be able to speak of what's enough to begin with. But these same people then recommend to others that they "should" change their page file settings, and without having a clue as to what the users' workflow and memory needs typically are! This isn't some one size fits all value setting relative to RAM capacity; it's relatively to the workload itself. People with a lot of RAM relative to their needs get away with limiting their commit limit, but that certainly doesn't make it the smarter thing to do for others.

Sorry for the mini rant.
Your rant is appreciated, but it also underlines why so many think they can outdo MS. Why is such a commit limit in place? MS requires you to either sacrifice notable storage space just to use your RAM, or face annoying errors like this. Linux doesnt ahve this issue, and neither did MACos back when you could upgrade. I can use 99% of my system RAM with no swap and no errors. Of course, that should be the default for msot users, but why is there no option to get around this?

It's also brining back bad memories of XP and it's LOVE of swapping as much as possible to pagefile. I do remember making RAMdisks with excess sytem RAM and forcing the pagfile onto them to get around this, but it was something just dumb....
 
Back
Top