• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
IMO, ideally, you'd always want to have a GPU bottleneck on your system,
and is typically the most common bottleneck although people do forget about monitor especially on game sites I also post on (I want to upgrade to the RTX 4070ti!...you are gaming on a 1080p 60hz monitor)
 
And what does that exactly achieve ? Unless there is some issue with the game engine there is no reason to artificially limit performance, you're not getting any less stutter and you're certainly getting more input lag if you do that.

It achieves substantially lower latency when the gpu isn't maxed out, and it achieves substantially more even frametimes when the cpu isn't maxed out.
 
And what does that exactly achieve ? Unless there is some issue with the game engine there is no reason to artificially limit performance, you're not getting any less stutter and you're certainly getting more input lag if you do that.
In this, as in many other things, you're so confidently wrong.

 
In this, as in many other things, you're so confidently wrong.

Variable Refresh Rate technology - has entered the chat
:D

Also related:
Prior to me getting 'set on FreeSync' I did try GSYNC on an Acer 27" 1440p and 780Ti. At least back then, I felt both VRR techs were equal, (and equally fantastic! so smooth. very wow.)
Before VRR, I'd gotten myself a VG248QE*. On that 144hz screen, tearing was a bitch if I didn't cap (read: limit performance) the framerate to 143-145 or 72ish FPS (depended on game). I never did any testing on input latency, but screen tearing definitely was disruptive to gaming if I didn't purposefully limit performance.
(*I never did get the GSYNC module, but still have the LCD)
 
And what does that exactly achieve ? Unless there is some issue with the game engine there is no reason to artificially limit performance, you're not getting any less stutter and you're certainly getting more input lag if you do that.
But you get rid of any unnecesarily spent power and heat in your system.
 
But you get rid of any unnecesarily spent power and heat in your system.
Yes, plus lower input lag and stutter, not despite.
 
I don't get the point of this theoretical debates - when there's users even among us who own the hardware in question and can test any game - and see in practice if the most demanding titles of 2023 - actually need that much ram on highest settings. Even tho among the games at large - the number of people who own GPUs with 24 GB Ram is way to small - to be targeted by the Gaming industry. And that's just it... the gaming industry - does take into account the most common or least common hardware used for playing their games. It's what they use to test this game - but also to impose limitations (what deemed as the Highest/Ultra Settings) - or else... they would crash. And if... even a 4090 Ti with 24 GB VRAM would constantly crash on highest settings - that would be bad for business (the company getting accused of releasing a heavilly bugged or poorly optimized game).

The thing is - the current hardware is still limiting for current software potential. As proven by C.G.I. developers from the movie industry - where supposedly - it's quite common 24 GB VRAM to get filled-up. Thus, for heavy load workstations - AMD actually recommends RadeonPRO W6800 Graphics (which comes with 36 GB VRAM). While nVidia takes things a bit further this year - with their RTX A6000 (which comes with 48 GB VRAM). That being said... 24 GB VRAM is still kinda over-kill for gaming - where even on highest settings "depending on title" - there could be a limit of max 20 GB VRAM if not less or far less even (again, tested it out - since it's pointless to guess when there's room for practice) - while the main focus is still the raw power of most common and least common used GPUs. I mean, even the most poorly optimized game of 2023 - needs less than 20 GB VRAM on highest settings. Maybe not online games (cause some Anti-Cheat system - might get you banned) - but you can use mods on SP games to further increase their visuals - enough to fill (or go over) 24 Gb VRAM. But this mods are an exception/taboo from the gaming industry standards.
 
It achieves substantially lower latency when the gpu isn't maxed out
That's categorically not true, the higher the framerate the lower the average latency, so by caping your framerate you are most certainly not lowering your latency, that makes no sense.

it achieves substantially more even frametimes when the cpu isn't maxed out.
And you are also increasing the average frametimes as a result, I don't know what use this could have to anyone, also I don't know where you guys got this idea that maxed out CPU/GPU = more even frametime or less latency, none of that is true.

In this, as in many other things, you're so confidently wrong.


If you can't be bothered to actually explain yourself and you just post a link I am going to assume that, as in many other things, you didn't have anything intelligent to say.
 
And you are also increasing the average frametimes as a result
No, you introduce even frametimes, which is a much more pleasant experience than lower, but uneven ones.

Especially if you combine it with technologies like variable refresh rates and AMD "Enhanced Sync" or Nvidia "Free Vsync" or whatever they call it (it's under the Vsync option in the driver menu).
 
No, you introduce even frametimes, which is a much more pleasant experience than lower, but uneven ones.

Especially if you combine it with technologies like variable refresh rates and AMD "Enhanced Sync" or Nvidia "Free Vsync" or whatever they call it (it's under the Vsync option in the driver menu).

100 frames in a second = an average frametime of 10ms

120 frames in a second = an average frametime of 8ms

By capping the framerate there is no guarantee that you'll get more even frametimes but you will certainly get longer frametimes. Also Enhanced Sync and Fast Sync do not cap the framerate, the game still runs uncapped internally, you just don't see the tearing. And VRR obviously needs a capped framerate for it to work, nothing interesting there.
 
100 frames in a second = an average frametime of 10ms

120 frames in a second = an average frametime of 8ms

By capping the framerate there is no guarantee that you'll get more even frametimes but you will certainly get longer frametimes. Also Enhanced Sync and Fast Sync do not cap the framerate, the game still runs uncapped internally, you just don't see the tearing.
Hilarious how you don't even know the meaning of the terms you throw about.

Fast sync increases input lag, because it holds frames unless they sync with the monitor's refresh rate, when FPS is above monitor refresh rate those frames are literally not shown. To achieve lower input lag than with fast sync off, you would need FPS twice or more than what your monitor can display, i.e. 240+ on a 120 Hz display. This would need to be always above this mark, every time your FPS dipped to below twice your monitor's refresh rate, your input lag would increase. This is to reduce screen tearing.

FPS caps cause lower input delay, regardless of whatever theoretical 1-2ms FRAMETIME advantage you can get by going uncapped. For example, 237 FPS which is what you should cap a 240 Hz system to for minimal input delay and a consistent frametime, gives 4.2 ms frametimes. Moving to 300 FPS would reduce that by less than one ms to 3.3 ms frametimes, but that number would fluctuate much more than a locked FPS. The theoretical less than one ms gained in frame time would be more than offset by the issue of increased input delay, which is something you seem to be struggling with.

When your FPS is above the VRR rate of your monitor, VRR sync is not enabled, increasing input delay.

Muscle memory in games is tied to stable frame rates, consistency is much more important than higher numbers, but I understand many people don't quite get that.

The intricacies of esports tuning for minimal input lag and clarity is something I've made myself familiar with. Before my medical degree got too intense, I played for Swansea Storm for two seasons, one as a player on the first team, and the second as captain of the second team. We beat over 150 universities in the UK to win the NUEL and NSE tournaments in Rainbow Six Siege. I've probably got a few clips floating around somewhere on discord if you don't believe me.

1680883480442.png
 
Last edited:
Hilarious how you don't even know the meaning of the terms you throw about.

Fast sync increases input lag, because it holds frames unless they sync with the monitor's refresh rate, when FPS is above monitor refresh rate those frames are literally not shown. To achieve lower input lag than with fast sync off, you would need FPS twice or more than what your monitor can display, i.e. 240+ on a 120 Hz display. This is to reduce screen tearing.

FPS caps cause lower input delay, regardless of whatever theoretical 1-2ms advantage you can get by going uncapped. For example, 237 FPS which is what you should cap a 240 Hz system to for minimal input delay and a consistent frametime, gives 4.2 ms frametimes. Moving to 300 FPS would reduce that by less than one ms to 3.3 ms frametimes, but that number would fluctuate much more than a locked FPS.

When your FPS is above the VRR rate of your monitor, VRR sync is not enabled, increasing input delay.

Muscle memory in games is tied to stable frame rates, consistency is much more important than higher numbers, but I understand many people don't quite get that.

Too add to that, when the render pipeline is full on the gpu, it increases latency - hence why the only thing nvidia reflex really does is prevent games from using 100% gpu load, thus drastically reducing input latency.
 
Fast sync
That's the word I was looking for, thanks!

100 frames in a second = an average frametime of 10ms

120 frames in a second = an average frametime of 8ms

By capping the framerate there is no guarantee that you'll get more even frametimes but you will certainly get longer frametimes. Also Enhanced Sync and Fast Sync do not cap the framerate, the game still runs uncapped internally, you just don't see the tearing. And VRR obviously needs a capped framerate for it to work, nothing interesting there.
I know what FPS and ms mean, thanks. ;)

You're right about Enhanced Sync and Fast sync, that's why I said combined with an FPS limit! When you have Fast Sync enabled, your GPU calculates all the frames and discards the ones above your monitor's refresh rate to eliminate tearing. An FPS limit will force the GPU not even to calculate those frames. So when you set your FPS limit to the monitor's refresh rate, and the game can actually achieve that, you will see butter smooth, even frame times and no tearing with no unnecessary heat introduced to your PC. I don't know what more you could want.

Apologies for the quality of my drawing, but I made a diagram to illustrate why your FPS and ms equivalencies don't mean anything to the human eye:
frametimes.png


Needless to say, the first image is an uncapped frame rate, the right one is a capped one at 60 FPS. While the average frametime is lower in the first case, the second one is a smoother experience.
 
That's the word I was looking for, thanks!


I know what FPS and ms mean, thanks. ;)

You're right about Enhanced Sync and Fast sync, that's why I said combined with an FPS limit! When you have Fast Sync enabled, your GPU calculates all the frames and discards the ones above your monitor's refresh rate to eliminate tearing. And FPS limit will force the GPU not even to calculate those frames. So when you set your FPS limit to the monitor's refresh rate, and the game can actually achieve that, you will see butter smooth, even frame times and no tearing with no unnecessary heat introduced to your PC. I don't know what more you could want.

Apologies for the quality of my drawing, but I made a diagram to illustrate why your FPS and ms equivalencies don't mean anything to the human eye:
View attachment 290680

Needless to say, the first image is an uncapped frame rate, the right one is a capped one. While the average frametime is lower in the first case, the second one is a smoother experience.
Considering I'm finding both you and those you're debating with 'correct', I think there's a perspective conflict.

-A esports/twitch gamer needs as low of frametimes and input latency as possible.

-Many gamers just want a pleasant experience, that doesn't break immersion from technical issues.
(Ex. VSYNC breaks my immersion from perceptible input latency, but for others it guarantees a smooth experience.

As is being gone-over, there are many 'compromises' between the two.

10years ago I'd 100% be for lowest latencies, no matter what.
Now, I'm great with a compromise; as long as I can't feel input latency, and it's a smooth experience.
 
Hilarious how you don't even know the meaning of the terms you throw about.

What's hilarious is that you don't even know what's being discussed.

This whole thing was about capping the framerate, which neither fast sync or enhanced sync do, the games are running uncapped and the frames are stored in a buffer that then shows them in sync with the monitor's refresh rate.

FPS caps cause lower input delay, regardless of whatever theoretical 1-2ms advantage you can get by going uncapped.

Dude you're literally contradicting yourself, you admit going uncapped does in fact lower your input lag, there is no point in discussing this anymore.

but I made a diagram to illustrate why your FPS and ms equivalencies don't mean anything to the human eye:
The quality of the drawing is fine, it's just not represetative of reality. This is how a typical frametime graph looks like, as you can see the variance is very low and few frames go outside the expected value, you can cap your framerate if you want but it doesn't really get you anything.

1680883632193.png


You're right about Enhanced Sync and Fast sync, that's why I said combined with an FPS limit
But there is no point in doing that, that whole reason those settings where invented in the first place is to not limit the FPS, that's when they work the best.
 
Dude you're literally contradicting yourself, you admit going uncapped does in fact lower your input lag, there is no point in discussing this anymore.
Man doesn't even understand the difference between input lag and frametime.

/end of discussion.
 
What's hilarious is that you don't even know what's being discussed.

This whole thing was about capping the framerate, which neither fast sync or enhanced sync do, the games are running uncapped and the frames are stored in a buffer that then shows them in sync with the monitor's refresh rate.



Dude you're literally contradicting yourself, you admit going uncapped does in fact lower your input lag, there is no point in discussing this anymore.


The quality of the drawing is fine, it's just not represetative of reality. This is how a typical frametime graph looks like, as you can see the variance is very low and few frames go outside the expected value, you can cap your framerate if you want but it doesn't really get you anything.

View attachment 290682


But there is no point in doing that, that whole reason those settings where invented in the first place is to not limit the FPS, that's when they work the best.

Do yourself a favor and read the link @dgianstefani posted before - i reckon it will blow your mind.

Alternatively i would recommend watching one of battlenonsense's many videos on the topic - this one for example.

 
Last edited:
Also, we should get back on topic, or mods will get upset.
 
Man doesn't even understand the difference between input lag and frametime.

You don't understand anything at all. I don't know what you stand to gain by pretending to be so dense.

Lower frametimes are equivalent to having higher frames per second which means lower input lag. That hard to comprehend ?
 
You do increase input lag by having a GPU bottleneck. That's like...computing 101. That's why reflex exists, it stops the CPU from rendering ahead and makes it wait for the GPU, that way each frame on your screen is the most recent one.

Lower frametimes are equivalent to higher frames per second which means lower input lag. That hard to comprehend ?
Not true
 
Considering I'm finding both you and those you're debating with 'correct', I think there's a perspective conflict.

-A esports/twitch gamer needs as low of frametimes and input latency as possible.

-Many gamers just want a pleasant experience, that doesn't break immersion from technical issues.
(Ex. VSYNC breaks my immersion from perceptible input latency, but for others it guarantees a smooth experience.

As is being gone-over, there are many 'compromises' between the two.

10years ago I'd 100% be for lowest latencies, no matter what.
Now, I'm great with a compromise; as long as I can't feel input latency, and it's a smooth experience.
I completely agree. Traditional Vsync breaks my immersion too. My compromise is Enhanced/Fast Sync and a 60 FPS cap. :)

The quality of the drawing is fine, it's just not represetative of reality. This is how a typical frametime graph looks like, as you can see the variance is very low and few frames go outside the expected value, you can cap your framerate if you want but it doesn't really get you anything.

View attachment 290682
It does, because if you zoom in on this graph enough, you'll see that the 15-ish ms range on the 3090 Ti, and the 11-ish ms range on the 4080 are full of small stutters which some people can pick up on (I sort of can and can't, depending on the game). Even if you can't, there is no reason the game should run faster than your monitor's refresh rate. You don't even see those frames, but you increase your PC's heat.

But there is no point in doing that, that whole reason those settings where invented in the first place is to not limit the FPS, that's when they work the best.
The only point is when your frame rate cap = your monitor refresh rate = the frame rate the game actually achieves, you may see occasional small tears that you eliminate this way. I know it's overkill, but why not. :)

Lower frametimes are equivalent to having higher frames per second ...
Correct.

... higher frames per second which means lower input lag.
No. Input lag and frame rate/time are somewhat related, but essentially completely different things.
 
It does, because if you zoom in on this graph enough, you'll see that the 15-ish ms range on the 3090 Ti, and the 11-ish ms range on the 4080 are full of small stutters which some people can pick up on
They are not really 15 or 11 but more like <5ms differences in frametimes, that's 1/200 of a second variation going from frame to frame. No one can pick up on that, be real.

Even if you can't, there is no reason the game should run faster than your monitor's refresh rate.
There is if you play competitively, no one caps their game in those situations. And I am not even saying that actually matters, I never run games uncapped, I couldn't care less but that's objectively the best case scenario for lowering your input lag.
 
Lower frametimes are equivalent to having higher frames per second which means lower input lag.
Interrelation is not the same as equivalence.

There is close-association between higher (apparent/average) FPS, lower frametimes, and Input Latency. However, they're not the same.
Factors beyond the game and your GPU can easily come up and throw one or 2 of those variables askew, while leaving a 3rd seemingly unaffected.

Example (that's actually happened to me): bad SATA cable causing periodic microfreezing or system latency spikes.
 
No. Input lag and frame rate/time are somewhat related, but essentially completely different things.
Uhm, yes it does.

The higher the frame rate the higher the rate at which the game logic can pool inputs from the user and naturally the higher the frame rate also means the faster those frames reach the user. This should be obvious.

There is close-association between higher (apparent/average) FPS, lower frametimes, and Input Latency.
They are not just closely related, one leads to the other.

Can you match the input lag of a game where the lowest frame time is 11ms vs a game where the lowest frame time is 10ms ? Obviously not, it's physically impossible, these thongs are intrinsically linked.
 
They are not really 15 or 11 but more like <5ms differences in frametimes, that's 1/200 of a second variation going from frame to frame. No one can pick up on that, be real.
I am real. I have seen 100+ FPS that looked like a stuttery mess... on a 60 Hz screen! When your frame time changes between every single frame, it can be an awful experience if you're sensitive to such stuff. I am.

Question: Do you watch films? Do you see their 24 frames per second being smooth?

There is if you play competitively, no one caps their game in those situations. And I am not even saying that actually matters, I never run games uncapped, I couldn't care less but that's objectively the best case scenario for lowering your input lag.
I give up.
 
Back
Top