• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
Uhm, yes it does.

The higher the frame rate the higher the rate at which the game logic can pool inputs from the user and naturally the higher the frame rate also means the faster those frames reach the user. This should be obvious.


They are not just closely related, one leads to the other.

Can you match the input lag of a game where the lowest frame time is 11ms vs a game where the lowest frame time is 10ms ? Obviously not, it's physically impossible, these thongs are intrinsically linked.

Dude, read the link or watch the video...
 
Uhm, yes it does.

The higher the frame rate the higher the rate at which the game logic can pool inputs from the user and naturally the higher the frame rate also means the faster those frames reach the user. This should be obvious.
Except that those frames never reach the user because the monitor can't display them.
 
Question: Do you watch films? Do you see their 24 frames per second being smooth?

Here's one that'll throw everyone for a loop!
I genuinely prefer media 'filmed' at above 'cinematic framerate'.
The 'soap opera effect' for me, has me feeling 'relief' not 'uncanniness'.
 
Except that those frames never reach the user because the monitor can't display them.

Input lag by any sensible definition means the time it takes from user input until the frame has finished rendering. You don't have to see every single frame for the input lag to be lower, that doesn't make any sense.
 
Here's one that'll throw everyone for a loop!
I genuinely prefer media 'filmed' at above 'cinematic framerate'.
The 'soap opera effect' for me, gets the reaction of 'relief' not 'uncanniness'.
I know what you mean. I wish every movie was filmed at high frame rates. But that doesn't change my point that we usually watch 24 FPS movies and enjoy them. ;)
 
Input lag by any sensible definition means the time it takes from user input until the frame has finished rendering. You don't have to see every single frame for the input lag to be lower, that doesn't make any sense.
Not meaning to antagonize, but...
You just gave evidence that frame time, frame rate, and input latency are not implicitly equivalent
Which, is kinda funny, because I *do* agree with what I quoted.
 
I give up.
Never argue with stupid people, they will drag you down to their level and then beat you with experience.” Mark Twain
LabRat 891 said:
Not meaning to antagonize, but...
You just gave evidence that frame time, frame rate, and input latency are not implicitly equivalent
Which, is kinda funny, because I *do* agree with what I quoted.
dgianstefani said:
Man doesn't even understand the difference between input lag and frametime.

/end of discussion.
^
 
Input lag by any sensible definition means the time it takes from user input until the frame has finished rendering. You don't have to see every single frame for the input lag to be lower, that doesn't make any sense.
No. Input lag means the time it takes from user input until the frame has been displayed on screen. Whether you render a frame or not is irrelevant if you never even see it.
 
Never argue with stupid people, they will drag you down to their level and then beat you with experience.” Mark Twain
Modernization/addendum:

Never underestimate smart-stupid people. They will beat you with pro-tier rhetoric, and what 'they feel is confirming' evidence.
 
You just gave evidence that frame time, frame rate, and input latency are not implicitly equivalent
I never said they are supposed to be implicitly equivalent. I said average frame rates and frametimes are equivalent, they are measures of the same thing and having a higher frame rate leads to a lower input. And it's also completely unrelated to what I said about been able to see every frame or not or whatever.

Whether you render a frame or not is irrelevant if you never even see it.
What ? :kookoo:

Of course it's relevant, if you render it you might see it, if you don't you never see it.

By the way since you insist, how exactly do you measure how many frames you "see" ?

Never argue with stupid people, they will drag you down to their level and then beat you with experience.” Mark Twain
Cringe.
 
What ? :kookoo:

Of course it's relevant, if you render it you might see it, if you don't you never see it.
"Might"? What? :kookoo:

By the way since you insist, how exactly do you measure how many frames you "see" ?
You see as many per second as your monitor's refresh rate is, as it is physically unable to display more.
 
"Might"? What? :kookoo:
I think they're referring to the 'diceroll' of what frame is displayed when you're rendering over your refresh rate.
You see as many per second as your monitor's refresh rate is, as it is physically unable to display more.
Yes. But, you do have a chance of visually-receiving 'more up to date' data. That 'diceroll chance' of getting a 'newer frame' is more-or-less what causes screen tearing.
To properly and fully quantify this, you'd need an entire study, done across a myriad of LCD panel types, display-controllers, GPUs, cables, and system configurations.
 
I am as dumbfounded by what you said as well, don't blame if it doesn't make any sense.

You see as many per second as your monitor's refresh rate is.
By this astoundingly mind-numbing definition, that means the input lag can never be lowered if you match the monitor refresh rate which is obviously not true. You will almost certainly experience less input lag with a game running uncapped at 300 fps vs one running at 30 fps even if your refresh rate is not 300hz.

The GPU is sending as many frames as it can render to the monitor, which, if not synchronized will actually try and display all them except partially, if a new frame is received while the current one is displayed you get tearing.
 
Utter nonsense, not that I expect anything less from you.
Well what you expect doesn't matter.


CPU renders frames ahead - said frames are waiting for the GPU (since we are talking about a GPU bottlenecked scenario) to finish rendering current frame - which is basically what causes input latency when you are gpu bottlenecked. Your GPU will be rendering frames currently that happened 4 frames in the past.
 
CPU renders frames ahead - said frames are waiting for the GPU (since we are talking about a GPU bottlenecked scenario) to finish rendering current frame - which is basically what causes input latency when you are gpu bottlenecked.

If the lowest frame time is X ms you cannot get below that, no matter what you do. Input lag has a lower bound imposed by how fast a frame can be finished.

And the CPU doesn't "render" anything ahead, like I said, utter nonsense.
 
Well what you expect doesn't matter.


CPU renders frames ahead - said frames are waiting for the GPU (since we are talking about a GPU bottlenecked scenario) to finish rendering current frame - which is basically what causes input latency when you are gpu bottlenecked. Your GPU will be rendering frames currently that happened 4 frames in the past.
It's pointless, he can't differentiate between the fact that frame time and input lag are associated, with the fact that input lag is influenced by many other metrics too.

We're arguing logic with someone who doesn't understand definitions, concepts or systems with more than two variables.

Also, we should get back on topic, or mods will get upset.
We need to stay on topic - VRAM in gaming. Don't feed the troll.
 
If the lowest frame time is X ms you cannot get below that, no matter what you do. Input lag has a lower bound imposed by how fast a frame can be finished.

And the CPU doesn't "render" anything ahead, like I said, utter nonsense.
Right, right. Jensen called, he requires your expertise

jensen.JPG
 
It's pointless, he can't differentiate between the fact that frame time and input lag are associated, with the fact that input lag is influenced by many other metrics too.

I mean the dude thinks the CPU is "rendering" frames and you agree with him. :roll:

Clearly I am amongst the brightest minds this forum has to offer.
 
I think they're referring to the 'diceroll' of what frame is displayed when you're rendering over your refresh rate.
Ah! Except that there's no dice roll. Olny screen tearing. :D

By this astoundingly mind-numbing definition, that means the input lag can never be lowered if you match the monitor refresh rate which is obviously not true. You will almost certainly experience less input lag with a game running uncapped at 300 fps vs one running at 30 fps even if your refresh rate is not 300hz.

The GPU is sending as many frames as it can render to the monitor, which, if not synchronized will actually display all them except partially, if a new frame is received while the current one is displayed you get tearing.
So you're perceiving partially displayed frames as lower input lag because half of a frame that your monitor can't cope with is already displayed while it wouldn't be if you synchronised it. I get it now.

At first, I thought you were confusing input lag and frame rates. Now, I just think you're under the belief that a frame rate cap works the same way as traditional Vsync, while it absolutely does not.

Feel free to prove me wrong.
 
It's pointless, he can't differentiate between the fact that frame time and input lag are associated, with the fact that input lag is influenced by many other metrics too.

We're arguing logic with someone who doesn't understand definitions, concepts or systems with more than two variables.


We need to stay on topic - VRAM in gaming. Don't feed the troll.
He goes around every topic - being wrong in all of them - I don't know wtf is going on. I don't believe he is real for a second, he can't be wrong on absolutely everything.. He is just baiting us
 
And the CPU doesn't "render" anything ahead, like I said, utter nonsense.
No, it prepares the frame that you currently instructed it to work on while the GPU still renders the one 4 frames before. That's technically output lag, but let's call it input lag now for simplicity's sake.
 
Back
Top