• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Since when became 60fps gaming a must

What matters is VSync now
 
View attachment 251051


actually, any normal person can perceive up to 100 FPS (black). A casual gamer (gray) could achieve better results with a 144 Hz monitor - of course, only if his system can handle 144 FPS. Pro gamers (green) are at a level where a 240 Hz monitor can achieve better results. Of course, the jump from 60 Hz to 144 Hz or FPS is much higher in perception than from 144 FPS to 240 FPS.
Then I'm not a normal person as I can't tell the difference between 40 and 60 fps. :roll:

How can I find and train my own FPS perception limit?
Why would I want to do that?
 
I went through a phase where I ran FRAPS and checked it regularly. Now I just run the game and if I don't experience any issues playing the game then I just don't concern myself with the FPS.
 
Then I'm not a normal person as I can't tell the difference between 40 and 60 fps. :roll:


Why would I want to do that?
in case u are a teen and u are playing in a tournament team.
i had several customers of those persons. they came with their parents or trainers to our shop and bought 240hz monitors or graphic cards.

Talk To Me Hello GIF by Stephen Maurice Graham
 
in case u are a teen and u are playing in a tournament team.
i had several customers of those persons. they came with their parents or trainers and bought 240hz monitors or graphic cards.

Talk To Me Hello GIF by Stephen Maurice Graham
I played with shitty fps when I was a teen... now I need that 60+ fps as an adult. :laugh:
 
The flickering was the worst with CRT, at least for my eyes. I had a HP 21" Trinitron with 1600x1200 @ 85Hz :)

Funny thing about that, when I was younger I always preferred the look and feel of CRT. When I moved to LCD it really magnified the abysmal performance of the hardware I had :D I swear that CRT flicker can make anything look smooth.

But nowadays a PWM screen can easily make my head pound, so I wouldn't be surprised if a CRT gave me a similar splitting headache.
 
Funny thing about that, when I was younger I always preferred the look and feel of CRT. When I moved to LCD it really magnified the abysmal performance of the hardware I had :D I swear that CRT flicker can make anything look smooth.

But nowadays a PWM screen can easily make my head pound, so I wouldn't be surprised if a CRT gave me a similar splitting headache.
Ah, what I meant is that CRT @ 60Hz hurted my eyes, with 75Hz or higher the headache was nonexistent :)
 
in case u are a teen and u are playing in a tournament team.
i had several customers of those persons. they came with their parents or trainers to our shop and bought 240hz monitors or graphic cards.

Talk To Me Hello GIF by Stephen Maurice Graham
them kids really believing a $1k monitor and RGB keyboards will give them skillz :roll:

I'm not very fond of multiplayer games but whenever I played UT99 I would be called a cheater and even get kicked out of servers when the admins couldn't stop crying after getting rekt that hard, my PC was crap and I played with a microsoft BALL MOUSE, duh
the ball was pretty clean though
 
them kids really believing a $1k monitor and RGB keyboards will give them skillz :roll:

I'm not very fond of multiplayer games but whenever I played UT99 I would be called a cheater and even get kicked out of servers when the admins couldn't stop crying after getting rekt that hard, my PC was crap and I played with a microsoft BALL MOUSE, duh
the ball was pretty clean though
Agree. I'll get a 4K60 monitor next myself.
 
Then I'm not a normal person as I can't tell the difference between 40 and 60 fps. :roll:


Why would I want to do that?
How can I find and train my own FPS perception limit?
by the way, it is also beneficial for us "old hares" to train our brain, reaction, muscle memory etc.

Funny Face Reaction GIF by TikTok


them kids really believing a $1k monitor and RGB keyboards will give them skillz :roll:
nah - the monitor is 250€ cus a tournament resolution is 1080p.
the graphics card was the worst.


Screenshot 2022-06-15 032110.JPG


I am one of those stupids... :D :roll:

serve music video GIF by Polyvinyl Records
 
When we moved away from interlaced displays that doubled the effective framerate

/endthread
 
When we moved away from interlaced displays that doubled the effective framerate

/endthread
Did you try those old school 3D glasses ever?
 
I blame the tick tock effect. I now would like to point out that it is different from the TikTok effect.* The eternal cycle of releasing serious eye candy games that the halo products struggle to do 60+ FPS, the tick. Then hardware comes out to make it playable at the low end and buttery at the high end, the tock. Then rinse and repeat.

Screens that should be for shooters became more mainstream too imho.

Younger players joining the game market will also drive this, as they are not used to waiting for their computing devices. If they feel any sort of lag or delay there is a problem, and they probably grew up playing older games on newer graphics cards. I ought to know as I am one of them...

Edit: the TikTok effect, in case you live under a rock, is applied science of natural selection in modern society.
 
Last edited:
So far, only US Air Force pilots who have perceived 220 FPS in tests (orange) have been shown to exceed this.
sorry to be all pedantic but that was NAVY pilots. :D

and it wasn't measure in FPS but in seconds (human eyes doesn't "see" in fps :cool: )
Capture.PNG

i attached what i believe this is the whole study.

and sorry this has come up over the years and just couldn't help the corrections.
Andy Richter Shrug GIF by Team Coco
 

Attachments

Younger players joining the game market will also drive this, as they are not used to waiting for their computing devices. If they feel any sort of lag or delay there is a problem
Well this is more of a behavioural issue caused by terrible parenting but hey, I'm not their mom.

Ah, what I meant is that CRT @ 60Hz hurted my eyes, with 75Hz or higher the headache was nonexistent :)
At least for me it depends on the CRT, my first one was amber monochrome and staring at it for more than 30 minutes was painful, I had the text and Win 3.1 menus inprinted in my eyes for a while heh. The one I'm using now works at 85-95Hz and I don't have any issues, I even have the "eye-care blue light filter" that's simply a slightly smoked glass on top of the screen.
 
Well this is more of a behavioural issue caused by terrible parenting but hey, I'm not their mom.
Nah, what I was describing was more of a mindset. If you had used a smartphone for your whole life as your main tech item, even the fastest computer would feel slow. I can also say that I grew up playing games that were by no means hardware limited, and felt smooth on any system. Now any new game needs the same sort of smoothness and clarity while being much much more graphically intensive. I am still training myself to get over lag and stuttering as I am far too broke to upgrade and have my main computer in a state of project right now.
 
At least for me it depends on the CRT, my first one was amber monochrome and staring at it for more than 30 minutes was painful, I had the text and Win 3.1 menus inprinted in my eyes for a while heh. The one I'm using now works at 85-95Hz and I don't have any issues, I even have the "eye-care blue light filter" that's simply a slightly smoked glass on top of the screen.
Man my eyes are melting even if I watch an old computer video with an amber monitor..
 
I'd say when 1990's twitch shooters became a thing :D

I've always preferred to play games at 60 fps or above. 120 will spoil you something awful, too! Very occasionally some games are passable at 30, but I would not call any of them preferrable to be honest.
 
it came up ca. 2013/2014

i found something from this website to translate it:

https://wiki.delphigl.com/index.php/Framerate

No screen can display any number of frames per second. A common refresh rate for flat panel displays is 60 Hz (meaning the image is refreshed 60 times per second). But there are also some with 120 Hz (for stereoscopy) or 200 Hz. Since 60 Hz on CRT screens flickers a lot, these old boxes can often also do 75, 85, or 100 Hz. No matter how often a monitor can refresh its image per second - there is no reason (apart from exceptional cases like benchmarking) to let the graphics card calculate more images, especially since this leads to unwanted artifacts (tearing).

Therefore, there is a technique called V-Sync. Here, the graphics card waits until the new image has been transferred to the screen before starting the next one. It is highly recommended to turn on V-Sync if the computer can render more frames per second than the screen is capable of displaying anyway. But what happens when V-Sync is turned on and the computer can't match the screen's native frame rate?
Let's say the screen refreshes 60 times per second, but the GPU can only manage 50 frames per second. As soon as the graphics card receives the synchronization signal, it copies the backbuffer contents into the front buffer and renders the next frame into the backbuffer. 16.67ms later, the next synchronization signal arrives, but the graphics card is not yet ready. So nothing is copied into the front buffer. Instead, the screen displays the same image twice in succession. Only another 3.33ms later does the graphics chip finish the picture and must now wait 13.34ms (!) for the next sync signal. Until this comes, a total of 33.33ms has passed - in other words, we have a frame rate of only 30 fps, although the computer would manage 50 fps!
Triple buffering
The problem here is that the graphics card does nothing while waiting for the sync signal. But there is a solution for this: Since we are not allowed to write into the backbuffer yet, we just write into another backbuffer! While we write into the second backbuffer, the first one is copied into the frontbuffer as soon as the sync signal comes again. In the next frame we write into the first backbuffer again, while the second one waits to be copied. Thus we achieve the same performance as without V-Sync, but at the same time we got rid of the tearing.
A disadvantage of this method is that we need a bit more video memory. However, the additional consumption is kept within limits, because only one additional color buffer is needed - we don't need the Z and stencil buffers again. Of course, you also get more micro-stutters - but let's be honest: 50 fps with micro-stutters is still better than 30 fps without.

OpenGL itself does not offer the possibility to switch triple buffering on or off. This is up to the driver. You can emulate triple buffering using an FBO, but then you actually have quadruple buffering if the driver already takes care of it. (source: [1])

Variable screen frequency
Currently (2014) there are efforts in the industry to develop displays with variable refresh rate. Names for this are Adaptive Sync (DisplayPort standard), G-Sync (Nvidia) or FreeSync (AMD). The problem of synchronization between screen and graphics card is to be approached from the other side than before: It is no longer the screen that dictates the clock rate at which the graphics card should deliver image data, but the graphics card signals the screen when it is done with the calculations. It then transmits the new image to the screen, which then displays it immediately.
The advantage: The latency (time between user input and image output) is reduced, triple buffering becomes unnecessary and the micro-stutters caused by the latter disappear. Tearing is also not seen, since synchronization still takes place.
2013 era sounds about right. Nvidia had just released gsync and amd was developing freesync.

PS4 and Xboxone came out late 2013 and I remember by 2014 both of them where getting flack for having most games only running at 30FPS.
 
I remember finishing NFSUG1 in an not so old but awful PC (P4-era Celeron 2.4GHz, nVidia FX5600) at 640x480 and lowest detail possible. At most 20FPS (~12FPS if I turn on car reflection). Since then I consider everything >30FPS as good enough (which most console can do).
Oh, and I can still enjoy racing games at 30FPS, no matter PS1 era or current gen. Maybe not at 24FPS though.
 
them kids really believing a $1k monitor and RGB keyboards will give them skillz :roll:

I'm not very fond of multiplayer games but whenever I played UT99 I would be called a cheater and even get kicked out of servers when the admins couldn't stop crying after getting rekt that hard, my PC was crap and I played with a microsoft BALL MOUSE, duh
the ball was pretty clean though
Gotta keep those balls clean. :roll:

On a serious note, I've always been crap at reflex-based online shooters. Never liked them, anyway. Maybe that's why I don't need 60 fps in my games up to this day.

by the way, it is also beneficial for us "old hares" to train our brain, reaction, muscle memory etc.
In what way?
 
Gotta keep those balls clean. :roll:

On a serious note, I've always been crap at reflex-based online shooters. Never liked them, anyway. Maybe that's why I don't need 60 fps in my games up to this day.


In what way?
Oh shit, keeping the balls (other than mine) clean... I don't miss ball mice. :3

That person who invented optical mice, is a hero.
 
Gotta keep those balls clean. :roll:

On a serious note, I've always been crap at reflex-based online shooters. Never liked them, anyway. Maybe that's why I don't need 60 fps in my games up to this day.


In what way?
PC games have a positive effect on brain structure

Hippocampus: Switching point in the brain
The hippocampus is an area in the brain where information from different senses converges, is processed and transmitted to the cortex. Thus, it is crucial for the formation of new memories and the transfer of memory content from short-term to long-term memory. Each hemisphere of the brain has a hippocampus.

The hippocampus adapts to current challenges. This is because the brain is not a rigid structure, but changes continuously with our personal experiences, for example through the formation and networking of new nerve cells.

Cognitive tests and magnetic resonance imaging studies of the brain show that computer games are well suited as a challenge for seniors and can compensate for the consequences of a lack of exercise, at least in the brain.

Physical activity has been shown to help prevent and treat dementia. Computer games have similar effects on brain structure, although those affected hardly move at all. They also lead to an increase in the size of the hippocampus. After a certain training phase, the challenges contained in the game map themselves in the relevant areas of the brain.

It is crucial that the players develop a three-dimensional imagination, i.e. that they move in virtual space. It then doesn't matter to their brain whether they train with real or virtual movement - the hippocampus grows and with it the performance of the memory.

In this case, this applies to everyone, not just the elderly.
 
Back
Top