• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Only some humans can see refresh rates faster than others, I am one of those humans.

Did you read my post? What did i write? About TV-Systems? It's true since the Pal, Secam and NTSC came up. I'm able to explain why it is that way and how the underlying technics (cathod ray tube = crt) works. At least in german.
The discussion here is mostly centered around modern display tech, not outdated TV standards. I am perfectly aware of the topic you mentioned, but your previous post was talking about something that has no bearing on the discussion. The user you answered to was clearly talking about TV as a display device, not in context of broadcast standards that are pretty much irrelevant in modern times.

For example, IQ testing (which is relative, not absolute BTW, it's based off the average, which is 100, this changes depending on region or year), current research suggests that most people with IQs below 90 cannot understand a conditional hypothesis - "how would you feel if you didn't eat lunch" "what do you mean, I did eat lunch" etc. Also tied in with recursion, or mapping.
Hot take - IQ tests are mostly bullshit.
 
I remember going from 60Hz to 144Hz monitor felt like a bigger upgrade than going from an i5 2300/RX 580 to a Ryzen 2600/GTX 1070 in the end. My KD ratio doubled after each upgrade I swear. lolz
 
That journal is based on LED flicker, this discussion is more about the point at which people fail to gain any fluidity or perceived framerate improvement.
I think? The OP linked the research paper and the TOMS article that is using the research paper on LED flicker to apparently extrapolate fluidity? That toms article and this thread then are just touching speculation?
 
Hot take - IQ tests are mostly bullshit.
Many online ones you do yourself typically are inaccurate, in one way or another, so it depends on the quality of the test, but IQ is pretty well studied and utilized in modern psychiatry/psychology. What's important to note is the high consistency and reliability of research done using accurate IQ testing. Results are repeatable, and fit in to the developing scientific understanding of how our bodies work, physically and mentally.

1714834848628.png


Anyway, it's a bit off topic. But things like spacial reasoning etc, all have close ties with IQ in general. Cognitive testing is one of the most accurate predictors of real world performance, and we've known this for many years.
https://doi.org/10.1016/j.trip.2023.100783

Something that's interesting to me is the link between physical and/or mental health, and cognitive/performance testing. For example, people with good mental health also tend to score better on these tests.

There's also evidence that physical health contributes to better brain function (duh) and therefore how well you score on a cognition/IQ test. Although many of the studies are conducted on older people, as funding is related to alzheimers/other degenerative diseases research.

So some good news is for us gamers, the reasoning, sensory and cognitive tasks of gaming help keep the brain fit!
 
Can you rotate an apple in your head test, mental rotation.

Some people don't have an inner monologue either which is incredibly fascinating to me.

Cognitive testing is quite interesting, the basic tests are cool, but so are the more specific ones that most people haven't heard of.

For example, IQ testing (which is relative, not absolute BTW, it's based off the average, which is 100, this changes depending on region or year), current research suggests that most people with IQs below 90 cannot understand a conditional hypothesis - "how would you feel if you didn't eat lunch" "what do you mean, I did eat lunch" etc. Also tied in with recursion, or mapping.

I'm not suggesting that how well people can differentiate between different FPS is related to IQ, in fact I highly doubt it, but there are very interesting and measurable differences between different people when it comes to how we perceive or understand things.

I suspect like many things related to the brain, FPS sensitivity is partially determined by how well you "train" your brain. i.e. people who play a lot of fast paced competitive shooters at a younger age and continue to do so, likely have more acute sensitivity.
Yeah, people are wired very differently. Brain function is very poorly understood still and we're getting there but still a long way off truly understanding it.

"Smart" is a very hard-to-nail-down metric, too. I sail through IQ tests at 130-145 depending on the test but I consider myself slow. As in slow to process stuff, I'll be the last person to get a joke in the room, the last person to arrive at an answer, and always need to use the full time in any exam that involves calculations. I'll get there in the end, with consistently right answers but I'm in awe of people who process stuff at something like 2-3x my speed when it comes to learning by reading.

IQ is largely a measure of spacial/pattern awareness with a good bit of reasoning and a splish of knowledge testing thrown in. I'd make a guess that there's little or weak correlation with the framerate of their vision. If you read around the subject of vision speed, top athletes and race drivers are often cited as having the fastest vision and their average IQ was (in the one study I saw) around 100, which makes sense, since physical fitness isn't a measure of mental fitness and if you get enough IQs samples to count from the population, you should tend toward the average IQ near 100.
 
Yeah, people are wired very differently. Brain function is very poorly understood still and we're getting there but still a long way off truly understanding it.

"Smart" is a very hard-to-nail-down metric, too. I sail through IQ tests at 130-145 depending on the test but I consider myself slow. As in slow to process stuff, I'll be the last person to get a joke in the room, the last person to arrive at an answer, and always need to use the full time in any exam that involves calculations. I'll get there in the end, with consistently right answers but I'm in awe of people who process stuff at something like 2-3x my speed when it comes to learning by reading.

IQ is largely a measure of spacial/pattern awareness with a good bit of reasoning and a splish of knowledge testing thrown in. I'd make a guess that there's little or weak correlation with the framerate of their vision. If you read around the subject of vision speed, top athletes and race drivers are often cited as having the fastest vision and their average IQ was (in the one study I saw) around 100, which makes sense, since physical fitness isn't a measure of mental fitness and if you get enough IQs samples to count from the population, you should tend toward the average IQ near 100.
Yeah there are different types of "smarts", for sure.

Just like people who are physically inept with certain tasks may excel in others.

I don't think that high physical acuity necessarily always means high mental acuity, but it does seem like there's an association between the two for people who push themselves a little.

What's nice is that the tasks that improve one, often improve the other.

For example social exercise as you age is incredibly important to maintaining brain and body health. Loneliness will lead to premature death etc. So many nonobvious associations.

We then provided an overview of the exercise physiology and further showed that the body’s adaptations to enhance exercise performance also benefit the brain and contribute to improve cognitive performance.

I do find it somewhat funny that people have different "stats" like in video games. And some/many of these you can actually improve with lifestyle and training.

I wonder if this is also true for FPS sensitivity. I'd be inclined to think it is, at least to a point.
 
I think? The OP linked the research paper and the TOMS article that is using the research paper on LED flicker to apparently extrapolate fluidity? That toms article and this thread then are just touching speculation?
The TOMS (you mean IGN?) article is written as speculation, based on extrapolation of that paper.

But other documentaries I've seen, as well as plenty of empirical, repeatable data prove there is no need to speculate. (eg vision speed testing/temporal resolution testing of USAF top gun trainee selection process, eye-tracking of a rally driver to see how fast he spotted new objects coming into view and how quickly he adjusted steering/throttle to reroute, and Microsoft's research blogs on framerate smoothness and input lag for displays and pen-tablets up to 1000Hz)

Some people can see faster than me, I can definitely see faster than other people. It's measurable, it's easy to test, and some youtubers have even done it (Shroud collab with LTT, I believe that was the video where they blind-tested a couple dozen staff to see who could guess the framerate).
 
Some humans can't experience this.
please, I can tell the difference between 143hz and 144hz...I just look at the FPS display on the screen
By the time we are 70 we wont have these kinds of computers anymore is my prediction. I am 46 now :)
and I, for one, welcome our new cyborg overlords
 
please, I can tell the difference between 143hz and 144hz...I just look at the FPS display on the screen

and I, for one, welcome our new cyborg overlords

Maybe it's like Superman, we just need to expose ourselves to specific spectrums of light, and we gain improved abilities :D


I do find it a little frustrating that there are so few studies done on enhancing these functions in otherwise healthy people, and if there's still an effect there, or if it's only suited for therapy of those with brain damage/neurodegenerative disorders.
 

Maybe it's like Superman, we just need to expose ourselves to specific spectrums of light, and we gain improved abilities :D

if you ask my wife I've been suffering from selective dementia for years
 
The discussion here is mostly centered around modern display tech, not outdated TV standards. I am perfectly aware of the topic you mentioned, but your previous post was talking about something that
Really? I suggest you to read all of the topic not only what you like to read. I just cite some former posts

... i used 60hz monitors and tvs ...

You know that you are talking about 50 - 60 half pics changes when talking about tv's? That means that one half of the full picture (all even lines) are changed and all (odd) lines stay. At the next turn all odd lines changes and all even ones keep. So you are talking about 25-30 FPS on videos. So already this lines you wrote are nonsense.

…that’s just straight up not true. What you’re talking about is interlacing and it is far from being common nowadays. It’s also not the function of the display necessarily, but content. The native resolution of any reasonably modern display (yes, TVs included) is full progressive.

Did you read my post? What did i write? About TV-Systems? It's true since the Pal, Secam and NTSC came up. I'm able to explain why it is that way and how the underlying technics (cathod ray tube = crt) works. At least in german.

This is a evidence that you either don't reda the posts fully or you don't know what you talk about
 
The TOMS (you mean IGN?) article is written as speculation, based on extrapolation of that paper.
I did mean IGN woops

But other documentaries I've seen, as well as plenty of empirical, repeatable data prove there is no need to speculate. (eg vision speed testing/temporal resolution testing of USAF top gun trainee selection process, eye-tracking of a rally driver to see how fast he spotted new objects coming into view and how quickly he adjusted steering/throttle to reroute, and Microsoft's research blogs on framerate smoothness and input lag for displays and pen-tablets up to 1000Hz)
For sure, I was just confused because it was stated that the article being referenced was based on dwell and not actual fluidity which was being discussed so it caused confusion. Thanks for clearing it up. Though I must admit, starting a thread about one thing and discussing an adjacent topic is very confusing.

I wish OP would have just cited the resources availible for actual fluidity.

Thanks for clearing it up!
 
I still do not understand how this can be. They must be NPC's.
I have heard about this too. It must be so nice to not have to have ideas, imagination or other mostly bad stuff running around your mind all the waking time and worse keeping you awake.

On topic, i can see the difference from 30 to 60 or 60 to 90 but above that it gets less noticeable. I do frame cap most games to 30 on my laptop, so that i get lower temps and noise. 60 for most RTS and RPGs on my PC and 90 for FPS or flight sims. I rarely play multiplayer and when i do is Vermintide 2 type co-op, only there i go to 120-140 frame cap.
But to me the most important tech is Gsync(or other adaptive sync) since tearing is way more noticable than a lower frame rate.

I did get a CRT a few months ago, it runs at 100Hz 1024x768, but i do not see a massive improvment in smoothness compared to my 144Hz VA Gsync monitor.
 
CRT monitors always felt smoother to me at 75hz, by a lot compared to modern LCD's at 60hz. Then in 2012/2013 I imported my first ever Korean monitor that overclocked to 100hz, that is the day gaming changed forever for me. Fast forward to now and I regularly game at 165hz 1440p or 1080p, and I very much prefer even my card games like Magic the Gathering Arena to run at 165 fps 165hz, single player games like Witcher 3 just feel more immersive, etc.

Some humans can't experience this. We didn't really know this until recently, but a study was done. So, I guess this is why some people just don't understand the immersion factor. I can distinguish between 75hz and 90hz, I can tell between 144hz and 180hz, and 240hz almost looks too "soap opera" like for me, so I actually prefer like around 165hz to 180hz range. I wonder if this also is different among those who can experience high refresh rates? Very interesting to think about. I don't think I'd want to own a 540hz monitor for example, but some competitive gamers swear by it. Immersion and competition are different though, so its something we would need to look in to, I suppose.

I cap out at around 180hz to 200hz fidelity - some professional gamers can do 540hz though, but it looks bad to me... so lot more for us to learn over time.


Good try, it's been proven that human eye can only see up to 29,9 fps, close 30 fps.
I work as a PC computer seller at Walmart, I think I know 1 or 2 things more than you about this.
 
This is a evidence that you either don't reda the posts fully or you don't know what you talk about
Oh, I did read the posts and I most assuredly know what I am talking about. I feel like the problem is on your end with inferring the actual meaning of the discussion, especially considering you admitted that you don’t have a great grasp of English. Bringing up interlacing in a discussion about perceived motion clarity kinda speaks for itself here.
But I have no desire to bicker with you over misunderstandings, so you can rest easy and take a W or whatever you wish.
 
So some good news is for us gamers, the reasoning, sensory and cognitive tasks of gaming help keep the brain fit!
Yeah a good few years ago when I was at my local doctor for a generic checkup she accidentally pushed a pencil off her table while I was sitting next to it and I've grabbed it mid air and shes like, well at least your reflexes are good. 'I was quite out of shape/unhealthy at the time tho:oops:'

What name did you use in UT2K4? :)
I was running a variation of Don/Mr-Gatto in most of my school years online gaming, mainly played on servers from my country tho with a friend or two.
What I meant by semi competitive is that I was actively trying to get better at the game and practiced it for hours every day after school and that was the last time I've cared about being good at a fps/online shooter. I don't even play such games anymore or if I do its only vs bots.:laugh: 'yep I'm a filthy casual nowadays, mainly playing single player games'

Wow, that's slow!
Count yourself lucky - you can afford to turn up the eye candy and resolution without having to spend silly money on high-refresh displays and overpriced GPUs to feed them :)

Yep thats probably the best part of that, I've put around 80 hours into Cyberpunk with the settings cranked up even tho I've averaged around 50+ fps at most and it wasn't bothering me at all.:)
Also put like 800+ hours into Borderlands 3 with fps being all over the place cause my RX 570 wasn't exactly up to the task to deal with the end game madness/builds cause thats a lot heavier to run. 'wasn't really bothered, still had a blast'
I've also had to enable the original 30 fps locked mode in Mafia 3 when I was playing that game cause it was very unstable on my at the time system if I've disabled it. That took me a few hours to get used to and finished the game like that.

Btw when I've made the switch from my CRT to my 1680x1050 res TN panel 60 Hz monitor in 2008 it was a very smooth transition for me, wasn't really bothered by the change and actually loved the extra display space.

O ye my father bought a new phone not long ago 'I've picked it for him' and it has a 60 and 90 Hz display mode and when I've showed him he was like what I'm supposed to see now?
Asked him if he wants to keep the 90Hz mode while I also told him that it does drain the battery more and he told me to disable it right away cause he can't see any difference nor cares. 'tbh I would do the same:oops:'
 
Last edited:
Yeah it's a brain thing not an eye thing.
Exactly, that's why reading usual mantra "human eye can't perceive more than..." is kinda tiresome to begin with. Eyes can't see sh*t, they're just beautifully packed clusters of state-of-the-art optical sensors.
Some people don't have an inner monologue either which is incredibly fascinating to me.
Same, but that could explain why some of us just can't survive alone - not being surrounded by bunch of people at all times. In contrast to introverts like myself.
I have a theory that there are different mind presets we're born equipped with, that are immune from education and other external influences like tradition, with master and slave being the major groups with a lot varieties within each of those. Some might find offended with my words, but mind you, I'm not a native Eng. speaker, and I do my best to explain this. It's an interesting topic nonetheless.
 
Oh, I did read the posts and I most assuredly know what I am talking about. I feel like the problem is on your end with inferring the actual meaning of the discussion, especially considering you admitted that you don’t have a great grasp of English. Bringing up interlacing in a discussion about perceived motion clarity kinda speaks for itself here.
But I have no desire to bicker with you over misunderstandings, so you can rest easy and take a W or whatever you wish.
If you read it is only selective. I'm not willing to waste more of my time to show it to you all the time. Enjoy your stay at my ignore list.
 

Maybe it's like Superman, we just need to expose ourselves to specific spectrums of light, and we gain improved abilities :D


I do find it a little frustrating that there are so few studies done on enhancing these functions in otherwise healthy people, and if there's still an effect there, or if it's only suited for therapy of those with brain damage/neurodegenerative disorders.

this is absolutely fascinating to me, thanks for posting this, and yes this is what I was hoping would come out of this thread. creative thinking!

Exactly, that's why reading usual mantra "human eye can't perceive more than..." is kinda tiresome to begin with. Eyes can't see sh*t, they're just beautifully packed clusters of state-of-the-art optical sensors.

no you are wrong on this one. see the very rare biological females who are Tetrachromacy vision, - it has to do with their rods and cones being much much different. and the world is so beautiful to them, they can see like 100 million more shades of colors than us.

it's about the comprehensive whole though, if we can agree on that at least.

please, I can tell the difference between 143hz and 144hz...I just look at the FPS display on the screen

and I, for one, welcome our new cyborg overlords

username checks out :roll::lovetpu:

Also I'd be willing to bet younger people can detect more fps than older people say like teenager vs 60+, but same cane be said for hearing, etc. Just the nature of things.

I also agree with this assumption, people like W1zz and some other members here who I "think" are older all seem to be leaning that way towards the 60hz can't tell a difference crowd.

We have so much to learn because it's so varied. I find it fascinating.
 
i think most people can see or feel it if they play on pc with a mouse.
many start out with 60hz monitor, then they try high and higher hz and they notice how it is better.
many can not imagine anything better than what they got, blissfully ignorant.
i went from to 60, then 144 and now 240hz, all much better each time.
and research tells us that only at 1000hz we reach the limit.
 
For CRT, Above 75Hz I can't tell. OLED (or whatever supports 500Hz), It is somewhere between 500 and 380 that I lose it. Watching the Alien screen test, 500 and 380 look the same, but 380 is ever so slightly clearer than 240Hz.

These new "fake" high frame-rate monitors that do blank frame inserts give me a massive headache.
 
no you are wrong on this one. see the very rare biological females who are Tetrachromacy vision, - it has to do with their rods and cones being much much different. and the world is so beautiful to them, they can see like 100 million more shades of colors than us.

it's about the comprehensive whole though, if we can agree on that at least.
I can partially agree on this. Having more input information provides the possibility for creating a better picture, but it's still on the brain to make something out of it. This could also explain different reactions (or absence of any) to art, nature and all that's around us (or that we perceive is around us, but let's not go there haha).

Also, that transcranial near-infrared (tNIR) light study just shows how little we know about light radiation, and how vulnerable we are, being exposed to all sorts of rays from environment, equipment etc.
 
For me, the 60 vs 144/165-OC (1440p) comparison is a night and day difference. I'm a fast paced competitive (not pro) shooter and the increased refresh rate definitely makes a difference in visual fluidity/awareness esp. in fast paced action. Even absent of fast paced action, i've got ants in me pants in those large MP maps... running around, twisting and turning, zig zag run ups, essentially moving around non-stop to avoid getting hit. This is where the image fluidity makes a big difference in spotting enemies or in the least spotting them fast enough to make them a target. I'm also pretty convinced it does gain me some K/D advantage but i have to admit i haven't really tested these refresh rates with a side-by-side comparison at any one given time. I went from 1080p 60, then 75, then 144 and then jumped ship to 1440p 144 (165-OC) and never looked back again.

In some titles i can't tell the difference as much between 120/165. At the moment anything above 120 gives me all the butter smooth experience i'm looking for.

I have played on a 240hz display several times but don't own one. Not sure if it really made a difference. I guess this is something where a side-by-side comparison will help to determine whether its a useful investment (for me). For the time being i'm playing some of the more demanding titles with preferences locking in the eye candy at higher quality settings, hence a 240hz screener @ 1440p is a little out of whack for my RTX 3080 (presumably 40-series too).
 
I do agree with everything being said here, and I also agree with Turmeric that most people probably can experience high refresh rate, see/it feels it, to varying degree. Another thing I thought of while contemplating this was some people read books, they can't form images in their mind, while others can create entire worlds when they read a book and get lost on it and really be transported to that world through vivid imagination in the mindseye. Those are all varying degrees too, with the far end of the spectrum being people that can't form images in their head at all when they read.

I wonder if this has something to do with it, would be interesting to try and see if there is any relation to this and high refresh rates, I imagine it is similar area of the brain that deals with this stuff.
 
Btw when I've made the switch from my CRT to my 1680x1050 res TN panel 60 Hz monitor in 2008 it was a very smooth transition for me, wasn't really bothered by the change and actually loved the extra display space.
In 2008, I was so limited for what LCD monitors I could get. And it was so easy to find used CRT monitors! That made me hesitant right there! So CRTs ruled my room, and I still was using 2 CRTs in 2015 and 2016.
 
Back
Top