• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Only some humans can see refresh rates faster than others, I am one of those humans.

180hz seems to be about my cap too, but other people like the optimum tech youtube channel guy talks about how 480hz OLED is mind blowing smooth. I hope to experience that and see if 180hz really is my cap physically or not. we will see soon enough, cause I am not waiting much longer for my OLED upgrade :D
It’s primarily smooth because it’s OLED. Just stubbornly increasing the refresh rate is a dumb approach to motion clarity anyway. It’s a sledgehammer solution. What really needs to happen is for screens once again leave behind the sample-and-hold method fundamentally, not just via crutches like BFI. That was why CRTs were so smooth - both near-instant response times and no persistence blur. OLED still had only one part of that equation.
 
It’s primarily smooth because it’s OLED. Just stubbornly increasing the refresh rate is a dumb approach to motion clarity anyway. It’s a sledgehammer solution. What really needs to happen is for screens once again leave behind the sample-and-hold method fundamentally, not just via crutches like BFI. That was why CRTs were so smooth - both near-instant response times and no persistence blur. OLED still had only one part of that equation.

I would not complain if newer CRT type monitors made a comeback, I don't think that will happen though. I wish my parents kept my old CRT monitor though, it would be interesting to go back for nostalgia sake.
 
I can barely see the difference between 30Hz and 60Hz :D
 
It still proves the point, imo. Some people can see some difference among different high refresh rates, but most people can't.
You might think it does, but it kind of doesn't. Let me explain.

What it does prove is that for 90% of people the sensitivity deviation is pretty small - from about 56-60Hz for that sample of people in the trial, and I have little doubt that it's representative of the wider population.

The thing is, that trial wasn't measuring the speed of the entire human visual system, it was simply a strobing light that measured the speed of the opsin cycle - essentially measuring the rate at which enzymes in your eye's photoreceptors rebuild the protein that was broken down when a photon hit it. Since enzyme reactions are pure chemistry that runs at a fixed rate based on molecular physics rather than biological differences between individuals, the relatively tight grouping from 56-60Hz is pretty much expected.

1728821684551.png

So a strobing test captures the "refresh rate" of a chemical reaction that is near-identical for everyone - minor differences might exist based on variances in cell composition person-to-person, but what's being tested is the recovery time from flashes.

In a non-flashing image, such as a gaming monitor, the way the retina works means all receptors aren't sychronously 'blinded' by a flash for a fixed 17-21ms duration of the opsin cycle. Those discs containing the rhodopsin contain multiple (thousands, millions?) of proteins that all get hit by photons at a roughly constant rate determined by the brightness of what is being focused on that particular part of the retina, but more importantly this constant stream of photons, rather than a synchronised flash means that all of the opsin cycles can get out-of-sync, so at no point are all the photoreceptors in your retina totally blinded by a flash for ~20ms or so. When your retina is processing a non-flashing image and all of the opsin cycles are allowed to get out-of-sync, the effective response time of your eye at a chemical level is effectively zero again. Then the real part of human vision kicks in, which is the neural processing done in the visual cortex - this is where differences in refresh rate sensitivity lie and unfortunately the visual cortex is poorly understood by current medicine and science.

I don't know how much scientific knowledge you have of the human visual system, but honestly Wikipedia is really solid on this subject, and well worth a deeper dive if you're interested:

and
 
Last edited:
I might have a look at the deeper literature, thanks for the links!
 
About that journal... it actually proves the opposite of what you said in post #264. According to this diagram, the perceived proportion of flashes drops significantly among the test subjects above 57-ish Hz:
View attachment 367407
Yup i remember on TV the cameras would capture the refesh flicker on meterological screens on news networks.

It still proves the point, imo. Some people can see some difference among different high refresh rates, but most people can't.

It’s primarily smooth because it’s OLED. Just stubbornly increasing the refresh rate is a dumb approach to motion clarity anyway. It’s a sledgehammer solution. What really needs to happen is for screens once again leave behind the sample-and-hold method fundamentally, not just via crutches like BFI. That was why CRTs were so smooth - both near-instant response times and no persistence blur. OLED still had only one part of that equation.
Weren't the Plasma Displays a Solution back then?
 
You might think it does, but it kind of doesn't. Let me explain.

What it does prove is that for 90% of people the sensitivity deviation is pretty small - from about 56-60Hz for that sample of people in the trial, and I have little doubt that it's representative of the wider population.

The thing is, that trial wasn't measuring the speed of the entire human visual system, it was simply a strobing light that measured the speed of the opsin cycle - essentially measuring the rate at which enzymes in your eye's photoreceptors rebuild the protein that was broken down when a photon hit it. Since enzyme reactions are pure chemistry that runs at a fixed rate based on molecular physics rather than biological differences between individuals, the relatively tight grouping from 56-60Hz is pretty much expected.

View attachment 367420

So a strobing test captures the "refresh rate" of a chemical reaction that is near-identical for everyone - minor differences might exist based on variances in cell composition person-to-person, but what's being tested is the recovery time from flashes.

In a non-flashing image, such as a gaming monitor, the way the retina works means all receptors aren't sychronously 'blinded' by a flash for a fixed 17-21ms duration of the opsin cycle. Those discs containing the rhodopsin contain multiple (thousands, millions?) of proteins that all get hit by photons at a roughly constant rate determined by the brightness of what is being focused on that particular part of the retina, but more importantly this constant stream of photons, rather than a synchronised flash means that all of the opsin cycles can get out-of-sync, so at no point are all the photoreceptors in your retina totally blinded by a flash for ~20ms or so. When your retina is processing a non-flashing image and all of the opsin cycles are allowed to get out-of-sync, the effective response time of your eye at a chemical level is effectively zero again. Then the real part of human vision kicks in, which is the neural processing done in the visual cortex - this is where differences in refresh rate sensitivity lie and unfortunately the visual cortex is poorly understood by current medicine and science.

I don't know how much scientific knowledge you have of the human visual system, but honestly Wikipedia is really solid on this subject, and well worth a deeper dive if you're interested:

and
Yes, you don't have a single protein.

Maybe a single protein can do ~60 Hz, but your brain interprets the results of parallel information, so yes, the "human eye can't can see higher than 60 FPS".

Wikipedia is not a trustworthy source, as any university lecturer will tell you.

There's a certain insane pride from people who state "I can't tell the difference between 60 and 120 Hz"... good for you I guess? You have a slow visual cortex?

It's almost like they want to convince themselves they are factually, scientifically correct, so they can poke fun at people who don't want to limit themselves to 60 FPS.
 
The human eye continually processes a stream of visual info which is transmitted by nerve conduction via the optic system. The brain then decodes it.

Notably, much of this thread is a pile of silly guff, because a lot of folk are fighting over something that isn't relevant.

We can't see frames per second because our eyes don't work that way. What we notice is judder or flicker when the stream is inconsistently relayed (lag) or the monitor refresh rate isn't matching the GPU output (tear). If you pan across a low hz display, you'll notice smearing. It becomes less at higher hz.

Point being, we all detect differing variance but nobody is seeing the actual frames per second. They're seeing the flicker produced in sub optimal rendering (which happens at lower refresh rates.)

We all see it differently.
 
The human eye continually processes a stream of visual info which is transmitted by nerve conduction via the optic system. The brain then decodes it.

Notably, much of this thread is a pile of silly guff, because a lot of folk are fighting over something that isn't relevant.

We can't see frames per second because our eyes don't work that way. What we notice is judder or flicker when the stream is inconsistently relayed (lag) or the monitor refresh rate isn't matching the GPU output (tear). If you pan across a low hz display, you'll notice smearing. It becomes less at higher hz.

Point being, we all detect differing variance but nobody is seeing the actual frames per second. They're seeing the flicker produced in sub optimal rendering (which happens at lower refresh rates.)

We all see it differently.

yep, and case in point, if you watch gameplay of a 60hz youtube video it looks smooth as butter like 120hz gameplay in real time. I think what you just described is also what explains that phenomena
 
I've seen people say that they experience motion sickness at 30 Hz. Thankfully I haven't had the displeasure. Even when struggling to play games at sub 30 FPS on a machine that didn't take well to gaming.

I've spent a good chunk of years gaming on 60Hz displays or 30-60fps. While the graphics weren't exactly smooth or fluid, I never really had issues with motion sickness or anything like that unless its those wild looong gaming sessions which are no stranger to eye strain and headaches at any resolution/refresh rate. After switching to a higher refresh rate panel, I’ve noticed when going back to lower refresh rate screens (60hz) the headaches and eye strain hit much harder than before. Seems like once you go high-ref, you cant go back without paying the price.

The human eye continually processes a stream of visual info which is transmitted by nerve conduction via the optic system. The brain then decodes it.

Notably, much of this thread is a pile of silly guff, because a lot of folk are fighting over something that isn't relevant.

We can't see frames per second because our eyes don't work that way. What we notice is judder or flicker when the stream is inconsistently relayed (lag) or the monitor refresh rate isn't matching the GPU output (tear). If you pan across a low hz display, you'll notice smearing. It becomes less at higher hz.

Point being, we all detect differing variance but nobody is seeing the actual frames per second. They're seeing the flicker produced in sub optimal rendering (which happens at lower refresh rates.)

We all see it differently.

I don't believe anyone is suggesting they can see the discrete frames on higher refresh rate displays. The general perception amongst the high refresh rate sensitive users being, some can see the difference in smoothness or fluidity, particularly in fast-paced content.
 
Yes, you don't have a single protein.

Maybe a single protein can do ~60 Hz, but your brain interprets the results of parallel information, so yes, the "human eye can't can see higher than 60 FPS".

Wikipedia is not a trustworthy source, as any university lecturer will tell you.

There's a certain insane pride from people who state "I can't tell the difference between 60 and 120 Hz"... good for you I guess? You have a slow visual cortex?

It's almost like they want to convince themselves they are factually, scientifically correct, so they can poke fun at people who don't want to limit themselves to 60 FPS.
You failed to comprehend my answer.

You have billions of proteins, but bright strobing at the right frequency ranges will cause all those billions of individual opsin cycles to sync up, effectively blinding the part of your retina that dealt with the flash as if it were a single protein.

As for Wikipedia, it certainly can't be trusted for topics where minutiae are critical, where editing parties have an agenda, or where the content of a page is matter of opinion, but in this case the two pages I link are mostly well-regarded scientific facts, citing almost a hundred scientific journals and publications including those from several world-renowned national laboratories and institutes. Dismissing those sources and the people who referenced them is madness. You might as well go full tin-foil hat conspiracy nutjob if you can't trust those sources...

Sweeping statements like "Wikipedia is not a trustworthy source" are as wrong as most sweeping statements tend to be, and if you are going to dismiss Wikipedia, you might as well dismiss every encyclopedia ever written, too - since Wikipedia cites all of them, and more. A braindead comment like "Wikipedia is not a trustworthy source" is information as useless as "Some sites on the internet are dangerous". That doesn't make the whole internet dangerous - it's only dangerous if you're ignorant of the dangers.
 
Back
Top