• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Only some humans can see refresh rates faster than others, I am one of those humans.

Joined
Jan 14, 2019
Messages
12,690 (5.83/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
This is only relevant to strobing lights, something that disappeared with CRTs, and re-appeared only at 120Hz+ once it was reintroduced to combat sample+hold blur on LCDs a good 10+ years after LCD panels replaced CRTs for the masses.
It still proves the point, imo. Some people can see some difference among different high refresh rates, but most people can't.
 
Joined
Nov 27, 2023
Messages
2,549 (6.39/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
180hz seems to be about my cap too, but other people like the optimum tech youtube channel guy talks about how 480hz OLED is mind blowing smooth. I hope to experience that and see if 180hz really is my cap physically or not. we will see soon enough, cause I am not waiting much longer for my OLED upgrade :D
It’s primarily smooth because it’s OLED. Just stubbornly increasing the refresh rate is a dumb approach to motion clarity anyway. It’s a sledgehammer solution. What really needs to happen is for screens once again leave behind the sample-and-hold method fundamentally, not just via crutches like BFI. That was why CRTs were so smooth - both near-instant response times and no persistence blur. OLED still had only one part of that equation.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,444 (4.68/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
It’s primarily smooth because it’s OLED. Just stubbornly increasing the refresh rate is a dumb approach to motion clarity anyway. It’s a sledgehammer solution. What really needs to happen is for screens once again leave behind the sample-and-hold method fundamentally, not just via crutches like BFI. That was why CRTs were so smooth - both near-instant response times and no persistence blur. OLED still had only one part of that equation.

I would not complain if newer CRT type monitors made a comeback, I don't think that will happen though. I wish my parents kept my old CRT monitor though, it would be interesting to go back for nostalgia sake.
 
Joined
Apr 8, 2010
Messages
1,012 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
I can barely see the difference between 30Hz and 60Hz :D
 
Joined
Feb 20, 2019
Messages
8,370 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
It still proves the point, imo. Some people can see some difference among different high refresh rates, but most people can't.
You might think it does, but it kind of doesn't. Let me explain.

What it does prove is that for 90% of people the sensitivity deviation is pretty small - from about 56-60Hz for that sample of people in the trial, and I have little doubt that it's representative of the wider population.

The thing is, that trial wasn't measuring the speed of the entire human visual system, it was simply a strobing light that measured the speed of the opsin cycle - essentially measuring the rate at which enzymes in your eye's photoreceptors rebuild the protein that was broken down when a photon hit it. Since enzyme reactions are pure chemistry that runs at a fixed rate based on molecular physics rather than biological differences between individuals, the relatively tight grouping from 56-60Hz is pretty much expected.

1728821684551.png

So a strobing test captures the "refresh rate" of a chemical reaction that is near-identical for everyone - minor differences might exist based on variances in cell composition person-to-person, but what's being tested is the recovery time from flashes.

In a non-flashing image, such as a gaming monitor, the way the retina works means all receptors aren't sychronously 'blinded' by a flash for a fixed 17-21ms duration of the opsin cycle. Those discs containing the rhodopsin contain multiple (thousands, millions?) of proteins that all get hit by photons at a roughly constant rate determined by the brightness of what is being focused on that particular part of the retina, but more importantly this constant stream of photons, rather than a synchronised flash means that all of the opsin cycles can get out-of-sync, so at no point are all the photoreceptors in your retina totally blinded by a flash for ~20ms or so. When your retina is processing a non-flashing image and all of the opsin cycles are allowed to get out-of-sync, the effective response time of your eye at a chemical level is effectively zero again. Then the real part of human vision kicks in, which is the neural processing done in the visual cortex - this is where differences in refresh rate sensitivity lie and unfortunately the visual cortex is poorly understood by current medicine and science.

I don't know how much scientific knowledge you have of the human visual system, but honestly Wikipedia is really solid on this subject, and well worth a deeper dive if you're interested:

and
 
Last edited:

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,444 (4.68/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
I might have a look at the deeper literature, thanks for the links!
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,781 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
About that journal... it actually proves the opposite of what you said in post #264. According to this diagram, the perceived proportion of flashes drops significantly among the test subjects above 57-ish Hz:
View attachment 367407
Yup i remember on TV the cameras would capture the refesh flicker on meterological screens on news networks.

It still proves the point, imo. Some people can see some difference among different high refresh rates, but most people can't.

It’s primarily smooth because it’s OLED. Just stubbornly increasing the refresh rate is a dumb approach to motion clarity anyway. It’s a sledgehammer solution. What really needs to happen is for screens once again leave behind the sample-and-hold method fundamentally, not just via crutches like BFI. That was why CRTs were so smooth - both near-instant response times and no persistence blur. OLED still had only one part of that equation.
Weren't the Plasma Displays a Solution back then?
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,125 (2.00/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape, Razer Atlas
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
You might think it does, but it kind of doesn't. Let me explain.

What it does prove is that for 90% of people the sensitivity deviation is pretty small - from about 56-60Hz for that sample of people in the trial, and I have little doubt that it's representative of the wider population.

The thing is, that trial wasn't measuring the speed of the entire human visual system, it was simply a strobing light that measured the speed of the opsin cycle - essentially measuring the rate at which enzymes in your eye's photoreceptors rebuild the protein that was broken down when a photon hit it. Since enzyme reactions are pure chemistry that runs at a fixed rate based on molecular physics rather than biological differences between individuals, the relatively tight grouping from 56-60Hz is pretty much expected.

View attachment 367420

So a strobing test captures the "refresh rate" of a chemical reaction that is near-identical for everyone - minor differences might exist based on variances in cell composition person-to-person, but what's being tested is the recovery time from flashes.

In a non-flashing image, such as a gaming monitor, the way the retina works means all receptors aren't sychronously 'blinded' by a flash for a fixed 17-21ms duration of the opsin cycle. Those discs containing the rhodopsin contain multiple (thousands, millions?) of proteins that all get hit by photons at a roughly constant rate determined by the brightness of what is being focused on that particular part of the retina, but more importantly this constant stream of photons, rather than a synchronised flash means that all of the opsin cycles can get out-of-sync, so at no point are all the photoreceptors in your retina totally blinded by a flash for ~20ms or so. When your retina is processing a non-flashing image and all of the opsin cycles are allowed to get out-of-sync, the effective response time of your eye at a chemical level is effectively zero again. Then the real part of human vision kicks in, which is the neural processing done in the visual cortex - this is where differences in refresh rate sensitivity lie and unfortunately the visual cortex is poorly understood by current medicine and science.

I don't know how much scientific knowledge you have of the human visual system, but honestly Wikipedia is really solid on this subject, and well worth a deeper dive if you're interested:

and
Yes, you don't have a single protein.

Maybe a single protein can do ~60 Hz, but your brain interprets the results of parallel information, so yes, the "human eye can't can see higher than 60 FPS".

Wikipedia is not a trustworthy source, as any university lecturer will tell you.

There's a certain insane pride from people who state "I can't tell the difference between 60 and 120 Hz"... good for you I guess? You have a slow visual cortex?

It's almost like they want to convince themselves they are factually, scientifically correct, so they can poke fun at people who don't want to limit themselves to 60 FPS.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,131 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
The human eye continually processes a stream of visual info which is transmitted by nerve conduction via the optic system. The brain then decodes it.

Notably, much of this thread is a pile of silly guff, because a lot of folk are fighting over something that isn't relevant.

We can't see frames per second because our eyes don't work that way. What we notice is judder or flicker when the stream is inconsistently relayed (lag) or the monitor refresh rate isn't matching the GPU output (tear). If you pan across a low hz display, you'll notice smearing. It becomes less at higher hz.

Point being, we all detect differing variance but nobody is seeing the actual frames per second. They're seeing the flicker produced in sub optimal rendering (which happens at lower refresh rates.)

We all see it differently.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,444 (4.68/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
The human eye continually processes a stream of visual info which is transmitted by nerve conduction via the optic system. The brain then decodes it.

Notably, much of this thread is a pile of silly guff, because a lot of folk are fighting over something that isn't relevant.

We can't see frames per second because our eyes don't work that way. What we notice is judder or flicker when the stream is inconsistently relayed (lag) or the monitor refresh rate isn't matching the GPU output (tear). If you pan across a low hz display, you'll notice smearing. It becomes less at higher hz.

Point being, we all detect differing variance but nobody is seeing the actual frames per second. They're seeing the flicker produced in sub optimal rendering (which happens at lower refresh rates.)

We all see it differently.

yep, and case in point, if you watch gameplay of a 60hz youtube video it looks smooth as butter like 120hz gameplay in real time. I think what you just described is also what explains that phenomena
 
Joined
Jan 20, 2019
Messages
1,596 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I've seen people say that they experience motion sickness at 30 Hz. Thankfully I haven't had the displeasure. Even when struggling to play games at sub 30 FPS on a machine that didn't take well to gaming.

I've spent a good chunk of years gaming on 60Hz displays or 30-60fps. While the graphics weren't exactly smooth or fluid, I never really had issues with motion sickness or anything like that unless its those wild looong gaming sessions which are no stranger to eye strain and headaches at any resolution/refresh rate. After switching to a higher refresh rate panel, I’ve noticed when going back to lower refresh rate screens (60hz) the headaches and eye strain hit much harder than before. Seems like once you go high-ref, you cant go back without paying the price.

The human eye continually processes a stream of visual info which is transmitted by nerve conduction via the optic system. The brain then decodes it.

Notably, much of this thread is a pile of silly guff, because a lot of folk are fighting over something that isn't relevant.

We can't see frames per second because our eyes don't work that way. What we notice is judder or flicker when the stream is inconsistently relayed (lag) or the monitor refresh rate isn't matching the GPU output (tear). If you pan across a low hz display, you'll notice smearing. It becomes less at higher hz.

Point being, we all detect differing variance but nobody is seeing the actual frames per second. They're seeing the flicker produced in sub optimal rendering (which happens at lower refresh rates.)

We all see it differently.

I don't believe anyone is suggesting they can see the discrete frames on higher refresh rate displays. The general perception amongst the high refresh rate sensitive users being, some can see the difference in smoothness or fluidity, particularly in fast-paced content.
 
Joined
Feb 20, 2019
Messages
8,370 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Yes, you don't have a single protein.

Maybe a single protein can do ~60 Hz, but your brain interprets the results of parallel information, so yes, the "human eye can't can see higher than 60 FPS".

Wikipedia is not a trustworthy source, as any university lecturer will tell you.

There's a certain insane pride from people who state "I can't tell the difference between 60 and 120 Hz"... good for you I guess? You have a slow visual cortex?

It's almost like they want to convince themselves they are factually, scientifically correct, so they can poke fun at people who don't want to limit themselves to 60 FPS.
You failed to comprehend my answer.

You have billions of proteins, but bright strobing at the right frequency ranges will cause all those billions of individual opsin cycles to sync up, effectively blinding the part of your retina that dealt with the flash as if it were a single protein.

As for Wikipedia, it certainly can't be trusted for topics where minutiae are critical, where editing parties have an agenda, or where the content of a page is matter of opinion, but in this case the two pages I link are mostly well-regarded scientific facts, citing almost a hundred scientific journals and publications including those from several world-renowned national laboratories and institutes. Dismissing those sources and the people who referenced them is madness. You might as well go full tin-foil hat conspiracy nutjob if you can't trust those sources...

Sweeping statements like "Wikipedia is not a trustworthy source" are as wrong as most sweeping statements tend to be, and if you are going to dismiss Wikipedia, you might as well dismiss every encyclopedia ever written, too - since Wikipedia cites all of them, and more. A braindead comment like "Wikipedia is not a trustworthy source" is information as useless as "Some sites on the internet are dangerous". That doesn't make the whole internet dangerous - it's only dangerous if you're ignorant of the dangers.
 
Top