• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Only some humans can see refresh rates faster than others, I am one of those humans.

Joined
Jun 22, 2019
Messages
155 (0.09/day)
Processor Ryzen 7 5800x @ stock
Motherboard B550M motar wifi
Cooling Thermalright assassin 120 se
Memory DDR4 G.skill 32gb @ 3600mhz
Video Card(s) RTX 2070
Storage 3x Crucial MX500 1tb SSDs
Display(s) Acer nitro XV272U 1440p 170hz
Case Deepcool M370
Power Supply Thermaltake GX2 600w
I can't see any difference past 120hz personally. I never saw the appeal behind 240hz, 360hz etc it's all marketing BS to me like those alienware 360hz panels I see on display, like who ever would need all that anyways?
 
Low quality post by gurusmi
Joined
May 24, 2023
Messages
655 (1.81/day)
Location
127.0.0.1, ::1
System Name Naboo (2019)
Processor AMD 3800x
Motherboard Gigabyte Aorus Master V1 (X470)
Cooling individual EKWB/Heatkiller loop
Memory 4*8 GB 3600 Corsair Vengeance
Video Card(s) Sapphire Pulse 5700XT
Storage SSD 1TB PCIe 4.0x4, 2 TB PCIe 3.0
Display(s) 2*WQHD
Case Lian Li O11 Rog
Audio Device(s) Hifiman, Topping DAC/KHV
Power Supply Seasonic 850W Gold
Mouse Logitech MX2, Logitech MX Ergo Trackball
Keyboard Cherry Stream Wireless, Logitech MX Keys
Software Linux Mint "Vera" Cinnamon
... I can clearly see the difference between 60hz and ~144hz...

I got you. You can see in the darkest dark and clearly hear above 100 KHz also. Placebo effect? Influenced by Marketing?
 
Joined
Apr 24, 2022
Messages
168 (0.22/day)
System Name lawooder
Processor 5800x3D
Motherboard B550M Aorus Elite
Cooling Arctic Freezer 34 DUO
Memory 32GB 4000Mhz CL16 (2x16GB)
Video Card(s) RTX 3070 Ti
Storage SN770 250GB/SN750 500GB/SA510 1TB
Display(s) VG279QM 1080P 280hz + VG27AQ 1440P 165HZ
Case MSI MAG Something
Audio Device(s) fiiO EK10 + HD 599
Power Supply Focus GX 750 Watts
Mouse XM1r
Keyboard Xtrfy K4
I own 1440p 144hz and 1080p 280hz.

In normal AAA singleplayers games it's really hard to justify using the 1080p display, even stuff like MMO's. Always use 1440p because the motion clarity is just good enough and I don't run these types of games at over 144fps, usually I'm pretty happy with 120 fps. I'm just chilling, dont even care.

However, for first person shooters, either single player and especially multiplayer I need to se the 1080p280HZ display because I know it's there.. The difference is huge, if I swap to the 1440p144hz monitor everything moves with motion blur (for lack of better representation) and some added input lag.

I think I notice this because I have the monitors side by side and in fast twitch shooters the difference is just to massive to ignore. It's like playing with blur on vs off. Honestly I really tried to make the 1440p144hz my main for everything but I just can't, get motion sickness after a few hours, eveything blurs together, it's just a nightmare for me personally, on the 280hz display it's like playing with weights off.
 
Joined
Feb 20, 2019
Messages
7,447 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
But they can't. Science has proven this.
Honestly, that's not science, that's BS pseudoscience that fails to hold up to evidence.

Not only can I see the difference between 120 and 240Hz, I would give myself a 9/10 success rate of accurately distinguishing between 75Hz and 85Hz, two common CRT refresh rate values that are much closer to each other than the blindingly-obvious contrast of 120 and 240Hz, yet also comfortably beyond what the BS pseudoscience claims is possible.

Seeing >60Hz is not special. Based on gamers I know, I'd guess than 30-50% of people can appreciate higher framerates.

60FPS human cap crew strikes back. I guess it was inevitable. :laugh:
I'd love to watch the 60FPS human cap crew fight off the arguments from the 24FPS human cap crew. "If I can't see it, then it can't possibly be true!"

When the first 360Hz display came out, I called BS. But then I recall watching The Hobbit at 48fps in the cinema and had a couple of people in the group discussing that they didn't get all the fuss about 48fps because it looked the same as normal movies. That was the point at which I realised there is clearly an order-of-magnitude range of human temporal resolution.

Just because I can't take advantage of a 360Hz display doesn't mean that it's pointless. My eyes/brain are unable to resolve anything shorter than ~1/200th of a second, but clearly there are some people who can. I believe their claims simply because I believe myself arguing the same point to others. I would expect that if I met them, and I was allowed to set a 360Hz display to either 360Hz or something lower like 240Hz, they would identify it correct every time. I can't see it, but that doesn't mean it's not there - I just know my own vision has its limits.
 
Last edited:
Joined
Sep 1, 2022
Messages
472 (0.75/day)
System Name Firestarter
Processor 7950X
Motherboard X670E Steel Legend
Cooling LF 2 420
Memory 4x16 G.Skill X5 6000@CL36
Video Card(s) RTX Gigabutt 4090 Gaming OC
Storage OS: 2TB P41 Plat, 2TB SN770, 1TB SN770
Display(s) FO48U, some dinky TN 10.1 inch display.
Case Fractal Torrent
Audio Device(s) PC38X
Power Supply GF3 TT Premium 850W
Mouse Razer Basilisk V3 Pro
Keyboard OG Razer Black Widow
A lot of people are very sensitive to quick motion, especially within their peripheral vision
That's a yes for me. I haven't gone above 144 fps yet but I can tell 90 to 120 easily. Although my eyes always feel bad for some reason. Not sure. `
 
Joined
Apr 24, 2022
Messages
168 (0.22/day)
System Name lawooder
Processor 5800x3D
Motherboard B550M Aorus Elite
Cooling Arctic Freezer 34 DUO
Memory 32GB 4000Mhz CL16 (2x16GB)
Video Card(s) RTX 3070 Ti
Storage SN770 250GB/SN750 500GB/SA510 1TB
Display(s) VG279QM 1080P 280hz + VG27AQ 1440P 165HZ
Case MSI MAG Something
Audio Device(s) fiiO EK10 + HD 599
Power Supply Focus GX 750 Watts
Mouse XM1r
Keyboard Xtrfy K4
People that perpetuate the notion that you cant perceive above 60, 144 etc are most certainly trolls. Or people with medical issues.
There is no in-between.

This is not really a discussion that warrants 6 pages. And I'm using the term "discussion" very lightly.

Cmon men, just put a 144hz vs a 240 display side by side, play a FPS shooter, you can't see the difference? Its pointless discussion because it's a you problem. 99% of healthy humans will perceive the difference because the difference is freaking huge in motion clarirty...

People who parrot this nonsense I label as trolls.

But if you actually have medical problems, thats another discussion but you are better off visiting a doctor rather than parroting nonsense on an online forum.
 
Joined
Oct 1, 2014
Messages
1,870 (0.53/day)
Location
Calabash, NC
System Name The Captain (2.0)
Processor Ryzen 7 7700X
Motherboard Gigabyte X670E AORUS Master
Cooling 280mm Arctic Liquid Freezer II, 4x Be Quiet! 140mm Silent Wings 4 (1x exhaust 3x intake)
Memory 32GB (2x16) G.Skill Trident Z5 Neo (6000Mhz)
Video Card(s) MSI GeForce RTX 3070 SUPRIM X
Storage 1x Crucial MX500 500GB SSD; 1x Crucial MX500 500GB M.2 SSD; 1x WD Blue HDD, 1x Crucial P5 Plus
Display(s) Aorus CV27F 27" 1080p 165Hz
Case Phanteks Evolv X (Anthracite Gray)
Power Supply Corsair RMx (2021) 1000W 80-Plus Gold
Mouse Varies based on mood/task; is currently Razer Basilisk V3 Pro or Razer Cobra Pro
Keyboard Varies based on mood; currently Razer Blackwidow V4 75% and Hyper X Alloy 65
Honestly, I could care less about being able to "see" the difference between 70hz or 100hz. All I care about is being able to stay at or slightly above 60FPS, depending on the game.
 
Joined
Feb 20, 2019
Messages
7,447 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Honestly, I could care less about being able to "see" the difference between 70hz or 100hz. All I care about is being able to stay at or slightly above 60FPS, depending on the game.
I think 60fps is used as a threshold for gaming smoothness by reviewers because it's the point at which an unscientific majority of people perceive fluid motion instead of a stuttery sequence of static images. The framerate needed for that is lower for film because camera exposures are much longer than that and induce motion blur. Try gaming at 30fps and it's horrible. Try gaming at 30fps with motion blur, and it's significantly better (but still horrible!)

That 60fps point - the distinction between individual frames and fluidity - is NOT the limit of what people can perceive. Based on the only experience I 100% trust (my own) I would extrapolate that the FPS threshold you can identify is very roughly double the threshold you perceive as "fluid".

I always like this test, on a high-refresh display. Set it to 6 UFOs and because each row is twice as fast as the next one, there's always a clear, unambiguous point where the framerate is fluid, yet there are improvements in the motion quality beyond that point for sure. Sadly, the only two relevant samples for me are 90Hz vs 180Hz and 120Hz vs 240Hz since I have never seen a display faster than 240Hz and the next fastest thing is my 180Hz laptop.
 
Joined
Dec 25, 2020
Messages
4,832 (3.89/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Razer DeathAdder Essential Mercury White
Keyboard Redragon Shiva Lunar White
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
One more medical wonder trying to tell everybody he can see 100,200 or 500fps. It is like a thread where some try to tell, that their new bought and not modified Malibu is drving 500 mph on the Road of Atlanta...

It's not a matter of see or not see. You need the adequate hardware to even *begin* considering the benefits of high refresh rates. The fact even low-cost phones are running 90 Hz displays now and most high-end handsets use 120 Hz (as well as most televisions) tells you that there are tangible and real benefits to it. At 60 fps, a single frame takes 16.6 milliseconds to be drawn and displayed. At 120, this is reduced to 8.3, at 240 we're already talking about 4.125 ms.

16 ms is long enough a time allotment that you can clearly see it, if not feel it through your reflexes, if you had one black background in the midst of 59 other white backgrounds. You'd at least have a fuzzy feeling about it. At 8.3 ms, however, the story can change to most people - but that's where the other benefits of high refresh rates kick in. Massively reduced render and presentation latencies improve response times, making it snappier and snappier, things feel completely instant and it's most visible in competitive gaming scenarios. You'd be surprised how large an allotment of time 500 ms can be under certain circumstances. Not only that, but the perceived motion rate will be substantially improved. Try this website and see it for yourself. Even on a dreadfully slow, blurry and problematic 60 Hz display you can tell it:

 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,795 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I've always been sensitive to refresh rate too, I think up to around 200hz I could pick refresh rates in 10-20hz increments.

It's fascinating too that different monitor technologies handle them better or worse, and to some degree provide a 'true XXhz experience' or not.

For example, the OLED at 120hz I use now feels so smooth and natural, easily as good as the VA's I've owned with eh response times at 144hz. And using a CRT for the retro rig, 1024x768 @ 85hz feels incredibly fluid, and even 1280x970 @ 70hz is a marked step up over 60hz, but 60hz on a CRT can feel a little flickery.

To the people that can't tell or are perhaps not gamners/indifferent, I suggest trying 120+ hz for an extended period, then go back to 60hz. Going down from high refresh is IMO where people click to the difference, more so than going up.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
7,668 (3.70/day)
Location
Winnipeg, Canada
Processor AMD R9 5900X
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Aqua Elite 360 V3 1x TL-B12, 2x TL-C12 Pro, 2x TL K12
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, Asus Hyper M.2, 2x SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact RGB
Audio Device(s) JBL 2.1 Deep Bass
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I am only playing at 60fps because my screen is only 60Hz. It is fluid af though. I dunno.. looks awesome to me.

I cant argue because this is all I know lol. I could see the diff between 75 and 85Hz on CRT, it was quite obvious.
 
Joined
Feb 20, 2019
Messages
7,447 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
16 ms is long enough a time allotment that you can clearly see it, if not feel it through your reflexes,
A great example of this is lip-sync.
Just about anyone can tell the movie audio is out of sync with the video if it's out by even 25ms. That's not even a single full frame at the stndard 24 cinematic fps.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,453 (4.70/day)
Location
Kepler-186f
Processor Ryzen 7800X3D -30 uv
Motherboard AsRock Steel Legend B650
Cooling MSI C360 AIO
Memory 32gb 6000 CL 30-36-36-76
Video Card(s) MERC310 7900 XT -60 uv +150 core
Display(s) NZXT Canvas IPS 1440p 165hz 27"
Case NZXT H710 (Red/Black)
Audio Device(s) HD58X, Asgard 2, Modi 3
Power Supply Corsair RM850W
Honestly, I could care less about being able to "see" the difference between 70hz or 100hz. All I care about is being able to stay at or slightly above 60FPS, depending on the game.

While I agree with you, the point of this thread was never meant to deviate into this debate, it was more about the scientific study of the eye and what is going on. That study I linked in first post makes for some interesting thought experiments. I think this thread did a good job exploring those ideas early on, but the thread has since deviated off its original purpose. This is ok with me, but I am not really following this thread anymore as it is has already made me think in different ways, so I got out of what I intended.
 
Joined
Feb 20, 2019
Messages
7,447 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I cant argue because this is all I know lol. I could see the diff between 75 and 85Hz on CRT, it was quite obvious.
Flicker or fluidity? I think a lot of people saw flicker at 75Hz on CRT but flicker threshold and fluidity thresholds are different.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
7,668 (3.70/day)
Location
Winnipeg, Canada
Processor AMD R9 5900X
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Aqua Elite 360 V3 1x TL-B12, 2x TL-C12 Pro, 2x TL K12
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, Asus Hyper M.2, 2x SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact RGB
Audio Device(s) JBL 2.1 Deep Bass
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Flicker or fluidity? I think a lot of people saw flicker at 75Hz on CRT but flicker threshold and fluidity thresholds are different.
I meant my current setup seems pretty fluid :D

But 75Hz CRT was flicker fest for me for sure.
 
Joined
Dec 25, 2020
Messages
4,832 (3.89/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Razer DeathAdder Essential Mercury White
Keyboard Redragon Shiva Lunar White
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Flicker or fluidity? I think a lot of people saw flicker at 75Hz on CRT but flicker threshold and fluidity thresholds are different.

CRTs inherently flicker because of the cathode ray gun and the way it works. The monitor is constantly performing a rolling scanout (also known as a raster scan), shooting a light beam onto the phosphor layer on the screen from left to right, top to bottom and then restarting from the first line - the frequency of which it does this is called the vertical scan rate. The faster the monitor completes the scanout, the less likely is for the flicker to be perceptible, this is why it's generally advisable to use CRTs at high refresh rates (aka high vertical frequency figures) whenever possible. Burn-in on CRTs is caused by wear on the phosphor layer, as the cathode ray gun continually beams in the same physical coordinates in the phosphor layer, degrading them over time.

LCD monitors work differently, the thin film transistor layer itself doesn't flicker, but it also generates no light. Having no emissive properties, the various types of LCD are immune to burn-in, but prone to the backlight's flickering rate, as most types of light-emitting diodes or cold cathodes use pulse-width modulation to control the intensity of the white light that forms the brightness component. Lower brightness, longer PWM window (looser duty cycle), higher perceived flicker (this is what causes discomfort). Flicker-free backlighting has been developed (usually marketed as "eye care" devices), and many higher-end TVs also use fast PWMs that are less easily discernible by the human eye (for example, the 2018 Sony X900F had a 720 Hz PWM), resulting in a more comforting image. Generally, this is the explanation for the anecdote that some displays "feel better" when the brightness and contrast settings are very high or maxed out.

OLEDs are self-emissive and thus the only delays involved are the amount of time that the input signal processing takes to complete, or, if that is removed from the equation, the amount that it takes for each diode to change its electrical characteristics to output the color that was requested by the controller (sample and hold time) - this is usually very fast and gets faster with each generation of panel. Since each pixel is self-emissive, they're prone to burn-in, as the changes in electrical state, as well as the maximum brightness achieved by each subpixel (R, G and B, sometimes W) will diminish over time. The gamble I did when I bought the LG G3 OLED was specifically this, to ensure I have the least burn-in risk possible, I bought a set that was designed to operate at very high brightness in HDR, and run it at the minimum brightness level as I use it as a monitor. It's currently got around 1600 hours of power on and at least 80% of that was spent playing Genshin Impact with the UID fixed on-screen, and there have been no discernible signs of burn-in yet.
 
Joined
Jan 14, 2019
Messages
10,050 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
60FPS human cap crew strikes back. I guess it was inevitable. :laugh:
I can't say that the "science" is correct and factual. All I can say is, it is true in my case. :ohwell:
 
Joined
Jun 25, 2020
Messages
100 (0.07/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 3x SilentWings3 120mm @Front, Noctua S12A @Back
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200 1GB, Crucial P3+ 4TB (w/adaptor, @Gen2x1 ), Seagate 3TB+1TB HDD, Kingston SA400 512GB
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Non-branded wired full custom mechanical keyboard from my brother's leftover
I think noticing flicker and noticing the smoothness of higher refresh rate is a different thing.
I don't think I can spot the 60Hz flicker like in the studies, nor can I spot VRR flickers on early VRR days or recent OLED stuffs, but I'm damn sure I can notice the difference between 60Hz, 100Hz, 120Hz, 150Hz and 360Hz.
The studies quoted is therefore, IMO, while proving some people have faster vision, not necessarily (dis)proving they can see >60Hz in a smooth motion.

Uh, I'm late on the point.

Being a "enthusiast" but far from competition level arcade IIDX (well, good old fashioned hit-the-notes-that-are dropping style rhythm game) player, the difference between old 60Hz cabinets on various older style LCD and newer 120Hz cabinets on more modern IPS VA are painfully obvious. 60Hz is smeary AF, and 120Hz is less smeary and reasonably smooth, yet I can kinda tell the notes are dropping in individual frames. (...Well, screen technology / quality also matters a lot. I really thought that was IPS though, but anyway.)

For those who don't play that competitively, old cabinets work well enough, and it is something that can get used to even after playing on new cabinets. It was raining notes like hell for ~20yrs before the new cabinets anyway. Real competitive players, professional or not, play on new cabinets almost exclusively though.

Weirdly, since the new cabinet came out, and that was after I'm on the 144Hz IPS, for years I was the only player who preferred old cabinets because for unknown reason I could hit slightly higher scores / miss less notes on top level charts even on situations where I can't read all the raining notes / hit all the notes on time. That changed some time last month, and I'm on new cabinets exclusively for now.

I would love to try on even older cabinets with CRT for once, but CRTs degrade, and cabinets with working CRTs are a bit hard to find.

At home, playing PC version of IIDX or BMS (IIDX kinda simulator on PC) on my 144/150Hz IPS monitor is much clearer, but the difference in application / screen size / quality is not fair.
 
Low quality post by gurusmi
Joined
May 24, 2023
Messages
655 (1.81/day)
Location
127.0.0.1, ::1
System Name Naboo (2019)
Processor AMD 3800x
Motherboard Gigabyte Aorus Master V1 (X470)
Cooling individual EKWB/Heatkiller loop
Memory 4*8 GB 3600 Corsair Vengeance
Video Card(s) Sapphire Pulse 5700XT
Storage SSD 1TB PCIe 4.0x4, 2 TB PCIe 3.0
Display(s) 2*WQHD
Case Lian Li O11 Rog
Audio Device(s) Hifiman, Topping DAC/KHV
Power Supply Seasonic 850W Gold
Mouse Logitech MX2, Logitech MX Ergo Trackball
Keyboard Cherry Stream Wireless, Logitech MX Keys
Software Linux Mint "Vera" Cinnamon
It's not a matter of see or not see.

Oh. Some don't need an eye to see. They watch with their ears or nose or whatever. Yeah. OK. And in the evening i take some of those pills...
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,755 (2.38/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
I am only playing at 60fps because my screen is only 60Hz. It is fluid af though. I dunno.. looks awesome to me.

I cant argue because this is all I know lol. I could see the diff between 75 and 85Hz on CRT, it was quite obvious.

It is one of those "if you know, you know" type situations. Most people will be happy for a lifetime playing at 60Hz, but once they dip into 120Hz on a daily basis, going back to 60Hz is immediately pure torture. I had the Pixel 7a for some time and even the 120Hz to 90Hz downgrade was a difficult pill to swallow, even though 60Hz to 90Hz would have felt great.

With the advent of VRR, I can *kinda* understand people saying they can't distinguish 60Hz from 120Hz, based on some mix of personal characteristics/individual game design/frame pacing/input lag etc. With a good panel, some well optimized games can certainly play better at 60Hz than others do at 120Hz.

What seals the deal for me is 120Hz on phones. There's no debate to be had - the difference between 60Hz and 120Hz on a phone is so starkly obvious, that an inability to tell can only be down to personal characteristics.
 
Low quality post by gurusmi
Joined
May 24, 2023
Messages
655 (1.81/day)
Location
127.0.0.1, ::1
System Name Naboo (2019)
Processor AMD 3800x
Motherboard Gigabyte Aorus Master V1 (X470)
Cooling individual EKWB/Heatkiller loop
Memory 4*8 GB 3600 Corsair Vengeance
Video Card(s) Sapphire Pulse 5700XT
Storage SSD 1TB PCIe 4.0x4, 2 TB PCIe 3.0
Display(s) 2*WQHD
Case Lian Li O11 Rog
Audio Device(s) Hifiman, Topping DAC/KHV
Power Supply Seasonic 850W Gold
Mouse Logitech MX2, Logitech MX Ergo Trackball
Keyboard Cherry Stream Wireless, Logitech MX Keys
Software Linux Mint "Vera" Cinnamon
It is one of those "if you know, you know" type situations. Most people will be happy for a lifetime playing at 60Hz, but once they dip into 120Hz on a daily basis, going back to 60Hz is immediately pure torture.

Welcome to the placebo effect.
 
Joined
Nov 11, 2016
Messages
3,133 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I need a 240hz OLED panel to tell how many FPS I can distinguish :D.

If I were to focus on playing a competitive game I bet that figure should be >200FPS
 
Joined
Feb 20, 2019
Messages
7,447 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
CRTs inherently flicker because of the cathode ray gun and the way it works. The monitor is constantly performing a rolling scanout (also known as a raster scan), shooting a light beam onto the phosphor layer on the screen from left to right, top to bottom and then restarting from the first line - the frequency of which it does this is called the vertical scan rate. The faster the monitor completes the scanout, the less likely is for the flicker to be perceptible, this is why it's generally advisable to use CRTs at high refresh rates (aka high vertical frequency figures) whenever possible. Burn-in on CRTs is caused by wear on the phosphor layer, as the cathode ray gun continually beams in the same physical coordinates in the phosphor layer, degrading them over time.

LCD monitors work differently, the thin film transistor layer itself doesn't flicker, but it also generates no light. Having no emissive properties, the various types of LCD are immune to burn-in, but prone to the backlight's flickering rate, as most types of light-emitting diodes or cold cathodes use pulse-width modulation to control the intensity of the white light that forms the brightness component. Lower brightness, longer PWM window (looser duty cycle), higher perceived flicker (this is what causes discomfort). Flicker-free backlighting has been developed (usually marketed as "eye care" devices), and many higher-end TVs also use fast PWMs that are less easily discernible by the human eye (for example, the 2018 Sony X900F had a 720 Hz PWM), resulting in a more comforting image. Generally, this is the explanation for the anecdote that some displays "feel better" when the brightness and contrast settings are very high or maxed out.

OLEDs are self-emissive and thus the only delays involved are the amount of time that the input signal processing takes to complete, or, if that is removed from the equation, the amount that it takes for each diode to change its electrical characteristics to output the color that was requested by the controller (sample and hold time) - this is usually very fast and gets faster with each generation of panel. Since each pixel is self-emissive, they're prone to burn-in, as the changes in electrical state, as well as the maximum brightness achieved by each subpixel (R, G and B, sometimes W) will diminish over time. The gamble I did when I bought the LG G3 OLED was specifically this, to ensure I have the least burn-in risk possible, I bought a set that was designed to operate at very high brightness in HDR, and run it at the minimum brightness level as I use it as a monitor. It's currently got around 1600 hours of power on and at least 80% of that was spent playing Genshin Impact with the UID fixed on-screen, and there have been no discernible signs of burn-in yet.
Yeah, that's what I was saying back in post #68 (not that I blame you for missing one post in a multi-page thread).

Strobing displays work through persistence of vision, which is well-understood science and a function of the fact that each rod/cone in our retinas have a recharge time after being struck by a photon of light. TL;DR is that a photon hits the retina and the oversimplified version is that protein chains get broken to generate the signal to your brain then rapidly reassembled by enzyme molecules before they're ready for the next photon:

1714987188936.png

This cycle is your eyeball's framerate, but it's not quite that simple; There are thousands of these molecules performing this cycle in each rod/cone of your eye and you have a hundred million rods+cones in each eye. THEY ARE NOT SYNCED, so your eyes have a near-infinite framerate as billions of these individual proteins all run through this cycle out of phase and at any instant in time, some of them will be ready to accept incoming photons.

The eyeball "framerate" is actually better described as the eyeball blind-time. After a photon comes in, that particular protein is out of action for a while, and the ion-charge signal it's sending to your brain is "on" for most of that time. Strobing lights like CRTs, PWM backlight dimming, and old reel-to-reel cinema projectors all had dark gaps between frames that our eyes can't see, because they're blind after seeing a flash. Their temporal resolution is the time it takes for 4 of the 5 stages in that rhodopsin cycle diagram above, and for most of that cycle they are transmitting an "on" signal even if no light is hitting the retina during that point. If you see a flash of light that's only 100µs long, your retina will detect that flash but it can't tell you that it's only 100µs long, it will transmit the "hey, it's light!" signal to your brain for the entire duration of the rhodopsin cycle which is far longer, measured in multiple ms rather than µs.

Clearly, the duration of the rhodopsin cycle is different for different people. It might be that different people have different physical length of rhodopsin chains in their cells, it might be something else like the rhodopsin/enzyme ratio. I don't know - at this point I'm guessing because I've not read any studies of this; If one such study exists, I haven't stumbled upon it yet. All we know from empirical data is that the cycle time is 13-17ms for most people. It'll be one of those standard deviation bell-curves where ~70% of the population are in that 13-17ms region, with a few outliers.

So this 13-17ms cycle time is the maximum amount of time a light source can go dark before our eyes will be able to sense that. If a light is cycling faster than every 13-17ms, your eye is transmitting signals to your brain indicating that the light is permanently on, with no gaps.
  • CRT's are almost entirely dark. The half-life of a lit phosphor dot is about 200µs. After 0.5ms it's already dim enough to not consider 'lit', so at 75Hz, a CRT is dark for 12ms+ between frames

  • Older cinema projectors such as 35mm film had a mechanical shutter that was closed, blocking light for about one-third of the cycle during which the film would be moved onto the next frame. Yes, the framerate of the reel was 24fps (a new image every 42ms) but the dark period between each frame was only ~14ms, similar duration to the perceived flicker of a 60-70Hz CRT, which feels about right.

  • PWM backlight flicker from LCDs are an interesting case, and this is where I'm starting to guess/hypothesise again: When I said the individual rhodopsin reactions in your eye are not synced, I meant that your eye has no biological mechanism to keep them in sync, but that doesn't mean they can't be synced by external stimuli. A light source cycling on and off will 'blind' all billion+ cells in your eye at once, and they'll all complete their rhodopsin cycle at about the same time some 13-17ms later. If that period where they all those proteins come back online again happens to fall at the start of one of the PWM-pulsed "off" cycles and it's a slower 200-1000Hz PWM, you might notice it. It also explains why people who perceive PWM backlight flicker explain that adjusting the brightness up or down slightly eliminates the flicker. It's still flickering but my guess is that changing this duration of the PWM's pulse brings a lit part of the PWM cycle into the overlap with the eye's rhodopsin cycle.
So all of the above is about the cycling light sources like CRTs, cinema reel projectors, PWM-flicker. For constant light sources such as LCD/OELD displays, you don't get this external frequency putting all of your retina into a synced rhodopsin cycle, and your eyeball's framerate is basically infinite again. That's where the brain comes in and perceived framerate is a function of how fast your visual cortex can resolve changes before it all becomes a blur. The mechanics of the eye are different to the mechanics of the brain. Unlike your brain, your eye cannot handle a sequence of flashes if there's less than 13-17ms between them, yet you brain can handle visual changes faster. It can interpret complex sound waves at frequencies exceeding 10KHz, for example. The processing power is fast enough, we just don't fully understand it.
 
Last edited:
Joined
Mar 29, 2023
Messages
1,043 (2.50/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
I'm in the category of people who notices tiny differences in image quality (thus play at 8k 60), but aren't sensitive to framerate or latency at all - at long as the frametimes are consistent.
 
Joined
Apr 24, 2022
Messages
168 (0.22/day)
System Name lawooder
Processor 5800x3D
Motherboard B550M Aorus Elite
Cooling Arctic Freezer 34 DUO
Memory 32GB 4000Mhz CL16 (2x16GB)
Video Card(s) RTX 3070 Ti
Storage SN770 250GB/SN750 500GB/SA510 1TB
Display(s) VG279QM 1080P 280hz + VG27AQ 1440P 165HZ
Case MSI MAG Something
Audio Device(s) fiiO EK10 + HD 599
Power Supply Focus GX 750 Watts
Mouse XM1r
Keyboard Xtrfy K4
I'm really curious to know if the people who claim that cant see the difference ever had a 144hz vs 240hz display side by side. I really wonder... Or even better, the same 2 displays, one at 144 and the other at 240hz. Dont even get me started on 60 vs 144...

Because they certainly never had that, otherwise topics like this wouldn't exist at all.

It's just ignorance and the lack of will to spend money/evolve, its kinda sad actually.
 
Top