CRTs inherently flicker because of the cathode ray gun and the way it works. The monitor is constantly performing a rolling scanout (also known as a raster scan), shooting a light beam onto the phosphor layer on the screen from left to right, top to bottom and then restarting from the first line - the frequency of which it does this is called the vertical scan rate. The faster the monitor completes the scanout, the less likely is for the flicker to be perceptible, this is why it's generally advisable to use CRTs at high refresh rates (aka high vertical frequency figures) whenever possible. Burn-in on CRTs is caused by wear on the phosphor layer, as the cathode ray gun continually beams in the same physical coordinates in the phosphor layer, degrading them over time.
LCD monitors work differently, the thin film transistor layer itself doesn't flicker, but it also generates no light. Having no emissive properties, the various types of LCD are immune to burn-in, but prone to the backlight's flickering rate, as most types of light-emitting diodes or cold cathodes use pulse-width modulation to control the intensity of the white light that forms the brightness component. Lower brightness, longer PWM window (looser duty cycle), higher perceived flicker (this is what causes discomfort). Flicker-free backlighting has been developed (usually marketed as "eye care" devices), and many higher-end TVs also use fast PWMs that are less easily discernible by the human eye (for example, the 2018 Sony X900F had a 720 Hz PWM), resulting in a more comforting image. Generally, this is the explanation for the anecdote that some displays "feel better" when the brightness and contrast settings are very high or maxed out.
OLEDs are self-emissive and thus the only delays involved are the amount of time that the input signal processing takes to complete, or, if that is removed from the equation, the amount that it takes for each diode to change its electrical characteristics to output the color that was requested by the controller (sample and hold time) - this is usually very fast and gets faster with each generation of panel. Since each pixel is self-emissive, they're prone to burn-in, as the changes in electrical state, as well as the maximum brightness achieved by each subpixel (R, G and B, sometimes W) will diminish over time. The gamble I did when I bought the LG G3 OLED was specifically this, to ensure I have the least burn-in risk possible, I bought a set that was designed to operate at very high brightness in HDR, and run it at the minimum brightness level as I use it as a monitor. It's currently got around 1600 hours of power on and at least 80% of that was spent playing Genshin Impact with the UID fixed on-screen, and there have been no discernible signs of burn-in yet.
Yeah, that's what I was saying back in
post #68 (not that I blame you for missing one post in a multi-page thread).
Strobing displays work through persistence of vision, which is well-understood science and a function of the fact that each rod/cone in our retinas have a recharge time after being struck by a photon of light. TL;DR is that a photon hits the retina and the
oversimplified version is that protein chains get broken to generate the signal to your brain then rapidly reassembled by enzyme molecules before they're ready for the next photon:
This cycle is your eyeball's framerate, but it's not quite that simple; There are thousands of these molecules performing this cycle in each rod/cone of your eye and you have a hundred million rods+cones in each eye.
THEY ARE NOT SYNCED, so your eyes have a near-infinite framerate as billions of these individual proteins all run through this cycle out of phase and at any instant in time, some of them will be ready to accept incoming photons.
The eyeball "framerate" is actually better described as the eyeball blind-time. After a photon comes in, that particular protein is out of action for a while, and the ion-charge signal it's sending to your brain is "on" for most of that time. Strobing lights like
CRTs, PWM backlight dimming, and old reel-to-reel cinema projectors all had dark gaps between frames that our eyes can't see, because they're blind after seeing a flash. Their temporal resolution is the time it takes for 4 of the 5 stages in that rhodopsin cycle diagram above, and for most of that cycle they are transmitting an "on" signal even if no light is hitting the retina during that point. If you see a flash of light that's only 100µs long, your retina will detect that flash but it can't tell you that it's only 100µs long, it will transmit the "hey, it's light!" signal to your brain for the entire duration of the rhodopsin cycle which is far longer, measured in multiple ms rather than µs.
Clearly, the duration of the rhodopsin cycle is different for different people. It might be that different people have different physical length of rhodopsin chains in their cells, it might be something else like the rhodopsin/enzyme ratio.
I don't know - at this point I'm guessing because I've not read any studies of this; If one such study exists, I haven't stumbled upon it yet. All we know from empirical data is that the cycle time is 13-17ms for most people. It'll be one of those standard deviation bell-curves where ~70% of the population are in that 13-17ms region, with a few outliers.
So this 13-17ms cycle time is the maximum amount of time a light source can go dark before our eyes will be able to sense that. If a light is cycling faster than every 13-17ms, your eye is transmitting signals to your brain indicating that the light is permanently on, with no gaps.
- CRT's are almost entirely dark. The half-life of a lit phosphor dot is about 200µs. After 0.5ms it's already dim enough to not consider 'lit', so at 75Hz, a CRT is dark for 12ms+ between frames
- Older cinema projectors such as 35mm film had a mechanical shutter that was closed, blocking light for about one-third of the cycle during which the film would be moved onto the next frame. Yes, the framerate of the reel was 24fps (a new image every 42ms) but the dark period between each frame was only ~14ms, similar duration to the perceived flicker of a 60-70Hz CRT, which feels about right.
- PWM backlight flicker from LCDs are an interesting case, and this is where I'm starting to guess/hypothesise again: When I said the individual rhodopsin reactions in your eye are not synced, I meant that your eye has no biological mechanism to keep them in sync, but that doesn't mean they can't be synced by external stimuli. A light source cycling on and off will 'blind' all billion+ cells in your eye at once, and they'll all complete their rhodopsin cycle at about the same time some 13-17ms later. If that period where they all those proteins come back online again happens to fall at the start of one of the PWM-pulsed "off" cycles and it's a slower 200-1000Hz PWM, you might notice it. It also explains why people who perceive PWM backlight flicker explain that adjusting the brightness up or down slightly eliminates the flicker. It's still flickering but my guess is that changing this duration of the PWM's pulse brings a lit part of the PWM cycle into the overlap with the eye's rhodopsin cycle.
So all of the above is about the cycling light sources like CRTs, cinema reel projectors, PWM-flicker. For constant light sources such as LCD/OELD displays, you don't get this external frequency putting all of your retina into a synced rhodopsin cycle, and your eyeball's framerate is basically infinite again.
That's where the brain comes in and perceived framerate is a function of how fast your visual cortex can resolve changes before it all becomes a blur. The mechanics of the eye are different to the mechanics of the brain. Unlike your brain, your eye cannot handle a sequence of flashes if there's less than 13-17ms between them, yet you brain can handle visual changes faster. It can interpret complex
sound waves at frequencies exceeding 10KHz, for example. The processing power is fast enough, we just don't fully understand it.