• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Only some humans can see refresh rates faster than others, I am one of those humans.

I'm sorry. But your graph over time is wrong. A CRT doesn't work the way you tried to paint it. You would need to know how a CRT is built to know how it works and why some people got headaches.


That's how a CRT works, measured at a specific point on the screen.

For the screen as a whole, the output from the electron gun is constant, but your retina doesn't focus on the whole screen at once, your macula focal region is a 5° cone of vision, and your monochrome periphery vision is even more sensitive to rapid contrast and brightness variance, which is why you can notice flicker in your peripheral vision sometimes but not when you stare directly at the source.

You're sorry? No, you're just plain wrong - I spent five seconds to double-check I was right and this was the very first google result. If you're going to attempt to patronise someone, please make sure you have a clue what you're actually talking about, at least take the five seconds to check yourself before wasting 20 seconds typing something wrong in a condescending tone.

1715252984352.png 1715252892412.png
(source link)
 
Last edited:
The only way to know for sure is to do a blind test where you are not allowed to move a mouse, as you are talking about *seeing* FPS(trying to feel it, would be cheating). Variable refresh rate test comprising of many refresh rates. Perhaps 30 different refresh rates running for a minute each where you would enter the refresh rate number after each test.
 
The only way to know for sure is to do a blind test where you are not allowed to move a mouse, as you are talking about *seeing* FPS(trying to feel it, would be cheating). Variable refresh rate test comprising of many refresh rates. Perhaps 30 different refresh rates running for a minute each where you would enter the refresh rate number after each test.
A few of the videos I've seen are blind testing of prerecorded gameplay footage, to ensure all participants get the exact same content.

You're talking about an even more scientific test, which is valid, but I don't think anyone actually cares, since based on the poll that's gone up, about 90% of respondents can see and appreciate higher refresh rates.

That's a much higher percentage than I predicted it would be but I'm not actually surprised, I erred on cautious side, and it's not a true random sample of people, it's 90% of a sample of people interested in a discussion about refresh rates. Even so, if 90% of the people interested in a more scientific test like you're suggesting already see high refresh rates, why bother doing such a large scale test to tell them what they already know, because they can see it with their own eyes!
 
A few of the videos I've seen are blind testing of prerecorded gameplay footage, to ensure all participants get the exact same content.

You're talking about an even more scientific test, which is valid, but I don't think anyone actually cares, since based on the poll that's gone up, about 90% of respondents can see and appreciate higher refresh rates.

That's a much higher percentage than I predicted it would be but I'm not actually surprised, I erred on cautious side, and it's not a true random sample of people, it's 90% of a sample of people interested in a discussion about refresh rates. Even so, if 90% of the people interested in a more scientific test like you're suggesting already see high refresh rates, why bother doing such a large scale test to tell them what they already know, because they can see it with their own eyes!
I'm fine with the results of the poll here but I do wonder how many more might have voted that they saw no difference if the vote had been anonymous. Maybe not much but who knows?
 
I wonder ... hope pple don't mix high FPS and high Hrz refresh ...
 
You're sorry? No, you're just plain wrong - I spent five seconds to double-check I was right and this was the very first google result. If you're going to attempt to patronise someone, please make sure you have a clue what you're actually talking about, at least take the five seconds to check yourself before wasting 20 seconds typing something wrong in a condescending tone.

You can use Google search. Great. I learned how to repair a CRT professionally. I learned how a CRT works.

The problem of all of you is not the FPS of a device but the sh*t of uneducated programmers not knowing how to code clean. How are not able to calculate their code runtime etc. The same at the CPU. The speed reduces the problems of their code. The best counter example to your FPS ferrytale are the demos of the Commodore Amiga. Programming s art. But most can only paint a wall of a house.

Only once i want to be that stupid to use a compiler to have the resulting code interpreted like at M$.Net. Only once.
 
I'm fine with the results of the poll here but I do wonder how many more might have voted that they saw no difference if the vote had been anonymous. Maybe not much but who knows?

If I saw 0 difference I'd have no issues voting no either way there isn't anything wrong with or with not being able to discern different refresh rates.

I would have a much harder time above 60fps telling them apart if it was a recorded video vs having hands on with the different framerates. I've had a high refresh monitor now for almost a decade. Even at the start I had a 4k60 Dell ultrasharp and a Asus ROG 165hz display and I much preferred the higher refresh monitor but to those that prefer Uber high resolution more power to them it's always a tradeoff and it's the beauty of the pc platform everyone can game how they want assuming they have the disposable income ofc.
 
It doesn't matter who's moving the mouse, the cursor is going to have wider gaps between each update, as the fps decreases.
Not when you can control the number of trails.
 
You can use Google search. Great. I learned how to repair a CRT professionally. I learned how a CRT works.

The problem of all of you is not the FPS of a device but the sh*t of uneducated programmers not knowing how to code clean. How are not able to calculate their code runtime etc. The same at the CPU. The speed reduces the problems of their code. The best counter example to your FPS ferrytale are the demos of the Commodore Amiga. Programming s art. But most can only paint a wall of a house.

Only once i want to be that stupid to use a compiler to have the resulting code interpreted like at M$.Net. Only once.
It would appear you don't have even have the weakest grasp of the relationship between CRT beam scan and human vision if you're now blaming "the sh*t of uneducated programmers not knowing how to code clean".

First of all, you can't patronisingly tell someone they're wrong in the face of strong supporting, seemingly-relevant evidence and then provide not only no evidence, but no reasoning for either why they're wrong, or why their evidence is wrong.

Second, have you been living under a rock for a decade? Competitive games have been running at hundreds of FPS for a long time, with a lot of work put into reducing latency and delivery accurately-timed, evenly-paced frames. I think Scott Wasson's Inside the Second article on the now defunct TechReport was a key turning point in getting the gaming and GPU industry to focus on lower-latency, more evenly paced frames. His article (2011, by the way) was the turning point at which everyone started concerning themselves with minimum FPS, 1% lows, frame pacing, and input lag. There are ecosystems that have gone through multiple generations of hardware and software since then, both on the consumer side (Reflex, Anti-Lag, etc) and for journalists/researchers to measure (FRAPS>FCAT>LDAT).

You haven't explained why I'm wrong and what little you have said makes it sound like you're angrily living 15 years in the past.
 
Last edited by a moderator:
Low quality post by gurusmi
Not acceptable behaviour
It would appear you don't have even have the weakest grasp of the relationship between CRT beam scan and human vision if you're now blaming "the sh*t of uneducated programmers not knowing how to code clean".

It's ok. You're right and I'll keep my peace. You're wasting my time with uneducated stupidness. Enjoy your stay at my ignore list.
 
I still remember playing Witcher 3 locked at 144hz 144fps and just in aw as I turned around. Its a game changer.
 
@chrcoluk
i never cared above 50/60Hz, until i started playing online in a team (more often).
when you pull the trigger, and still get killed by the enemy, shooting after you did, you start noticing...

but ppl also ignore synced vs unsynced.
i rather have a game vrr synced and capped at 58, than unsynced at 120,
while still getting a massive drop in rendering time (cpu/gpu) vs vsync, similar to using a higher fps.
 
yep.

I had 120hz in school and a roomate had a 144hz

I could not tell the difference in a game at that.

Having said that, being at 4k and not really playing fps anymore, I'll keep my money and use 60hz. When/if 120hz becomes about a $50 adder to 4k I may drop that coin for the option.

Also I'd be willing to bet younger people can detect more fps than older people say like teenager vs 60+, but same cane be said for hearing, etc. Just the nature of things.
My 4K Samsung does 120Hz supposedly (I think it's interpolated frame generation) but I still limit it to 60FPS or lower as I see no difference. 55 inches of HDR 4K goodness at 60FPS is awesome, the pixel density and contrast are amazing.
 
I can tell the difference if I look at one then look at the other. But I am perfectly comfortable using 60hz monitors, so no need to buy high end stuff for me lol
 
Regarding my earlier posts in this thread about the science part of what is going on, I wonder if researching this for gaming related stuff could lead to helping studies like this one? In short: patients who see motion slower in older age can have Alzheimer's predicted 12 years ahead of schedule. I wonder if there is any crossover here, probably not, but it still might lead to some deeper understanding of how the part of the brain that makes visual stimulation - and what role that plays in all of this medical and immersion

It seems to me that brains that get stimulated more frequently can ward off dementia to some degree, its sort of like the old saying, "if you don't use it, you lose it" ----

Link to Medical Study

actual study

From the article:
“We found that a low score on this test missing many targets can indicate future dementia risk, on average 12 years before the diagnosis, especially when using this test with other specific memory tests and some tests of global cognitive functioning worked well to predict this risk.”

the targets he is referring to his visual tests.

and don't our fluids flush our brain out during deep sleep cycles? and the older you get the harder it is to get in deep sleep cycles, usually older people wake up every few hours. I wonder if that has a role in this too. so use it or you lose it combined with the need for deep sleep cycles, that would be interesting to study in relation to all of this
 
Last edited:
might be worth a try, but seeing that more and more research into things like ALS or Alzheimer,
shows its caused either by genes or bacteria, stuff not affected by "training" the brain.

the funny part is, any brain gymnastics like doing calculations, dont even have to be correct on the results.
numbers showed less than -5% difference in "gains", vs those that got the "summary' right.
 
I've locked my displays to 60 Hz. When I cycled through the different refresh rates, I noticed and felt a clear difference between 30 Hz and 60 Hz, but beyond that I couldn't see nor feel any difference when cycling up to 144 Hz.

Maybe it would be more noticeable while playing online jump and shoot games, but I have zero interest in those.
 
60 hz causes me headaches so its gotta be higher
 
60 hz causes me headaches so its gotta be higher

I don't know if you meant that literally. I occasionally play competitive games at work on a 60hz panel... and it does give me a headache or eye strain after 20/30mins of play. Perhaps i've got used to improved motion fluidity with my 165hz gaming panel at home and the occasional transition back to 60hz does my head in. I don't recall having the same issue with 60hz back in the day before moving up the ladder.
 
I do notice some differences like 30 fps ps5 / 60 hz are unusable for me, and about 100 to 144 hz is more in the comfort zone.
 
The Simpsons [11x02] Brother's Little Helper.avi_snapshot_09.33_[2024.10.12_14.15.48].jpg


"Joke, if you will, but did you know most people use ten percent of their brains. I am now one of them."
 
Oh it's this thread again, back on the front page.

I'll just get back to doing human things with my human appendages.... (Thread title is entertainingly special)
 
I don't know if you meant that literally. I occasionally play competitive games at work on a 60hz panel... and it does give me a headache or eye strain after 20/30mins of play. Perhaps i've got used to improved motion fluidity with my 165hz gaming panel at home and the occasional transition back to 60hz does my head in. I don't recall having the same issue with 60hz back in the day before moving up the ladder.
I meant it literally, that was on CRT and LCD, it has to be 75+
 
I meant it literally, that was on CRT and LCD, it has to be 75+

lol glad to hear i'm not the only one. Thought I was a factory defect up for a recall.

I'll just get back to doing human things with my human appendages

You're lucky. Once my backside hits the chair, mine are all physically challenged except the fingers and the occasional right arm swinging over to grab the mouse lol
 
Back
Top