• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS Reveals 500 Hz ROG Swift Esports Monitor With E-TN Panel and G-Sync

How do higher refresh rates affect pixel longevity?
 
Time to go back to CRT

I wish, cuz LCDs still suffer from the native only resolution or it looks blurry as shit issue.

Would love a Sony Trinitron FW900, but no way that fits on my 80CM deep desk. I mean it fits, but no room to sit at a reasonal distance then.
 
Completely sidestepping that argument as well, is 500 Hz even supported by Windows and GPU drivers?
Absolutely, yes. Alienware already make 300Hz and 360Hz displays, I've hooked up a weird laboratory-grade digital oscilloscope before that was technically a 320x320 monochrome display in W7 with 1200Hz as an option IIRC (Might have been 12KHz, LOL).

As for the "can people see the difference beyond x Hz?" discussion, my opinion is that the higher the refresh rate gets, the less relevant the answer becomes because it's subject to diminishing returns. For me, visual fluidity occurs at about 85Hz and I can feel/notice the difference between 85Hz and 120Hz in side-by-side testing but that's about it. I've not used a 360Hz display but I'd be lying if I said I could tell the difference between 240Hz and 120Hz on the same 240Hz monitor. I feel like once you are getting triple-digit framerates it's time to worry about other things.

I have a 165Hz display but choose to run it at 120Hz because to me that's plenty fast enough and it means my GPU can run cooler/quieter and I don't have to faff with Factorio that runs at a locked 60Hz and stutters a bit with G-Sync at 165Hz.

How do higher refresh rates affect pixel longevity?
Pixel longevity isn't a thing for LCD panels, only OLEDs.
An LCD (so TN, IPS, VA) pixel will be as good and as fast as the day it left the factory when the backlight dies 15 years later.
 
Never looked at Mhz anyway. I am after real estate (32:9 / 32:10) :D
 
I wish, cuz LCDs still suffer from the native only resolution or it looks blurry as shit issue.
Honestly, with FSR or DLSS this is a non-issue these days. You don't have to run at non-native resolution any more to increase performance; All the benefits (and more) of reducing the render resolution, but you still get the pin-sharp native-resolution text, menus, OSD, HUD, whatever...
 
I still think TN panels are bad, though.
Let's be fair, TN has come a long way and is much better now than it once was. It's not IPS or LED kind of nice, but it has become something respectable.

I wish, cuz LCDs still suffer from the native only resolution or it looks blurry as shit issue.
While very true, I would never go back to CRT's I only offered the example earlier as an illustration of a point.
 
This is a good way to sell poor/ low cost panels at high price. Just slap some outrageous refresh rate and a big price tag.
I still think TN panels are bad, though.
TN still has its flaws but I doubt these are poor or low-cost panels.
I'd never volunteer to use one, but some of the best TN gaming panels actually had reasonably viewing angles and colour accuracy. The gamut was a little low but that's also true of many half-decent gaming IPS and VA screens these days.
 
You.... you didn't read any of those, did you? Truly?

Well lets go through them then and see if there is any stock in any to them (TLDR: There is not)

First link, irrelevant discussion related to persistence of vision for video, and it talks about the lower levels of required frame rate. Though according to it with their exactly 0 sources, its not until you get lower than 12 frames per second our brain find it a bit unbelievable. Yeah.... no, my eyes cry far before that. Also unrelated since lowest amount of framerate is not the same as what benefits you can draw from higher framerate. Two different things.

Second link is a random toy, no sources, discusses nothing of relevancy.

Third one is wikipedia, which is in fact not a source. Though again thats related to persistence of vision. Which is about fooling our brain to believe a flicker of something is actually solid. Which is still, unrelated to this. Solid or not is not what we are after with higher refresh rates. It is if we can determine a difference, and thats not the same as persistence of vision.

forth one is a is 100 years old, it does in fact contradict some of the previous links as well so I have no idea why you are linking to that. Or well I do, its because you did in fact not read it, you just googled and picked some links that sounded relevant when in fact, its not. Though I love old-schooly language used.

And the fifth and last one is a funny one. Its actually about creating something low powered, which means least amount of updates, so stay on as little as possible to fool our head in to thinking its something there and they push out this funny line.
"Due to the fact that human eyes can only render about 10 images per second, the fast spinning LEDS seem like a solid display."
Page 11312.


Yeah its easy to say this, if thats what you take as reasonable arguments to support your case for this, ignorance is definitely the correct term.
Persistence of vision is not the same thing as the ability to detect changes. They are in fact, opposite things. What you should be looking at is the shortest amount of time something is needed to be displayed for our brain to register a change. This is the exact opposite of persistence of vision. Because what we are after is not exactly to fool our heads to believe its a solid image or fluid movie. But edge case change detection. And the fact that you throw a bunch of neigh useless links related to persistence of vision to support your argument kinda says a lot about you missing the point of it by about the broad side of a barn.


Not sure I think there is a useful reason for a 500hz display, unless maybe used for 3d or something so it flickers between two fields with refreshrate to spare.
 
Absolutely, yes. Alienware already make 300Hz and 360Hz displays, I've hooked up a weird laboratory-grade digital oscilloscope before that was technically a 320x320 monochrome display in W7 with 1200Hz as an option IIRC (Might have been 12KHz, LOL).

As for the "can people see the difference beyond x Hz?" discussion, my opinion is that the higher the refresh rate gets, the less relevant the answer becomes because it's subject to diminishing returns. For me, visual fluidity occurs at about 85Hz and I can feel/notice the difference between 85Hz and 120Hz in side-by-side testing but that's about it. I've not used a 360Hz display but I'd be lying if I said I could tell the difference between 240Hz and 120Hz on the same 240Hz monitor. I feel like once you are getting triple-digit framerates it's time to worry about other things.

I have a 165Hz display but choose to run it at 120Hz because to me that's plenty fast enough and it means my GPU can run cooler/quieter and I don't have to faff with Factorio that runs at a locked 60Hz and stutters a bit with G-Sync at 165Hz.
I think it's not only a matter of "can you see the difference", but also one of "do you even care". I'm actually doing some testing regarding these questions, so I temporarily swapped the 2070 in my rig to my spare passive 1050 Ti. My framerate in Mass Effect: Andromeda dropped from a fixed 120 to between 30 and 50. Yet, my overall experience improved, because my PC is so much quieter now! :D

Let's be fair, TN has come a long way and is much better now than it once was. It's not IPS or LED kind of nice, but it has become something respectable.
TN still has its flaws but I doubt these are poor or low-cost panels.
I'd never volunteer to use one, but some of the best TN gaming panels actually had reasonably viewing angles and colour accuracy. The gamut was a little low but that's also true of many half-decent gaming IPS and VA screens these days.
I haven't had a TN monitor for a while, so I'll take your word for it. :) I still wouldn't want one, though.
 
Is it a 6-bit or 8-bit TN panel, though? Very limited colour space.
 
Ok ok, goint past 360hz is completely pointless for 99.9% of the people
Going past 100Hz is pointless for 99% of the people too ;)
 
Good stuff. But, it doesnt explain what happens after extend exposure to higher refresh rates (from what I read). I think once you see at 500hz, one can become accustomed it and eventually tell the difference of lower Hz monitors in side by side comparison. Its a theory since something like this was not tested.
 
Imo 24 is a bit small and 32 is a bit too big, but 27 is the best in between size.

24 is large, 32 is extremely large.

Do you remember the CRT times when we used 13-inch, 15-inch, 17-inch and very rarely 19-inch?

1653476171303.png





Edit: The human eyes can see the movements in the surrounding nature with unlimited detail and unlimited refresh rate.

If you launch a 10,000 Hz display and put it next to a 360 Hz one and a 500 Hz one.
 
Last edited:
You.... you didn't read any of those, did you? Truly?

Well lets go through them then and see if there is any stock in any to them (TLDR: There is not)

First link, irrelevant discussion related to persistence of vision for video, and it talks about the lower levels of required frame rate. Though according to it with their exactly 0 sources, its not until you get lower than 12 frames per second our brain find it a bit unbelievable. Yeah.... no, my eyes cry far before that. Also unrelated since lowest amount of framerate is not the same as what benefits you can draw from higher framerate. Two different things.

Second link is a random toy, no sources, discusses nothing of relevancy.

Third one is wikipedia, which is in fact not a source. Though again thats related to persistence of vision. Which is about fooling our brain to believe a flicker of something is actually solid. Which is still, unrelated to this. Solid or not is not what we are after with higher refresh rates. It is if we can determine a difference, and thats not the same as persistence of vision.

forth one is a is 100 years old, it does in fact contradict some of the previous links as well so I have no idea why you are linking to that. Or well I do, its because you did in fact not read it, you just googled and picked some links that sounded relevant when in fact, its not. Though I love old-schooly language used.

And the fifth and last one is a funny one. Its actually about creating something low powered, which means least amount of updates, so stay on as little as possible to fool our head in to thinking its something there and they push out this funny line.
"Due to the fact that human eyes can only render about 10 images per second, the fast spinning LEDS seem like a solid display."
Page 11312.


Yeah its easy to say this, if thats what you take as reasonable arguments to support your case for this, ignorance is definitely the correct term.
Persistence of vision is not the same thing as the ability to detect changes. They are in fact, opposite things. What you should be looking at is the shortest amount of time something is needed to be displayed for our brain to register a change. This is the exact opposite of persistence of vision. Because what we are after is not exactly to fool our heads to believe its a solid image or fluid movie. But edge case change detection. And the fact that you throw a bunch of neigh useless links related to persistence of vision to support your argument kinda says a lot about you missing the point of it by about the broad side of a barn.


Not sure I think there is a useful reason for a 500hz display, unless maybe used for 3d or something so it flickers between two fields with refreshrate to spare.
Try actually READING the citations, including the reference material. Your failure to understand the context of the reference material is not a failure of the citations. Either provide citations that support your argument and contradict what I have provided or put a cork in your cake hole.
(For the record folks, THIS is one of the reasons I don't bother with citation most of the time. All people do is nit-pick and marginalize with BS and rabmlings to fit their narrow agenda.)

Good stuff. But, it doesnt explain what happens after extend exposure to higher refresh rates (from what I read). I think once you see at 500hz, one can become accustomed it and eventually tell the difference of lower Hz monitors in side by side comparison. Its a theory since something like this was not tested.
A lot of testing has been done. The problem is that the results are as varied as the participants. One persons vision acuity is very much not the same as another persons and is dependent on many factors. Some people can see individual frames upto 100hz and some can only tell the difference in framerates. There is something to be said about being accustomed to certain stimuli, however that does not mean the human eye can actively differentiate between one rate that is beyond the physical limits and another.

However, the medical science of how fast cones and rods in the eye can respond to photons is known and the limits of how fast the optic nerve response rate is also known. The science of human perception of motion and framerates is known. Pushing screen frame times beyond the limit of human perception is a waste of time, money and effort.

This 500hz nonsense is just that, nonsensical.

Imo 24 is a bit small and 32 is a bit too big, but 27 is the best in between size.
I'll agree with this.
 
Last edited:
JFC on a bike, that's just outright lies right there:

1653477111692.png


Yes, the average latency of, say two frames of screen buffer and input lag at 500Hz will be lower than 144Hz, like an average of 3ms at 500Hz compared to 10ms at 144Hz, but you can eyeball the ms of delay in that video from the number of 7ms frames of the 144Hz panel at the bottom. It's clearly 7-8 of the 144Hz frames behind the 500Hz panel which is 50ms+ of input lag.

I'm sorry, what 144Hz gaming monitor on the market has a 50ms input lag!? Asus and Nvidia making easily-disproven marketing lies to try and sell something that nobody needs. It's even worse than that - Nvidia literally created the most popular tools to measure display lag (LDAT hardware, which was eventually released to consumers as Nvidia REFLEX) and now they are making this video that completely undermines those efforts by faking results and failing to use that specific tool designed and sold for this one specific purpose!
 
Completely sidestepping the argument of whether human vision can perceive the difference between 500 and 300 Hz, how many games are capable of running at those framerates?

Depends on the resolution. The panel above is 1080p. A 3080/RX6800 and above can easily push most games above 300fps, especially if some of the settings are turned down/off. So whether or not a PC can can output upto 500fps is academic.

Maybe not so much about if a particular CPU and GPU combo can push FPS to certain heights.....but did the developers of said games even design the game engine to produce frame rates that high?

For example, Doom 2016 is known to run on well on hardware that isn't bleeding edge, but the game is capped at 200 FPS

So my point being it's only going to benefit with games that were designed with high frame rates in mind, uncapped engines. Even games that benefit from high FPS like a fast first person shooter like Doom may have engine limits in place. Though I would call 200 FPS high frame rate I suppose some "e-sports gamer" (christ that term makes me cringe just typing it even:fear:) would think it is sub-par.
 
Depends on the resolution. The panel above is 1080p. A 3080/RX6800 and above can easily push most games above 300fps, especially if some of the settings are turned down/off. So whether or not a PC can can output upto 500fps is academic.
Games can peak at 500 FPS, but actually holding a stable 2ms frametime is an entirely different beast.

Even the fastest CPU and RAM config run at bench-stable settings (read: probably going to crash intermittently) can't do that today, unless you go very far back in time where the entire game's memory map fits into L3 cache. :- )
 
Try actually READING the citations, including the reference material. Your failure to understand the context of the reference material is not a failure of the citations. Either provide citations that support you argument and contradict what I have provided or put a cork in your cake hole.

Because you are arguing the wrong thing. What you are on about is about how little is required to simulate smooth motion from static images. It does not talk about the benefits of higher rates, the upper limits of it or anything of the like. And the fact that you do not notice this, does not make much of an argument. Because your talking about the complete opposite.
How many frames you need per second to have motion look fluid is not the same as being able to notice a change or improvement, it is also not the same as the shortest frame a human eye would notice. And that, is what matters for something like this.

Of course there are indeed studies made that show that we can indeed perceive rates far higher than one would think, with some even going above 500hz and can still notice flickering. Granted, edge cases. Point is, its possible.

Or for those too lazy to read
"all viewers saw flicker artifacts over 200 Hz and several viewers reported visibility of flicker artifacts at over 800 Hz. For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate."


There is a difference between the lowest framerate required to fool us in to believing something is fluid and the shortest amount something can be displayed for us to take notice. Then there is also the problem about defining at what rates we can practically identify what it is we see. But the whole argument based around the persistence of vision for video, is not relevant to this. We know the human eye can detect changes faster that what is required to make a video look fluid. The question would be at what point the increased rate would stop being useful, but your whole argument is based on the wrong point, which was my point. Not that specifically that this 500hz monitor would be a super useful thing to have. You said that at ~300-320hz it becomes irrelevant, which you state is a clearly defined limitation and fleshed out field of study. Then fine, where are the sources for that? Thats what I want to read. Because the sources I find is that they do indeed improvements with higher refresh rates as it helps to imitate something from reality, rather than a pancake video. This is why the goal is to aim for monitors in the kilohertz range, blur free, flicker free, strobe free. Where you would not be able to determine the difference between it being a monitor or reality. For which you need thousands of updates per second because of those reasons. Yet you provide sources for... 24fps video and ones that state our eyes render at 10 frames per second. I mean, really?
Or you know take a read of the many reasons why: https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/

There is more to this than just individual "frames".
 
Of course there are indeed studies made that show that we can indeed perceive rates far higher than one would think, with some even going above 500hz and can still notice flickering. Granted, edge cases. Point is, its possible.
Or for those too lazy to read
"all viewers saw flicker artifacts over 200 Hz and several viewers reported visibility of flicker artifacts at over 800 Hz. For the median viewer, flicker artifacts disappear only over 500 Hz, many times the commonly reported flicker fusion rate."
Did you actually READ the article you cited?(yeah I can do that too!)
If so, did you contextually understand it? There is a difference between flicker and movement perception. The article clearly states that fact. What you fail to realized is that you just offered more evidence to support the factual information I provided earlier.

Thank You.
 
Almost correct. The human eye can not perceive individual frames of animation past 80hz, but can not perceive a difference of framerate beyond about 300hz to 320hz.

This is because of a WELL documented and proven physical limitation known as "persistence of vision". This condition is why we all easily enjoyed CRT screens BITD without actually seeing the individual scan-lines. Even at 50hz PAL standard, the human eye can not perceive the scan-lines of the electron gun. And before anyone says "It's because of the phosphor glow effect.", no it isn't. High-speed camera's have already debunked that nonsense.

This is why 300hz, 480hz and 500hz panels are a waste. They can not help the human eye see faster even if the screen can display a higher framerate.

People, if you need fast refresh rate, get yourselves a high quality screen that can do between 180hz & 240hz and call it a day. Anything faster is snake-oil and a waste on your eyes.


Take your own advice. He wasn't far off the mark.


Above a certain framerate, that is a myth..
Ola Lex.
‎I see as true all the explanations you gave, but I want to add a simple factor, which is perception. A few days ago I spoke to one of the NVIDIA representatives here in my parents (Brazil) about whether there was already any advantage using 240hz even if the game so delivered 100 fps. If it wouldn't be a necessity expense, and better I'd be putting my panel up gradating to 120hz. He then replied that at 240hz I had yes advantage because the pixel would update faster. As soon as the card does the next 100 FPS, the 240hz monitor would be ready milliseconds faster than if it had at 120hz.‎

‎So from what I learned, this is the only advantage since the 500hz monitor, that even our eye could not have complete perception of something besides 300hz, the image getting ready and being delivered as soon as possible, helps yes we can perceive as soon as possible such action in the game.‎
 
Last edited by a moderator:
Ola Lex.
‎I see as true all the explanations you gave, but I want to add a simple factor, which is perception. A few days ago I spoke to one of the NVIDIA representatives here in my parents (Brazil) about whether there was already any advantage using 240hz even if the game so delivered 100 fps. If it wouldn't be a necessity expense, and better I'd be putting my panel up gradating to 120hz. He then replied that at 240hz I had yes advantage because the pixel would update faster. As soon as the card does the next 100 FPS, the 240hz monitor would be ready milliseconds faster than if it had at 120hz.‎

‎So from what I learned, this is the only advantage since the 500hz monitor, that even our eye could not have complete perception of something besides 300hz, the image getting ready and being delivered as soon as possible, helps yes we can perceive as soon as possible such action in the game.‎
‎Hi to you!‎

‎While this is a good point, higher refresh rates above human perception will not be useful to anyone.‎
 
Last edited:
Let's be fair, TN has come a long way and is much better now than it once was. It's not IPS or LED kind of nice, but it has become something respectable.


While very true, I would never go back to CRT's I only offered the example earlier as an illustration of a point.
There is no "LED" kinda type of panel. It's the backlight (replaced the CCFL) which have nothing to do with LCD tech - TN/VA/IPS.
Is it a 6-bit or 8-bit TN panel, though? Very limited colour space.
I bet will be native 8-bit (no FRC) like most high-end TN monitors nowadays.
Yeah, remember when they were thre first to introduce 18-bit performance panels? Is this one dropped down to 12-bti now?
I'm talking about QC issues.
 
  • Like
Reactions: ARF
Back
Top