• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

BOE Creates 27-Inch Full HD Display With 500 Hz Refresh Rate

Considering that the human eye can not determine single frames above 80hz and can not perceive a framerate above 220hz, a 500hz framerate is a waste. Give us a quality 240hz display and call it a day.
Source please?

The 80 Hz one makes sense to me anecdotally though. I have experienced that when increasing fps upto and beyond 70-80 fps on my 144 hz screen there is a noticable increment in level of smoothness/butteriness.
 
Sure, whatever you say.

Nobody's being dismissive. But this is early adopter stuff, not many will care about it at this point.
And then there's diminishing returns.
Also, it's Chinese. Until someone reviews this properly, we can't tell how many corners were cut.

BOE supplies a lot of displays to major brands.
 
Considering that the human eye can not determine single frames above 80hz and can not perceive a framerate above 220hz, a 500hz framerate is a waste. Give us a quality 240hz display and call it a day.
It's been around a decade since i looked into the research on this topic, and I believe the limit is in the ~200 Hz range as you said. But it is worth mentioning that it's situation dependent and even to some extent varying between individuals.

The most important takeaway though, is that human vision is much more perceptive to smoothness of motion than to detecting individual frames. So while having >60 Hz is certainly useful, the frame rate consistency is even more useful. Years ago, I conducted an experiment of rendering at ~60 FPS (on a 60 Hz panel) and having stutter in ~1-2 ms range vs. <0.1 ms, and the difference was easily noticeable. So in order for higher frame rates to be useful, the computer needs to be able to produce the new frames with a higher precision. The reason why high frame rates are advantageous is not because details may appear earlier on the screen, it's mostly because it's easier for the brain to filter out what is actually moving. And stutter is the worst enemy of this, as it distracts the brain when processing the image. I know I'm fairly sensitive to stutter, and find it quite straining.

So 500 Hz is not just wasteful because people can't see the difference, it's also a bad idea because it cuts the tolerances for frame rate consistency in half, so you can get to a point where the picture becomes noticeable worse. At 500 Hz there is only 2 ms between frames, and with the precision in the Windows scheduler you will struggle to keep a good consistency at these rates.

But I believe no one has addressed the biggest elephant in the room; can games even produce unique frames at this rate?
Modern game engines work at a fixed tick rate, and if you render frames at a higher rate than this, the GPU will just render multiple identical frames, rendering the 500 Hz screen utterly pointless (pun intended).
A few years ago, I remember CS:GO had 120 Hz tick rate (30 Hz server), and 60-100 Hz was fairly typical. I haven't checked the most recent games, but I doubt there are many running at >120 Hz.
 
This has to be the dumbest thing the tech industry made in quite some time.
It's "future tech" - commercial protoypes

a 500hz monitor is no use for gamers, but in certain industries it'd be magical... imagine if you were testing high frame rate, slow motion videography?


the 8K 120hz is definitely made for commercial purposes and not home users, they could literally use that for a small cinema display, or slap it outside buildings like they do in NYC
 
Chinese does not automatically mean it's not true. They make a lot of stuff that check out. Besides a 480Hz prototype panel was demoed nearly 5 years ago:
I didn't say it was not true. But their standards tend to be different from ours. Quite often.
Besides, this is just a prototype, we don't know if it's meant to hit retail this decade.
 
Having switched from a Rubbish Samsung Odyssey G7 240hz to an LG C1 120hz, the C1 is LEAGUES better than the G7. LCD is just a blurry mess and can barely display 60hz without blur (or some hacky backlight flickering that doesn't work with gsync/freesync). LCD either needs to die or substantially improve, otherwise 500hz on this display is going to be entirely pointless. The proof will be in the pudding though.
 
One word, why?

Because i like feel a champion in counter strike

giphy.gif


:)
 
Modern game engines work at a fixed tick rate, and if you render frames at a higher rate than this, the GPU will just render multiple identical frames
Uh what? I haven't looked much into this but I know for sure that Doom Eternal has its tick rate tied to fps. I have seen the console commands relevant to this and speedrunners have to deal with the implications of this when doing their runs.

Having switched from a Rubbish Samsung Odyssey G7 240hz to an LG C1 120hz, the C1 is LEAGUES better than the G7. LCD is just a blurry mess and can barely display 60hz without blur (or some hacky backlight flickering that doesn't work with gsync/freesync). LCD either needs to die or substantially improve, otherwise 500hz on this display is going to be entirely pointless. The proof will be in the pudding though.
So you're saying that something like the 175 Hz Alienware OLED is gon be gud?
 
Uh what? I haven't looked much into this but I know for sure that Doom Eternal has its tick rate tied to fps. I have seen the console commands relevant to this and speedrunners have to deal with the implications of this when doing their runs.


So you're saying that something like the 175 Hz Alienware OLED is gon be gud?
some shitty games use low server tickrates

As an example, PUBG used this and it varied per region - after some big fancy upgrades americans got a whopping 60Hz tickrate, while us aussies got 20Hz.
Led to a lot of "what shot me, i was behind cover" moments and so on
Fortnite was 30Hz, and so on

It not only varies between games but varies within the match itself... so it'll speed up at the end of the game as less players are alive, but run like dogs ass early on with all 100 players
Below info is screencaps from the highlights of this video
PUBG 60Hz Tickrate Update 14 Netcode Analysis - YouTube
1643699252986.png

Early PUBG
1643699354676.png


Updated Pubg
1643699386737.png

(Aussie PUBG is the red bar at the bottom)
(It's almost like they want to save money on the servers)

Now is that relevant here? Not really, because not every game does things this way, and since you cant match your PC's refresh rates to the servers due to distance, a faster monitor refresh rate does give you a higher chance of receiving the visual update before your opponent, if they dont have a ping advantage

Examples with math here:
One critical example is how fortnite keeps the network latency much much lower despite only running at 30hz. Anti cheat, server location, server power, all sorts of things add up far beyond just tickrate.
1643699515083.png
 

Attachments

  • 1643699273829.png
    1643699273829.png
    483.1 KB · Views: 76
The only good thing higher refresh rate monitors higher than 240hz give is less eye strain because of the blur effect but besides that we still need proof of any benefit.

Good topic about eye strain https://forums.blurbusters.com/viewtopic.php?t=8446

What helped me to contain that eye strain is to block the constant blue light, I had to buy a special glasses that block almost all blue light and so far has been my saver, It really works great.

Another article about glasses that block blue light.

Not everyone suffers from eye-strain. For example, doesn't effect me at all.

Source please?
I would if I could remember where I read it and I can't find it. University study done 15 some-odd years ago. Sorry. This is known science though, if you go looking you'll find it. I know there's been more research done since then.

But it is worth mentioning that it's situation dependent and even to some extent varying between individuals.
The most important takeaway though, is that human vision is much more perceptive to smoothness of motion than to detecting individual frames.
This is true. However, where flat panels displays are concerned we can't perceive framerate changes above 200hz to 220hz. About 8 years ago a few companies were testing out 480hz displays with 480hz content and comparing them to 60hz, 120hz and 240hz displays They did this at my local BestBuy. Everyone could see the difference between 60->120->240hz. But when they switched to 480hz, no one could tell much of a difference, if at all. This is why it never took off. 480hz displays have been made, but there's no point as the human eye just can't see it.
 
Last edited:
480hz displays have been made, but there's no point as the human eye just can't see it.
The fps needed to take advantage of such high refresh rate would only be feasible in competitive multiplayer games (and even there it will be limited in improvement over existing 360 Hz). But since those have at max 128 Hz tick rate, having fps be 4 times the tick rate doesn't really make a lot of sense.

For leisure games like Metro Exodus, God of War, Sekiro etc... even 120 Hz is more than plenty fast, not to mention the hardware needed to reach 100+ Hz in these titles. VRR support in monitors is far more useful than 300+ Hz monitors for most of us.

I also have to agree with Lay-kun that OLED is the tech with the ultra fast response time, not LCDs. It would be interesting to see how 200+ Hz OLEDs perform.
 
Last edited:
I've got the same "fluidity" with G-Sysnc/Freesysnc enabled at ~40-50fps/Hz, compared to 200Hz monitors. I test it right on spot ;)
 
360 Hz means 1 frame every 2.8 ms. 500 Hz means 1 frame every 2 ms. The difference is 0.8 millisecond. There is no latency advantage to going 500 Hz in case anyone is wondering. The only usable advantage may be smoothness if anyone's eyes can see the difference between 360 Hz and 500 Hz.

As additional data, 240 Hz means 1 frame every 4.2 ms so the latency difference is 2.2 ms. Very difficult to notice.
 
In mt CRT days, I couldn't tell the difference between gaming at 85 hz and 120 hz.

I also couldn't tell tell the difference between 75 hz (except for noticeable flicker) - 60 hz was noticeable versus the 75 hz option though

After transitioning to both oled TV (B7 120 hz at 1080p) and TN (1 ms 1080p running at an overclocked 75 hz), I still can't tell the difference between the two!
 
Uh what? I haven't looked much into this but I know for sure that Doom Eternal has its tick rate tied to fps. I have seen the console commands relevant to this and speedrunners have to deal with the implications of this when doing their runs.
Adding to what others have said in the meantime;
It's very uncommon to have tick rate tied to frame rate today, at least in real time "precision" games and especially multiplayer games.
You can have a local and a server tick rate (e.g. 120 Hz local and 30 Hz server which used to be the defaults for CS:GO ~5 years ago). And the way this works is the local game simulates the game while waiting for the next server tick, and then corrects any difference when it finally arrives. So in theory, this means that you can see yourself kill an opponent on your screen, only to be immediately "corrected" and killed yourself. Usually this kind of glitching is minimal, but it can certainly be noticeable, especially when watching other players move rapidly.

Also keep in mind that even if the server tick rate is fairly high, you still have to live with the latency difference, so there will be edge cases where "strange things" happen.

I assume engines like Unreal, Id tech, etc. have similar mechanisms for "latency compensation".

The fps needed to take advantage of such high refresh rate would only be feasible in competitive multiplayer games (and even there it will be limited in improvement over existing 360 Hz). But since those have at max 128 Hz tick rate, having fps be 4 times the tick rate doesn't really make a lot of sense.
Yes, you're starting to get it. Technically there is a very minor latency gain though, or at least for the first of those four frames. So you gain a tiny bit in best case input lag, but nothing in smoothness.

After transitioning to both oled TV (B7 120 hz at 1080p) and TN (1 ms 1080p running at an overclocked 75 hz), I still can't tell the difference between the two!
Have you tried just dragging av window quickly around on your screens?
At least I can easily see that on 60 vs. 120/144 Hz.
 
Last edited:
Have you tried just dragging av window quickly around on your screens?
At least I can easily see that on 60 vs. 120/144 Hz.

Havn't noticed that in like at least a decade - most modern notebooks are plenty fast enough for "interactiveness" at 60 hz (so why you would imagine a tn screen running without the notebook power consumption limits would be so much worse, I cant imagine?)


I do agree that 120 hz brings other video playback benefits, but it's more than enough for gaming (and overkill for basic desktop use)
 
Have you tried just dragging av window quickly around on your screens? At least I can easily see that on 60 vs. 120/144 Hz.
I recently switched from 120 Hz to 144 Hz and I can tell there is a difference in smoothness between 120 and 144 while scrolling text (not in games though). It really is noticable.
 
I recently switched from 120 Hz to 144 Hz and I can tell there is a difference in smoothness between 120 and 144 while scrolling text (not in games though). It really is noticable.
And do you actually ever scroll text in your daily work? or is it only encountered when you're testing your display at on of these "Motion Clarity Overkill Review+++ " sites?

You do realize that these sites were dreamed up before monitor makers added Overdrive, right (and that makes even dog-slow VA acceptable for most!)?

A worst-case test failure doesn't mean that youre ever going to notice the difference in the real world.
 
Last edited:
And do you actually ever scroll text in your daily work? or is it only encountered when you're testing your display at on of these "Motion Clarity Overkill Review+++ " sites?
So, you don't ever scroll on a web page, document, etc.? :eek:

There are probably hundreds of thousands of developers who spend all day looking at text, not to mention all the people working with documents.
I was actually surprised when I noticed that coding on a high refresh monitor was actually more comfortable (I noticed when switching back). It's certainly noticeable and comfortable, but not anywhere close to a necessity. But like many other factors, like general responsiveness and using a tactile mechanical keyboard, it does help productivity a tiny bit.
 
So, you don't ever scroll on a web page, document, etc.? :eek:

There are probably hundreds of thousands of developers who spend all day looking at text, not to mention all the people working with documents.
I was actually surprised when I noticed that coding on a high refresh monitor was actually more comfortable (I noticed when switching back). It's certainly noticeable and comfortable, but not anywhere close to a necessity. But like many other factors, like general responsiveness and using a tactile mechanical keyboard, it does help productivity a tiny bit.
Sorry man, I thought you were talking about some overkill motion testing site (think Blurblusteer)

I haven't noticed any smearing while scrolling vertically in my last ten years of desktop LCD display here at work.
 
360 Hz means 1 frame every 2.8 ms. 500 Hz means 1 frame every 2 ms. The difference is 0.8 millisecond. There is no latency advantage to going 500 Hz in case anyone is wondering. The only usable advantage may be smoothness if anyone's eyes can see the difference between 360 Hz and 500 Hz.

As additional data, 240 Hz means 1 frame every 4.2 ms so the latency difference is 2.2 ms. Very difficult to notice.
Reminds of the time Sony announced their first 4k phone. Journalists were marveling at how sharp the image was, when they got to play with the actual unit. And then the Sony representative showed up: "umm, that's the FHD unit, the 4k unit is over here".

People just have a way of seeing what they want to see...
 
Back
Top