Monday, January 31st 2022
BOE Creates 27-Inch Full HD Display With 500 Hz Refresh Rate
If you thought that your 144 Hz monitor sounds rather fancy, you would have to think again after seeing this. Beijing Oriental Electronics Group Co., Ltd or BOE shortly, has announced that the company has managed to design and manufacture a 27-inch full HD display with a refresh rate of 500 Hz. No, this is not a typo, and the company made a display with such a high refresh rate. This technology marvel features an 8-lane eDP connection paired with a 1 ms response time and an actual 8-bit color gamut. While this refresh rate may not suit every AAA game title, players of CS: GO, and DOTA 2 are likely targets. Along with this 500 Hz monitor, BOE also announced a 110-inch 8K 120 Hz panel. You can read more about it from the company statement below..
Source:
via VideoCardz
BOE (Machine Translation from Chinese)With years of technology accumulation, BOE has made important breakthroughs in the field of oxide semiconductor display technology, overcoming industry problems such as copper (Cu) easy to diffuse, easy to oxidize, and easy to drill and engrave, and is the first in the industry to achieve mass production of copper interconnect stack structures., and the integration of high refresh rate, high resolution, low power consumption oxide display technology, breaking the foreign monopoly, and continue to launch low power consumption, ultra-narrow bezel, 500Hz+ gaming display, super-sized 8K Oxide 120Hz, A series of high-end technologies and products such as frequency conversion refresh rate display. At the same time, great breakthroughs have been made in the research and development of high mobility 30+ cm 2 / Vsoxide technology, which has laid a technical foundation for the subsequent performance improvement of high-end products.
65 Comments on BOE Creates 27-Inch Full HD Display With 500 Hz Refresh Rate
The most important takeaway though, is that human vision is much more perceptive to smoothness of motion than to detecting individual frames. So while having >60 Hz is certainly useful, the frame rate consistency is even more useful. Years ago, I conducted an experiment of rendering at ~60 FPS (on a 60 Hz panel) and having stutter in ~1-2 ms range vs. <0.1 ms, and the difference was easily noticeable. So in order for higher frame rates to be useful, the computer needs to be able to produce the new frames with a higher precision. The reason why high frame rates are advantageous is not because details may appear earlier on the screen, it's mostly because it's easier for the brain to filter out what is actually moving. And stutter is the worst enemy of this, as it distracts the brain when processing the image. I know I'm fairly sensitive to stutter, and find it quite straining.
So 500 Hz is not just wasteful because people can't see the difference, it's also a bad idea because it cuts the tolerances for frame rate consistency in half, so you can get to a point where the picture becomes noticeable worse. At 500 Hz there is only 2 ms between frames, and with the precision in the Windows scheduler you will struggle to keep a good consistency at these rates.
But I believe no one has addressed the biggest elephant in the room; can games even produce unique frames at this rate?
Modern game engines work at a fixed tick rate, and if you render frames at a higher rate than this, the GPU will just render multiple identical frames, rendering the 500 Hz screen utterly pointless (pun intended).
A few years ago, I remember CS:GO had 120 Hz tick rate (30 Hz server), and 60-100 Hz was fairly typical. I haven't checked the most recent games, but I doubt there are many running at >120 Hz.
a 500hz monitor is no use for gamers, but in certain industries it'd be magical... imagine if you were testing high frame rate, slow motion videography?
the 8K 120hz is definitely made for commercial purposes and not home users, they could literally use that for a small cinema display, or slap it outside buildings like they do in NYC
Besides, this is just a prototype, we don't know if it's meant to hit retail this decade.
:)
As an example, PUBG used this and it varied per region - after some big fancy upgrades americans got a whopping 60Hz tickrate, while us aussies got 20Hz.
Led to a lot of "what shot me, i was behind cover" moments and so on
Fortnite was 30Hz, and so on
It not only varies between games but varies within the match itself... so it'll speed up at the end of the game as less players are alive, but run like dogs ass early on with all 100 players
Below info is screencaps from the highlights of this video
PUBG 60Hz Tickrate Update 14 Netcode Analysis - YouTube
Early PUBG
Updated Pubg
(Aussie PUBG is the red bar at the bottom)
(It's almost like they want to save money on the servers)
Now is that relevant here? Not really, because not every game does things this way, and since you cant match your PC's refresh rates to the servers due to distance, a faster monitor refresh rate does give you a higher chance of receiving the visual update before your opponent, if they dont have a ping advantage
Examples with math here:
One critical example is how fortnite keeps the network latency much much lower despite only running at 30hz. Anti cheat, server location, server power, all sorts of things add up far beyond just tickrate.
For leisure games like Metro Exodus, God of War, Sekiro etc... even 120 Hz is more than plenty fast, not to mention the hardware needed to reach 100+ Hz in these titles. VRR support in monitors is far more useful than 300+ Hz monitors for most of us.
I also have to agree with Lay-kun that OLED is the tech with the ultra fast response time, not LCDs. It would be interesting to see how 200+ Hz OLEDs perform.
As additional data, 240 Hz means 1 frame every 4.2 ms so the latency difference is 2.2 ms. Very difficult to notice.
I also couldn't tell tell the difference between 75 hz (except for noticeable flicker) - 60 hz was noticeable versus the 75 hz option though
After transitioning to both oled TV (B7 120 hz at 1080p) and TN (1 ms 1080p running at an overclocked 75 hz), I still can't tell the difference between the two!
It's very uncommon to have tick rate tied to frame rate today, at least in real time "precision" games and especially multiplayer games.
You can have a local and a server tick rate (e.g. 120 Hz local and 30 Hz server which used to be the defaults for CS:GO ~5 years ago). And the way this works is the local game simulates the game while waiting for the next server tick, and then corrects any difference when it finally arrives. So in theory, this means that you can see yourself kill an opponent on your screen, only to be immediately "corrected" and killed yourself. Usually this kind of glitching is minimal, but it can certainly be noticeable, especially when watching other players move rapidly.
Also keep in mind that even if the server tick rate is fairly high, you still have to live with the latency difference, so there will be edge cases where "strange things" happen.
I assume engines like Unreal, Id tech, etc. have similar mechanisms for "latency compensation". Yes, you're starting to get it. Technically there is a very minor latency gain though, or at least for the first of those four frames. So you gain a tiny bit in best case input lag, but nothing in smoothness. Have you tried just dragging av window quickly around on your screens?
At least I can easily see that on 60 vs. 120/144 Hz.
I do agree that 120 hz brings other video playback benefits, but it's more than enough for gaming (and overkill for basic desktop use)
You do realize that these sites were dreamed up before monitor makers added Overdrive, right (and that makes even dog-slow VA acceptable for most!)?
A worst-case test failure doesn't mean that youre ever going to notice the difference in the real world.
There are probably hundreds of thousands of developers who spend all day looking at text, not to mention all the people working with documents.
I was actually surprised when I noticed that coding on a high refresh monitor was actually more comfortable (I noticed when switching back). It's certainly noticeable and comfortable, but not anywhere close to a necessity. But like many other factors, like general responsiveness and using a tactile mechanical keyboard, it does help productivity a tiny bit.
I haven't noticed any smearing while scrolling vertically in my last ten years of desktop LCD display here at work.
People just have a way of seeing what they want to see...