Monday, January 31st 2022
BOE Creates 27-Inch Full HD Display With 500 Hz Refresh Rate
If you thought that your 144 Hz monitor sounds rather fancy, you would have to think again after seeing this. Beijing Oriental Electronics Group Co., Ltd or BOE shortly, has announced that the company has managed to design and manufacture a 27-inch full HD display with a refresh rate of 500 Hz. No, this is not a typo, and the company made a display with such a high refresh rate. This technology marvel features an 8-lane eDP connection paired with a 1 ms response time and an actual 8-bit color gamut. While this refresh rate may not suit every AAA game title, players of CS: GO, and DOTA 2 are likely targets. Along with this 500 Hz monitor, BOE also announced a 110-inch 8K 120 Hz panel. You can read more about it from the company statement below..
Source:
via VideoCardz
BOE (Machine Translation from Chinese)With years of technology accumulation, BOE has made important breakthroughs in the field of oxide semiconductor display technology, overcoming industry problems such as copper (Cu) easy to diffuse, easy to oxidize, and easy to drill and engrave, and is the first in the industry to achieve mass production of copper interconnect stack structures., and the integration of high refresh rate, high resolution, low power consumption oxide display technology, breaking the foreign monopoly, and continue to launch low power consumption, ultra-narrow bezel, 500Hz+ gaming display, super-sized 8K Oxide 120Hz, A series of high-end technologies and products such as frequency conversion refresh rate display. At the same time, great breakthroughs have been made in the research and development of high mobility 30+ cm 2 / Vsoxide technology, which has laid a technical foundation for the subsequent performance improvement of high-end products.
65 Comments on BOE Creates 27-Inch Full HD Display With 500 Hz Refresh Rate
If it doesn't bleed, people will simply perceive trails like text in motion. Truth be told, that's not really annoying. It's not tiring on the eye and you're not reading moving text either.
amo.net/NT/02-21-01FPS.html
amo.net/nt/05-24-01FPS.html
There were no citations stated and I can not find any USAF documentation to verify.
While exceptions to the rule exist, the fact remains that for most of the human race, the eye can not perceive individual frames of animation above 80hz and can not perceive a difference of framerate above 220hz, generally.
While I was unable to find the University study I read years ago, I was able to find an article in a science publication that shows data that was studied on the subject which supports my statement, but also shows that exceptions to the general rule happen.
www.nature.com/articles/srep07861
Also found this which details 13millisecond perception, which is a little over 75hz;
news.mit.edu/2014/in-the-blink-of-an-eye-0116
This is also a paper which details information about human visual perception.
opg.optica.org/jdt/viewmedia.cfm?uri=jdt-12-11-1372&seq=0
The way we see things is very complicated but can be understood once the principles of vision are clear(pun intended).
www.healthline.com/health/human-eye-fps#how-many-fps-do-people-see 500Hz is nearly 7 times faster than 75Hz, I haven't seen any evidence that it would be physically possible for a human to process visual information at such a high rate. If you're going to post a quote, have the decency to at least link to its source, mkay? amo.net/NT/02-21-01FPS.html, followed by amo.net/nt/05-24-01FPS.html - although reading that load of horseshit, I kinda understand why you didn't.
But I'm not advocating running screens at 500+ Hz, I think that would be a waste of resources. I think it's much more important to fine-tune software (both OS kernel, drivers and the games themselves) to provide better frame time consistency. And even then, I'm not convinced running >200 Hz will be noticeable.
Again, with mismatched timings such as a game servers tickrate falling right after your screen refreshes you miss it - but a faster refresh rate will display it
Lets say we have 20 people on the server with a magical fixed ns ping (It's a LAN party on fiber optic.)
They all have 16.6ms 60hz monitors (the 1ms marketing etc is another kettle of fish i'll skip here)
One guy has a 500hz, 2ms monitor
Server has a 15ms tickrate, because it's aussie pubg.
Tick 1: 15ms
the 60hz users display this 1.6ms after it happens
the 500Hz user gets it 1ms after it happens
Slim difference, no one would notice
Tick 2: 30ms
60Hz users display this at 33.2ms
500Hz user gets it at 30ms
Tick 3: 45ms
60Hz users display it at 49.8ms
500Hz gets it at 46ms
Tick 4: 60ms
60Hz are at 66.4ms
500hz bang on 60
Now yes this is napkin math, but the point is that with external sources of delays that prevent the monitor timing being perfect - you can end up with very large delays before the image even gets sent to the monitor to display
In reality these numbers would be all over the place, with ping varying every second as well as delays from all other players in the game and THEIR pings to the server - so information is chaotically arriving at erratic times
a 60hz user vs a user with 1000FPS and Vsync off, could have a ~15ms delay before things even display
A 60hz 60FPS user, who happens to get 1ms network latency at the wrong time, can suddenly get a 15ms delay vs everyone else
This is napkin math and ignoring network latency, input latency and a dozen other variables, but not including at least this much is just ignorance
If the human eye can see a single frames difference in the hundreds, and monitor technology can show a 15ms difference quite easily... how many frames are in that 15ms? How many chances to get updated visual information vs the competition?
A network game's end-to-end latency will be something like this:
1) Device input lag (with OS overhead). ~15 ms (if we assume USB HID)
2) Client side tick (before sending to server). If we assume 120 Hz: 1000/120 + 50%* = 12.5 ms
3) Network transfer: Variable based on distance and condition. Let's assume 10 ms (average) for this example.
4) Server tick: 60 Hz seems fairly common, so I assume that. 1000/60 + 50%+ = 25 ms
5) Network transfer - same as 3): 10 ms
6) Client side tick (yes, again): 12.5 ms
7a) Rendering at 60 FPS: 1000/60 + 50%* = 25 ms
7b) Rendering at 120 FPS: 1000/120 + 50%* = 12.5 ms
7c) Rendering at 500 FPS: 1000/500 + 50%* = 3 ms
8) Frame pacing: unpredictable
9) Monitor input lag: I assume 5 ms
Total:
Scenario a 60 FPS: 115 ms
Scenario b 120 FPS: 102.5 ms
Scenario c 500 FPS: 93 ms
You decide whether this difference is significant or not. And keep in mind I assumed a low network latency here. I'm also ignoring pixel response time of the monitor, only looking at the monitor's input lag. I'm also assuming V-Sync is not used.
*) 50% is added for average variation whenever two things are not in sync. If a packet/event comes in 0.0002 ms too late, it needs to wait for the next tick.
So look at the impact if some of the other factors changed, like:
* Increasing the client and/or server side tick rate, e.g. to 200 Hz
* Choosing a game server that's closer
* Using a faster input device, either a device with a special protocol or PS/2.
However I can also definitely say, I couldn't see absolutely no difference in the smoothness on a 240Hz panel running CS at ~240Fps, compared to 100Hz/100Fps.
To reach 1000Hz, all parts of the screen need to run at 1ms (and for 500hz, 2ms, and so on)
Except uhh... they measure grey to grey.
Some monitors, simply cant keep up. Thats why VA panels get that black crush/smearing, for example.
I wonder if OLEDs future variants is the answer, but for now not all high refresh rate monitors can truly display at those rates, so of course you dont always see it