Monday, December 11th 2017

VESA Announces the DisplayHDR v1.0 Specification

The Video Electronics Standards Association (VESA) today announced it has defined the display industry's first fully open standard specifying high dynamic range (HDR) quality, including luminance, color gamut, bit depth and rise time, through the release of a test specification. The new VESA High-Performance Monitor and Display Compliance Test Specification (DisplayHDR) initially addresses the needs of laptop displays and PC desktop monitors that use liquid crystal display (LCD) panels.

The first release of the specification, DisplayHDR version 1.0, establishes three distinct levels of HDR system performance to facilitate adoption of HDR throughout the PC market. HDR provides better contrast and color accuracy as well as more vibrant colors compared to Standard Dynamic Range (SDR) displays, and is gaining interest for a wide range of applications, including movie viewing, gaming, and creation of photo and video content.
VESA developed the DisplayHDR specification with the input of more than two dozen active member companies. These members include major OEMs that make displays, graphic cards, CPUs, panels, display drivers and other components, as well as color calibration providers. A list of participating companies is available here.

DisplayHDR v1.0 focuses on LCDs, which represent more than 99 percent of displays in the PC market. VESA anticipates future releases to address organic light emitting diode (OLED) and other display technologies as they become more common, as well as the addition of higher levels of HDR performance. While development of DisplayHDR was driven by the needs of the PC market, it can serve to drive new levels of HDR performance in other markets as well.

Brand Confusion Necessitates Clearly Defined HDR Standard
HDR logos and brands abound, but until now, there has been no open standard with a fully transparent testing methodology. Since HDR performance details are typically not provided, consumers are unable to obtain meaningful performance information. With DisplayHDR, VESA aims to alleviate this problem by:
  • Creating a specification, initially for the PC industry, that will be shared publicly and transparently;
  • Developing an automated testing tool that end users can download to perform their own testing if desired; and
  • Delivering a robust set of test metrics for HDR that clearly articulate the performance level of the device being purchased.
What DisplayHDR Includes
The specification establishes three HDR performance levels for PC displays: baseline (DisplayHDR 400), mid-range (DisplayHDR 600) and high-end (DisplayHDR 1000). These levels are established and certified using eight specific parameter requirements and associated tests, which include:
  • Three peak luminance tests involving different scenarios - small spot/high luminance, brief period full-screen flash luminance, and optimized use in bright environments (e.g., outside daylight or bright office lighting);
  • Two contrast measurement tests - one for native panel contrast and one for local dimming;
  • Color testing of both the BT.709 and DCI-P3 color gamuts;
  • Bit-depth requirement tests - these stipulate a minimum bit depth and include a simple visual test for end users to confirm results;
  • HDR response performance test - sets performance criteria for backlight responsiveness ideal for gaming and rapid action in movies by analyzing the speed at which the backlight can respond to changes in luminance levels.
"We selected 400 nits as the DisplayHDR specification's entry point for three key reasons," said Roland Wooster, chairman of the VESA task group responsible for DisplayHDR, and the association's representative from Intel Corp. for HDR display technology. "First, 400 nits is 50 percent brighter than typical SDR laptop displays. Second, the bit depth requirement is true 8-bit, whereas the vast majority of SDR panels are only 6-bit with dithering to simulate 8-bit video. Finally, the DisplayHDR 400 spec requires HDR-10 support and global dimming at a minimum. With this tiered specification, ranging from baseline to high-end HDR performance levels, PC makers will finally have consistent, measurable HDR performance parameters. Also, when buying a new PC, consumers will be able to view an HDR rating number that is meaningful and will reflect actual performance."

"Developing this specification is a natural expansion of our range of video standards," said Bill Lempesis, VESA executive director. "Moreover, we are the first standards body to develop a publicly available test tool for HDR qualification, utilizing a methodology for the above-mentioned tests that end users can apply without having to invest in costly lab hardware. Most of the tests require only a colorimeter, which many users already own. Ease of testing was a must-have requirement in order to make DisplayHDR a truly viable, consumer-friendly spec."

New products complying with the DisplayHDR specification will be demonstrated at the Consumer Electronics Show (CES), January 9-12, 2018 at the Las Vegas Convention Center South Hall, DisplayPort booth #21066.
Add your own comment

25 Comments on VESA Announces the DisplayHDR v1.0 Specification

#1
evernessince
Awesome. We should see much more monitors with HDR now that industry standards have been established. Much better than the wild west show that the TV market is for HDR.
Posted on Reply
#2
FordGT90Concept
"I go fast!1!11!1!"
I hope it catches on, I really do.
Posted on Reply
#3
bubbly1724
Second, the bit depth requirement is true 8-bit
Is this a joke? I wondered who influenced VESA for these low standards.
Posted on Reply
#4
Zubasa
bubbly1724Is this a joke? I wondered who influenced VESA for these low standards.
You be surprised by how many 6-bit + FRC Monitors out there. :banghead:
Posted on Reply
#5
olymind1
ZubasaYou be surprised by how many 6-bit + FRC Monitors out there. :banghead:
Like DELL's HDRs.

BTW i saw no mention of HDR10(+) or Dolby Vision, is there any correlation between them and HDR400-1000?
Posted on Reply
#6
GenericAMDFan
bubbly1724Is this a joke? I wondered who influenced VESA for these low standards.
it's not that bad for displayhdr 600+ they require 10 bit (8 bit native + 2 bit dithering). 10 bit native would have been better obviously but imo it's good enough, you have to consider that this is just version 1.0
Posted on Reply
#7
FordGT90Concept
"I go fast!1!11!1!"
displayhdr.org/performance-criteria/

99% Rec. 709 (same palette as HDTV)
90% DCI-P3 (25% more colors than sRGB)
8-bit DAC
10-bit image processing

So...this standard's purpose is clearly to push for higher brightness (the brightness levels, black levels, ability to localize diming, etc.). Right now, there is a systemic problem of HDR panels needing more brightness but inadequate information in the data to know when more brightness is a good thing and when it is a bad thing. DisplayHDR appears to try to get the entire industry on the same page in that regard. It's not aiming for the best colors or the most color data; it's aiming to standardize brightness which is arguably more important to HDR than color depth or size of palette.


I suspect in five years, they'll update the spec to include Rec. 2020 and 10-bit DAC. Not entirely sure why they didn't just add that now as Ultra HDR or something.
Posted on Reply
#8
bug
bubbly1724Is this a joke? I wondered who influenced VESA for these low standards.
I don't see the problem. You have 3 quality levels: lowest is pretty much what's available today, highest is something only OLED displays can do and one more between those two.
Unless you somehow think that defining HDR1000 only would magically spur everyone into producing only high quality, cheap OLEDs.
Posted on Reply
#9
Deeveo
olymind1Like DELL's HDRs.

BTW i saw no mention of HDR10(+) or Dolby Vision, is there any correlation between them and HDR400-1000?
It says that HDR10 support is required, no mention of other formats though.
Posted on Reply
#10
dicktracy
most PC monitors are still using the cheap edgelit LCD configuration so they aren’t going to reproduce HDR accurately even if you give them 100,000 nits. TVs will still have way better picture quality in either SDR or HDR.
Posted on Reply
#11
bubbly1724
bughighest is something only OLED displays can do
Completely false. Plenty of 4K TVs starting from (50"+) $1000 have actual 10-bit panels and some also have FALD. HDR1000 should have native 10-bit as the lowest standard. Everyone has been so brainwashed by low quality IPS computer monitors that they're forgetting VA + proper local dimming is a thing in TVs.

One thing I've always wondered was why HDR displays didn't have a contrast ratio standard, but then I remembered LG still uses their terrible IPS displays on TVs. Go into any store and compare them to a proper VA panel on a Sony/Samsung set and you can see how washed out everything is. UHD Phase Adefined HDR as 13 stops of DR back in 2016, which means well over 3000:1 static contrast ratio, which IPS cannot even hope to achieve. There's no reason computer monitors are lacking so much compared to TVs at this point. It's not just about cheap LCD configs like TN, but IPS inferiority is also a huge issue.
Posted on Reply
#12
mtcn77
bugI don't see the problem. You have 3 quality levels: lowest is pretty much what's available today, highest is something only OLED displays can do and one more between those two.
Unless you somehow think that defining HDR1000 only would magically spur everyone into producing only high quality, cheap OLEDs.
Agreed;
- '400' = TN-IPS,
- '600' = VA,
- '1000' = OLED.
Posted on Reply
#13
bug
bubbly1724Completely false. Plenty of 4K TVs starting from (50"+) $1000 have actual 10-bit panels and some also have FALD. HDR1000 should have native 10-bit as the lowest standard. Everyone has been so brainwashed by low quality IPS computer monitors that they're forgetting VA + proper local dimming is a thing in TVs.

One thing I've always wondered was why HDR displays didn't have a contrast ratio standard, but then I remembered LG still uses their terrible IPS displays on TVs. Go into any store and compare them to a proper VA panel on a Sony/Samsung set and you can see how washed out everything is. UHD Phase Adefined HDR as 13 stops of DR back in 2016, which means well over 3000:1 static contrast ratio, which IPS cannot even hope to achieve. There's no reason computer monitors are lacking so much compared to TVs at this point. It's not just about cheap LCD configs like TN, but IPS inferiority is also a huge issue.
When I said only OLED can do I was think of max brightness, not number of colors. LCD doesn't do the required 1000cd/sqm afaik.
Posted on Reply
#14
bubbly1724
bugWhen I said only OLED can do I was think of max brightness, not number of colors. LCD doesn't do the required 1000cd/sqm afaik.
Once again, completely wrong. LCD is far brighter than OLED in large TV sized displays. AMOLED is brighter than OLED on phone sized displays only because Samsung makes good panels. HDR peak brightness is the spec to look at. OLED will also be terrible for computer monitors due to image retention. Rtings only reviewed the E7 as the highest OLED model but LG's current OLED lineup to the W7 all use the same panel. Once you add in FALD to a VA panel brightness goes through the roof for LCD class displays.
www.rtings.com/tv/reviews/lg/e7-oled
www.rtings.com/tv/reviews/sony/z9d
www.rtings.com/tv/reviews/sony/x900e
Posted on Reply
#15
FordGT90Concept
"I go fast!1!11!1!"
DeeveoIt says that HDR10 support is required, no mention of other formats though.
HDR10 is Rec. 2020 with 10-bit color depth. DisplayHDR 1000 is way below that (HDTV + 8-bit). Like I said, this is mostly about compliance with existing standards and increasing the brightness.
Posted on Reply
#16
bug
bubbly1724Once again, completely wrong. LCD is far brighter than OLED in large TV sized displays. AMOLED is brighter than OLED on phone sized displays only because Samsung makes good panels. HDR peak brightness is the spec to look at. OLED will also be terrible for computer monitors due to image retention. Rtings only reviewed the E7 as the highest OLED model but LG's current OLED lineup to the W7 all use the same panel. Once you add in FALD to a VA panel brightness goes through the roof for LCD class displays.
www.rtings.com/tv/reviews/lg/e7-oled
www.rtings.com/tv/reviews/sony/z9d
www.rtings.com/tv/reviews/sony/x900e
Ok, newer panels got better. But typical LCD panels don't do that. (Fwiw, that Sony LCD costs the same as LG's OLED, but we weren't talking price here.)
Posted on Reply
#17
bubbly1724
bugOk, newer panels got better. But typical LCD panels don't do that. (Fwiw, that Sony LCD costs the same as LG's OLED, but we weren't talking price here.)
Again, wrong. Just give it up mate. Even last year's OLEDs are far worse in terms of brightness.
www.rtings.com/tv/reviews/samsung/ks8000
www.rtings.com/tv/reviews/sony/x930d
www.rtings.com/tv/reviews/lg/e6-oled

EDIT: And in case you're still saying LCD only recently got better, here's your 2 gens ago OLED.
www.rtings.com/tv/reviews/lg/ef9500
Terrible brightness. OLED always gets massive brightness increases every gen but for TVs, they're still far off from LCD + dedicated backlight system.
And since you did say the Z9D was expensive, here's a far cheaper Sony set from this year that's actually brighter.
www.rtings.com/tv/reviews/sony/x930e
Posted on Reply
#18
bug
bubbly1724Again, wrong. Just give it up mate. Even last year's OLEDs are far worse in terms of brightness.
www.rtings.com/tv/reviews/samsung/ks8000
www.rtings.com/tv/reviews/sony/x930d
www.rtings.com/tv/reviews/lg/e6-oled

EDIT: And in case you're still saying LCD only recently got better, here's your 2 gens ago OLED.
www.rtings.com/tv/reviews/lg/ef9500
Terrible brightness. OLED always gets massive brightness increases every gen but for TVs, they're still far off from LCD + dedicated backlight system.
And since you did say the Z9D was expensive, here's a far cheaper Sony set from this year that's actually brighter.
www.rtings.com/tv/reviews/sony/x930e
Ok, I went back and re-read some stuff, you're right - my memory was failing me. LCD is generally brighter, but OLED has an easier time doing HDR because its deeper blacks don't actually need that much brightness to achieve a wide gamut. The trouble is, to cover everything the human eye can see, we need to achieve both max and min brightness simultaneously.
So yes, my first post wasn't exactly spot-on. But I still don't have a problem with defining three levels of HDR compliance.

Edit: Also, your usage of LG OLEDs as an example is not very fortunate as LG uses some sort of "fake" OLED that further impacts brightness. But competing implementations, using "true" OLED aren't that much better.
Posted on Reply
#19
medi01
OLED is not as bright as LCD, but all that it tells you is "it's not about brightness, stupid".
Ultra HD Premium certification accounts for it and sets brightness targets for OLEDs at lower level (their superior blacks let them achieve better contrast).

People buy Samsung's QLEDs over OLED TVs out of sheer ignorance.
Posted on Reply
#20
bug
medi01OLED is not as bright as LCD, but all that it tells you is "it's not about brightness, stupid".
Ultra HD Premium certification accounts for it and sets brightness targets for OLEDs at lower level (their superior blacks let them achieve better contrast).

People buy Samsung's QLEDs over OLED TVs out of sheer ignorance.
Most TVs are sold because of how they're set up in a showroom, this isn't anything particular to QLED.
Crank up the saturation and brightness of the model you want sold, place it among models set to lower brightness and saturation - problem solved.

But the fault lies in the panels as well. No panel is better than the rest in all aspects, so no matter how informed you are, you still have to compromise when buying. I may be wrong here (because I haven't followed the market too closely), but I think Sony's Triluminous display holds the best promise now and in the near future.
Posted on Reply
#21
medi01
bugBut the fault lies in the panels as well. No panel is better than the rest in all aspects
Not in, perhaps, MEASURABLE aspects (as LCD dish out stupid amount of light), that doesn't mean OLED is not superior to Samsung's faux-QLED.
Posted on Reply
#22
bubbly1724
bugOLED has an easier time doing HDR because its deeper blacks don't actually need that much brightness to achieve a wide gamut. The trouble is, to cover everything the human eye can see, we need to achieve both max and min brightness simultaneously.
That was my whole point with contrast ratios a few posts back. 13 stops of DR (what HDR is defined as by UHD Phase A) is well over 3000:1 contrast ratio, which means IPS will never display actual HDR.
LG uses some sort of "fake" OLED
Christ just stop with the constant fake news. There's no such thing as "fake" OLED. There's active matrix (AMOLED) and passive matrix (POLED), but they end up being mostly the same. The only difference is which OEM makes the OLED since there's manufacturing process differences.

The only thing "fake" in TVs are fake 4K and fake HDR. Fake 4K occurs when an RGBW stripe is implemented improperly (LG does this for LCD but not for OLED). The extra white subpixel in the OLED displays are mainly for brightness. LG's LCD TVs don't give you the extra white subpixel but instead replaces another subpixel. You can read up on that here.
www.techhive.com/article/3104880/smart-tv/how-lg-uses-fuzzy-math-to-label-some-of-its-lcd-tvs-as-4k.html

Fake HDR on the other hand is accepting a higher bit depth HDR signal but only displaying it on a 8 bit display, as well as other things like not enough contrast ratio. Even with FRC you can't get the true color depth of a true 10-bit panel. For contrast, if the TV only has 1500:1 static contrast display, then there's no way to display the 13+ stops of DR that makes it HDR. OLED has it easy here since it infinite contrast, VA is getting there with the aid of local dimming backlight systems, but IPS will never reach the contrast ratios needed by HDR. The wikipedia tables gives you a rough estimation of how much contrast you need for the stops of DR you actually get. DR is logarithmic which is why there's such massive jumps in contrast ratios between 10 and 13 stops.
en.wikipedia.org/wiki/High-dynamic-range_imaging
Posted on Reply
#23
mtcn77
FordGT90ConceptHDR10 is Rec. 2020 with 10-bit color depth. DisplayHDR 1000 is way below that (HDTV + 8-bit). Like I said, this is mostly about compliance with existing standards and increasing the brightness.
That is the key insight through which VESA is pushing the market forward; more brightness improves all aspects of the panel(contrast, saturation, etc) and it is very easily implemented without any further deliberation on their part. Quite genious, imo.
Posted on Reply
#24
bug
@bubbly1724 Actually, it seems LG doesn't use one LED per color channel, but instead uses white LEDs with with a color mask on top, similar to how LCDs work. For the sake of our discussion, that setup is bound to have lower max brightness since it puts all light through one more filter.
Posted on Reply
#25
bubbly1724
bug@bubbly1724 Actually, it seems LG doesn't use one LED per color channel, but instead uses white LEDs with with a color mask on top
Mate that's the reason why they have an extra white subpixel.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts