• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

VESA Announces the DisplayHDR v1.0 Specification

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The Video Electronics Standards Association (VESA) today announced it has defined the display industry's first fully open standard specifying high dynamic range (HDR) quality, including luminance, color gamut, bit depth and rise time, through the release of a test specification. The new VESA High-Performance Monitor and Display Compliance Test Specification (DisplayHDR) initially addresses the needs of laptop displays and PC desktop monitors that use liquid crystal display (LCD) panels.

The first release of the specification, DisplayHDR version 1.0, establishes three distinct levels of HDR system performance to facilitate adoption of HDR throughout the PC market. HDR provides better contrast and color accuracy as well as more vibrant colors compared to Standard Dynamic Range (SDR) displays, and is gaining interest for a wide range of applications, including movie viewing, gaming, and creation of photo and video content.



VESA developed the DisplayHDR specification with the input of more than two dozen active member companies. These members include major OEMs that make displays, graphic cards, CPUs, panels, display drivers and other components, as well as color calibration providers. A list of participating companies is available here.

DisplayHDR v1.0 focuses on LCDs, which represent more than 99 percent of displays in the PC market. VESA anticipates future releases to address organic light emitting diode (OLED) and other display technologies as they become more common, as well as the addition of higher levels of HDR performance. While development of DisplayHDR was driven by the needs of the PC market, it can serve to drive new levels of HDR performance in other markets as well.

Brand Confusion Necessitates Clearly Defined HDR Standard
HDR logos and brands abound, but until now, there has been no open standard with a fully transparent testing methodology. Since HDR performance details are typically not provided, consumers are unable to obtain meaningful performance information. With DisplayHDR, VESA aims to alleviate this problem by:
  • Creating a specification, initially for the PC industry, that will be shared publicly and transparently;
  • Developing an automated testing tool that end users can download to perform their own testing if desired; and
  • Delivering a robust set of test metrics for HDR that clearly articulate the performance level of the device being purchased.
What DisplayHDR Includes
The specification establishes three HDR performance levels for PC displays: baseline (DisplayHDR 400), mid-range (DisplayHDR 600) and high-end (DisplayHDR 1000). These levels are established and certified using eight specific parameter requirements and associated tests, which include:
  • Three peak luminance tests involving different scenarios - small spot/high luminance, brief period full-screen flash luminance, and optimized use in bright environments (e.g., outside daylight or bright office lighting);
  • Two contrast measurement tests - one for native panel contrast and one for local dimming;
  • Color testing of both the BT.709 and DCI-P3 color gamuts;
  • Bit-depth requirement tests - these stipulate a minimum bit depth and include a simple visual test for end users to confirm results;
  • HDR response performance test - sets performance criteria for backlight responsiveness ideal for gaming and rapid action in movies by analyzing the speed at which the backlight can respond to changes in luminance levels.
"We selected 400 nits as the DisplayHDR specification's entry point for three key reasons," said Roland Wooster, chairman of the VESA task group responsible for DisplayHDR, and the association's representative from Intel Corp. for HDR display technology. "First, 400 nits is 50 percent brighter than typical SDR laptop displays. Second, the bit depth requirement is true 8-bit, whereas the vast majority of SDR panels are only 6-bit with dithering to simulate 8-bit video. Finally, the DisplayHDR 400 spec requires HDR-10 support and global dimming at a minimum. With this tiered specification, ranging from baseline to high-end HDR performance levels, PC makers will finally have consistent, measurable HDR performance parameters. Also, when buying a new PC, consumers will be able to view an HDR rating number that is meaningful and will reflect actual performance."

"Developing this specification is a natural expansion of our range of video standards," said Bill Lempesis, VESA executive director. "Moreover, we are the first standards body to develop a publicly available test tool for HDR qualification, utilizing a methodology for the above-mentioned tests that end users can apply without having to invest in costly lab hardware. Most of the tests require only a colorimeter, which many users already own. Ease of testing was a must-have requirement in order to make DisplayHDR a truly viable, consumer-friendly spec."

New products complying with the DisplayHDR specification will be demonstrated at the Consumer Electronics Show (CES), January 9-12, 2018 at the Las Vegas Convention Center South Hall, DisplayPort booth #21066.

View at TechPowerUp Main Site
 
Joined
Jul 13, 2016
Messages
3,279 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Awesome. We should see much more monitors with HDR now that industry standards have been established. Much better than the wild west show that the TV market is for HDR.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I hope it catches on, I really do.
 
Joined
Oct 1, 2006
Messages
4,931 (0.74/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Is this a joke? I wondered who influenced VESA for these low standards.
You be surprised by how many 6-bit + FRC Monitors out there. :banghead:
 
Joined
Jul 21, 2016
Messages
144 (0.05/day)
Processor AMD Ryzen 5 5600
Motherboard MSI B450 Tomahawk
Cooling Alpenföhn Brocken 3 140mm
Memory Patriot Viper 4 - DDR4 3400 MHz 2x8 GB
Video Card(s) Radeon RX460 2 GB
Storage Samsung 970 EVO PLUS 500, Samsung 860 500 GB, 2x Western Digital RED 4 TB
Display(s) Dell UltraSharp U2312HM
Case be quiet! Pure Base 500 + Noiseblocker NB-eLoop B12 + 2x ARCTIC P14
Audio Device(s) Creative Sound Blaster ZxR,
Power Supply Seasonic Focus GX-650
Mouse Logitech G305
Keyboard Lenovo USB
You be surprised by how many 6-bit + FRC Monitors out there. :banghead:

Like DELL's HDRs.

BTW i saw no mention of HDR10(+) or Dolby Vision, is there any correlation between them and HDR400-1000?
 

GenericAMDFan

New Member
Joined
Oct 17, 2017
Messages
23 (0.01/day)
Is this a joke? I wondered who influenced VESA for these low standards.

it's not that bad for displayhdr 600+ they require 10 bit (8 bit native + 2 bit dithering). 10 bit native would have been better obviously but imo it's good enough, you have to consider that this is just version 1.0
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
https://displayhdr.org/performance-criteria/

99% Rec. 709 (same palette as HDTV)
90% DCI-P3 (25% more colors than sRGB)
8-bit DAC
10-bit image processing

So...this standard's purpose is clearly to push for higher brightness (the brightness levels, black levels, ability to localize diming, etc.). Right now, there is a systemic problem of HDR panels needing more brightness but inadequate information in the data to know when more brightness is a good thing and when it is a bad thing. DisplayHDR appears to try to get the entire industry on the same page in that regard. It's not aiming for the best colors or the most color data; it's aiming to standardize brightness which is arguably more important to HDR than color depth or size of palette.


I suspect in five years, they'll update the spec to include Rec. 2020 and 10-bit DAC. Not entirely sure why they didn't just add that now as Ultra HDR or something.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,763 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Is this a joke? I wondered who influenced VESA for these low standards.

I don't see the problem. You have 3 quality levels: lowest is pretty much what's available today, highest is something only OLED displays can do and one more between those two.
Unless you somehow think that defining HDR1000 only would magically spur everyone into producing only high quality, cheap OLEDs.
 
Joined
Aug 8, 2015
Messages
114 (0.03/day)
Location
Finland
System Name Gaming rig
Processor AMD Ryzen 7 5900X
Motherboard Asus X570-Plus TUF /w "passive" chipset mod
Cooling Noctua NH-D15S
Memory Crucial Ballistix Sport LT 2x16GB 3200C16 @3600C16
Video Card(s) MSI RTX 3060 TI Gaming X Trio
Storage Samsung 970 Pro 1TB, Crucial MX500 2TB, Samsung 860 QVO 4TB
Display(s) Samsung C32HG7x
Case Fractal Design Define R5
Audio Device(s) Asus Xonar Essence STX
Power Supply Corsair RM850i 850W
Mouse Logitech G502 Hero
Keyboard Logitech G710+
Software Windows 10 Pro
Like DELL's HDRs.

BTW i saw no mention of HDR10(+) or Dolby Vision, is there any correlation between them and HDR400-1000?

It says that HDR10 support is required, no mention of other formats though.
 
Joined
Jul 18, 2017
Messages
575 (0.21/day)
most PC monitors are still using the cheap edgelit LCD configuration so they aren’t going to reproduce HDR accurately even if you give them 100,000 nits. TVs will still have way better picture quality in either SDR or HDR.
 
Joined
Aug 3, 2011
Messages
113 (0.02/day)
highest is something only OLED displays can do
Completely false. Plenty of 4K TVs starting from (50"+) $1000 have actual 10-bit panels and some also have FALD. HDR1000 should have native 10-bit as the lowest standard. Everyone has been so brainwashed by low quality IPS computer monitors that they're forgetting VA + proper local dimming is a thing in TVs.

One thing I've always wondered was why HDR displays didn't have a contrast ratio standard, but then I remembered LG still uses their terrible IPS displays on TVs. Go into any store and compare them to a proper VA panel on a Sony/Samsung set and you can see how washed out everything is. UHD Phase A defined HDR as 13 stops of DR back in 2016, which means well over 3000:1 static contrast ratio, which IPS cannot even hope to achieve. There's no reason computer monitors are lacking so much compared to TVs at this point. It's not just about cheap LCD configs like TN, but IPS inferiority is also a huge issue.
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
I don't see the problem. You have 3 quality levels: lowest is pretty much what's available today, highest is something only OLED displays can do and one more between those two.
Unless you somehow think that defining HDR1000 only would magically spur everyone into producing only high quality, cheap OLEDs.
Agreed;
- '400' = TN-IPS,
- '600' = VA,
- '1000' = OLED.
 

bug

Joined
May 22, 2015
Messages
13,763 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Completely false. Plenty of 4K TVs starting from (50"+) $1000 have actual 10-bit panels and some also have FALD. HDR1000 should have native 10-bit as the lowest standard. Everyone has been so brainwashed by low quality IPS computer monitors that they're forgetting VA + proper local dimming is a thing in TVs.

One thing I've always wondered was why HDR displays didn't have a contrast ratio standard, but then I remembered LG still uses their terrible IPS displays on TVs. Go into any store and compare them to a proper VA panel on a Sony/Samsung set and you can see how washed out everything is. UHD Phase Adefined HDR as 13 stops of DR back in 2016, which means well over 3000:1 static contrast ratio, which IPS cannot even hope to achieve. There's no reason computer monitors are lacking so much compared to TVs at this point. It's not just about cheap LCD configs like TN, but IPS inferiority is also a huge issue.
When I said only OLED can do I was think of max brightness, not number of colors. LCD doesn't do the required 1000cd/sqm afaik.
 
Joined
Aug 3, 2011
Messages
113 (0.02/day)
When I said only OLED can do I was think of max brightness, not number of colors. LCD doesn't do the required 1000cd/sqm afaik.
Once again, completely wrong. LCD is far brighter than OLED in large TV sized displays. AMOLED is brighter than OLED on phone sized displays only because Samsung makes good panels. HDR peak brightness is the spec to look at. OLED will also be terrible for computer monitors due to image retention. Rtings only reviewed the E7 as the highest OLED model but LG's current OLED lineup to the W7 all use the same panel. Once you add in FALD to a VA panel brightness goes through the roof for LCD class displays.
https://www.rtings.com/tv/reviews/lg/e7-oled
https://www.rtings.com/tv/reviews/sony/z9d
https://www.rtings.com/tv/reviews/sony/x900e
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It says that HDR10 support is required, no mention of other formats though.
HDR10 is Rec. 2020 with 10-bit color depth. DisplayHDR 1000 is way below that (HDTV + 8-bit). Like I said, this is mostly about compliance with existing standards and increasing the brightness.
 

bug

Joined
May 22, 2015
Messages
13,763 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Once again, completely wrong. LCD is far brighter than OLED in large TV sized displays. AMOLED is brighter than OLED on phone sized displays only because Samsung makes good panels. HDR peak brightness is the spec to look at. OLED will also be terrible for computer monitors due to image retention. Rtings only reviewed the E7 as the highest OLED model but LG's current OLED lineup to the W7 all use the same panel. Once you add in FALD to a VA panel brightness goes through the roof for LCD class displays.
https://www.rtings.com/tv/reviews/lg/e7-oled
https://www.rtings.com/tv/reviews/sony/z9d
https://www.rtings.com/tv/reviews/sony/x900e
Ok, newer panels got better. But typical LCD panels don't do that. (Fwiw, that Sony LCD costs the same as LG's OLED, but we weren't talking price here.)
 
Joined
Aug 3, 2011
Messages
113 (0.02/day)
Ok, newer panels got better. But typical LCD panels don't do that. (Fwiw, that Sony LCD costs the same as LG's OLED, but we weren't talking price here.)
Again, wrong. Just give it up mate. Even last year's OLEDs are far worse in terms of brightness.
https://www.rtings.com/tv/reviews/samsung/ks8000
https://www.rtings.com/tv/reviews/sony/x930d
https://www.rtings.com/tv/reviews/lg/e6-oled

EDIT: And in case you're still saying LCD only recently got better, here's your 2 gens ago OLED.
https://www.rtings.com/tv/reviews/lg/ef9500
Terrible brightness. OLED always gets massive brightness increases every gen but for TVs, they're still far off from LCD + dedicated backlight system.
And since you did say the Z9D was expensive, here's a far cheaper Sony set from this year that's actually brighter.
https://www.rtings.com/tv/reviews/sony/x930e
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,763 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Again, wrong. Just give it up mate. Even last year's OLEDs are far worse in terms of brightness.
https://www.rtings.com/tv/reviews/samsung/ks8000
https://www.rtings.com/tv/reviews/sony/x930d
https://www.rtings.com/tv/reviews/lg/e6-oled

EDIT: And in case you're still saying LCD only recently got better, here's your 2 gens ago OLED.
https://www.rtings.com/tv/reviews/lg/ef9500
Terrible brightness. OLED always gets massive brightness increases every gen but for TVs, they're still far off from LCD + dedicated backlight system.
And since you did say the Z9D was expensive, here's a far cheaper Sony set from this year that's actually brighter.
https://www.rtings.com/tv/reviews/sony/x930e
Ok, I went back and re-read some stuff, you're right - my memory was failing me. LCD is generally brighter, but OLED has an easier time doing HDR because its deeper blacks don't actually need that much brightness to achieve a wide gamut. The trouble is, to cover everything the human eye can see, we need to achieve both max and min brightness simultaneously.
So yes, my first post wasn't exactly spot-on. But I still don't have a problem with defining three levels of HDR compliance.

Edit: Also, your usage of LG OLEDs as an example is not very fortunate as LG uses some sort of "fake" OLED that further impacts brightness. But competing implementations, using "true" OLED aren't that much better.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
OLED is not as bright as LCD, but all that it tells you is "it's not about brightness, stupid".
Ultra HD Premium certification accounts for it and sets brightness targets for OLEDs at lower level (their superior blacks let them achieve better contrast).

People buy Samsung's QLEDs over OLED TVs out of sheer ignorance.
 

bug

Joined
May 22, 2015
Messages
13,763 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
OLED is not as bright as LCD, but all that it tells you is "it's not about brightness, stupid".
Ultra HD Premium certification accounts for it and sets brightness targets for OLEDs at lower level (their superior blacks let them achieve better contrast).

People buy Samsung's QLEDs over OLED TVs out of sheer ignorance.
Most TVs are sold because of how they're set up in a showroom, this isn't anything particular to QLED.
Crank up the saturation and brightness of the model you want sold, place it among models set to lower brightness and saturation - problem solved.

But the fault lies in the panels as well. No panel is better than the rest in all aspects, so no matter how informed you are, you still have to compromise when buying. I may be wrong here (because I haven't followed the market too closely), but I think Sony's Triluminous display holds the best promise now and in the near future.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
But the fault lies in the panels as well. No panel is better than the rest in all aspects
Not in, perhaps, MEASURABLE aspects (as LCD dish out stupid amount of light), that doesn't mean OLED is not superior to Samsung's faux-QLED.
 
Joined
Aug 3, 2011
Messages
113 (0.02/day)
OLED has an easier time doing HDR because its deeper blacks don't actually need that much brightness to achieve a wide gamut. The trouble is, to cover everything the human eye can see, we need to achieve both max and min brightness simultaneously.
That was my whole point with contrast ratios a few posts back. 13 stops of DR (what HDR is defined as by UHD Phase A) is well over 3000:1 contrast ratio, which means IPS will never display actual HDR.
LG uses some sort of "fake" OLED
Christ just stop with the constant fake news. There's no such thing as "fake" OLED. There's active matrix (AMOLED) and passive matrix (POLED), but they end up being mostly the same. The only difference is which OEM makes the OLED since there's manufacturing process differences.

The only thing "fake" in TVs are fake 4K and fake HDR. Fake 4K occurs when an RGBW stripe is implemented improperly (LG does this for LCD but not for OLED). The extra white subpixel in the OLED displays are mainly for brightness. LG's LCD TVs don't give you the extra white subpixel but instead replaces another subpixel. You can read up on that here.
https://www.techhive.com/article/31...-math-to-label-some-of-its-lcd-tvs-as-4k.html

Fake HDR on the other hand is accepting a higher bit depth HDR signal but only displaying it on a 8 bit display, as well as other things like not enough contrast ratio. Even with FRC you can't get the true color depth of a true 10-bit panel. For contrast, if the TV only has 1500:1 static contrast display, then there's no way to display the 13+ stops of DR that makes it HDR. OLED has it easy here since it infinite contrast, VA is getting there with the aid of local dimming backlight systems, but IPS will never reach the contrast ratios needed by HDR. The wikipedia tables gives you a rough estimation of how much contrast you need for the stops of DR you actually get. DR is logarithmic which is why there's such massive jumps in contrast ratios between 10 and 13 stops.
https://en.wikipedia.org/wiki/High-dynamic-range_imaging
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
HDR10 is Rec. 2020 with 10-bit color depth. DisplayHDR 1000 is way below that (HDTV + 8-bit). Like I said, this is mostly about compliance with existing standards and increasing the brightness.
That is the key insight through which VESA is pushing the market forward; more brightness improves all aspects of the panel(contrast, saturation, etc) and it is very easily implemented without any further deliberation on their part. Quite genious, imo.
 

bug

Joined
May 22, 2015
Messages
13,763 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@bubbly1724 Actually, it seems LG doesn't use one LED per color channel, but instead uses white LEDs with with a color mask on top, similar to how LCDs work. For the sake of our discussion, that setup is bound to have lower max brightness since it puts all light through one more filter.
 
Top