Wednesday, June 7th 2017
NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR
In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.
The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Source:
Hardware Canucks' YouTube Channel
The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR
In the famous words of Led Zeppelin, " Lying.., Cheatin'..., Hurtin'..., that all you seem to do"
Already we have several HDR versions, already its being marketed on panels that have no capability to do anything useful with HDR, and already its a huge inflated marketing mess - and it has even barely landed yet.
Display tech > any bullshit that comes after that.
And the standard display tech is still inferior or way too costly (TN/IPS versus OLED).
Ill come back to HDR when OLED is mainstream. Until then, unless you have a full array local dimming panel, this is a total waste of time.
Well played Nvidia, well played - you really are rats, thanks for confirming
And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.
The only way to show the capabilities of HDR is to use the same monitor and content, preserving all settings, except the one being tested, enabled and disabled between runs. Record the same sequence with the same camera settings, in the same ambient (lighting) conditions, then show a split screen using post-processing magic.
The test was doomed from the start.
Haters will quickly jump to the conclusion they did it because they probably suck at HDR*. While everybody with an interest knows HDR content is subtle, it's not it's not as easy as blowing out all color channels to make it look like your colors are more vivid. For all we know, the content displayed might have looked pretty similar under floor lighting conditions. I've actually read about someone who was used to oversaturating color channels. When they tried to play HDR content using the same settings, they came to the conclusion HDR is the same as SDR.
*Nvidia actually introduced HDR (albeit in a different form) in their 7000 series (iirc) over a decade ago.
Edit: Just watched a South Park episode. Something about first world problems.
To deliberately make the old look worse to sell the new...is just a con.
Kinda like making older cards worse through drivers to promote newer cards.
You think that expensive TV at BestBuy in the middle of the showroom floor is running on the same exact settings as all the other TVs? No, they tweak the settings and create custom images to showcase their flagship product for more revenue. Unfortunately, it's up to the consumer to research and be informed.
Besides, I remember reading an article (arsTechnica?) that true HDR isn't possible on these displays. We'll have to wait for OLED to become mainstream. I'll come back if I find the link.