Wednesday, June 7th 2017
NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR
In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.
The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Source:
Hardware Canucks' YouTube Channel
The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR
Doing this in a technical expo.... I would think this kinda of stuff just makes them look bad, but their marketeers surely knows best what are the outcomes of deception.
Typical nvidia shenanigans and another vote for no money their way.
I have heard from console gamers that HDR is amazing, and I was hoping my next monitor would be an HDR one. This story tells me there is no need to rush out and buy one just yet lol.
Basically they are trying too hard to utilize these new standards in color gamut and brightness offered by HDR that contradicts their original vision of the work.
Also HDR is of course more uncomfortable for the eye than SDR, more contrast less comfort, it is just a massive marketing scheme, and it doesn't take rocket science to compress an image with a wider dynamic range into a picture with less dynamic range, all what it takes is float precision as opposite of flat integer precision of RGB.
Unless you have so much stock in one of these companies that your livelihood depends upon it, take a breath, count to five, and decide if the hateful or super-negative thing is going to make one bit of difference.
And then rejoice because you have saved an hour of your life and reduced your risk of a stroke. :)