Wednesday, June 7th 2017
NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR
In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.
The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Source:
Hardware Canucks' YouTube Channel
The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR
The dude said "1 month" fidelization, claiming the law had changed recently.
A few days later, i went to that company's store on an unrelated matter and, while i was there, i asked about the fidelization time to which they said "it's 2 years" ...
I asked: hasn't the law change recently? One of your sellers came to my house and said it had changed recently and now the fidelization period was 1 month, to which they promptly said "the law on fidelization period hasn't changed.
As for the topic @ hand, i don't care who it is: if they lie to sell products and get caught, they should be exposed on the lie(s) in question, in the most public fashion possible so that they (and others) thing twice (or more times) before considering doing this sort of crap again.
And of course most monitor, tv, phone...car commercials do not represent the product in a realistic fashion.
:eek:Free the Beasts :D
And you wanted to sign up, get the free wireless router, or whatever the IPS company was offering, and then bail on your contract without penalty? And you believed the guy when he said you could? Really? Ever look at the photos of the menu at McDonald's and then look at your meal? This is why we have reviewers in the 1st place. You can't trust the vendor to tell you straight.
test tool in nvidia's hdr sdk was as close as i could find. and the differences are rather minimal. bright lights are somewhat brighter and there is definitely more detail in both bright and dark areas thanks to 10-bit colors. but that's it. it really is not that noticeable unless you know what to look for. or if the content is not fudged.
practically all of hdr marketing is pure bullshit.
It all starts and ends with content, just like 4K, and whether or not regular Joe can see an advantage of buying into it. So in a way, his opinion is fact.
The current way HDR is marketed for one, does not help the adoption of HDR at all. The way Nvidia did it right here, is in fact counterproductive. Putting HDR capability on TN and IPS panels, or even VA panels, that do not have full array local dimming, is also counterproductive. It will make people think HDR is pointless, or produces unnatural images, because the panel tech simply can't display the gamut.
The whole situation is rather similar to the way High Definition was being done: everything was HD... but we had HD Ready too, and Full HD. People were being misled, and bought 720p screens, end result being that we now today still don't have 1080p broadcasts (ofc this is just part of the reason, but it definitely helped) - a large majority of people got stuck on 720p so why scale up content further?
On HDR: I'd rather see new panels with a wider gamut, better blacks, response times, etc, and a good, calibrated, sRGB mode but, here we are.
Let's go full throttle defense mode.
The issue is, "HDR support" means jack shit. It merely means device supports it as input, not that it is actually capable of displaying it.
So instead you actually need to look after "Ultra HD Premium" certification to get proper HDR screen.
And here is the thing with every single example of 10/12-bit vs 8-bit I've ever seen, they are all 8-bit pictures because they are designed to be viewed on 8-bit panels. Obviously, this is to try to sell people currently with 8-bit panels on the miracles of 10/12-bit.
I'll use BenQ's product page as a perfect example: www.benq.us/product/monitor/PG2401PT/features/
On that page they have this picture to show the difference between 8-bit and 10-bit:
Look how smooth the 10-bit side looks! Obviously 10-bit is so much better, how did we ever live with 8-bit? But wait, people will most likely be viewing that on a 8-bit monitor. So the "10-bit" image on the left will be viewed as 8-bit. Dig a little deaper, and download the picture and open it in an image editor. OMG, it's only an 8-bit image! So what you are seeing on the left is really 8-bit, so what the hell is the right?
You mean the monitor manufacturers, and pretty much everyone trying to say 10/12-bit is so much better, are really just purposely lowering the quality of images to make 8-bit look worse than it really is? OMG, get the pitchforks!!!
The same, most FX today can be done in DX 8.1 to a very large extent.
That does not mean it happens.
And what's all the new fuss about the HDR anyway ?
Do you not remember the good old HL2 demo on that ?
In the grand scheme of things, HDR (more specifically HDR10) is about being able to cover more colour space then traditional displays. It's using a wider colour space (which we could already do), but at the same time being able to use more colours from that colour space. Up until now, even if you used Rec2020, you were still limited to 256 shades of R/G/B, thus limiting the maximum luminance delta. Of course, much like other things before it, having HDR support in place is one thing, wielding HDR properly is another and it will come a little later.
My worry is that it's going to be dragged out for years, much like OLED, before we start seeing it in the mainstream and without gouging prices.
P.S: I have not seen an HDR monitor in person, but based on the way the picture looks, you can easily achieve a similar look.
Hell no!!!
HDR covers wider range of color gamut. You are able to display colors that normal monitor isn't capable of.
Think of it like TN vs IPS screeen
comparison with different content is apples to oranges.
with similar/same content, the difference is quite difficult to discern.