Wednesday, June 7th 2017

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Source: Hardware Canucks' YouTube Channel
Add your own comment

78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

#26
semantics
TheDeeGeeI like SDR better.
Living the life on the web where everything is sRGB if you're lucky.
Posted on Reply
#27
Air
Not shady at all eh? I could ignore it as normal if they just chosen a terrible monitor, but going out of their way to also mess with the default settings? Boy...

Doing this in a technical expo.... I would think this kinda of stuff just makes them look bad, but their marketeers surely knows best what are the outcomes of deception.
Posted on Reply
#28
TheoneandonlyMrK
birdieJust give us affordable true 10bit matrices. HDR is more about capturing the image rather than displaying it.
Go Amd ,been that way years i can even run 12bit with this setup ,looks nice and better than the lg hdr tv I've got though to be fair the tellys lower teir by far then the monitor.

Typical nvidia shenanigans and another vote for no money their way.
Posted on Reply
#29
semantics
AirNot shady at all eh? I could ignore it as normal if they just chosen a terrible monitor, but going out of their way to also mess with the default settings? Boy...

Doing this in a technical expo.... I would think this kinda of stuff just makes them look bad, but their marketeers surely knows best what are the outcomes of deception.
Marketing sells better by over promising though the roof, just look at any indygogo/kickstarted project. All of the "successful" ones at fundraising are just all design and marketing the product meant next to nothing in bringing in people.
Posted on Reply
#30
bug
theoneandonlymrkGo Amd ,been that way years i can even run 12bit with this setup ,looks nice and better than the lg hdr tv I've got though to be fair the tellys lower teir by far then the monitor.

Typical nvidia shenanigans and another vote for no money their way.
12 bit? You don't say.
Posted on Reply
#31
RejZoR
Aaaaaah, just NVIDIA being NVIDIA. Remember how they purposely gimped normal PhysX effects to make hardware accelerated ones look better even though 10 years older games using pure CPU physics (Havok) rendered BETTER effects?
Posted on Reply
#32
Captain_Tom
If Nvidia had to nerf a display just so people would notice the difference... that is quite disappointing.


I have heard from console gamers that HDR is amazing, and I was hoping my next monitor would be an HDR one. This story tells me there is no need to rush out and buy one just yet lol.
Posted on Reply
#33
natr0n
Just nvidia demonstrating the way its meant to be played.
Posted on Reply
#34
Captain_Tom
RejZoRAaaaaah, just NVIDIA being NVIDIA. Remember how they purposely gimped normal PhysX effects to make hardware accelerated ones look better even though 10 years older games using pure CPU physics (Havok) rendered BETTER effects?
PhysX is one my favorite things to make fun of. Nvidia's Popcorn Simulator is a complete joke.
Posted on Reply
#35
TheGuruStud
natr0nJust nvidia demonstrating the way its meant to be played.
You mean, "The Way You're Meant To Be Deceived." ™ © ®
Posted on Reply
#36
mcraygsx
No Surprise there for Green Team, they nerf their older GPU via drivers to make new ones look more attractive.
Posted on Reply
#37
qubit
Overclocked quantum bit
tsk @Raevenlord I see you're once again reporting lies about my dear NVIDIA! Please wait while I report you to NVIDIA Spin Central who will deal with you appropriately. ;)
Posted on Reply
#38
Relayer
I am far far far from an nVidia apologist. I wouldn't walk across the street to piss on Jen Hsun if he was on fire, nor spend a cent on an nVidia product. With that said, I've been in sales my whole life. I'd be willing to bet it was someone at the show who on their own adjusted the settings to make the difference more dramatic. Salespeople are known for often stretching the truth and outright lying to make a sale. A lot more likely than some upper end marketing decision.
Posted on Reply
#39
dwade
HDR's biggest problem is not the displays themselves but the contents. Content makers violate their original work by making the image too oversatured to emphasis the wide color gamut. Then they darken the image in order to exaggerate the highlights in brightness.

Basically they are trying too hard to utilize these new standards in color gamut and brightness offered by HDR that contradicts their original vision of the work.
Posted on Reply
#40
WhateverAnotherFreakingID
Still nVidia GeForce drivers as far a today can't manage the wide gamut on my SDR display, an old Dell U2413, so I don't really get why they want to show off HDR.

Also HDR is of course more uncomfortable for the eye than SDR, more contrast less comfort, it is just a massive marketing scheme, and it doesn't take rocket science to compress an image with a wider dynamic range into a picture with less dynamic range, all what it takes is float precision as opposite of flat integer precision of RGB.
Posted on Reply
#41
TheoneandonlyMrK
bug12 bit? You don't say.
Look it up I've used it not imagined it my monitor is a pro model.
Posted on Reply
#42
rtwjunkie
PC Gaming Enthusiast
HoodnVidia sucks, AMD sucks, Intel sucks - a visitor to this site might get the idea that this place is full of geniuses, since everyone here obviously knows much more than the 3 biggest PC chip makers in the world. Never mind that all 3 corporations have accomplished things that none of us could do if we had a trillion dollars and 20 years. Being critical is normal, but it's turned into a shouting match and degenerated into personal attacks, hatred, and bigotry, ultimately against everything and everyone, at one time or another. Does anyone here actually like any of their PC hardware brands, without needing to talk crap about other brands, and the people who buy them? These 3 companies have risen to the top against many other makers of CPUs and GPUs, shouldn't we give them their due respect for the great things they've done? We all have bad days, and feel negativity, myself included, but does it really make us feel better when we put others down? I'm going to try to post only positive things from now on, and leave the keyboard alone when I'm feeling scrappy. In the 5 years I've been a member, the quality and tone of this site has gone downhill rapidly, just as the political and social climate has.become toxic in most of the world. I hope we can be better than that, and focus on the real mission of this site (well, besides making money) - informing the uninformed about how to build and repair the best PC they can afford. Hate will never accomplish that - patience, understanding, and tolerance are what's needed.
Thank you. This +1. The disrespect for others point of view on TPU has gotten pathetic and saddening. And the levels, not only of fanboyism but HATRED of whatever other team applies at the moment has become like an out of control melanoma.

Unless you have so much stock in one of these companies that your livelihood depends upon it, take a breath, count to five, and decide if the hateful or super-negative thing is going to make one bit of difference.

And then rejoice because you have saved an hour of your life and reduced your risk of a stroke. :)
Posted on Reply
#43
Fluffmeister
Crappy panels and a lack of decent content, they basically had no choice.
Posted on Reply
#44
Mistral
Well, this is awkward... so, business as usual?
Posted on Reply
#45
Th3pwn3r
Lol, I'm willing to bet the people here knocking HDR don't even have a monitor or TV that can display it. I can see a clear difference on my Samsung 55'' KS8000 when HDR is off. Not only does it look far better with HDR enabled but I also like it(for the person saying it's uncomfortable to the eye(wtf??))...
Posted on Reply
#47
oxidized
Not much different from when AMD deliberately chose specific settings on OBS in order to show how a ryzen would stream better than a i7. Different methods, both disgusting.
Posted on Reply
#48
evernessince
SihastruOr maybe the monitors weren't properly reset after the guys that came before Dmitry and Eber had their way with them.

The only way to show the capabilities of HDR is to use the same monitor and content, preserving all settings, except the one being tested, enabled and disabled between runs. Record the same sequence with the same camera settings, in the same ambient (lighting) conditions, then show a split screen using post-processing magic.

The test was doomed from the start.
Nvidia only let those guys tweak the settings...
Posted on Reply
#49
creativeusername
What a surprise, like the sky being blue and dirt being brown.
Posted on Reply
#50
Prima.Vera
HDR can be easily emulated via software. 10 or 12 bit however cannot. Even if some of the panels are doing fake 10/12 bit by using dithering. Still good, since it completely removes banding.
Posted on Reply
Add your own comment
Nov 24th, 2024 04:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts