• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Samsung, Amazon Unveil Yet Another HDR Standard

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.23/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
And here I was thinking the whole point of having standards was to homogenize offerings for a given feature, ensuring the same minimal requirements were met by anyone (or any product) looking to carry a sticker emblazoning its capabilities. Yet here it is, another HDR standard, which Samsung and Amazon are calling HDR10+.

The HDR10+ standard looks to slightly bridge the gap between the HDR10 standard as certified by the UHD Alliance, and the Dolby Vision one, which boasted better HDR reproduction whilst carrying higher specifications to be adhered to. The greatest change in HDR10+: the adoption of Dynamic Tone Mapping, which stand upon variable dynamic metadata to help adjust brightness and contrast in real time, optimized on a frame-by-frame basis, a feature present in Dolby Vision but lacking on the UHD Alliance's HDR10, which resulted in some overly darkened bright scenes.





The presence of both Samsung and Amazon in this announcement isn't fortuitous: a hardware manufacturer and a content provider are, in this case, holding hands in delivering an improved experience and an ecosystem adapted to this new standard. For one, Samsung's 2016 UHD TVs will get HDR10+ support through a firmware update during the second half of 2017 (all of Samsung's 2017 UHD TVs, including its premium QLED TV lineup, support HDR10+.) This ensures hardware support for new HDR10+ content, where dynamic metadata needs to be included within a video file before it can be decoded, hence relying upon content creators adopting it - and Amazon is one of the prime content providers for the latest technologies. What a happy coincidence.

View at TechPowerUp Main Site
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Dolby Vision isn't standard... it's proprietary shit like G-Sync and the like, that needs to die!
 
Joined
Aug 20, 2007
Messages
21,535 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Dolby Vision isn't standard... it's proprietary shit like G-Sync and the like, that needs to die!

It is a standard, just a proprietary standard. Standards don't have to be open. Heck, betamax was a standard.
 
Joined
May 29, 2012
Messages
537 (0.12/day)
System Name CUBE_NXT
Processor i9 12900K @ 5.0Ghz all P-cores with E-cores enabled
Motherboard Gigabyte Z690 Aorus Master
Cooling EK AIO Elite Cooler w/ 3 Phanteks T30 fans
Memory 64GB DDR5 @ 5600Mhz
Video Card(s) EVGA 3090Ti Ultra Hybrid Gaming w/ 3 Phanteks T30 fans
Storage 1 x SK Hynix P41 Platinum 1TB, 1 x 2TB, 1 x WD_BLACK SN850 2TB, 1 x WD_RED SN700 4TB
Display(s) Alienware AW3418DW
Case Lian-Li O11 Dynamic Evo w/ 3 Phanteks T30 fans
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 11 Pro 64-bit
Dolby Vision isn't standard... it's proprietary shit like G-Sync and the like, that needs to die!
Being proprietary has no effect on whether it is still a standard or not. Especially when it's as influential as Dolby.

I don't know why you would even think otherwise.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
The UHD Alliance Premium Cert is an Entry level sticker. It only requires you to meet 90% of color gamut @ 1k nits / 540 on OLEDs. Good enough stamp of approval. They still have their issues with certifying Local Dimming tech and of course it doesn't include the Audio side of things.
 
Joined
Feb 11, 2009
Messages
5,569 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Dolby Vision isn't standard... it's proprietary shit like G-Sync and the like, that needs to die!

ermm which standard is not proprietary? pretty sure HDMI etc is also owned.
 
Joined
Jun 28, 2016
Messages
3,595 (1.16/day)
ermm which standard is not proprietary? pretty sure HDMI etc is also owned.

Of course. And it's not free, obviously:
"HDMI manufacturers pay an annual fee of US$10,000 plus a royalty rate of $0.15 per unit, reduced to $0.05 if the HDMI logo is used, and further reduced to $0.04 if HDCP is also implemented."
en.wikipedia.org/wiki/HDMI

It's very unlikely for community-originated solution to become a large industry standard in electronics. Almost each of the "standards" were started by large manufacturers (often as joint venture).

Even the ubiquitous USB is protected by a (non-free) license.

And here's a interesting difference: if you're making a HDMI cable, you're getting a discount for adding HDMI logo. It's exactly the opposite with USB, where you have to pay extra to put a USB logo on your product. :D
This is a beautiful example of a difference between a dominating and well-known product VS one that still fights for market share and recognizability. :D
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
So, buying HDR capable devices is a pointless thing at this point with so many standards and no one knowing which one will be used...
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Pretty sure MPEG-H Part 14 (HDR10) is the defacto standard for HDR. It is what ATSC 3.0 will use and no doubt PAL as well.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
I don't see any gap between (free) HDR10+ and Dolby.
The main difference was that metadata was static in HDR, now it isn't anymore.

Oh, and did I mention you don't need to pay Dolby (or anyone) for every device/software that is using HDR10+ just for supporting it?

To my knowledge LG is the only TV manufacturer that bothered supporting Dolby (on top of HDR10 of course). But given the price of its OLED tvs, it's peanuts for them anyhow.


It is a standard, just a proprietary standard. Standards don't have to be open. Heck, betamax was a standard.

They don't have to be open, but given that we have HDR10 that is free, no thanks (and f*ck you, Dolby)

The "we invest a couple of millions into standard to shave off billions for nothing later on" crap needs to stop. (and I'm glad Samsung and Sony bothered)

Pretty sure MPEG-H Part 14 (HDR10) is the defacto standard for HDR.
No, that's a different thing (encoding based).
The HDR10 vs Dolby is how you communicate that to a device.

Since expectations are that device max brightness/whatever will vary a lot, "additional work" is needed to map from source to device in an optimal way.


So, buying HDR capable devices is a pointless thing at this point with so many standards and no one knowing which one will be used...
There are only 2 standards, OK, this one is 3rd, next to nobody supports Dolby and good luck taking on standard established by actual TV manufacturers (and available loyalty free)

You should care more about actual capabilities of the thing with "HDR" label, as it can very wildly.
 
Last edited:
Joined
Mar 23, 2012
Messages
777 (0.17/day)
Location
Norway
System Name Games/internet/usage
Processor I7 5820k 4.2 Ghz
Motherboard ASUS X99-A2
Cooling custom water loop for cpu and gpu
Memory 16GiB Crucial Ballistix Sport 2666 MHz
Video Card(s) Radeon Rx 6800 XT
Storage Samsung XP941 500 GB + 1 TB SSD
Display(s) Dell 3008WFP
Case Caselabs Magnum M8
Audio Device(s) Shiit Modi 2 Uber -> Matrix m-stage -> HD650
Power Supply beQuiet dark power pro 1200W
Mouse Logitech MX518
Keyboard Corsair K95 RGB
Software Win 10 Pro
not a big fan of these dynamic back lighting solutions, and all the TVs I have seen with local dimming looks bad.

as for the thing with standards:

(from https://xkcd.com/927/)
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
not a big fan of these dynamic back lighting solutions, and all the TVs I have seen with local dimming looks bad.

as for the thing with standards:

(from https://xkcd.com/927/)

That strip is kind of right

Dolby Vision already had these standards but HDR10 didn't cover them. UHD Alliance seams to not take them into consideration until the majority of their body agrees on them. There is a gap between innovating new way to meet higher set standard and them being accepted.

If you recall Samsung had been pushing HDR 1000 for nits capability and with the QLEDs they are pushing HDR 1500 & HDR 2000 which puts them closer to DV territory.



LEDs were already capable of +1000 but they looked horrid without local dimming grids. OLED just cant get bright enough to reach 1k let alone DV at 4k and who knows what comes next HDR12 ?nits & DV 10k nits and thats just one aspect of the standards.
 
Last edited:
Joined
Oct 2, 2011
Messages
98 (0.02/day)
System Name Custom build
Processor AMD Ryzen 7 5800X
Motherboard ASrock B550M
Cooling Wraith Max
Memory DDR4 4000 16x2GB
Video Card(s) RX 6900 XT
Storage Sabrent Rocket 4.0 500GB
Display(s) ASUS VG236H 120Hz
Case N/A
Audio Device(s) Onboard
Power Supply Thermaltake 850W
Mouse Generic
Keyboard Generic
Software Win 10 x64
Do we really need more than one standard to display a 1.07 billion colors?
 
Top