• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Rename "FreeSync 2" To "FreeSync 2 HDR", Increase Minimum HDR Requirement

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
The guys over at PC Perspective conducted an interesting interview with AMD, during which a company representative talked about impending changes to AMD's FreeSync program. Essentially, the company found that there is some consumer confusion regarding what features exactly FreeSync 2 delivers over its first-gen counterpart. As such, they feel renaming the technology to FreeSync 2 HDR conveys the focus on the new feature-set: LFC (Low Framerate Compensation) and the FreeSync 2 HDR fast-lane for tone-mapping improvements.

The AMD representative further clarified what specs are required for a monitor to receive FreeSync 2 HDR certification: support for at least HDR600, coverage of 99 percent of BT.709 and 90 percent of the DCI P3 color spectrum. Also mentioned was a minimum response time, though the exact value remains unknown. An interesting point that can be gleaned from AMD's change, though, is that this one is more than just cosmetic: AMD's first FreeSync 2 certification program required displays to only be able to adhere to HDR400. There are some examples of announced, FreeSync 2 monitors that only support that standard (and others that don't support even that but were certified all the same), instead of the aforementioned HDR600 the company will apparently start enforcing alongside the renewed "FreeSync 2 HDR" program. Here's hoping for a stricter certification program from AMD in this regard, since HDR400 was a push in itself towards being true HDR (it isn't...) - and FreeSync 2 already has all the market support and recognition it needs to now start increasing its requirements for quality support instead of mainly quantity.



View at TechPowerUp Main Site
 
So AMD lied. Let's see if AdorkedTV will add this into "AMD - Anti-Competitive, Anti-Consumer, Anti-Technology" list.
 
So AMD lied. Let's see if AdorkedTV will add this into "AMD - Anti-Competitive, Anti-Consumer, Anti-Technology" list.
lol
 
You need 1000 nits for real HDR.



AMD didn't lie, HDR400 and HDR600 are both VESA standards. Yes, 1000 nits is the ideal sweetspot for HDR performance, but VESA themselves established these HDR-compliant tiers (much like the HD Ready televisions of hold, if you'll ask me, which accepted inputs, but weren't able to diplay them). AMD simply seems to be elevating the ceiling for which they'll give their FreeSync 2 HDR badge.
 
You need 1000 nits for real HDR.

Says who ? You subjectively ? VESA has 3 distinct HDR certifications all equally true , for 400 , 600 and 1000 nits. And you still didn't explain how exactly AMD lied.
 
Well, I have to post this again. HDR400 is for OLED that can't shine as bright as LCD, but can have much deeper blacks (thus offering a very high dynamic range). Anyone sticking a HDR400 label on an LCD monitor should the taken behind the barn and shot.
 
AMD didn't lie, HDR400 and HDR600 are both VESA standards. Yes, 1000 nits is the ideal sweetspot for HDR performance, but VESA themselves established these HDR-compliant tiers (much like the HD Ready televisions of hold, if you'll ask me, which accepted inputs, but weren't able to diplay them). AMD simply seems to be elevating the ceiling for which they'll give their FreeSync 2 HDR badge.

I'm not saying Freesync 2 HDR monitors are fake HDR. I'm talking, for example, Samsung C32HG70 350 nits?

Says who ? You subjectively ? VESA has 3 distinct HDR certifications all equally true , for 400 , 600 and 1000 nits. And you still didn't explain how exactly AMD lied.

Watch video (0:21) but I doubt it help since you take AMD word above anything else.

"AMD's first FreeSync 2 certification program required displays to only be able to adhere to HDR400. There are some examples of announced, FreeSync 2 monitors that only support that standard (and others that don't support even that but were certified all the same)"

AMD promised that Freesync 2 comes with HDR by default. But from all Freesync 2 monitors only few were certified HDR400, 600 (maybe more didn't check). Now they call it Freesync 2 HDR because only now those monitors will come with DisplayHDR 400, 600, 1000 nits.
 
But from all Freesync 2 monitors only few were certified HDR400, 600 (maybe more didn't check). Now they call it Freesync 2 HDR because only now those monitors will come with DisplayHDR 400, 600, 1000 nits.

Dude, where are those Freesync 2 monitors that are not HDR400 certified? If Samsung rates the Samsung C32HG70 at 600-nits peak and VESA seems to be OK with it, how is it managing sub 400 in some situations AMD's fault?
 
Last edited:
You need 1000 nits for real HDR.
False. Samsung talked about this with their FreeSync 2 certified monitors. They said that HDR 1000 is too bright for monitor use. It is intended for living room TVs where the viewer is some distance from the TV so all that brightness is drowned out by ambient light before it reaches the viewers eyes. HDR 600 is more sensible (less blinding) for monitor use.

FreeSync 2 spec was created before VESA put out the DisplayHDR standard. AMD updated FreeSync 2 spec to use the definitions VESA established. There's only three FreeSync 2 certified monitors right now and all three exceed DisplayHDR 600.

There will likely be DisplayHDR 1000 FreeSync 2 certified TVs.

I welcome this change. Less confusing for everyone.


FYI, DisplayHDR 400......most monitors are 300-350 nit but lack region lighting.
 
Last edited:
False. Samsung talked about this with their FreeSync 2 certified monitors. They said that HDR 1000 is too bright for monitor use. It is intended for living room TVs where the viewer is some distance from the TV so all that brightness is drowned out by ambient light before it reaches the viewers eyes. HDR 600 is more sensible (less blinding) for monitor use.

FreeSync 2 spec was created before VESA put out the DisplayHDR standard. AMD updated FreeSync 2 spec to use the definitions VESA established. There's only three FreeSync 2 certified monitors right now and all three exceed DisplayHDR 600.

There will likely be DisplayHDR 1000 FreeSync 2 certified TVs.

I welcome this change. Less confusing for everyone.


FYI, DisplayHDR 400......most monitors are 300-350 nit but lack region lighting.

Yeah, ambient lighting plays a huge role in how much brightness you need. Most people don't know that and just leave it at default settings but you can significantly increase image quality simply by tuning brightness. Decreasing the brightness increases the contrast. Brightness is merely a measure of the minimum level of light allowed so naturally increasing it means darker colors will no longer be dark. It's like when you turn off all the lights and notice the TV is still because you can see the light even though the scree is "black", or in this case as black as the monitor can emulate. The flipside of that though is if there is too much ambient light and you have your TV set to a low brightness, it will be hard to see much of anything.
 
Back
Top