Friday, November 18th 2016

AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

High-dynamic range or HDR is all the rage these days as the next big thing in display output, now that hardware has time to catch up with ever-increasing display resolutions such as 4K ultra HD, 5K, and the various ultra-wide formats. Hardware-accelerated HDR is getting a push from both AMD and NVIDIA in this round of GPUs. While games with HDR date back to Half Life 2, hardware-accelerated formats that minimize work for game developers, in which the hardware makes sense of an image and adjusts its output range, is new and requires substantial compute power. It also requires additional interface bandwidth between the GPU and the display, since GPUs sometimes rely on wider color palettes such as 10 bpc (1.07 billion colors) to generate HDR images. AMD Radeon GPUs are facing difficulties in this area.

German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.
Source: Heise.de
Add your own comment

126 Comments on AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

#26
bug
ShurikNI believe there are some, but are not that great in number. Now i understand a 1080p screen not having DP, but for a 4k one, its ridiculous.
Just be thankful 4k doesn't require MST, like it did in the beginning :D
TVs aren't really meant for 60fps, because they're not primarily aimed at gamers. At 24 or 30 they can get away with older interfaces.
Posted on Reply
#27
qubit
Overclocked quantum bit
bugTVs aren't really meant for 60fps, because they're not primarily aimed at gamers. At 24 or 30 they can get away with older interfaces.
I've never heard of a TV that can't work at 60Hz. It doesn't make sense, because then it wouldn't be fully compatible with the NTSC spec and interlaced scanning, which it has to be.

In the UK it's the PAL spec and 50Hz interlaced.
Posted on Reply
#28
Solidstate89
qubitI've never heard of a TV that can't work at 60Hz. It doesn't make sense, because then it wouldn't be fully compatible with the NTSC spec and interlaced scanning, which it has to be.

In the UK it's the PAL spec and 50Hz interlaced.
He didn't say they can't work at 60Hz, just that they don't need to unless you have an HTPC hooked up to it as no movie and almost all console games will run 30FPS or less. So they could get away with running like HDM 1.3 or 1.4 at 4K at 30Hz.

Most TVs these days run at full 60Hz, at full 4K at full gamut as well. This was only an issue for early adopters.
Posted on Reply
#29
m1dg3t
zlatanJesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.
ShurikNSomething you forgot to mention
Of course, everyone loves to shit on AMD. Titling it this way grabs more clicks too, smooth move bta :)
Posted on Reply
#30
eidairaman1
The Exiled Airman
GhostRyderHence why I prefer Display Port...
It works for VGA
Posted on Reply
#31
Xzibit
Well this is interesting.
Heise.dewe found that connecting HDR to Nvidia graphics cards took some performance, while the frame rate on the Radeon RX 480 remained constant.
Posted on Reply
#32
Steevo
XzibitWell this is interesting.
Nvidia cuts corners in quality for performance, they always have.

Looking at TSAA vs other AA effects, TSAA looks like it removes some lighting passes, and some detail while performing AA.
Posted on Reply
#33
FordGT90Concept
"I go fast!1!11!1!"
bugThe assertions is it only reduces the colour depth on HDMI 2.0 while all is peachy on DP 1.2. Which, as I have said, it's weird, because both HDMI 2.0 and DP 1.2 have the same bandwidth. Still, it could be HDMI's overhead that makes the difference once again.
DisplayPort 1.2 is 17.28 Gb/s which is enough for 2160p60 @ 10-bpc
Prima.VeraThis time is not AMD fault, but HDMI standard one. nVidia has THE SAME issue with HDMI 2.0 btw.
Also not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...
Because Hollywood. The same people that produce content for TVs are invested in HDMI which they make royalties off of. DisplayPort is almost royalty free so they'd be cutting into their own pocketbook. HDMI isn't going anywhere because of that which is why I'm moving to IPTV boxes and monitors which takes me away, entirely, from the HDMI ecosystem.
qubitIn the UK it's the PAL spec and 50Hz interlaced.
I think 2160p50 @ 10-bpc does work on HDMI 2.0. Edit: Yup, 12.4 Gb/s which is under the 14.4 Gb/s HDMI 2.0 can handle.
Posted on Reply
#34
danbert2000
FordGT90ConceptI think 2160p50 @ 10-bbp does work on HDMI 2.0. Edit: Yup, 12.4 Gb/s which is under the 14.4 Gb/s HDMI 2.0 can handle.
Please stop repeating that 14.4 Gb/s number, you are wrong. Full bandwidth of HDMI 2.0b is 18 Gbps:

en.wikipedia.org/wiki/HDMI#Version_2.0

What I gleaned from this article is that AMD is unable to push HDR10 at any chroma resolution over HDMI, even 4:2:0. This is bad news for anyone that wants an AMD card in an HTPC, as it won't be able to output HDR movies or games to your fancy 4KTV. Let's hope they can somehow fix this in the drivers. Nvidia has driver support for HDR, but sadly there are almost no games that support it now and Hollywood is restricting access to 4k and HDR video content on PC. Nvidia apparently is working with Netflix and devs, but no news for a while. Also, Windows 10 does not support HDR in shared mode with the desktop, so until they fix that, HDR will only work with exclusive fullscreen. Lots of work to be done on the PC side, luckily at least Nvidia cards will be ready when the content is.

wccftech.com/nvidia-pascal-goes-full-in-with-hdr-support-for-games-and-4k-streaming-for-movies/\

What I find curious is that the PS4 Pro and Xbox One S have no issue with HDR at 4:2:0 over HDMI 2.0. Sony made a big deal of having more up-to-date architecture than the Polaris used in the RX 480, and Microsoft used a newer version of the architecture for their die shrink (as evidenced by the 16nm size and H265 support). I really hope for AMD owners' sake that the Polaris in the 480 isn't missing this feature, as HDR is awesome and makes a bigger difference than 4k for gaming, in my humble opinion.
Posted on Reply
#36
danbert2000
FordGT90Concept14.4 Gb/s is payload, 18 Gb/s includes overhead.
Yeah, overhead for 10 bit. So you're doing your math wrong.
Posted on Reply
#37
danbert2000
I mean, come on. There are many devices outputting HDR over HDMI 2.0, it's not like these rules apply differently for AMD. They screwed up, it's not the spec's fault that the RX 480 can't output HDR10 over HDMI.
Posted on Reply
#38
Xzibit
danbert2000I mean, come on. There are many devices outputting HDR over HDMI 2.0, it's not like these rules apply differently for AMD. They screwed up, it's not the spec's fault that the RX 480 can't output HDR10 over HDMI.
Did you also miss the part of the source article that's says.
Heide.deIn the test with the HDR-enabled game ShadowWarrior 2, AMD's Radeon RX 480 (GPU: Polaris) showed mostly a similar HDR image as Nvidia's GeForce GTX 1080.
You do realize HDR is end to end.
Posted on Reply
#40
Steevo
danbert2000I mean, come on. There are many devices outputting HDR over HDMI 2.0, it's not like these rules apply differently for AMD. They screwed up, it's not the spec's fault that the RX 480 can't output HDR10 over HDMI.
Do you gag on Nvidia cause they pay you or for the lulz? I only ask as Nvidia isn't doing deep color (HDR) over HDMI either. So essentially everything you have said is wrong.


Also, don't double post.
Posted on Reply
#41
danbert2000
SteevoDo you gag on Nvidia cause they pay you or for the lulz? I only ask as Nvidia isn't doing deep color (HDR) over HDMI either. So essentially everything you have said is wrong.


Also, don't double post.
RTFA. Nvidia supports HDR over HDMI. They also support 4K at 4:4:4 and RGB, something else that AMD's Polaris doesn't support.

Also, I may have gotten the 8bit/10bit encoding stuff wrong, but FordGT90Concept, you haven't really answered why you think that all these other 4K devices, like UHD bluray players, Xbox One S, PS4 Pro, Nvidia Shield, Chromecast Ultra, can send HDR at 4K over HDMI 2.0 but AMD can't because of the spec. That's just stupid.

Edit:
XzibitYou do realize HDR is end to end.
I have a 4K HDR TV, I realize that very well thanks. Probably better than you in fact. Some random journalist is comparing the image quality subjectively and that's somehow proof that Nvidia isn't sending HDR metadata? Please. The big issue is AMD is not supporting full chroma resolution over 4K even at 8 bit, which means they may not have the bandwidth to run HDR as 4:2:2 HDR uses similar bandwidth to 4:4:4 8-bit.
Posted on Reply
#42
bug
danbert2000Please stop repeating that 14.4 Gb/s number, you are wrong. Full bandwidth of HDMI 2.0b is 18 Gbps:

en.wikipedia.org/wiki/HDMI#Version_2.0
Actually the guy is right. Physically HDMI 2.0 and DP 1.2 will carry the same amount of data. But HDMI adds more protocol overhead, thus it carries less actual data.
This was a problem before, when it was discovered HDMI 2.0 couldn't do 4k at some certain fps (I don't recall the exact number) together with HDCP 2.2, meanwhile DP 1.2 was more than happy to oblige.
Posted on Reply
#43
danbert2000
bugActually the guy is right. Physically HDMI 2.0 and DP 1.2 will carry the same amount of data. But HDMI adds more protocol overhead, thus it carries less actual data.
I'll concede that HDMI does not have 18 Gbps bandwidth for just video and audio, but at no point does this prevent HDR from going over HDMI 2.0, like he was suggesting. It is AMD's fault if this isn't working, not HDMI 2.0. I have sent HDR over HDMI 2.0. There is no bandwidth issue, no matter what the number is.

EDIT:

This is what Vizio says their HDMI 2.0 ports support. Parens are my analysis of the article.
600MHz pixel clock rate:
2160p@60fps, 4:4:4, 8-bit (AMD doesn't support)
2160p@60fps, 4:2:2, 12-bit (PS4 Pro w/ AMD GPU supports, Polaris may not support)
2160p@60fps, 4:2:0, 12-bit (PS4 Pro w/ AMD GPU supports, Polaris may not support)
Posted on Reply
#44
FordGT90Concept
"I go fast!1!11!1!"
danbert2000Also, I may have gotten the 8bit/10bit encoding stuff wrong, but FordGT90Concept, you haven't really answered why you think that all these other 4K devices, like UHD bluray players, Xbox One S, PS4 Pro, Nvidia Shield, Chromecast Ultra, can send HDR at 4K over HDMI 2.0 but AMD can't because of the spec. That's just stupid.
They have to be doing 4:2:2 chroma sampling which Polaris can/does do. If it is at 4:4:4 then it has to be at a lower framerate (29.97 for ATSC or 50 for PAL).
danbert20002160p@60fps, 4:4:4, 8-bit (AMD doesn't support)
AMD does support that. 8-bpc is not HDR. HDR starts at 10-bpc.

I'm not quite sure how chroma subsampling translates to number of bits in the stream so I can't do the math on 4:2:2 or 4:2:0

Edit: It's complicated... en.wikipedia.org/wiki/Chroma_subsampling ...not going to invest my time in figuring that out.
Posted on Reply
#45
danbert2000
The more I read the source article, the more I realize why btarunr didn't link to it. This article has no proof for its claims and no analysis of what they mean. This is a clickbait article posted as some sort of revelation and I was bamboozled. This is their "news":

"According to c't demand, a manager of the GPU manufacturer AMD explained that the current Radeon graphics cards RX 400 (Polaris GPUs) in PC games produce HDR images only with a color depth of 8 bits instead of 10 bits per color channel , When output via HDMI 2.0 to a HDR-compatible 4K TV. AMD uses a special dithering method with respect to the gamma curve (Perceptual Curve), in order to display the gradients as smooth as possible. HDR TVs still recognize an HDR signal and switch to the appropriate mode."

What is missing in there is an acknowledgement that Windows 10 doesn't support HDR at the moment, and it's suspect that Shadow Warrior is actually outputting HDR metadata. They don't give any proof that the TV's are actually entering HDR mode. I apologize to others for trying to take this article at face value. At present, there is no confirmed support for HDR content of any sort on PC, irrespective of the GPU you have.

I expected better from Techpowerup.
Posted on Reply
#46
FordGT90Concept
"I go fast!1!11!1!"
Windows has long supported HDR because of professional software like Adobe Photoshop. 10-bpc used to be exclusive to workstation graphics cards (Quadro and Fire Pro) but that is no longer the case.

It's widely reported that Shadow Warrior 2 is the first PC title to support HDR (because of backing from NVIDIA).

To make HDR work, you need a GPU that supports 10 or more bits/color (I believe GTX 2## or newer and HD 7### series or newer qualifies), an operating system that supports it (Windows XP and newer should), a monitor that supports it (there's some...often spendy), and software that uses it (Shadow Warrior 2 is apparently the only game that meets that criteria now).
Posted on Reply
#47
Xzibit
danbert2000I have a 4K HDR TV, I realize that very well thanks. Probably better than you in fact. Some random journalist is comparing the image quality subjectively and that's somehow proof that Nvidia isn't sending HDR metadata? Please. The big issue is AMD is not supporting full chroma resolution over 4K even at 8 bit, which means they may not have the bandwidth to run HDR as 4:2:2 HDR uses similar bandwidth to 4:4:4 8-bit.
That's the point your speculating on a speculative article.
danbert2000The more I read the source article, the more I realize why btarunr didn't link to it. This article has no proof for its claims and no analysis of what they mean. This is a clickbait article posted as some sort of revelation and I was bamboozled. This is their "news":
He always links to source article. People reading them is another thing.

The article doesn't take into consideration HDR is end to end

Like your TV M-series didn't have HDR until a firmware update in August and even then HDR is still limited to the original standard tv spec of around 1k nits which is basic SDR. No different than any other monitor/TV but with a 20% dynamic backlight 80%-100% which would be neat if it went pass standard nits. The TV could have HDMI 2.0b+ but would be neglected by what the actual "image/color processor" inside the TV T-Com can handle inside not what GPU, cable. input can transmit
Posted on Reply
#48
Steevo
danbert2000The more I read the source article, the more I realize why btarunr didn't link to it. This article has no proof for its claims and no analysis of what they mean. This is a clickbait article posted as some sort of revelation and I was bamboozled. This is their "news":

"According to c't demand, a manager of the GPU manufacturer AMD explained that the current Radeon graphics cards RX 400 (Polaris GPUs) in PC games produce HDR images only with a color depth of 8 bits instead of 10 bits per color channel , When output via HDMI 2.0 to a HDR-compatible 4K TV. AMD uses a special dithering method with respect to the gamma curve (Perceptual Curve), in order to display the gradients as smooth as possible. HDR TVs still recognize an HDR signal and switch to the appropriate mode."

What is missing in there is an acknowledgement that Windows 10 doesn't support HDR at the moment, and it's suspect that Shadow Warrior is actually outputting HDR metadata. They don't give any proof that the TV's are actually entering HDR mode. I apologize to others for trying to take this article at face value. At present, there is no confirmed support for HDR content of any sort on PC, irrespective of the GPU you have.

I expected better from Techpowerup.
I run my TV with Deep Color as it supports it at 12bpp, and it is supported by my 7970, and using fullscreen color gradient tests it works. My 5870 supported 10bpp and I used that too for many years, with Vista and 7.
www.techpowerup.com/forums/threads/do-10-bit-monitors-require-a-quadro-firepro-card.198031/#post-3068138

Since you obviously have the 1080 card, a 4K HDR monitor, and the time to spare, why not test it yourself and make a relevant post about something instead of thread crapping.
Posted on Reply
#49
danbert2000
HDR is a superset of 10 bit support. Shared graphics mode on Windows 10 does not support HDR 10 yet, nor do any desktop apps. HDR is 10 bit, 10 bit is not HDR.
mspoweruser.com/microsoft-bringing-native-hdr-display-support-windows-year/

Shadow Warrior will work with HDR on PC, but only in exclusive mode. Again, there's no proof from the original article that AMD is not pushing 10 bits in HDR mode. I think that's what is so frustrating is that the headline makes it sound like a fact but there's no actual proof that it's a deficiency on AMD's part or if the game is "faking" HDR in some way (not using 10 bit pixels in the entire rendering pipeline).

To Steevo, thanks for confirming you have no experience with HDR 10 or any 4K HDR format. Not that your shitbox could even push 4K anything.

To Xzibit, I completely missed the tiny source link, whoops. It looks like the source article is quoting yet another article, which makes this third hand news. Still a whole bunch of non-news. As for my TV, I have a P series now, and though the 1000 nit mastering is a big part of HDR, it's only part. The big deal is 10 bit color for better reds and greens and little to no dithering. I play UHD blurays and Forza Horizon 3 on my TV and the difference is shocking. I never realized how much tone mapping they did in 8 bit content until I saw dark shadow detail and the sun in the same shot.
Posted on Reply
Add your own comment
Dec 31st, 2024 18:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts