# AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0



## btarunr (Nov 18, 2016)

High-dynamic range or HDR is all the rage these days as the next big thing in display output, now that hardware has time to catch up with ever-increasing display resolutions such as 4K ultra HD, 5K, and the various ultra-wide formats. Hardware-accelerated HDR is getting a push from both AMD and NVIDIA in this round of GPUs. While games with HDR date back to Half Life 2, hardware-accelerated formats that minimize work for game developers, in which the hardware makes sense of an image and adjusts its output range, is new and requires substantial compute power. It also requires additional interface bandwidth between the GPU and the display, since GPUs sometimes rely on wider color palettes such as 10 bpc (1.07 billion colors) to generate HDR images. AMD Radeon GPUs are facing difficulties in this area.

German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro. 



 

 

 



*View at TechPowerUp Main Site*


----------



## RejZoR (Nov 18, 2016)

There must be technical reason behind this, considering Polaris is new GPU that now has HDMI 2.0 support (unlike R9 Fury which was still HDMI 1.4). Question is, why. It can't be cheating or GPU processing saving since DisplayPort does work with HDR in full fat mode. So, what is it? Unless HDMI 2.0 support isn't actually HDMI 2.0. Somehow. That would kinda suck.


----------



## bug (Nov 18, 2016)

This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?


----------



## MyTechAddiction (Nov 18, 2016)

I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?


----------



## TheLostSwede (Nov 18, 2016)

MyTechAddiction said:


> I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?



The point is HDR which the human eye can see.


----------



## zlatan (Nov 18, 2016)

Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.


----------



## ShurikN (Nov 18, 2016)

> German tech publication Heise.de..........
> .........The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.



Something you forgot to mention



> In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.


----------



## RejZoR (Nov 18, 2016)

MyTechAddiction said:


> I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?



Just like human eye can't see more than 24fps, right? Things are not so simple for computers, mostly before image actually reaches the output display device. Those 24bit means just number of solid colors. You need additional 8bits to represent transparent colors, that's why we address it as 32bit while color space is actually 24bit (+8bit transparency).

I've been working with 3D raytracing and the tool had option for 40bit image renering which significantly increased rendering time, but decreased or even eliminated color banding on the output image (especially when that got scaled down later to be stored as image file since mainstream formats like JPG, PNG or BMP don't support 40bits).


----------



## snakefist (Nov 18, 2016)

[/irony on]

Of course human eye can see 10-bit palette accurately - just as human ear is capable to hear cats and dogs frequencies and distinguish clearly the debilitating difference between 22.1KHz and 24KHz (let alone 48KHz!). Also, all displays used accurately show all 100% of visible spectrum, having no technological limitations whatsoever, so this is of utmost importance! All humanity vision and hearing are flawless, too... And we ALL need 40MP cameras for pictures later reproduced pictures on FHD or, occasionally, 4k displays... especially on mobile phones displays...

[irony off]

I try to fight this kind of blind belief in ridiculous claims for many years, but basically it's not worth it...


----------



## bug (Nov 18, 2016)

MyTechAddiction said:


> I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?


The human eye can actually distinguish less than a million colours, but the point is, which colours?
On one side we have the dynamic range, which is the difference between the darkest and brightest colour. Th human eye has a certain dynamic range, but by adjusting the pupil, it can shift this range up and down.
On the other hand, we have the computer which handles colour in a discrete world. Discrete computing inevitably alters the data, thus we need to have the computer work at a level of detail the human doesn't see, otherwise we end up with wrong colours and/or banding.

In short, for various reasons, computers need more info to work with than the naked eye can see.


----------



## heky (Nov 18, 2016)

Like zlatan has mentioned already, this is a HDMI 2.0 limiation, not a AMD/Nvidia one.


----------



## Prima.Vera (Nov 18, 2016)

MyTechAddiction said:


> I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?


Actually the human eye can distinguish between 2 and 10 million, depending on age, eye viewside quality, etc... However HDR supposedly was good on removing color banding in games/movies...


----------



## natr0n (Nov 18, 2016)

HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.


----------



## nemesis.ie (Nov 18, 2016)

The key here is that we are now getting displays that can output HDR images which means better contrast, more visible detail in dark areas. small very bright points at the same time as dark areas etc.  i.e. the entire pipeline needs to support HDR to get the "full effect".


----------



## bug (Nov 18, 2016)

natr0n said:


> HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.


Eh, it's not the same thing. That was tone mapping (see: https://en.wikipedia.org/wiki/Tone_mapping ). This time we're talking real HDR. Which is nice and has been around since forever, but it needs to shift huge amount of data which video cards are only now starting to be able to handle. Even so, be prepared for the same content availability issues that have plagued 3D and 4k.


----------



## FordGT90Concept (Nov 18, 2016)

3840 x 2160 x 60 Hz x 30 bit = 14,929,920,000 or *14.9 Gb/s*
HDMI 2.0 maximum bandwidth: *14.4 Gb/s*

I said it before and I'll say it again, HDMI sucks.  They're working on HDMI 2.1 spec likely to increase the bandwidth. Since HDMI is running off the DVI backbone that was created over a decade ago, the only way to achieve more bandwidth is shorter cables.  HDMI is digging its own grave and has been for a long time.


----------



## Xuper (Nov 18, 2016)

bug said:


> This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?



Nope, For 10bit ,4K@60Hz and 4:4:4 , You need at least DP 1.3 far above HDMI 2.0 , This supports 30/36/48 RGB Colour or 10/12/16 bit Colour at 4K

Edit : Oh i found this :

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx

Look at a Chart :
4K@60 , 10 bit , 4.2.2 : Pass
4K@60 , 10 bit , 4:4:4 : Fail


----------



## Solidstate89 (Nov 18, 2016)

MyTechAddiction said:


> I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?


You don't "remember" anything, because that is bullshit what you just said. It's like saying the human eye can't see more than 30FPS.

Also, this is a limitation of the HDMI spec. That's the entire purpose for HDMI 2.0a's existence is to add the necessary metadata stream for UHD-HDR to work.


----------



## bug (Nov 18, 2016)

behrouz said:


> Nope, For 10bit ,4K@60Hz and 4:4:4 , You need at least DP 1.3 far above HDMI 2.0 , This supports 30/36/48 RGB Colour or 10/12/16 bit Colour at 4K
> 
> Edit : Oh i found this :
> 
> ...


I was commenting on this:


> German tech publication Heise.de discovered that *AMD Radeon GPUs render HDR games* ... *at a reduced color depth* of 8 bits per cell (16.7 million colors), or 32-bit; *if your display* (eg: 4K HDR-ready TV) *is connected over HDMI 2.0 and not DisplayPort 1.2 (and above)*.



The assertions is it only reduces the colour depth on HDMI 2.0 while all is peachy on DP 1.2. Which, as I have said, it's weird, because both HDMI 2.0 and DP 1.2 have the same bandwidth. Still, it could be HDMI's overhead that makes the difference once again.


----------



## eidairaman1 (Nov 18, 2016)

Not a fan of hdmi anyway, such a weaksauce connector standrd.


----------



## prtskg (Nov 18, 2016)

bug said:


> This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?


So you are in the driver. Sorry couldn't resist it.



ShurikN said:


> Something you forgot to mention
> 
> 
> > In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.


That's quite the thing he missed.


----------



## bug (Nov 18, 2016)

prtskg said:


> So you are in the driver. Sorry couldn't resist it.



I meant another bug :blushing:


----------



## qubit (Nov 18, 2016)

zlatan said:


> Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.


This would mean that NVIDIA will have the same limitation then. I personally have no idea as I'm not familiar with the intimate details of the HDMI spec and am taking your word for it.

@btarunr Do you want to check zlatan's point and update the article if he's right?


----------



## Prima.Vera (Nov 18, 2016)

This time is not AMD fault, but HDMI standard one. nVidia has THE SAME issue with HDMI 2.0 btw.
Also not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...


----------



## GhostRyder (Nov 18, 2016)

Hence why I prefer Display Port...


----------



## ShurikN (Nov 18, 2016)

Prima.Vera said:


> Also not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...


I believe there are some, but are not that great in number. Now i understand a 1080p screen not having DP, but for a 4k one, its ridiculous.


----------



## bug (Nov 18, 2016)

ShurikN said:


> I believe there are some, but are not that great in number. Now i understand a 1080p screen not having DP, but for a 4k one, its ridiculous.


Just be thankful 4k doesn't require MST, like it did in the beginning 
TVs aren't really meant for 60fps, because they're not primarily aimed at gamers. At 24 or 30 they can get away with older interfaces.


----------



## qubit (Nov 18, 2016)

bug said:


> TVs aren't really meant for 60fps, because they're not primarily aimed at gamers. At 24 or 30 they can get away with older interfaces.


I've never heard of a TV that can't work at 60Hz. It doesn't make sense, because then it wouldn't be fully compatible with the NTSC spec and interlaced scanning, which it has to be.

In the UK it's the PAL spec and 50Hz interlaced.


----------



## Solidstate89 (Nov 18, 2016)

qubit said:


> I've never heard of a TV that can't work at 60Hz. It doesn't make sense, because then it wouldn't be fully compatible with the NTSC spec and interlaced scanning, which it has to be.
> 
> In the UK it's the PAL spec and 50Hz interlaced.


He didn't say they can't work at 60Hz, just that they don't need to unless you have an HTPC hooked up to it as no movie and almost all console games will run 30FPS or less. So they could get away with running like HDM 1.3 or 1.4 at 4K at 30Hz.

Most TVs these days run at full 60Hz, at full 4K at full gamut as well. This was only an issue for early adopters.


----------



## m1dg3t (Nov 18, 2016)

zlatan said:


> Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.





ShurikN said:


> Something you forgot to mention



Of course, everyone loves to shit on AMD. Titling it this way grabs more clicks too, smooth move bta


----------



## eidairaman1 (Nov 18, 2016)

GhostRyder said:


> Hence why I prefer Display Port...



It works for VGA


----------



## Xzibit (Nov 18, 2016)

Well this is interesting.



			
				Heise.de said:
			
		

> we found that connecting HDR to Nvidia graphics cards took some performance, while the frame rate on the Radeon RX 480 remained constant.


----------



## Steevo (Nov 18, 2016)

Xzibit said:


> Well this is interesting.


Nvidia cuts corners in quality for performance, they always have.

Looking at TSAA vs other AA effects, TSAA looks like it removes some lighting passes, and some detail while performing AA.


----------



## FordGT90Concept (Nov 18, 2016)

bug said:


> The assertions is it only reduces the colour depth on HDMI 2.0 while all is peachy on DP 1.2. Which, as I have said, it's weird, because both HDMI 2.0 and DP 1.2 have the same bandwidth. Still, it could be HDMI's overhead that makes the difference once again.


DisplayPort 1.2 is 17.28 Gb/s which is enough for 2160p60 @ 10-bpc



Prima.Vera said:


> This time is not AMD fault, but HDMI standard one. nVidia has THE SAME issue with HDMI 2.0 btw.
> Also not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...


Because Hollywood.  The same people that produce content for TVs are invested in HDMI which they make royalties off of.  DisplayPort is almost royalty free so they'd be cutting into their own pocketbook.  HDMI isn't going anywhere because of that which is why I'm moving to IPTV boxes and monitors which takes me away, entirely, from the HDMI ecosystem.



qubit said:


> In the UK it's the PAL spec and 50Hz interlaced.


I think 2160p50 @ 10-bpc does work on HDMI 2.0.  Edit: Yup, 12.4 Gb/s which is under the 14.4 Gb/s HDMI 2.0 can handle.


----------



## danbert2000 (Nov 18, 2016)

FordGT90Concept said:


> I think 2160p50 @ 10-bbp does work on HDMI 2.0.  Edit: Yup, 12.4 Gb/s which is under the 14.4 Gb/s HDMI 2.0 can handle.



Please stop repeating that 14.4 Gb/s number, you are wrong. Full bandwidth of HDMI 2.0b is 18 Gbps:

https://en.wikipedia.org/wiki/HDMI#Version_2.0

What I gleaned from this article is that AMD is unable to push HDR10 at any chroma resolution over HDMI, even 4:2:0. This is bad news for anyone that wants an AMD card in an HTPC, as it won't be able to output HDR movies or games to your fancy 4KTV. Let's hope they can somehow fix this in the drivers. Nvidia has driver support for HDR, but sadly there are almost no games that support it now and Hollywood is restricting access to 4k and HDR video content on PC. Nvidia apparently is working with Netflix and devs, but no news for a while. Also, Windows 10 does not support HDR in shared mode with the desktop, so until they fix that, HDR will only work with exclusive fullscreen. Lots of work to be done on the PC side, luckily at least Nvidia cards will be ready when the content is.

http://wccftech.com/nvidia-pascal-g...pport-for-games-and-4k-streaming-for-movies/\

What I find curious is that the PS4 Pro and Xbox One S have no issue with HDR at 4:2:0 over HDMI 2.0. Sony made a big deal of having more up-to-date architecture than the Polaris used in the RX 480, and Microsoft used a newer version of the architecture for their die shrink (as evidenced by the 16nm size and H265 support). I really hope for AMD owners' sake that the Polaris in the 480 isn't missing this feature, as HDR is awesome and makes a bigger difference than 4k for gaming, in my humble opinion.


----------



## FordGT90Concept (Nov 18, 2016)

14.4 Gb/s is payload, 18 Gb/s includes overhead.  See the table under "version comparison."

http://www.vesa.org/wp-content/uplo...evCon-Presentation-DP-1.2-Dec-2010-rev-2b.pdf
"video data bandwidth to 2160Mbytes/sec" which is 17.28 Gb/s as stated above.

HDMI 2.0 can only do 2160p60 10-bit if using 4:2:2 chroma sampling, as AMD's slide says.


----------



## danbert2000 (Nov 18, 2016)

FordGT90Concept said:


> 14.4 Gb/s is payload, 18 Gb/s includes overhead.


Yeah, overhead for 10 bit. So you're doing your math wrong.


----------



## danbert2000 (Nov 18, 2016)

I mean, come on. There are many devices outputting HDR over HDMI 2.0, it's not like these rules apply differently for AMD. They screwed up, it's not the spec's fault that the RX 480 can't output HDR10 over HDMI.


----------



## Xzibit (Nov 18, 2016)

danbert2000 said:


> I mean, come on. There are many devices outputting HDR over HDMI 2.0, it's not like these rules apply differently for AMD. They screwed up, it's not the spec's fault that the RX 480 can't output HDR10 over HDMI.



Did you also miss the part of the source article that's says.



			
				Heide.de said:
			
		

> In the test with the HDR-enabled game ShadowWarrior 2, AMD's Radeon RX 480 (GPU: Polaris) showed mostly a similar HDR image as Nvidia's GeForce GTX 1080.



You do realize HDR is end to end.


----------



## FordGT90Concept (Nov 18, 2016)

danbert2000 said:


> Yeah, overhead for 10 bit. So you're doing your math wrong.


https://en.wikipedia.org/wiki/8b/10b_encoding


----------



## Steevo (Nov 18, 2016)

danbert2000 said:


> I mean, come on. There are many devices outputting HDR over HDMI 2.0, it's not like these rules apply differently for AMD. They screwed up, it's not the spec's fault that the RX 480 can't output HDR10 over HDMI.




Do you gag on Nvidia cause they pay you or for the lulz? I only ask as Nvidia isn't doing deep color (HDR) over HDMI either. So essentially everything you have said is wrong.


Also, don't double post.


----------



## danbert2000 (Nov 18, 2016)

Steevo said:


> Do you gag on Nvidia cause they pay you or for the lulz? I only ask as Nvidia isn't doing deep color (HDR) over HDMI either. So essentially everything you have said is wrong.
> 
> 
> Also, don't double post.



RTFA. Nvidia supports HDR over HDMI. They also support 4K at 4:4:4 and RGB, something else that AMD's Polaris doesn't support.

Also, I may have gotten the 8bit/10bit encoding stuff wrong, but FordGT90Concept, you haven't really answered why you think that all these other 4K devices, like UHD bluray players, Xbox One S, PS4 Pro, Nvidia Shield, Chromecast Ultra, can send HDR at 4K over HDMI 2.0 but AMD can't because of the spec. That's just stupid.

Edit:



Xzibit said:


> You do realize HDR is end to end.



I have a 4K HDR TV, I realize that very well thanks. Probably better than you in fact. Some random journalist is comparing the image quality subjectively and that's somehow proof that Nvidia isn't sending HDR metadata? Please. The big issue is AMD is not supporting full chroma resolution over 4K even at 8 bit, which means they may not have the bandwidth to run HDR as 4:2:2 HDR uses similar bandwidth to 4:4:4 8-bit.


----------



## bug (Nov 18, 2016)

danbert2000 said:


> Please stop repeating that 14.4 Gb/s number, you are wrong. Full bandwidth of HDMI 2.0b is 18 Gbps:
> 
> https://en.wikipedia.org/wiki/HDMI#Version_2.0



Actually the guy is right. Physically HDMI 2.0 and DP 1.2 will carry the same amount of data. But HDMI adds more protocol overhead, thus it carries less actual data.
This was a problem before, when it was discovered HDMI 2.0 couldn't do 4k at some certain fps (I don't recall the exact number) _together with HDCP 2.2_, meanwhile DP 1.2 was more than happy to oblige.


----------



## danbert2000 (Nov 18, 2016)

bug said:


> Actually the guy is right. Physically HDMI 2.0 and DP 1.2 will carry the same amount of data. But HDMI adds more protocol overhead, thus it carries less actual data.



I'll concede that HDMI does not have 18 Gbps bandwidth for just video and audio, but at no point does this prevent HDR from going over HDMI 2.0, like he was suggesting. It is AMD's fault if this isn't working, not HDMI 2.0. I have sent HDR over HDMI 2.0. There is no bandwidth issue, no matter what the number is.

EDIT:

This is what Vizio says their HDMI 2.0 ports support. Parens are my analysis of the article.
600MHz pixel clock rate: 
2160p@60fps, 4:4:4, 8-bit (AMD doesn't support)
2160p@60fps, 4:2:2, 12-bit (PS4 Pro w/ AMD GPU supports, Polaris may not support)
2160p@60fps, 4:2:0, 12-bit (PS4 Pro w/ AMD GPU supports, Polaris may not support)


----------



## FordGT90Concept (Nov 18, 2016)

danbert2000 said:


> Also, I may have gotten the 8bit/10bit encoding stuff wrong, but FordGT90Concept, you haven't really answered why you think that all these other 4K devices, like UHD bluray players, Xbox One S, PS4 Pro, Nvidia Shield, Chromecast Ultra, can send HDR at 4K over HDMI 2.0 but AMD can't because of the spec. That's just stupid.


They have to be doing 4:2:2 chroma sampling which Polaris can/does do.  If it is at 4:4:4 then it has to be at a lower framerate (29.97 for ATSC or 50 for PAL).



danbert2000 said:


> 2160p@60fps, 4:4:4, 8-bit (AMD doesn't support)


AMD does support that.  8-bpc is not HDR.  HDR starts at 10-bpc.

I'm not quite sure how chroma subsampling translates to number of bits in the stream so I can't do the math on 4:2:2 or 4:2:0

Edit: It's complicated... https://en.wikipedia.org/wiki/Chroma_subsampling ...not going to invest my time in figuring that out.


----------



## danbert2000 (Nov 18, 2016)

The more I read the source article, the more I realize why btarunr didn't link to it. This article has no proof for its claims and no analysis of what they mean. This is a clickbait article posted as some sort of revelation and I was bamboozled. This is their "news":

"According to c't demand, a manager of the GPU manufacturer AMD explained that the current Radeon graphics cards RX 400 (Polaris GPUs) in PC games produce HDR images only with a color depth of 8 bits instead of 10 bits per color channel , When output via HDMI 2.0 to a HDR-compatible 4K TV. AMD uses a special dithering method with respect to the gamma curve (Perceptual Curve), in order to display the gradients as smooth as possible. HDR TVs still recognize an HDR signal and switch to the appropriate mode."

What is missing in there is an acknowledgement that Windows 10 doesn't support HDR at the moment, and it's suspect that Shadow Warrior is actually outputting HDR metadata. They don't give any proof that the TV's are actually entering HDR mode. I apologize to others for trying to take this article at face value. At present, there is no confirmed support for HDR content of any sort on PC, irrespective of the GPU you have.

I expected better from Techpowerup.


----------



## FordGT90Concept (Nov 18, 2016)

Windows has long supported HDR because of professional software like Adobe Photoshop.  10-bpc used to be exclusive to workstation graphics cards (Quadro and Fire Pro) but that is no longer the case.

It's widely reported that Shadow Warrior 2 is the first PC title to support HDR (because of backing from NVIDIA).

To make HDR work, you need a GPU that supports 10 or more bits/color (I believe GTX 2## or newer and HD 7### series or newer qualifies), an operating system that supports it (Windows XP and newer should), a monitor that supports it (there's some...often spendy), and software that uses it (Shadow Warrior 2 is apparently the only game that meets that criteria now).


----------



## Xzibit (Nov 18, 2016)

danbert2000 said:


> I have a 4K HDR TV, I realize that very well thanks. Probably better than you in fact. Some random journalist is comparing the image quality subjectively and that's somehow proof that Nvidia isn't sending HDR metadata? Please. The big issue is AMD is not supporting full chroma resolution over 4K even at 8 bit, which means they may not have the bandwidth to run HDR as 4:2:2 HDR uses similar bandwidth to 4:4:4 8-bit.



That's the point your speculating on a speculative article.



danbert2000 said:


> The more I read the source article, *the more I realize why btarunr didn't link to it*. This article has no proof for its claims and no analysis of what they mean. This is a clickbait article posted as some sort of revelation and I was bamboozled. This is their "news":



He always links to source article.  People reading them is another thing.

The article doesn't take into consideration HDR is end to end

Like your TV M-series didn't have HDR until a firmware update in August and even then HDR is still limited to the original standard tv spec of around 1k nits which is basic SDR. No different than any other monitor/TV but with a 20% dynamic backlight 80%-100% which would be neat if it went pass standard nits.  The TV could have HDMI 2.0b+ but would be neglected by what the actual "image/color processor" inside the TV T-Com can handle inside not what GPU, cable. input can transmit


----------



## Steevo (Nov 18, 2016)

danbert2000 said:


> The more I read the source article, the more I realize why btarunr didn't link to it. This article has no proof for its claims and no analysis of what they mean. This is a clickbait article posted as some sort of revelation and I was bamboozled. This is their "news":
> 
> "According to c't demand, a manager of the GPU manufacturer AMD explained that the current Radeon graphics cards RX 400 (Polaris GPUs) in PC games produce HDR images only with a color depth of 8 bits instead of 10 bits per color channel , When output via HDMI 2.0 to a HDR-compatible 4K TV. AMD uses a special dithering method with respect to the gamma curve (Perceptual Curve), in order to display the gradients as smooth as possible. HDR TVs still recognize an HDR signal and switch to the appropriate mode."
> 
> ...




I run my TV with Deep Color as it supports it at 12bpp, and it is supported by my 7970, and using fullscreen color gradient tests it works. My 5870 supported 10bpp and I used that too for many years, with Vista and 7. 
https://www.techpowerup.com/forums/...re-a-quadro-firepro-card.198031/#post-3068138

Since you obviously have the 1080 card, a 4K HDR monitor, and the time to spare, why not test it yourself and make a relevant post about something instead of thread crapping.


----------



## danbert2000 (Nov 19, 2016)

HDR is a superset of 10 bit support. Shared graphics mode on Windows 10 does not support HDR 10 yet, nor do any desktop apps. HDR is 10 bit, 10 bit is not HDR.
https://mspoweruser.com/microsoft-bringing-native-hdr-display-support-windows-year/

Shadow Warrior will work with HDR on PC, but only in exclusive mode. Again, there's no proof from the original article that AMD is not pushing 10 bits in HDR mode. I think that's what is so frustrating is that the headline makes it sound like a fact but there's no actual proof that it's a deficiency on AMD's part or if the game is "faking" HDR in some way (not using 10 bit pixels in the entire rendering pipeline).

To Steevo, thanks for confirming you have no experience with HDR 10 or any 4K HDR format. Not that your shitbox could even push 4K anything.

To Xzibit, I completely missed the tiny source link, whoops. It looks like the source article is quoting yet another article, which makes this third hand news. Still a whole bunch of non-news. As for my TV, I have a P series now, and though the 1000 nit mastering is a big part of HDR, it's only part. The big deal is 10 bit color for better reds and greens and little to no dithering. I play UHD blurays and Forza Horizon 3 on my TV and the difference is shocking. I never realized how much tone mapping they did in 8 bit content until I saw dark shadow detail and the sun in the same shot.


----------



## FordGT90Concept (Nov 19, 2016)

danbert2000 said:


> HDR is a superset of 10 bit support. Shared graphics mode on Windows 10 does not support HDR 10 yet, nor do any desktop apps. HDR is 10 bit, 10 bit is not HDR.
> https://mspoweruser.com/microsoft-bringing-native-hdr-display-support-windows-year/


Pretty sure they're talking about Microsoft's generic drivers (GPU and display), not manufacturer drivers.


----------



## danbert2000 (Nov 19, 2016)

FordGT90Concept said:


> Pretty sure they're talking about Microsoft's generic drivers (GPU and display), not manufacturer drivers.



From Nvidia:

"If your application does not run in full-screen exclusive mode, the desktop compositor will strip the extra range and precision necessary for HDR. It is important to understand that this is a temporary restriction as Microsoft announced plans for OS support for HDR."

https://developer.nvidia.com/displaying-hdr-nuts-and-bolts


----------



## Steevo (Nov 19, 2016)

danbert2000 said:


> To Steevo, thanks for confirming you have no experience with HDR 10 or any 4K HDR format. Not that your shitbox could even push 4K anything. Asshole.




Thanks sweetie, I love you too.


----------



## FordGT90Concept (Nov 19, 2016)

Ah, yeah, I read that Shadow Warrior 2 had to run full screen to work with HDR.  That makes sense that, running windowed, it has to be the same as the desktop which, right now, is 8-bpc.  Microsoft aims to make the whole explorer HDR compatible so that's no longer a limitation (improves the look of the OS too).


----------



## danbert2000 (Nov 19, 2016)

I don't know what's taking Microsoft so long. This is holding up all sorts of goodies for HTPC users. It's... awkward for me to have to switch to my Xbox One S instead of my PC for Forza to enjoy the HDR goodness.


----------



## Xuper (Nov 19, 2016)

danbert2000 said:


> HDR is a superset of 10 bit support. Shared graphics mode on Windows 10 does not support HDR 10 yet, nor do any desktop apps. HDR is 10 bit, 10 bit is not HDR.
> https://mspoweruser.com/microsoft-bringing-native-hdr-display-support-windows-year/
> 
> Shadow Warrior will work with HDR on PC, but only in exclusive mode. Again, there's no proof from the original article that AMD is not pushing 10 bits in HDR mode. I think that's what is so frustrating is that the headline makes it sound like a fact but there's no actual proof that it's a deficiency on AMD's part or if the game is "faking" HDR in some way (not using 10 bit pixels in the entire rendering pipeline).
> ...



What the hell ? Is it allowed in Forum? No need to be angry.Also reported.



danbert2000 said:


> I'll concede that HDMI does not have 18 Gbps bandwidth for just video and audio, but at no point does this prevent HDR from going over HDMI 2.0, like he was suggesting. It is AMD's fault if this isn't working, not HDMI 2.0. I have sent HDR over HDMI 2.0. There is no bandwidth issue, no matter what the number is.
> 
> EDIT:
> 
> ...



When HDMI 2.0 FAQ says It's not possible to carry 4K@60 , 10 bit , 4:4:4 over HDMI 2.0 then You can't say it's AMD's fault! also , Where did you get that AMD can't do that ? proof?


----------



## Steevo (Nov 19, 2016)

behrouz said:


> What the hell ? Is it allowed in Forum? No need to be angry.Also reported.
> 
> 
> 
> When HDMI 2.0 FAQ says It's not possible to carry 4K@60 , 10 bit , 4:4:4 over HDMI 2.0 then You can't say it's AMD's fault! also , Where did you get that AMD can't do that ? proof?


Whoa whoa whoa there, we were merely having a spirited discussion.


----------



## Prima.Vera (Nov 19, 2016)

Steevo said:


> Nvidia cuts corners in quality for performance, they always have.
> 
> Looking at TSAA vs other AA effects, TSAA looks like it removes some lighting passes, and some detail while performing AA.


I never understood why the game developers are still pushing for those shitty and crappy technologies like FXAA or TSAA, when there are way better solution out there. The COD:IW and DOOM all have beautiful AA settings with very low performance impact.


----------



## Mistral (Nov 19, 2016)

I guess the "nVidia GF GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0" article will be saved for another slow news day, eh?..


----------



## AlienIsGOD (Nov 19, 2016)

Getting called out by tweak town for click bait... Really shoulda fact checked this one @btarunr

http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html


----------



## nemesis.ie (Nov 19, 2016)

I think it would be good if Bta strikes through the article and issues an apology at this point.

Otherwise it will be hard to take any future articles seriously. (IMO, of course)


----------



## efikkan (Nov 19, 2016)

MyTechAddiction said:


> I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?


16.8 million colors gives a maximum of 256 gradations between any two colors, which is clearly not enough. Human vision can distinguish more than 16.8 million colors across the field of vision.



natr0n said:


> HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.


Games have supported HDR internally for more than a decade, what's been missing is a way to display it. For this reason games have applied a tone mapping technique called *bloom*, which looks awful.



Steevo said:


> Looking at TSAA vs other AA effects, TSAA looks like it removes some lighting passes, and some detail while performing AA.


Any expert in visualization knows that AA-techniques utilizing post processing effects will degrade the overall visual quality, that includes versions of temporal antialiasing. Such techniques will effectively *blur* the picture, which defeats the purpose of higher resolutions in the first place. Stick with proper AA techniques, like MSAA, CSAA or the best: SSAA.


----------



## bug (Nov 19, 2016)

efikkan said:


> Any expert in visualization knows that AA-techniques utilizing post processing effects will degrade the overall visual quality, that includes versions of temporal antialiasing. Such techniques will effectively *blur* the picture, which defeats the purpose of higher resolutions in the first place. Stick with proper AA techniques, like MSAA, CSAA or the best: SSAA.



And the explanation for that is really simple: in post processing, the processor doesn't actually know what constitutes an edge to be anti-aliased, it guesses.


----------



## Ungari (Nov 19, 2016)

ShurikN said:


> Something you forgot to mention



A glaring omission?


----------



## doudou (Nov 19, 2016)

i found this about this consept http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html 
as a poster may need to do more research before posting


----------



## AlienIsGOD (Nov 19, 2016)

doudou said:


> i found this about this consept http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html
> as a poster may need to do more research before posting


do you even bother to read? i posted that already 5 posts ago.  how about you do some research lol


----------



## doudou (Nov 19, 2016)

AlienIsGOD said:


> do you even bother to read? i posted that already 5 posts ago.  how about you do some research lol


i happen to correct my information today so i post it for it maybe a good for someone else and it way yesterday that i read it from techpowerup main page


----------



## efikkan (Nov 19, 2016)

bug said:


> And the explanation for that is really simple: in post processing, the processor doesn't actually know what constitutes an edge to be anti-aliased, it guesses.


Yes, all the geometry information is lost during rasterization; which is the process of transferring the scene from a 3D world into a sampled 2D picture. After the fragment/pixel shaders you have even less information, basically only the "finished" picture. The realm of possibilities of what you can do then is similar to what you can do in a photo editor, so not much really. You know you can never regenerate lost information.

MSAA, CSAA and SSAA all work by increasing the sampling during rasterization, which increases the data available for each pixel. That is simply the only way to create a quality rendering.


----------



## Monsuta (Nov 19, 2016)

This article became a joke, and others are laughing about it for making misleading article based on a click bait article which TPU didn't bother confirming before posting.
Does TPU still holding a grudge over the "reviews need to be fair." thing & all the other AMD related whatever ?


----------



## m1dg3t (Nov 19, 2016)

Monsuta said:


> This article became a joke, and others are laughing about it for making misleading article based on a click bait article which TPU didn't bother confirming before posting.
> Does TPU still holding a grudge over the "reviews need to be fair." thing & all the other AMD related whatever ?



Ever since nVidia invited w1zard to that 'CONference' back at the gtx680 release there has been a large shift in TPUs 'attitude'.

Gotta keep those favours coming in I guess? 

There is clear bias, IMHO. w1z prolly gonna ip ban me now hahaha


----------



## AlienIsGOD (Nov 19, 2016)

doudou said:


> i happen to correct my information today so i post it for it maybe a good for someone else and it way yesterday that i read it from techpowerup main page


im sorry if english isnt your 1st language, but this sentence makes no sense lol


----------



## wiak (Nov 19, 2016)

another reason to prefer modern DisplayPort above all else, want the best quality pixels?, use DisplayPort
also there was some issues with nvida doing wierd hdmi in the early days of hdmi 2.0, when they transmitted 4:2:0 instead of 4:4:4 to get 4k60


----------



## doudou (Nov 19, 2016)

AlienIsGOD said:


> im sorry if english isnt your 1st language, but this sentence makes no sense lol


haha english is not my first language 
i just basically said that: i did read tweakdown post today and it did basically correct the misleading information about this post that i did ready yesterday. (i basically write it just like how i would spell it in my main language and forget that in my main language we use a lot of consciences connected in words to shrink the size of phrases because i was kinda rush xD  )
and sorry if i did repeat the link of tweakdown since after that i did read it today i did post over here and i didn't read the older post


----------



## AlienIsGOD (Nov 19, 2016)

doudou said:


> haha english is not my first language
> i just basically said that: i did read tweakdown post today and it did basically correct the misleading information about this post that i did ready yesterday. (i basically write it just like how i would spell it in my main language and forget that in my main language we use a lot of consciences connected in words to shrink the size of phrases because i was kinda rush xD  )
> and sorry if i did repeat the link of tweakdown since after that i did read it today i did post over here and i didn't read the older post


all good my friend, was having an annoying morning was all.  thanks for clearing up


----------



## Xuper (Nov 19, 2016)

> German tech publication Heise.de *discovered* that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.



I really don't know Why it's News! ( when AMD told them before Launching RX480) , Here direct Link from this TPU's Article







Look at 3840x2160 @ 60Hz ( 4:2:2 ) 

You *Discovered*?! Here footnote :


> *Under embargo until June 29.2016at 9 am.EST.*


Holy Mother of GOD! They found out on Nov 17.2016at 11:07pm.


----------



## Monsuta (Nov 20, 2016)

behrouz said:


> I really don't know Why it's News! ( when AMD told them before Launching RX480) , Here direct Link from this TPU's Article
> 
> Look at 3840x2160 @ 60Hz ( 4:2:2 )
> 
> ...



Using 4:2:2 in 4K@60 doesn't mean it's 8-bit only, it can be 12-bit, not sure why no 10-bit:






http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx


----------



## Xuper (Nov 20, 2016)

Monsuta said:


> Using 4:2:2 in 4K@60 doesn't mean it's 8-bit only, it can be 12-bit, not sure why no 10-bit:
> 
> 
> 
> ...



That Zlatan Answered!



> Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.


----------



## nem.. (Nov 20, 2016)

what kind of noobs do write articles here in tpu, what epic failure article do write *btarunr*..  just make the ridicolous.. ROFLMAO ¬¬

*AMD reaffirms HDR abilities, HDMI 2.0 is the limitation*

Read more: http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html

Just as I was laying down to hopefully fall asleep after a massive 18-hour work day, I read a story over at TechPowerUp sourced from German tech site Heise.de, that AMD Radeon graphics cards were limited in their HDR abilities... well, click bait can be bad sometimes, and we now know the truth.

amd-reaffirms-hdr-abilities-hdmi-limitation_09 The original story can be read here, which claimed that Radeon graphics cards were reducing the color depth to 8 bits per cell (16.7 million colors) or 32-bit, if the display was connected to HDMI 2.0, and not DisplayPort 1.2 - something that spiked my interest.

10 bits per cell (1.07 billion colors) is a much more desired height to reach for HDR TVs, but the original article made it out to seem like this was a limitation of AMD, and not that of HDMI 2.0 and its inherent limitations. Heise.de said that AMD GPUs reduce output sampling from the "desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD 'Polaris' GPUs, including the ones that drive game consoles such as the PS4 Pro," reports TPU.

I reached out to AMD for clarification, with Antal Tungler - the Senior Manager of Global Technology Marketing, who said t hat this was a limitation of HDMI bandwidth. Tungler said "we have no issues whatsoever doing 4:4:4 10b/c HDR 4K60Hz over Displayport, as it's offering more bandwidth".

Tungler added: "4:4:4 10b/c @4K60Hz is not possible through HDMI, but possible and we do support over Displayport. Now, over HDMI we do 4:2:2 12b 4K60Hz which Dolby Vision TVs accept, and we do 4:4:4 8b 4K60Hz, which TVs also can accept as an input. So we support all modes TVs accept. In fact you can switch between these modes in our Settings".

amd-reaffirms-hdr-abilities-hdmi-limitation_10 Now, that's settled. This isn't a limitation of AMD Radeon graphics cards or the APU inside of the PS4 Pro, but instead its a limitation of bandwidth from the HDMI 2.0 standard. DP 1.2 has no issues throwing up HDR at the right 4:4:4 10b/c at 4K60, with AMD supporting it all - as long as you are careful when buying your TV or display, and want the best experience from it - HDMI 2.0 is a limitation right now.

Click bait articles aren't good, and it tarnishes the reputation of people in its path. TPU ran the story without fact checking it either, and while I'm not personally calling Heise.de or TPU out personally, it would be nice to not have sensationalist headlines for something that has been explained in detail (the limitations of HDMI 2.0 and the superiority of DisplayPort).

DisplayPort offers more bandwidth, and will be driving 4K120 in 2017, as well as 1080p and 1440p at 240Hz. AMD is on the bleeding edge of that, but don't fall for the non-hype of this story. Our original story on Radeon Technologies Group event in Sonoma, CA last year. The same article, discussing DP1.3 supporting 5K60, 4K120, 1080p/1440 at 240Hz.

Read more: http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html


----------



## AlienIsGOD (Nov 20, 2016)

What kind of noob post links which have been repeatedly posted throughout this thread?


----------



## Monsuta (Nov 20, 2016)

AlienIsGOD said:


> What kind of noob post links which have been repeatedly posted throughout this thread?



Because it's a noob thread?


----------



## AlienIsGOD (Nov 20, 2016)

Monsuta said:


> Because it's a noob thread?


yay trolls that join just to start crap.  you will be gone soon enough


----------



## Prima.Vera (Nov 20, 2016)

What if you connect your GPU to the 4K HDR TV using dual HDMI cables?? Wouldn't this double the bandwidth in theory making possible ...everything ?


----------



## qubit (Nov 20, 2016)

Prima.Vera said:


> What if you connect your GPU to the 4K HDR TV using dual HDMI cables?? Wouldn't this double the bandwidth in theory making possible ...everything ?


Yes it would, it's like dual channel DVI. Of course the standard would have to be ratified and implemented for it to work and I can't see that happening since a single plug is cheaper and more marketable. Instead, HDMI Licensing will just up the HDMI standard itself instead like they've been doing since it came out in 2002.


----------



## Xuper (Nov 20, 2016)

Prima.Vera said:


> What if you connect your GPU to the 4K HDR TV using dual HDMI cables?? Wouldn't this double the bandwidth in theory making possible ...everything ?



Do we have TV with dual HDMI cables? I was thinking. I think with SFR (Why SFR? Read this link), It's possible to carry 50% of 10-bit colour with Full RGB 4:4:4 through HDMI 2.0.Other 50% of data can go either DP 1.2 or other HDMI 2.0 port.But that's problem :

1) Game engine should support SFR.a SFR implementation requires a considerable amount of skill and experience, not every developer can do
2) TV with android should support Dual screen( I don't know what they call ).for instance : android should be able to play 50% of data from HDMI + other 50%'s Data from DP

any else?


----------



## Captain_Tom (Nov 20, 2016)

I mean PS4K has HDR and people seem to think it looks good.  Furthermore they have it on PS3 as well which has HDMI 1.3 I think.


So either this is easily fixable with a driver update, or 8-bit HDR is decent.


----------



## D007 (Nov 20, 2016)

Same with nvidia... I can't run above 8bpc with my 1080.. It's bullshit if you ask me.. I bought this thing for top end.. Not to find out it can't compete with DP.. 
I'm so sick of false and misleading advertising by nvidia and AMD..


----------



## prtskg (Nov 20, 2016)

behrouz said:


>





D007 said:


> Same with nvidia... I can't run above 8bpc with my 1080.. It's bullshit if you ask me.. I bought this thing for top end.. Not to find out it can't compete with DP..
> I'm so sick of false and misleading advertising by nvidia and AMD..


Behrouz's post shows that AMD gave data of hdmi's limitation. I'm hopeful Nvidia would have similar slides too.


----------



## simlariver (Nov 20, 2016)

Gaming on a Tv screen is _awefull_, input lag is rarely under 30ms for 4kHDR TV's.

The surent state of Computer monitors is pretty bad, the market is littered with awefull 6bit pannels.


----------



## Prima.Vera (Nov 21, 2016)

behrouz said:


> Do we have TV with dual HDMI cables?


Well, on average, all 4K TVs have 4xHDMI 2.0 ports...


----------



## Captain_Tom (Nov 21, 2016)

D007 said:


> Same with nvidia... I can't run above 8bpc with my 1080.. It's bullshit if you ask me.. I bought this thing for top end.. Not to find out it can't compete with DP..
> I'm so sick of false and misleading advertising by nvidia and AMD..



The fact is HDR is f***ing pathetic on PC.  We still don't have monitors that support it, apparently only DP works with it, and few games actually support it.


Meanwhile a simple update adds it to most new TV's for consoles.  WTF is going on?!?!


----------



## ShurikN (Nov 21, 2016)

Captain_Tom said:


> The fact is HDR is f***ing pathetic on PC.  We still don't have monitors that support it, apparently only DP works with it, and few games actually support it.
> 
> 
> Meanwhile a simple update adds it to most new TV's for consoles.  WTF is going on?!?!


Idk... more money in consoles/tv market, I guess.


----------



## Captain_Tom (Nov 21, 2016)

ShurikN said:


> Idk... more money in consoles/tv market, I guess.



Well yes and no.   Rich old men are often the ones to buy new display tech first - so I would say the TV thing is mostly true.

However the PC gaming market is pretty comparable to the console market in many respects at this point, and usually companies use PC footage as their flagship advertisement - so having HDR for the "Sizzle Reals" would be a good idea.


Though the dumbest aspect imo is how easy it would be to add HDR to monitors.  it would take so little effort to make 4K monitors HDR capable, and god knows PC Elite gamers are happy to pay $500+ for a display.


----------



## eidairaman1 (Nov 21, 2016)

Next monitor I get will be DP, same with TV, hdmi is a poor connection in my book


----------



## heky (Nov 21, 2016)

eidairaman1 said:


> Next monitor I get will be DP, same with TV, hdmi is a poor connection in my book



Good luck finding TVs with DP...i actually wanted to change my TV this holiday season, but changed my mind when i couldnt find any with DP, except for some really expensive Panasonic models.


----------



## eidairaman1 (Nov 21, 2016)

heky said:


> Good luck finding TVs with DP...i actually wanted to change my TV this holiday season, but changed my mind when i couldnt find any with DP, except for some really expensive Panasonic models.



I dont use a tv on pc


----------



## Captain_Tom (Nov 21, 2016)

eidairaman1 said:


> Next monitor I get will be DP, same with TV, hdmi is a poor connection in my book



Yeah I'm pretty done with HDMI.  I wish everything would use Displayport.

In fact I wish the next gen consoles would just come with DP 1.4.  Though I know that's a longshot, I could actually see it happen due to 8K support.


----------



## cdawall (Nov 22, 2016)

AlienIsGOD said:


> yay trolls that join just to start crap.  you will be gone soon enough



To be fair they are correct. This thread shouldn't exist because it makes TPU look either nvidia biased or just flat out wrong. Neither of those options is good in my book.


----------



## jigar2speed (Nov 22, 2016)

Also Tarun looks bad, it also shows that time after time Tarun has pulled the trigger sooner than he should have.


----------



## AlienIsGOD (Nov 22, 2016)

cdawall said:


> This thread shouldn't exist because it makes TPU look either nvidia biased or just flat out wrong.


and yet there's no correction/retraction on the front page


----------



## bug (Nov 22, 2016)

AlienIsGOD said:


> and yet there's no correction/retraction on the front page


Or closing of this thread...


----------



## Aquinus (Nov 22, 2016)

cdawall said:


> To be fair they are correct. This thread shouldn't exist because it makes TPU look either nvidia biased or just flat out wrong. Neither of those options is good in my book.


Exactly. The title shouldn't be: AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0
It should be: HDMI 2.0 has insufficient bandwidth to drive HDR with more than 8bpc

It's a misleading title that makes it sound like AMD gimped their GPUs when HDMI 2.0 just sucks compared to DP.

This feels a lot like 8GB of VRAM being a con in reviews until nVidia did it with an equivalent GPU. I love TPU but, this kind of bullshit is growing concerning.



bug said:


> Or closing of this thread...


Or maybe @btarunr should be scolded for making an article that so damn misleading. Pointing a finger at AMD when it should be pointed at HDMI and the HDMI Forum. I expect better out of TPU and crap like this is WCCFTech worthy.


----------



## bug (Nov 22, 2016)

Speaking of (pardon the expression) wccftech, does anyone know how to keep it out of Google Now newsfeed? I keep "clicking" "Not interested in stories from wccftech", but they keep coming still.


----------



## cdawall (Nov 22, 2016)

AlienIsGOD said:


> and yet there's no correction/retraction on the front page



You are correct here let's tag someone who can do something about it. @W1zzard


----------



## Aquinus (Nov 24, 2016)

cdawall said:


> You are correct here let's tag someone who can do something about it. @W1zzard


I think they would just prefer that everyone be quiet about it so they don't have to say anything. Maybe @W1zzard needs another bump?


----------



## eidairaman1 (Nov 24, 2016)

yeah this is too biased when it affects both companies


----------



## Aquinus (Nov 24, 2016)

eidairaman1 said:


> yeah this is too biased when it affects both companies


Isn't posting FUD against the forum guidelines? Maybe everyone should be reporting the news post and calling it out for what it is...


----------



## prtskg (Nov 24, 2016)

Aquinus said:


> Isn't posting FUD against the forum guidelines? Maybe everyone should be reporting the news post and calling it out for what it is...


Just did it.


----------



## nemesis.ie (Nov 24, 2016)

Yes, we can all report it via the link in the first post. I just did too.


----------



## heky (Nov 24, 2016)

So did i...


----------



## David Fallaha (Nov 26, 2016)

bug said:


> This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?



er no, DP1.2 doesn't do HDR and 1.3/1.4 have far more bandwidth



btarunr said:


> German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.
> 
> Source: Heise.de



LOL your post is completely wrong/nonsensical -you do realise that 4:2:0 can be 10bit or 8bit right?

also AMD is doing 8bit 4:4:4 on PC and 10bit 4:2:0 on console -these are the limits of HDMI2.0...plus no one wants to look at a desktop with 4:2:0 subsampling...


----------



## Tatty_One (Nov 26, 2016)

David Fallaha said:


> er no, DP1.2 doesn't do HDR and 1.3/1.4 have far more bandwidth
> 
> 
> 
> ...



It is not his post, it is a news piece sourced from another site with link, he is sharing with the community what other sites are saying.... right or wrong, good or bad, FUD or not, whether anyone thinks it's right or wrong to post  such news sources is another story of course.


----------



## Aquinus (Nov 26, 2016)

Tatty_One said:


> It is not his post, it is a news piece sourced from another site with link, he is sharing with the community what other sites are saying.... right or wrong, good or bad, FUD or not, whether anyone thinks it's right or wrong to post  such news sources is another story of course.


This isn't freaking WCCFTech, like any good journalist, check your freaking sources and *accurately summarize it*. Check the quality of the news before posting it or at least make sure that the claims are legitimate and that they're being summarized properly. Even the article itself says that this is an HDMI 2.0 limitation and that nVidia refused to comment on whether their GPUs do the same thing... but when you read the "summary" here, it speaks like AMD is at fault. No mention that this isn't an AMD-specific issue and that it's all HDMI 2.0.

When your summary says stuff like:


btarunr said:


> The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.


You're taking the article out of context.

When the article says (after translation,):


> HDMI 2.0a provides insufficient in contrast to DisplayPort 1.4 bandwidth to 4K HDR images with 10-bit, 60 Hz and Full YCrBr 4: 4: 4 color scanning. Although AMD could set the HDR color depth to 10 bits, but then would have to reduce the sampling to 4: 2: 2 or 4: 2: 0 (color subsampling / chroma subsampling). This can result in filigreed details or fonts, which could interfere with the display of life in the head-up display. Just a few months ago, AMD told journalists that using Polaris graphics cards via HDMI 2.0 would drive 10-bit HDR displays with color sub-scanning 4: 2: 2. Possibly, the at least continues to play when playing UHD-HDR movies.


That's not a problem with AMD cards, that's an issue with anything using HDMI 2.0 and even the article acknowledges that but, not the summary here. In fact, nothing in the article says that they suspect this is an AMD problem so I'm baffled that the summary here would jump to that kind of conclusion.

Either way, the way the article was represented was distasteful and misleading. When your own guidelines say to not post FUD but when you say: 


Tatty_One said:


> It is not his post, it is a news piece sourced from another site with link, he is sharing with the community what other sites are saying.... right or wrong, good or bad, *FUD or not*, whether anyone thinks it's right or wrong to post  such news sources is another story of course.



If members are held to different standards than people posting "news," then you have a problem.


----------



## FordGT90Concept (Nov 26, 2016)

The original post needs an editorial update stating it is factually incorrect and an explanation of why; this is required for journalistic integrity.  This news post is generating negative publicity for TechPowerUp.


----------



## Aquinus (Nov 26, 2016)

FordGT90Concept said:


> The original post needs an editorial update stating the source is factually incorrect and an explanation of why.  This is required for journalistic integrity.  TechPowerUp doesn't look good when this article is getting flagged by other publications for spreading falsehoods.


If you read the article after translating it, it states very clearly that this is an HDMI 2.0 limitation, that AMD said they were going to do this, and that nVidia hasn't commented on if their GPUs do the same thing however, the summarized version here leaves a lot of that out and actively makes it sound like AMD gimped their GPUs. The summary is more inaccurate than the article which pisses me off more. It's like grabbing FUD, sharing it, then supercharging it with your own rhetoric.


----------



## prtskg (Nov 26, 2016)

Tatty_One said:


> It is not his post, it is a news piece sourced from another site with link, he is sharing with the community what other sites are saying.... _*right or wrong*_, good or bad,* FUD or not*, whether anyone thinks it's right or wrong to post  such news sources is another story of course.


I realise my mistake now. I was thinking highlighted parts are important.


----------



## Tatty_One (Nov 26, 2016)

All my point is, arguably he is reporting other sources inaccuracies, I find it amusing that some think it's Bias against AMD, especially when it is sourced from another site AND when the piece was released onto the front page it sat next to an article about how great Zen 8 core performance was, I would agree wholeheartedly if it was an editorial but its not.


----------



## Aquinus (Nov 26, 2016)

Tatty_One said:


> All my point is, arguably he is reporting other sources inaccuracies, I find it amusing that some think it's Bias against AMD, especially when it is sourced from another site AND when the piece was released onto the front page it sat next to an article about how great Zen 8 core performance was, I would agree wholeheartedly if it was an editorial but its not.


Intentions don't really impact whether it's FUD or not. FUD is FUD. How the article was cherry picked and summarized, along with a nice deceiving title, doesn't help.

The truth of the matter is that this problem has nothing to do with AMD and everything to do with HDMI.


----------



## FordGT90Concept (Nov 26, 2016)

...and corrections have not been made.


----------



## Tatty_One (Nov 26, 2016)

Well I cannot answer for my news colleagues of course, I am sure they will be along shortly to read the feedback and reports.


----------



## chr0nos (Dec 9, 2016)

And still no changes since 26/Nov.

Seems like they just dont care.


----------



## Tatty_One (Dec 9, 2016)

chr0nos said:


> And still no changes since 26/Nov.
> 
> Seems like they just dont care.


Well to be fair they don't have to, it's their choice, I didn't think they would change the wording of the article in any case, the article is not theirs and if they quote a source then they cannot change that, however if we for example wrote our own headline for the news piece here then that would be different.


----------



## nemesis.ie (Dec 9, 2016)

Tatty_One said:


> Well to be fair they don't have to, it's their choice, I didn't think they would change the wording of the article in any case, the article is not theirs and if they quote a source then they cannot change that, however if we for example wrote our own headline for the news piece here then that would be different.



To be fair they should at least add in that the original article says that nVidia may also be (and as we know actually are) affected by this.  It's blatantly cherry-picked information from the original and poor journalism (IMO anyway) to just lift/paste things directly without thinking about it, fact checking and presenting an unbiased view.


----------



## EarthDog (Dec 9, 2016)

Tatty_One said:


> Well I cannot answer for my news colleagues of course, I am sure they will be along shortly to read the feedback and reports.


And if they did, it would be nice to hear where they were coming from, particularly if it isn't going to be corrected.... its been over 2 weeks since your post... but even more since it was corrected in the thread. At least another article correcting the previous report I still believe is in order. 

A little communication goes a long way in situations like this.


----------



## cdawall (Dec 9, 2016)

Simple feedback, this post is wrong, it's a simple mistake based off of a biased source. TPU should either dump the post or correct it.


----------



## Tatty_One (Dec 9, 2016)

I appreciate all of your points, there were plenty of reports, I can only assume that our news people took a look and made a decision, if you feel strongly enough maybe drop them a PM, we have a policy that only News Editor's (or possibly W1z too) change/amend/delete or add to news article's.


----------



## bug (Dec 9, 2016)

What's funny is users complaining about the inaccuracy of the initial report seem oblivious of the Streisand effect.


----------



## eidairaman1 (Dec 12, 2016)

I'd say this article should be deleted then for inaccuracies.


----------

