Thursday, June 18th 2015

AMD "Fiji" Silicon Lacks HDMI 2.0 Support
It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.
Source:
OCUK Forums
139 Comments on AMD "Fiji" Silicon Lacks HDMI 2.0 Support
HDMI 2.0 = 18 gbps
DisplayPort 1.3 = 32.4 gbps
The best HDMI cables can do is 25 Gbps and those are the best of the best cables over very short distances.
There's a chart here on HDMI2: www.dpllabs.com/page/dpl-full-4k-cable-certification
DisplayPort 1.3 should be able to handle 4:4:4 4K @ 16-bits per color where HDMI 2.0 can only handle 8-bits per color. Not to mention DisplayPort 1.3 can carry an HDMI signal. As if that weren't enough, DisplayPort 1.3 is capable of VESA Display Stream Compression which can further increase effective payload.
If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.
Nvidia used a 4:2:0 @ 8bit to get 4K 60Hz working.
Great game looks, MSAA with blocky color gradients, but Nvidia users are used to that since their cards colors a always Fuxxed up. www.reddit.com/r/pcgaming/comments/2p3xs7/nvidia_users_using_hdmi_output_youre_most_likely
forums.geforce.com/default/topic/770359/geforce-drivers/fix-for-the-limited-color-range-issue-in-hdmi-dp/
www.neogaf.com/forum/showthread.php?t=953806
and that 4:2:0 is only for kepler (600/700 series).
welcome to tpu by the way.. probably more than obvious how much misleading information after reading this thread so just ask haha
i will tell you @HumanSmoke @Steevo @FordGT90Concept are probably going to give you the best strait up information.
i can give the mods shit but they are cool for the most part just some of them seem to be very limited in knowledge for the years they have been around reading articles.
i would like to be wrong for what i said about what mussels was doing but it does not seem that way.
HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.
AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?
Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.
i really find this a non issue. and i want to thank you guys for trying to be objective and civil.
AMD Demonstrates FreeSync-over-HDMI Concept Hardware at Computex 2015
Based on that alone it seems more like a mistake then some message. Is it a big mistake, not IMO but it still looks like a mistake.
I also expect hardware H.265 encode and decode. If this HDMI 2.0 thing is true I wouldn't be surprised if that was a bust too.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?
i guess that is a question of budget and standard. 30fps is playable and people do it every day.. i like 50 because going around 50-60. not really noticeable to me. wont being locked into a refresh rate eventually cause input lag if not driving someone crazy for days trying to fix it?
g-sync vs freesync
they are both good and above my standard on refresh rates and totally specd out in my opinion.
i do like how freesync works and doesnt need extra parts in the display that you get charged for because oem's get charged the cost of the extra hardware with a license free.
yet another open standard amd helped to put on paper way before gsync was a thought.
DisplayPort on UHD TV's has already been addressed in this thread though. Very few UHD TVs have DP and it doesn't look like many will.
However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in. These UHD TVs are getting cheaper too,...
If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0 is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.
www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx
So a little color schooling.
What do you see?
i4.minus.com/ibyJcwdIniHUEs.png
en.wikipedia.org/wiki/Color_depth
Even if you saw the highest end, it may only be processed in 8 bit per color instead of 10 and thus will still have blocking and gradients. HDMI 2.0 is still shit compared to Display Port.
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz
Tell it to the industry making UHD TV's.
My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.
Either you understand that 8 bit color looks like shit, or you don't.
We have two simple scenarios in which you replay to a thread about a new graphics card where HDMI 2.0 is NOT supported.
1) You care as you have something relevant to add, understand what it means, why its important or not.
2) You are a Nvidiot and need to thread crap elsewhere.
@Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
@Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.
One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.
That still isn't the overall issues because even then your upscaling or downscaling thru the chain
Titan X has HDMI 2.0 but DisplayPort 1.2 (not "a" for Adaptive V-Sync support). So right now we either have to go with HDMI 1.4a and DisplayPort 1.2a or HDMI 2.0 and DisplayPort 1.2. I think I'd have to go with the former because I loathe proprietary standards like G-Sync and all HDMI has to ever power for me is a 1920x1200 display via DVI adapter. The argument Steevo makes, and one I agree with, is that HDMI 2.0 should be terminated and DisplayPort should be replacing it in full. DisplayPort supports HDMI packets so DisplayPort has backwards compatibility ingrained. There's no reason HDMI 2.0 exists other than, as Steevo said, "Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO. " It's the TV industry trying to dictate what standard people use because they refuse to provide an affordable alternative.