Thursday, June 18th 2015
AMD "Fiji" Silicon Lacks HDMI 2.0 Support
It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.
Source:
OCUK Forums
139 Comments on AMD "Fiji" Silicon Lacks HDMI 2.0 Support
Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k. However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.
HDMI 2 support is near non-existant. Why would Fiji be an exception to the rule? I'm disappointed it doesn't but at the same time, I don't care. DisplayPort is the future, not HDMI. Look at the reviews. Advertized as 4K: only does 30Hz; horrible reviews. This is the problem with HDMI2. They keep trying to ram more bits through that hose without changing the hose. The connectors may be able to handle the advertised 60 Hz but cables cannot. When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference. HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work. This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing. This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task. HDMI is not suitable for 4K and likely never will be.
The Fury X manual only mentions DisplayPort 1.2 repeatedly. I don't know if it supports 1.3:
support.amd.com/Documents/amd-radeon-r9-fury-x.pdf
I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today.
Also HDMI 2.0 works fine for 4k 60hz, yes it is very close to the threshold of HDMI bandwidth limits but it still works so why not use it.
Not trying to be rude to you specifically, just pointing out another side of the argument.
Edit: Some cable certification programs have started: www.dpllabs.com/page/dpl-full-4k-cable-certification
So yes still a better deal (If the leaks are believed) depending on how you factor everything in. Of course this is all still based on rumor but factoring in everything and the fact most gaming monitors in this age have DP or you can use a DP to (Whatever plug you have) adaptor, its pretty irrelevant. The only downside to this is it makes it difficult for T.V.'s that have HDMI 2.0 running at 4k 60hz...
I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...? :)
But now and we will be good and welcomed for cheaper and weaker TITAN X as I told before few days.
They had big confidence to cut CUDA cores instead to go forward as much as possible with full GM200 with increased based clock with AIO if need and to prey god to keep crown they didn't care.
Now AMD will maybe to take crown exactly for that little peace and Maxwell will be first NVIDIA 3DMark loser after Fermi and Kepler.
What now? Feeding customers 10 months with rumors about Pascal while AMD sold their cards. They couldn't finish him before Spring no way.
You will see how much will NVIDIA ask for HBM II, fortune. Last 3-4 years on every perfidious way they try to silently increase prices, every year 100-150$ more for high end chip. Where is end? I think now.
And people whole time thought TITAN X is god miracle, no that was one strong card overpriced to the max.
But nothing special. NVIDIA had GK110 2013, and now they have 50% more performance.
Why NVIDIA decide to give little buy little that's different story, but they made 50% improvements from 2013.
For that period they ask 1000$ 3 times. TITAN, TITAN Black, TITAN X.
You need to be very tricky to force people to pay that.
Real value is 450-500-550MAX. Same as GTX580.
Their fans justify that as no competition... What is Fury than.
Let's take Intel as example... Intel no competition and hold CPU market much stronger than NVIDIA GPU...
Intel didn't ask 1500 or 2000$ for extreme processors. Every series of Intel processors is 10% cheaper or more expensive than before 5 years. They didn't increased price for double. But what NVIDIA do?
I'm glad because I was right and because cutting CUDA cores from GTX980Ti will cost NVIDIA crown for single chip and hundreds of thousands dollars, because we prey for full GM200 chip with increased clock, literally, months before they launch only to cut on 6GB video memory and to drop price as for normal GeForce series and they didn't had mercy. No 1200e - no full chip. That was their motto.
That was weird how NVIDIA teach their fans and they expect to AMD ask 1000$ day before presentation, they are almost sure in similar price as TITAN X.
Just because that's new technology HBM they have rights to ask 1000$.
I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)
Regardless of what the future will bring we still have to live in the here and now. Omitting HDMI 2.0 doesn't lend to a harmonious coexistence with UHD TVs in the here and now.
Its better to have it and not need it then to need it and not have it.
I'm not going to say that the omission of HDMI 2.0 would mean I would never buy one of these cards but it would jumpstart my urge to look elsewhere for a product that does support HDMI 2.0.
hell yeah amd! toss the waist of cash hdmi to the curb!
@OneMoar i think was trying to tell me months ago why it was crap but i didnt fully read the hdmi wiki and the links on it.
..... blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html
I think you're confusing bandwidth with sub sampling. DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps. I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".