Thursday, June 18th 2015

AMD "Fiji" Silicon Lacks HDMI 2.0 Support

It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.
Source: OCUK Forums
Add your own comment

139 Comments on AMD "Fiji" Silicon Lacks HDMI 2.0 Support

#26
ValenOne
Panasonic's Smart VIERA WT600 has both DP1.2a and HDMI 2.0.
Posted on Reply
#27
Lou007
john_How about this one?

Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics



$11.56
So to support those features would the HDMI cable have to be certified to carry those signals? You probably couldn't connect a cheap branded HDMI cable, is what I'm saying.
Posted on Reply
#28
xfia
john_How about this one?

Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics



$11.56
that will do it but the bottleneck is still hdmi for most pc gamers standards.. like why buy the best gpu's around and a 4k display when you cant run full color. fine for some peoples standards but that cuts out most production work and any enthusiast type artist rendering off the top.
Posted on Reply
#29
praze
RejZoRIt also says this for the adapter cable:
Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)

Meaning DisplayPort supports 60Hz anyway (I have it at 144Hz), output is HDMI 2.0 which means it should work at 60Hz on a LCD TV that has HDMI 2.0 input. Just at the time of writing no devices supported this. Question here is, when did they write the cable description...
jigar2speedFound this- Display port to HDMI 2 - www.amazon.com/dp/B00E964YGC/?tag=tec06d-20

But in the review section people are still complaining - is it related to driver ?
Stop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.

Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV.

A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.
Posted on Reply
#30
xfia
prazeStop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.

Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV.

A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.
thats interesting.. i just seen similar reviews on 3 websites haha i doubt every damn person is doing something wrong so yeah that is more than misleading.
from what your saying it would cost you 100 bucks to do it but there wont be lag just not full color.
Posted on Reply
#31
Uplink10
RejZoRSure I'm on DisplayPort, but a flagship without HDMI 2.0 support while bragging about 4K... Kinda weird, considering the availability of 4K TV's is far greater than it is of computer monitors...
AssimilatorYeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.
Use DP and if monitor doesn't support DP then it is your fault for buying a monitor which uses a soon-to-be legacy connector (HDMI) which will get phased in the future. Everyone knows DP is the future and that HDMI doesn't stand a chance.
I am interested why it doesn't have HDMI 2.0 but they probably tried to extort some additional money over HDMI 2.0 and they stayed with HDMI 1.4.
All this fuss about some stupid thing over "it doesn't support a connector which is 4 years to late and you have to pay annual fee plus a royalty rate per unit."
Die HDMI!!!
Posted on Reply
#32
RejZoR
Saying DP is a future when not a single TV supports it is a bit blunt statement. I don't think DP will ever be supported in LCD TV's. It hasn't been so far, why would it be in the future? No device for the living room even has DP...
Posted on Reply
#33
praze
xfiathats interesting.. i just seen similar reviews on 3 websites haha i doubt every damn person is doing something wrong so yeah that is more than misleading.
from what your saying it would cost you 100 bucks to do it but there wont be lag just not full color.
Yeah, it's a crappy position to be in, not having the right ports on both ends. The industry is full of shady companies making wild claims because cables are almost all profit.

The $100 adapter scenario would likely pull it off full-featured, but would be laggy due to the interruption in signal. This sort of thing has been happening with Dual-Link DVI adapters for years.
Posted on Reply
#34
chinmi
ha ha ha ha ha ha....
amd has failed once again....
Posted on Reply
#35
xfia
prazeYeah, it's a crappy position to be in, not having the right ports on both ends. The industry is full of shady companies making wild claims because cables are almost all profit.

The $100 adapter scenario would likely pull it off full-featured, but would be laggy due to the interruption in signal. This sort of thing has been happening with Dual-Link DVI adapters for years.
borders crazy if not there
Posted on Reply
#36
john_
chinmiha ha ha ha ha ha....
amd has failed once again....
Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.
Posted on Reply
#37
Luka KLLP
This isn't really a problem for the big bulky Fury (X), but for the Nano I can see this hurting sales...
Posted on Reply
#38
RejZoR
chinmiha ha ha ha ha ha....
amd has failed once again....
Despite me criticizing them a lot lately, I wouldn't say that. Majority of graphic cards still land in the PC's that are connected to monitors.

Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...
Posted on Reply
#39
xfia
RejZoRDespite me criticizing them a lot lately, I wouldn't say that. Majority of graphic cards still land in the PC's that are connected to monitors.

Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...
i really dont see it as a htpc type of card.. it should around the performance of a 290 while being better at some things due to hbm. was it is even worth a hdmi upgrade for tv's when they are are only 1080p or 4k? i honestly dont know a single person that uses a 4k tv. i have 3 friends with 4k monitors and i use one at work for charting.
Posted on Reply
#40
praze
RejZoROnly card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...
Wow, that does amplify this problem a bit now that you mention it. And that "console-sized" Project Quantum rig they built with the dual-Fiji card and a power brick the size of a small dog is very confusing now.

I hope they get it together, compromise is the last thing they need in the current market. Lack of competition hurts the consumers in the long run.
Posted on Reply
#41
ZoneDymo
AssimilatorMaxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.



Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.
Because after spending 550 dollars on a video card you are going to cry about 12 dollars worth of adapter cable.... Im guessing you run your gpu with a Pentium 4 and 1gb of ram to save cost right? because after spending 550 dollars on a gpu who would want to have to spend more money on other parts....

Cant even believe I wasted time typing out to obvious as everyone here knows thats a non-argument troll comment.
Posted on Reply
#42
ZoneDymo
john_Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.
He is just mad because he was lied to by Nvidia and did not get the full 4gb he paid for.
The wound is still too fresh and it hurts every time he uses the pc.
Posted on Reply
#43
praze
xfiai really dont see it as a htpc type of card.. it should around the performance of a 290 while being better at some things due to hbm.
I think its size is more of a happy side-effect of HBM than a push into the living room, for sure.
xfiawas it is even worth a hdmi upgrade for tv's when they are are only 1080p or 4k? i honestly dont know a single person that uses a 4k tv. i have 3 friends with 4k monitors and i use one at work for charting.
While that's absolutely the case (small 4K TV market), I think it's a pivotal time where the price and content delivery options mean the market is about to grow rapidly. Especially this holiday season where the entire GTX 900 series already gives people that option. Hopefully they fix it by Holiday 2016 with some silicon tweaks.
Posted on Reply
#44
buggalugs
lol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.

SO maybe there is 0.0000000001 % of the market that will feel let down.

Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway

If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.
Posted on Reply
#45
Ferrum Master
Who the Sock cares?? And if even if someone do care -

Buy a TV with display port then like this Panasonic TC-L65WT600 has...
Posted on Reply
#47
xfia
prazeI think its size is more of a happy side-effect of HBM than a push into the living room, for sure.



While that's absolutely the case (small 4K TV market), I think it's a pivotal time where the price and content delivery options mean the market is about to grow rapidly. Especially this holiday season where the entire GTX 900 series already gives people that option. Hopefully they fix it by Holiday 2016 with some silicon tweaks.
heck yeah a push for higher quality and resolutions. you know rendering efficiency is increase by like 1000 percent in some cases with dx12 and gddr5. it will be practically off the chart with hbm especially with dx12.1. that could mean that the furyx 4gb is roughly comparable to titanx 12gb if not even more in favor of the furyx.
@buggalugs its not the cores being 28nm that could make them obsolete but dx12.1.. games and apps will certainly still be able to use 12.0 tho if not 11.0 or 11.1 by then.
Posted on Reply
#48
Uplink10
Facts:
-you cannot play a decent video game (not those crappy indie games) at 4K resolution and at 60 FPS
-if you need 4K resolution for viewing and editing text or images because 1080p does not look smooth enough, 30 Hz is enough

Edit: You will not get a decent minimal 30 FPS when playing at 4K
Posted on Reply
#49
r.h.p
I personally don't use a TV for gaming , its for watching the News or DVDS to relax lol. I have a 28" 4k Monitor running DP from a R9 290x for Gaming, perfect combo in my opinion. This new AMD Fury product looks awesome , reminds me of around year 2000 when AMD Athlon WAS kicking Intel P4 , now it is back to kick Nvidia ..... I am so proud lol :D
Posted on Reply
#50
mroofie
FluffmeisterThese comment sections always deliver.

Hehe.
Especially the adapter part lel :roll:
Posted on Reply
Add your own comment
Oct 2nd, 2024 17:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts