# AMD "Fiji" Silicon Lacks HDMI 2.0 Support



## btarunr (Jun 18, 2015)

It turns out that AMD's new "Fiji" silicon lacks HDMI 2.0 support, after all. Commenting on OCUK Forums, an AMD representative confirmed that the chip lacks support for the connector standard, implying that it's limited to HDMI 1.4a. HDMI 2.0 offers sufficient bandwidth for 4K Ultra HD resolution at 60 Hz. While the chip's other connectivity option, DisplayPort 1.2a supports 4K at 60 Hz - as do every 4K Ultra HD monitor ever launched - the lack of HDMI 2.0 support hurts the chip's living room ambitions, particularly with products such as the Radeon R9 Nano, which AMD CEO Lisa Su, stated that is being designed for the living room. You wouldn't need a GPU this powerful for 1080p TVs (a GTX 960 or R9 270X ITX card will do just fine), and if it's being designed for 4K UHD TVs, then its HDMI interface will cap visuals at a console-rivaling 30 Hz.





*View at TechPowerUp Main Site*


----------



## dwade (Jun 18, 2015)

So much for focusing on 4k when most 4k gamers own a tv instead of smallish monitors. Times have changed, AMD. Get with it.


----------



## mroofie (Jun 18, 2015)

lel fail 



RejZoR said:


> Sure I'm on DisplayPort, but a flagship without HDMI 2.0 support *while bragging about 4K... Kinda weird*, considering the availability of 4K TV's is far greater than it is of computer monitors...


Finally someone who gets it !! 



SimpleTECH said:


>


lel 





the54thvoid said:


> No. Measured responses so far but your own comment is quite blatantly troll bait. *Congratulations on starting the self fulfilling prophecy*.



lel that line 

Here take a cookie


----------



## the54thvoid (Jun 18, 2015)

Issue or non issue? Let the brand loyalists decide. I use a monitor so no impact to me.


----------



## SimpleTECH (Jun 18, 2015)




----------



## Champ (Jun 18, 2015)

I can see how this would be an issue. I had a HTPC gaming machine and want to build another. A 4k version. When you're gaming on the living room teley, you need thr most of everything you can get.


----------



## RejZoR (Jun 18, 2015)

Sure I'm on DisplayPort, but a flagship without HDMI 2.0 support while bragging about 4K... Kinda weird, considering the availability of 4K TV's is far greater than it is of computer monitors...


----------



## Xaled (Jun 18, 2015)

You figured out that while you are testing the card? :> review date please, i couldnt get any informatiin about that or about NDA anywhere


----------



## HumanSmoke (Jun 18, 2015)

Xaled said:


> You figured out that while you are testing the card? :> review date please, i couldnt get any informatiin about that or about NDA anywhere


Reading is key.
From the second sentence of the article:


> *Commenting on OCUK Forums, an AMD representative* confirmed that the chip lacks support for the connector standard..


If you click on the source hyperlink. you'll get it from the proverbial horses mouth.


----------



## jigar2speed (Jun 18, 2015)

And the Nvidia fanboys go wild.


----------



## the54thvoid (Jun 18, 2015)

jigar2speed said:


> And the Nvidia fanboys go wild.



No. Measured responses so far but your own comment is quite blatantly troll bait. Congratulations on starting the self fulfilling prophecy.


----------



## Xzibit (Jun 18, 2015)

*This might help




*


----------



## the54thvoid (Jun 18, 2015)

Xzibit said:


> *This might help
> 
> 
> 
> ...



What is this technical wizardry? You magician, begone with.... Oh yeah, a practical work around. Funnily enough.


----------



## ZoneDymo (Jun 18, 2015)

Xzibit said:


> *This might help
> 
> 
> 
> ...



I know right? lol wtf are we even talking about here? just get a damn adapter, honestly.


----------



## ZoneDymo (Jun 18, 2015)

dwade said:


> So much for focusing on 4k when most 4k gamers own a tv instead of smallish monitors. Times have changed, AMD. Get with it.



I would like to know where you got that information


----------



## nekrik (Jun 18, 2015)

AMD: "We have true 4k GPU. It gives 55fps average in crysis Yey!!! BUT!! only in reduced 4:2:2 color subsampling (HDMI 2.0 bandwith can do 4:4:4) the latter written with small letters  
So it is not real 4k gpu and with 4 gb vram HBM it could not be. Benchmarks is one story real games at 4k will eat that 4gb vram for breakfast, see GTA V , Shadow of Mordor and others .


----------



## Lionheart (Jun 18, 2015)

AMD wtf!! You're bragging about 4 & 5k yet you can't support the new HDMI 2.0 standard?? Guess I'm going to need one of those converting cable's that Xzibit posted...


----------



## Assimilator (Jun 18, 2015)

Maxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.



ZoneDymo said:


> I know right? lol wtf are we even talking about here? just get a damn adapter, honestly.



Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.


----------



## praze (Jun 18, 2015)

Xzibit said:


> *This might help
> 
> 
> 
> ...



It won't, that cable is capped at 30 Hz just like the HDMI 1.4a port on the card. All this will do is cost you $45 and introduce lag.

Edit: source


----------



## RejZoR (Jun 18, 2015)

*It also says this for the adapter cable:*
Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)

Meaning DisplayPort supports 60Hz anyway (I have it at 144Hz), output is HDMI 2.0 which means it should work at 60Hz on a LCD TV that has HDMI 2.0 input. Just at the time of writing no devices supported this. Question here is, when did they write the cable description...


----------



## xfia (Jun 18, 2015)

well looks like the rest of the industry needs a kick in the right direction. displayport that has been in production for like 7 years is much more advanced than hdmi 2.0 that has been in production for like 3 and is what a 5k display will need to work and goes all the way up to 8k with subsampling at 60hz.. that is what hdmi does at 4k 60hz as mentioned.
non issue for almost anyone that will buy them.


----------



## jigar2speed (Jun 18, 2015)

praze said:


> It won't, that cable is capped at 30 Hz just like the HDMI 1.4a port on the card. All this will do is cost you $45 and introduce lag.
> 
> Edit: source


Found this-  Display port to HDMI 2 -  http://www.amazon.com/dp/B00E964YGC/?tag=tec06d-20

But in the review section people are still complaining - is it related to driver ?


----------



## Xaled (Jun 18, 2015)

HumanSmoke said:


> Reading is key.
> From the second sentence of the article:
> 
> If you click on the source hyperlink. you'll get it from the proverbial horses mouth.






HumanSmoke said:


> Reading is key.
> From the second sentence of the article:
> 
> If you click on the source hyperlink. you'll get it from the proverbial horses mouth.


And review date? Do yiu have any info about it?


----------



## Lou007 (Jun 18, 2015)

Found these interesting facts:

The HDMI (High Definition Multimedia Interface) specification was conceived more than ten years ago by six consumer electronics giants: Hitachi, Panasonic, Philips, Silicon Image, Sony, and Toshiba. Today, HDMI Licensing, LLC, a wholly owned subsidiary of Silicon Image, controls the spec. Manufacturers must pay a royalty for including HDMI into their products.

The DisplayPort specification was developed by, and remains under the control of, the Video Electronics Standards Association (VESA), a large consortium of manufacturers ranging from AMD to ZIPS Corporation. DisplayPort debuted in 2006 as part of an effort to supplant the much older VGA (Video Graphics Array, an analog interface first introduced in 1987) and DVI (Digital Video Interface, introduced in 1999) standards used primarily for computer displays. DisplayPort is a royalty-free product.

Fun fact: Of the six companies responsible for the creation of HDMI, only Hitachi and Philips are not also member companies of VESA.

http://www.pcworld.com/article/2030...t-which-display-interface-reigns-supreme.html

When comparing DisplayPort 1.3 with HDMI 2.0, DisplayPort 1.3 has several key advantages. First, the video bandwidth of DisplayPort 1.3 is much higher than HDMI 2.0.  This means that DisplayPort 1.3 can support higher resolution timing such as 8K at 60Hz, where as HDMI 2.0 can support 4K at 60Hz max. Second, DisplayPort 1.3 has the ability to transmit multiple video streams on one cable through the MST feature allowing multiple monitors to be daisy-chained together (although there are limitations as to the number of displays and resolution supported, which makes it more appropriate for desktop uses than video walls).  Finally DisplayPort includes installer-friendly locking connectors. HDMI doesn’t natively support locking connectors though many Planar products do provide support for threaded hex nuts for special-locking HDMI connector - See more at: http://www.planar.com/blog/2014/12/15/displayport-13-vs-hdmi-20/#sthash.jrl6QSDr.dpuf

I can now understand why AMD is pushing Display Port over HDMI 2.0


----------



## xfia (Jun 18, 2015)

Lou007 said:


> When comparing DisplayPort 1.3 with HDMI 2.0, DisplayPort 1.3 has several key advantages. First, the video bandwidth of DisplayPort 1.3 is much higher than HDMI 2.0.  This means that DisplayPort 1.3 can support higher resolution timing such as 8K at 60Hz, where as HDMI 2.0 can support 4K at 60Hz max. Second, DisplayPort 1.3 has the ability to transmit multiple video streams on one cable through the MST feature allowing multiple monitors to be daisy-chained together (although there are limitations as to the number of displays and resolution supported, which makes it more appropriate for desktop uses than video walls).  Finally DisplayPort includes installer-friendly locking connectors. HDMI doesn’t natively support locking connectors though many Planar products do provide support for threaded hex nuts for special-locking HDMI connector - See more at: http://www.planar.com/blog/2014/12/15/displayport-13-vs-hdmi-20/#sthash.jrl6QSDr.dpuf


the big picture they wanted that they have advertised is to run 1440p eyefinity at 60hz at full color.


----------



## john_ (Jun 18, 2015)

How about this one?

Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics






Supports* HDMI 2.0 Technology*, which increases bandwidth from 10.2 Gbps to 18 Gbps and is 4k and Ultra HD compatible. Increases from 8 Audio Channels to 32 Audio Channels for expanded audio. *60 fps video playback at 4k resolution*. Dynamic synchronization of video and audio streams.

$11.56


----------



## rvalencia (Jun 18, 2015)

Panasonic's Smart VIERA WT600 has both DP1.2a and HDMI 2.0.


----------



## Lou007 (Jun 18, 2015)

john_ said:


> How about this one?
> 
> Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics
> 
> ...



So to support those features would the HDMI cable have to be certified to carry those signals? You probably couldn't connect a cheap branded HDMI cable, is what I'm saying.


----------



## xfia (Jun 18, 2015)

john_ said:


> How about this one?
> 
> Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics
> 
> ...


that will do it but the bottleneck is still hdmi for most pc gamers standards.. like why buy the best gpu's around and a 4k display when you cant run full color. fine for some peoples standards but that cuts out most production work and any enthusiast type artist rendering off the top.


----------



## praze (Jun 18, 2015)

RejZoR said:


> *It also says this for the adapter cable:*
> Maximum Resolution: 4k @ 30 Hz (note: at time of writing no chipset supports 60Hz)
> 
> Meaning DisplayPort supports 60Hz anyway (I have it at 144Hz), output is HDMI 2.0 which means it should work at 60Hz on a LCD TV that has HDMI 2.0 input. Just at the time of writing no devices supported this. Question here is, when did they write the cable description...





jigar2speed said:


> Found this-  Display port to HDMI 2 -  http://www.amazon.com/dp/B00E964YGC/?tag=tec06d-20
> 
> But in the review section people are still complaining - is it related to driver ?



Stop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.

Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV. 

A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.


----------



## xfia (Jun 18, 2015)

praze said:


> Stop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.
> 
> Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV.
> 
> A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.


thats interesting.. i just seen similar reviews on 3 websites haha i doubt every damn person is doing something wrong so yeah that is more than misleading.
from what your saying it would cost you 100 bucks to do it but there wont be lag just not full color.


----------



## Uplink10 (Jun 18, 2015)

RejZoR said:


> Sure I'm on DisplayPort, but a flagship without HDMI 2.0 support while bragging about 4K... Kinda weird, considering the availability of 4K TV's is far greater than it is of computer monitors...





Assimilator said:


> Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.


Use DP and if monitor doesn't support DP then it is your fault for buying a monitor which uses a soon-to-be legacy connector (HDMI) which will get phased in the future. Everyone knows DP is the future and that HDMI doesn't stand a chance.
I am interested why it doesn't have HDMI 2.0 but they probably tried to extort some additional money over HDMI 2.0 and they stayed with HDMI 1.4.
All this fuss about some stupid thing over "it doesn't support a connector which is 4 years to late and you have to pay annual fee plus a royalty rate per unit."
Die HDMI!!!


----------



## RejZoR (Jun 18, 2015)

Saying DP is a future when not a single TV supports it is a bit blunt statement. I don't think DP will ever be supported in LCD TV's. It hasn't been so far, why would it be in the future? No device for the living room even has DP...


----------



## praze (Jun 18, 2015)

xfia said:


> thats interesting.. i just seen similar reviews on 3 websites haha i doubt every damn person is doing something wrong so yeah that is more than misleading.
> from what your saying it would cost you 100 bucks to do it but there wont be lag just not full color.



Yeah, it's a crappy position to be in, not having the right ports on both ends. The industry is full of shady companies making wild claims because cables are almost all profit.

The $100 adapter scenario would likely pull it off full-featured, but would be laggy due to the interruption in signal. This sort of thing has been happening with Dual-Link DVI adapters for years.


----------



## chinmi (Jun 18, 2015)

ha ha ha ha ha ha.... 
amd has failed once again....


----------



## xfia (Jun 18, 2015)

praze said:


> Yeah, it's a crappy position to be in, not having the right ports on both ends. The industry is full of shady companies making wild claims because cables are almost all profit.
> 
> The $100 adapter scenario would likely pull it off full-featured, but would be laggy due to the interruption in signal. This sort of thing has been happening with Dual-Link DVI adapters for years.


borders crazy if not there


----------



## john_ (Jun 18, 2015)

chinmi said:


> ha ha ha ha ha ha....
> amd has failed once again....


Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.


----------



## Luka KLLP (Jun 18, 2015)

This isn't really a problem for the big bulky Fury (X), but for the Nano I can see this hurting sales...


----------



## RejZoR (Jun 18, 2015)

chinmi said:


> ha ha ha ha ha ha....
> amd has failed once again....



Despite me criticizing them a lot lately, I wouldn't say that. Majority of graphic cards still land in the PC's that are connected to monitors.

Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...


----------



## xfia (Jun 18, 2015)

RejZoR said:


> Despite me criticizing them a lot lately, I wouldn't say that. Majority of graphic cards still land in the PC's that are connected to monitors.
> 
> Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...


i really dont see it as a htpc type of card.. it should around the performance of a 290 while being better at some things due to hbm. was it is even worth a hdmi upgrade for tv's when they are are only 1080p or 4k? i honestly dont know a single person that uses a 4k tv. i have 3 friends with 4k monitors and i use one at work for charting.


----------



## praze (Jun 18, 2015)

RejZoR said:


> Only card that is really gimped because of it is R9 Nano really. It's a HTPC type of card and without HDMI 2.0 support, it kinda loses it's huge selling point...



Wow, that does amplify this problem a bit now that you mention it. And that "console-sized" Project Quantum rig they built with the dual-Fiji card and a power brick the size of a small dog is very confusing now. 

I hope they get it together, compromise is the last thing they need in the current market. Lack of competition hurts the consumers in the long run.


----------



## ZoneDymo (Jun 18, 2015)

Assimilator said:


> Maxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.
> 
> 
> 
> Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.



Because after spending 550 dollars on a video card you are going to cry about 12 dollars worth of adapter cable.... Im guessing you run your gpu with a Pentium 4 and 1gb of ram to save cost right? because after spending 550 dollars on a gpu who would want to have to spend more money on other parts....

Cant even believe I wasted time typing out to obvious as everyone here knows thats a non-argument troll comment.


----------



## ZoneDymo (Jun 18, 2015)

john_ said:


> Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.



He is just mad because he was lied to by Nvidia and did not get the full 4gb he paid for.
The wound is still too fresh and it hurts every time he uses the pc.


----------



## praze (Jun 18, 2015)

xfia said:


> i really dont see it as a htpc type of card.. it should around the performance of a 290 while being better at some things due to hbm.



I think its size is more of a happy side-effect of HBM than a push into the living room, for sure. 



xfia said:


> was it is even worth a hdmi upgrade for tv's when they are are only 1080p or 4k? i honestly dont know a single person that uses a 4k tv. i have 3 friends with 4k monitors and i use one at work for charting.



While that's absolutely the case (small 4K TV market), I think it's a pivotal time where the price and content delivery options mean the market is about to grow rapidly. Especially this holiday season where the entire GTX 900 series already gives people that option. Hopefully they fix it by Holiday 2016 with some silicon tweaks.


----------



## buggalugs (Jun 18, 2015)

lol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.

 SO maybe there is 0.0000000001 % of the market that will feel let down.

 Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway

 If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.


----------



## Ferrum Master (Jun 18, 2015)

Who the Sock cares?? And if even if someone do care - 

Buy a TV with display port then like this Panasonic TC-L65WT600 has...


----------



## Fluffmeister (Jun 18, 2015)

These comment sections always deliver.

Hehe.


----------



## xfia (Jun 18, 2015)

praze said:


> I think its size is more of a happy side-effect of HBM than a push into the living room, for sure.
> 
> 
> 
> While that's absolutely the case (small 4K TV market), I think it's a pivotal time where the price and content delivery options mean the market is about to grow rapidly. Especially this holiday season where the entire GTX 900 series already gives people that option. Hopefully they fix it by Holiday 2016 with some silicon tweaks.



heck yeah a push for higher quality and resolutions. you know rendering efficiency is increase by like 1000 percent in some cases with dx12 and gddr5. it will be practically off the chart with hbm especially with dx12.1.  that could mean that the furyx 4gb is roughly comparable to titanx 12gb if not even more in favor of the furyx.
@buggalugs  its not the cores being 28nm that could make them obsolete but dx12.1.. games and apps will certainly still be able to use 12.0 tho if not 11.0 or 11.1 by then.


----------



## Uplink10 (Jun 18, 2015)

Facts:
-you cannot play a decent video game (not those crappy indie games) at 4K resolution and at 60 FPS
-if you need 4K resolution for viewing and editing text or images because 1080p does not look smooth enough, 30 Hz is enough

Edit: You will not get a decent *minimal* 30 FPS when playing at 4K


----------



## r.h.p (Jun 18, 2015)

I personally don't use a TV for gaming , its for watching the News or DVDS to relax lol. I have a 28" 4k Monitor running DP from a R9 290x for Gaming, perfect combo in my opinion. This new AMD Fury product looks awesome , reminds me of around year 2000 when* AMD Athlon* _WAS kicking_ Intel P4 , now it is back to kick Nvidia ..... I am so proud lol


----------



## mroofie (Jun 18, 2015)

Fluffmeister said:


> These comment sections always deliver.
> 
> Hehe.


Especially the adapter part lel


----------



## Uplink10 (Jun 18, 2015)

r.h.p said:


> I personally don't use a TV for gaming


That is the way to go, who the hell buys a TV for gaming because:
-you do not even know the max frequency of the input signal because they are always advertising falsely and saying it has 400 Hz, 200 Hz, 600 Hz and I do not care how many times their LEDs blink and others false advertising methods

Who even needs a TV Tuner nowadays, put mini PC behind it (monitor) and enjoy the best content you can get from NAS in the house, Internet, Netflix...


----------



## xfia (Jun 18, 2015)

Fluffmeister said:


> These comment sections always deliver.
> 
> Hehe.


well we all gotta learn and engineers, customer support get stuff wrong too sometimes. does not really help with so much misleading information and advertising flying around. 

really does not help when a mod.. namely @Mussels wants a human experiment instead of making a list of questions and diagnostic tools for peoples issues.  

i wonder if its that he didnt see the recent through the wormhole that explained non critical thinking humans use especially with no incentive or that he did see it and wanted to screw with people.


----------



## mroofie (Jun 18, 2015)

Well guess what one day im going to get db xenoverse then connect my new pc to the tv (HDMI 2.0 ) and then this happens

















(ps just trolling / joking)



RejZoR said:


> Saying DP is a future when not a single TV supports it is a bit blunt statement. I don't think DP will ever be supported in LCD TV's. It hasn't been so far, why would it be in the future? No device for the living room even has DP...



Well its 30 fps for most then :/
Still can't understand Amd's decision.



Assimilator said:


> I can't speak for anyone else, but I've never been inside a , before. Mind you, I've also never misspelled words like a retard, so there's that.
> 
> 
> 
> Apparently reading isn't your strong point, allow me to assist:



Wait you are in South africa ???


----------



## semantics (Jun 18, 2015)

hdcp support is really the concern of this.


----------



## Bytales (Jun 18, 2015)

Allthough its a bad thing it doesnt have HDMI 2.0, since the newest 4k TVs have HDMI, in the end its pointless because Display Port is the Future


----------



## HumanSmoke (Jun 18, 2015)

r.h.p said:


> I have a 28" 4k Monitor running DP from a R9 290x for Gaming, perfect combo in my opinion.


4K gaming with a 290X...a perfect combo?


----------



## Assimilator (Jun 18, 2015)

john_ said:


> Nvidia fanboys *where in a comma* yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.



I can't speak for anyone else, but I've never been inside a , before. Mind you, I've also never misspelled words like a retard, so there's that.



ZoneDymo said:


> Because after spending 550 dollars on a video card you are going to cry about 12 dollars worth of adapter cable.... Im guessing you run your gpu with a Pentium 4 and 1gb of ram to save cost right? because after spending 550 dollars on a gpu who would want to have to spend more money on other parts....



Apparently reading isn't your strong point, allow me to assist:



praze said:


> Stop. Neither of these cables support 4K at 60Hz, you're gonna make someone waste time and money. Just look at the reviews.
> 
> Claiming "compatibility" with a backwards compatible port like HDMI 2.0 is misleading in both of these products. It's like saying a PS3 is "HDMI 2.0 compatible" just because it will display a 1080p image on your new 4K TV.
> 
> A cable actually capible of this feat would likely cost $100+ and require external power all while introducing lag.


----------



## xfia (Jun 18, 2015)

Uplink10 said:


> Facts:
> -you cannot play a decent video game (not those crappy indie games) at 4K resolution and at 60 FPS
> -if you need 4K resolution for viewing and editing text or images because 1080p does not look smooth enough, 30 Hz is enough
> 
> Edit: You will not get a decent *minimal* 30 FPS when playing at 4K


what maxed out with shoty gameworks on? your just going by benchmarks that they use unrealistic settings for like 4x msaa at 4k.. its laughable is some regard but not really your fault.


----------



## Tatty_One (Jun 18, 2015)

r.h.p said:


> I personally don't use a TV for gaming , its for watching the News or DVDS to relax lol. I have a 28" 4k Monitor running DP from a R9 290x for Gaming, perfect combo in my opinion. This new AMD Fury product looks awesome , reminds me of around year 2000 when* AMD Athlon* _WAS kicking_ Intel P4 , now it is back to kick Nvidia ..... I am so proud lol


Can I ask..... proud of what?  Proud of the fact that it does not come with a level of HDMI technology that it's competitors do have or proud because you think it will just give better performance than it's main competitors?


----------



## Rowsol (Jun 18, 2015)

jigar2speed said:


> And the Nvidia fanboys go wild.



What a useless post.  Please, go away.

I voted for "yes" in the vote.  I don't personally have a 4k anything, but I don't see why they would omit that.


----------



## Basard (Jun 18, 2015)

So, do modern TVs not have display ports on them?  I dunno, haven't owned a TV in almost a decade now... just asking....


----------



## xfia (Jun 18, 2015)

Tatty_One said:


> Can I ask..... proud of what?  Proud of the fact that it does not come with a level of HDMI technology that it's competitors do have or proud because you think it will just give better performance than it's main competitors?


@r.h.p dont do it.. it might just be more of the human experiment i already posted on technet. maybe i will get to see what the news thinks if i shoot them a email.


----------



## mroofie (Jun 18, 2015)

xfia said:


> @r.h.p dont do it.. it might just be more of the human experiment i already posted on technet. maybe i will get to see what the news thinks if i shoot them a email.



what human experiment  ?? 
WTF are you talking about


----------



## xfia (Jun 18, 2015)

mroofie said:


> what human experiment  ??
> WTF are you talking about


http://www.techpowerup.com/forums/t...her-clocked-i7-with-280x.213559/#post-3299766
for your viewing pleasure so please share. think i figured it out but i wouldnt know without the right diagnostic tools.


----------



## Ferrum Master (Jun 18, 2015)

Basard said:


> So, do modern TVs not have display ports on them?  I dunno, haven't owned a TV in almost a decade now... just asking....



There are but not much yet... Philips and Panasonic have some... haven't seen more... I don't have any interest in TV's as such, treat them as crap... I have a projector on 135inch screen... size does matter.


----------



## Lou007 (Jun 18, 2015)

But didn't AMD as part of VESA adopt DP to phase out VGA? It was never intended to be used on Televisions but more as a means of a new connector for lcd monitors slowly replacing the analogue input, and at the same time able to carry the audio signal. The intent was to make it widely available  and free for monitor makers to imstall on all new hardware, unlike HDMI which has a charge attatched for any company intending to adopt it.


----------



## john_ (Jun 18, 2015)

Assimilator said:


> I can't speak for anyone else, but I've never been inside a , before. Mind you, I've also never misspelled words like a retard, so there's that.


I can write it in Greek if you prefer. No misspelling there. By the way. Who is the retard here? The person who makes a mistake writing in another language, or the person that comments about that, like you did? Anyway, I can understand you being upset. Try to relax.


----------



## ZoneDymo (Jun 18, 2015)

Assimilator said:


> I can't speak for anyone else, but I've never been inside a , before. Mind you, I've also never misspelled words like a retard, so there's that.
> 
> 
> 
> Apparently reading isn't your strong point, allow me to assist:



So reading is not my strong point...so you copy paste someone's text again....
And you think that would assist a person who you claim is not too great at reading....


----------



## the54thvoid (Jun 18, 2015)

Such an ill tempered thread. How about people stop being dicks and stick to the topic.


----------



## r.h.p (Jun 18, 2015)

HumanSmoke said:


> 4K gaming with a 290X...a perfect combo?



Um... yeah I get what u mean HumanSmoke , my wording was a bit fuzzy  ..... perfect combo was suppose to mean my setup with the DP cable
runs BFH at 2560x1440 @ 60hz on ultra 50 fps. Its not true 4K , but I am presuming that my New* AMD* Radeon R9 Fury X will change that. So who needs the HDMI port ??


----------



## mister2 (Jun 18, 2015)

This news saddens me.  I was hoping I could finally offload the 980 in my HTPC (RVZ-01) for something faster that didn't have driver issues.  I game on my couch with full surround sound and a 65" 4K TV.  

It's a big oversight by AMD.  The cost of adding the support is ridiculously small for the benefit of expanding your audience.  Sigh...


----------



## r.h.p (Jun 18, 2015)

Tatty_One said:


> Can I ask..... proud of what?  Proud of the fact that it does not come with a level of HDMI technology that it's competitors do have or proud because you think it will just give better performance than it's main competitors?



I am proud that AMD have come back with a product that can Kick Nvidia performance wise ...... anyway ova it now lol


----------



## xfia (Jun 18, 2015)

r.h.p said:


> I am proud that AMD have come back with a product that can Kick Nvidia performance wise ...... anyway ova it now lol


amd sets industry open standards and nvidia tries to own them


----------



## Whilhelm (Jun 18, 2015)

Well so much for my interest in Fiji, I was looking forward to ditching my GTX 980 SLI setup in favor of a single 4k capable card from team red.

This is such a small thing to overlook that completely prevents me from being able to buy this card. I have a 4k 40" TV as my main monitor that uses HDMI 2.0 and having to buy a different TV because AMD decided to ignore HDMI 2.0 is crazy. 

As much as Display port is a better solution its going to be years before HDMI is displaced as a TV connection standard. So, for AMD to decide to not include it as a 4k connectivity option is a mistake.


----------



## Tatty_One (Jun 18, 2015)

Whilhelm said:


> Well so much for my interest in Fiji, I was looking forward to ditching my GTX 980 SLI setup in favor of a single 4k capable card from team red.
> 
> This is such a small thing to overlook that completely prevents me from being able to buy this card. I have a 4k 40" TV as my main monitor that uses HDMI 2.0 and having to buy a different TV because AMD decided to ignore HDMI 2.0 is crazy.
> 
> As much as Display port is a better solution its going to be years before HDMI is displaced as a TV connection standard. So, for AMD to decide to not include it as a 4k connectivity option is a mistake.


Thank you, that answers the question at the end of R.H.P's post # 71, all my point is/was is just because this limitation does not hinder some users AMD is still in my opinion missing a trick and an opportunity with all those 4K TV owners who don't want to spend another bunch of cash on a monitor, in my case it's more about commiserating with 4K TV owners than criticising AMD but both go hand in hand to a degree.


----------



## GhostRyder (Jun 18, 2015)

Assimilator said:


> Maxwell 2, which is almost a year old shipped with HDMI 2.0 support from day 1, yet AMD can't get it into their own "high-end" product. That's inexcusable.
> Yeah because after I've spent $550 on a "high-end" video card I want to have to spend more money on a cable so that card can work properly.


Yea guess what, its still cheaper than the Titan X...

Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k.  However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.


----------



## 64K (Jun 18, 2015)

I think AMD should have included HDMI 2.0 support on their high end cards. They will lose some sales of Fiji over this but how many? The last Steam Hardware Survey I looked at said that about 1 out of 1,650 are gaming at 4K.


----------



## FordGT90Concept (Jun 18, 2015)

Xzibit said:


> *This might help
> 
> 
> 
> ...


NOooooooope! That's HDMI 1.4 (30Hz max at 4K).



jigar2speed said:


> Found this-  Display port to HDMI 2 -  http://www.amazon.com/dp/B00E964YGC/?tag=tec06d-20
> 
> But in the review section people are still complaining - is it related to driver ?


Noooooope, HDMI 1.4.  Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.

HDMI 2 support is near non-existant.  Why would Fiji be an exception to the rule?  I'm disappointed it doesn't but at the same time, I don't care.  DisplayPort is the future, not HDMI.




john_ said:


> How about this one?
> 
> Amazon.com: Belkin Displayport to HDMI Adapter (Supports HDMI 2.0): Electronics
> 
> ...


Look at the reviews.  Advertized as 4K: only does 30Hz; horrible reviews.  This is the problem with HDMI2.  They keep trying to ram more bits through that hose without changing the hose.  The connectors may be able to handle the advertised 60 Hz but cables cannot.  When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference.  HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work.  This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing.  This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task.  HDMI is not suitable for 4K and likely never will be.


The Fury X manual only mentions DisplayPort 1.2 repeatedly.  I don't know if it supports 1.3:
http://support.amd.com/Documents/amd-radeon-r9-fury-x.pdf


----------



## mroofie (Jun 18, 2015)

GhostRyder said:


> Yea guess what, its still cheaper than the Titan X...
> 
> Odd for them to not include that, I had thought it was confirmed to be part of the Fiji card...Well its definitely not smart of them to not include it especially with the focus at 4k.  However, I find DP and monitors to be better as it is and that is what I use for my 4k experience so it would not bother me one bit.


and the gtx 980 ti ?


----------



## [XC] Oj101 (Jun 18, 2015)

john_ said:


> Nvidia fanboys where in a comma yesterday. They show signs of life again today, thanks to the lack of a connector, because efficiency and performance is secondary anyway, to a connector.



I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature


----------



## Whilhelm (Jun 18, 2015)

FordGT90Concept said:


> HDMI 2 support is near non-existant.  Why would Fiji be an exception to the rule?  I'm disappointed it doesn't but at the same time, I don't care.  DisplayPort is the future, not HDMI.
> 
> HDMI is not suitable for 4K and likely never will be.



All new 4k TVs are HDMI 2.0 compliant and this standard isn't going anywhere anytime soon. UHD BluRay is imminent and all those players will be HDMI 2.0. Aside from 2.0 pretty much every other device uses some variant of HDMI so it is the standard for connectivity in the home theater world.

I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today. 

Also HDMI 2.0 works fine for 4k 60hz, yes it is very close to the threshold of HDMI bandwidth limits but it still works so why not use it. 

Not trying to be rude to you specifically, just pointing out another side of the argument.


----------



## FordGT90Concept (Jun 18, 2015)

Sure, the TVs have HDMI2 inputs but find a cable that can actually handle 4K @ 60 Hz _reliably_.  Until there are cables that can handle it, the rest is moot.


----------



## Whilhelm (Jun 18, 2015)

FordGT90Concept said:


> Sure, the TVs have HDMI2 inputs but find a cable that can actually handle 4K @ 60 Hz.



The one that I have plugged in works fine.


----------



## FordGT90Concept (Jun 18, 2015)

I bet it is 10 feet or less in length and likely not very old.  There's lots of reported issues with cable degradation trying to run 4K.


Edit: Some cable certification programs have started: http://www.dpllabs.com/page/dpl-full-4k-cable-certification


----------



## GhostRyder (Jun 18, 2015)

mroofie said:


> and the gtx 980 ti ?


Its performance (if we believe the leaks) is better so its the same cost for more performance.  To top it off it comes with a better cooler which when you factor that onto the GTX 980ti will turn out to be roughly the same (better cooler for GTX 980ti versus R9 Fury X with the cable).

So yes still a better deal (If the leaks are believed) depending on how you factor everything in.  Of course this is all still based on rumor but factoring in everything and the fact most gaming monitors in this age have DP or you can use a DP to (Whatever plug you have) adaptor, its pretty irrelevant.  The only downside to this is it makes it difficult for T.V.'s that have HDMI 2.0 running at 4k 60hz...


----------



## Vlada011 (Jun 18, 2015)

I think that my curse helped little to AMD.
I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...? 
But now and we will be good and welcomed for *cheaper and weaker TITAN X *as I told before few days.
They had big confidence to cut CUDA cores instead to go forward as much as possible with full GM200 with increased based clock with AIO if need and to prey god to keep crown they didn't care.
Now AMD will maybe to take crown exactly for that little peace and Maxwell will be first NVIDIA 3DMark loser after Fermi and Kepler.
What now? Feeding customers 10 months with rumors about Pascal while AMD sold their cards. They couldn't finish him before Spring no way.
You will see how much will NVIDIA ask for HBM II, fortune. Last 3-4 years on every perfidious way they try to silently increase prices, every year 100-150$ more for high end chip. Where is end? I think now.
And people whole time thought TITAN X is god miracle, no that was one strong card overpriced to the max.
But nothing special. NVIDIA had GK110 2013, and now they have 50% more performance.
Why NVIDIA decide to give little buy little that's different story, but they made 50% improvements from 2013.
For that period they ask 1000$ 3 times. TITAN, TITAN Black, TITAN X.
You need to be very tricky to force people to pay that.
Real value is 450-500-550MAX. Same as GTX580.
Their fans justify that as no competition... What is Fury than.
Let's take Intel as example... Intel no competition and hold CPU market much stronger than NVIDIA GPU...
Intel didn't ask 1500 or 2000$ for extreme processors. Every series of Intel processors is 10% cheaper or more expensive than before 5 years. They didn't increased price for double. But what NVIDIA do?
I'm glad because I was right and because cutting CUDA cores from GTX980Ti will cost NVIDIA crown for single chip and hundreds of thousands dollars, because we prey for full GM200 chip with increased clock, literally, months before they launch only to cut on 6GB video memory and to drop price as for normal GeForce series and they didn't had mercy. No 1200e - no full chip. That was their motto.
That was weird how NVIDIA teach their fans and they expect to AMD ask 1000$ day before presentation, they are almost sure in similar price as TITAN X. 
Just because that's new technology HBM they have rights to ask 1000$.


----------



## mister2 (Jun 18, 2015)

[XC] Oj101 said:


> I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature



Because my GTX 980 didn't need a registry hack to support 4:4:4 over HDMI 2.0 and the driver doesn't reset desktop scaling 4 times out of 10 when resuming from sleep?


----------



## mroofie (Jun 18, 2015)

Vlada011 said:


> I think that my curse helped little to AMD.
> I prey to god, from moment when I saw that NVIDIA cut 980Ti I prey god to AMD win for that part.
> Now all of us who couldn't afford 1200-1250e for TITAN X should turn back them or what??? What NVIDIA suggest to us now...?
> But now and we will be good and welcomed for *cheaper and weaker TITAN X *as I told before few days.
> ...



Fanboy detected


----------



## swirl09 (Jun 18, 2015)

I dont think they want me getting this card :/

I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)


----------



## Octavean (Jun 18, 2015)

Whilhelm said:


> All new 4k TVs are HDMI 2.0 compliant and this standard isn't going anywhere anytime soon. UHD BluRay is imminent and all those players will be HDMI 2.0. Aside from 2.0 pretty much every other device uses some variant of HDMI so it is the standard for connectivity in the home theater world.
> 
> I agree that Displayport is a better solution but at the time this card is being released HDMI 2.0 is the connection that is required by the HTPC market. If Display port ever takes off as a TV connection it will be too late for this card since there will be something better by that time. To me it simply does not make sense for AMD to not adopt the standard that is available and being used today.
> 
> ...



I agree 100%

Regardless of what the future will bring we still have to live in the here and now. Omitting HDMI 2.0 doesn't lend to a harmonious coexistence with UHD TVs in the here and now. 

Its better to have it and not need it then to need it and not have it. 

I'm not going to say that the omission of HDMI 2.0 would mean I would never buy one of these cards but it would jumpstart my urge to look elsewhere for a product that does support HDMI 2.0.


----------



## TheGuruStud (Jun 18, 2015)

Armchair gamers can keep their HDMI 2.0


----------



## xfia (Jun 18, 2015)

TheGuruStud said:


> Armchair gamers can keep their HDMI 2.0


they can certainly keep not getting full color on a uber expensive 4k tv.. if you actually think about it full circle its a bunch of garbage. especially when you realize displayport has been ahead of hdmi way before 2.0 came out. 

hell yeah amd! toss the waist of cash hdmi to the curb!


----------



## mister2 (Jun 18, 2015)

4:4:4 isn't full color?  Interesting...


----------



## Whilhelm (Jun 18, 2015)

xfia said:


> Hell yeah amd! toss the waist of cash hdmi to the curb!



And with that they toss a bunch of potential buyers to the curb as well.


----------



## xfia (Jun 18, 2015)

mister2 said:


> 4:4:4 isn't full color?  Interesting...


4k@60hz with hdmi 2.0 is limited color because of insufficient bandwidth. so you pay a shit load and your tv is not looking as good as it could or not to its full potential.


----------



## mister2 (Jun 18, 2015)

xfia said:


> 4k@60hz with hdmi 2.0 is limited color because of insufficient bandwidth. so you pay a shit load and your tv is not looking as good as it could or not to its full potential.


I'm playing @ 4k60 over HDMI with 4:4:4....


----------



## xfia (Jun 18, 2015)

mister2 said:


> I'm playing @ 4k60 over HDMI with 4:4:4....


not even close to the deep color range your able to get on your 4k with displayport. its called subsampling https://en.wikipedia.org/wiki/HDMI
@OneMoar i think was trying to tell me months ago why it was crap but i didnt fully read the hdmi wiki and the links on it.


----------



## mister2 (Jun 18, 2015)

xfia said:


> not even close to the deep color range your able to get on your 4k with displayport. its called subsampling https://en.wikipedia.org/wiki/HDMI
> @OneMoar i think was trying to tell me months ago why it was crap but i didnt fully read the hdmi wiki and the links on it.



"4:4:4 color is a platinum standard for color, and it’s extremely rare to see a recording device or camera that outputs 4:4:4 color. Since the human eye doesn’t really notice when color is removed, most of the higher-end devices output something called 4:2:2."

..... http://blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html

I think you're confusing bandwidth with sub sampling.  DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps.  I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".


----------



## xfia (Jun 18, 2015)

mister2 said:


> "4:4:4 color is a platinum standard for color, and it’s extremely rare to see a recording device or camera that outputs 4:4:4 color. Since the human eye doesn’t really notice when color is removed, most of the higher-end devices output something called 4:2:2."
> 
> ..... http://blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html
> 
> I think you're confusing bandwidth with sub sampling.  DP 1.2 supports 17.28Gbps, HDMI 2.0 supports 18Gbps.  I do agree that DP is a better platform (although MST can be fussy with cables), but don't kid yourself into thinking that HDMI 2.0 "doesn't give full color reproduction".


well i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.


----------



## mister2 (Jun 18, 2015)

xfia said:


> well i will see what a few other people have to say about because you copied what that said and idk if that is crap or what unless i translated something wrong on some more technical information.


I copied it from Adobe.  I would hope Adobe knows about color lol.  All joking aside, we agree on the core topic (DP > HDMI), but HDMI 2.0 isn't limited in color reproduction compared to DP 1.2.   I really wanted to get a Fury X, but since the home theater world relies on HDMI 2.0, it's a must have for me .


----------



## FordGT90Concept (Jun 18, 2015)

DisplayPort 1.2 = 17.28 gbps
HDMI 2.0 = 18 gbps
DisplayPort 1.3 = 32.4 gbps
The best HDMI cables can do is 25 Gbps and those are the best of the best cables over very short distances. 

There's a chart here on HDMI2: http://www.dpllabs.com/page/dpl-full-4k-cable-certification

DisplayPort 1.3 should be able to handle 4:4:4 4K @ 16-bits per color where HDMI 2.0 can only handle 8-bits per color.  Not to mention DisplayPort 1.3 can carry an HDMI signal.  As if that weren't enough, DisplayPort 1.3 is capable of VESA Display Stream Compression which can further increase effective payload.


If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.


----------



## mister2 (Jun 18, 2015)

FordGT90Concept said:


> DisplayPort 1.3 is 32.4 Gbps



Yup, and supports 8k, though display adoption is still very slim right now.


----------



## Tatty_One (Jun 18, 2015)

swirl09 said:


> I dont think they want me getting this card :/
> 
> I use DVI for my monitor (fair enough, its getting on and things need to move forward), but Ive a lovely TV which I use currently as my 4k gaming display when I feel like being on the couch and thats over HDMI2 (And yes 4k@60)


So would you mind sharing what cable you have to allow you that @60?


----------



## Steevo (Jun 18, 2015)

http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of


Nvidia used a 4:2:0 @ 8bit to get 4K 60Hz working.


Great game looks, MSAA with blocky color gradients, but Nvidia users are used to that since their cards colors a always Fuxxed up. http://www.reddit.com/r/pcgaming/comments/2p3xs7/nvidia_users_using_hdmi_output_youre_most_likely

https://forums.geforce.com/default/...for-the-limited-color-range-issue-in-hdmi-dp/

http://www.neogaf.com/forum/showthread.php?t=953806


----------



## xfia (Jun 18, 2015)

the amd middle finger to people wasting money? 4k tv's got that price tag.


----------



## mister2 (Jun 18, 2015)

Steevo said:


> http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
> 
> 
> Nvidia used a 4:2:0 @ 8bit to get 4K 60Hz working.
> ...


Yup, I had to do the reg hack to get full range.  

and that 4:2:0 is only for kepler (600/700 series).


----------



## xfia (Jun 18, 2015)

its seems your 4k may not even use that much color space.. because i only look at the awesome ones for gaming lol


----------



## mister2 (Jun 18, 2015)

xfia said:


> its seems your 4k may not even use that much color space.. because i only look at the awesome ones for gaming lol



Maybe.  I play on a 65" Samsung.


----------



## xfia (Jun 18, 2015)

mister2 said:


> Maybe.  I play on a 65" Samsung.


well check it out.. if you got time to read a forum you can read and learn while you do it if thats what interests you.
welcome to tpu by the way.. probably more than obvious how much misleading information after reading this thread so just ask haha
i will tell you @HumanSmoke @Steevo @FordGT90Concept are probably going to give you the best strait up information.
i can give the mods shit but they are cool for the most part just some of them seem to be very limited in knowledge for the years they have been around reading articles.
i would like to be wrong for what i said about what mussels was doing but it does not seem that way.


----------



## Steevo (Jun 18, 2015)

Lets give this a good logical look.


HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.

Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.

AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?

Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.


----------



## john_ (Jun 18, 2015)

[XC] Oj101 said:


> I'll bite. I wasn't in a coma, you guys have two years to wait for your drivers to mature


You will be installing newer *hotfix* "stable" drivers every week until then


----------



## newtekie1 (Jun 18, 2015)

Xzibit said:


> *This might help
> 
> 
> 
> ...





			
				TFA said:
			
		

> Maximum Resolution: *4k @ 30 Hz* (note: at time of writing no chipset supports 60Hz)



Nope, won't help one bit.


----------



## $ReaPeR$ (Jun 18, 2015)

buggalugs said:


> lol at people complaining about this. 95% of users will buy this card to use on a monitor with displayport......and for the few % that want to use a 4K TV, most of the 4K TVs and even monitors on the market don't even support HDMI 2.0 yet.
> SO maybe there is 0.0000000001 % of the market that will feel let down.
> Sure, if you're planning to hold onto the card for 5 years and want to use a TV it could be a problem, but these cards will be obsolete in 12 months when 14/16 nm cards arrive anyway, and most people will have to buy a new 4K TV to support HDMI 2.0 anyway
> If you're buying 4K TVs and $500-$600 graphics cards I'm sure you can afford an upgrade next year.



+1 to that!!! i really dont get why so many people in here are so worked up.. i mean if you can afford to pay those amount of $ why the hell wouldnt you buy a panel that can support DP.. i really dont get it.





Ferrum Master said:


> Who the Sock cares?? And if even if someone do care -
> 
> Buy a TV with display port then like this Panasonic TC-L65WT600 has...




exactly!!! as if the 4k displays are chep in the first place..




john_ said:


> I can write it in Greek if you prefer. No misspelling there. By the way. Who is the retard here? The person who makes a mistake writing in another language, or the person that comments about that, like you did? Anyway, I can understand you being upset. Try to relax.



ela mori Elladara!!!  





the54thvoid said:


> Such an ill tempered thread. How about people stop being dicks and stick to the topic.




thank you!




Tatty_One said:


> Thank you, that answers the question at the end of R.H.P's post # 71, all my point is/was is just because this limitation does not hinder some users AMD is still in my opinion missing a trick and an opportunity with all those 4K TV owners who don't want to spend another bunch of cash on a monitor, in my case it's more about commiserating with 4K TV owners than criticising AMD but both go hand in hand to a degree.




i really think this is a non issue.. if you have $$ in order to buy a 4k tv and such expensive GPUs i am certain that you can afford a new tv with all the bells and whistles..





FordGT90Concept said:


> NOooooooope! That's HDMI 1.4 (30Hz max at 4K).
> Noooooope, HDMI 1.4.  Two comments say 30 Hz is the best it can do at 4K, if it even does 4K.
> HDMI 2 support is near non-existant.  Why would Fiji be an exception to the rule?  I'm disappointed it doesn't but at the same time, I don't care.  DisplayPort is the future, not HDMI.
> Look at the reviews.  Advertized as 4K: only does 30Hz; horrible reviews.  This is the problem with HDMI2.  They keep trying to ram more bits through that hose without changing the hose.  The connectors may be able to handle the advertised 60 Hz but cables cannot.  When genuine 2.0 compliant cables debut, they'll probably be like $20+ per foot because of the massive insulation required to prevent crosstalk and interference.  HDMI has always been the retard of the display standards taking the DVI spec, tacking audio on to it, and disregarding all the controls VESA put on it to guarantee the cable will work.  This was in inevitability with HDMI because the people behind HDMI haven't a clue what they're doing.  This is why VESA didn't get behind HDMI and why they went off and designed their own standard that's actually prepared to handle the task.  HDMI is not suitable for 4K and likely never will be.
> ...




thank you!





Steevo said:


> Lets give this a good logical look.
> HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
> Then lets look at the whole need for G-Sync, its the assumption that Nvidia can't make a graphics card that will push a full 60FPS at 4K resolution (OK, that was a jab), so lets make the monitor refresh at the frame rate we can drive instead. So then we have two choices here, either 60FPS isn't really that important, or it is, so which one is it guys? Either way its still a fail, as it only works on Display Port.
> AMD releases a card without support for a display standard, like they did before, and yet shipped plug and play active DP adapters in the box, whats the chance they will do the same here then?
> Lastly lets place the blame for this at the feet of those who deserve it, the TV makers, who could provide us with DisplayPort capable televisions, and yet haven't yet, its like the first generation of HD TV's that had strangely mixed options for input, and sometimes they may or may not work as expected. Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO.



the truth has been spoken!!!!

i really find this a non issue. and i want to thank you guys for trying to be objective and civil.


----------



## john_ (Jun 18, 2015)

FordGT90Concept said:


> If Fiji has DisplayPort 1.3 instead of HDMI 2.0, I'll be happy.


I wouldn't expect that. They only need 1.2a for Freesync, so they will be fine with that. I believe they decided to spend the last dollars they had on the LEDs instead of implementing DP1.3.



Steevo said:


> HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.


AMD was working with Freesync over HDMI at Computex and the same was *rumored* for Nvidia.

AMD Demonstrates FreeSync-over-HDMI Concept Hardware at Computex 2015


----------



## Octavean (Jun 18, 2015)

People have every right to expect that new video cards will support newer standards like HDMI 2.0.  If AMD was trying to make some stand against HDMI (which I doubt)  then it would be more appropriate for them to omit support for all versions of HDMI rather then stagnating on an older HDMI standard.

Based on that alone it seems more like a mistake then some message. Is it a big mistake, not IMO but it still looks like a mistake.

I also expect hardware H.265 encode and decode.  If this HDMI 2.0 thing is true I wouldn't be surprised if that was a bust too.


----------



## ShurikN (Jun 18, 2015)

Ok, so HDMI 2.0 is needed for 4K/60.
How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
And most importantly, how much of the market share do they take?


----------



## xfia (Jun 19, 2015)

Steevo said:


> Lets give this a good logical look.
> 
> 
> HDMI, does it support G-Sync.....nope. So if you want to bash the lack of HDMI support on this front when Nvidia and AMD are both actively pushing a technology(G-Sync or Free-Sync) that isn't compatible with HDMI.... you fail.
> ...


30vs60pfs 
i guess that is a question of budget and standard. 30fps is playable and people do it every day.. i like 50 because going around 50-60. not really noticeable to me. wont being locked into a refresh rate eventually cause input lag if not driving someone crazy for days trying to fix it?
g-sync vs freesync 
they are both good and above my standard on refresh rates and totally specd out in my opinion. 
i do like how freesync works and doesnt need extra parts in the display that you get charged for because oem's get charged the cost of the extra hardware with a license free. 
yet another open standard amd helped to put on paper way before gsync was a thought.


----------



## Octavean (Jun 19, 2015)

ShurikN said:


> Ok, so HDMI 2.0 is needed for 4K/60.
> How many UHD tvs car run at 60Hz, and how many of them have 2.0 (or display port), and how many 4:4:4?
> And most importantly, how much of the market share do they take?


That's a good question,....

DisplayPort on UHD TV's has already been addressed in this thread though.  Very few UHD TVs have DP and it doesn't look like many will.

However, I have a UHD TV that does 4K/60Hz via HDMI 2.0 and supports 4:4:4. IMO it doesn't necessarily matter if they are like hen's teeth now because they do exist and it seems like the direction the UHD TV industry is going in.  These UHD TVs are getting cheaper too,...

If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0  is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so  when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.


----------



## Steevo (Jun 19, 2015)

Octavean said:


> That's a good question,....
> 
> DisplayPort on UHD TV's has already been addressed in this thread though.  Very few UHD TVs have DP and it doesn't look like many will.
> 
> ...




With 8 bit color.

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx


So a little color schooling. 


What do you see?

http://i4.minus.com/ibyJcwdIniHUEs.png


https://en.wikipedia.org/wiki/Color_depth

Even if you saw the highest end, it may only be processed in 8 bit per color instead of 10 and thus will still have blocking and gradients. HDMI 2.0 is still shit compared to Display Port.


----------



## xfia (Jun 19, 2015)

Octavean said:


> That's a good question,....
> 
> DisplayPort on UHD TV's has already been addressed in this thread though.  Very few UHD TVs have DP and it doesn't look like many will.
> 
> ...


your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should. 
displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz


----------



## Octavean (Jun 19, 2015)

xfia said:


> your missing that the color space is filled and is not the true color of the content when your 4:4:4 via hdmi 2.0 4k@60hz. so if your display can do better than 8bits its not going to be what it should.
> displayport 1.3 can do twice the color depth and well accuracy for true high quality uhd 4k@60hz


So what is your point,......?

Tell it to the industry making UHD TV's.

My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.


----------



## Steevo (Jun 19, 2015)

Octavean said:


> So what is your point,......?
> 
> Tell it to the industry making UHD TV's.
> 
> My point is simple, support the standards that are available in a new card. If Fiji didn't support the latest DisplayPort standard my issue would be the same. Its not about the merits of the standards and never was.




 


Either you understand that 8 bit color looks like shit, or you don't.

We have two simple scenarios in which you replay to a thread about a new graphics card where HDMI 2.0 is NOT supported. 


1) You care as you have something relevant to add, understand what it means, why its important or not.

2) You are a Nvidiot and need to thread crap elsewhere.


----------



## HumanSmoke (Jun 19, 2015)

You guys are arguing two different standpoints that are mutually exclusive.
@Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
@Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.

One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.


----------



## Xzibit (Jun 19, 2015)

4k standards is defined as 4k resolution 10-bit+ Rec/BT 2020.  HDMI 2.0 can only do that at 4K/30hz.

That still isn't the overall issues because even then your upscaling or downscaling thru the chain


----------



## FordGT90Concept (Jun 19, 2015)

Octavean said:


> If the spec weren't ratified then HDMI 2.0 omission on a new video card would make perfect sense but it is ratified, HDMI 2.0  is in the wild and it is a checkmark feature that makes little sense to leave out. Especially so  when you are competing with products that do support HDMI 2.0, have supported it for a while and support it on a range of price-points starting as low as ~$200.


DisplayPort 1.3 was released almost exactly a year after HDMI 2.0.  If the latter wasn't adopted, the former most certainly isn't.  These are probably things they're putting off for 16/14nm. 

Titan X has HDMI 2.0 but DisplayPort 1.2 (not "a" for Adaptive V-Sync support).  So right now we either have to go with HDMI 1.4a and DisplayPort 1.2a or HDMI 2.0 and DisplayPort 1.2.  I think I'd have to go with the former because I loathe proprietary standards like G-Sync and all HDMI has to ever power for me is a 1920x1200 display via DVI adapter.




HumanSmoke said:


> You guys are arguing two different standpoints that are mutually exclusive.
> @Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
> @Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.
> 
> One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.


The argument Steevo makes, and one I agree with, is that HDMI 2.0 should be terminated and DisplayPort should be replacing it in full.  DisplayPort supports HDMI packets so DisplayPort has backwards compatibility ingrained.  There's no reason HDMI 2.0 exists other than, as Steevo said, "Samsung should know better, Panasonic, Toshiba, and others should know better, they are paying for a dying standard, but they are doing it for planned obsolescence IMO. "  It's the TV industry trying to dictate what standard people use because they refuse to provide an affordable alternative.


----------



## wiak (Jun 19, 2015)

hmm, why did they remove *both* DVI ports?. i believe it was because they needed more TMDS signals to support HDMI 2.0

well, the reason i love DisplayPort is that its possible to convert it to HDMI 2.0, just as it was possible doing DP>VGA, DP>DVI, DP>HDMI


----------



## HumanSmoke (Jun 19, 2015)

FordGT90Concept said:


> The argument Steevo makes, and one I agree with, is that HDMI 2.0 should be terminated and DisplayPort should be replacing it in full.  DisplayPort supports HDMI packets so DisplayPort has backwards compatibility ingrained.


Uncontested.
The point I was making is that one person is vehemently arguing what *should* be happening, while the other is arguing what *is *happening. Both viewpoints are valid - they just don't constitute sides of the *same* argument.


----------



## xfia (Jun 19, 2015)

so its a honest to god like 5 or 20 buck adapter that they will stick in the box if people bug them about it that makes its backwards compatible haha
*"the lack of HDMI 2.0 support hurts the chip's living room ambitions"*
so its the best damn living room gpu around 
nano is the fastest smallest thing around at 2x(200%) performance per watt and faster than the 290x. two will deliver very well at 4k or 1080p eyefinity in small form factors in any game and thats the baby fury with 5k 1440p eyefinity at reasonable settings. optimal support for vr headsets with split flame rendering. the 3 displayports are excellent for anyone giving the bandwidth you need to run alot of smaller resolution monitors any of latest greatest gaming experiences available that is also enabled to transfer hdmi 2.0 via adapter.


----------



## ShurikN (Jun 19, 2015)

xfia said:


> *"the lack of HDMI 2.0 support hurts the chip's living room ambitions"*


Only the Nano, Fury is not a concern imo.


----------



## Octavean (Jun 19, 2015)

HumanSmoke said:


> You guys are arguing two different standpoints that are mutually exclusive.
> @Octavean is putting forward that HDMI 2.0 has favour with TV vendors and even if it lacks bandwidth compared with DP, will still be utilized.
> @Steevo ...well you're basically arguing that DP is better than HDMI and graphics vendors should concentrate on it even though TV manufacturers aren't using it to any great extent.
> 
> One is argument about tech implementation (and a few insults), one is about practical implementation in a real market.


Exactly, thank you for seeing it for what it is.


----------



## xfia (Jun 19, 2015)

ShurikN said:


> Only the Nano, Fury is not a concern imo.


i feel like you did not read my post or watch the video. the nano is a fury that has hbm and the connectors on them are better than any others for anyone.


----------



## john_ (Jun 20, 2015)

How about this?
Google Translate

Someone posted on reddit, that there where some DP 1.2 to HDMI 2.0 adaptors at Computex capable of 60Hz like the one in the link. I guess if Fury is a success those adaptors will multiply and probably come down in price, a price that it is still unknown.



Of topic
At KitGuru they are baning accounts because they got way too much criticism for crying that AMD wasn't giving them a free sample of Fury X. They just banned me and I was one of the 5 top posters there. Nice. More free time.


----------



## praze (Jun 21, 2015)




----------



## R-T-B (Jun 21, 2015)

RejZoR said:


> Saying DP is a future when not a single TV supports it is a bit blunt statement. I don't think DP will ever be supported in LCD TV's. It hasn't been so far, why would it be in the future? No device for the living room even has DP...



A select few TVs do, as posted above.  Mostly just Panasonic high end ones right now.


----------



## RejZoR (Jun 21, 2015)

Who cares about Panasonic, they sell nearly no TV's. Philips, LG and Samsung have the largest market share and until they support it, it's basically the same as "not existing".


----------



## Octavean (Jun 21, 2015)

R-T-B said:


> A select few TVs do, as posted above.  Mostly just Panasonic high end ones right now.




I believe the models are the Panasonic TC-58AX800U and TC-65AX800U (~$1600 and ~$2600 respectively)

Nice to have as an option I am sure but I don't expect it to catch on as a typical feature of new UHD TV's any time soon.


----------



## R-T-B (Jun 22, 2015)

RejZoR said:


> Who cares about Panasonic, they sell nearly no TV's. Philips, LG and Samsung have the largest market share and until they support it, it's basically the same as "not existing".



Hey!  I own a Panasonic!  Do I not exist as well? lol

Yeah, I know, it's not really their panel or anything, but they DO exist.  That was my point. 

EDIT:

How did Philips/LG make the list?  They are behind Panasonic, who is admitedly pretty behind:






Google may be lying to me though...


----------



## john_ (Jun 23, 2015)




----------



## swirl09 (Jun 25, 2015)

Tatty_One said:


> So would you mind sharing what cable you have to allow you that @60?


Sorry, just spotted this now.

Nothing special about the cables, theyre ones I got from random purchases.


----------

