# AMD Radeon R9 Nano CrossFire



## W1zzard (Sep 17, 2015)

In this Review, we will test two Radeon R9 Nanos in a CrossFire setup, and we also include data for a R9 Nano paired with a Fury X, a combination that's possible on AMD because both cards use the same GPU.

*Show full review*


----------



## xkche (Sep 18, 2015)

Thanks!
How about VRAM usage?, with "just" 4GB.


----------



## NC37 (Sep 18, 2015)

So after the fuss of not getting a review sample, TPU strikes back with a Crossfire review. Nice one 

Performance scaling is nice even in the lower res. Glad TPU did those too so one could see.


----------



## HammerON (Sep 18, 2015)

Nice review


----------



## Finners (Sep 18, 2015)

Am I missing it or is the review missing noise and temperatures?


----------



## GhostRyder (Sep 18, 2015)

Glad you took the time to do this @W1zzard.

Interesting concept at the end there with the MATX cube build idea.  Had not considered the idea that CFX might be a real option for the Nano (More in terms of why would you not just pick up something else) but that argument makes good sense.  I think CFX values for Fiji do show the potential of HBM better than in a single GPU config especially at the 4K levels.

Meh, its not to unreasonable to think about purchasing 2 Nano's for CFX I suppose.


----------



## the54thvoid (Sep 18, 2015)

Finners said:


> Am I missing it or is the review missing noise and temperatures?



<sarcasm>No, it's now an AMD sponsored site, so the chorus of crickets and fan noise get's hidden under the bed<\sarcasm>

I was just checking - ironically - I can't find a 980ti sli review...  Can't compare the scaling - c'mon @W1zzard, for fairness we need a 980ti sli review.  If there was one and I missed it, please accept my apologies.  I wont be surprised if Fiji comes out better!

EDIT: glad i checked my post - my 'sarcasm' tag had the wrong forward slash and wasn't visible - sorted now....


----------



## Darksword (Sep 18, 2015)

Wizz, can you do a R9 Fury CrossFire test next?


----------



## Luka KLLP (Sep 18, 2015)

GhostRyder said:


> Glad you took the time to do this @W1zzard.
> 
> Interesting concept at the end there with the MATX cube build idea.  Had not considered the idea that CFX might be a real option for the Nano (More in terms of why would you not just pick up something else) but that argument makes good sense.  I think CFX values for Fiji do show the potential of HBM better than in a single GPU config especially at the 4K levels.
> 
> Meh, its not to unreasonable to think about purchasing 2 Nano's for CFX I suppose.


Yeah that build would absolute 4K gaming heaven 

Then again, the Fury X2 will probably fit in an even smaller mITX case... It will be interesting to see how that card will compare to CrossFired Nano's!


----------



## john_ (Sep 18, 2015)

Great. Thanks for the review.


----------



## Effting (Sep 19, 2015)

TechPowerUp, you guys seriously have to take out of your test suite _Project CARS, Wolfenstein: The New Order_ and _World of Warcraft_. This 3 games really damage the real performance index of the AMD cards. Just take a look at all other games and notice that the performance the Nano, Fury and Fury X cards presents do not agree with the final performance summary, and that is just because of those three aforementioned games.


----------



## Maban (Sep 19, 2015)

Effting said:


> TechPowerUp, you guys seriously have to take out of your test suite _Project CARS, Wolfenstein: The New Order_ and _World of Warcraft_. This 3 games really damage the real performance index of the AMD cards. Just take a look at all other games and see for yourself that the performance the Nano and the Fury and Fury X cards presents do not agree with the final performance summary, and that is just because of those three aforementioned games.


To take out a game for that would be it's own type of bias. World of Warcraft still has something like 7 million subscriptions. It's a popular contemporary game that people want to know the performance of.


----------



## Effting (Sep 19, 2015)

Maban said:


> To take out a game for that would be it's own type of bias. World of Warcraft still has something like 7 million subscriptions. It's a popular contemporary game that people want to know the performance of.


Sure, but that game already runs at 120+FPS on those cards. So there is no point other than to mess up the performance summary.


----------



## Maban (Sep 19, 2015)

@W1zzard Why is the Perf/$ page normalized to 200%?


----------



## Dieinafire (Sep 19, 2015)

Luka KLLP said:


> Yeah that build would absolute 4K gaming heaven
> 
> Then again, the Fury X2 will probably fit in an even smaller mITX case... It will be interesting to see how that card will compare to CrossFired Nano's!



With no HDMI 2.0 it is 4k hell


----------



## thebluebumblebee (Sep 19, 2015)

Finners said:


> Am I missing it or is the review missing noise and temperatures?


Those are done with the individual card reviews.


----------



## Zubasa (Sep 19, 2015)

Effting said:


> Sure, but that game already runs at 120+FPS on those cards. So there is no point other than to mess up the performance summary.


There is certainly a point, Crossfire having negative scaling in WoW is a problem.
You are talking a about a game that has more players playing it, than all the rest of the games in the review combined.
Considering a pair of GTX 960 in SLI is almost twice as fast as a pair of Nanos @4k in WoW.


Dieinafire said:


> With no HDMI 2.0 it is 4k hell


What kind of crap 4k monitor has no Display Port?


----------



## Darksword (Sep 19, 2015)

Effting said:


> Sure, but that game already runs at 120+FPS on those cards. So there is no point other than to mess up the performance summary.



You can't just remove all the games that AMD does poorly in and call it an objective review.   People want to know how well the card performs on specific games.  If AMD cards underperform due to driver issues then that's something people want to know before spending $650.00.

If and when AMD fixes those issues the scores will reflect that.


----------



## Steevo (Sep 19, 2015)

Excellent review, and almost a kick in the ass to see two tiny cards beating the crap out of the rest of everything on offer currently less one or two titles where crossfire support is lacking. 


Kind of a eye opener looking at the performance of the pair compared to other cards, build a Micro ATX system with these in crossfire and the ability to game at 4K and good frame rates.......


----------



## RealNeil (Sep 19, 2015)

Thanks for the review. It's a good read.
I think I'll hold off on these for a while and see what else comes down the pike.


----------



## 15th Warlock (Sep 19, 2015)

Zubasa said:


> There is certainly a point, Crossfire having negative scaling in WoW is a problem.
> You are talking a about a game that has more players playing it, than all the rest of the games in the review combined.
> Considering a pair of GTX 960 in SLI is almost twice as fast as a pair of Nanos @4k in WoW.
> 
> What kind of crap 4k monitor has no Display Port?



I think he might be referring to limiting the HDMI output at 4K to 30Hz when using this card in an HTPC build connected to a 4K TV; very few TVs have DP, so that there limits the potential to build a killer mini-ITX rig powered by this card to place in a living room and play games at a decent frame rate.

As always, great review W1zzard, I actually considered using this card in crossfire at one point before the price was revealed, two of these cards deliver awesome performance!


----------



## Kanan (Sep 19, 2015)

15th Warlock said:


> I think he might be referring to limiting the HDMI output at 4K to 30Hz when using this card in an HTPC build connected to a 4K TV; very few TVs have DP, so that there limits the potential to build a killer mini-ITX rig powered by this card to place in a living room and play games at a decent frame rate.
> 
> As always, great review W1zzard, I actually considered using this card in crossfire at one point before the price was revealed, two of these cards deliver awesome performance!



As said probably a zillion times by now: get a DP to HDMI 2.0 adapter. If you got 650 or 1300$$ for graphics cards you can afford that adapter, I think. Or get one of the "very few TVs that have DP 1.2" for 4K 60hz. Last time I checked "very few" meant more than "none", so thats not exactly a problem, too.

Edit: thanks for the Nano reviews, I really liked the good pictures.


----------



## 15th Warlock (Sep 19, 2015)

Kanan said:


> As said probably a zillion times by now: get a DP to HDMI 2.0 adapter. If you got 650 or 1300$$ for graphics cards you can afford that adapter, I think. Or get one of the "very few TVs that have DP 1.2" for 4K 60hz. Last time I checked "very few" meant more than "none", so thats not exactly a problem, too.
> 
> Edit: thanks for the Nano reviews, I really liked the good pictures.



Yeah, that's always an option, but that doesn't save the card from being criticized for not including this feature out of the box, it would've cost AMD literally a few cents to add this feature to this "premium card" considering it costs exactly the same as a full AIO water cooled Fury X.

And yes, this coming from a guy who has spent literally over $2.5K in video cards and a custom cooling loop for my main rig, but before you think I'm a hypocrite, let me say that I said the same thing about Nvidia when they tried saving a few cents on a $1K card by not adding a backplate to Titan X, so both companies are equally guilty of the same sin.


----------



## Effting (Sep 19, 2015)

Darksword said:


> You can't just remove all the games that AMD does poorly in and call it an objective review.   People want to know how well the card performs on specific games.  If AMD cards underperform due to driver issues then that's something people want to know before spending $650.00.
> 
> If and when AMD fixes those issues the scores will reflect that.


That's another thing we should stop complaining about: it's not the GPU makers fault if a game does not run well on its GPU, its the game itself that was poorly built. The GPU and it's driver (or its arquiteture GCN) were already there for the game developers to make good use of it. Hardware companies should stop trying to make shitty games run well if it wasn't meant like that in the first place.

So if you can't just remove games that don't run well on AMD hardware, to balance things out, it would be fair to include games that run well on AMD hardware but not on Nvidia hardware.


----------



## Kanan (Sep 19, 2015)

> it would've cost AMD literally a few cents to add this feature to this "premium card"


A few cents? Architectural changes are more expensive than a few cents, get your informations right please. Is it not obvious to you, that they would have done it, would it have been a few cents? 


> So if you can't just remove games that don't run well on AMD hardware, to balance things out, it would be fair to include games that run well on AMD hardware but not on Nvidia hardware.


Afaik there are no games that run on NV hardware that bad as Cars on AMD. But there are games that run worse on NV hardware, for example Tomb Raider and that game is included. Also, all games that are used in the article should stay. There's no cherry picking in something like that, it wouldn't be "fair" - not to NV and not to the people interested in the card and the games.


----------



## the54thvoid (Sep 19, 2015)

Effting said:


> TechPowerUp, you guys seriously have to take out of your test suite _Project CARS, Wolfenstein: The New Order_ and _World of Warcraft_. This 3 games really damage the real performance index of the AMD cards. Just take a look at all other games and notice that the performance the Nano, Fury and Fury X cards presents do not agree with the final performance summary, and that is just because of those three aforementioned games.



No, AMD let their cards performance be damaged by not having the best DX11 hardware and/or driver solutions. 
But, rejoice and flap thy red cape, future games that choose to go down certain feature paths in DX12 will seriously level the playing field.  The new Deus Ex title is AMD sponsored and comes out in Feb. If it uses heavy Asynchronous shading, it'll be very good for Tonga, Hawaii and Fiji based cards.
But then, should I ask for that to be taken out of future benchmarks for being unfair to Nvidia?


----------



## john_ (Sep 19, 2015)

I think those games should stay there. 

Looking at those 2-3 games and comparing the results with the other 20+ games, reveals the developers who will happily take someone's money and screw, in my opinion, their own customers by giving them an inferior product, while asking the full price. We should NOT protect those developers from hiding the benchmark results in their games. No. Those games should be there for everyone to see. And then put those developers in a black list and NEVER pay the full price for their games. Wait until those games come down to 1/4 of the original price and only then consider buying them. 

Just my opinion of course, let those games there. Let everyone know who's games NOT to buy. Developers who happily screw AMD owners today, in a monopoly tomorrow will happily screw every owner of a "last gen" hardware in favor of the "next gen" hardware, so to force them to upgrade.


----------



## W1zzard (Sep 19, 2015)

Maban said:


> @W1zzard Why is the Perf/$ page normalized to 200%?


because i fail at typing  new graphs are up


----------



## EarthDog (Sep 19, 2015)

Thanks ks wiz, another great job. Keep up the good work!



Effting said:


> That's another thing we should stop complaining about: it's not the GPU makers fault if a game does not run well on its GPU, its the game itself that was poorly built. The GPU and it's driver (or its arquiteture GCN) were already there for the game developers to make good use of it. Hardware companies should stop trying to make shitty games run well if it wasn't meant like that in the first place.
> 
> So if you can't just remove games that don't run well on AMD hardware, to balance things out, it would be fair to include games that run well on AMD hardware but not on Nvidia hardware.


Oy...

Just quoting this in hopes he reads his own post to see how asinine this suggestion actually is on so many levels.

@john, dirt rally runs terrier on amd hardware for some reason... it's an amd game too iirc.


----------



## Aquinus (Sep 19, 2015)

Great review but, I think I'll go with a second 390 considering two 390s cost about the same as a Nano. I'll hold off on first generation technology and let all the early adopters enjoy the high price tag, so people like me can live vicariously through the review but, still go with the cheaper option. Good job @W1zzard . You make me wish I had more money for computer components.


----------



## john_ (Sep 19, 2015)

EarthDog said:


> @john, dirt rally runs terrier on amd hardware for some reason... it's an amd game too iirc.


Dirt Rally Performance Review - GeForce GTX 970 Versus Radeon R9 390 - Page 4 of 4 - Legit ReviewsCodemasters: Dirt Rally Performance Benchmark
It doesn't look bad.





Even if we go before the introduction of 300 series
Dirt Rally: Ersteindruck des geistigen "Colin McRae 2015" mit Benchmarks von 20 Grafikkarten
the results are what someone would expect. Not every card from one manufacturer being faster than every card from the other manufacturer.


----------



## Cryio (Sep 19, 2015)

"Far Cry 4 is an interesting test. The R9 Nano CrossFire not only doesn't scale at 4K, but also sees a performance drop. In the same test, the dual-GPU R9 295X2 scales just fine. It goes to show that AMD still needs to refine drivers for the "Fiji" GPU"

AMD doesn't need to do anything. They have fixed poor performance in drivers 15.7.1 and 15.8. You see poor performance here because you guys used the 15.7 drivers, for whatever reason.


----------



## W1zzard (Sep 19, 2015)

Cryio said:


> AMD doesn't need to do anything. They have fixed poor performance in drivers 15.7.1 and 15.8. You see poor performance here because you guys used the 15.7 drivers, for whatever reason.


Help me please, what driver have I used for Nano?


----------



## Effting (Sep 19, 2015)

Effting said:


> That's another thing we should stop complaining about: it's not the GPU makers fault if a game does not run well on its GPU, its the game itself that was poorly built. The GPU and it's driver were already there for the game developers to make good use of it. Hardware companies should stop trying to make shitty games run well if it wasn't meant like that in the first place.
> 
> So if you can't just remove games that don't run well on AMD hardware, to balance thing out, it would be fair to include games that run well on AMD hardware but not on Nvidia hardware.





the54thvoid said:


> No, AMD let their cards performance be damaged by not having the best DX11 hardware and/or driver solutions.
> But, rejoice and flap thy red cape, future games that choose to go down certain feature paths in DX12 will seriously level the playing field.  The new Deus Ex title is AMD sponsored and comes out in Feb. If it uses heavy Asynchronous shading, it'll be very good for Tonga, Hawaii and Fiji based cards.
> But then, should I ask for that to be taken out of future benchmarks for being unfair to Nvidia?



No You don't, because it will be more balanced.
You see, what I'm trying to say is that Nvidia has a history of making games run bad on AMD cards, and probably that is whats going on with those 3 games.
Also Gameworks is a terrible thing for the gaming community...


----------



## Cryio (Sep 19, 2015)

W1zzard said:


> Help me please, what driver have I used for Nano?


From the review, "AMD R9 Nano: 15.201.1102"

Latest 15.8 driver is 15.201.1151 . Far Cry 4 performance has been fixed.


----------



## W1zzard (Sep 19, 2015)

Cryio said:


> From the review, "AMD R9 Nano: 15.201.1102"
> 
> Latest 15.8 driver is 15.201.1151 . Far Cry 4 performance has been fixed.


Hmm .. I used the latest driver from AMD's FTP, recommended for R9 Nano reviews, marked as "September 1" build. As far as I know there is no newer driver for R9 Nano.

Your .1151 driver is from August 23, so older... amd-catalyst-15.8beta-64bit-win10-win8.1-win7-aug23.exe


----------



## oZ65 (Sep 19, 2015)

Cryio said:


> From the review, "AMD R9 Nano: 15.201.1102"
> 
> Latest 15.8 driver is 15.201.1151 . Far Cry 4 performance has been fixed.



It's a little change and has already been used.

Up to 7% in Far Cry® 4 on AMD Radeon™ R7 and AMD Radeon™ R9 200 series and up 
http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst15-7WINReleaseNotes.aspx


----------



## the54thvoid (Sep 19, 2015)

Cryio said:


> From the review, "AMD R9 Nano: 15.201.1102"
> 
> Latest 15.8 driver is 15.201.1151 . Far Cry 4 performance has been fixed.



Evidently not fixed. This is always going to be a problem with dual GPU cards.


----------



## soulsore (Sep 20, 2015)

I think only one thing.. in 2015 it is no longer possible continue to publish reviews with a number and that's it, it's absurd. Should be published a chart with the trend of fps for the whole duration of the test.


----------



## Relayer (Sep 20, 2015)

Maybe I missed it, but in your conclusion you neglected the power usage as a benefit of the Nanos in crossfire vs. Fury/Fury-X. Seems that performance @~350W would be a definite plus.


----------



## 15th Warlock (Sep 20, 2015)

Kanan said:


> A few cents? Architectural changes are more expensive than a few cents, get your informations right please. Is it not obvious to you, that they would have done it, would it have been a few cents.



Architectural changes? You do know how active display port to HDMI converters work, right? Those little dongles contain a tiny chip that functions by processing the DP video signal to convert it to an HDMI video signal, a very small piece of silicon no bigger than a few squared millimeters in size, and those cost how much to companies like AMD? Yeah you guessed that right, cents, no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0, but they decided to transfer that burden to the people who want to use this card in a home theater situation when plugged it to thousands of existing 4K TVs, so it seems like you should get your information right.

Also, I've been looking for this fabled Display port to HDMI 2.0 converter you say people have mentioned a "zillion times" it turns  out such converters are not even available as of sept 2015, as found in numerous forum  threads filled with people looking for this sought after piece of hardware:

http://www.avsforum.com/forum/35-ca...-1-2-hdmi-2-0-adapter-there-manufacturer.html

http://www.tomshardware.com/forum/id-2258794/powering-hdmi-devices-mini-gpu.html

http://hardforum.com/showthread.php?t=1853226

There are some cheap adapters you can find online claiming to convert DP signal to HDMI 2.0, but the reviews for these adapters are filled with angry customers warning other people to stay away from these adapters, as they are HDMI 1.4 at most and thus, capable of only 30Hz at 4K.

One such adapter claims to be able to drive 4K at 60Hz, but it doesn't even list official HDMI 2.0 support, and some reviewers report this adapter uses a trick similar to what Nvidia did to enable 60Hz on early maxwell cards by downsampling the video signal to 8bit

http://www.amazon.com/dp/B00ZA067MA/?tag=tec06d-20

So, if you know where to find this adapter you speak of, would you kindly share a link to the product page so the hundreds of people who are looking for it in hardware forums can purchase it?


----------



## john_ (Sep 20, 2015)

15th Warlock said:


> no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0


Probably they didn't wanted to add complexity on their board, or a third party chip or even increase the size of the board a "few squared millimeters" considering we do not talk about a big graphics card. In fact, in the case of Nano a "few squared millimeters" is a really big deal considering how they market the card.
So, their only option would probably be to upgrade the chip to support HDMI 2.0. They should have done that, they didn't. I don't know how much more it would have cost them in time and money to implement HDMI 2.0 support. I mean, OK, they just took two Tonga chips and glue them together, change the memory subsystem to support HBM and that's it? Not enough money/engineers/time to add HDMI 2.0 support? Did they had to change significant part of the chip's architecture to support HDMI 2.0?


----------



## FordGT90Concept (Sep 20, 2015)

They need to make an XL-ITX board for crossfire Nanos with matching power supply. XD


If you're going to blow $650-1300 on graphics cards, there's a good chance you can afford a DisplayPort TV/monitor too.  HDMI 2.0 can't do 4K without cutting corners; DisplayPort can.  I think that's the message AMD is trying to send by excluding HDMI 2.0.

There has to be a technical reason why DisplayPort to HDMI 2.0 converters don't exist.  I wish I knew it.

Edit: It sounds like DP->HDMI 2.0 converters should becoming by the end of the year.


----------



## AsRock (Sep 20, 2015)

Dieinafire said:


> With no HDMI 2.0 it is 4k hell



AMD are going release a adapter that solves the HDMI 2.0 issue, i would of though it be connected though the DP connecters.


----------



## Dieinafire (Sep 20, 2015)

AsRock said:


> AMD are going release a adapter that solves the HDMI 2.0 issue, i would of though it be connected though the DP connecters.



Using an adapter on a plus 600 dollar gpu? No thank you if I'm spending good money I want my tech to be high tech


----------



## FordGT90Concept (Sep 20, 2015)

I doubt that.  Active DisplayPort converters are not cheap.  AMD can't afford to be handing them out and they really have no reason to start selling them either (other companies like Startech, Monoprice, and Belkin will get all over that).


----------



## AsRock (Sep 20, 2015)

Dieinafire said:


> Using an adapter on a plus 600 dollar gpu? No thank you if I'm spending good money I want my tech to be high tech



I don't disagree with you, was just saying that their will be.


----------



## Xzibit (Sep 20, 2015)

Dieinafire said:


> Using an adapter on a plus 600 dollar gpu? No thank you if I'm spending good money I want my tech to be high tech



If it was "high tech". Would you not prefer DisplayPort ?

*HDMI FAQ*
4k@60hz 10-bit 4:2:0
*
DisplayPort FAQ*
4k@60 10-bit 4:4:4


			
				DisplayPort FAQ said:
			
		

> DisplayPort 1.2a systems today can support 4K displays at 60Hz refresh and full 30-bit 4:4:4 color (non-chroma subsampled).



You can always channel your diatribe towards the non-inclusion from both GPU vendors of DisplayPort 1.3 which was passed last year .  At least you'd be advocating for a superior standard not going backward for convenience.


----------



## Effting (Sep 20, 2015)

Xzibit said:


> If it was "high tech". Would you not prefer DisplayPort ?
> 
> *HDMI FAQ*
> 4k@60hz 10-bit 4:2:0
> ...


Absolutely! Don't know why people keep complaining about. DisplayPort is a must for 4K monitors...


----------



## Aquinus (Sep 20, 2015)

Xzibit said:


> If it was "high tech". Would you not prefer DisplayPort ?
> 
> *HDMI FAQ*
> 4k@60hz 10-bit 4:2:0
> ...


Forget the fact that the actual connector for HDMI is crap with respect to build quality and longevity. DP is a much more rigid design than HDMI. A clip to hold a connector in and an L shaped (keyed,) internal connector? You would think that after using screws with HDMI and VGA that they would realize that there is a need to hold cables in place and for the connector to be rigid and HDMI fails in that respect. The only benefit of HDMI is convenience because it has weaseled its way on to just about every device you can find.

As I understand it though, 4:4:4 can look a lot nicer than 4:2:0 although I've only heard and read something about it once.


----------



## cadaveca (Sep 20, 2015)

Xzibit said:


> If it was "high tech". Would you not prefer DisplayPort ?
> 
> *HDMI FAQ*
> 4k@60hz 10-bit 4:2:0
> ...


It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.

That said, I've been using DisplayPort exclusively since 2007. It's old tech, too, just like HDMI. I prefer DisplayPort, but I cannot use it to connect a VGA to my home theatre. And yes, I would have likely bought a Nano if it had a connector that would give me 60 FPS @ 4k. Now I'll go with GTX980 Ti, and a larger case.

Since AMD and NVidia launch new GPUs on a yearly basis, any idea of being "forward-looking" by using DP only is asinine. HDMI chips cost money and board real-estate, and that's why they were not used.


----------



## Aquinus (Sep 20, 2015)

cadaveca said:


> It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.
> 
> That said, I've been using DisplayPort exclusively since 2007. It's old tech, too, just like HDMI. I prefer DisplayPort, but I cannot use it to connect a VGA to my home theatre. And yes, I would have likely bought a Nano if it had a connector that would give me 60 FPS @ 4k. Now I'll go with GTX980 Ti, and a larger case.


As I understand it the difference between 4:2:0 and 4:4:4 has more to do with color and actually impacts sharpness as well. Although, I'm not the authoritative source to be stating that. @Xzibit seems to know a lot more about this than I.


----------



## RealNeil (Sep 20, 2015)

My Acer B286HK has four connections (DP, HDMI, Mini-DP, and DVI) and came with a DP Cable too,.....it has Ultra 4K/2K support (60-Hz refresh rate, 2ms response time) and it looks great. Not a bad deal for $400.00.


----------



## Xzibit (Sep 20, 2015)

cadaveca said:


> It's not about the color options. It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin. If it was, then there'd be no issues. HTPC use kind of indicated livingroom use, which equals HDTV products, not monitors.



It is about color options.  4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.

Buy 4K TV with proper connections and capability.
*Panasonic TC-65AX900U*
*Panasonic TC-85AX850U*

Saying DP is not an option is being lazy or out of ones budget.  It certainly is an option just not one or many are willing to afford.


----------



## cadaveca (Sep 20, 2015)

Xzibit said:


> It is about color options.  4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.
> 
> Buy a proper TV.
> 
> ...


out of budget, for sure. But you missed my point. Nobody complaining about the lack of HDMI is saying those things because of the difference in colour space offered by DP or HDMI. This is a product available now, that doesn't actually connect to the devices it is really intended to be used with (Nano = HTPC card, which connect in most instances to HDTVs), by not having HDMI 2.0. That's it. Sure, you CAN find HTDV panels with DP, but they are by and far a minority.

This is a design oversight that has limited the product reach. But maybe this is intentional since it seems that AMD is not truly capable of release any of the Fury-based designs in decent numbers.


At the same time, you can argue that people interested in 4K HTPC gaming is just a few, so whatever.


----------



## Kanan (Sep 20, 2015)

> Architectural changes? You do know how active display port to HDMI converters work, right? Those little dongles contain a tiny chip that functions by processing the DP video signal to convert it to an HDMI video signal, a very small piece of silicon no bigger than a few squared millimeters in size, and those cost how much to companies like AMD? Yeah you guessed that right, cents, no need to change anything in the actual architecture of the GPU to add a tiny video converter to the board to correctly support HDMI 2.0, but they decided to transfer that burden to the people who want to use this card in a home theater situation when plugged it to thousands of existing 4K TVs, so it seems like you should get your information right.


I still wait for the factual information that it would cost "a few cents" thats just blabla from you. You simply want to critizize AMD for not implementing it and you think yourself superior to a whole company - if someone like you can come to such an idea you dont think a company like AMD can? Your arrogance is just... wow. As on topic, other people already explained why they haven't done it, and my self opinion still is, a architectural change would have cost a lot more than a few cents. You're just wrong with your opinion.

As with Nano just being a "SFF card": you can see Nano as a air cooled Fury X too, with much more energy efficiency and only 5-15% less performance. Or with almost same power on same energy level (just raise the energy cap). It's more than just a SFF card, and the prizing is right. Also, you can take that Nano and mod it to have a stronger cooler, and just use it as an Fury X, with same specs, or with -50 MHz. That all being said, when you dont want that radiator/pump-thing, not everyone likes that stuff or has the space for it.


----------



## 15th Warlock (Sep 20, 2015)

Xzibit said:


> It is about color options.  4:4:4 to 4:2:0 you loose 75% of the color information. 4:2:0 is good enough for situations and will be variable for people depending on eyesight, tolerance and equipment.
> 
> Buy 4K TV with proper connections and capability.
> *Panasonic TC-65AX900U*
> ...



It's not laziness or not being able to afford it, a lot of people like me bought 4K TV sets this past couple of years when prices became more palatable (in my case a 2015 Sony Bravia 4K model) and to think that people would go out of their way and buy a new TV just because this card cannot output at 60Hz to their TV's HDMI 2.0 connectors and there simply aren't DP converters currently able to support this card does not make much sense and is not a very sound reason to invest almost $2K on a new TV. Mind you, this is a top of the line 2015 set from a well known manufacturer, and it doesn't have DP connectors on it, like the vast majority of existing 4K TVs out there.

I like gaming in my living room, but have full ATX systems on both my home theaters, so I'm not the target market for this card in particular, but people looking to build a killer mini ATX system will have to look elsewhere if they want to game at 60FPS at 4K



Kanan said:


> I still wait for the factual information that it would cost "a few cents" thats just blabla from you. You simply want to critizize AMD for not implementing it and you think yourself superior to a whole company - if someone like you can come to such an idea you dont think a company like AMD can? Your arrogance is just... wow. As on topic, other people already explained why they haven't done it, and my self opinion still is, a architectural change would have cost a lot more than a few cents. You're just wrong with your opinion.
> 
> As with Nano just being a "SFF card": you can see Nano as a air cooled Fury X too, with much more energy efficiency and only 5-15% less performance. Or with almost same power on same energy level (just raise the energy cap). It's more than just a SFF card, and the prizing is right. Also, you can take that Nano and mod it to have a stronger cooler, and just use it as an Fury X, with same specs, or with -50 MHz. That all being said, when you dont want that radiator/pump-thing, not everyone likes that stuff or has the space for it.



Am I arrogant because I'm just expressing my opinion on this card? I expressed my opinion in a respectful way and even used multiple links to validate it, when presenting a counter argument to yours. It is common knowledge such tiny chips as the ones found inside those dongles cost only cents to manufacture, how do you think you can find active converters for less than $10 dollars out there? Or any other electronic device powered by small processors that sell for a few dollars ?

Take this IC commonly found inside a DP to HDMI converter:

http://datasheet.octopart.com/STDP2650-AC-STMicroelectronics-datasheet-16348534.pdf

It sells for $0.10 when you order 100 or more from China:

http://www.alibaba.com/product-detail/-new-for-ic-in-integrated_60284045204.html

So there, see? A few cents, satisfied? What's more, what do you think the actual manufacturing cost of a similar IC would be to AMD? Let's take adding more traces to the PCB into consideration, to place this 8x8mm IC between the GPU and the HDMI connector, do you think that adds a big chunk of money to the BOM for this card? Just think about it for a minute.

It's all about maximizing your profits by reducing costs, I'm pretty sure the BOM for Fury X is much higher than for the Nano, and yet both sell for the exact same price, does the fact that I point that out make me arrogant? I think not. Nvidia did the same with Titan X by saving a few cents and not adding a backplate to it, something the least expensive 980 featured out of the box from day one, and I called them out back then as well, both companies have a board of directors to respond to and a few cents here and there add up in the end when talking about your bottom line.

I don't even know you and never resorted to insults like you did from your first post, in my view you're the one who comes across as arrogant, funny how you just sidestepped the whole DP adapter topic you brought up in the first place, and yet you accuse me of not backing my argument, double standards much?

Oh, and btw, it's not me presenting this card as the king of SFF cards, it's AMD in pretty much all of their marketing presentations for the Nano so far.

I'm done with you, seldom have I had to deal with people who resorts to belittling and insulting others just for the sake of coming on top of an argument when I'm expressing my valid point of view, seldom have I used the ignore feature in the long time I've been a member of this forum either, as most people here are mature enough to discuss any given topic without resorting to insulting others, so I'm gonna take the high road and hope you learn to appreciate or at least respect the opinions of others.

Have a good day sir.


----------



## Xzibit (Sep 21, 2015)

15th Warlock said:


> It's not laziness or not being able to afford it, a lot of people like me bought top of the line 4K TV sets this past couple of years when prices became more palatable (in my case a top of the line 2015 Sony Bravia 4K model) and to think that people would go out of their way and buy a new TV just because this card cannot output at 60Hz to their TV's HDMI 2.0 connectors and there simply aren't DP converters currently able to support this card does not make much sense and is not a very sound reason to invest almost $2K on a new TV. Mind you, this is a top of the line 2015 set from a well known manufacturer, and it doesn't have DP connectors on it, like the vast majority of existing 4K TVs out there.
> 
> I like gaming in my living room, but have full ATX systems on both my home theaters, so I'm not the target market for this card in particular, but people looking to build a killer mini ATX system will have to look elsewhere if they want to game at 60FPS at 4K



Just looked at some of the Sony 4k TV manuals and unless you have a different one.

Video (2D): 4096 × 2160p (60 Hz)*, 4096 × 2160p (24 Hz), 3840 × 2160p (60 Hz)*,
3840 × 2160p (24, 25, 30 Hz),1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz),
720p (30, 60 Hz), 720/24p, 480p, 480i, PC Formats
**1 YCbCr 4:2:0 / 8 bit
*2 3840 × 2160p is displayed when 4096 × 2160p is input*

You'd be down-sampled to 8-bit 4:2:0 which is worse.  Better to invest that 2k on a 4k 10-bit monitor with a DP connection and go as big as you can.  Regardless of your GPU choice.

Personally I'd wait for DP 1.3 but if I had to buy now I'd look at the pros and cons.


----------



## 15th Warlock (Sep 21, 2015)

Xzibit said:


> Just looked at some of the Sony 4k TV manuals and unless you have a different one.
> 
> Video (2D): 4096 × 2160p (60 Hz)*, 4096 × 2160p (24 Hz), 3840 × 2160p (60 Hz)*,
> 3840 × 2160p (24, 25, 30 Hz),1080p (30, 60 Hz), 1080/24p, 1080i (60 Hz),
> ...



Thanks for mentioning that, I had to double check, as honestly I wasn't aware of that limitation on my TV  I have a Sony XBR55X850C, it does, apparently support 444 mode at 4K 60Hz after the latest firmware update according to this site:

http://www.rtings.com/tv/reviews/by-brand/sony/x850c?uxtv=b58b6b8ba3c3



> 10PC Monitor:
> 
> 1080p @ 60Hz @ 4:4:4: Yes
> 1080p @ 120Hz: Yes
> ...



I checked and my TV has the latest firmware, so it seems it supports 10bit after all, as it shows the enhanced signal format in the settings menu, thanks for the heads up


----------



## FordGT90Concept (Sep 21, 2015)

cadaveca said:


> It's all about what other products carry, and for like 98% of panels in the HDTV space, DisplayPort is NOT an optoin.


Which is nonsensical.  If I walk around my house and look at all my TVs that have HDMI, every single one of them has at least VGA + 3.5mm too.  Some even have DVI.  I think it's pretty clear what's going on here: the TV industry is deliberately trying to sabotage DisplayPort in the name of keeping HDMI around.


----------



## Xzibit (Sep 21, 2015)

15th Warlock said:


> Thanks for mentioning that, I had to double check, as honestly I wasn't aware of that limitation on my TV  I have a Sony XBR55X850C, it does, apparently support 444 mode at 4K 60Hz after the latest firmware update according to this site:
> 
> http://www.rtings.com/tv/reviews/by-brand/sony/x850c?uxtv=b58b6b8ba3c3
> 
> ...



No problem.

Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do.  HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.

The person who posted that didn't read the HDMI FAQ as some of the following post point out.  One even posted the Firmware changes and there is no mention of it.


----------



## 15th Warlock (Sep 21, 2015)

Xzibit said:


> No problem.
> 
> Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do.  HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.
> 
> The person who posted that didn't read the HDMI FAQ as the post following his imply.  One even posted the Firmware changes and there is no mention of it.



You're right, I stand corrected, in order to display chroma at 444 the bit rate is downsampled to 8bit.

And I agree it sucks most TVs don't support display port out of the box, it clearly is the best alternative for video interfaces


----------



## Sasqui (Sep 21, 2015)

Awesome review.  I particularly like the editorial comments in the conclusion about the irrationality of it all.  Put's enthusiasm in perspective


----------



## the54thvoid (Sep 21, 2015)

Xzibit said:


> No problem.
> 
> Probably knocking you down to 8-bit 4:4:4 which HDMI 2.0 can do.  HDMI 2.0 cant do 10-bit 4:4:4 at 4k 60hz.
> 
> The person who posted that didn't read the HDMI FAQ as some of the following post point out.  One even posted the Firmware changes and there is no mention of it.



I didn't know what 4:4:4 was.  So I read up on it.

http://hdguru.com/hdmi-2-0-what-you-need-to-know/



> *Color crunching*
> Here’s how chroma subsampling works. The human eye is more sensitive to black and white detail than color detail. Chroma subsampling compression takes advantage of this fact by sending a full-resolution black and white (luma) information and only partial-resolution color (chroma) information. *The result is a reduction of image data with no accompanying visual degradation.*
> 
> There are three main types of chroma subsampling for video content: 4:4:4; 4:2:2; and 4:2:0. With 4:4:4 there is no subsampling. With 4:2:2, half of the color detail is thrown away. And with 4:2:0, 75% of color information is discarded. *Blu-ray, HDTV, and DVD all use 4:2:0 subsampling*. We don’t notice the loss of color detail right now with those formats, and we aren’t likely to notice it after the move to UHD.



Given 4:4:4 is ideal but not what the industry works to , it is still remiss of AMD to omit the HDMI 2.0 standard from their 'marketed' dedicated Home PC graphics card.  All other arguments aside and the 4:4:4 drum banging put away - the industry has dictated the format, graphics vendors need to deal with that.  The simple question to ask is, would my crossfired Nano's in my (now slightly larger mini ATX) case be better with an HDMI 2.0 connection for 60 fps gaming?  The answer is absolutely yes. 

Would it be great if the industry all adopted an unsampled chromatic system? - yes.  BUT they haven't yet.  So my hypothetical Nano crossfire set up is limited to 30fps in most circumstances in my living room TV because AMD didn't use HDMI 2.0.


----------



## john_ (Sep 21, 2015)

One more thought.

HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.


----------



## Xzibit (Sep 21, 2015)

the54thvoid said:


> I didn't know what 4:4:4 was.  So I read up on it.
> 
> http://hdguru.com/hdmi-2-0-what-you-need-to-know/
> 
> ...



You've convinced me.

If only I can force my system to work at 4:2:0 I'd be set.  The bigger the screen the better.

/s

Here is something some of you will be able to duplicate if your system gives you the 4:2:2 option.


----------



## the54thvoid (Sep 21, 2015)

Xzibit said:


> You've convinced me.
> 
> If only I can force my system to work at 4:2:0 I'd be set.  The bigger the screen the better.
> 
> ...



Hmm, wasn't trying to convince you of anything, just pointing out market conditions from the TV manufacturers.  No amount of techno posturing will change that.  You still wont address whether AMD should have included HDMI 2.0 as an output.  What may affect peoples choices though is the fps options for gaming.  Fast first person shooters require higher fps for better gameplay - 30fps isn't ideal.

All being said, I personally wouldn't game on a TV, I'd always prefer a monitor, so the argument isn't for me. And perhaps the folks that game on TV's use consoles.  For AMD though the fact is, most reviewers have criticised AMD for not adopting HDMI 2.0 for Fiji for it's 'living room TV use'.  Whether it's got any real world impact or not, the negative impact is there from the start.  I know 4:4:4 is preferred - it's the way the image should be but it's generally not delivered to us that way, hell - Blu Ray (which looks great to 99% of folks) doesn't use 4:4:4 (4:2:0 I think for that link?).

Anyway, this thread is about crossfire Nano so the colour discussion is for another thread.  No point talking about it if AMD can't give you it via a TV that wont support it.  It's 4:2:2 or 4:2:0.  For now - we need to accept that.

EDIT: I did watch that wonderful sales presentation and I've now bought a Roland V-800HD.  It added nothing to the argument though, in fact it was irrelevant to the discussion on Nano being used in a living room environment on a 4K TV with no HDMI 2.0.... go figure.  Slow clap.


----------



## Xzibit (Sep 22, 2015)

the54thvoid said:


> Hmm, wasn't trying to convince you of anything, just pointing out market conditions from the TV manufacturers.  No amount of techno posturing will change that.  You still wont address whether AMD should have included HDMI 2.0 as an output.  What may affect peoples choices though is the fps options for gaming.  Fast first person shooters require higher fps for better gameplay - 30fps isn't ideal.
> 
> All being said, I personally wouldn't game on a TV, I'd always prefer a monitor, so the argument isn't for me. And perhaps the folks that game on TV's use consoles.  For AMD though the fact is, most reviewers have criticised AMD for not adopting HDMI 2.0 for Fiji for it's 'living room TV use'.  Whether it's got any real world impact or not, the negative impact is there from the start.  I know 4:4:4 is preferred - it's the way the image should be but it's generally not delivered to us that way, hell - Blu Ray (which looks great to 99% of folks) doesn't use 4:4:4 (4:2:0 I think for that link?).
> 
> ...



Should they have included it sure I think I said it in another thread but not all 4k TVs even fully support HDMI 2.0 functionality as where all monitors and those TVs that have DisplayPort are likely to support all functionality. 15th Warlock just provided an example...

If your only option is HDMI 2.0.  *You better make sure you know what your 4k TV supports*.  I would say use the 4k 8-bit 4:4:4. At the same time that defeats the purpose of 4k 10-bit content.  So you'll have to switch back and forth between settings.  That's if your 4k TV supports the 10-bit 4k signal and doesn't down-sample you. Some 4k TVs down-sample you as soon as you pop-up the menu or use PiP.  Until HDMI 2.0 functionality is ironed out in the 4k TVs its just a check-box.

You could just take one of the TVs I linked (they are a few others) or preferably a 4k 60hz 10-bit monitor w/DP and plug it in and forget about having to switch between settings.

*Seems you'll buy anything that doesn't have AMDs name attached to it...*  You set that one up.

Like I said *HERE*.


----------



## cadaveca (Sep 22, 2015)

Xzibit said:


> You could just take one of the TVs I linked (they are a few others) or preferably a 4k 60hz 10-bit monitor w/DP and plug it in and forget about having to switch between settings.




Bleh. I already got my TV, and paid nearly 10 times what a nano would cost locally. Spending that money again, just to get a DP port... psh... it's cheaper to not buy any AMD card, and go NVidia.

All I hope is that AMD corrects this with the next generation of GPUs.

Anyway, I kind of realized that AMD touts this card as a SFF VGA, not a HTPC VGA. *It's the HTPC designation that makes things look bad, when it comes to HDMI/DP*. A product should meet the needs of the market NOW not the future, and also not force consumers into limited purchasing choices of supporting hardware. SFF PC = monitor. HTPC = HDTV. It's quite different.

Nobody in their right mind will say HDMI is better than DP.. nobody has...yet you keep harping on this point like it matters when it doesn't. IT's the fact that DP-only connectivity prevents many users interested in this card form actually putting it to use, due to their pre-existing hardware.


----------



## Kanan (Sep 22, 2015)

> Nvidia did the same with Titan X by saving a few cents and not adding a backplate to it, something the least expensive 980 featured out of the box from day one, and I called them out back then as well, both companies have a board of directors to respond to and a few cents here and there add up in the end when talking about your bottom line.


Nvidia said they scraped the backplate on some (or all) of their cards because it had heat issues with SLI, because room between cards is scarce. I don't know if its true, but could be. Or it's a smart way to earn some more money. Fact is, the point of Nvidia is probably valid anyway.

On HDMI 2.0 vs DP topic in TVs:
It saves money not to include DP, so they try to not add it to their TVs or charge extra for it in another model of same type.  That would be another point of view on this subject.



> HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.


Interesting thought. Maybe you're right, but I still think it's just AMD without money, trying to get away with HDMI 1.4 and adding it later in their graphics card line in 2016.


----------



## FordGT90Concept (Sep 22, 2015)

john_ said:


> One more thought.
> 
> HDMI is a competitor to DisplayPort. It's not VESA's. VESA did a favor to AMD by supporting AdaptiveSync/FreeSync with DisplayPort, maybe what we see here is just AMD doing a favor to VESA by not supporting HDMI 2.0. Maybe it's just politics.


HDMI has no support for adaptive sync and likely never will.  It would cost TV manufacturers too much to implement.

HDMI 2.0 has enough bandwidth for 4K @ 60 Hz 24-bit color but it does not have enough bandwidth for 4K @ 60 Hz 30-bit color.  DisplayPort can handle 4K @ 60 Hz 48-bit color.  All figures are for 4:4:4.


----------



## Xzibit (Sep 22, 2015)

Kanan said:


> Interesting thought. Maybe you're right, but I still think it's just AMD without money, trying to get away with HDMI 1.4 and adding it later in their graphics card line in 2016.



They are still paying royalties on HDMI either way since 1.4 was included.

What they said at the Fiji event is probably true (There is a video somewhere). They didn't see HDMI 2.0 necessary because there focus was on DisplayPort which could do it better and it has FreeSync.

Next gen you might see a similar outcome HDMI 2.0 might be there but its not going to be emphasized especially if DisplayPort 1.3 is introduced.


----------



## FordGT90Concept (Sep 22, 2015)

I expect all 14/16nm cards to have one HDMI 2.0 and one or more DisplayPorts.  I don't know about DVI-I, DVI-D, and VGA--low end cards may still have them but I suspect NVIDIA will take after AMD and exclude them on top of the line cards.


----------



## cadaveca (Sep 22, 2015)

FordGT90Concept said:


> I expect all 14/16nm cards to have one HDMI 2.0 and one or more DisplayPorts.  I don't know about DVI-I, DVI-D, and VGA--low end cards may still have them but I suspect NVIDIA will take after AMD and exclude them on top of the line cards.


All current HDMI 2.0 implementations that I have seen use a MegaChips support IC. This IC will require additional PCB real-estate (and is probably why Fury cards don't have such support). So I expect that perhaps such connectivity will be put internal to the GPU silicon, but I'd simply be happy with all DP if a suitable HDMI adapter came in the box rather than other things like DVI to VGA or whatever. It's just weird how little HDMI 2.0 support there really is in hardware (many Z170 motherboards support it) considering it is a relatively old spec.


----------



## FordGT90Concept (Sep 22, 2015)

They're never going to put DisplayPort-to-anything (except miniDisplayPort to DisplayPort in the case of Eyefinity cards) adapters in the box because they're too expensive.  DVI-I to VGA  is as simple as changing the pin out (<$1 adapter) which is why they're all over the place.  There is no native backwards compatibility in DisplayPort.

Z170 only supports HDMI 2.0 through Thunderbolt via Alpine Ridge chip (achieves it by converting a DisplayPort signal).

Article about MegaChips (it is 7 x 7 mm):
http://www.reuters.com/article/2015/06/15/megachips-hdmi-chip-idUSnPn5PfYPz+96+PRN20150615

DisplayPort to HDMI 2.0 requires a level-shifter and active-protocol converter (LSPCON).


----------



## Kanan (Sep 22, 2015)

> What they said at the Fiji event is probably true (There is a video somewhere). They didn't see HDMI 2.0 necessary because there focus was on DisplayPort which could do it better and it has FreeSync.


Well thats exactly what I'd say too, if I didn't want or can add HDMI 2.0 support to my cards. I still think they didn't do it because they lack money.

Edit: or space in the die/on PCB (as someone else has written).


----------



## cadaveca (Sep 22, 2015)

FordGT90Concept said:


> They're never going to put DisplayPort-to-anything (except miniDisplayPort to DisplayPort in the case of Eyefinity cards) adapters in the box because they're too expensive.  DVI-I to VGA  is as simple as changing the pin out (<$1 adapter) which is why they're all over the place.  There is no native backwards compatibility in DisplayPort.
> 
> Z170 only supports HDMI 2.0 through Thunderbolt via Alpine Ridge chip (achieves it by converting a DisplayPort signal).
> 
> ...


I ran Eyefinity before it was officially supported. I have more than one active DP-DVI adapter. I do understand the cost, but I'm also willing to pay more to have that convenience. To me, it's like the fabled MST hubs for Eyefinity that never did sell in the NA market that might have fixed the "broken cursor" issue.

You do need to keep in mind that even a GTX980 locally is $900. A nano, fury, they cost about $20 less locally. FuryX is about $50 more, just touching over 1k. What's 10% of the cost to ensure I can use things properly? minor, in my books.

It's that experience dealing with active adapters and AMD cards that has me a bit pissed that Nano isn't going to provide HDMI 2.0, since I think the cards are great, even @ $1000.


----------



## Xzibit (Sep 22, 2015)

FordGT90Concept said:


> They're never going to put DisplayPort-to-anything (except miniDisplayPort to DisplayPort in the case of Eyefinity cards) adapters in the box because they're too expensive.  DVI-I to VGA  is as simple as changing the pin out (<$1 adapter) which is why they're all over the place.  There is no native backwards compatibility in DisplayPort.
> 
> Z170 only supports HDMI 2.0 through Thunderbolt via Alpine Ridge chip (achieves it by converting a DisplayPort signal).
> 
> ...



The dongle is part #
*MCDP2850-BB*
MCDP2850 (64 LFBGA, 7x7mm) – USB Type-C / DisplayPort to HDMI2.0 accessory application solution

Maybe just too costly to make no-one wants to fork over the HDMI fees with a very limited potential buyers.


----------



## FordGT90Concept (Sep 22, 2015)

I think they're coming; it is just too soon.  That announcement was 3 months ago.  I think it takes longer than that to take a chip, design a PCB, manufacturer, and test it.  I wouldn't be surprised if there's some prototypes behind closed doors.  We should be seeing converters available for purchase by the end of the year.


----------



## r.h.p (Sep 22, 2015)

Excellent review , good 1    From my opinion the Nano x 2  is Expensive but if your cashed up and love gaming its a good option


----------



## Cataclysm_ZA (Sep 22, 2015)

15th Warlock said:


> Yeah, that's always an option, but that doesn't save the card from being criticized for not including this feature out of the box, it would've cost AMD literally a few cents to add this feature to this "premium card" considering it costs exactly the same as a full AIO water cooled Fury X.



Adding HDMI 2.0 into GCN for the Nano would have required a time machine and adding several dollars to the BOM. AMD was far too deep in designing Fiji by the time the HDMI 2.0 specification was close to finished. Current implementations on Maxwell, to my knowledge, don't support the latest versions of HDCP fully, either. So neither company is completely on board with HDMI 2.0 support.


----------



## DaedalusHelios (Sep 22, 2015)

980 ti SLI needs to be added to the comparison. It is more common than nano Crossfire and it is the most relevant to the card comparison since the price point is the same.....


----------



## 15th Warlock (Sep 23, 2015)

Cataclysm_ZA said:


> Adding HDMI 2.0 into GCN for the Nano would have required a time machine and adding several dollars to the BOM. AMD was far too deep in designing Fiji by the time the HDMI 2.0 specification was close to finished. Current implementations on Maxwell, to my knowledge, don't support the latest versions of HDCP fully, either. So neither company is completely on board with HDMI 2.0 support.



Adding HDMI 2.0 to GCN? I think you might be confused, GCN is the 3D geometry and rasterizer architecture used in the actual rendering engine of the GPU, and as such, it doesn't handle the display interface the video card will utilize to connect to a particular monitor or TV, this is a separate part in the overall architecture of a video card, and as explained before (or even suggested by certain people) is a limitation that could be "easily" overcome by utilizing a DP to HDMI adapter or, in the case of having this feature enabled out of the box for this card in particular, by adding a converter chip such as the ones utilized in said converters between the actual GPU and the display interface connector, no need to mess with the architecture inside the GPU itself.

What you're saying is akin to proposing GCN is incapable of rendering HDMI 2.0 because its rendering engine cannot output a game at 3840x2160 and 60 frames per second, something we know this card is very capable of, one thing is not related to the other.

Also, I want to bring to your attention the fact that the HDMI 2.0 protocol was finalized as such in September of 2013, a full two years ahead of the release of the Nano.

As for HDCP 2.2, once again, I believe you might be confused about the real purpose of this especification, which refers to the ability of your video card to display protected content using HDCP 2.2 protocol. This has absolutely nothing to do with the ability of your video card to be able to display games to any given 4K display at 60Hz, in fact, as of Sept 2015, there's no content available that utilizes this protection protocol, and it is expected that by this holiday season, the first UHD bluray movies or HDR enabled content that will require the use of this protocol will be released, which as mentioned before, is only utilized for protected media content, and not for gaming.

Also, you are not correct in saying neither Nvidia, nor AMD are completely on board with HDMI 2.0, in fact Maxwell based 960 and 950 cards fully support HDCP 2.2, as well as the IGP found in Intel's Skylake processors, Nvidia has said they'll tentatively enable this feature to more of their Maxwell based GPUs as more protected content media is released next year, whether this is true or not, is not even pertinent to the topic we are discussing.

Hope this helps clear up some of your doubts, in reality, I think this topic should be dropped already, as it seems we are going in circles, and I'm sure, the majority of our forum members must be tired of the ongoing debate, and of the constant derailing of the original purpose of this review and its thread.

Truth is Nano is a very nice card, in fact, there's nothing out there quite like it, and I don't believe the green team expected this card to even exist, and I can't think of how their engineering team will respond to it, as I think they got caught with their pants down

It's such a shame that a few features were not included by AMD, which would have truly made this card shine in certain situations; also, in my personal opinion, the decision to price the card as they did, only opened them to further criticism when you take into consideration that for $650, Fury X is a much better card.


----------



## john_ (Sep 23, 2015)

Prices will probably go down as more Fiji GPUs come out of the factory. AMD had two options
- Put high prices and slowly sell whatever they make.
- Put lower prices, sell everything in a couple of days/weeks and also risk Nvidia responding with lower prices for 980/980Ti.

I think in the end everyone in their place would go with the first option.


----------



## r.h.p (Sep 23, 2015)

15th Warlock said:


> Adding HDMI 2.0 to GCN? I think you might be confused, GCN is the 3D geometry and rasterizer architecture used in the actual rendering engine of the GPU, and as such, it doesn't handle the display interface the video card will utilize to connect to a particular monitor or TV, this is a separate part in the overall architecture of a video card, and as explained before (or even suggested by certain people) is a limitation that could be "easily" overcome by utilizing a DP to HDMI adapter or, in the case of having this feature enabled out of the box for this card in particular, by adding a converter chip such as the ones utilized in said converters between the actual GPU and the display interface connector, no need to mess with the architecture inside the GPU itself.
> 
> What you're saying is akin to proposing GCN is incapable of rendering HDMI 2.0 because it's rendering engine cannot render a game at 3840x2160 and 60 frames per second, something we know this card is very capable of, one thing is not related to the other.
> 
> ...




Dude I respect u , though do u Work for AMD or a Development Component company ? I mean what you are Bringing to the table is serious I know what im talking about programing and hardware genius Words from my opinion ....... so is it just trolling info or the real deal ? I don't mind , just thought id ask.


----------



## Fx (Sep 24, 2015)

W1zzard, I enjoyed this information. Thank you.


----------



## r.h.p (Sep 25, 2015)

Um yes Wizzard I enjoyed your review too , I forgot


----------



## HyLite (Oct 10, 2015)

Anybody get a successful Win10 64bit r9 nano in crossfire?


----------



## i7Baby (Jun 17, 2016)

See https://community.amd.com/thread/189771 - BF4, W10, R9 Xfire - Oct 2015

R9 Nano prices have dropped. GTX1080 is out. I have one R9 Nano already (for video editing). Cheap 4K for me is add another R9 Nano.


----------



## Xavier Gonzalez (Jun 25, 2016)

Is it safe to say this is what the Pro Duo would benchmark?


----------



## i7Baby (Jun 25, 2016)

Xavier Gonzalez said:


> Is it safe to say this is what the Pro Duo would benchmark?



See http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-Pro-Duo-Review/Grand-Theft-Auto-V

GTAV - Pro Duo is same as XF R9 Nano at 1440p, and 3% less than XF R9 Nano at 4K.


----------

