• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD R9 Nano Review

It's definitely better than I expected, but the price is just a deal-breaker. Basically this is a card for people that want to game at 4K or maybe 3440x1440 in a super tiny case where even a Fury X won't fit... In any other situation it's better IMO to get a GTX 970 mITX (if you really want the tiniest of cases) or a Fury X / Fury / GTX 980 / GTX 980 Ti, since like other people said already there are plenty of small cases where these cards will fit; especially the Fury X. I mean, for the same price you get something faster, quieter, cooler and a liiiittle bit bigger :P (and of course it consumes more power, but if you want efficiency, just go Maxwell)
 
A custom liquid-cooled 980Ti in a SFF case would be a more worthy investment & runs a little more quieter than most non-reference vendor's offerings.
 
Really showing an edge over the GM204 in 4k. Not too shabby TBH, quite an impressive piece of silicon. Funny how it's cooler than the 290x too.
 
Fury X is only 1.5" larger than the nano. The nano's supposedly niche market can be filled by the Fury X. I know plenty of ITX cases that can not only fit the 120mm radiator that the Fury X comes with, but most of them can fit full size cards in anyway.

Nano has worse cooling, costs the same, less performance, is louder, and neither of them have HDMI 2.0, making their use as a 4K@60FPS TV HTPC completely useless. It has lower power limits, hindering its overclocking ability, and no AIB is currently allowed to slap on an awesome sensible cooler with better VRM's and inductors. Add to that that it looks totally stupid in a full sized tower case, I literally don't understand what market this is for.

The nano makes zero sense to me, other than as a fashion accessory. I think there was a reason many sites didn't get a nano, and that's because of their honesty. Some of the reviews I've seen have hinted at negativity, but have not written anything downright honest about the severity of the downsides to the nano. To me it feels like AMD sent samples on the agreement that nothing severely negative was written. I look at every review's performance figures (the factual bits), and what I glean from them is a world apart from the things I'm seeing in these conclusions.

Why anyone would buy this over the Fury X is beyond me.

EDIT: Oh, and clocking this at "up to 1000mhz" is absolutely hilarious when you see the actual frequency figures

People still don't understand HDMI 2.0 it seams. HDMI 2.0 cant do 4:4:4 / 2020 at 60k 10bit+. If anyone is serious about HTPC they aren't getting a HDMI 2.0 for it, they are connecting it with DisplayPort. Not to mention the TV has to support the signal process not just have the input.
 
People still don't understand HDMI 2.0 it seams. HDMI 2.0 cant do 4:4:4 / 2020 at 60k 10bit+. If anyone is serious about HTPC they aren't getting a HDMI 2.0 for it, they are connecting it with DisplayPort. Not to mention the TV has to support the signal process not just have the input.

But it's needed for 4K @ 60FPS minimum, because not even high end TV's have DP, but the top end ones have HDMI 2.0. Way to go to miss the point.
 
But it's needed for 4K @ 60FPS minimum, because not even high end TV's have DP, but the top end ones have HDMI 2.0. Way to go to miss the point.

Don't think I missed the point when you said HTPC

The whole point of it is viewing 4k content that's 4k 2020. which HDMI 2.0 cant do, which your pointing out for the second time at 60.
 
Don't think I missed the point when you said HTPC

The whole point of it is viewing 4k content that's 4k 2020. which HDMI 2.0 cant do, which your pointing out for the second time at 60.

I don't understand but Anandtech does (Fiji discussion)

For Display I/O this means 6 display controllers capable of driving DVI, HDMI 1.4a, and DisplayPort 1.2a. Unfortunately because Tonga lacked support for HDMI 2.0, the same is true for Fiji, and as a result you can only drive 4k@60Hz displays either via DisplayPort, or via tandem HDMI connections. The good news here is that it will be possible to do active conversion from DisplayPort to HDMI 2.0 later this year, so Fiji is not permanently cut-off from HDMI 2.0, however those adapters aren’t here quite yet and there are still some unresolved questions to be addressed (e.g. HDCP 2.2).

On the multimedia front, Fiji brings with it an enhanced set of features from Tonga. While the video encode side (VCE) has not changed – AMD still supports a wide range of H.264 encode settings – the video decode side has seen a significant upgrade. Fiji is the first AMD discrete GPU to support full hardware HEVC decoding, coinciding with the launch of that feature on the GCN 1.2-based Carrizo APU as well.



A look at DXVA Checker confirms the presence of Main Profile (HEVC_VLD_Main) support, the official designation for 8-bit color support. Main profile is expected to be the most common profile level for HEVC content, so Fiji’s support of just Main profile should cover many use cases.

Unfortunately what you won’t find here is Main10 profile support, which is the profile for 10-bit color, and AMD has confirmed that 10-bit color support is not available on Fiji. As our in-house video guru Ganesh T S pointed out when looking at these results, Main10 is already being used in places you wouldn’t normally expect to see it, such as Netflix streaming. So there is some question over how useful Fiji’s HEVC decoder will be with commercial content, ignoring for now the fact that lack of Main10 support essentially rules out good support for some advanced color space features such as Rec. 2020, which needs higher bit depths to support the larger color space without extensive banding.
 
Yea, but what is nice (I really thought this was going to be locked down tight) you can increase the power limit which allows it to run full time at the 1000mhz threshold and it still manages to run in the designated temp range. Its definitely not nearly as bad as I was fearing it might be and actually has more potential than I initially thought.
I agree 100%.
 
Don't think I missed the point when you said HTPC

The whole point of it is viewing 4k content that's 4k 2020. which HDMI 2.0 cant do, which your pointing out for the second time at 60.

HTPC though, I think he's trying to say, are primarily for Televisions, not monitors. Most TV's are not equipped with DP, but HDMI. I recall seeing somewhere that it's almost an unspoken agreement among manufacturers to keep them separate, so as not to compete between the two standards. Thus TV's need dual HDMI for 60fps.
 
HTPC though, I think he's trying to say, are primarily for Televisions, not monitors. Most TV's are not equipped with DP, but HDMI. I recall seeing somewhere that it's almost an unspoken agreement among manufacturers to keep them separate, so as not to compete between the two standards. Thus TV's need dual HDMI for 60fps.

HTPC isn't limited to a TV. If you have that mind set you already screwed yourself due the HDMI 2.0 limitations. Let alone calling it useless.

HDMI FAQ said:
HDMI 2.0 4k 10bit 4:2:0 2020 @ 60

Best case scenario your limited to 4k streaming content which can be downgraded to 8bit to save bandwidth and true 4k content will be downgraded chroma-subsampled. You also have to have the proper software to play it.

800px-Colorcomp.jpg


My main issue was calling something useless when DisplayPort can do it properly and HDMI cant but since it gives the convenience of a TV makes it better? That doesn't make sense to me nor qualifies it as useless.

Panasonic TC-65AX900U

Panasonic said:
DisplayPort (4K 60 Input) 1 (bottom)

4K Pure direct

Enables the support of a 4k 4:4:4 digital video format signal. Use only with a 4k 4:4:4 capable source
 
Last edited:
HTPC isn't limited to a TV. If you have that mind set you already screwed yourself due the HDMI 2.0 limitations. Let alone calling it useless.
What we think doesn't matter. It's what the general populance thinks, and livingroom PC = HDTV-compatible just like a console, dvd player, HT receiver, etc, etc... Nano should be easily drop-in compatible. In that market, HDMI is the standard. Personally, I prefer DisplayPort, but it is what it is.
 
What we think doesn't matter.
But but....our opinions! I'm still holding onto mine until reviews from "unfair" people like W1zzard and HardOCP come out. (If you can't tell, I can't be more sarcastic).
 
I wonder what a binned chip would do under water with better voltage delivery.... but probably not much better than the FuryX, so while its a interesting idea, it has already failed on multiple points.

Price
Performance
Power

The market is small.


This is the equivalent of buying a overpriced Camaro with a small engine, no mod options, and taking it to the track.
 
I think instead of demanding HDMI 2.0 on graphics cards, we should be demanding DisplayPort receivers and TVs. HDMI was a horrible standard, is a horrible standard, and always will be a horrible standard. In a world where DisplayPort exists, HDMI no longer needs to exist. The sooner it gets kicked to the curb, the better.
 
The game scores seem right in the reviews but on the first link, the Firestrike scores do not seem correct at all.
 
This is the equivalent of buying a overpriced Camaro with a small engine, no mod options, and taking it to the track.

No it isnt.
 
What we think doesn't matter. It's what the general populance thinks, and livingroom PC = HDTV-compatible just like a console, dvd player, HT receiver, etc, etc... Nano should be easily drop-in compatible. In that market, HDMI is the standard. Personally, I prefer DisplayPort, but it is what it is.


scart was the standard before that, and rf before that, should we mark things down today for letting them die too?

2015 not 2005...

if the people who should know better keep banging the drum for hdmi what chance has joe public got?
 
scart was the standard before that, and rf before that, should we mark things down today for letting them die too?

2015 not 2005...

if the people who should know better keep banging the drum for hdmi what chance has joe public got?

The industry dictates. It always has done. Joe Public doesn't influence that. Gfx vendors need to either force the change with intent or supply the compatible interface for the here and now.
It is that black and white.
 
yes, and amd should be applauded for trying to make that so, yet people who should know better say the lack of hdmi is a flaw...
 
The industry dictates. It always has done. Joe Public doesn't influence that. Gfx vendors need to either force the change with intent or supply the compatible interface for the here and now.
It is that black and white.
That might be exactly what AMD is doing. They kicked VGA to the curb by switching from DVI-I to DVI-D. Fiji is kicking DVI-D to the curb by removing it entirely. Now AMD doesn't bother to put HDMI 2.0 on their latest flagship card but they do put three DisplayPorts and they're all capable of MST.

Maxwell didn't have HDMI 2.0 either until GT2xx. We'll have to see if AMD does the same with Fiji or not.
 
That might be exactly what AMD is doing. They kicked VGA to the curb by switching from DVI-I to DVI-D. Fiji is kicking DVI-D to the curb by removing it entirely. Now AMD doesn't bother to put HDMI 2.0 on their latest flagship card but they do put three DisplayPorts and they're all capable of MST.

Maxwell didn't have HDMI 2.0 either until GT2xx. We'll have to see if AMD does the same with Fiji or not.

That may have been the plan but to not furnish your current gen card with the current industry standard, when your discrete market share is considerably diminished is like trying to move a boulder by blowing on it.
Necessity is the mother of invention and AMD's current position doesn't necessitate any industry shift.
If Intel clapped their hands then I could see the change. What do consoles use? AMD machines, all of them. 1.4. (Lol, I just checked, I genuinely didn't know)
 
What do consoles use? AMD machines, all of them. 1.4. (Lol, I just checked, I genuinely didn't know)
Most consoles can barely play games at 1080p, so what does it matter how well it can do 4k? It's not something consoles are going to do very well anyways. I think Ford is on to something here though.
That might be exactly what AMD is doing. They kicked VGA to the curb by switching from DVI-I to DVI-D. Fiji is kicking DVI-D to the curb by removing it entirely. Now AMD doesn't bother to put HDMI 2.0 on their latest flagship card but they do put three DisplayPorts and they're all capable of MST.
 
Most consoles can barely play games at 1080p, so what does it matter how well it can do 4k? It's not something consoles are going to do very well anyways. I think Ford is on to something here though.

Being an idealist is great but change costs money. If the consoles are seen by the TV industry as the standard gaming product, it is not in their interests to develop connections suited to 4k future gaming. That's why I said what I said. I'm a pragmatist, not an idealist. I know a little about economics and enough about greed to understand that AMD's biggest influence with the TV industry is it's involvement in consoles and that is stuck at HDMI 1.4.

They should have stuck with HDMI 2.0 for now and perhaps ditched it with Arctic Islands and Pascal time frame when maybe, the TV market will want to know what's next. Now Fiji might be kicking DVI-D to the kerb but Maxwell isn't. What sells more - what puts more pressure on the industry? We can applaud AMD for trying but FFS, most reviews are saying it;s a mistake so why do people here think it's a great idea?
Seriously, change is transition, not cessation and creation. They needed to go with HDMI 2.0 for 4k for now, they didn't. Luckily, I think most gamers who spend that money on gfx cards use monitors and not TV's. So ironically, Nano's 4k HDTV PR spin is lost on it's actual market of 4k SFF gaming cases for monitors. They haven't taken an arrow in the knee yet.

You all know I'm right. Despite the rare exception example being posted (which almost proves my point) with the relevant Fiji friendly connection, HDTV's are NOT favouring Fiji in 2015.
 
We can applaud AMD for trying but FFS, most reviews are saying it;s a mistake so why do people here think it's a great idea?
Seriously, change is transition, not cessation and creation. They needed to go with HDMI 2.0 for 4k for now, they didn't. Luckily, I think most gamers who spend that money on gfx cards use monitors and not TV's. So ironically, Nano's 4k HDTV PR spin is lost on it's actual market of 4k SFF gaming cases for monitors. They haven't taken an arrow in the knee yet.

If they have a HDMI connector they should updated to 2.0

As far as being a better option and reviewers. I guess they point to convenience of wide TV connectivity rather then functionality.

HDMI FAQ said:
HDMI 2.0 4k 10bit 4:2:0 @ 60

DisplayPort FAQ said:
DisplayPort 1.2a systems today can support 4K displays at 60Hz refresh and full 30-bit 4:4:4 color (non-chroma subsampled).
 
Last edited:
That may have been the plan but to not furnish your current gen card with the current industry standard, when your discrete market share is considerably diminished is like trying to move a boulder by blowing on it.
Necessity is the mother of invention and AMD's current position doesn't necessitate any industry shift.
If Intel clapped their hands then I could see the change. What do consoles use? AMD machines, all of them. 1.4. (Lol, I just checked, I genuinely didn't know)
Realize you're talking about two industries. The computer industry has/is rapidly adopting DisplayPort. The TV/film industry previously adopted HDMI and is pushing to switch to the updated standard. Going back more than a decade, graphics cards have traditionally supported two or more monitors (VGA, DVI, DisplayPort) and one TV (RCA, S-Video, component video, HDMI). Fiji is staying true to that just like virtually every card before it.


As pointed out by Xzibit, HDMI 2.0 can't even handle all of the bandwidth 4K requires at 60 Hz, again, because it is a bad standard. The problem, is not AMD refusing to put HDMI 2.0 on there; there problem is the TV/film industry refusing to switch to DisplayPort. I just looked for a Home Theatre receiver that implements DisplayPort switching over HDMI and I couldn't find any. Therein lies the problem. It has nothing to do with AMD and everything to do with the TV/film industry clinging to HDMI.
 
Last edited:
Back
Top