# ASUS Radeon R9 Fury STRIX 4 GB



## W1zzard (Jul 10, 2015)

Today, AMD released their Radeon R9 Fury. Only two vendors are allowed to build designs for the Fury Non-X: ASUS and Sapphire. Today, we are reviewing the ASUS R9 Fury STRIX, which comes with a new large DirectCU thermal solution that keeps noise levels in check.

*Show full review*


----------



## RejZoR (Jul 10, 2015)

I'm pretty confident you can bump up the operational temperature to 75°C and slightly lower the fans. If you throw in framerate limiting, temperatures should drop even further, meaning you can make it a lot quieter without really sacrificing any performance.


----------



## Joss (Jul 10, 2015)

What's that horrible thing in page 5 photo #2, could that be thermal paste? it looks like scratched metal.


----------



## HisDivineOrder (Jul 10, 2015)

Price is too high.  If I were that close to a reference 980 Ti ($70ish), I'd go on and finish my way there and get something far more capable for 1440p or greater.  Otherwise, I'd drop down to $500 or less (980, R9 390X) for 1080p.

AMD has a real problem with pricing.


----------



## pat-roner (Jul 10, 2015)

@W1zzard, what other reviews of videocard are in the pipeline? Mainly wondering about Palit Super Jetstream 980 ti.
Thanks


----------



## RejZoR (Jul 10, 2015)

Joss said:


> What's that horrible thing in page 5 photo #2, could that be thermal paste? it looks like scratched metal.
> 
> View attachment 66393



There are soft metal thermal pads and I think this is one of them. Coollaboratory makes them...

Liquid MetalPad
http://www.coollaboratory.com/en/products/liquid-metalpad/


----------



## Joss (Jul 10, 2015)

RejZoR said:


> Liquid MetalPad
> http://www.coollaboratory.com/en/products/liquid-metalpad/


Thanks, didn't know about that.


----------



## HumanSmoke (Jul 10, 2015)

Nice review!

Hey, @Casecutter , that DC3 cooler looks pretty damn similar to the that found on the 980 Ti doesn't it?


----------



## Lionheart (Jul 10, 2015)

Great review & seems to be a great card too. I might just pick one up.



HisDivineOrder said:


> Price is too high.  If I were that close to a reference 980 Ti ($70ish), I'd go on and finish my way there and get something far more capable for 1440p or greater.  Otherwise, I'd drop down to $500 or less (980, R9 390X) for 1080p.
> 
> AMD has a real problem with pricing.



Price is too high my ass! Keep dreaming.


----------



## haswrong (Jul 10, 2015)

HisDivineOrder said:


> Price is too high.  If I were that close to a reference 980 Ti ($70ish), I'd go on and finish my way there and get something far more capable for 1440p or greater.  Otherwise, I'd drop down to $500 or less (980, R9 390X) for 1080p.
> 
> AMD has a real problem with pricing.


well, amd is in deep trouble, so it has to surface somewhere. like chaotic pricing model in their last spasmatic attmept to grab a market share and generate revenue.. i dont see as likely for anyone to buy a g-card with a heatsink larger than the g-card itself and zero oc potential.. now i have to wonder even more what exactly amd expect from the nano model.......... 

edit: oops, i thought the pcb was smaller, i skipped first few pages of the review  ok.. the heatsink ISNT larger in length and width than the card/pcb.. ok, ok............ meh


----------



## newtekie1 (Jul 10, 2015)

Lionheart said:


> Price is too high my ass! Keep dreaming.



20% higher price for 0-13% higher performance.  Yeah, the price is a little high.

I think $50 more than a 980 would be reasonable, but not $100.


----------



## Anymal (Jul 10, 2015)

just a moment, Strix is not stock Fury card, why do you compare it to stock geforce 980, Mr. Wizzard


----------



## Deleted member 138597 (Jul 10, 2015)

LOL that Far Cry 4 anomaly


----------



## Deleted member 138597 (Jul 10, 2015)

Anymal said:


> just a moment, Strix is not stock Fury card, why do you compare it to stock geforce 980, Mr. Wizzard


What then you want to compare with? AIBP cards? That would make it a huge variable for the test. I hope you know what I mean.

And also, if you mean to compare between 980 Strix and Fury strix, then it would be much of a cooler review, with a icing of GPU review


----------



## Ferrum Master (Jul 10, 2015)

Anymal said:


> just a moment, Strix is not stock Fury card, why do you compare it to stock geforce 980, Mr. Wizzard



Troll alert....

The card is almost nice... the OC potential spoils.... completely...


----------



## Xzibit (Jul 10, 2015)

Anymal said:


> just a moment, Strix is not stock Fury card, why do you compare it to stock geforce 980, Mr. Wizzard



I'm going to take the obvious guess and say because it uses stock clocks.


----------



## Anymal (Jul 10, 2015)

Xzibit said:


> I'm going to take the obvious guess and say because it uses stock clocks.


ok, OC on Sapphire Tri-X is 40mhz which makes it almost stock but if they compare also temperatures and noise, Fury Strix is no stock at all

Strix product if far from stock, believe me, so it is faster from stock 980, ehat about strix 980?


----------



## newtekie1 (Jul 10, 2015)

The disappointing thing with this card is the overclock potential.  It is strange for ASUS to not release a Strix card, or any card with an aftermarket cooler, that is pre-overclocked.  But I think this is just a sign of how how poor the overclock potential is with these cards.



Anymal said:


> ok, OC on Sapphire Tri-X is 40mhz which makes it almost stock but if they compare also temperatures and noise, Fury Strix is no stock at all
> 
> Strix product if far from stock, believe me, so it is faster from stock 980, ehat about strix 980?



I don't believe there is a "stock" Fury card.  AMD didn't make a reference Fury card, they just left it up to the board partners to use the designe they wanted.

The closest we might see is one of them using the reference Fury X board with a Fury GPU stuck on it, but since the Fury X board/cards are all manufacturered by AMD and handed out to the partners for distribution, I don't even think that is very likely.

And since this card uses reference clocks, the performance will be the same as any "stock" Fury.  Power consumption and temps will be difference, but performance is the important part.


----------



## Deleted member 138597 (Jul 10, 2015)

Anymal said:


> ok, OC on Sapphire Tri-X is 40mhz which makes it almost stock but if they compare also temperatures and noise, Fury Strix is no stock at all
> 
> Strix product if far from stock, believe me, so it is faster from stock 980, ehat about strix 980?



You mean you want to know stock Fury cooler's noise and temperatures?

Well, that ain't happening because like GTX 970, there will be no stock cooler from AMD.


----------



## Anymal (Jul 10, 2015)

I am new on this site and now I see that he always compare 1 card to all other in stock/reference form, but when you write it is faster than Geforce 980 I think you should add "reference" or "stock"


----------



## Casecutter (Jul 10, 2015)

Right about what I figured Fury would be.  Though this Asus isn't great in lots of ways, their price is high, or they went super nice cooler, custom PCB with even solder pads to monitor voltage!  But... if it can't offer OC'n (that's not Asus' fault, AMD going to have to reveal truthfully what's going to be happening on that front) these from Asus aren't going to fly off the shelf.

Honestly, if AMD now has some reason they can't OC these "spill-it" and then AIB could not have to invest this level of cooling and move down on price.  Personally I would think AIB's can offer an extra 25-35mhz from a reference "petite" PCB and then work a much less extravagant cooler, that still can maintain under 1100Mhz (but sure some noise and heat). There’s buyers out there who’ll "get it" not for OC, though for its $530 price and they'd still move whatever they could deliver.

And there's the problem AMD isn't _"going all in", or "over board"_ this round they intend to keep the channel with product, but not be caught burdened with any amount of SKU's at any level. They can't/don't want to indicate some Price-War, and know they aren't about to win back market share working it that way.  Just hold course and stay lean, and at the ready.


----------



## Anymal (Jul 10, 2015)

it would be great if Wizzard add at least one of AIB 980 cards in comparison


----------



## ZoneDymo (Jul 10, 2015)

Great card imo, love the improved power consumption, also at that level I do have to find it way too close to call that in the negative portion in the conclusion tbh.


----------



## ZoneDymo (Jul 10, 2015)

newtekie1 said:


> 20% higher price for 0-13% higher performance.  Yeah, the price is a little high.
> 
> I think $50 more than a 980 would be reasonable, but not $100.



Its $70....


----------



## GhostRyder (Jul 10, 2015)

Supposedly the overclocking software's will soon give us voltage control which will hopefully let us overclock it a bit more at least.

Its odd this has no overclock at stock, but its a nice card and a nice cooler which seems to put it with its price at the right spot since its above the GTX 980 which is priced at 79 bucks lower.  Asus tends to charge a bit more so hopefully the other guys will put some out with overclocks or at a lower price point!


----------



## Fluffmeister (Jul 10, 2015)

Anymal said:


> it would be great if Wizzard add at least one of AIB 980 cards in comparison



I wouldn't get too hung up on that, we all know custom 980's fly like the wind so the cards ultimately trade blows game to game. Besides GM204 must be a cash cow compared to that humongous Fiji chip.

Still this card seems to be a far more compelling option compared to Fury X.


----------



## RejZoR (Jul 10, 2015)

Anymal said:


> just a moment, Strix is not stock Fury card, why do you compare it to stock geforce 980, Mr. Wizzard



This. Strix is an aftermarket enthusiast series. Comparing it to stock reference models is silly. Strix R9 Fury should only be compared to Strix GTX 980 (despite the fact that R9 Fury is using cooler found on GTX 980Ti). Only this way it's apples vs apples.


----------



## Casecutter (Jul 10, 2015)

HumanSmoke said:


> Nice review!
> 
> Hey, @Casecutter , that DC3 cooler looks pretty damn similar to the that found on the 980 Ti doesn't it?


Figured we'd get back to this...
Well they made some changes to the side rail(s), looks like they were able to revise it to back-fit the 980 Ti.  Although, they have a custom board and they lengthen from the chip to the I/O bracket, that spacing from the FuryX PCB was my point.  While Asus almost appears to have brought in the spacing for their 4 mount holes to the bracket/frame in ever so slightly, verses FuryX ... but maybe not.  But in that lies their folly, to use that cooler they made a new PCB, and now they have an expensive product that doesn't make substantial use of any of it. Appears a bad concession.


----------



## Basard (Jul 10, 2015)

What is up with that thermal crud?  Looks like some crap you'd find in an e-machine!

Can you please explain, w1zzard?


----------



## Joss (Jul 10, 2015)

I'd still buy the 980 simply because... I stopped trusting AMD. Too many mistakes, flaws, misses, pure stupidity.
Trust is easy to loose and difficult to regain, and I shouldn't be the only one in this boat.


----------



## majevica (Jul 10, 2015)

Joss said:


> I'd still buy the 980 simply because... I stopped trusting AMD. Too many mistakes, flaws, misses, pure stupidity.
> Trust is easy to loose and difficult to regain, and I shouldn't be the only one in this boat.



You mean like Nvidia with their ultimate fuck you to customers with their faulty laptop chips that rendered my 1500e laptop useless after a year and in my country they didn't offered extended warranty or anything.


----------



## nem (Jul 10, 2015)

the nvfangilrls looks jealous

meanwhile AMD fanS be like


----------



## HumanSmoke (Jul 10, 2015)

Casecutter said:


> Figured we'd get back to this...
> Well they made some changes to the side rail(s), looks like they were able to revise it to back-fit the 980 Ti.


GPU packages (die + substrate + support bracing) and mounting points differ, thus the "side rails" ( hold down/retention bracket) differ for different GPUs using the same cooler. Even small differences in die size necessitate either a change in mounting hardware, or a multi-fit approach. Good luck trying to fit an AMD mount (53.2mm x 53.2mm) to an Nvidia GF100/GF110/GK104/GK110/GM204/GM200 (58.4mm x 58.4mm) without a multi-option mounting plate.


----------



## happita (Jul 11, 2015)

Really? If people are hung up on the fact that W1zz doesn't include another aftermarket card to compare with the current card being reviewed then you guys are just being lazy. It isn't that hard to just look up the different ASUS STRIX cards (be it a 970, 980, etc.), take the temps, OCability, power consumption and compare it to another STRIX card that you're interested in. No need to put more work on the table where it isn't needed.


----------



## RejZoR (Jul 11, 2015)

Then again, why not do that in the review itself and be done with it? Why should user/consumer chase data around?


----------



## happita (Jul 11, 2015)

RejZoR said:


> Then again, why not do that in the review itself and be done with it? Why should user/consumer chase data around?



Reviewing a card has always been like this on TPU. The card being reviewed is the star of the show, whatever manufacturer/brand/aftermarket card that might be, against ALL other STOCK REFERENCE cards out there. Now, whether that dynamic needs to be challenged, is up to the people of TPU and it is obviously W1zz's decision.


----------



## Aquinus (Jul 11, 2015)

If only it cost less. 

Side note: @W1zzard , can we stop calling no VGA signal on DVI a con? Do you really think anyone is buying a Fury to use a VGA display on it? Honestly, I thought I was late to the game switching from VGA to DVI in *2007*.


----------



## rubenclavs (Jul 11, 2015)

Strix 980 vs Strix Fury

I gathered this from the reviews here in TPU. The 980 used 1600p and Fury used 1440p so I skipped it and used the common benchmarks for 1080p instead.

All the common games on both cards below.

Batman Arkham
Radeon Fury 255.8
GeForce 980 176.9

Battlefield 3
Radeon Fury 141.6
GeForce 980 137

Battlefield 4
Radeon Fury 87.3
GeForce 980 90.8

Bioshock Infinite
Radeon Fury 181.3
GeForce 980 156.7

Crysis 3
Radeon Fury 54.3
GeForce 980 49.6

Metro Last Light
Radeon Fury 97.7
GeForce 980 92.9

Tomb Raider
Radeon Fury 78.3
GeForce 980 70.5

Watch Dogs
Radeon Fury 85.4
GeForce 980 94.6

Wolfenstein
Radeon Fury 84.2
GeForce 980 104.6

Total Average:

Fury - 118.4333333
980 - 108.1777778

Cheers!


----------



## Steevo (Jul 11, 2015)

Great review, and perhaps with voltage control software we will get a higher overclock, but this is pretty nice performance and the power consumption numbers are in line with what they should be.


----------



## Fluffmeister (Jul 11, 2015)

rubenclavs said:


> Strix 980 vs Strix Fury
> 
> I gathered this from the reviews here in TPU. The 980 used 1600p and Fury used 1440p so I skipped it and used the common benchmarks for 1080p instead.
> 
> ...



Good work, damn that tiny GM204 is a monster, gotta love those margins.


----------



## Initialised (Jul 11, 2015)

OK so where's the softmod to Fury-X for those who want it watercooled but don't want the noisy CM pump?


----------



## HumanSmoke (Jul 11, 2015)

rubenclavs said:


> Strix 980 vs Strix Fury
> I gathered this from the reviews here in TPU. The 980 used 1600p and Fury used 1440p so I skipped it and used the common benchmarks for 1080p instead.


Unfortunately you aren't taking into account game patches and driver revisions. Comparing one review from October 2014 to one from June 2015 is a flawed exercise at best.
Case in point: The Strix 980 review had the stock 980 at 169.3 f.p.s. in Batman:AO at 1080p. The current review has the same card producing 231.8 f.p.s.


----------



## xvi (Jul 11, 2015)

Initialised said:


> OK so where's the softmod to Fury-X for those who want it watercooled but don't want the noisy CM pump?


Patience, young grasshopper.


----------



## fullinfusion (Jul 11, 2015)

Joss said:


> What's that horrible thing in page 5 photo #2, could that be thermal paste? it looks like scratched metal.
> 
> View attachment 66393


@W1zzard , yeah whats that, a pad?

And I'm sure you bench the card before pulling it apart correct?


----------



## btarunr (Jul 11, 2015)

fullinfusion said:


> And I'm sure you bench the card before pulling it apart correct?



That is correct.


----------



## dwade (Jul 11, 2015)

Overclock vs overclock, my money's on the 980.


----------



## newtekie1 (Jul 11, 2015)

Casecutter said:


> Figured we'd get back to this...
> Well they made some changes to the side rail(s), looks like they were able to revise it to back-fit the 980 Ti.  Although, they have a custom board and they lengthen from the chip to the I/O bracket, that spacing from the FuryX PCB was my point.  While Asus almost appears to have brought in the spacing for their 4 mount holes to the bracket/frame in ever so slightly, verses FuryX ... but maybe not.  But in that lies their folly, to use that cooler they made a new PCB, and now they have an expensive product that doesn't make substantial use of any of it. Appears a bad concession.



All Fury cards will be custom PCBs, there is no reference PCB for Fury.  And the Fury X PCBs are all manufactured by AMD(Sapphire actually) and given to the other companies for distribution, so they can't even use that PCB for their Fury cards.  Well, Sapphire can.


----------



## buggalugs (Jul 11, 2015)

happita said:


> Reviewing a card has always been like this on TPU. The card being reviewed is the star of the show, whatever manufacturer/brand/aftermarket card that might be, against ALL other STOCK REFERENCE cards out there. Now, whether that dynamic needs to be challenged, is up to the people of TPU and it is obviously W1zz's decision.



Exactly, its funny how guys are complaining about this now, its always been the case and its usually AMD that suffered worse from this. Now that Nvidia looks worse some people are upset about it. A perfect example is the 290X, which was criticized for heat and noise based on the early reference blower cooler. Those opinions and early benchmarks blighted the card but when the non-reference versions came out, it was a different card. An Asus DirectCUII 290X is silent, not audible during gaming and gaming temps are very low in the 60's-70's. Most of the non-reference cards also had a decent overclock, so it was a much better performer all around in benchmarks than the early reference designs indicated.

 Same can be said for most of the 290X non reference versions from Sapphire, MSI, Gigabyte etc, but there are still guys(and benchmarks) that judge it based on the reference.

 Its not the reviewers fault, there are so many options  a reviewer cant test them all but people should be aware of it.

 The other funny thing is how people  judge cards based on performance differences of 3% or 5% or 7%.  I guarantee most people would not notice a performance difference that small in the real world.


----------



## NC37 (Jul 11, 2015)

HisDivineOrder said:


> Price is too high.  If I were that close to a reference 980 Ti ($70ish), I'd go on and finish my way there and get something far more capable for 1440p or greater.  Otherwise, I'd drop down to $500 or less (980, R9 390X) for 1080p.
> 
> AMD has a real problem with pricing.



You said it. Probably one of the biggest disappointments with Fury to me has been the price. AMD really could have had a good price/performance leader but they decided to pull a nVidia. Heck even $50 less would have done AMD some good.


----------



## Caring1 (Jul 11, 2015)

RejZoR said:


> This. Strix is an aftermarket enthusiast series. Comparing it to stock reference models is silly. Strix R9 Fury should only be compared to Strix GTX 980 (despite the fact that R9 Fury is using cooler found on GTX 980Ti). Only this way it's apples vs apples.


The only difference from reference is the cooler, the clocks are the same.


----------



## fullinfusion (Jul 11, 2015)

btarunr said:


> That is correct.


Yeah cool, reason I asked is the thermal pad or whatever they used looked, well, like a pad, I dont think the temperatures would be as good otherwise..

And for your info do you know what pad that is?
I really think that's the reason for such cooler temp's


----------



## newtekie1 (Jul 11, 2015)

fullinfusion said:


> And for your info do you know what pad that is?
> I really think that's the reason for such cooler temp's



I looks to me like it is a Liquid Metal pad, it's better than most thermal pastes.


----------



## RejZoR (Jul 11, 2015)

Caring1 said:


> The only difference from reference is the cooler, the clocks are the same.



And why do you think people buy Strix series!? The ONLY reason we decide for these is because the cooler is so amazing. Don't you think that would kinda make it a very important factor? If it was anything else, there are cards that are more overclocked than Strix... but have crappier coolers and don't really perform as cool or quiet as Strix cards.


----------



## ZoneDymo (Jul 11, 2015)

RejZoR said:


> And why do you think people buy Strix series!? The ONLY reason we decide for these is because the cooler is so amazing. Don't you think that would kinda make it a very important factor? If it was anything else, there are cards that are more overclocked than Strix... but have crappier coolers and don't really perform as cool or quiet as Strix cards.



I think his point is that the performance we see is equal to stock, the only stuff that is different is temperature, noise and perhaps overclockability, but for where it counts most (fps performance), this is apples vs apples.


----------



## RejZoR (Jul 11, 2015)

Which is pretty much useless info if you're deciding between a Strix GTX 980 and Strix R9 Fury... Then you have to read both reviews and cross reference the data and hope you get it right... Ideal would be comparison with reference and with competitor Strix model. But you know, I'm bitching again about making reviews more useful to readers/customers...


----------



## Frick (Jul 11, 2015)

They would have a monster on their hands if they had undercut the 980, or at least have it in the same ballpark.


----------



## idx (Jul 11, 2015)

Anymal said:


> just a moment, Strix is not stock Fury card, why do you compare it to stock geforce 980, Mr. Wizzard


As far as I know there are no AMD Fury reference design out there. Also I really wonder if anyone ever said the same about the reviews of all the nvidia cards, some of them were like 200MHz higher than stock and still were compared to a list of stock AMD cards ?

Edit: Thanks for the review W1zzard ! I really was waiting for it.


----------



## Anymal (Jul 11, 2015)

ok, i will rephrase it: would you be so kind to add refreshed geforce 980 strix results in comparison?


----------



## uuuaaaaaa (Jul 11, 2015)

I wish there were custom air cooled Fury x cards!


----------



## W1zzard (Jul 11, 2015)

Joss said:


> What's that horrible thing in page 5 photo #2, could that be thermal paste? it looks like scratched metal.








Seems to be some kind of thermal pad, I haven't encountered it before. Removing it now and replacing with thermal paste when re-assembling the card.



Shamonto Hasan Easha said:


> LOL that Far Cry 4 anomaly


Looking into that right after I've reassembled the card. 



Anymal said:


> I am new on this site and now I see that he always compare 1 card to all other in stock/reference form, but when you write it is faster than Geforce 980 I think you should add "reference" or "stock"


I've added the word "reference" near the first mention of GTX 980.



happita said:


> Reviewing a card has always been like this on TPU. The card being reviewed is the star of the show, whatever manufacturer/brand/aftermarket card that might be, against ALL other STOCK REFERENCE cards out there.


That is correct. I'm not sure how I should handle non-ref cards, even if I wanted to. Nearly everyone sends me his cards, so do I include them all? Or just certain manufacturers? Which ones?



Aquinus said:


> can we stop calling no VGA signal on DVI a con? Do you really think anyone is buying a Fury to use a VGA display on it?


After reading my review they won't  It's just mentioned there to make people aware, it doesn't affect score or conclusion in any way.


----------



## idx (Jul 11, 2015)

Anymal said:


> ok, i will rephrase it: would you be so kind to add refreshed geforce 980 strix results in comparison?


the STRIX GTX 980 is OC version, the STRIX Fury is running at stock clocks. Still +50Mhz OC 980 wont change much tbh.


----------



## W1zzard (Jul 11, 2015)

W1zzard said:


> Removing it now and replacing with thermal paste when re-assembling the card.


finished, temps are the same, as with the pad, maybe 1°C higher


----------



## Lionheart (Jul 11, 2015)

newtekie1 said:


> 20% higher price for 0-13% higher performance.  Yeah, the price is a little high.
> 
> I think $50 more than a 980 would be reasonable, but not $100.



Okay I think I need to do my own research on pricing next time instead of relying on several youtube reviewers which they stated that the GTX 980 was going for around $500 USD. Didn't know you could get one for around $450, thanks for the input.


----------



## RejZoR (Jul 11, 2015)

W1zzard, there are metallic thermal pads made by Collaboratory (same guys who made LiquidPro). Was this on GPU similar thing or it just looked like metal but was the usual gummy pad with metal particles or something?


----------



## W1zzard (Jul 11, 2015)

RejZoR said:


> W1zzard, there are metallic thermal pads made by Collaboratory (same guys who made LiquidPro). Was this on GPU similar thing or it just looked like metal but was the usual gummy pad with metal particles or something?


See the pic I posted like 5 posts above. I didn't notice any metal, it's just a gummy pad with some carbon particles in it I think, it leaves difficult-to-remove black stains.


----------



## birdie (Jul 11, 2015)

*Drivers:    AMD: Catalyst 15.5 Beta*

This kinda invalidates the whole review.

Please, retest all AMD GPUs using Catalyst 15.7 (which contains a whole lot of optimizations and fixes).


----------



## newtekie1 (Jul 11, 2015)

birdie said:


> *Drivers:    AMD: Catalyst 15.5 Beta*
> 
> This kinda invalidates the whole review.
> 
> Please, retest all AMD GPUs using Catalyst 15.7 (which contains a whole lot of optimizations and fixes).




Can you not be bothered to read two lines down?



			
				The Review said:
			
		

> AMD Fury X, Fury, R9 390X, R9 290X, R9 290, R9 285: *15.7 WHQL*



All the important cards were tested with 15.7.


----------



## birdie (Jul 11, 2015)

One other thing I disliked:



> a card that is nearly as fast without eating up the room a radiator would take or suffering from the watercooling pump's noise.



AMD claims that the pump noise has been totally eliminated in a new batch of these cards.


----------



## W1zzard (Jul 11, 2015)

newtekie1 said:


> Can you not be bothered to read two lines down?


and when I wrote it I thought "Maybe I should put 15.7 first? Someone might stop reading after 15.5?"


----------



## idx (Jul 11, 2015)

W1zzard said:


> finished, temps are the same, as with the pad, maybe 1°C higher


W1zzard, May you try to lower the fan speed under load or keep it fixed at a low speed (something that keeps the noise around 30-35 dbA) , I am kinda curious how quite this card can be if I wanted to keep the temperature around 75C, Please ?

Thanks in advance !


----------



## RejZoR (Jul 11, 2015)

W1zzard said:


> See the pic I posted like 5 posts above. I didn't notice any metal, it's just a gummy pad with some carbon particles in it I think, it leaves difficult-to-remove black stains.



I know, but I've seen a metallic looking one somewhere around here on TPU...


----------



## newtekie1 (Jul 11, 2015)

birdie said:


> AMD claims that the pump noise has been totally eliminated in a new batch of these cards.



AMD has made a lot of claims lately, and most of them have turned out to be false, so really until we actually see this new batch hit the reviewers I'm not going to take their word for it.


----------



## Ikaruga (Jul 11, 2015)

This isn't a bad card at all. Somebody who prefers AMD and want an air cooled card can buy something "new" finally. It's not a game changer, but I'm sure AMD will drop the price later and it runs games quite well.


----------



## pat-roner (Jul 11, 2015)

W1zzard said:


> That is correct. I'm not sure how I should handle non-ref cards, even if I wanted to. Nearly everyone sends me his cards, so do I include them all? Or just certain manufacturers? Which ones?



Maybe include relevant cards?
I.e. Testing aftermarket 980 ti, include aftermarket ti's

On another note.

Have you considered to benchmark 3440x1440? Istead of 1600x900 maybe


----------



## W1zzard (Jul 11, 2015)

pat-roner said:


> Have you considered to benchmark 3440x1440? Istead of 1600x900 maybe


We need 16x9 for comparison with lower end cards. 3440x1440 doesn't seem to be that popular yet, I'll look at it again later this year. Adding a separate monitor just for one new resolution that almost nobody uses, I'm not sure if that's worth it.


----------



## pat-roner (Jul 11, 2015)

W1zzard said:


> We need 16x9 for comparison with lower end cards. 3440x1440 doesn't seem to be that popular yet, I'll look at it again later this year. Adding a separate monitor just for one new resolution that almost nobody uses, I'm not sure if that's worth it.



Yeah, that makes sense!

Thanks for reply


----------



## the54thvoid (Jul 11, 2015)

birdie said:


> *Drivers:    AMD: Catalyst 15.5 Beta*
> 
> This kinda invalidates the whole review.
> 
> Please, retest all AMD GPUs using Catalyst 15.7 (which contains a whole lot of optimizations and fixes).



You could always post a welcome and honest apology to @W1zzard for being hideously quick to judge and criticising him. He's probably one of the most flexible and thorough reviewers out there.


----------



## Anymal (Jul 11, 2015)

idx said:


> the STRIX GTX 980 is OC version, the STRIX Fury is running at stock clocks. Still +50Mhz OC 980 wont change much tbh.


the strix 980 is strix version and so is strix Fury strix version, if they could or couldnt factory overclock it is not our problem, it is just a shame that Wizzard wont add at least geforce 980 strix in comparison as i believe it would be very helpful for all of us who are buying card in that 550-600$ range

I am grateful for the review but anyway


----------



## W1zzard (Jul 11, 2015)

Anymal said:


> wont add at least geforce 980 strix in comparison


there is also the issue that i don't have the 980 strix anymore, so i wouldnt be able to test it on new drivers


----------



## HumanSmoke (Jul 11, 2015)

birdie said:


> One other thing I disliked:
> *AMD claims* that the pump noise has been* totally eliminated* in a new batch of these cards.


I don't think AMD claim any such thing. From their statement on the matter:


> adjustments in the sound baffling adhesive compound were applied in the assembly of the high speed cooling pump to address the specific sound a few end users experienced as problematic. This improved the acoustic profile of the pump, and repeat testing shows the specific pitch/sound in question was *largely reduced* through adjustments to the sound-baffling adhesive compound in the pump.


How much probably depends on the user and the specific card in question, but it falls short of the claim you are making on AMD's behalf.


----------



## W1zzard (Jul 11, 2015)

Shamonto Hasan Easha said:


> LOL that Far Cry 4 anomaly


Far Cry 4 seems to be CPU limited on AMD cards to around 78 FPS, which causes some large random performance swings at the "wall".
I've rerun FC4 on Fury and Fury X and updated the graphs accordingly.


----------



## SASBehrooz (Jul 11, 2015)

well, its hard for me to say it : good card. worth it


----------



## Anymal (Jul 11, 2015)

W1zzard said:


> there is also the issue that i don't have the 980 strix anymore, so i wouldnt be able to test it on new drivers


thank you for quick response, I really appreciate it and your tests, especially relative performance which were one of the first on scene if not the firs
more questions: do you consider to make also FCAT tests in near future and why not/yes, do you mean it is important as they show also user experience not only avg. fps?


----------



## majevica (Jul 11, 2015)

W1zzard said:


> Far Cry 4 seems to be CPU limited on AMD cards to around 78 FPS, which causes some large random performance swings at the "wall".
> I've rerun FC4 on Fury and Fury X and updated the graphs accordingly.



Hey can you maybe add Assetto Corsa and Dirt Rally to list of games tested or at least one of them.They are probably two best games when it comes to physics in racing genre so it would be really useful for us sim racers.
Thank you.


----------



## Assimilator (Jul 11, 2015)

"I wish more AMD partners built R9 Fury cards because this SKU is definitely a better rounded package than the R9 Fury X."

Hit the nail on the head. I'd go further and say that Fury is a better card than Fury X. It may not perform as well, but it's positioned much better so that it actually offers a compelling alternative to GTX 980. I don't know what, if anything, AMD can do about the price, because Fiji is already hella expensive.

It feels to me that AMD should've skipped Fury X altogether (at least for now, while they're still working out the overclocking) and released plain Fury as their first Fiji SKU. A few months down the line, they could've released a better Fury X (similar to how nVIDIA released 980 Ti) and a lot of people would've been a lot happier.


----------



## W1zzard (Jul 11, 2015)

Anymal said:


> FCAT tests in near future


I have no plans for FCAT testing, I rather provide a large selection of games and resolutions, other reviews have FCAT data. You should always consider multiple reviews anyway.



majevica said:


> Hey can you maybe add Assetto Corsa and Dirt Rally


i'm thinking about F1 2015 which has the new version of the ego engine


----------



## jabbadap (Jul 11, 2015)

Great review and great card. If nvidia want to release yet another cut down gm200 card to better compete with this, they have no names left(well maybe gtx980 2560 like old fermi gtx560ti 448. Most likely they just lower the price of gtx980, if even needed).



Spoiler






newtekie1 said:


> All Fury cards will be custom PCBs, there is no reference PCB for Fury.  And the Fury X PCBs are all manufactured by AMD(Sapphire actually) and given to the other companies for distribution, so they can't even use that PCB for their Fury cards.  Well, Sapphire can.



Afaik reference pcb is the same as seen in Fury X, see anandtech Tri-X review:
http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/2


----------



## Zakin (Jul 11, 2015)

Funny how much the review style comes into question after so many years, just to try and make AMD look better in the light. What even, the card isn't even bad, definitely better all around than the Fury X, or at least more interesting. Really glad to see multi-monitor usage is down so much, almost seems like it took a third party vendor to fix that..


----------



## W1zzard (Jul 11, 2015)

Zakin said:


> almost seems like it took a third party vendor to fix that..


it was fixed by using HBM instead of GDDR5


----------



## Zakin (Jul 11, 2015)

Possibly dumb question, than why is the Fury X still double this cards? I've also seen in other reviews, the Fury X was still quite a bit higher in multi-monitor. Just didn't seem like AMD themselves fixed the core issue.


----------



## mroofie (Jul 11, 2015)

HumanSmoke said:


> I don't think AMD claim any such thing. From their statement on the matter:
> 
> How much probably depends on the user and the specific card in question, but it falls short of the claim you are making on AMD's behalf.


forget about birdie  he's a Amd fanboy


----------



## Aquinus (Jul 11, 2015)

Zakin said:


> Possibly dumb question, than why is the Fury X still double this cards? I've also seen in other reviews, the Fury X was still quite a bit higher in multi-monitor. Just didn't seem like AMD themselves fixed the core issue.


Most reviews I've seen puts multi-monitor around 20-25 watts on both Fury and Fury X. Do you have a particular review that you're looking at that says otherwise? The reduced multi-monitor idle usage is definitely a perk.


----------



## W1zzard (Jul 11, 2015)

Zakin said:


> Possibly dumb question, than why is the Fury X still double this cards? I've also seen in other reviews, the Fury X was still quite a bit higher in multi-monitor. Just didn't seem like AMD themselves fixed the core issue.


The pump


----------



## newtekie1 (Jul 11, 2015)

jabbadap said:


> Afaik reference pcb is the same as seen in Fury X, see anandtech Tri-X review:
> http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/2



Yes, and only Sapphire is using it, because they were the only ones to produce it in the first place. I addressed this in the original post.


----------



## birdie (Jul 11, 2015)

mroofie said:


> forget about birdie  he's a Amd fanboy



Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.



Spoiler: My GPU history



Riva TNT2 ->MX 440 8x->GeForce FX 5600 (owned for just a month - I hated it)->GeForce 6600-> GeForce 7600 GT->GeForce 8800 GT->(currently) *Gigabyte GeForce GTX 660*->Intend to buy GeForce 960 Ti if it gets released - if it's not, I will wait for Pascal



The reason why I've always avoided ATI/AMD is due to their awful drivers. They still are.


----------



## jabbadap (Jul 11, 2015)

newtekie1 said:


> Yes, and only Sapphire is using it, because they were the only ones to produce it in the first place. I addressed this in the original post.



Yeah I did that spoiler thing to hide my own mistake 

Is there any news for other AIB:s. I have hard time to believe that xfx, powercolor,his,msi, gigabyte etc. will skip this card.


----------



## n-ster (Jul 11, 2015)

W1zzard said:


> I'm not sure how I should handle non-ref cards, even if I wanted to. Nearly everyone sends me his cards, so do I include them all? Or just certain manufacturers? Which ones?



I personally agree with the suggestion to somehow incorporate it. You could add the best 1-2 in the noise section, or just the closest one to that model (strix vs strix). Perhaps even not needed to add them to the graph but mention them in text 

One thing I'd love to see for the best coolers (Strix, MSI Gaming etc) is ways to quiet down the cooler without affecting performance. Perhaps have a 75-80C target temperature. 

GPS are so loud, even people who don't care that much about noise will rather pay 30$ on upgrading to a Strix


----------



## mirakul (Jul 11, 2015)

With amazing 10+2 phase, Asus set power limit at mere 216W for Fury Strixx. A custom bios is needed to release this beast.


----------



## the54thvoid (Jul 11, 2015)

mirakul said:


> With amazing 10+2 phase, Asus set power limit at mere 216W for Fury Strixx. A custom bios is needed to release this beast.



Power phases don't always allow far higher performance.  It can be quite a sales gimmick - I know having owned some cards with silly numbers.  If the chip runs at it's top end for the architecture, the power phases really mean the card runs at a better power limit, in other words, if the Fiji chip is near 100% of it's 'theoretical' performance, the added power circuitry helps it use power more effectively, thus drawing less power.

This shows quite a good comparison of what benefit the power phases give:







The Strix draws 29 watts less (7% lower draw) than the Sapphire Ref speed card.


----------



## mirakul (Jul 11, 2015)

the54thvoid said:


> The Strix draws 29 watts less (7% lower draw) than the Sapphire Ref speed card.



It's clear that you don't know about Sapphire limit of 300w. The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption. Not to mention the difference in stock volt btw


----------



## ZoneDymo (Jul 11, 2015)

birdie said:


> Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.
> 
> 
> 
> ...



Is it not a tad ignorant to claim "awful drivers" if you never used the product?
What makes you even think they are awful?
Ive been using my HD6950 for years now and I have had no issues of any kind with the drivers 
(in opposite of my previous Nvidia 8800GTS (G92) and 7900 GTO before it where settings reset and the control panel crashed upon trying to start it and having to DL extra software to be able to tweak stuff etc, personally I always found Nvidias software to feel a lot more crude and unsophisticated vs CCC, an opinion born out of experience).


----------



## the54thvoid (Jul 11, 2015)

mirakul said:


> It's clear that you don't know about Sapphire limit of 300w. The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption. Not to mention the difference in stock volt btw



I intentionally mentioned the *Reference* Sapphire card.  It's a better comparison to the power phase efficiency of the Strix.  At 1000Mhz for the Sapphire reference, it draws more power than the 1020(?) of the Strix.  That's what the power phase is good for.  It's clear you dont understand the point of a good power circuit.


----------



## mirakul (Jul 11, 2015)

the54thvoid said:


> I intentionally mentioned the *Reference* Sapphire card.  It's a better comparison to the power phase efficiency of the Strix.  At 1000Mhz for the Sapphire reference, it draws more power than the 1020(?) of the Strix.  That's what the power phase is good for.  It's clear you dont understand the point of a good power circuit.


Good power circuit = more consistent voltage. Yes, it could lead to better power consumption, but not much.
But, you didn't read my post, right?
"The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption"
Sapphire have higher stock voltage and power limit. That's why it draws more power even at lower clock.


----------



## birdie (Jul 11, 2015)

ZoneDymo said:


> Is it not a tad ignorant to claim "awful drivers" if you never used the product?
> What makes you even think they are awful?
> Ive been using my HD6950 for years now and I have had no issues of any kind with the drivers
> (in opposite of my previous Nvidia 8800GTS (G92) and 7900 GTO before it where settings reset and the control panel crashed upon trying to start it and having to DL extra software to be able to tweak stuff etc, personally I always found Nvidias software to feel a lot more crude and unsophisticated vs CCC, an opinion born out of experience).



It's strange that people while not knowing you speak as if they've known you your entire life. It's ever stranger that they try to humiliate you in a process.

Over the past 12 months my friends (who in fact own AMD GPUs) have had the following problems (some of them are still not resolved):

1) No fan speed management, it's running at full speed all the time.
2) Unable to uninstall AMD drivers (also see below).
3) Windows BSOD'ing after ostensibly uninstalling drivers (in fact they didn't uninstall cleanly thus the error - I could only uninstall them in safe mode).
4) Firefox working crazily slowly because of the drivers broken Direct2D acceleration.
5) CCC requiring .Net framework which taxes your CPU/HDD.
6) CCC weighing God knows how much.

I don't even want to touch brain-damaged AMD's website. At the same time practically a literal idiot can use NVIDIA's one.

I had problems only with leaked NVIDIA drivers over ten years ago (NVIDIA since fixed the leak).


----------



## nem (Jul 11, 2015)

the54thvoid said:


> You could always post a welcome and honest apology to @W1zzard for being hideously quick to judge and criticising him. He's probably one of the most flexible and thorough reviewers out there.


yeah i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , its seem like this some kind of favor to the GeForce... ¬¬


----------



## the54thvoid (Jul 11, 2015)

mirakul said:


> Good power circuit = more consistent voltage. Yes, it could lead to better power consumption, but not much.
> But, you didn't read my post, right?
> "The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption"
> Sapphire have higher stock voltage and power limit. That's why it draws more power even at lower clock.



I do see what you are saying but my point was the better phase power circuitry doesn't mean it's built for higher clocks and better performance.  The fact the Sapphire card has the reference PCB and is clocked higher kinda of proves my point.  Yes, if you want to over clock you will want good power circuitry - I entirely understand that point but in this situation the more phases create more stable current create better  power efficiency.  Then again, I once owned a card with a very 'basic' power circuitry but under water and volt soft modded it gave me around 1300Mhz and that was 2 years ago.  Moar phases doesn't create the beast - the chip does.



nem said:


> yeah , i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , in seem like this some kind of favor to side of nVIDIA really! ¬¬



Umm, what?  Your post is utter "unsense".


----------



## alwayssts (Jul 11, 2015)

So, end of the day, what did we learn from the Fiji launches?

IMHO, it justified some of the viewpoints I've long held about AMD's architecture...do you guys agree or do you see things differently?

1. AMD really needs clockspeeds in the ~1400-1500mhz (capable) range for an extremely compelling part given the properties of today's games.

2.  AMD needs to get their CU/ROP ratio under control.  1CU:1ROP is not optimal...it's closer to something like *16ROPs: (14/)15 CUs or 24ROPs:22 CUs*.  While compute is great, there becomes a point where it's a liability.  This is something nvidia learned from Kepler to Maxwell.

3.  AMD is not benefited by HDL (high-density libraries), or whatever other jazz (outsourcing/lack of key engineers?) has gotten into their designs since Hawaii.  While whatever process (HPM?) likely saves them space and/or may in theory run higher clocks at lower voltage, which obviously for Fiji's design may be crucial in one way (it's the largest it can be at 28nm) or another (Nano might be compelling if something like 850mhz-900/400mhz HBM at .9v core/1v memory vs 970/980), *the underlying voltage required for decent clockspeed/performance isn't great, nor is the over-all scaling*.  While we see newer 390(x) parts doing better than the initial 290 series run, these parts clock worse per volt than any of the original 7000 series by a decent margin (10%?).  Also, even figuring Maxwell having a 20-25% deeper pipeline (or whatever changed so their clock-speed scaling is now more similar to ARM A57; I always assumed a presumptive design towards 20nm), their scaling has stayed the same from Kepler to Maxwell.  *What is going on with AMD's clockspeed problems*?

End of the day, I think AMD's arch could be a good one, give or take a few tweaks and changes in philosophy; *GCN needs to be rebalanced*.  While I certainly have no idea what design rules apply, ie does 16 CUs take up just as much space as 15 in a setup engine within the confines of the overall chip parameters and/or can AMD reconfigure such an engine to be 12-24 ROPs etc, something needs to be done in that regard for efficiency.  On the same token, changes in process/design need to be applied to allow the chips' clockspeeds to scale, even at the cost of chip size (remember the decap ring in rv790?) or they are flat-out doomed.  While there is always an argument for adding more units and lowering clockspeed/voltage, probably especially at we move forward to more mobile-oriented (low-voltage) processes, ATi has always been at the top of their game when using less units and having a greater clockspeed potential than their competition; it saves space/cost (and on former processes stock power consumption) while also making it the 'overclocker's choice'.  This philosophy has also surely helped nvidia succeed.  Look at how many references you see in this review alone taking note that nvidia's arch, even if at a stock 1000-<1300mhz, and using less units (say 2080-2560 sp+sfu in GM204 or 3360-3840 in GM200) can overclock (even if sometimes while drawing lots of power) super high consistantly: ~1500mhz.


This, in part, is what scares me about 14LPP/16nmFF+ wrt to AMD.   Samsung 14nm is seemingly smaller and cheaper (~10%), but likely offset by clock potential vs TSMC and 16nmFF+.  While I could very much see AMD dropping a Fiji shrink that is under 225w (typical first parts from AMD on a process are around 188w; half of 375w) and capitalizes completely on die savings and extra perf/v, perhaps even dropping two such small chips (and 2x HBM2) on a single interposer for a crazy-awesome part, the compelling nature of such a chip, let's say (for argument's sake) Fiji at 1400mhz/625mhz and 8GB only goes so far.  I seriously fear (for competition's sake) that nvidia will use 14/16nm to both increase floating point (and/or decrease unique special special function) to create a part similar to AMD while maximizing clock speed.

Say, for instance, 224sp-240 (+/- 32sfu) in a SMP (shader module Pascal), up from 128sp (+32 sfu) in Maxwell or 192 (+32 sfu) in Kepler.  In a 16 module design (ex: GP104, something replacing GM200 for the slightly lower-end performance market)...the result is something similar to either 3584(4096) or 3840sp...similar or more efficient than Fiji.  While nvidia may not capitalize completely on die savings (as amd surely will) nor absolute power consumption (I could see them doing another '980' which is made to draw >225w), we could conceivabley see something that continues to follow the clock scaling path of ARM processors (which on 14/16nm are planned as 2ghz+).  While certainly rumors, this theory is backed by the voices in the wind mentioning nvidia ran back to tsmc for their next designs after Samsung's (/GF's) yields were terrible.  For AMD, that has to be an incredibly scary thought....I very much doubt they want a 970 vs 290x (but imagine 970 wasn't gimped and 290x drew less power) rematch.

TLDR:  I've always appreciated AMD's strengths in engineering, design choices, and pushing technology forward....but something needs to change.  I surely hope it does by next generation.


----------



## nem (Jul 11, 2015)

meanwhile the review of GeForces are maked with the last drives ... ¬¬

#W1zzard pls !


----------



## Kissamies (Jul 12, 2015)

No VGA signal should be a thumbs up, it's 2015 already, who the hell would connect a VGA device to these?

I was stunned already when Palit/Gainward custom HD4870X2 had a physical VGA connector and that was like 7 years ago.


----------



## HumanSmoke (Jul 12, 2015)

nem said:


> yeah i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , its seem like this some kind of favor to the GeForce... ¬¬





nem said:


> meanwhile the review of GeForces are maked with the last drives ... ¬¬



If your location truly is Cyberdyne, the Human race is under no threat whatsoever...unless you're planning on subjugating us using broken syntax.


----------



## newtekie1 (Jul 12, 2015)

9700 Pro said:


> No VGA signal should be a thumbs up, it's 2015 already, who the hell would connect a VGA device to these?



I do.  No VGA is a down to me.  My main rig, with SLI 970s still has a VGA monitor connected to it.


----------



## nem (Jul 12, 2015)

HumanSmoke said:


> If your location truly is Cyberdyne, the Human race is under no threat whatsoever...unless you're planning on subjugating us using broken syntax.


you did forgot I'm a robot.. :B


----------



## Tetsudo77 (Jul 12, 2015)

How noisy and hot is the card when doing 1440P ?


----------



## HumanSmoke (Jul 12, 2015)

Tetsudo77 said:


> How noisy and hot is the card when doing 1440P ?


If the card maxes out at 69°C overclocked under load, I wouldn't think it qualifies as hot or noisy given W1zzard's comments. Amazing how much you get out of a review if you read it.


----------



## xenocide (Jul 12, 2015)

I see no reason to get this rather than a 980 or 980 Ti to be honest.  The price/performance doesn't match it well against either card...


----------



## W1zzard (Jul 12, 2015)

nem said:


> meanwhile the review of GeForces are maked with the last drives ... ¬¬


Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet


----------



## Caring1 (Jul 12, 2015)

W1zzard said:


> Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet


I'd love to read the review when you get one


----------



## Tetsudo77 (Jul 12, 2015)

HumanSmoke said:


> If the card maxes out at 69°C overclocked under load, I wouldn't think it qualifies as hot or noisy given W1zzard's comments. Amazing how much you get out of a review if you read it.



I assume it was under 40% fan profile, correct?


----------



## nem (Jul 12, 2015)

Caring1 said:


> I'd love to read the review when you get one


Yeah! me too


----------



## nem (Jul 13, 2015)

well dont surpized since W


W1zzard said:


> Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet



?


----------



## anubis44 (Jul 13, 2015)

majevica said:


> You mean like Nvidia with their ultimate fuck you to customers with their faulty laptop chips that rendered my 1500e laptop useless after a year and in my country they didn't offered extended warranty or anything.



This happened to me, too. Bought a Toshiba Tecra M3, and sold it to my mother-in-law about a year later. Damn nVidia graphics chip died less than a month afterwards, and I found out later that it was one of the 'Bump-gate' scandal chips that nVidia washed their hands of and never reimbursed many, many customers over. Then I had a GTX670 card that wouldn't power up all three monitors in a three monitor setup without going through about 15 steps each time there was a driver update. Then there;Then there's Phys-X and now Hairworks' dirty tricks to deliberately make games run like crap on AMD graphics cards by cranking up the tessellation levels to 11, even though it looks identical at half that level. Finally, nVidia sells tons of GTX970 cards, advertising them as '4GB' cards, only to have it revealed that really, they've got 3.5GB of 'fast' memory  and .5GB of 'slow' memory, because it allows them to use a cheaper memory subsystem design and squeeze a few extra bucks out of their customers.

After all this, why would people continue to trust them? AMD may have made mistakes in the past, but at least they weren't straight up, naked attempts to rip off their own customers.


----------



## nem (Jul 13, 2015)

anubis44 said:


> This happened to me, too. Bought a Toshiba Tecra M3, and sold it to my mother-in-law about a year later. Damn nVidia graphics chip died less than a month afterwards, and I found out later that it was one of the 'Bump-gate' scandal chips that nVidia washed their hands of and never reimbursed many, many customers over. Then I had a GTX670 card that wouldn't power up all three monitors in a three monitor setup without going through about 15 steps each time there was a driver update. Then there;Then there's Phys-X and now Hairworks' dirty tricks to deliberately make games run like crap on AMD graphics cards by cranking up the tessellation levels to 11, even though it looks identical at half that level. Finally, nVidia sells tons of GTX970 cards, advertising them as '4GB' cards, only to have it revealed that really, they've got 3.5GB of 'fast' memory  and .5GB of 'slow' memory, because it allows them to use a cheaper memory subsystem design and squeeze a few extra bucks out of their customers.
> 
> After all this, why would people continue to trust them? AMD may have made mistakes in the past, but at least they weren't straight up, naked attempts to rip off their own customers.


so what about the premium support and high quality of nVIDIA then ? :B


----------



## HumanSmoke (Jul 13, 2015)

nem said:


> well dont surpized since W  ?









Spoiler









  Cyberdyne: Inspired by Hollywood....built in North Korea


----------



## W1zzard (Jul 13, 2015)

nem said:


>


That's the image you posted, date is july 7, zotac 980 ti review uses 15.15 beta.

_this_ article, the fury review, posted on july 10, uses 15.7


----------



## anubis44 (Jul 13, 2015)

haswrong said:


> well, amd is in deep trouble, so it has to surface somewhere. like chaotic pricing model in their last spasmatic attmept to grab a market share and generate revenue.. i dont see as likely for anyone to buy a g-card with a heatsink larger than the g-card itself and zero oc potential.. now i have to wonder even more what exactly amd expect from the nano model..........
> 
> edit: oops, i thought the pcb was smaller, i skipped first few pages of the review  ok.. the heatsink ISNT larger in length and width than the card/pcb.. ok, ok............ meh



Knee-jerk anti-AMD response perhaps? I don't recall people accusing nVidia of being in deep trouble when they asked $3000 for the first Titan. Greedy, yes.


----------



## 1d10t (Jul 13, 2015)

Price is just a little too steep 
Oh well...i think i'm gonna wait a little while and for the time being i might go three fire


----------



## anubis44 (Jul 13, 2015)

birdie said:


> Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.
> 
> 
> 
> ...



Strange that you are an authority on AMD drivers when, by your own admission, you've "never owned a single AMD GPU". I, on the other hand, have had mostly AMD Radeon cards over the last 7 years. Since a Radeon 4850, I've had a Radeon 6950 (briefly), a GTX670 (for about a month, which I sold because the nVidia drivers kept dropping my 3rd monitor after every driver update, and it required a 15 step process to get it back), a Radeon 7950 (bios flashed using a 7970 bios to 1000MHz - thanks TechPowerUp!), and now I've got a Gigabyte Radeon R9 290 (non-OC, but bios flashed with the OC bios to 1040MHz - again, thanks TechPowerUp for the bios!). 

Frankly, the 'bad AMD drivers' argument is obsolete. AMD drivers have been very good for several years now. It's true that AMD sometimes lags behind nVidia in optimizing for a specific game, but that's also true of nVidia. Their driver for Tomb Raider that worked properly came out nearly a month after AMD's. Point is, AMD drivers are now excellent for the mostpart.


----------



## anubis44 (Jul 13, 2015)

1d10t said:


> Price is just a little too steep
> Oh well...i think i'm gonna wait a little while and for the time being i might go three fire



I would really urge you to reconsider that idea. Two Radeons in crossfire will give you the best possible bang for the buck. That third card will almost certainly be wasted, at least until we know how DX12 handles 3 cards.


----------



## nem (Jul 13, 2015)

W1zzard said:


> That's the image you posted, date is july 7, zotac 980 ti review uses 15.15 beta.
> 
> _this_ article, the fury review, posted on july 10, uses 15.7


but the point is this review of Fury (non x) before say 15.5 and after some kind of "MAGIC" misteriusly change to 15.7 , then whats go on there ?


----------



## the54thvoid (Jul 13, 2015)

nem said:


> but the point is this review of Fury (non x) before say 15.5 and after some kind of "MAGIC" misteriusly change to 15.7 , then whats go on there ?



I think that you should return to this adult forum when you turn maybe 16 years old.  Maybe then you'll have the common sense and intelligence to read things before posting.







1 - You use the Zotac 980ti review to gain the drivers for the AMD cards 
2 - The date is 7th July
3 - The up to date (as of 7th July) 15.5 beta for AMD.  Note, below that is the 15.15 beta for Fury.






4 - You now use a different review to post drivers used.
5 - The date is now 10th July
6 - W1zzard has updated the drivers (having been released on the 9th July).


Why do you waste peoples time with your idiocy?


----------



## W1zzard (Jul 13, 2015)

nem said:


> but the point is this review of Fury (non x) before say 15.5 and after some kind of "MAGIC" misteriusly change to 15.7 , then whats go on there ?


It has always said 15.7, I've just reworded the section so that people who read only the first line, will understand it more easily.

When I saw the new drivers released I immediately started rebenching as many cards as possible, then I got the fury strix and benched that on new drivers too. Since the review had to go up at that time, there was no time to rebench the remaining older cards.


----------



## ZoneDymo (Jul 14, 2015)

birdie said:


> It's strange that people while not knowing you speak as if they've known you your entire life. It's ever stranger that they try to humiliate you in a process.
> 
> Over the past 12 months my friends (who in fact own AMD GPUs) have had the following problems (some of them are still not resolved):
> 
> ...



Ah right, the illusive friends...
Well I dont know how technologically adept your friends are but lets glance over that as it should not be that important.

1. Ermm CCC has fan control for a long time now...under the performance tab....
http://i43.tinypic.com/2d77i1l.gif (that is from 2010)
2. Riiiiight
3. Riiiiiiiiiiiight, also who (who bothers with uninstalling drivers) does not do that in Safemode using software like Driver Cleaner or more recently Display Driver Uninstaller anyway?..
Honestly from when I started getting into PC gaming and hardware, everyone and their mother always said to use that software and safe mode for the cleanest possible uninstall regardless of using AMD (ATI att) or Nvidia.
4. That is true, that shit sucks.
5. like you need that only for AMD drivers.... also taxes your cpu/hdd are you serious? maybe if you have a 1ghz Pentium 3 would you feel that.
6. Well god would know, and so should your friends so instead of assuming its a lot why not ask them? its odd to whip up a complaint you are not even sure is one. (also your answer: its a feather, weighs nothing (unless again you have that pentium 3 rig))

Im not sure when the last time was that you visited the AMD website (judging from the intensity of your post Id wager never) but its exactly the same as Nvidia...

Let me show you how many clicks it takes to get to the drivers:
Nvidia.com > USA > Drivers/Geforce Drivers > Download ADU or Manually > Download driver
  AMD.com > Drivers + Support/Drivers + Download Center > Download ADU or Manually > Download driver

Conclusion: Exactly the same


----------



## tiptop (Jul 14, 2015)

ZoneDymo said:


> Ah right, the illusive friends...
> 
> Let me show you how many clicks it takes to get to the drivers:
> Nvidia.com > USA > Drivers/Geforce Drivers > Download ADU or Manually > Download driver
> ...



Yep, for me it's like this: http://www.amd.com/en-us (or just click a bookmark) > mouse over Drivers + Support (upper right corner) > click Latest GPU Drivers for (choose your OS, e.g. WIndows 8.1 (64-bit) > click Download. Let's sum it up: 1 URL, 1 mouse move, 2 mouse clicks. Unbelievably hard for an NVIDIA user, isn't it? Just LOL.

People with illusive friends with problems always amuse me. He's never had an AMD product but AMD drivers, well... are just shit because they're not NVIDIA's, right? Also, it's crystal clear he's never read the NVIDIA support forum/any driver's release notes directly from NVIDIA or visited any AAA-title Steam forum just after its release. Let's tell him a secret: godlike NVIDIA drivers have lots of problems, too.


----------



## W1zzard (Jul 14, 2015)

Pro-tip, go to TPU, right side bar, scroll down near the end, section "latest drivers"


----------



## Ferrum Master (Jul 14, 2015)

Dafuq is going on here, ban hammer and clean out, make your own driver complaint thread that I wont visit. 

I want to know if any had succeded putting some more milivolts to Fury series...? Any luck? No LNO action also...


----------



## haswrong (Jul 15, 2015)

anubis44 said:


> Knee-jerk anti-AMD response perhaps? I don't recall people accusing nVidia of being in deep trouble when they asked $3000 for the first Titan. Greedy, yes.


anyway, nothing matters, because intel and nvidia are tolerating and actually helping amd to not perish from the market because of looming monopoly.. so in fact, amd can ask attrocious prices, but its only up to you


anubis44 said:


> Strange that you are an authority on AMD drivers when, by your own admission, you've "never owned a single AMD GPU". I, on the other hand, have had mostly AMD Radeon cards over the last 7 years. Since a Radeon 4850, I've had a Radeon 6950 (briefly), a GTX670 (for about a month, which I sold because the nVidia drivers kept dropping my 3rd monitor after every driver update, and it required a 15 step process to get it back), a Radeon 7950 (bios flashed using a 7970 bios to 1000MHz - thanks TechPowerUp!), and now I've got a Gigabyte Radeon R9 290 (non-OC, but bios flashed with the OC bios to 1040MHz - again, thanks TechPowerUp for the bios!).
> 
> Frankly, the 'bad AMD drivers' argument is obsolete. AMD drivers have been very good for several years now. It's true that AMD sometimes lags behind nVidia in optimizing for a specific game, but that's also true of nVidia. Their driver for Tomb Raider that worked properly came out nearly a month after AMD's. Point is, AMD drivers are now excellent for the mostpart.


http://www.techpowerup.com/forums/threads/amd-catalyst-15-7-error.214199/


----------



## jigar2speed (Jul 15, 2015)

haswrong said:


> anyway, nothing matters, because intel and nvidia are tolerating and actually helping amd to not perish from the market because of looming monopoly.. so in fact, amd can ask attrocious prices, but its only up to you
> 
> http://www.techpowerup.com/forums/threads/amd-catalyst-15-7-error.214199/



I can give you 2 URLs that says Nvidia drivers actually killed Graphic cards/Laptops... Request you to please stop, cause you seriously don't know what you are talking about.


----------



## Kissamies (Jul 16, 2015)

Having no VGA should be on the pros list instead of cons. It's 2015, time to move on from those ancient connectors.


----------



## MxPhenom 216 (Jul 16, 2015)

Anymal said:


> it would be great if Wizzard add at least one of AIB 980 cards in comparison



Not like its going to make that much of a difference, get over it.


----------



## n-ster (Jul 17, 2015)

MxPhenom 216 said:


> Not like its going to make that much of a difference, get over it.



People in a public forum get to make suggestions. Note that it is a suggestion and not a demand. "Get over it"


----------



## hero1 (Jul 23, 2015)

ASUS should have made this card single slot like the Fury X. Why go through the troubles of making a custom PCB then turn around and limit the power of the GPU and then make it double slot wide?  I would have rather seen 2 DPI, 1 HMDI and 1 DVI heck even one of each would have been nice just to make it a single slot card.


----------



## By-tor (Aug 5, 2015)

For the price of a single 980ti or Fury x I would have to go with a pair of 390's (not 390x) at the same price....  They would rock.

My pair of Powercolor 290x's LCS cards still hold there own...


----------



## Dieinafire (Aug 10, 2015)

What a waste. I was hoping this would be a good card.......... I should have known better


----------



## Dany (Aug 12, 2015)

*ASUS R9 Fury Overclocked to 1.0 GHz HBM and 1400 MHz GPU Clock – Fully Unlocked To Fury X With 1 TB/s Bandwidth on LN2 , check this out :  http://forum.hwbot.org/showthread.php?t=142320  , enjoy , cheers !! *


----------

