# AMD Radeon R9 Fury X 4 GB



## W1zzard (Jun 23, 2015)

AMD's Radeon R9 Fury X is released today, introducing HBM memory for graphics cards. The new card is built around a watercooled Fiji GPU, which enabled AMD to design a very compact card that will fit into many small-form-factor cases. Gaming performance at 4K is good and roughly matches the GTX 980 Ti.

*Show full review*


----------



## Basard (Jun 24, 2015)

OMFG I CANT WAIT TO READ THIS!!!!!

(edit) Wow.... I wish I didn't just read that....

(another edit)  Nano is supposed to be "significantly" faster than the 290... Seeing how the Fury compares to the 290, I have my doubts.

Well... I guess I'll just wait to upgrade... Some smaller nm chips need to come along....


----------



## Roph (Jun 24, 2015)

Seriously, samsung should just buy AMD. Fab their stuff at 22/16/14nm, dump cash into CPU+GPU R&D.

Samsung wants its own graphics IP for mobile too.

This card is great and all, but just imagine where we could be now if we weren't still fabbing 28nm chips like it's 2011.


----------



## birdie (Jun 24, 2015)

Heil [german] competition!


----------



## Fluffmeister (Jun 24, 2015)

So yeah, GTX 980 Ti FTW.

Solid card regardless, I'm sure the peeps here that only go red will lap it up.


----------



## W1zzard (Jun 24, 2015)

birdie said:


> What's the GTX 980 on this page?


NVIDIA reference design


----------



## RCoon (Jun 24, 2015)

Same price as 980ti, performs worse in more than half of the benchmarks, and comes with an extra KG of hosing+water+radiator.

Meh.


----------



## Chaitanya (Jun 24, 2015)

Coolermaster has done an excellent job with cooling this beast.


----------



## KarymidoN (Jun 24, 2015)

Disappointing ... just that.
AMD is officially dead to me, the board had some improvements in temperature and power consumption at the cost of a liquid cooling solution ... still lost almost 100% of tests for 980ti (excluding 4K) and even those who won , it was very close. If AMD thinks a plate for 4K is what the market needs then it is better than the Samsung just buy AMD.
I have been a loyal user of AMD for a long time, today I am using a 970 (MSI Gaming 3,5gb), I have no plans to use 4K, I'm very happy with full HD.


----------



## Over_Lord (Jun 24, 2015)

Meh


----------



## Sihastru (Jun 24, 2015)

Wait... so it's not actually faster then the 980 Ti? Hmm... And what's the point in using a really nice high quality quiet fan if the pump and the coil whine buzzes it all away. The good news is that AMD is finally competitive, the bad news is that they needed next gen tech to just be in the same fight with "old" gen tech. Early next year, it all evaporates, and we get back to the underwhelming AMD we all know and "love".

Not the "halo" product I was expecting.


----------



## ChristTheGreat (Jun 24, 2015)

Disappointed, totally. Will keep my R9 290, or maybe sell it and buy a GTX 980... or 2 GTX 780 at low price


----------



## okidna (Jun 24, 2015)

I love when TPU fall into (almost) silence when new card reviews are published, everybody are busy reading 

On topic : good card with good power consumption, low temperature, and of course good performance. Disappointing overclocking headroom, and still can't totally beat 980 Ti, which is not so good news for AMD.

Those high bandwidth coming from HBM also needs some extraordinary GPU computing power which, as we all can see, still hard to be achieved with the current 28nm technology.

So it's settled then, after seeing Fury's performance state and also the rebranding strategy with R9 3xx series : I'm gonna buy GTX 970 next month


----------



## darkangel0504 (Jun 24, 2015)

how about this ?
*http://www.techpowerup.com/forums/threads/error-with-tpus-wolfenstein-benchmarks.213767/*
http://tpucdn.com/reviews/AMD/R9_Fury_X/images/wolfenstein_1920_1080.gif


----------



## LAN_deRf_HA (Jun 24, 2015)

So pretty much what any reasonable person expected, alright but not exactly great, pretty much just a HBM product demo - as opposed to the greatest thing since sliced bread that the fanatics were expecting.


----------



## buildzoid (Jun 24, 2015)

It's only 23% faster than the 390X at best. Like WTF are the drivers broken? The only thing it doesn't have more of than the 390X is the ROP count so is that holding it back? Because the SPs and TMU point to 45% more power than Hawaii and this doesn't even come close to delivering that.

Also is the card voltage locked?


----------



## Fleurious (Jun 24, 2015)

Do you know if the high pitched whine from the pump is specific to your review card, or are others getting the same result with their cards?  That would be a deal breaker for me.

<edit> it seems like HWC encountered the same high pitched whine.


----------



## Outback Bronze (Jun 24, 2015)

Its great to *finally *see a review. Puts all those conspiracies to rest. Thanks wiz.


----------



## W1zzard (Jun 24, 2015)

Fleurious said:


> Do you know if the high pitched whine from the pump is specific to your review card, or are others getting the same result with their cards?  That would be a deal breaker for me.


every reviewer i talked to mentioned the pump noise. the coil noise is a separate noise depending on gpu load.


----------



## natr0n (Jun 24, 2015)

They need a well tuned driver to exploit this card better.


----------



## Crap Daddy (Jun 24, 2015)

The hype is over. The only new card apart from rebrandeon to come out from AMD since October 2013 is, surprise, not the fastest single card on the planet. Otherwise it looks good but it doesn't have the right price. 

If someone from NV would've come up to JHH with this expensive cooling solution for an average overclocking chip while affecting the almighty profit margins he would have fired him on the spot.


----------



## Sihastru (Jun 24, 2015)

Fleurious said:


> Do you know if the high pitched whine from the pump is specific to your review card, or are others getting the same result with their cards?  That would be a deal breaker for me.
> 
> <edit> it seems like HWC encountered the same high pitched whine.



Almost all the reviews mention the pump having a high pitched whine. Some said AMD is working with CoolerMaster to reduce or eliminate it for the retail cards, but I wouldn't hold my breath.


----------



## Vayra86 (Jun 24, 2015)

As many others, extremely disappointed by this release.

Also having a lot of trouble justifying the relatively high score in the review. 9.2 for this card? It is underwhelming across the board:

- price/performance ratio is shit, only card doing worse is Titan X.
- extremely high power draw
- performance/watt is last-gen at best
- bad - virtually non-existant overclocking potential

Price point is OK-ish, but really only makes sense if you love AMD because the card performs sub-par across many different resolutions and games. For 4K it may be a sensible choice, but that is about it, and then it still is a coin toss between Nvidia/AMD offerings.

So much for the big marketing push and incredible performance/watt jump of HBM. To be honest, I had expected a slight jump in performance, even if only 5% above Titan X, but even that is nowhere to be seen. Basically AMD put a lot of effort in new tech that does not perform better than GDDR5, and the fundamental GCN issues such as high power draw while watching Blu-Ray and at maximum output have not been tackled, if anything they have gotten slightly more pronounced.

The most surprising thing however, is the whopping 4096 shaders used and the virtually unlimited bandwidth and how little performance it all adds. It's almost like the card doesn't get to stretch its legs. Weird.


----------



## zsolt_93 (Jun 24, 2015)

Too bad it is competition with an already released product, that was brought to market to steal sales from Fury (talking about the 980Ti), and it is not winning this competition. This is going the way AMD and Intel are doing in the CPU department. They just cannot bring out a product that will keep them in contention. Would have been nice to see f it beats the nVidia offering what the green team would have done. A Titan X Black would have been nice with watercooling(maybe) and all. But now there is no need for that it seems and the prices wont go down, which is a pity since i am in the market for a new card and this will not change anything on the midrange pricing.

I see HBM as a disappointment, but the same was true when DDR3 and GDDR5 were competing at High end levels and GDDR5 couldnt really bring it on. Maybe next generation of memory will change my mind. Honestly i see this working in an APU more likely, with a small amount of HBM 512MB-1GB dedicated to the Graphics cores and not using DDR3/DDR4 which is clearly not at the level of VRAM no matter what frequency it runs at. Thinking ahead from the results the R9 Nano does not sound good at all anymore. Maybe 980 level performance and probably higher consumption as this still  consumes in the range of the 290x and was said to be 50-100% more efficient than that, which seems the big one might achieve, but the small one would have to be on 16nm not to have to give up too much processing power, and i dont see them pulling that out of nowhere with their financial troubles.


----------



## Enterprise24 (Jun 24, 2015)

Overclocking suck so much.


----------



## HTC (Jun 24, 2015)

I was hoping for Fury to be this close to 980Ti but i hoped it would be ahead: not behind.

In any case, it's us the consumers that win with performance this close: let the price wars begin!!!


----------



## btarunr (Jun 24, 2015)

"AMD fans don't trust their own brand. Oh the Hue-manatee!"


just kidding / payback for trolling newsposts


----------



## FrustratedGarrett (Jun 24, 2015)

The massive variation in the Fury's relative performance, especially how badly it performs in GameWorks titles is a real problem. 
The card seems to perform as well the TitanX in the neutral titles, but in titles like the Witcher 3 and Project Cars, it falls behind badly. In GTAV Nvidia does better but me thinks its due to their crappy HBAO+ implementation.

http://www.hardocp.com/article/2015...mage_quality_comparison_part_5/7#.VYqmJLYpnIU


----------



## RCoon (Jun 24, 2015)

Frame times are still an issue for AMD cards too:


----------



## Recus (Jun 24, 2015)




----------



## $ReaPeR$ (Jun 24, 2015)

damn... :/ so close... @W1zzard  could new drivers and dx12 change this situation?


----------



## jabbadap (Jun 24, 2015)

Well not that bad, that hbm helps it in uhd. But at lower resolutions it's quite stomped by geforces. Power consumption is quite good actually, that 20W idle is only thing that is annoying(pump and rad?).

You said it's gcn1.2, hardocp says it's gcn1.3, are they getting that up theirs backside or do you have some concrete information about the version(Not that I would trust hardocp over you, just curious where did they got idea the of gcn1.3)?


----------



## xkm1948 (Jun 24, 2015)

This is underwhelming. Waited so long for quite some disappointment. 980Ti it is then.


----------



## natr0n (Jun 24, 2015)

wow almost 600 lurkers lurkin


----------



## Absolution (Jun 24, 2015)

Meh, just on par with the 980Ti at the same price, if not worse.

Just gotta see how the NANO performs with the green teams ITX options, since my next build is gonna be ITX. Not gonna be optimistic.


----------



## Hugis (Jun 24, 2015)

Great review as allways, shame on AMD for the price point and lacklustre results.
On another note over 600 people reading Wizz's review TPU rocks!


----------



## LightningJR (Jun 24, 2015)

It's too bad. At 4K it's a decent card but still ONLY decent, at any res lower it's a bad card. Cmon AMD competition pls.. 

There doesn't seem to be any reason to get a Fury X over the 980ti, if the Fury X launched at $549 it would be pretty great.

Any word on the fan version of the Fury X? And it's price?


----------



## W1zzard (Jun 24, 2015)

Hugis said:


> On another note over 600 people reading Wizz's review TPU rocks!


that's only for the comments ^^


----------



## Sihastru (Jun 24, 2015)

And just to add insult to injury, LTT received a Fury X that artifacts! Artifacting is a direct result of faulty memory! I'm beginning to have doubts as to how much this technology is ready for prime time.


----------



## Caring1 (Jun 24, 2015)

I'll wait to see results using Win 10, then decide which is better, the 980Ti or the FuryX


----------



## newtekie1 (Jun 24, 2015)

Well after all the hype about how it was going to be faster than the 980Ti in 4K, this was a big letdown...


----------



## Frick (Jun 24, 2015)

Other sites place it closer to the 980ti at 1440p... But w1z has more games.

All in all a good card, should be a bit cheaper though and they should be avalable en masse now so... Dunno. It'll be a divider for sure and provide flaming adult discourse for months to come.

BTW, this card got a 9.2, which should put it in the excellent bracket, but many many people will be dissapointed. I find that funny.


----------



## luches (Jun 24, 2015)

well had already decided on 980ti and was just waiting to see how fury does to get 100% sure.  stuck between G1 and amp extreme . the extra 100$ for amp extreme is a bummer but the thought of more OC headroom over that already massive Factory OC is quite tempting !


----------



## Absolution (Jun 24, 2015)

Maybe AMD just doesnt want to blow away the competition. They just wanna match it and see their response, so they can release an immediate response with less cut down features.

One can dream.


----------



## FrustratedGarrett (Jun 24, 2015)

R
Coon said:


> Frame times are still an issue for AMD cards too:



AMD cards have performed worse in Crysis 3 since it was released.
FuryX does better in Civilization beyond earth:


----------



## btarunr (Jun 24, 2015)

Caring1 said:


> I'll wait to see results using Win 10, then decide which is better, the 980Ti or the FuryX



What will Windows 10 change for, say, Battlefield 4?

It's not enough that you have Windows 10 / DX12, you need every test in our review replaced with ones that actually use DX12.

Happy waiting.


----------



## blibba (Jun 24, 2015)

RCoon said:


> Frame times are still an issue for AMD cards too:


Yup


----------



## birdie (Jun 24, 2015)

The truth is AMD is fucked. Fury based products are the only competitive products on the 28nm node, and considering that top tier GPUs are used only by few (as indicated by Steam HW Survey), there's no way AMD can restore or even improve its market share.

Darn. I thought only the competition in the CPU market stalled, now it seems like the GPU market has come to the same bleak conclusion.


----------



## MxPhenom 216 (Jun 24, 2015)

64 ROPs on this card makes no sense.


----------



## techy1 (Jun 24, 2015)

AMD, w da f happened :O ???... you let me down Building up expectations with false powerslides (again) - I was hoping for GTX 980 Ti price drop after this release... but now i am screwd - and next year when you will be under chapter 7 or under different brand - nvidia can and will make all kinds of ridiculosly high pricetags :'(


----------



## newbsandwich (Jun 24, 2015)

Nice write up W1zzard, thank you for the candid review.  I think if this card was priced at 500 or $550, it would do good, but it cost too much for the performance given the competition.  
Which of the 980Ti variants would you recommend over the stock design?   I read both your Gigabyte and EVGA reviews and they both seem similar.


----------



## SonicZap (Jun 24, 2015)

This is disappointing. They learned from Hawaii's mistakes and solved the noise and throttling, but otherwise, this is much worse than Hawaii at the time. I guess it's going to receive a price drop to 599$ soon.

They better improve a lot next year with Arctic Islands or it'll be like Intel vs AMD in the GPU space.


----------



## haswrong (Jun 24, 2015)

Hugis said:


> Great review as allways, shame on AMD for the price point and lacklustre results.
> On another note over 600 people reading Wizz's review TPU rocks!


i started with http://www.hardwareluxx.de , because they have compute measurements and minimum framerate in game benchmarks, but they didnt have the witcher 3 test, so thats why i jumped here.. anyway.. im quite disappointed with the results.. 20W in idle and noisy pump could be offset by better than nvidia performance. apparently, amd decided to go the r9 290(x) route, with underwhelmed performance. so now we have to wait one or two years until amd decides to update the firmware or drivers and turn fury into a better fury, like they turned 290(x) into 390(x). what can i say? i wont upgrade my graphics for longer than i expected.. *sad exhausted face*


----------



## W1zzard (Jun 24, 2015)

newbsandwich said:


> Which of the 980Ti variants would you recommend over the stock design? I read both your Gigabyte and EVGA reviews and they both seem similar.


I'm waiting for ASUS, which should arrive in the next weeks. MSI arrives Monday, was stuck in customs.


----------



## Caring1 (Jun 24, 2015)

btarunr said:


> What will Windows 10 change for, say, Battlefield 4?
> 
> It's not enough that you have Windows 10 / DX12, you need every test in our review replaced with ones that actually use DX12.
> 
> Happy waiting.


I'm a patient man, and in no rush to buy the latest and greatest, but I do like reading great reviews


----------



## redko (Jun 24, 2015)

Still using R9 270, its bring me enough for the money. So, Fury of what? - This card has been overestimated from the cradle.  Yes, its small, yes its cool, yes it performs not bad at all, but is not extraordinary. Unfortunately, 980ti wins.


----------



## RCoon (Jun 24, 2015)

FrustratedGarrett said:


> AMD cards have performed worse in Crysis 3 since it was released.
> FuryX does better in Civilization beyond earth:


Civ BE is a CPU bound title. Other games in that entire review show AMD has worse frametimes overall:


----------



## haswrong (Jun 24, 2015)

luches said:


> well had already decided on 980ti and was just waiting to see how fury does to get 100% sure.  stuck between G1 and amp extreme . the extra 100$ for amp extreme is a bummer but the thought of more OC headroom over that already massive Factory OC is quite tempting !


dunno man.. id save the 100 for an extra heatspreader or watercooling block and take the g1. i think zotacs bios is more limited than gigabytes.. you should check that before a purchase. or if you intend to uplad custom bios, than check the robustness of the pcb. isnt msi going to make lightning version this time around? that could be a nice piece of hardware. unless evga classified beats it of course..


----------



## GhostRyder (Jun 24, 2015)

Well we finally have the review we wanted of the Fury X.  Honestly I am not disappointed in the card as it seems to be around what was expected (Matching GTX 980ti performance).  Personally I see this card as a pretty good product, low noise (Pump noise in a case will be inaudible though coil whine does worry me), no throttling (Unless I missed that part), and good performance in the reviews phase.  I won't be buying one, but its still a great card considering it comes with a water pump reference and matches the price/performance (Mostly 1440p+) of the GTX 980ti.  Besides the disappointing overclocking which probably put the GTX 980ti a bit ahead (time will only tell on that) its a great card.

Bear in mind 2 things of course:
1: NVidia cards with boost 2.0 all differ and some boost at stock speeds well beyond the clocks (not saying this one was just pointing it out).
2: AMD cards always (At least in the last 2 generations) have gotten much better with time while NVidia get worse (Just look at reviews from the past till now, they eventually start beating out GTX 780ti).

I am not defending this as being the greatest thing since sliced bread, however I think we all blew our expectations to astronomical levels which is causing everyone to only look at this disappointingly instead of that its a great stock card alternative to the GTX 980ti.  Each has their advantage so we have to just expect that, the only major disappointment was this was supposed to completely dethrone GTX Titan X in which it did not.


----------



## Nihilus (Jun 24, 2015)

Crap Daddy said:


> The hype is over. The only new card apart from rebrandeon to come out from AMD since October 2013 is, surprise, not the fastest single card on the planet. Otherwise it looks good but it doesn't have the right price.
> 
> If someone from NV would've come up to JHH with this expensive cooling solution for an average overclocking chip while affecting the almighty profit margins he would have fired him on the spot.



YES!  Judging by the reviews from the other sites, this thing could have easily been cooled by a normal aircooler - it is NOT the 295x.  They would have save $50 or so on the card.


----------



## GreiverBlade (Jun 24, 2015)

well my 290 is still worth it ... ahah will wait a bit more before upgrade (as they still are valid counterpart to a 970 )


----------



## Rowsol (Jun 24, 2015)

So disappointed.


----------



## Luka KLLP (Jun 24, 2015)

Sure, it's a fine card, but I don't see any reason to buy this over a nice custom design GTX 980 Ti


----------



## TheGuruStud (Jun 24, 2015)

GhostRyder said:


> Well we finally have the review we wanted of the Fury X.  Honestly I am not disappointed in the card as it seems to be around what was expected (Matching GTX 980ti performance).  Personally I see this card as a pretty good product, low noise (Pump noise in a case will be inaudible though coil whine does worry me), no throttling (Unless I missed that part), and good performance in the reviews phase.  I won't be buying one, but its still a great card considering it comes with a water pump reference and matches the price/performance (Mostly 1440p+) of the GTX 980ti.  Besides the disappointing overclocking which probably put the GTX 980ti a bit ahead (time will only tell on that) its a great card.
> 
> Bear in mind 2 things of course:
> 1: NVidia cards with boost 2.0 all differ and some boost at stock speeds well beyond the clocks (not saying this one was just pointing it out).
> ...




It's a disaster. There's no sugar coating it. These will be sitting on shelfs as paper weights when volume ramps. The driver must be written by a 5th grader and gameworks' cheat....optimization finishes it off.

This isn't even an upgrade over my 290x. It's dog shit.


----------



## bpgt64 (Jun 24, 2015)

Considering the immaturity of the drivers, I would like to see what other OEM's will do with Custom cooling.  The cooling solution looks a little rushed.


----------



## Vulpesveritas (Jun 24, 2015)

I.. Can't believe just how much of a fail this is.  By specs and everything else, this thing should be so much better.   My inner AMD Fanboy is screaming right now.

Well, I guess I'll just wait a few months and see if the Nano is really better than a 390X and get whichever of them as planned anyhow.  I'm only gaming at 1080p so as much of a stab from the hype this is from AMD, yeah..


----------



## mirakul (Jun 24, 2015)

Luka KLLP said:


> Sure, it's a fine card, but I don't see any reason to buy this over a nice custom design GTX 980 Ti


A big reason is 'I hate Mr. Huang's face'
nVidia has more money>Buying dev>More Gameworks title>More won bench>More customer>More money, the vicious cycle continues. I doubt that 980Ti still has that lead when Gameworks titles are excluded from test bench http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-5.html


----------



## mroofie (Jun 24, 2015)

Fluffmeister said:


> *So yeah, GTX 980 Ti FTW*.
> 
> Solid card regardless, I'm sure the peeps here that only go red will lap it up.


----------



## haswrong (Jun 24, 2015)

Caring1 said:


> I'll wait to see results using Win 10, then decide which is better, the 980Ti or the FuryX


is there any directx12 game out there?


----------



## GreiverBlade (Jun 24, 2015)

TheGuruStud said:


> It's a disaster. There's no sugar coating it. These will be sitting on shelfs as paper weights when volume ramps. The driver must be written by a 5th grader and gameworks' cheat....optimization finishes it off.


a disaster ... funny 


Rowsol said:


> AMD done fucked it up.  So disappointed.


not really...


Luka KLLP said:


> Sure, it's a fine card, but I don't see any reason to buy this over a nice custom design GTX 980 Ti


well i can see some, altho i would be tied between the 2 or ... just go for a 980 or a 390X if i need to replace my 290 ... who knows 

as for me i am not really disappointed, remember all the rumor and funny post ... true or not  
the card is good, no doubt and the Titan X is untouched (who care ... it cost 400$ MSRP more than a F-X or a 980Ti) 
the brand new arch of nvidia is still kept in check by a set of 2yrs cards and 1 new sku, for me it's enough.

if it line up with the 980Ti in price and in performance (2% under??? a big gap indeed) then all is technically right no? if it was priced like a Titan X it would be different 



Vulpesveritas said:


> I.. Can't believe just how much of a fail this is.  By specs and everything else, this thing should be so much better.   My inner AMD Fanboy is screaming right now.
> 
> Well, I guess I'll just wait a few months and see if the Nano is really better than a 390X and get whichever of them as planned anyhow.  I'm only gaming at 1080p so as much of a stab from the hype this is from AMD, yeah..


a fail ... oh ... yes it's a fail ... and a big one indeed


----------



## Vayra86 (Jun 24, 2015)

GCN is actually showing more and more to be a jack of all trades and master of none.

I think that is the real culprit here. It is not efficient, and that limits the whole potential of GPU offerings with AMD. They are still just scaling up the die further and further like they did since the 7970, and Tonga was not enough to limit the inefficiency of this arch. With Tonga XL (Fiji) they are hitting limits, shader count brings no 1:1 performance increase anymore, which is also why overclocking yields virtually nothing.

This spells doom for any future AMD gpu unless they start fresh or totally revamp GCN. I even think a node shrink won't help them enough.

The difference in overclockability of Maxwell and Tonga/GCN is staggering. Maxwell is much more versatile, I think the comparison between a motorcycle and a fast car is the best one: Maxwell being the motorcycle that can accelerate faster than any car possibly could, even if they put a 4096 HP engine in that car.


----------



## mab1376 (Jun 24, 2015)

BF4 performance with MANTLE?


----------



## Joss (Jun 24, 2015)

TheGuruStud said:


> It's a disaster. There's no sugar coating it.



This.


----------



## TRWOV (Jun 24, 2015)

I think that the increased shader count per CU isn't doing them any favors.... I'm having a VLIW5 vs VLIW4 dejavu.

AMD should have priced this about $50 lower but I guess that the cooling solution didn't leave them with much wiggle room. Hopefully air cooled versions will be cheaper.


----------



## Luka KLLP (Jun 24, 2015)

GreiverBlade said:


> or ... just go for a 980 or a 390X


That's what I'm gonna do


----------



## haswrong (Jun 24, 2015)

GreiverBlade said:


> ...
> if it line up with the 980Ti in price and in performance (2% under??? a big gap indeed) then all is technically right no? if it was priced like a Titan X it would be different
> ...


tit x has 3x more vram and is faster while consuming less energy.


----------



## mroofie (Jun 24, 2015)

W1zzard said:


> every reviewer i talked to mentioned the pump noise. the coil noise is a separate noise depending on gpu load.



slower than the 980 ti 

Amd needs to step up fast


----------



## the54thvoid (Jun 24, 2015)

GhostRyder said:


> Bear in mind 2 things of course:
> 1: NVidia cards with boost 2.0 all differ and some boost at stock speeds well beyond the clocks (not saying this one was just pointing it out).



Yeah but some boost stupidly high.  Maxwell's perf lies in it's higher clocks.  The 1400+Mhz boosts really push 980ti's to 15% improvements.



GhostRyder said:


> 2: AMD cards always (At least in the last 2 generations) have gotten much better with time while NVidia get worse (Just look at reviews from the past till now, they eventually start beating out GTX 780ti)



The 780ti was (tin foil hat required) hobbled to push the perceived perf of the 980ti.  The 780ti was 'allowed' to get worse because Nv had a faster card to sell.  In other words, Nvidia had a lot of room to work with to make their products look even better (against their own brand)

@W1zzard - On topic of the scores - why does it falter so much at <4K res?  I've never understood that.  Obviously the 980ti must take a hit from 1080p>1440p>4k.  But how does the Fury X do so badly at lower res?


----------



## Vulpesveritas (Jun 24, 2015)

GreiverBlade said:


> a disaster ... funny
> 
> 
> a fail ... oh ... yes it's a fail ... and a big one indeed



Compared to the hype I felt?  Yes, yest it is.

I wanted to wake up this morning and squee about how amazing the Fury X was after I read the reviews.  I wanted it to be fast enough to justify my paying more for it despite my 1080p-ness.  

I wanted a Titan X Killer, and instead I saw a 980ti match.  

So yes, it was a fail from the hype-stab side of things.


----------



## FrustratedGarrett (Jun 24, 2015)

Vayra86 said:


> As many others, extremely disappointed by this release.
> 
> Also having a lot of trouble justifying the relatively high score in the review. 9.2 for this card? It is underwhelming across the board:
> 
> ...




It performs quite well in most games and it's a lot quieter and cooler than 980 TI. Your post screams fanboy 0-


----------



## mroofie (Jun 24, 2015)

FrustratedGarrett said:


> The massive variation in the Fury's relative performance, especially how badly it performs in GameWorks titles is a real problem.
> The card seems to perform as well the TitanX in the neutral titles, but in titles like the Witcher 3 and Project Cars, it falls behind badly. In GTAV Nvidia does better but me thinks its due to their crappy HBAO+ implementation.
> 
> http://www.hardocp.com/article/2015...mage_quality_comparison_part_5/7#.VYqmJLYpnIU


ALien Isolation, Battlefield 4 all amd titles come on now


----------



## mirakul (Jun 24, 2015)

the54thvoid said:


> Yeah but some boost stupidly high.  Maxwell's perf lies in it's higher clocks.  The 1400+Mhz boosts really push 980ti's to 15% improvements.
> 
> 
> 
> ...


It's the notorious draw call problem AMD having with DX11. That issue is bigger in lower res.


----------



## buggalugs (Jun 24, 2015)

I'm happy with the card's performance but if the only option is the closed loop water cooler I'm not going to be buying it. I was hoping Asus and MSI would be making their own versions. The single 120mm radiators perform worse than a decent air cooler, then you have to put up with pump noise, and more noise than a decent highend air cooler.  I don't know why AMD would limit their sales by releasing a water cooling only option.

 I don't understand why people are so impressed with 60 degrees gaming. I get that on my Asus 290X DCUII

 It looks like HBM is the future though for highres 4K displays and multi-monitor. With a few multi-monitor resolutions in the review the FuryX would appear much better. It looks like the FuryX stretches its legs at high resolutions. Theres enough performance and new tech to keep AMD relevant, especially for 4K users which will be a lot of people now that 4K monitors are cheap, but I'm not getting one with a closed loop cooler.

 Most people willing to spend $600 on a graphics card will probably go 4K in the near future, making this card a serious option.


----------



## DarkOCean (Jun 24, 2015)

So we waited so long just for this ? ... it's slower, more power hungry, it has way less memory, sucks at overclocking (on water!) while costing the same as a 980ti ... so why would anyone buy this?


----------



## GhostRyder (Jun 24, 2015)

TheGuruStud said:


> It's a disaster. There's no sugar coating it. These will be sitting on shelfs as paper weights when volume ramps. The driver must be written by a 5th grader and gameworks' cheat....optimization finishes it off.


How, enlighten us how this card is a disaster.

1: It works
2: Runs Cool
3: Priced at the area it matches
4: Smaller than the competition (If that is your thing)



the54thvoid said:


> Yeah but some boost stupidly high.  Maxwell's perf lies in it's higher clocks.  The 1400+Mhz boosts really push 980ti's to 15% improvements.
> 
> 
> 
> ...


 
Indeed, and that is what I was referencing as some boost without people noticing which leads to variations in review across the web but not everyone's does that.

Well allowing it to get worse still is getting worse in my book but that up for a whole different discussion.  But either way its still worth noting especially depending on how often you upgrade.


----------



## hero1 (Jun 24, 2015)

Thanks for another great review. I am simply shocked by the performance levels for anything less than 4K. This card has the potential to beat the 980Ti cleanly if they unlock the voltage and memory overclock. It's crazy that it can keep up with the Titan X in 4K. Looks like I'm sticking with my 290Xs in CF under water for the foreseeable future.


----------



## Vayra86 (Jun 24, 2015)

FrustratedGarrett said:


> It performs quite well in most games and it's a lot quieter and cooler than 980 TI. Your post screams fanboy 0-



Three examples in a benchmark suite of over 20 games is nothing to write home about. And this is only at 4K, the ONLY res at which it only marginally excels, while performing notably worse on anything lower than that.

Relative performance is where its at. I don't care about individual titles and neither should you. You dont play three games on an enthusiast card, and you don't pick games based on their best performance on AMD or Nvidia hardware either.


----------



## TheGuruStud (Jun 24, 2015)

I'm seriously not even arguing this BS. Look at the gd numbers!


----------



## mirakul (Jun 24, 2015)

@W1zzard Given that Gameworks has become a well known issue, could you please make another chart without Gameworks titles? (Project cars, Watch dogs, Witcher, etc..)


----------



## the54thvoid (Jun 24, 2015)

At least I know what my next purchase is now.  It's a shame, I was looking forward to going Red again. 



TheGuruStud said:


> I'm seriously not even arguing this BS. Look at the gd numbers!



Irony is - I've seen you defend AMD a lot.  Your words are open and honest.


----------



## BiggieShady (Jun 24, 2015)

TheGuruStud said:


> These will be sitting on shelfs as paper weights when volume ramps


I sure hope that AMD is not ramping any volumes, they should tread carefully and make a single batch for now.


----------



## FrustratedGarrett (Jun 24, 2015)

Vayra86 said:


> Three examples in a benchmark suite of over 20 games is nothing to write home about. And this is only at 4K, the ONLY res at which it only marginally excels, while performing notably worse on anything lower than that.
> 
> Relative performance is where its at. I don't care about individual titles and neither should you.



That's true, but what you forget is that we have both Witcher 3 and Project cars in this test suite, and AMD performs way worse in these games. GTAV is another title were AMD is doing worse mostly due to NVidia's cheaper HBAO+ implementation in this game. 

http://www.hardocp.com/article/2015...mage_quality_comparison_part_5/7#.VYq3hbYpmUk


----------



## Assimilator (Jun 24, 2015)

I'm laughing at all the people who believed AMD's marketing hype.



mirakul said:


> @W1zzard Given that Gameworks has become a well known issue, could you please make another chart without Gameworks titles? (Project cars, Watch dogs, Witcher, etc..)



Oh get over your butthurt, the only "well known issue" is that AMD can't write drivers for s**t. W1zz uses a decent sample of popular game titles that isn't biased for or against any company. I guess you'd prefer it if he only tested AMD cards with Mantle games?


----------



## HD64G (Jun 24, 2015)

My humble opinion on the "Fury" subject is below

In 4K resolution (all other resolutions are too low for these monsters after all to have any problem playing in ultra settings):

Taking ALL games of W1z's review into consideration we have 10-10 for Fury X vs 980Ti (2 are draws)

By throwing the 3 totaly inbalanced games (PC, WoW and Wolfenstein are too green) into the equation we have 10 wins for red team and 7 for green team. 

If we oblige the DR3 game from the competition (seems red balanced this) we still have 9-7 for Fury X.

So, having a same priced, lower temp, more quiet GPU which is somewhat better in 4K with not matured drivers yet is not good enough to buy it? Since when logic is crazyness and the opposite also?

And Nano will be a GREAT product if price is correct...


----------



## GreiverBlade (Jun 24, 2015)

Vulpesveritas said:


> Compared to the hype I felt?  Yes, yest it is.
> 
> I wanted to wake up this morning and squee about how amazing the Fury X was after I read the reviews.  I wanted it to be fast enough to justify my paying more for it despite my 1080p-ness.
> 
> ...


you can't have a titan X killer ... since it cost the same price of a 980Ti ...


----------



## mroofie (Jun 24, 2015)

darkangel0504 said:


> how about this ?
> *http://www.techpowerup.com/forums/threads/error-with-tpus-wolfenstein-benchmarks.213767/*
> http://tpucdn.com/reviews/AMD/R9_Fury_X/images/wolfenstein_1920_1080.gif


It just one game the rest are legit and go look at other sites same conclusion!! (Gtx 980 ti > Fury X)


----------



## Vayra86 (Jun 24, 2015)

FrustratedGarrett said:


> That's true, but what you forget is that we have both Witcher 3 and Project cars in this test suite, and AMD performs way worse in these games. GTAV is another title were AMD is doing worse mostly due to NVidia's cheaper HBAO+ implementation in this game.
> 
> http://www.hardocp.com/article/2015...mage_quality_comparison_part_5/7#.VYq3hbYpmUk



The Witcher 3 is a non-issue because TPU tests without Hairworks and the performance falls right in line with expectations. The fact that AMD has trouble with their drivers is not CDProjekt RED's fault.  Without Hairworks and HBAO+ the AMD cards still underperform. Project Cars, yes, true. GTA V: another non-issue, if Nvidia makes do with lighter HBAO+, what is stopping AMD for bringing a similar solution?

Drivers are an important part of the purchase and AMD has a history of being late to the party or unable to tackle issues. This is not Nvidia's fault. Frame pacing is a notorious example, it was only until the shitstorm was unleashed that AMD moved to driver optimizations. Nvidia was ahead of the game, and generally still is. This is what customers recognize.

(By the way, recent Nvidia driver issues are also on my radar, don't get me wrong)

I simply call it like I see it. If AMD brings a killer product, I'll be the first to applaud them for it. They just don't, and they keep saying they do.


----------



## haswrong (Jun 24, 2015)

FrustratedGarrett said:


> It performs quite well in most games and it's a lot quieter and cooler than 980 TI. Your post screams fanboy 0-
> ...


ehm... who exactly plays games at 30fps? only consolists do.. we are not them though.. 4k perf is totally irrelevant these days.. (expensive, unplayable, unenjoyable = a horrible combo)


----------



## Joss (Jun 24, 2015)

See (hear) here for noise, minute 8 onwards.


----------



## mroofie (Jun 24, 2015)

FrustratedGarrett said:


> It performs quite well in most games and it's a lot quieter and cooler than 980 TI. Your post screams fanboy 0-


1.Only at 4K (It performs quite well in most games)
2.Water vs Air cooling

Only fanboy here is you


----------



## lilhasselhoffer (Jun 24, 2015)

On a brave new world, we're still treading water in the 28 nm node.


Seriously, is anyone greatly surprised?  The card competes with the highest end offerings of Nvidia, at around the same pricing point.  As per usual, they tackled one issue (heat), but couldn't tackle anything else because the underlying chips are built on the same process that we used 4 years ago.  


Objectively, HBM is interesting.  I'm not paying $650 for an interesting experiment.  
Objectively, the Fury was designed for 4K.  As I don't have a 4K monitor its value can't be realized by me.
Objectively, neither AMD nor Nvidia are winners here.  We've got another battle that is a toss-up.  Worst of all, it's a toss-up the consumer loses every time.



Thanks AMD, for proving what Nvidia already has.  Without huge investments, 28 nm is done.  We've moved beyond the point of gains equaling investment, and by the time investment could actually see gains we'll have a die shrink to shake things up.  Consider me waiting for the inevitable Arctic Islands vs. Pascal flame war.  This little show has burned itself out.


----------



## mroofie (Jun 24, 2015)

lilhasselhoffer said:


> On a brave new world, we're still treading water in the 28 nm node.
> 
> 
> Seriously, is anyone greatly surprised?  The card competes with the highest end offerings of Nvidia, at around the same pricing point.  As per usual, they tackled one issue (heat), but couldn't tackle anything else because the underlying chips are built on the same process that we used 4 years ago.
> ...


Wow ............


----------



## mirakul (Jun 24, 2015)

WAIT WAIT WAIT!!!!
My bad, really sorry.


----------



## W1zzard (Jun 24, 2015)

mirakul said:


> WAIT WAIT WAIT!!!!
> Why did you test the card with 15.5 beta??????
> The driver for FuryX IS 15.15 for godsake. This whole review means NOTHING at all.


----------



## GreiverBlade (Jun 24, 2015)

haswrong said:


> tit x has 3x more vram and is faster while consuming less energy.


yes... but i fail to see your point ... as for power consumption "ooohhh it will cost me more on my yearly electricity bill" (how much? ) also the 980Ti has 2gb more (but GDDR5 not HBM so the cost could also be the same)
so ... what do you try to say? that the card is a failure because it doesnt beat or scratch a titan X? while costing 400 less? so then the argument should be valid for the 980Ti to Titan X ?

ok i am finished with this thread  enough is enough  and time will tell us more funny story


----------



## Ferrum Master (Jun 24, 2015)

Could it it be that at lower res the card actually starves for CPU performance. It doesn't matter crappy drivers or architecture issue... it needs more CPU horsepower to drive it?


----------



## GhostRyder (Jun 24, 2015)

mirakul said:


> WAIT WAIT WAIT!!!!
> Why did you test the card with 15.5 beta??????
> The driver for FuryX IS 15.15 for godsake. This whole review means NOTHING at all.


 R9 300 series was on 15.15 (Look again).



lilhasselhoffer said:


> On a brave new world, we're still treading water in the 28 nm node.
> 
> 
> Seriously, is anyone greatly surprised?  The card competes with the highest end offerings of Nvidia, at around the same pricing point.  As per usual, they tackled one issue (heat), but couldn't tackle anything else because the underlying chips are built on the same process that we used 4 years ago.
> ...


 True, sad but true as this round is not really much to speak of overall.

Either way, card does exactly what its supposed to.  Its got great performance, lower power consumption (it basically if very close to Titan X and 980ti) and comes with an already great stock cooler.  What do they need to wrap the card in solid gold to make it appealing at this point?


----------



## Ebo (Jun 24, 2015)

Thx for a great review w1z .

Im not exactly jumping up and down in my chair right now.

Now all we can do is to wait for the custom aircooled R9 fury and see how it ends up.

Some of it might have something to do with inmature drivers, but that NOT all. That noise from the pump and the preformance i res lower than 4K ? thats the real deal breaker for me, NO thank you, not this time AMD.


----------



## FrustratedGarrett (Jun 24, 2015)

mroofie said:


> 1.Only at 4K (It performs quite well in most games)
> 2.Water vs Air cooling
> 
> Only fanboy here is you



Actually, I linked to the 4K charts by mistake. Here's FarCry4 at 2K: 





I don't see how you can call this card slow, when it's on par with the TitanX in arguably the most visuall appealing game of the year.


----------



## crazyeyesreaper (Jun 24, 2015)

the54thvoid said:


> Yeah but some boost stupidly high.  Maxwell's perf lies in it's higher clocks.  The 1400+Mhz boosts really push 980ti's to 15% improvements.
> 
> 
> 
> ...



Most likely a ROP issue @the54thvoid  They have 4096 shaders lots of TMUs but still 64 ROPs  at higher resolutions the shaders flex their muscle but low res it holds it back. That would be my guess. HBM also helps at higher res to an extent.  NVIDIA meanwhile has alot more ROPs does well at low res up to high res where the lesser memory bandwidth seems to have its performance fall back and AMD catches up across the board. Still only speculation but AMD has had this issue before.

7770 scaled up to 7870 was exact double but 7870 scaled up to 7970 was not they kept ROP count the same but doubled everything else. So seems to be a similar issue here as well.

They kept the same ROP count as the R9 290X / 390X but increased shader and TMU count.


----------



## the54thvoid (Jun 24, 2015)

HD64G said:


> My humble opinion on the "Fury" subject is below
> 
> In 4K resolution (all other resolutions are too low for these monsters after all to have any problem playing in ultra settings):
> 
> ...



So, if we take out all the games that Nvidia does well in - AMD wins.  Your logic is... astounding.  As @W1zzard clearly stated - it all comes down to what you play but the spread of games is large and that gives a far better representation of real life performance.  What is more disturbing is the frame times of the Fury X.  The only other reviews so far I have read (Hexus and Tech Report) both make reference to the stutter it has.  It's delivering a poorer gaming experience, even in titles where it has a higher fps.

I'm sorry to those this may offend but AMD clearly said it was the fastest GPU in the world.  It's simply not.  And then OC versus OC, it's cleanly beaten. 

Is it a great card? Yes.  The best - No.

Argue away but you're scientifically proven wrong (when science for our purposes is a meta analysis of reviews).



FrustratedGarrett said:


> Actually, I linked to the 4K charts by mistake. Here's FarCry4 at 2K:
> 
> 
> 
> ...



About that:


----------



## newtekie1 (Jun 24, 2015)

FrustratedGarrett said:


> It performs quite well in most games and it's a lot quieter and cooler than 980 TI.




Quieter and cooler? Have you looked at the eVGA 980Ti review?

980Ti Idle=0db Load=35db
Fury X Idle=31db Load=32db

At idle the Fury X is one of the loudest cards ever tested!  And at load it is basically equal to the 980Ti, close enough that the difference wouldn't be noticeable to the human ear.

As for cooler, it damn well better be, the thing is liquid cooled!  But the Fury X is still putting out more heat than a 980Ti.


----------



## Nihilus (Jun 24, 2015)

Does anyone know if this card is 1/4 DP?  It would be the small saving grace for this card.  AFAIK, Tahiti is still king of Dual Precision - a card released 3.5  years ago.
Also, it seems that 4 GB of ram did not hurt it at 4k in any of the reviews.


----------



## lilhasselhoffer (Jun 24, 2015)

mroofie said:


> Wow ............



You know, one word responses are useless.  The more I read from you, the more often I see something like that.  Care to espouse what your reaction actually means?


----------



## FrustratedGarrett (Jun 24, 2015)

the54thvoid said:


> So, if we take out all the games that Nvidia does well in - AMD wins.  Your logic is... astounding.  As @W1zzard clearly stated - it all comes down to what you play but the spread of games is large and that gives a far better representation of real life performance.  What is more disturbing is the frame times of the Fury X.  The only other reviews so far I have read (Hexus and Tech Report) both make reference to the stutter it has.  It's delivering a poorer gaming experience, even in titles where it has a higher fps.
> 
> I'm sorry to those this may offend but AMD clearly said it was the fastest GPU in the world.  It's simply not.  And then OC versus OC, it's cleanly beaten.
> 
> ...



No, you're acting like a shill and now you're giving me an irrelevant chart that doesn't even include the product in question.


----------



## RCoon (Jun 24, 2015)

FrustratedGarrett said:


> you're giving me an irrelevant chart that doesn't even include the product in question.



He's showing that a 980ti (the custom Gigabyte air cooler) is faster at this game you're pushing as graphically wonderful than the custom (water cooled) Fury X


----------



## FrustratedGarrett (Jun 24, 2015)

newtekie1 said:


> Quieter and cooler? Have you looked at the eVGA 980Ti review?
> 
> 980Ti Idle=0db Load=35db
> Fury X Idle=31db Load=32db
> ...



This could be a fan control issue. The fan is not slowing down when the card is in idle. it's a matter of a driver fix. This doesn't change the fact that the FuryX is competitive with the TitanX/980Ti while being ATST better cooled and quiter on load.


----------



## Frick (Jun 24, 2015)

Frick said:


> It'll be a divider for sure and provide flaming adult discourse for months to come.



See my prophecy from mere hours ago has proven true!



the54thvoid said:


> I'm sorry to those this may offend but AMD clearly said it was the fastest GPU in the world.  It's simply not.  And then OC versus OC, it's cleanly beaten.



If you close one eye and only look at certain graphs it's the fastest.


----------



## btarunr (Jun 24, 2015)

lilhasselhoffer said:


> You know, one word responses are useless.  The more I read from you, the more often I see something like that.  Care to espouse what your reaction actually means?



Leo Tolstoy's _War and Peace_ in Doge.


----------



## the54thvoid (Jun 24, 2015)

Frick said:


> If you close one eye and only look at certain graphs it's the fastest.



I literally LOL'ed.  



FrustratedGarrett said:


> No, you're acting like a shill and now you're giving me an irrelevant chart that doesn't even include the product in question.



And to you I say begone.  Calling me a shill is fine. I don't care.  I'm not one but i don't need to prove it.  I'll enjoy playing with my 980ti Classy when i can get one and a block.  You can enjoy that toy where you try to push shapes through holes.  Remember to match them up properly now.


----------



## nem (Jun 24, 2015)

Clearly AMD reduce the power consumtions ...o__________o

http://tpucdn.com/reviews/AMD/R9_Fury_X/images/power_average.gif

http://tpucdn.com/reviews/AMD/R9_Fury_X/images/power_peak.gif


----------



## FrustratedGarrett (Jun 24, 2015)

the54thvoid said:


> I literally LOL'ed.
> 
> 
> 
> And to you I say begone.  Calling me a shill is fine. I don't care.  I'm not one but i don't need to prove it.  I'll enjoy playing with my 980ti Classy when i can get one and a block.  You can enjoy that toy where you try to push shapes through holes.  Remember to match them up properly now.



I don't care about what you enjoy doing with your 980TI... why are you sharing all this information with me? The subject of discussion here is not your 980TI... Facepalm (


----------



## mroofie (Jun 24, 2015)

lilhasselhoffer said:


> You know, one word responses are useless.  The more I read from you, the more often I see something like that.  Care to espouse what your reaction actually means?


Look at the line where it's bold not rocket science


----------



## the54thvoid (Jun 24, 2015)

FrustratedGarrett said:


> I don't care about what you enjoy doing with your 980TI... why are you sharing all this information with me? The subject of discussion here is not your 980TI... Facepalm



I like to share.  Sorry 'Shill'.  I'm telling you because yours is the first post in this thread (I think) to name call someone - me, a 'shill'.  So I throw a comment back at you.  My tongue in cheek response being a remark that the 980ti Classy (being dual BIOS and therefore fun to flash) will fly like a rocket under water.  It will pump green mist all over Fury X.  I'm throwing a tech stone because you're being a tool by saying I sound like a 'shill' for speaking the truth.  My irrelevant graph is directly under your quoted graph so both graphs can be compared.  If you want more fire, the G1 980ti OC's a further 13% in W1zz's review (so about 90 fps versus [on OC level of 5%] 73 fps).

Your crazy defensive stance is not required.  Only the idiots are laughing at AMD.  I'm saddened it's not thumping Maxwell.  But fact is, it seems that when pushed on OC (and this is most def a tech forum of nerds) the 980ti rips Fury X apart.

Sorry.


----------



## Jborg (Jun 24, 2015)




----------



## Absolution (Jun 24, 2015)

Vayra86 said:


> GCN is actually showing more and more to be a jack of all trades and master of none.
> 
> I think that is the real culprit here. It is not efficient, and that limits the whole potential of GPU offerings with AMD. They are still just scaling up the die further and further like they did since the 7970, and Tonga was not enough to limit the inefficiency of this arch. With Tonga XL (Fiji) they are hitting limits, shader count brings no 1:1 performance increase anymore, which is also why overclocking yields virtually nothing.
> 
> ...



Well it's good for LuxMark at least lol, but then again 4GB is a problem for 3d rendering


----------



## _BARON_ (Jun 24, 2015)

This could should've been @ MSRP $500 or $550 then it would've made sense. It costs like GTX980Ti, offers less performance than stock GTX980Ti, just terrible.

From what I've seen overclocks are shit.

Best card on the market now seems to be  EVGA GTX980Ti SC+ 6gb @ 650$


----------



## INSTG8R (Jun 24, 2015)

I guess what I take away from this is that AMD has now at least cracked the HBM implementation so it should put them one up on NV when it comes to the Arctic Islands vs Pascal. For AMD it will be 2nd gen and for NV 1st gen(okay they will both most likely be using HMB2 but you get my point)


----------



## Kyuuba (Jun 24, 2015)

For the first time I am feeling worried about AMD...


----------



## Ikaruga (Jun 24, 2015)

They also "forgot" the cool the VRMs, and since it's a closed enclosure, it heats up quite nicely:





(source)

I wonder how fast these "top of the line" cards will die in the hands of enthusiasts


----------



## Sihastru (Jun 24, 2015)

They didn't, there's a copper pipe dedicated to that very task.


----------



## Mistral (Jun 24, 2015)

Not quite what we hoped for, but not too shabby either. Will wait and see where the Nano ends up.


----------



## nickbaldwin86 (Jun 24, 2015)

LOL AMD fans bois crying at AMDs fail right now.... sorry had too.

I really REALLY wanted this to be a WIN for AMD... because I really wanted the prices to drop on the 980Ti's... 

NVidia could raise the prices and still get sales now... thanks AMD you have had epic amounts of fails in the past week/month/year

I guess I will just sit back and be grateful with what I got.


----------



## SirEpicWin (Jun 24, 2015)

I'm utterly disappointed I shouldn't have waited this long ........sigh 
but can someone explain these numbers? just how in the world it's slower then 980Ti even by a small percentage!
Fury x
*Shading Units:* 4096
*TMUs:* 256
*ROPs:* 64
*Compute Units:* 64
*Pixel Rate:* 67.2 GPixel/s
*Texture Rate:* 269 GTexel/s
*Floating-point performance:* 8,602 GFLOPS 

980Ti
*Shading Units:* 2816
*TMUs:* 176
*ROPs:* 96
*SMM Count:* 22
*Pixel Rate:* 96.0 GPixel/s
*Texture Rate:* 176 GTexel/s
*Floating-point performance:* 5,632 GFLOPS


----------



## TheGuruStud (Jun 24, 2015)

Better be a secret 96 rop version...  What a bunch of losers.


----------



## Ikaruga (Jun 24, 2015)

Sihastru said:


> They didn't, there's a copper pipe dedicated to that very task.


I used quote marks with the "forgot" word, but seriously: it's 104C under heavy load with the dedicated copper pipe?


----------



## buildzoid (Jun 24, 2015)

Ikaruga said:


> They also "forgot" the cool the VRMs, and since it's a closed enclosure, it heats up quite nicely:
> 
> 
> 
> ...



104C is fine. The MOSFETs are rated at 75A continuous drain current at 125C and there's 6 of them. The caps are all tantalums so those should be OK with it too.

@W1zzard where are you thermal camera images? I remember you had them for some GPU reviews right?


----------



## L'Eliminateur (Jun 24, 2015)

what a monumental failure.
I too wanted this to be the next "core 2 duo", but nope, AMD has been doing nothing but failures for the past YEARS.

This is the faildozer of GPU releases for AMD, a ton of hype and shit for result.

why in gods name they released this in 28nm?, this should be 16nm!.

And if these are the results for the X then the fury and the nano will be complete dissapointment and wont be competitive with the 970 at all.

Nvidia's HQ must be dancing in joy right now, i guess i'll get a 970, no point in waiting, AMD won't have anything for more than a year(maybe when they rebrandeon the fury and do a die shrink of it) and then we will have Pascal which -so far- looks like an interesting gpu with a paradigm shift PLUS HBM and you know it will be crazy fast


----------



## ensabrenoir (Jun 24, 2015)

Outback Bronze said:


> Its great to *finally *see a review. Puts all those conspiracies to rest. Thanks wiz.



......no.......im sure Nvidia is responsible for this.........we go now to start the under ground resistance........
Seriously though if Amd was able to get this out a year or two earlier it would have really mattered.   Now when Nvdia goes hbm2 next year......


----------



## slick530 (Jun 24, 2015)

The 980Ti pretty much destroys the Fury X on every single benchmark. It's misleading when Richard Huddy and Lisa Su claims that the Fury X is the fastest card in the world. The Titan X still holds the crown. The only advantage I see with this card is the small form factor and it's ability to fit in a HTPC. But then again, without HDMI 2.0, it just doesn't make sense and not an option. AMD still haven't nailed this as much as the hype surrounds it make people believe so. It's a decent card but honestly, doesn't worth $650 given how it performs. $599 would make it a more appealing card to new buyers.


----------



## lilhasselhoffer (Jun 24, 2015)

GhostRyder said:


> ...
> True, sad but true as this round is not really much to speak of overall.
> 
> Either way, card does exactly what its supposed to.  Its got great performance, lower power consumption (it basically if very close to Titan X and 980ti) and comes with an already great stock cooler.  What do they need to wrap the card in solid gold to make it appealing at this point?



Yes and no.

I'm neither and AMD lover, nor an Nvidia lover.  What I need both of these companies to do is actively smack their fabs around, and get a new node up and running.  I'm not buying twice reheated leftovers, and especially not at the price of a fresh gourmet meal.  It doesn't honestly matter what minor incremental improvements Fury and the 980 Ti brought to the table, when I don't have to spend one additional cent for the minor performance penalty my 7970 already offers.


I, like most people, am a cheap ba####d.  If it costs 200% more than I've already invested, for a 40% increase in performance, my happy backside is punching out.  If AMD wants to sell the Fury to me they'll have to run a goods air cooler, allow me to trade my 7970 in, and only charge $200.  Given, that proposition is insane; likewise, the proposition of spending enough to make a new system on a single card of dubious improvement is equally insane.


----------



## haswrong (Jun 24, 2015)

so tomshardware says the average idle power consumption is 4.8W. so why tpu says its 20W? *confused look*


----------



## KainXS (Jun 24, 2015)

Its looking pretty bad for amd, ontop of pretty much rebranding nearly every card in the 300 series from cards in previous series the Fury X is not performing as well as they thought it seems and thats kind of to be expected since this is the first release of HBM. When Lisa said this was the fastest card in the world boy did she get that wrong and almost no overclocking potential vs what they said at the event, very very disappointing and they should drop the price as they're asking too much right now, I still want a nano though to be honest.


----------



## LAN_deRf_HA (Jun 24, 2015)

You know it's really similar to politics. No matter how times a side gets burned they're always ready to get their hopes up again next election cycle.


----------



## Ikaruga (Jun 24, 2015)

SirEpicWin said:


> I'm utterly disappointed I shouldn't have waited this long ........sigh
> but can someone explain these numbers? just how in the world it's slower then 980Ti even by a small percentage!


If you want to compare numbers, compare the Pixel rate. Maxwell is simply better.  (note: not many talk about it here, but AMD's GCN will perform better under DX12, the higher number of units will help a lot there)



buildzoid said:


> 104C is fine. The MOSFETs are rated at 75A continuous drain current at 125C and there's 6 of them. The caps are all tantalums so those should be OK with it too.


Yea, they are fine alone, but look how large area of the PCB is dark redish, add the extra heat when the backplate is back in place and and tell me that's healthy on the long run if you push the card a lot (let alone if you want to OC it heavily).


----------



## HD64G (Jun 24, 2015)

SirEpicWin said:


> I'm utterly disappointed I shouldn't have waited this long ........sigh
> but can someone explain these numbers? just how in the world it's slower then 980Ti even by a small percentage!
> Fury x
> *Shading Units:* 4096
> ...



Most possible factors are ROPS and even for a few % the immature drivers.


----------



## the54thvoid (Jun 24, 2015)

Will the trolls please stop calling it a fail.

It's far superior to the 290X in every way.  Yes AMD hyped it too much at their PR call but it's still a damn good card.

Just not what we had all hoped for.


----------



## steen (Jun 24, 2015)

Nihilus said:


> Does anyone know if this card is 1/4 DP?


DP is 1/16.


----------



## revin (Jun 24, 2015)

These Pre-Release card's just could be a sandbag..............just saying.
It's very possable AMD could release a very fine tuned unit to market................
Or pull off a price drop since the reviewer's all have found similar faults

Please Don't bother just flaming my post


----------



## TheGuruStud (Jun 24, 2015)

revin said:


> These Pre-Release card's just could be a sandbag..............just saying.
> It's very possable AMD could release a very fine tuned unit to market................
> Or pull off a price drop since the reviewer's all have found similar faults
> 
> Please Don't bother just flaming my post



Too late. Your flame suit can't handle what's coming.


----------



## revin (Jun 24, 2015)

Thank you @W1zzard
These Pre-Release card's just could be a sandbag..............just saying.
It's very possable AMD could release a very fine tuned unit to market................
Or pull off a price drop since the reviewer's all have found similar faults

Please Don't bother just flaming my post


----------



## ...PACMAN... (Jun 24, 2015)

In no way is this a damn good card, it loses consistently in price, performance, looks, noise etc. A waste of time......at a cheaper price point, I still wouldn't consider it. It looks horrible. 980Ti it is then in my new build.


----------



## mroofie (Jun 24, 2015)

SirEpicWin said:


> I'm utterly disappointed I shouldn't have waited this long ........sigh
> but can someone explain these numbers? just how in the world it's slower then 980Ti even by a small percentage!
> Fury x
> *Shading Units:* 4096
> ...


the gtx 980 ti has a higher pixel rate and has more rops  



Ikaruga said:


> They also "forgot" the cool the VRMs, and since it's a closed enclosure, it heats up quite nicely:
> 
> 
> 
> ...


The pcb is going to MEELLLTTTT NOOOOOOOO 



dwade said:


> Price drop or go home. Better yet, sell ATI to Intel or Samsung.


neither of them unless you want to see you're house on fire especially samsung 
When they fail they fail badly 

Intel will probably screw us more than nvidia 



TheGuruStud said:


> Samsung didn't become #1 by failing.


yes but their products do fail some forum searching and occasional news will help 



...PACMAN... said:


> In no way is this a damn good card, it loses consistently in price, performance, looks, noise etc. A waste of time......at a cheaper price point, I still wouldn't consider it. It looks horrible. 980Ti it is then in my new build.


I was hoping that Fury would destroy nvidia offerings 



ensabrenoir said:


> ...no problem bruh.......we got your back
> 
> View attachment 66013 View attachment 66014



Fanboys jo 



newtekie1 said:


> The idle sound is largely due to the annoying pump whine. Adjusting the fan is not going to get rid of pump whine.



Don't feed the fanboys  





horik said:


> Was expecting more, i hope Nano will do better for it`s price range.


I doubt it since the nano might cost $450 especially if its battling  against the gtx 970


----------



## SonicZap (Jun 24, 2015)

What I'm wondering is what this means for upcoming AMD GPUs. To me it seems that most of the efficiency improvement that Fury has over Hawaii comes from two things; 1) HBM, which Nvidia will have next year as well and 2) lower operating temperature thanks to the liquid cooler. Meaning that practically GCN's efficiency hasn't improved much at all except for the memory controller.

If Pascal further improves the efficiency from Maxwell (other than what the change to 14/16nm will bring), I can see AMD falling behind by a lot, losing more market share, and eventually dying as their revenue keeps shrinking.

For the card itself, it's mediocre. It's not a Bulldozer-class flop, but it'll need a price drop to compete with the 980 Ti. Earlier AMD mentioned that Fiji is almost as large (within a few square mm) as it can be, because the interposer has a specific maximum size that cannot be exceeded. Taking that into account, AMD likely just wasn't able to put more ROPs on the card, so they decided to design it strictly for 4K. I think AMD just adopted HBM too early, and so they weren't able to put more transistors on the chip because they would've exceeded the maximum size of the interposer that connects the GPU and HBM.

All in all, I'm worried what this means for the future of PC gaming. Nvidia has already demonstrated that they will price their products very high if they're capable (see GTX Titan and 780 and its sudden price drop from $650 to $500 when R9 290X was released), and AMD isn't going to stop bleeding money with products like these. Maybe they'll pull off another HD 4xxx series miracle in 2016 or 2017, but right now, I'm expecting them to become as irrelevant in graphics as they've become in CPUs.


----------



## dwade (Jun 24, 2015)

Price drop or go home. Better yet, sell ATI to Intel or Samsung.


----------



## TheGuruStud (Jun 24, 2015)

mroofie said:


> neither of them unless you want to see you're house on fire especially samsung
> When they fail they fail badly
> 
> Intel will probably screw us more than nvidia



Samsung didn't become #1 by failing.


----------



## ensabrenoir (Jun 24, 2015)

revin said:


> These Pre-Release card's just could be a sandbag..............just saying.
> It's very possable AMD could release a very fine tuned unit to market................
> Or pull off a price drop since the reviewer's all have found similar faults
> 
> *Please Don't bother just flaming my post*



...no problem bruh.......we got your back


----------



## newtekie1 (Jun 24, 2015)

FrustratedGarrett said:


> This could be a fan control issue. The fan is not slowing down when the card is in idle. it's a matter of a driver fix. This doesn't change the fact that the FuryX is competitive with the TitanX/980Ti while being ATST better cooled and quiter on load.



The idle sound is largely due to the annoying pump whine. Adjusting the fan is not going to get rid of pump whine.


----------



## horik (Jun 24, 2015)

Was expecting more, i hope Nano will do better for it`s price range.


----------



## Aquinus (Jun 24, 2015)

...and as expected, while HBM was an interesting experiment, but at the same cost as a 980 Ti, the requirement of having to find a place to put a AIO water cooler, and sub-par overclocking performance in comparison to the 980 Ti, I think that it's safe to say that GPU prices aren't going anywhere and that my suspicious of HBM not making night and day difference seems to be true. I suspect it's more bandwidth than the GPU can utilize and we're probably not seeing huge differences because memory probably wasn't a bottleneck yet.

Either way, unless I see a really good reason, I'm getting pushed towards the green camp and I'm a little bummed that the hype was (once again,) overstated. Not to say that this isn't an interesting experiment, it's just a costly experiment that I would rather spend on something a little more proven while the technology evolves. Clearly AMD has some tweaking to do.


----------



## FordGT90Concept (Jun 24, 2015)

So how much are the numbers going to change with future driver updates?


----------



## ensabrenoir (Jun 24, 2015)

horik said:


> Was expecting more, i hope Nano will do better for it`s price range.




^^.....this.  The nano will the real jewel.  If they price it right.


----------



## TRWOV (Jun 24, 2015)

SirEpicWin said:


> I'm utterly disappointed I shouldn't have waited this long ........sigh
> but can someone explain these numbers? just how in the world it's slower then 980Ti even by a small percentage!
> Fury x
> *Shading Units:* 4096
> ...




I guess we're seeing a repeat of VLIW5 vs VLIW4. For Fiji AMD increased the number of shader per CU. My best guess is that at 4K those extra shaders are being optimally fed and once resolution goes down the extra shaders are just iddling... otherwise the sub 4k numbers don't make sense.

Also NV shaders aren't built in the same way as AMD shaders so you can't make a straight comparison.


----------



## the54thvoid (Jun 24, 2015)

FordGT90Concept said:


> So how much are the numbers going to change with future driver updates?



It's irrelevant.  Only DX12 is relevant now.  Nvidia has shown it can do driver improvements too - in fact it's considered that Nvidia intentionally hobbles drivers to hold back performance.  The launch is the bona fide revelation that Fury X failed to do what it was meant to do - Destroy Maxwell.  This is bad.  If it gets better over time - as it will, so too will Maxwell.  By the time DX12 games are common place, Arctic Islands and Pascal will also be out - Round infinity - bing, bing - the fight goes on.


----------



## qubit (Jun 24, 2015)

Yet another disappointing top model from AMD. Guess I was expecting too much from them. 

So we have a card that costs the same as a 980 ti, yet performs generally worse, has watercooling by default with no other option while NVIDIA beats them on air and with slower memory technology.

On top of that, all that quietness is ruined by coil noise and a pump that emits a seriously annoying high-pitched whine. The noise alone is a dealbreaker for me and many others.

WTF?!

For the watercooling and noises in particular, doesn't anyone over there see these things and say "Hey wait a minute, we can't release a product in this state"? Guess not. Fail.


----------



## Joss (Jun 24, 2015)

SonicZap said:


> Earlier AMD mentioned that Fiji is almost as large (within a few square mm) as it can be, because the interposer has a specific maximum size that cannot be exceeded. Taking that into account, AMD likely just wasn't able to put more ROPs on the card, so they decided to design it strictly for 4K.* I think AMD just adopted HBM too early*, and so they weren't able to put more transistors on the chip because they would've exceeded the maximum size of the interposer that connects the GPU and HBM.



You may have hit the nail on the head.
We've been presented with a deluge of strategic errors from AMD this last few years. From Piledriver architecture with cores sharing FPU and cache to Fury's HBM and its probable limitations, with more in between.
Add to that tactical errors like the 290/290x release weeks before the custom coolers where available  and you have a company directed by idiots.


----------



## esrever (Jun 24, 2015)

Something is seriously messed up with this card. It has no OC scaling and its performance is down right pathetic for all the hype. I guess the watercooler adds some value but I hope fury will be better when its $100 cheaper than this.


----------



## WaroDaBeast (Jun 24, 2015)

I don't know what all of the people calling it a failure expected. Don't you guys remember Bulldozer? It was overhyped and ended up being a mixed bag. The scenario for the Fury X is about the same. Did any of you expect that much from a 28 nm chip? I mean, seriously now...

Anyhow, we'll see how third party models fare. Plus, there might be driver updates that make things better. Not calling Fury a superb card by any means -- just saying that's the sort of things that usually happen. History has a nasty habit of repeating itself.





btarunr said:


> Leo Tolstoy's _War and Peace_ in Doge.



Dude... You. Just. Made my day. xD


----------



## qubit (Jun 24, 2015)

Joss said:


> You may have hit the nail on the head.
> We've been presented with a deluge of strategic errors from AMD this last few years. From Piledriver architecture with cores sharing FPU and cache to Fury's HBM and its probable limitations, with more in between.
> Add to that tactical errors like the 290/290x release weeks before the custom coolers where available  *and you have a company directed by idiots.*


Couldn't have said it better myself.


----------



## semantics (Jun 24, 2015)

newtekie1 said:


> The idle sound is largely due to the annoying pump whine. Adjusting the fan is not going to get rid of pump whine.


Yeah it will turn it up enough and you'll no longer hear the pump


----------



## Frick (Jun 24, 2015)

qubit said:
			
		

> For the watercooling and noises in particular, doesn't anyone over there see these things and say "Hey wait a minute, we can't release a product in this state"? Guess not. Fail.



Or they realized it was a problem too late and they had to release it (well close to it anyway), and boohoy did they need to release it.


----------



## Octopuss (Jun 24, 2015)

Wasn't this supposed to be THE card? Looks like fail to me. Worse than 980Ti in every possible aspect.


----------



## soulsore (Jun 24, 2015)

I'm (was?) a fan of AMD, my first video card was an ATI Rage, then ATI 9600 Pro, HD4870 and HD7870. I was really, really, really hoping for Fury to be better than 980Ti, but if this is true... I'm sad to say this but I think I will pass to Nvidia now...


----------



## FordGT90Concept (Jun 24, 2015)

the54thvoid said:


> It's irrelevant.  Only DX12 is relevant now.  Nvidia has shown it can do driver improvements too - in fact it's considered that Nvidia intentionally hobbles drivers to hold back performance.  The launch is the bona fide revelation that Fury X failed to do what it was meant to do - Destroy Maxwell.  This is bad.  If it gets better over time - as it will, so too will Maxwell.  By the time DX12 games are common place, Arctic Islands and Pascal will also be out - Round infinity - bing, bing - the fight goes on.


Sure NVIDIA could do improvements too but Fiji is new silicon.  Are the first drivers out (ones that aren't even publically downloadable) really going to be the best?  As others have said, DX12 is irrelevant "now" because there are no games that use it.


----------



## SIGSEGV (Jun 24, 2015)

the rop count in this card is unbelievable, it has similar rop count with 290x/390x (wtf !!). Doubled the shader of 290x doesn't improve significantly to this card, i suspect perhaps due sufficient number of rop (under 4K benchs)
This card is hardly recommended for people with monitor setup resolution under 4k (~14% gap on 1080p with 980Ti). OC capability temporarily on this card is also horrible with only ~10%. However, still i'd like to know the result if AMD unlocked the OC restriction to this card. On the one hand it's indeed sad story but on the other hand it makes me happy that i can save huge of money and waiting for their 14nm process gpu. 

thanks for review


----------



## Vayra86 (Jun 24, 2015)

Aquinus said:


> ...and as expected, while HBM was an interesting experiment, but at the same cost as a 980 Ti, the requirement of having to find a place to put a AIO water cooler, and sub-par overclocking performance in comparison to the 980 Ti, I think that it's safe to say that GPU prices aren't going anywhere and that my suspicious of HBM not making night and day difference seems to be true. I suspect it's more bandwidth than the GPU can utilize and we're probably not seeing huge differences because memory probably wasn't a bottleneck yet.
> 
> Either way, unless I see a really good reason, I'm getting pushed towards the green camp and I'm a little bummed that the hype was (once again,) overstated. Not to say that this isn't an interesting experiment, it's just a costly experiment that I would rather spend on something a little more proven while the technology evolves. Clearly AMD has some tweaking to do.





Joss said:


> You may have hit the nail on the head.
> We've been presented with a deluge of strategic errors from AMD this last few years. From Piledriver architecture with cores sharing FPU and cache to Fury's HBM and its probable limitations, with more in between.
> Add to that tactical errors like the 290/290x release weeks before the custom coolers where available  and you have a company directed by idiots.



Precisely what I have been saying these past years on every slide from AMD that came past and screamed awesomeness, and on every forum user here that said HBM was going to change the world. It is nothing new: AMD has a new tech, AMD markets it as the future, the future seems very far away on release of said tech. They have fundamental timing and time-to-market issues, fundamental marketing shortcomings, and develop products ahead of their prime.

My most optimistic approach was a 10% gain over 980ti stock, and even with a max overclock on Fury you won't even get near.


----------



## BiggieShady (Jun 24, 2015)

I wonder how much power consumption reduction came from water cooling. I'd love to see if it leaks current like crazy at 80 C and to what extent power gating is helping.


----------



## Folderu (Jun 24, 2015)

Wow, just wow...  All those comments filled with hate, now i wish amd did not exist, it would be nice to see the guys here where will they throw their hate at.
If you just read the comments before the review you will think that this card is like 980ti - 999999 fps vs Fury x 15 fps, thats how much hate this card gets and i dont understand why, it runs all the games smooth.
Ok there are some numbers at lower resolutions but who the hell notices 125 fps vs 119 at 1080p and at 4k the reviewer needs to add commas to fps numbers to show direferences sometimes.
The only downside i see to this card is the price, it should be cheaper but not by much, this is a solid card when compaired to nvidia top offerings.


----------



## Fluffmeister (Jun 24, 2015)

W1zzard said:


> I'm waiting for ASUS, which should arrive in the next weeks. MSI arrives Monday, was stuck in customs.



Cool, looking forward to the MSI GTX 980 Ti Gaming review, that's the badboy I've got my eye on.

I'm sure it's gonna be much of a muchness with the Gigabyte G1, but still... oh it's pretty.



Folderu said:


> Wow, just wow...  All those comments filled with hate, now i wish amd did not exist, it would be nice to see the guys here where will they throw their hate at.
> If you just read the comments before the review you will think that this card is like 980ti - 999999 fps vs Fury x 15 fps, thats how much hate this card gets and i dont understand why, it runs all the games smooth.
> Ok there are some numbers at lower resolutions but who the hell notices 125 fps vs 119 at 1080p and at 4k the reviewer needs to add commas to fps numbers to show direferences sometimes.
> The only downside i see to this card is the price, it should be cheaper but not by much, this is a solid card when compaired to nvidia top offerings.



Doesn't help with all the ridiculous hype, either via AMD directly or their more fanatical followers of which there are many.

Anyway it swings both ways, the reviews of cards like the GTX 480 are there still for all to read and the endless bile they generated, hell AMD even spent money mocking that card with videos.

So yeah, what goes around comes around.


----------



## Vayra86 (Jun 24, 2015)

Folderu said:


> Wow, just wow...  All those comments filled with hate, now i wish amd did not exist, it would be nice to see the guys here where will they throw their hate at.
> If you just read the comments before the review you will think that this card is like 980ti - 999999 fps vs Fury x 15 fps, thats how much hate this card gets and i dont understand why, it runs all the games smooth.
> Ok there are some numbers at lower resolutions but who the hell notices 125 fps vs 119 at 1080p and at 4k the reviewer needs to add commas to fps numbers to show direferences sometimes.
> The only downside i see to this card is the price, it should be cheaper but not by much, this is a solid card when compaired to nvidia top offerings.



You would be right in a utopian world where the enthusiast segment cannot be overclocked at all and enthusiasts shell out money for subpar offerings (oh wait, they do... Titan?). With all cards at stock, the Fury X doesn't come out *too* bad. But we live in a world where overclocking has become part of the business model, especially for GPU, and where the direct competitor offers an overclock potential of AT LEAST 10% above stock performance, and much more in most cases. And we live in a world where performance is not JUST about FPS, but maybe even more about _frame pacing. _Take a good long look at the Techreport review and you will see a Fury X reintroduce pacing issues we last saw in 2013. Sure, drivers, but this is the way the cards lie now and AMD did release a Fury X driver already.

All things considered this card simply has no compelling USP's (Unique Selling Points) and the ones it does have, are cosmetic at best (water cooling, small form factor). In a market where you are the underdog, that is NOT enough.


----------



## Lionheart (Jun 24, 2015)

What a huge disappointment! The framerates between resolutions aren't consistent which has me guessing.... Needs driver updates. God damn it AMD I wanna like & support you but you make it hard sometimes not to this is a bad product at all, it's good yet your own hyped killed it, but then again I hyped myself up for it so in the end I'm just as guilty  The 4k results are good though, 1440p pretty good to but anything lower it just feels like the GPU ain't being utilized properly. Sigh.... I might just wait for the Fury, Nano or better driver support but I gotta say, a GTX 980 Ti looks more enticing too me now but I seriously don't wanna go back to Nvidia, sick of their gameworks BS implementation in games.


----------



## Steevo (Jun 24, 2015)

Immature drivers and biting off a huge project seems to have bitten them in the ass, I hope they bring the performance up with driver optimizations and maturing BIOS and perhaps some further tuning of the clock speeds.


While it is cool to see them bring new tech to the table my fear here is the long standing failure in delivery. They need to get on polishing this quickly or they will lose another generation of profit, and if its Mantle or DX12, or even going back to the original GCN hardware so be it (Tonga is slower than Tahiti per clock/shader) and this chip is only 40% faster than the 7970 at the same clocks speed despite being twice the card!

So I guess if they can squeeze the performance to something more level with drivers, and then get the clocks higher, they can still win this.


----------



## Casecutter (Jun 24, 2015)

It was a valiant effort... it's not enough hardly anymore than a few fingers hold on the ledge.

I suppose it has pressure on pricing... but even then not really. I could imagine AIB customs of 980Ti bumping in price given this... After seeing this I think AMD might've re-juggled pricing on this launch to $630… Why they probably know they can sell all they have in the plan at this price.  If there’s return on investment and suitable profit in this, isn’t that all that matters right now?

It just feels as though there have been comprises, shoe-string budgets, and bad decision making... that have AMD seemingly “shooting” there knee(s) off at every turn.  AMD is just mismanaged, now to that point it needs to be unburden from the BoD, being subject to debt, and quarterly reports. They need the deep pockets of “a daddy Warbucks”; perchance only way to get back in the game without clawing back inch by inch without a slip-up… that’s tough hall on limited R&D. Either way they need to work unencumbered, not trying to control the bludgeon of stock, just nose to R&D grindstone.

If unfettered, just holding to an austere tack that’s about maybe 3-4 hardware projects, keep to a strict though achievable course with little or no fanfare. The product launches, promote the merits and move on.  Time to market is all important, while quash the rumor mill.  Offer hard timelines, with zilch as to “promises”.  For example, say you have Zen / Artic Islands and provide the market quarter each will commence.  Don't provide any information you can't conclusive achieve, no “in theory”, just facts don't over sell! While absolutely nothing from rouge executives (get rid of most) and hardly present anything years out.  Get better at controlling the message and hopefully with that less overblown rumors.  If something comes out “truth or not” defuse it ASAP.  Just say that it “unsubstantiated”, while we do not speak to unreleased product and continue to develop on multiple R&D proposals.

To me that how Intel's playbook reads, although yes a bigger scale and longer term.  That who/what they are, stay to who you are now.


----------



## Zakin (Jun 24, 2015)

Man this does almost remind me of a repeat of Bulldozer hype, into disappointment and depression. At least it's not that disappointing in actual performance, but seriously AMD's marketing (hype) team need to actually talk to the engineers on what is actually coming out in the end. Looks like my brother is without a doubt going to a 980, he's been sick of problems on his AMD card anyway, but I still wanted to give them another shot, always like an underdog.


----------



## birdie (Jun 24, 2015)

*Folderu,*

I don't know about other people but I hoped the the Fury X would ensue a new round of healthy competition in the GPU industry and AMD somewhat failed in that regard (considering its price/performance/power draw/etc. at the moment). The Fury X is a *wonderful* GPU/architecture, I'd even say a engineering marvel, considering its power efficiency (as compared to previous AMD products), but it still failed to live up to the expectations. That's the real issue here. Perhaps it was priced correctly (let's say $550 or less) then there'll be less hatred.

And those who trash AMD are simply complete idiots. We've already reaped the fruits of a missing competition in the CPU market, now many feel like it's becoming the issue in the GPU market as well.


----------



## SonicZap (Jun 24, 2015)

birdie said:


> The Fury X is a *wonderful* GPU/architecture, I'd even say a engineering marvel, considering its power efficiency (as compared to previous AMD products)


90% of that improved power efficiency likely comes from HBM (which Nvidia is going to get with Pascal) and much lower operating temperature compared to their earlier high-end GPUs. Fury X runs at 50'C, Hawaii runs at 95'C. That makes a huge difference since power efficiency always goes down as temperature goes up. I don't see improvements in GCN itself, and that's worrying me. The architechture was very competitive with Kepler, but to me it seems that AMD is either focusing everything on Arctic Islands or... sitting on Tonga's GCN and hoping that it'll compete for the next decade, while Maxwell already kills it brutally.


----------



## Vayra86 (Jun 24, 2015)

You know that if AMD priced this piece of hardware at 550 they would instantly turn profit into loss, I mean the amount of hardware and tech on that product is way out of line, especially compared to the price points and hardware used for every other card on the market. 4096 shaders, (or 8900M transistors!) expensive HBM, the R&D that went into bringing it all into this form factor and a liquid cooled package. Any price drop will hurt them, no doubt. Compare that to Nvidia's top cards and the difference in 'metal used for performance' is staggering. The logical question is how they expected to ever turn a solid margin on this card, even if it WAS a halo product. Now with it not living up to that standard, they have essentially shot themselves in the foot.

Basically AMD is already forced into serious optimization of the card if they ever want it to be profitable. Once again I fail to see a solid business perspective on this release, where the fuck is it?


----------



## Batou1986 (Jun 24, 2015)

Rest in pieces AMD


----------



## Joss (Jun 24, 2015)

They should just have produced the chips and leave the PCBs and coolers to the partners.


----------



## Absolution (Jun 24, 2015)

What if we look at it this way. For the hardcore 4K gamer, the Fury X is a good alternative to the Titan X. It does manage to kill nvidia top of the creme, in the sense the Titan is about 6% faster for for roughly twice the price.












ie, at 4K is *does *manage to beat the Titan X as being a better alternative.


----------



## the54thvoid (Jun 24, 2015)

Not to rub salt on any wounds but if Nvidia deign to release a full core GM200 (Titan spec) with 6Gb ram, unlimited TDP (within reason), adjustable voltage and let partners do the shroud and cooling work - the granddaddy of all gaming GPU's would take some beating at 28nm.  Given what we've seen Fury X do now, we should all understand that surely for this generation, the chips are down and everyone can make their minds up.  AMD need to take this arch design forward and find it the room to go faster.  
Kepler was realistically limited at 1100-1200Mhz but Maxwell is up to what? 1500Mhz?  Hawaii was 1100 (pushing it) and Fiji is the same.  If AMD can tame the variables and get the clocks like Maxwell - it'd piss all over it.


----------



## hero1 (Jun 24, 2015)

Ikaruga said:


> I used quote marks with the "forgot" word, but seriously: it's 104C under heavy load with the dedicated copper pipe?



They could have just made things easier and have a custom air cooler that took away the heat from every part of that PCB equally. They seriously screwed that up. Like the rest of the people here, I love competition, performance and better pricing. I probably will be going 4K in a few months but even then I will consider going the Nvidia route once again (I have bought from both sides equally).


----------



## NC37 (Jun 24, 2015)

Why all the disappointment? It runs pretty decently for what I was expecting. Too pricey for what it does, IMO. But either way, Fury Nano is where things will likely really shine. If not, AMD just sunk themselves with one of the worst rebrands ever.


----------



## the54thvoid (Jun 24, 2015)

Absolution said:


> What if we look at it this way. For the hardcore 4K gamer, the Fury X is a good alternative to the Titan X. It does manage to kill nvidia top of the creme, in the sense the Titan is about 6% faster for for roughly twice the price.
> 
> 
> 
> ...



Just no.






By your own argument, the similarly priced 980ti beats the Fury X.


----------



## LogitechFan (Jun 24, 2015)

Ok, so basically it' a huge meh and is DOA with this price. RIP AMD, it was nice having you around.


----------



## SASBehrooz (Jun 24, 2015)

Like I thought and said .... I knew that it will not beat 980ti. big fail for amd. is this card really deserve 9.2 rate? so why 980ti is not 10?
just compare the specifications. kidding me ? 384bit vs 4096bit. lol

just love my 3.5gb 970. wp nvidia


----------



## semitope (Jun 24, 2015)

If this were an nvidia card the sentiment would be the other way. It's sad but true that AMD has to do a lot more to be seen in the positive. Nvidia can get away with so much. Even if they lie nobody cares

The fury X is a solid card. It's a new card with a new architecture and a new VRAM technology. Yet its frequently on part, sometimes under, sometimes above the more mature Maxwell 2 cards.

The biggest issue is simply the $649 price but most of those bashing it would be fine paying more for nvidia cards that perform worse than cheaper cards (which they did).

The performance will only get better over time and it could shine with dx12 so I'd call it more than a failure.


----------



## Petey Plane (Jun 24, 2015)

Folderu said:


> Wow, just wow...  All those comments filled with hate, now i wish amd did not exist, it would be nice to see the guys here where will they throw their hate at.
> If you just read the comments before the review you will think that this card is like 980ti - 999999 fps vs Fury x 15 fps, thats how much hate this card gets and i dont understand why, it runs all the games smooth.
> Ok there are some numbers at lower resolutions but who the hell notices 125 fps vs 119 at 1080p and at 4k the reviewer needs to add commas to fps numbers to show direferences sometimes.
> The only downside i see to this card is the price, it should be cheaper but not by much, this is a solid card when compaired to nvidia top offerings.



Check out the frame-time benchmarks on a couple other sites.  They are considerably worse with the Fury (compared to the 980ti).  They may have the same average FPS on a lot of games, but the nVidia delivers much smoother gameplay.  This is partially a driver issue, but AMD has always struggled with frame times.  Unfortunately, AMD appears to have dropped the ball with the Fury X.  Here's hoping that the followup, when HBM is more mature, can actually compete.  As it is now, there is no reason to buy the Fury over the 980ti.

All that being said, i'm really looking forward to the Nano.  If it can put down 980 numbers in a form factor similar to the ITX 970s, then AMD will have a legitimate (although niche market) winner.

Also, lol at anyone who buys the Fury X in the first month.  It will inevitably be $550 to $600 a few weeks from now.  They'll have to, otherwise only AMD Fanbois and system integrators will buy them.


----------



## 64K (Jun 24, 2015)

AMD just doesn't have the resources to top Nvidia. There's something going on with AMD but they're denying it at this point. It may just be shit rumors.

http://arstechnica.com/business/2015/06/amd-weighing-a-business-break-up-or-spin-off-reuters-says/


----------



## the54thvoid (Jun 24, 2015)

semitope said:


> If this were an nvidia card the sentiment would be the other way. It's sad but true that AMD has to do a lot more to be seen in the positive. Nvidia can get away with so much. Even if they lie nobody cares
> 
> The fury X is a solid card. It's a new card with a new architecture and a new VRAM technology. Yet its frequently on part, sometimes under, sometimes above the more mature Maxwell 2 cards.
> 
> ...



But it's not an Nvidia card.  Its AMD.  They said it was the best card in the world.  Their pre-release benchmarks were evidently lies.  This is like a cult whose leader says the coming comet will bring death and when nobody dies the cult leader says - "your faith has saved you". 

There are assholes here saying it's fail.  It's obviously not.  It's a damn fine card - make no mistake.  The main issue is it was touted and hyped, and hyped and hyped as a Titan X slayer (more by fans).  The problem for the Fury X is......

...

...

the GTX 980ti is better.  Nvidia is allowing custom variants with silly clocks and better coolers.  Now, if AMD can (for technical reasons) allow AIB's to play with the Fury X PCB, then maybe this card will grow wings and kick ass properly.  But we dont know for now.  So yeah, the card has failed to meet the fan based perceived objective.  Deal with it.

Let's hope they allow the board partners to work their magic on it.


----------



## yeeeeman (Jun 24, 2015)

So, everyone is taking a guess as what the causes of this situation might be. Really, the things are simple if you look carefully at the specs. Quoting Hexus on this:
1. Titan X has *6,611* GFLOPS of SP compared to *8,602* for the Fury. So here is the first offset.
2. Titan X has, ready for it, *207* GFLOPS of DP compared to *537* for the Fury. But both of them are under what the 290X is capable (739GFLOPS DP). So here is the second offset.
3. The ROPS are for sure the key element and here I will quote the folks at techreport:
4. HBM memory is wider but slower

"In other respects, including peak triangle throughput for rasterization and pixel fill rates, Fiji is simply no more capable in theory than Hawaii. As a result, Fiji offers a very different mix of resources than its predecessor. There's tons more shader and computing power on tap, and the Fury X can access memory via its texturing units and HBM interfaces at much higher rates than the R9 290X.

In situations where a game's performance is limited primarily by shader effects processing, texturing, or memory bandwidth, the Fury X should easily outpace the 290X. On the other hand, if gaming performance is gated by any sort of ROP throughput—including raw pixel-pushing power, blending rates for multisampled anti-aliasing, or effects based on depth and stencil like shadowing—the Fury X has little to offer beyond the R9 290X. The same is true for geometry throughput"

So, they increased the efficiency by removing some of the DP hardware compared to Hawaii. Still, they didn't do as much of a cut-down as NVIDIA did with Maxwell. Maxwell is so efficient because it's releaved of the DP hardware. Basically, that is how nvidia got this efficiency jump from Kepler to Maxwell, so that's not magic. The second part is indeed related to the ROPS and I guess they could've taked a bit from the shaders (3584 instead of 4096) and raise the number of ROPS to 96. Well, they could've but they didn't.
Now, the HBM part is tricky. This must be done at the driver level, and we know that AMD is a master guru when it comes to optimizing them. Only time will tell.

In conclusion, nvidia got rid of most of the unimportant bits and used those savings (which are huge, btw) to increase graphics horsepower. So, kudos to nvidia for doing this, they always pursuit what's the best even if that means losing something else. Still, considering all these things, I think AMD has done a very good job. 
And also, stop looking at the maximum power consumption of this card. If you read the description of this test you'll see that it's just a furmark test, which does not mean anything. The real power consumption you should follow is the peak or the average.

Cheers!


----------



## Petey Plane (Jun 24, 2015)

the54thvoid said:


> But it's not an Nvidia card.  Its AMD.  They said it was the best card in the world.  Their pre-release benchmarks were evidently lies.  This is like a cult whose leader says the coming comet will bring death and when nobody dies the cult leader says - "your faith has saved you".
> 
> There are assholes here saying it's fail.  It's obviously not.  It's a damn fine card - make no mistake.  The main issue is it was touted and hyped, and hyped and hyped as a Titan X slayer (more by fans).  The problem for the Fury X is......
> 
> ...



I'm not sure they will be able to help.  I haven't seen a review that was able to get more than a 100mhz overclock out of the Fury X.  Most seem to be in the 50mhz range.  And that's with a 500watt capable AIO cooler.  I can't imagine a windforce or strix cooler will be able to do better.  Also, memory overclocking is locked down on the hardware level. 

I'm guessing that the standard, air cooled Fury will be the one that the AIBs get to put their spin on, and not the Fury X.  Which shouldn't be an issue, because my understanding is the standard Fury has all the same silicon, it will just thermally throttle much sooner than the X.  We'll see. 

Also, you're right.  All of AMD's benchmarks, and those "leaked" ones from Chinese forms were completely bogus.

I'm really looking to the Nano to restore some faith in AMD.  I'd love to get ≈ 980 performance in a SFF case like the PC-Q30 (the little curved upright ITX case with the window)


----------



## Nihilus (Jun 24, 2015)

Tahiti is 1024 Gflops in DP while Tonga is 207 Gflops in DP.  We all saw a huge efficiency jump when they re-leaved Tahiti of this hardware (sarcasm).
This was not the time for cool experiments by AMD, it was a time to restore consumer confidence.  They failed IMHO.


----------



## mouacyk (Jun 24, 2015)

yeeeeman said:


> The second part is indeed related to the ROPS and I guess they could've taked a bit from the shaders (3584 instead of 4096) and raise the number of ROPS to 96. Well, they could've but they didn't.



According to AnandTech, AMD went with a 65nm interposer.  Could they have chosen a 32nm interposer process that  allowed more ROPS to be added?  Would a simple (but more costly) upgrade in the process like that deliver the hype of this GPU?  65nm is what my Q6600 processor was 8 years ago!


----------



## Petey Plane (Jun 24, 2015)

mouacyk said:


> According to AnandTech, AMD went with a 65nm interposer.  Could they have chosen a 32nm interposer process that  allowed more ROPS to be added?  Would a simple (but more costly) upgrade in the process like that deliver the hype of this GPU?  65nm is what my Q6600 processor was 8 years ago!



The interposer is not the same as the GPU die.  The ROPs are on the GPU die, which is 28nm.  The interposer is more like a substrate that the GPU die and memory sit on (in a kind of inaccurate sort of way)


----------



## yeeeeman (Jun 24, 2015)

mouacyk said:


> According to AnandTech, AMD went with a 65nm interposer.  Could they have chosen a 32nm interposer process that  allowed more ROPS to be added?  Would a simple (but more costly) upgrade in the process like that deliver the hype of this GPU?  65nm is what my Q6600 processor was 8 years ago!


As Petey before me said, the interposer is not an issue. This was solely a design choice and I guess it has its reasons, but we can only guess what those reasons were.


----------



## DarkOCean (Jun 24, 2015)

has anyone found a review that shows vram usage on this card ?


----------



## Nihilus (Jun 24, 2015)

horik said:


> Was expecting more, i hope Nano will do better for it`s price range.



Just where in the hell will Nano and Fury pro fit in??  Fury X costs $650 while the 390x 8 GB goes for around $400.  Here is the problem-
Fury X is only 10-20% faster than a mildly clocked 390x (the Fury X doesnt overclock much at all).  So then what?  Nano will be 5% better than the 
390x while the Fury pro is 15% better.  Price-wise I am certain they will be closer to the Fury X due to HBM costs.  Price/performance will suck with them 
as well.

Then there are those that plan on setting up crossfire, trifire, or even quad fire.  HAHA can you imagine 4 of these in a case - what  nightmare!
Not to mention it will only be 4k of RAM.  On the other hand, three or four 8 GB 390x or even a pair of 295x would be alot more manageable (and cheaper)
for those that want to game beyond 4k.


----------



## Hood (Jun 24, 2015)

No point in being disappointed about Fury X, the real performance of any flagship video card has NEVER lived up to the pre-release hyperbole.  If you are surprised you may be guilty of wishful thinking...what, you thought maybe they'd suddenly decide to underplay the hype just this once?  Not likely!


----------



## Yolokila (Jun 24, 2015)

I have been waiting a while to see FURY x do well but this is a disappointment. I have had the 290x, 780 ti and 980 ti. I am no fanboy but AMD you guys are terrible. False claims, poor performance and various issues such as pump noise.

I cannot even justify a FURY X over a 980 ti. There is no comparison lol.

Cons of FURY X;
-4gb Limit. Yeah its HBM but it is still a limit anyone who does not believe that is stupid.
-Pump noise/ coil whine (yeah NVidia has but as for 980 ti and titan x they addresses coil whine issues - it is much less than 970 and 980s)/ audible fan on idle - Overall these are completely ridiculous issues.
- POOR drivers as showed by TP review if you don't believe you are stupid. AMD I am sorry but you live in BETA driver world. Why can you not release real drivers. WTF are AMD programmers doing.
- 100mhz overclock is a "overclockers dream" SERIOUSLY AMD WTF-********* The fact that I have 980 ti reference currently 1450mhz core, 8.1ghz mem at 110% tdp I am speechless with AMD's claim. Second If I watercool the 980 ti with a EK block at 89 quid more FURY X has no chance. 1500mhz easy.
- watercooling - still 105c on VRM is terrible I think AMD have actually oc there fury x to compete with the 980 ti as the 980 ti release was unexpected. As some have said AMD I think was releasing this thinking there was only going to be a gtx 980 and titan x. AMD are too slow and that it self has killed off this product launch.

Sorry for gunning you AMD but they are FACTS. Too bad im not a reviewer so you cannot discipline me like you did kitguru lol muhahaha









 LMAO OMG


----------



## SonicZap (Jun 24, 2015)

Nihilus said:


> Then there are those that plan on setting up crossfire, trifire, or even quad fire.  HAHA can you imagine 4 of these in a case - what  nightmare!


That's not really an issue because trifire and quadfire don't make sense to begin with.

Fury Pro and Nano might still prove interesting, but after this launch I'm not expecting much from them. The Pro version will likely have somewhat better performance for the price (Pro versions of AMD GPUs have almost always been price/perf winners), while Nano will be used to milk money from the gamers who want a powerful GPU in a small form-factor case (ITX).


----------



## Yolokila (Jun 24, 2015)

Considering the shambles with the FURY X claims by AMD. I doubt the FURY nano will be any more false than there claims. You will be probably be better off with those smaller nvidia 960s



m1dg3t said:


> Is 4k @ 60Hz not possible via DP? HDMI sucks anyways. Goddamn HDCP.
> 
> How is the compute capability of this chip? Is it still capable or have they crippled/limited functionality, a la nVidia, in order to gain 'efficiency'?
> 
> If these were queried previously, disregard as I didn't read all 9/10 pages...



yeah but HDMI usually used if you are connecting to a 4k tv. FUry x does not support hdmi 2.0 so cannot do 60hz on hdmi.


----------



## btarunr (Jun 24, 2015)

mouacyk said:


> According to AnandTech, AMD went with a 65nm interposer.  Could they have chosen a 32nm interposer process that  allowed more ROPS to be added?



What!? Interposer lithography has nothing to do with rop count. Just reading that analysis gave me nausea.

What AMD did wrong, in my opinion, is lazily reused 1x the front-end of Tonga and 2x its back end, while merely doubling CU count. There's a bottleneck hidden somewhere there.

They thought memory will bail them out in the end. It did, to an extent. Just not enough.

So they're selling a "new" engine with intercooled variable turbine turbocharger, but with an oldschool 2VperC/SOHC cast-iron block underneath it.


----------



## BiggieShady (Jun 24, 2015)

btarunr said:


> So they're selling a "new" engine with intercooled variable turbine turbocharger, but with an oldschool 2VperC/SOHC cast-iron block underneath it.


It had to come eventually to a car engine analogy, didn't it  it's like Godwin's law for tech forums


----------



## TRWOV (Jun 24, 2015)

The price seems to imply that Fury X availability will be somewhat low. If AMD had priced it $50 less or so these would be flying off shelves so I guess they're trying to recoup as much cash as posible from sales.

Nano seems to be the GPU to watch,


----------



## RadFX (Jun 24, 2015)

btarunr said:


> What!? Interposer lithography has nothing to do with rop count. Just reading that analysis gave me nausea.
> 
> What AMD did wrong, in my opinion, is lazily reused 1x the front-end of Tonga and 2x its back end, while merely doubling CU count. There's a bottleneck hidden somewhere there.
> 
> ...



Bad anology. My old school non-variable Turbo, SOHC, cast iron block Daytona regularly put the boots to higher HP/Torque cars with DOHC, VVT and other new tech.


I don't see any reason to purchase AMD's new GPU's, The prices are outrageous (Nvidia too) and all I'd have to do it pick up another R9 290 to get R9 295 x2 like performance...which is cream of the crop right now.


----------



## hero1 (Jun 24, 2015)

TRWOV said:


> The price seems to imply that Fury X availability will be somewhat low. If AMD had priced it $50 less or so these would be flying off shelves so I guess they're trying to recoup as much cash as posible from sales.
> 
> Nano seems to be the GPU to watch,



With the way things are, I wouldn't hold my breath for Fury Nano. I'd more likely look to non X version with custom  PCB designs and even then, you're better off with whatever you got unless they somehow, magically, improve the card before launch, i.e., tweak the ROPs to 96 or more. Dreamland.


----------



## BiggieShady (Jun 24, 2015)

Speaking of pcb-s


----------



## hero1 (Jun 24, 2015)

BiggieShady said:


> Speaking of pcb-s
> View attachment 66021



ROPs ROPs

That's one sexy dual GPU PCB for sure. Let's hope they go back to drawing board and make some changes in ROPs count, lower the shader units, and let the Fury out of the dungeon lol


----------



## Casecutter (Jun 24, 2015)

the54thvoid said:


> They said it was the best card in the world. Their pre-release benchmarks were evidently lies.


Agree they need to learn to muzzle the executives' are rouge and say that crap.  While I think Lisa SU said something to that effect.  PR should watch such talking points from her. 

As to their 4k B-M those were more I believe part of the launch kit for "accepted typical" performance, they where not meant circulate in the rumor mill as an AMD consumer proclamation as what most presented them as.  That said I would absolutely hope someone independently verify and re-created their results.  AMD provided all the detail and settings, I think it would be fair for a reviewer to revel them as either truth or lies.

It may be that so many reviewer just run max/ultra to provide reusability of the data down the road and call it day.  It looks as though AMD is showing what presentable today in term of real or usable FpS with various adjustments.  Honestly, if those charts are considerable different from other identically configured set-ups I'd really like to see the deviation.  Sure you might never perfectly match their charts, but the deviation should not be that great while consistency of a trend should. 

Honestly, I think AMD might get right in there with the 980Ti @ 4K, especially if more playable setting are presented, and not aided with proprietary algorithm (or yes more copasetic to AMD).  Some of the reviews are showing only a couple of FpS difference and seem to spare back and forth.


----------



## Nihilus (Jun 24, 2015)

So time to speculate on the Fury pro now?  We can make some educated guesses.  I say slightly better than
GTX 980 performance at $550.  Most likely worse $/performance than the GTX 980

If you must go Team Red, the 295x2 is now $660 - a card that beats the Titan X in every benchmark.

Hell, one of the board partners could cook up an updated 7990 based on the 380x with more vram and AIO cooling that would match this for less money.

It feels like we are going back in time with these guys.


----------



## btarunr (Jun 24, 2015)

Nihilus said:


> So time to speculate on the Fury pro now?  We can make some educated guesses.  I say slightly better than
> GTX 980 performance at $550.  Most likely worse $/performance than the GTX 980



Yuo'll hear more about Fiji Pro only in July.


----------



## N3M3515 (Jun 24, 2015)

the54thvoid said:


> Just no.
> 
> 
> 
> ...



*BEATS???*
2% is a beating?? 50fps vs 51fps for example LOL, it is the same performance.

I think this card should be priced at $600 just because of the poor oc potential and medium 1440p perf.


----------



## semantics (Jun 24, 2015)

FordGT90Concept said:


> Sure NVIDIA could do improvements too but Fiji is new silicon.  Are the first drivers out (ones that aren't even publically downloadable) really going to be the best?  As others have said, DX12 is irrelevant "now" because there are no games that use it.


You're talking about a few percent improvement for most end users. The only large driver improvements i've ever seen are dual-gpu single card. Else you're hoping that it will be finally on par with the 980ti. Which still makes it underwhelming.


----------



## Fluffmeister (Jun 24, 2015)

N3M3515 said:


> *BEATS???*
> 2% is a beating?? 50fps vs 51fps for example LOL, it is the same performance.
> 
> I think this card should be priced at $600 just because of the poor oc potential and medium 1440p perf.



Hey, it turns out stock reference 980 Ti's are deadly, especially at the same price point.

But you're right, Fury X is overpriced.


----------



## ShurikN (Jun 24, 2015)

Yolokila said:


> Cons of FURY X;
> -4gb Limit. Yeah its HBM but it is still a limit anyone who does not believe that is stupid.


Looking at the 4k results it's clear that 4GB HBM is not limiting this card. Because it's at that res where this card actually shines.


----------



## mroofie (Jun 24, 2015)

64K said:


> AMD just doesn't have the resources to top Nvidia. There's something going on with AMD but they're denying it at this point. It may just be shit rumors.
> 
> http://arstechnica.com/business/2015/06/amd-weighing-a-business-break-up-or-spin-off-reuters-says/



"It may just be shit rumors" lol


----------



## jormungand (Jun 24, 2015)

Not trolling , but AMD have dead wish or something like that??? reading the review my face when to this....


----------



## Nihilus (Jun 24, 2015)

ShurikN said:


> Looking at the 4k results it's clear that 4GB HBM is not limiting this card. Because it's at that res where this card actually shines.



I feel we have been misled on VRAM consumption over the years.  Some of these sites claim that GTA V and Middle Earth: SoM use 6 GB or more at 4k with high settings, yet looking at theses reviews, there were no signs that the Fury X hit the Vram wall.

Most were certain that the GTX 690 would suffer badly with 2 gb of useable Vram, but it was consistently stronger than the 7990.


----------



## terroralpha (Jun 24, 2015)

after carefully examining this review and others (earlier today) , I placed an order for a Gigabyte GTX 980 Ti G1 and an MSI GTX 980 Ti Gaming. due to limited supply couldn't get 2 identical cards. but i'm OK with that. I bought them via Newegg's ebay store this morning and ebay was having the 4x ebay bucks promo. So i'm going to get about $55 back.

all that promising, "fastest GPU in the world" and other BS. this is almost like bulldozer all over again!


----------



## HumanSmoke (Jun 25, 2015)

Fluffmeister said:


> So yeah, GTX 980 Ti FTW.
> Solid card regardless, I'm sure the peeps here that only go red will lap it up.


Customer of mine ordered a Gigabyte GTX 980 G1 Gaming. Got cold feet after seeing AMD's PR benchmarks and booked a Fury X. I picked up the G1 order with a 10% discount (restock fee), so I'm stoked as the card was already discounted down to 10% below the EVGA SC. Still might trade up if the Classified turns up the heat. (So much for sitting this generation out!)


natr0n said:


> They need a well tuned driver to exploit this card better.





Steevo said:


> I hope they bring the performance up with driver optimizations


Ah, the miracle driver Hail Mary! By the time AMD get the driver out, the gloss would certainly be off this SKU.


Enterprise24 said:


> Overclocking suck so much.


Got to save something for Christmas sales - AMD Radeon Fury X PRO XT


the54thvoid said:


> Not to rub salt on any wounds but if Nvidia deign to release a full core GM200 (Titan spec) with 6Gb ram, unlimited TDP (within reason), adjustable voltage and let partners do the shroud and cooling work - the granddaddy of all gaming GPU's would take some beating at 28nm.


They may not need to - until they NEED a new SKU ( Holiday refresh for Q4). If the current models are anything to go by, the custom GTX 980Ti's should hold the line. Even if AIB's are allowed to clock the Fury X to ~ 1200-1250, I doubt that will tip the balance.






TheGuruStud said:


> Samsung didn't become #1 by failing.


Nope, you're right. Samsung became #1 though institutionalized IP theft, wholesale bribery, and protection and sweetheart deals with government.


----------



## TheGuruStud (Jun 25, 2015)

HumanSmoke said:


> Nope, you're right. Samsung became #1 though institutionalized IP theft, wholesale bribery, and protection and sweetheart deals with government.



So.... Pretty much like all of the giants that are US based (regardless of sector)?


----------



## CounterSpell (Jun 25, 2015)

it should cost $ 500

its overpriced 

at least we are having competition


----------



## xenocide (Jun 25, 2015)

Oh.  Oh my.  This is beautiful.  It's exactly what I expected.  A card that underperformed, didn't OC for crap, and was overpriced because of the CLC.  I can't believe people really expected a card that _shipped stock_ with a AIO water cooling solution would OC well.  Of course it gets good temps, the cooler is like $100 tacked onto the pricetag.  The power consumption is still garbage, and the performance on sub-4K resolutions is pretty bad.  If this were priced around $500 or even $549.99, it would be a decent product, but at $650 I see literally no reason to buy this thing.



Nihilus said:


> I feel we have been misled on VRAM consumption over the years.  Some of these sites claim that GTA V and Middle Earth: SoM use 6 GB or more at 4k with high settings, yet looking at theses reviews, there were no signs that the Fury X hit the Vram wall.


 
TPU already kind of debunked the idea of games using 4GB+ at higher resolutions.  For some reason people refused to believe that games wouldn't quickly climb to using 6GB or 8GB.  Then, almost immediately after that article was published here, they reviewed The Witcher 3, which used ~2GB of VRAM at 4K, which kind of solidified the idea that VRAM saturation wasn't a real problem, lazy optimization was.  A lot of games _can_ use 6-8GB of VRAM, but most of them don't _need_ to.  There are plenty of games that adapt to the amount of VRAM available too.  I remember seeing reviewed of BF3 and BF4 where on cards with more than 4GB of VRAM it would be using 3.5GB give or take, and with 3GB and 2GB cards it would be using near all of it, with no real performance loss.


----------



## ShurikN (Jun 25, 2015)

CounterSpell said:


> it should cost $ 500
> 
> its overpriced
> 
> at least we are having competition


Nah man it should be for free, or better yet AMD should pay us to use it. That'll keep em in the buisness...


----------



## LogitechFan (Jun 25, 2015)

ShurikN said:


> Nah man it should be for free, or better yet AMD should pay us to use it. That'll keep em in the buisness...


If they can't compete performance-wise -- they need to discount. Or die.


----------



## mouacyk (Jun 25, 2015)

btarunr said:


> What!? Interposer lithography has nothing to do with rop count. Just reading that analysis gave me nausea.
> 
> What AMD did wrong, in my opinion, is lazily reused 1x the front-end of Tonga and 2x its back end, while merely doubling CU count. There's a bottleneck hidden somewhere there.
> 
> ...



There are lots of reading suggesting that the die is near maximum size allowed by the interposer.  Thus, ROPS could not be increased to make use of the additional available bandwidth.  The bandwidth is ROP-starved and therefore useless.  Wouldn't going with a smaller lithography allow more traces on the interposer, and in turn allow more stuff in the die?  Or is the die near its maximum size, and going beyond would affect yields drastically?  Just trying to understand the useless new bandwidth...


----------



## Ikaruga (Jun 25, 2015)

@logitechfan Had to change my avatar because of you, welcome to TPU


----------



## Nihilus (Jun 25, 2015)

xenocide said:


> A lot of games _can_ use 6-8GB of VRAM, but most of them don't _need_ to.  There are plenty of games that adapt to the amount of VRAM available too.  I remember seeing reviewed of BF3 and BF4 where on cards with more than 4GB of VRAM it would be using 3.5GB give or take, and with 3GB and 2GB cards it would be using near all of it, with no real performance loss.



The only evidence that 4 GB may be a problem was at (edit) overclock3d when the 8 GB 390x performed better than the Fury.  They were running
Crysis 3 at 4k with AA.  This was the only one I saw after looking at several reviews so not an absolute.

http://www.overclock3d.net/reviews/gpu_displays/amd_r9_fury_x_review/21


----------



## HumanSmoke (Jun 25, 2015)

Nihilus said:


> The only evidence that 4 GB may be a problem was at bittech when the 8 GB 390x performed better than the Fury.  They were running
> Crysis 3 at 4k with AA.  This was the only one I saw after looking at several reviews so not an absolute.


Dying Light is also an (albeit extreme) example, as HardOCP found


----------



## btarunr (Jun 25, 2015)

You won't need AA on 4K as long as your display is 32-inch or smaller.


----------



## Nihilus (Jun 25, 2015)

Why would you get a 32" or smaller 4k tv?  I thought the Master PC Race was trying to get PCs to replace consoles.  There will be a ton
of people with 4k displays in the next 6 months.  Black Friday will be interesting as always.

After closer review of my above link, there are SEVERAL games where the 390x with 8GB give the Fury X a run for the money.  You better not even THINK about xfire with the Fury X - there are so many better routes to go with from AMD and Nvidia.


----------



## btarunr (Jun 25, 2015)

Nihilus said:


> Why would you get a 32" or smaller 4k tv?  I thought the Master PC Race was trying to get PCs to replace consoles.  There will be a ton
> of people with 4k displays in the next 6 months.  Black Friday will be interesting as always.



Uhm...monitors?

I personally use a 28-inch 4K. I've never ever needed AA, in even the oldest games I've played on this display. Even if a game or GfE auto-sets an AA mode, I disable it to give myself a softer framerate cushion.


----------



## nem (Jun 25, 2015)

leaving us fanboyismos every one know that DX11 is dying and what has claim microsoft is they want to do the transisicon to DX12 as quickly as possible to ensure that all work DX11 may portear the easiest way to the new API, also remember that one bech under DX12 was leaked and one 290x can beat the 980 this gives too much to think, where in a month there will be new bech and we'll see this is not at all a good reference to ensure the long life of a graph and see how it pays filter both architectures the nVidia has added him three irrelevant efects but lacks by dont support asynchronous shader where AMD if does ,and as far all knows this is an vital feature for performance which could provide the DX12 ... ¬___¬

https://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels







here the hard reality...^^


----------



## Frick (Jun 25, 2015)

Petey Plane said:


> I'm really looking to the Nano to restore some faith in AMD.  I'd love to get ≈ 980 performance in a SFF case like the PC-Q30 (the little curved upright ITX case with the window)



It's supposed to compete with the 970 I think.


----------



## nem (Jun 25, 2015)

Frick said:


> It's supposed to compete with the 970 I think.


all this bench is under drivers too green , still let too much to make better driver , meanwhile the Gefail have drivers too optimized..


----------



## rubenclavs (Jun 25, 2015)

@All

IMO this is a great card for the money. Performance is not that worse when compare to the 980Ti.

Nvidia has mature drivers compared to AMD.

And also, AMD is working and molding with the future API which is DX12. We really never know what will happen when the time comes, but for me buying 980Ti over Fury X now then a year later DX12 games will come up, then Fury X leaves 980Ti or Titan X to dust then my upgrade won't be worth it (buy another card then).

This product right here is being released with potential for Windows 10. AMD is not saying that we need to buy this when it is released. This is a new technology with HBM and Windows 10 is coming in 1-2 months.

So relax guys. The best is yet to come. We just have wait and get there.

Cheers!


----------



## fullinfusion (Jun 25, 2015)

A sad day for me while seeing this review but to all the ppl bitching I say this... Its new tech so deal with it. Give AMD a chance to get there drivers right and some time to tune this tech. This is new technology so I give AMD a thumbs up for being the first to change the way the cards are built.

I'll be waiting for the x2 version to come out before buying or unless the drivers get tuned accordingly but whatever. 

@W1zzard 
Why no 3dMark runs?


----------



## some_big_freek12 (Jun 25, 2015)

So the things that actually matter the most in performance of a card are ROPs and Pixel Fillrate? 
And what are DPs?


----------



## W1zzard (Jun 25, 2015)

fullinfusion said:


> Why no 3dMark runs?


we're focusing on real game tests, havent used 3dmark for years


----------



## mroofie (Jun 25, 2015)

HumanSmoke said:


> Customer of mine ordered a Gigabyte GTX 980 G1 Gaming. Got cold feet after seeing AMD's PR benchmarks and booked a Fury X. I picked up the G1 order with a 10% discount (restock fee), so I'm stoked as the card was already discounted down to 10% below the EVGA SC. Still might trade up if the Classified turns up the heat. (So much for sitting this generation out!)
> 
> 
> Ah, the miracle driver Hail Mary! By the time AMD get the driver out, the gloss would certainly be off this SKU.
> ...



*"Samsung became #1 though institutionalized IP theft, wholesale bribery, and protection and sweetheart deals with government"*

Thank you 



fullinfusion said:


> A sad day for me while seeing this review but to all the ppl bitching I say this... Its new tech so deal with it. Give AMD a chance to get there drivers right and some time to tune this tech. This is new technology so I give AMD a thumbs up for being the first to change the way the cards are built.
> 
> I'll be waiting for the x2 version to come out before buying or unless the drivers get tuned accordingly but whatever.
> 
> ...


3dMark runs Really ?? 
Because Real World Performance 
Wacky numbers don't mean sh**



nem said:


> leaving us fanboyismos every one know that DX11 is dying and what has claim microsoft is they want to do the transisicon to DX12 as quickly as possible to ensure that all work DX11 may portear the easiest way to the new API, also remember that one bech under DX12 was leaked and one 290x can beat the 980 this gives too much to think, where in a month there will be new bech and we'll see this is not at all a good reference to ensure the long life of a graph and see how it pays filter both architectures the nVidia has added him three irrelevant efects but lacks by dont support asynchronous shader where AMD if does ,and as far all knows this is an vital feature for performance which could provide the DX12 ... ¬___¬
> 
> https://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels
> 
> ...


Uhm ..... no 

Lets wait for official news and data


----------



## Aceman.au (Jun 25, 2015)

Wow.... I think I have to go to the green team. This is absolutely disgusting. I will wait until after my tax return to buy, but I expect very little improvement from drivers.


----------



## laszlo (Jun 25, 2015)

lot of comments....bad card.. good card...

this card is released for 4k gaming not lower resolutions so take it as is.

who want to play in 4k may buy it or not; for lower res we have a lot of products available

I'm glad they release it so I can upgrade cheaper this year;i'm sure for 200$ will get a goodie for lower res


----------



## xenocide (Jun 25, 2015)

Even for 4K I think a third part 980 Ti would be a better solution for barely more ($20 give or take).


----------



## Parn (Jun 25, 2015)

Performance not as good as I hoped. 

But considering its compact size, low operating temperature and vastly improved efficiency over the last generation Fury X is still a decent card albeit if the price could drop a further $50. Let's see what the Nano can do next as that's the card I'm more interested in. 

390X is probably the most disappointing one in terms of price. At the mainstream resolution (1080p), it's only 3% faster than a 970 but costs way more. Its 8GB VRAM sounds nice on the paper, but useless most of the time as the GPU itself can't handle 4K when large VRAM is actually needed.


----------



## buggalugs (Jun 25, 2015)

A few brands of FuryX (MSI, XFX, Gigabyte) hit stores here in Australia today and they were sold out in hours......for the price of $999 each. Which is much higher than it would normally be, ($650 US is about $760ish here) but there are plenty of takers.

 There are a lot of doomsayers here but its a very good card, at 4K and mutlimonitor its a very competitive, at any res really. Most users would not tell the difference between the FuryX, 980TI and TitanX in the real world. With windows 10 and DX12, the FuryX will only get better.

 I heard the Bugatti was 5% faster than the Ferrari, so does that mean Ferraris are shit?? A lot of people want this card and are buying it.


----------



## raptori (Jun 25, 2015)

Disappointing card , and considering all hype was surrounding this card a 9.2 is too much for this card .


----------



## HumanSmoke (Jun 25, 2015)

buggalugs said:


> A few brands of FuryX (MSI, XFX, Gigabyte) hit stores here in Australia today and they were sold out in hours......for the price of $999 each. Which is much higher than it would normally be, ($650 US is about $760ish here) but there are plenty of takers.


There always is for new hardware that has a full court press with marketing. OcUK sold 100 cards (their entire stock) at almost a 20% mark-up over MSRP inside half an hour. A client of mine pre-paid for a Fury X before the card was launched - not only paying an exorbitant price thanks to the country's GST, but also having to foot a $NZ100 restocking fee for the 980 Ti he had originally ordered - so how's that for keen?
Price isn't the great barrier to sales that you seem to think. Sales of Titan cards over the last couple of years should be a sobering example.


raptori said:


> Disappointing card , and considering all hype was surrounding this card a 9.2 is too much for this card .


It is still a great card. You won't find an AIO cooled 980 Ti for anywhere near the same price. The only disappointment (for some) is the hype surrounding it and the expectation of some all-conquering silicon.
Expectations and AMD's feeding of them might lead to some deflation, but it doesn't alter the fact that the card will find a home with many - although not me. I enjoy tinkering with my hardware, and the voltage and memory locks don't give much in the way of extended tweaking for benchmarking fun.


----------



## mirakul (Jun 25, 2015)

HumanSmoke said:


> There always is for new hardware that has a full court press with marketing. OcUK sold 100 cards (their entire stock) at almost a 20% mark-up over MSRP inside half an hour. A client of mine pre-paid for a Fury X before the card was launched - not only paying an exorbitant price thanks to the country's GST, but also having to foot a $NZ100 restocking fee for the 980 Ti he had originally ordered - so how's that for keen?
> Price isn't the great barrier to sales that you seem to think. Sales of Titan cards over the last couple of years should be a sobering example.
> 
> It is still a great card. You won't find an AIO cooled 980 Ti for anywhere near the same price. The only disappointment (for some) is the hype surrounding it and the expectation of some all-conquering silicon.
> Expectations and AMD's feeding of them might lead to some deflation, but it doesn't alter the fact that the card will find a home with many - although not me. I enjoy tinkering with my hardware, and the voltage and memory locks don't give much in the way of extended tweaking for benchmarking fun.


I didn't expect much from that asking price. If it did beat 980Ti its price would be on the moon. However, the unimpressive GPU doesn't change the fact that this is a great card with high quality build and shiny new tech HBM.


----------



## MagnuTron (Jun 25, 2015)

A small video of my "Day 2" thoughts on this release..


----------



## springs113 (Jun 25, 2015)

After reading about 6-7 reviews and multiple discussion threads about this card, I've come to the following conclusions.

1.  The disparity of the results between the reviewers are crazy.  Some reviewers results for the same game or synthetic software is so far off from others.

2.  The most annoying thing is ppl calling this a flop(no matter what AMD touted it as) It is definitely not that.  PPL fail to realize that NVIDIA probably got wind of the performance and knew that they had to counter it(drop in 980 ti).

3.  Basically a follow-up to (2).  If this card came out first no one here would be saying anything negative other than (oc sucks)..  It truly is disappointing that this card can only muster up a meagerly 100mhz oc.  WTF was AMD thinking, knowing that the        target audience for this card relies heavily on being able to push their cards, some to the extreme.  The performance it offers for the price(especially comparing it to Titan-x) is of the same relevancy of the 780/780ti to the Titan.

4.  My main concern if I were to upgrade (pretty sure a lot of others would be in the same boat) is Fury-X apparently will not come in vanilla(air cooled) form, I have a custom loop and it would suck because I would either have to bypass my loop or not        use  this card until there are custom water blocks out.  Which means AMD loses money here once again.  I don't care about power draw and efficiency all that much, I didn't go all out on my system to worry about how much my light bill is going to          increase.  Moot point here for ppl still complaining about power.

5.  I am a little disappointed by the lower resolution performances but I do believe driver updates will help.  I currently game on a 1440p monitor but I know my next gpu upgrade I'm going 4k.

6.  As far as drivers are concerned ppl need to relax because there are ppl out there experiencing bad drivers from both camps.  *YOUR CASE IS NOT THE END OF THE WORLD, IT HAPPENS TO EVERYONE OWNING A CARD FROM BOTH CAMPS*.            I have several computers in my home running cards from both camp (3870, 880gt, 7730m, 780, 980, 980m and 2x 290x).  Of all the driver issues I have had, it's been mainly of the green team side.  In fact I am experiencing 2 right now and 1 I just had      it resolved.  Arkham Knight does not work on my 980/980m I have yet to try it on the 780 system.  My 290Xs in crossfire rips through the game with no issues, and before anyone speak on updating the drivers,  I have and still no luck.  *AMDs beta            drivers work better than NVIDIAs whql drivers. *This is my case but that wont deter me from NVIDIA and I certainly wont bash them for it.  I do like what the red team is doing giving their financial situation and would continue to support them and      their products even if I lose a bit of performance here and there.

7.  I don't know if their profit margins can allow for something like this but @ $600 this card is a no brainer for sure.  It is doing quite nicely at its current pricing though, despite all the negatives that I have seen/read surrounding this card, It is practically        sold out everywhere in the U.S.

8.  You're either going to buy it or not, the difference in the FPS is not all that concerning given the fact that the Tech is new, drivers aren't mature, and to my knowledge is going to require these lazy developers to actually start working for their money.        This card is here to stay and I do believe is an experiment for AMDs next iteration of GPU...the next implementation of HBM will prove that, I have no doubt in my mind and secondly NVIDIA wouldn't be going that route if it wasn't so.  Lets not forget      AMD gets paid off of every use of HBM along with Hynix so the green team is shelling out to the red team for Pascal ppl.  All you guys thinking AMD will fail are delusional and quite frankly fools.  I certainly want them around to keep prices in check.        All yall out there saying this is Bulldozer all over again are wrong, this card loses/ties/beats the competition,  it ties mostly but given the whole budget and resource availability of the two companies I would give AMD the nod...they have less resources      and are restricted heavily financially all in all I would say good job AMD.  No Titan-x/980TI killer but job well done.


Quite frankly If I didn't already go over budget on my PC, Laptop(s), PS4, Xbox-1 and game purchases and pre-orders.  I would get both a 980Ti and Fury-X, but I don't really need either one or the upgrades...my 4770k/780 setup only been turned on once after installing the operating system, same goes for my 4790/980 system(htpc).  My 290x and 980m systems get all the attention.  My xps13 is for my DJ'n which is on a back burner. Sorry for the little rant, I think a lot of ppl are in the same boat as me(don't need this generation).  I guess its time to teach my 1yr old how to operate these things.  I need help in utilization.


----------



## Frick (Jun 25, 2015)

springs113 said:


> I guess its time to teach my 1yr old how to operate these things.  I need help in utilization.



I suggest you hire someone, that'll teach the little bugger good values as well.


----------



## springs113 (Jun 25, 2015)

Frick said:


> I suggest you hire someone, that'll teach the little bugger good values as well.


He don't need it, he already knows a lot more than most ppl in this thread.


----------



## Enterprise24 (Jun 25, 2015)

9.2 is too much , I'll give it 7.5
Pros is like wizz said but I'll add cons.
- Lack of VRM cooling
- Overclocking potential is really slim , I don't know if non-ref with strong VRM will fix this
- Price is insane , should be 499$-549$
- too much propaganda (like AMD official benchmark)
- 1080p/1440p performance is really joke , 980 Ti is faster and way way faster with non-ref and manual OC


----------



## the54thvoid (Jun 25, 2015)

Technical question based on statement:

If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat?  On such a small PCB would this create heat dissipation problems?  I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle.  I figure AIB's will do 3rd party coolers from the start?

I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks.  It could be a HBM integration issue with heat affecting perf?  Who knows.


----------



## thebluebumblebee (Jun 25, 2015)

IMHO, AMD's biggest mistake with this is not bringing it out in AIO and waterblock versions, from the get go.  Because of the expense of the AIO, a waterblock version should not be that much more.

CoolerMaster _makes_ AIO's?



the54thvoid said:


> Technical question based on statement:
> 
> If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat?  On such a small PCB would this create heat dissipation problems?  I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle.  I figure AIB's will do 3rd party coolers from the start?
> 
> I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks.  It could be a HBM integration issue with heat affecting perf?  Who knows.


It's my understanding that they went AIO because the GPU and the VRAM are all located on the GPU "package".  The "non X" will have regular GDDR5 in a conventional format.


----------



## EarthDog (Jun 25, 2015)

Excellent review Wizz!!! Looks like they fell just a bit short again... but hey, the price is at least in the ballpark for the performance (lack of overclocking headroom not withstanding). Hopefully the pump noise issue isn't prevalent.


LOL, Xfia got a vacation? Christ, that was looooooooooooooong overdue...


----------



## btarunr (Jun 25, 2015)

Enterprise24 said:


> 9.2 is too much , I'll give it 7.5
> Pros is like wizz said but I'll add cons.
> - Lack of VRM cooling
> - Overclocking potential is really slim , I don't know if non-ref with strong VRM will fix this
> ...



I'm not trying to justify the score. However,

1. It does have VRM cooling in the form of a copper tube passing over the area, which is the main coolant channel.
2. To be honest, we all are cavemen figuring out an iPhone, when it comes to HBM. Maybe there will be more documentation or larger community poking-with-a-stick before we figure out how to tweak it
3. AMD did not see GTX 980 Ti coming. Heck, they didn't expect GM200 to come out this fast. NVIDIA put GM200 to work within a span of 4 months (faster, when compared to how long it took to monetize GK110, post GK104).
4.  Every GPU launch has such pre-launch activity
5. Nobody buys these cards for 1080p. In case of 1080p @ 120Hz, I agree. 1440p isn't so bad.

Regardless of how Fury X turned out to be, whatever little inventory is with the retailers, is flying off the shelves. There's still a section of people who are convinced by the product, who like quirky-yet-potentially-powerful toys, and love the _deus ex machina _it gives them.


----------



## Casecutter (Jun 25, 2015)

HumanSmoke said:


> Dying Light is also an (albeit extreme) example, as HardOCP found.


 
But isn't that more as a built-in of Nvidia Gameworks; ShadowWorks? Honestly, I personally find Very High looks very... un-natural.  Medium is about the top limit where you have "defined but soften shadows" that are more real to life. Even the differences from low/medium hardly noticeable.  Very High shadows is more there to be like setting that Nvidia gave themselves to “sugar-coat” their hardware.  Folk can see for themselves in their interactive comparison here:
http://www.geforce.com/whats-new/gu...performance-guide#dying-light-shadow-map-size

But yea, shadowing technology has come a long way... I thought it amazing to see shadows following rolling beach balls, and how shadows even climbed and elongate up say a surface (wall) as that ball got close to it, or rolled down in trough.  Those where some of the technical advancement that won awards at the Odyssey Visual Design Computer Animation Festival: circa, 1986 & 1987

There's been comparisons to the Gigabyte GeForce GTX 980 Ti G1 review W1zzard did prior to this.  Looking at that as to the subject of "power" under gaming it's reveling.
The G1 saw 23% higher in it's Average/Peak from the reference 980Ti. Avg: 211 vs. 259W;  Peak 238 vs. 293W
The Fury numbers... Avg 246W;  Peak: 280W

Interesting Tom's also has a fairly sophisticated power test and they report the Fury X  under gaming Averages 220.7, while I don't find a reference 980Ti review (?), their Titan X number was 233W.  So I'm not all "up in arms" over Fury X power consumption, for an Enthusiast it not "out-of-bounds".  Heck even [H] isn't showing it as that "out-of-bounds", perhaps at best 3-5% and they average all the games they run.  But you need to read how Brent appears to sabotage it by showing the "spike peak" in  in _FC4_ @ 4K, odd you mean there isn't a number for/like that in a game with the 980Ti?

I do say reviewers universally see minimal OC'n, although if you read what was provide as "AMD said about OC'n"  (scroll down) around June 18th, _"AMD tells us that a 1150Mhz clock speed, a 100Mhz overclock over the 1050Mhz stock, is quite easily achievable with AMD’s OverDrive... does not support over-volting, so... attainable with stock voltage. AMD tells us Fury X graphics cards should be able to run at this clock speed without any issue."_ But then the author goes on to say, "_If what AMD’s telling us is true, then Fury X will be AMD’s best overclocking graphics card yet. As a 1150Mhz clock speed at the stock voltage was more of a rare occasion on AMD’s previous 200 series and HD 7000 series reference designed cards._" It appears AMD was reining in or tempering on OC'n expectations, as I find where (June 16th) _AMD CTO Joe Macri_ started the fire storm... “You’ll be able to overclock this thing like no tomorrow.” “This is an overclocker’s dream.”  There's an executive that should be brought and shot for not knowing the talking point and it's again a Technical guy!  *This is perhaps the biggest problem AMD has!*



Enterprise24 said:


> Price is insane , should be 499$-549$


The thing is, it's priced as it offers "as good as any other" Enthusiast card for 4K, which is what it is intended for.  Sure it's not always like or above a 980Ti, but folks can weigh the merits of that and at least have a choice.

AMD knows for their production schedule that's set and achievable, and seeing no lull in sales at the moment, they foresee selling everyone they produce.  While re-coupe some R&D, production cost, and still pull-in a reasonable profit. That's a company doing it right!

As other have said, they aren't a non-profit charity...?  If a product is competitive within the intended market, pricing it appropriately is how that works.  You, believing they're only competitive if they reduce the price say 18%, even when W1zzard indicates only perhaps 2% separates them at 4K? Hope you aren't running marketing at my business, but then again you might...


----------



## Ikaruga (Jun 25, 2015)

nem said:


> leaving us fanboyismos every one know that DX11 is dying and what has claim microsoft is they want to do the transisicon to DX12 as quickly as possible to ensure that all work DX11 may portear the easiest way to the new API, also remember that one bech under DX12 was leaked and one 290x can beat the 980 this gives too much to think, where in a month there will be new bech and we'll see this is not at all a good reference to ensure the long life of a graph and see how it pays filter both architectures the nVidia has added him three irrelevant efects but lacks by dont support asynchronous shader where AMD if does ,and as far all knows this is an vital feature for performance which could provide the DX12 ... ¬___¬ ..... here the hard reality...^^


Isn't it possible that you are the one who suffers from fanboyism and spreading fuds? That table can't be right... Maxwell2 supports it for sure, more to that iirc it's mandatory to support it for every DX12 GPU.



nem said:


>


Yes, AMD cards will gain more speed increase from dx12, but we don't know how that will actually impact real life gaming performances, and I'm so tired of this benchmark being linked on every tech sites. Do you even realize it's an api *OVERHEAD*(!) test, or you just see the bigger number and the longer something?


----------



## 64K (Jun 25, 2015)

BiggieShady said:


> Speaking of pcb-s
> View attachment 66021



If that's to be the dual GPU Fury X then it may be seen as a fail. It's got two 8pins so that's good for 375 watts. A single Fury X has two 8pins and already draws 280 watts peak. The clocks will have to be backed down. Probably cost more than two Fury X and not perform as well as two Fury X.


----------



## mirakul (Jun 25, 2015)

Plot twist is real


> Just putting this out there. Here's a review that seems quite different from the other ones: https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http://www.ixbt.com/video3/fiji-part3.shtml This one actually shows that Fury beats 980 ti in a lot of tests and even titan x in some tests. Of course in lower resolutions it seems that NVidia still wins a lot of times, but it's not as bad as it is with some other reviewers. Now here's a possible reason why. Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th. The press driver on AMD's FTP server is 15.15-150611a-185358E. I think this is probably the reason of this inconsistency.


Your thought, @W1zzard


----------



## BiggieShady (Jun 25, 2015)

64K said:


> If that's to be the dual GPU Fury X then it may be seen as a fail. It's got two 8pins so that's good for 375 watts. A single Fury X has two 8pins and already draws 280 watts peak. The clocks will have to be backed down. Probably cost more than two Fury X and not perform as well as two Fury X.


They'll probably exceed PCI-E spec for power draw just as with their dual GPU 295X


----------



## yogurt_21 (Jun 25, 2015)

mirakul said:


> Plot twist is real
> 
> Your thought, @W1zzard


umm that's the same story of this review. In some tests the Fury came out on top of the 980 ti, trouble is when it did it was by 1-2%, in the tests the 980 ti came out on top it was a higher percentage making the overall average in favor of the 980 ti as even the games which the fury x is faster, the 980 ti has almost the exact gaming experience. The reverse isn't true however where the Fury X is significantly slower in several tests, so much so it might affect detail level or even playability in some cases.


----------



## EarthDog (Jun 25, 2015)

BiggieShady said:


> They'll probably exceed PCI-E spec for power draw just as with their dual GPU 295X


golf clap!


----------



## mirakul (Jun 25, 2015)

yogurt_21 said:


> umm that's the same story of this review. In some tests the Fury came out on top of the 980 ti, trouble is when it did it was by 1-2%, in the tests the 980 ti came out on top it was a higher percentage making the overall average in favor of the 980 ti as even the games which the fury x is faster, the 980 ti has almost the exact gaming experience. The reverse isn't true however where the Fury X is significantly slower in several tests, so much so it might affect detail level or even playability in some cases.


If you count those Gameworks titles, I don't think we would have a nice discussion. 
On other note, here is the latest driver for FuryX
http://www2.ati.com/drivers/amd-catalyst-15.15.1004-software-suite-win8.1-win7-64bit-june20.exe
Notice the *june20* part.


----------



## N3M3515 (Jun 25, 2015)

xenocide said:


> Even for 4K I think a third part 980 Ti would be a better solution for barely more ($20 give or take).



This one is at $640 and comes with a free batman arkham knight
http://www.newegg.com/Product/Produ...500376&cm_re=gtx_980ti-_-14-500-376-_-Product


----------



## Steevo (Jun 25, 2015)

N3M3515 said:


> This one is at $640 and comes with a free batman arkham knight
> http://www.newegg.com/Product/Produ...500376&cm_re=gtx_980ti-_-14-500-376-_-Product




A game that's no longer available?


----------



## the54thvoid (Jun 25, 2015)

mirakul said:


> If you count those Gameworks titles, I don't think we would have a nice discussion.
> On other note, here is the latest driver for FuryX
> http://www2.ati.com/drivers/amd-catalyst-15.15.1004-software-suite-win8.1-win7-64bit-june20.exe
> Notice the *june20* part.



No doubt the drivers will help performance but you're really quite 'graspy' when it comes to propping up AMD.  They don't need your help - the card's good, just not what (people like you) hyped it up to be.

EDIT:

Your link doesn't work.


----------



## Sihastru (Jun 25, 2015)

Someone at AMD doesn't know the difference between G*B* (gigabyte) and G*b* (gigabit)...


----------



## 64K (Jun 25, 2015)

N3M3515 said:


> This one is at $640 and comes with a free batman arkham knight
> http://www.newegg.com/Product/Produ...500376&cm_re=gtx_980ti-_-14-500-376-_-Product



Not sure what's the situation there. Warner Brothers has pulled the game for the PC for the time being. I don't think at this point that they are just having a hissy fit because they got called out for releasing a buggy game but it's not like they didn't know from beta testing what the issues were with the game to begin with. Hopefully they will do their job and fix the game.

http://arstechnica.com/gaming/2015/...-pulled-from-steam-and-retailers-due-to-bugs/


----------



## HumanSmoke (Jun 25, 2015)

the54thvoid said:


> Technical question based on statement:
> If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat?  On such a small PCB would this create heat dissipation problems?


I think that about sums it up. The big problem it seems is that the HBM stacks sit in close proximity to the GPU (and under the same heatsink), while the PCB acts as a heatsink itself for the VRMs -as someone else noted, the high localised temp, is the same scenario - albeit not as drastic, as found on the 295X2. Heat from all sides, with minimal internal airflow reliant upon a cold plate on one side to heatsink the entire card. The heat buildup is by my reckoning largely behind the decision to voltage lock the GPU and clock lock the HBM at Hynix's default settings. We may never know unless PowerColor or some other vendor gets a full cover waterblock version of the card out and sanction from AMD to relax voltage lock ( BTW :wasn't there a huge furore when Nvidia voltage locked their cards?).


the54thvoid said:


> I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle.


I concur. AMD's GPUs have about the same power requirement as Nvidia's, but have over the last few generations had greater issues with thermal dissipation (maybe the higher transistor density?). The ideal situation would be (aside from TEC) a heatsink fed directly to vapour chamber, with the HBM stacks cooled by fan air and ramsinks, but I think the HBM stacks proximity to the GPU make that a tricky assembly job as well as adding some unwelcome (for the vendors) added expense working in machining tolerances.


the54thvoid said:


> I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks.  It could be a HBM integration issue with heat affecting perf?  Who knows.


A bad IC on a GDDR5 card means the RMA involves removal and replacing. RMA of a Fury X means the whole package probably gets binned and the GPU salvaged for assembly into another interposer package by Amkor. Warranty returns mean bad PR in general, but the physical cost on a small volume halo product card like the Fury X could also be prohibitive.


Casecutter said:


> But isn't that more as a built-in of Nvidia Gameworks; ShadowWorks? Honestly, I personally find Very High looks very... un-natural.


Doesn't matter in the greater scheme of things. How many AMD users vehemently deride, ignore, and refuse to buy any Nvidia sponsored title? Yet the titles still arrive, and if they are AAA titles, sell. If they sell, tech sites are bound by common sense and page views  if nothing else to do performance reviews of the titles (and if popular enough include them in their benchmark suites). Benchmarking involves highest image quality at playable settings for the most part, and highest game i.q. in general.
Now, bearing that in mind, and seeing the Dying Light numbers, what do you suppose Nvidia are likely to shoehorn into any of their upcoming sponsored games for maximum quality settings?


Casecutter said:


> Medium is about the top limit where you have "defined but soften shadows" that are more real to life. Even the differences from low/medium hardly noticeable.


In practical terms, no it doesn't. In marketing terms?...well, that's an entirely different matter. DiRT Showdown's advanced lighting enabled lens flares to warm the heart of JJ Abrams. Minimal advancement in gaming enjoyment (less if overdone lens flare is a distraction), but the default testing scenario.


Casecutter said:


> There's been comparisons to the Gigabyte GeForce GTX 980 Ti G1 review W1zzard did prior to this.  Looking at that as to the subject of "power" under gaming it's reveling.
> The G1 saw 23% higher in it's Average/Peak from the reference 980Ti. Avg: 211 vs. 259W;  Peak 238 vs. 293W
> The Fury numbers... Avg 246W;  Peak: 280W


One is a non-reference card sporting a very high 20% overclock out of the box, one is a reference card. 20% overclock equates to 23% more power usage (and 27% higher performance than the Fury X at 4K). What kind of wattage do you suppose a Fury X would consume at 1260MHz ? Now, this observation aside, what the hell does your long winded power usage diatribe have to do with my quote that you used to launch into it?


----------



## Casecutter (Jun 25, 2015)

HumanSmoke said:


> what do you suppose Nvidia are likely to shoehorn into any of their upcoming sponsored games for maximum quality settings?


IDK, I suppose we need to see what Warner Bros. can do to fix the PC version Batman Arkham Knight?



HumanSmoke said:


> In practical terms, no it doesn't.


Like belly buttons we all get an opinion. Mine was running very-high shadow map to me adds no visual reality.



HumanSmoke said:


> and 27% higher performance than the Fury X at 4K


Odd if a 980Ti and Fury X are close at 4K ~2%... W1zzard review of that G1 appears to indicate it like 15% between a 980Ti.



HumanSmoke said:


> Now, this observation aside, what the hell does your long winded power usage diatribe have to do with my quote that you used to launch into it?


Nothing that was a new paragraph, and new topic not directed at you, just comparing published data. While, that’s the part you "decry" as long winded?


----------



## haswrong (Jun 26, 2015)

xenocide said:


> Oh.  Oh my.  This is beautiful.  It's exactly what I expected.  A card that underperformed, didn't OC for crap, and was overpriced because of the CLC.  I can't believe people really expected a card that _shipped stock_ with a AIO water cooling solution would OC well.  Of course it gets good temps, the cooler is like $100 tacked onto the pricetag.  The power consumption is still garbage, and the performance on sub-4K resolutions is pretty bad.  If this were priced around $500 or even $549.99, it would be a decent product, but at $650 I see literally no reason to buy this thing.
> 
> 
> 
> TPU already kind of debunked the idea of games using 4GB+ at higher resolutions.  For some reason people refused to believe that games wouldn't quickly climb to using 6GB or 8GB.  Then, almost immediately after that article was published here, they reviewed The Witcher 3, which used ~2GB of VRAM at 4K, which kind of solidified the idea that VRAM saturation wasn't a real problem, lazy optimization was.  A lot of games _can_ use 6-8GB of VRAM, but most of them don't _need_ to.  There are plenty of games that adapt to the amount of VRAM available too.  I remember seeing reviewed of BF3 and BF4 where on cards with more than 4GB of VRAM it would be using 3.5GB give or take, and with 3GB and 2GB cards it would be using near all of it, with no real performance loss.



but you are considering only stupid-old polygonal approximative graphics even without raytracing.. the next gen graphics with a lot of recursion is where you can start needing quite unpretty amount of memory and it makes hardware with large amount of it more future-proof. you may have noticed that nvidia kinda made the first step towards monetizing voxelized graphics with their vxgi 2-bounce illumination model (with $hitload of pep-talk trying to suggest that nvidia actually invented everything what spells graphics). what i wanted to say waiting for hbm v2 is just another waiting and losing time and fury is 1 year late product. its been at least 2 years of prolonging, rebranding old stuff and losing time from both camps (and intel comfortably waits in hiding whit their shitty igps integrated to cpus mostly increasing their prices with little use). what strikes me though is, that many people jumped on that competition train that competition is a needed thing. it in fact is only for a minority of creatures (or groups there of) of predatory nature who are using its effect to eliminate or control a subject that poses a threat, because its basically a war and who runs out of resources first loses or at least isnt allowed the originally intended share of outcome by the will of the stronger entity. on the other hand theres a much greater power called synergy which is usually beneficial for all partakers thanks to the unity and coherency. i wonder when this world finally realizes it and starts spelling this word. probably not as early as theres less than only half a billion peeps left from the mutual wars we are going to have soon in the future as members of society kepp on predating on each other because of competition and because we do what politicians backed by enforcers say using media outlets, not what we would stand for ourselves representing our free, but scattered, disunited incoherent minds. whatever.. the price is unbearable for me anyway regardless of the hardwares performance.. see ya in the next discussion. 



the54thvoid said:


> *Compassion and Science are the only things that can save mankind.*


*true*


----------



## HumanSmoke (Jun 26, 2015)

Casecutter said:


> Odd if a 980Ti and Fury X are close at 4K ~2%... W1zzard review of that G1 appears to indicate it like 15% between a 980Ti.


My bad on using a wrong baseline figure, but your 15% is wrong.








100 / 85 = 1.1765, or 17.65% higher. To normalize the G1's percentage in the Fury X chart you have to normalize it to the 102% for the reference card. 120% for the G1 / 102% reference gives the same 17.65%. 120% of the Fury X equals 20% faster than Fury X


Casecutter said:


> Nothing that was a new paragraph, and new topic not directed at you, just comparing published data. While, that’s the part you "decry" as long winded?


Just seemed a bit defensive and hinged largely on the one review that had the Fury X pulling less power than 980 Ti/Titan X ( although compute is a totally different story).
While the differences are marginal in a lot of cases, the consensus is that the card (and system - which would also involve AMD's driver overhead / CPU load into the reckoning) isn't more frugal than the Titan X/980 Ti overall - although using certain game titles that may be the case. Hardware France, true to Damien's exacting test protocol sourced a second Fury X in addition to the AMD provided review sample. The differences are quite marked





I might also add that sites that tested the Fury X's power demand to be lower than the 980Ti/Titan X:
*Tom's Hardware, Hardware Heaven (355W 980Ti, 350W Fury X)*
Sites that tested the Fury X's power demand to be greater than the 980 Ti/Titan X
*TechPowerUp, Hardware France, Guru3D, bit-tech, Hot Hardware, ComputerBase, PC Perspective, Tech Report, HardOCP, Digital Storm, Tweaktown, Hexus, HardwareBG, SweClockers, Legit Reviews, Hardware.info, Eurogamer, Overclock3D, Forbes, Hardware Canucks, PC World, Hardwareluxx, and** PCGH* (who also test power consumption with two games)


----------



## Casecutter (Jun 26, 2015)

HumanSmoke said:


> but your 15% is wrong.


When you started with 27%.... I'm less wrong!



HumanSmoke said:


> Just seemed a bit defensive


Nope, not me... I'm not anyone who's saying a Fury X is ever going to be seen as "pulling less power", but is an improvement (yes some or good portion is HBM) over Hawaii.  Not sure why you seem to put me in that group?

*Hardwareluxx* has in past done some sophisticated power testing, and would concur their findings of 13% above a 980Ti'.  That's a lot better than W1zzard data of 18%, though if I had to put a round number to it I'd say 15%.  Looking a Hardwareluxx and their OC they got a really good 1185MHz (13%) and saw between 9-10% bump in FpS in several tiles, though power went up 17%,  So yes Fiji should not have had an executive shooting his mouth off about OC'n.  

Like I said earlier at Tom's I can't find they tested a reference 980Ti, so no sure how they arrived at that. 
Day off tomorrow (so my Friday night), and a pleasant Southern California evening... done.


----------



## mirakul (Jun 26, 2015)

With the fact that Furmark was still used in this bench, what would be expected here? Do you guys even realize that nVidia card consumes less power in Furmark than in game? Does that really suit to be the maximum consumption test?

And even in game test, what is the point of comparing peak power of different architectures? It should be the average consumption as in Tom's. It's similar to max fps and avg fps, the later is what mater.

Then people process to hail nVidia and boo AMD based on that fail numbers, what a joke.


----------



## Arjai (Jun 26, 2015)

1st World issues, why do they amuse me?


----------



## rvalencia (Jun 26, 2015)

Ikaruga said:


> Isn't it possible that you are the one who suffers from fanboyism and spreading fuds? That table can't be right... Maxwell2 supports it for sure, more to that iirc it's mandatory to support it for every DX12 GPU.
> 
> 
> Yes, AMD cards will gain more speed increase from dx12, but we don't know how that will actually impact real life gaming performances, and I'm so tired of this benchmark being linked on every tech sites. Do you even realize it's an api *OVERHEAD*(!) test, or you just see the bigger number and the longer something?


3DMark's API overhead benchmark also test GPU's command I/O ports and pathways


----------



## Vlada011 (Jun 26, 2015)

AMD again lose from NVIDIA. Again is weaker than two high end GeForce.
I'm NVIDIA fan but I'm sad little because AMD. Somehow no matter what they try they can't offer to customers same performance as NVIDIA.
Even several months after NVIDIA are unable to bring same performance.
I would like really to AMD at least sell enough graphics to cover production and to earn something for existence.
NVIDIA dominance is huge... and they know everything. Whole time they knew that AMD can't offer same performance even as slower GM200.
One more thing is very bad, customers love to OC, this is so popular only because overclocking. Their Fury X don't need 8+8 pin. He could work and with 6+6pin.
Anyway customers can't OC more than 5-6 or maybe 10%, probably for games and less. In mean time GTX980Ti go on 200-250MHz and difference among them grow and grow...
Don't even to talk about overclocked TITAN X vs overclocked Fury X.... I afraid when everything settle down Fury X will be exactly on middle GTX980 - GTX 980Ti.
30-40% stronger than R9-290X. People expect to driver bring them 10% improvements total in every situation... That's impossible even for NVIDIAs best moments not for AMD.
Customers expect one clear win, at least for 2-3% more powerful than TITAN X, maybe even same, only to avoid someone to say Yes TITAN X is stronger...  I mean because NVIDIA is better and because I love EVGA and I will continue with GeForce, I say that even in moments when I thought that AMD will win but no reason to anybody wish bad things to AMD, I remember they served me well 10 years until Cayman. 
I still have god memories about Radeon before they start to lose from NVIDIA and that GeForce become better option for gaming, no doubt in that.


----------



## Vulpesveritas (Jun 26, 2015)

Vlada011 said:


> AMD again lose from NVIDIA. Again is weaker than two high end GeForce.
> I'm NVIDIA fan but I'm sad little because AMD. Somehow no matter what they try they can't offer to customers same performance as NVIDIA.
> Even several months after NVIDIA are unable to bring same performance.
> I would like really to AMD at least sell enough graphics to cover production and to earn something for existence.
> ...




Nvidia has what, 20+ times the budget AMD has?  And AMD has to divide their budget for more things than Nvidia and has fewer engineers? 

I'd say AMD is doing well to offer what they do, all things considered.


----------



## rvalencia (Jun 26, 2015)

Vlada011 said:


> AMD again lose from NVIDIA. Again is weaker than two high end GeForce.
> I'm NVIDIA fan but I'm sad little because AMD. Somehow no matter what they try they can't offer to customers same performance as NVIDIA.
> Even several months after NVIDIA are unable to bring same performance.
> I would like really to AMD at least sell enough graphics to cover production and to earn something for existence.
> ...




Fury X is competitive against 980 Ti at very high resolution. Remember, AMD hasn't enabled DX11 MT drivers in Windows 8.1.

I rather see benchmarks done on Windows 10 i.e. Project Cars has frame rate uplift on R9-280 on Windows 10. Windows 10 forces DX11 MT.


----------



## HumanSmoke (Jun 26, 2015)

Vulpesveritas said:


> Nvidia has what, 20+ times the budget AMD has?"]And AMD has to divide their budget for more things than Nvidia


No and Yes.
No, Nvidia doesn't have 20+ times the budget. Nvidia's R&D for the past year amounted to $1.36bn. AMD's R&D came to $1.04bn
Yes, the lions share of AMD's R&D should be going into K12 and Zen (especially the latter) and the platform as a whole, but the split wouldn't be anywhere close to 10:1 in favour of processors.


Vulpesveritas said:


> I'd say AMD is doing well to offer what they do, all things considered.


Too early to say. Fiji isn't going to make or break the company. Sales volumes for $650 cards aren't particularly high, and I'm estimating that it costs AMD a hell of a lot more to build a Fury X than it does for Nvidia and its partners to churn out GTX 980 Ti's. With Fury X missing the halo of "worlds fastest GPU", they will need to get a dual-GPU up and running very quickly - but that again will be a double-edged sword. Two expensive high performance GPUs will need a 240 radiator at least - all adds to the bill of materials. Somehow I don't see the company turning their market share around too much unless they sacrifice average selling prices of the rest of the Fury line- and the product stack under it.
If the company get Zen out the door in a timely manner, and the platform lives up to AMD's PR ( not a given based on recent history) they should/could be OK. If Zen flops and/or is late out of the gate, the AMD Financial Analyst Day in 2017 might look like this


Spoiler


----------



## rvalencia (Jun 26, 2015)

HumanSmoke said:


> No and Yes.
> No, Nvidia doesn't have 20+ times the budget. Nvidia's R&D for the past year amounted to $1.36bn. AMD's R&D came to $1.04bn
> Yes, the lions share of AMD's R&D should be going into K12 and Zen (especially the latter) and the platform as a whole, but the split wouldn't be anywhere close to 10:1 in favour of processors.
> 
> ...


According to http://www.streetwisereport.com/adv...t-intel-nvidia-corporation-nasdaqnvda/120113/

Qualcomm expects to make a MA(merge or acquisition) offer for AMD. Qualcomm already kicked NVIDIA's out of mobile phones.


----------



## HumanSmoke (Jun 26, 2015)

rvalencia said:


> According to http://www.streetwisereport.com/adv...t-intel-nvidia-corporation-nasdaqnvda/120113/
> 
> Qualcomm expects to make a MA(merge or acquisition) offer for AMD. Qualcomm already kicked NVIDIA's out of mobile phones.


Qualcomm could fund the buy out from pocket change, but I'll believe it when I see it. Seems like every bad financial quarter brings the rumour of someone buying AMD (Samsung, BLX, Xilinx, and the list goes on). AMD's stock price then magically firms up just before the earnings call.


----------



## Bytales (Jun 26, 2015)

Lots of negative comments here,, i might add.
What i havent seen are some DX12 benchmarks, because i have a feeling this little pinky is going to blow everything out of the water. Because in the end dx12 is the future, and not dx11 (which mean these benchmarks here)
Allthough, sincere to be, i might add that we must wait at least a generation before dx12 becomes mainstream.


----------



## mroofie (Jun 26, 2015)

mirakul said:


> With the fact that Furmark was still used in this bench, what would be expected here? Do you guys even realize that nVidia card consumes less power in Furmark than in game? Does that really suit to be the maximum consumption test?
> 
> And even in game test, what is the point of comparing peak power of different architectures? It should be the average consumption as in Tom's. It's similar to max fps and avg fps, the later is what mater.
> 
> Then people process to hail nVidia and boo AMD based on that fail numbers, what a joke.


whaaat ??


----------



## W1zzard (Jun 26, 2015)

mirakul said:


> And even in game test, what is the point of comparing peak power of different architectures? It should be the average consumption as in Tom's. It's similar to max fps and avg fps, the later is what mater.


you do realize that we have "average" gaming power draw numbers in our reviews? clearly marked as "average", and provide additional data points for interested readers


----------



## haswrong (Jun 26, 2015)

Bytales said:


> Lots of negative comments here,, i might add.
> What i havent seen are some DX12 benchmarks, because i have a feeling this little pinky is going to blow everything out of the water. Because in the end dx12 is the future, and not dx11 (which mean these benchmarks here)
> Allthough, sincere to be, i might add that we must wait at least a generation before dx12 becomes mainstream.


its probably the lousy 64 computing units (where as much as twice as much were expected by the crowd). *shrug*
amd ends up creating an average console hardware in the place where pc enthusiast hardware is craved. 
so for the true technology progress enthusiasts, its a bit of a let down after what nvida pulled off with their flexible overclocking and kinda so much waiting for amd to deliver on their heap of promises.


----------



## Frick (Jun 26, 2015)

haswrong said:


> amd ends up creating an average console hardware in the place where pc enthusiast hardware is craved.



Fury X is not avarage console hardware if that's what you mean.


----------



## Lord Nemesis (Jun 26, 2015)

Personally, the main problem that I see with this card is the use of HBM Memory. HBM with its current technical limitation (4GB) is not ready for the big time on a flagship card. AMD pitched this as the pinnacle of cards for UHD, but did nothing to support that. While lack of HDMI 2.0 port is one of those, the more important thing is the 4 GB of HBM that they put on this card.

What is the actual driving force for desiring UHD gaming? It is not just the screen resolution, When we have 4 times the pixel density of FHD, we would like to use those extra pixels to present more detail and that comes from higher and more detailed textures. What we are seeing as UHD gaming today is just the games being run at UHD resolution with the same textures that were designed for FHD. In place of one pixel in FHD, we have four pixels with the same shade in UHD which is quite pointless. The goal of UHD will be realized when we start getting more detailed textures to use for UHD and such textures are going to occupy a lot of VRAM.

No amount of PR about driver optimizations, texture compression and the high bandwidth offered by HBM would be able to side track the fact that 4GB is not going to be enough in the long run when higher resolution textures come into the picture. It maybe true that VRAM is not getting utilized efficiently today and that there might be ways to optimize the drivers to make the allocations better, but applies only for the present day situation when there is head room for such optimizations. Once true UHD optimized games with higher resolution textures start coming out, they simply will not fit in the 4GB VRAM and once swapping from main memory starts, you all know that the bottleneck is going to kill the performance. 

Either AMD is banking on the fact that most multi platform games may not yet offer super high resolution textures for UHD in the PC versions or that for the games that do so, they can just compromise the texture quality at the driver or ask the developer to fallback to lower resolution textures for these cards. 

Even if the card has the raw compute power for UHD, they have crippled this card by pairing it with 4GB HBM. I would have much rather preferred to have them pair this with 8 GB GDDR instead. They should have waited on the HBM till they could come out with 8 GB modules.  Pair it with 8GB GDDR, replace the water cooling with the regular air cooled designs and market it at $500 or even $550 and this would have killed the 980 and 980Ti using the value for money tag. This was never a UHD ready card to begin with. I consider the 390X, to be more of a UHD ready card than Fury X. 

There never was any need for them to be king of the hill in terms of performance, they could have claimed it in terms of value for money has they have done in the past. My last 3 GPU purchases and most of my overall GPU purchases were AMD for this reason, but this time, I went for GTX 980 after 10 years of not using an nVidia card because it offered more value for my money when I bought it. Personally I never card about Physx or any of the nvidia specific stuff, but the overall value justified the purchase. 

Can Fury X compete with 980 Ti at the same price point? Not unless they beat the 980 Ti in 95% of the games with at least a 7.5~10% better performance margin which currently is not the case. 74% of the gaming GPU market share is currently owned by nVidia and a vast majority of AAA titles are using Gameworks features that add additional value for nVidia GPU users and it doesn't help that AMD fails to optimize their drivers and the game for their own GPUs and resulting in inferior performance. Personally I think all this talk about gameworks being some sort of cheating or blocking out AMD is nothing more than nonsense or a case of sour grapes. AMD should be proactive and resonsible for working with game developers to tweak their games for their GPUs.  Further with 6GB of VRAM and a HDMI 2.0 port, it is somewhat more more UHD ready than Fury X. 

AMD should drop $100 on the price  to compete and maybe even make a Fury X version without the water cooling (even more preferable would be to drop 4 GB HBM in favour of 8GB GDDR).


----------



## Aquinus (Jun 26, 2015)

Lord Nemesis said:


> Personally, the main problem that I see with this card is the use of HBM Memory. HBM with its current technical limitation (4GB) is not ready for the big time on a flagship card. AMD pitched this as the pinnacle of cards for UHD, but did nothing to support that. While lack of HDMI 2.0 port is one of those, the more important thing is the 4 GB of HBM that they put on this card.
> 
> What is the actual driving force for desiring UHD gaming? It is not just the screen resolution, When we have 4 times the pixel density of FHD, we would like to use those extra pixels to present more detail and that comes from higher and more detailed textures. What we are seeing as UHD gaming today is just the games being run at UHD resolution with the same textures that were designed for FHD. In place of one pixel in FHD, we have four pixels with the same shade in UHD which is quite pointless. The goal of UHD will be realized when we start getting more detailed textures to use for UHD and such textures are going to occupy a lot of VRAM.
> 
> ...


I suspect memory bandwidth has very little to do with why it turned out the way it did. I think because of the performance numbers, it's not unlikely to say that the R9 Fury X is starved for ROPs.


----------



## Lord Nemesis (Jun 26, 2015)

^^ Yes,  the raw performance would definitely benefit from more ROP's. But my argument was specifically about the quantity of memory they put in because they forced themselves to use HBM on it. They have essentially crippled the card by choosing to go with HBM with its current technical limitation of 4GB instead of higher amount of GDDR.

HBM would actually make sense when games have vast amount of textures and other data to deal with where having higher bandwidths will help move or manipulate it faster. Having to reduce the quantity because of opting to go for HBM kind of defeated the purpose for having all that bandwidth in the first place. They should have reserved its use for the next refresh when they would have have larger sized modules of HBM like 8GB or 12GB.

I highly suspect if the current performance numbers for sub FHD or QHD would have been any different if GDDR had been used in place of HBM.


----------



## HumanSmoke (Jun 26, 2015)

Lord Nemesis said:


> ^^ Yes,  the raw performance would definitely benefit from more ROP's. But my argument was specifically about the quantity of memory they put in because they forced themselves to use HBM on it. They have essentially crippled the card by choosing to go with HBM with its current technical limitation of 4GB instead of higher amount of GDDR.


The only problem with that scenario is that Fiji would be a totally different GPU. The only way AMD could get 4096 ALUs into the chip was because a large amount of the die space usually reserved for GDDR5 memory controllers and I/O is now able to be devoted to the core(s). IF Fiji was GDDR5, it would have just made sacrifices elsewhere. The increased power demand of GDDR5 would have meant the GPU would be clocked lower to compensate. A single Tonga chip has a  384-bit bus feeding 2048 cores/128 TAU/32 ROP. Doubling the core components but only increasing the bus width by a third (to 512-bit) starves the GPU of bandwidth, while going any larger puts the GPU die well outside manufacturability due to size limits of the lithography. 


Lord Nemesis said:


> HBM would actually make sense when games have vast amount of textures and other data to deal with where having higher bandwidths will help move or manipulate it faster.


Well, that's just AMD being ahead of the curve. There's always a price to pay when a new technology arrives and its first iteration is not appreciably better than the incumbent. Put it down to the price of progress. AMD needed high bandwidth and low latency for HSA and to compete with Intel's own eDRAM and HMC roadmap. There is way more at stake here than just a consumer graphics card and AMD had to commit to HBM to accelerate its development.


----------



## EarthDog (Jun 26, 2015)

[/quote]They have essentially crippled the card by choosing to go with HBM with its current technical limitation of 4GB instead of higher amount of GDDR.[/quote]No, they did not. On the cover, it looks like it, but, like NVIDIA with their "ZOMG 256 bit bus wth are you doing", they have better compression algo's. Not to mention they looked very deep at vRAM allocation and found, according to AMD, that 70% of vRAM is not used efficiently... so, its fine. You can see that it tends to pull away from 980ti at a higher res compared to low res too. If vRAM was a limit, that wouldnt be happening. Look at the 780ti vs 290x for example.


----------



## Lord Nemesis (Jun 26, 2015)

^^ That is because none of these games have properly hit the VRAM limit so far. We currently don't have many games that max out 4 GB of VRAM, but 6 months down the line, there will that be new games that will do just that. 

Further, this issue of inefficient usage of VRAM applies only to an extent. Are they seriously trying to tell us that each and every game wastes 70% of the memory that it allocates. Its more than likely the worst case scenario. Also, is AMD really going to hand tweak VRAM allocations for each individual game in their drivers. We all know how well AMD is handling their game specific driver tweaking currently.  If  AMD is so confident that their optimizations will allow the Fury X to run with just 4 GB where another card would require 5 or 6 GB, why didn't they release the 390X with 4GB and use those optimizations instead of putting 8GB VRAM on those cards. Nothing in their allegations of inefficiency and about their strategy for optimization suggests that it has to be memory technology specific. if inefficient memory usage is a issue and they have solution to optimize it at driver level, they should be able to do it for GDDR base cards as well.

Lastly, what about the case when a games compressed textures and other data go above the 4 GB limit beyond the walls for any optimization ? It should not be that difficult if a game has very high resolution textures to be used along with UHD resolutions.

One of the reviewers tested with GTA V @ UHD after tweaking the settings so VRAM usage goes over 4 GB and the Fury X dropped performance severely compared to 980 Ti at same settings. Once he tweaked the settings to make VRAM go above 6 GB, 980 Ti also dropped in performance severely.


----------



## Vlada011 (Jun 26, 2015)

rvalencia said:


> Fury X is competitive against 980 Ti at very high resolution. Remember, AMD hasn't enabled DX11 MT drivers in Windows 8.1.
> 
> I rather see benchmarks done on Windows 10 i.e. Project Cars has frame rate uplift on R9-280 on Windows 10. Windows 10 forces DX11 MT.



I can't agree with that and most customers will not believe in that. Simply they want on paper more video memory... I afraid they will not believe to AMD that 4GB is enough for future.
I imagine someone who plan to buy 1440p monitor or example ASUS 3800R... 4GB is suicide. Than better to wait one year more when AMD offer Fury X with 8GB. They always make rebrand.
Chance of AMD is R9-390X 8GB. Not as single option, but for multi GPU that configuration become better than GTX980 SLI. Maybe is GTX980 better if someone have 600W PSU and 1080p and no plans for more.
But for anything more Hawaii CF is better. Only is problem price. If 4GB is not enough for Fury X than situation is even worse with GTX980. And that's segment hold biggest part of gaming community. People who pay under 500$. Better to say 300-400$ for graphic card because they will want more than 4GB. That's chance of AMD. NVIDIA can offer them only 600-650$ if someone want over 4GB. But someone to believe that Fury X will resist on higher resolution without fps drops next 2 years is very bad. Special because AMD owners keep graphic card longer than NVIDIA customers usually. Situation is even worse because NVIDIA dictate to developers what to do and 980Ti have 6GB, they plan 8GB for Pascal they will force games with more video memory to push people on upgrade and on TITAN X. NVIDIA 100% have plans to offer efficient Pascal with HBM, 8GB of video memory only little stronger than TITAN X but very expensive, and 8GB will be reason for upgrade from GTX980Ti, what people to do with 4GB. New cards from AMD will come for 18 months.


----------



## btarunr (Jun 26, 2015)

This GPU with 8 GB 512-bit GDDR5 would've been 375-400W typical board power (educated guess). GCN 1.1 to GCN 1.2 wasn't as big a perf/Watt leap as Kepler to Maxwell. That's probably why this whole HBM adventure was unavoidable.

Edit. Now I'm really curious to know what this GPU would have been like with 8 GB 512-bit GDDR5.


----------



## EarthDog (Jun 26, 2015)

Lord Nemesis said:


> ^^ That is because none of these games have properly hit the VRAM limit so far. We currently don't have many games that max out 4 GB of VRAM, but 6 months down the line, there will that be new games that will do just that.
> 
> Further, this issue of inefficient usage of VRAM applies only to an extent. Are they seriously trying to tell us that each and every game wastes 70% of the memory that it allocates. Its more than likely the worst case scenario. Also, is AMD really going to hand tweak VRAM allocations for each individual game in their drivers. We all know how well AMD is handling their game specific driver tweaking currently.  If  AMD is so confident that their optimizations will allow the Fury X to run with just 4 GB where another card would require 5 or 6 GB, why didn't they release the 390X with 4GB and use those optimizations instead of putting 8GB VRAM on those cards. Nothing in their allegations of inefficiency and about their strategy for optimization suggests that it has to be memory technology specific. if inefficient memory usage is a issue and they have solution to optimize it at driver level, they should be able to do it for GDDR base cards as well.
> 
> ...


There are plenty of games that will smash 4GB and 4K resolutions. 

The issue isn't at the game level, it is at the API level. So it is game agnostic from what they said. As far as the second part of that paragraph, the 390x is a 290x with higher clocks and 8GB. IT doens't have the new algo's for compression AFAIK. I was at the release in L.A and talked with them about this. 

What about compressed textures... it would do the same thing any other card does I would imagine in that it 'pages out' to system ram. ANd yes, that performance drop after running out of vRAM is normal... I believe I am missing your point there.



btarunr said:


> Edit. Now I'm really curious to know what this GPU would have been like with 8 GB 512-bit GDDR5.


Remarkably similar. Memory bandwidth isn't really a limiting factor until 4K +. Power draw would have been a bit higher though. Not as much as the estimates above guessed.


----------



## CrAsHnBuRnXp (Jun 26, 2015)

Im not about to click through 13 pages, but has @xfia tried to come in and defend AMD yet? 

In all seriousness though, for as much as I like nVIDIA for my video cards (not a fanboy just I like their features like Shadowplay and drivers are just all around better and more plentiful and I will go either side of the fence if performance is there), when will people realize that AMD just isnt what they were 10-12 years ago? They just cannot spank Intel/nVIDIA like they used to.


----------



## Aquinus (Jun 26, 2015)

CrAsHnBuRnXp said:


> Im not about to click through 13 pages, but has @xfia tried to come in and defend AMD yet?
> 
> In all seriousness though, for as much as I like nVIDIA for my video cards (not a fanboy just I like their features like Shadowplay and drivers are just all around better and more plentiful and I will go either side of the fence if performance is there), when will people realize that AMD just isnt what they were 10-12 years ago? They just cannot spank Intel/nVIDIA like they used to.


Stop trying to start an argument, you're baiting him and you know it.


----------



## Lord Nemesis (Jun 26, 2015)

My point is that paging from system RAM has more likelihood of happening sooner with a card equipped with 4 GB VRAM than one with 6 GB, 8 GB or 12 GB regardless of compression techniques and optimizations. Optimizing resource utilization is definitely important and I don't think nVidia ignores it either, but there would a limit for how much can be achieved through sheer optimizations. Optimizations are not always a replacement for having sufficient amount of resources.  

Console games are often heavily optimized for the hardware they run on. But the lower quantity of resources like VRAM mean that after a point, they also end up having to make compromises that impact the final quality.


----------



## CrAsHnBuRnXp (Jun 26, 2015)

Aquinus said:


> Stop trying to start an argument, you're baiting him and you know it.


Im actually not trying to bait anyone. It is my legitimate viewpoint.


----------



## Ikaruga (Jun 26, 2015)

rvalencia said:


> 3DMark's API overhead benchmark also test GPU's command I/O ports and pathways


Yes, but how is that relevant or argue with what I said? Command ports and pathways are needed to test the overhead of the api. 

As I said, AMD cards will see more speed increase from dx12 (obviously since Microsoft makes it, who has not one but two consoles on the market with AMD GPUs), *but we just don't know yet* (or those who do are still under NDA) how much that speed increase will transfer to real life performance increases in games (so not talking about benchmarks here but actual game performances)... and - I believe - those who think that dx12 will magically make their GPU twice as fast gonna get a rough wake-up call.


----------



## Caring1 (Jun 26, 2015)

CrAsHnBuRnXp said:


> Im not about to click through 13 pages, but has @xfia tried to come in and defend AMD yet?


I believe he/ or she was given a little holiday recently


----------



## EarthDog (Jun 26, 2015)

Lord Nemesis said:


> My point is that paging from system RAM has more likelihood of happening sooner with a card equipped with 4 GB VRAM than one with 6 GB, 8 GB or 12 GB regardless of compression techniques and optimizations. Optimizing resource utilization is definitely important and I don't think nVidia ignores it either, but there would a limit for how much can be achieved through sheer optimizations. Optimizations are not always a replacement for having sufficient amount of resources.
> 
> Console games are often heavily optimized for the hardware they run on. But the lower quantity of resources like VRAM mean that after a point, they also end up having to make compromises that impact the final quality.


And my point is that with their optimizations, it doesn't arrive as fast as you seem to expect. After a point, you are correct, but because of their optimizations, that point is further down the line than their previous generations.


----------



## Aquinus (Jun 26, 2015)

Lord Nemesis said:


> My point is that paging from system RAM has more likelihood of happening sooner with a card equipped with 4 GB VRAM than one with 6 GB, 8 GB or 12 GB regardless of compression techniques and optimizations. Optimizing resource utilization is definitely important and I don't think nVidia ignores it either, but there would a limit for how much can be achieved through sheer optimizations. Optimizations are not always a replacement for having sufficient amount of resources.
> 
> Console games are often heavily optimized for the hardware they run on. But the lower quantity of resources like VRAM mean that after a point, they also end up having to make compromises that impact the final quality.


That's why I only recently started having issues with having only 1GB on my 6870s at 1080p, right? While I think you're right, I also think you're wrong. More VRAM is most definitely getting used and going into system memory for textures is a little different, because if the textures in memory aren't accessed often, you may be running at 60FPS (like I am in Elite Dangerous,) but occassionally get a blip because it was either used or it got paged in and something else paged out. It's still playable, but it's sometimes annoying if it happens at just the wrong time. It also depends on a game as well.

Either way, when push comes to shove, 4GB is plenty now and probably will stay that way for at least a couple years unless you're one of those people who shoves AA as high as it can go on your shiny new 4k display, in which case, I can't say you're the typical gamer.


----------



## Initialised (Jun 26, 2015)

I wanted to want this card, but I think AMD should have had two SKUs, one with the CLLC and one with a full cover, single slot waterblock.

How well the card competes with the 12GB Titan-X at 4K shows that (for now at least) the extra 8GB are just for show.


----------



## Xzibit (Jun 26, 2015)




----------



## Bansaku (Jun 26, 2015)

And still my HD7950 in CFX performs better than the Fury X......


----------



## Xzibit (Jun 27, 2015)




----------



## tacosRcool (Jun 27, 2015)

I know Fury X is slower than the GTX 980 Ti but for the price you do get a water cooler. I say that isn't that bad taken that into account


----------



## ensabrenoir (Jun 27, 2015)

Xzibit said:


>







not talking about any electric bill either..... that thing is border line evil.....gotta know what kinda numbers it puts out........


----------



## Xzibit (Jun 27, 2015)

*@TheMattB81*


----------



## Sihastru (Jun 27, 2015)

These days, 3D Mark doesn't say anything anymore. In actual games I think the fourth card usually means worse performance then three, and sometimes even then two cards!


----------



## Deleted member 138597 (Jun 27, 2015)

Okay, this review by ixbit with a new driver shows very different story. Fury X is neck-on-neck with TITAN X and 980 Ti - even at sub-4k. Is this actually correct or biased of sorts?
https://translate.googleusercontent....shtml&usg=ALkJrhhnmEOoMRKsDO6nalghr9D1qAWVCA


----------



## Aquinus (Jun 27, 2015)

Sihastru said:


> These days, 3D Mark doesn't say anything anymore. In actual games I think the fourth card usually means worse performance then three, and sometimes even then two cards!


I think that's pushing it a bit. CFX has either not worked, or resulted in better performance. I've never encountered a situation where I got less performance from CFX. Not to say those situations don't exist, I just don't think they're as common as you think except for day 0 games, in which case drivers need to naturally keep up. People demonize CFX and SLI more than it deserves. The fact that it's only the 1GB framebuffer keeping my 6870s back alone says (in my opinion,) good things about scaling.

Mind you, this doesn't mean CFX is perfect, I just think it was a work while investment while investment for a cheaper upgrade at the time and I would do it again.


Shamonto Hasan Easha said:


> Okay, this review by ixbit with a new driver shows very different story. Fury X is neck-on-neck with TITAN X and 980 Ti - even at sub-4k. Is this actually correct or biased of sorts?
> https://translate.googleusercontent....shtml&usg=ALkJrhhnmEOoMRKsDO6nalghr9D1qAWVCA


I wouldn't be slightly surprised if AMD had some tweaking they could do with Fiji. I'm sure after a few driver tweaks, if there are substantial gains or requests for, W1zz might consider another review. As it stands right now though, it seems to keep up with the best of them just fine. Plus, you're probably not getting a Fury X if your intention is to play at 1080p60 and even if you were, I don't think you would be disappointed with the performance.


----------



## Sihastru (Jun 27, 2015)

I was referring to 4xCFX as opposed to 3xCFX or 2xCFX. 2xCFX and 3xCFX have good scaling. 4xCFX has crappy scaling (sometimes negative when comparing to 3xCFX). Same with SLI.


----------



## dwade (Jun 28, 2015)

So how will R9 Nano be "significantly faster" than 290x when an overclocked 290x (390x) is not much slower than Fury X.


----------



## the54thvoid (Jun 28, 2015)

Xzibit said:


> *@TheMattB81*



Just for comparison and nothing more.

Graphics score of 4x Fury X = 15307      CPU score 23810
Graphics score of 4x 980ti = 17748        CPU score 21859

Overall FuryX = 13605
Overall 980ti  = 16122

The 980ti's looked to be mildly overclocked but nothing extreme.  This was the 2nd lowest 4 card set up.  The lowest 980ti set up only managed 13xxx on graphics score but gpu's were marked at 1000Mhz?

Clock for clock, Fury X is a winner but Maxwell's boost makes clock for clock unlikely in the real world.

EDIT: FTR - single card Ultra top 100 lowest score is 5102, Dual card is 8805.  

Any chance AMDMatt could OC those cards and rebench?  Far better comparison that way.


----------



## rvalencia (Jun 29, 2015)

Ikaruga said:


> Yes, but how is that relevant or argue with what I said? Command ports and pathways are needed to test the overhead of the api.
> 
> As I said, AMD cards will see more speed increase from dx12 (obviously since Microsoft makes it, who has not one but two consoles on the market with AMD GPUs), *but we just don't know yet* (or those who do are still under NDA) how much that speed increase will transfer to real life performance increases in games (so not talking about benchmarks here but actual game performances)... and - I believe - those who think that dx12 will magically make their GPU twice as fast gonna get a rough wake-up call.


Well, AMD gains MT with Windows 10, Nvidia already used their DX11 MT gains.


----------



## mroofie (Jun 29, 2015)

Aquinus said:


> I think that's pushing it a bit. CFX has either not worked, or resulted in better performance. I've never encountered a situation where I got less performance from CFX. Not to say those situations don't exist, I just don't think they're as common as you think except for day 0 games, in which case drivers need to naturally keep up. People demonize CFX and SLI more than it deserves. The fact that it's only the 1GB framebuffer keeping my 6870s back alone says (in my opinion,) good things about scaling.
> 
> Mind you, this doesn't mean CFX is perfect, I just think it was a work while investment while investment for a cheaper upgrade at the time and I would do it again.
> 
> I wouldn't be slightly surprised if AMD had some tweaking they could do with Fiji. I'm sure after a few driver tweaks, if there are substantial gains or requests for, W1zz might consider another review. As it stands right now though, it seems to keep up with the best of them just fine. Plus, you're probably not getting a Fury X if your intention is to play at 1080p60 and even if you were, I don't think you would be disappointed with the performance.



"_*if your intention is to play at 1080p60 and even if you were, I don't think you would be disappointed with the performance*_"

Did you even read the review ? 





dwade said:


> So how will R9 Nano be "significantly faster" than 290x when an overclocked 290x (390x) is not much slower than Fury X.



Finally someone with brains  

If the Nano is exactly like the fury X then prepare to weep because it will be slower than gtx 970 and will have a possible $450 price tag (not really value for money in my opinion )  



Aquinus said:


> Yes I did. Don't be a smart ass. For a GPU that falls between a 980 and 980 Ti in most cases, I would say it would run a 1080p game just fine. Better than the other options? Sure, but that doesn't mean that it's no usable for 1080p or that it isn't any good.



I agree with the last line


----------



## Aquinus (Jun 29, 2015)

mroofie said:


> "_*if your intention is to play at 1080p60 and even if you were, I don't think you would be disappointed with the performance*_"
> 
> Did you even read the review ?


Yes I did. Don't be a smart ass. For a GPU that falls between a 980 and 980 Ti in most cases, I would say it would run a 1080p game just fine. Better than the other options? Nah, but that doesn't mean that it's no usable for 1080p or that it isn't any good.


----------



## EarthDog (Jun 29, 2015)

> So how will R9 Nano be "significantly faster" than 290x when an overclocked 290x (390x) is not much slower than Fury X.


Looks to me like, overall, the 290x is anywhere from 27-21% faster than a 290x in looking at the TPU reviews (4K to 1080p). I would assume the Fury and Nano will fall right in between those values, perhaps 10% slower.

You can also overclock the core on the Fury X and I would assume the Fury. Now, you can't touch the memory (so what, its 4096 bit!) and the core headroom is not as much, but, its clear they have their place. The 390x will be several percent behind the Fury and Nano (as I understand it, the Nano will be Furylike performance). Its a tight fit, no doubt, but there is room there.

Someone was also able to overclock the ram on the FuryX too so... there is that too (assuming it trickles down to Fury of course).


----------



## xenocide (Jun 30, 2015)

I think the Nano will be the biggest disappointment in this new series of cards from AMD.  They made some bold claims with the Nano, and even the believable claims about the Fury X seems to have all been misleading or wrong.


----------



## Mobius Pizza (Jun 30, 2015)

Vayra86 said:


> As many others, extremely disappointed by this release.
> 
> Also having a lot of trouble justifying the relatively high score in the review. 9.2 for this card? It is underwhelming across the board:
> 
> ...



I can explain your suprise why 4096 shaders seem to result in poor performance. That is due to architecture of GCN is very different to Nvidia's Maxwell. In the former, all shader tasks are converted to compute tasks, whereas Maxwell has dedicated specialized units to do each task. There are pros and cons to each approach, for example, Maxwell doesn't need to translate commands to compute tasks and hence have lower latency (and higher fps) per transistor. GCN however, is more flexible and can potentially benefit a lot more when transistioning to asynchronous shader and Direct X 12:


----------



## AsRock (Jun 30, 2015)

GreiverBlade said:


> yes... but i fail to see your point ... as for power consumption "ooohhh it will cost me more on my yearly electricity bill" (how much? ) also the 980Ti has 2gb more (but GDDR5 not HBM so the cost could also be the same)
> so ... what do you try to say? that the card is a failure because it doesnt beat or scratch a titan X? while costing 400 less? so then the argument should be valid for the 980Ti to Titan X ?
> 
> ok i am finished with this thread  enough is enough  and time will tell us more funny story



I know ya said ya done with this thread but, AMD did manage to all so put a water cooler on the card for about the same price as the 980TI which only has a air cooler on it.

And tbh the Fury X don't do to badly, just a shame it's to early for me need the upgrade and if any thing bites about this card is the HDMI limitation how ever if you turn of vsync you'll get more than 30fps anyways.  I would have to down clock the thing in fact i have to down clock my 290X as it easy does gaming on 1080P 30-60 even at 700MHz on the core.


----------



## zoomer-fodder (Jul 2, 2015)

Hello, Im test Crysis 3 resolution 4K with FXAA FULL maxed out [GTX 780 SLI], and this im got:


----------



## EarthDog (Jul 2, 2015)

What is your point? Anything in English?


----------



## EarthDog (Jul 2, 2015)

xenocide said:


> I think the Nano will be the biggest disappointment in this new series of cards from AMD.  They made some bold claims with the Nano, and even the believable claims about the Fury X seems to have all been misleading or wrong.


I actually believe the exact opposite...

we will see soon enough.


----------



## dinmaster (Jul 4, 2015)

Really want to see a eyefinity review, haven't seen one yet, if anyone has, please let me know. I want to see 5760x1080, not quite 4k but alot of pixels none the less. ive seen 4k and crossfire reviews which as cool and kind of are what im looking for. i want to run two cards and apparently in 4k with 8gb hbm (4 x 2) helps alot and crossfire apparently scales almost 100% in areas for 4k. so i want to know if it still scales high at resolutions just under 4k as well.


----------



## DaedalusHelios (Jul 4, 2015)

I was itching for an upgrade and this release hasn't given me a reason. More power consumption and less performance than the SLI 980 ti setup I am using. I guess I will just grab a new ASUS Gsync monitor instead.


----------



## rvalencia (Jul 10, 2015)

tacosRcool said:


> I know Fury X is slower than the GTX 980 Ti but for the price you do get a water cooler. I say that isn't that bad taken that into account


*Nvidia Titan X reduce image quality cheating...*

The root of the problem is the image quality setting in Nvidia control panel. Check the difference (and proof) here:

http://hardforum.com/showpost.php?p=1041709168&postcount=84

there seems to be around 10% perf. drop after setting the quality to highest.


----------



## Frag_Maniac (Jul 12, 2015)

Any chance you'll bench the Fury X again using the 15.7 driver W1zzard, or do you know any reviewer that has?


----------



## okidna (Jul 12, 2015)

Frag Maniac said:


> Any chance you'll bench the Fury X again using the 15.7 driver W1zzard, or do you know any reviewer that has?



HWC did that in their Fury review, 12 games in 1440p and 4K : http://www.hardwarecanucks.com/foru.../69792-amd-r9-fury-performance-review-20.html


----------



## Frick (Jul 12, 2015)

Frag Maniac said:


> Any chance you'll bench the Fury X again using the 15.7 driver W1zzard, or do you know any reviewer that has?


 
He used them in the Fury review, no difference to speak off.


----------



## W1zzard (Jul 12, 2015)

Frag Maniac said:


> Any chance you'll bench the Fury X again using the 15.7 driver W1zzard, or do you know any reviewer that has?


Just look at the Fury review and you'll see Fury X numbers included?


----------



## Frag_Maniac (Jul 13, 2015)

W1zzard said:


> Just look at the Fury review and you'll see Fury X numbers included?




I looked at your Fury X review, but it uses Cat 15.15 beta. https://www.techpowerup.com/reviews/AMD/R9_Fury_X/6.html

Are you saying you've done another since then using 15.7 on the Fury X?



okidna said:


> HWC did that in their Fury review, 12 games in 1440p and 4K : http://www.hardwarecanucks.com/foru.../69792-amd-r9-fury-performance-review-20.html


That's just the Fury, and I'm looking for reviews that compare the Fury X on 15.7 to the 980 Ti.


----------



## newtekie1 (Jul 13, 2015)

Frag Maniac said:


> Are you saying you've done another since then using 15.7 on the Fury X?



http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/


----------

