• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon R9 Fury X 4 GB

Customer of mine ordered a Gigabyte GTX 980 G1 Gaming. Got cold feet after seeing AMD's PR benchmarks and booked a Fury X. I picked up the G1 order with a 10% discount (restock fee), so I'm stoked as the card was already discounted down to 10% below the EVGA SC. Still might trade up if the Classified turns up the heat. (So much for sitting this generation out!)


Ah, the miracle driver Hail Mary! By the time AMD get the driver out, the gloss would certainly be off this SKU.

Got to save something for Christmas sales - AMD Radeon Fury X PRO XT

They may not need to - until they NEED a new SKU ( Holiday refresh for Q4). If the current models are anything to go by, the custom GTX 980Ti's should hold the line. Even if AIB's are allowed to clock the Fury X to ~ 1200-1250, I doubt that will tip the balance.
0y9IzYc.jpg


Nope, you're right. Samsung became #1 though institutionalized IP theft, wholesale bribery, and protection and sweetheart deals with government.

"Samsung became #1 though institutionalized IP theft, wholesale bribery, and protection and sweetheart deals with government"

Thank you :respect:

A sad day for me while seeing this review but to all the ppl bitching I say this... Its new tech so deal with it. Give AMD a chance to get there drivers right and some time to tune this tech. This is new technology so I give AMD a thumbs up for being the first to change the way the cards are built.

I'll be waiting for the x2 version to come out before buying or unless the drivers get tuned accordingly but whatever.

@W1zzard
Why no 3dMark runs?
3dMark runs Really ?? :rolleyes:
Because Real World Performance ;)
Wacky numbers don't mean sh**

leaving us fanboyismos every one know that DX11 is dying and what has claim microsoft is they want to do the transisicon to DX12 as quickly as possible to ensure that all work DX11 may portear the easiest way to the new API, also remember that one bech under DX12 was leaked and one 290x can beat the 980 this gives too much to think, where in a month there will be new bech and we'll see this is not at all a good reference to ensure the long life of a graph and see how it pays filter both architectures the nVidia has added him three irrelevant efects but lacks by dont support asynchronous shader where AMD if does ,and as far all knows this is an vital feature for performance which could provide the DX12 ... ¬___¬

https://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels

vmvUAfX.png



here the hard reality...^^

kKLCcAr.png
Uhm ..... no :slap:
:rolleyes:
Lets wait for official news and data
 
Last edited:
Wow.... I think I have to go to the green team. This is absolutely disgusting. I will wait until after my tax return to buy, but I expect very little improvement from drivers.
 
lot of comments....bad card.. good card...

this card is released for 4k gaming not lower resolutions so take it as is.

who want to play in 4k may buy it or not; for lower res we have a lot of products available

I'm glad they release it so I can upgrade cheaper this year;i'm sure for 200$ will get a goodie for lower res
 
Even for 4K I think a third part 980 Ti would be a better solution for barely more ($20 give or take).
 
Performance not as good as I hoped.

But considering its compact size, low operating temperature and vastly improved efficiency over the last generation Fury X is still a decent card albeit if the price could drop a further $50. Let's see what the Nano can do next as that's the card I'm more interested in.

390X is probably the most disappointing one in terms of price. At the mainstream resolution (1080p), it's only 3% faster than a 970 but costs way more. Its 8GB VRAM sounds nice on the paper, but useless most of the time as the GPU itself can't handle 4K when large VRAM is actually needed.
 
A few brands of FuryX (MSI, XFX, Gigabyte) hit stores here in Australia today and they were sold out in hours......for the price of $999 each. Which is much higher than it would normally be, ($650 US is about $760ish here) but there are plenty of takers.

There are a lot of doomsayers here but its a very good card, at 4K and mutlimonitor its a very competitive, at any res really. Most users would not tell the difference between the FuryX, 980TI and TitanX in the real world. With windows 10 and DX12, the FuryX will only get better.

I heard the Bugatti was 5% faster than the Ferrari, so does that mean Ferraris are shit?? A lot of people want this card and are buying it.
 
Disappointing card , and considering all hype was surrounding this card a 9.2 is too much for this card .
 
A few brands of FuryX (MSI, XFX, Gigabyte) hit stores here in Australia today and they were sold out in hours......for the price of $999 each. Which is much higher than it would normally be, ($650 US is about $760ish here) but there are plenty of takers.
There always is for new hardware that has a full court press with marketing. OcUK sold 100 cards (their entire stock) at almost a 20% mark-up over MSRP inside half an hour. A client of mine pre-paid for a Fury X before the card was launched - not only paying an exorbitant price thanks to the country's GST, but also having to foot a $NZ100 restocking fee for the 980 Ti he had originally ordered - so how's that for keen?
Price isn't the great barrier to sales that you seem to think. Sales of Titan cards over the last couple of years should be a sobering example.
Disappointing card , and considering all hype was surrounding this card a 9.2 is too much for this card .
It is still a great card. You won't find an AIO cooled 980 Ti for anywhere near the same price. The only disappointment (for some) is the hype surrounding it and the expectation of some all-conquering silicon.
Expectations and AMD's feeding of them might lead to some deflation, but it doesn't alter the fact that the card will find a home with many - although not me. I enjoy tinkering with my hardware, and the voltage and memory locks don't give much in the way of extended tweaking for benchmarking fun.
 
Last edited:
There always is for new hardware that has a full court press with marketing. OcUK sold 100 cards (their entire stock) at almost a 20% mark-up over MSRP inside half an hour. A client of mine pre-paid for a Fury X before the card was launched - not only paying an exorbitant price thanks to the country's GST, but also having to foot a $NZ100 restocking fee for the 980 Ti he had originally ordered - so how's that for keen?
Price isn't the great barrier to sales that you seem to think. Sales of Titan cards over the last couple of years should be a sobering example.

It is still a great card. You won't find an AIO cooled 980 Ti for anywhere near the same price. The only disappointment (for some) is the hype surrounding it and the expectation of some all-conquering silicon.
Expectations and AMD's feeding of them might lead to some deflation, but it doesn't alter the fact that the card will find a home with many - although not me. I enjoy tinkering with my hardware, and the voltage and memory locks don't give much in the way of extended tweaking for benchmarking fun.
I didn't expect much from that asking price. If it did beat 980Ti its price would be on the moon. However, the unimpressive GPU doesn't change the fact that this is a great card with high quality build and shiny new tech HBM.
 
A small video of my "Day 2" thoughts on this release..

 
After reading about 6-7 reviews and multiple discussion threads about this card, I've come to the following conclusions.

1. The disparity of the results between the reviewers are crazy. Some reviewers results for the same game or synthetic software is so far off from others.

2. The most annoying thing is ppl calling this a flop(no matter what AMD touted it as) It is definitely not that. PPL fail to realize that NVIDIA probably got wind of the performance and knew that they had to counter it(drop in 980 ti).

3. Basically a follow-up to (2). If this card came out first no one here would be saying anything negative other than (oc sucks).. It truly is disappointing that this card can only muster up a meagerly 100mhz oc. WTF was AMD thinking, knowing that the target audience for this card relies heavily on being able to push their cards, some to the extreme. The performance it offers for the price(especially comparing it to Titan-x) is of the same relevancy of the 780/780ti to the Titan.

4. My main concern if I were to upgrade (pretty sure a lot of others would be in the same boat) is Fury-X apparently will not come in vanilla(air cooled) form, I have a custom loop and it would suck because I would either have to bypass my loop or not use this card until there are custom water blocks out. Which means AMD loses money here once again. I don't care about power draw and efficiency all that much, I didn't go all out on my system to worry about how much my light bill is going to increase. Moot point here for ppl still complaining about power.

5. I am a little disappointed by the lower resolution performances but I do believe driver updates will help. I currently game on a 1440p monitor but I know my next gpu upgrade I'm going 4k.

6. As far as drivers are concerned ppl need to relax because there are ppl out there experiencing bad drivers from both camps. YOUR CASE IS NOT THE END OF THE WORLD, IT HAPPENS TO EVERYONE OWNING A CARD FROM BOTH CAMPS. I have several computers in my home running cards from both camp (3870, 880gt, 7730m, 780, 980, 980m and 2x 290x). Of all the driver issues I have had, it's been mainly of the green team side. In fact I am experiencing 2 right now and 1 I just had it resolved. Arkham Knight does not work on my 980/980m I have yet to try it on the 780 system. My 290Xs in crossfire rips through the game with no issues, and before anyone speak on updating the drivers, I have and still no luck. AMDs beta drivers work better than NVIDIAs whql drivers. This is my case but that wont deter me from NVIDIA and I certainly wont bash them for it. I do like what the red team is doing giving their financial situation and would continue to support them and their products even if I lose a bit of performance here and there.

7. I don't know if their profit margins can allow for something like this but @ $600 this card is a no brainer for sure. It is doing quite nicely at its current pricing though, despite all the negatives that I have seen/read surrounding this card, It is practically sold out everywhere in the U.S.

8. You're either going to buy it or not, the difference in the FPS is not all that concerning given the fact that the Tech is new, drivers aren't mature, and to my knowledge is going to require these lazy developers to actually start working for their money. This card is here to stay and I do believe is an experiment for AMDs next iteration of GPU...the next implementation of HBM will prove that, I have no doubt in my mind and secondly NVIDIA wouldn't be going that route if it wasn't so. Lets not forget AMD gets paid off of every use of HBM along with Hynix so the green team is shelling out to the red team for Pascal ppl. All you guys thinking AMD will fail are delusional and quite frankly fools. I certainly want them around to keep prices in check. All yall out there saying this is Bulldozer all over again are wrong, this card loses/ties/beats the competition, it ties mostly but given the whole budget and resource availability of the two companies I would give AMD the nod...they have less resources and are restricted heavily financially all in all I would say good job AMD. No Titan-x/980TI killer but job well done.


Quite frankly If I didn't already go over budget on my PC, Laptop(s), PS4, Xbox-1 and game purchases and pre-orders. I would get both a 980Ti and Fury-X, but I don't really need either one or the upgrades...my 4770k/780 setup only been turned on once after installing the operating system, same goes for my 4790/980 system(htpc). My 290x and 980m systems get all the attention. My xps13 is for my DJ'n which is on a back burner. Sorry for the little rant, I think a lot of ppl are in the same boat as me(don't need this generation). I guess its time to teach my 1yr old how to operate these things. I need help in utilization.
 
Last edited:
I guess its time to teach my 1yr old how to operate these things. I need help in utilization.

I suggest you hire someone, that'll teach the little bugger good values as well.
 
I suggest you hire someone, that'll teach the little bugger good values as well.
He don't need it, he already knows a lot more than most ppl in this thread.
 
9.2 is too much , I'll give it 7.5
Pros is like wizz said but I'll add cons.
- Lack of VRM cooling
- Overclocking potential is really slim , I don't know if non-ref with strong VRM will fix this
- Price is insane , should be 499$-549$
- too much propaganda (like AMD official benchmark)
- 1080p/1440p performance is really joke , 980 Ti is faster and way way faster with non-ref and manual OC
 
Technical question based on statement:

If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat? On such a small PCB would this create heat dissipation problems? I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle. I figure AIB's will do 3rd party coolers from the start?

I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks. It could be a HBM integration issue with heat affecting perf? Who knows.
 
IMHO, AMD's biggest mistake with this is not bringing it out in AIO and waterblock versions, from the get go. Because of the expense of the AIO, a waterblock version should not be that much more.

CoolerMaster makes AIO's?:eek:

Technical question based on statement:

If AMD designed it to work with water AIO from the start (and not air cooled) this must mean the chip produces a lot of heat? On such a small PCB would this create heat dissipation problems? I genuinely think Fury (non X) is air cooled on slower clocks because it thermally will struggle. I figure AIB's will do 3rd party coolers from the start?

I really don't think there is any headroom for air cooling and even then, the water cooling isn't allowing higher clocks. It could be a HBM integration issue with heat affecting perf? Who knows.
It's my understanding that they went AIO because the GPU and the VRAM are all located on the GPU "package". The "non X" will have regular GDDR5 in a conventional format.
 
Excellent review Wizz!!! Looks like they fell just a bit short again... but hey, the price is at least in the ballpark for the performance (lack of overclocking headroom not withstanding). Hopefully the pump noise issue isn't prevalent.


LOL, Xfia got a vacation? Christ, that was looooooooooooooong overdue...
 
9.2 is too much , I'll give it 7.5
Pros is like wizz said but I'll add cons.
- Lack of VRM cooling
- Overclocking potential is really slim , I don't know if non-ref with strong VRM will fix this
- Price is insane , should be 499$-549$
- too much propaganda (like AMD official benchmark)
- 1080p/1440p performance is really joke , 980 Ti is faster and way way faster with non-ref and manual OC

I'm not trying to justify the score. However,

1. It does have VRM cooling in the form of a copper tube passing over the area, which is the main coolant channel.
2. To be honest, we all are cavemen figuring out an iPhone, when it comes to HBM. Maybe there will be more documentation or larger community poking-with-a-stick before we figure out how to tweak it
3. AMD did not see GTX 980 Ti coming. Heck, they didn't expect GM200 to come out this fast. NVIDIA put GM200 to work within a span of 4 months (faster, when compared to how long it took to monetize GK110, post GK104).
4. Every GPU launch has such pre-launch activity
5. Nobody buys these cards for 1080p. In case of 1080p @ 120Hz, I agree. 1440p isn't so bad.

Regardless of how Fury X turned out to be, whatever little inventory is with the retailers, is flying off the shelves. There's still a section of people who are convinced by the product, who like quirky-yet-potentially-powerful toys, and love the deus ex machina it gives them.
 
Dying Light is also an (albeit extreme) example, as HardOCP found.

But isn't that more as a built-in of Nvidia Gameworks; ShadowWorks? Honestly, I personally find Very High looks very... un-natural. Medium is about the top limit where you have "defined but soften shadows" that are more real to life. Even the differences from low/medium hardly noticeable. Very High shadows is more there to be like setting that Nvidia gave themselves to “sugar-coat” their hardware. Folk can see for themselves in their interactive comparison here:
http://www.geforce.com/whats-new/gu...performance-guide#dying-light-shadow-map-size

But yea, shadowing technology has come a long way... I thought it amazing to see shadows following rolling beach balls, and how shadows even climbed and elongate up say a surface (wall) as that ball got close to it, or rolled down in trough. Those where some of the technical advancement that won awards at the Odyssey Visual Design Computer Animation Festival: circa, 1986 & 1987

There's been comparisons to the Gigabyte GeForce GTX 980 Ti G1 review W1zzard did prior to this. Looking at that as to the subject of "power" under gaming it's reveling.
The G1 saw 23% higher in it's Average/Peak from the reference 980Ti. Avg: 211 vs. 259W; Peak 238 vs. 293W
The Fury numbers... Avg 246W; Peak: 280W

Interesting Tom's also has a fairly sophisticated power test and they report the Fury X under gaming Averages 220.7, while I don't find a reference 980Ti review (?), their Titan X number was 233W. So I'm not all "up in arms" over Fury X power consumption, for an Enthusiast it not "out-of-bounds". Heck even [H] isn't showing it as that "out-of-bounds", perhaps at best 3-5% and they average all the games they run. But you need to read how Brent appears to sabotage it by showing the "spike peak" in in FC4 @ 4K, odd you mean there isn't a number for/like that in a game with the 980Ti?

I do say reviewers universally see minimal OC'n, although if you read what was provide as "AMD said about OC'n" (scroll down) around June 18th, "AMD tells us that a 1150Mhz clock speed, a 100Mhz overclock over the 1050Mhz stock, is quite easily achievable with AMD’s OverDrive... does not support over-volting, so... attainable with stock voltage. AMD tells us Fury X graphics cards should be able to run at this clock speed without any issue." But then the author goes on to say, "If what AMD’s telling us is true, then Fury X will be AMD’s best overclocking graphics card yet. As a 1150Mhz clock speed at the stock voltage was more of a rare occasion on AMD’s previous 200 series and HD 7000 series reference designed cards." It appears AMD was reining in or tempering on OC'n expectations, as I find where (June 16th) AMD CTO Joe Macri started the fire storm... “You’ll be able to overclock this thing like no tomorrow.” “This is an overclocker’s dream.” There's an executive that should be brought and shot for not knowing the talking point and it's again a Technical guy! :shadedshu: This is perhaps the biggest problem AMD has!

Price is insane , should be 499$-549$
The thing is, it's priced as it offers "as good as any other" Enthusiast card for 4K, which is what it is intended for. Sure it's not always like or above a 980Ti, but folks can weigh the merits of that and at least have a choice.

AMD knows for their production schedule that's set and achievable, and seeing no lull in sales at the moment, they foresee selling everyone they produce. While re-coupe some R&D, production cost, and still pull-in a reasonable profit. That's a company doing it right!

As other have said, they aren't a non-profit charity...? If a product is competitive within the intended market, pricing it appropriately is how that works. You, believing they're only competitive if they reduce the price say 18%, even when W1zzard indicates only perhaps 2% separates them at 4K? Hope you aren't running marketing at my business, but then again you might...
 
Last edited:
leaving us fanboyismos every one know that DX11 is dying and what has claim microsoft is they want to do the transisicon to DX12 as quickly as possible to ensure that all work DX11 may portear the easiest way to the new API, also remember that one bech under DX12 was leaked and one 290x can beat the 980 this gives too much to think, where in a month there will be new bech and we'll see this is not at all a good reference to ensure the long life of a graph and see how it pays filter both architectures the nVidia has added him three irrelevant efects but lacks by dont support asynchronous shader where AMD if does ,and as far all knows this is an vital feature for performance which could provide the DX12 ... ¬___¬ ..... here the hard reality...^^
Isn't it possible that you are the one who suffers from fanboyism and spreading fuds? That table can't be right... Maxwell2 supports it for sure, more to that iirc it's mandatory to support it for every DX12 GPU.

Yes, AMD cards will gain more speed increase from dx12, but we don't know how that will actually impact real life gaming performances, and I'm so tired of this benchmark being linked on every tech sites. Do you even realize it's an api OVERHEAD(!) test, or you just see the bigger number and the longer something?
 
Last edited:

If that's to be the dual GPU Fury X then it may be seen as a fail. It's got two 8pins so that's good for 375 watts. A single Fury X has two 8pins and already draws 280 watts peak. The clocks will have to be backed down. Probably cost more than two Fury X and not perform as well as two Fury X.
 
Last edited:
Plot twist is real
Just putting this out there. Here's a review that seems quite different from the other ones: https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=http://www.ixbt.com/video3/fiji-part3.shtml This one actually shows that Fury beats 980 ti in a lot of tests and even titan x in some tests. Of course in lower resolutions it seems that NVidia still wins a lot of times, but it's not as bad as it is with some other reviewers. Now here's a possible reason why. Driver version used in this review is 15.15-180612a-18565BE which the reviewer was sent by AMD on June 18th. The press driver on AMD's FTP server is 15.15-150611a-185358E. I think this is probably the reason of this inconsistency.
Your thought, @W1zzard
 
If that's to be the dual GPU Fury X then it may be seen as a fail. It's got two 8pins so that's good for 375 watts. A single Fury X has two 8pins and already draws 280 watts peak. The clocks will have to be backed down. Probably cost more than two Fury X and not perform as well as two Fury X.
They'll probably exceed PCI-E spec for power draw just as with their dual GPU 295X
 
Plot twist is real

Your thought, @W1zzard
umm that's the same story of this review. In some tests the Fury came out on top of the 980 ti, trouble is when it did it was by 1-2%, in the tests the 980 ti came out on top it was a higher percentage making the overall average in favor of the 980 ti as even the games which the fury x is faster, the 980 ti has almost the exact gaming experience. The reverse isn't true however where the Fury X is significantly slower in several tests, so much so it might affect detail level or even playability in some cases.
 
Back
Top