Friday, June 26th 2015

AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

This has been a roller-coaster month for high-end PC graphics. The timing of NVIDIA's GeForce GTX 980 Ti launch had us giving finishing touches to its review with our bags to Taipei still not packed. When it launched, the GTX 980 Ti set AMD a performance target and a price target. Then began a 3-week wait for AMD to launch its Radeon R9 Fury X graphics card. The dance is done, the dust has settled, and we know who has won - nobody. AMD didn't get the R9 Fury X wrong, but NVIDIA got its GTX 980 Ti right. At best, this stalemate yielded a 4K-capable single-GPU graphics option from each brand at $650. You already had those in the form of the $650-ish Radeon R9 295X2, or a pair GTX 970 cards. Those with no plans of a 4K display already had great options in the form of the GTX 970, and price-cut R9 290X.

The Radeon R9 290 series launch from Fall-2013 stirred up the high-end graphics market in a big way. The $399 R9 290 made NVIDIA look comically evil for asking $999 for the card it beat, the GTX TITAN; while the R9 290X remained the fastest single-GPU option, at $550, till NVIDIA launched the $699 GTX 780 Ti, to get people back to paying through their noses for the extra performance. Then there were two UFO sightings in the form of the GTX TITAN Black, and the GTX TITAN-Z, which made no tangible contributions to consumer choice. Sure, they gave you full double-precision floating point (DPFP) performance, but DPFP is of no use to gamers. So what could have been the calculation at AMD and NVIDIA as June 2015 approached? Here's a theory.
Image credit: Mahspoonis2big, Reddit

AMD's HBM Gamble
The "Fiji" silicon is formidable. It made performance/Watt gains over "Hawaii," despite a lack of significant shader architecture performance improvements between GCN 1.1 and GCN 1.2 (at least nowhere of the kind between NVIDIA's "Kepler" and "Maxwell.") AMD could do a 45% increase in stream processors for the Radeon R9 Fury X, at the same typical board power as its predecessor, the R9 290X. The company had to find other ways to bring down power consumption, and one way to do that, while not sacrificing performance, was implementing a more efficient memory standard, High Bandwidth Memory (HBM).

Implementing HBM, right now, is not as easy GDDR5 was, when it was new. HBM is more efficient than GDDR5, but it trades clock speed for bus-width, and a wider bus entails more pins (connections), which would have meant an insane amount of PCB wiring around the GPU, in AMD's case. The company had to co-develop the industry's first mass-producible interposer (silicon die that acts as substrate for other dies), relocate the memory to the GPU package, and still make do with the design limitation of first-generation HBM capping out at 8 Gb per stack, or 4 GB for AMD's silicon; after having laid a 4096-bit wide memory bus. This was a bold move.

Reviews show that 4 GB of HBM isn't Fiji's Achilles' heel. The card still competes in the same league as the 6 GB memory-laden GTX 980 Ti, at 4K Ultra HD (a resolution that's most taxing on the video memory). The card is just 2% slower than the GTX 980 Ti, at this resolution. Its performance/Watt is significantly higher than the R9 290X. We reckon that this outcome would have been impossible with GDDR5, if AMD never gambled with HBM, and stuck to the 512-bit wide GDDR5 interface of "Hawaii," just as it stuck to a front-end and render back-end configuration similar to it (the front-end is similar to that of "Tonga," while the ROP count is the same as "Hawaii.")

NVIDIA Accelerated GM200
NVIDIA's big "Maxwell" silicon, the GM200, wasn't expected to come out as soon as it did. The GTX 980 and the 5 billion-transistor GM204 silicon are just 9 months old in the market, NVIDIA has sold a lot of these; and given how the company milked its predecessor, the GK104, for a year in the high-end segment before bringing out the GK110 with the TITAN; something similar was expected of the GM200. Its March 2015 introduction - just six months following the GTX 980 - was unexpected. What was also unexpected, was NVIDIA launching the GTX 980 Ti, as early as it did. This card has effectively cannibalized the TITAN X, just 3 months post its launch. The GTX TITAN X is a halo product, overpriced at $999, and hence not a lot of GM200 chips were expected to be in production. We heard reports throughout Spring, that launch of a high-volume, money-making SKU based on the GM200 could be expected only after Summer. As it turns out, NVIDIA was preparing a welcoming party for the R9 Fury X, with the GTX 980 Ti.

The GTX 980 Ti was more likely designed with R9 Fury X performance, rather than a target price, as the pivot. The $650 price tag is likely something NVIDIA came up with later, after having achieved a performance lead over the R9 Fury X, by stripping down the GM200 as much as it could to get there. How NVIDIA figured out R9 Fury X performance is anybody's guess. It's more likely that the price of R9 Fury X would have been different, if the GTX 980 Ti wasn't around; than the other way around.

Who Won?
Short answer - nobody. The high-end graphics card market isn't as shaken up as it was, right after the R9 290 series launch. The "Hawaii" twins held onto their own, and continued to offer great bang for the buck, until NVIDIA stepped in with the GTX 970 and GTX 980 last September. $300 gets you not much more from what it did a month ago. At least now you have a choice between the GTX 970 and the R9 390 (which appears to have caught up), at $430, the R9 390X offers competition to the $499 GTX 980; and then there are leftovers from the previous-gen, such as the R9 290 series and the GTX 780 Ti, but these aren't really the high-end we were looking for. It was gleeful to watch the $399 R9 290 dethrone the $999 GTX TITAN in September 2013, as people upgraded their rigs for Holiday 2013. We didn't see that kind of a spectacle this month. There is a silver lining, though. There is a rather big gap between the GTX 980 and GTX 980 Ti just waiting to be filled.

Hopefully July will churn out something exciting (and bonafide high-end) around the $500 mark.
Add your own comment

223 Comments on AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

#1
btarunr
Editor & Senior Moderator
Editorial / Opinion. Keep it civil.
Posted on Reply
#2
Patriot
btarunrEditorial / Opinion. Keep it civil.
I think the who won is consumers. Competitive gfx is good for everyone.
Posted on Reply
#3
erocker
*
PatriotI think the who won is consumers. Competitive gfx is good for everyone.
With $650 dollar price tags, I don't think so.
Posted on Reply
#4
Fluffmeister
I wonder how much AMD would have charged for Fury X if Nvidia hadn't launched the 980 Ti yet.
Posted on Reply
#5
EarthDog
Interesting take...

If I may pick your brain... since the 390x is a 290x with 8GB (rebrand), why isn't the 290x mentioned at $335? That 8GB really doesn't matter to most anyone, unless they are 4K and enjoy dumping AA on top...

290x ($335) and 980 ($500) and a scant 10% difference between the two still makes the 290x a hell of a card.
erockerWith $650 dollar price tags, I don't think so.
So, I take it you have been on that horse since the 8800GTX days several years ago? That is when prices started to skyrocket...truthfully, to me, it is what it is... Its been this way for so long, its the new status quo for high end GPUs. As much as a negative nancy complainer I am, I even stopped bitching about the high price of enthusiast level cards a couple year ago. ;)

But a $500 high end would be nice again (a la 580 as was mentioned below - I was thinking it was $600 out of the gate.
Posted on Reply
#6
radrok
erockerWith $650 dollar price tags, I don't think so.
Kinda agree with you here man.

These GPUs should have been on the usual 499$ price point like the GTX 580 was.

This situation is just a sad duopoly.
Posted on Reply
#7
btarunr
Editor & Senior Moderator
EarthDogInteresting take...

If I may pick your brain... since the 390x is a 290x with 8GB (rebrand), why isn't the 290x mentioned at $335? That 8GB really doesn't matter to most anyone, unless they are 4K and enjoy dumping AA on top...

290x ($335) and 980 ($500) and a scant 10% difference between the two still makes the 290x a hell of a card.

So, I take it you have been on that horse since the 8800GTX days several years ago? That is when prices started to skyrocket...truthfully, to me, it is what it is... Its been this way for so long, its the new status quo for high end GPUs. As much as a negative nancy complainer I am, I even stopped bitching about the high price of enthusiast level cards a couple year ago. ;)
Good point, added that.
Posted on Reply
#8
Rahmat Sofyan
I wonder, does nvidia has a spy in AMD and already know early about Fury X performance ?

AFAIK from 780 to last Titan X they've released these GPU in very short time or purely marketing strategy ?

overall this is really a good opinion, another wonder what kind of GPU will nvidia release for next time..
GTX 990, or 970 Ti or what or new GPU, GTX 1000 series or new name scheme.

I guess Fury X just a first step for AMD to jump with next GPU architecture, and the good point, AMD was the first with HBM, same like HD 5870 first for DX11...

wait and see..
Posted on Reply
#9
erocker
*
EarthDogInteresting take...

If I may pick your brain... since the 390x is a 290x with 8GB (rebrand), why isn't the 290x mentioned at $335? That 8GB really doesn't matter to most anyone, unless they are 4K and enjoy dumping AA on top...

290x ($335) and 980 ($500) and a scant 10% difference between the two still makes the 290x a hell of a card.

So, I take it you have been on that horse since the 8800GTX days several years ago? That is when prices started to skyrocket...truthfully, to me, it is what it is... Its been this way for so long, its the new status quo for high end GPUs. As much as a negative nancy complainer I am, I even stopped bitching about the high price of enthusiast level cards a couple year ago. ;)

But a $500 high end would be nice again (a la 580 as was mentioned below - I was thinking it was $600 out of the gate.
Ah, I failed to mention Titan which really should be the high end card and priced around where the 980Ti is. But yeah, $550+ prices on high end GPU's has been around since at least that ATi 9800XT I bought years ago. But they added another tier and essentially we're paying 400-500 bucks more for the highest end product than we did 10 years ago or so. My opinion, yes, but probably more likely wishful thinking.
Posted on Reply
#10
rooivalk
FluffmeisterI wonder how much AMD would have charged for Fury X if Nvidia hadn't launched the 980 Ti yet.
I heard it's rumored to be $800 card. It could be true, but nVidia undercut them and set the bar of performance and price like the editorial said. It could be very costly situation for AMD since they're using new tech (HBM).
EarthDogSo, I take it you have been on that horse since the 8800GTX days several years ago? That is when prices started to skyrocket...truthfully, to me, it is what it is... Its been this way for so long, its the new status quo for high end GPUs. As much as a negative nancy complainer I am, I even stopped bitching about the high price of enthusiast level cards a couple year ago. ;)

But a $500 high end would be nice again (a la 580 as was mentioned below - I was thinking it was $600 out of the gate.
I don't really understand economy but I remember paying $500+ for Radeon X800XT Platinum Edition 10 years ago. Isn't the world is almost always going towards inflation? 30% for flagship GPU in 10 years doesn't seem 'skyrocketed' in my opinion.
Posted on Reply
#11
kiddagoat
Just picked up my Fury X yesterday. Had the 980Ti for about 2 weeks previously before that. Not much difference between the two. I just liked the looks and am really liking the audible noise of the Fury X. I could hear the 980Ti's fan spin up here and there under certain situations but it wasn't anything totally off putting.

Both cards good. They both play all my games just fine. I need to upgrade the monitor soon to justify the new GPU. I have had the itch for a long 8 months. Baby steps.... baby steps.

I have a mid tower, Arc Midi, with a high-end Zalman flower style cooler. To say installation was a breeze would be a bit far fetched. But once I took my time to get everything needed into the confined space, looks pretty sharp if you ask me.
Posted on Reply
#12
lilhasselhoffer
I fully agree with this editorial.


There's nothing great out there for consumers, as it's just the same minor performance increases in anywhere near a reasonable pricing bracket. Thank you for making this an editorial piece. I'm looking forward to the fan boy arguments shortly.
Posted on Reply
#13
xkche
Waiting CFX reviews :D..
Posted on Reply
#14
Frick
Fishfaced Nincompoop
xkcheWaiting CFX reviews :D..
Dual Fury X crossfire!
Posted on Reply
#15
john_
FluffmeisterI wonder how much AMD would have charged for Fury X if Nvidia hadn't launched the 980 Ti yet.
The rumored $850 probably and the 980Ti would have been a September launch. That way both companies would be selling hi end cards with a pretty nice margin. If things where different, that would have been the case.

But Nvidia is trying to throw AMD out of the mid-hi end market, before AMD comes out with Zen. They did it with 970, they did it again with 980Ti.

Intel iGPUs get more EUs and more cache (Kaby Lake offers 256MB cache in one model), if AMD produces a good Zen cpu core, it's APUs will get a nice boost just from that. Add HBM and things look bad for every discrete card under $150-$200.
Posted on Reply
#16
BiggieShady
kiddagoatI just liked the looks and am really liking the audible noise of the Fury X.
How about that high pitched pump noise we read in reviews ... did they fix it? Is it noticeable with your card?
Posted on Reply
#18
kiddagoat
No noise at all.... I have a Sapphire... Card is audible... silent. I had to look inside the case to make sure the fan was running when I turned my machine on.
Posted on Reply
#19
EarthDog
rooivalkI don't really understand economy but I remember paying $500+ for Radeon X800XT Platinum Edition 10 years ago. Isn't the world is almost always going towards inflation? 30% for flagship GPU in 10 years doesn't seem 'skyrocketed' in my opinion.
Exactly my point... ;)

8800 Ultra was $830
8800GTX was $600-$650...

... that was 9 years ago.
Posted on Reply
#20
GAR
Ive had pretty much all the high end cards that have come out in the past 19 years, plenty of them have been from ati and some from nvidia, I have used the riva tnt 16mb, tnt 2, 3dfx voodoo 1 and voodoo2 12mb in sli, ive used the 5870, 7970, both I believe are one of ati's best cards, ive had the legendary 8800gtx, and the not so great 9800gtx, ive had 4850's in crossfire, and even the 4870x2, I love both companies and competition is only good for the consumer, remember that people.
Posted on Reply
#21
ZoneDymo
700 euro for cards that can barely run todays games at full blast at 1080p/1440p let alone damn 4k they damn well should be ready for by now.

I think neither got it right, this is a weak step purely there to milk consumers for more money before we get to the actual thing we want.

Honestly though, it would have been awesome if AMD went all out and just said f it to the competition and just came out with something that just decimated it. Something actually 4k capable. and something that again kinda hovers in between all the current cards.
Posted on Reply
#22
bentan77
EarthDogBut a $500 high end would be nice again (a la 580 as was mentioned below - I was thinking it was $600 out of the gate.
There's still the Fury card to be released and this could offer pretty similar performance to the Fury X at a cheaper price.
Posted on Reply
#23
kiddagoat
ZoneDymo700 euro for cards that can barely run todays games at full blast at 1080p/1440p let alone damn 4k they damn well should be ready for by now.

I think neither got it right, this is a weak step purely there to milk consumers for more money before we get to the actual thing we want.

Honestly though, it would have been awesome if AMD went all out and just said f it to the competition and just came out with something that just decimated it. Something actually 4k capable. and something that again kinda hovers in between all the current cards.
Again... people say this.... then would complain about the price, the power consumption, noise, thermals, or something they pulled from their ass despite the raw performance of it. Doesn't matter who would make the card, someone somewhere would take issue with it...

Few people are running 4k monitors at this time, they are starting to drop in price..... The ones that have 4k mostly probably do not use them for gaming, more for CAD, Media Creation, or other "professional" purposes.......
Posted on Reply
#24
LightningJR
Nice editorial. When AMD brought us the 200 series I was extremely impressed with it's price/performance, even now it's still on top. With the aftermarket coolers the cards are also quieter, cooler and faster they're the best mid range cards today if you don't care about power consumption.

I would have to disagree with you mildly about who "won". The Fury X's direct competitor, the 980ti is faster when you start dropping the resolution down from 4k and it's not by a small amount. It also edges the Fury X out in power consumption no matter the work being done. The last thing is overclocking, no need to explain that. The only advantages the Fury X has over the 980ti is heat and noise.

The Fury X is not a 1080p card, the performance doesn't scale well as you decrease resolution, unlike the 980ti. I know the argument can be "why would you buy a 980ti or Fury X for 1080p" honestly I would much rather a minimum 60fps @ 1080p in all my games than a 35fps 4k.
Posted on Reply
#25
RejZoR
The framerate target on AMD cards should bring down power consumption. Has this even been used so far? The tech makes sense though. No need to draw stuff at 250fps if 144fps is enough for your screen. Plus it can eliminate tearing without using V-Sync...
Posted on Reply
Add your own comment
Nov 23rd, 2024 03:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts