Friday, June 26th 2015

AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

This has been a roller-coaster month for high-end PC graphics. The timing of NVIDIA's GeForce GTX 980 Ti launch had us giving finishing touches to its review with our bags to Taipei still not packed. When it launched, the GTX 980 Ti set AMD a performance target and a price target. Then began a 3-week wait for AMD to launch its Radeon R9 Fury X graphics card. The dance is done, the dust has settled, and we know who has won - nobody. AMD didn't get the R9 Fury X wrong, but NVIDIA got its GTX 980 Ti right. At best, this stalemate yielded a 4K-capable single-GPU graphics option from each brand at $650. You already had those in the form of the $650-ish Radeon R9 295X2, or a pair GTX 970 cards. Those with no plans of a 4K display already had great options in the form of the GTX 970, and price-cut R9 290X.

The Radeon R9 290 series launch from Fall-2013 stirred up the high-end graphics market in a big way. The $399 R9 290 made NVIDIA look comically evil for asking $999 for the card it beat, the GTX TITAN; while the R9 290X remained the fastest single-GPU option, at $550, till NVIDIA launched the $699 GTX 780 Ti, to get people back to paying through their noses for the extra performance. Then there were two UFO sightings in the form of the GTX TITAN Black, and the GTX TITAN-Z, which made no tangible contributions to consumer choice. Sure, they gave you full double-precision floating point (DPFP) performance, but DPFP is of no use to gamers. So what could have been the calculation at AMD and NVIDIA as June 2015 approached? Here's a theory.
Image credit: Mahspoonis2big, Reddit

AMD's HBM Gamble
The "Fiji" silicon is formidable. It made performance/Watt gains over "Hawaii," despite a lack of significant shader architecture performance improvements between GCN 1.1 and GCN 1.2 (at least nowhere of the kind between NVIDIA's "Kepler" and "Maxwell.") AMD could do a 45% increase in stream processors for the Radeon R9 Fury X, at the same typical board power as its predecessor, the R9 290X. The company had to find other ways to bring down power consumption, and one way to do that, while not sacrificing performance, was implementing a more efficient memory standard, High Bandwidth Memory (HBM).

Implementing HBM, right now, is not as easy GDDR5 was, when it was new. HBM is more efficient than GDDR5, but it trades clock speed for bus-width, and a wider bus entails more pins (connections), which would have meant an insane amount of PCB wiring around the GPU, in AMD's case. The company had to co-develop the industry's first mass-producible interposer (silicon die that acts as substrate for other dies), relocate the memory to the GPU package, and still make do with the design limitation of first-generation HBM capping out at 8 Gb per stack, or 4 GB for AMD's silicon; after having laid a 4096-bit wide memory bus. This was a bold move.

Reviews show that 4 GB of HBM isn't Fiji's Achilles' heel. The card still competes in the same league as the 6 GB memory-laden GTX 980 Ti, at 4K Ultra HD (a resolution that's most taxing on the video memory). The card is just 2% slower than the GTX 980 Ti, at this resolution. Its performance/Watt is significantly higher than the R9 290X. We reckon that this outcome would have been impossible with GDDR5, if AMD never gambled with HBM, and stuck to the 512-bit wide GDDR5 interface of "Hawaii," just as it stuck to a front-end and render back-end configuration similar to it (the front-end is similar to that of "Tonga," while the ROP count is the same as "Hawaii.")

NVIDIA Accelerated GM200
NVIDIA's big "Maxwell" silicon, the GM200, wasn't expected to come out as soon as it did. The GTX 980 and the 5 billion-transistor GM204 silicon are just 9 months old in the market, NVIDIA has sold a lot of these; and given how the company milked its predecessor, the GK104, for a year in the high-end segment before bringing out the GK110 with the TITAN; something similar was expected of the GM200. Its March 2015 introduction - just six months following the GTX 980 - was unexpected. What was also unexpected, was NVIDIA launching the GTX 980 Ti, as early as it did. This card has effectively cannibalized the TITAN X, just 3 months post its launch. The GTX TITAN X is a halo product, overpriced at $999, and hence not a lot of GM200 chips were expected to be in production. We heard reports throughout Spring, that launch of a high-volume, money-making SKU based on the GM200 could be expected only after Summer. As it turns out, NVIDIA was preparing a welcoming party for the R9 Fury X, with the GTX 980 Ti.

The GTX 980 Ti was more likely designed with R9 Fury X performance, rather than a target price, as the pivot. The $650 price tag is likely something NVIDIA came up with later, after having achieved a performance lead over the R9 Fury X, by stripping down the GM200 as much as it could to get there. How NVIDIA figured out R9 Fury X performance is anybody's guess. It's more likely that the price of R9 Fury X would have been different, if the GTX 980 Ti wasn't around; than the other way around.

Who Won?
Short answer - nobody. The high-end graphics card market isn't as shaken up as it was, right after the R9 290 series launch. The "Hawaii" twins held onto their own, and continued to offer great bang for the buck, until NVIDIA stepped in with the GTX 970 and GTX 980 last September. $300 gets you not much more from what it did a month ago. At least now you have a choice between the GTX 970 and the R9 390 (which appears to have caught up), at $430, the R9 390X offers competition to the $499 GTX 980; and then there are leftovers from the previous-gen, such as the R9 290 series and the GTX 780 Ti, but these aren't really the high-end we were looking for. It was gleeful to watch the $399 R9 290 dethrone the $999 GTX TITAN in September 2013, as people upgraded their rigs for Holiday 2013. We didn't see that kind of a spectacle this month. There is a silver lining, though. There is a rather big gap between the GTX 980 and GTX 980 Ti just waiting to be filled.

Hopefully July will churn out something exciting (and bonafide high-end) around the $500 mark.
Add your own comment

223 Comments on AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

#76
AsRock
TPU addict
newtekie1AMD can't go back and change the way they hyped up the card, they can't undo the straight up lies they told about the card leading up to its launch. Yes, companies hype up products, they even cherry pick test results to make the product look good before launch. However, AMD made bold claims, with no cherry picked sources to back them up. They just flat out made claims that were simply not true. There is a difference between what AMD did before the launch of Fury X and what other companies do to promote their upcoming products.

The card is good, I never even hinted it wasn't. AMD just marketed it wrong, their PR department made the situation have one outcome, disappointing. Like I said, if they had been straight up and just said "we're releasing a card that can compete with the 980Ti at 4k" and then released it at $50 less because it is slightly weaker than the 980Ti, they would have had a winner launch. They didn't have to lie and say it beats the 980Ti at 4k, that it was the fastest GPU in the world. They just had to establish that Fury X would be a reasonable alternative to the 980Ti and a slightly lower price. That is all they had to do, but their ego got the best of them. That is my constructive criticism, that was in my first post. Promote the card, hype the card, but don't over hype the card to the point that it can't live up to what you are saying about it.

These aren't fanboy statements, they are my opinion on the situation and why Fury X ended up a disappointment. I mean, clearly I'm an nVidia/Intel fanboy. I've got 2 Intel/nVidia systems and 6 AMD/AMD systems in my house right now...but I just love me some AMD bashing...:rollseyes: Sorry, but the fanboy is the one that can't take someone even suggesting that their beloved company of choice did something wrong, and things any negative opinion on the companies actions is "flaming" them.
Who takes notice of PR anyways ?, OMG you learned any thing over the years ?, it's not like they can say our card is meh and expect people to look still.
Posted on Reply
#77
john_
1c3d0gThat last part of the sentence is not so easy to say. GPU makers have an uncanny ability to double, even triple low-end GPU performance if/when integrated GPU's start performing too close for comfort. Focusing purely on Intel here, every time they figure out a way to boost iGPU performance, the established GPU makers just push the performance envelope even further. Intel is dedicating tons of die area to their GPU efforts (usually half of the die is reserved just for the GPU alone!), in most generations they manage to DOUBLE performance, but the GPU makers always come back and smash them head on, leaving them dizzy so they have to go back to the drawing board to come up with another plan. So Intel will always play catch up, a constant game of cat & mouse, unless it buys/merges with a well-respected GPU maker.
GPU makers can always come out and improve the performance for their low end cards, so that their low end products keep a safe distance from the iGPUs. But, there are NO new low end products, are there? Look at the 900 series. The whole line is 5 cards and 3 of them are 500 dollars or more. 900 series starts at $200. So in a way it's already happening. Top AMD APUs and top iGPUs in Intel CPUs can challenge a GT740. Later will be able to challenge a GT750 or even a 750Ti, because what iGPUs lack today is bandwidth, and that's what they will mostly get in the next 1-2 years.

What we see in the low end market today would have been unheard in the past. In the past you might have an iGPU challenging the lowest performing graphics cards in the market. Today you have plenty of (old) discrete graphics cards that are in fact a downgrade compared to some iGPUs. An old G210 or HD5450 that are still on the market, is a downgrade probably compared to any iGPU that's in a last gen CPU. In the past a new line meant new models for the low end market. Today a new line means rebrands or NO new low end models, because financially it doesn't make sense.

So while GPU makers can always come back and smash iGPUs easily with their low end offerings, they are not going to do it, because financially it makes no sense. 6-7 years ago a GTX960 would be called GT940 and sold for $80. Today this is suicide from a financial point of view.
Posted on Reply
#78
the54thvoid
Super Intoxicated Moderator
kiddagoatI dunno..... seems like @newtekie1 is the sort of passive aggressive fanboy..... I mean I don't hear any constructive just constant bashing without all the flare and flame usually associated with the die hard fanboy......

Yeah AMD hyped up the card, they most certainly did.... I mean that's the PR and marketing team's job... they generate buzz... Nvidia does it, Intel does it, MSI, ASUS, on and on... all vendors do it.... Looking at the raw numbers yeah the 980Ti is ahead... but you are talking 5-10FPS at their targeted resolutions which are 1440p and above.... which in most cases turns out to be a wash because they trade back and forth.... yeah lower resolution the gap is larger but these cards aren't made for 1080p or lower.... they aren't marketed to that...

Again this is a case of people just looking for any little thing to bash a company on.... I mean does it really make you feel that much better to come here and flame?? I sometimes am amused by these comments but some of these are just disheartening both as an enthusiast and an engineer.... wtf...... It is as if people think coming up with these technologies is easy and that anyone can do it...
The irony of your post (or hypocrisy) is that you come across as a fanboy. Nobody is suggesting the tech is easy, far from it. You say what AMD did is simple PR. It's not really. It's disingenuous marketing to make some of the statements they made.
Had I gone on their release alone, I would have ordered the Fury X as soon as I could have. However, the reviews are clear, there is certainly no title of fastest GPU to be claimed. Especially dropping back to 1440p and lower.
You seem annoyed that AMD is being 'picked on' for Fury X. What you are actually seeing is post release frustration that it's not what at least 6 months of hype and leaks said it was going to be.
Only a tool would not want it to be worse. Faster means competition on prices to sway buyers, parity makes it more stable instead. Who will break first?
Also, as you say you're an engineer, you will know when you design and builde something, what your product is and you will know its performance. You would be foolish to have stats in a press deck that don't hold up to scrutiny. Hell, Nvidia get dogs abuse (rightly so as market leader) for miscommunication and they got off relatively scott free with 970 memory issue (God knows how).
AMD's press deck, the games comparison one was evidently bogus -they had it beating 980ti across the board.
They said it would overclock well (I think) but it doesn't.
We can all defend AMD for bringing a great card and making the sacrifices, some say necessarily, to bring in HMB but the OP title is very accurate.
And nothing Newtekie1 says is wrong.
Posted on Reply
#79
newtekie1
Semi-Retired Folder
AsRockWho takes notice of PR anyways ?, OMG you learned any thing over the years ?, it's not like they can say our card is meh and expect people to look still.
Obviously everyone that were expecting the Fury X to be beat the 980Ti and put AMD in the lead again. All the people that had already decided to buy the Fury X before the reviews were even out.

They don't have to say it is a mediocre card, as I have said twice now. If they had just said the card is competitive with the 980Ti at 4k, which it definitely is, and released it at a slightly cheaper price point, the card would not have been as disappointing.
Posted on Reply
#80
RejZoR
Actually, the biggest winner here is still AMD really... They are successfully selling "ancient" R9-290X as brand new card and still competing with the top cards from NVIDIA. For a vendor, that is almost ideal scenario. It's just a question how costly was development of Fiji XT for them in this regard and how far this cuts into the savings made by the R9-390X...
Posted on Reply
#81
HumanSmoke
qubitWow, that's a blast from the past! I can see that the price is eye watering, too.
That looks like a daughterboard at the back there. Do you have a clearer picture of it?
Get ready for some hi res 3Dfx porn...
the54thvoidThey said it would overclock well (I think) but it doesn't.
I think the exact phrase was " overclockers dream"
“You’ll be able to overclock this thing like no tomorrow,” AMD CTO Joe Macri says. “This is an overclocker’s dream.”
As an overclocker, I'd say an overclock of 75-100MHz (max) under water constitutes less a dream than a stupor.
the54thvoidAMD's press deck, the games comparison one was evidently bogus -they had it beating 980ti across the board.
That is usually the way of things when a company is playing catch up, although AMD in recent years seems to have developed a marketing insecurity. I think they attempt to portray themselves as the little engine that could, but their marketing tends to come across as hesitancy wrapped up in bluster ( The Roy Taylor Syndrome). The company almost always use the competitions products for reference. Nvidia tend to do the same thing with Intel, less so with AMDwhere they are more confident...while Intel? Well, when was the last time any of their PR material/PPS/review kit mentioned AMD or Nvidia at all? I can't think of a single instance in at least the last half dozen years.
When a company has confidence it does not need to reference the competition, but I don't think AMD as a company has ever embraced the "less is more" angle of the brand, which is a shame because ATI before AMD's swallowing of them, tended to carry the aura of "walk softly and carry a big stick" - the kind of company projected attitude that builds consumer confidence in a brand.
Posted on Reply
#82
mirakul
Development of Fury is costly: whole new PCB, interposer, HBM, Fiji, new cooler. I doubt that AMD could have much profit from FuryX selling.
Posted on Reply
#83
HumanSmoke
mirakulDevelopment of Fury is costly: whole new PCB, interposer, HBM, Fiji, new cooler. I doubt that AMD could have much profit from FuryX selling.
AMD won't make any profit. In fact, they will likely be very very far from recouping their investment. Large GPUs usually earn their keep as professional cards where margins are high. Very doubtful that Fiji will arrive as a FirePro workstation board (power envelope, 4GB memory), so that leaves all revenue to accrue from Radeon sales. Large GPU R&D even if it borrows heavily from other designs would cost hundreds of millions of dollars. AMD offsets some the loss by using Fiji as a pipecleaner/proof of concept for HBM which it needs for future projects....but you're right, AMD's bill of materials, and GPU fabbing cost probably nullifies any real profit even taking R&D out of the equation.
Fiji's (like GM 200) worth should have been just as much as a halo product. Having the title "Worlds Fastest GPU/Single GPU card" sells a lot of entry level and mainstream cards ( Just as stock car and drag racing sells production models by association). The GTX 980 Ti effectively rained on that parade.
Posted on Reply
#84
qubit
Overclocked quantum bit
HumanSmokeGet ready for some hi res 3Dfx porn...
I've just nerdgasmed. End.
Posted on Reply
#85
RejZoR
Now that's a high resolution porn :D
Posted on Reply
#86
Prima.Vera
HumanSmokeIt has always been this way. The 7800GTX512 and it's ATI counterpart, the X1900 XTX caught flak at the time for pushing the $600-650 mark by people relatively new to tech.
For some people like myself, who got their start some time earlier, there has always been a high end. Exhibit #1 from 1998, a Quantum3D Obsidian² X-24

Oh man, check out those games. Real gold out there.
Posted on Reply
#87
64K
Steam Hardware Surveys aren't scientific but I look at them to get some general idea what people are gaming on. Very very few are gaming on high end cards. The vast majority surveyed are gaming on entry level and mid range cards or integrated graphics. Neither Fury X nor 980 Ti will constitute much of AMD/Nvidia sales. As far as monitors between 1368X768 and 1920X1080 combined totals 60% of monitors. At my resolution, 1440p, it's 1.1% and despite all the talk of 4K becoming mainstream it's 0.06% (about 1 out 0f 1,600). The vast majority don't need or want a high end card.

store.steampowered.com/hwsurvey
Posted on Reply
#88
AsRock
TPU addict
newtekie1Obviously everyone that were expecting the Fury X to be beat the 980Ti and put AMD in the lead again. All the people that had already decided to buy the Fury X before the reviews were even out.

They don't have to say it is a mediocre card, as I have said twice now. If they had just said the card is competitive with the 980Ti at 4k, which it definitely is, and released it at a slightly cheaper price point, the card would not have been as disappointing.
Not saying your wrong and in a perfect world it be sweet but this is like some one running to be the Governor or even president and people still take notice of them with their promises.

They turn around say our card is even competitive that be many less people looking.

Gotta wounder though the R9 Fury might not that much less performance than the Fury X at a much lower price which might shake things up as you know and they do i will check and you will too.
Posted on Reply
#89
Frick
Fishfaced Nincompoop
newtekie1AMD Claims Fury X is faster than 980Ti at 4k. = False
In 12 of the 22 games of w1z's charts it is actually faster than the 980ti. In that chart I count 12 games. I haven't bothered to see if they are the same games.
AMD Claims Fury X runs at 50°C under typical load. = False
www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,11.html
AMD Claims, in the first 5 minutes, Fury X is the worlds fastest graphics processor, world most power efficient graphics processor, and allows revolutionary form factors. = Again all False
Haven't watched the video, but it is the fastest in cherry picked tests. Which is the point of PR, and as far as I'm concerned they haven't told lies. Picking on a company for PR speak is like hating the sky because it's blue, or the sun because it shines.
Posted on Reply
#90
Dave65
Really a damn shame AMD seems to always shoot themselves in the foot..Yeah new tech is great but when you are trying to survive as a company you need to make damn sure you can keep up with #1 but they seem to be chasing their tail all the time..
Posted on Reply
#91
TRWOV
For me, the thing that really brings down Fury X's wind is how the performance at <4K doesn't scale as it should. Really hoping that Nano doesn't suffer the same issue.
Posted on Reply
#92
RejZoR
What I've heard somewhere about R9 Nano is that it won't be even as powerful as R9-290X. Probably something along R9-280X-ish performance in that tiny package. I hope though that those were just empty rumors and that it'll be a lot more powerful...
Posted on Reply
#93
m6tzg6r
The picture should be a green fist up a red butt.
Posted on Reply
#94
bubbleawsome
Great article! I don't want to get into the discussion around, but wanted to say I enjoyed it.
Seems like a pretty fair balance between green and red.
Posted on Reply
#95
kiddagoat
@the54thvoid Yes I am very annoyed people are giving AMD so much crap over the Fury X. Because as you stated Nvidia blatantly lied on a spec sheet and then pulled some voodoo magic out that got them off free and clear. Again as you stated, I have no idea how they got away with it either. Just seems like a double standard in the tech industry as a whole. Certain companies get away with murder while others make some PR slides and people lose their minds when they find out it isn't the entire picture.

They did blow it up like they did with Bulldozer but as the old saying goes... fool me once, shame on you.... fool me twice, shame on me.... I'll be damned it happens a third time. After Bulldozer I hung up the Red banner and just started making better informed decisions. I don't buy into the PR hype and wait for products to come out to make better decisions on products.

In this case, I fully agree the "fans" hyped this to the moon, and a few of those cherry picked titles for testing and slides weren't lying but they didn't paint the entire picture either. That's all I am saying, giving a company grief over PR and marketing that their rabid fans blow up isn't their fault. Now I fully agree that saying it is the fastest GPU in the world was a bit more than necessary. As an engineer I don't want to misrepresent anything. In this case, they probably told them it was parity and when the "suits/management" seen a handful of tests it was faster in at 4K they started the hype train a rolling.

As I noted in another post, in general this radical fanaticism with tech companies and their fans is petty and just complete rubbish. People just need to agree to disagree and move on.
Posted on Reply
#96
RejZoR
They didn't get away with anything in my book. GTX 970 doesn't exist for me. I'd only take it if it was 200 € new in the store. Or go with the GTX 980 which isn't gimped. And this is what I'm planning to do.
Posted on Reply
#97
BiggieShady
If anyone wants to recreate conditions in which Fury X beats 980 Ti in every benchmark, do what AMD did in their benches before release: force 0x anisotropic filtering in drivers (basically they used trilinear or bilinear filtering)

IMO bigger problem is 99th percentile metric.
In this plot, things are looking good:

In this "observed fps" plot, not so much:

I still think Fury X is a great card and I'm glad AMD has competing GPU, but they need to do frame pacing fix in drivers ... again. Frame pacing fix they did for Hawaii is simply not working properly for Fiji (or it's not used at all)
Posted on Reply
#98
Darller
BiggieShadyIf anyone wants to recreate conditions in which Fury X beats 980 Ti in every benchmark, do what AMD did in their benches before release: force 0x anisotropic filtering in drivers (basically they used trilinear or bilinear filtering)
Wow, those Physics and Combined scores really highlight the Fury X driver's high CPU usage and relative inefficiency vs 980ti. That's shocking.
Posted on Reply
#99
Slizzo
FolterknechtYou now what the most retarded thing is about AMDs frame limiter?

It only works up to 95 Hz ... (brain fart)²
RejZoRReally? What the hell is the point of it then!? 144Hz monitors aren't exactly exotics anymore (if I could afford one)...
That's not an AMD issue, it's a monitor manufacturer scaler issue. As the VESA standard matures you'll see manufacturers using better scalers.
Posted on Reply
#100
RejZoR
Erm, how is that a monitor problem?
Posted on Reply
Add your own comment
Nov 23rd, 2024 05:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts