Friday, June 26th 2015

AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

This has been a roller-coaster month for high-end PC graphics. The timing of NVIDIA's GeForce GTX 980 Ti launch had us giving finishing touches to its review with our bags to Taipei still not packed. When it launched, the GTX 980 Ti set AMD a performance target and a price target. Then began a 3-week wait for AMD to launch its Radeon R9 Fury X graphics card. The dance is done, the dust has settled, and we know who has won - nobody. AMD didn't get the R9 Fury X wrong, but NVIDIA got its GTX 980 Ti right. At best, this stalemate yielded a 4K-capable single-GPU graphics option from each brand at $650. You already had those in the form of the $650-ish Radeon R9 295X2, or a pair GTX 970 cards. Those with no plans of a 4K display already had great options in the form of the GTX 970, and price-cut R9 290X.

The Radeon R9 290 series launch from Fall-2013 stirred up the high-end graphics market in a big way. The $399 R9 290 made NVIDIA look comically evil for asking $999 for the card it beat, the GTX TITAN; while the R9 290X remained the fastest single-GPU option, at $550, till NVIDIA launched the $699 GTX 780 Ti, to get people back to paying through their noses for the extra performance. Then there were two UFO sightings in the form of the GTX TITAN Black, and the GTX TITAN-Z, which made no tangible contributions to consumer choice. Sure, they gave you full double-precision floating point (DPFP) performance, but DPFP is of no use to gamers. So what could have been the calculation at AMD and NVIDIA as June 2015 approached? Here's a theory.
Image credit: Mahspoonis2big, Reddit

AMD's HBM Gamble
The "Fiji" silicon is formidable. It made performance/Watt gains over "Hawaii," despite a lack of significant shader architecture performance improvements between GCN 1.1 and GCN 1.2 (at least nowhere of the kind between NVIDIA's "Kepler" and "Maxwell.") AMD could do a 45% increase in stream processors for the Radeon R9 Fury X, at the same typical board power as its predecessor, the R9 290X. The company had to find other ways to bring down power consumption, and one way to do that, while not sacrificing performance, was implementing a more efficient memory standard, High Bandwidth Memory (HBM).

Implementing HBM, right now, is not as easy GDDR5 was, when it was new. HBM is more efficient than GDDR5, but it trades clock speed for bus-width, and a wider bus entails more pins (connections), which would have meant an insane amount of PCB wiring around the GPU, in AMD's case. The company had to co-develop the industry's first mass-producible interposer (silicon die that acts as substrate for other dies), relocate the memory to the GPU package, and still make do with the design limitation of first-generation HBM capping out at 8 Gb per stack, or 4 GB for AMD's silicon; after having laid a 4096-bit wide memory bus. This was a bold move.

Reviews show that 4 GB of HBM isn't Fiji's Achilles' heel. The card still competes in the same league as the 6 GB memory-laden GTX 980 Ti, at 4K Ultra HD (a resolution that's most taxing on the video memory). The card is just 2% slower than the GTX 980 Ti, at this resolution. Its performance/Watt is significantly higher than the R9 290X. We reckon that this outcome would have been impossible with GDDR5, if AMD never gambled with HBM, and stuck to the 512-bit wide GDDR5 interface of "Hawaii," just as it stuck to a front-end and render back-end configuration similar to it (the front-end is similar to that of "Tonga," while the ROP count is the same as "Hawaii.")

NVIDIA Accelerated GM200
NVIDIA's big "Maxwell" silicon, the GM200, wasn't expected to come out as soon as it did. The GTX 980 and the 5 billion-transistor GM204 silicon are just 9 months old in the market, NVIDIA has sold a lot of these; and given how the company milked its predecessor, the GK104, for a year in the high-end segment before bringing out the GK110 with the TITAN; something similar was expected of the GM200. Its March 2015 introduction - just six months following the GTX 980 - was unexpected. What was also unexpected, was NVIDIA launching the GTX 980 Ti, as early as it did. This card has effectively cannibalized the TITAN X, just 3 months post its launch. The GTX TITAN X is a halo product, overpriced at $999, and hence not a lot of GM200 chips were expected to be in production. We heard reports throughout Spring, that launch of a high-volume, money-making SKU based on the GM200 could be expected only after Summer. As it turns out, NVIDIA was preparing a welcoming party for the R9 Fury X, with the GTX 980 Ti.

The GTX 980 Ti was more likely designed with R9 Fury X performance, rather than a target price, as the pivot. The $650 price tag is likely something NVIDIA came up with later, after having achieved a performance lead over the R9 Fury X, by stripping down the GM200 as much as it could to get there. How NVIDIA figured out R9 Fury X performance is anybody's guess. It's more likely that the price of R9 Fury X would have been different, if the GTX 980 Ti wasn't around; than the other way around.

Who Won?
Short answer - nobody. The high-end graphics card market isn't as shaken up as it was, right after the R9 290 series launch. The "Hawaii" twins held onto their own, and continued to offer great bang for the buck, until NVIDIA stepped in with the GTX 970 and GTX 980 last September. $300 gets you not much more from what it did a month ago. At least now you have a choice between the GTX 970 and the R9 390 (which appears to have caught up), at $430, the R9 390X offers competition to the $499 GTX 980; and then there are leftovers from the previous-gen, such as the R9 290 series and the GTX 780 Ti, but these aren't really the high-end we were looking for. It was gleeful to watch the $399 R9 290 dethrone the $999 GTX TITAN in September 2013, as people upgraded their rigs for Holiday 2013. We didn't see that kind of a spectacle this month. There is a silver lining, though. There is a rather big gap between the GTX 980 and GTX 980 Ti just waiting to be filled.

Hopefully July will churn out something exciting (and bonafide high-end) around the $500 mark.
Add your own comment

223 Comments on AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

#51
MrGenius
Stole the graphic from a cheap Chinese firework box.


:laugh:
Posted on Reply
#52
Folterknecht
lilhasselhofferLet's do some math together, and start seeing if our BS detectors go off...
Using consumer prices (inflation of ~3%) to tackle that "problem" sure raises my bull-shito-meter, especially for companies operating mostly outside of the US or europe (production, suppliers ...).

I dont claim to know the numbers, but I doubt its a calculation u can do on a napkin or a short forum post.
Posted on Reply
#53
mastrdrver
btarunrEditorial / Opinion. Keep it civil.
I think this in not necessarily an AMD or nVidia thing, but the problems of getting the process smaller at this point. I think because of the stalling in node shrinks because it's getting harder and harder, these type of things are going to get worse and worse. I'd be surprised if anything came after TSMC's 16nm (or what ever it is) process until at least 5 years later.
Posted on Reply
#54
lilhasselhoffer
FolterknechtUsing consumer prices (inflation of ~3%) to tackle that "problem" sure raises my bull-shito-meter, especially for companies operating mostly outside of the US or europe (production, suppliers ...).

I dont claim to know the numbers, but I doubt its a calculation u can do on a napkin or a short forum post.
You'll note some of the things I stated as given assumptions.


The first big ask is that inflation is a constant. Inflation varies year to year, in sections of the market, and even then we aren't accounting for the market crash of 2008. If you somehow believe that my numbers were 100% accurate, you failed to grasp that I was setting up a rough estimation. Assuming you wanted 100% accurate numbers you'd have to review governmental reporting for each year, and do the math for each individual year. That 100% accurate answer takes more than 4 times the effort of my 90% accurate answer. If you'd like to do that extended math, help yourself.

Company headquarters, and even manufacturing facilities don't matter. To the consumer the manufacturing plant could be down the street, half way around the world, or in a parallel dimension. The manufacturing facility influences only the associated materials cost for the product, as the resources used to get the goods to point of sale are lumped into the gross price. Adding in currency conversion rates is a needless complication, because they don't matter. The consumer only sees the shelf price. We are unconcerned with the price the manufacturer actually pays (and their subsequent profit margin).

Currency doesn't matter. To have to state this is stupid, but relative currency value fluctuates daily. Despite this, the cost of a card doesn't fluctuate. Fluctuating currency values are built into the selling price of cards. Even then, value fluctuation is generally insignificant. Assuming a 10% relative fluctuation in the relative value of currencies, the parent company eats a loss somewhere and a gain in another location. 2+3 = 7-2 = 900-901+6.

At this point I'm supposing that you want to factor in something else needlessly complex. How about AUD =/= USD =/= EURO =/= German Mark? The reason I chose one currency is because it makes things easy to relate. If you want to be pedantic I suggest you start calculating the relative value of Franks, Pounds, AUD, USD, Marks, and a dozen other Western European countries old currencies. Kinda seems like you're looking for a justification as to why I might be wrong then, without regards for contents of the argument.



At this point, I've justified my reasoning. It's time for you to do the same. Assuming you have no argument, I'll assume you've acquiesced to the point. If you can come back an prove that my assertion that pricing is actually in line with inflation, I'll gladly admit that I am wrong. It's your move @Folterknecht .
Posted on Reply
#56
mirakul
Given that Win10 and Dx12 is so close, I hope that there would be a rebench on Win 10 and later a bench on Dx12 patch of Witcher3, Batman AK, Project cars.

Kind of surprise that TPU's review has been done on Win 7 for quite a long time
Posted on Reply
#57
HumanSmoke
AsRockyou can go a little further than that lol.
www.cnet.com/products/ati-radeon-9800-xt/#!
It has always been this way. The 7800GTX512 and it's ATI counterpart, the X1900 XTX caught flak at the time for pushing the $600-650 mark by people relatively new to tech.
For some people like myself, who got their start some time earlier, there has always been a high end. Exhibit #1 from 1998, a Quantum3D Obsidian² X-24



According to this U.S. inflation calculator, that $600 now equates to almost $900. At that was considered a bargain compared with the Obsidian Pro 100DB-4440 which was four times the price.
Posted on Reply
#58
AsRock
TPU addict
Yup, my nVidia 7800 \ 7900 were around the $400 mark two cards that i wish i never owned lol.


How ever price of living a lot higher.
Posted on Reply
#59
arbiter
newtekie1think there were two big problems that lead to the disappointing Fury X results. AMD's hype and AMD's ego.
Most hype around the card was AMD fans hyping it up not AMD themselves. Most that hype was based on rumored specs of the card which shows a lot of them haven't learned a thing over last 4-5 years that specs of something don't mean its gonna be stomp anything else in its class. Specs looked like to some on paper to be a monster but turned out not to be so much in practice.
Posted on Reply
#60
johnspack
Here For Good!
I just want to know when nvidia will release the big daddy chip at consumer level. This titan stuff is crap. I need 970 prices to go down, arg.
Posted on Reply
#61
oldskooler
From the greater collective community pov, most would agree, AMD didn't get it right. They rushed the product to market ( their own words ) and didn't have time for some features. I think the best review out there now would be HardOCP's review of the card. No DVI and HDMI 2.0 is a deal killer for many. I cannot use my 4k 40" Samsung that has a cheaper price and better PQ than nearly any monitor out there but I cannot use my Catleap's which were dirty cheap and have great performance. I think this card would have been great back in 2013 / 14.
Posted on Reply
#62
1c3d0g
john_The rumored $850 probably and the 980Ti would have been a September launch. That way both companies would be selling hi end cards with a pretty nice margin. If things where different, that would have been the case.
Yeah.
john_But Nvidia is trying to throw AMD out of the mid-hi end market, before AMD comes out with Zen. They did it with 970, they did it again with 980Ti.
Yep again.
john_Intel iGPUs get more EUs and more cache (Kaby Lake offers 256MB cache in one model), if AMD produces a good Zen cpu core, it's APUs will get a nice boost just from that. Add HBM and things look bad for every discrete card under $150-$200.
That last part of the sentence is not so easy to say. GPU makers have an uncanny ability to double, even triple low-end GPU performance if/when integrated GPU's start performing too close for comfort. Focusing purely on Intel here, every time they figure out a way to boost iGPU performance, the established GPU makers just push the performance envelope even further. Intel is dedicating tons of die area to their GPU efforts (usually half of the die is reserved just for the GPU alone!), in most generations they manage to DOUBLE performance, but the GPU makers always come back and smash them head on, leaving them dizzy so they have to go back to the drawing board to come up with another plan. So Intel will always play catch up, a constant game of cat & mouse, unless it buys/merges with a well-respected GPU maker.
Posted on Reply
#63
newtekie1
Semi-Retired Folder
arbiterMost hype around the card was AMD fans hyping it up not AMD themselves. Most that hype was based on rumored specs of the card which shows a lot of them haven't learned a thing over last 4-5 years that specs of something don't mean its gonna be stomp anything else in its class. Specs looked like to some on paper to be a monster but turned out not to be so much in practice.
No, there was a crap ton of AMD hype.

Examples:

AMD Claims Fury X is faster than 980Ti at 4k. = False

AMD Claims Fury X runs at 50°C under typical load. = False

AMD Claims Fury X under load is less than 32db. = False(but close at least)

AMD Claims, in the first 5 minutes, Fury X is the worlds fastest graphics processor, world most power efficient graphics processor, and allows revolutionary form factors. = Again all False
Posted on Reply
#64
RejZoR
kn00tcnnot true at all, you dont remove tearing if you cap, you only remove it if you sync

people should be capping/syncing in the first place for years, i know i am, why should any hardware uselessly add more fps (excluding competitive)? or worse, why should anything at any time go to 3,000 fps in a menu!?

anyway you dont 'bring down power consumption' if many sites are testing at 4k with fps in the 40s, you only bring it down if you're originally going past 60 for sustained periods (or whatever monitor refresh you have)


proper enthusiast right here :respect:
True, you dont't eliminate it entirely, but it's a point for tearing where you have framerate below the screen refresh and above it. And that does make a difference.
Posted on Reply
#65
arbiter
newtekie1No, there was a crap ton of AMD hype.
Examples:
AMD Claims Fury X is faster than 980Ti at 4k. = False
AMD Claims Fury X runs at 50°C under typical load. = False
AMD Claims Fury X under load is less than 32db. = False(but close at least)
AMD Claims, in the first 5 minutes, Fury X is the worlds fastest graphics processor, world most power efficient graphics processor, and allows revolutionary form factors. = Again all False
Well those were said during the official anouncement, there was a TON hype by AMD fans then numbers of 4k didn't help but probably AMD felt they had to do make the numbers look good. AMD did put their foot in their mouth with those but still most hype and let down was fans end.
Posted on Reply
#66
newtekie1
Semi-Retired Folder
arbiterWell those were said during the official anouncement, there was a TON hype by AMD fans then numbers of 4k didn't help but probably AMD felt they had to do make the numbers look good. AMD did put their foot in their mouth with those but still most hype and let down was fans end.
Yes, but they were all said before the official launch, about a week before actually. The fan hype was really just people taking what AMD said and running with it.
Posted on Reply
#67
HTC
newtekie1No, there was a crap ton of AMD hype.

Examples:

AMD Claims Fury X is faster than 980Ti at 4k. = False

AMD Claims Fury X runs at 50°C under typical load. = False

AMD Claims Fury X under load is less than 32db. = False(but close at least)

AMD Claims, in the first 5 minutes, Fury X is the worlds fastest graphics processor, world most power efficient graphics processor, and allows revolutionary form factors. = Again all False
Unfortunately, i have to agree.

All the talk of performance over watts without giving any actual performance figures (and referring to their biggest single GPU card power guzzler too) made me suspect something like this but i really hoped they learned from "the bulldozer hype fiasco": it seems they didn't :(

The card is good and is a definite boost over their previous single GPU card high end, both performance as well as power usage wise but it fails quite hard because AMD hyped it WAY too much, making it seem much better then it actually is, and it's byting them in the ass, with great white shark teeth ...

EDIT

@OP: since this is an editorial / opinion, it shouldn't be in news section, IMO.
Posted on Reply
#68
Ebo
Fore me the situation is a bit "difficult".

I see it as, AMD with Fury X, has shown that the HBM tek works, thats the way to go in the future, thats where the money is.

As far as preformance, thats a bit dissapointing which is mostly due to the hype arround Fury X. Had Nvidia launched GFX 980TI in late summer(as planned), we would all just sit and say....WOW what a card, but Nvidia pulled out a rabbit.

I know most gamers play in less resolutions, to them, its a entusiast card, its NOT made for the masses, its made for the top of the pop.
You might argue that the price is too high, I can only say get over it and find the card in your pricepoint that suits you, I couldnt care less.
The price is right and people are buying it so far, now we can only wait and see what R9 Fury on air brings to the table. That card might end up being the real bang for the buck for AMD, just like R9 290 is, great preformance for a reasonable price.

As for 4K gaming, thats just too soon, the screens just arent good enough yet, but they will be eventually. If you are in the market for a card like Fury X or GTX 980TI it would be a real shame to let them run at a 1080 screen, its simply not good enough to being able using all that power, the sweet spot is in my opinion is 2560X1440 as of now..
Posted on Reply
#69
kiddagoat
I dunno..... seems like @newtekie1 is the sort of passive aggressive fanboy..... I mean I don't hear any constructive just constant bashing without all the flare and flame usually associated with the die hard fanboy......

Yeah AMD hyped up the card, they most certainly did.... I mean that's the PR and marketing team's job... they generate buzz... Nvidia does it, Intel does it, MSI, ASUS, on and on... all vendors do it.... Looking at the raw numbers yeah the 980Ti is ahead... but you are talking 5-10FPS at their targeted resolutions which are 1440p and above.... which in most cases turns out to be a wash because they trade back and forth.... yeah lower resolution the gap is larger but these cards aren't made for 1080p or lower.... they aren't marketed to that...

Again this is a case of people just looking for any little thing to bash a company on.... I mean does it really make you feel that much better to come here and flame?? I sometimes am amused by these comments but some of these are just disheartening both as an enthusiast and an engineer.... wtf...... It is as if people think coming up with these technologies is easy and that anyone can do it...
Posted on Reply
#70
qubit
Overclocked quantum bit
BiggieShadyHow about that high pitched pump noise we read in reviews ... did they fix it? Is it noticeable with your card?
kiddagoatNo noise at all.... I have a Sapphire... Card is audible... silent. I had to look inside the case to make sure the fan was running when I turned my machine on.
I'm very glad the noise isn't a problem like in the review.

In that case, it looks like AMD shot themselves in the foot by delivering a faulty unit to TPU for review. I think they would do well to send another one so the review can be updated, since a problem like that is a dealbreaker.

Enjoy your card. :toast:
Posted on Reply
#71
Frick
Fishfaced Nincompoop
qubitIn that case, it looks like AMD shot themselves in the foot by delivering a faulty unit to TPU for review. I think they would do well to send another one so the review can be updated, since a problem like that is a dealbreaker.

Enjoy your card. :toast:
I think other reviewers had the same problems, could be a bad batch.
Posted on Reply
#72
techy1
nice title Picture..., but needed sume update... sadly next time (after year or two) in this Picture - there will be no Red or Green fists, there will be only the Green hand full with our green Money (cuz grean team will asky any price without competition). RIP AMD :'( you were great and we all needed you (tough you sucked last years)
Posted on Reply
#73
Ebo
#74

What rock do you live under ?

AMD will live no matter what, even Nvidia needs them otherwise there will be no development regarding GFX, then all is over. Nvidia hasent come up with 1 single breaktrough in years, they have only been on the wave AMD)ATI started. Im actually not a fanboy, but ive had AMD/ATI ever since the battle between ATI 8500 and Gforce 2.
Ive sticked with AMD ever since before catalyst drivers were envented, but im still not a fan boy, saying AMD is best.
Ive only gone that way because I know how to tweak the drivers for AMD, and I have no clue how to do that on a Nvidia card, and I really dont care. Im happy about the choices ive made and i will stick with that no matter what as long as my card deliveres and preform the way I expect it to.
Posted on Reply
#74
qubit
Overclocked quantum bit
HumanSmokeIt has always been this way. The 7800GTX512 and it's ATI counterpart, the X1900 XTX caught flak at the time for pushing the $600-650 mark by people relatively new to tech.
For some people like myself, who got their start some time earlier, there has always been a high end. Exhibit #1 from 1998, a Quantum3D Obsidian² X-24



According to this U.S. inflation calculator, that $600 now equates to almost $900. At that was considered a bargain compared with the Obsidian Pro 100DB-4440 which was four times the price.
Wow, that's a blast from the past! I can see that the price is eye watering, too.

That looks like a daughterboard at the back there. Do you have a clearer picture of it?
Posted on Reply
#75
newtekie1
Semi-Retired Folder
kiddagoatI dunno..... seems like @newtekie1 is the sort of passive aggressive fanboy..... I mean I don't hear any constructive just constant bashing without all the flare and flame usually associated with the die hard fanboy......

Yeah AMD hyped up the card, they most certainly did.... I mean that's the PR and marketing team's job... they generate buzz... Nvidia does it, Intel does it, MSI, ASUS, on and on... all vendors do it.... Looking at the raw numbers yeah the 980Ti is ahead... but you are talking 5-10FPS at their targeted resolutions which are 1440p and above.... which in most cases turns out to be a wash because they trade back and forth.... yeah lower resolution the gap is larger but these cards aren't made for 1080p or lower.... they aren't marketed to that...

Again this is a case of people just looking for any little thing to bash a company on.... I mean does it really make you feel that much better to come here and flame?? I sometimes am amused by these comments but some of these are just disheartening both as an enthusiast and an engineer.... wtf...... It is as if people think coming up with these technologies is easy and that anyone can do it...
AMD can't go back and change the way they hyped up the card, they can't undo the straight up lies they told about the card leading up to its launch. Yes, companies hype up products, they even cherry pick test results to make the product look good before launch. However, AMD made bold claims, with no cherry picked sources to back them up. They just flat out made claims that were simply not true. There is a difference between what AMD did before the launch of Fury X and what other companies do to promote their upcoming products.

The card is good, I never even hinted it wasn't. AMD just marketed it wrong, their PR department made the situation have one outcome, disappointing. Like I said, if they had been straight up and just said "we're releasing a card that can compete with the 980Ti at 4k" and then released it at $50 less because it is slightly weaker than the 980Ti, they would have had a winner launch. They didn't have to lie and say it beats the 980Ti at 4k, that it was the fastest GPU in the world. They just had to establish that Fury X would be a reasonable alternative to the 980Ti and a slightly lower price. That is all they had to do, but their ego got the best of them. That is my constructive criticism, that was in my first post. Promote the card, hype the card, but don't over hype the card to the point that it can't live up to what you are saying about it.

These aren't fanboy statements, they are my opinion on the situation and why Fury X ended up a disappointment. I mean, clearly I'm an nVidia/Intel fanboy. I've got 2 Intel/nVidia systems and 6 AMD/AMD systems in my house right now...but I just love me some AMD bashing...:rollseyes: Sorry, but the fanboy is the one that can't take someone even suggesting that their beloved company of choice did something wrong, and things any negative opinion on the companies actions is "flaming" them.
Posted on Reply
Add your own comment
Nov 26th, 2024 21:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts