# AMD Revises Pump-block Design for Radeon R9 Fury X



## btarunr (Jul 2, 2015)

AMD seems to have reacted swiftly to feedback from reviewers and owners of initial batches if its Radeon R9 Fury X, over a noisy pump-block; and revised its design. The revised pump-block lacks the "high pitched whine" that users were reporting, according to owners. At this point there are no solid visual cues on how to identify a card with the new block, however a user with the revised card (or at least one that lacks the whine), pointed out a 2-color chrome Cooler Master (OEM) badge on the pump-block, compared to the multi-color sticker on pump-blocks from the initial batches. You can open up the front-plate covering the card without breaking any warranties.



 



*View at TechPowerUp Main Site*


----------



## TheGuruStud (Jul 2, 2015)

More ROPs or throw it away.


----------



## R-T-B (Jul 2, 2015)

TheGuruStud said:


> More ROPs or throw it away.



It's not THAT bad.  It's not good compared to a TI or something, but you can't seriously expect AMD to go back to the drawing board at this point.

Their most competitive move would be to put it in the 980s price bracket, IMO.  It would absolutely be competitive then.


----------



## TheGuruStud (Jul 2, 2015)

R-T-B said:


> It's not THAT bad.  It's not good compared to a TI or something, but you can't seriously expect AMD to go back to the drawing board at this point.
> 
> Their most competitive move would be to put it in the 980s price bracket, IMO.  It would absolutely be competitive then.



It is bad. It's terrible.

They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.

It is a monumental failure b/c their profitability relied on it and now they're going to lose their ass even more.

Fire every goddamn exec and lead engineer that allowed this to happen.


----------



## ZoneDymo (Jul 2, 2015)

TheGuruStud said:


> More ROPs or throw it away.



I can say from every videocard out today "more performance or throw it away" tbh, because honestly they are not where they should be.


----------



## ZoneDymo (Jul 2, 2015)

fantastic, but what I dont understand yet again, how did they not catch this during development.
Is everyone on the team deaf or something? Why not address it from the start instead of letting it get some bad press on release?


----------



## Xzibit (Jul 2, 2015)

TheGuruStud said:


> It is bad. It's terrible.
> 
> They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.
> 
> ...



Scott explains it well without the tinfoil hats from either side.

*The TR Podcast 178: Going deep with the Radeon Fury X*

*1:09:00+*


----------



## chinmi (Jul 2, 2015)

useless change... no one in their sane mind wanna buy a weak failed card when with the same money they can have a much stronger, cooler, overclockable, more silence, great driver suppport, more power efficient and awesome green color, the 980ti !!


----------



## SetsunaFZero (Jul 2, 2015)

Fury isn't even lunched and ppl are already whining. Give it some months after lunch day 16.6 and the price will drop. What mre bugs are DX12 benches.


----------



## ZoneDymo (Jul 2, 2015)

chinmi said:


> useless change... no one in their sane mind wanna buy a weak failed card when with the same money they can have a much stronger, cooler, overclockable, more silence, great driver suppport, more power efficient and awesome green color, the 980ti !!



the fanboy is strong with this one


----------



## techy1 (Jul 2, 2015)

I just hope (for everybody) that AMD can make this crappy fury dirt cheap - and there is lot of room for price cuts so AMD can still make profit... for this price - Fury 's price/preformace sux so bad that I can not even belive it is 2015 out there... but we all need AMD alive or else Nvidia can and willl go crazy with prices... so lets us all pray that AMD will get some cash from red fanatics now, with this stupid price tag, and after few weeks will have plenty room for price cuts then that GPU will not look so bad after all (it still will sound bad and heat your room - but it must be dirt cheap)


----------



## Ferrum Master (Jul 2, 2015)

very pleased to see such past paced reaction to the issue... I bet they knew it already before the launch.


----------



## FordGT90Concept (Jul 2, 2015)

Xzibit said:


> Scott explains it well without the tinfoil hats from either side.
> 
> *The TR Podcast 178: Going deep with the Radeon Fury X*
> 
> *1:09:00+*


Very interesting.  AMD was limited by the size of the interposer which gave a GPU handicap against NVIDIA but a huge memory boon.  And while I watched that, I was thinking "what is really stopping AMD from making these Fiji chips swappable?"  Since virtually all of the magic is happening at the interposer level and above and not on the card anymore, are we quickly reaching the point where we can swap GPUs like we swap CPUs?  Also, what does this mean for CPUs (especially the embedded variety)?  If GPU memory can be set into an interposer to vastly increase bandwidth and response time, why can't CPUs do the same?  I think we know what next generation console processors (perhaps even the next Nintendo console) will look like: imagine a Jaguar CPU, a smaller version of Fiji (maybe 1024-2048 stream processors), and 8-16 GiB of HBM all on one interposer...

Fury X performs similar to 980 Ti and they are priced similar as well.  Sure Fury X might be a few percentage points slower but the trade off on the purchasing price is water cooled versus air cooled.  In my opinion, that trade off more than offsets the price/performance difference.

The next node of GPUs, whatever they are so long as they aren't 28nm, will be very, very interesting.


On topic: I'm glad they got it fixed.  The advantage of being water cooled (quiet) being swept away by a noisy pump is a deal breaker on the aforementioned advantage of Fury X over 980 Ti.


----------



## RejZoR (Jul 2, 2015)

Smaller version of Fiji with 2048 shaders would make R9-290X with GCN 1.2 basically. Not sure if you could stuff this into an APU just yet...


----------



## Ferrum Master (Jul 2, 2015)

FordGT90Concept said:


> what is really stopping AMD from making these Fiji chips swappable?"



300W thing in a socket? Remember early socket 1155 burning out due to bad pins? An that was only a 130W thermal package.

It needs to be bigger, then designed for the added capacity and resistance, more added cost, testing etc stuff... bigger RMA rates etc...


----------



## Easo (Jul 2, 2015)

chinmi said:


> useless change... no one in their sane mind wanna buy a weak failed card when with the same money they can have a much stronger, cooler, overclockable, more silence, great driver suppport, more power efficient and awesome green color, the 980ti !!



Fury X is weak card to you?
Realy?


----------



## mroofie (Jul 2, 2015)

ZoneDymo said:


> the fanboy is strong with this one



Not really its the truth and the truth hurts obviously  

But yes he is a fanboy by looking at his previous posts.


----------



## buggalugs (Jul 2, 2015)

TheGuruStud said:


> It is bad. It's terrible.
> 
> They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.
> 
> ...



  AMD slotted the card in exactly where they wanted it.

 The FuryX is selling out, they cant make enough of them, and they are selling at 30% higher than recommended price in some places.

 I wasn't going to buy FuryX just because of the closed loop cooler but what the heck, I'm going to buy one, (when I can buy one, they are sold out) I might get 2.  Thanks for the advice.


----------



## FordGT90Concept (Jul 2, 2015)

Ferrum Master said:


> 300W thing in a socket? Remember early socket 1155 burning out due to bad pins? An that was only a 130W thermal package.
> 
> It needs to be bigger, then designed for the added capacity and resistance, more added cost, testing etc stuff... bigger RMA rates etc...


But most of those pins are connecting to very high frequency DIMMs in a CPU.  Those are in the interposer with Fiji--not transmitted to the socket.  Yes, because of the 250+ watt requirement, it would still have to be on a daughter board or accept PCIE power directly on the motherboard close to the socket.  I still think it may be feasible.  Basically all this chip needs is power, PCI Express lanes, and pins to wire up the DisplayPort connectors.

No one can deny HBM and the interposer open up possibilities that did not exist with GDDR5.  They could theoretically even move some logic to the interposer freeing up even more die space for the GPU.


----------



## Deleted member 138597 (Jul 2, 2015)

What I think after few days of digging that, 28nm is obviously not the right point of time for HBM. You see, for GCN, if you cram a few more ROPs, engines etc, it _can_ beat TITAN X. But they can't cram anymore, why? Because of the limitations of HBM and ofcourse the interposer. I guess there are some shortcomings you can't figure out until they're implemented.

Whatever the reason, I think Fury X can outperform a 980 Ti with few clock boost after new driver release and the voltage unlocks.

Anyway, glad AMD is being swift on the user reactions, it's quite seldom. You don't see everyday big companies with thick walls between end users & enthusiasts and them reaching and hearing them out so quickly. I guess AMD is trying to get back on every little details as possible, and retain their position again.


----------



## Aquinus (Jul 2, 2015)

FordGT90Concept said:


> No one can deny HBM and the interposer open up possibilities that did not exist with GDDR5. They could theoretically even move some logic to the interposer freeing up even more die space for the GPU.


Like what? Latency is still a thing and for it to exist outside the GPU, it would need to be connected to the memory controller or the PCI-E controller... both of which are relatively slow latency wise versus having something on the same GPU die like cache. I don't think there would be much benefit by doing this as the point of HBM was to move components *closer* to the core, not further away from it. I honestly think that last sentence makes very little sense from a technical perspective.


----------



## Bytales (Jul 2, 2015)

Im not going to deny me the pleasure of owning 2 Fury X because it was suppose to be the fastest GPU and because its 3-4% beneath 980ti/Titan x. Do remember that we are yet to see DX12 related benchmarks games, where i have a feeling the fury x might be on top on nvidia.

Im getting the Fury x because im supporting Free Sync (Clearly Nvidia could have made G-Sync without a G-Sync Module as there are laptopts now on the market with g-sync screens withouth the expensive g-sync module, yet they choose to sell us an expensive extra thats not really needed, and continue not to support a free standard which would cost them nothing - the display port 1.2a specs) after i sold my asus rog Swift, im owning now the 32inch 4k ips from samsung.

And Fury x its the best card AMD has for 4 K, 4gb are clearly enough for 4k, im not going to use 32AA at 4k anyway.
Apart from that im mostly choosing fury x because of its small PCB that can be made single slot with waterblock, its not extra wide and its not extra ong.
And i could fit 4 of them with custom waterblock, and still have place for an Areca Raid Card, and a pci express USB 3.0 adapter. The second USB 3.0 adapter, i can take out, (thus freeing the slot for the 4th card) - i can do this because of the short length of the card i can use the built in USB 3.0 19pin connectors. Wich would be imposible with the long titans or 980ti, or 390x for that matter.

I cannot take wider cards, like the 980ti kingpin woud be, which would be the only other competitive card that can be made single slot through water cooling, as i have 40mm fans on the side installed.

So these are the reasons i am choosing the fury x, the 1 or 2 frames are not going to make or break a game compared to the titanium 980 or titan x, but its size and build characteristic, as well as the support for FREE SYNC; is what compells me to choose the fury x instead of its direct competitors.

Not to mention i want to help a bit AMD get on their feet. Competition is good for us all.


----------



## RejZoR (Jul 2, 2015)

buggalugs said:


> AMD slotted the card in exactly where they wanted it.
> 
> The FuryX is selling out, they cant make enough of them, and they are selling at 30% higher than recommended price in some places.
> 
> I wasn't going to buy FuryX just because of the closed loop cooler but what the heck, I'm going to buy one, (when I can buy one, they are sold out) I might get 2.  Thanks for the advice.



If I had plenty of money (which I don't), I'd get R9 Fury X just because it's such a novelty. It's revolutionary on several levels and some want exotic stuff even if it's not the best at the raw framerate. That's like picking the fastest possible car for road that might not be able to utilize it anyway. But you look real good in the other one and the drive itself in it is sublime. That's how I see Fury. It's beautiful, exotic card and some people just want that over raw performance.


----------



## FordGT90Concept (Jul 2, 2015)

Aquinus said:


> Like what? Latency is still a thing and for it to exist outside the GPU, it would need to be connected to the memory controller or the PCI-E controller... both of which are relatively slow latency wise versus having something on the same GPU die like cache. I don't think there would be much benefit by doing this as the point of HBM was to move components *closer* to the core, not further away from it. I honestly think that last sentence makes very little sense from a technical perspective.


e.g. the embedded DisplayPort chips and the PCI Express controller.  It could be designed in a way that the interposer is actually closer (think 3D) then the component in the GPU.  There's only two problems with moving components to the interposer:
1) it is 65nm so they'll require more power
2) heat dissipation is a problem so it can't be too intensive

Let me put it this way: most GPUs are designed with the PCI Express controller off to the side and logic branches out from there.  The controller could instead be in the center of the interposer and connect directly up to the GPU.  The distance from the PCI Express controller would be equally short to all compute units.


The purpose of HBM was to stack memory.  Because that resulted in many, many pins for each stack, the interposer was the only reasonable solution to connect everything.  Think embedding logic in the interposer as stacking the GPU.  The gains wouldn't be as massive as HBM but there would still be gains.


----------



## SonicZap (Jul 2, 2015)

I like how AMD lately seems to react quicker to issues than before. While Fury X was released already, their response time (a bit over a week) for this issue wasn't that bad and they also released drivers for the latest Batman game before its flopped release. I hope they keep it up. It's hard to avoid issues with hardware as complex as graphics cards (as evidenced by the Chrome TDRs, Nvidia isn't perfect either), but how fast you solve those issues is critical.


----------



## Fluffmeister (Jul 2, 2015)

This is an issue they need to sort out:

*Retail Fury X coolers still whine, don't include fix*

http://techreport.com/news/28566/retail-fury-x-coolers-still-whine-dont-include-fix


----------



## jboydgolfer (Jul 2, 2015)

before i ask/pose this question, Please excuse My ignorance when it comes to this particular subject.

After watching the video posted by someone above, I took that the AMD engineers were bound by rules of the nature of HBM ,and its positioning on the interposer, which binds they're hands as far as making a bigger/beefier chip to "blow away" Nvidia's higher end Ti's, etc... meaning they are stuck since there is only SO much room...

My question/point, is this. Being that the HBM MUST be on an Interposer, Could it be an option for the engineers to come out with a "Dual Chip" Card, e.g (7990) (6990), etc.., but instead of it being 2 Graphic processors, it could be One Large "Kill 'Em All" GPU, and then the other Chips Interposer dedicated SOLEY to the housing of HBM?

Or, another way as One Large GPU Only chip, then a Smaller GPU with say 8Gb's of HMB?

I understand that there are inherent limitation's when Xfiring on one PCB(or Two), ESPECIALLY (I would imagine) with such a High bandwidth Dram, but I cant help but wonder if that would be a Viable direction to explore...


----------



## ShurikN (Jul 2, 2015)

TheGuruStud said:


> It is bad. It's terrible.



Last truly terrible card from AMD was HD 2900XT. Every card after was more than competitive with NV.


----------



## jigar2speed (Jul 2, 2015)

Fluffmeister said:


> This is an issue they need to sort out:
> 
> *Retail Fury X coolers still whine, don't include fix*
> 
> http://techreport.com/news/28566/retail-fury-x-coolers-still-whine-dont-include-fix


This is exactly what is being discussed here, AMD has a fix...


----------



## Initialised (Jul 2, 2015)

R-T-B said:


> It's not THAT bad.  It's not good compared to a TI or something, but you can't seriously expect AMD to go back to the drawing board at this point.
> 
> Their most competitive move would be to put it in the 980s price bracket, IMO.  It would absolutely be competitive then.


I disagree, to make it competitive they need to lose the pump and ship a single slot version of the card with a full cover block.


----------



## APEKS (Jul 2, 2015)

If that's the case Guru3D had one from the second batch but still said there was coil whine so it obvisously hasn't done much.
http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,5.html


----------



## the54thvoid (Jul 2, 2015)

jboydgolfer said:


> before i ask/pose this question, Please excuse My ignorance when it comes to this particular subject.
> 
> After watching the video posted by someone above, I took that the AMD engineers were bound by rules of the nature of HBM ,and its positioning on the interposer, which binds they're hands as far as making a bigger/beefier chip to "blow away" Nvidia's higher end Ti's, etc... meaning they are stuck since there is only SO much room...
> 
> ...



Forgive me if I'm wrong. Is the interposer not required for the HBM to talk to the chip to remove the need for extremely complex PCB traces?
A separate HBM area would require PCB traces that would add latency?  I think the interposer is required and the only current issue is that it's manufactured on a far larger process than the chip it's linked to.


----------



## the54thvoid (Jul 2, 2015)

jigar2speed said:


> This is exactly what is being discussed here, AMD has a fix...



Sorry for double post, on phone, harder to copy paste quotes in an edit.

The fix was meant to appear in retail (as reviewers were told by AMD). I'm sure those review sites with retail versions could easily remove the cover to see what version they have.
AMD will have some whiny versions in the channel but I imagine they'll sort it out.


----------



## AsRock (Jul 2, 2015)

RejZoR said:


> If I had plenty of money (which I don't), I'd get R9 Fury X just because it's such a novelty. It's revolutionary on several levels and some want exotic stuff even if it's not the best at the raw framerate. That's like picking the fastest possible car for road that might not be able to utilize it anyway. But you look real good in the other one and the drive itself in it is sublime. That's how I see Fury. It's beautiful, exotic card and some people just want that over raw performance.



Wait for the Fury that should be even better as custom designs will be allowed, if it be a nerfed we will have to see.


----------



## RejZoR (Jul 2, 2015)

I had enough of waiting. At first it was March/April, then it was June and now it's July. No. Enough is enough. Why the fuck they couldn't release Fury X and normal Fury is just out of this world idiotic.


----------



## MxPhenom 216 (Jul 2, 2015)

ZoneDymo said:


> the fanboy is strong with this one


I could say the same thing to you in reply to 90% of your posts. Oh the irony.

I pointed it out the day reviews flooded the internet, the ROP count does not make any sense for a card with ~4000 SPs. It has a similar "bottlenecking" issue that the 7970/280x has.


----------



## Aquinus (Jul 2, 2015)

jboydgolfer said:


> Or, another way as One Large GPU Only chip, then a Smaller GPU with say 8Gb's of HMB?


Memory latency. If you do something like that you're still stuck with the same limitations of GDDR5. HBM can be really wide because there are no traces going through the PCB (the graphics card itself,) to connect memory to the GPU, you only have the interposer. If you have two different ICs on the same PCB, you need traces going through the PCB to make the connections between the two and the wider you make it, the more wires you need to do it. Also, because of the distance problem, latency goes up.

The point of HBM was to put memory closer to the compute cores because it results in lower latency and (relatively) easier implementation of wider data buses. Moving HBM to a different IC goes against the reasoning for making HBM in the first place IMHO and that is to get memory closer to compute cores and to take up less space. Adding dedicated ICs doesn't accomplish either of those goals.


----------



## Fluffmeister (Jul 2, 2015)

jigar2speed said:


> This is exactly what is being discussed here, AMD has a fix...





			
				AMD said:
			
		

> The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump. This problem has been resolved and a fix added to production parts and is not an issue.



They said retail wouldn't be affected, but they clearly are.


----------



## ironwolf (Jul 2, 2015)

So if I get this correctly, there is no indicator on the box itself that the card is the revised version w/updated pump-block?  Does this mean people might play video card roulette and people might be returning their working cards to play?


----------



## the54thvoid (Jul 2, 2015)

ironwolf said:


> So if I get this correctly, there is no indicator on the box itself that the card is the revised version w/updated pump-block?  Does this mean people might play video card roulette and people might be returning their working cards to play?



Yes but when you buy a gfx card from either vendor you're also rolling the dice on coil whine too these days.


----------



## R-T-B (Jul 2, 2015)

TheGuruStud said:


> It is bad. It's terrible.
> 
> They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.
> 
> ...



If you're approaching it from a design and profitability perspective it's certainly a failure.

I'm just pointing out from a consumer perspective the card can still be made competitive.



ShurikN said:


> Last truly terrible card from AMD was HD 2900XT. Every card after was more than competitive with NV.



Speaking as a sad owner of said card, I'll admit that card was about as bad as they came...


----------



## ZoneDymo (Jul 2, 2015)

MxPhenom 216 said:


> I could say the same thing to you in reply to 90% of your posts. Oh the irony.
> 
> I pointed it out the day reviews flooded the internet, the ROP count does not make any sense for a card with ~4000 SPs. It has a similar "bottlenecking" issue that the 7970/280x has.



I dare you to find me 3 posts of mine that shows pure fanboyism.
Ill also have you know that 3 of my previous cards were Nvidia.

Apart from that, that post of his was not just borderline trolling fanboyism, it also made no sense in making a point.
For example, calling Fury weak and the 980Ti much stronger, apart from that being just wrong, how is being much stronger then weak anything special, that would barely make it "good".


----------



## v12dock (Jul 2, 2015)

I love that were all engineers / scientist.


----------



## TheGuruStud (Jul 2, 2015)

ShurikN said:


> Last truly terrible card from AMD was HD 2900XT. Every card after was more than competitive with NV.



I've been buying Radeons since the 4890 b/c they have been a better buy for me all the way up to the 290x.

I bought a 980 Ti this week. Everyone knows I hate Nvidia. FURY X Fing SUCKS!

If they can get their shit together, then I will GLADLY buy their new card on launch day and disown the GTX.


----------



## INSTG8R (Jul 2, 2015)

R-T-B said:


> If you're approaching it from a design and profitability perspective it's certainly a failure.
> 
> I'm just pointing out from a consumer perspective the card can still be made competitive.
> 
> ...



It was pretty tho wasn't it? I had one and had the insane dreams of Crossffire, thankfully that moment passed tho I did briefly own a 2900 Pro just to give that Crossfire a shot. Total failure, sold the card to a buddy the next day.
Lets not forget that the 3xxx wasn't any better just being a refresh of an already horrid chip. 4xxx was when they came back in force.

As for Fury I am very anxiously awaiting Sapphires take on Fury(non X) That is the card I am more interested in.


----------



## haswrong (Jul 2, 2015)

TheGuruStud said:


> It is bad. It's terrible.
> 
> They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.
> 
> ...


i think they decided to work around it by a dualchip version once again. for dx12 titles there should be 4+4GB of vram, only the power draw and pricing will be funny again.. i think the $650 price tag is more suitable for the dualchip than a single chip card. whatever.. as someone noted later, all videocards are kinda meh performance and price wise.


----------



## Ikaruga (Jul 2, 2015)

Hey, I'm a top technology decision-maker at AMD and now as we also tested our top of the line world fastest and coolest card ever, we actually found out that the pump is indeed whining like hell as how our customers said so, so we fixed it for you. Thanks.

/sarcasm


----------



## haswrong (Jul 2, 2015)

Bytales said:


> ...
> So these are the reasons i am choosing the fury x, the 1 or 2 frames are not going to make or break a game compared to the titanium 980 or titan x, but its size and build characteristic, as well as the support for FREE SYNC; is what compells me to choose the fury x instead of its direct competitors.
> 
> Not to mention i want to help a bit AMD get on their feet. Competition is good for us all.


you mean neverending competition without a clear winner.. right?


----------



## Casecutter (Jul 2, 2015)

It's sad how vehemently people are deploring FuryX, almost as if they have to spew condemnation and have really never studied more than one review and then ran to the conclusion page.

FuryX is considered for 4K and at that it *matches* the 980Ti reference vs. ref. (sure there's OC 980Ti's) They are as close/identical as possible, while being totally different as two companies should be. It's good for us that AMD isn't trying to clone the competition it nice to have some choice... And given all things it's more than a valiant effort.  AMD has delivered a competitive card, and while both have pluses and minuses why does AMD have to tender a lower price and succumb to less profit? As the mantra back when some set an inordinate price said... "they're not charity".

Sure there was misstep's by executives who don't know about "loose-lips....".  Though AMD negated the whole OC'n issue in like a day or two, but no one seem to have heard about that part. Pumps whine… like is that any different than coil whine? Some surmise they could/should've thrown more ROP/SP’s at it, though have no idea whether technical challenges or developments had a basis on such considerations, if you do you might send in a resume’ for such a job.

Was it the “win” the forums and rumor mill was "drumming up" months ahead, _No_ though hardly ever is. Were there executives prior to launch that aren’t using/given properly vetted talking points...   Honestly executive and engineers should never speak on behave of the company unless truly tasked to do so… STFU.  AMD need credible young spokesperson(s)... I envision a team of say 3; fun/hip gamer (speaker), easy talking while strong (technical), and a “Luther” (aka anger translator) that hangs back adds fun, debunks sh#t, though keeps it real. That would be the all-encompassing AMD PR front, the media face that provides both social campaigns and announcements. Either being tapped collectively in media/stage events, although more often individuals working a “measured cadence” for releasing information.
*AND Lastly*… just dump Richard Huddy he’s just not cutting it!


----------



## GorbazTheDragon (Jul 2, 2015)

Casecutter said:


> It's sad how vehemently people are deploring FuryX, almost as if they have to spew condemnation and have really never studied more than one review and then ran to the conclusion page.
> ...


TL;DR People believed the marketing hype.

That never happened  before did it??? Oh, wait, Devil's Canyon...
(Was I the only one screaming at all the pre-release presentations that they should just solder the damn chip to the IHS or just drop the IHS all together on unlocked chips...)


----------



## arbiter (Jul 2, 2015)

Bytales said:


> Im getting the Fury x because im supporting Free Sync (Clearly Nvidia could have made G-Sync without a G-Sync Module as there are laptopts now on the market with g-sync screens withouth the expensive g-sync module, yet they choose to sell us an expensive extra thats not really needed, and continue not to support a free standard which would cost them nothing - the display port 1.2a specs) after i sold my asus rog Swift, im owning now the 32inch 4k ips from samsung.



G-sync module does a ton of work and IS needed. There is a difference between a desktop monitor and a laptop monitor. Laptop the gpu is connected DIRECTLY to the panel and they can use gpu itself to control VRR on the panel. In a desktop monitor its not, its connected by a small little cable and the monitor has its own hardware inside. G-sync module handles all VRR of the panel and does the overdriving itself on the panel. The faster you drive the panel the most likely hood you will get Ghosting which is an issue freesync has which is non-existent on g-sync panels. (Image: http://www.pcper.com/files/review/2015-06-26/ghost1.jpg ) You can see the ghosting issues on 2 freesync monitors in middle and right where as virtually nothing wrong with g-sync on left. Freesync has problems that need to be fixed. As for expensive module, freesync monitors needed new hardware to do same job that was claimed wouldn't be needed.

Why freesync isn't a free standard, its proprietary software AMD wrote up that uses the adaptive sync standard. Its far from perfect which AMD left on the monitor makers to fix themselves were as nvidia with g-sync did all the leg work to make it right since the start with no ghosting and list of panels that THEY tested to work well doing VRR.

Best way monitor makers found to fix ghosting well least minimize it, is to lock VRR down. you buy a 144hz panel but enable free sync you are limited to 90hz. Really? Kinda makes that bad buy to not use what you pay for.
(cue all the flame posts now)



GorbazTheDragon said:


> TL;DR People believed the marketing hype.
> That never happened  before did it??? Oh, wait, Devil's Canyon...


Intel was up front about it being a refresh. AMD said problem was fixed in retail cards but looks like some retail cards have the issue. Look at 390/x cards, AMD has said they are not re-brands even though there is enough proof that they are are. AMD isn't doing them selves many favors by lieing to everyone and when cault they keep quiet like a 3 year old. If AMD wants to get on more stable financial ground, can't keep saying something when doing another.

Reasons they gave 390/x isn't re-brand is, they increased memory, they increased memory clock speed, and they rewrote the power management microcode. So they claim its not a re-brand even though its the same chip......


----------



## haswrong (Jul 2, 2015)

GorbazTheDragon said:


> TL;DR People believed the marketing hype.
> 
> That never happened  before did it??? Oh, wait, Devil's Canyon...
> (Was I the only one screaming at all the pre-release presentations that they should just solder the damn chip to the IHS or just drop the IHS all together on unlocked chips...)


you want to overclock a chip with integrated voltage regulators? not even solder can help put that much heat away unless you intend to freeze it a bit.


----------



## GorbazTheDragon (Jul 2, 2015)

IVB was rubbish temp wise without the IVR...


----------



## profoundWHALE (Jul 2, 2015)

arbiter said:


> G-sync module does a ton of work and IS needed.



AFAIK it works both as a double frame buffer (with some DDR or whatever handling that) and a few other things, but it clearly isn't needed because AMD made the same thing possible without needing this module. The major difference is that the manufacturers get to have their say as to what happens in the monitor, while with GSYNC, it's one way and that's it.

You were also comparing a 144Hz (TN?) monitor with an 75Hz IPS panel.



arbiter said:


> There is a difference between a desktop monitor and a laptop monitor.


Both are display panels in different form factors with different goals. Laptops are portable and low power, with the display built right in. It needs to be turned off when closed, and that laptop will have only one interface for the screen. It doesn't have any settings to program, no "is a source plugged in?" sort of check.

Desktops have power to spare and aren't going anywhere really. They could be connected to computers with VGA, Displayport, HDMI, DVI, and checks to see if something is plugged in to one of those. You can adjust colours, brightness and contrast. 

Those are the major differences and I think I only covered the basics, so yes, those are different. What doesn't make sense is bringing this up just after saying that the Gsync module does a ton of work. It's as direct from GPU to display as you can get with an integrated framebuffer.



arbiter said:


> Why freesync isn't a free standard, its proprietary software AMD wrote up that uses the adaptive sync standard. Its far from perfect which AMD left on the monitor makers to fix themselves were as nvidia with g-sync did all the leg work to make it right since the start with no ghosting and list of panels that THEY tested to work well doing VRR.



It is a free standard, no licensing fees, and it only requires an AMD GPU (right now) with freesync support and displayport with the adaptivesync. That doesn't have any of this framebuffering, but it also doesn't add any extra cost. Adding a framebuffer costs them like $50? Maybe $100?



arbiter said:


> Reasons they gave 390/x isn't re-brand is, they increased memory, they increased memory clock speed, and they rewrote the power management microcode. So they claim its not a re-brand even though its the same chip......



This is away from the Freesync/GSYNC argument, but if you wish to learn a bit more (and or correct me) then take a look at this http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

So, back on topic, they doubled the memory, bumped the clock a bit, and did some sort of magic to increase performance while keeping temps lower, like 78 C rather than 92 C. I seriously doubt that a little bit of microcode did that, and it certainly wasn't the RAM increase, although, that didn't hurt the 4K performance


----------



## turbogear (Jul 2, 2015)

I have bought Sapphire Fury X one day after they were released and mein has chrome Cooler Master badge.
I do not hear any high pitch whine from pump.
Actually I hear some sound from the other pump that I have in my system EK-DDC 3.2 PWM but not from Fury X pump.

I have a existing water cooling system with Radeon 290X until now which is replaced with Fury X. Most probably I will buy the EK full coverage single-slot Block for my Fury X and integrate it into my water cooling loop.
http://www.techpowerup.com/213800/e...9-fury-x-water-block-single-slot-capable.html


----------



## Batou1986 (Jul 2, 2015)

I don't see anything that changed besides the logo 
Also people need to stay on topic or mods need to start giving warnings, there are plenty of threads to go debate the fury x's performance, more than half the comments on this have nothing to do with the cooler.


----------



## AsRock (Jul 3, 2015)

Fluffmeister said:


> This is an issue they need to sort out:
> 
> *Retail Fury X coolers still whine, don't include fix*
> 
> http://techreport.com/news/28566/retail-fury-x-coolers-still-whine-dont-include-fix



LOL, they need a good ol .


----------



## GorbazTheDragon (Jul 3, 2015)

Maybe if the card doesn't sell they would get a nice wake up call.

Too bad we have this great thing called fanboyism...


----------



## arbiter (Jul 3, 2015)

profoundWHALE said:


> You were also comparing a 144Hz (TN?) monitor with an 75Hz IPS panel.


Wow how far are you behind in tech reading a tech site? IPS 144hz monitors been a thing for last 3-4+ months. Try catching up to present with rest of us instead of least 6months in the past.
http://www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review


profoundWHALE said:


> AFAIK it works both as a double frame buffer (with some DDR or whatever handling that) and a few other things, but it clearly isn't needed because AMD made the same thing possible without needing this module. The major difference is that the manufacturers get to have their say as to what happens in the monitor, while with GSYNC, it's one way and that's it.


That frame doubler means monitor can do sub 40hz without image tearing up, where as freesync gets below that well it tears to heck.  Other function the module does is handle over drive of the monitor which if you look through that link you can see when overdrive is done Wrong it creates ghosting issues.

Problem is what voltage is needed to change a pixel on a panel at 60hz is not same that is needed at 70hz, 80hz, 90hz, etc. hardware in the monitor has to be able to figure that out which g-sync modules does almost perfectly where as scaler for freesync is a bit behind.
Some people have tried to blame the LCD panels as being at fault but g-sync uses same panels without a problem, but nvidia does limit the panels you can use to lists of ones they tested g-sync on and works best with. So can't just use any off the shelf cheapest panel they can get.


profoundWHALE said:


> This is away from the Freesync/GSYNC argument, but if you wish to learn a bit more (and or correct me) then take a look at this
> So, back on topic, they doubled the memory, bumped the clock a bit, and did some sort of magic to increase performance while keeping temps lower, like 78 C rather than 92 C. I seriously doubt that a little bit of microcode did that, and it certainly wasn't the RAM increase, although, that didn't hurt the 4K performance


if you could read i was quoteing and responding to someone else in that not you if you read. Is that temp drop really cause they improved it OR cause 3rd party cooler on the card? That is a rhetorical question if you missed it cause we already know that answer. AMD didn't do anything else to the card its been proven with card info strings and face some of the gpu's on it have dates from 2014 on them.


----------



## GorbazTheDragon (Jul 3, 2015)

arbiter said:


> Wow how far are you behind in tech reading a tech site? IPS 144hz monitors been a thing for last 3-4+ months. Try catching up to present with rest of us instead of least 6months in the past.
> http://www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review
> .....


Not convinced by either Freesync or Gsync, both are still very immature technologies. It's really pointless to argue about it at this point.

Sure the 390x is a rebadge in the sense that it is essentially the same chip. However, internally it has probably gone through various different revisions since the 290x, now being enough to have to differentiate between the cards in a sales perspective. Remember what happened when GB started shaving stuff like VRM phases, heatsinks, capacitors, and other stuff that doesn't make it to the spec sheet off some of their boards, the only change being from revision 1.0 to revision 2.0 but there being no way to distinguish between them when you buy the board... The same would really count for the GPUs. From what I've seen the PCB is the same on the 390x, however if they have done major revisions to microcode and possibly some other small physical alterations that require different driver/BIOS systems it is easier for them to keep the cards completely separate.

The same could be said for the 680 vs 770. The 770 has some major differences, firstly in reference PCB design and also the practical performance of the card is somewhat better than the 680. Yes, both use a fully fledged GK104, but the 770 is far more mature in every way. (it performs slightly better while using less power, what a surprise...)


----------



## Basard (Jul 3, 2015)

GorbazTheDragon said:


> Not convinced by either Freesync or Gsync, both are still very immature technologies. It's really pointless to argue about it at this point.



I'm happy with just plain old v-sync.   But then I'm an old prick and I'm still somewhat happy with my 1680x1050...


----------



## jigar2speed (Jul 3, 2015)

Fluffmeister said:


> They said retail wouldn't be affected, but they clearly are.


That means the issue was identified but Cooler Master was late to resolve it, still 8 days reaction is bad ?? How are those GTX 970 with coil whine treating their customers, have they been rectified yet ??? Stop whinning like a kid, when multiple partners are creating a product things like this can happen, what's good is that they have been rectified.


----------



## Fluffmeister (Jul 3, 2015)

jigar2speed said:


> That means the issue was identified but Cooler Master was late to resolve it, still 8 days reaction is bad ?? How are those GTX 970 with coil whine treating their customers, have they been rectified yet ??? Stop whinning like a kid, when multiple partners are creating a product things like this can happen, what's good is that they have been rectified.



Sorry I'd hate for you to think I'm beaten up on AMD, it's bad enough it didn't live up to the hype in the first place, I'm sure they will sort it, they bloody ought too.

Don't know about other 970 owners, but then the 970 is a lot cheaper card, mine works great you'll be pleased to know.


----------



## jigar2speed (Jul 3, 2015)

Fluffmeister said:


> Sorry I'd hate for you to think I'm beaten up on AMD, it's bad enough it didn't live up to the hype in the first place, I'm sure they will sort it, they bloody ought too.
> 
> Don't know about other 970 owners, but then the 970 is a lot cheaper card, mine works great you'll be pleased to know.



You are happy with your GTX 970 while the reference card did have a coil whining issue and 3.5GB Ram bonus. Anyway the point I am trying to make here is that any company can have issue related to their product, but you should give them thumbs up if they are able address the issue promptly.


----------



## Fluffmeister (Jul 3, 2015)

jigar2speed said:


> You are happy with your GTX 970 while the reference card did have a coil whining issue and 3.5GB Ram bonus. Anyway the point I am trying to make here is that any company can have issue related to their product, but you should give them thumbs up if they are able address the issue promptly.



And the point of my link is that it still hasn't been addressed, even though they said it would be a non issue on retail cards, that's all. 

Banging on about the 970 won't change that.


----------



## Sir Alex Ice (Jul 3, 2015)

I expect the fan version of R9 Fury will be well under the 600$ mark, VAT excluded were applicable. This would make it interesting again.


----------



## husseinHr (Jul 3, 2015)

TheGuruStud said:


> It is bad. It's terrible.
> 
> They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.
> 
> ...


Agreed. Kill them with fire.


----------



## jigar2speed (Jul 3, 2015)

Fluffmeister said:


> And the point of my link is that it still hasn't been addressed, even though they said it would be a non issue on retail cards, that's all.



why don't you understand, Its clearly understandable that the vendor (Cooler master) was informed by AMD about the issue and they couldn't rectify it and hence the initial batch had the issue.

Now we have news that the problems has been taken cared off - so what's the fuzz about ???

Also, if the existing Fury X owners have a whining issue Guru3d says they can get it RMA - http://www.guru3d.com/news-story/amd-fixes-r9-fury-x-whining-noises.html



Fluffmeister said:


> Banging on about the 970 won't change that.



It was simple comparison showing how this 2 companies acted differently to same problem that you are crying foul about.


----------



## Fluffmeister (Jul 3, 2015)

jigar2speed said:


> Is there a problem why you don't understand the simple news here, Its clearly understandable that the vendor (Cooler master) was informed by AMD about the issue and they couldn't rectify it and initial batch did have the issue.
> 
> Now we have news that the problems has been taken cared off - so what's the fuzz about ???
> 
> Also, if the existing Fury X owners have a whining issue Guru3d says they can get it RMA - http://www.guru3d.com/news-story/amd-fixes-r9-fury-x-whining-noises.html



It's very simple indeed, they clearly said retail wouldn't be affected, it is.

Good to see they are getting prompt RMA's, as they should.


----------



## Ikaruga (Jul 3, 2015)

jigar2speed said:


> It was simple comparison showing how this 2 companies acted differently to same problem that you are crying foul about.


*Offtopic*, but coil whine is not really something what you can avoid if you don't assemble the whole system with all the parts tested with each other.
Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.
The juice from the PSU will also have some noise, and the VRM on the graphic cards won't asks for the same amount of juice all the times either (in fact, it changes thousands of times in every second)... so these pulsations and noises will give you some interference even with the best possible parts available, and sometimes some of the wavelength of that interference can be around 20-20Khz what you might hear.
Modern digi-VRMs with a lot of phases also switch of most of said phases when jumping into and back from power-saving states which will also make things worse... etc.

The bottom line is: There might be no way to eliminate all the coil whine possibilities if the graphics card maker only produces the card and not other parts in the PC like the PSU or the motherboard, etc... so it's really not like a pump whine what can be "tested out", it's simple is that.


----------



## GorbazTheDragon (Jul 3, 2015)

Ikaruga said:


> *Offtopic*, but coil whine is not really something what you can avoid if you don't assemble the whole system with all the parts tested with each other.
> Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.....


That is true. Because why do we get samples of the same design that suffer from coil whine while others don't.

I would expect that this is due to normal changes or variations in the strength/rigidity of phyisical connections on the device. There are plenty of things you can't control when soldering things to the board, solder may creep up leads, but with variable amounts making some more rigid than others. Also, variations in the parts themselves can contribute similarly.

I mentioned this in another thread: I have a 760 ACX and that exhibits coil whine under very specific circumstances, when I tried with a friends card I could not replicate it. I only get coil whine when running at very specific clock-voltage combinations within a certain temperature range. And it only works when I run furmark. Bad VRM design? I don't think so.


----------



## WaroDaBeast (Jul 3, 2015)

Ikaruga said:


> *Offtopic*, but coil whine is not really something what you can avoid if you don't assemble the whole system with all the parts tested with each other.
> Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.
> The juice from the PSU will also have some noise, and the VRM on the graphic cards won't asks for the same amount of juice all the times either (in fact, it changes thousands of times in every second)... so these pulsations and noises will give you some interference even with the best possible parts available, and sometimes some of the wavelength of that interference can be around 20-20Khz what you might hear.
> Modern digi-VRMs with a lot of phases also switch of most of said phases when jumping into and back from power-saving states which will also make things worse... etc.
> ...



Not off-topic at all to me, and a very interesting read to boot.


----------



## Ikaruga (Jul 3, 2015)

GorbazTheDragon said:


> That is true. Because why do we get samples of the same design that suffer from coil whine while others don't.
> 
> I would expect that this is due to normal changes or variations in the strength/rigidity of phyisical connections on the device. There are plenty of things you can't control when soldering things to the board, solder may creep up leads, but with variable amounts making some more rigid than others. Also, variations in the parts themselves can contribute similarly.
> 
> I mentioned this in another thread: I have a 760 ACX and that exhibits coil whine under very specific circumstances, when I tried with a friends card I could not replicate it. I only get coil whine when running at very specific clock-voltage combinations within a certain temperature range. And it only works when I run furmark. Bad VRM design? I don't think so.


Yes, that's why they use super alloy chokes for example, because those are solid and can't vibrate, but there are lots of small individual parts on a board/card and temperature, quality of the assembly, used materials.. etc all are factors here too.


----------



## SetsunaFZero (Jul 3, 2015)

in the meantime Fury non-X was listed, the price is kinda disappointing 624Eur


----------



## Sir Alex Ice (Jul 3, 2015)

Judging by the old CM logo on the initial pump, I would guess that AMD bought some old stock from CM and repurposed them
Aircooled version will be at least 100$ less expensive than the current asking price of GTX980 Ti, maybe even 150$.


----------



## arbiter (Jul 3, 2015)

Sir Alex Ice said:


> Judging by the old CM logo on the initial pump, I would guess that AMD bought some old stock from CM and repurposed them
> Aircooled version will be at least 100$ less expensive than the current asking price of GTX980 Ti, maybe even 150$.



it was reported it would be 550$. But some listings i have seen says it will be a slight cut down version. Wonder what kinda heat it will put out and what that might do to the HBM chips being subjected to that heat if its in the 90c range. Nothing is set in stone til card comes out get some months of heat cycles on the chip's.


----------



## midnightoil (Jul 4, 2015)

MxPhenom 216 said:


> I could say the same thing to you in reply to 90% of your posts. Oh the irony.
> 
> I pointed it out the day reviews flooded the internet, the ROP count does not make any sense for a card with ~4000 SPs. It has a similar "bottlenecking" issue that the 7970/280x has.



#1 - can a mod please purge all the troll posts.  Particularly the half a dozen or so that start the thread.

#2 - If ROPs are such a huge bottleneck as you claim, then why is scaling so ridiculously good in CF?  It's better than any previous AMD card, and miles ahead of any SLI setup.  Even games it loses badly in (even at 4K) in single card config, it shits on the TX in 2x FX vs 2x TX.  This is with very early drivers.  2x FX @4K Ultra in BF4 gets ~120FPS, and this is a game where the drivers seem to be really bad for the FX.  I expect to see new drivers extend these gains (in single & CF) immensely ... and DX12 will in my opinion be a complete whitewash for GCN and Fiji particularly vs Maxwell or Kepler.

If you're going for more than one high end card, it looks like Fiji is the only thing you should consider right now.  Also, if you intend to buy a VR headset, GCN (& more particularly Fiji / 380/ 285) is the only game in town ... the hardware is far more suitable than Maxwell (v.1 or v.2) and LiquidVR is miles ahead of GWVR.


----------



## MxPhenom 216 (Jul 4, 2015)

midnightoil said:


> #1 - can a mod please purge all the troll posts.  Particularly the half a dozen or so that start the thread.
> 
> #2 - If ROPs are such a huge bottleneck as you claim, then why is scaling so ridiculously good in CF?  It's better than any previous AMD card, and miles ahead of any SLI setup.  Even games it loses badly in (even at 4K) in single card config, it shits on the TX in 2x FX vs 2x TX.  This is with very early drivers.  2x FX @4K Ultra in BF4 gets ~120FPS, and this is a game where the drivers seem to be really bad for the FX.  I expect to see new drivers extend these gains (in single & CF) immensely ... and DX12 will in my opinion be a complete whitewash for GCN and Fiji particularly vs Maxwell or Kepler.
> 
> If you're going for more than one high end card, it looks like Fiji is the only thing you should consider right now.  Also, if you intend to buy a VR headset, GCN (& more particularly Fiji / 380/ 285) is the only game in town ... the hardware is far more suitable than Maxwell (v.1 or v.2) and LiquidVR is miles ahead of GWVR.



Thanks, id still get a 980Ti though. Not the biggest fan of multi GPU, and even if it scales well, the frametimes are still lousy.

Scaling is good on 7970/280x, but those cards are a bit bottlenecked by the 32  ROPs.


----------



## midnightoil (Jul 4, 2015)

MxPhenom 216 said:


> Thanks, id still get a 980Ti though. Not the biggest fan of multi GPU, and even if it scales well, the frametimes are still lousy.
> 
> Scaling is good on 7970/280x, but those cards are a bit bottlenecked by the 32  ROPs.


Frame times for CF are often half that of SLI, or even less.  That's why NVIDIA withdrew permission for sites to do comparative FCAT tests for SLI / CF for the sites they sent the FCAT equipment / software to.  Last major test of it was early this year by SweClockers, since then nothing - said test showed a gigantic lead in frame times for CF over SLI.

Forgive me if I'm skeptical about that ROP bottlenecking claim .. because you're claiming the same thing with the Fiji and it's total bollocks.


----------



## MxPhenom 216 (Jul 4, 2015)

midnightoil said:


> Frame times for CF are often half that of SLI, or even less.  That's why NVIDIA withdrew permission for sites to do comparative FCAT tests for SLI / CF for the sites they sent the FCAT equipment / software to.  Last major test of it was early this year by SweClockers, since then nothing - said test showed a gigantic lead in frame times for CF over SLI.
> 
> Forgive me if I'm skeptical about that ROP bottlenecking claim .. because you're claiming the same thing with the Fiji and it's total bollocks.



Dude, single GPU frame time issues with AMD cards are there, and that's single GPU. Someone posted the graph in the fury X review thread. Directly compared to the 980Ti/Titan X, and the Nvidia cards look to have a lot better single GPU frame times. And just let me remind you, until now, crossfire frametimes have been total rubbish. I still here about cases where stuttering is pretty bad with CF.

Even the review at PCPER brings up the low rop count that could be hindering the performance of the card. This is the same rop count as the 290/290x like 2 years ago.


----------



## Haytch (Jul 5, 2015)

I think the Fury X is a fantastic card.  I love HBM.

I have decided to give it a miss this time 'round and see what AMD have install for the next series.
I am looking forward to 8+Gb HBM with further refined technology.  I love the GPU and memory being closer together concept and obviously this is a factor that Nvidia will surely need to adapt to stay competitive in the future.

Sure, Fury X might not beat the current Nvidia range, or maybe it does. The point being is, if Nvidia don't start improving their architecture they will soon fall behind with no real hope of ever catching up. The Fury X might as well be considered the turn-around point for AMD.
As for the coil whining . . . I seem to see more whining coming from people than the actual GPU package.  You can always improve GPU whining, but you can't improve people whining!\

For those that have an issue with Crossfire performance, let me assure you, Crossfire has no issues except with the end-user.  Sure my 3x 290X's don't scale 100%, but it's close enough for me not to complain.  In-fact, 2x 290X's scale better than my 2x Titans ranging from 1080 to 4k.  Only after 4k is where my Titans are better, which I don't even use anymore, and I am sure 99% of you don't either.


----------



## arbiter (Jul 5, 2015)

Haytch said:


> Sure, Fury X might not beat the current Nvidia range, or maybe it does. The point being is, if Nvidia don't start improving their architecture they will soon fall behind with no real hope of ever catching up. The Fury X might as well be considered the turn-around point for AMD.


fury is only 2 cards over all. You say its a turn around but its a 3-4 from that being possible with how AMD loves to re brand everything lately. Pascal which will be nvidia's next gpu will have HBM2.  maxwell is plenty good as it was able to keep up and in most cases beat AMD super hyped card. So i wouldn't say Nvidia needs to improve as its AMD needs to improve a lot more yet.


----------



## Haytch (Jul 6, 2015)

arbiter said:


> fury is only 2 cards over all. You say its a turn around but its a 3-4 from that being possible with how AMD loves to re brand everything lately. Pascal which will be nvidia's next gpu will have HBM2.  maxwell is plenty good as it was able to keep up and in most cases beat AMD super hyped card. So i wouldn't say Nvidia needs to improve as its AMD needs to improve a lot more yet.


Thank you, I was not aware that Nvidia were going to use HBM2.  I thought that AMD had some licencing rights to it, something like that.  That's why I assumed that Nvidia needed to get their act together.
HBM1 is all good, but just a minor stepping stone in my books, the real fun wont start until HBM2/3+.


----------



## arbiter (Jul 6, 2015)

Haytch said:


> Thank you, I was not aware that Nvidia were going to use HBM2.  I thought that AMD had some licencing rights to it, something like that.  That's why I assumed that Nvidia needed to get their act together.
> HBM1 is all good, but just a minor stepping stone in my books, the real fun wont start until HBM2/3+.


http://www.techpowerup.com/213254/nvidia-tapes-out-pascal-based-gp100-silicon.html
Its only a prototype but you can see what hbm lead amd has won't last long. I expect HBM will be highest end gpu only since really mid range don't need it yet.


----------



## Freebird (Jul 6, 2015)

TheGuruStud said:


> I've been buying Radeons since the 4890 b/c they have been a better buy for me all the way up to the 290x.
> 
> I bought a 980 Ti this week. Everyone knows I hate Nvidia. FURY X Fing SUCKS!
> 
> If they can get their shit together, then I will GLADLY buy their new card on launch day and disown the GTX.



As with everything AMD a little Waiting is required...

 I believe we will find that the Fury X is a fine card once the it's OCing abilities are "found" i.e. Over Volting GPU & HBM.  It should have some decent head room once this is accomplished... in addition, I speculate that there may be more performance to squeeze out of the Fury/X when DX12 arrives in about a month... and then another boost when "true" DX12 games appear that leverage asynchronous shaders...   

you stated in another post that AMD needs more ROPs to take advantage of those 4096 shaders... maybe just maybe AMD built the FURY/X this way because DX12 can use it to the full potential... MAYBE AMD built the FURY/X for the FUTURE. (DX12 which is only just a month away) So would you rather have a card that excels in the near future or one that only performs well in the past... Pre-DX12 release...

which leads me to another reason AMD PUSHED out Mantle when they did... if they hadn't pushed it out with the R290x; DX12 might still be a year or two away... and still hampering AMD's GPU design decisions.  In my OWN opinion DX9-11 has been hamstringing GPU performance for WAY TOO LONG; due to not fully utilizing the CPU cores available.

Now, if we could just get someone to develop a game or two that uses more than 4GB of SYSTEM memory, I would be ECSTATIC...    I'm tired of Fallout NV crapping out to the desktop running out of memory with over 20+GB free and Fallout 4Gnv isn't much help.  (Yeah, I know it's about 7 years old... I'm looking forward to Fallout 4 and the graphics detail in Star Wars Battle Front; YEA!!)


----------



## arbiter (Jul 6, 2015)

Freebird said:


> which leads me to another reason AMD PUSHED out Mantle when they did... if they hadn't pushed it out with the R290x; DX12 might still be a year or two away... and still hampering AMD's GPU design decisions. In my OWN opinion DX9-11 has been hamstringing GPU performance for WAY TOO LONG; due to not fully utilizing the CPU cores available.


Only thing tyhat was hampering AMD's design decisions was AMD.


----------



## Freebird (Jul 6, 2015)

Apparently spell-checker doesn't "hamper" your misspellings... but thanks for your thoughts.


----------



## AsRock (Jul 6, 2015)

arbiter said:


> Only thing tyhat was hampering AMD's design decisions was AMD.



Only thing that was hampering AMD's design decisions is money \ shrink.


----------

