Thursday, July 2nd 2015

AMD Revises Pump-block Design for Radeon R9 Fury X

AMD seems to have reacted swiftly to feedback from reviewers and owners of initial batches if its Radeon R9 Fury X, over a noisy pump-block; and revised its design. The revised pump-block lacks the "high pitched whine" that users were reporting, according to owners. At this point there are no solid visual cues on how to identify a card with the new block, however a user with the revised card (or at least one that lacks the whine), pointed out a 2-color chrome Cooler Master (OEM) badge on the pump-block, compared to the multi-color sticker on pump-blocks from the initial batches. You can open up the front-plate covering the card without breaking any warranties.
Source: AnandTech Forums
Add your own comment

87 Comments on AMD Revises Pump-block Design for Radeon R9 Fury X

#51
haswrong
GorbazTheDragonTL;DR People believed the marketing hype.

That never happened before did it??? Oh, wait, Devil's Canyon...
(Was I the only one screaming at all the pre-release presentations that they should just solder the damn chip to the IHS or just drop the IHS all together on unlocked chips...)
you want to overclock a chip with integrated voltage regulators? not even solder can help put that much heat away unless you intend to freeze it a bit.
Posted on Reply
#53
profoundWHALE
arbiterG-sync module does a ton of work and IS needed.
AFAIK it works both as a double frame buffer (with some DDR or whatever handling that) and a few other things, but it clearly isn't needed because AMD made the same thing possible without needing this module. The major difference is that the manufacturers get to have their say as to what happens in the monitor, while with GSYNC, it's one way and that's it.

You were also comparing a 144Hz (TN?) monitor with an 75Hz IPS panel.
arbiterThere is a difference between a desktop monitor and a laptop monitor.
Both are display panels in different form factors with different goals. Laptops are portable and low power, with the display built right in. It needs to be turned off when closed, and that laptop will have only one interface for the screen. It doesn't have any settings to program, no "is a source plugged in?" sort of check.

Desktops have power to spare and aren't going anywhere really. They could be connected to computers with VGA, Displayport, HDMI, DVI, and checks to see if something is plugged in to one of those. You can adjust colours, brightness and contrast.

Those are the major differences and I think I only covered the basics, so yes, those are different. What doesn't make sense is bringing this up just after saying that the Gsync module does a ton of work. It's as direct from GPU to display as you can get with an integrated framebuffer.
arbiterWhy freesync isn't a free standard, its proprietary software AMD wrote up that uses the adaptive sync standard. Its far from perfect which AMD left on the monitor makers to fix themselves were as nvidia with g-sync did all the leg work to make it right since the start with no ghosting and list of panels that THEY tested to work well doing VRR.
It is a free standard, no licensing fees, and it only requires an AMD GPU (right now) with freesync support and displayport with the adaptivesync. That doesn't have any of this framebuffering, but it also doesn't add any extra cost. Adding a framebuffer costs them like $50? Maybe $100?
arbiterReasons they gave 390/x isn't re-brand is, they increased memory, they increased memory clock speed, and they rewrote the power management microcode. So they claim its not a re-brand even though its the same chip......
This is away from the Freesync/GSYNC argument, but if you wish to learn a bit more (and or correct me) then take a look at this www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

So, back on topic, they doubled the memory, bumped the clock a bit, and did some sort of magic to increase performance while keeping temps lower, like 78 C rather than 92 C. I seriously doubt that a little bit of microcode did that, and it certainly wasn't the RAM increase, although, that didn't hurt the 4K performance ;)
Posted on Reply
#54
turbogear
I have bought Sapphire Fury X one day after they were released and mein has chrome Cooler Master badge.
I do not hear any high pitch whine from pump.
Actually I hear some sound from the other pump that I have in my system EK-DDC 3.2 PWM but not from Fury X pump.

I have a existing water cooling system with Radeon 290X until now which is replaced with Fury X. Most probably I will buy the EK full coverage single-slot Block for my Fury X and integrate it into my water cooling loop.
www.techpowerup.com/213800/ek-water-blocks-ready-with-its-radeon-r9-fury-x-water-block-single-slot-capable.html
Posted on Reply
#55
Batou1986
I don't see anything that changed besides the logo
Also people need to stay on topic or mods need to start giving warnings, there are plenty of threads to go debate the fury x's performance, more than half the comments on this have nothing to do with the cooler.
Posted on Reply
#57
GorbazTheDragon
Maybe if the card doesn't sell they would get a nice wake up call.

Too bad we have this great thing called fanboyism...
Posted on Reply
#58
arbiter
profoundWHALEYou were also comparing a 144Hz (TN?) monitor with an 75Hz IPS panel.
Wow how far are you behind in tech reading a tech site? IPS 144hz monitors been a thing for last 3-4+ months. Try catching up to present with rest of us instead of least 6months in the past.
www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review
profoundWHALEAFAIK it works both as a double frame buffer (with some DDR or whatever handling that) and a few other things, but it clearly isn't needed because AMD made the same thing possible without needing this module. The major difference is that the manufacturers get to have their say as to what happens in the monitor, while with GSYNC, it's one way and that's it.
That frame doubler means monitor can do sub 40hz without image tearing up, where as freesync gets below that well it tears to heck. Other function the module does is handle over drive of the monitor which if you look through that link you can see when overdrive is done Wrong it creates ghosting issues.

Problem is what voltage is needed to change a pixel on a panel at 60hz is not same that is needed at 70hz, 80hz, 90hz, etc. hardware in the monitor has to be able to figure that out which g-sync modules does almost perfectly where as scaler for freesync is a bit behind.
Some people have tried to blame the LCD panels as being at fault but g-sync uses same panels without a problem, but nvidia does limit the panels you can use to lists of ones they tested g-sync on and works best with. So can't just use any off the shelf cheapest panel they can get.
profoundWHALEThis is away from the Freesync/GSYNC argument, but if you wish to learn a bit more (and or correct me) then take a look at this
So, back on topic, they doubled the memory, bumped the clock a bit, and did some sort of magic to increase performance while keeping temps lower, like 78 C rather than 92 C. I seriously doubt that a little bit of microcode did that, and it certainly wasn't the RAM increase, although, that didn't hurt the 4K performance
if you could read i was quoteing and responding to someone else in that not you if you read. Is that temp drop really cause they improved it OR cause 3rd party cooler on the card? That is a rhetorical question if you missed it cause we already know that answer. AMD didn't do anything else to the card its been proven with card info strings and face some of the gpu's on it have dates from 2014 on them.
Posted on Reply
#59
GorbazTheDragon
arbiterWow how far are you behind in tech reading a tech site? IPS 144hz monitors been a thing for last 3-4+ months. Try catching up to present with rest of us instead of least 6months in the past.
www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review
.....
Not convinced by either Freesync or Gsync, both are still very immature technologies. It's really pointless to argue about it at this point.

Sure the 390x is a rebadge in the sense that it is essentially the same chip. However, internally it has probably gone through various different revisions since the 290x, now being enough to have to differentiate between the cards in a sales perspective. Remember what happened when GB started shaving stuff like VRM phases, heatsinks, capacitors, and other stuff that doesn't make it to the spec sheet off some of their boards, the only change being from revision 1.0 to revision 2.0 but there being no way to distinguish between them when you buy the board... The same would really count for the GPUs. From what I've seen the PCB is the same on the 390x, however if they have done major revisions to microcode and possibly some other small physical alterations that require different driver/BIOS systems it is easier for them to keep the cards completely separate.

The same could be said for the 680 vs 770. The 770 has some major differences, firstly in reference PCB design and also the practical performance of the card is somewhat better than the 680. Yes, both use a fully fledged GK104, but the 770 is far more mature in every way. (it performs slightly better while using less power, what a surprise...)
Posted on Reply
#60
Basard
GorbazTheDragonNot convinced by either Freesync or Gsync, both are still very immature technologies. It's really pointless to argue about it at this point.
I'm happy with just plain old v-sync. :) But then I'm an old prick and I'm still somewhat happy with my 1680x1050...
Posted on Reply
#61
jigar2speed
FluffmeisterThey said retail wouldn't be affected, but they clearly are.
That means the issue was identified but Cooler Master was late to resolve it, still 8 days reaction is bad ?? How are those GTX 970 with coil whine treating their customers, have they been rectified yet ??? Stop whinning like a kid, when multiple partners are creating a product things like this can happen, what's good is that they have been rectified.
Posted on Reply
#62
Fluffmeister
jigar2speedThat means the issue was identified but Cooler Master was late to resolve it, still 8 days reaction is bad ?? How are those GTX 970 with coil whine treating their customers, have they been rectified yet ??? Stop whinning like a kid, when multiple partners are creating a product things like this can happen, what's good is that they have been rectified.
Sorry I'd hate for you to think I'm beaten up on AMD, it's bad enough it didn't live up to the hype in the first place, I'm sure they will sort it, they bloody ought too.

Don't know about other 970 owners, but then the 970 is a lot cheaper card, mine works great you'll be pleased to know.
Posted on Reply
#63
jigar2speed
FluffmeisterSorry I'd hate for you to think I'm beaten up on AMD, it's bad enough it didn't live up to the hype in the first place, I'm sure they will sort it, they bloody ought too.

Don't know about other 970 owners, but then the 970 is a lot cheaper card, mine works great you'll be pleased to know.
You are happy with your GTX 970 while the reference card did have a coil whining issue and 3.5GB Ram bonus. Anyway the point I am trying to make here is that any company can have issue related to their product, but you should give them thumbs up if they are able address the issue promptly.
Posted on Reply
#64
Fluffmeister
jigar2speedYou are happy with your GTX 970 while the reference card did have a coil whining issue and 3.5GB Ram bonus. Anyway the point I am trying to make here is that any company can have issue related to their product, but you should give them thumbs up if they are able address the issue promptly.
And the point of my link is that it still hasn't been addressed, even though they said it would be a non issue on retail cards, that's all.

Banging on about the 970 won't change that.
Posted on Reply
#65
Sir Alex Ice
I expect the fan version of R9 Fury will be well under the 600$ mark, VAT excluded were applicable. This would make it interesting again.
Posted on Reply
#66
husseinHr
TheGuruStudIt is bad. It's terrible.

They had it. They got the power consumption down to a decent level, crammed a shit load of shaders in there, crazy fast ram....and then bottlenecked the whole goddamn card. Screw them. They're running out of things screw up.

It is a monumental failure b/c their profitability relied on it and now they're going to lose their ass even more.

Fire every goddamn exec and lead engineer that allowed this to happen.
Agreed. Kill them with fire.
Posted on Reply
#67
jigar2speed
FluffmeisterAnd the point of my link is that it still hasn't been addressed, even though they said it would be a non issue on retail cards, that's all.
why don't you understand, Its clearly understandable that the vendor (Cooler master) was informed by AMD about the issue and they couldn't rectify it and hence the initial batch had the issue.

Now we have news that the problems has been taken cared off - so what's the fuzz about ???

Also, if the existing Fury X owners have a whining issue Guru3d says they can get it RMA - www.guru3d.com/news-story/amd-fixes-r9-fury-x-whining-noises.html
FluffmeisterBanging on about the 970 won't change that.
It was simple comparison showing how this 2 companies acted differently to same problem that you are crying foul about.
Posted on Reply
#68
Fluffmeister
jigar2speedIs there a problem why you don't understand the simple news here, Its clearly understandable that the vendor (Cooler master) was informed by AMD about the issue and they couldn't rectify it and initial batch did have the issue.

Now we have news that the problems has been taken cared off - so what's the fuzz about ???

Also, if the existing Fury X owners have a whining issue Guru3d says they can get it RMA - www.guru3d.com/news-story/amd-fixes-r9-fury-x-whining-noises.html
It's very simple indeed, they clearly said retail wouldn't be affected, it is.

Good to see they are getting prompt RMA's, as they should.
Posted on Reply
#69
Ikaruga
jigar2speedIt was simple comparison showing how this 2 companies acted differently to same problem that you are crying foul about.
Offtopic, but coil whine is not really something what you can avoid if you don't assemble the whole system with all the parts tested with each other.
Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.
The juice from the PSU will also have some noise, and the VRM on the graphic cards won't asks for the same amount of juice all the times either (in fact, it changes thousands of times in every second)... so these pulsations and noises will give you some interference even with the best possible parts available, and sometimes some of the wavelength of that interference can be around 20-20Khz what you might hear.
Modern digi-VRMs with a lot of phases also switch of most of said phases when jumping into and back from power-saving states which will also make things worse... etc.

The bottom line is: There might be no way to eliminate all the coil whine possibilities if the graphics card maker only produces the card and not other parts in the PC like the PSU or the motherboard, etc... so it's really not like a pump whine what can be "tested out", it's simple is that.
Posted on Reply
#70
GorbazTheDragon
IkarugaOfftopic, but coil whine is not really something what you can avoid if you don't assemble the whole system with all the parts tested with each other.
Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.....
That is true. Because why do we get samples of the same design that suffer from coil whine while others don't.

I would expect that this is due to normal changes or variations in the strength/rigidity of phyisical connections on the device. There are plenty of things you can't control when soldering things to the board, solder may creep up leads, but with variable amounts making some more rigid than others. Also, variations in the parts themselves can contribute similarly.

I mentioned this in another thread: I have a 760 ACX and that exhibits coil whine under very specific circumstances, when I tried with a friends card I could not replicate it. I only get coil whine when running at very specific clock-voltage combinations within a certain temperature range. And it only works when I run furmark. Bad VRM design? I don't think so.
Posted on Reply
#71
WaroDaBeast
IkarugaOfftopic, but coil whine is not really something what you can avoid if you don't assemble the whole system with all the parts tested with each other.
Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.
The juice from the PSU will also have some noise, and the VRM on the graphic cards won't asks for the same amount of juice all the times either (in fact, it changes thousands of times in every second)... so these pulsations and noises will give you some interference even with the best possible parts available, and sometimes some of the wavelength of that interference can be around 20-20Khz what you might hear.
Modern digi-VRMs with a lot of phases also switch of most of said phases when jumping into and back from power-saving states which will also make things worse... etc.

The bottom line is: There might be no way to eliminate all the coil whine possibilities if the graphics card maker only produces the card and not other parts in the PC like the PSU or the motherboard, etc... so it's really not like a pump whine what can be "tested out", it's simple is that.
Not off-topic at all to me, and a very interesting read to boot.
Posted on Reply
#72
Ikaruga
GorbazTheDragonThat is true. Because why do we get samples of the same design that suffer from coil whine while others don't.

I would expect that this is due to normal changes or variations in the strength/rigidity of phyisical connections on the device. There are plenty of things you can't control when soldering things to the board, solder may creep up leads, but with variable amounts making some more rigid than others. Also, variations in the parts themselves can contribute similarly.

I mentioned this in another thread: I have a 760 ACX and that exhibits coil whine under very specific circumstances, when I tried with a friends card I could not replicate it. I only get coil whine when running at very specific clock-voltage combinations within a certain temperature range. And it only works when I run furmark. Bad VRM design? I don't think so.
Yes, that's why they use super alloy chokes for example, because those are solid and can't vibrate, but there are lots of small individual parts on a board/card and temperature, quality of the assembly, used materials.. etc all are factors here too.
Posted on Reply
#73
SetsunaFZero
in the meantime Fury non-X was listed, the price is kinda disappointing 624Eur
Posted on Reply
#74
Sir Alex Ice
Judging by the old CM logo on the initial pump, I would guess that AMD bought some old stock from CM and repurposed them
Aircooled version will be at least 100$ less expensive than the current asking price of GTX980 Ti, maybe even 150$.
Posted on Reply
#75
arbiter
Sir Alex IceJudging by the old CM logo on the initial pump, I would guess that AMD bought some old stock from CM and repurposed them
Aircooled version will be at least 100$ less expensive than the current asking price of GTX980 Ti, maybe even 150$.
it was reported it would be 550$. But some listings i have seen says it will be a slight cut down version. Wonder what kinda heat it will put out and what that might do to the HBM chips being subjected to that heat if its in the 90c range. Nothing is set in stone til card comes out get some months of heat cycles on the chip's.
Posted on Reply
Add your own comment
Dec 19th, 2024 03:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts