Thursday, July 2nd 2015
AMD Revises Pump-block Design for Radeon R9 Fury X
AMD seems to have reacted swiftly to feedback from reviewers and owners of initial batches if its Radeon R9 Fury X, over a noisy pump-block; and revised its design. The revised pump-block lacks the "high pitched whine" that users were reporting, according to owners. At this point there are no solid visual cues on how to identify a card with the new block, however a user with the revised card (or at least one that lacks the whine), pointed out a 2-color chrome Cooler Master (OEM) badge on the pump-block, compared to the multi-color sticker on pump-blocks from the initial batches. You can open up the front-plate covering the card without breaking any warranties.
Source:
AnandTech Forums
87 Comments on AMD Revises Pump-block Design for Radeon R9 Fury X
You were also comparing a 144Hz (TN?) monitor with an 75Hz IPS panel. Both are display panels in different form factors with different goals. Laptops are portable and low power, with the display built right in. It needs to be turned off when closed, and that laptop will have only one interface for the screen. It doesn't have any settings to program, no "is a source plugged in?" sort of check.
Desktops have power to spare and aren't going anywhere really. They could be connected to computers with VGA, Displayport, HDMI, DVI, and checks to see if something is plugged in to one of those. You can adjust colours, brightness and contrast.
Those are the major differences and I think I only covered the basics, so yes, those are different. What doesn't make sense is bringing this up just after saying that the Gsync module does a ton of work. It's as direct from GPU to display as you can get with an integrated framebuffer. It is a free standard, no licensing fees, and it only requires an AMD GPU (right now) with freesync support and displayport with the adaptivesync. That doesn't have any of this framebuffering, but it also doesn't add any extra cost. Adding a framebuffer costs them like $50? Maybe $100? This is away from the Freesync/GSYNC argument, but if you wish to learn a bit more (and or correct me) then take a look at this www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ
So, back on topic, they doubled the memory, bumped the clock a bit, and did some sort of magic to increase performance while keeping temps lower, like 78 C rather than 92 C. I seriously doubt that a little bit of microcode did that, and it certainly wasn't the RAM increase, although, that didn't hurt the 4K performance ;)
I do not hear any high pitch whine from pump.
Actually I hear some sound from the other pump that I have in my system EK-DDC 3.2 PWM but not from Fury X pump.
I have a existing water cooling system with Radeon 290X until now which is replaced with Fury X. Most probably I will buy the EK full coverage single-slot Block for my Fury X and integrate it into my water cooling loop.
www.techpowerup.com/213800/ek-water-blocks-ready-with-its-radeon-r9-fury-x-water-block-single-slot-capable.html
Also people need to stay on topic or mods need to start giving warnings, there are plenty of threads to go debate the fury x's performance, more than half the comments on this have nothing to do with the cooler.
Too bad we have this great thing called fanboyism...
www.pcper.com/reviews/Displays/ASUS-MG279Q-27-1440P-144Hz-IPS-35-90Hz-FreeSync-Monitor-Review That frame doubler means monitor can do sub 40hz without image tearing up, where as freesync gets below that well it tears to heck. Other function the module does is handle over drive of the monitor which if you look through that link you can see when overdrive is done Wrong it creates ghosting issues.
Problem is what voltage is needed to change a pixel on a panel at 60hz is not same that is needed at 70hz, 80hz, 90hz, etc. hardware in the monitor has to be able to figure that out which g-sync modules does almost perfectly where as scaler for freesync is a bit behind.
Some people have tried to blame the LCD panels as being at fault but g-sync uses same panels without a problem, but nvidia does limit the panels you can use to lists of ones they tested g-sync on and works best with. So can't just use any off the shelf cheapest panel they can get. if you could read i was quoteing and responding to someone else in that not you if you read. Is that temp drop really cause they improved it OR cause 3rd party cooler on the card? That is a rhetorical question if you missed it cause we already know that answer. AMD didn't do anything else to the card its been proven with card info strings and face some of the gpu's on it have dates from 2014 on them.
Sure the 390x is a rebadge in the sense that it is essentially the same chip. However, internally it has probably gone through various different revisions since the 290x, now being enough to have to differentiate between the cards in a sales perspective. Remember what happened when GB started shaving stuff like VRM phases, heatsinks, capacitors, and other stuff that doesn't make it to the spec sheet off some of their boards, the only change being from revision 1.0 to revision 2.0 but there being no way to distinguish between them when you buy the board... The same would really count for the GPUs. From what I've seen the PCB is the same on the 390x, however if they have done major revisions to microcode and possibly some other small physical alterations that require different driver/BIOS systems it is easier for them to keep the cards completely separate.
The same could be said for the 680 vs 770. The 770 has some major differences, firstly in reference PCB design and also the practical performance of the card is somewhat better than the 680. Yes, both use a fully fledged GK104, but the 770 is far more mature in every way. (it performs slightly better while using less power, what a surprise...)
Don't know about other 970 owners, but then the 970 is a lot cheaper card, mine works great you'll be pleased to know.
Banging on about the 970 won't change that.
Now we have news that the problems has been taken cared off - so what's the fuzz about ???
Also, if the existing Fury X owners have a whining issue Guru3d says they can get it RMA - www.guru3d.com/news-story/amd-fixes-r9-fury-x-whining-noises.html It was simple comparison showing how this 2 companies acted differently to same problem that you are crying foul about.
Good to see they are getting prompt RMA's, as they should.
Modern VRMs will easily provide 30-50 amps and more, and that will have a "significant" magnetic field, which will "move/nudge" things around no matter what.
The juice from the PSU will also have some noise, and the VRM on the graphic cards won't asks for the same amount of juice all the times either (in fact, it changes thousands of times in every second)... so these pulsations and noises will give you some interference even with the best possible parts available, and sometimes some of the wavelength of that interference can be around 20-20Khz what you might hear.
Modern digi-VRMs with a lot of phases also switch of most of said phases when jumping into and back from power-saving states which will also make things worse... etc.
The bottom line is: There might be no way to eliminate all the coil whine possibilities if the graphics card maker only produces the card and not other parts in the PC like the PSU or the motherboard, etc... so it's really not like a pump whine what can be "tested out", it's simple is that.
I would expect that this is due to normal changes or variations in the strength/rigidity of phyisical connections on the device. There are plenty of things you can't control when soldering things to the board, solder may creep up leads, but with variable amounts making some more rigid than others. Also, variations in the parts themselves can contribute similarly.
I mentioned this in another thread: I have a 760 ACX and that exhibits coil whine under very specific circumstances, when I tried with a friends card I could not replicate it. I only get coil whine when running at very specific clock-voltage combinations within a certain temperature range. And it only works when I run furmark. Bad VRM design? I don't think so.
Aircooled version will be at least 100$ less expensive than the current asking price of GTX980 Ti, maybe even 150$.