# AMD Releases PCI-Express Power Draw Fix, We Tested, Confirmed, Works



## W1zzard (Jul 8, 2016)

Earlier today, AMD has posted a new Radeon Crimson Edition Beta, 16.7.1, which actually includes two fixes for the reported PCI-Express overcurrent issue that kept Internet people busy the last days.

The driver changelog mentions the following: "Radeon RX 480's power distribution has been improved for AMD reference boards, lowering the current drawn from the PCIe bus", and there's also a second item "A new "compatibility mode" UI toggle has been made available in the Global Settings menu of Radeon Settings. This option is designed to reduce total power with minimal performance impact if end users experience any further issues."







In order to adjust the power distribution between PCI-Express slot power and power drawn from the PCI-Express 6-pin power connector, AMD uses a feature of the IR3567 voltage controller that's used on all reference design cards. 



 

This feature lets you adjust the power phase balance by changing the controller's configuration via I2C (a method to talk to the voltage controller directly, something that GPU-Z uses too, to monitor VRM temperature, for example). By default, power draw is split 50/50 between slot and 6-pin, this can be adjusted per-phase, by a value between 0 to 15. AMD has chosen a setting of 13 for phases 1, 2 and 3, which effectively shifts some power draw from the slot away onto the 6-pin connector, I'm unsure why they did not pick a setting of 15 (which I've tested to shift even more power).

The second adjustment is an option inside Radeon Settings, called "Compatibility Mode", kinda vague, and the tooltip doesn't reveal anything else either. Out of the box, the setting defaults to off and should only be employed by people who still run into trouble, even with the adjusted power distribution from the first change, which is always on and has no performance impact. When Compatibility Mode is enabled, it will slightly limit the performance of the GPU, which results in reduced overall power draw.

We tested these options, below you will find our results using Metro Last Light (with the card being warmed up before the test run). First we measured power consumption using the previous 16.6.2 driver, then we installed 16.7.1 (while keeping Compatibility Mode off), then we turned Compatibility Mode on.





As you can see, just the power-shift alone, while working, is not completely sufficient to reduce slot power below 75 W, we measured 76 W. As the name suggests, the changed power distribution results in increased power draw from 6-pin, which can easily handle slightly higher power draw though.
With the Compatibility Mode option enabled, power from the slot goes down to 71 W only, which is perfectly safe, but will cost performance.

AMD has also promised improved overall performance with 16.7.1, so we took a look at performance, using Metro again.



 

Here you can see that the new driver adds about 2.3% performance, which is a pretty decent improvement. Once you enable Compatibility Mode though, performance goes down slightly below the original result (0.8% lower), which means Compatibility Mode costs you around 3%, in case you really want to use it. I do not recommend using Compatibility Mode, personally I don't think anyone with a somewhat modern computer would have run into any issues due to the increased power draw in the first place, neither did AMD. It is good to see that AMD still chose to address the problem, and solved it fully, in a good way, and quick.

*View at TechPowerUp Main Site*


----------



## Basard (Jul 8, 2016)

So, if your mobo fries... simply reboot and turn compatibility mode on.  Seems like it should be on by default.


----------



## Chaitanya (Jul 8, 2016)

nice of Amd to fix the power draw management.


----------



## Countryside (Jul 8, 2016)

Basard said:


> So, if your mobo fries... simply reboot and turn compatibility mode on.  Seems like it should be on by default.



It fries when you use the 480 with a ancient motherboard witch has an ancient power delivery system.


----------



## cryohellinc (Jul 8, 2016)

Countryside said:


> It fries when you use the 480 with a ancient motherboard witch has an ancient power delivery system.


Better safe then sorry


----------



## raptori (Jul 8, 2016)

Basard said:


> So, if your mobo fries... simply reboot and turn compatibility mode on.  Seems like it should be on by default.



"the adjusted power distribution from the first change, which is always on" by default simply by installing the new driver .

The card will not draw full current while being idle on desktop , so obviously you can turn the second adjustment option (Compatibility Mode) on before running any heavy 3D application .


----------



## Nobody99 (Jul 8, 2016)

Shouldn't motherboard deliver a max of 75W per lane, even if PCIe card requests more?


----------



## ZoneDymo (Jul 8, 2016)

well thats very neat.
So it actually works and the performance does not suffer 

Nice job AMD.

@tPU and chance of running a few more games to see how they react?


----------



## RejZoR (Jul 8, 2016)

Told ya all that modern GPU's have clever power delivery systems and it was an easy fix by just redirecting power from PCIe to 6pin. This means that fix doesn't actually gimp the performance AT ALL because it redirects power to 6pin. Compatibility Fix switch however restricts it to initially advertised 150W.

So, AMD actually delivered best of both worlds. Not only they fixed power delivery of PCIe without performance penalty, they actually delivered factory over-boosted card at expense of slightly higher power consumption. Because when you flip Compatibility mode ON, you actually restrict it to official 150W.

So, for consumers, it's actually a double win. Good job AMD and good job reviewers spotting this anomaly. In the end, we, the consumers benefit the most from all this.

EDIT:
They should rename *Compatibility Mode* to *Power Saving+*, *Enhanced Power Saving* or something more descriptive by itself than Compatibility Mode which can mean anything from software features to hardware features...


----------



## laszlo (Jul 8, 2016)

Nobody99 said:


> Shouldn't motherboard deliver a max of 75W per lane, even if PCIe card requests more?



mb don't have built in a current limiter and any consumer attached to it can draw more till something is kaboom


----------



## Norton (Jul 8, 2016)

My card is in a UPS truck atm so it will be waiting for me when I get home 

Very nice of AMD to get the bugs worked out while I was waiting for the card to arrive


----------



## mcraygsx (Jul 8, 2016)

Basard said:


> So, if your mobo fries... simply reboot and turn compatibility mode on.  Seems like it should be on by default.



You simply cannot fry a mainboard when a minor spikes like that. Each PCI-E is able to handle that kind of load. If you fry your mainboard most likely it is not your 480 that caused it.

Original Post: Glad they were able to tweak and address the problem with a software update.


----------



## RejZoR (Jul 8, 2016)

It just hit me. If they could shift the power delivery, so can users. And expand it beyond total 150W/166W. The card has a very beefy VRM segment. And a 6pin which can deliver up to 150W on its own. Meaning this card has an overclocking potential if users can shift the power delivery and stick better cooling on them. Hidden massive potential!


----------



## TheoneandonlyMrK (Jul 8, 2016)

Sweet now if next payday would hurry up and get here I can get me some waterblocks and crank these to 11.


----------



## wurschti (Jul 8, 2016)

Really nice to see good news from AMD. We need good participants in a "GPU Competition" and AMD is doing fine rn. I am waiting for custom RX 480 for my PC. I wanted to see what the 1060 fuss was all about but it seems to limited for future upgrades. Maybe the DirectX 12 Mixed Multi-GPU thing will be glorious, but it is still far from ready and I'm sure even if it becomes a thing, NVIDIA will just disable it just like it has done in the past.


----------



## TRWOV (Jul 8, 2016)

Basard said:


> So, if your mobo fries... simply reboot and turn compatibility mode on.  Seems like it should be on by default.



A single 480 won't fry your motherboard, even with the old driver. The only documented case of a damaged motherboard had multiple cards installed. If, for example,  a board has 3 PCIe slots it has to have provisions for delivering 75w to each one, so 225w in total. A single 480 won't draw that much current from the motherboard.

The new driver sets the 12V bias to the 6pin connector, the compatibility mode just puts the power limit to 150w. It's a different fix for folks that still have problems with the 6pin bias fix; likely those with very old motherboards with power delivery systems that can't deal with the load spikes.




RejZoR said:


> It just hit me. If they could shift the power delivery, so can users. And expand it beyond total 150W/166W. The card has a very beefy VRM segment. And a 6pin which can deliver up to 150W on its own. Meaning this card has an overclocking potential if users can shift the power delivery and stick better cooling on them. Hidden massive potential!



The bias has a maximum of .15 according to W1zard. You can't set all the power to come from the 6pin connector and even if you could it's not advisable anyway since each phase can deal 40w. If you switched everything to the 6pin it could only draw 120w at most.


----------



## RejZoR (Jul 8, 2016)

So, all phases can churn out 240W theoretical load and realistic max being 120W on 6pin phases and 75W on PCIe phases. That's still 195W which ain't that bad, giving some headroom. If you can cool it well enough.


----------



## buildzoid (Jul 8, 2016)

TRWOV said:


> A single 480 won't fry your motherboard, even with the old driver.
> 
> The new driver sets the 12V bias to the 6pin connector, the compatibility mode just puts the power limit to 150w. It's a different fix for folks that still have problems (likely those with very old motherboards... we're talking about LGA775, AM2 boards).
> 
> ...



Each phase can do 40A not W and that's with the VRM at 125C. At normal operating temps (80C) you can shove much more current through them.


----------



## zAAm (Jul 8, 2016)

Nice, exactly the response we needed. Glad to see the VRM could handle per-phase load configuration.


----------



## TRWOV (Jul 8, 2016)

buildzoid said:


> Each phase can do 40A not W and that's with the VRM at 125C. At normal operating temps (80C) you can shove much more current through them.



Ow, sorry, you're right. I recalled reading about them supplying 40 something... so watts seemed like a safe bet


----------



## RejZoR (Jul 8, 2016)

buildzoid said:


> Each phase can do 40A not W and that's with the VRM at 125C. At normal operating temps (80C) you can shove much more current through them.



40A per phase, that's a... a lot more than 40W. At 12V, that's 480W per phase at 125°C. That can't be right. Tell me I've cocked up my math somewhere...


----------



## zAAm (Jul 8, 2016)

RejZoR said:


> 40A per phase, that's a... a lot more than 40W. At 12V, that's 480W per phase at 125°C. That can't be right. Tell me I've cocked up my math somewhere...



I think that would be 40A per phase on the secondary side, meaning 40A at the core/memory voltage, i.e. <1.5V.


----------



## Shamalamadingdong (Jul 8, 2016)

RejZoR said:


> It just hit me. If they could shift the power delivery, so can users. And expand it beyond total 150W/166W. The card has a very beefy VRM segment. And a 6pin which can deliver up to 150W on its own. Meaning this card has an overclocking potential if users can shift the power delivery and stick better cooling on them. Hidden massive potential!



Here you go:
http://www.overclock.net/t/1604979/...-the-reference-rx-480-cards/110#post_25324219

It increases the TDP to 250W~ by shifting the power targets, so -50% actually becomes the current/stock (0%) default of 110W and the +50% is now 250W. I believe these values are for the GPU/vCore itself and not the entire card. So you can imagine the added headroom.

An added bonus is that it also increases the limit on the memory so the maximum is 2500 MHz instead of 2250.

As always with mods like these: use at your own risk. Things can go kaboom if you don't know what you're doing.


----------



## RejZoR (Jul 8, 2016)

zAAm said:


> I think that would be 40A per phase on the secondary side, meaning 40A at the core/memory voltage, i.e. <1.5V.



That would make more sense, 40A*1.5V is 60W. If that's a 125°C rating, at lower temperatures we can still be talking around 80-100W. That's still huge.

@Shamalamadingdong
Yeah, I'm aware of the risks, but that applies to every overclock. Still, the headroom can be a lot higher than around 150W as initially thought. If user has PSU powerful enough to feed 6pin, VRM is there to deliver it if needed. Far beyond the original specs. Meaning AIB's only have to slam better cooling on reference design. They don't even have to redesign PCB and components on it to gain headroom.


----------



## GhostRyder (Jul 8, 2016)

So its resolved then?  Ok, guess we can knock that problem off the list...

Now then, we just need the aftermarket variants to allow better overclocking or better voltage control on the reference (I assume we will still get that at some point).


----------



## xkm1948 (Jul 8, 2016)

I am having stability issue with this new driver on FuryX. 1100 OC is no longer stable no matter what amount of vcore offset is applied now. Played CSGO for 30mins and it crashed on me which have not happened since launch driver. Lower down to 1075OC and it still would crash. It can't even pass the 3DMark FireStrike Stress test now. I use DDU between all of my driver installations so it is definitely not the problem of driver files conflict. Reverted back 16.6.1 and everything is back to normal. Passed 100 loops of FSE Stress test at 99.5% for 1100 OC.

Very likely RTG is focusing on 480 driver now and somehow messed up driver support for previous gen cards. Guess I will just wait until a newer driver that does not solely focus on RX480 then.

And yes I have already submitted the problem to RTG.


----------



## xkm1948 (Jul 8, 2016)

RejZoR said:


> That would make more sense, 40A*1.5V is 60W. If that's a 125°C rating, at lower temperatures we can still be talking around 80-100W. That's still huge.
> 
> @Shamalamadingdong
> Yeah, I'm aware of the risks, but that applies to every overclock. Still, the headroom can be a lot higher than around 150W as initially thought. If user has PSU powerful enough to feed 6pin, VRM is there to deliver it if needed. Far beyond the original specs. Meaning AIB's only have to slam better cooling on reference design. They don't even have to redesign PCB and components on it to gain headroom.




Gusteberg over OCN is working on editing RX480 BIOS for increased power limit. With all power limit removed and heat factor dealt with the card should run 1.4GHz to 1.5GHz easily.


----------



## $ReaPeR$ (Jul 8, 2016)

xkm1948 said:


> I am having stability issue with this new driver on FuryX. 1100 OC is no longer stable no matter what amount of vcore offset is applied now. Played CSGO for 30mins and it crashed on me which have not happened since launch driver. Lower down to 1075OC and it still would crash. It can't even pass the 3DMark FireStrike Stress test now. I use DDU between all of my driver installations so it is definitely not the problem of driver files conflict. Reverted back 16.6.1 and everything is back to normal. Passed 100 loops of FSE Stress test at 99.5% for 1100 OC.
> 
> Very likely RTG is focusing on 480 driver now and somehow messed up driver support for previous gen cards. Guess I will just wait until a newer driver that does not solely focus on RX480 then.
> 
> And yes I have already submitted the problem to RTG.


damn! with their budget though i dont think they can "multitask" that much..


----------



## xkm1948 (Jul 8, 2016)

$ReaPeR$ said:


> damn! with their budget though i dont think they can "multitask" that much..




Good thing is there are nothing new I needed from 16.7.1 at this moment. I thought FuryX would get the Wattman but nope. Oh well, I can work with traiditional way of OC.


----------



## GhostRyder (Jul 8, 2016)

xkm1948 said:


> I am having stability issue with this new driver on FuryX. 1100 OC is no longer stable no matter what amount of vcore offset is applied now. Played CSGO for 30mins and it crashed on me which have not happened since launch driver. Lower down to 1075OC and it still would crash. It can't even pass the 3DMark FireStrike Stress test now. I use DDU between all of my driver installations so it is definitely not the problem of driver files conflict. Reverted back 16.6.1 and everything is back to normal. Passed 100 loops of FSE Stress test at 99.5% for 1100 OC.
> 
> Very likely RTG is focusing on 480 driver now and somehow messed up driver support for previous gen cards. Guess I will just wait until a newer driver that does not solely focus on RX480 then.
> 
> And yes I have already submitted the problem to RTG.


That's unfortunate, it didn't effect my 290X's and their overclocking.  I tested it with some BF4 last night and no problems with same settings though I don't see anything that I gained either so probably could have not downloaded it.


----------



## $ReaPeR$ (Jul 8, 2016)

xkm1948 said:


> Good thing is there are nothing new I needed from 16.7.1 at this moment. I thought FuryX would get the Wattman but nope. Oh well, I can work with traiditional way of OC.


cool, you look like you know your stuff anyway..


----------



## cdawall (Jul 8, 2016)

laszlo said:


> mb don't have built in a current limiter and any consumer attached to it can draw more till something is kaboom



Depends on board I have more than one high end board that allowed per slot wattage limits. Most of which I could go as high as 150w...


----------



## HD64G (Jul 8, 2016)

ZoneDymo said:


> well thats very neat.
> So it actually works and the performance does not suffer
> 
> Nice job AMD.
> ...



Almost sure that for the next GPU review (custom RX480 probably) W1z will test them with Crimson 14.7.1 as he always does


----------



## laszlo (Jul 8, 2016)

RejZoR said:


> So, all phases can churn out 240W theoretical load and realistic max being 120W on 6pin phases and 75W on PCIe phases. That's still 195W which ain't that bad, giving some headroom. If you can cool it well enough.



you understood wrong.... 6 pin isn't max out with 120w...can deliver min. 150w....if psu rail have juice...


----------



## RejZoR (Jul 8, 2016)

I know that, I was talking based on what phases can actually deliver. If 3 phases are dedicated to 6pin and each is 40W, then that's 120W. But someone later clarified it's 40A and not 40W...


----------



## Agiels (Jul 8, 2016)

so is a driver or what ?? i was having big peaks of current in my r9 280, maybe it got fixed for me too ?


----------



## RejZoR (Jul 8, 2016)

Agiels said:


> so is a driver or what ?? i was having big peaks of current in my r9 280, maybe it got fixed for me too ?



Very unlikely since this is RX480 specific control of it's VRM segment. I very much doubt they'd somehow change old R9 280 as well with it.


----------



## Agiels (Jul 8, 2016)

damn it, i think my PSU its failing to hold my system, i will get a EVGA Supernova 750W or a CX750M ...


----------



## RejZoR (Jul 8, 2016)

$ReaPeR$ said:


> damn! with their budget though i dont think they can "multitask" that much..



Still, Fury X is still their current flagship. It deserves same attention as new RX generation. @xkm1948 send a bug report to AMD.


----------



## xkm1948 (Jul 8, 2016)

RejZoR said:


> Still, Fury X is still their current flagship. It deserves same attention as new RX generation. @xkm1948 send a bug report to AMD.



Read my thread dude, already sent. Thanks for the tag though.  

I don't buy cards for showing epen. I use them for work and entertainment. So far my FuryX has been more than awesome for VR as well as OpenCL computing.

Going back to Nekopalive now.


----------



## Agiels (Jul 8, 2016)

cant be more agree with you, for me no company is better than the other, i focus on costs and perfomance, no of those companys will give you a damn product for free, they will preffer to have ur blood first !

for what i see, nVidia is always more spensive than AMD at the same level of performance, atleast here in Cuba it this, for example, GTX960 here cost around 250/270$ at that same price tag and more performance i will neve buy that above an r9 280x (220$) or a r9 380x (270$)


----------



## buildzoid (Jul 8, 2016)

Turns out I messed up the current rating at 125C. The RX 480 Vcore VRM is 6 phases composed of a MDU1514 high side FET and an MDU1511 low side FET. The MDU 1514 has a continuous drain current of 66A @ 25C case temperature and just under 30A at 125C. It has a pulsed current rating of 100A however the pulse duration or frequency isn't specified so it's not all that useful . The low side MDU1511 has a continuous drain current of 100A at 25C case temperature and goes down to about 66A at 125C. So the low side of the VRM can handle 396A of current at 125C and the high side should be able to keep up with that depending on the duty cycle. A 100% safe rating for the high side is 180A @125C at any core voltage setting. Realisticly the VRM shouldn't get anywhere near that hot and you will have a lot more current to play with on both the high and low sides.


----------



## $ReaPeR$ (Jul 8, 2016)

entropic said:


> they really should focus on the boot issue with high refresh rate using dvi, multi high refresh monitors increased power draw and displayport incompatibility with htc vive, but i guess any amd fault to bash is better than admitting nvidia has any


i didnt know about these problems, could you care to post some links please?


----------



## cdawall (Jul 8, 2016)

I don't understand why people care so much about brand if it works it works if it doesn't it doesn't. Both companies have released junk for whatever reason. Move on.



$ReaPeR$ said:


> i didnt know about these problems, could you care to post some links please?



Can you use google?

https://www.techpowerup.com/223669/geforce-gtx-pascal-faces-high-dvi-pixel-clock-booting-problems

http://www.tomshardware.com/news/nvidia-vive-displayport-incompatible,32204.html


----------



## Frick (Jul 8, 2016)

There's a word for people stamping degrading titels on others without knowing the person in question: People stampting degrading titles to others sans knowing the persons they apply the stampt upon. And it's always ugly, on so many levels.


----------



## cadaveca (Jul 8, 2016)

Since this option was a function of the VRM controller, I was really surprised AMD didn't do this from the start, especially considering that it barely affects performance.

Nice to see compatibility mode for those users with older(within limits) PSU/mobo combos can safely add in multiple 480s, too.

However, If I was AMD, I would have had these things ready at the launch and advertised them as features to boost up the bullet points and bring in more consumer confidence. Hopefully future releases will take this situation as a consideration and they'll put it to good use. Sad to see hardware released that doesn't take advantage of all it has to offer; excellent to see it used properly so quickly.


----------



## $ReaPeR$ (Jul 8, 2016)

cadaveca said:


> Since this option was a function of the VRM controller, I was really surprised AMD didn't do this from the start, especially considering that it barely affects performance.
> 
> Nice to see compatibility mode for those users with older(within limits) PSU/mobo combos can safely add in multiple 480s, too.
> 
> However, If I was AMD, I would have had these things ready at the launch and advertised them as features to boost up the bullet points and bring in more consumer confidence. Hopefully future releases will take this situation as a consideration and they'll put it to good use. Sad to see hardware released that doesn't take advantage of all it has to offer; excellent to see it used properly so quickly.


that was their stupidity.. the whole problem could be avoided so easily.. i really cant get amd sometimes..


----------



## W1zzard (Jul 8, 2016)

cadaveca said:


> Since this option was a function of the VRM controller, I was really surprised AMD didn't do this from the start, especially considering that it barely affects performance.
> 
> Nice to see compatibility mode for those users with older(within limits) PSU/mobo combos can safely add in multiple 480s, too.
> 
> However, If I was AMD, I would have had these things ready at the launch and advertised them as features to boost up the bullet points and bring in more consumer confidence. Hopefully future releases will take this situation as a consideration and they'll put it to good use. Sad to see hardware released that doesn't take advantage of all it has to offer; excellent to see it used properly so quickly.



They could have just connected one more phase to the 6-pin, instead of 50/50, very easy to do during board design stage.


----------



## cadaveca (Jul 8, 2016)

W1zzard said:


> They could have just connected one more phase to the 6-pin, instead of 50/50, very easy to do during board design stage.


But that might have given the potential for higher clocking and cannibalized a future SKU.


----------



## AsRock (Jul 8, 2016)

Thanks W1zzard for putting the time and effort in to this.


----------



## the54thvoid (Jul 8, 2016)

cadaveca said:


> But that might have given the potential for higher clocking and cannibalized a future SKU.



Is the rumour not that the higher SKU, RX490 is a dual card?


----------



## cadaveca (Jul 8, 2016)

the54thvoid said:


> Is the rumour not that the higher SKU, RX490 is a dual card?


Sarcasm. RX485, not RX490.

I don't think it actually had anything to do with that. I originally thought it made their binning process easier to do it the way they did.


----------



## the54thvoid (Jul 8, 2016)

$ReaPeR$ said:


> isnt the 490 supposed to have a vega gpu?
> edit: damn! well this sounds stupid!! if its a dual gpu solution i dont think it will be much of an oponent to the 1080.. crossfire never scales 100%, and sometimes it doesnt work altogether.



I got it through a google now feed...

http://wccftech.com/amd-rx-490-dual-gpu/

Good old WCCFTech.....

Perhaps Vega will be called "Faster and Fury-ous"?


----------



## RejZoR (Jul 8, 2016)

Well, if it is a dual GPU card and they are commited to deliver drivers 1 day before game release, then I'd be actually fine with it. About GTX 1080 performance for roughly 200 € less, hell yeah.


----------



## Tatty_One (Jul 8, 2016)

I see the children are misbehaving in the nursery yet again, reply bans for this thread will be issued if it continues, followed by free holiday passes, it's been a quiet day, feel free to liven it up for me...... thank you as always.


----------



## PP Mguire (Jul 8, 2016)

$ReaPeR$ said:


> isnt the 490 supposed to have a vega gpu?
> edit: damn! well this sounds stupid!! if its a dual gpu solution i dont think it will be much of an oponent to the 1080.. crossfire never scales 100%, and sometimes it doesnt work altogether.
> 
> well, im sorry that im not as godlike as you and perfect in every way. your problem was with my stupid post and not with the fanboys that blew this way out of proportion. and, in order not to be misunderstood, im not talking about the people saw this as it is, a stupid mistake that could easily be avoided, im talking about the people that were posting in the "this card is shit, amd is worthless, i told you so" kind of mentality. and you judge me to be hypocritical.. yeah sure mate.


Edited due to above post not being seen.


----------



## xvi (Jul 8, 2016)

I think it's a bit funny that at first everyone complains about the PCIe bus power issue and now that it's fixed, they're talking about how beefy the VRMs are and how much power they think we can push through it.


W1zzard said:


> They could have just connected one more phase to the 6-pin, instead of 50/50, very easy to do during board design stage.


Fair point. Think there's any chance that non-reference cards will make this modification?


----------



## $ReaPeR$ (Jul 8, 2016)

the54thvoid said:


> I got it through a google now feed...
> 
> http://wccftech.com/amd-rx-490-dual-gpu/
> 
> ...


LOL that would be funny!! i dont mind the 2 tier approach, though it may be confusing for the average user..



RejZoR said:


> Well, if it is a dual GPU card and they are commited to deliver drivers 1 day before game release, then I'd be actually fine with it. About GTX 1080 performance for roughly 200 € less, hell yeah.


i think you are a bit optimistic mate. (its not a bad thing, i also hope thats the case) i dont know if amd can pull that off. if it worked that would be great though. doesnt dx12 play a role in this situation?


----------



## GhostRyder (Jul 8, 2016)

cadaveca said:


> Sarcasm. RX485, not RX490.
> 
> I don't think it actually had anything to do with that. I originally thought it made their binning process easier to do it the way they did.


I am going to slap my head if they do that with the naming scheme alone.  Though I would hope if they do a RX 485/RX 480X then its just a full version of this chip (If there is such a thing, unless this is the full chip already in the RX 480).



the54thvoid said:


> I got it through a google now feed...
> 
> http://wccftech.com/amd-rx-490-dual-gpu/
> 
> ...





the54thvoid said:


> Is the rumour not that the higher SKU, RX490 is a dual card?


If they do that, then might as well write off the high end department for them and call it a day.  But then again it is WCCFTech...


----------



## $ReaPeR$ (Jul 8, 2016)

GhostRyder said:


> I am going to slap my head if they do that with the naming scheme alone.  Though I would hope if they do a RX 485/RX 480X then its just a full version of this chip (If there is such a thing, unless this is the full chip already in the RX 480).
> 
> 
> 
> If they do that, then might as well write off the high end department for them and call it a day.  But then again it is WCCFTech...



i think the 480 has the full chip, havent seen anything that it would suggest otherwise. 
as for the 490's "duality"  i dont think they have any other choice until vega is ready. which is sad because it means that the prices will remain high in the nvidia camp due to lack of competition. neither sli nor cf have been better till now than a single card solution.


----------



## PP Mguire (Jul 8, 2016)

xvi said:


> I think it's a bit funny that at first everyone complains about the PCIe bus power issue and now that it's fixed, they're talking about how beefy the VRMs are and how much power they think we can push through it.
> 
> Fair point. Think there's any chance that non-reference cards will make this modification?


I found this a bit hilarious as well.


----------



## Jism (Jul 8, 2016)

PP Mguire said:


> I found this a bit hilarious as well.



I think half of the members in this thread should apply at AMD for an VGA engineering job.

Geezus. You know that some motherboards offer Crossfire or SLI up to 4 slots? And those slots DONT even have an extra Molex-connector? How do you think these boards will hold up the moment you put 4 cards in without any PCI-express booster? That's 4x 75W's pulling from PCI-E bus alone, over 2 wires on that poor 24 ATX connector.

Geezus people even many ancient motherboards offer onboard Molex-connector, it proberly sits near the PCI-express slot which feeds an additional current.

And if it's not there, you'd buy a pci-express booster:







This way you are able to pull more current then the standard 75W's.


----------



## RejZoR (Jul 8, 2016)

PCIe is specified to deliver 75W per slot. If board can't deliver that power, why the F does it even exist then?


----------



## Jism (Jul 8, 2016)

RejZoR said:


> PCIe is specified to deliver 75W per slot. If board can't deliver that power, why the F does it even exist then?



It's not about delivery, it's about the traces through the motherboard, ancient & cheap ones are designed to handle a current up to 75Watts. If a card exceeds that 75W current, the problem is is that traces might burn eventually or simple 'degrade' for running too hot / too long. That is in theory, what is going on.

The 12V power comes straight from the 24 pins ATX plug. Some motherboards even share the 8 / 4 PINS from the CPU towards the PCI-express.


----------



## ensabrenoir (Jul 8, 2016)

Jism said:


> *I think half of the members in this thread should apply at AMD for an VGA engineering job*.
> 
> Geezus. You know that some motherboards offer Crossfire or SLI up to 4 slots? And those slots DONT even have an extra Molex-connector? How do you think these boards will hold up the moment you put 4 cards in without any PCI-express booster? That's 4x 75W's pulling from PCI-E bus alone, over 2 wires on that poor 24 ATX connector.
> 
> ...



Cool!!!!!!  It would be like twitch plays Pokemon!!!!!!  Thousands of people design a gpu over the internet.


----------



## xvi (Jul 9, 2016)

ensabrenoir said:


> Cool!!!!!!  It would be like twitch plays Pokemon!!!!!!  Thousands of people design a gpu over the internet.


Praise Helix.


----------



## GhostRyder (Jul 9, 2016)

$ReaPeR$ said:


> i think the 480 has the full chip, havent seen anything that it would suggest otherwise.
> as for the 490's "duality"  i dont think they have any other choice until vega is ready. which is sad because it means that the prices will remain high in the nvidia camp due to lack of competition. neither sli nor cf have been better till now than a single card solution.


Yea, I am giving it that it may unfortunately.  I was actually hoping that it was like the R9 285/380 and they were just stocking up/waiting to release the full chip.

As for the dual card, it would be very foolish if they do that.  Unless its priced super aggressively (like 1.5-1.75x the price of the RX 480) its not going to be worth it.  Plus its still a dual card which carries lots of problems relating to scaling in proper games.  I don't think they would do that personally because dual GPU's have been on the down slide alot lately in sales.


----------



## nem.. (Jul 9, 2016)

*ASUS GTX 950 2 GB ( no power connector )* *79w tdp*

https://www.techpowerup.com/reviews/ASUS/GTX_950/21.HTML


----------



## Fluffmeister (Jul 9, 2016)

Oh nem, I love your attempts at damage control.


----------



## CounterZeus (Jul 9, 2016)

nem.. said:


> *ASUS GTX 950 2 GB ( no power connector )* *79w tdp*
> 
> https://www.techpowerup.com/reviews/ASUS/GTX_950/21.HTML



79W was peak during gaming, the highest value (might as well be a little power spike), while furmark only hit 76W. Pretty good within the limits I would say.

Btw, tdp means thermal design power, it's designed to still be able to cool up until 75W.


----------



## AsRock (Jul 9, 2016)

I was thinking the same that it was odd that gaming was higher so it was just a spike.  Not as any of it really matters anyways, AMD fixed the shit and people should be happy and leave it at that.



Fluffmeister said:


> Oh nem, I love your attempts at damage control.



Thing is there is not damage to control and tbh i have no idea why he would post that anyways as it looks all good anyways.


----------



## GhostRyder (Jul 9, 2016)

nem.. said:


> *ASUS GTX 950 2 GB ( no power connector )* *79w tdp*
> 
> https://www.techpowerup.com/reviews/ASUS/GTX_950/21.HTML
> 
> ...


First off TDP is not actual power usage.  Second so what?  Not on the subject much and we already deduced how little this is a problem anyways.


AsRock said:


> I was thinking the same that it was odd that gaming was higher so it was just a spike.  Not as any of it really matters anyways, AMD fixed the shit and people should be happy and leave it at that.
> 
> 
> 
> Thing is there is not damage to control and tbh i have no idea why he would post that anyways as it looks all good anyways.


Bingo


----------



## nem.. (Jul 9, 2016)

GhostRyder said:


> First off TDP is not actual power usage.  Second so what?  Not on the subject much and we already deduced how little this is a problem anyways.
> 
> Bingo



Even without OC the power consumption has been more than 75w from PCIE port.

tdp thermal dissipation power its mean average power disipated as i do undestand , but in this case than we are talking about there are no doubts the real tdp be 75w tdp o even more as in the OC case where consumption rises without touch voltage.







CounterZeus said:


> 79W was peak during gaming, the highest value (might as well be a little power spike), while furmark only hit 76W. Pretty good within the limits I would say.
> 
> Btw, tdp means thermal design power, it's designed to still be able to cool up until 75W.



The point be than the 75w limit has been exceeded many times this is not peaks on energy consume as marking the Asus Strix up to 250w.


----------



## Fluffmeister (Jul 9, 2016)

AsRock said:


> I was thinking the same that it was odd that gaming was higher so it was just a spike.  Not as any of it really matters anyways, AMD fixed the shit and people should be happy and leave it at that.
> 
> Thing is there is not damage to control and tbh i have no idea why he would post that anyways as it looks all good anyways.



Thing is, nem is still posting random nonsense, he stinks of damage control.


----------



## Fluffmeister (Jul 9, 2016)

nem.. said:


> the only here posting nonsenses til i can see are you.



Honestly, I hope you aren't banned again, your my fave.

Glad AMD addressed the juicy-ness of the 480.


----------



## xvi (Jul 9, 2016)

*hovers over unsub button*

It's almost always a mistake subbing to any GPU-related thread.


----------



## Frick (Jul 9, 2016)

xvi said:


> *hovers over unsub button*
> 
> It's almost always a mistake subbing to any GPU-related thread.



You drown in notifications anyway, so there's little point to it.


----------



## qubit (Jul 9, 2016)

Nice to see this fixed properly. I was quite pessimistic that they'd botch it and am really happy it's now working properly. Bring on competition!


----------



## the54thvoid (Jul 9, 2016)

qubit said:


> Nice to see this fixed properly. I was quite pessimistic that they'd botch it and am really happy it's now working properly. Bring on competition!



It was never botched, more of an oversight from probably a specific section of the design team. My brother works on compute GPUS for industry and he has little to do with power delivery or cooling. I imagine the people concerned got a bit of a dressing down.


----------



## qubit (Jul 9, 2016)

the54thvoid said:


> It was never botched, more of an oversight from probably a specific section of the design team. My brother works on compute GPUS for industry and he has little to do with power delivery or cooling. *I imagine the people concerned got a bit of a dressing down.*


Yeah, I'll bet!

No, I said that the _fix_ might be botched, but I'm glad it wasn't. The fault was certainly an oversight all right.


----------



## $ReaPeR$ (Jul 9, 2016)

GhostRyder said:


> Yea, I am giving it that it may unfortunately.  I was actually hoping that it was like the R9 285/380 and they were just stocking up/waiting to release the full chip.
> 
> As for the dual card, it would be very foolish if they do that.  Unless its priced super aggressively (like 1.5-1.75x the price of the RX 480) its not going to be worth it.  Plus its still a dual card which carries lots of problems relating to scaling in proper games.  I don't think they would do that personally because dual GPU's have been on the down slide alot lately in sales.


for this gen amd chose to compete only in the perf/$ segment, and from the results i dont think that was a bad choice, i find this segment to be the closest to "reality" since the majority of people are included in it. lets hope vega will compete with the x80/ti and titan variants because the lack of competition is not good for anyone.



xvi said:


> *hovers over unsub button*
> 
> It's almost always a mistake subbing to any GPU-related thread.


sometimes its hard to not take the bait though, these threads are the liveliest after all..


----------



## RejZoR (Jul 9, 2016)

And they sell RX480 for rather realistic price. I just found RX480 for 269 € on some German store.

$229 = 207€ + 20% VAT = 248€

The price is a bit inflated since the stocks are small, but 20€ premium is not that huge if you want it now.
And they do have them in stock.


----------



## EdInk (Jul 9, 2016)

The 6-pin used in the RX480 has the power throughput of an 8pin connector already


----------



## $ReaPeR$ (Jul 9, 2016)

RejZoR said:


> And they sell RX480 for rather realistic price. I just found RX480 for 269 € on some German store.
> 
> $229 = 207€ + 20% VAT = 248€
> 
> ...


wow!!! thats a good deal! 
the euro/dollar pricing its so horrible, i understand the tax thing but why the hell they go with the 199 dollars=199 euro?! that irritates me so much.


----------



## ensabrenoir (Jul 9, 2016)

.............Problem solved......Amd did good.....how much longer are we gonna let this thing cook ?.


----------



## sutyi (Jul 9, 2016)

RejZoR said:


> And they sell RX480 for rather realistic price. I just found RX480 for 269 € on some German store.
> 
> $229 = 207€ + 20% VAT = 248€
> 
> ...



Germany:
RX 480 8GB - 239USD = ~217EUR + 20% VAT = ~260EUR

So I would say pretty decent sales margin.


Meanwhile in Hungary:
RX 480 8GB - 239USD = ~217EUR + 27% VAT = ~275EUR *~319EUR*

It is kinda funny and sad at the same time that I could order a Sapphire RX 480 NITRO+ with shipping from OCUK for about the same amount at ~324EUR.

_Ahhh...God bless Hungaristan..._


----------



## RejZoR (Jul 9, 2016)

Also, meanwhile in Slovenia (Slovenistan if you will  ), 329€ for reference ASUS RX480...


----------



## TRWOV (Jul 9, 2016)

Amazon has free shipping in orders over $150 to Mexico... hello RX 480 at 2/3 of the local price ... heck, is actually cheaper than buying it on amazon.com.mx :/


----------



## Nobody99 (Jul 9, 2016)

RejZoR said:


> And they sell RX480 for rather realistic price. I just found RX480 for 269 € on some German store.
> 
> $229 = 207€ + 20% VAT = 248€
> 
> ...


Better wait for RX470 which will have even better performance per €/$ and will be without this behaviour which doesn't adhere to specifications.

In Europe it is better to order from Germany because shipping fees aren't that high (https://www.notebooksbilliger.de/infocenter/section/shipping) and one important thing they offer when buying products is that the sites are actually optimized for buying and have good filtering and if that isn't enough you can use sites like Geizhals which have even better filtering.


----------



## sutyi (Jul 9, 2016)

RejZoR said:


> Also, meanwhile in Slovenia (Slovenistan if you will  ), 329€ for reference ASUS RX480...





Order from Mindfactory, they are shipping to Slovenia iirc.


----------



## TheLaughingMan (Jul 9, 2016)

Does this improve power consumption any other time such as idle, Blu-ray playback, etc.


----------



## Agiels (Jul 9, 2016)

well seems like 2x Rx480 cant surpass a sigle 1080, its near, but not what AMD stated, hope in future games it will do, just hoping, since im in the red side of the force


----------



## TRWOV (Jul 9, 2016)

TheLaughingMan said:


> Does this improve power consumption any other time such as idle, Blu-ray playback, etc.



No



Agiels said:


> well seems like 2x Rx480 cant surpass a sigle 1080, its near, but not what AMD stated, hope in future games it will do, just hoping, since im in the red side of the force



That claim applied for AoS and only in DX12 mode...so yeah. RX 480 crossfire isn't worth it at the moment really.

I plan to buy 2 480s and 1 470 to upgrade all my rigs... maybe 1 460 as a backup card too. I haven't upgraded since the HD7000 series... well there wasn't a lot of options for a real upgrade anyways (gaming on 7970 and 7950).


----------



## Agiels (Jul 9, 2016)

TRWOV said:


> No
> 
> 
> 
> ...


im still in the r9's ... i have a Sapphire r9 280 3GB Dual-x, just a quick question, im having 0 instability problems in my rig with this card, but games still crashes, what can be this ??


----------



## TRWOV (Jul 9, 2016)

dunno, drivers? Haven't experienced a crash lately... I think my last crash was on Crysis 2 in an overclocking session....- Is your card overclocked?


----------



## RejZoR (Jul 9, 2016)

sutyi said:


> Order from Mindfactory, they are shipping to Slovenia iirc.



I don't need one, I'm good with GTX 980. I used to order from Hardwareversand, but they are no more. Usually ComputerUniverse. Usually because they support wire transfer. Geizhals is usually a place where I check the prices and then pick out those that have wire transfer/prepayment option. Bought basically entire current system this way, no need to deal with credit card limits and stuff.


----------



## Agiels (Jul 9, 2016)

TRWOV said:


> dunno, drivers? Haven't experienced a crash lately... I think my last crash was on Crysis 2 in an overclocking session....- Is your card overclocked?


its factory OC 940/1250 havent touched since i have it, not worth  it, it doesnt crash on any stress test, Firestrike, Unigine all of them, Asus Realbench, 3Dmark 11, CINEBENCH_R15, FurMark, with Win7 and driver 14.12 crash less but ofc run badly due to non optimized for new games, maybe my PSU its getting tired ? a CX600M i heard they are good but not got for stressing


----------



## INSTG8R (Jul 9, 2016)

Agiels said:


> its factory OC 940/1250 havent touched since i have it, not worth  it, it doesnt crash on any stress test, Firestrike, Unigine all of them, Asus Realbench, 3Dmark 11, CINEBENCH_R15, FurMark, with WinXP and driver 14.12 crash less but ofc run badly due to non optimized for new games, maybe my PSU its getting tired ? a CX600M i heard they are good but not got for stressing



Maybe you should make a thread for help and not use the news.


----------



## Agiels (Jul 10, 2016)

INSTG8R said:


> Maybe you should make a thread for help and not use the news.


that is why i said "quick question"


----------



## Tsukiyomi91 (Jul 10, 2016)

good guy AMD.


----------



## TRWOV (Jul 10, 2016)

Agiels said:


> its factory OC 940/1250 havent touched since i have it, not worth  it, it doesnt crash on any stress test, Firestrike, Unigine all of them, Asus Realbench, 3Dmark 11, CINEBENCH_R15, FurMark, with Win7 and driver 14.12 crash less but ofc run badly due to non optimized for new games, maybe my PSU its getting tired ? a CX600M i heard they are good but not got for stressing



940/1250 is very tame. I have my 7950 (basically the same GPU) running at 1150/1500 (sapphire Vapor-x) so I don't think is that. The PSU is fine too, not top grade but adequate.

I'd say it's a software issue. I would try a full Windows re-install (refresh if you have W8/W10) to start clean and see if that solves it.


----------



## xkm1948 (Jul 10, 2016)

And now NV is shareing some of the PCI-E gate love as well, brought to you by no other than Tom's PCI-E Guide!

https://www.techpowerup.com/forums/threads/gtx1070-fe-pcie-gate.224024/


----------



## Covert_Death (Jul 10, 2016)

Chaitanya said:


> nice of Amd to fix the power draw management.


hha, yes, how nice of them to ship a bad device and then fix it, praise be to AMD!


----------



## xkm1948 (Jul 10, 2016)

Covert_Death said:


> hha, yes, how nice of them to ship a bad device and then fix it, praise be to AMD!




Shh, now you need to defend the 1070's PCI-E gate as well. Hurry!

Oh and did I forget the dual link DVI defect of 1070/1080? Or the VR defect of 1070/1080?


----------



## TheLaughingMan (Jul 10, 2016)

Covert_Death said:


> hha, yes, how nice of them to ship a bad device and then fix it, praise be to AMD!



Exactly. How dare AMD not anticipate review sites, who typically use the best of the best when testing, to use $30 motherboards and/or boards from like 4 years ago. Then implement a fix for their customers who may be using the same stuff. Its almost like they care about their customers or something. Next time they should be like Nvidia with their missing 512 MB of VRAM and just make excuses.


----------



## Tatty_One (Jul 10, 2016)

xkm1948 said:


> Shh, now you need to defend the 1070's PCI-E gate as well. Hurry!
> 
> Oh and did I forget the dual link DVI defect of 1070/1080? Or the VR defect of 1070/1080?


Trolling much, you on a retainer?  I see some more reply bans coming if it continues.


----------



## RejZoR (Jul 10, 2016)

Covert_Death said:


> hha, yes, how nice of them to ship a bad device and then fix it, praise be to AMD!



GTX 970 is a bad device with it's 3.5GB crap. RX480 was just configured badly. GTX 970 can't ever be fixed. RX480 was already fixed. Anything else?


----------



## HD64G (Jul 10, 2016)

RejZoR said:


> GTX 970 is a bad device with it's 3.5GB crap. RX480 was just configured badly. GTX 970 can't ever be fixed. RX480 was already fixed. Anything else?


No need to feed the trolls buddy . They have their pain now with AMD easily fixing a problem they believed it could totally ruin RX480's sales. Every customer is happy with the fix except for green fanboys who want AMD dead to be able to pay twice than now for their beloved GPUs...


----------



## Fluffmeister (Jul 10, 2016)

The superb sales of the GTX 970 has already answered all the haters.

Anyway, I think we all agree it's good to see AMD getting close to GM204's performance per watt, and fixing these problems which pop up with new card launches.


----------



## D007 (Jul 10, 2016)

Kind of loling a bit here.. People keep screaming AMD fixed it!.. But they really didn't....
They gave you a driver update.. Which is totally normal and they can improve performance in "some games" while hurting it in others..
Truth is if you enable compatibility mode, you lose 3%..
The driver update is just a driver update.. All cards get them..
You lost 3% with the fix..
Ok.. Scream about the driver update now again, as if it is a valid response..lol..

That being said.. Did it ever really require a fix? I mean who's hardware could this possibly harm?
Just leave compatibility mode off I guess..



HD64G said:


> No need to feed the trolls buddy . They have their pain now with AMD easily fixing a problem they believed it could totally ruin RX480's sales. Every customer is happy with the fix except for green fanboys who want AMD dead to be able to pay twice then now for their beloved GPUs...



In my experience around here.. people who call others fanboys are in fact themselves, fan boys..
If that's what you consider and intelligent response.
/yawn..


----------



## xkm1948 (Jul 10, 2016)

RejZoR said:


> GTX 970 is a bad device with it's 3.5GB crap. RX480 was just configured badly. GTX 970 can't ever be fixed. RX480 was already fixed. Anything else?




Selective blindness it is called. I haven't seen nvidia fixing its incompatiability problem with HTC Vive yet and i see 0 complains. AMD fixed a problem and people are still complaining. Double standard much?


----------



## the54thvoid (Jul 10, 2016)

xkm1948 said:


> Selective blindness it is called. I haven't seen nvidia fixing its incompatiability problem with HTC Vive yet and i see 0 complains. AMD fixed a problem and people are still complaining. Double standard much?



I'm awesomely immune to a retort as I'm the one that linked the initial HTC VR issue in a post last week.
But seriously dude, your posts are incredibly immature and stink of bitterness. Why try so hard to focus on Fanboys? Don't you see the absolute irony in what you do?
Seriously, please try to post about the issue in the OP without trying to make it a gutter fight with the 'other' side.
Neither team makes perfect products. Neither team makes perfect drivers. People aren't perfect either and we'll get people here that always poke fun (talking to you @Fluffmeister ). But fluff isn't hostile. Some people downright are and you're well on that path. It's unpleasant.
You don't like Nvidia, or you don't like their fanboys, neither do I. But as soon as you pick up the sword to fight them, you're in the other camp be default. Rise above it and help keep TPU more mature than other sites. Please?


----------



## TheoneandonlyMrK (Jul 10, 2016)

D007 said:


> Kind of loling a bit here.. People keep screaming AMD fixed it!.. But they really didn't....
> They gave you a driver update.. Which is totally normal and they can improve performance in "some games" while hurting it in others..
> Truth is if you enable compatibility mode, you lose 3%..
> The driver update is just a driver update.. All cards get them..
> ...


I've got 2, Amd did fix it with a driver ,and I'm getting the same scores in 3mark I got before and I'm folding with them so they Have been soak tested adequately and haven't killed my mother board even without its pciex additional power Molex in use, also wizard and many other sites agree its fixed ,why you such a refusenik


----------



## Norton (Jul 10, 2016)

theoneandonlymrk said:


> I've got 2, Amd did fix it with a driver ,and I'm getting the same scores in 3mark I got before and I'm folding with them so they Have been soak tested adequately and haven't killed my mother board even without its pciex additional power Molex in use, also wizard and many other sites agree its fixed ,why you such a refusenik



Did you try folding with them yet? Really hope they do well in ppd since I want to give mine some folding work to do after I get it installed


----------



## TheoneandonlyMrK (Jul 10, 2016)

Norton said:


> Did you try folding with them yet? Really hope they do well in ppd since I want to give mine some folding work to do after I get it installed


Certainly,mine are at it now, on stock clocks I was getting 550 000(clients been off and on all weekend so I actually expect 620000 given stable folding)ppd but I've downclocked them to 1140 for 528000 ppd until I get waterblocks because they do run very hot I run 4 folding CPU cores too, Ek need to get there stuff out .


----------



## the54thvoid (Jul 10, 2016)

theoneandonlymrk said:


> Certainly,mine are at it now, on stock clocks I was getting 550 000(clients been off and on all weekend so I actually expect 620000 given stable folding)ppd but I've downclocked them to 1140 for 528000 ppd until I get waterblocks because they do run very hot I run 4 folding CPU cores too, Ek need to get there stuff out .



Really surprised RX480 blocks aren't out yet. The Radeon Pro Duo blocks were out almost the same time.


----------



## TheoneandonlyMrK (Jul 10, 2016)

the54thvoid said:


> Really surprised RX480 blocks aren't out yet. The Radeon Pro Duo blocks were out almost the same time.


I've seen some plate type at water cooling uk from xspc but I'd prefer EK full cover long term.


----------



## the54thvoid (Jul 10, 2016)

Yeah, I think the XSPC is a 'budget' cooler. Not sure if it was TPU but I saw a PR post on a tech site for them. If you can hack that driver and get them under water you could push them further, though for folding I guess stability is king.


----------



## HD64G (Jul 10, 2016)

D007 said:


> In my experience around here.. people who call others fanboys are in fact themselves, fan boys..
> If that's what you consider and intelligent response.
> /yawn..


I cannot feed trolls, so have a good time doing what you like. Even ignoring the facts and not being productive in the topic we discuss...


----------



## GhostRyder (Jul 10, 2016)

D007 said:


> Kind of loling a bit here.. People keep screaming AMD fixed it!.. But they really didn't....
> They gave you a driver update.. Which is totally normal and they can improve performance in "some games" while hurting it in others..
> Truth is if you enable compatibility mode, you lose 3%..
> The driver update is just a driver update.. All cards get them..
> ...


That is not quite correct, I think you missed the part about how it changes the power distribution.  It not only changes the power distribution by default to more on the 6 pin side side, but it also includes a fix for keeping the power lower if you want.  By default, you will now be much lower on the PCIE side now but if you want to lower hte power further you just enable compatibility mode (Which in all honesty will probably see a very minuscule use).



the54thvoid said:


> Really surprised RX480 blocks aren't out yet. The Radeon Pro Duo blocks were out almost the same time.


Well you can order them so maybe soon?


----------



## TheoneandonlyMrK (Jul 10, 2016)

GhostRyder said:


> That is not quite correct, I think you missed the part about how it changes the power distribution.  It not only changes the power distribution by default to more on the 6 pin side side, but it also includes a fix for keeping the power lower if you want.  By default, you will now be much lower on the PCIE side now but if you want to lower hte power further you just enable compatibility mode (Which in all honesty will probably see a very minuscule use).
> 
> 
> Well you can order them so maybe soon?


Nice looking block but it would be upside down and out of sight in mine so a bit spangly for me. And expensive.


----------



## TRWOV (Jul 11, 2016)

the54thvoid said:


> Really surprised RX480 blocks aren't out yet. The Radeon Pro Duo blocks were out almost the same time.



This was announced the same day the 480 was released:

https://www.techpowerup.com/forums/...ith-radeon-rx-480-full-coverage-block.223784/


----------



## Assimilator (Jul 11, 2016)

TheLaughingMan said:


> Next time they should be like Nvidia with their missing 512 MB of VRAM and just make excuses.



"Missing 512MB of VRAM"? How do you AMD fanboys dream up this s**t that has zero basis in reality?



Tatty_One said:


> Trolling much, you on a retainer?  I see some more reply bans coming if it continues.



Can't you just ban it outright and save everyone's time?



Fluffmeister said:


> Anyway, I think we all agree it's good to see AMD getting close to GM204's performance per watt, and fixing these problems which pop up with new card launches.



Yep, it's awesome that GloFo's 14nm process is barely capable of matching TSMC's 28nm, while TSMC's 16nm is leaps ahead.


----------



## R-T-B (Jul 11, 2016)

Assimilator said:


> Yep, it's awesome that GloFo's 14nm process is barely capable of matching TSMC's 28nm, while TSMC's 16nm is leaps ahead.



You have to factor in AMD insists on making their cards much more potent at FP32, which hurts their energy efficiency something fierce.  Because of this, we don't really know how far behind GloFo's efficiency on their 14nm node is...  but it still seems to be worse than TSMC's 16nm for certain.


----------



## cadaveca (Jul 11, 2016)

R-T-B said:


> You have to factor in AMD insists on making their cards much more potent at FP32, which hurts their energy efficiency something fierce.  Because of this, we don't really know how far behind GloFo's efficiency on their 14nm node is...  but it still seems to be worse than TSMC's 16nm for certain.


There's at least the clock-speed difference...


But here's the thing. It's obviously cheaper per mm, or AMD wouldn't be selling these cards for so cheap. That's part of the nice thing here when it comes to what AMD offers, is it shows something really interesting about what GloFo offers. But that's something that belongs in an entirely different thread. I'm kind of happy both with AMD and GloFo at this point. What that means in the long run is where the money is.


----------



## cadaveca (Jul 11, 2016)

R-T-B said:


> You have to factor in AMD insists on making their cards much more potent at FP32, which hurts their energy efficiency something fierce.  Because of this, we don't really know how far behind GloFo's efficiency on their 14nm node is...  but it still seems to be worse than TSMC's 16nm for certain.


There's at least the clock-speed difference...


But here's the thing. It's obviously cheaper per mm, or AMD wouldn't be selling these cards for so cheap. That's part of the nice thing here when it comes to what AMD offers, is it shows something really interesting about what GloFo offers. But that's something that belongs in an entirely different thread. I'm kind of happy both with AMD and GloFo at this point. What that means in the long run is where the money is.


----------



## D007 (Jul 11, 2016)

HD64G said:


> No need to feed the trolls buddy . They have their pain now with AMD easily fixing a problem they believed it could totally ruin RX480's sales. Every customer is happy with the fix except for green fanboys who want AMD dead to be able to pay twice then now for their beloved GPUs...





theoneandonlymrk said:


> I've got 2, Amd did fix it with a driver ,and I'm getting the same scores in 3mark I got before and I'm folding with them so they Have been soak tested adequately and haven't killed my mother board even without its pciex additional power Molex in use, also wizard and many other sites agree its fixed ,why you such a refusenik



Whatever you say... They can "fix it all they want.. It's still an underperforming card for todays requirements.
Bought ATI before and was never happy with it.. Even their rendering methods look worse.

Again I say.. You got a "driver" update that got you those FPS. The software tweak they did to fix the power draw, helped nothing.. 
It costs 3% if you use compatibility mode.
People can't see the fire through the smoke..


----------



## TRWOV (Jul 11, 2016)

D007 said:


> Whatever you say... They can "fix it all they want.. It's still an underperforming card for todays requirements.
> Bought ATI before and was never happy with it.. Even their rendering methods look worse.
> 
> Again I say.. You got a "driver" update that got you those FPS. The software tweak they did to fix the power draw, helped nothing..
> ...



The fix was for the overdrawing over the PCIe slot. The compatibility mode is for those with old motheboard/PSU combos that still have problems with the PCIe fix.


----------



## cdawall (Jul 11, 2016)

D007 said:


> Whatever you say... They can "fix it all they want.. It's still an underperforming card for todays requirements.
> Bought ATI before and was never happy with it.. Even their rendering methods look worse.
> 
> Again I say.. You got a "driver" update that got you those FPS. The software tweak they did to fix the power draw, helped nothing..
> ...



Underperforming in today's standards? How so? What midrange $200 card is currently beating the 480? Did I miss something on the gtx960?


----------



## jigar2speed (Jul 11, 2016)

cdawall said:


> Underperforming in today's standards? How so? What midrange $200 card is currently beating the 480? Did I miss something on the gtx960?



You didn't miss on anything, he just wanted to whine about something, he doesn't care if people laugh at him for feeling disappointed by $200 card which matches or beats GTX 970 in DirectX 11 games and same card match GTX 980 in DirectX 12.


----------



## cdawall (Jul 11, 2016)

jigar2speed said:


> You didn't miss on anything, he just wanted to whine about something, he doesn't care if people laugh at him for feeling disappointed by $200 card which matches or beats GTX 970 in DirectX 11 games and same card match GTX 980 in DirectX 12.



It is a quite strong price point card I honestly don't understand the complaints from people. Basic gist I get is how dare AMD offer a good midrange product.


----------



## Steevo (Jul 11, 2016)

D007 said:


> Whatever you say... They can "fix it all they want.. It's still an underperforming card for todays requirements.
> Bought ATI before and was never happy with it.. Even their rendering methods look worse.
> 
> Again I say.. You got a "driver" update that got you those FPS. The software tweak they did to fix the power draw, helped nothing..
> ...




Apparently you didn't read the article written by W1zz, but maybe since the words were about a brand you only care to hate on you can't read them?


But I'm glad you overpaid for a FTW 1080 I hope it works well after the fan issues have been sorted, and I hope you never have any display issues, or any black screens when watching media on multiple screens, or when hardware acceleration is in use, or just randomly.


----------



## cdawall (Jul 11, 2016)

Steevo said:


> Apparently you didn't read the article written by W1zz, but maybe since the words were about a brand you only care to hate on you can't read them?
> 
> 
> But I'm glad you overpaid for a FTW 1080 I hope it works well after the fan issues have been sorted, and I hope you never have any display issues, or any black screens when watching media on multiple screens, or when hardware acceleration is in use, or just randomly.



I don't hope any of that as it is just mean. All I hope is nvidia releases a 1080Ti in two months that makes his card an overpriced paperweight.


----------



## Steevo (Jul 11, 2016)

cdawall said:


> I don't hope any of that as it is just mean. All I hope is nvidia releases a 1080Ti in two months that makes his card an overpriced paperweight.




It already is to a small degree, tested head to head against a 980Ti with overclocks included its only a few percent faster in almost every game, until Nvidia starts their planned obsolescence program again. 








http://www.newegg.com/Product/Produ...gclid=CNv5wNTs680CFdKIfgodufIIzQ&gclsrc=aw.ds  $389.99 after MIR

VS






Anywhere from a hypothetical $629 to $719 depending on which version of "no stock/out of stock" you would like to pre-orderwith no delivery date or $886 for a Fanboi Edition that is available. 


All for 2FPS or so more performance when overclocked......


----------



## TheoneandonlyMrK (Jul 11, 2016)

D007 said:


> Whatever you say... They can "fix it all they want.. It's still an underperforming card for todays requirements.
> Bought ATI before and was never happy with it.. Even their rendering methods look worse.
> 
> Again I say.. You got a "driver" update that got you those FPS. The software tweak they did to fix the power draw, helped nothing..
> ...


You know what you read and refute, i've actually got them, there's no smoke in my pc and if I lost 3% I'd notice but didn't because by default compatability is off not on.
You be trollin n I'm out.


----------



## Agiels (Jul 11, 2016)

TRWOV said:


> 940/1250 is very tame. I have my 7950 (basically the same GPU) running at 1150/1500 (sapphire Vapor-x) so I don't think is that. The PSU is fine too, not top grade but adequate.
> 
> I'd say it's a software issue. I would try a full Windows re-install (refresh if you have W8/W10) to start clean and see if that solves it.


i did man, i reinstalled windows like 5 times, fresh, i never upgrade over an existing SO, no, i decided to clean and replace thermal paster in my GPU and its overheating, do i have to wait an amount of time after doing that ?? like what 2h ? its a lot hotter now, and i reaply thermal again and same problem.


----------



## GhostRyder (Jul 11, 2016)

This thread has become quite funny as the hypocrisy levels are off the charts...Some trolls are OK but others aren't...give me a break.  Can we just have one thread that doesn't devolve into a my company has a bigger...



D007 said:


> Whatever you say... They can "fix it all they want.. It's still an underperforming card for todays requirements.
> Bought ATI before and was never happy with it.. Even their rendering methods look worse.
> 
> Again I say.. You got a "driver" update that got you those FPS. The software tweak they did to fix the power draw, helped nothing..
> ...


I understand what your saying but were just saying that compatibility mode is a very specific thing that means nothing.  Most people will just gain the 3% and the power draw will transfer to the 6 pin. 
BTW, when did you get the FTW 1080?  How is it?



cdawall said:


> I don't hope any of that as it is just mean. All I hope is nvidia releases a 1080Ti in two months that makes his card an overpriced paperweight.


If they come that fast I think I will be shocked.  I think we may see the Titan soon though lol...



theoneandonlymrk said:


> Nice looking block but it would be upside down and out of sight in mine so a bit spangly for me. And expensive.


Oh really, what case?  Btw you tried playing with them clocking wise?


----------



## xkm1948 (Jul 11, 2016)

I just find it funny that I was pointing out 1080 has its defects and got personal threatened by a moderator to ban me. While at the same time rolls of trolling fanboys went on unpunished. Ain't it lovely when the moderating team is biased? You can't even use the "report" on them since more than likely then are going to ban me without saying anything.


----------



## TheoneandonlyMrK (Jul 12, 2016)

Agiels said:


> i did man, i reinstalled windows like 5 times, fresh, i never upgrade over an existing SO, no, i decided to clean and replace thermal paster in my GPU and its overheating, do i have to wait an amount of time after doing that ?? like what 2h ? its a lot hotter now, and i reaply thermal again and same problem.





GhostRyder said:


> This thread has become quite funny as the hypocrisy levels are off the charts...Some trolls are OK but others aren't...give me a break.  Can we just have one thread that doesn't devolve into a my company has a bigger...
> 
> 
> I understand what your saying but were just saying that compatibility mode is a very specific thing that means nothing.  Most people will just gain the 3% and the power draw will transfer to the 6 pin.
> ...


I've got quite a heavily modded Thermaltake Kandalf with extra fans and too much rad for my blocks atm, its old but usefully a bit quiet because it runs while I sleep, I've only managed a mild 1330core 2100mem so far but I've not had the time for a proper go yet,hopefully soon, I've a question actually.
Does anyone know what the he'll the setting on the new wattman does below the fan conrols ,it says acoustic something and is in mhz? Is a higher setting to allow more noise or less it has me stumped.


----------



## D007 (Jul 14, 2016)

GhostRyder said:


> This thread has become quite funny as the hypocrisy levels are off the charts...Some trolls are OK but others aren't...give me a break.  Can we just have one thread that doesn't devolve into a my company has a bigger...
> 
> 
> I understand what your saying but were just saying that compatibility mode is a very specific thing that means nothing.  Most people will just gain the 3% and the power draw will transfer to the 6 pin.
> ...



Sorry, been away for a few days.. Maybe I am misunderstanding this fix a bit.. idk..
Am I wrong? A driver update gave that 3%.. That's very normal for driver updates and I don't see how it applies to the current issue. As it is they really didn't fix the power draw at all right? You can now choose to leave it as it, or go compatibility mode and loose 3%..

But the 1080 has been an absolute beast. I am sincerely impressed.. I am playing games like witcher 3, in 3840x2160, ultra settings with Nvidia hair effects on...
Just finished Shadow of Mordor in 4k.
I'm kind of in shock at how well it performs tbh. The fan is even very silent and the temps are wow.. They never even hit 70c, much less 80.



Steevo said:


> Apparently you didn't read the article written by W1zz, but maybe since the words were about a brand you only care to hate on you can't read them?
> 
> 
> But I'm glad you overpaid for a FTW 1080 I hope it works well after the fan issues have been sorted, and I hope you never have any display issues, or any black screens when watching media on multiple screens, or when hardware acceleration is in use, or just randomly.



So tired of people jumping on the fan boy wagon, every time things don't go how they want. I have used ATI before dude.. That fanboy BS you are spewing is just that BS..  ATI screwed up and the cards took a performance hit.. Bottom line..

Weather or not you can deal with that, is your thing and I really don't care.... Overpaid? Hey buddy, I part own two companies. I'm allowed to pay whatever I want, for whatever I want.. Only person hating here and fan boy flaming, is you.. 

Fan issues were virtually immediately fixed btw and I have no idea wtf you are talking about with the display problems.. Likely people need to update their firmware or it's a browser issue. Mine runs like a champ.. But hey, if you can't say true things, I guess you could always make up stuff... Apparently..

Regardless for both Nvidia and ATI, I expect they will fix it eventually.


----------



## AsRock (Jul 14, 2016)

D007 said:


> Sorry, been away for a few days.. Maybe I am misunderstanding this fix a bit.. idk..
> Am I wrong? A driver update gave that 3%.. That's very normal for driver updates and I don't see how it applies to the current issue. As it is they really didn't fix the power draw at all right? You can now choose to leave it as it, or go compatibility mode and loose 3%..
> 
> But the 1080 has been an absolute beast. I am sincerely impressed.. I am playing games like witcher 3, in 3840x2160, ultra settings with Nvidia hair effects on...
> ...



The 1080 should be a beast, how much was it ?, ooh never mind yeah over 3 times the cost of a RX480 which does dam well for it's price range and whats your 1080 get to do with any thing anyways..

They fixed the power issue yes it gained a little to lose a little,  thinking about it, they got a boost from drivers that fixed a issue  of a card that only been out about a week and upped the performance at the same time.


----------



## D007 (Jul 14, 2016)

AsRock said:


> The 1080 should be a beast, how much was it ?, ooh never mind yeah over 3 times the cost of a RX480 which does dam well for it's price range and whats your 1080 get to do with any thing anyways..
> 
> They fixed the power issue yes it gained a little to lose a little,  thinking about it, they got a boost from drivers that fixed a issue  of a card that only been out about a week and upped the performance at the same time.



Maybe because HE ASKED ME HOW MY 1080 WAS DOING..  So childish..
You don't read much do you? But you loooove seeing yourself speak..

And you can keep repeating that a "driver update gave you 3%" as if it applies to the current and still existing issue of power draw, all you want..
It's the same as it was, with an option to loose 3%, with a toggle.. Am I wrong? Oh, you don't care.. You just want to rant and hop on the fan boy band wagon.. Enjoy.. To the list you go..
Here's the fixes you were so worried about with Nvidia btw.. VR crap and something about DP, neither of which I experienced.
New driver.
https://www.techpowerup.com/forums/...ql-virtual-reality-game-ready-drivers.224129/


----------



## GhostRyder (Jul 14, 2016)

D007 said:


> Sorry, been away for a few days.. Maybe I am misunderstanding this fix a bit.. idk..
> Am I wrong? A driver update gave that 3%.. That's very normal for driver updates and I don't see how it applies to the current issue. As it is they really didn't fix the power draw at all right? You can now choose to leave it as it, or go compatibility mode and loose 3%..
> 
> But the 1080 has been an absolute beast. I am sincerely impressed.. I am playing games like witcher 3, in 3840x2160, ultra settings with Nvidia hair effects on...
> ...


Well, the thing is it does three different things for the RX 480.  The first is a fix for the overdraw, just by having the driver installed (Not changing settings or anything) the power just switches the card from drawing on the PCIE slot to the 6 pin connector resulting in the new draw being ~76watts from the PCIE and the rest now on the 6 pin so the power consumption remains the same on the card overall its just routed differently the way the card pulls it.  The next fix was a power limiter which is what your referring to, that is off by default and when enabled lowers the cards power consumption for older machines or if you just want lower power consumption.  The third part was a 3% performance increase that is just a what it is.  If you just installed the driver, you would get 3% higher in games is all that would be noticed by most users.

Have you tried some overclocking on the FTW edition?  Curious what max your able to hit on it.


----------



## D007 (Jul 14, 2016)

GhostRyder said:


> Well, the thing is it does three different things for the RX 480.  The first is a fix for the overdraw, just by having the driver installed (Not changing settings or anything) the power just switches the card from drawing on the PCIE slot to the 6 pin connector resulting in the new draw being ~76watts from the PCIE and the rest now on the 6 pin so the power consumption remains the same on the card overall its just routed differently the way the card pulls it.  The next fix was a power limiter which is what your referring to, that is off by default and when enabled lowers the cards power consumption for older machines or if you just want lower power consumption.  The third part was a 3% performance increase that is just a what it is.  If you just installed the driver, you would get 3% higher in games is all that would be noticed by most users.
> 
> Have you tried some overclocking on the FTW edition?  Curious what max your able to hit on it.



Ty for giving me an informed response to an honest question.. I asked like three times but it seemed like people just wanted to attack me..
My 1080 Oc's like crap..lol.. I can get like 50 points on the core in Afterburner. But the memory is something else..
Varies on the game for stability but I have run it at up to +600 on the memory.. Idk what that equals.

So they did kind of fix it then it would seem. Since the power draw is no longer "dangerous" it would sound.
That's what I was having a hard time understanding.. ty.
Cool, glad they worked it out then.
Still think they are misrepresenting the gain though.. That 3% is from the driver, standard..


----------



## GhostRyder (Jul 14, 2016)

D007 said:


> Ty for giving me an informed response to an honest question.. I asked like three times but it seemed like people just wanted to attack me..
> My 1080 Oc's like crap..lol.. I can get like 50 points on the core in Afterburner. But the memory is something else..
> Varies on the game for stability but I have run it at up to +600 on the memory.. Idk what that equals.
> 
> ...


Its a normal gain as we see from all cards once a driver matures, I agree its a little blown up.

Really, so whats that mean it tops out at boost wise, 2050?  I was hoping the FTW edition whcih I was looking at would hit 2150-2200.


----------



## D007 (Jul 14, 2016)

GhostRyder said:


> Its a normal gain as we see from all cards once a driver matures, I agree its a little blown up.
> 
> Really, so whats that mean it tops out at boost wise, 2050?  I was hoping the FTW edition whcih I was looking at would hit 2150-2200.



Yea, seems blown up.. As much as you can blow up  3%, which is in it's self, nothing..lol 
Just not a fan of being mislead. It's the principle of the matter.. 
Like if you rip me off for ten bucks or a hundred. I'm going to be annoyed.
Not saying they ripped people off..lol.. Just trying to make a decent comparison.

As for the 1080 overclocking...
Just mine sucks..lol.. I hear a lot of people hitting 2100 easy.. Mine doesn't, not fully stable anyway.. Crappy thing is they saved the good dies for the kingpin editions.. 
I assumed the FTW would have them as well. The FTW is no more capable than the FE.. No idea what makes them they think should call it a FTW if it's no better..
I honestly feel ripped off by that one thing. I paid like 75 bucks more, for no reason at all. Could of gotten an ATX 3.0, with the same performance.


----------



## AsRock (Jul 14, 2016)

D007 said:


> Maybe because HE ASKED ME HOW MY 1080 WAS DOING..  So childish..
> You don't read much do you? But you loooove seeing yourself speak..
> 
> And you can keep repeating that a "driver update gave you 3%" as if it applies to the current and still existing issue of power draw, all you want..
> It's the same as it was, with an option to loose 3%, with a toggle.. Am I wrong? Oh, you don't care.. You just want to rant and hop on the fan boy band wagon.. Enjoy.. To the list you go..



Well my bad for not noticing. Sure i like replying to what people think as much as you do by the looks of it.

I am sure we will all love on ).



D007 said:


> Yea, seems blown up.. *As much as you can blow up  3%, which is in it's self, nothing..lol *
> Just not a fan of being mislead. It's the principle of the matter..
> Like if you rip me off for ten bucks or a hundred. I'm going to be annoyed.
> Not saying they ripped people off..lol.. Just trying to make a decent comparison.
> ...



Gain 3% with new drivers, use toggle power usage is fixed but lose the 3%, pretty much puts  it all as it should be.

Yeah i got screwed by eVGA with some thing like that too.


----------



## GhostRyder (Jul 15, 2016)

D007 said:


> Yea, seems blown up.. As much as you can blow up  3%, which is in it's self, nothing..lol
> Just not a fan of being mislead. It's the principle of the matter..
> Like if you rip me off for ten bucks or a hundred. I'm going to be annoyed.
> Not saying they ripped people off..lol.. Just trying to make a decent comparison.
> ...


Good to know, thanks for the information as I might skip and wait for others to see what else becomes available (I think EVGA is at the point of releasing way to many variants of the same GPU).  I was still hoping for awhile more voltage was gonna help those chips but looks like its just luck of the draw within ~50mhz for all samples.  Ill wait and buy probably some plastic editions for cheap and just waterblock them or buy another brand that bins a bit better.  Or heck I may wait for 1080ti, still up in the air on that.

Yea I am not a big fan of being misled either, its why I have gotten so skeptical of any purchase and no longer jump on them day one like I used to.


----------

