# Sapphire Radeon RX 580 Nitro+ Limited Edition 8 GB



## W1zzard (Apr 18, 2017)

The Sapphire RX 580 Nitro+ Limited Edition is a highly overclocked custom variant of the just-launched AMD Radeon RX 580. Performance now beats the NVIDIA GTX 1060, and in the box, Sapphire has bundled two user-replaceable semi-transparent fans with blue LEDs, if you want a little bit more bling.

*Show full review*


----------



## Nokiron (Apr 18, 2017)

That's quite the increase in power consumption...


----------



## ShurikN (Apr 18, 2017)

Consumes 100W more to achieve 5% more perf than a 1060. Guess which chip you wont see in a laptop... again.
Not to mention that it's trading blows with an equally priced custom 1060. And those cards were out for a while.

Wanted to say I'm disappointed, but my expectations were extremely low to begin with. And boy were those expectations met.


----------



## RejZoR (Apr 18, 2017)

Would be nice if news also mad it into the Forum headlines section. I miss bunch of reviews because I forget to look down here or I miss a tweet that has been sent out for the review...


----------



## newtekie1 (Apr 18, 2017)

ShurikN said:


> Consumes 100W more to achieve 5% more perf than a 1060. Guess which chip you wont see in an laptop... again.



Not to mention the worse price to performance and worse overclocking.


----------



## KainXS (Apr 18, 2017)

The Max Power Delivery Limit and TDP increased over the RX 480 Nitro(TDP increased drastically 110W --> 180W) in the bios, thats probably why it consumes so much power.

If the Gaming X version has the same PCB as the 480's then I would probably try to flash my card to 580 to get the reduced clocks though to lower the idle power consumption.


----------



## darklm (Apr 18, 2017)

I didn't expect anything and I'm still disappointed

This rebranding is worse than the 300 series


----------



## sith'ari (Apr 18, 2017)

From a 1st glimpse it seems like a good mid-range GPU (*and it is a good mid-range GPU).
BUT....
1) Extremely high power consumption compared to the competition (*GTX1060) or even its predecessor (RX480)
2) Its 5% average ahead of GTX1060 reference, but that means that it will be equal with the aftermarket GTX1060s.
3) In addition to comment 2), if we add the fact that NVidia will be launching a refreshed line of GTX1060s with faster memory, then......
the results that come from thoughts 1) , 2) , 3) aren't very encouraging for AMD.


----------



## horsemama1956 (Apr 18, 2017)

I don't believe the power consumption is an issue with this class of card unless you're running a tiny it build or something. It's not 290/390 territory.

It is a nice selling point to say it's more efficient but in reality I don't really think people care when the card is under gaming load. People will whine about it on the internet but it shouldn't be a. Deal breaker for someone that just wants to game. The price is my main issue in terms of making it competitive with the 1060, nVidia can easily afford to drop the price knowing they have the 1070 and beyond basically untouchable until Vega.


----------



## Solid State Brain (Apr 18, 2017)

Where did the multi-monitor idle consumption improvements go? That was the thing I was looking forward to seeing.


----------



## ShurikN (Apr 18, 2017)

horsemama1956 said:


> I don't believe the power consumption is an issue with this class of card unless you're running a tiny it build or something. It's not 290/390 territory.


The power consumption is not an issue if the difference is up to 40W or so. But when you consume almost twice the power to achieve the same performance as your competitor ... you gotta ask yourself where you went wrong.


----------



## Dimi (Apr 18, 2017)

I'm just wondering how you got such low numbers on the 1070 Doom 1080p benchmark. I am constantly hitting the frame limit with mine on nightmare settings throughout the entire game. (yes i finished it lol).


----------



## Frick (Apr 18, 2017)

horsemama1956 said:


> I don't believe the power consumption is an issue with this class of card unless you're running a tiny it build or something. It's not 290/390 territory.
> 
> It is a nice selling point to say it's more efficient but in reality I don't really think people care when the card is under gaming load. People will whine about it on the internet but it shouldn't be a. Deal breaker for someone that just wants to game. The price is my main issue in terms of making it competitive with the 1060, nVidia can easily afford to drop the price knowing they have the 1070 and beyond basically untouchable until Vega.



Thing is if cards A and B perform about the same, cost about the same and make about the same kind of noise, shaving off 80W in gaming is not a small thing.

So, the 4GB RX580 is the bang/buck card to get, or the 4GB RX480 if priced lower.


----------



## W1zzard (Apr 18, 2017)

Solid State Brain said:


> Where did the multi-monitor idle consumption improvements go? That was the thing I was looking forward to seeing.


second chart on the power page?



Dimi said:


> I'm just wondering how you got such low numbers on the 1070 Doom 1080p benchmark. I am constantly hitting the frame limit with mine on nightmare settings throughout the entire game. (yes i finished it lol).


play it again to find a better test scene


----------



## RejZoR (Apr 18, 2017)

It's funny how people obsess over power consumption all of a sudden. I only see that as a problem if that affects actual heat and overall noise (which is usually connected with heat). Considering this card is one of the quietest cards, who really cares? And it overclocks kinda decent. I mean, 1480MHz is quite nice clock for Polaris. And now someone will throw an miniATX or ITX build in my face. How many people seriously use those compared to normal id or high towers? They are tiny minority no one cares about.


----------



## 80-watt Hamster (Apr 18, 2017)

ShurikN said:


> The power consumption is not an issue if the difference is up to 40W or so. But when you consume almost twice the power to achieve the same performance as your competitor ... you gotta ask yourself where you went wrong.



Design choices.  AMD has kept the compute resources intact over GCN revisions, while Nvidia was able to dial some of it back, along with the memory subsystem, when they introduced the new compression algorithms with Maxwell.  If you go back and look at Kepler and earlier, power vs. performance was darn close from both companies.  AMD hasn't found an equivalent "magic bullet" to Nvidia's delta compression, so they're stuck with primarily process gains to this point.


----------



## Basard (Apr 18, 2017)

It's an overclocked card so the power usage looks higher than it should.  
Compare it to a higher than reference clocked 480 and the consumption will look more in line.


----------



## Nokiron (Apr 18, 2017)

80-watt Hamster said:


> Design choices.  AMD has kept the compute resources intact over GCN revisions, while Nvidia was able to dial some of it back, along with the memory subsystem, when they introduced the new compression algorithms with Maxwell.  If you go back and look at Kepler and earlier, power vs. performance was darn close from both companies.  AMD hasn't found an equivalent "magic bullet" to Nvidia's delta compression, so they're stuck with primarily process gains to this point.


The magic bullet is the tile-based rasterization, the compression does not matter that much in comparsion. Especially not in regards to efficiency.



RejZoR said:


> It's funny how people obsess over power consumption all of a sudden. I only see that as a problem if that affects actual heat and overall noise (which is usually connected with heat). Considering this card is one of the quietest cards, who really cares? And it overclocks kinda decent. I mean, 1480MHz is quite nice clock for Polaris. And now someone will throw an miniATX or ITX build in my face. How many people seriously use those compared to normal id or high towers? They are tiny minority no one cares about.


I think the point really is that these results are worse than when Hawaii got a refresh.

That increased power consumption by quite a bit as well, but the price dropped by like 20 percent.


----------



## Solid State Brain (Apr 18, 2017)

W1zzard said:


> second chart on the power page?



I did check out the graph; unfortunately I saw no improvements compared to the RX480 although apparently it was supposed to be one of the points of the improved process.


----------



## OneCool (Apr 18, 2017)

Wow. It's even worst than I expected.


----------



## NdMk2o1o (Apr 18, 2017)

Guess I might grab an 480 when they drop in price, was looking forward to 580 but not so much now....


----------



## W1zzard (Apr 18, 2017)

Solid State Brain said:


> I did check out the graph; unfortunately I saw no improvements compared to the RX480 although apparently it was supposed to be one of the points of the improved process.


yeah, amd promised but didn't deliver. it's not exactly an effect of the new process, but they supposedly added intermediary power states that should be active in multi-monitor for example.

If you look at the temps page, table at the end you'll see that the card still runs full memory clocks in multi-monitor. compare that to blu-ray, where the clocks are lower and look like they reflect what AMD intended to do.


----------



## kruk (Apr 18, 2017)

The power of this card is high, but it should be no surprise as it's clocked much higher than Polaris 10 optimum is:






Reference had 0.980 - 1.070 V, this one has 1.125 V. It surprises me however that they only increased the GPU clocks but kept the RAM speed at the same level. I thought that the RAM speed was the bottleneck for Polaris 10 ...


----------



## 80-watt Hamster (Apr 18, 2017)

Nokiron said:


> The magic bullet is the tile-based rasterization, the compression does not matter that much in comparsion. Especially not in regards to efficiency.



Thanks for the correction.  I had a feeling I'd gotten that wrong.


----------



## etayorius (Apr 18, 2017)

What a useless rebrand, just grab an aftermarket RX480 and OC the hell out of it. What was AMD thinking? PowerConsumption actually went up, blegh!

If they had drop the price to $220 Max, it would had been something, also push the naming scheme one notch down and sold it as RX570. It just makes absolutely no sense.


----------



## madness777 (Apr 18, 2017)

Ok guys, I see a lot of Power Consumption comments here. Well it seems you forgot that the RX4xx series had a serious TDP problem with the single 6-pin and PCI-E slot power balance.
I can still hear the people losing their mind when they saw the PCI-E power go over the limit of 75W. Then we saw some new BIOS releases for the RX4xx.
Now you see, you're looking at the ACTUAL power consumption that the chip should have had when the RX4xx series were released. They fixed the mistakes they made previously.
I am disappointed to see the price increase however, I was really hoping they would keep the price and stay relevant.


----------



## tvamos (Apr 18, 2017)

Well, not much more power hungry compared to Nitro+480


----------



## rtwjunkie (Apr 18, 2017)

Slightly useless rebrand, BUT...for someone looking to upgrade his secondary rig, like me, and who didn't get a 480, it might be worth it to get the 580, due to consistently getting about 6 FPS more on average than the 480.


----------



## notb (Apr 18, 2017)

RejZoR said:


> It's funny how people obsess over power consumption all of a sudden. I only see that as a problem if that affects actual heat and overall noise (which is usually connected with heat). Considering this card is one of the quietest cards, who really cares? And it overclocks kinda decent. I mean, 1480MHz is quite nice clock for Polaris. And now someone will throw an miniATX or ITX build in my face. How many people seriously use those compared to normal id or high towers? They are tiny minority no one cares about.



Sure, let's forget about ITX builds (and AIO). But if I open some Ryzen-related threads, I think I should be able to find your posts praising Ryzen's 95W TDP compared to Intel HEDT 140W. Wanna bet?

And how many people use something as silly and outdated as a desktop PC, when you can get a notebook with mobile 1060 that's almost as fast as the desktop one? And keep in mind AMD makes some very efficient Radeon Pro GPUs for Apple MacBooks. Why do they even bother with this RX rebrand? Or is Radeon Pro designed by Apple and exclusive?


----------



## Manu_PT (Apr 18, 2017)

RejZoR said:


> It's funny how people obsess over power consumption all of a sudden. I only see that as a problem if that affects actual heat and overall noise (which is usually connected with heat). Considering this card is one of the quietest cards, who really cares? And it overclocks kinda decent. I mean, 1480MHz is quite nice clock for Polaris. And now someone will throw an miniATX or ITX build in my face. How many people seriously use those compared to normal id or high towers? They are tiny minority no one cares about.



I do use matx and matx users are not a minority at all. Still the temps on matx are not much worse than atx depending on the case. Power consumprion is always an important factor. Maybe not where you live, but electricity is expensive in some eu countries. I always want the best power consumption product for the same price/perfomance. 180w in 2017 for a 1080p card is too much


----------



## Manu_PT (Apr 18, 2017)

notb said:


> Sure, let's forget about ITX builds (and AIO). But if I open some Ryzen-related threads, I think I should be able to find your posts praising Ryzen's 95W TDP compared to Intel HEDT 140W. Wanna bet?
> 
> And how many people use something as silly and outdated as a desktop PC, when you can get a notebook with mobile 1060 that's almost as fast as the desktop one? And keep in mind AMD makes some very efficient Radeon Pro GPUs for Apple MacBooks. Why do they even bother with this RX rebrand? Or is Radeon Pro designed by Apple and exclusive?


 
He will always defend everything amd related


----------



## Nihilus (Apr 18, 2017)

Manu_PT said:


> I do use matx and matx users are not a minority at all. Still the temps on matx are not much worse than atx depending on the case. Power consumprion is always an important factor. Maybe not where you live, but electricity is expensive in some eu countries. I always want the best power consumption product for the same price/perfomance. 180w in 2017 for a 1080p card is too much



Yep, mATX and ITX builder as well.  I wouldn't consider non-crossfire users as the minority, so what is the point of getting an ATX board??  Electricity is plenty expensive here as well in Japan.  Power consumption does not matter in Rejzor's world since his parents pay the utilities.


----------



## Nima (Apr 18, 2017)

What a disappointing GPU. worse performance/watt than +2 year old Maxwell cards and worse performance/dollar compare to 1060 the 1 year old GPU from competition and almost no overclocking headroom. I think this card is only aimed at die hard AMD fanboys.


----------



## rtwjunkie (Apr 18, 2017)

notb said:


> And how many people use something as silly and outdated as a desktop PC


You're in the wrong forum for that statement, bud. Anyone that wants a powerful, cool system with expandability and the ability to play games as they were designed will build and use "outdated" desktops.  

Additionally there are professional environments that need only what a desktop can provide.


----------



## efikkan (Apr 18, 2017)

It's a refresh, that's fine. But the efficiency is still not improving.


----------



## Dimi (Apr 18, 2017)

The problem is that in my case, power consumption is beginning to weigh in heavily on the purchase of gpu's. 2 years ago, monthly electricity bill was 152 euro. Jumped up to 187 last year and at the moment it is already at 202 per month WITH a usage decrease of around 5% over the past 2 years. 40.6% increase in electricity cost. They are already talking about rising prices even more next year.


----------



## TheGuruStud (Apr 18, 2017)

ShurikN said:


> Consumes 100W more to achieve 5% more perf than a 1060. Guess which chip you wont see in a laptop... again.
> Not to mention that it's trading blows with an equally priced custom 1060. And those cards were out for a while.
> 
> Wanted to say I'm disappointed, but my expectations were extremely low to begin with. And boy were those expectations met.



Nope, it's being overvolted. Undervolt and run at laptop freqs and you're gonna shave off more than 50% (probably 75%).


----------



## Yukikaze (Apr 18, 2017)

Performance wise, it is exactly what was promised. The power consumption, however, is nuts. I actually think my R9 Fury Nitro has a better power/performance ratio at this point, and that is saying something.


----------



## GhostRyder (Apr 18, 2017)

I don't think its as bad as some people are saying, but its definitely far from anything spectacular.  Honestly, I had hoped more of an improvement on the clocking potential but I guess thats a pipe dream for now.  

Well either way I guess its up to Vega to do something...


----------



## ShurikN (Apr 18, 2017)

TheGuruStud said:


> Nope, it's being overvolted. Undervolt and run at laptop freqs and you're gonna shave off more than 50% (probably 75%).


Mhm, the market is brimming currently with those undervolted 480s and 470s.


----------



## efikkan (Apr 18, 2017)

The TDP of 185W is really bad news.

But at least it should put to rest the myth of 80W RX 480s…


----------



## Countryside (Apr 18, 2017)

Driver 16.3.3 a little bit outdated won't you say.


----------



## Captain_Tom (Apr 18, 2017)

For the love of God when will you guys stop using BF3 for overclocking?!

At least upgrade to BF1, DE:MD, or he'll even Metro Last Light is newer.



Otherwise great review.  I will be interested to see if someone attempts to make an efficient AIB card, but most likely everyone will just chase higher clocks with tons of power connectors....


----------



## intelzen (Apr 18, 2017)

why there is no CONS like: "it is a rx480" or "we tested this card a year ago".


----------



## HD64G (Apr 18, 2017)

newtekie1 said:


> Not to mention the worse price to performance and worse overclocking.


Worse overclocking just because @W1zzard didn't use Trixx or Afterburner you mean...



sith'ari said:


> From a 1st glimpse it seems like a good mid-range GPU (*and it is a good mid-range GPU).
> BUT....
> 1) Extremely high power consumption compared to the competition (*GTX1060) or even its predecessor (RX480)
> 2) Its 5% average ahead of GTX1060 reference, but that means that it will be equal with the aftermarket GTX1060s.
> ...


1) This specific one is a custom OC and overvoltage product. Even 1080 when oced and overvoltaged consumes 80-100W more...
2) For same money as before for stock vs stock, now 580 is equal to 1060 6GB, being cheaper usually...
3) Remains to be seen how much gain will 1060 with 1Gb faste memory will gain...



etayorius said:


> What a useless rebrand, just grab an aftermarket RX480 and OC the hell out of it. What was AMD thinking? PowerConsumption actually went up, blegh!
> Stock 580 isn't consuming more than stock 480, you are looking on a custom oced model here....
> 
> If they had drop the price to $220 Max, it would had been something, also push the naming scheme one notch down and sold it as RX570. It just makes absolutely no sense.





madness777 said:


> Ok guys, I see a lot of Power Consumption comments here. Well it seems you forgot that the RX4xx series had a serious TDP problem with the single 6-pin and PCI-E slot power balance.
> I can still hear the people losing their mind when they saw the PCI-E power go over the limit of 75W. Then we saw some new BIOS releases for the RX4xx.
> Now you see, you're looking at the ACTUAL power consumption that the chip should have had when the RX4xx series were released. They fixed the mistakes they made previously.
> I am disappointed to see the price increase however, I was really hoping they would keep the price and stay relevant.



$230 for stock 580 is the official price with 4% more FPS average. A refresh with benefits for customes and not any con imho.



Yukikaze said:


> Performance wise, it is exactly what was promised. The power consumption, however, is nuts. I actually think my R9 Fury Nitro has a better power/performance ratio at this point, and that is saying something.



Overvoltage your Fury Nitro and report back with your power consuption numbers. I cannot get why most overlook that this model is the higher clocked one. Wait for the stock model or the ones with moderate oc to see the normal power consumption numbers.


----------



## xkm1948 (Apr 18, 2017)

Wow, one negative thing! Let us ALL ATTACK that single point over and over and over and over. Internet mob mentality.


Meanwhile clearly the author of review gave it a 9.2/10 along with this:






At the same time this card delivers some good gain over DX12/Vulkan.

@W1zzard, is it possible for you to separate DX11/OpenGL gaming performance summary from DX12/Vulkan gaming performance summary?


----------



## yotano211 (Apr 18, 2017)

notb said:


> Sure, let's forget about ITX builds (and AIO). But if I open some Ryzen-related threads, I think I should be able to find your posts praising Ryzen's 95W TDP compared to Intel HEDT 140W. Wanna bet?
> 
> And how many people use something as silly and outdated as a desktop PC, when you can get a notebook with mobile 1060 that's almost as fast as the desktop one? And keep in mind AMD makes some very efficient Radeon Pro GPUs for Apple MacBooks. Why do they even bother with this RX rebrand? Or is Radeon Pro designed by Apple and exclusive?


I would use a silly/outdated machine like those desktop PCs and I have run laptop only for the past 9 years. My life style says that I can only use a laptop since I am not at home lots of times. If I was home more often, I would built myself a nice fancy "outdated desktop PC".

These comments dont belong here, if you want desktop vs laptop battles take it somewhere else. There will always be a need for a laptop for desktop its up to the buyer and their own life style.


----------



## xkm1948 (Apr 18, 2017)

I am seriously disgusted by some of the posts here.  Rebrand is boring, We don't like rebrands. Yet when Intel/Nvidia does rebrands I don't see such outcry. Such double standards and selective blindness.  

Read the review with your own god damn judgements. Read the DX12/Vulkan portion separately from DX11 portion and it will see some good gains.


----------



## yotano211 (Apr 18, 2017)

I was expecting for the power draw to come down somewhat and maybe underclock it so that it can fit in a laptop. Nvdia is the only player at the mid-top level GPUs for laptops.


----------



## Nokiron (Apr 18, 2017)

xkm1948 said:


> I am seriously disgusted by some of the posts here.  Rebrand is boring, We don't like rebrands. Yet when Intel/Nvidia does rebrands I don't see such outcry. Such double standards and selective blindness.


Really now? Show me one Nvidia-rebrand post where the outcry is missing.

And AFAIK, Intel har no rebrands through their models. A die-shrink is a heck of alot more then a simple rebrand.


----------



## rtwjunkie (Apr 18, 2017)

xkm1948 said:


> I am seriously disgusted by some of the posts here.  Rebrand is boring, We don't like rebrands. Yet when Intel/Nvidia does rebrands I don't see such outcry. Such double standards and selective blindness.
> 
> Read the review with your own god damn judgements. Read the DX12/Vulkan portion separately from DX11 portion and it will see some good gains.


Getting angry over people's opinions? Fanboy much?  There was plenty of that anti-rebrand hate last time NVIDIA did it.  I don't really care who does it.  Your reaction though, is over the top and way too personally reflective of being butt-hurt.

That being said, people are forgettting that if you want to compare the REAL power consumption increase, compare this Nitro+ to the top model Nitro+ model RX 480.  That is the real comparison.


----------



## birdie (Apr 18, 2017)

Score 9.2? Editor's choice?

Really? For almost zero improvements? For the same power inefficiency? For the same bugs, like, extreme dual monitor idle power consumption? The most upsetting rebrand in the history of GPUs.

I'm shaking my head in disbelief.


----------



## bug (Apr 18, 2017)

sith'ari said:


> From a 1st glimpse it seems like a good mid-range GPU (*and it is a good mid-range GPU).
> BUT....
> 1) *Extremely high power consumption compared to the competition (*GTX1060) or even its predecessor (RX480)*
> 2) Its 5% average ahead of GTX1060 reference, but that means that it will be equal with the aftermarket GTX1060s.
> ...


It burns through the same power as a 1080, but without outputting as many FPS. If that's not disappointing, nothing is.


----------



## Yukikaze (Apr 18, 2017)

HD64G said:


> Overvoltage your Fury Nitro and report back with your power consuption numbers. I cannot get why most overlook that this model is the higher clocked one. Wait for the stock model or the ones with moderate oc to see the normal power consumption numbers.



I was commenting on the specific model, not every RX580 in the world. And why would I overvolt my Fury? It beats this Nitro 580 at stock volts, which is the point I was making there. This specific card has a horrible power consumption, and it is very hard to argue otherwise.


----------



## ShurikN (Apr 18, 2017)

AMD should have released this card as a 570 (move the entire Polaris lineup one step down), and give the consumers a 580 with 40CU like the one in Xbox Scorpio.


----------



## birdie (Apr 18, 2017)

ShurikN said:


> AMD should have released this card as a 570 (move the entire Polaris lineup one step down), and give the consumers a 580 with 40CU like the one in Xbox Scorpio.



475 and 485 would have filled the bill better. At least when NVIDIA does rebrands like 6xx to 7xx we get some very tangible benefits:


> The new card delivers an impressive 20% performance improvement over the GTX 660 but is also priced $50 higher



480 to 580? The difference is not really visible to the naked eye.


----------



## DarkOCean (Apr 18, 2017)

OMG these are so bad. Just look at that power consumption and perf/watt. ~gtx 1060 performance with ~gtx 1080 ti power consumption 
wtf?! People have been undervolting polaris to save some power but noo, Amd went ahead and overvolted and overclocked these even further, and calling it a new product at the same prices as a year ago. What could go wrong? LOL that perf/watt would be bad even for 28m


----------



## HD64G (Apr 18, 2017)

Yukikaze said:


> I was commenting on the specific model, not every RX580 in the world. And why would I overvolt my Fury? It beats this Nitro 580 at stock volts, which is the point I was making there. This specific card has a horrible power consumption, and it is very hard to argue otherwise.


But it still consumes less than yours me thinks. And loses by less than 10% me thinks eh? And it's much cheaper also?


----------



## medi01 (Apr 18, 2017)

Regarding multi monitor power consumption, as seen on anand, with two monitors running at the same resolution it consumes way less power, than 480. 
There is no difference when monitors have different resolutions.

(TPU tests with different)


----------



## HD64G (Apr 18, 2017)

yotano211 said:


> I was expecting for the power draw to come down somewhat and maybe underclock it so that it can fit in a laptop. Nvdia is the only player at the mid-top level GPUs for laptops.


This same chip powers the PS4 pro. It just runs at 800MHz where it excels at efficiency. So, this could be done for mobiles too with 900-1000 clocks and very low power comsumption.


----------



## bug (Apr 18, 2017)

HD64G said:


> This same chip powers the PS4 pro. It just runs at 800MHz where it excels at efficiency. So, this could be done for mobiles too with 900-1000 clocks and very low power comsumption.


Yes. And under the right circumstances, pigs could fly.


----------



## medi01 (Apr 18, 2017)

Solid State Brain said:


> I did check out the graph; unfortunately I saw no improvements compared to the RX480 although apparently it was supposed to be one of the points of the improved process.



Afair undervolted 580 consumes less than undervolted 480, but can't find the source any more... =/
(they significantly reduced blueray/video power consumption, as well as multi mon, when both monitors are at the same resolution)




DarkOCean said:


> OMG these are so bad at power consumption


FTFY
18% perf difference between these and 980Ti (beating it in Doom), that's hardly bad.



ShurikN said:


> AMD should have released this card as a 570 (move the entire Polaris lineup one step down), and give the consumers a 580 with 40CU like the one in Xbox Scorpio.


Puzzling why they didn't.
570 looks better, although +50w isn't that great either, but it does beat 3Gb 1060.


----------



## Yukikaze (Apr 18, 2017)

HD64G said:


> But it still consumes less than yours me thinks. And loses by less than 10% me thinks eh? And it's much cheaper also?



The R9 Fury consumes about the same and is indeed a bit more powerful. As for the price, the Fury is no longer on sale, as stocks finally ran out. It was sold for 220-240$ after rebate for the past few months, and I doubt the RX580 Nitro will be this cheap for a long time.

But that isn't really relevant to the point I was making, either. The point is that a 14nm card that manages to consume about as much power as a last gen 28nm card based off an absolutely massive chip is just....weird. The same point can be made in relation to a GTX980Ti, not just the R9 Fury. I just own a Fury, so I referenced that.

Don't get me wrong, the RX580 isn't bad, just as the RX480 is not bad. It has its niche and price range where it fits pretty well, and DX12 performance is good. I just find the power consumption ridiculous, especially for the performance. Whether that matters to people is up to them.


----------



## 80-watt Hamster (Apr 18, 2017)

birdie said:


> 475 and 485 would have filled the bill better. At least when NVIDIA does rebrands like 6xx to 7xx we get some very tangible benefits:



Agreed re: 475 and 485.  But Nvidia is far from blameless in the rebrand arena.  The 8800GT got transformed into 5 different chips, eventually becoming the GTS 250 (albeit with a die shrink along the way)


----------



## HD64G (Apr 18, 2017)

bug said:


> Yes. And under the right circumstances, pigs could fly.


Nice joke there


----------



## Meraj (Apr 18, 2017)

AC Syndicate 86 frame? everything maxed ? 
Thanks for review!


----------



## dat_boi (Apr 18, 2017)

AMD is making it harder and harder for their shill army to sell their products. Fanatizing consumers is probably more profitable than making a good products, and they don't even have to pay off the reviewers coz they're in the same boat.


----------



## newtekie1 (Apr 18, 2017)

medi01 said:


> FTFY
> 18% perf difference between these and 980Ti (beating it in Doom), that's hardly bad.



When did it beat the 980Ti in Doom?  The tests here don't show the RX 580 beating Doom, or even coming close.


----------



## Solid State Brain (Apr 18, 2017)

medi01 said:


> Regarding multi monitor power consumption, as seen on anand, with two monitors running at the same resolution it consumes way less power, than 480.
> There is no difference when monitors have different resolutions.
> 
> (TPU tests with different)


I think there are a lot of people who use as secondary monitor one with a lower resolution than the primary one, perhaps the model they were previously using as a primary.


----------



## illli (Apr 18, 2017)

I'm always rooting for the underdog, but this rebrand AMD card is completely underwhelming


----------



## Kissamies (Apr 18, 2017)

Better than I excepted, what AMD said about improvements over RX400 series wasn't that typical PR-BS. Some people care about power consumption, but for anyone like me who doesn't give a littlest crap about it, this looks good.

Also packing replacement fan(s) should be something other AIB's should also do.


----------



## Shatun_Bear (Apr 19, 2017)

The usual suspects going OTT with their criticism of this AMD product.

It doesn't take a genius to work out that the Polaris arch was designed as a mobile part first and foremost, where it excels at lower frequencies from 800-1000 Mhz. As soon as you push the frequencies way above that, efficiency nose-dives. Operating at 1440Mhz is way past the optimum for Polaris, hence the comparatively large power consumption.

Still, that doesn't change the fact that these are the fastest cards in their price bracket if you predominantly play new games. They are also decisively faster than anything in the same price bracket in DX12/Vulkan, which is a more important point than is appreciated by many.

Also, Vega is coming in a couple of months, let's not act like these are all AMD have got this year. Vega is going to encompass more than just the top-end Fury/Fury X replacements with Vega 11.


----------



## EarthDog (Apr 19, 2017)

Shatun_Bear said:


> The usual suspects going OTT with their criticism of this AMD product.
> 
> It doesn't take a genius to work out that the Polaris arch was designed as a mobile part first and foremost, where it excels at lower frequencies from 800-1000 Mhz. As soon as you push the frequencies way above that, efficiency nose-dives. Operating at 1440Mhz is way past the optimum for Polaris, hence the comparatively large power consumption.
> 
> ...


With you all the way outside of the BOLD part... only time will tell there, and its bad news looking at history (Mantle).



rtwjunkie said:


> Getting angry over people's opinions? Fanboy much?  There was plenty of that anti-rebrand hate last time NVIDIA did it.  I don't really care who does it.  Your reaction though, is over the top and way too personally reflective of being butt-hurt.
> 
> That being said, people are forgettting that if you want to compare the REAL power consumption increase, compare this Nitro+ to the top model Nitro+ model RX 480.  That is the real comparison.


+1  @rtwjunkie 

The key here... 'the last time'. Which was what... the 770 or something 3 generations ago? AMD does this religously is the problem. Its not different than an Intel tick-tock, except, without the IPC increase and nothing but a clockspeed increase.


----------



## oxidized (Apr 19, 2017)

I'm not so sure anymore, at these prices i could get a 1060 6GB overclock the sh** out of it (since it can) and eat both the 480 and the 580 alive.


----------



## lilunxm12 (Apr 19, 2017)

Dimi said:


> The problem is that in my case, power consumption is beginning to weigh in heavily on the purchase of gpu's. 2 years ago, monthly electricity bill was 152 euro. Jumped up to 187 last year and at the moment it is already at 202 per month WITH a usage decrease of around 5% over the past 2 years. 40.6% increase in electricity cost. They are already talking about rising prices even more next year.


Unless you're using your card 24x7 at relatively high load, I doubt it matters. There're probably other factors increasing your power bill more significantly.


----------



## TheMailMan78 (Apr 19, 2017)

Well l may get a 580 to replace my 780ti. Honestly people crying about power consumption are the same people who cry about MPG in muscle cars. Dumb asses.

Looking at raw performance/price these 580s ain't to bad. Looking forward to seeing Vega.


----------



## EarthDog (Apr 19, 2017)

lilunxm12 said:


> Unless you're using your card 24x7 at relatively high load, I doubt it matters. There're probably other factors increasing your power bill more significantly.


Spot on. I'd do some math on that as a gpu or two wouldnt need to be working 24/7 to increase you bill that much. A couple bucks a month if you game a few hours a day.


----------



## TheMailMan78 (Apr 19, 2017)

EarthDog said:


> Spot on. I'd do some math on that as a gpu or two wouldnt need to be working 24/7 to increase you bill that much. A couple bucks a month if you game a few hours a day.


You spend more making toast than what a 580 will cost you in gaming. Unless of course you are crunching or something.


----------



## newtekie1 (Apr 19, 2017)

lilunxm12 said:


> Unless you're using your card 24x7 at relatively high load, I doubt it matters. There're probably other factors increasing your power bill more significantly.



Lets say you average 2.5 hours of gaming a day.  At ~100w higher power consumption, that amounts to about $16.50 extra per year(at $0.15 per KWH).  Not an amount over the course of a year that you should be worried about.

On the flip side, remember how the GTX480 was pretty universally bashed for its high power consumption compared to the competition?  Well, it also consumed about 100w more than its competition at the time, and actually consumed less power than this RX 580.  Something to think about.


----------



## Assimilator (Apr 19, 2017)

9.2? Seriously? For a card that, in all honesty, blows goats? This isn't a refresh, it's AMD taking the piss.

Vega had better be the holy lord and saviour that AMD and its fanboys are hoping for, because if the rumours are true and Volta is going to arrive in Q3, all the Pascal cards are going to drop in price, which means we could see GTX 1060 6GB for lower than $200. Which would, justifiably, s**t all over the RX "500" series.



Shatun_Bear said:


> It doesn't take a genius to work out that the Polaris arch was designed as a mobile part first and foremost, where it excels at lower frequencies from 800-1000 Mhz. As soon as you push the frequencies way above that, efficiency nose-dives. Operating at 1440Mhz is way past the optimum for Polaris, hence the comparatively large power consumption.



Your "argument" is nonsensical. If Polaris was designed for mobile parts, then why is AMD putting it in desktop graphics cards? And where are the Polaris-powered laptops? The truth is that the Samsung/GloFo 14nm process is a POS that doesn't scale to decent clocks, and no number of claims about "improvements" can change that. Lipstick on a pig.


----------



## ShurikN (Apr 19, 2017)

Assimilator said:


> And where are the Polaris-powered laptops? The truth is that the Samsung/GloFo 14nm process is a POS that doesn't scale to decent clocks, and no number of claims about "improvements" can change that. Lipstick on a pig.


There's only one Polaris laptop i saw and that's the Alienware17 with an RX470 which costs as much as a 1060 6GB variant. Not taking 460 into consideration since it's a low power, low performance part.


----------



## RejZoR (Apr 19, 2017)

notb said:


> Sure, let's forget about ITX builds (and AIO). But if I open some Ryzen-related threads, I think I should be able to find your posts praising Ryzen's 95W TDP compared to Intel HEDT 140W. Wanna bet?
> 
> And how many people use something as silly and outdated as a desktop PC, when you can get a notebook with mobile 1060 that's almost as fast as the desktop one? And keep in mind AMD makes some very efficient Radeon Pro GPUs for Apple MacBooks. Why do they even bother with this RX rebrand? Or is Radeon Pro designed by Apple and exclusive?



Yeah, I praised AMD's Ryzen's TDP because their last gen was utter crap compared to this one. It was entirely incomparable. They went from 300W monstrosities that couldn't beat even Core i5's to Ryzen that puts even Intel to shame most of the time and considering the R&D differences between both, you'd expect Intel never to get into that position. Ever.



Manu_PT said:


> I do use matx and matx users are not a minority at all. Still the temps on matx are not much worse than atx depending on the case. Power consumprion is always an important factor. Maybe not where you live, but electricity is expensive in some eu countries. I always want the best power consumption product for the same price/perfomance. 180w in 2017 for a 1080p card is too much



Sorry, but tiny case users are in total minority. I've had a miniATX build and in all the years I had it, all I've seen was tons of big cases and like 5 people who had ITX builds as HTPC.



Manu_PT said:


> He will always defend everything amd related



I guess that's the reason why I have a, wait for it... Intel CPU and NVIDIA GPU... right? Because I'm an AMD fanboy. Dude, your logic just fell flat on its face from a 300m tower... The reason why I'm waiting for RX Vega is because GTX 1080 is like meeeh. Sure, it's fast and efficient, but it's just so damn boring now. It doesn't offer any new software features (ANSEL is totally useless thing imo), it doesn't offer some new radical rendering tech or feature, it's just more of the same, but faster. Believe it or not, I also buy HW for novelty thrill. It's also reason why I kinda regret not going with R9 Fury instead of GTX 980. Not that there is anything wrong with GeForce, but it has the same problem as GTX 1080. It's the same old same, just faster. R9 Fury had HBM which was something new and exciting as it was never used before. MFAA was nice addition back then on GTX, granted, but you can't use normal FSAA anyway in any game so it's kinda irrelevant. If they could replace the FXAA in NV CP with SMAA, that would make my excitement electrons going on again. Or other stuff like that.


----------



## Assimilator (Apr 19, 2017)

ShurikN said:


> There's only one Polaris laptop i saw and that's the Alienware17 with an RX470 which costs as much as a 1060 6GB variant. Not taking 460 into consideration since it's a low power, low performance part.



Exactly. Polaris at low clocks isn't performance-competitive with Pascal, Polaris at high clocks isn't power-competitive with Pascal. So in terms of mobile there are exactly zero reasons to use a Polaris chip when Pascal is superior in every facet. If Polaris was really designed as a mobile chip, then AMD f**king failed miserably, because their competitor's desktop chip is better than their mobile one.


----------



## jchambers2586 (Apr 19, 2017)

still no competition for the high end market I can get a 980ti used for and extra $100 used 

http://www.ebay.com/sch/i.html?_from=R40&_sacat=0&LH_BIN=1&_nkw=GTX+980ti&_sop=15


----------



## 0x4452 (Apr 19, 2017)

It is more powerful than NVIDIA's best! 



Spoiler


----------



## lilunxm12 (Apr 19, 2017)

newtekie1 said:


> Lets say you average 2.5 hours of gaming a day.  At ~100w higher power consumption, that amounts to about $16.50 extra per year(at $0.15 per KWH).  Not an amount over the course of a year that you should be worried about.
> 
> On the flip side, remember how the GTX480 was pretty universally bashed for its high power consumption compared to the competition?  Well, it also consumed about 100w more than its competition at the time, and actually consumed less power than this RX 580.  Something to think about.


I think for gtx480 the heat issue is more concerning people, same for all GF110, Tahiti, Hawaii chips. If heat and noise were fine, I don't care there's a 200w power difference.


----------



## Komshija (Apr 19, 2017)

The problem is that AMD currently cannot compete with Nvidia's high-end GPU's. RX 480 is comparable to GTX 1060, while RX 580 turns out to be just slightly faster than GTX 1060. That's marginal increase in performance and I hope that these cards will not be more expensive than RX 480 was at the time of release. 
GTX 1070 still has no competition from AMD, GTX 1080 is blown to pieces by more expensive (relevant factor) and much more power hungry (irrelevant factor) Radeon Pro Duo, while GTX 1080 Ti will hold the crown trough the entire 2017.

I would still buy an AMD card, not because of some fanboyism, but because I can always get more from AMD for the given price.


----------



## notb (Apr 19, 2017)

RejZoR said:


> Yeah, I praised AMD's Ryzen's TDP because their last gen was utter crap compared to this one. It was entirely incomparable. They went from 300W monstrosities that couldn't beat even Core i5's to Ryzen that puts even Intel to shame most of the time and considering the R&D differences between both, you'd expect Intel never to get into that position. Ever.



So in your opinion it's fine to praise Ryzen for better power efficiency than it's predecessor (even though possibly not many people will choose between them), but it's wrong to criticize Polaris for drawing 2x what similarly performing 1060 needs - even though these cards are direct competitors at the moment. This is... interesting. 

By share coincidence the power difference is similar in both cases. Zen shaved of 100W compared to Bulldozer, but RX580 needs 100W more than GTX1060.
And did I mention that many people will go for a dual (or more) GPU lineup? I'd rather have a 200W CPU and 100W GPU than the opposite.

Also... 300W? FX-9590 has a TDP=220W.
The R&D argument is just weird. If AMD's R&D is too small, they should simply hire more people. What's the problem? It's not like they're a tiny, unknown company.
And if they can't afford a bigger R&D, they should try to sell more mainstream products.
But instead they're concentrating on specialized, niche parts. First marketing Ryzen towards gamers (and not including an IGP) and now giving us a GPU that's hardly usable in notebooks, AIO and small systems.
Looking at the current strategy, it seems AMD is fine with the role of an "underdog": smaller, targeted at particular group of clients. But if that's the case, we shouldn't really expect less because of size difference.
Intel is big, but Intel makes many things. Money from CPU sales funds research in other areas - it's always been like that. I doubt their budgets for actual CPU R&D differ greatly.



RejZoR said:


> Sorry, but tiny case users are in total minority. I've had a miniATX build and in all the years I had it, all I've seen was tons of big cases and like 5 people who had ITX builds as HTPC.


Of course they are. But so are large gaming desktops in the whole personal computing business.
Also, people on this forum keep saying that Intel - by not going over 4 cores in consumer CPU - blocks the evolution of software towards using more cores.
So here's my answer: why is AMD forcing us to get large cases with large PSUs and cooling solutions? Why are they blocking the evolution towards smaller gaming rigs? 

BTW: I assume RX550 is designed to be paired with Ryzen in PCs that don't need powerful GPU (like office desktops and non-GPU-dependent productivity machines). I wonder how this card will perform. Given the specs compared to RX460, it'll have to be the slowest card available right now, but maybe it's suitable for passive cooling?
That could easily be the most interesting card in the RX5xx series.



RejZoR said:


> Sure, it's fast and efficient, but it's just so damn boring now.
> R9 Fury had HBM which was something new and exciting as it was never used before.


OK. So here I'm out, really... You prefer a PC part because it is "exciting", performance being less important. This is something I can't argue with. No one can.
At this point you could simply say that you prefer AMD, because you like red over blue and green...


----------



## notb (Apr 19, 2017)

Komshija said:


> I would still buy an AMD card, not because of some fanboyism, but because I can always get more from AMD for the given price.


MSRP of RX580 and GTX1060 is exactly the same: $230. However, since 1060 has been around for a while, prices have already dropped.

Just looking at the ASUS ROG series:
GTX 1060 OC: $310, but e.g. Amazon sells it for $295
RX580 OC: $320
Same cooler, possibly almost identical performance.

Keep in mind RX580 TPU review compares it to a generic GTX1060. The Asus mentioned above was 5% faster than generic on average, which means it's just as fast as this Sapphire RX580 (also with large factory OC).


----------



## Assimilator (Apr 19, 2017)

Komshija said:


> GTX 1080 is blown to pieces by more expensive (relevant factor) and much more power hungry (irrelevant factor) Radeon Pro Duo



Radeon Pro Duo may as well not exist since it's almost impossible to get hold of. Plus it's a dual-GPU card which means game performance is up to the whim of the developers and AMD's driver team.


----------



## mroofie (Apr 19, 2017)

Assimilator said:


> 9.2? Seriously? For a card that, in all honesty, blows goats? This isn't a refresh, it's AMD taking the piss.
> 
> Vega had better be the holy lord and saviour that AMD and its fanboys are hoping for, because if the rumours are true and Volta is going to arrive in Q3, all the Pascal cards are going to drop in price, which means we could see GTX 1060 6GB for lower than $200. Which would, justifiably, s**t all over the RX "500" series.
> 
> ...


Volta arrives next year...


----------



## uuuaaaaaa (Apr 19, 2017)

As always, great review Wiz! Sweet mid range card from Sapphire


----------



## medi01 (Apr 19, 2017)

notb said:


> MSRP of RX580 and GTX1060 is exactly the same: *$230*





notb said:


> However, *since 1060 has been around for a while, prices have already dropped.*





notb said:


> GTX 1060 OC: *$310*, but e.g. Amazon sells it for* $295*



Lol.



Assimilator said:


> This isn't a refresh, it's AMD taking the piss.


They "said" they'd up MHZ a bit, they did up it quite a bit.
It sure made cards more power hungry.
I don't get what all the fuss is about, frankly.
Performance is decent and 200W power consumption (on cards with record OC) is hardly outlandish.
They also come with amazing coolers, ASUS Strix 580 is quieter than 1060. (Sapphire manages to beat both).




Assimilator said:


> The truth is that the Samsung/GloFo 14nm process is a POS that doesn't scale to decent clocks, and no number of claims about "improvements" can change that. Lipstick on a pig.



Well, is Samsung's process inferiority a fact?
I've only seen toms review on this (where he compared apple's chip to apple's chip) and Samsung was on par or better.
Ryzen is also on 14nm, as far as I know.

I wonder why they couldn't use whatever they have created for Microsoft Scropio, with 40CUs (up from 36) and where that thing was supposed to be manufactured, GloFo or TSMC.
It has the same TFLOPs rating as 580, but surely consumes much less.




mroofie said:


> reason to buy the 580


FreeSync, extra 8GB, superior Vulkan/DX12 performance, general longevity of AMD cards.

Oh, and I want to see the prices you have mentioned.


----------



## mroofie (Apr 19, 2017)

oxidized said:


> I'm not so sure anymore, at these prices i could get a 1060 6GB overclock the sh** out of it (since it can) and eat both the 480 and the 580 alive.


this^
I checked the price of the gtx 1060 6GB and 

no reason to buy the 580


----------



## Tatty_One (Apr 19, 2017)

TheMailMan78 said:


> *Well l may get a 580 to replace my 780ti*. Honestly people crying about power consumption are the same people who cry about MPG in muscle cars. Dumb asses.
> 
> Looking at raw performance/price these 580s ain't to bad. Looking forward to seeing Vega.


Really?  For what, around a 10% performance gain.


----------



## Komshija (Apr 19, 2017)

notb said:


> MSRP of RX580 and GTX1060 is exactly the same: $230. However, since 1060 has been around for a while, prices have already dropped.
> 
> Just looking at the ASUS ROG series:
> GTX 1060 OC: $310, but e.g. Amazon sells it for $295
> ...



RX 580 was just released, so prices will drop after a while. At the time of release RX 480 was cheaper than GTX 1060 and I assume that RX 580 follow RX 480's philosophy. Plus, RX 580 has 8GB VRAM, which is nicer than 6GB, even if you won't use all of it.
AMD yet has to release new drivers to further boost RX 580's performance, like they did with RX 480.

Today we have the same hype with GTX 1060 against RX 480 and RX 580 as we did two years ago with GTX 970 and R9 390. Well, R9 390 was cheaper and faster, except in some Nvidia-sponsored games where AMD cards are artificially bogged down. When games start using DX12, and that will happen, AMD cards will get additional boost unlike the Nvidia's.


----------



## medi01 (Apr 19, 2017)




----------



## Komshija (Apr 19, 2017)

mroofie said:


> this^
> I checked the price of the gtx 1060 6GB and
> 
> no reason to buy the 580


It depends whether you look from realistic or fanboy's perspective. As i said, AMD always offers more bang for the buck and their situation with drivers vastly improved in the last two years.


----------



## notb (Apr 19, 2017)

Komshija said:


> RX 580 was just released, so prices will drop after a while.


With this kind of performance and efficiency? I'm sure they'll drop very quickly, indeed.



Komshija said:


> Today we have the same hype with GTX 1060 against RX 480 and RX 580 as we did two years ago with GTX 970 and R9 390. Well, R9 390 was cheaper and faster, except in some Nvidia-sponsored games where AMD cards are artificially bogged down. When games start using DX12, and that will happen, AMD cards will get additional boost unlike the Nvidia's.


"Hype" is something happening before the launch. Today we have numbers. GTX1060 and RX580 have similar price and performance.

I won't comment on the "NVIDIA-sponsored games", but I suggest you don't expect many games to support DX12. It's an awful library. At this point we should worry more about a future DX13 (or Vulkan).


----------



## Komshija (Apr 19, 2017)

notb said:


> "Hype" is something happening before the launch. Today we have numbers. GTX1060 and RX580 have similar price and performance.


Hype can last as long as the life cycle of the product.



notb said:


> I won't comment on the "NVIDIA-sponsored games"


 Of course you won't, because this is well known Nvidia's dirty trick. Just to make it clear, AMD also sponsored a few games, but sheer number of Nvidia-sponsored games vastly outnumber AMD-sponsored ones.



notb said:


> but I suggest you don't expect many games to support DX12. It's an awful library. At this point we should worry more about a future DX13 (or Vulkan).


 DX 13 isn't released yet, while Vulkan is much more advanced platform than OpenGL. I see absolutely no problem with games supporting DX12 and implementing Vulkan as this would vastly boost AMD GPU's performance. In other words, I like to pay less and get as much as possible. Paying a lot to get a lot isn't the way to go.


----------



## notb (Apr 19, 2017)

medi01 said:


> Performance is decent and 200W power consumption (on cards with record OC) is hardly outlandish.


We're not saying that 200W is something unacceptable by definition. We're simply pointing out that the competitor is way better.


medi01 said:


> They also come with amazing coolers, ASUS Strix 580 is quieter than 1060. (Sapphire manages to beat both).


Quieter than which 1060? This is the same (or almost identical) cooler ASUS used in their GTX1060. It'll have similar noise characteristics.
According to TPU this Sapphire in it's "quiet mode" is just as loud as ASUS ROG 1060 under load (32 dBA). Keep in mind ASUS has 3 fans.
A 2-fan MSI 1060 Gaming was rated at 28 dBA. That's a huge difference.
We'll see how MSI RX580 Gaming performs. Hopefully TPU will do a test.


medi01 said:


> Well, is Samsung's process inferiority a fact?
> I've only seen toms review on this (where he compared apple's chip to apple's chip) and Samsung was on par or better.
> Ryzen is also on 14nm, as far as I know.


It is said that Samsung's FinFET - being designed primarily for mobile devices - is not that great for PC parts.
This might be one of the reasons why Ryzen is not great for OC. It's clocked at the optimal point and since there it's very difficult to push it further (very steep temp/clock curve compared to Intel or older AMD).


medi01 said:


> I wonder why they couldn't use whatever they have created for Microsoft Scropio, with 40CUs (up from 36) and where that thing was supposed to be manufactured, GloFo or TSMC.
> It has the same TFLOPs rating as 580, but surely consumes much less.


Guaranteed exclusivity?
AMD also makes a very efficient Radeon PRO GPU for MacBooks. It's not available anywhere else.


medi01 said:


> FreeSync, extra 8GB, superior Vulkan/DX12 performance, general longevity of AMD cards.


Don't get this as an insult, but the "general longevity of AMD whatever" is usually an effect of very slow replacement cycle. E.g. many people still use 5-year-old FX CPUs, because they (for whatever reasons - it's not always fanboyism) didn't want to jump to Intel. Now they're suddenly moving to Ryzen. 



Komshija said:


> Of course you won't, because this is well known Nvidia's dirty trick. Just to make it clear, AMD also sponsored a few games, but sheer number of Nvidia-sponsored games vastly outnumber AMD-sponsored ones.


I see the same argument in Ryzen discussions. Software is "sponsored by Intel".
Yes, it's everyone else's fault... 



Komshija said:


> DX 13 isn't released yet


DX13 is a distant future. But DX12 has been around for a while and new games are still released based on DX11.

Lately AMD announced a cooperation with Bethesda. Prey is used in the Ryzen 5 marketing campaign, but it is a DX11 game, so RX cards won't get any boost.



Komshija said:


> I see absolutely no problem with games supporting DX12 and implementing Vulkan as this would vastly boost AMD GPU's performance.


You might not, but gaming studios do. DX12 is widely criticized by programmers. Maybe we'll see a new, fixed revision (12.1 etc) or maybe they'll jump straight to DX13.
Until that happens most games will use the older libraries.
Another thing is that it's not AMD that's gaining in DX12. It's actually NVIDIA who's loosing, as their drivers are not supporting DX12 very well.


----------



## Athlonite (Apr 19, 2017)

This is the reason I commonly skip whole generations of GPU's like going from HD7850 > R9 285 > RX480 > whatever vega card is around


----------



## medi01 (Apr 19, 2017)

notb said:


> Quieter than which 1060?


Quieter that Strix from Asus.



notb said:


> "general longevity of AMD whatever" is usually an effect of very slow replacement cycle.


Uh, no, not really.
290(x) vs 780(Ti)



notb said:


> Guaranteed exclusivity?


Well, meh. Exclusivity in this context would only apply to competitor, Sony in this case.
Microsoft couldn't care less about competing with... AMD... 



notb said:


> for whatever reasons - it's not always fanboyism


Voting with your wallet is not fanboism.


----------



## oxidized (Apr 19, 2017)

medi01 said:


> Lol.
> 
> Performance is decent and 200W power consumption (on cards with record OC) is hardly outlandish.
> They also come with amazing coolers, ASUS Strix 580 is quieter than 1060. (Sapphire manages to beat both).
> ...



Performance/Watts show the 580 at one of the last spots go look

https://www.techpowerup.com/reviews/Sapphire/RX_580_Nitro_Plus/31.html

And this is for the nitro+ LE probably the fastest 580 came out yesterday, it's almost 220W actually, and what record are you talking about exactly? any 1060 can overclock up to 2050, i've seen people going over 2100MHz, from 1700/1800, now which one is a record OC? Memory also overclocks far more on 1060s, +500MHz pretty easily, all this with at least 50W less, and better temperature, oh and at the same price for now, what are we really talking about? Freesync? I still have to find someone saying freesync is a very good feature...extra 8 GB? i guess you mean extra 2 GB, also 2 GB on cards like these won't make much of a difference, if not in the very long run, when they'll be both obsolete, we're not talking about 4 vs 8 GB, 6 vs 8 is a completely different thing. Superior Vulkan performance? Maybe, but not that much when you take that 1060 to those frequencies i mentioned, both DX11 and 12 will be an all win for 1060 at those freqs, longevity is probably the only one on AMD side, even because of the extra 2 GB.


----------



## medi01 (Apr 19, 2017)

oxidized said:


> Performance/Watts show the 580 at one of the last spots go look


It isn't last, but remind me, when did perf/watt become major metric.



oxidized said:


> and what record are you talking about exactly


Nitro+ card.



oxidized said:


> over 2100MHz


Oh, not this shit again, are we back in P4 times?
At 2000Mhz 580 would have wiped the floor with 1070.


----------



## EarthDog (Apr 19, 2017)

Lol medi...

A 290x would beat it at 2000 mhz..

...point is it cant come close.

So many red herring arguments... poor tpu.


----------



## Assimilator (Apr 19, 2017)

oxidized said:


> And this is for the nitro+ LE probably the fastest 580 came out yesterday, it's almost 220W actually



Now AMD fanboys have a 220W GPU to match their 220W FX-9590 CPUs 



medi01 said:


> It isn't last, but remind me, when did perf/watt become major metric.



You should probably ask laptop manufacturers, who for some reason that I cannot begin to fathom, prefer NVIDIA GPUs.



medi01 said:


> At 2000Mhz 580 would have wiped the floor with 1070.



And a P4 at 10GHz would beat any CPU today. Except P4 could never get to 10GHz just like Polaris can't get to 2GHz. So did you have a point, or are you just regurgitating irrelevant hypotheses to make yourself feel clever?


----------



## notb (Apr 19, 2017)

medi01 said:


> Uh, no, not really.
> 290(x) vs 780(Ti)


So "slow replacement cycle" + "rebranding". But I know what you're talking about.
http://www.babeltechreviews.com/nvidia-forgotten-kepler-gtx-780-ti-vs-290x-revisited/view-all/
(the difference is even bigger in games from 2016)
290 has aged well because it was so similar to succeeding 390. And it's not that far from RX either.

NVIDIA releases cards more often and optimizes drivers for the latest architecture. And boy did NVIDIA cards changed...
1080 is more than twice as fast as 780, but uses around 10% less power.
RX580 uses 10-15% less power than R9 290, but the performance increase is fairly small - 20-25% according to TPU reviews.

Generally speaking, at a random moment in time it's more likely that an NVIDIA card will be a better choice at any price point - simply because they're updating a lot more often.
And if you're after high-end solutions, it might just be that AMD doesn't offer anything serious (like at the moment).
But sure, if you can cherry pick the moment when you upgrade your PC, going AMD can be very efficient. R9 290 -> Vega (if any good), why not? 

That said, it's not the way I'd do my shopping.
A good r9 290 was a little over $400 and keeping it for 4 years would mean playing at lower resolutions near the end, anyway - still living with all the drawbacks of having a high-end GPU.
I'd still prefer to buy mid-range cars for $200-250 and replace every 2 years (e.g. 760 -> 1060), because:
1) in long run it costs the same (including some money from selling a 2-year-old card),
2) you're under warranty more of the time and you get all the latest features,
3) mid-range cards eat less power and should be less noisy.

I guess the (3) is the most important argument. Here's a review of a good quality 290:
https://www.techpowerup.com/reviews/Sapphire/R9_290_Vapor-X/24.html
30 dBA idle, 37 dBA under load. Sorry, but that's not acceptable. I could live with that 4 years ago, when 290 was a performance monster. But now?
Here is the duo I'd choose:
https://www.techpowerup.com/reviews/MSI/GeForce_GTX_760_TF_Gaming/26.html
https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X/23.html
Both cards are quieter under load than the Sapphire 290 is in idle. MSI 1060's fan stops in idle and light use.


----------



## medi01 (Apr 19, 2017)

notb said:


> 290 has aged well because


That's a theory.
We know it has aged well.
We also know that 580/570 are rather similar to what is in Xbone/PS4, so this tendency is likely to stay.



Assimilator said:


> You should probably ask laptop manufacturers, who for some reason that I cannot begin to fathom, prefer NVIDIA GPUs.


I thought we were discussing desktop.
It sure matters in notebooks, I guess Polaris can appear there only if undervolted/clocked.
(that demo from year ago vs 950, adoredtv bothered lowering clocks far enough to repeat that "twice lower power consumption" thing)



Assimilator said:


> And a P4 at 10GHz would beat any CPU today.





Assimilator said:


> Except P4 could never get to 10GHz


Indeed. 
Except there was no 10Ghz CPU to compare it to and right analogy would be Athlon running ant P4 clocks.

Anyhow., are you sure what you are arguing with me about?


----------



## notb (Apr 19, 2017)

medi01 said:


> It isn't last, but remind me, when did perf/watt become major metric.


Basically when notebooks started outselling desktops. 

If you're not aware of the situation in notebook gaming, this might hurt - sorry.
Mobile GTX1070 beats a desktop R9 Fury (and mobile GTX1080 is another 25% faster).
And today it's not just about huge gaming notebooks. You can buy a slim 14" notebook with 960M and that's RX460 territory in 1080p gaming...



medi01 said:


> Oh, not this shit again, are we back in P4 times?
> At 2000Mhz 580 would have wiped the floor with 1070.


Oh, man. And you call me an Intel/NVIDIA fanboy...


----------



## notb (Apr 19, 2017)

medi01 said:


> That's a theory.
> We know it has aged well.
> We also know that 580/570 are rather similar to what is in Xbone/PS4, so this tendency is likely to stay.



Not if Vega is a totally different architecture compared to Polaris (and it must be to meet AMD claims).
Even if Vega is reserved only for the high-end segment in this generation (keeping RXxxx in low/mid-range), it'll now be the architecture that AMD cares about.


----------



## 64K (Apr 19, 2017)

Well, here's my 2 cents. I will compare value based on the reviews that are available on TPU and I will compare the Sapphire Nitro+ RX 580 to a MSI Gaming X 1060 6GB. The reason I would do that is that the 1060 6GB in the Performance Summary Chart is the reference non-overclocked 1060 6GB.

Current prices on Newegg are $250 for the Sapphire Nitro+ RX 580 8GB and $255 for the MSI Gaming X 1060 6GB although there is a $20 rebate and a free game for the MSI Gaming X 1060 6GB which makes it a much better deal than the 580 Nitro+ if you bother with rebates and actually want the free game offered.

The reference non-OC 1060 6GB in the 1080p Performance Summary is a few percent slower than the 580 Nitro+ overall but the Gaming X 1060 6GB is OC right out of the box just like the Nitro+ is so I would bet that would narrow down the performance lead that the Nitro+ has. Additionally for anyone considering OC even further, according to the tests done here at 1440p in Battlefield 3 an overclocked MSI Gaming X 1060 6 GB gives you an actual performance gain of 15.1% and the overclocked Nitro+ gives you an actual performance gain of only 4.4%. So probably with both cards overclocked they would be pretty equal in performance to each other.

What about the cost to buy the card and use it? The Nitro+ 580 according to the tests done here uses 234 watts average gaming and the Gaming X 1060 uses 121 watts average gaming. For me, I game an average of 15 hours a week and I only pay 10 cents per kWh so it would cost me only $9 more on my power bill per year to use the Nitro+ and I usually keep a card for around 2 years so that would make the 1060 6 GB about $13 cheaper over it's lifespan buy and use (taking into consideration that it costs $5 more for the 1060) for me and wouldn't dump as much unnecessary heat into my room as the 580. If you game more than that or pay more for electricity or keep a card for longer than 2 years then factor that in and make your own judgement but really I think it's mostly a draw between the 2 cards and if DX12 or Vulkan takes off then the lead will probably go to the 580.

I think Vega will arrive in a couple of months so if possible I would hold off on buying this card or any card until then anyway.


----------



## rtwjunkie (Apr 19, 2017)

notb said:


> Basically when notebooks started outselling desktops.
> 
> If you're not aware of the situation in notebook gaming, this might hurt - sorry.
> Mobile GTX1070 beats a desktop R9 Fury (and mobile GTX1080 is another 25% faster).
> ...


Apparently YOU  are unaware of the situation in notebook gaming and how pathetic they are for longevity, heat etc.  If you are interested, Maximum PC regularly tests insanely expensive gaming notebooks that barely compare in performance to an average built gaming desktop. So, yeah.


----------



## TheMailMan78 (Apr 19, 2017)

Tatty_One said:


> Really?  For what, around a 10% performance gain.


1. Free upgrade. Sell the 780ti and 580 is basically free.
2. Almost 15 to 20 fps in BF1. Game which I play mostly.
3. I have a freesync monitor I wanna try out.
4. I haven't run ATI in a while. Kinda bored of NVIDIA.


----------



## 80-watt Hamster (Apr 19, 2017)

Assimilator said:


> And a P4 at 10GHz would beat any CPU today.



You know, it probably couldn't.  It's one lonely core is a pretty significant handicap, plus the increases in IPC since then mean that a current i3 has more horsepower at just shy of 4 GHz than the P4 would have at 10.

This whole argument has taken a turn for the bizarre.  I mean, what are we arguing about here, aside from who is or isn't a fanboy (always a productive discussion, that)?  Yes, the 570/580 are largely rebrands.  No, it's not a particularly great advancement.  Yes, the 1060 6GB is overall a better chip.  No, it's not by a significant margin. (Perf/$, perf/W is another discussion, and AMD still can't compete here.)  Yes, the basic 570/580 are a decent value, particularly if you throw Freesync into the mix.  No, the high-end OC versions are not.

AMD has definitely leaned more heavily on refreshes and rebrands than Nvidia has over the last few generations.  They're operating from a position of weakness, making less per unit while at the same time selling far fewer units.  That's a rough place to be.  They bet heavily on GCN, which has paid off in scalability, but hurt in power consumption.  I don't know what people were expecting from the 500 series; OC attempts on the 480 showed pretty clearly that there was a performance limit that AMD was pretty close to right out of the gate; given the maturity of GCN, this isn't surprising, as Polaris was more die shrink than redesign.

Will Vega change things?  Maybe.  Ryzen came out much stronger than I expected, though the frequency limits make me skeptical of how well it will scale.  Lightning could strike twice, but I'm not expecting them to make up as much ground in GPU as they made in CPU.


----------



## mroofie (Apr 19, 2017)

TheMailMan78 said:


> Well l may get a 580 to replace my 780ti. Honestly people crying about power consumption are the same people who cry about MPG in muscle cars. Dumb asses.
> 
> Looking at raw performance/price these 580s ain't to bad. Looking forward to seeing Vega.


So People who care about heat output are dumb asses? 
k..


----------



## oxidized (Apr 19, 2017)

medi01 said:


> It isn't last, but remind me, when did perf/watt become major metric.
> 
> 
> Nitro+ card.
> ...



You can't be serious, It's always been important, what's the point if i can match X card performance, consuming twice as much, and having higher temps or using a twice bigger die? (i'm not saying 580 is all this)

also, nitro yeah, but the others won't be that far behind, 10 to 20 W less, we're still around 200W

Too bad 580 can't reach 2000MHz, can it? And it's still ~500MHz behind. So what's your point? 480 has bigger die size than 1060, so what if 1060 had 480's die size?


----------



## Vayra86 (Apr 19, 2017)

ShurikN said:


> Consumes 100W more to achieve 5% more perf than a 1060. Guess which chip you wont see in a laptop... again.
> Not to mention that it's trading blows with an equally priced custom 1060. And those cards were out for a while.
> 
> Wanted to say I'm disappointed, but my expectations were extremely low to begin with. And boy were those expectations met.



This, and I'm gonna leave it at that


----------



## CounterSpell (Apr 19, 2017)




----------



## Assimilator (Apr 19, 2017)

80-watt Hamster said:


> You know, it probably couldn't.  It's one lonely core is a pretty significant handicap, plus the increases in IPC since then mean that a current i3 has more horsepower at just shy of 4 GHz than the P4 would have at 10.



Please don't make me explain how hyperbole works. Especially when I employ it in response to fallacious arguments.



80-watt Hamster said:


> Will Vega change things?  Maybe.  Ryzen came out much stronger than I expected, though the frequency limits make me skeptical of how well it will scale.  Lightning could strike twice, but I'm not expecting them to make up as much ground in GPU as they made in CPU.



Ryzen's clocks are limited by Samsung/GloFo 14nm process just as much as Polaris's clocks are. Unless they have something completely new in the works for Vega, AMD will have to pack so many CUs onto that GPU to compensate for its low clocks, that those chips will be massive and have a correspondingly high defect rate. Which means they'll be expensive to produce, which means the cards will be expensive, which means that if NVIDA decide to introduce Volta at a lower price point, AMD is pooch screwed.



mroofie said:


> So People who care about heat output are dumb asses?
> k..



In exactly the same way that ordinary people who care about their car's fuel consumption are dumbasses. 
I guess the guys who want power with great fuel consumption must be the ultimate dumbasses then according to @TheMailMan78.


----------



## oxidized (Apr 19, 2017)

GG AMD

https://www.techpowerup.com/232498/radeon-rx-480-cards-can-successfully-be-flashed-to-rx-580


----------



## TheMailMan78 (Apr 19, 2017)

Assimilator said:


> In exactly the same way that ordinary people who care about their car's fuel consumption are dumbasses.
> I guess the guys who want power with great fuel consumption must be the ultimate dumbasses then according to @TheMailMan78.


 Ya don't buy a Corvette and bitch about MPG. If you do.....yes you are in fact a dumbass. I don't buy a gaming GPU and care about power draw.


----------



## 64K (Apr 19, 2017)

TheMailMan78 said:


> Ya don't buy a Corvette and bitch about MPG. If you do.....yes you are in fact a dumbass. I don't buy a gaming GPU and care about power draw.



That doesn't make sense though. A RX 580 is comparable in performance to a 1060 6 GB which is a low end of the mid-range Pascals. So if a RX 580 is comparable to a Corvette then what would a 1080 or 1080 Ti be?

A better comparison would be that a RX 580 is a 4 door Camry that gets worse gas mileage than a GTX 1080 Corvette.


----------



## TheMailMan78 (Apr 19, 2017)

64K said:


> That doesn't make sense though. A RX 580 is comparable in performance to a 1060 6 GB which is a low end of the mid-range Pascals. So if a RX 580 is comparable to a Corvette then what would a 1080 or 1080 Ti be?
> 
> A better comparison would be that a RX 580 is a 4 door Camry that gets worse gas mileage than a GTX 1080 Corvette.


1080ti would be a Bugatti. IGP would me a Camry.


----------



## CounterSpell (Apr 19, 2017)

oxidized said:


> GG AMD
> 
> https://www.techpowerup.com/232498/radeon-rx-480-cards-can-successfully-be-flashed-to-rx-580



hhahah

we need benches!


----------



## notb (Apr 19, 2017)

rtwjunkie said:


> Apparently YOU  are unaware of the situation in notebook gaming and how pathetic they are for longevity, heat etc.  If you are interested, Maximum PC regularly tests insanely expensive gaming notebooks that barely compare in performance to an average built gaming desktop. So, yeah.



Source?


----------



## EarthDog (Apr 20, 2017)

notb said:


> Source?


you are just as welcome to support your assertion as he is with his... 

This isnt a discussion where a blatently obvious point is being questioned.


----------



## hat (Apr 20, 2017)

Seems like AMD is really falling behind in graphics cards. The GTX1060 looks like the better buy all around. You get the same performance for less money, both at the time of purchase and when it's time to pay the power bill. And if you want more performance, you can go higher with GTX1070 or better from the green team. Hopefully the next release is a game changer.


----------



## Vlada011 (Apr 20, 2017)

Power consumption almost as TITAN Xp. More than GTX1080 but far slower.
If GTX1080 price drop little more will be best investment for gaming. On 499$ example.
I seriously think to buy GTX1080 Founders Edition.


----------



## medi01 (Apr 20, 2017)

Interesting (I was surprised) results with undervolting + chill (AIBs seem to be very conservative about voltage, undervolting alone reduces power consumption by 30W, without any effect on performance).
Skip to 10:00 mark:













oxidized said:


> consuming twice as much


Consuming X watts more to achieve the same performance is important.
At which frequency is done is not.
That's what you have failed to grasp/


----------



## Prima.Vera (Apr 20, 2017)

Re-brands should be illegal, since is just a very elaborate SCAM. That's all.


----------



## medi01 (Apr 20, 2017)

oxidized said:


> GG AMD
> 
> https://www.techpowerup.com/232498/radeon-rx-480-cards-can-successfully-be-flashed-to-rx-580



The real "new" cards are 560 (more CUs than 460) and 550 (didn't even exist as 4xx).

The other things are slightly better chip revisions, that allow to achieve higher frequency at lower voltage.




Prima.Vera said:


> Re-brands should be illegal, since is just a very elaborate SCAM. That's all.


There is nothing that qualifies as "scam" about RX 5xx series.



notb said:


> Basically when notebooks started outselling desktops.


When notebooks started outseling desktop, Intel started to dominate the market.


----------



## notb (Apr 20, 2017)

EarthDog said:


> you are just as welcome to support your assertion as he is with his...
> 
> This isnt a discussion where a blatently obvious point is being questioned.



Hmm... I kind of thought that this is a well known fact and a proof isn't needed. 
I did ask for a source, because rtwjunkie mentioned a particular reviewer: "Maximum PC". First of all: I didn't know that magazine (and I have no access to a physical copy). Second: internet-wise they've been swallowed by PC Gamer a year ago. So maybe rtwjunkie was talking about fairly old reviews, while the big jump in laptop GPU performance happened thanks to Maxwell and Pascal?

But no problem. Here's a fairly comprehensive list (section Overview).
https://www.notebookcheck.net/The-Witcher-3-Notebook-Benchmarks.143187.0.html
I'm linking The Witcher because it seems to favour AMD architectures.


----------



## TheMailMan78 (Apr 20, 2017)

Prima.Vera said:


> Re-brands should be illegal, since is just a very elaborate SCAM. That's all.


Does that apply to cars also? I mean they use the same powertrain with minor tweeks for 10+ years in a row sometimes. Would it make you feel any better if they offered different coolers and colors........oh wait that's exactly what they do.


----------



## oxidized (Apr 20, 2017)

medi01 said:


> Consuming X watts more to achieve the same performance is important.
> At which frequency is done is not.
> That's what you have failed to grasp/



What does this even mean?



medi01 said:


> The real "new" cards are 560 (more CUs than 460) and 550 (didn't even exist as 4xx).
> The other things are slightly better chip revisions, that allow to achieve higher frequency at lower voltage.



So???


----------



## Prima.Vera (Apr 20, 2017)

TheMailMan78 said:


> Does that apply to cars also? I mean they use the same powertrain with minor tweeks for 10+ years in a row sometimes. Would it make you feel any better if they offered different coolers and colors........oh wait that's exactly what they do.


You're comparing oranges with machine guns.



medi01 said:


> There is nothing that qualifies as "scam" about RX 5xx series.


I was speaking in general.


----------



## TheMailMan78 (Apr 20, 2017)

Prima.Vera said:


> You're comparing oranges with machine guns
> .


Um no. Same thing.


----------



## EarthDog (Apr 20, 2017)

TheMailMan78 said:


> Does that apply to cars also? I mean they use the same powertrain with minor tweeks for 10+ years in a row sometimes. Would it make you feel any better if they offered different coolers and colors........oh wait that's exactly what they do.


lol.. but they still call it a [insert name of car here]. With a different motor/powertrain, there are designations, like instead of a ford taurus, its a ford taurus sho. To relate this to gpus, it should be an r9 480+ or something, not a new 'series'. So, im with the other dude on the analogy fail. 



medi01 said:


> The real "new" cards are 560 (more CUs than 460) and 550 (didn't even exist as 4xx).
> 
> The other things are slightly better chip revisions, that allow to achieve higher frequency at lower voltage


two things.

1. We dont know yet if the low 5 series cards are cut down or not...cant make that assumption yet, can we? Regardless, i wouldnt consider those the same either.
2. The 5xx series cards use MORE voltage and MORE power to reach those clocks. Every single review ive seen has said that.


----------



## notb (Apr 21, 2017)

EarthDog said:


> lol.. but they still call it a [insert name of car here]. With a different motor/powertrain, there are designations, like instead of a ford taurus, its a ford taurus sho. To relate this to gpus, it should be an r9 480+ or something, not a new 'series'. So, im with the other dude on the analogy fail.


But on the other hand we have this situation with better DDR5 used in new revisions of Pascal-based cards. Manufacturers didn't change the names, so now we'll have a bunch of new reviews for "GeForce GTX 1060 6GB with 9 Gbps GDDR5" and so on. I think I'd prefer rebranding, because this is just sad...



EarthDog said:


> 2. The 5xx series cards use MORE voltage and MORE power to reach those clocks. Every single review ive seen has said that.


But this is even sadder. It's really difficult to understand what's happening at AMD because, honestly, they could have just changed the bios in the older cards...
IMO this could be a result of switching the manufacturing process. Maybe these new chips don't like high clocks - much like Ryzen.


----------



## Freelancer (Apr 21, 2017)

I wonder how MSI Gaming X compare with Sapphire ... will there be a review of MSI Gaming X? Thanks!


----------



## W1zzard (Apr 21, 2017)

Freelancer said:


> will there be a review of MSI Gaming X? Thanks!


Not sure yet


----------



## medi01 (Apr 21, 2017)

EarthDog said:


> 2. The 5xx series cards use MORE voltage and MORE power to reach those clocks. Every single review ive seen has said that.


How does that contradict what I said though?
Slightly improved silicon might allow manufacturers to roll them out.
Note that they had targets (beating 1060) which dictated the OC levels.

The "but it's just 480" is... not really clear. Sure you can find 480 that would OC even better than 580, but it doesn't mean manufacturers could go for it.

For power consumption stuff, I found this rather surprising (rewind to 10:00 mark):


----------



## EarthDog (Apr 21, 2017)

medi01 said:


> How does that contradict what I said though?
> Slightly improved silicon might allow manufacturers to roll them out.
> Note that they had targets (beating 1060) which dictated the OC levels.
> 
> ...


I guess we are still waiting for this better chip revision you speak of? Its not in the 5 series so far..


----------



## medi01 (Apr 21, 2017)

EarthDog said:


> I guess we are still waiting for this better chip revision you speak of? Its not in the 5 series so far..


But based on what? (note, slightly better)


----------



## uuuaaaaaa (Apr 21, 2017)

EarthDog said:


> I guess we are still waiting for this better chip revision you speak of? Its not in the 5 series so far..



In the video aboveRx580@rx480 clocks requires 890mV resulting in max power comsumption of 75w + below 60ºC temps which means no fans need to turn on. (after 17min mark)  It sounds like a better revision to me.


----------



## EarthDog (Apr 21, 2017)

But its an rx 580.. and to reach rx580 clocks, it needs more voltage to reach those clockspeeds...the same as the rx480 which are flashing successfully to 580s...

Underclocking... lol.


----------



## Fluffmeister (Apr 21, 2017)

EarthDog said:


> But its an rx 580.. and to reach rx580 clocks, it needs more voltage to reach those clockspeeds...the same as the rx480 which are flashing successfully to 580s...
> 
> Underclocking... lol.



Come on Earth, these things are an Underclockers dream!!!!11


----------



## uuuaaaaaa (Apr 21, 2017)

EarthDog said:


> But its an rx 580.. and to reach rx580 clocks, it needs more voltage to reach those clockspeeds...the same as the rx480 which are flashing successfully to 580s...
> 
> Underclocking... lol.


Find me a polaris 10 chip that does 1266 @890mV and takes less than 75w to do so. This mean that the polaris 20xtx (or w/e it is called) is an improved polaris 10 chip, i.e. achieving the same at much lower power and heat.
Well it is also shown in the video that this RX580 could undervolt to 1040mV from the stock 1150mV, reducing the power consumption down to 120w at its RX580 clocks.


----------



## FordGT90Concept (Apr 21, 2017)

Prima.Vera said:


> Re-brands should be illegal, since is just a very elaborate SCAM. That's all.


No one is going to upgrade an RX 480 to an RX 580 unless it's through an RMA program.



notb said:


> I'm linking The Witcher because it seems to favour AMD architectures.


It does not.  Witcher 3 is full of NVIDIA tech.


----------



## rruff (Apr 21, 2017)

ShurikN said:


> Consumes 100W more to achieve 5% more perf than a 1060. Guess which chip you wont see in a laptop... again.
> Not to mention that it's trading blows with an equally priced custom 1060. And those cards were out for a while.
> 
> Wanted to say I'm disappointed, but my expectations were extremely low to begin with. And boy were those expectations met.



Extremely depressing news. What happened to all those reports last fall of new RX 480s maxing out at <100W? I thought AMD had refined their process and were going to match Nvidia for efficiency. Instead it looks like we are back to the same old story.

AMD out of the game in laptops, and only making financial sense in desktops if you have free electricity and good cooling.

EDIT: 


uuuaaaaaa said:


> Find me a polaris 10 chip that does 1266 @890mV and takes less than 75w to do so. This mean that the polaris 20xtx (or w/e it is called) is an improved polaris 10 chip, i.e. achieving the same at much lower power and heat.
> Well it is also shown in the video that this RX580 could undervolt to 1040mV from the stock 1150mV, reducing the power consumption down to 120w at its RX580 clocks.



Or maybe not so bad! Huge improvement with undervolting.


----------



## jabbadap (Apr 21, 2017)

According to Sampsa at io-tech.fi(in finnish so use some translation service) there's two flavors of RX 580, one with polaris 20 XTX and one with Polaris 20 XTR. Latter is binned chip which clocks are higher and is used on top end AIB RX 580s like this reviewed Nitro limited edition. Does W1zzard have same kind of information?


----------



## notb (Apr 22, 2017)

rruff said:


> Or maybe not so bad! Huge improvement with undervolting.


Please don't undervolt your PC parts. This is such a bad idea...
RX580 needs a lot of power, but doesn't emit that much heat. It shows that the electricity is actually put into use.

AMD had really no point in releasing a GPU with too-high stock voltage, if it could have been stable using much less power.

Why would anyone sensible make such decisions about a GPU (bought for few years) based on a youtube video shown few days after the launch...?


----------



## uuuaaaaaa (Apr 22, 2017)

notb said:


> Please don't undervolt your PC parts. This is such a bad idea...
> RX580 needs a lot of power, but doesn't emit that much heat. It shows that the electricity is actually put into use.
> 
> AMD had really no point in releasing a GPU with too-high stock voltage, if it could have been stable using much less power.
> ...



Because the stock VID is usualy an overestimation (based on chip quality) to make sure every card works and does not crash. It does not mean that every card would need suck voltage. You can actually have improved performance in power limit situations by lowering the stock VID. You have lower temperatures and power consumption (less fan noise) + higher performance. I have undervolted my  R9 Fury X Strix, it works flawlessly better. Nowadays it is possible to change the VID per DPM stage in radeon wattman, so it makes it even less risky and easier than flashing a new edited vbios. Undervolting a card will potentially increase the lifespan of it too.


----------



## ShurikN (Apr 22, 2017)

AdoredTV showed that the MSI 580 gaming is overvolted for no apparent reason. He lowered the vcore by ~100mV and still managed same clocks and stability, with less heat and significantly less power draw.


----------



## Solid State Brain (Apr 22, 2017)

ShurikN said:


> AdoredTV showed that the MSI 580 gaming is overvolted for no apparent reason. He lowered the vcore by ~100mV and still managed same clocks and stability, with less heat and significantly less power draw.


Default clocks at default voltages are also stable near or at the thermal limit, whereas a lower voltage doesn't necessarily guarantee that unless dedicated testing is performed.


----------



## X800 (Apr 22, 2017)

Would this be a good card to replace my msi 390 8gb. ?


----------



## jabbadap (Apr 22, 2017)

X800 said:


> Would this be a good card to replace my msi 390 8gb. ?



In all honestly: if it's must be amd, wait Vega.


----------



## X800 (Apr 22, 2017)

Yes because I have an freesync monitor.


----------



## jabbadap (Apr 22, 2017)

X800 said:


> Yes because I have an freesync monitor.



Well then wait Vega, R9 390 -> RX 580 is more like sidegrade than anything(just look this test game to game and keep in mind R9 390 is tested with older driver too).


----------



## FordGT90Concept (Apr 22, 2017)

X800 said:


> Would this be a good card to replace my msi 390 8gb. ?


If it died, yes.  If it didn't die, keep using your 390.


----------



## X800 (Apr 22, 2017)

Hmm but its too slow , I need more FPS  .If I cant play the games I own on max its time update .


----------



## FordGT90Concept (Apr 22, 2017)

Then wait for Vega.


----------



## 64K (Apr 22, 2017)

X800 said:


> Hmm but its too slow , I need more FPS  .If I cant play the games I own on max its time update .



Save up some more money and have a look at Vega when it drops soon. The RX 580 Nitro+ is a bit of an upgrade for you but Vega holds better promise. You can always grab a 580 then if Vega doesn't deliver or if it's priced out of your budget.


----------



## awatz (Apr 22, 2017)

Noob question

580 cards from other manufacturers only have 1 8pin connector. Is the power consumption the same as the Sapphire?


----------



## EarthDog (Apr 23, 2017)

medi01 said:


> But based on what? (note, slightly better)


I dont know... you are the who said its slightly better. What do yoy base that on? I havent seen anything about these being better silicon. I see raised voltages to reach higher clocks using more power.


uuuaaaaaa said:


> Find me a polaris 10 chip that does 1266 @890mV and takes less than 75w to do so. This mean that the polaris 20xtx (or w/e it is called) is an improved polaris 10 chip, i.e. achieving the same at much lower power and heat.
> Well it is also shown in the video that this RX580 could undervolt to 1040mV from the stock 1150mV, reducing the power consumption down to 120w at its RX580 clocks.


rx480s can underclock too. Each silicon is different in how much it can underclock...

...again.. underclocking, so few care and its really not proving anything about the silicon. 
It may be 'slightly better' but i havent seen evidence of this...


----------



## uuuaaaaaa (Apr 23, 2017)

EarthDog said:


> I dont know... you are the who said its slightly better. What do yoy base that on? I havent seen anything about these being better silicon. I see raised voltages to reach higher clocks using more power.
> rx480s can underclock too. Each silicon is different in how much it can underclock...
> 
> ...again.. underclocking, so few care and its really not proving anything about the silicon.
> It may be 'slightly better' but i havent seen evidence of this...



1266 Mhz @ 890mV? Back when the 480 was released some guys couldn't even do 1300Mhz at 1200mv, those who could to get into 1400Mhz territory would usually need 1250+mV, of course there were some "alien" cards among the best aib models that could do 1400Mhz @ 1150mV, like some "special" XFX GTR s or some "special" Sapphire Nitro+. With this revision of polaris it is much more common to get such clocks at much lower volts, that msi rx580 in the video did 1400Mhz at 1050mV. These are not straight rebrands, the arquitecture is the same, but the manufacturing process is better, more refined.

I see this like the 3.2GHz Phenom II x4 955 BE vs the 3.7GHz Phenom II x4 980 BE, essentially the same arquitecture, but the latter had much higher clocks than the former at the same tdp. In hwbot the average oc for the 955 on water is 4138MHz, while the average oc on air for the 980 is 4270MHz. Despite being the same arquitecture, would you say that the 980 is a rebranded 955? Sure you can get a 955 working at stock 980 clocks, but when you push the clocks on both, the 980 will easily beat the 955 on average.

I'm not saying that what AMD did is 100% correct, this should have been called rx 485/ rx480 xtx/ rx480XT PE or something similar, not rx580, but these are not straight rebrands, this is how i see it.


----------



## EarthDog (Apr 23, 2017)

Clockspeeds dont make it a different card. If anything, perhaps they are better binned (like your cpu example), but it is, seemingly, the exact same silicon.


----------



## uuuaaaaaa (Apr 23, 2017)

EarthDog said:


> Clockspeeds dont make it a different card. If anything, perhaps they are better binned (like your cpu example), but it is, seemingly, the exact same silicon.



Exactly, that's the point, the problem is all in the naming of the card, they should have called it something different, not rx580.


----------



## notb (Apr 23, 2017)

uuuaaaaaa said:


> Exactly, that's the point, the problem is all in the naming of the card, they should have called it something different, not rx580.


Oh come on... it's just a string of characters. They can call it how they want - as long as they advertise the card as a new product. And they don't - it's openly a refresh with slight improvements in manufacturing (which seems to be true).
Honestly, this GPU has bigger issues than the name.


----------



## Shatun_Bear (Apr 24, 2017)

uuuaaaaaa said:


> 1266 Mhz @ 890mV? *Back when the 480 was released some guys couldn't even do 1300Mhz at 1200mv, those who could to get into 1400Mhz territory would usually need 1250+mV*, of course there were some "alien" cards among the best aib models that could do 1400Mhz @ 1150mV, like some "special" XFX GTR s or some "special" Sapphire Nitro+. With this revision of polaris it is much more common to get such clocks at much lower volts, that msi rx580 in the video did 1400Mhz at 1050mV. These are not straight rebrands, the arquitecture is the same, but the manufacturing process is better, more refined.
> 
> I see this like the 3.2GHz Phenom II x4 955 BE vs the 3.7GHz Phenom II x4 980 BE, essentially the same arquitecture, but the latter had much higher clocks than the former at the same tdp. In hwbot the average oc for the 955 on water is 4138MHz, while the average oc on air for the 980 is 4270MHz. Despite being the same arquitecture, would you say that the 980 is a rebranded 955? Sure you can get a 955 working at stock 980 clocks, but when you push the clocks on both, the 980 will easily beat the 955 on average.
> 
> I'm not saying that what AMD did is 100% correct, this should have been called rx 485/ rx480 xtx/ rx480XT PE or something similar, not rx580, but these are not straight rebrands, this is how i see it.



Spot on, I had two 480s and it's obvious these chip or at least the manufacturing process to create them has improved a lot.

First Nitro 480 hit a wall at 1340Mhz. Second would only get to 1380 stable and without throttling down. RX500 is around +100Mhz on older 480s.


----------



## EarthDog (Apr 25, 2017)

Shatun_Bear said:


> Spot on, I had two 480s and it's obvious these chip or at least the manufacturing process to create them has improved a lot.
> 
> First Nitro 480 hit a wall at 1340Mhz. Second would only get to 1380 stable and without throttling down. RX500 is around +100Mhz on older 480s.


the pcb for the nitro is different... not the silicon... that can certainly help. 

These are really no different...just more voltage and more clocks... perhaps better binning. But its still an overcloked 480x.


----------



## Bruno_O (Apr 30, 2017)

Guys, mine is using ~36W in idle (single monitor, 1080p or 4k makes no diff) any clues?
I've tried down-clocking but it has no effect as the card is going to lower states (both gpu and mem at 300 MHz).
Used both latest WHQL and Beta drivers, new Creators installation.

On the flip side, undervolted to 1025mV @ 1340 and it's using 150W peak =)


----------



## gasolin (May 6, 2017)

Why aren't there any gtx 980 to compare to the rx 580 only gtx 980 ti and 1060


----------



## Fluffmeister (May 6, 2017)

gasolin said:


> Why aren't there any gtx 980 to compare to the rx 580 only gtx 980 ti and 1060



It's a good question, the full GM204 chip is a performance per watt monster.


----------



## gasolin (May 6, 2017)

is the gtx 980 to close to the rx 580 i mean an almost 3 year old gpu as fast as a the newest and fastes amd card rx 580


----------



## uuuaaaaaa (May 6, 2017)

gasolin said:


> Why aren't there any gtx 980 to compare to the rx 580 only gtx 980 ti and 1060





Fluffmeister said:


> It's a good question, the full GM204 chip is a performance per watt monster.





gasolin said:


> is the gtx 980 to close to the rx 580 i mean an almost 3 year old gpu as fast as a the newest and fastes amd card rx 580



What about the R9 Nano? Epic card imho.


----------



## gasolin (May 6, 2017)

That card is also missing, what is the point of  reviewing a gpu if you can't compared it to cards that used might be an alternative.

Reviews of the nano and gtx 980 is so old that most if not all games they used  are different compared to rx 580, so it's hard to compare an old and new gpu


----------



## gamerman (Nov 10, 2018)

still i cant belive what i seen. techpower give 'editor choice' award that junk.

i mean its loose many games for gtx 1060 6gb,but its need almost 100% more power to do it!

its interesting that when amd release NEW gpu,nvidia need beat that stone age gtx 1060 6gb with eany memory update to beat rx 590,but again,we must remembe how much ore juice rx 590 need to get battle,almost 100% more power!

in fact even jurassic park old 980 ti beat rx 590!! think about it..

hoe ever that kind lausy gpu CAN rewarded??!

noway, come on TP!

its not earn any reward.thats it

i post this bcoz there is coming soon rx 590 gpu what is excatly like this review but builded 7nm tech and bcoz that oc'd skyhigh, so all know that power eat is even higher. i guess near 300W for peak and gaming 240-260W

its terrible, lausy efficiency!

game speed just margin diffeerent. iguess 2-5 fps for HD ,but higher 1-3 . so nothing.

keep high level TP, efficiency must be one of moust important these days!


----------



## Kissamies (Nov 10, 2018)

Nice necrobump there. Can you see that the review is over 1½ years ago?

And stop that AMD hate trolling what you've done always in io-tech and Muropaketti, your messages in Finnish are hard to read, this in English is almost unreadable.


----------



## Captain_Tom (Nov 12, 2018)

gamerman said:


> still i cant belive what i seen. techpower give 'editor choice' award that junk.
> 
> i mean its loose many games for gtx 1060 6gb,but its need almost 100% more power to do it!
> 
> ...



Almost nothing you said is correct.  Nothing lol.


----------

