# MSI RX 480 Gaming X 8 GB



## W1zzard (Aug 1, 2016)

MSI's Radeon RX 480 Gaming X is the first custom-design RX 480 with a good cooler that manages low noise levels and temperatures, which is the only way to compete with the GTX 1060. The card is also overclocked out of the box and includes user-adjustable RGB lighting.

*Show full review*


----------



## Chaitanya (Aug 1, 2016)

How long before reviews of Sapphire cards?


----------



## W1zzard (Aug 1, 2016)

Chaitanya said:


> How long before reviews of Sapphire cards?


Haven't heard anything from Sapphire yet regarding reviews of their cards, but reviews on other sites seem to indicate they aren't doing so well.


----------



## darkangel0504 (Aug 1, 2016)

Okay . . .


> The game uses an in-house game engine by IO Interactive that is one of the first to leverage DirectX 12. The game's DirectX 12 implementation, however, is too riddled with bugs to be integrated into our test bench for now. In this test, we're testing the game in DirectX 11 mode, which is still reasonably taxing on high-end GPUs.


----------



## W1zzard (Aug 1, 2016)

darkangel0504 said:


> Okay . . .


DX12 in Hitman has been fixed since, on next rebench it will use DX12


----------



## natr0n (Aug 1, 2016)

One thing that bothers me about MSI's AMD cards they always leave 1-2 ram chips uncovered been like this for years.

MSI's Nvidia cards are almost always completely covered in heatspreader.


----------



## sutyi (Aug 1, 2016)

Power consumption seems a tad bit high on these overclocked AIB RX 480 boards. 

That +20% power consumption is not really worth the +4-5% extra performance in my eyes.


----------



## matteo (Aug 1, 2016)

nice review, would have been great if driver version were 16.7.3 since those improve performance by a tot on tomb raider also i would love to see hitman in dx12 rather on old dx11 and also doom on opengl/vulkan too.
great work on the review, i think i will lean toward one of those rx480 for my next home factotum build on a matx case, i feel nvidia's a bit anchored on the old dx11 side for my taste and the rumor on volta's arch around the corner i might turn on amd this time for my mainstream rig, its impressive on how good this card perform in certain scenarios catching up the 980ti.
In some games the card seem to loose too much like the battlefield series and ac too, i hope it is a driver related problem.


----------



## buggalugs (Aug 1, 2016)

sutyi said:


> Power consumption seems a tad bit high on these overclocked AIB RX 480 boards.
> 
> That +20% power consumption is not really worth the +4-5% extra performance in my eyes.



It is when one card is silent and the other sounds like a vacuum cleaner.

Hey Wizzard you said in the review:

"Nearly all board partners have thus far either struggled to come up with a cooling solution that successfully manages the heat output of AMD's RX 480 GPU, or they have failed to come up with a proper configuration for their fan settings. MSI's RX 480 Gaming X, however, acts like most other MSI Gaming cards,"

 When you say "nearly all board partners" who are you referring to? In the reviews I can only see this MSI 480 and the Asus Strix 480. How are you coming to this conclusion.?? Have you tested more but reviews arent finished or something?

 I find it a little confusing how a 150Watt card (even if its a little more than 150Watt)  is having problems with heat. Pretty much all the AIB partner coolers have run virtually silent for a few generations now, even on highend 250watt cards. Even with current gen other, higher end cards draw more power than the 480. Most of these coolers are designed to handle up to 250-275 watts. I dont understand it.


----------



## KainXS (Aug 1, 2016)

natr0n said:


> One thing that bothers me about MSI's AMD cards they always leave 1-2 ram chips uncovered been like this for years.
> 
> MSI's Nvidia cards are almost always completely covered in heatspreader.



It is a little annoying really, Its probably because AMD designed the 480 with the ram too close to the heatsink screw holes on the reference design, the only card I see that does not have that so far is the nitro since it does not use a metal plate but the heatsink base itself(sadly the heatsink looks narrow though).







The power consumption is also a little sad but some of the partners are saying that its to stabilize the boost clock when oc'd so thats OK for me, everyone should know by now that polaris is power hungry when overclocking.



W1zzard said:


> Haven't heard anything from Sapphire yet regarding reviews of their cards, but reviews on other sites seem to indicate they aren't doing so well.



hmm is something wrong with the sapphire cards?


----------



## Ferrum Master (Aug 1, 2016)

Looking at Tomb Rider and Fury X performance... Looks really odd.


----------



## bug (Aug 1, 2016)

sutyi said:


> Power consumption seems a tad bit high on these overclocked AIB RX 480 boards.
> 
> That +20% power consumption is not really worth the +4-5% extra performance in my eyes.


What about the +$35 over MSRP?
Nvidia was supposed to be evil one this round because of FE, but it seems the RX480 is selling above MSRP as well.


----------



## rtwjunkie (Aug 1, 2016)

KainXS said:


> hmm is something wrong with the sapphire cards?



I wish it wasn't so, since I have been counting on Sapphire to deliver, but some of the other reviews on it paint it not as good as the MSI version.  In truth, it's still a great performer on its own merits though.


----------



## sutyi (Aug 1, 2016)

buggalugs said:


> It is when one card is silent and the other sounds like a vacuum cleaner.



That might be true for some, but I would prefer that 150-165W power budget with the same cooling solution. That would make it even more silent and probably cooler by a couple of degrees. Would probably hold the references 1266MHz boost clock relatively stable too.


----------



## Chaitanya (Aug 1, 2016)

W1zzard said:


> Haven't heard anything from Sapphire yet regarding reviews of their cards, but reviews on other sites seem to indicate they aren't doing so well.


I thought if anyone would have had good AMD cards it would have been Sapphire, but its very unsual they have managed to miss opportunity on  these cards.


----------



## raghu78 (Aug 1, 2016)

wizzard
                Why was the 16.7.3 whql drivers not used for this review ? This driver bought significant gains for Rx 480 in RoTR especially in DX12. hexus used that for the Sapphire Nitro Rx 480 OC review . the review came out on 29th July (3 days before this review).  

http://hexus.net/tech/reviews/graphics/94969-sapphire-radeon-rx-480-nitro-4gb-8gb-oc/?page=10

Anyway the gap between Rx 480 and GTX 1060 should reduce with Hitman DX12, RoTR DX12. btw do you have any plans for adding Doom Vulkan. Right now your test suite is dated with very few 2016 games.


----------



## Caring1 (Aug 1, 2016)

1060 wins again, and it's cheaper where I live.


----------



## Fluffmeister (Aug 1, 2016)

Wow, it's performance per watt has taken a serious nose dive, but in the grand scheme of things it's a decent card, although it's clear the 1060 is an excellent alternative in this price range. Competition FTW!


----------



## Captain_Tom (Aug 1, 2016)

bug said:


> What about the +$35 over MSRP?
> Nvidia was supposed to be evil one this round because of FE, but it seems the RX480 is selling above MSRP as well.



It's because demand is so high - a very big difference.  Nvidia used the FE pricing in order to make it look like they weren't jacking up prices.


----------



## bug (Aug 1, 2016)

Captain_Tom said:


> It's because demand is so high - a very big difference.  Nvidia used the FE pricing in order to make it look like they weren't jacking up prices.


I see. So the 480 sells above MSRP because of high demand and the 1060 because of different reasons. I don't need further arguments, I'm sure both Nvidia and AMD report their sales figures directly to you.


----------



## Roph (Aug 1, 2016)

Why do you need sales figures? Nvidia creates a price range - $250-300. So you could argue that a $299 1060 isn't "overpriced" because the founders edition is even more overpriced.

AMD's got a single MSRP - $200 for 4GB and $230 for 8GB.

I'm waiting to buy an AIB RX480 here in the UK, there are tons of GTX 1060s to choose from in stock, it's impossible to buy an AIB 480 right now. Constantly out of stock.


----------



## Enterprise24 (Aug 1, 2016)

So 480 non ref max OC = 1060 ref stock.


----------



## Caring1 (Aug 1, 2016)

Roph said:


> Why do you need sales figures? Nvidia creates a price range - $250-300. So you could argue that a $299 1060 isn't "overpriced" because the founders edition is even more overpriced.
> 
> AMD's got a single MSRP - $200 for 4GB and $230 for 8GB.
> 
> I'm waiting to buy an AIB RX480 here in the UK, there are tons of GTX 1060s to choose from in stock, it's impossible to buy an AIB 480 right now. Constantly out of stock.


But those prices are in U.S. dollars, and don't always equate to local currency.
It seems prices fluctuate according to demand in different countries.
Nvidia is slightly lower priced here (Australia) then the U.S. in comparison to AMD cards.


----------



## bug (Aug 1, 2016)

Roph said:


> Why do you need sales figures? Nvidia creates a price range - $250-300. So you could argue that a $299 1060 isn't "overpriced" because the founders edition is even more overpriced.
> 
> AMD's got a single MSRP - $200 for 4GB and $230 for 8GB.
> 
> I'm waiting to buy an AIB RX480 here in the UK, there are tons of GTX 1060s to choose from in stock, it's impossible to buy an AIB 480 right now. Constantly out of stock.


Funny, I can see plenty of 480s on overclockers.co.uk...


----------



## Figus (Aug 1, 2016)

bug said:


> Funny, I can see plenty of 480s on overclockers.co.uk...


They are ALL reference card.


----------



## Recus (Aug 1, 2016)

bug said:


> Funny, I can see plenty of 480s on overclockers.co.uk...



No AIB models. Also people still waiting for Sapphire Toxic Unicorn edition which will oc till 1.6 GHz and will be faster than GTX 1070.


----------



## GhostRyder (Aug 1, 2016)

Well its the best overclocker out of the lot so far....But still completely disappointing overall.  The only thing this card really has going for it is the cheap price for the reference models as the others seem to eclipse the 1060.  Unfortunate...Hopefully some cheap after market 4gb cards will hit the shelves or the price will come a tad bit down on the 8gb variant.


----------



## okidna (Aug 1, 2016)

Roph said:


> I'm waiting to buy an AIB RX480 here in the UK, there are tons of GTX 1060s to choose from in stock, it's impossible to buy an AIB 480 right now. *Constantly out of stock.*



It's not out of stock, it's not on sale yet, most of the AIB cards (Sapphire, XFX, ASUS, MSI, Gigabyte, PowerColor) will be on sale in mid-August.


----------



## daimonass (Aug 1, 2016)

Where test with 12DX ?


----------



## rtwjunkie (Aug 1, 2016)

daimonass said:


> Where test with 12DX ?



He stated in Post#5 that DX12 just got fixed in Hitman,and will rebench later in DX12.
https://www.techpowerup.com/forums/threads/msi-rx-480-gaming-x-8-gb.224494/#post-3498559

"DX12 in Hitman has been fixed since, on next rebench it will use DX12"


----------



## efikkan (Aug 1, 2016)

GTX 1060 remains the superior choice.


----------



## bug (Aug 1, 2016)

efikkan said:


> GTX 1060 remains the superior choice.


It's slightly more power for slightly more $$$ (i.e. not superior if you don't have the extra $20-30). I'll probably go for the 1060 because of the Linux drivers (yet again). It seems AMD hasn't fixed any of their OpenGL implementation issues in the new driver (which seems to be actually the old driver, but with some parts open sourced).


----------



## LooKMaYnE (Aug 1, 2016)

Custom GTX1060s are actually cheaper in EU than custom RX480s.


----------



## wiak (Aug 1, 2016)

hey @W1zzard can you change out the 900 full size resolution to 1920 or 2560 full size resolution images in the reviews?, the current ones look pretty bad if you look at them on a 1200p monitor and horrible on 1440p/4k one
great review as always


----------



## Air (Aug 1, 2016)

buggalugs said:


> It is when one card is silent and the other sounds like a vacuum cleaner.
> I find it a little confusing how a 150Watt card (even if its a little more than 150Watt)  is having problems with heat. Pretty much all the AIB partner coolers have run virtually silent for a few generations now, even on highend 250watt cards. Even with current gen other, higher end cards draw more power than the 480. Most of these coolers are designed to handle up to 250-275 watts. I dont understand it.



1. Because its not really 150 W. In "typical gaming" reference uses 163 W, this one from MSI uses 196 W, close to ref. Fury (200 W) and ref. GTX 980 ti (211 W).
2. Die area is much smaller, 232 mm², while GTX 980 Ti and Fury are both 600 mm². Its like having to remove almost the same amount of heat through a much smaller "pipe".

That said, results for this card are pretty good.


----------



## Fluffmeister (Aug 1, 2016)

Recus said:


> No AIB models. Also people still waiting for Sapphire Toxic Unicorn edition which will oc till 1.6 GHz and will be faster than GTX 1070.



Man, that thing would suck power like it's going out of fashion!


----------



## Captain_Tom (Aug 1, 2016)

bug said:


> I see. So the 480 sells above MSRP because of high demand and the 1060 because of different reasons.



Yes, so you can read.


----------



## Captain_Tom (Aug 1, 2016)

Air said:


> 1. Because its not really 150 W. In "typical gaming" reference uses 163 W, this one from MSI uses 196 W, close to ref. Fury (200 W) and ref. GTX 980 ti (211 W).
> 2. Die area is much smaller, 232 mm², while GTX 980 Ti and Fury are both 600 mm². Its like having to remove almost the same amount of heat through a much smaller "pipe".
> 
> That said, results for this card are pretty good.



AMD chose the 14nm process for 2 big reasons:

1) It is MUCH cheaper than TSMC's 16nm, and that allows them to be ultra competitive in the low - high end brackets.  People just never seemed to notice when AMD took the performance crown.  As such, AMD finally understands that their best days were the 4000 - 6000 series where they dominated price/perf and rarely worried about Halo products.

2) GloFlo's 14nm process is 9% denser than TSMC's 16nm.  As it matures its clock-speeds should mostly catch up to Nvidia's, and when it does they WILL be able to scale higher than Nvidia can in the top end.  Expect nvidia to hit a wall at about 16 billion transistors while AMD manages to scale to 18-20 billion.  Sure it will probably use 320w, but who cares if it manages to win by 10% performance and costs less to make?


----------



## efikkan (Aug 1, 2016)

Captain_Tom said:


> GloFlo's 14nm process is 9% denser than TSMC's 16nm.  As it matures its clock-speeds should mostly catch up to Nvidia's, and when it does they WILL be able to scale higher than Nvidia can in the top end.  Expect nvidia to hit a wall at about 16 billion transistors while AMD manages to scale to 18-20 billion.  Sure it will probably use 320w, but who cares if it manages to win by 10% performance and costs less to make?


What Samsung/GloFo calls "14 nm FinFET" is roughly equivalent to what TSMC calls "16 nm FinFET", and would be called "20 nm FinFET" if it were made by Intel. Both processes are basically a 20 nm node with FinFET, they have chosen a different measure just for marketing.


----------



## xkm1948 (Aug 1, 2016)

AMD card won't get better in old DX11/OpenGL games. The design is not geared towards DX11 at all. However DX11 still represents the majority of games on market right now. So Wizzard review is spot on to reflect how RX480 is doing for current and old games. With newer games utilizing new API we may see improvement of RX480 over time. DOOM is a glimpse into the future of how RX480 will do.


----------



## dj-electric (Aug 1, 2016)

I read this review and i loudly said to myself:

"This RX 480 card consumes 207 watts at gaming"

I was not happy after that. This is a ridiculously high amount of power compered to cards like 3rd party GTX 1060s.
It is alarmingly and unresponsively high for a mid-range 14nm card.


----------



## the54thvoid (Aug 1, 2016)

Dj-ElectriC said:


> I read this review and i loudly said to myself:
> 
> "This RX 480 card consumes 207 watts at gaming"
> 
> ...



Power consumption only mattered in the Fermi days.  Then when AMD started to use more power and Nvidia started to use less power (relatively speaking), that power efficiency no longer mattered.  Funny that.

Perf/Wat you're better off with a Fury or a Fury X (for AMD).  In the UK you can buy a Fury (non X) for £300 which is a damn good deal and has that fangly HBM that everyone bitches about.


----------



## dj-electric (Aug 1, 2016)

Power consumption ALWAYS matters.
Its what drives the mobile computer market, and gives headroom in GPUs to squeeze more performance out of.

If you would ask me if i would like a card that performs like X and consumes 120W in gaming, compered to one that performs like X+5% and consumes 200W, i would take the first one in a heartbeat.
Here, we have a dystopian situation where the X+5% is at the 120W mark and the X is at 200W. This is "back to the future of biff" bad.

It raises the question how will a RX 480 + 50% performance card will look like. Red hardware will cross the 300W mark at 14nm at this rate (we know how it looks on the Nvidia side). That means more heat, more noise, and gigantic chunks of metal to cool it all.

You need efficiency to move forward.


----------



## geon2k2 (Aug 1, 2016)

efikkan said:


> What Samsung/GloFo calls "14 nm FinFET" is roughly equivalent to what TSMC calls "16 nm FinFET", and would be called "20 nm FinFET" if it were made by Intel. Both processes are basically a 20 nm node with FinFET, they have chosen a different measure just for marketing.



How did you reach such conclusion? There must be some difference between them not just marketing. 
Maybe their process does not allow them to pack the transistors so much as intel and by leaving more space between the transistors they are wasting wafer space, however it would be nice if you could provide more details about this.


----------



## geon2k2 (Aug 1, 2016)

Dj-ElectriC said:


> Power consumption ALWAYS matters.
> Its what drives the mobile computer market, and gives headroom in GPUs to squeeze more performance out of.
> 
> If you would ask me if i would like a card that performs like X and consumes 120W in gaming, compered to one that performs like X+5% and consumes 200W, i would take the first one in a heartbeat.
> ...



Efficiency has many variables.
See this link: http://www.legitreviews.com/amd-radeon-rx-480-undervolting-performance_183699

With 0.05 voltage decrease they reduced power consumption with 10-30W making this an ~130W card instead of 150, while also getting more performance due to lower throttling. From what I see MSI has done the opposite just to squeeze a little bit more performance. You need to strike the right balance between clocks/performance and you will get great efficiency. Maybe with 0.1 less and 100 mhz less they can get fantastic off the chart efficiency, but then they will not meet the desired performance target, which was VR Ready.

I used to have phenom II in the past with stock voltage of 1.35 and running at 3.0Ghz. I ran it for most of its life in my PC at 3.6 Ghz and 1.25 V. Considering that power consumption is directly proportional with voltage square 2, I would say I improved the efficiency of phenom II with 30-40%, with very little tweaking.(Didn't do exact calculation, there is a formula somewhere on the net if you want to see, but it is more than the 20% which comes directly from overclock). I think AMD should do a bit more checks from the factory and be a bit more aggressive, but then there might be a risk that in some scenario (maybe furmark, with renamed exe) it will hang so they don't want to take this risk.


----------



## dj-electric (Aug 1, 2016)

What i wanted to see is companies doing their adjustments.

I am also looking forward for the Samsung made GPUs it is clear as day to me that they will be of a much higher quality than GloFo's.
I'm still hoping for an increase of 30%+ in perf\watt ratio. At its maxwell-like state, even stronger GPUs than the polaris 10 will just not a appeal to me.

My plan is to get my 290Xs out of there for 180Wish GPUs


----------



## dyonoctis (Aug 1, 2016)

Dj-ElectriC said:


> Power consumption ALWAYS matters.
> Its what drives the mobile computer market, and gives headroom in GPUs to squeeze more performance out of.
> 
> If you would ask me if i would like a card that performs like X and consumes 120W in gaming, compered to one that performs like X+5% and consumes 200W, i would take the first one in a heartbeat.
> ...



Amd has definitely lost the mobile market. Unless they are going to do a custom chip with HBM there is no point in making a RX480m, when the 1060m will be faster and way more power efficient. Assuming that Vega will be the RX 490, I don't think that Red team will hit the 300w mark. I rather think that it will be like Fury/390: way moar core for equal/lower power. Unless Amd choose to go nuts and release out of the blue a 3072 core gcn with GDDR5.
Then there is Navi that we don't know **** about but is planning to be more power efficient .

(However if Nvidia is planning to go full HBM for it's next gen, then it's gonna be trouble again. The huge lead they got in power efficiecy will get bigger and they will be able to cram a shitload of shaders units on their gpu.)


----------



## Jism (Aug 2, 2016)

AMD did not give any much more headroom on OC's, so every card (reference or AIB) is looking at a already maxed out chip where OC's are really marginal. Yes it could reach 1500MHz but proberly not for 24/7 usage and just good enough for one run and nothing more.

This is the proces limitation. It suits best at 1200 up to 1400Mhz in very rare cases, but anything above is a litterally no go on Air.

It's not the VRM, it's overdesigned up to 250W, it's not the 6 Pins vs 8 Pins header either, 6 pins is enough for a rough 300 up to 400Watts far above spec, it's not the cooling as well, still 130W TDP / chip (there is some TDP loss at VRM & memory chips as well) is very excellent to cool.

Nvidia's approach is different, they render frames based upon tile's rather then 1 frame at a time. This puts all the 'cores' at work and with the right programming utilizing all available cores.

The moment AMD applies the very same rendering technique, nvidia can pack up their cards and brainstorm for something new. Because AMD cards have a much better raw performance then Nvidia in general, they tend to utilizing things in a more efficient matter.

The 480X is not a bad card or chip, when was the last time we had a chip for 200 up to 270$ with this type of performance? It's bin a while, and we all should praise AMD anyway for giving us that chip that does very very well up to WHQD resolutions. Compare that with Vulkan API and you have a very decent and beefy card. Wait 1 or 2 years, buy yourself a second 480x and double the performance in many games. What more do you want?


----------



## deu (Aug 2, 2016)

Enterprise24 said:


> So 480 non ref max OC = 1060 ref stock.



In that game and that resolution = yes; Im not sure if you're aiming for a specific point. You could also say that 480 = 980Ti if you take hitman or DOOM with vulkan (almost), but id say its a pretty narrow comparison. The reason why battlefield 3 is used (as far as I understand), in the OC test, is to best show the represented percentage advancement in performance. You could argue that BF3 maybe a little outdated and that newer games will benifit differently from the average OC but I assume Wizzard knows when its time to change this


----------



## deu (Aug 2, 2016)

dyonoctis said:


> Amd has definitely lost the mobile market. Unless they are going to do a custom chip with HBM there is no point in making a RX480m, when the 1060m will be faster and way more power efficient. Assuming that Vega will be the RX 490, I don't think that Red team will hit the 300w mark. I rather think that it will be like Fury/390: way moar core for equal/lower power. Unless Amd choose to go nuts and release out of the blue a 3072 core gcn with GDDR5.
> Then there is Navi that we don't know **** about but is planning to be more power efficient .
> 
> (However if Nvidia is planning to go full HBM for it's next gen, then it's gonna be trouble again. The huge lead they got in power efficiecy will get bigger and they will be able to cram a shitload of shaders units on their gpu.)


AMD actually have quiet alot of the mobile market with APU's but yes they have NO gaming solutions whatsoever


----------



## CounterSpell (Aug 2, 2016)

so... in what particular aspect does rx480 is better than gtx1060?

I just want to understand why people claim RX480 is better than 1060.. all the benchs say otherwise...

Im not a fan of nvidia or amd...i just want to make a good choice, and lately, everything points to gtx1060, instead of rx480


----------



## HumanSmoke (Aug 2, 2016)

[OT]


Captain_Tom said:


> AMD chose the 14nm process for 2 big reasons:
> 
> 1) It is MUCH cheaper than TSMC's 16nm


Show proof or quit with the guerrilla marketing. Neither TSMC nor Samsung (nor any other foundry) release wafer costs - especially those weighted for yield. What is known is that Samsung's 14nmLPC (the successor to LPP) and TSMC's 16nmFFC both offer similar production cost reductions due to fewer mask steps (8-10) being required for complex IC's.


Captain_Tom said:


> 2) GloFlo's 14nm process is 9% denser than TSMC's 16nm.  As it matures its clock-speeds should mostly catch up to Nvidia's, and when it does they WILL be able to scale higher than Nvidia can in the top end.  Expect nvidia to hit a wall at about 16 billion transistors while AMD manages to scale to 18-20 billion.  Sure it will probably use 320w, but who cares if it manages to win by 10% performance and costs less to make?


Again, show proof* or quit with the guerrilla marketing.
1. Samsung hasn't proven to be able to scale their process to anywhere close to that of TSMC.
2. Even if Samsung could produce a monolithic GPU in the 550-600mm^2 range, the increased transistor density will require a lower transistor switching voltage and lower frequencies as is the case currently.
3. You seem to think that Samsung/Glofo's process advancements occur in a vacuum. TSMC's process also improves as the node becomes mature. Answer me this: If Samsung is poised to leave TSMC in its wake, why has its (and TSMC's) largest customer, Apple, ditched Samsung for the A10 with insider talk pointing to Samsung's ramp, yield, and heat/power issues. Apple has access to both Samsung's 14LPC and TSMC 16FFC's projections and seem to have reached a definitive conclusion...are you going to tell me Apple make foundry partner manufacturing decisions based on frivolous and arbitrary reasons?

* Actual proof not rumour/claims from some random on WTFtech

[/OT]
On topic: Nice looking card. Wouldn't buy. The Sapphire Nitro card looks to be almost 20% (!) cheaper locally.


----------



## Jism (Aug 2, 2016)

CounterSpell said:


> so... in what particular aspect does rx480 is better than gtx1060?
> 
> I just want to understand why people claim RX480 is better than 1060.. all the benchs say otherwise...
> 
> Im not a fan of nvidia or amd...i just want to make a good choice, and lately, everything points to gtx1060, instead of rx480



- asynchronous compute (see vulkan / doom for example)
- fully DX12 capable
- best bang for the buck currently (more available then Nvidia)
- new card, driver updates will follow as well
- and because you can, AMD is just more fun

- and be cool with other AMD fanboys as well. 

Remember that asynchronous compute and vulkan will play a very important role in future games, and since AMD is having both major brands in consoles (MS / Sony) this will become more and more important for dev's to work on. You'll get to see more and more from AMD very soon, esp when the HBM2 party starts to throw off.

I'd put my money on a R480x for the WQHD resolutions, and within a year buy a second 480x for less then 230$ and increase performance with games up to 50%.


----------



## Malabooga (Aug 2, 2016)

CounterSpell said:


> so... in what particular aspect does rx480 is better than gtx1060?
> 
> I just want to understand why people claim RX480 is better than 1060.. all the benchs say otherwise...
> 
> Im not a fan of nvidia or amd...i just want to make a good choice, and lately, everything points to gtx1060, instead of rx480



Because it wins all DX12 games except NVidia sponsored ROTTR (in which they somehow managed to make AMD cards run slower than DX11 but somehow that game got chosen for only DX12 representative) and Vulkan (Doom)

Unfortunately this benchmark suite is pretty irrelevant when it comes to choosing card for next 2-3 years. I have NVidia Maxwell card and Dx12/Vulkan has been nothing but huge dissapointment, drivers are awful and performance is really bad. Thats something you dont see in this review (well if you knew Hitmans history, NVidia had huge problems with that game and thats the reason its not tested in DX12. I wonder why author doesnt mention that also)

Some sites have put up analysis of DX11 and DX12/Vulkan separately, and while GTX1060 wins DX11 by 5-10%, RX480 wins DX12/Vulkan by 15-20%.


----------



## CounterSpell (Aug 2, 2016)

Malabooga said:


> Some sites have put up analysis of DX11 and DX12/Vulkan separately, and while GTX1060 wins DX11 by 5-10%, RX480 wins DX12/Vulkan by 15-20%.


can you link them?


----------



## Malabooga (Aug 2, 2016)

And thats without Doom (Vulkan) included which GTX1060 loses by 25-30%

So, this last develpment with Doom Vulkan pretty much sealed the deal for me for dumping my GTX970 and picking up RX480 4GB for small markup (maybe i even pick up 8GB one if the price is right).

But for me theres no doubt, RX470 4GB and RX480 4GB are pretty much unchallenged for mainstream in this gen so far, im not sure why is GTX1060 even a thing as its quite overpriced compared to these and not really hardware equipped for new APIs (Async Compute which is never coming for my card, another lie from NVidia.....)


----------



## Fluffmeister (Aug 2, 2016)

Hmm the 1060 does just dandy in those tests (whilst using less power) , and a i7 5960X CPU?

Again the realistic picture for average joe isn't getting shown.


----------



## evernessince (Aug 2, 2016)

Is the RX 480 a good card?  Yes but it isn't the card AMD needed.  Instead what we have is a good card for the price with decent power consumption that can't overclock.  The only two things it has over the 1060 is DX 12 support and price.  Once again neither card is a clear cut winner and Nvidia will win out simply because it's brand.


----------



## Captain_Tom (Aug 2, 2016)

efikkan said:


> What Samsung/GloFo calls "14 nm FinFET" is roughly equivalent to what TSMC calls "16 nm FinFET", and would be called "20 nm FinFET" if it were made by Intel. Both processes are basically a 20 nm node with FinFET, they have chosen a different measure just for marketing.



They are different process bud.  Their 14nm node is literally more dense.


----------



## Captain_Tom (Aug 2, 2016)

evernessince said:


> Is the RX 480 a good card?  Yes but it isn't the card AMD needed.  Instead what we have is a good card for the price with decent power consumption that can't overclock.  The only two things it has over the 1060 is DX 12 support and price.  Once again neither card is a clear cut winner and Nvidia will win out simply because it's brand.



It is selling out like crazy with while having flooded the market.  It is capturing marketshare - which is exactly what AMD needs.


----------



## sweet (Aug 2, 2016)

Captain_Tom said:


> It is selling out like crazy with while having flooded the market.  It is capturing marketshare - which is exactly what AMD needs.


It has been selling extremely well, but I'm not sure about "capturing marketshare" as my fellow miners bought a lot of them.


----------



## shk021051 (Aug 2, 2016)

Doom (vulkan)?


----------



## Shatun_Bear (Aug 2, 2016)

I've been browsing TPU for years but am I right in thinking that the review Performance Summary, that so many put so much stock in, is based on the cards benched in the test? Sorry, I know it sounds obvious but just want to be sure otherwise I'll have egg on my face with the next comment?

So if it is, and you throw in more recent games like:

*DOOM* running under Vulkan (which any AMD user would set the game to)
*Total War: Warhammer* DX12
*Hitman* DX12

The performance summary would be very different, as the performance advantage the 480 has in DOOM and Hitman is huge and noteworthy in Warhammer too. For me personally I would take out at least two old games from the performance summary and replace them with DOOM and WH at least. I would probably replace Crysis 3 and BF3 as they are very old. I did ask W1zzard when he would include DOOM and apparently that is coming. Shame it wasn't ready for this review.


----------



## Shatun_Bear (Aug 2, 2016)

Some more observations for people that just see the numbers and don't investigate a little. Those making posts like 'the 1060 wins again' are funny. It's all about perception and the factors in each review.

- The power consumption figures are massively off from KitGuru's review of custom cards. KG have some of the best PSU reviews around so I would be inclined to take their figures on power seriously. KG has the power consumption much, much more competitive with the 1060. Look at this chart for the apparently more power hungry 480 Strix:

http://www.kitguru.net/components/g...us-rx-480-strix-gaming-oc-aura-rgb-8192mb/29/







- Next, were the new Tomb Raider AMD drivers used for this review? Hexus benched the game with them, and the Nitro 480 they tested jumped a huge amount in performance (you may have used them I just want to be sure)?


----------



## ZeroFM (Aug 2, 2016)

Shatun_Bear said:


> I've been browsing TPU for years but am I right in thinking that the review Performance Summary, that so many put so much stock in, is based on the cards benched in the test? Sorry, I know it sounds obvious but just want to be sure otherwise I'll have egg on my face with the next comment?
> 
> So if it is, and you throw in more recent games like:
> 
> ...


All this three games heavy support from AMD . But i agree , need ,,new'' game include in test


----------



## Shatun_Bear (Aug 2, 2016)

ZeroFM said:


> All this three games heavy support from AMD . But i agree , need ,,new'' game include in test



Indeed they do. But I would argue that the reason for their inclusion (at least WH:TW and DOOM) is not merely because they favour AMD cards, but because they are far more relevant I think to most buyers right now than what I propose they should replace: *BF3* as there is already BF4 being benched, and *Crysis 3*. Two titles which are very old. WH:TW and DOOM are two of the biggest PC titles released this year. Hitman's DX12 bench inclusion is perhaps less essential.

Regardless, that they should be benched in a mode that gives most performance for each card is also a no-brainer, as no-one should not play these games with the superior drivers for their hardware.

My point was, swap out two of these older ( in my opinion less relevant games) with the two bigger PC titles of the year and the *Performance Summary* is going to look very different. So people declaring one card is superior to the other based on the performance summary alone are being slightly ignorant of the context here. For example, we could have a review from some large site in a few months. DOOM Vulkan, WH:TW DX12, No Man's Sky and the new Battlefield 1 DX12 make up half of the benching suite.  All super relevant, newer PC games. Nothing out of the ordinary (on the contrary, a broad and fair selection of relevant games for PC gamers). A performance summary is knocked up from it, and suddenly it shows the reference 480 is 7% faster than the 1060 at 1440p. Que lots of people going around claiming the 480 is faster than the 1060. This then has a knock-on effect on perf per dollar etc etc.

TL;DR - Context is key.


----------



## Captain_Tom (Aug 2, 2016)

sweet said:


> It has been selling extremely well, but I'm not sure about "capturing marketshare" as my fellow miners bought a lot of them.



I know a lot of people buying this card (For gaming), and I am seeing it pop up in the forum signatures.


----------



## CounterSpell (Aug 2, 2016)

Malabooga said:


> And thats without Doom (Vulkan) included which GTX1060 loses by 25-30%
> 
> So, this last develpment with Doom Vulkan pretty much sealed the deal for me for dumping my GTX970 and picking up RX480 4GB for small markup (maybe i even pick up 8GB one if the price is right).
> 
> But for me theres no doubt, RX470 4GB and RX480 4GB are pretty much unchallenged for mainstream in this gen so far, im not sure why is GTX1060 even a thing as its quite overpriced compared to these and not really hardware equipped for new APIs (Async Compute which is never coming for my card, another lie from NVidia.....)



cool. but what games nowadays have dx 12 and async? its just a few games. And thats why i think for the majority  of actual games, the 1060 is a better deal. IMHO


----------



## Recus (Aug 2, 2016)

Captain_Tom said:


> AMD chose the 14nm process for 2 big reasons:
> 
> 1) It is MUCH cheaper than TSMC's 16nm, and that allows them to be ultra competitive in the low - high end brackets.  People just never seemed to notice when AMD took the performance crown.  As such, AMD finally understands that their best days were the 4000 - 6000 series where they dominated price/perf and rarely worried about Halo products.
> 
> 2) GloFlo's 14nm process is 9% denser than TSMC's 16nm.  As it matures its clock-speeds should mostly catch up to Nvidia's, and when it does they WILL be able to scale higher than Nvidia can in the top end.  Expect nvidia to hit a wall at about 16 billion transistors while AMD manages to scale to 18-20 billion.  Sure it will probably use 320w, but who cares if it manages to win by 10% performance and costs less to make?



Well AMD used TSMC 16nm for XBox One S.

http://www.eurogamer.net/articles/d...as-a-gpu-overclock-and-we-have-benchmarked-it


----------



## Figus (Aug 2, 2016)

CounterSpell said:


> cool. but what games nowadays have dx 12 and async? its just a few games. And thats why i think for the majority  of actual games, the 1060 is a better deal. IMHO



When i buy a gpu i hope to use it at least for 18 months... so is better look for the future.


----------



## Captain_Tom (Aug 2, 2016)

Recus said:


> Well AMD used TSMC 16nm for XBox One S.
> 
> http://www.eurogamer.net/articles/d...as-a-gpu-overclock-and-we-have-benchmarked-it



Very interesting, I didn't see that.  Also interesting that games easily take advantage of it without an update.


----------



## dalekdukesboy (Aug 2, 2016)

IFy


bug said:


> It's slightly more power for slightly more $$$ (i.e. not superior if you don't have the extra $20-30). I'll probably go for the 1060 because of the Linux drivers (yet again). It seems AMD hasn't fixed any of their OpenGL implementation issues in the new driver (which seems to be actually the old driver, but with some parts open sourced).



If you don't have 20 bucks "extra" to use for a couple hundred dollar video card you shouldn't be upgrading in the first place.  That is a poor argument, 50$ + ok at least it's worth considering but minus at high resolutions (ones not used mostly by mid-range cards) the 1060 is better than this card in pretty much every way...INCLUDING power consumption, so yeah..."20 bucks" or even "30 bucks" you will make back and then some simply by the power efficiency difference in the cards.  Oh, and you'll be getting more FPS and the quality of your experience will be better.


----------



## dalekdukesboy (Aug 2, 2016)

Jism said:


> AMD did not give any much more headroom on OC's, so every card (reference or AIB) is looking at a already maxed out chip where OC's are really marginal. Yes it could reach 1500MHz but proberly not for 24/7 usage and just good enough for one run and nothing more.
> 
> This is the proces limitation. It suits best at 1200 up to 1400Mhz in very rare cases, but anything above is a litterally no go on Air.
> 
> ...



Not to be political but to make a point...This sounded like James Comey laying out the case for indictment against Hillary for 3/4 of the press conference only to say at the end "nothing to see here".  I say that because you lay out the case that Nvidia does a better job with potentially "less" than what AMD has and list every metric that can't be improved on AMD's cards while saying how efficient and better Nvidia's approach is....only to say "nothing to see here" by saying AMD only needs to use same graphical strategy as NVIDIA with their intrinsically "better/beefy" cards and NVIDIA is toast essentially.  Yet that is NOT what IS, it is what it is so to speak, and that isn't happening so yes valid point perhaps but whether it is or not it's fairly moot for the moment as long as the cards perform as they do at the present time.  AMD's card is slower overall period, and it doesn't overclock as well etc etc so you throw all that in that was listed and you really have to conclude AMD has a great card, but for now you have to be objective with what IS and say NVIDIA's card is faster and more power efficient with more OC headroom.  The "what if" scenario you bring up may be true, but never seems to come to fruition so what difference does it make at this point?


----------



## Shatun_Bear (Aug 2, 2016)

dalekdukesboy said:


> IFy
> 
> 
> If you don't have 20 bucks "extra" to use for a couple hundred dollar video card you shouldn't be upgrading in the first place.  That is a poor argument, 50$ + ok at least it's worth considering but minus at high resolutions (ones not used mostly by mid-range cards) *the 1060 is better than this card in pretty much every way*...INCLUDING power consumption, so yeah..."20 bucks" or even "30 bucks" you will make back and then some simply by the power efficiency difference in the cards.  Oh, and you'll be getting more FPS and the quality of your experience will be better.



No it is not. You are failing to consider at all the advantage the 480 has in DX12 and Vulkan titles. This is a huge factor for me between the cards. From the trend we are seeing, with pretty much all DX12 and Vulkan DOOM favouring the 480 in benches, we can propose that this advantage will continue with the upcoming big PC titles. Deus Ex and Battlefield 1 DX12 will be very interesting. So unless you buy these cards with the intent to play old games first, the 480 advantage in newer titles is a big plus over the 1060.

The second thing you need to consider is the trend for AMD cards to better mature over time. This is not fanboy nonsense, it is real. If we look at the last new cards AMD released (and I mean new, not the 300-series refresh), the Fury series, look at the review of Sapphire Fury from last year in our very own TPU and look at the Performance Summary figures. At the time of the Strix and Sapphire Fury's launch reviews, the Fury reference was only *6% faster* in 1440p than a GTX 980:

https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/30.html

Fast forward to the present and this MSI 480 yesterday and look at the Perf Summary again and noticed that the Fury is now placed as *11% faster* at the same resolution:

https://www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/23.html

That is huge to me.

These two factors mean my next card (because I am actually in the market for a new card) is going to be either a 480 or 1070. I think the 480 is a better buy than the 1060 for sure.


----------



## geon2k2 (Aug 2, 2016)

HumanSmoke said:


> [OT]
> 
> Apple has access to both Samsung's 14LPC and TSMC 16FFC's projections and seem to have reached a definitive conclusion...are you going to tell me Apple make foundry partner manufacturing decisions based on frivolous and arbitrary reasons?



Actually there are very good reasons for them to choose something different than Samsung:
1. Samsung is a competitor, so why to give them money and finance their R&D not to mention that to depend on a competitor to produce your product is not very good business. 
2. By manufacturing in Samsung fab you give them access to your blueprints so they can either copy, inspire or be notified in time to respond to any significant architectural change you are doing.

For the above reasons if I would be Apple I'd choose anything other than Samsung, even if it will cost me a bit more, even if the quality isn't that good. 
The fact that they went to Samsung in the first place proves that they found there something they couldn't find anywhere else.


----------



## HumanSmoke (Aug 2, 2016)

geon2k2 said:


> Actually there are very good reasons for them to choose something different than Samsung:
> 1. Samsung is a competitor, so why to give them money and finance their R&D not to mention that to depend on a competitor to produce your product is not very good business.
> 2. By manufacturing in Samsung fab you give them access to your blueprints so they can either copy, inspire or be notified in time to respond to any significant architectural change you are doing.
> 
> ...


You clearly don't understand how the pure-play foundry model works. Analyzing and copying the IP logic blocks of a customers IC's would destroy Samsung's foundry business - they wouldn't retain or gain a single customer, and by the time the court system finished pounding them into dust for breaches of confidentiality, contract, and IP theft, Samsung wouldn't have a pure-play foundry business.


----------



## sweet (Aug 3, 2016)

geon2k2 said:


> Actually there are very good reasons for them to choose something different than Samsung:
> 1. Samsung is a competitor, so why to give them money and finance their R&D not to mention that to depend on a competitor to produce your product is not very good business.
> 2. By manufacturing in Samsung fab you give them access to your blueprints so they can either copy, inspire or be notified in time to respond to any significant architectural change you are doing.
> 
> ...


Samsung is a big-ass corporation. Their fab and their mobile unit are completely different entities.


----------



## Malabooga (Aug 3, 2016)

CounterSpell said:


> cool. but what games nowadays have dx 12 and async? its just a few games. And thats why i think for the majority  of actual games, the 1060 is a better deal. IMHO



The list of DX12 games is already quite long and will only get longer.

So its mostly: if youre buying GPU for yesterday choose 1060, if you buy GPU for tomorrow choose AMD.

The only valid case of "1060 being better" is if you  dont have interest in new games.



Recus said:


> Well AMD used TSMC 16nm for XBox One S.
> 
> http://www.eurogamer.net/articles/d...as-a-gpu-overclock-and-we-have-benchmarked-it



The new SoC is pretty much direct shrink from old SoC which was done on 28nm TSMC. They even talk about "2 disabled CUs".

http://www.anandtech.com/show/7546/chipworks-confirms-xbox-one-soc-has-14-cus

there was even talk of 20nm, so this design might have been in existance for a long time (for 20nm then just shifted to 16nm as those are not all that different)


----------



## rtwjunkie (Aug 3, 2016)

Malabooga said:


> The list of DX12 games is already quite long and will only get longer.


----------



## bug (Aug 3, 2016)

Malabooga said:


> So its mostly: if youre buying GPU for yesterday choose 1060, if you buy GPU for tomorrow choose AMD.



That usually works out well. For Nostradamus, that is.


----------



## Shatun_Bear (Aug 3, 2016)

rtwjunkie said:


>



He's right though. Pretty much every major PC release going forward is going to have a DX12/Vulkan mode from the get go or have it patched in afterwards.


----------



## rtwjunkie (Aug 3, 2016)

Shatun_Bear said:


> He's right though. Pretty much every major PC release going forward is going to have a DX12/Vulkan mode from the get go or have it patched in afterwards.



Mostly I'm laughing at the "long list". Also, prepare to be surprised, DX12 adoption is not going quickly, nor will it be wholesale adopted by game studios.

As for games having DX12 backwards shoehorned onto existing games, well let's say those results leave alot to be desired.


----------



## Shatun_Bear (Aug 3, 2016)

rtwjunkie said:


> Mostly I'm laughing at the "long list". Also, prepare to be surprised, *DX12 adoption is not going quickly*, nor will it be wholesale adopted by game studios.
> 
> As for games having DX12 backwards shoehorned onto existing games, well let's say those results leave alot to be desired.



DX12 or Vulkan adoption by the big studios is/is going to be universal as it gives performance advantages across the board. As we're (or at least I was) talking about the benchmarks of a suite of big PC titles developed by said AAA developers (used for all benchmarks of nearly every major site), them adopting low-level apis quickly is more relevant than whether every other small or indie game studios adopts them wholesale too. In any case by and large the games made by smaller and more indie studios tend to have graphics engines that run in 100+fps in DX11 so they are mostly irrelevant when judging performance anyway. It doesn't change the value proposition between an AMD and Nvidia card if Truck Simulator 4000 runs at 180fps in DX11 on a 1060 versus 168fps in DX11 on a 480.

What is going to change perceptions going forward is DX12 performance of the likes of Battlefield 1, Deus Ex or Civilization VI. These are the big games that are going to be benched and what people are going to draw conclusions from.


----------



## rtwjunkie (Aug 3, 2016)

Shatun_Bear said:


> DX12 or Vulkan adoption by the big studios is/is going to be universal as it gives performance advantages across the board. As we're (or at least I was) talking about the benchmarks of a suite of big PC titles developed by said AAA developers (used for all benchmarks of nearly every major site), them adopting low-level apis quickly is more relevant than whether every other small or indie game studios adopts them wholesale too. In any case by and large the games made by smaller and more indie studios tend to have graphics engines that run in 100+fps in DX11 so they are mostly irrelevant when judging performance anyway. It doesn't change the value proposition between an AMD and Nvidia card if Truck Simulator 4000 runs at 180fps in DX11 on a 1060 versus 168fps in DX11 on a 480.
> 
> What is going to change perceptions going forward is DX12 performance of the likes of Battlefield 1, Deus Ex or Civilization VI. These are the big games that are going to be benched and what people are going to draw conclusions from.


DX 12 works better since it is low level, but as far as Implementation, it is harder to program from everything I have read. Your statement is thus only hope at this point.

Independent games en masse are a majority of the market, mostly because of actual content, so their use of DX11 and it's influence should not be discounted too swiftly.

BTW, some of those indie games you discounted as being irrelevant performance-wise are well optimized and still push PC's as hard as so thing like Crysis 3 or TW3.

It's always wise to refrain from blanket predictions such as DX12 will be everywhere when dealing in the tech world, which is the only reason I even commented.  Oh, and the "quite long" list of DX12 titles already published.  That really got me laughing.


----------



## Shatun_Bear (Aug 3, 2016)

rtwjunkie said:


> DX 12 works better since it is low level, but as far as Implementation, it is harder to program from everything I have read. Your statement is thus only hope at this point.
> 
> Independent games en masse are a majority of the market, mostly because of actual content, so their use of DX11 and it's influence should not be discounted too swiftly.
> 
> ...



Laugh all you want (don't know what is funny) 

 You have a curious position on something that will advance PC gaming for everyone, trying to write DX12 adoption as merely 'hope' on my part. This hints that perhaps you have an agenda - could it be that you are willing DX12 to be slowly adopted, and thus the whole of gaming development is kept shackled, so that Nvidia maintains its performance advantage? Foolish and futile hope on your part.

Bye.


----------



## rtwjunkie (Aug 3, 2016)

Shatun_Bear said:


> Laugh all you want (don't know what is funny)
> 
> You have a curious position on something that will advance PC gaming for everyone, trying to write DX12 adoption as merely 'hope' on my part. This hints that perhaps you have an agenda - could it be that you are willing DX12 to be slowly adopted, and thus the whole of gaming development is kept shackled, so that Nvidia maintains its performance advantage? Foolish and futile hope on your part.
> 
> Bye.



Where the F*&K would you get an idea that I have an NVIDIA agenda?  Do you not check signatures as well as specs (just to make sure you aren't talking out your rear)?  I own green and red.  I am far too old to play fanboi games that you think I might ascribe to.

I have a PC gaming agenda, and I am a realist.  And yeah, there is no "long list" of DX12 titles, that is what is funny.  I am a realist in the breakdown of AAA to Indie, and have observed the extremely slow adoption of DX12.


----------



## geon2k2 (Aug 3, 2016)

HumanSmoke said:


> You clearly don't understand how the pure-play foundry model works. Analyzing and copying the IP logic blocks of a customers IC's would destroy Samsung's foundry business - they wouldn't retain or gain a single customer, and by the time the court system finished pounding them into dust for breaches of confidentiality, contract, and IP theft, Samsung wouldn't have a pure-play foundry business.



 you might be right ... but still having a look never hurts, you don't need to blatantly copy, and imagine how hard it would be to prove that such an inspiration exists. 
Industrial espionage will always exist and S is definitely struggling hard to find out what A is working on, and I'm sure A is doing the same, you just don't go unprepared in current times.
And I'm just saying that going with the schematics in their courtyard will kind of ease their job.

BTW, I don't have any sort of information, this is just pure speculation, but just can't believe someone is such a gentleman and just prints, while being turned with the back and doesn't even glance at what is being printed.


----------



## eidairaman1 (Aug 4, 2016)

Good Card I'd say Reference boards are easier to watercool though.


----------



## HumanSmoke (Aug 4, 2016)

geon2k2 said:


> you might be right ... but still having a look never hurts, you don't need to blatantly copy, and imagine how hard it would be to prove that such an inspiration exists.


No, it is actually quite easy to prove IP theft on a IC logic block basis. It was easy 35 years ago when NEC was found to have copied Intel's 8086/8088 and marketed them as the µPD 8086/8088, and it is even easier now where private companies routinely shave heatspreaders off chips and produce highly detailed X-Ray metrology images for anyone with $5000 to buy. Semicon giants of course have much more resources at their disposal including (obviously) the orginal logic block floorplans, transistor placement, and metal layer interconnects.


----------



## Malabooga (Aug 6, 2016)

rtwjunkie said:


> Where the F*&K would you get an idea that I have an NVIDIA agenda?  Do you not check signatures as well as specs (just to make sure you aren't talking out your rear)?  I own green and red.  I am far too old to play fanboi games that you think I might ascribe to.
> 
> I have a PC gaming agenda, and I am a realist.  And yeah, there is no "long list" of DX12 titles, that is what is funny.  I am a realist in the breakdown of AAA to Indie, and have observed the extremely slow adoption of DX12.



sure dude, you dont even have a clue about DX12 games, DX12 adoption or anything else. The be a "realist" you first have to step into reality. Indie? You mean 8bit games that you can run on current smartphone? Thas your "leverage".

Yeah, now im laughing my socks out

Fact is that this benchmark suite is good for measuring performance of 2-3 years ago and has nothing to do with todays or tomorrows performance and is good only if youre buying GPU for games of 2-3 years ago.



Shatun_Bear said:


> He's right though. Pretty much every major PC release going forward is going to have a DX12/Vulkan mode from the get go or have it patched in afterwards.



dont bother, loudest people are those who know the least. Almost all major 2016. releases had/will have DX12/Vulkan, and thats only 2016. I guess hes one of those people that few months ago yelled DX12 is 3-4 years away lol


----------



## rtwjunkie (Aug 6, 2016)

Malabooga said:


> sure dude, you dont even have a clue about DX12 games, DX12 adoption or anythiung else. The be a "realist" you first have to step into reality. Indie? You mean 8bit games that you can run on current smartphone? Thas your "leverage".
> 
> Yeah, now im laughing my socks out



obviously you're fuckin clueless. You're talking idiocy. You're a waste of all of our air. Go find yourself while I put you on ignore.

What's simply amazing is how little you know of the gaming industry if you think Indie = 8-bit.


----------



## eidairaman1 (Aug 6, 2016)

Malabooga said:


> sure dude, you dont even have a clue about DX12 games, DX12 adoption or anything else. The be a "realist" you first have to step into reality. Indie? You mean 8bit games that you can run on current smartphone? Thas your "leverage".
> 
> 
> Yeah, now im laughing my socks out
> ...



CoughCough900u110r7coughcough.
Go elsewhere youre not welcome for trying to start fights here.


----------



## Tatty_One (Aug 6, 2016)

Might I suggest some of you calm down to a frenzy, take a breath and possibly even re-read the guidelines, assuming you had in the first place, any further insults/attacks and free holiday passes will be issued...... thank you.


----------



## Malabooga (Aug 8, 2016)

ah, i see, some people have "special pass to troll" here

ok


----------



## eidairaman1 (Aug 9, 2016)

Malabooga said:


> ah, i see, some people have "special pass to troll" here
> 
> ok



Actually no one does but you started it.


----------



## Caring1 (Aug 10, 2016)

As I have mentioned previously regarding the RX480, power consumption is a big negative tick against it's name for me.
It would be a no brainer to most people to put this card, or another brand in their cart, if it used half the power it currently does.
Hopefully AMD pulls their finger out and actually does something about it.
I'm really hoping changes like those in this article are an incentive for them.
http://www.theregister.co.uk/2016/08/04/california_to_put_powerhungry_pcs_on_diet/


----------



## Ungari (Sep 9, 2016)

Caring1 said:


> As I have mentioned previously regarding the RX480, power consumption is a big negative tick against it's name for me.
> It would be a no brainer to most people to put this card, or another brand in their cart, if it used half the power it currently does.
> Hopefully AMD pulls their finger out and actually does something about it.
> I'm really hoping changes like those in this article are an incentive for them.
> http://www.theregister.co.uk/2016/08/04/california_to_put_powerhungry_pcs_on_diet/



This whole Power Consumption issue is such a non-issue that has been pushed as a marketing point by Nvidia even though the difference is mere pennies per month.
Even the Etherium Mining community who live and die in the profit margins have chosen the RX 480 as the card of choice.
Every time I see the point brought up, I think of a parrot; "_Power Consumption, the Power Consumption...caw caw rawwwk!_" .


----------



## dalekdukesboy (Sep 9, 2016)

Ok, how many other performance or consumption figures are you going to selectively ignore so you can profess your deep love for the rx 480 while mocking everyone who brings up facts about it you don't like? You can have your opinions, but you're not entitled to your own facts.  How about rather than being a bird brain mocking people ironically calling them parrots you actually give us your factual positive reasons for the 480 (which of course there are)  and maybe actually bring up facts about the 1060 and tell us why you prefer the 480 in comparison rather than just belittle and marginalize whoever actually happens to care about power consumption.  Ironically I rarely even care about power consumption myself I care about how cool the card runs, how it performs, and is it not a hair dryer; that said stick to dealing with facts and give your opinions on those rather than just being insulting and flaming others' legitimate opinions.


----------



## Ungari (Sep 9, 2016)

The fact is that power consumption on all these new cards are much lower than the last generations, and that the talking point itself is contrived as a sales feature/benefit by Nvidia, which is fine.
However, what I object to is when so called tech-savvy enthusiasts embrace this nonsense as if was some sort of dealbreaker, and keep spouting it incessantly as if it was really going to impact them.
Let's be honest; is someone who can afford $750 for a video card really pinching pennies each month to pay the electric company?

BTW Tom Baker is the one and only Doctor!


----------



## dalekdukesboy (Sep 9, 2016)

Yep, long live Tom Baker!  That tip of the hat aside I will say you destroy your own argument by saying "$750" card and power bill penny pinching when we are discussing a card that is barely a 1/3 as expensive as unnamed card you refer to by a 3/4 of a grand cost; So therefore you are   undercutting your own argument by talking about a totally different group of consumers ( as in entry level enthusiasts versus high end enthusiasts like RX 480 users vs. highest end custom 1080 users)...And you are either intentionally or unintentionally misrepresenting the customer group we are talking about and therefore invalidating your own point.


----------



## Captain_Tom (Sep 9, 2016)

Ungari said:


> The fact is that power consumption on all these new cards are much lower than the last generations, and that the talking point itself is contrived as a sales feature/benefit by Nvidia, which is fine.
> However, what I object to is when so called tech-savvy enthusiasts embrace this nonsense as if was some sort of dealbreaker, and keep spouting it incessantly as if it was really going to impact them.
> Let's be honest; is someone who can afford $750 for a video card really pinching pennies each month to pay the electric company?
> 
> BTW Tom Baker is the one and only Doctor!



Bingo.  I just bought a Fury for $310.  Sure it typically is using 200 - 250w while gaming instead of 120 - 170w with the 1070; BUT WHO CARES?!   In BF1, Metro LL, Bioshock, and Deus Ex my card performs 5-20% better.  It is laughable imo that I would choose anything else right now.

Power consumption should be a tie breaker and nothing else.


----------



## Malabooga (Sep 9, 2016)

Ungari said:


> This whole Power Consumption issue is such a non-issue that has been pushed as a marketing point by Nvidia even though the difference is mere pennies per month.
> Even the Etherium Mining community who live and die in the profit margins have chosen the RX 480 as the card of choice.
> Every time I see the point brought up, I think of a parrot; "_Power Consumption, the Power Consumption...caw caw rawwwk!_" .



As always it turns out that the most loud about "power consumption" are those who have no clue what theyre talking about and throw numbers like "100$/month!"

oh yes, 1060 use few less W, youll save few cents/month, maybe even 5 $ cofee in a year.

Card running cool/silent or not is a matter of cooler, if you put up claim that "480 runs hot and loud" than everything above 150W runs hot and loud too including 1070/1080/980ti/980/970... ... ....

My MSI 480 gaming X runs mostly passively cooled in some older games and turns on fans now and then, as a standard it runs 65c at 1100-1200 RPM fans which is inaudible and pretty much beats every other 1060 in that departement except gaming x which has same cooler.



Captain_Tom said:


> Bingo.  I just bought a Fury for $310.  Sure it typically is using 200 - 250w while gaming instead of 120 - 170w with the 1070; BUT WHO CARES?!   In BF1, Metro LL, Bioshock, and Deus Ex my card performs 5-20% better.  It is laughable imo that I would choose anything else right now.
> 
> Power consumption should be a tie breaker and nothing else.



Great deal, theres also Fury Nitros in EU for 315/319€, pretty much best buy card ATM.


----------



## dalekdukesboy (Sep 9, 2016)

...


----------



## dalekdukesboy (Sep 9, 2016)

Sorry but that last post needed to be accentuated by being by itself as a reaction to the last 2 posts.  I get loving what you got and having a preference, we all have bias (as well as bios) but when you say having similar performance from 2 products and having 1 of them literally be DOUBLE the power consumption and say, "meh, who cares" just blows me away.  That's like having one car use 5 gallons of gas per 100 miles and generate 300 horespower and another that gets 300 horsepower and using 10 gallons per 100 miles and say you'd rather have the 10 gallon per 100 mile car, oh and the car that uses more gas also runs significantly hot and has little "overclocking" room with how it is built whereas the more efficient car has much more aftermarket cams etc that it can tolerate and so has a decently higher horsepower it can attain.  Seriously?  Also for well over 300 bucks and/or euros I can find many cards that certainly match or beat the fury...Then again with how you guys are looking at this you may next say a mud puddle is a better place to fish than the ocean.


----------



## Ungari (Sep 9, 2016)

dalekdukesboy said:


> Yep, long live Tom Baker!  That tip of the hat aside I will say you destroy your own argument by saying "$750" card and power bill penny pinching when we are discussing a card that is barely a 1/3 as expensive as unnamed card you refer to by a 3/4 of a grand cost; So therefore you are   undercutting your own argument by talking about a totally different group of consumers ( as in entry level enthusiasts versus high end enthusiasts like RX 480 users vs. highest end custom 1080 users)...And you are either intentionally or unintentionally misrepresenting the customer group we are talking about and therefore invalidating your own point.



No, this is brought up at every price point comparing Paxwell cards to previous gen Fury and 390X.
It will go on as long as Nvidia holds a lead in efficiency against Vega, just you wait.

Now, if you'll permit me once again:

"_Power Consumption, the Power Consumption...caw caw rawwwk!_".


----------



## Ungari (Sep 9, 2016)

dalekdukesboy said:


> having 1 of them literally be DOUBLE the power consumption and say, "meh, who cares" just blows me away. That's like having one car use 5 gallons of gas per 100 miles and generate 300 horespower and another that gets 300 horsepower and using 10 gallons per 100 miles and say you'd rather have the 10 gallon per 100 mile car,



This is an invidious comparison as the cost of gasoline is not to be compared with electric. Or, shall I say it's an _Nvidious_ comparison?


----------



## Ungari (Sep 9, 2016)

Malabooga said:


> Card running cool/silent or not is a matter of cooler, if you put up claim that "480 runs hot and loud" than everything above 150W runs hot and loud too including 1070/1080/980ti/980/970... ... ....



But see, if the electric bill was really a problem then you couldn't afford to run the air conditioner and your ambient temps would cause the fans to ramp up to full speed causing more noise and----
"_Power Consumption, the Power Consumption...caw caw rawwwk!_".


----------



## dalekdukesboy (Sep 9, 2016)

Ungari said:


> No, this is brought up at every price point comparing Paxwell cards to previous gen Fury and 390X.
> It will go on as long as Nvidia holds a lead in efficiency against Vega, just you wait.
> 
> Now, if you'll permit me once again:
> ...



Again, you ironically compare nvidia users to parrots while literally parroting your parrot call repeatedly over and over...hypocrisy much?  Oh, and while we're at it YOU brought up the crazy high price as if it were relevant to this sub 300 dollar card discussion, which it is not. So you were the one who erroneously threw a price out there that only broke your argument, also "just you wait" is hardly an argument considering what you're talking about hasn't even happened yet. 



Ungari said:


> This is an invidious comparison as the cost of gasoline is not to be compared with electric. Or, shall I say it's an _Nvidious_ comparison?



Now, if you'll permit me to not act as a parrot parroting a parrot...but actually argue sensibly where your arguments are weak and vapid it's called an analogy, obviously it isn't perfect however it effectively made a point on multiple issues and all you can say is gas does not = electricity therefore every point I effectively made you avoided.  Therefore, I will simply consider I with even a non-perfect analogy done off the top of my head simply to make arguments on efficiency and performance of machines ( be it cpu or a V8) you had no useful rebuttal to whatsoever and will only continually display a strange obsession with parrots.


----------



## Ungari (Sep 9, 2016)

dalekdukesboy said:


> Again, you ironically compare nvidia users to parrots while literally parroting your parrot call repeatedly over and over...hypocrisy much?



I use both Nvidia and AMD cards in my two builds.

No matter the price points, the issue is brought up as if it were an important factor in selecting a card for purchase. Even reviewers are focusing on this metric now.

Your petrol example was bad because the difference in electrical usage is irrelevant for anyone who is not pedaling a generator with their feet.


----------



## dalekdukesboy (Sep 9, 2016)

Sigh, so the reviewers are in on the conspiracy too? No, my "petrol" AKA gasoline example has nothing to do with gas or electricity it has to do with efficiency and how a product performs and you know it.  So do you use the most inefficient light bulbs you can find for everything in your house? Do you also find the least efficient appliances you can as well? I assume you also find the products that in a certain class point run the hottest you can find and throw the most heat?  I guess that might make a scant bit of sense if you lived in Antarctica or Alaska maybe you can use 50 computers with fanless RX 480's overclocked to the max and never need to run your heat again, who cares if it takes 2000 watts an hour?  After all, efficiency isn't an issue for you long as you get some sort of result  .
Regardless you still won't admit the obvious mistake you made in referencing a price point not slightly relevant to this whole thread and focusing on only gas vs electric as if you truly are oblivious to the obvious point which is efficiency and ultimate performance.  The type of fuel was irrelevant, and it's irrelevant because the points I made using it for measurable metrics still are perfectly in-line with this card.  You can't even address them so please do us a favor and don't insult my and everyone's intelligence on this thread with your stupid girl on a bike picture. 
Also, spare me with the "I use both camps cards" bullshit, at least just admit you're an honorary CEO of AMD and get on with it, sheesh lol.  After you basically attack someone on their opinion that the efficiency of the 480 is an issue for them and call them out like they were committing treason you really think you look like you are unbiased?  Again, your "objectivity" is a joke and we all see that.  I like Nvidia this round they win, no one can debate they got from midrange to top monopolized.  I actually don't like that because I had a 7970 and a few older AMD/ATI as well I like competition, and I wanted AMD to do more this round but they are only half in the game letting Nvidia overprice every sku they have over what AMD has due to zero competition.  I don't really like it or find any happiness in the 480 being a bit underwhelming (but still a decent gpu) plus having zero products to even compete with gtx 980ti, 1070, 1080, and now titan.  Yes the 980ti is EOL essentially and last generation, I list it to make the point that even that especially being a good overclocker competes with new Nvidia products up to 1070 and I've seen comparisons of max clocked 1080 and 980ti and the ti was far too close hence why it wasn't in a lot of reviews early on especially OC'd models.  So Nvidia narrowly beat its' own last generation in performance till you get to the titan....what AMD product even touches a 980ti in performance?  Yes, since you aren't that good with analogies and meanings and the like that is a rhetorical question.


----------



## Ungari (Sep 9, 2016)

I'm glad you brought up Adored TV's test of the 1080 vs. 980Ti, and that Pascal was not on the original architecture roadmap and so was obsolete the day it was released since it doesn't have Asynchronous Compute.

Yes, I do use the inefficient lightbulbs since they cannot be used as microphones as they are currently used in Newark International Airport.
The only mistake I made in my referencing that price point was in my overestimation of your ability to understand.

As to having both cards, at that time Maxwell was truly a great architecture and so I bought it, but now Paxwell is not future proof and much too expensive. I may go back to Nvidia after I see Volta vs. Vega, but I don't like Team Green's deceptive business practices with VRAMGate and the 1050 3GB, and the prices are outrageous.


----------



## dalekdukesboy (Sep 10, 2016)

Do you even know how to read? Seriously, you are reduced to agreeing with me about Nvidia being overpriced and not liking the market etc you said "deceptive business practices" but yeah kinda what I was saying dumbass.  Also you are erroneous in referencing a specific single test of 1080 vs 980ti since I looked up several, as well as did my homework and sieved through newer reviews (and older ones) that did list the 980ti and where it was overclocked and benched in certain games so I could use that to compare to other gpus.  I guess you just knew of one test and being the parrot brain you are you assumed I'd do what you do and look at one review and make a judgement, again totally false and I never said anything about what, who and where I got the info from. No, the mistake you made is you said something stupid that made no sense as stated and ironically was far worse than my gas analogy you won't address the efficiency point there within.  

Also the light bulb bit, yeah not workin' for ya try again.  Maybe actually address the very simple points I make and respond, rather than say something that shows supreme idiocy and flippancy knowing you got nothing useful to say. I know someday you may be able to do this, I'll be here long as one of us doesn't die before that happens...

So...you're telling me the Rx 480 is more future proof than say a gtx 1080 or 1070? Obviously you think so you defend it so vehemently and if obviously are skipping *"Paxwell"* due to non-future proofing you think AMD must be better somehow with this.  Also I like how I lay out my arguments and back them up and best you can do is a few crap sentences that address nothing I said, except miss the point OR ironically even agree with me on some things.  Yeah, I can almost hear the Vulkan and DX 12 argument coming where suddenly and magically that Rx 480 will not only beat a 1080, maybe give the implementation enough time it'll beat a Titan!  

You sir truly have earned the idiot of the day award, from your first message to the last.  I reiterate, when you can take what I very simply and carefully  laid out and intelligently address the points and not squawk like a parrot maybe I and others' may take what you say seriously.  It starts and goes on with you squawking then obfuscating repeatedly so I find it rather humorous you question my comprehension or anyone else for that matter; like the poor bloke who simply dared to state his opinion on his take on why the Rx 480's efficiency and lack thereof was an issue for him.  Also I assume you mean the 1070 3gb ram issue?  I know you're so much smarter than me (it's so obvious) that I just can't imagine you'd fuck up like a moron and type 1050 right? . You type so little of anything I'd assume the little you do you'd use that obviously huge intellectual genius and almost Christ-like faultlessness of yours to pontificate with such utter Einsteinian-like genius and give us philosophically timeless and greatest hits-type gems like....

* "But see, if the electric bill was really a problem then you couldn't afford to run the air conditioner and your ambient temps would cause the fans to ramp up to full speed causing more noise and----
"Power Consumption, the Power Consumption...caw caw rawwwk!".*
_*
"Now, if you'll permit me once again:

"Power Consumption, the Power Consumption...caw caw rawwwk!"."

*_
*"Every time I see the point brought up, I think of a parrot; "Power Consumption, the Power Consumption...caw caw rawwwk!"

Quotes by Einstein...can't you tell? *


----------



## Ungari (Sep 10, 2016)

I thought maybe you were referring to Adored's video on the 980Ti vs Maxwell 3.0, since Adored came on this forum to discuss his findings with the community, particularly those who were unhappy with his conclusions.
The 1050 3GB I'm referring to is the the cutdown CUDA of the 1060 3GB.
You don't need to defend or feel sorry that _poor bloke, _that guy is on record for continuously bringing up power efficiency for any and all AMD cards, but fails hard when pointing his finger at Polaris.
I would like to respond to any carefully and simply laid out arguments you would like to make, I've been waiting very patiently for you to post some.


----------



## dalekdukesboy (Sep 10, 2016)

Hilarious, do you have anything useful to say or just bad one liners? Oh all those parrot quotes as well!  I think I will do you a favor and since you haven't made ANY points of any value, forget well laid out maybe you should listen to yourself when you say....

*"I would like to respond to any carefully and simply laid out arguments you would like to make, I've been waiting very patiently for you to post some."
*
Because dude, if you think my paragraphs' worth of points you simply don't address aren't carefully laid out, I shudder to think what you would objectively say about an asshole who squawks like a parrot and makes bad light bulb jokes and posts pointless pictures of a chick on a bike.  Just sayin'.

The 1070 also had similar issue 1050 I haven't even looked at since I'm uninterested in it. So I am sorry I mistook that one thing, but still am right the 1070 had a 3 gig vs 3.5 gig issue for some people I heard.  The rest stands, maybe you are right that guy is a schmuck who goes off on power efficiency on AMD only, doesn't change how you reacted to it.


----------



## Ungari (Sep 10, 2016)

Let me say that those who make such a big deal about the  Power Consumption of the RX 480 and it's Polaris architecture, should be more concerned about the _energy footprint_ of the Tardis:


----------



## dalekdukesboy (Sep 10, 2016)

It's either technically 0 because it doesn't really exist, or it's exactly the amount of power used from the first episode till now on every set of every Doctor Who episode or movie.


----------



## Fluffmeister (Sep 10, 2016)

AMD once mocked nVidias power consumption, now the tables have turned and it is delicious.

It's a shame AMD and their fans have no class.


----------



## dalekdukesboy (Sep 10, 2016)

LOL thanks Fluffmeister, I wondered when or if anyone would jump in and take a side whatever it may be, my guess is you and many others were just laying in the weeds and enjoying the show till we stopped.


----------



## Freelancer (Sep 28, 2016)

Personally I lean towards this card simply because I also need to get a new monitor, and I can get a cheap LG monitor with AMD Freesync instead of those with G Sync.


----------



## Ungari (Sep 29, 2016)

Freelancer said:


> Personally I lean towards this card simply because I also need to get a new monitor, and I can get a cheap LG monitor with AMD Freesync instead of those with G Sync.


----------



## fullinfusion (Dec 11, 2016)

There no dual bios switch?

wth!


----------

