# MSI GTX 1060 Gaming X 3 GB



## W1zzard (Sep 6, 2016)

MSI's GTX 1060 Gaming X 3 GB might come with half the memory amount only, but still brings the big guns in form of the large dual-fan TwinFrozr cooler. Our review will test whether 3 GB is a viable alternative to 6 GB if you are trying to save some money.

*Show full review*


----------



## PainfulByte (Sep 6, 2016)

Are the charts correct? It scores worst FPS across the board than the reference card?


----------



## Chaitanya (Sep 6, 2016)

Going forward it seems like GTX 1060 is going to lag behind RX480 in DX12/Vulkan games.


----------



## Nokiron (Sep 6, 2016)

PainfulByte said:


> Are the charts correct? It scores worst FPS across the board than the reference card?


It's the 3GB model.

Like it says on the first page...



> Typically GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA has decided to reduce the shader count of GTX 1060 3 GB down to 1152 from 1280 on the 6 GB version. This, roughly 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer to the 6 GB version.


----------



## W1zzard (Sep 6, 2016)

PainfulByte said:


> Are the charts correct? It scores worst FPS across the board than the reference card?


It also has 1152 shaders vs 1280 on 6 GB


----------



## hojnikb (Sep 6, 2016)

Any chance of testing gainward/palit 1060s with 3 and 6gb of vram. They seems to be quite cheap here in EU.


----------



## qubit (Sep 6, 2016)

@W1zzard

Typo in the conclusion: 

In the bullet points you've stated:  Fans don't stop in idle

In the main text you've stated: *MSI has also included the idle-fan-off feature* we love so much since it provides a perfect noise-free experience during desktop work, Internet browsing, and even light gaming.


----------



## Adam Freeman (Sep 6, 2016)

Great review with some newer games. The card performs very good at 1080p despite having 3GB of vram, but I wouldn't even
consider buying for 240$, it is simply too expensive and there are many better choices at this price point. For example you can pay
10$ more for the 1060 6GB version from other brand like evga or gigabyte which would be a little noisier but with better
performance and more future proofing. Other choices from AMD like overclocked RX 480 4gb for 230$ that has same performance 
in DX11 titles but better in DX12/volkan titles with 1GB more vram. Also for 240$ you can buy sapphire RX 470 8GB which has
way better performance than reference RX 470 tested in this article and it's the only RX 470 in the market that has the same
memory speed as RX 480 8GB which is 8Gb/s.


----------



## W1zzard (Sep 6, 2016)

qubit said:


> @W1zzard
> 
> Typo in the conclusion:
> 
> ...


Lol I forgot to update the +- bullet list  Fixed now. Spent too much time thinking about conclusion ^^


----------



## qubit (Sep 6, 2016)

W1zzard said:


> Lol I forgot to update the +- bullet list  Fixed now


That was quick!

So you _are_ actually human then. I suspect a conspiracy.


----------



## refillable (Sep 6, 2016)

You should stop testing BF4, the game is older than my 2-year-old cousin.

Fallout 4 seems broken in 1080p. No Man Sky (the super stupid hyped game) is broken in all resolution. Deus Ex MD has a stupidly low FPS, it's broken as well. I hope DX12 is going to fix it.


----------



## Shatun_Bear (Sep 6, 2016)

Fantastic job from W1zzard for updating the benching suite of games. He's delivered all the games I would put in a current review and Hitman being benched in DX12 instead of DX11 is also a noteworthy improvement over before. There are some older titles that still could be removed, but I can't complain too much.

One thing though:



			
				W1zzard said:
			
		

> Compared to *Radeon RX 480, the GTX 1060 3 GB is right on the same level in our performance summaries* - individual game performance varies wildly though, with more recent titles having slightly the upper hand on the RX 480.



This is confusing as you're wording it like you're comparing reference vs reference. But this is comparing high overclocked non-reference 1060 versus reference 480. If you compared MSI Gaming 480 versus this card, the 480 would likely be notably faster.

I'm interested in seeing how the 4GB 480 stacks up against a 3GB 1060 going forward.


----------



## Shatun_Bear (Sep 6, 2016)

Adam Freeman said:


> Great review with some newer games. The card performs very good at 1080p despite having 3GB of vram, but I wouldn't even
> consider buying for 240$, it is simply too expensive and there are many better choices at this price point. For example you can pay
> 10$ more for the 1060 6GB version from other brand like evga or gigabyte which would be a little noisier but with better
> performance and more future proofing. Other choices from AMD like overclocked RX 480 4gb for 230$ that has same performance
> ...



Yes I agree with you completely. Personally I would not recommend ANYONE to buy this card when there are better cards around the same price with at least 4GB of VRAM. You can get ITX 6GB 1060s for around the same price. Surely, that would be a better spend of $240. 4GB 480s are also a far more sensible purchase imo.


----------



## bug (Sep 6, 2016)

> On the back is a high-quality metal backplate





> No backplate included



I'm pretty sure you need to pick just one. Unless the backplate really wasn't included and you installed one yourself


----------



## W1zzard (Sep 6, 2016)

bug said:


> I'm pretty sure you need to pick just one. Unless the backplate really wasn't included and you installed one yourself


Fixed. Proofreader uploaded an old version that was lacking my recent edits


----------



## jabbadap (Sep 6, 2016)

Great review and nice to see new games 

There's an error on No man's sky definition, it is using OpenGL4.5 not directx11.


----------



## Enterprise24 (Sep 6, 2016)

Glad to see Totalwar Warhammer integrate into TPU benchmark. Thanks


----------



## jabbadap (Sep 6, 2016)

W1zzard said:


> Lol I forgot to update the +- bullet list  Fixed now. Spent too much time thinking about conclusion ^^



Speaking of it...


> The MSI GTX 1060 6GT OC retails at $240.



I think that should be 3GB, I'm not absolutely certain though: 6GT means 6GB in Finnish but is there some extraordinary half byte variable called T in English language.


----------



## Frick (Sep 6, 2016)

Oohh a GTX1060 for €50 less, essentially. The high price on Polaris really is killing them.


----------



## chaosmassive (Sep 6, 2016)

so 900p is gone for good 
is there any way to roughly estimate fps increase over 1080p?


----------



## ZeppMan217 (Sep 6, 2016)

TPU's results are better than other reviewers'.


----------



## Enterprise24 (Sep 6, 2016)

chaosmassive said:


> so 900p is gone for good
> is there any way to roughly estimate fps increase over 1080p?



Actually I like 900p result. It can check GPU driver efficiency in cpu intensive situation.


----------



## W1zzard (Sep 6, 2016)

jabbadap said:


> There's an error on No man's sky definition, it is using OpenGL4.5 not directx11.


Fixed



jabbadap said:


> I think that should be 3GB, I'm not absolutely certain though: 6GT means 6GB in Finnish


and  fixed. Copy and paste from the previous MSI review


----------



## bug (Sep 6, 2016)

chaosmassive said:


> so 900p is gone for good
> is there any way to roughly estimate fps increase over 1080p?


1,920x1,080=2,073,600 pixels
1,600x900=1,440,000 pixels

1,440,000/2,073,600=0.69

That's a theoretical maximum of 30% more performance when pushing less pixels. Depending on how much a given title is CPU bottlenecked, you'll see 10-25% more performance in practice. Less than 10% in non-demanding games, but those would already be running at hundreds of FPS.


----------



## HD64G (Sep 6, 2016)

Great review and the 1st one in months to include the latest tech and games also. With Mankind Divided patched up to DX12 this gamelist will safely give the correct result for any comparison from now on. This 1060 3GB is a mistake in marketing from nVidia. It should be 1060Ti for the 6GB or 1050Ti for this one. And with 480 4GB being cheaper even in custom models it will have hard times imho. Bravo to MSI for this cooler though. Excellent performance & noise control at once.

@W1zzard : How comes that in this review many games gave much different results than in previous reviews? I talk about 10-20% more or less in comparison for same game and GPUs.


----------



## efikkan (Sep 6, 2016)

Chaitanya said:


> Going forward it seems like GTX 1060 is going to lag behind RX480 in DX12/Vulkan games.


No, there is no evidence of that. GTX 1060 beats RX 480 in unbiased games.
Even with more AMD favoring titles than ever (console ports and more), Nvidia still manages better performance overall. For PC gamers, GTX 1060 remains the superior choice in all the metrics, including raw performance, performance per dollar and performance per watt.


----------



## Basard (Sep 6, 2016)

Fury's PPD is getting better, I see......


----------



## Joss (Sep 6, 2016)

You can't visually distinguish this from a GTX1080, the cooler seams to be basically the same. Other manufacturers go the same route and it's wrong. If you pay premium you should have premium in every detail including a specific cooler.


----------



## Dimi (Sep 6, 2016)

I don't understand some of these results, some are completely off from previous reviews.

For example RotTR went down 16.2 fps in the latest review.  










Can you please check this out?


----------



## Jeffredo (Sep 6, 2016)

I would have no issue buying this for 1080p use.  Its faster across the board than a GTX 970 for $199.99 (if you shop wisely).  I don't see the point in getting one of these more expensive models like the MSI Twin Frozr.  The whole point (to me anyway) of a cut down card is to save money and they're just too close to a 6GB model to make sense.  As far as the longevity of 3GB goes I personally wouldn't worry.  I haven't kept a GPU more than two years since my 8800 GTX (which lasted 3 1/2).  Mid range cards?  I don't expect more than that out of them.


----------



## Nordic (Sep 6, 2016)

As far as I can tell the 1060 3gb gives about 6% less performance than the 1060 6gb at 1080p. The price difference between the cheapest option of both is $60 in the USA when I just looked 5 minutes ago. $60 cheaper for a 6% performance hit seems worth it to me.
The 480 4gb does not seem as good of a deal to me. It is the same price of $199.99 but uses a 30% more power. The 470 just doesn't match the performance.

I would rather have 4gb to 8gb vram for star citizen, but by the time that game comes out I will probably have upgraded 2 more times.


----------



## G33k2Fr34k (Sep 6, 2016)

It's an alright card but price wise it's no where near as competitive as the RX 480 4GB version. I think the 470 is bandwidth starved in most of these games. 
AMD's DCC is not quite as effective as Nvidia's when it comes to bandwidth savings. I don't think Nvidia's DCC is achieving these bandwidth savings through compression alone. They admitted to employing some  "tricks" where per-pixel deltas are "very small". Essentially, where pixel delta values are within a certain range of the base pixel value, Nvidia's Pascal chips compress such pixels using the same delta value, which results in massive bandwidth savings and certainly performance benefits since there's less variance in pixel values when tiles are rendered.


----------



## W1zzard (Sep 6, 2016)

Dimi said:


> I don't understand some of these results, some are completely off from previous reviews.



New test system: New drivers, game updated, Windows 10 Anniversary, benchmark scene changed slightly, maybe different settings (don't remember).


----------



## CAT-THE-FIFTH (Sep 6, 2016)

I don't see the point of this card - VRAM usage is going to only go up as time progresses. In the UK the GTX1060 3GB starts from £190 to £200,and the GTX1060 6GB starts from £230 to £240. Over two to three years that is not much more to spend,and I would rather not have another scenario like with the 8800GT 256MB,8800GTS 320MB and the various other VRAM limited ATI,AMD and Nvidia cards which well apart over time,and usually in games which pushed graphics somewhat. They all looked pretty decent at release IIRC.

The 8800GT 256MB was a prime example of how VRAM was a big limiting factor:

http://www.anandtech.com/show/2453

Even at a time when 256MB was a common amount of VRAM in older cards,you can see the 9600GT 512MB with only 64 shaders destroying a 112 shader 8800GT 512MB,and they were both of a similar generation too(the 9600GT launched somewhat later and the 8800GT 256MB was a stop-gap card to counter the HD3850 and HD3870 IIRC). At least the 8800GT 512MB was 40% more expensive than the 8800GT 256MB,but the GTX1060 6GB is barely 20% more expensive and also has more shaders too.

Personally I think AMD and Nvidia have gimped their lower end "mid-range" cards just enough,that they will hit issues quicker,ie, the GTX1060 3GB and RX470 4GB.

Edit!!

Plus DF encountered some problems with the 3GB version too(like Guru3D did),and one of those games was Rise of the Tombraider.






Tombraider is one of the most taxing modern games currently,and there is no doubt going to be more in the next two years.

Hitman was more of the same too.





The GTX970 is ahead in minimums.

Latest AC has stuttering too.






Guru3D also showed the frametimes for the GTX1060 3GB in Tombraider and Hitman had more spikes.

The thing is the GTX1060 3GB is also going to limit your upgrade options if you want to get a higher resolution monitor,ie,an ultrawide or qHD one which pushes more pixels,or if you want to use DSR/VSR or push AA.

Plus,what about the resale value of the card - I expect the 6GB version will hold its value much more than the 3GB version as I expect if you are that budget limited you would sell your card on when upgrading too.

As much as I would rather spend less myself on a card,the 6GB version just seems a more consistent option to have over the next two to three years.

In fact I would rather have had Nvidia disable more shaders and had the card as a 6GB one TBH.


----------



## Steevo (Sep 6, 2016)

I agree that 3GB is on the edge of any new card that I would buy, but looking forward to 4K 4GB seems to be a minimum to have, and all else being equal, it overclocks as poorly as the 480 does in most cases. On a side note, have you tried or seen any issues with video latency and drop outs as reported in the GeForce forums and on Reddit?


----------



## Rowsol (Sep 7, 2016)

@ 1080p 3gb is fine.  You don't need AA and ultra settings to play.  I like whatever is the best value and that's usually always the budget cards like this.


----------



## okidna (Sep 7, 2016)

CAT-THE-FIFTH said:


> Edit!!
> 
> Plus DF encountered some problems with the 3GB version too(like Guru3D did),and one of those games was Rise of the Tombraider.
> 
> ...



That's because they're using Very High textures, if you ever play the game then you will know that if you set the texture quality to Very High, the game will warn you that the Very High texture will need more than 4 GB VRAM.

I don't know about Hitman but AC Unity also needs more than 3 GB VRAM with all setting maxed out, I remember using 3.8 GB of VRAM on my GTX 970 when I use the highest settings.


----------



## ppn (Sep 7, 2016)

3GB should be fine, just like GTX 780T 3GB, otherwise they would have put 6GB on both. I mean its 3 year old card now. Can't go from absolute high end to garbage in under 2.5 years.


----------



## Darksword (Sep 7, 2016)

Wait... it performs worse than the 6GB cards, but uses MORE power?  Fantastic.


----------



## Nordic (Sep 7, 2016)

Darksword said:


> Wait... it performs worse than the 6GB cards, but uses MORE power?  Fantastic.


Read the review fully man. That specific card used more power because it was overclocked out of the box.


----------



## Monsuta (Sep 7, 2016)

Joss said:


> You can't visually distinguish this from a GTX1080, the cooler seams to be basically the same. Other manufacturers go the same route and it's wrong. If you pay premium you should have premium in every detail including a specific cooler.



Indeed, you pay $1,200 price premium to buy a Pascal-based "NVIDIA TITAN X" without the "GEFORCE GTX" naming unlike previous Maxwell-based"NVIDIA GEFORCE GTX TITAN X" thinking you can finally get rid of that silly "GEFORCE GTX" but you still got it on the cooler and the backplate.


----------



## mac007 (Sep 7, 2016)

other that Hitman 3GB & 6GB are not way apppart in fps
GTX 980 still is slighlty better than 3GB in all cases and in some cases above 6GB too


----------



## idx (Sep 7, 2016)

I wonder why MSI gave this low power chip a really really good cooling and a back plate, while in case of the RX470 they removed one of the heat pipes and both of the front plate (memory  cooling) and back plate.

RX 470:




RX 470:






GTX1060 3GB: 




GTX1060 3GB:






It seems that they really gave the rx470 a less effective cooler while it does really need a good cooling more than this one .


----------



## Dethroy (Sep 7, 2016)

bug said:


> 1,920x1,080=2,073,600 pixels
> 1,600x900=1,440,000 pixels
> 
> 1,440,000/2,073,600=0.69
> ...


It's actually 44% more performance


----------



## Adam Freeman (Sep 7, 2016)

idx said:


> I wonder why MSI gave this low power chip a really really good cooling and a back plate, while in case of the RX470 they removed one of the heat pipes and both of the front plate (memory  cooling) and back plate.
> 
> RX 470:
> 
> ...



Regarding AMD cards with polaris architecture, I believe MSI designed a specific cooler for them thinking they required less power than nvidia
 cards with pascal architecture, so they removed one of the heatpips. But in the case of GTX 1060 that was released later, they didn't design
a specific cooler for it, they simply used the existing on they are mounting on the GTX 1070 and 1080 cards.


----------



## bug (Sep 7, 2016)

Dethroy said:


> It's actually 44% more performance


No, it isn't. The baseline here is 1920x1080. When going to 1600x900, there are 30% less pixels to push.
But that's just me being pedantic.


----------



## Dethroy (Sep 7, 2016)

bug said:


> No, it isn't. The baseline here is 1920x1080. When going to 1600x900, there are 30% less pixels to push.


Let's pretend we would score 100 fps with 1920x1080. If we were to apply your logic we would gain 30 fps for a total of 130 fps when rendering in 1600x900. Your math showed that (1600*900)/(1920*1080)=0.69.
130 fps * 0.69 = 90 fps → Oops!


bug said:


> But that's just me being pedantic.


It's rather you being stubborn.


----------



## rtwjunkie (Sep 7, 2016)

refillable said:


> You should stop testing BF4, the game is older than my 2-year-old cousin.
> 
> Fallout 4 seems broken in 1080p. No Man Sky (the super stupid hyped game) is broken in all resolution. Deus Ex MD has a stupidly low FPS, it's broken as well. I hope DX12 is going to fix it.



Troll much?

What I really see here is someone who either has read these things, and thus they are "facts", or who has a system that can't play any of the above games without problem. 

FO4 works fine in 1080p for me, BF4 is still a taxing test for many systems, and I just got finished playing a rock solid 59fps in DEMD...in DX11.


----------



## efikkan (Sep 7, 2016)

ppn said:


> 3GB should be fine, just like GTX 780T 3GB, otherwise they would have put 6GB on both. I mean its 3 year old card now. Can't go from absolute high end to garbage in under 2.5 years.


Correct. 3 GB is fine for this card. It's not like it's fast enough to push demanding games above 1080p anyway.


----------



## Shatun_Bear (Sep 7, 2016)

efikkan said:


> Correct. 3 GB is fine for this card. It's not like it's fast enough to push demanding games above 1080p anyway.



Yeah I'm sure it's pefectly 'fine' but if you're spending £230, you'd be a numpty to go for a card that has 3GB over one with 4GB or 6GB that cost the same amount of money or less.


----------



## etayorius (Sep 7, 2016)

Thanks for the review Wizz, Finally low level API benchmarks, great news. 

But I think the RX480 should be faster in Doom Vulkan, most other benchmarks i seen at 1080p Ultra Settings shows the RX480 AVG of 110-115 FPS, 90 avg fps seems somewhat low.


----------



## jabbadap (Sep 7, 2016)

A bit of off topic, but: Heh had to chuckle a bit, confess W1zzard you could not resist a buy that monitor of yours with that kind of name on it. I kind of dislike acer as a brand, but is it any good?


----------



## Artas1984 (Sep 7, 2016)

refillable said:


> You should stop testing BF4, the game is older than my 2-year-old cousin.
> 
> Fallout 4 seems broken in 1080p. No Man Sky (the super stupid hyped game) is broken in all resolution. Deus Ex MD has a stupidly low FPS, it's broken as well. I hope DX12 is going to fix it.



You should stop talking garbage. Techpowerup has been testing some of the older titles since beginning of time. Quake 4, Crysis, Call of Duty 4 have been tested for over 5 or 6 years in a row and there was no problem with that, as long as newer games like Crysis 2 or Call of Duty Advanced Warfare have been included together with them later on, it was interesting to compare the same older vs newer game in the past.. So expect Battlefield 4 to stay for another 2 years, as long some new Battlefield title will be included in comparison. It's trend that is common on TP.

Perhaps you ave forgotten that out of all Battlefield games, Battlefield 4 is still the most popular multiplayer game? It will stay as long as it will have a validation point in community.


----------



## W1zzard (Sep 7, 2016)

jabbadap said:


> A bit of off topic, but: Heh had to chuckle a bit, confess W1zzard you could not resist a buy that monitor of yours with that kind of name on it. I kind of dislike acer as a brand, but is it any good?


back then it was the cheapest monitor with multiple inputs and proper support for 1600x900 and 2560x1440, besides 4k. it is somewhat sensitive to thinking bios boot screen = monitor off, but other than that no issues.

that's not the monitor i use for every day work, it's only for the vga test system (my work system monitors are dell u3011 2560x1600 and a 1280x1024 eizo to the right of it)


----------



## AC BEE (Sep 8, 2016)

Nice review.
If amd the nvidia card both are not REFERENCE.
RX480 beats GTX1060 6G a little bit.
RX470 beats GTX1060 3G a little bit and equal to GTX 1060 6G.

RX480 REF is totally disaster. It is even worse than RX470 nitro and red devil.


----------



## xorbe (Sep 8, 2016)

AC BEE said:


> _... first post, bashing AMD in NV review thread ..._



Welcome, you'll fit right in here!!!


----------



## Ungari (Sep 8, 2016)

Reviews are not consistent across the board so far. For some reason the benchmarks for the 3GB vary wildly between various review sites, and way outside the margin of error.

If 3.5 VRAM  on the GTX 970 was a limitation in several games 2 years ago, why isn't 3GB VRAM a bad idea in 2016?
As to the value conclusion, for the price of this MSI custom card one could get the full 8GB RX 480, never-mind that many reference 4GB 480's actually have 8GB on the board that can be unlocked, a value that cannot be beat.
I cannot understand how anyone could recommend purchasing this card.

I am dismayed that many reviewers are not chastising Nvidia on what is a deceptive practice by not renaming the cutdown version of this card to GTX 1050 or 1050Ti, something to let the consumer know it is NOT a true 1060.


----------



## rtwjunkie (Sep 8, 2016)

Ungari said:


> I am dismayed that many reviewers are not chastising Nvidia on what is a deceptive practice by not renaming the cutdown version of this card to GTX 1050 or 1050Ti, something to let the consumer know it is NOT a true 1060.



It's actually normal for them.  All the way back to the 9000 series they have had variations of the same model, distinguished only by the different VRAM amount in the name.

If customers can't read, then maybe they have no business being customers?


----------



## sutyi (Sep 8, 2016)

Pretty useless having a 3GB VRAM with this performance class of GPU especially in this price range. New games pretty much eat up to 3.5GB at 1080p.
Unless you fancy videomemory related constant micro-stuttering in new AAA titles go 4GB+ if you get my drift...


----------



## bug (Sep 8, 2016)

Dethroy said:


> Let's pretend we would score 100 fps with 1920x1080. If we were to apply your logic we would gain 30 fps for a total of 130 fps when rendering in 1600x900. Your math showed that (1600*900)/(1920*1080)=0.69.
> 130 fps * 0.69 = 90 fps → Oops!
> It's rather you being stubborn.



Classic mistake. That .7 factor is determined when going from 1920x1080 to 1600x900. You can't apply the same factor when going from 1600x900 back to 1920x1080 and expect to get back where you started.
It's like you earning $1,000 a months and your boss comes and tells you: due to financial difficulties I'm going to cut your paycheck by 50% (multiply by .5), but I'm going to give you 50% more  (multiply by .5 again) after 6 months.



Ungari said:


> If 3.5 VRAM  on the GTX 970 was a limitation in several games 2 years ago, why isn't 3GB VRAM a bad idea in 2016?



3.5 GB VRAM was never an actual limitation. Just something people liked to argue about.
The only way 3.5GB could be a limitation is if a game spent a significant amount of time precisely between 3.5 and 4 GB. And even if such a game existed, you'd lower texture quality a bit and be in the clear. Most games offer only minimal differences beyond Medium texture quality and once you're past High quality, you can usually only spot differences in screen shots.


----------



## troelses (Sep 8, 2016)

bug said:


> Classic mistake. That .7 factor is determined when going from 1920x1080 to 1600x900. You can't apply the same factor when going from 1600x900 back to 1920x1080 and expect to get back where you started.
> It's like you earning $1,000 a months and your boss comes and tells you: due to financial difficulties I'm going to cut your paycheck by 50% (multiply by .5), but I'm going to give you 50% more  (multiply by .5 again) after 6 months.



Actually you're the one that made a classic mistake in original post and Dethroy was just trying to point this out to you by using your logic. You said that 900P gained 30% more performance due to 30% fewer pixels, however with that statement you are making the mistake of mixing terms like "more" and "fewer" without correcting for the denominator. The correct statement would have that 900P gains 44% more performance than 1080P, since 1080P is 44% more pixels, and vise versa that 1080P gets 30% lower performance since 900P is 30% fewer pixels. 

In other words you basically said a reduction of 30% (in pixels) would be equalled by an increase of 30% (in performance), which is exactly the same as saying that cutting your paycheck by 50% is equalled by an increase of 50%.


----------



## RaviSS (Sep 8, 2016)

So the new performance hierarchy taking GTX 1070 = 100 becomes (all reference cards):

GTX 1070 8GB - 100
GTX 1060 6GB - 71
RX 480 8GB - 67

GTX 1060 3GB - 65 (reference should be ~3% lower than the card tested, as cores are overclocked 4% and memory is overclocked 0%. So overall 8-9% slower than 6GB version).
RX 480 4GB - 63 (on avg. 6% slower than 8GB version)
RX 470 4GB - 54

GTX 1060 3GB easily has the best price-performance ratio at reference price.


----------



## Ungari (Sep 8, 2016)

rtwjunkie said:


> It's actually normal for them.  All the way back to the 9000 series they have had variations of the same model, distinguished only by the different VRAM amount in the name.
> 
> If customers can't read, then maybe they have no business being customers?



There is nothing stated on the box that tells the consumer it has less CUDA cores from the 6GB version, and so many will buy this thinking the only difference is VRAM.
It is deceptive to call this card a 1060, and Nvidia should be censured for it.


----------



## Ungari (Sep 8, 2016)

bug said:


> 3.5 GB VRAM was never an actual limitation. Just something people liked to argue about.
> The only way 3.5GB could be a limitation is if a game spent a significant amount of time precisely between 3.5 and 4 GB. And even if such a game existed, you'd lower texture quality a bit and be in the clear. Most games offer only minimal differences beyond Medium texture quality and once you're past High quality, you can usually only spot differences in screen shots.



GTA V, and Shadow Of Mordor, R6 Siege, and Skyrim with HD Texture Packs exceed 3GB VRAM, ---the limitations are real.


----------



## rtwjunkie (Sep 8, 2016)

Ungari said:


> There is nothing stated on the box that tells the consumer it has less CUDA cores from the 6GB version, and so many will buy this thinking the only difference is VRAM.
> It is deceptive to call this card a 1060, and Nvidia should be censured for it.



Again, they've done this now for at least ten years, and many times the difference is not just with the VRAM amount, and no one has called for their censure.  It seems like you are just noticing this? 

I'm not saying it's not a little bit crappy, but it's never actually been something I have seen people getting upset over.


----------



## Nordic (Sep 8, 2016)

For all those who can't make sense of why someone would buy this card when new AAA  games will use more than 3gb of memory. I don't play new AAA games. The most intensive game I play is star citizen, and that gets bad performance even on the best of hardware.

As I said before:


james888 said:


> As far as I can tell the 1060 3gb gives about 6% less performance than the 1060 6gb at 1080p. The price difference between the cheapest option of both is $60 in the USA when I just looked 5 minutes ago. $60 cheaper for a 6% performance hit seems worth it to me.
> The 480 4gb does not seem as good of a deal to me. It is the same price of $199.99 but uses a 30% more power. The 470 just doesn't match the performance.
> 
> I would rather have 4gb to 8gb vram for star citizen, but by the time that game comes out I will probably have upgraded 2 more times.



It is the best value card right now by a lot. I bought and EVGA mini itx one to go in my ITX build. It should be here today.


----------



## bug (Sep 8, 2016)

Ungari said:


> GTA V, and Shadow Of Mordor, R6 Siege, and Skyrim with HD Texture Packs exceed 3GB VRAM, ---the limitations are real.


That's not even coherent.
The card has 3.5GB VRAM that run at full speed and an additional .5GB that run at reduced speed. The only scenario it would suffer compared to a card that has 4GB of VRAM running at full speed is when a game spends a significant amount of time exactly between 3.5 and 4 GB VRAM. If a game stay over 3GB, but under 3.5 GB VRAM, the 970 is fine. If the game goes over 4GB, the 970 would have been in trouble even with 4GB full-speed VRAM.

*and I'm not even sure what HD texture packs are. do they come standard with those titles?


----------



## Ungari (Sep 8, 2016)

HD Textures are available to be enabled the last three games I listed.
The R6 Siege has a 6GB VRAM Texture Pack.


----------



## rtwjunkie (Sep 8, 2016)

Ungari said:


> GTA V, and Shadow Of Mordor, R6 Siege, and Skyrim with HD Texture Packs exceed 3GB VRAM, ---the limitations are real.



Skyrim is completely unable to exceed 3GB of VRAM.  The reason has to to with textures being replicated in system RAM.  Since it is a 32 bit game, you can see that it presents a problem for game stability.  Even the LAA patch only allows for 4GB of RAM use.

Since game files and any mods that aren't texture mods also have to be in RAM, this makes it impossible, without setting up ENB, to have an extraordinary amount of texture mods.  ENB skirts this by keeping the textures only in the VRAM.


----------



## Ungari (Sep 8, 2016)

rtwjunkie said:


> Again, they've done this now for at least ten years, and many times the difference is not just with the VRAM amount, and no one has called for their censure.  It seems like you are just noticing this?
> 
> I'm not saying it's not a little bit crappy, but it's never actually been something I have seen people getting upset over.



Can you cite a previous product where they cut CUDA but sold the card using the same name as the full CUDA model?


----------



## Ungari (Sep 8, 2016)

james888 said:


> It is the best value card right now by a lot. I bought and EVGA mini itx one to go in my ITX build. It should be here today.



If you are shopping for value and wish to ignore the RX 480, just wait a few months until Nvidia finally releases a new architecture Volta.
There will be a Fire Sale on Paxwell cards, especially used ones that people will be dumping on EBAY for over 50% off what they paid for them.


----------



## kaspar737 (Sep 11, 2016)

@W1zzard Why didn't you downclock the card to reference clocks like you have done previously when you don't have a reference card? And any chance to get a Palit/Gainward/PNY 1060 for a review?


----------



## xorbe (Sep 12, 2016)

kaspar737 said:


> @W1zzard Why didn't you downclock the card to reference clocks like you have done previously when you don't have a reference card? And any chance to get a Palit/Gainward/PNY 1060 for a review?



With the new boost stuff, it's better to flash the card to stock speeds most likely.


----------



## W1zzard (Sep 12, 2016)

kaspar737 said:


> @W1zzard Why didn't you downclock the card to reference clocks like you have done previously when you don't have a reference card?


Not reliable due to boost and other parameters in BIOS. I'm probably just gonna buy a card that's very close to stock


----------



## kaspar737 (Sep 12, 2016)

W1zzard said:


> Not reliable due to boost and other parameters in BIOS. I'm probably just gonna buy a card that's very close to stock


https://www.computerbase.de/2016-08/geforce-gtx-1060-279-euro-test/

The Gainward/Palit/PNY (all the same card) should be a close match to reference and would be a good chance to kill 2 birds with one stone (seen others here asking for a review of this card as well).


----------



## MicheleM79 (Sep 19, 2016)

Hi guys,
new member here and really newbie when talking about graphics cards.
This is a great review, and I need one simple suggestion from you, regarding this card.
The main thing here is:  I don't game, absolutely no games on my machine. I'm a programmer/systemadministrator
and I'm about to purchase 3 wide monitors (29" 2650x1080), I really enjoy the multi-monitor rig because
it increases my productivity a lot. Sometimes I also enjoy watching some blu-ray/MKV. The PC is really silent and
I'd like to keep it that way!  . The current card is old and has only one digital out, so it's time to change.
So the question: considering budget's not a problem, is this card (MSI GTX 1060 X 3G) the right one for me?
Should I buy the 6GB version for a better multi-monitor setup? Better wait for the upcoming 1050 to save money?
(perhaps considering the MSI version that seems silent as fanless in idle). Or better point to AMD RX 460/470 ?

Many thanks in advance,
Regards
Michele


----------



## bug (Sep 19, 2016)

MicheleM79 said:


> Hi guys,
> new member here and really newbie when talking about graphics cards.
> This is a great review, and I need one simple suggestion from you, regarding this card.
> The main thing here is:  I don't game, absolutely no games on my machine. I'm a programmer/systemadministrator
> ...


1050 should be plenty for your needs (integrated GPUs from intel are enough for the job). But wait for reviews on the 1050 and then compare with 460/470.
I think what you need is something with passive or semi-active cooling and with the necessary amount of outputs. Outputs shouldn't be a problem these days, but cooling solutions vary quite a lot.
Amount of VRAM is also irrelevant, you're not going to have to store huge textures.

If you watch movies, you may want to look at video output quality. I know AMD used to have an advantage (i.e. look better), but since I'm not a movies aficionado, I don't know whether that still holds true (or where to go to get an update).

Also, if you're going to use Linux, you're looking at another decision. Nvidia had the best Linux support for a while, but AMD has recently switched to an open-source driver strategy. That driver seems to be still up and coming, but it seems to do well enough on newer hardware.


----------



## MicheleM79 (Sep 19, 2016)

bug said:


> 1050 should be plenty for your needs (integrated GPUs from intel are enough for the job). But wait for reviews on the 1050 and then compare with 460/470.
> I think what you need is something with passive or semi-active cooling and with the necessary amount of outputs. Outputs shouldn't be a problem these days, but cooling solutions vary quite a lot.
> Amount of VRAM is also irrelevant, you're not going to have to store huge textures.
> 
> ...



@bug  Yeah, thanks bug for your suggestions.
My main needs/concerns are multiple digital outputs, silence, decent video quality (also with 4K/HEVC - monitor future upgrade/adding).
Regarding silence, yes I always had passive Asus solution cards, but this MSI GTX seems to keep 0 dB in idle (stop fans).
Another interesting point is VRAM: really I don't need 6GB for a multiple monitors rig ? (considering gaming is not a need).

Many Thanks
Michele


----------



## bug (Sep 19, 2016)

MicheleM79 said:


> @bug  Yeah, thanks bug for your suggestions.
> My main needs/concerns are multiple digital outputs, silence, decent video quality (also with 4K/HEVC - monitor future upgrade/adding).
> Regarding silence, yes I always had passive Asus solution cards, but this MSI GTX seems to keep 0 dB in idle (stop fans).
> Another interesting point is VRAM: really I don't need 6GB for a multiple monitors rig ? (considering gaming is not a need).
> ...


Nope, you absolutely don't need 6GB for multiple monitors. If you look here: https://www.techpowerup.com/reviews/Performance_Analysis/Deus_Ex_Mankind_Divided/5.html you'll see not even games need that much (unless devs don't care and load them up with tons of textures and then load them all up).
To give you an idea, 4k is ~8MP. At 32bits per channel, you're looking at 32MB. Three monitors -> 96MB. VRAM is definitely not an issue if you're not doing 3D work (e.g. gaming, CAD).


----------



## Ungari (Sep 19, 2016)

Gaming or not gaming, I would say the 3GB 1060 is not the card for anyone just based on principle. This card is deceptively titled as it is not a full 1060 chip with cut-down processors, and should be litigated as consumer fraud.

As you say budget is not an issue but you still like to save money, I would give consideration to the used card market. Particularly used EVGA cards as the 3 year warranty is transferable, even without a receipt as they will honor the warranty from the date of manufacture. Just to give an example, I just saw an used EVGA 980Ti 6GB going for $350, this card new was selling at over $700 a year ago.

If you just want to run three 1080p monitors and just play movies, you don't really need a high end card like the example I gave you, but if you go for a used card then why the hell not?
Particularly if you would like to go to higher resolutions, which I think you should if haven't yet purchased the monitors. Why? Because soon 1080p will be like today's 720p, and the prices of 1440p Freesync monitors are coming way down.
Besides future relevance, it just plain looks awesome and why not watch movies with the best visual quality?

Sounds like you are about to have a fun time shopping!


----------



## MicheleM79 (Sep 19, 2016)

Perfect,
many thanks to bug and Ungari, really some interesting things to think about!
Yeah, no problem with budget but if I could save some money and put it in other
hardware, it's only good!
Can this card (GTX 1060) keep 2560x1080 (or higher) in each of the 3 monitor at the same time? I hope it can! I think in the future I will add a 4K TV to the HDMI out.
Regarding movie video quality, I also read about AMD being superior, but it was years ago,
I don't know if today is the same.

Thanks guys!


----------



## Steevo (Sep 19, 2016)

MicheleM79 said:


> Perfect,
> many thanks to bug and Ungari, really some interesting things to think about!
> Yeah, no problem with budget but if I could save some money and put it in other
> hardware, it's only good!
> ...




AMD was better with video, was. The new Crimson drivers hide the video adjustment settings and the only way to access them is through registry keys/values. They have done better with driver performance but haven't bothered to fix the new interface with the updates they promised long ago. So essentially they are still doing the same crap but with a different focus.


----------



## MicheleM79 (Sep 20, 2016)

Steevo said:


> AMD was better with video, was. The new Crimson drivers hide the video adjustment settings and the only way to access them is through registry keys/values. They have done better with driver performance but haven't bothered to fix the new interface with the updates they promised long ago. So essentially they are still doing the same crap but with a different focus.



Yeah I don't know, considering the raw video quality, which one is better, maybe as you said it's only a matter of drivers?
Or maybe today there's no differences. Ok guys many thanks, I think I'll wait a bit for GTX 1050, in the meantime looking for something
in the used market (however not so easy here where I live, and I always prefer new if possible).
Possibly, if I want to use the 3 displayport outputs in the card for 3 hdmi/dvi monitors, could I bump into some problem?
A simple cable DP <=> HDMI should be enough?

Many thanks,
Regards


----------



## Steevo (Sep 21, 2016)

MicheleM79 said:


> Yeah I don't know, considering the raw video quality, which one is better, maybe as you said it's only a matter of drivers?
> Or maybe today there's no differences. Ok guys many thanks, I think I'll wait a bit for GTX 1050, in the meantime looking for something
> in the used market (however not so easy here where I live, and I always prefer new if possible).
> Possibly, if I want to use the 3 displayport outputs in the card for 3 hdmi/dvi monitors, could I bump into some problem?
> ...


AMD is still marginally better if you spend the time to tweak the settings in the registry. With MHC and a few other programs able to use shaders/CUDA for hardware acceleration and upscaling, and now that Nvidia has corrected/added the ability to tweak output format and color depth its tit for tat if you spend the time to tweak the settings for hardware for either. 

The only remaining issue is a lot of reports on the Gforce forums about black screens occasionally and video latency/popping issues with Nvidia cards, AMD has higher multi-monitor power consumption since the clock/power domains are still tied together for video output. 

You may need active DP to HDMI cable adapters for either brand.


----------



## W1zzard (Sep 21, 2016)

Oh for media playback, always use madVR. "Makes SD look like HD"


----------



## Steevo (Sep 21, 2016)

What he said ^^^^


----------



## Nordic (Sep 21, 2016)

W1zzard said:


> Oh for media playback, always use madVR. "Makes SD look like HD"


It does more than that. It makes a low quality video download look like HD.


----------



## Marstg (Oct 29, 2016)

Hello W1zzard
Would it be possible to do some Folding at home testing after you finish the main 3D part and update us with the results a few days later? I mean, after all, you have all the latest and greatest hardware on hand !


----------

