• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

MSI GTX 1060 Gaming X 3 GB

Going forward it seems like GTX 1060 is going to lag behind RX480 in DX12/Vulkan games.
No, there is no evidence of that. GTX 1060 beats RX 480 in unbiased games.
Even with more AMD favoring titles than ever (console ports and more), Nvidia still manages better performance overall. For PC gamers, GTX 1060 remains the superior choice in all the metrics, including raw performance, performance per dollar and performance per watt.
 
Fury's PPD is getting better, I see......
 
You can't visually distinguish this from a GTX1080, the cooler seams to be basically the same. Other manufacturers go the same route and it's wrong. If you pay premium you should have premium in every detail including a specific cooler.
 
I don't understand some of these results, some are completely off from previous reviews.

For example RotTR went down 16.2 fps in the latest review.

rottr_1920_1080.png
rottr_1920_1080.png


Can you please check this out?
 
I would have no issue buying this for 1080p use. Its faster across the board than a GTX 970 for $199.99 (if you shop wisely). I don't see the point in getting one of these more expensive models like the MSI Twin Frozr. The whole point (to me anyway) of a cut down card is to save money and they're just too close to a 6GB model to make sense. As far as the longevity of 3GB goes I personally wouldn't worry. I haven't kept a GPU more than two years since my 8800 GTX (which lasted 3 1/2). Mid range cards? I don't expect more than that out of them.
 
As far as I can tell the 1060 3gb gives about 6% less performance than the 1060 6gb at 1080p. The price difference between the cheapest option of both is $60 in the USA when I just looked 5 minutes ago. $60 cheaper for a 6% performance hit seems worth it to me.
The 480 4gb does not seem as good of a deal to me. It is the same price of $199.99 but uses a 30% more power. The 470 just doesn't match the performance.

I would rather have 4gb to 8gb vram for star citizen, but by the time that game comes out I will probably have upgraded 2 more times.
 
Last edited:
It's an alright card but price wise it's no where near as competitive as the RX 480 4GB version. I think the 470 is bandwidth starved in most of these games.
AMD's DCC is not quite as effective as Nvidia's when it comes to bandwidth savings. I don't think Nvidia's DCC is achieving these bandwidth savings through compression alone. They admitted to employing some "tricks" where per-pixel deltas are "very small". Essentially, where pixel delta values are within a certain range of the base pixel value, Nvidia's Pascal chips compress such pixels using the same delta value, which results in massive bandwidth savings and certainly performance benefits since there's less variance in pixel values when tiles are rendered.
 
I don't understand some of these results, some are completely off from previous reviews.

New test system: New drivers, game updated, Windows 10 Anniversary, benchmark scene changed slightly, maybe different settings (don't remember).
 
I don't see the point of this card - VRAM usage is going to only go up as time progresses. In the UK the GTX1060 3GB starts from £190 to £200,and the GTX1060 6GB starts from £230 to £240. Over two to three years that is not much more to spend,and I would rather not have another scenario like with the 8800GT 256MB,8800GTS 320MB and the various other VRAM limited ATI,AMD and Nvidia cards which well apart over time,and usually in games which pushed graphics somewhat. They all looked pretty decent at release IIRC.

The 8800GT 256MB was a prime example of how VRAM was a big limiting factor:

http://www.anandtech.com/show/2453

Even at a time when 256MB was a common amount of VRAM in older cards,you can see the 9600GT 512MB with only 64 shaders destroying a 112 shader 8800GT 512MB,and they were both of a similar generation too(the 9600GT launched somewhat later and the 8800GT 256MB was a stop-gap card to counter the HD3850 and HD3870 IIRC). At least the 8800GT 512MB was 40% more expensive than the 8800GT 256MB,but the GTX1060 6GB is barely 20% more expensive and also has more shaders too.

Personally I think AMD and Nvidia have gimped their lower end "mid-range" cards just enough,that they will hit issues quicker,ie, the GTX1060 3GB and RX470 4GB.

Edit!!

Plus DF encountered some problems with the 3GB version too(like Guru3D did),and one of those games was Rise of the Tombraider.

jpg


Tombraider is one of the most taxing modern games currently,and there is no doubt going to be more in the next two years.

Hitman was more of the same too.

I7T9Ivb.png

The GTX970 is ahead in minimums.

Latest AC has stuttering too.

csOpBYn.png


Guru3D also showed the frametimes for the GTX1060 3GB in Tombraider and Hitman had more spikes.

The thing is the GTX1060 3GB is also going to limit your upgrade options if you want to get a higher resolution monitor,ie,an ultrawide or qHD one which pushes more pixels,or if you want to use DSR/VSR or push AA.

Plus,what about the resale value of the card - I expect the 6GB version will hold its value much more than the 3GB version as I expect if you are that budget limited you would sell your card on when upgrading too.

As much as I would rather spend less myself on a card,the 6GB version just seems a more consistent option to have over the next two to three years.

In fact I would rather have had Nvidia disable more shaders and had the card as a 6GB one TBH.
 
Last edited:
I agree that 3GB is on the edge of any new card that I would buy, but looking forward to 4K 4GB seems to be a minimum to have, and all else being equal, it overclocks as poorly as the 480 does in most cases. On a side note, have you tried or seen any issues with video latency and drop outs as reported in the GeForce forums and on Reddit?
 
@ 1080p 3gb is fine. You don't need AA and ultra settings to play. I like whatever is the best value and that's usually always the budget cards like this.
 
Edit!!

Plus DF encountered some problems with the 3GB version too(like Guru3D did),and one of those games was Rise of the Tombraider.

jpg


Tombraider is one of the most taxing modern games currently,and there is no doubt going to be more in the next two years.

That's because they're using Very High textures, if you ever play the game then you will know that if you set the texture quality to Very High, the game will warn you that the Very High texture will need more than 4 GB VRAM.

I don't know about Hitman but AC Unity also needs more than 3 GB VRAM with all setting maxed out, I remember using 3.8 GB of VRAM on my GTX 970 when I use the highest settings.
 
3GB should be fine, just like GTX 780T 3GB, otherwise they would have put 6GB on both. I mean its 3 year old card now. Can't go from absolute high end to garbage in under 2.5 years.
 
Wait... it performs worse than the 6GB cards, but uses MORE power? Fantastic.
 
Wait... it performs worse than the 6GB cards, but uses MORE power? Fantastic.
Read the review fully man. That specific card used more power because it was overclocked out of the box.
 
You can't visually distinguish this from a GTX1080, the cooler seams to be basically the same. Other manufacturers go the same route and it's wrong. If you pay premium you should have premium in every detail including a specific cooler.

Indeed, you pay $1,200 price premium to buy a Pascal-based "NVIDIA TITAN X" without the "GEFORCE GTX" naming unlike previous Maxwell-based"NVIDIA GEFORCE GTX TITAN X" thinking you can finally get rid of that silly "GEFORCE GTX" but you still got it on the cooler and the backplate.
 
other that Hitman 3GB & 6GB are not way apppart in fps
GTX 980 still is slighlty better than 3GB in all cases and in some cases above 6GB too
 
I wonder why MSI gave this low power chip a really really good cooling and a back plate, while in case of the RX470 they removed one of the heat pipes and both of the front plate (memory cooling) and back plate.

RX 470:
L1000723.jpg

RX 470:
L1000728.jpg



GTX1060 3GB:
cooler3.jpg

GTX1060 3GB:
cooler2.jpg



It seems that they really gave the rx470 a less effective cooler while it does really need a good cooling more than this one .
 
1,920x1,080=2,073,600 pixels
1,600x900=1,440,000 pixels

1,440,000/2,073,600=0.69

That's a theoretical maximum of 30% more performance when pushing less pixels. Depending on how much a given title is CPU bottlenecked, you'll see 10-25% more performance in practice. Less than 10% in non-demanding games, but those would already be running at hundreds of FPS.
It's actually 44% more performance ;)
 
I wonder why MSI gave this low power chip a really really good cooling and a back plate, while in case of the RX470 they removed one of the heat pipes and both of the front plate (memory cooling) and back plate.

RX 470:
L1000723.jpg

RX 470:
L1000728.jpg



GTX1060 3GB:
cooler3.jpg

GTX1060 3GB:
cooler2.jpg



It seems that they really gave the rx470 a less effective cooler while it does really need a good cooling more than this one .

Regarding AMD cards with polaris architecture, I believe MSI designed a specific cooler for them thinking they required less power than nvidia
cards with pascal architecture, so they removed one of the heatpips. But in the case of GTX 1060 that was released later, they didn't design
a specific cooler for it, they simply used the existing on they are mounting on the GTX 1070 and 1080 cards.
 
It's actually 44% more performance ;)
No, it isn't. The baseline here is 1920x1080. When going to 1600x900, there are 30% less pixels to push.
But that's just me being pedantic.
 
No, it isn't. The baseline here is 1920x1080. When going to 1600x900, there are 30% less pixels to push.
Let's pretend we would score 100 fps with 1920x1080. If we were to apply your logic we would gain 30 fps for a total of 130 fps when rendering in 1600x900. Your math showed that (1600*900)/(1920*1080)=0.69.
130 fps * 0.69 = 90 fps → Oops!
But that's just me being pedantic.
It's rather you being stubborn.
 
You should stop testing BF4, the game is older than my 2-year-old cousin.

Fallout 4 seems broken in 1080p. No Man Sky (the super stupid hyped game) is broken in all resolution. Deus Ex MD has a stupidly low FPS, it's broken as well. I hope DX12 is going to fix it.

Troll much?:rolleyes:

What I really see here is someone who either has read these things, and thus they are "facts", or who has a system that can't play any of the above games without problem.

FO4 works fine in 1080p for me, BF4 is still a taxing test for many systems, and I just got finished playing a rock solid 59fps in DEMD...in DX11.
 
Last edited:
3GB should be fine, just like GTX 780T 3GB, otherwise they would have put 6GB on both. I mean its 3 year old card now. Can't go from absolute high end to garbage in under 2.5 years.
Correct. 3 GB is fine for this card. It's not like it's fast enough to push demanding games above 1080p anyway.
 
Correct. 3 GB is fine for this card. It's not like it's fast enough to push demanding games above 1080p anyway.

Yeah I'm sure it's pefectly 'fine' but if you're spending £230, you'd be a numpty to go for a card that has 3GB over one with 4GB or 6GB that cost the same amount of money or less.
 
Back
Top