# how much VRAM do you need? (1080p)



## NorthboundOcclusive (Nov 8, 2017)

i have a 660 TI (specifically this flavor) that i bought the day they were released (impulse buy, no regrets) to replace a refurbished HD 5750
given the age of the card, that only 1 fan runs (below 86c anyway), and increased workload demanded by my usual games, i figure im due to upgrade. (i might hold out until the next cycle of rx600/gtx11 series drop)
i have a firm (but not solid) budget of $300 earmarked for a GPU, which puts me right in gtx 1060/rx580 territory (1070ti are so tempting, but thanks to miners, well outside of my price bracket)
from reviews, i know the rx580 might be a few percents faster than the 1060, but 1060s get there with much less power draw and are a better value for it.

but one thing ive noticed lacking mention of, from the usual suite of benchmark games is their vram usage. i vaguely recall a year or so back it was more of an issue with (then-)newer titles being released

1060s come in 3 and 6gb vram flavors (and i know the 3gb is slightly less powerful), rx570s have 4gb and 580s have 8gb
presently im only gaming at 1080p, single monitor, (1440 might be nice, but would take a new monitor) and the vram use i typically see hovers around 1gb at the low end and 1.6/1.7 on the high end, but i do have a few games (doom 2016/quake champions, deus ex:MD, ashes of the singularity for example) that'll gladly eat up every bit the 2gb vram i have.

so, as for the title, what's the typical vram load these days? is 8gb basically future-proofing if you're not running at 4k resolutions or VR titles (and im not)

thanks for advice!


----------



## Toothless (Nov 8, 2017)

I ran 2gb on my 660 for a few years and was fine cept for a few games. Never really went close to 3gb on my 780s until I ran 5760x1080.

I'd say at least 4gb should be enough. Even a 970 would be an okay upgrade for you.


----------



## P4-630 (Nov 8, 2017)

Depends on the game and the settings.
The best choice would be a GTX1060 6GB or an RX580 8GB IMO.


----------



## Liviu Cojocaru (Nov 8, 2017)

Best choice would be the 1060 6GB imo, 4gb of VRAM is more than enough for 1080p gaming


----------



## Divide Overflow (Nov 8, 2017)

P4-630 said:


> Depends on the game and the settings.
> The best choice would be a GTX1060 6GB or an RX580 8GB IMO.


What game takes 6GB or greater at 1080 resolution?


----------



## EarthDog (Nov 8, 2017)

When buying a card today for 1080p, there is no way I would go less than 4GB and plan on playing with Ultra settings and AA. This also assumes I will be keeping the card for a couple years. VRAM use is only going up.


----------



## hat (Nov 8, 2017)

Apparently the new cod hits near 8gb maxed out at 1080...


----------



## Athlonite (Nov 8, 2017)

Divide Overflow said:


> What game takes 6GB or greater at 1080 resolution?



Skyrim SE can with a butt load of mods I regularly see 6.5~7 GB of VRam usage on my RX580 8GB @ 1080p


----------



## HD64G (Nov 8, 2017)

For now apart from heavily modded games (Skyrim for instance) 3-4GB of VRAM is enough if not for ultra settings, at least for very high. In the next 1-2 years I guess 6-8 might become the new standard. And as for the games using 7-8 gigs @1080P that's just pre-caching that doesn't affect any GPU having lower VRAM amount.


----------



## FreedomEclipse (Nov 8, 2017)

hat said:


> Apparently the new cod hits near 8gb maxed out at 1080...



This is one thing that confuses me... All it is, is just brown and grey textures with the odd sand yellow thrown in somewhere.


----------



## natr0n (Nov 8, 2017)

hat said:


> Apparently the new cod hits near 8gb maxed out at 1080...



COD engine now is designed to fill up all available vram with textures. It gives a false sense of oh man I need more vram on my card.


----------



## newtekie1 (Nov 8, 2017)

natr0n said:


> COD engine now is designed to fill up all available vram with textures. It gives a false sense of oh man I need more vram on my card.



Agreed, the actual performance numbers show it doesn't need nearly that much RAM.  The 6GB GTX1060 outperforms the 8GB RX580 in COD:WWII.  If 8GB of RAM was actually being used, and not just loaded with un-used textures, then the RX580 would easily be winning.

And I find that to be the case with most games that use more than 4GB of RAM at 1080p(and even the higher resolutions really).  Most of them are just filling the RAM with unused textures due to lazy programming.


----------



## FireFox (Nov 8, 2017)

Choices are choices but i wouldn't go less than 8GB if you intend to keep the Card for a few years.



natr0n said:


> COD engine now is designed to fill up all available vram with textures. It gives a false sense of oh man I need more vram on my card.



Talking about that, Titanfall 2 uses 8GB+ VRM that's 12GB VRM it's recommend.


----------



## jallenlabs (Nov 8, 2017)

I run an RX580 with 4GB of vram.  Only paid $199 for it when it first came out.  Plays all my games (BF1, COD WWII, Doom, Pubg and Titanfall 2) at ultra at 1080p no problem.  No real need for 8GB for this gpu.


----------



## FireFox (Nov 8, 2017)

jallenlabs said:


> Titanfall 2



Resolution?


----------



## jallenlabs (Nov 8, 2017)

1080p.  Haven't watched ram usage in anything, but I haven't had any performance issues either.  That has me curious though.


----------



## FireFox (Nov 8, 2017)

jallenlabs said:


> 1080p.  Haven't watched ram usage in anything, but I haven't had any performance issues either.  That has me curious though.



Never said i had issues but believe me it uses more than 8GB VRM, plus i play at 1440.

Maybe @EarthDog remember the thread i created about Titanfall 2 and the 8GB+ VRM usage?


----------



## Gmr_Chick (Nov 8, 2017)

Knoxx29 said:


> Choices are choices but i wouldn't go less than 8GB if you intend to keep the Card for a few years.
> 
> Talking about that, Titanfall 2 uses 8GB+ VRM that's 12GB VRM it's recommend.





Knoxx29 said:


> Never said i had issues but believe me it uses more than 8GB VRM, plus i play at 1440.
> 
> Maybe @EarthDog remember the thread i created about Titanfall 2 and the 8GB+ VRM usage?



OK, now I'm curious myself because I really want to play Titanfall 2 when I get my new rig up and running and I'm torn between a GTX 1060, RX 570 (@ OP, RX 570's also come in 8GB versions as well  ) and RX 580. I use a 1080p 60hz BenQ monitor so I'm wondering if I actually need 8GB of VRAM or if it's "safe" to just get the 4 gig version of either RX cards.


----------



## eidairaman1 (Nov 8, 2017)

4GB, anything less you're wasting time


----------



## EarthDog (Nov 8, 2017)

Gmr_Chick said:


> OK, now I'm curious myself because I really want to play Titanfall 2 when I get my new rig up and running and I'm torn between a GTX 1060, RX 570 (@ OP, RX 570's also come in 8GB versions as well  ) and RX 580. I use a 1080p 60hz BenQ monitor so I'm wondering if I actually need 8GB of VRAM or if it's "safe" to just get the 4 gig version of either RX cards.


6-8gb wont hurt anything. At the end of those cards lives in a couple years, it could be helpful. Id get 6-8gb now, hands down.


----------



## hapkiman (Nov 9, 2017)

Liviu Cojocaru said:


> Best choice would be the 1060 6GB imo, 4gb of VRAM is more than enough for 1080p gaming



He's right.

GTX 1060 6GB.  Shop around eBay, etc, and you can pick up a used one on the cheap.


----------



## Gmr_Chick (Nov 9, 2017)

EarthDog said:


> 6-8gb wont hurt anything. At the end of those cards lives in a couple years, it could be helpful. Id get 6-8gb now, hands down.



Much appreciated EarthDog


----------



## lZKoce (Nov 9, 2017)

I bought my RX560 2GB few months ago. But I play StarCraft mainly. To be honest, RAM is not what's bothering me with this card  I still say depends on what you are looking for in a card. I say get a balanced card : noise, temps, looks.


----------



## FireFox (Nov 9, 2017)

lZKoce said:


> I say get a balanced card : noise, temps, looks.



And the most important thing: Ram


----------



## NorthboundOcclusive (Nov 9, 2017)

thanks for the replies, all


----------



## Vario (Nov 9, 2017)

1070 or Vega 56 at a minimum, which are a bit out of your budget unfortunately, or wait until the next upgrade cycle.  1060 6GB and RX 580 8GB is just overpriced for what you get.  You want something that won't go obsolete in 3-4 years, no guarantee on those two I think.  If the 1060 6GB / RX 580 8GB was $180-220 it would be more in the ballpark.  After-all, the 1060 is 192bit low-mid range card, crazy to pay $300+ for it.  8GB VRAM is really the minimum so you don't have to upgrade again for 3-4 years.  With the current inflated prices on cards, I would just hold out until next cycle.  The current generation has been out for well over a year, its old tech.


----------



## EarthDog (Nov 9, 2017)

+1

Yeah horsepower is a different story... but if that is the budget, that is the budget.


----------



## Vario (Nov 9, 2017)

EarthDog said:


> +1
> 
> Yeah horsepower is a different story... but if that is the budget, that is the budget.


I think between the 1060 6GB and the 580 8GB, both being around $300 and some change I'd get the 580.

First, because 2GB more VRAM and 256bit bus, that extra 2GB VRAM might make all the difference in the next few years on whether you can "run it" or not.

Second, Radeon has been supporting older cards on its current releases, while NVidia has not, so I think you get a longer term solution with the Radeon.
If it were me, and it sort of is right now, since I am in similar boat, I am going to wait until next generation instead of either of these cards.  But this reminds me of the 7970 vs 680, the 7970 is still the card to have had in retrospect: more VRAM, larger bandwith 384 bit vs 256bit, and better future driver support.


----------



## EarthDog (Nov 9, 2017)

I agree with the vast majority (even mentioned your first bullet point much earlier), but you lost me here....


Vario said:


> and better future driver support.


----------



## Vario (Nov 9, 2017)

EarthDog said:


> I agree with the vast majority (even mentioned your first bullet point much earlier), but you lost me here....


I believe Nvidia dropped the ball on Kepler support after Maxwell came out.


----------



## EarthDog (Nov 9, 2017)

Sorry.. did they stop working or something? I recall seeing rumors about performance slowing down, which was bunked, but, I don't see the issue here, really. For the vast majority, the latest and greatest work a generation or so back. Surely there are exceptions, but... I don't know, I just prefer not to hang my hat on speculation. 

I do like the wider bus though!


----------



## Vario (Nov 9, 2017)

EarthDog said:


> Sorry.. did they stop working or something? I recall seeing rumors about performance slowing down, which was bunked, but, I don't see the issue here, really. For the vast majority, the latest and greatest work a generation or so back. Surely there are exceptions, but... I don't know, I just prefer not to hang my hat on speculation.
> 
> I do like the wider bus though!


The wider bus is something people overlook a lot.


----------



## dirtyferret (Nov 9, 2017)

I have a 1060 3Gb and never had a single issue in any game in achieving excellent performance.  I also don't sit there playing games with my Fraps on so I'm not tossing a hissy fit if if get 50FPS in game rather then 60 FPS.  There is also the difference of using GPU ram and getting performance from GPU ram.  Just because a game will use 5 GB ram doesn't mean you will see a performance increase, it could just be a RAM hog (often times it is).

There is also another side to this story.  Nvidia and AMD push developers to use more RAM so it can help sell video cards yet the performance difference with a game on Ultra setting vs the same game at Very High is often negligible at best.  Logical Increments does an excellent job to show what you achieve or lose with each drop in graphic settings vs frame rate benefits for certain games.

http://www.logicalincrements.com/games/witcher3


----------



## jallenlabs (Nov 9, 2017)

Just played some Titan Fall 2 and recorded the vram usage.  Never went about 3900mb and stayed around 3700mb most of the time.  No stutters or performance issues.  As long as you stay at 1080p you'll be fine.


----------



## EarthDog (Nov 9, 2017)

What were your settings? Important to know.


----------



## Vario (Nov 9, 2017)

jallenlabs said:


> Just played some Titan Fall 2 and recorded the vram usage.  Never went about 3900mb and stayed around 3700mb most of the time.  No stutters or performance issues.  As long as you stay at 1080p you'll be fine.


But in 5 years, will he be fine? He is using a 660ti from 2012 in 2017.  The card he used before that, a 5750 was 3 years old when he upgraded. If we expect same time duration until next upgrade, 8GB is needed.


----------



## EarthDog (Nov 9, 2017)

At least 6... agreed.


----------



## dirtyferret (Nov 9, 2017)

jallenlabs said:


> Just played some Titan Fall 2 and recorded the vram usage.  Never went about 3900mb and stayed around 3700mb most of the time.  No stutters or performance issues.  As long as you stay at 1080p you'll be fine.



Titan Fall 2 is one of those games that more RAM does not always translate into a performance increase @ 1080p


----------



## FireFox (Nov 10, 2017)

EarthDog said:


> What were your settings? Important to know.



For sure not the same setting i use.



jallenlabs said:


> Just played some Titan Fall 2 and recorded the vram usage.  Never went about 3900mb and stayed around 3700mb most of the time.  No stutters or performance issues.  As long as you stay at 1080p you'll be fine.


 
So, i found the Thread i created in 2016 about *Titanfall 2.*

Here:

https://www.techpowerup.com/forums/threads/8gb-gpu-ram-not-enough-to-handle-titanfall-2.227631/


Here: 

*Titanfall 2 Ram usage*









jallenlabs said:


> No stutters or performance issues. As long as you stay at 1080p you'll be fine.



As said before, i never had any performance issue, yeap maybe at 1080p the Ram usage is less than when you play at 1440 and insane satteings.


However, took this screenshot a few minutes ago while playing *Titanfall 2*


----------



## NorthboundOcclusive (Nov 10, 2017)

Vario said:


> wait until the next upgrade cycle.  1060 6GB and RX 580 8GB is just overpriced for what you get.  You want something that won't go obsolete in 3-4 years, no guarantee on those two I think.  If the 1060 6GB / RX 580 8GB was $180-220 it would be more in the ballpark.  After-all, the 1060 is 192bit low-mid range card, crazy to pay $300+ for it.  8GB VRAM is really the minimum so you don't have to upgrade again for 3-4 years.  With the current inflated prices on cards, I would just hold out until next cycle.  The current generation has been out for well over a year, its old tech.


have to admit im leaning in the 'wait' direction myself. my 660TI was about $300 on launch, so getting this much use out of it for this long id say it's money well spent and good value at that.
if the current gen cards, both red and green, were anywhere near their msrp, would be a different story im sure (recall seeing something like the 1070 had a $375msrp or something ridiculous like that. bloody miners. )



Vario said:


> But in 5 years, will he be fine? He is using a 660ti from 2012 in 2017.  The card he used before that, a 5750 was 3 years old when he upgraded. If we expect same time duration until next upgrade, 8GB is needed.


aye, glorious pc gaming master race right here. buy mid-range, milk it until it burns out or you run into a game that you just have to upgrade for.


----------



## Vario (Nov 10, 2017)

NorthboundOcclusive said:


> aye, glorious pc gaming master race right here. buy mid-range, milk it until it burns out or you run into a game that you just have to upgrade for.



I think in some ways it is more fun to wait, then the upgrade feels so much more potent.


----------



## Frag_Maniac (Nov 11, 2017)

You need a bare minimum of 4GB VRAM on an increasing number of games now, and some will require 6GB or more for max textures. Keep in mind that regarding 4GB, the 970 doesn't really count because it really only has 3.5.


----------



## EarthDog (Nov 12, 2017)

Frag Maniac said:


> Keep in mind that regarding 4GB, the 970 doesn't really count because it really only has 3.5.


No... it has 4GB... just 512MB of it is slower.


----------



## ASOT (Nov 12, 2017)

For low setting 2 Gb, medium to high settings 3 Gb,high to vry high or ultra 4 Gb


----------



## EarthDog (Nov 12, 2017)

Who plays PC on anything less than high unless they are forced to????


----------



## Frag_Maniac (Nov 12, 2017)

EarthDog said:


> No... it has 4GB... just 512MB of it is slower.


I stand by what I said. *All* of the VRAM capacity should buffer as quickly as any other VRAM. The fact that that 512MB doesn't results in undesireable performance in some games. They got sued because they *should* have for the false advertising they did.


----------



## EarthDog (Nov 13, 2017)

There are people that stand by the Flat Earth theory... that doesn't make them correct. 

Being serious, I agree that is what *all* of it *should* do... but, its still 4GB total, not 3.5 GB. They were sued for false advertising of the memory subsystem, not because it was missing 512MB.


----------



## Frag_Maniac (Nov 13, 2017)

EarthDog said:


> There are people that stand by the Flat Earth theory... that doesn't make them correct.
> 
> Being serious, I agree that is what *all* of it *should* do... but, its still 4GB total, not 3.5 GB. They were sued for false advertising of the memory subsystem, not because it was missing 512MB.


So this off the wall banter was started by you're not being serious? Then what point do you have?

There was no reason for them to cripple the cards with over 12% of the memory performing way too slow to function as normal VRAM. And I never said they were sued on amount of VRAM, just on not being up front about how slow that 512MB of it was. Even though I do feel, as many do, that the way that 512MB functions is effectively not healthy VRAM.

There were a lot of people that bought them whom at first that thought it was of no issue. Funny how many of them ended up opting to get that $30 when it was offered though, and IMO it wasn't enough.


----------



## EarthDog (Nov 13, 2017)

No... this started by you saying there was only 3.5gb on the 970 without qualifying your statement. It has 4gb... just some is notably slower.

Lol, just because they opted in doesnt mean they had a problem. It just means they bought the card and can. Its a class action lawsuit, only a purchase was required.


----------



## P4-630 (Nov 13, 2017)

I'm glad I_ didn't_ buy a 960 or a 970....


----------



## EarthDog (Nov 13, 2017)

Why not a 960 outside of being slow?


----------



## P4-630 (Nov 13, 2017)

EarthDog said:


> Why not a 960 outside of being slow?



Yeah that was the card I was looking at first but I decided to wait a little longer(while was using intel graphics at that time).....
Also I was able to save some more cash till the 1070 came out and so I went for it!


----------



## EarthDog (Nov 13, 2017)

Gotcha.. good info in a vram thread.


----------



## P4-630 (Nov 13, 2017)

EarthDog said:


> Gotcha.. good info in a vram thread.



Hehe


----------



## Frag_Maniac (Nov 13, 2017)

EarthDog said:


> No... this started by you saying there was only 3.5gb on the 970 without qualifying your statement. It has 4gb... just some is notably slower.
> 
> Lol, just because they opted in doesnt mean they had a problem. It just means they bought the card and can. Its a class action lawsuit, only a purchase was required.


I think you're splitting hairs to on the one hand admit that 512MB is quite a bit slower, yet act like it's normal VRAM. Whether they can legal even CALL it VRAM is debatable, since they DID lose that lawsuit. One thing is certain though, at least they know now they're not allowed legally to not clarify that a portion of the VRAM is significantly slower when they indulge in such idiotic designs.

And no, not all noticed problems with it right away, but there's been many games made since where 4GB VRAM is the minimum requirement, and I've seen bench tests on them showing how the 970 can start stuttering when the frame buffer starts exceeding 3.5GB, where 4GB VRAM cards don't. So for all intents and purposes, effectively it's not the same as having an actual 4GB of full speed VRAM. That's all that matters really.

I also think it's rather disingenuous, and even contradictory of those whom sought the $30 if they claimed they had no issues with the product. It's called being a hypocrite.


----------



## EarthDog (Nov 14, 2017)

You consider it not even there and Im splitting hairs? Nice. 

Its VRAM. That was never in question. Just the speed was not disclosed, and if you believe nvidia (...) by mistake. As time goes on games use more and more vram. If there is frequent swapping over the 3.5gb threshold, hitching and slow downs can be noticed in some titles. Its not a card that matures well, that is for sure.

Its curious to me how you believe the ram isnt really there, and people should/should not participate in the class action lawsuit based on performance and if they are seeing the problem instead of the merits of the case...the fact that it did not disclose the significant speed difference of the last 512MB. Misleading specs effects everyone, regardless of who it actually effects in-game. 

Anyway... good talking.


----------



## Frag_Maniac (Nov 14, 2017)

It's "there" per se, but really not in performance when you consider by the time it does it's job it's too late to avoid problems in games that require 4GB VRAM. What part of that is so friggin hard to get? Would you settle for an 8 core CPU that has one notoriously slow core? I don't think so. I get the feeling you don't even know what splitting hairs means. If you were a car salesman and you had an 8 cylinder car on your lot that you knew had one cylinder that was always misfiring, would you claim it ran as well as any other 8 cyl rig? If so you'd be a con artist.

The only reason Nvidia got away with this for a short time was the games that released when the 970 came out didn't require nearly as much VRAM as they do now. Now that people know the design of this card, no one wants it.

I would think you'd get that by now, but all you do is drone on like an Nvidia sycophant claiming this thing has a legit 4GB VRAM, when it's obvious over 12% of  it functions nowhere near as well as normal VRAM.

The way it should have been advertised is 3.5GB VRAM + 512MB cache RAM, and we all know by now cache RAM does not perform anywhere NEAR the speed of VRAM. When it comes to video memory, speed is of the utmost importance. You're acting like it's only about capacity.

So again, for the slow to conceptualize, I'm not saying it's not there physically, I'm saying it's not there in performance, which is more important, and that's what I mean by your splitting hairs. Who gives a shit if it's there if it's slow as molasses?

It's a crippled card plain and simple Dog. Don't see how you don't get that because you're usually up on things. Only shit lawyers that enjoy making money off crooks would defend Nvidia on this, which is why they got sued. The suit wasn't about lying about capacity, no, but it WAS about not divulging how slow that part of the memory was, ergo speed is just as important as capacity.

At any rate, I'm glad they blundered this way, because they deservedly wound up with a lot of egg on their face, and probably know now they won't be able to get away with it again, not that they really got away with anything to begin with. More like it damaged their reputation.


----------



## EarthDog (Nov 14, 2017)

For the most part, im with you. I suppose im too literal of a person. After all, this is the post i responded to...


Frag Maniac said:


> Keep in mind that regarding 4GB, the 970 doesn't really count because it really only has 3.5.



Too my literal mind, well, thats what got us here. Its also clear we differ on the severity of the issue, but, im already dizzy from our circles. 

Again, cheers.


----------



## Frag_Maniac (Nov 14, 2017)

EarthDog said:


> For the most part, im with you. I suppose im too literal of a person. After all, this is the post i responded to...
> 
> 
> Too my literal mind, well, thats what got us here. Its also clear we differ on the severity of the issue, but, im already dizzy from our circles.
> ...


I think the problem is you're misinterpreting that post.

I could have said ..."because it only has 3.5GB VRAM". I instead said "...because it *really* only has 3.5GB VRAM".

I think you know by now what I meant by that after explaining it several times, and for the record, I could just as easily misinterpret what you mean by agreeing with me "for the most part". Your clarification on that is no more clear than the part I said that you misinterpreted.

Anyways, I think we've both had our say on the matter. I see no sense in derailing an entire thread over it.


----------



## Vario (Nov 14, 2017)

I think right now the gaming consoles having a lot of VRAM, many $300 mainstream graphics cards having 6GB and up, it is reasonable to expect that going forward 4GB is the absolute minimum one should have and you should aim for 8GB+ if you want to keep your system relevant for high end gaming (AAA games, 1080+ resolution, high settings) beyond a year.  Also the current crop of cards are all at-least 1 year old technology including the 2016 released Geforce 1060 and its competitor the Radeon RX 580 which is essentially a binned and slightly improved RX 480.  Next update cycle we may find all budget cards having at least 4GB, mainstream cards having at least 8GB, and extreme cards having beyond 8GB.  Seems like it increases 2GB per generation.


----------



## Frag_Maniac (Nov 14, 2017)

Vario said:


> I think right now the gaming consoles having a lot of VRAM, many $300 mainstream graphics cards having 6GB and up, it is reasonable to expect that going forward 4GB is the absolute minimum one should have and you should aim for 8GB+ if you want to keep your system relevant for high end gaming (AAA games, 1080+ resolution, high settings) beyond a year.  Also the current crop of cards are all at-least 1 year old technology including the 2016 released Geforce 1060 and its competitor the Radeon RX 580 which is essentially a binned and slightly improved RX 480.  Next update cycle we may find all budget cards having at least 4GB, mainstream cards having at least 8GB, and extreme cards having beyond 8GB.  Seems like it increases 2GB per generation.


I'm kinda surprised they're selling as many 1060 3GB cards as they are. People seem fooled by the fact that they perform well on most games at 1080p, but going forward there are going to be a lot more games that you can't use max textures on with only 3GB VRAM. Hell, even 4GB is a risky choice going forward.

My 3GB 7970 is quickly becoming more and more inadequate. I have the funds to get the TV and 1080 Ti I want, but I'm waiting for a drop in price on the TV, and hoping to get a good price on the 1080 Ti in the same time frame.

I just saw the triple fan MSI 1080 Ti Duke for $690 though, so things are heading that direction finally.


----------



## StefanM (Nov 24, 2017)

Just a tidbit: i tried to run the new Wolfenstein demo on a 2 GB GPU (because i didn't RTFM).
The game tries to allocate 4 GB even at 1280x720 



> NVIDIA device detected, defaulting GPU triangle culling to off
> enabling image dropmip as device has less than 4000 MiB vramInitializing Vulkan subsystem
> ShowGameWindow: (0, 0) 1280 x 720, full screen


----------



## jboydgolfer (Nov 24, 2017)

my GTX 970 has 3.5Gb's of Vram, and .5Gb's of Maple Syrup.


----------



## FreedomEclipse (Nov 24, 2017)

jboydgolfer said:


> my GTX 970 has 3.5Gb's of Vram, and .5Gb's of Maple Syrup.



The best maple syrup that you have ever licked in your entire life (coming from a former 970SLi user/owner)


----------



## Splinterdog (Dec 16, 2017)

This is Forza Horizon 3 on my GTX 970, with an 8320 and 16Gb RAM and game settings at Ultra. Nearly 4 Gb Vram used. Most of the time I play at high settings because the FPS dip to around 20 on Ultra.
I'm pretty sure the Asus RX580 8Gb that I'm getting from Santa will do better than this.


----------



## John Naylor (Dec 16, 2017)

At 1080p, 3 GB is enough ... while you oft will see claims that this game needs more, it's almost always based upon a false assumption.  You remeber all the fake hoopla about the 970's 3.5 GB ... despite all the ranting, any game that gave the 1070 problems (and this only happened at 4k), the 4 GB 980 provided no improvement  The reason fpor this sis simply that there is no utility which actually measured VRAM usage.

As an analogy I'll use a credit card account.   You have a Visa card w/ a $5,000 limit, abd you spent $500 on it, meaning you owe $500 and gave $4500 in credit remaining.    Yet when you apply for a car loan and the bank asks for a credit report , that report contains a credit  liability for Visa of $5,000.  When a game installs, it looks at the amount of VRAM avalable and based on that "allocates" a certain % of that to be available.  So if you have 8 GB, it might allocate 3.5GB...if you have 4 GB, it might allocate 2.5 GB.  And it's highly unlikely that actual usage gets anywhere near that level.  There is one way you can see whether VRAM has any impoact and that is to run the game with different amounts and look for chnages in qualitu, user experience of fps.

Alienbabeltech did this with some 40+ games with twin 770s (2GB + 4GB).  They observed no significant in performance in any game at 1080p.  They then did it at 5760 x 1080 and they did fond differences... but the thing is, for those games with differences, the games were simply unplayable... If having 4 GB gets you to 19 fps when the 2GFb gets 16 fps, the game is still unplayable.  The kicker was, Max Payne wouldn't even install w/ the 2 GB card installed at 5760 x 1080.  After installing and testing it with the 4GB card, they swapped cards ... and since the game was already installed, it didn't go thru the VRAM allocate step ... it ran at the same fps, with the no change in graphical quality.

Here's some  links with other test data
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

The most recent one is this where they had to go to 4k and highest settings  to observe any issues with 4 GB
https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

GPUz is  a great tool but, as indicated there,  using it to measure VRAM usage is not really what it does.  Here's how it ties into the credit card analogy

"GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — _instead, it reports the amount of VRAM that a game has requested_. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”

So there is simply no way to **see** VRAM usage when playing any game because there's simply no tool capable of measuring actual usage.

One great way to see the impact is to use techpowerups test results.  Compare the 3 GB and 6GB 1060s which TPU reviewed.  Ya can't make a direct comparison because the 6 GB model's GPU has about 10% more shaders.  So even w/ the same amount of VRAM, it would be about 7% faster.  So if VRAM was an issue at 1080p, we would expext it to be an even bigger issue at 1440p.  But the fps performance advantage (due to the shaders)  does not vary in any significant way between the 6GB and 3 GB card which indicated that the extra 3GB isn't doing anything for you.  You need to get above 1440p for it to matter.

Now with 76.4% of those hitting Steam servers @ 1080p, it's a lil hard to say ypou must plan for more.  But I think any purchase today should keep where ya might be in next cupla years in mind, and my recommendation would be plan for at least 1440p.  We recommend ..... ** as a minimum** ...getting as close as ya can to.  More will provide a cushion.

1080p => 3 GB
1440p => 6 GB (1440p has 1.8 times as many pixles as 1080p)
2160p => 12 GB (1440p has 4.0 times as many pixles as 1080p)


----------



## Splinterdog (Dec 16, 2017)

The so called missing 512Mb on my 970 has never really bothered me, especially since I upgraded from a Sapphire Radeon 7950 and was blown away by the improvement.
Anyway, Forza Horizon 3 is a very graphically demanding game at full settings and although the Geforce driver updates have improved the performance noticeably, the 970 still struggles at Ultra, in a real world meaning.


----------



## EarthDog (Dec 16, 2017)

John Naylor said:


> At 1080p, 3 GB is enough ...


Many would disagree... i would call this a MINIMUM these days with 4gb being strongly preferred.

FPS doesnt tell you much with regards to vram. Hitching can be observed when its swapping out data when vram is full.

See post 6.


----------



## Batou1986 (Dec 16, 2017)

4gb of vram is not enough for 1080p, whoever came up with the idea that resolution is absolutely tied to how much vram a game can use should be shot.

I don't even have to make any arguments here just fire up total warhammer and try to play on ultra @1080p with 4gb of vram or less.
The game will actually tell you on the loading screen that it does not have enough vram and is reducing settings.


----------



## Vayra86 (Dec 17, 2017)

Batou1986 said:


> 4gb of vram is not enough for 1080p, whoever came up with the idea that resolution is absolutely tied to how much vram a game can use should be shot.
> 
> I don't even have to make any arguments here just fire up total warhammer and try to play on ultra @1080p with 4gb of vram or less.
> The game will actually tell you on the loading screen that it does not have enough vram and is reducing settings.



This.

4 GB bare minimum these days and the 1060 3GB is a pointless card to begin with. Yes, in the bench hierarchy they look to be great price/perf, theoretically the 3GB 'should be enough'... and then there is actual practice when you have used a 3GB GPU on recent games, which paints the real picture of hitchy, stuttery gameplay.

Another issue with 3GB cards is resale value. It is going to be as nonexistant as any 2GB GTX 680 or 770 today. Nobody buys those cards anymore, because 2GB simply isn't enough. For anything anymore.



John Naylor said:


> At 1080p, 3 GB is enough ... while you oft will see claims that this game needs more, it's almost always based upon a false assumption.  You remeber all the fake hoopla about the 970's 3.5 GB ... despite all the ranting, any game that gave the 1070 problems (and this only happened at 4k), the 4 GB 980 provided no improvement  The reason fpor this sis simply that there is no utility which actually measured VRAM usage.
> 
> As an analogy I'll use a credit card account.   You have a Visa card w/ a $5,000 limit, abd you spent $500 on it, meaning you owe $500 and gave $4500 in credit remaining.    Yet when you apply for a car loan and the bank asks for a credit report , that report contains a credit  liability for Visa of $5,000.  When a game installs, it looks at the amount of VRAM avalable and based on that "allocates" a certain % of that to be available.  So if you have 8 GB, it might allocate 3.5GB...if you have 4 GB, it might allocate 2.5 GB.  And it's highly unlikely that actual usage gets anywhere near that level.  There is one way you can see whether VRAM has any impoact and that is to run the game with different amounts and look for chnages in qualitu, user experience of fps.
> 
> ...



The problem with your sources indicates exactly why you say what you say. Your sources date back to 2012 and 2015.

Its 2017 now, we have a PS4Pro and Xbox Scorpio, and mainstream cards now push 4 GB as a standard VRAM amount. 4GB is the new '2GB' that was all hot and true in 2012. If you buy a 3 GB card for 1080p or as a sensible VRAM amount, it will lose its value fast. Its a different story from, for example, the positioning of a 780ti 3GB. That card ALSO suffers from VRAM shortage but it just doesn't have the core to push more. Todays GPUs do, and 3GB means it'll hamstring that core performance. There is a good reason AMD doesn't even touch it in its midrange offering.

As for the 970 and its 3.5 GB - like the 780(ti) which is quite similar in core capabilities, a single card 970 setup never needed 4 GB and the driver handles that well. In SLI, you do run the risk of touching on those limitations.


----------



## ViperXTR (Dec 17, 2017)

i remember Resident Evil 7 consuming a lot of vram if you enable that one setting (i forgot what it is) and it causes minor stutter on some cards when not enabled


----------



## Vayra86 (Dec 17, 2017)

ViperXTR said:


> i remember Resident Evil 7 consuming a lot of vram if you enable that one setting (i forgot what it is) and it causes minor stutter on some cards when not enabled



Something with dynamic resolution?


----------



## ViperXTR (Dec 17, 2017)

Vayra86 said:


> Something with dynamic resolution?


ah found it, shadow caching. When enabled on low vram cards, it will stutter but runs fine on 6-8gb ones


----------



## Toothless (Dec 17, 2017)

Starting to think the words "need" and "want" are getting blurred in this thread. "Need" would be more "I need x amount minimum to get by" in which IMHO 2-3GB is minimum. "Want" is more on the side of 4GB+ because people are all about pretty textures.


----------



## EarthDog (Dec 17, 2017)

Well, who buys a computer and WANTS to run with lowered textures? If thats ok, then yeah, 3gb or less is ok.. otherwise, 4gb.


----------



## cucker tarlson (Dec 17, 2017)

3GB is borderline 
3.5GB is refundable
4GB is safe
6GB is future-proof
8GB/12GB is for those running max. details with a lot of anti aliasing, guys who cheap out on a monitor to buy a Titan X.


----------



## P4-630 (Dec 17, 2017)

If you have the cash, just buy a 8GB vram card, otherwise save up a little longer till you can afford one.
(GTX 1070(Ti) , GTX 1080 or Vega)


----------



## cucker tarlson (Dec 17, 2017)

Definitely wait for 1070Ti to drop price or GTX Volta/Ampere if you want future-proof.


----------



## Toothless (Dec 17, 2017)

EarthDog said:


> Well, who buys a computer and WANTS to run with lowered textures? If thats ok, then yeah, 3gb or less is ok.. otherwise, 4gb.


Easier to buy a $100 780 than a $200+ 1060 6gb. Money rules the world and tighter budgets won't allow much.


----------



## EarthDog (Dec 17, 2017)

If that is your budget..not much choice. You can make any shoe fit...im sure you get my point. 

...nobody wants to have to run with lowered textures if it can be helped.


----------



## NdMk2o1o (Dec 17, 2017)

EarthDog said:


> If that is your budget..not much choice. You can make any shoe fit...im sure you get my point.
> 
> ...nobody wants to have to run with lowered textures if it can be helped.


3GB is fine for 1080p imo and I am one of those who has a 3GB 780ti so can speak from experience, would be ok if 1060 6GB was available for $200 but it's more like $260-$300 so yea I opted for a $140 780 ti 3GB and am not missing a 30% performance increase for 115% cost increase and I dont run with low textures either, are you crazy? I'll just turn off AA, extreme shadows, blur etc  before I reduce texture settings and run most games on high settings at 60fps just fine on my measly old 780 ti with 3GB vRAM


----------



## EarthDog (Dec 18, 2017)

Yep.. ok with reduced settings and AA. Not how I or most people want to play. You supported the overall point...IQ sacrifices need to be made.


----------



## Gmr_Chick (Dec 18, 2017)

Vayra86 said:


> This.
> 
> 4 GB bare minimum these days and* the 1060 3GB is a pointless card to begin with*. Yes, in the bench hierarchy they look to be great price/perf, theoretically the 3GB 'should be enough'... and then there is actual practice when you have used a 3GB GPU on recent games, which paints the real picture of hitchy, stuttery gameplay.
> 
> Another issue with 3GB cards is resale value. It is going to be as nonexistant as any 2GB GTX 680 or 770 today. Nobody buys those cards anymore, because 2GB simply isn't enough. For anything anymore.



Seeing as the 6GB version is definitely the more popular of the two, why did Nvidia even bother with the 3GB version? Probably to maximize profits I'm guessing, but even still, it's a gimped 1060, and certainly not suitable for anything over 1080p gaming with reduced settings. Go beyond that, and it gets brutal pretty quickly. 

If Nvidia were going to do this from the start, I think they should have *at least* made the naming scheme a bit more clear, meaning, the 3GB version would be the 1060, and the 6GB version would be a 1060 Ti...you know, just make it a bit more fool proof because not everybody knows the difference between the two and they should because it's a pretty BIG difference.


----------



## BarbaricSoul (Dec 18, 2017)

Divide Overflow said:


> What game takes 6GB or greater at 1080 resolution?



the new Wolfenstien was pegging my 780 Ti card's 3 GB of VRAM to the point that it was completely unplayable, even at the lowest resolution the game supports and lowest video settings.


----------



## Frag_Maniac (Dec 18, 2017)

Gmr_Chick said:


> ...it's a gimped 1060, and certainly not suitable for anything over 1080p gaming with reduced settings. Go beyond that, and it gets brutal pretty quickly.


Actually I wouldn't say the 1060 6GB is practical for more than 1080p gaming in general either, especially if we're talking the next few years as a GPU cycle. Think of them more in terms of 1080p cards, with one being able to handle max texture settings at that res, and the other not being able to use max textures in a growing number of games at 1080p.

It's a matter of what settings and what games are played we're talking about here in comparing the two, not what res. Also, it's been common practice for some time to offer different VRAM capacity variants of the same GPU. It's not the same as Ti versions, which have more core power. As long as one understands what they're buying, these different variants do make sense.


----------



## dirtyferret (Dec 18, 2017)

lol future proof.  The 1060 3GB and 6Gb will be outdated at the same time.  In some demanding games you may get an increase of 10% from the 1060 6B compared to the 3GB.  Three years from now no one is going to playing Call of Duty seven saying wow the 1060 3GB sucks at 20FPS but the 6GB is smooth as butter at 22FPS.  Video ram is nice but video chip horsepower is what drives the game.  Just look at the 8800 GTS 320mb vs 640, GTX 460 768mb vs 1GB, GTX 960 2GB vs 4GB;  All cards outdated at the same time due to their chip.


----------



## EarthDog (Dec 18, 2017)

Those cards were in the same boat though. The former could run out of vram at their intended res in several titles upon launch... of course, performance differences notwithstanding.


----------



## BadFrog (Dec 18, 2017)

I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k      ¯\_(ツ)_/¯


----------



## Frag_Maniac (Dec 18, 2017)

dirtyferret said:


> Video ram is nice but video chip horsepower is what drives the game.


Again, it depends on the games played and the settings you're OK with. A 3GB 1060 will not allow max texture settings on some games, while the 6GB version will. This is not just about frames per second, it's about visual quality.


----------



## EarthDog (Dec 18, 2017)

BadFrog said:


> I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k      ¯\_(ツ)_/¯


Maybe? But when will it be prevalent, really? 2560x1440 has been out for years and according to steam, is 3%... 4k uhd is less than .5%. You also have to consider that many dont have gpus good enough to drive 2560x1440 (1070+/ vega 56+)...and that isnt even talking 4k where multiple gpus are needed or a 1080ti/titan...amd cant even play in the playground there.

1080p will still be the majorty for several years to come.


----------



## dirtyferret (Dec 18, 2017)

Frag Maniac said:


> Again, it depends on the games played and the settings you're OK with. A 3GB 1060 will not allow max texture settings on some games, while the 6GB version will. This is not just about frames per second, it's about visual quality.



what games? please link a review to a professional site as I would like to know.



BadFrog said:


> I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k      ¯\_(ツ)_/¯


I still like my IPS 24" 1080p monitor and according to steam most recent survey 77% of gamers do as well.  Obviously monitor resolution is always moving towards a higher spec but today most people seem to like 1080p just fine. 

The thing with future proofing or being a consumer for anything at all is matching your desired performance to your purchase.  Buying a mini-van because you and your wife have a child and plan to have another one or two is a wise investment.  Buying a mini-van because you are 21 and plan to start a family in 10+ years and want to future proof yourself is probably not a wise investment.


----------



## Frag_Maniac (Dec 18, 2017)

dirtyferret said:


> what games? please link a review to a professional site as I would like to know.


As I already said, it's entirely dependent on what games you're asking about, and what settings you're OK with. There are many games since Shadow of Mordor that have had restrictions and/or warnings in the graphics settings that say you cannot use the higher texture quality options if you have less than 4GB VRAM.

It's very hard to find a list of games that require such things though. You're better off just doing a Google search based on the title of the games in question, along with hardware requirements. For instance "Shadow of Mordor hardware requirements". Pretty much all game requirements now state the amount of VRAM needed.

And BTW, I agree with you on both IPS and 1080p preference. I've tried 4k gaming and TV watching on a 4K TV that supposedly has great scaling and up-converting, and I came away feeling  TVs, GPUs, W10, and programs in general are not ready for 4K yet. When you watch 1080i or 720p broadcasts on pretty much any 4K TV, the image quality is terrible. They up-convert some things in the image fine, but a lot of other things look very blurry. For instance in an NFL game, close-ups of players look fine, everything else, not so good.

4K gaming is hit and miss. Some games play fine at 4K, while others will have performance, HUD, or HDR problems. Worse yet, some programs, even ones that aren't old, do not look right with a 4K desktop res. This is because Windows 10 scales the fonts larger to be readable, and in the process some program's GUIs get messed up. It gets better if you choose a lesser percentage of font scaling, but then you get to the point where it can get a bit too small to read comfortably.

It will take at least 5 years before there's enough UHD content to make 4K displays viable, but the good thing is the UHD broadcast standards (ATSC 3.0), are nearly finalized, and will probably be done early next year. By early 2019 there will likely be TVs with ATSC 3.0 tuners in them, and addon ATSC 3.0 tuners for existing TVs. Once ATSC 3.0 TV broadcasts get mainstream, UHD content and hardware supporting it will be as well. That's what it's going to take for 4K to really be practical.

It's hard to tell exactly how long it will take. The UHD rollover will not be mandatory with a deadline like analog to digital was, it's going to be voluntary. This means UHD broadcasts for the first 2-3 years will likely be only available in "select markets", such as the TV streaming services that offer local live broadcasts only in certain cities.

There are many reasons to be optimistic though. For one, they can use existing transmitters with only slight modifications, because ATSC 3.0 uses RF, just like ATSC 1.0. The new equipment required in the broadcasting stations themselves is actually cheaper than what they're using now. The FCC is going to fund 80 to 90% of that equipment. ATSC 3.0 is more suited for advertising, uses half the bandwidth, has much better reception, has multi lingual capability, and can transmit one fixed (TV) and 2 mobile device transmissions simultaneously, and offer far better reception while doing so. This means the changeover won't be overly expensive, and broadcasters will reach a much larger audience, so expect to see many broadcast stations and cities adopting it early.

I've read stats that say currently only 17% of US consumers own 4K TVs, with roughly the same percentage of people getting their TV broadcasts over the air (antenna). The percentage of US houses that have 4K TVs is projected to be 48% by 2020. I'm willing to bet with ATSC 3.0 being so much better than ATSC 1.0, and cord cutting growing in popularity, the percentage of people getting TV over the air will also rise significantly. However that's also because ATSC 3.0 will be a hybrid antenna/internet system, which will likely have both free local antenna content, as well as streaming options available at a certain cost.


----------



## BadFrog (Dec 18, 2017)

dirtyferret said:


> what games? please link a review to a professional site as I would like to know.
> 
> 
> I still like my IPS 24" 1080p monitor and according to steam most recent survey 77% of gamers do as well.  Obviously monitor resolution is always moving towards a higher spec but today most people seem to like 1080p just fine.
> ...



Me too. I LIKED my 24 inch 3 years ago. But I noticed when the owner of my company wanted 27 inch monitors, that when you put 1080p on it, it seemed "pixelated" compared to the 24 inch. When I upgraded to 27 inch with my Asus Swift ROG, the 1440p seemed alot sharper. I might be spoiled as the lowest res I work/game on is 1440p. When I look at 1080's they feel blurry to me. It might be that I'm also getting old too lol


----------



## jboydgolfer (Dec 19, 2017)

How cool would it be if you could purchase a card that was only VRAM installed on a pcb that plugged into your pcie slot ,to be accessed by a GPU installed in your system


----------



## BadFrog (Dec 19, 2017)

jboydgolfer said:


> How cool would it be if you could purchase a card that was only VRAM installed on a pcb that plugged into your pcie slot ,to be accessed by a GPU installed in your system



Cool like a yeti eating frozen spaghetti


----------



## EarthDog (Dec 19, 2017)

jboydgolfer said:


> How cool would it be if you could purchase a card that was only VRAM installed on a pcb that plugged into your pcie slot ,to be accessed by a GPU installed in your system


Interesting thought..  but id have to imagine horsepower would run out before a vRAM need in a lot of cases. 

Id dont know... id put that $50 away for an entirely better gpu...

....never really thought that out. It could work on some cards and if you upgrade monitors/res...


----------



## GoldenX (Dec 19, 2017)

IGP and stolen dynamic RAM is enough, bite me.


----------



## EarthDog (Dec 19, 2017)

I bet. But some prefer to play more than solitare, minecraft, and settings turned way down to accommodate.


----------



## Vayra86 (Dec 19, 2017)

dirtyferret said:


> lol future proof.  The 1060 3GB and 6Gb will be outdated at the same time.  In some demanding games you may get an increase of 10% from the 1060 6B compared to the 3GB.  Three years from now no one is going to playing Call of Duty seven saying wow the 1060 3GB sucks at 20FPS but the 6GB is smooth as butter at 22FPS.  Video ram is nice but video chip horsepower is what drives the game.  Just look at the 8800 GTS 320mb vs 640, GTX 460 768mb vs 1GB, GTX 960 2GB vs 4GB;  All cards outdated at the same time due to their chip.





Toothless said:


> Starting to think the words "need" and "want" are getting blurred in this thread. "Need" would be more "I need x amount minimum to get by" in which IMHO 2-3GB is minimum. "Want" is more on the side of 4GB+ because people are all about pretty textures.



No. The 1060 is a perfect example of a 3GB card that was already obsolete on the day it was released, and the 6GB variant being a pretty well balanced product in terms of core and VRAM capacity. Its not just the additional cores that make it shine, its the VRAM that makes the 6GB pleasant to game on and the 3GB unpleasant. TODAY there are already examples of the 3GB version of this card providing much spikier frame times than the 6GB variant while both are pegged at similar average FPS. Note: this is precisely the same as trying your hand at a 780(ti) that has 3GB and somewhat similar core performance today - it will tank in the exact same games in the exact same way. For the same reasons the 780(ti), along with the 7970 with its 3GB, have remained very reasonable cards for even the year 2016; but they are now definitely at the end of their optimal life. They can still push 60 FPS in many games with very decent settings, but will struggle in area transitions and many games with large texture footprints and streamed game worlds; GTA V, The Witcher 3, TW:WH, The Division, and so forth.

The people who buy and/or defend the 1060 3GB today are the bang-for-buck buyers who have no bearing on real world performance and what to look for in a GPU. They Google userbenchmark or some other stupid comparison site, scroll over a couple of bench comparisons and conclude that the card with higher FPS/dollar is the better card.

Future proof exists as long as you don't shift your own expectations upward. If you stay at 1080p, then yes, a 6GB card is future proofing compared to a 3GB card, and an 8GB card has little merit over the 6GB equivalent unless it gets additional core power. It comes down to building a rig for a purpose, buying a card for a purpose, and for the purpose of 1080p60fps gaming, 3GB is not sufficient any longer.


----------



## cucker tarlson (Dec 19, 2017)

BadFrog said:


> I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k      ¯\_(ツ)_/¯


It only leads me to believe that I was wrong thinking 1080p is going away anywhere soon. It won't. Look at the pace 1080p high variable refresh monitors pop up. There's more of them than 1440p and 4K ones.


----------



## dirtyferret (Dec 19, 2017)

So I was able to dig up a chart with shadow of war on ultra 1080p settings and the GTX 1060 3GB can clearly play it (it was mentioned prior the card would not run on ultra settings).  I have not played this game so I have no personal experience with it.  

Based off the chart the GTX 1060 3GB offers a playable experience (46fps avg, 31min) but clearly not as ideal as the 6GB version.  Surprisingly it also outperforms many 4GB cards including the Nvidia GTX 970 "quasi" 4GB & R9 390 w/ 8GB.    

_The ultra tier foes are another level of difficulty entirely. You'll want 4GB VRAM to stand a chance, and anything less than a 1060 3GB will result in occasional trips and missteps. Those can be fatal, and only epic level weaponry like the GTX 1070 and RX Vega 56 will push you past the 60 fps mark. And if you're looking at a 144Hz Palantír (aka display), you might need to drop down to high quality to maximize the accuracy of the visions it sends._







One other thing of note about this game; the game takes up 100GB with 30GB going to the high definition pack!


_We have also found that, when played on a traditional mechanical hard drive, the game takes ten minutes or so to ‘warm up’, with average frame rates being close to 20fps to begin with. This problem did not occur with SSD drives. This is likely down to the drive struggling with texture streaming - those HD files are massive, afterall...The game is huge, though: you will need 100GB of free space to install it, thanks to a 70GB base game install and a 30GB high-definition pack. If you are not intending to play on ultra, uncheck the HD packs from your install as they are simply a waste of space. If you do intend to play on ultra, be sure to play on an SSD - the stuttering and slowdown from traditional drives when streaming in those massive textures ruin the game’s flow. 
https://www.pcgamesn.com/middle-ear...arth-shadow-of-war-pc-graphics-performance-4k_


----------



## EarthDog (Dec 19, 2017)

100GB w/HD textures... holy shyte!


----------



## GoldenX (Dec 19, 2017)

The 3GB 1060 has a reason for existing, price difference between the 1050ti and the 1060 6GB. Some people can't afford the 6GB version, especially in non 1st world countries where the gap is bigger.
I prefer a 3GB 1060 over a 1050ti any day. This is where AMD should have a solid product, yet it doesn't.


----------

