• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

how much VRAM do you need? (1080p)

3GB is borderline
3.5GB is refundable
4GB is safe
6GB is future-proof
8GB/12GB is for those running max. details with a lot of anti aliasing, guys who cheap out on a monitor to buy a Titan X.
 
If you have the cash, just buy a 8GB vram card, otherwise save up a little longer till you can afford one.
(GTX 1070(Ti) , GTX 1080 or Vega)
 
Definitely wait for 1070Ti to drop price or GTX Volta/Ampere if you want future-proof.
 
Well, who buys a computer and WANTS to run with lowered textures? If thats ok, then yeah, 3gb or less is ok.. otherwise, 4gb.
Easier to buy a $100 780 than a $200+ 1060 6gb. Money rules the world and tighter budgets won't allow much.
 
If that is your budget..not much choice. You can make any shoe fit...im sure you get my point. ;)

...nobody wants to have to run with lowered textures if it can be helped. ;)
 
If that is your budget..not much choice. You can make any shoe fit...im sure you get my point. ;)

...nobody wants to have to run with lowered textures if it can be helped. ;)
3GB is fine for 1080p imo and I am one of those who has a 3GB 780ti so can speak from experience, would be ok if 1060 6GB was available for $200 but it's more like $260-$300 so yea I opted for a $140 780 ti 3GB and am not missing a 30% performance increase for 115% cost increase and I dont run with low textures either, are you crazy? I'll just turn off AA, extreme shadows, blur etc before I reduce texture settings and run most games on high settings at 60fps just fine on my measly old 780 ti with 3GB vRAM
 
Yep.. ok with reduced settings and AA. Not how I or most people want to play. You supported the overall point...IQ sacrifices need to be made. :)
 
Last edited:
This.

4 GB bare minimum these days and the 1060 3GB is a pointless card to begin with. Yes, in the bench hierarchy they look to be great price/perf, theoretically the 3GB 'should be enough'... and then there is actual practice when you have used a 3GB GPU on recent games, which paints the real picture of hitchy, stuttery gameplay.

Another issue with 3GB cards is resale value. It is going to be as nonexistant as any 2GB GTX 680 or 770 today. Nobody buys those cards anymore, because 2GB simply isn't enough. For anything anymore.

Seeing as the 6GB version is definitely the more popular of the two, why did Nvidia even bother with the 3GB version? Probably to maximize profits I'm guessing, but even still, it's a gimped 1060, and certainly not suitable for anything over 1080p gaming with reduced settings. Go beyond that, and it gets brutal pretty quickly.

If Nvidia were going to do this from the start, I think they should have *at least* made the naming scheme a bit more clear, meaning, the 3GB version would be the 1060, and the 6GB version would be a 1060 Ti...you know, just make it a bit more fool proof because not everybody knows the difference between the two and they should because it's a pretty BIG difference.
 
What game takes 6GB or greater at 1080 resolution?

the new Wolfenstien was pegging my 780 Ti card's 3 GB of VRAM to the point that it was completely unplayable, even at the lowest resolution the game supports and lowest video settings.
 
...it's a gimped 1060, and certainly not suitable for anything over 1080p gaming with reduced settings. Go beyond that, and it gets brutal pretty quickly.
Actually I wouldn't say the 1060 6GB is practical for more than 1080p gaming in general either, especially if we're talking the next few years as a GPU cycle. Think of them more in terms of 1080p cards, with one being able to handle max texture settings at that res, and the other not being able to use max textures in a growing number of games at 1080p.

It's a matter of what settings and what games are played we're talking about here in comparing the two, not what res. Also, it's been common practice for some time to offer different VRAM capacity variants of the same GPU. It's not the same as Ti versions, which have more core power. As long as one understands what they're buying, these different variants do make sense.
 
lol future proof. The 1060 3GB and 6Gb will be outdated at the same time. In some demanding games you may get an increase of 10% from the 1060 6B compared to the 3GB. Three years from now no one is going to playing Call of Duty seven saying wow the 1060 3GB sucks at 20FPS but the 6GB is smooth as butter at 22FPS. Video ram is nice but video chip horsepower is what drives the game. Just look at the 8800 GTS 320mb vs 640, GTX 460 768mb vs 1GB, GTX 960 2GB vs 4GB; All cards outdated at the same time due to their chip.
 
Those cards were in the same boat though. The former could run out of vram at their intended res in several titles upon launch... of course, performance differences notwithstanding.
 
I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k ¯\_(ツ)_/¯
 
Video ram is nice but video chip horsepower is what drives the game.
Again, it depends on the games played and the settings you're OK with. A 3GB 1060 will not allow max texture settings on some games, while the 6GB version will. This is not just about frames per second, it's about visual quality.
 
I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k ¯\_(ツ)_/¯
Maybe? But when will it be prevalent, really? 2560x1440 has been out for years and according to steam, is 3%... 4k uhd is less than .5%. You also have to consider that many dont have gpus good enough to drive 2560x1440 (1070+/ vega 56+)...and that isnt even talking 4k where multiple gpus are needed or a 1080ti/titan...amd cant even play in the playground there.

1080p will still be the majorty for several years to come.
 
Last edited:
Again, it depends on the games played and the settings you're OK with. A 3GB 1060 will not allow max texture settings on some games, while the 6GB version will. This is not just about frames per second, it's about visual quality.

what games? please link a review to a professional site as I would like to know.

I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k ¯\_(ツ)_/¯
I still like my IPS 24" 1080p monitor and according to steam most recent survey 77% of gamers do as well. Obviously monitor resolution is always moving towards a higher spec but today most people seem to like 1080p just fine.

The thing with future proofing or being a consumer for anything at all is matching your desired performance to your purchase. Buying a mini-van because you and your wife have a child and plan to have another one or two is a wise investment. Buying a mini-van because you are 21 and plan to start a family in 10+ years and want to future proof yourself is probably not a wise investment.
 
what games? please link a review to a professional site as I would like to know.
As I already said, it's entirely dependent on what games you're asking about, and what settings you're OK with. There are many games since Shadow of Mordor that have had restrictions and/or warnings in the graphics settings that say you cannot use the higher texture quality options if you have less than 4GB VRAM.

It's very hard to find a list of games that require such things though. You're better off just doing a Google search based on the title of the games in question, along with hardware requirements. For instance "Shadow of Mordor hardware requirements". Pretty much all game requirements now state the amount of VRAM needed.

And BTW, I agree with you on both IPS and 1080p preference. I've tried 4k gaming and TV watching on a 4K TV that supposedly has great scaling and up-converting, and I came away feeling TVs, GPUs, W10, and programs in general are not ready for 4K yet. When you watch 1080i or 720p broadcasts on pretty much any 4K TV, the image quality is terrible. They up-convert some things in the image fine, but a lot of other things look very blurry. For instance in an NFL game, close-ups of players look fine, everything else, not so good.

4K gaming is hit and miss. Some games play fine at 4K, while others will have performance, HUD, or HDR problems. Worse yet, some programs, even ones that aren't old, do not look right with a 4K desktop res. This is because Windows 10 scales the fonts larger to be readable, and in the process some program's GUIs get messed up. It gets better if you choose a lesser percentage of font scaling, but then you get to the point where it can get a bit too small to read comfortably.

It will take at least 5 years before there's enough UHD content to make 4K displays viable, but the good thing is the UHD broadcast standards (ATSC 3.0), are nearly finalized, and will probably be done early next year. By early 2019 there will likely be TVs with ATSC 3.0 tuners in them, and addon ATSC 3.0 tuners for existing TVs. Once ATSC 3.0 TV broadcasts get mainstream, UHD content and hardware supporting it will be as well. That's what it's going to take for 4K to really be practical.

It's hard to tell exactly how long it will take. The UHD rollover will not be mandatory with a deadline like analog to digital was, it's going to be voluntary. This means UHD broadcasts for the first 2-3 years will likely be only available in "select markets", such as the TV streaming services that offer local live broadcasts only in certain cities.

There are many reasons to be optimistic though. For one, they can use existing transmitters with only slight modifications, because ATSC 3.0 uses RF, just like ATSC 1.0. The new equipment required in the broadcasting stations themselves is actually cheaper than what they're using now. The FCC is going to fund 80 to 90% of that equipment. ATSC 3.0 is more suited for advertising, uses half the bandwidth, has much better reception, has multi lingual capability, and can transmit one fixed (TV) and 2 mobile device transmissions simultaneously, and offer far better reception while doing so. This means the changeover won't be overly expensive, and broadcasters will reach a much larger audience, so expect to see many broadcast stations and cities adopting it early.

I've read stats that say currently only 17% of US consumers own 4K TVs, with roughly the same percentage of people getting their TV broadcasts over the air (antenna). The percentage of US houses that have 4K TVs is projected to be 48% by 2020. I'm willing to bet with ATSC 3.0 being so much better than ATSC 1.0, and cord cutting growing in popularity, the percentage of people getting TV over the air will also rise significantly. However that's also because ATSC 3.0 will be a hybrid antenna/internet system, which will likely have both free local antenna content, as well as streaming options available at a certain cost.
 
Last edited by a moderator:
what games? please link a review to a professional site as I would like to know.


I still like my IPS 24" 1080p monitor and according to steam most recent survey 77% of gamers do as well. Obviously monitor resolution is always moving towards a higher spec but today most people seem to like 1080p just fine.

The thing with future proofing or being a consumer for anything at all is matching your desired performance to your purchase. Buying a mini-van because you and your wife have a child and plan to have another one or two is a wise investment. Buying a mini-van because you are 21 and plan to start a family in 10+ years and want to future proof yourself is probably not a wise investment.

Me too. I LIKED my 24 inch 3 years ago. But I noticed when the owner of my company wanted 27 inch monitors, that when you put 1080p on it, it seemed "pixelated" compared to the 24 inch. When I upgraded to 27 inch with my Asus Swift ROG, the 1440p seemed alot sharper. I might be spoiled as the lowest res I work/game on is 1440p. When I look at 1080's they feel blurry to me. It might be that I'm also getting old too lol
 
How cool would it be if you could purchase a card that was only VRAM installed on a pcb that plugged into your pcie slot ,to be accessed by a GPU installed in your system :)
 
How cool would it be if you could purchase a card that was only VRAM installed on a pcb that plugged into your pcie slot ,to be accessed by a GPU installed in your system :)

Cool like a yeti eating frozen spaghetti :)
 
How cool would it be if you could purchase a card that was only VRAM installed on a pcb that plugged into your pcie slot ,to be accessed by a GPU installed in your system :)
Interesting thought.. but id have to imagine horsepower would run out before a vRAM need in a lot of cases.

Id dont know... id put that $50 away for an entirely better gpu...

....never really thought that out. It could work on some cards and if you upgrade monitors/res...
 
Last edited:
IGP and stolen dynamic RAM is enough, bite me.
 
I bet. But some prefer to play more than solitare, minecraft, and settings turned way down to accommodate. ;)
 
lol future proof. The 1060 3GB and 6Gb will be outdated at the same time. In some demanding games you may get an increase of 10% from the 1060 6B compared to the 3GB. Three years from now no one is going to playing Call of Duty seven saying wow the 1060 3GB sucks at 20FPS but the 6GB is smooth as butter at 22FPS. Video ram is nice but video chip horsepower is what drives the game. Just look at the 8800 GTS 320mb vs 640, GTX 460 768mb vs 1GB, GTX 960 2GB vs 4GB; All cards outdated at the same time due to their chip.

Starting to think the words "need" and "want" are getting blurred in this thread. "Need" would be more "I need x amount minimum to get by" in which IMHO 2-3GB is minimum. "Want" is more on the side of 4GB+ because people are all about pretty textures.

No. The 1060 is a perfect example of a 3GB card that was already obsolete on the day it was released, and the 6GB variant being a pretty well balanced product in terms of core and VRAM capacity. Its not just the additional cores that make it shine, its the VRAM that makes the 6GB pleasant to game on and the 3GB unpleasant. TODAY there are already examples of the 3GB version of this card providing much spikier frame times than the 6GB variant while both are pegged at similar average FPS. Note: this is precisely the same as trying your hand at a 780(ti) that has 3GB and somewhat similar core performance today - it will tank in the exact same games in the exact same way. For the same reasons the 780(ti), along with the 7970 with its 3GB, have remained very reasonable cards for even the year 2016; but they are now definitely at the end of their optimal life. They can still push 60 FPS in many games with very decent settings, but will struggle in area transitions and many games with large texture footprints and streamed game worlds; GTA V, The Witcher 3, TW:WH, The Division, and so forth.

The people who buy and/or defend the 1060 3GB today are the bang-for-buck buyers who have no bearing on real world performance and what to look for in a GPU. They Google userbenchmark or some other stupid comparison site, scroll over a couple of bench comparisons and conclude that the card with higher FPS/dollar is the better card.

Future proof exists as long as you don't shift your own expectations upward. If you stay at 1080p, then yes, a 6GB card is future proofing compared to a 3GB card, and an 8GB card has little merit over the 6GB equivalent unless it gets additional core power. It comes down to building a rig for a purpose, buying a card for a purpose, and for the purpose of 1080p60fps gaming, 3GB is not sufficient any longer.
 
Last edited:
I love when people talk about "future proofing" their purchases for tech related stuff. Makes no sense to do so. Especially where we are with PC tech. Things will get crazy starting this year 2018. Wouldn't you say that 1080p is already dated and we are all moving towards 1440p and 4k ¯\_(ツ)_/¯
It only leads me to believe that I was wrong thinking 1080p is going away anywhere soon. It won't. Look at the pace 1080p high variable refresh monitors pop up. There's more of them than 1440p and 4K ones.
 
Back
Top