so now we've got
12gb for 3060
8gb for 3060 ti
8gb for 3070
8gb AND 16gb for 3070 ti
10gb for 3080
jesus christ this just keeps getting even more retarded
As opposed to system ram in double, triple quad channel, 2Gb 4 Gb , 8 GB, 16gn, 32 GB .....
Just pick what works for the screen, if ya pay attention to what is actually "used" versus how much it allocates on installation, won't have any issues. When a utility says your 8GB card is using 5.8 MB and then you insert the 4 GB version and get same fps, same user experience, same screen quality as the 8 GB var ... no it is not ***using*** 5.8 MB
1080p ==> 3 - 4 GB
1440p ==> 6 - 8GB
2160p ==> 12 - 16 GB
If Nvidia does release an 8GB and a 16GB card then we will have some proof whether the extra VRAM is needed right now in games but people don't buy a card just for today's games. From what I've seen people generally buy for a 4 year lifespan. I
That's not proof, it's taking advantage of uneducated consumers. Been the same claim since the nVidia 500 series... test 2 variants of a card at the resolution it's intended for and the higher one and no difference in user experience.
This was one of the 1st "exposes" on the myth ... but it's been done for for 600, 900, 1000 series also
Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.
alienbabeltech.com
This is Nvidia plan now, release a card, then release an upgraded refresh of that card a year later.
Geez, imagine if car companies and women's fashions, text books, software, etc. started pulling that stuff.
Personally I've seen my 3070 have all its VRAM allocated and even go 2 GB+ into the Shared GPU memory during Warzone, and I noticed no performance difference, stutters, or texture issues, at all.
Allocated and used are two very, very different things. Think of your master card,... lets say you have a $5,000 limit and $500 charged. How much much of your credit line are you using ? ... $500.
Now when you apply for a car loan ... what will the credit report show is allocated against your potential lime of credit. MasterCard reports $5000. You are not ***using it** at the time.... but it$5000 has been ***allocated*** to be used whenever you want. Same principle here and nvidia has explained this on several occasions.
See above link: They tried to install a 2 GB version of max payne and it would not allow the install because of insufficient VRAM at that resolution. So they installed the 4GB car, tested it... then put 2 GB card back in ... same fps, same graphics quality, same user experience.
"We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: GFX Card utilities
"all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available."
Look at the TPU reviews. The 3 GB and 6GB 1060 reviews here on TPU are a great example. The 6 GB card has 11% more shaders than the 3 GB card so it has an inherent speed advantage aside from VRAM. And when we look at the relative speed of the cards at 1080p ... the 6GB card has a 6% speed advantage. So if this VRAM thing is true that 3 GB is wholly inadequate for 1080p ... then when we look at 1440p we should see a critical drop in performance w/ 3 GB at 1440p ... but there isn't same 6%
Now that's not to say that no game will have issues with some games at some point. Poor console ports for example have been one of the most common exceptions. But for the most part ... by the time you run out of VRAM, you will fun out of playable. Not really relevant if you get 33% more fps when the result is 15 vs 20 fps. it's unplayable either way.
On thing I like about this abundance of options is seeing W1zzard take this issue on and get a data set for the 3000 series, and see if it comes out the same way as 500, 600, 700, 900 and 1000 series