Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.
Do you need 4GB of ram? We tested EVGA's GTX 770 4GB versus Nvidia's GTX 770 2GB version, at 1920x1080, 2560x1600 and 5760x1080.
alienbabeltech.com
2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.
"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with
Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.
Same here ....
https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here ....
http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here ....
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports
And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.
Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.
Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "
all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.
So now for the why 10 question ?
When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:
"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "
In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?