# 3060ti 8gb vs 3060 12gb



## Vario (Dec 1, 2021)

I've been in EVGA queue for both of these products for almost a year. The 3060 12GB is likely to come up first.  I have a question regarding predicting which is more likely to have staying power ~4 years out from now.  I tend to run stuff a really long time. The 3060ti processor is much faster but the 12GB ram is possibly better in this aspect if lower vram amounts end up being a limiting factor. What do you guys think?

The main reason I ask is in 2013, I once purchased a 770 2GB instead of the 4GB because I was told the card wasn't really fast enough to matter.  After a few years, the limited VRAM became an issue that prevented me from running some games.  Similarly, the 780ti's 3GB severely limited that card just a couple years after its launch, despite being close to a 980 in performance otherwise.


----------



## droopyRO (Dec 1, 2021)

3060 Ti. GPU power is power. While vRAM can be managed by lowering texture details.


----------



## Batou1986 (Dec 1, 2021)

VRAM only matters if you have the GPU horsepower to use it, the only way your going to use 12gb of VRAM is at 4k or higher and the 3060 isn't fast enough at 4k for it to even be a consideration.
The only reason the 3060 has 12gb is due to the 192 bit buss design which basically limits it to either using 6gb or 12gb of VRAM and 6gb is likely too little for some modern games.
If you plan on keeping the card for a long time the faster GPU is going to matter much more than the extra VRAM


----------



## Vario (Dec 1, 2021)

I agree with you guys, the ti was what I was leading with, I just didn't want to make the same mistake twice.
I should add I use 1440P, 165Hz, graphics not important, smooth frame rate is.


----------



## ixi (Dec 1, 2021)

Well, I always go for stronger gpu chip than vram amount as I have never had problem with GB capacity what I'm playing.


----------



## ppn (Dec 1, 2021)

The database here shows 3060 to be
3.0x faster than 770/2, and 6.0x Vram buffer amount,
2.5x faster than 780/3,  4.0x Vram
2.0x than 1060/6, rocking 2.0x vram

Of course I find 3060 to be much faster than 780 because the latter struggles with 720p60

3060 Ti on the other hand is
3.0x faster than 780/3, but only 2,666x Vram

SO not looking good for the 3060Ti. But what other choice do we have.


----------



## oxrufiioxo (Dec 1, 2021)

At this point unless you come up in the queue in the next month or so I'd probably skip both and wait for the 4000 series that will likely have adequate  vram. I personally wouldn't be shocked if none of the nvidia cards from the 3080 down age well due to nvidia cheaping out on vram.

If I had to choose one of these two it would be the ti though it's quite a bit faster but having to lower settings in a year or so for a 500+ gpu feels bad.


----------



## neatfeatguy (Dec 1, 2021)

The 3060Ti will give you better performance at 1440p, hands down. The 3060 is a solid card for 1080 and can play 1440p with setting adjustments, but it still lags behind the Ti version by roughly 30%.

The one thing that really amazes me about the difference between the two cards is that that power draw of the 3060Ti is barely higher than the 3060. The 3060 draws so much power for the performance it provides. The Ti draws around 20W higher and provides upwards of 30% more performance over the 3060.

If you feel the 3060Ti will be something coming up soon, I'd just wait for that one. If you feel you absolutely need something sooner than later and just can't wait, the 3060 isn't a bad card, but just not an ideal one for your resolution.

On the other hand, I ran a 980Ti card (6GB VRAM) for nearly 6 years. It ran 5760x1080 for 5+ years and finally ran on the 1440p monitor for about 4-5 months until I got a new GPU. The card ran things well for that time frame and I was okay with dropping settings. 6GB did just fine. The 8GB on the 3060Ti or 12GB (which, honestly is wasted) on the 3060 will be fine for the next 3-5 years. It just all depends on if you're okay with lowering settings more with the 3060 over the Ti.

With the 4xxx series coming out sometime in the next year - according to rumors - waiting might mean you might not find a card from the 4xxx series either and you could be stuck with no upgrading at all. Weigh your pros and cons. Personally, I wouldn't pass up the chance for something now in hopes something better comes along. If you get something now and something better comes along, you could always upgrade and sell what you have now to recoup some of the cost.


----------



## Vario (Dec 1, 2021)

neatfeatguy said:


> The 3060Ti will give you better performance at 1440p, hands down. The 3060 is a solid card for 1080 and can play 1440p with setting adjustments, but it still lags behind the Ti version by roughly 30%.
> 
> The one thing that really amazes me about the difference between the two cards is that that power draw of the 3060Ti is barely higher than the 3060. The 3060 draws so much power for the performance it provides. The Ti draws around 20W higher and provides upwards of 30% more performance over the 3060.
> 
> ...


I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.


----------



## oxrufiioxo (Dec 1, 2021)

Vario said:


> I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.



That's a great idea actually.


----------



## pregep (Dec 1, 2021)

My vote goes for 3060Ti, what @droopyRO just said.


----------



## Deleted member 202104 (Dec 1, 2021)

Vario said:


> I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.



This is exactly what I'd recommend.


----------



## Metroid (Dec 1, 2021)

12gb, more future proof, 8gb is 2018, now anything more than or equal 10gb minimum but 8gb still not bad, 6gb is bad.


----------



## neatfeatguy (Dec 1, 2021)

Vario said:


> I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.



That's a route I'd look if I was in your shoes. If GPU prices stay like they are for some time, you should easily be able to come out ahead after selling the 3060. I look every now and then and "new" 3060 cards on ebay are selling upwards of $900  (I know, gag me with a spoon). Hopefully you'd basically be paying next to nothing (or actually making money) for the 3060Ti once you get it and sell the 3060.


----------



## Kissamies (Dec 1, 2021)

I'd go for 3060 Ti as well. Like said, 3060's raw power may not be enough to use all that VRAM.


----------



## looniam (Dec 1, 2021)

Vario said:


> I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.


that was my plan too but after getting a 3060 the drops for the Ti's are so few and far inbetween i left the queue. the 3069 isn't as bad as some think, i haven't had a problem getting any game at 1440 max settings and a few "slow paced" games (~35fps) at 4K.

i expect it to be about the same as the 980ti; great 1440 card that after a few years (4?) will be a 1080 card. but  maybe i should have stuck it out for the 3060ti and worry about Vram later.


----------



## Frick (Dec 1, 2021)

Metroid said:


> 12gb, more future proof, 8gb is 2018, now anything more than or equal 10gb minimum but 8gb still not bad, 6gb is bad.



This isn't a hard rule though. I have the 3060ti and tried Horizon Zero Dawn @ 4K Ultra (maybe some tweaking of settings was involved, don't remember) and it worked quite well.


----------



## Kissamies (Dec 1, 2021)

Frick said:


> This isn't a hard rule though. I have the 3060ti and tried Horizon Zero Dawn @ 4K Ultra (maybe some tweaking of settings was involved, don't remember) and it worked quite well.


And also the amount for Afterburner's osd (for example) doesn't show the true VRM usage, rather how much it reserves it. Kinda same with RAM, I've had only one situation when playing a game (FF XV) the RAM usage went over 16GB. Anyway, with 16GB, there were zero problems.


----------



## cvaldes (Dec 1, 2021)

Nvidia is marketing the 3060 and 3060 Ti as 1080p gaming cards. The 3070 and 3070 Ti are 1440p gaming cards, the 3080 and 3080 Ti are 4K gaming cards.

The sweet spot for current 1080p gaming is 8GB. As mentioned above, Nvidia's choice was 6GB or 12GB VRAM for a 192-bit memory bus.

There is no videogame running at 1080p that will use up 12GB of VRAM. It is unlikely that there would be videogames like this in the near future because the two 4K videogame consoles (Xbox Series X and PS5) have 16GB of RAM (which is shared between the CPU and GPU).

A 3060 with 12GB VRAM will be bandwidth starved. Basically 4GB of VRAM will be wasted in 1080p gaming.

OP's usage case is 1440p gaming. Because of this, OP will benefit more on all games from a more powerful GPU over scads of VRAM. The 3060 Ti is a no brainer here.


----------



## Kissamies (Dec 1, 2021)

cvaldes said:


> As mentioned above, Nvidia's choice was 6GB or 12GB VRAM for a 192-bit memory bus.


Could mixed density be still used? Like GTX 550 Ti had 1GB and 192-bit bus and 660/660 Ti had 2GB/192-bit.


----------



## cvaldes (Dec 1, 2021)

Maenad said:


> Could mixed density be still used? Like GTX 550 Ti had 1GB and 192-bit bus and 660/660 Ti had 2GB/192-bit.


I don't have an electrical engineering degree (nor do I design GPUs for a living) so I'm not qualified to provide an authoritative answer.

That said my assumption is that mixed densities cause significant performance issues otherwise Nvidia and AMD would already be doing this to reduce cost.


----------



## Kissamies (Dec 1, 2021)

cvaldes said:


> I don't have an electrical engineering degree (nor do I design GPUs for a living) so I'm not qualified to provide an authoritative answer.
> 
> That said my assumption is that mixed densities cause significant performance issues otherwise Nvidia and AMD would already be doing this to reduce cost.


Yeah, I know what you mean there. Thinking the same, as 550 Ti for example had 4x128MB + 2x256MB chips, that would it've been different if it had 768MB or 1.5GB indtead of 1GB as with a card like that, the memory wasn't a bottleneck.


----------



## Batou1986 (Dec 1, 2021)

Maenad said:


> Could mixed density be still used? Like GTX 550 Ti had 1GB and 192-bit bus and 660/660 Ti had 2GB/192-bit.


Bus width determines how many memory lanes are available which in turn limits how many memory chips are used on the card.
I'm assuming the size of available GDDR6 chips is the factor at play here the 3060 uses 6 2gb chips while the 3060ti uses 8 1gb chips.
To get 8 gb out of 6 memory lanes you would need some weird 1.3gb GDDR chips.
Also both your examples use the same bus width so the same amount of chips the only difference being the density of the chips.


----------



## Kissamies (Dec 1, 2021)

Batou1986 said:


> Bus width determines how many memory lanes are available which in turn limits how many memory chips are used on the card.
> I'm assuming the size of available GDDR6 chips is the factor at play here the 3060 uses 6 2gb chips while the 3060ti uses 8 1gb chips.
> To get 8 gb out of 6 memory lanes you would need some weird 1.3gb GDDR chips.
> Also both your examples use the same bus width so the same amount of chips the only difference being the density of the chips.


To get 8GB with 192bit bus you could use 4x1GB + 2x2GB chips. Similar to what they did with those cards I mentioned.


----------



## Batou1986 (Dec 1, 2021)

Maenad said:


> To get 8GB with 192bit bus you could use 4x1GB + 2x2GB chips. Similar to what they did with those cards I mentioned.


This is true as you speculated the speed latency of the chips might vary which is why they avoided that.


----------



## Kissamies (Dec 1, 2021)

Batou1986 said:


> This is true as you speculated the speed latency of the chips might vary which is why they avoided that.


Yea, I'm not too sure how that would work, just pure speculation so your guess is as good as mine.


----------



## cvaldes (Dec 1, 2021)

Maenad said:


> Yea, I'm not too sure how that would work, just pure speculation so your guess is as good as mine.


My guess is that GPU designers (Nvidia, AMD, Intel, Imagination, etc.) continuously test various approaches in their labs. Every final choice is going to be some sort of compromise based on power, heat, cost, speed, capacity, supply, and other factors. 

Why does one card ship with cheaper/slower/cooler GDDR6 memory and another card ships with pricier/faster/hotter GDDR6X? Someone looked at all of the numbers of the prototypes and chose one.

It's not like GPUs are designed by 22-year old interns or tech forum dilettantes. The people making these final decisions have been doing this for 30+ years. And publicly traded corporations also need to accomplish their primary responsibility: increase shareholder value.


----------



## Kissamies (Dec 1, 2021)

cvaldes said:


> My guess is that GPU designers (Nvidia, AMD, Intel, Imagination, etc.) continuously test various approaches in their labs. Every final choice is going to be some sort of compromise based on power, heat, cost, speed, capacity, supply, and other factors.
> 
> Why does one card ship with cheaper/slower/cooler GDDR6 memory and another card ships with pricier/faster/hotter GDDR6X? Someone looked at all of the numbers of the prototypes and chose one.
> 
> It's not like GPUs are designed by 22-year old interns or tech forum dilettantes. The people making these final decisions have been doing this for 30+ years. And publicly traded corporations also need to accomplish their primary responsibility: increase shareholder value.


Just realized that they need to have headroom for OEM designs as well, at least I guess..


----------



## Vario (Dec 1, 2021)

Thanks for the information, interesting stuff. The 3070 is out of budget, a nearly $200 premium over the 3060ti for ~10% performance.


----------



## seth1911 (Dec 1, 2021)

Batou1986 said:


> VRAM only matters if you have the GPU horsepower to use it, the only way your going to use 12gb of VRAM is at 4k or higher and the 3060 isn't fast enough at 4k for it to even be a consideration.


Nope, Textures 4 win 

If i can decide between middle Details + max out Textures and high Details + medium Textures,
i take the first one.


----------



## chrcoluk (Dec 2, 2021)

Batou1986 said:


> VRAM only matters if you have the GPU horsepower to use it, the only way your going to use 12gb of VRAM is at 4k or higher and the 3060 isn't fast enough at 4k for it to even be a consideration.
> The only reason the 3060 has 12gb is due to the 192 bit buss design which basically limits it to either using 6gb or 12gb of VRAM and 6gb is likely too little for some modern games.
> If you plan on keeping the card for a long time the faster GPU is going to matter much more than the extra VRAM


FF15 can use 12 gigs at 1080p due to massive textures   It even has nasty leaks with nvidia grass feature that it will consume 10s of gigs of VRAM if its available.

Depends on size of textures and textures need VRAM more than horsepower.

Remember the market is bigger than AAA shooters.


----------



## Taraquin (Dec 2, 2021)

I have had both and the performanceuplift from 3060ti is worth it. In SOTTR I got 118fps 1080p highest with 3060 and 153fps with 3060ti. That`s about 30%.


----------



## Valantar (Dec 2, 2021)

Outside of a few edge cases with massive VRAM needs, the Ti is likely to last longer simply due to being faster. As has been mentioned above, it's crucial to remember that system reported VRAM "usage" is wildly inflated in most games through opportunistic pre-caching of assets, most of which are never used before being ejected in favor of pre-caching of other assets again. VRAM usage numbers are thus not really an indicator of anything other than how aggressively the game streams in assets that might be useful. Real VRAM-induced performance limitations are found in framerate/frametime measurements. Typically seen as especially bad 1%/.1% lows, but also as unexpectedly bad averages if the bottleneck is sufficiently bad.

As for predicting future developments, actual VRAM needs in games have grown relatively slowly over the past decade, and while they are indeed higher across the board, history has shown that most GPUs are held back by compute long before they are held back by VRAM capacity. Of course history doesn't predict the future, but change is also typically slow and gradual. Plus, technologies like DirectStorage have the potential to lower actual VRAM capacity needs quite noticeably through enabling vastly faster on-the-fly streaming of assets.


----------



## neatfeatguy (Dec 2, 2021)

chrcoluk said:


> FF15 can use 12 gigs at 1080p due to massive textures   It even has nasty leaks with nvidia grass feature that it will consume 10s of gigs of VRAM if its available.
> 
> Depends on size of textures and textures need VRAM more than horsepower.
> 
> Remember the market is bigger than AAA shooters.



I ran GTX 570s in SLI - they had 1.25GB for VRAM.
I was able to play many games with decent to great performance on 5760x1080 resolution.
A few examples are: Sniper Elite 3, Borderlands 2, Batman: Arkham Origins and even FarCry 3. I posted my results here of my 5760x1080 results of the 570s in SLI vs the 980Ti I ended up getting. You can see the settings used and performance they gave at the ultrawide resolution I was using: https://www.techpowerup.com/forums/threads/gtx-570-sli-vs-gtx-980ti.214683/



Spoiler



In my comparison of the 570s and 980Ti, I won't lie, there were some issues with the low amount of VRAM (only 1.25GB on the 570s) where it caused some minor stutters as you'd progress through an open world game (such as FarCry 3) when things had to be off loaded from the VRAM and other textures were loaded in, but my comparison of the two different cards is a big leap in terms of power and VRAM. You won't notice anything like I did going from 8GB on the 3060Ti to 12GB on the 3060.



Just because you have less VRAM doesn't mean the game will run worse. Sure, it can help (in hindsight I wish I would have opted for the 2.5GB models of the 570, but the 2.5GB didn't come out until 6 months after I already had mine) having more VRAM at times, but if the card itself doesn't have the power to fully utilize what it has available, then it's just kind of wasted. A good example is you don't get better gaming results from the GT 730 that has 2GB vs 4GB. Personally, I'd much rather run a 3060Ti with 8GB of RAM over my 3060 that has 12GB. That 30% more performance the 3060Ti gives over the 3060, to me, is much more beneficial on my 1440p.


----------



## Solid State Soul ( SSS ) (Dec 3, 2021)

Why not RX 6800 ? 

16gb Vram buffer will set you up for the good this whole console generation


----------



## freeagent (Dec 3, 2021)

You guys are making me want to buy a new tv lol. I have an 8GB card and play at 1080p and this thing is a savage beast lol..

A new display would show me what you guys are talking about


----------



## Vario (Dec 6, 2021)

Solid State Soul ( SSS ) said:


> Why not RX 6800 ?
> 
> 16gb Vram buffer will set you up for the good this whole console generation


Hard to find these things, only reason the 3000 series is a possibility is because I entered EVGA queue a year ago.


----------



## Frick (Dec 6, 2021)

freeagent said:


> You guys are making me want to buy a new tv lol. I have an 8GB card and play at 1080p and this thing is a savage beast lol..
> 
> A new display would show me what you guys are talking about



I went from 24" 1080p to 32" 4K and honestly ... it wasn't really worth it (even though the monitor was on sale). Not for games anyway. Freelancer is really nice on a big screen, but on the whole, not worth it. Add to it that some games (looking at you, Paradox Interactive) doesn't do UI scaling well and the scaling issues in Windows and other programs.

I mean it's nice in a way, but it's not like games get magically better.


----------



## Vayra86 (Dec 6, 2021)

For 1440p > 3440x1440, you can do "_just" _fine with 8GB and it is also OK relative to core power. Not ideal, but not starved either. But... I will have you know, at that UW res I do see upwards of 7GB usage more and more lately, even at sub-max settings (high instead of max). I do always max out textures, but don't max out some post processing or shadowmaps all the time. So could I hit 8GB? Probably. And I'd still get over 50 FPS ingame.

Dialing down texture res is a significant IQ hit, definitely not a place you'd prefer going to. I disagree with that being a 'fix'. The fact is, you've really not got the right GPU for long term then.

If you must upgrade now, sure, 8GB and 3060ti. Not ideal for 4 years going forward - again, I'm looking at a 1080 with substantially less core power but I do get playable frames at that res, so dialing down textures would be meh. The balance has shifted and not in a good way. Add RT on top and it gets worse. 8GB won't be aging nicely, that is a certainty you have.

But then the alternatives... those are likely worse, so 8GB 3060ti it probably is. The Ampere stack is quite simply a total mess.



Frick said:


> I went from 24" 1080p to 32" 4K and honestly ... it wasn't really worth it (even though the monitor was on sale). Not for games anyway. Freelancer is really nice on a big screen, but on the whole, not worth it. Add to it that some games (looking at you, Paradox Interactive) doesn't do UI scaling well and the scaling issues in Windows and other programs.
> 
> I mean it's nice in a way, but it's not like games get magically better.



Yeah the novelty wears off fast. Resolution is heavily overrated above 1080p at normal desktop view distance. Form factor is possibly a better upgrade. I'm more impressed with going wider than gaining pixels with my recent upgrade. But the extra screen real estate from 1080 > 1440p is worth it in non-gaming scenarios, exclusively when you have lots of stuff on screen. The biggest upgrade for me is really the fact that UW effectively offers two 50% windows at full height. Now thát is a big plus, Win Key+Arrows is in frequent use here.



chrcoluk said:


> FF15 can use 12 gigs at 1080p due to massive textures   It even has nasty leaks with nvidia grass feature that it will consume 10s of gigs of VRAM if its available.
> 
> Depends on size of textures and textures need VRAM more than horsepower.
> 
> Remember the market is bigger than AAA shooters.



This. Balanced GPUs always win... VRAM needs to always be 'sufficient'. Insufficient is painful. A bit more is never noticeable but always nice to have going forward. And the facts don't lie... going from Pascal, we lost 50% (give or take) in relative VRAM to additional core power. That is a huge, huge gap and it is already noticeable not even a year post-release. There are multiple examples and the list is growing.


----------



## Valantar (Dec 6, 2021)

Vayra86 said:


> For 1440p > 3440x1440, you can do "_just" _fine with 8GB and it is also OK relative to core power. Not ideal, but not starved either. But... I will have you know, at that UW res I do see upwards of 7GB usage more and more lately, even at sub-max settings (high instead of max). I do always max out textures, but don't max out some post processing or shadowmaps all the time. So could I hit 8GB? Probably. And I'd still get over 50 FPS ingame.
> 
> Dialing down texture res is a significant IQ hit, definitely not a place you'd prefer going to. I disagree with that being a 'fix'. The fact is, you've really not got the right GPU for long term then.
> 
> If you must upgrade now, sure, 8GB and 3060ti. Not ideal for 4 years going forward - again, I'm looking at a 1080 with substantially less core power but I do get playable frames at that res, so dialing down textures would be meh. The balance has shifted and not in a good way. Add RT on top and it gets worse. 8GB won't be aging nicely, that is a certainty you have.


You're assuming that reported data allocated to VRAM is actually in active use (or inevitably will be), which isn't actually the case for any game that streams data in any way. Those asset loading techniques always pre-cache aggressively and thus end up using a lot more VRAM than is actually made use of as gameplay progresses - especially as they for the most part still assume HDD loading speeds (i.e. a maximum of ~200MB/s, likely much less, of compressed data). Of course, cutting this allocation means you either need better predictions (nearly impossible) or faster ways of streaming in data. The latter is what DirectStorage will do, but also what developers themselves can do if they start actually designing for SSD loading speeds. Of course they shouldn't be riding the line on necessary textures, as that will always lead to judder as something is mispredicted, but there are a lot of improvements that can be done.

Even without changes in game code and engines, this still means that with your current <1GB of "free" VRAM, you could likely increase the _actual_ VRAM usage of any game quite significantly without seeing any effect on performance. This is easily illustrated in how many games will show astronomical VRAM "use" figures on GPUs with tons of VRAM, yet show no dramatic performance deficiencies on GPUs with much less VRAM (even from the same vendor).

One sample is of course not generally applicable, but the recent Far Cry 6 performance benchmark is a decent example of this is reality:





>9GB of VRAM usage on cards from both vendors at 2160p, yet when we look at 2160p performance?




No visible correlation between VRAM amount and performance for the vast majority of GPUs. The 8GB 6600 XT performs the same as the 12GB 3060. The 12GB 6700 XT is soundly beaten by the 8GB 3070. There are three GPUs that show uncharacteristic performance regressions compared to previously tested games, and all at 2160p: the 4GB 5500 XT, the 6GB 1660 Ti, and the 6GB 5600 XT. The 6GB cards show much smaller drops than the 4GB card, but still clearly noticeable. So, for Far Cry 6, while reported VRAM usage at 2160p is in the 9-10GB range, _actual_ VRAM usage is in the >6GB <8GB range.





This could of course be interpreted as 8GB of VRAM becoming too little in the near future, but, a) we don't know where in the 6-8GB range that usage sits; b) there are technologies incoming that will alleviate this; c) this is only at 2160p, which these GPUs can barely handle at these settings levels even today. Thus, it's far more likely for compute to be a bottleneck in the future than VRAM, outside of a handful of poorly balanced SKUs - and there is no indication of the 3060 Ti being one of those.


----------



## Vayra86 (Dec 6, 2021)

Valantar said:


> You're assuming that reported data allocated to VRAM is actually in active use (or inevitably will be), which isn't actually the case for any game that streams data in any way. Those asset loading techniques always pre-cache aggressively and thus end up using a lot more VRAM than is actually made use of as gameplay progresses - especially as they for the most part still assume HDD loading speeds (i.e. a maximum of ~200MB/s, likely much less, of compressed data). Of course, cutting this allocation means you either need better predictions (nearly impossible) or faster ways of streaming in data. The latter is what DirectStorage will do, but also what developers themselves can do if they start actually designing for SSD loading speeds. Of course they shouldn't be riding the line on necessary textures, as that will always lead to judder as something is mispredicted, but there are a lot of improvements that can be done.
> 
> Even without changes in game code and engines, this still means that with your current <1GB of "free" VRAM, you could likely increase the _actual_ VRAM usage of any game quite significantly without seeing any effect on performance. This is easily illustrated in how many games will show astronomical VRAM "use" figures on GPUs with tons of VRAM, yet show no dramatic performance deficiencies on GPUs with much less VRAM (even from the same vendor).
> 
> ...



Right, good story, and then you get to the situations reviewers cannot cover in full length, and you still notice the occasional stutter, inconsistency, a hang here or there, and you're just not quite as smooth on frametimes as you'd love to be.

Or you start modding and DO require that VRAM allocated because many more assets are pushed through than developers intended. I've seen it too often. Allocation is relevant to performance and frame times, even if its not in active usage. You are pushing harder on your VRAM bandwidth with more swaps required, and this will cause hiccups.

Reading charts != gaming 

But... I'll leave everyone to their own illusion. Its very hard to get the full insight on this apart from long term experience. However if you intend to use your card for longer than 2-3 years, better have 'too much' VRAM or you'll find yourself upgrading soon. One thing though... putting your eggs in the basket of 'future technologies' is the worst possible outlook IMHO. Remember DX12 and its mGPU? Hmhm developers definitely jumped on that. I can name you another few hands full of such 'developments' that fell off the dev budget train.

BTW... 6GB cards are definitely VRAM limited in those Far Cry charts. So there you have it. 1060 and 1660ti equal perf? Ouch. That's 26% performance lost... almost a perfect relative perf loss compared to having 8GB vs 6GB (25% less). That's your window looking at the future of cards 4-6 years of age. The rebuttal 'but 20 FPS' does not matter. They would have had a playable near 30 with more VRAM. In a relative sense, with higher perf cards that's 40 being an actual 60 if you had sufficient memory. - And now note the correlation with Far Cry's VRAM allocations being all way over 6GB.


----------



## DuxCro (Dec 6, 2021)

droopyRO said:


> 3060 Ti. GPU power is power. While vRAM can be managed by lowering texture details.


3060. VRAM is VRAM. No ammount of horsepower or overcloking will fix lack of VRAM.  If you are trying to build something more futureproof, I would go with RTX 3060. Not much difference in horsepower, especially if you OC your 3060,  but once you get to a game that uses more than 8GB of VRAM for max textures, you will be happy with 3060. On the other side, you will have 3060Ti that gives you more fps, but you have to lower texture quality so much that the game looks like crap.


----------



## puma99dk| (Dec 6, 2021)

Metroid said:


> 12gb, more future proof, 8gb is 2018, now anything more than or equal 10gb minimum but 8gb still not bad, 6gb is bad.



Not totally current, because Nvidia had a choice with the RTX 3060 it was either going to be 6GB or 12GB and 12GB sounds better to 12GB but you will run out of performance with the RTX 3060 GPU before reaching this high same goes for Nvidia's RTX 2060 Super with 12GB it looks awesome on paper but in reality they are too weak most of the times to fully use it.


----------



## Selaya (Dec 6, 2021)

Vayra86 said:


> [ ... ]
> 
> BTW... 6GB cards are definitely VRAM limited in those Far Cry charts. So there you have it. 1060 and 1660ti equal perf? Ouch. That's 26% performance lost... almost a perfect relative perf loss compared to having 8GB vs 6GB (25% less). That's your window looking at the future of cards 4-6 years of age. The rebuttal 'but 20 FPS' does not matter. They would have had a playable near 30 with more VRAM. In a relative sense, with higher perf cards that's 40 being an actual 60 if you had sufficient memory. - And now note the correlation with Far Cry's VRAM allocations being all way over 6GB.


6GB isn't the problem _per se_ w/ those, rather _6GB GDDR5_ is. As you can see, the 2060 w/ 6GB of _GDDR6_ is doing _just fine_. Bandwidth matters. And the 3060's starved on that, compared to the 3060Ti.


----------



## Tetras (Dec 6, 2021)

Selaya said:


> 6GB isn't the problem _per se_ w/ those, rather _6GB GDDR5_ is. As you can see, the 2060 w/ 6GB of _GDDR6_ is doing _just fine_. Bandwidth matters. And the 3060's starved on that, compared to the 3060Ti.



because the Ti has a bigger bus?


----------



## Selaya (Dec 6, 2021)

yes


----------



## Valantar (Dec 6, 2021)

Vayra86 said:


> Right, good story, and then you get to the situations reviewers cannot cover in full length, and you still notice the occasional stutter, inconsistency, a hang here or there, and you're just not quite as smooth on frametimes as you'd love to be.
> 
> Or you start modding and DO require that VRAM allocated because many more assets are pushed through than developers intended. I've seen it too often. Allocation is relevant to performance and frame times, even if its not in active usage. You are pushing harder on your VRAM bandwidth with more swaps required, and this will cause hiccups.
> 
> Reading charts != gaming


That would be a good point - if I had been arguing the opposite. Yes, the lack of frametime data and/or .1%/1% lows in TPU's charts is a weakness, and it is entirely possible that some of those average FPS numbers are misleading. But in general, they won't be. The problem with only looking at averages is that you're unable to spot the outliers, not that the overall image is wrong.


Vayra86 said:


> But... I'll leave everyone to their own illusion. Its very hard to get the full insight on this apart from long term experience. However if you intend to use your card for longer than 2-3 years, better have 'too much' VRAM or you'll find yourself upgrading soon. One thing though... putting your eggs in the basket of 'future technologies' is the worst possible outlook IMHO. Remember DX12 and its mGPU? Hmhm developers definitely jumped on that. I can name you another few hands full of such 'developments' that fell off the dev budget train.


Except that mGPU has been a shitshow since the first implementation of SLI, and DX12 putting the onus for making it work entirely on developers was _exactly_ what made it problematic previously (the few games that had official profiles worked okay-ish, everything else was crap, and developers are always pressed on time). DS is supposedly easily implemented, is standard on the Xbox consoles (which is a _huge_ push for adoption by itself), and ultimately does the same that already happens, just faster and more efficiently. So while I agree that betting on future tech to save the day is generally a bad idea, DS seems like one of those (relatively few) cases where it's likely to work out decently. As advertised? Unlikely. But as an improvement over the current "okay, on the current trajectory in 10 seconds the player might enter areas A, B or C, each of which need 500MB of new textures loaded, and we can't expect more than 200MB/s, so let's start caching!"? That's a given. And, as I said above, that doesn't even need DirectStorage, it just requires games to be developed with the expectation of SSD storage.


Vayra86 said:


> BTW... 6GB cards are definitely VRAM limited in those Far Cry charts. So there you have it. 1060 and 1660ti equal perf? Ouch. That's 26% performance lost... almost a perfect relative perf loss compared to having 8GB vs 6GB (25% less). That's your window looking at the future of cards 4-6 years of age. The rebuttal 'but 20 FPS' does not matter. They would have had a playable near 30 with more VRAM. In a relative sense, with higher perf cards that's 40 being an actual 60 if you had sufficient memory. - And now note the correlation with Far Cry's VRAM allocations being all way over 6GB.


It seems like you're trying to make some "gotcha" point here, but ... *ahem*


Valantar said:


> There are three GPUs that show uncharacteristic performance regressions compared to previously tested games, and all at 2160p: the 4GB 5500 XT, the 6GB 1660 Ti, and the 6GB 5600 XT. The 6GB cards show much smaller drops than the 4GB card, but still clearly noticeable. So, for Far Cry 6, while reported VRAM usage at 2160p is in the 9-10GB range, _actual_ VRAM usage is in the >6GB <8GB range.


So ... yes?

My whole point was: "you can't trust VRAM readouts from drivers or software, as they are not representative of actual VRAM usage" - as a counter to your "I've seen games come close to 7GB, so 8GB is going to be too little soon" argument. The point of my argument isn't specifically whether or not 8GB is sufficient or not, but more broadly that you need to look at actual performance data and not VRAM usage. Your initial statement was made on a deeply flawed basis - much more flawed than the absence of .1% data in TPU's reviews.

To make this extra clear: Your argument that I responded to was "I'm seeing >7GB, so we might soon be hitting 8GB and be bottlenecked." My response was "here's an example of a game that shows _9GB_ of VRAM usage, yet is only clearly bottlenecked on _6GB or lower_."

As for whether the lower amount of VRAM will bring your from "a playable near 30" to something lower: at that point you need to _lower your damn settings_. Seriously. This is at Ultra. Playing at Ultra is always dumb and wasteful, even on a flagship GPU. And yes, lowering texture quality is often a lot more noticeable than other settings with a similar performance gain. But when you're at the "can I hit 30 or not" point in performance, well, either you're playing a game where smoothness doesn't matter, or you'll have a better play experience lowering your settings.

As I apparently have to repeat myself:


Valantar said:


> This could of course be interpreted as 8GB of VRAM becoming too little in the near future, but, a) we don't know where in the 6-8GB range that usage sits; b) there are technologies incoming that will alleviate this; c) this is only at 2160p, which these GPUs can barely handle at these settings levels even today. Thus, it's far more likely for compute to be a bottleneck in the future than VRAM, outside of a handful of poorly balanced SKUs - and there is no indication of the 3060 Ti being one of those.


I mean ... this should be pretty clear. The important thing is a GPU with a good balance of compute and VRAM. In current games, and as VRAM usage has developed in the past years, 8GB is unlikely to be a significant bottleneck for anything but the most powerful GPUs at resolutions they are actually capable of rendering at half-decent framerates. If you're buying a 3060 Ti to play at 2160p Ultra, then either you are making some particularly poor choices or you are well aware that this will not result in a smooth experience (which, depending on the game, can be perfectly fine). If you are buying a 3060 Ti, have a 2160p monitor, and refuse to lower your resolution or settings? Then you are letting stubbornness get in the way of enjoying your games, and the bottleneck is your attitude, not the GPU. Either way, even the 3070 Ti (with its 26% additional compute resources) will most likely do just fine with its 8GB for the vast majority of titles at the settings it can otherwise handle. That card has a higher chance of being bottlenecked by VRAM in some titles, and no doubt will, but enough for it to _really_ matter? Not likely. And certainly not to a degree that can't be overcome by adjusting a few settings.


----------



## DuxCro (Dec 6, 2021)

Selaya said:


> 6GB isn't the problem _per se_ w/ those, rather _6GB GDDR5_ is. As you can see, the 2060 w/ 6GB of _GDDR6_ is doing _just fine_. Bandwidth matters. And the 3060's starved on that, compared to the 3060Ti.


No amount of bandwith or overcloking will solve lack of quantity. 6GB is 6GB. higher bandwith memory will give you better performance in higher resolutions. But will at the same time require more VRAM and 6GB is still 6GB. I'd take a card with 12GB of GDDR5 over card with 8GB of GDDR6 any day of the week. Especially since i game at 1080P.


----------



## RandallFlagg (Dec 6, 2021)

Vario said:


> ...I have a question regarding predicting which is more likely to have staying power ~4 years out from now.  I tend to run stuff a really long time....



3060 Ti without question.  TPU shows it being 23% faster than 3060 at 1080P and 27% faster at 1440P.  

That is like an entire generational jump in GPU performance, so in theory it would give you two more years of useful life vs a 3060 (new gen GPUs tend to come out every 2 years).


----------



## DuxCro (Dec 6, 2021)

Vario said:


> I've been in EVGA queue for both of these products for almost a year. The 3060 12GB is likely to come up first.  I have a question regarding predicting which is more likely to have staying power ~4 years out from now.  I tend to run stuff a really long time. The 3060ti processor is much faster but the 12GB ram is possibly better in this aspect if lower vram amounts end up being a limiting factor. What do you guys think?
> 
> The main reason I ask is in 2013, I once purchased a 770 2GB instead of the 4GB because I was told the card wasn't really fast enough to matter.  After a few years, the limited VRAM became an issue that prevented me from running some games.  Similarly, the 780ti's 3GB severely limited that card just a couple years after its launch, despite being close to a 980 in performance otherwise.


Identical situation awaits you if you buy a card today that has 8GB of video memory.  I think everyone who reccomend 3060ti with 8GB is really short sighted and completely missed the point that you are asking for a card that will have enough VRAM for new games 4 years from now. You will find yourself in a situation where developers reccomend 10-12GB of  VRAM with RDNA 3 and RTX 4000 series cards on the market, you will have RTX 3060 Ti that gives slightly better fps and then you must lower textures to medium, and the game just looks bad. While on RTX 3060 you can simply lower the quality of shadows, ambient occlusion and some other barely noticable things for example and get the same fps but with high quality textures. And in the end better looking game. While lower quality textures are quite noticable and have big impact on how the game looks overall.


----------



## RandallFlagg (Dec 6, 2021)

DuxCro said:


> Identical situation awaits you if you buy a card today that has 8GB of video memory.  I think everyone who reccomend 3060ti with 8GB is really short sighted and completely missed the point that you are asking for a card that will have enough VRAM for new games 4 years from now. You will find yourself in a situation where developers reccomend 10-12GB of  VRAM with RDNA 3 and RTX 4000 series cards on the market, you will have RTX 3060 Ti that gives slightly better fps and then you must lower textures to medium, and the game just looks bad. While on RTX 3060 you can simply lower the quality of shadows, ambient occlusion and some other barely noticable things for example and get the same fps but with high quality textures. And in the end better looking game. While lower quality textures are quite noticable and have big impact on how the game looks overall.



That might have some validity if OP were buying a 3090 or 6900XT.  8GB is more than enough for what a 3060 or 3060 Ti can do.  12GB is just there on the 3060 for marketing purposes.  No sane developer is going to target 12GB GPUs for multiple obvious reasons, and there's no reason to think that they will for the next 4-5 years.


----------



## dirtyferret (Dec 6, 2021)

droopyRO said:


> 3060 Ti. GPU power is power. While vRAM can be managed by lowering texture details.


----------



## DuxCro (Dec 6, 2021)

RandallFlagg said:


> That might have some validity if OP were buying a 3090 or 6900XT.  8GB is more than enough for what a 3060 or 3060 Ti can do.  12GB is just there on the 3060 for marketing purposes.  No sane developer is going to target 12GB GPUs for multiple obvious reasons, and there's no reason to think that they will for the next 4-5 years.


No reason to think that they won't. Especially for the latest games. And especially since AMD is more generous with VRAM.  I also have a system with RX 570 4GB. Suprisingly capable 1080P card held back in some games by 4GB of VRAM. 

As for bandwith difference on RTX 3060 vs 3060Ti....3060 has 12GB of GDDR6 running at 1875MHt over 192bit bus. 3060Ti has 8GB of GDDR6 running at 1750MHz over 256Bit bus. 

So 3060 bandwidth = 360gbps
3060Ti bandwidth = 448gbps

And you can lower that difference by OC-ing the VRAM. According to TPU review you can OC it to 2200MHz which would come to 422Gbps on RTX 3060.
You can OC VRAM on RTX 3060 Ti with liquid Nitrogen if you want. It won't go over 8GB of total memory if the game requires more for max textures. 

Or you can always download more VRAM or RAM from the internet. 

Best solution would be to go for RX 6700 with 12GB of VRAM and more or less better performance than 3060Ti


----------



## dirtyferret (Dec 6, 2021)

Vario said:


> The main reason I ask is in 2013, I once purchased a 770 2GB instead of the 4GB because I was told the card wasn't really fast enough to matter.



You were told right, what did the 4GB version do better?  Offer 10-20% performance boost in a few cherry pick games three to four years down the road?  So instead of getting a stuttering mess of 25 FPS you got a smooth experience at 28 FPS?  Both versions were due for an update at the same time.


----------



## Vayra86 (Dec 6, 2021)

Selaya said:


> 6GB isn't the problem _per se_ w/ those, rather _6GB GDDR5_ is. As you can see, the 2060 w/ 6GB of _GDDR6_ is doing _just fine_. Bandwidth matters. And the 3060's starved on that, compared to the 3060Ti.



Yes. Im not contesting that. Rather you could say its the combination of bandwidth and capacity. But for its bandwidth and its capacity, the 3060ti has 'too much' core.


----------



## droopyRO (Dec 6, 2021)

DuxCro said:


> 3060. VRAM is VRAM. No ammount of horsepower or overcloking will fix lack of VRAM.  If you are trying to build something more futureproof, I would go with RTX 3060. Not much difference in horsepower, especially if you OC your 3060,  but once you get to a game that uses more than 8GB of VRAM for max textures, you will be happy with 3060. On the other side, you will have 3060Ti that gives you more fps, but you have to lower texture quality so much that the game looks like crap.


Yup, all my games that fit their textures in to 8 GB of vRAM look like crap. /s


----------



## DuxCro (Dec 6, 2021)

droopyRO said:


> Yup, all my games that fit their textures in to 8 GB of vRAM look like crap. /s


How about you read my post again. This time with brain activated


----------



## droopyRO (Dec 6, 2021)

To respond on the same tone you used.:


DuxCro said:


> 3060. VRAM is VRAM. No ammount of horsepower or overcloking will fix lack of VRAM.  If you are trying to build something more futureproof, I would go with RTX 3060. Not much difference in horsepower, especially if you OC your 3060,  but once you get to a game that uses more than 8GB of VRAM for max textures, you will be happy with 3060. On the other side*, you will have 3060Ti that gives you more fps, but you have to lower texture quality so much that the game looks like crap.*


Do you remember writing this ? Please show us a game that fits it's textures in to 8GB of vRAM and "looks like crap".


----------



## neatfeatguy (Dec 6, 2021)

DuxCro said:


> How about you read my post again. This time with brain activated



Games looked good when I ran them on two GTX 570s.  Then when I replaced them with a 980Ti, at the same settings, the game looked exactly the same, even though there was more VRAM to load textures into on the 980Ti vs the 570s.

If you're trying to elude that a 3060, that's just underwhelming slower than the 3060Ti will be a better choice because it has 4GB more of VRAM.....huh, I wonder why the 980Ti didn't make the games I was playing look nicer. 

My 980Ti must have been defective. Damn it....I used it for 6 years....6 years I used a defective card! Curse you 980Ti and your defectiveness! /s


----------



## Vayra86 (Dec 6, 2021)

Valantar said:


> That would be a good point - if I had been arguing the opposite. Yes, the lack of frametime data and/or .1%/1% lows in TPU's charts is a weakness, and it is entirely possible that some of those average FPS numbers are misleading. But in general, they won't be. The problem with only looking at averages is that you're unable to spot the outliers, not that the overall image is wrong.
> 
> Except that mGPU has been a shitshow since the first implementation of SLI, and DX12 putting the onus for making it work entirely on developers was _exactly_ what made it problematic previously (the few games that had official profiles worked okay-ish, everything else was crap, and developers are always pressed on time). DS is supposedly easily implemented, is standard on the Xbox consoles (which is a _huge_ push for adoption by itself), and ultimately does the same that already happens, just faster and more efficiently. So while I agree that betting on future tech to save the day is generally a bad idea, DS seems like one of those (relatively few) cases where it's likely to work out decently. As advertised? Unlikely. But as an improvement over the current "okay, on the current trajectory in 10 seconds the player might enter areas A, B or C, each of which need 500MB of new textures loaded, and we can't expect more than 200MB/s, so let's start caching!"? That's a given. And, as I said above, that doesn't even need DirectStorage, it just requires games to be developed with the expectation of SSD storage.
> 
> ...


We're taking, again, a very flawed instrument to have this discussion.

Far Cry 6 is a very _recent_ example.
What it shows, is that on cards at 4-5 years of age, the combination of lower VRAM and tighter bus is killing the otherwise still 'decent enough' core performance. Far Cry 6 is a mainstream title. What's more interesting is what type of games you'll be playing at a 4-5 year age on your GPU.

I've lived the practice of that, as I'm doing right now. And its very easy to distill what happens. The fact that I'm still fine with 8GB, is what's saving my 1080's performance and doesn't cause it to fall off sharply like the 1660ti does today (and note, has a mere *2,5 years* in age!). I can play FC6 just fine and Cyberpunk happily takes 8GB and produces 50 FPS nonetheless, without meaningful IQ sacrifice. But in a relative sense - the 1660ti at 6GB and the 1080 at 8GB are quite well balanced in terms of core power. Now, fast forward to today, because in the absence of data of the future 4-5 years from now, we need to extrapolate what we have. We're looking at much faster GPUs endowed with the same 6GB (2060, on par with 1080 core perf) and with the same 8GB on a 3060 that is _way faster_ on the core.

I'm not sure what's in the way of logical thought processes here with people, but we're specifically talking about what GPU X or Y is going to be worth in terms of future proofing 4-5 years down the line. Its the exact question the OP is asking. And in thát situation, 8GB on a card with 3060ti performance is just bad balance. No matter the cache it has and how things changed in its architecture. No matter what arcane technology gets stacked on top in certain titles. The bottom line, stands. We had an 8GB card in 2016 and now we have an 8GB card in 2021-2022 with 40-50% higher core perf. It won't last, and it will be capped at the exact same IQ level my 2016 1080 is going to be, while it has lots of core oomph to spare, in any similar use case. At the same time, I'm seeing a very nice, rather well balanced dropoff on the older 8GB card, where as you say, most of the time, most games are forcing you to dial back regardless and stay within your VRAM cap. But I can still max textures at nearly no perf hit. All I need to do is kill some overexpensive post processing (Ultra > High) to keep my perf. A 3060ti, 4-5 years down the line, will be making far greater sacrifices on IQ, and likely not the ones you'd prefer.

This mismatch between VRAM cap and much higher core perf serves an obvious purpose, one Nvidia is well known for. It hard caps VRAM so that the urge to upgrade will arise even for people who are just fine with somewhat lower FPS and good IQ alongside it. Its a form of planned obscolesence and while Turing pushed that button slightly, Ampere is ramming it like no other. The market is clearly asking for higher VRAM cap GPUs, consoles are clearly going for over 8GB, AMD has a stack with more across the board per tier and games are already looking for higher allocation than 8GB as it is. How many writings on the wall do you really need? Don't even believe my word for it. Believe Nvidia itself, when it produces double VRAM, same SKU products in the same Ampere stack - and note, even in the absence of working SLI... the last GPUs they doubled VRAM on were all with SLI fingers, where the double VRAM serves an obvious purpose.

Bandwidth can certainly take part of capacity out of the equation 'before you start to notice'. But that has its limits too, and is highly dependant on careful driver tweaks and per-game optimizations. Yet another example of what developers are not going to spend a lot of budget on, except if the game has great reception. It works well for AAA, but not so well for less popular software.


However... in this current comparison, 3060ti is still the better option, and I think I did agree on that earlier in this topic. I also added, that it is _far from ideal_. The choice between a 3060 12 GB and a 3060ti 8 GB is a choice between poorly balanced products for long term usage.


----------



## DuxCro (Dec 6, 2021)

neatfeatguy said:


> Games looked good when I ran them on two GTX 570s.  Then when I replaced them with a 980Ti, at the same settings, the game looked exactly the same, even though there was more VRAM to load textures into on the 980Ti vs the 570s.
> 
> If you're trying to elude that a 3060, that's just underwhelming slower than the 3060Ti will be a better choice because it has 4GB more of VRAM.....huh, I wonder why the 980Ti didn't make the games I was playing look nicer.
> 
> My 980Ti must have been defective. Damn it....I used it for 6 years....6 years I used a defective card! Curse you 980Ti and your defectiveness! /s


OK, last post. Done with this. If you think 8GB of VRAM will be enough for at least next 4 years to play all games with *texture quality on max*, go for 3060Ti. Otherwise i would go with 3060 with 12GB of VRAM and just OC the GPU and VRAM to tighten the performance gap. Killing the 2 flies with one shot would be buying RX 6700 XT. Slightly more expensive than 3060 Ti. Depending on the model.


----------



## lexluthermiester (Dec 6, 2021)

Vario said:


> I've been in EVGA queue for both of these products for almost a year. The 3060 12GB is likely to come up first.  I have a question regarding predicting which is more likely to have staying power ~4 years out from now.  I tend to run stuff a really long time. The 3060ti processor is much faster but the 12GB ram is possibly better in this aspect if lower vram amounts end up being a limiting factor. What do you guys think?
> 
> The main reason I ask is in 2013, I once purchased a 770 2GB instead of the 4GB because I was told the card wasn't really fast enough to matter.  After a few years, the limited VRAM became an issue that prevented me from running some games.  Similarly, the 780ti's 3GB severely limited that card just a couple years after its launch, despite being close to a 980 in performance otherwise.


Go with the 3060 12GB. You'll get more longevity out of it.


----------



## Valantar (Dec 6, 2021)

Vayra86 said:


> We're taking, again, a very flawed instrument to have this discussion.


Well, yes. Predicting the future will always be flawed. That's why discussions are productive, though! We can at least try to get down to _why_ predicting this is difficult, if nothing else.


Vayra86 said:


> Far Cry 6 is a very _recent_ example.


That was the point - about as up to date as you're going to get in terms of game development and thus VRAM usage in AAA titles.


Vayra86 said:


> What it shows, is that on cards at 4-5 years of age, the combination of lower VRAM and tighter bus is killing the otherwise still 'decent enough' core performance. Far Cry 6 is a mainstream title. What's more interesting is what type of games you'll be playing at a 4-5 year age on your GPU.
> 
> I've lived the practice of that, as I'm doing right now. And its very easy to distill what happens. The fact that I'm still fine with 8GB, is what's saving my 1080's performance and doesn't cause it to fall off sharply like the 1660ti does today (and note, has a mere *2,5 years* in age!). I can play FC6 just fine and Cyberpunk happily takes 8GB and produces 50 FPS nonetheless, without meaningful IQ sacrifice. But in a relative sense - the 1660ti at 6GB and the 1080 at 8GB are quite well balanced in terms of core power. Now, fast forward to today, because in the absence of data of the future 4-5 years from now, we need to extrapolate what we have. We're looking at much faster GPUs endowed with the same 6GB (2060, on par with 1080 core perf) and with the same 8GB on a 3060 that is _way faster_ on the core.


IMO you're still mixing a few factors here that make this comparison problematic though: That a 2.5-year-old midrange card (1660Ti) struggles with a brand new AAA title at 2160p? That's expected. Heck, that was expected when it was new. The 1660Ti has never been able to handle 2160p Ultra in any reasonable way. The 1080 supposedly was at the time it was new, but that's due to games then being less demanding and no faster cards existing - and most reviews back then still pointed out that even the 1080 Ti wasn't _really_ fast enough for 2160p60 Ultra.

So, while you're right that the 1660Ti underperforms, it underperforms in a scenario it has never been suited for, and where it would perform poorly no matter what. Sure, it would likely be around 30fps instead of 20. But at that point, the _only_ sensible approach (unless you're playing a game where framerate _really_ doesn't matter) is to turn down your settings or resolution. Thus, the point is moot. It was unsuited to the task before the VRAM bottleneck ever came into question.


Vayra86 said:


> I'm not sure what's in the way of logical thought processes here with people, but we're specifically talking about what GPU X or Y is going to be worth in terms of future proofing 4-5 years down the line. Its the exact question the OP is asking. And in thát situation, 8GB on a card with 3060ti performance is just bad balance. No matter the cache it has and how things changed in its architecture. No matter what arcane technology gets stacked on top in certain titles. The bottom line, stands. We had an 8GB card in 2016 and now we have an 8GB card in 2021-2022 with 40-50% higher core perf. It won't last, and it will be capped at the exact same IQ level my 2016 1080 is going to be, while it has lots of core oomph to spare, in any similar use case. At the same time, I'm seeing a very nice, rather well balanced dropoff on the older 8GB card, where as you say, most of the time, most games are forcing you to dial back regardless and stay within your VRAM cap. But I can still max textures at nearly no perf hit. All I need to do is kill some overexpensive post processing (Ultra > High) to keep my perf. A 3060ti, 4-5 years down the line, will be making far greater sacrifices on IQ, and likely not the ones you'd prefer.


There's a flaw in your logic here: just because the older 8GB cards aren't bottlenecked doesn't mean they're actively using 8GB - the next step down is 6GB, so all we know is >6GB <8GB. And VRAM usage across games on average creeps up relatively slowly. Over the past decade we've gone from 2GB at the high end to 8GB in the mid-range and 16GB at the high end (and from 1080p at the high end to 2160p, though lower resolution were still _very_ common in 2011-ish), with visible bottlenecks at 2160p showing up first at 4GB a couple of years back, then 6GB more recently. Expecting 8GB to be performance-breakingly insufficient within 4-5 years, at resolutions that these GPUs are reasonably equipped to handle? Yeah, no, I don't believe that. At 2160p Ultra, sure, but these GPUs can't handle that _now_, and _definitely_ won't handle that in 4-5 years, however much VRAM they might have. At 1440p or 1080p? Even with high quality textures, 8GB is likely to be plenty for many years to come. In the FC6 test, the 4GB 5500 XT _might_ seem to be toeing the line at 1440p, as there's a noticeable relative performance loss compared to 1080p - but it's overall still small, likely indicative of occasional stuttering. And if 4GB is borderline enough for 1440p Ultra today, there's no way whatsoever 8GB will be insufficient in 4-5 years. VRAM needs have _never_ increased that rapidly, and there's no reason why they would start doing so now.


Vayra86 said:


> This mismatch between VRAM cap and much higher core perf serves an obvious purpose, one Nvidia is well known for. It hard caps VRAM so that the urge to upgrade will arise even for people who are just fine with somewhat lower FPS and good IQ alongside it. Its a form of planned obscolesence and while Turing pushed that button slightly, Ampere is ramming it like no other. The market is clearly asking for higher VRAM cap GPUs, consoles are clearly going for over 8GB, AMD has a stack with more across the board per tier and games are already looking for higher allocation than 8GB as it is. How many writings on the wall do you really need? Don't even believe my word for it. Believe Nvidia itself, when it produces double VRAM, same SKU products in the same Ampere stack - and note, even in the absence of working SLI... the last GPUs they doubled VRAM on were all with SLI fingers, where the double VRAM serves an obvious purpose.


Double VRAM SKUs is a marketing tactic, nothing more, nothing less. The 3060 has 12GB because 6GB would make it look bad compared to AMD's 8GB cards for spec-illiterate buyers. The 2060 12GB is a cash grab, period - if they cared about supply at all, they would just stuff the market with standard 2060s - the 6GB isn't holding it back at resolutions it's suited for (1440p and 1080p). But precisely because there's this prevalent idea that you _need_ 8GB or even more because a lot of games opportunistically allocate tons of VRAM if it exists, people get roped into these misconceptions about what makes a SKU worth buying. "The market" is not rational. And it is certainly not well informed. Thus, what the "market wants" is often a poor choice of guideline for making a good product.

I mean, look at the FC6 numbers for the 2060. The VRAM "use" reading is >7GB for every single resolution on the 3090, yet the 2060 shows no sign of underperforming whatsoever - it's exactly where you'd expect it. Heck, except for 2160p the 8GB 2070 has _worse_ performance regressions compared to the TPU test suite mean, even if those differences are marginal. It is of course possible that these average numbers are hiding stuttering and poor 1%/.1% performance, and I'd kind of expect that at 2160p given results with other 6GB cards in that test), but at 1440p and 1080p? Not a chance.

And, again, even if you have a 2160p monitor and "only" a 2060: insisting on playing on ultra is _dumb_. Heck, most 2160p panels have sufficient pixel density to look decently sharp even at non-native resolutions, so 1440p (or 4:1 scaled 1080p) would be a much better choice on that monitor for most games if you insist on Ultra, though 2106p mid-high would likely work decently too. And you can of course again complain that this is a new and expensive card and so on, but again: it isn't a 2160p Ultra card. It never has been. Not when new, not  now, not 4-5 years after launch. And that's fine.


IMO, a _huge_ part of what has changed here is not the imbalance between compute and VRAM, even if that has been on a gradual increase since the birth of GPUs more or less, but the fact that reviewers have started focusing on 2160p and that games all stream their assets aggressively and opportunistically. When we're talking about GPUs barely capable of 2160p Ultra today, the question of longevity at that setting level is already answered. That isn't what these GPUs are suited for, so the answer to the question of which would handle that best in the future would be like asking which human is best at breathing underwater - there might be a difference, but it definitely doesn't matter. Thus, the question _must_ be moved to lower settings levels and/or resolutions - that's a necessity when thinking of GPU longevity. And VRAM needs drop off dramatically then, with even 6GB clearly being sufficient today. Thus, it's reasonably safe to assume that the 8GB 3060 Ti will perform the best at reasonable settings and resolutions in 4-5 years, just as it does today. There will no doubt be outliers then too, of games with astronomical VRAM needs and dumb bugs, but ultimately, none of these GPUs are likely to be meaningfully VRAM bottlenecked before they are bottlenecked by their compute capabilities. Which was exactly what I said in my first post here, just without the 2000 words explaining the reasoning behind it.


----------



## droopyRO (Dec 6, 2021)

I don't know if the day will ever come that a 3060 will be faster than a 3060 Ti. 8GB of vRAM can fit today's textures at ultra settings in 1440p. 
Even if in 5 years we will have games that require 8GB for low settings. Do you really think those textures would look muddy or "like crap" ? 8GB worth ? And that having 4GB more and 30% less GPU power would be better ?

I think not. I am buying hardware for current requirements. Not for some distant future. Imagine this minning madness never stopping and the prices being this inflated 5 years from now. How would any AAA game will sell good on PC. If people need to have a 12GB card to run the game in 1440p and low details. It would mean the death of PC gaming.
So, either 8GB would be sufficient for low to medium details at 1080[/1440p or miners would have overwhelmed the AAA PC gaming market. The latter situation would be very bad for our hobby.

PS: speaking about a distant future. Imagine in 2016 someone telling you that GPU prices would double or tripple in 2021 at the same performance level. How hard would you have laughed at that person ?


----------



## Bomby569 (Dec 6, 2021)

i have a 3060ti, i would not trade it for a 3060 but i already had to take settings down because it passed the 8GB on resident evil village. It could certainly use more VRAM, today, not to mention in a 1 year or 2


----------



## Valantar (Dec 7, 2021)

Bomby569 said:


> i have a 3060ti, i would not trade it for a 3060 but i already had to take settings down because it passed the 8GB on resident evil village. It could certainly use more VRAM, today, not to mention in a 1 year or 2


Did you actually see judder or performance drops? At what resolution?


----------



## wolf (Dec 7, 2021)

Valantar said:


> Outside of a few edge cases with massive VRAM needs, the Ti is likely to last longer simply due to being faster. As has been mentioned above, it's crucial to remember that system reported VRAM "usage" is wildly inflated in most games through opportunistic pre-caching of assets, most of which are never used before being ejected in favor of pre-caching of other assets again. VRAM usage numbers are thus not really an indicator of anything other than how aggressively the game streams in assets that might be useful. Real VRAM-induced performance limitations are found in framerate/frametime measurements. Typically seen as especially bad 1%/.1% lows, but also as unexpectedly bad averages if the bottleneck is sufficiently bad.
> 
> As for predicting future developments, actual VRAM needs in games have grown relatively slowly over the past decade, and while they are indeed higher across the board, history has shown that most GPUs are held back by compute long before they are held back by VRAM capacity. Of course history doesn't predict the future, but change is also typically slow and gradual. Plus, technologies like DirectStorage have the potential to lower actual VRAM capacity needs quite noticeably through enabling vastly faster on-the-fly streaming of assets.


A well thought out and well written take which is absolutely spot on.


Vayra86 said:


> Dialing down texture res is a significant IQ hit


Going from Ultra to low is, but if we're talking Ultra down to very high or high (equivalents) I disagree. Sure if you have the VRAM, fill it, but I don't find 1-2 notches down on textures to significantly affect the games I play, in fact most times I'm hard pressed to notice any difference whatsoever.

I'd argue losing ~30% GPU performance is the more significant and consistent 'hit' they'd be taking to make the choice of a 3060 over the Ti, and they'd be taking it from the get go.


Valantar said:


> If you are buying a 3060 Ti, have a 2160p monitor, and refuse to lower your resolution or settings? Then you are letting stubbornness get in the way of enjoying your games, and the bottleneck is your attitude, not the GPU


So much this. One way or another the vast majority of us are tweaking all manner of settings to get the balance of visuals to FPS right, and that's a highly personal set of choices.


Valantar said:


> it's reasonably safe to assume that the 8GB 3060 Ti will perform the best at reasonable settings and resolutions in 4-5 years, just as it does today.


Exactly this yet again, it will essentially always be an appropriate texture setting relative to the other settings the GPU can manage anyway. Why take a ~30% hit on FPS today and for the next 4-5 years? The overall experience will always be better on the 3060Ti.


droopyRO said:


> Even if in 5 years we will have games that require 8GB for low settings. Do you really think those textures would look muddy or "like crap" ? 8GB worth ? And that having 4GB more and 30% less GPU power would be better ?


Yet another great point that I hadn't even considered.


----------



## Selaya (Dec 7, 2021)

Vayra86 said:


> Yes. Im not contesting that. Rather you could say its the combination of bandwidth and capacity. But for its bandwidth and its capacity, the 3060ti has 'too much' core.


I wouldn't say that. The 3060Ti has performance around a 2080S, or something? And that was fine w/ 8GB really, and still is.


----------



## nguyen (Dec 7, 2021)

I would rather have 100FPS with High Texture Quality than 70FPS with Ultra Texture Quality


----------



## Frick (Dec 7, 2021)

DuxCro said:


> Identical situation awaits you if you buy a card today that has 8GB of video memory.  I think everyone who reccomend 3060ti with 8GB is really short sighted and completely missed the point that you are asking for a card that will have enough VRAM for new games 4 years from now. You will find yourself in a situation where developers reccomend 10-12GB of  VRAM with RDNA 3 and RTX 4000 series cards on the market, you will have RTX 3060 Ti that gives slightly better fps and then you must lower textures to medium, and the game just looks bad. While on RTX 3060 you can simply lower the quality of shadows, ambient occlusion and some other barely noticable things for example and get the same fps but with high quality textures. And in the end better looking game. While lower quality textures are quite noticable and have big impact on how the game looks overall.



This argument comes back, and sometimes it has merit, but generally it has turned out that it's better to buy more raw power than more memory. The 128MB Radeon 9000 vs the 64MB Radeon 9000 Pro argument takes me back...


----------



## Outback Bronze (Dec 7, 2021)

Frick said:


> This argument comes back, and sometimes it has merit, but generally it has turned out that it's better to buy more raw power than more memory. The 128MB Radeon 9000 vs the 64MB Radeon 9000 Pro argument takes me back...


 
Yep, my vote is 3060Ti


----------



## Vayra86 (Dec 7, 2021)

Selaya said:


> I wouldn't say that. The 3060Ti has performance around a 2080S, or something? And that was fine w/ 8GB really, and still is.



2080S is a 2019 GPU, it better be. And yet, it will also find a small selection of games already where it falls short on 8GB with that amount of core perf. Turing already was a step back in VRAM/core.

Is that horrible? No. Is it sub optimal? Definitely. And that's what I've been saying. 3060 / 3060ti is a choice of suboptimal product. 12GB on the 3060 makes as little sense as 8GB on the Ti.


----------



## Outback Bronze (Dec 7, 2021)

Vayra86 said:


> 8GB on the Ti.



 Can I stir you up about the 3080 lols..


----------



## Vayra86 (Dec 7, 2021)

Valantar said:


> Double VRAM SKUs is a marketing tactic, nothing more, nothing less.


Nope, they never were. Double VRAM catered to a good 10~15% of market demand that was aimed at SLI/Crossfire buyers that would drain the late-in-gen-stock for Nvidia and AMD. It was always a great way to move units, and for those reasons dual GPU was always priced about 10% more favorable in FPS/dollar, even if you took some support issues in your stride. Spec illiterate was never expected beyond the midrange, and never catered to. And it isn't today. There is only market demand and how do we sell units. What part of that is marketing and what's real demand? Good luck drawing that line. A salesman knows: business is about 'creating demand'. Demand is demand. Today, Nvidia releases those units because there is demand, because the market spoke out against low VRAM amounts, and Nvidia wasn't capable of sourcing enough chips for the double amount, or any unholy combination of these factors.

If you apply the marketing tactic about VRAM to the segment below midrange, ie OEM-midrange prebuilts, laptops, and casual gaming gpus then yes, you would be right - that is the segment of illiterate buyers that say 'moar better' without looking at what's behind the numbers.

The 3090 also got a double VRAM version, which was specifically aimed at 'creators'. Similarly, Titans were marketed specifically at some gray area of enthusiasts (the last group you'd expect to be tech illiterate) that would be semi pro as well.


----------



## Bomby569 (Dec 7, 2021)

Valantar said:


> Did you actually see judder or performance drops? At what resolution?



No, because i didn't let it past the limit. But i'm sure that's what would happen, 1440p
But don't just panic this is one example, RE games are notorious for eating VRAM, just like RE2 remake did back in the day



Selaya said:


> I wouldn't say that. The 3060Ti has performance around a 2080S, or something? And that was fine w/ 8GB really, and still is.



2 years have passed like already mentioned, and now DLSS or even FSR lets you use more of your card, you can do resolutions you couldn't with a 2080s back in 2019


----------



## Frick (Dec 7, 2021)

Vayra86 said:


> 2080S is a 2019 GPU, it better be. And yet, it will also find a small selection of games already where it falls short on 8GB with that amount of core perf. Turing already was a step back in VRAM/core.
> 
> Is that horrible? No. Is it sub optimal? Definitely. And that's what I've been saying. 3060 / 3060ti is a choice of suboptimal product. 12GB on the 3060 makes as little sense as 8GB on the Ti.



Sometimes I wish VRAM slots became a thing. My brother had such a card (because the seller was convinced it was the next big thing), don't remember what card it was. Riva? Matrox?


----------



## Bomby569 (Dec 7, 2021)

Frick said:


> Sometimes I wish VRAM slots became a thing. My brother had such a card (because the seller was convinced it was the next big thing), don't remember what card it was. Riva? Matrox?



The way gpu's work it's hard to see how that could be a thing, they have to be close to the core but also have a gigantic heatsink over it with no room to spare. They also need lots of cooling themselves.


----------



## Outback Bronze (Dec 7, 2021)

Bomby569 said:


> The way gpu's work it's hard to see how that could be a thing, they have to be close to the core but also have a gigantic heatsink over it with no room to spare. They also need lots of cooling themselves.



Back of the GPU?


----------



## wolf (Dec 7, 2021)

Outback Bronze said:


> Back of the GPU?


That was my thought, I'm envisioning like laptop style DIMM slots or say an M.2 slot where they lay flat against the board on the back. Alas it seems like a pipedream, at least for anything more powerful than entry level stuff.


----------



## Selaya (Dec 7, 2021)

Frick said:


> Sometimes I wish VRAM slots became a thing. My brother had such a card (because the seller was convinced it was the next big thing), don't remember what card it was. Riva? Matrox?


i actually had a dream the other night about socketable GPU daughterboards, just for that lmfao


----------



## Bomby569 (Dec 7, 2021)

I'm no expert and i could be saying something wrong, but when you introduce a socket you have to see what speeds that socket can support , and there's also the question of having one or 2 slots versus the normal configuration with several chips for vram, so you have a bottleneck.


----------



## Frick (Dec 7, 2021)

Bomby569 said:


> I'm no expert and i could be saying something wrong, but when you introduce a socket you have to see what speeds that socket can support , and there's also the question of having one or 2 slots versus the normal configuration with several chips for vram, so you have a bottleneck.



Sure, it would introduce all sorts of complications and there's a reason for the idea never taking off, but still.


----------



## Valantar (Dec 7, 2021)

Vayra86 said:


> Nope, they never were. Double VRAM catered to a good 10~15% of market demand that was aimed at SLI/Crossfire buyers that would drain the late-in-gen-stock for Nvidia and AMD.


Okay, I should have added the qualifier "for the pat 3-4 generations (i.e. since SLI/CF died)". You're right that they served a purpose back when that was still a feasible approach to improving performance. Me cheaping out and getting the 512MB HD 4850s instead of the 1024MB ones back in 2008 or so was exactly the reason why I had to get a 6950 in 2011 - there was no way those 512MB cards could perform passably at 1440p when I got my U2711. But then, 1440p gaming in 2011 was ... pushing it.


Vayra86 said:


> It was always a great way to move units, and for those reasons dual GPU was always priced about 10% more favorable in FPS/dollar, even if you took some support issues in your stride.


Sure. But then SLI/CF has been entirely irrelevant since at least 2015. Yes, I know even 10 series supported SLI, but by that point game/driver support was so poor as to be essentially nonexistent.


Vayra86 said:


> Spec illiterate was never expected beyond the midrange, and never catered to. And it isn't today.


Yes it absolutely is. Gaming has _exploded_ in recent years, and the sheer amount of sales of GPUs today despite their ridiculous pricing is by itself proof that people are willing to pay _way_ unreasonable prices compared to what products are worth. A significant part of that can likely be attributed to people either new to the hobby or uninterested in learning, who just want to buy something good. There will _always_ be more uninformed customers than well-informed ones. Period. Thus, catering to the spec-illiterate is a _massive_ part of marketing, and always has been. The main change in PC gaming is that a decade ago it was still a relatively niche hobby mostly (but not entirely) limited to enthusiasts, while today it's ubiquitous.


Vayra86 said:


> There is only market demand and how do we sell units. What part of that is marketing and what's real demand? Good luck drawing that line. A salesman knows: business is about 'creating demand'. Demand is demand. Today, Nvidia releases those units because there is demand, because the market spoke out against low VRAM amounts, and Nvidia wasn't capable of sourcing enough chips for the double amount, or any unholy combination of these factors.


You're contradicting yourself several times here. You say demand can be created, yet you claim "the market spoke out against low VRAM amounts", as if that occurred spontaneously and out of nothing. Or, maybe, VRAM amounts have been heavily marketed since at least Vega? One-upmanship in VRAM amounts has been a popular GPU marketing sport for quite a few years. You're entirely right that the relationship between marketing and demand is incredibly complex and anything but linear - I've never claimed otherwise - but ideas and beliefs do not appear spontaneously, and there is a _staggering_ amount of superstition and misunderstanding among even enthusiasts about how PCs work. Nobody is immune to this - this debate is proof of that.

Still, your assertion that "today, Nvidia releases those because there is demand" is a cop-out. They have a responsibility to educate and not mislead their customers. And they certainly aren't forced by "the market" to produce dumb SKUs. They _can_, because doing so is an easy and cynical way of playing off of customer superstitions, earning them more money as they can charge more of a premium for """better""" products. But, especially taking Nvidia's market position and _massive_ mindshare into account, they could just as well not do so, and would likely sell just as many GPUs. The amount of customers lost to 8GB AMD competitors would be _microscopic_.


Vayra86 said:


> If you apply the marketing tactic about VRAM to the segment below midrange, ie OEM-midrange prebuilts, laptops, and casual gaming gpus then yes, you would be right - that is the segment of illiterate buyers that say 'moar better' without looking at what's behind the numbers.


You seem to be under the impression that only well-informed enthusiasts are willing to spend a couple of thousand dollars on a PC. This was true even half a decade ago, but today? Not even close. There are _tons_ of reasonably wealthy idiots in the world, and an ever-increasing number of them are gamers.


Vayra86 said:


> The 3090 also got a double VRAM version, which was specifically aimed at 'creators'. Similarly, Titans were marketed specifically at some gray area of enthusiasts (the last group you'd expect to be tech illiterate) that would be semi pro as well.


Don't all 3090s have 24GB? The 3090 is a fundamentally weird GPU, as it doesn't get creator-focused drivers (it's not a Titan), but costs as much, while having 2x the RAM such a card would need. There's a reason they've launched a 12GB 3080 Ti - it performs the same at a lower BOM cost. The 3090 is IMO mainly a flex - it's Nvidia making a _true_ flagship SKU, one that is "extra everything" without it really making sense beyond demonstrating that it can be done. Which is precisely where ultra-luxury products tend to live.


Bomby569 said:


> No, because i didn't let it past the limit. But i'm sure that's what would happen, 1440p
> But don't just panic this is one example, RE games are notorious for eating VRAM, just like RE2 remake did back in the day


Not panicking at all, I was just curious - "letting it past the limit" is exactly what would have made this interesting, as I sincerely doubt you would have seen performance issues before you exceeded 8GB by a relatively significant amount of allocated data - as illustrated by the FC6 examples in this thread. Reported VRAM usage is not equal to actual VRAM usage.



Frick said:


> Sometimes I wish VRAM slots became a thing. My brother had such a card (because the seller was convinced it was the next big thing), don't remember what card it was. Riva? Matrox?


I was about to say, didn't some GPUs have that back in the _really_ old days, when GPUs were new? I sincerely doubt it would be doable today though - SODIMMs have a 64-bit interface after all, so you'd need 4 SODIMMs spaced at equal distance from the package for it to work. Even on the back of the card that would be _really_ difficult. You could of course make a denser connector for GPU memory modules, perhaps some sort of mezzanine connector with a ton of pins, but that would get expensive fast, and signal integrity and power would still be an issue. One possible stopgap solution would be to stick a single SODIMM socket on the GPU for a second layer of RAM, not for direct GPU access but for pre-pre-caching. A 16GB DDR4-3200 SODIMM is pretty cheap, needs little power, and would be able to feed the VRAM much faster than any storage medium, even using DirectStorage. That would likely be sufficient to alleviate nearly any VRAM bottleneck as long as the game is even remotely well written in how it streams in assets.


----------



## Bomby569 (Dec 7, 2021)

Valantar said:


> Not panicking at all, I was just curious - "letting it past the limit" is exactly what would have made this interesting, as I sincerely doubt you would have seen performance issues before you exceeded 8GB by a relatively significant amount of allocated data - as illustrated by the FC6 examples in this thread. Reported VRAM usage is not equal to actual VRAM usage.



i actually just tried it now for a quick play (game is really good btw), it doesn't stop at 8GB, but there is no noticeable problems at first, but idk if it's play time of a certain scene, fps started to drop like crazy, and i mean the low 10's, after a bit or if you exit to menu it solves it, but after a while it returns.
Again this is all to be expect i think.


----------



## Valantar (Dec 8, 2021)

Bomby569 said:


> i actually just tried it now for a quick play (game is really good btw), it doesn't stop at 8GB, but there is no noticeable problems at first, but idk if it's play time of a certain scene, fps started to drop like crazy, and i mean the low 10's, after a bit or if you exit to menu it solves it, but after a while it returns.
> Again this is all to be expect i think.


Huh, that's some odd behaviour. Sounds like it's constantly loading assets without ejecting older assets from VRAM, which I suppose would explain why it has a reputation for using tons of it! There's probably a reason for it, but it sure sounds unnecessary.


----------



## chrcoluk (Dec 8, 2021)

For me the these settings typically offer the biggest IQ gain.

Textures - yes this is top of the list, and its main requirement is VRAM not rendering performance.  I have played games where there is e.g. effort to make the playable character nice to look at, but when you look at field objects, doors etc. they look horrible due to low res textures.  Remember the door in ff7 remake?  This is a almost consistent problem with games before the current gen of consoles.  Usually due to VRAM been under spec'd on hardware.
Shadow quality/resolution - often set too low on default settings, and many games need mods to get them acceptable.
MSAA - Not very common in modern games, post processing AA is preferred which is usually sub par.  But in older games where this is offered it is usually noticeable.
Draw distance - not much is worse than objects appearing right in front of you.

Outside of game tuning

SGSSAA - An absolute killer feature but like MSAA mostly only useful for older games.

In many games doing things like dropping fog from ultra to high has no immediate visual impact, but do the same on textures its very noticeable.  So I do think there is a place for large VRAM mid end cards.  Although I own a 3080 I think the VRAM is under spec'd on it, however I prefer paying £650 for a 10 gig 3080 vs say £1500 for  24 gig 3090.


----------



## wolf (Dec 8, 2021)

chrcoluk said:


> For me the these settings typically offer the biggest IQ gain.


My list looks a bit different..

Framerate
Resolution
Antialiasing
Lighting
Geometric detail

In no _particular_ order, but they're my more meaningful ones. Lately I am finding less and less of a stark, obvious 'in my face' different between Max/HD textures and a ~middle setting, only low ever seems to be mud spec. Again, and of course, if you have the VRAM fill it, but from my gaming chair I'm hard pressed to notice the difference, if and when I am required (or just playing about - which I do a lot) by dropping this setting.

The 3080 having only 10GB is commonly thrown in my face in forums and on reddit as a ha-ha-gotcha, but it's effectively a non-issue, and fine by me to have more GPU than VRAM in general. If it was an issue to me, I would have bought the 3090, but the 3080 will be long replaced as my main gaming card, and long obsolete by a gen or 2 before the 10GB is in any way a day-to-day issue.

The kicker being that it mines ETH while not obliterating any game I throw at it, the 3080 has paid for itself twice over now, affording me the opportunity to effectively buy whatever I want to replace it when that time comes.



Valantar said:


> Huh, that's some odd behaviour. Sounds like it's constantly loading assets without ejecting older assets from VRAM, which I suppose would explain why it has a reputation for using tons of it! There's probably a reason for it, but it sure sounds unnecessary.


At least part of AMD's play this RDNA2 generation will be to offer bloated textures and VRAM requirements on all sponsored games, because most of their stack has more VRAM than comparable Ampere products, they actively want this to be their talking/success point, because outside of solid raster performance, lots of VRAM is all they really have in their hand to play. 

Even in W1z's head to heads now he's going all out on max textures and even HD pack (FC6), but won't enable RT. Wether deliberate or not, it's actively giving RDNA2 every chance to succeed, but not affording Ampere the same benefit, like texture settings per game that fits on both cards tested, and/or testing with RT on - especially when the game puts it on by default.


----------



## Vayra86 (Dec 8, 2021)

Valantar said:


> Huh, that's some odd behaviour. Sounds like it's constantly loading assets without ejecting older assets from VRAM, which I suppose would explain why it has a reputation for using tons of it! There's probably a reason for it, but it sure sounds unnecessary.



Or... maybe games push harder on VRAM to keep gameplay stutter free


----------



## Mussels (Dec 8, 2021)

Frick said:


> Sometimes I wish VRAM slots became a thing. My brother had such a card (because the seller was convinced it was the next big thing), don't remember what card it was. Riva? Matrox?


Good news, they aim to do something quite like this using NVME slots
(Direct Storage. Using NVME as a VRAM supplement, really)

I agree with others, get the faster GPU before the more VRAM. Lowering VRAM requirements is easy to do and often visually hard to spot, but if you've got a weak GPU you wont be able to use that VRAM anyway


----------



## Solid State Soul ( SSS ) (Dec 8, 2021)

Regardless of what everyone is saying, I believe paying above msrp for a 8gb card in now soon to come 2022 is a bad idea, lots of new games can consume up to 8gb at 1080p, and todays games are cross gen, what do you think will happen once developers start to optimize games form the ground up for consoles with 16gb ram ?

I strongly believe if you are buying a card today it should have 12gb minimum, and please don't spill me the "suffent vram for sufficient performance " that is utter BS, the 780ti came with 3gb vram when ps4 launched with 8gb ram, then two years later every game under the sun ate that 3gb for breakfast, the 1060 launched 3 years later with 780ti performance and double the vram, that card still plays games today fine, so would have the 780ti, if only it came with sufficient vram, and let's not forget the 1060 3gb ...

People need to stop falling for Nvidia bs, Nvidia intentionally limit their cards vram so that consumers keep upgrading every couple of years once games start to demand more, this is not 2016 anymore, wake up people

If I were you I would wait for Nvidia 4000 series or buy AMD rx 6800 with its generous 16 gb vram buffer, Jensen can take his above 700$ 3070ti card with its mesely 8gb ram buffer and F himself withit, Nvidia are not failing to read the room, they are more aware of it than all of you, and they grinning from ear to ear cause yall fall for it


----------



## oxrufiioxo (Dec 8, 2021)

To be fair it seems the 3080 will soon have 12GB of vram and the 3070Ti 16GB...... Not that the majority of people will be able to buy them 

Maybe the 4000 series is getting pushed to 2023 due to shortages (or nvidia selling every ampere gpu it can make) becuase it seems like too many new ampere cards are coming out soon only time will tell.


----------



## Solid State Soul ( SSS ) (Dec 8, 2021)

oxrufiioxo said:


> To be fair it seems the 3080 will soon have 12GB of vram and the 3070Ti 16GB...... Not that the majority of people will be able to buy them
> 
> Maybe the 4000 series is getting pushed to 2023 due to shortages (or nvidia selling every ampere gpu it can make) becuase it seems like too many new ampere cards are coming out soon only time will tell.


Yeah thats Nvidia giving yall the middle finger for buying the low vram cards because people knew no better, and now they are double dipping their consumers with high vram models that should have been there from the start

This is nothing short than "F you give me many dump consumers " double dripping bs


----------



## oxrufiioxo (Dec 8, 2021)

Solid State Soul ( SSS ) said:


> Yeah thats Nvidia giving yall the middle finger for buying the low vram cards because people knew no better, and now they are double dipping their consumers with high vram models that should have been there from the start
> 
> This is nothing short than "F you give me many dump consumers " double dripping bs



I agree, $200 ish cards from 2016 had 8GB of vram.. 400+ gpus have no business only having that amount. Regardless of budget I would be shooting for at least 3060 ti performance but with at least 12GB of vram if I was keeping the card longer than 2 years.

Nvidia decided to give us two bad options in the midrange a card not powerful enough for 12GB and a 2080 super level card with the same amount of vram as budget cards from 2016....


----------



## Ibizadr (Dec 8, 2021)

When you guys talked in games at 1440p and 4k, it was with upscale resolutions in  nvcp?


----------



## neatfeatguy (Dec 8, 2021)

oxrufiioxo said:


> I agree, $200 ish cards from from 2016 had 8GB of vram.. 400+ gpus have no business only having that amount. Regardless of budget I would be shooting for at least 3060 ti performance but with at least 12GB of vram if I was keeping the card longer than 2 years.
> 
> Nvidia decided to give us two bad options in the midrange a card not powerful enough for 12GB and a 2080 super level card with the same amount of vram as budget cards from 2016....



2016 GPUs, see if I can list them here from Nvidia:
GTX 1050 - 2GB - $109
GTX 1050 Ti - 4GB - $139
GTX 1060 - 3GB - $199
GTX 1060 - 6GB - $250
GTX 1070 - 8GB - $379
GTX 1080 - 8GB - $599
Titan X - 12GB - $1200

Looks like the mid range cards, 1060 and under were only coming with 6GB and less and the 6GB model was already priced $250. Yeah, I'm just not seeing 2016 GPUs having 8GB of RAM and being $200-ish.

It's also hard to compare prices from 2016 when it was using different hardware. These 2016 cards came with GDDR5 or GDDR5X. The RTX 3xxx come with GDDR6 or GDDR6X memory. 

Even the RTX 2xxx series, the 2060 launched with 6GB and MSRP was $299.
2070/80 and the Super versions came with 8GB.

I don't understand why people are so upset that a mid range card this generation has 8GB.


----------



## Lei (Dec 8, 2021)

What about non-gaming uses?
I need to open several 3d software at the same time : unreal, maya, substance painter, speedtree, photoshop....
My monitor max at 1080

Is 3060ti still better for my use case?


----------



## Selaya (Dec 8, 2021)

Valantar said:


> [ ... ]
> 
> Don't all 3090s have 24GB? The 3090 is a fundamentally weird GPU, as it doesn't get creator-focused drivers (it's not a Titan), but costs as much, while having 2x the RAM such a card would need. There's a reason they've launched a 12GB 3080 Ti - it performs the same at a lower BOM cost. The 3090 is IMO mainly a flex - it's Nvidia making a _true_ flagship SKU, one that is "extra everything" without it really making sense beyond demonstrating that it can be done. Which is precisely where ultra-luxury products tend to live.
> 
> [ ... ]


I don't recall Titans ever receiving special drivers. They just got the usual GeForce ones, the Quadro drivers are reserved for ... Quadros. (and the new RTX AX000 series which are Quadros w/o the Quadro brand, but I digress.)


----------



## Vario (Dec 8, 2021)

I appreciate the discussion with a lot of good points made all around.  My game plan is likely 3060 due to queue, then 3060ti from queue and sell the 3060, then enter queue for the next gen card (4060?), and sell the 3060ti because I am sure we will be stuck in this mining mess for another couple years and I am sure the price on the used market won't come down to sane levels either.

Edit: Also it seems like a bunch of cards in the marketing stack are getting VRAM increases, a trend that is likely to continue.








						New Filing Hints at a Memory Upgrade for RTX 3080 and RTX 3070 Ti
					

More memory, more money




					www.tomshardware.com


----------



## oxrufiioxo (Dec 8, 2021)

neatfeatguy said:


> 2016 GPUs, see if I can list them here from Nvidia:
> GTX 1050 - 2GB - $109
> GTX 1050 Ti - 4GB - $139
> GTX 1060 - 3GB - $199
> ...



The RX 480 released in 2016 with 8GB of vram for 239.

Nvidia under the Ti cards has always cheaped out on vram it was semi ok on the 1080 from 2016 but the best pascal card period was the 1080 ti with 11Gb from 2017 that comically has more vram than the 3080..... Turing was terrible under the 2080 ti at launch the Super cards are ok but not great.

I owned  2 1080s, a 1080 ti, and a Titan Xp.... 8GB was ok in 2016 it isn't in 2021.

What people do with their own money I don't personally care and if you think 8GB 400+ gpu are ok in 2021 good for you.



Vario said:


> I appreciate the discussion with a lot of good points made all around.  My game plan is likely 3060 due to queue, then 3060ti from queue and sell the 3060, then enter queue for the next gen card (4060?), and sell the 3060ti because I am sure we will be stuck in this mining mess for another couple years and I am sure the price on the used market won't come down to sane levels either.
> 
> Edit: Also it seems like a bunch of cards in the marketing stack are getting VRAM increases, a trend that is likely to continue.
> 
> ...



Definitely a solid gameplan. Nvidia is saying better supply in the 2nd half of 2022 but I've heard that one before.....


----------



## neatfeatguy (Dec 8, 2021)

oxrufiioxo said:


> The RX 480 released in 2016 with 8GB of vram for 239.
> 
> Nvidia under the Ti cards has always cheaped out on vram it was semi ok on the 1080 from 2016 but the best pascal card period was the 1080 ti with 11Gb from 2017 that comically has more vram than the 3080..... Turing was terrible under the 2080 ti at launch the Super cards are ok but not great.
> 
> ...



So, you're still upset that a midrange card isn't good enough for you? I don't follow your logic.

You owned top-end cards (1080/1080Ti/Titan XP) and they were okay for you in 2016 with 8+GB they came with. Are the top-end cards of this current gen not good enough for your needs? They come with 10/12/24GB, depending on the model you get. I can understand why a mid-ranged card like the 3060Ti and possibly the 3070 wouldn't be good enough for you since they're limited to 8GB, but they are just mid-ranged gaming cards in the lineup that's available from Nvidia.

Also, you need to stop comparing prices of 5-6 years ago to prices now. Things have changed, prices are up.

Even the RX 480 card that had MSRP of $239, just from inflation alone would almost see a 12% increase, putting it at $270 for today's cost.
But, it was also using different tech (GDDR5 vs GDDR6), so you can't just outright compare and older generation and boldly claim prices these days are way out of line for what's being provided as stuff from 5/6/7 years ago. Prices have gone up from other factors, too; such as tariffs, wage increases, costs of raw material, etc.

I don't disagree that prices seem high for what is offered, but time has moved on and prices are going up. On the bright side of things mid-ranged cards from Nvidia have been having an uptick in available RAM as new generations come out. 
GTX 760 was 2GB
GTX 960 was 4GB
GTX 1060 was 3GB and 6GB
RTX 2060 was 6GB
RTX 3060 is 12GB (let's face it, 12GB on a 3060 is wasted, if possible to put 8GB would have been just fine, but I don't think 6GB wouldn't have been enough for it)
RTX 3060Ti is 8GB


----------



## looniam (Dec 8, 2021)

Vario said:


> My game plan is likely 3060 due to queue, then 3060ti from queue and sell the 3060,


yeah that was my plan BUT you know evga are still waaaay behind?





						EVGA Queue Tracker
					

Web site to track the EVGA Product Queues




					www.element35gaming.com
				



3060ti XC gaming are still in january and the FTW are still processing launch day orders. if you didn't mash the que button within minutes (if not seconds) of launch . .forget it. 

there had been a crowd sourced spreadsheet on google and seen i was gonna (estimated) wait until next april and left the queue with ~1500+ people ahead of me.

sorry, i really am not trying to rain on your parade.


----------



## oxrufiioxo (Dec 8, 2021)

neatfeatguy said:


> So, you're still upset that a midrange card isn't good enough for you? I don't follow your logic.


Not really, I still buy their high end products.... I wouldn't personally touch the 3080 down and just pointing out Nvidia likes to cut every vram corner it can when it comes to their margins all the way up to the 80 tier cards these days. They use to only do it up to the 60 tier cards how times have changed...... I feel bad for people in this general price range it sucks... I didn't even like the 3080 at MSRP it only looked good because Turing was such a bad value



looniam said:


> yeah that was my plan BUT you know evga are still waaaay behind?
> 
> 
> 
> ...



Yeah, it's depressing I was averaging about 12 gaming builds per year from 2013-2019... 2020-2021 I've done 2 due to the gpu shortages..... Not including upgrading my own hardware which I do frequently. It's a sad day when I feel relieved I got a 3080ti FTW3 for its $1400 MSRP. I know people who have been waiting  over a year for a vanilla 3080 in the evga queue.


----------



## neatfeatguy (Dec 8, 2021)

oxrufiioxo said:


> Not really, I still buy their high end products.... I wouldn't personally touch the 3080 down and just pointing out Nvidia likes to cut every vram corner it can when it comes to their margins all the way up to the 80 tier cards these days. They use to only do it up to the 60 tier cards how times have changed...... I feel bad for people in this general price range it sucks... I didn't even like the 3080 at MSRP it only looked good because Turing was such a bad value
> 
> 
> 
> Yeah, it's depressing I was averaging about 12 gaming builds per year from 2013-2019... 2020-2021 I've done 2 due to the gpu shortages..... Not including upgrading my own hardware which I do frequently. It's a sad day when I feel relieved I got a 3080ti FTW3 for its $1400 MSRP. I know people who have been waiting  over a year for a vanilla 3080 in the evga queue.



I do agree that the 3080 felt a little on the lower end of things with VRAM as to how the previous high-end cards were progressing, I thought it would show up with at least 12GB.

I was thrilled when the saw the performance the cards offered and the MSRP they were listing for when the cards were supposed to launch.
A RX 6800 or a RTX 3070 would have been the best option for me, they landed in the best price range and would have been a helluva upgrade from the 980Ti I was using. I didn't fell hindered by the 6GB of VRAM on the 980Ti and 8GB on the 3070 should handle things just fine for my needs. I was pumped to get one! Sadly, many of us know how things went with inventory and then followed by price hikes and scalping.

Then the 3060Ti was announced, a solid performance card that's just behind a 3070 and priced lower....I could live with that. Then, of course, everyone and their cousins were snatching them up for scalping or mining.

Eventually, when all was said and done, I was lucky enough to find a 3080 for around $950 after taxes and shipping costs. If things go like they did with my 980Ti, I'll probably make good use of this card for a good 4-5 years.

I still know a few people waiting for 3070 or 3080 cards from the EVGA queue, too. They've been in it since the start of November 2020 for the models they opted in for. 

Hopefully folks are hitting up the newegg.com/shuffle - you can generally find 3060Ti listed on there without being in a bundle, around the $550-620 price range.


----------



## looniam (Dec 8, 2021)

well i guess jan 11 (9am pst?) will be another episode of cue button mash




does this put a new spin on things?


----------



## RandallFlagg (Dec 8, 2021)

oxrufiioxo said:


> Yeah, it's depressing I was averaging about 12 gaming builds per year from 2013-2019...* 2020-2021 I've done 2 due to the gpu shortages*..... Not including upgrading my own hardware which I do frequently. It's a sad day when I feel relieved I got a 3080ti FTW3 for its $1400 MSRP. I know people who have been waiting  over a year for a vanilla 3080 in the evga queue.



That's interesting.  I've been thinking that this GPU situation may kill PC gaming and more generally desktops if it goes on for too long.  I've said for a long time if someone asks me about a brand new PC gaming build, for most of the past year I would point them towards a laptop.   

However, even laptops with current dGPUs are beginning to get silly high in cost.  

My last laptop was a 7700HQ with a 1070, got it near the end of the 10X0 series life, and cost $1100.  Today looking at newegg and eliminating the sub 3050 cards, looks like the lowest in stock is $1850.  

I've been doing Newegg shuffle on the off chance I can get a 3060 Ti... for $600.  Back in early 2020 I didn't even want to spend $300 on a GPU, that was pretty much my self-imposed limit since the only time I have ever bought a top end GPU it was $300 (think it was the Geforce 256).

For a lot of people who want to game and aren't loaded with disposable income,  pointing them to an AppleTV or Nintendo Switch or some similar device is making a lot of sense.  Think maybe the days of PC gaming are just going to stagnate here.


----------



## Bomby569 (Dec 8, 2021)

looniam said:


> well i guess jan 11 (9am pst?) will be another episode of cue button mash
> 
> View attachment 228150​does this put a new spin on things?



those vram numbers make a lot more sense, but as no one will be able to get one at decent prices i guess it doesn't help us at all


----------



## Valantar (Dec 8, 2021)

looniam said:


> well i guess jan 11 (9am pst?) will be another episode of cue button mash
> 
> View attachment 228150​does this put a new spin on things?


Meh - to me that just looks like Nvidia going with the flow of people shouting for more VRAM with little backing up that claim besides inaccurate driver readouts and "AMD has 16GB!" Guess we'll see in a few years whether or not I'm wrong. I doubt it. The 3080 Ti might be the exception there, as it's powerful enough to do 2160p RT in most games, and thus can stand to have some RAM added. 8GB is complete overkill for the 3050, but 4GB would be on the low side so I guess that's where the bus leaves them. That 3070 Ti is ... well, silly. But again, this is where the bus stops, so to speak. Doubling RAM for a higher SKU was less of a shock back when that meant another GB or two. Now that we're tacking another 8GB onto a card, if anything, we're starting to see a need for more diversity in VRAM die capacities. If they made 12Gbit chips too instead of just 8 and 16 things would be much more reasonable across the board. IMO, given how huge VRAM capacities are these days, this is becoming a necessity in order to configure SKUs in non-stupid ways. 192-bit bus you say, and 6GB seems low? Have 9, instead of a useless 12. 256-bit bus and 8GB on the stingy side? Here's 12. All the while the limited wafer output would be made better use of across more products.


----------



## Bomby569 (Dec 8, 2021)

Valantar said:


> Meh - to me that just looks like Nvidia going with the flow of people shouting for more VRAM with little backing up that claim besides inaccurate driver readouts and "AMD has 16GB!" Guess we'll see in a few years whether or not I'm wrong. I doubt it. The 3080 Ti might be the exception there, as it's powerful enough to do 2160p RT in most games, and thus can stand to have some RAM added. 8GB is complete overkill for the 3050, but 4GB would be on the low side so I guess that's where the bus leaves them. That 3070 Ti is ... well, silly. But again, this is where the bus stops, so to speak. Doubling RAM for a higher SKU was less of a shock back when that meant another GB or two. Now that we're tacking another 8GB onto a card, if anything, we're starting to see a need for more diversity in VRAM die capacities. If they made 12Gbit chips too instead of just 8 and 16 things would be much more reasonable across the board. IMO, given how huge VRAM capacities are these days, this is becoming a necessity in order to configure SKUs in non-stupid ways. 192-bit bus you say, and 6GB seems low? Have 9, instead of a useless 12. 256-bit bus and 8GB on the stingy side? Here's 12. All the while the limited wafer output would be made better use of across more products.



coincidence was just watching Linus and he disagrees with you. Don't forget DLSS and FSR that lets you use lower end cards with higher resolutions and settings.


----------



## RandallFlagg (Dec 8, 2021)

It used to be when you selected textures, they would say '1080p' '1440p' and '4K' or similar.  

Now they just say 'Medium' 'Max' and 'Ultra'.   Good move.  Helps sell 12GB GPUs to people with 1080P monitors.


----------



## lexluthermiester (Dec 8, 2021)

RandallFlagg said:


> It used to be when you selected textures, they would say '1080p' '1440p' and '4K' or similar.


That was never a thing.


----------



## RandallFlagg (Dec 8, 2021)

lexluthermiester said:


> That was never a thing.



Skyrim.


----------



## Vario (Dec 8, 2021)

looniam said:


> yeah that was my plan BUT you know evga are still waaaay behind?
> 
> 
> 
> ...


3060 XC Black SKU 3655 Last Timestamp 02/27/2021.  I am early March 2021 time stamp for 3655 SKU.  So I am due for 3060 queue draw.








						EVGA "In Stock" and "Queue" Summary
					

EVGA has started sending out emails that read, in part Now that EVGA has sufficient stock on 30 series at EVGA.com and at ETAIL/RETAIL partners, starting June 23th 2022, all pending queue notifies on [Fill in the blank EVGA SKU number] will be removed and the product will be made available for ...




					docs.google.com


----------



## Valantar (Dec 8, 2021)

Bomby569 said:


> coincidence was just watching Linus and he disagrees with you. Don't forget DLSS and FSR that lets you use lower end cards with higher resolutions and settings.


Well, we all have our opinions. Interesting that the screenshot you chose (which is a marketing slide from AMD, so hardly an unbiased source of information) has a subtitle noting the exact thing I've been saying all along here: that measuring actual VRAM usage is extremely difficult (and driver readouts of allocated VRAM are misleading and untrustworthy).

Also, DLSS and FSR do the opposite of what you say - they allow you to render at _lower _resolution than native, not higher than previously. They _output _at a higher resolution, but that's through upscaling. It can lead to minor increases in VRAM over non-upscaled rendering at the same resolution due to the buffering of data for the upscaler, but that amount is negligible. But remember, DLSS 2160p ranges from 720p rendering at Ultra Performance to 1440p rendering at Quality. VRAM usage will be more in line with the render resolution than the output resolution.


----------



## Outback Bronze (Dec 8, 2021)

neatfeatguy said:


> I was lucky enough to find a 3080



I wouldn't be disappointed with a 3080 matey. Its a very good card. It was a toss-up of that and a 6800XT for me and at the time the 3080 prices were out of control so 6800XT it was.


----------



## lexluthermiester (Dec 8, 2021)

RandallFlagg said:


> Skyrim.


What was that?


----------



## Frick (Dec 8, 2021)

lexluthermiester said:


> What was that?
> View attachment 228165



but modz


Solid State Soul ( SSS ) said:


> Regardless of what everyone is saying, I believe paying above msrp for a 8gb card in now soon to come 2022 is a bad idea, lots of new games can consume up to 8gb at 1080p, and todays games are cross gen, what do you think will happen once developers start to optimize games form the ground up for consoles with 16gb ram ?
> 
> I strongly believe if you are buying a card today it should have 12gb minimum, and please don't spill me the "suffent vram for sufficient performance " that is utter BS, the 780ti came with 3gb vram when ps4 launched with 8gb ram, then two years later every game under the sun ate that 3gb for breakfast, the 1060 launched 3 years later with 780ti performance and double the vram, that card still plays games today fine, so would have the 780ti, if only it came with sufficient vram, and let's not forget the 1060 3gb ...
> 
> ...



All of this would make sense if there existed such a thing as a GPU sold at MSRP. But that is not the case, and there are no signs whatsoever the situation will change for the forseeble future. So it sucks, but it's the way it is. If you sit on a failing GTX980 or a RX480 and want more perfromance, what are the realistic options?


----------



## looniam (Dec 8, 2021)

Vario said:


> 3060 XC Black SKU 3655 Last Timestamp 02/27/2021.  I am early March 2021 time stamp for 3655 SKU.  So I am due for 3060 queue draw.
> 
> 
> 
> ...


your link has my link.  (e: oops it was another)
enewt is a good guy.


----------



## AvrageGamr (Dec 9, 2021)

Go for the 12 gig if you are gaming at or above 1440p. Vram usage is increasing. Far Cry 6 for example has issues with the high res texture pack with 8 gig cards at high resolutions. There is a number of games that use 11 plus gigs of vram in 4k.  FH 5 is the same. Don't be surprised when 16 gig vram becomes the standard. Edit; Even the consoles have significantly more ram in them now.


----------



## Vayra86 (Dec 9, 2021)

oxrufiioxo said:


> I agree, $200 ish cards from 2016 had 8GB of vram.. 400+ gpus have no business only having that amount. Regardless of budget I would be shooting for at least 3060 ti performance but with at least 12GB of vram if I was keeping the card longer than 2 years.
> 
> Nvidia decided to give us two bad options in the midrange a card not powerful enough for 12GB and a 2080 super level card with the same amount of vram as budget cards from 2016....



Exactly.

Everything else is massive cognitive dissonance. But the topic question is the choice between two Nvidia cards... and in that limited scenario I do think the 3060ti with its 8GB would be the 'least shitty option'... The real question is whether that is worth spending big money on at all...



neatfeatguy said:


> So, you're still upset that a midrange card isn't good enough for you? I don't follow your logic.
> 
> You owned top-end cards (1080/1080Ti/Titan XP) and they were okay for you in 2016 with 8+GB they came with. Are the top-end cards of this current gen not good enough for your needs? They come with 10/12/24GB, depending on the model you get. I can understand why a mid-ranged card like the 3060Ti and possibly the 3070 wouldn't be good enough for you since they're limited to 8GB, but they are just mid-ranged gaming cards in the lineup that's available from Nvidia.
> 
> ...



The issue isnt price in isolation, its that unholy combo with planned obscolescense. Paying alot for tech that is hardly used to advantage (tensor/Rt) in exchange for low VRAM buffer is exactly that.


----------



## oxrufiioxo (Dec 9, 2021)

Vayra86 said:


> Exactly.
> 
> Everything else is massive cognitive dissonance. But the topic question is the choice between two Nvidia cards... and in that limited scenario I do think the 3060ti with its 8GB would be the 'least shitty option'... The real question is whether that is worth spending big money on at all...



I think it's fine as long as he gets it in the evga queue soon.  The 3060 might cover itself and half the Ti on the used market. If it takes more than 3-4 months probably not worth it.


----------



## Bomby569 (Dec 9, 2021)

Valantar said:


> Well, we all have our opinions. Interesting that the screenshot you chose (which is a marketing slide from AMD, so hardly an unbiased source of information) has a subtitle noting the exact thing I've been saying all along here: that measuring actual VRAM usage is extremely difficult (and driver readouts of allocated VRAM are misleading and untrustworthy).
> 
> Also, DLSS and FSR do the opposite of what you say - they allow you to render at _lower _resolution than native, not higher than previously. They _output _at a higher resolution, but that's through upscaling. It can lead to minor increases in VRAM over non-upscaled rendering at the same resolution due to the buffering of data for the upscaler, but that amount is negligible. But remember, DLSS 2160p ranges from 720p rendering at Ultra Performance to 1440p rendering at Quality. VRAM usage will be more in line with the render resolution than the output resolution.



If i used DLSS i can max all the settings because i'm rendering at a lower resolution, if i didn't and stayed at my native resolution i had to go for lower setttings, so doesn't that makes VRAM usage go up? because that's exactly what happens in RE for example and what is showned on the AMD chart.

Either i'm very confused or i think you got things backward.


----------



## Valantar (Dec 9, 2021)

Bomby569 said:


> If i used DLSS i can max all the settings because i'm rendering at a lower resolution, if i didn't and stayed at my native resolution i had to go for lower setttings, so doesn't that makes VRAM usage go up? because that's exactly what happens in RE for example and what is showned on the AMD chart.


Sure, that would increase it, but it goes up _relative to the baseline of a lower resolution_. And it would take a pretty significant increase in detail levels to make up for the difference between rendering at, say, 1440p and 2160p, and even more if you're not running DLSS Quality but something with a lower baseline resolution like Performance. You won't be able to increase detail levels past the game's Ultra settings, so whatever those are + however much DLSS adds is the ceiling there.. Thus, 2160p DLSS Quality (1440p redner resolution) at Ultra settings will use marginally more VRAM than 1440p Ultra, but most likely nowhere near as much as native 2160p Ultra. DLSS is still a fancy upscaler, and upscaling does not bring your VRAM usage up to the level of native rendering at the output resolution unless it's already _very_ close.

An example (and yes, these numbers are entirely pulled out of my rear end and are only for the sake of exemplifying the dynamics here, not actual VRAM usage):
In a given game, at ultra settings, playing at native 1440p needs ~6GB of VRAM and playing at native 2160p needs ~8GB of VRAM. DLSS consumes a few hundred MB for its temporal buffering and other data. How much VRAM will 2160p DLSS Quality at Ultra settings then require? 6GB + a few hundred MB. There is no increase in the internal rendering resolution, thus none of the factors that increase VRAM usage in native rendering between 1440p and 2160p are in play, and thus, you don't see the same increase in VRAM usage.


----------



## Bomby569 (Dec 9, 2021)

I think you got my original point. 
This is confusing but i can do 1440p with max settings (even if i'm not really doing 1440p), and i would never be doing 1080p high or ultra (that is not an option for most of us, go for the lower res then what your monitor supports), i would be doing 1440p low settings, so it uses a lot more VRAM. This was just my point. Even if i admit the way i initially put it was confusing.


----------



## Valantar (Dec 9, 2021)

Bomby569 said:


> I think you got my original point.
> This is confusing but i can do 1440p with max settings (even if i'm not really doing 1440p), and i would never be doing 1080p high or ultra (that is not an option for most of us, go for the lower res then what your monitor supports), i would be doing 1440p low settings, so it uses a lot more VRAM. This was just my point. Even if i admit the way i initially put it was confusing.


I see what you're saying, but how does that relate to DLSS? DLSS always lowers the actual rendering resolution - that's the entire point of it. You have DLAA, which uses the temporal scaler from DLSS as an anti-aliasing pass at native resolution, but that's not DLSS. DLSS is an upscaler, after all, and a huge part of its point is to allow for play at upscaled-to-native resolution at higher quality levels than you would be able to use if you were rendering at native resolution. I.e. in a game where you can play at 1080p High or 1440p Low, DLSS allows you to kinda-sorta get the experience of 1440p High through rendering at 1080p High and then upscaling it.

Also, lowering detail levels at any given resolution will drastically lower VRAM usage in most games, so at that point this whole discussion is pretty much moot. If we have games in 4-5 years that need more than 8GB of VRAM at 1440p medium (or, god forbid, low), PC gaming would be utterly dead, as that's an entirely unrealistic baseline for performance.


----------



## Mussels (Dec 9, 2021)

Bomby569 said:


> I think you got my original point.
> This is confusing but i can do 1440p with max settings (even if i'm not really doing 1440p), and i would never be doing 1080p high or ultra (that is not an option for most of us, go for the lower res then what your monitor supports), i would be doing 1440p low settings, so it uses a lot more VRAM. This was just my point. Even if i admit the way i initially put it was confusing.


Resolution + Textures + AA

Those three are the VRAM eaters, the rest of the settings are more or less GPU usage
So quite often all you need to do is lower from ultra to high or medium, and halve the VRAM usage with the eye candy still the same

DLSS, Image scaling, and FSR all let us render lower than native res while outputting native res - so we are finally getting the methods consoles have always had to cheat performance on PC as well.

Render at 1080p, output at 4k with sharpening and away you go. Suddenly you dont need a monster GPU, or 24GB VRAM


----------



## Valantar (Dec 9, 2021)

Mussels said:


> Resolution + Textures + AA
> 
> Those three are the VRAM eaters, the rest of the settings are more or less GPU usage
> So quite often all you need to do is lower from ultra to high or medium, and halve the VRAM usage with the eye candy still the same
> ...


Very true. It's frankly about time we start being smarter about performance instead of just brute-forcing it. Computers are supposed to be kind of smart, no?

There have been quite a few games with built-in scalers before DLSS though (or after it launched but still lacking it) - they just use kind of shitty bilinear scaling, which loses a lot of quality. AFAIK that's what most console games have used too, though some use more clever solutions like checkerboard rendering.


----------



## Mussels (Dec 10, 2021)

Valantar said:


> Very true. It's frankly about time we start being smarter about performance instead of just brute-forcing it. Computers are supposed to be kind of smart, no?
> 
> There have been quite a few games with built-in scalers before DLSS though (or after it launched but still lacking it) - they just use kind of shitty bilinear scaling, which loses a lot of quality. AFAIK that's what most console games have used too, though some use more clever solutions like checkerboard rendering.


The games could cheat a little, rendering the 2D elements and HUD at 4K, with the game at lower res. That allowed text to be crystal clear at all times, with the 3D elements dynamically changing.
I've seen that in quite a few modern games already (D2 resurrected has it, working well too) so i guess thats how things will progress now


To make games sell, they have to make it prettier than the last game. But now with the parts shortage, they cant - so to avoid losing sales they're working on performance, for a change


----------



## Valantar (Dec 10, 2021)

Mussels said:


> The games could cheat a little, rendering the 2D elements and HUD at 4K, with the game at lower res. That allowed text to be crystal clear at all times, with the 3D elements dynamically changing.
> I've seen that in quite a few modern games already (D2 resurrected has it, working well too) so i guess thats how things will progress now
> 
> 
> To make games sell, they have to make it prettier than the last game. But now with the parts shortage, they cant - so to avoid losing sales they're working on performance, for a change


True. But developers are also always pushed to finish the games ASAP, which leads to quick fixes and poorly optimized code, not to mention insufficient time to learn how to make the most of the hardware. The quality changes from early to late games across any console generation demonstrates this beautifully - the difference between early and late PS3 or X360 games is downright staggering. Just goes to show what you can do with the same hardware if you have enough time to learn how to really squeeze it. In some way, DLSS/FSR allow for a kind of third path here - no hardcore optimization; no brute force quality improvements, but a brand-new route that can incorporate both but is its own thing still. Given that lithographic density and transistor count increases are bound to taper off sooner rather than later, getting smarter about this is really about time.


----------



## eidairaman1 (Dec 11, 2021)

Ti is a cherry chip and i thought that componrnts between the 2 were cut on the non ti card


----------



## Vario (Dec 13, 2021)

Well, my queue notify occurred and I ordered an EVGA RTX 3060 XC Black 12GB: 12G-P5-3655-KR.  For those wondering, I entered the wait list March 3, 2021.


eidairaman1 said:


> Ti is a cherry chip and i thought that componrnts between the 2 were cut on the non ti card


Don't know if this is what you are referring to but the 3060 is GA106, while the 3060TI is cut down from 3070's GA104.


----------



## Deleted member 202104 (Dec 13, 2021)

Vario said:


> Well, my queue notify occurred and I ordered an EVGA RTX 3060 XC Black 12GB: 12G-P5-3655-KR.  For those wondering, I entered the wait list March 3, 2021.
> 
> Don't know if this is what you are referring to but the 3060 is GA106, while the 3060TI is cut down from 3070's GA104.



Glad to hear you were able to order!

I've got notifies pending for two 3080's, one from October 2020 and the other from November 2020.  I don't really want them anymore, but my sister and her husband may need a card when/if they're ever in stock.

Congrats again!


----------



## Vario (Dec 13, 2021)

weekendgeek said:


> Glad to hear you were able to order!
> 
> I've got notifies pending for two 3080's, one from October 2020 and the other from November 2020.  I don't really want them anymore, but my sister and her husband may need a card when/if they're ever in stock.
> 
> Congrats again!


Yeah the 3080 would be nice, but so far beyond my budget.  I usually buy in the $300-400 card segment.


----------



## neatfeatguy (Dec 14, 2021)

Vario said:


> Well, my queue notify occurred and I ordered an EVGA RTX 3060 XC Black 12GB: 12G-P5-3655-KR.  For those wondering, I entered the wait list March 3, 2021.
> 
> Don't know if this is what you are referring to but the 3060 is GA106, while the 3060TI is cut down from 3070's GA104.



Congrats. Hopefully it gives you some improved performance you're looking for.


----------



## Mussels (Dec 14, 2021)

It's a whole 2000 bigger than your 1060, so it's gunna be quite the upgrade


----------



## chrcoluk (Dec 21, 2021)

Turns out FF7 Remake PC needs 11-12 gig minimum to not lose performance from excessive texture eviction from VRAM.  Seems a poor port in general though.

Maybe I will upgrade my 3080 to the 3060 12 gig?


----------



## oxrufiioxo (Dec 21, 2021)

chrcoluk said:


> Turns out FF7 Remake PC needs 11-12 gig minimum to not lose performance from excessive texture eviction from VRAM.  Seems a poor port in general though.
> 
> Maybe I will upgrade my 3080 to the 3060 12 gig?



Even with 12GB is feels like shit compared to the PS5 version don't waste your time. There are stutters even on 24GB cards.


----------



## chrcoluk (Dec 21, 2021)

oxrufiioxo said:


> Even with 12GB is feels like shit compared to the PS5 version don't waste your time. There are stutters even on 24GB cards.



I forgot to reply check this link.

FF7R texture swapping

It seems the same as older SE FF ports, stutters due to evicting textures to make room for new ones in VRAM, the guy found a way to disable it so they all stay cached, and it fixes it until of course the VRAM fills up.

Lightning returns was very extreme with the issue, it was coded to run on machines with just 300MB of ram, amd would stutter every time moving in certain field areas as it was unloading and loading new textures.

I will speculate, with the PS5 version the dev's had assurances everyone has 16gigs of GDDR so could do less aggressive texture swapping, and they have probably coded this game for low end hardware.  Due to some video cards on the market having pitiful amounts of video ram. Even the original PS4 had more memory than low end PC cards.

Next time you play it, check if it actually utilises all your VRAM and try his trick, I am curious if it helps you.


----------



## lexluthermiester (Dec 21, 2021)

oxrufiioxo said:


> Even with 12GB is feels like shit compared to the PS5 version don't waste your time. There are stutters even on 24GB cards.


I'm not seeing that. 8GB card. You're having another problem. Perhaps your settings are not optimal.


----------



## Solid State Soul ( SSS ) (Dec 21, 2021)

chrcoluk said:


> Even the original PS4 had more memory than low end PC cards.


The original PS4 had more Vram than a freaking 780Ti, you know, the flagship 700$ Nvidia card in 2013 for gamers.



lexluthermiester said:


> I'm not seeing that. 8GB card. You're having another problem. Perhaps your settings are not optimal.


Are you playing the FF7 Remake right now ? 
i'v seen plenty of people including You tubers complaining about major stuttering issues even on 3090s


----------



## lexluthermiester (Dec 21, 2021)

Solid State Soul ( SSS ) said:


> Are you playing the FF7 Remake right now ?
> i'v seen plenty of people including You tubers complaining about major stuttering issues even on 3090s


My son has it. He had framerate issues to begin with but when we turned on vsync and adjusted a few other settings it was solid. Granted, he's on 1080p. Maybe the game engine is choking on 4k?


----------



## Vario (Dec 21, 2021)

Solid State Soul ( SSS ) said:


> The original PS4 had more Vram than a freaking 780Ti, you know, the flagship 700$ Nvidia card in 2013 for gamers.
> 
> 
> Are you playing the FF7 Remake right now ?
> i'v seen plenty of people including You tubers complaining about major stuttering issues even on 3090s


The 780ti also aged terribly due to that 3GB.


----------



## looniam (Dec 21, 2021)

Vario said:


> The 780ti also aged terribly due to that 3GB.


OG titans with 6GB haven't fared any better, kepler's complete lack of compute didn't age well at all. the 7970 with 3GB starting slapping both of them around years ago.
that's how both kepler, maxwell and with anemic hardware support, pascal had such good power consumption since compute is a power hog.

mantle/DX12/vulkan were the nails in those coffins.


----------



## Solid State Soul ( SSS ) (Dec 21, 2021)

looniam said:


> that's how both kepler, *maxwell and with anemic hardware support, pascal had such good power consumption since compute is a power hog.*


are you saying Maxwell and Pascal are power efficient because they sucked at performance ?



Vario said:


> The 780ti also aged terribly due to that 3GB.


It surely did age terribly, not even a year later Shadow of Mordor recommended of 6gb of vram for its hd texture pack, imagine paying 700$ then be told you cant play at high resolution texture packs a year later, lol
Nvidia want to deliver just the bare minimum amount vram for cost cutting measures and its sad to see people not see trough this, pascal was an anomaly that i dont think Nvidia would
repeat again, amazing performance to wattage ration and plenty of vram, why people are keeping their 1060s still ? because its a decent performing card and the 6gb frame buffer allowed it to last the test of time, the 1080ti was so good you can even shug along with it till the mid of this new console generation


----------



## looniam (Dec 21, 2021)

Solid State Soul ( SSS ) said:


> are you saying Maxwell and Pascal are power efficient because they sucked at performance ?


not what i stated at all.


----------



## Solid State Soul ( SSS ) (Dec 21, 2021)

looniam said:


> not what i stated at all.


my bad, i got confused since you mentioned Kepler too, those were not efficient, maxwell almost halved the power consumption over it


----------



## looniam (Dec 21, 2021)

Solid State Soul ( SSS ) said:


> my bad, i got confused since you mentioned Kepler too, those were not efficient, maxwell almost halved the power consumption over it


yeah, sorry i'm starting to celebrate my BD (12/22) a few hours early and my grammar/syntax will be more american than usual.


----------



## oxrufiioxo (Dec 22, 2021)

lexluthermiester said:


> I'm not seeing that. 8GB card. You're having another problem. Perhaps your settings are not optimal.



On both my 3080 ti and 2080 ti it's bad at all 3 resolutions. My buddies with 3080s and 3090s are also mentioning stutters but I haven't been able to swing by their places to see if it's the same issues I'm seeing in person yet.

Don't get me wrong it's playable but frame times are still worse than the PS5 version which I also own. It's just a bad port in general.... They've also ruined the hdr when compared to the Playstation versions. Apparently running the game in DX 11 helps but I haven't had a chance to try it.

This is pretty much what I'm seeing if anyone else is wondering how bad the port is.


__ https://twitter.com/i/web/status/1471805451495002119


----------



## Rorre (Dec 22, 2021)

For gaming the 3060ti for sure, for AI and ML the more vram the better. So a 3060 12GB card for those who cant afford an RTX card for scientific computing.


----------



## lexluthermiester (Dec 22, 2021)

Rorre said:


> So a 3060 12GB card for those who cant afford an RTX card for scientific computing.


Not true at all.



Vario said:


> The 780ti also aged terribly due to that 3GB.


There was a 6GB version that was fairly popular. And it aged a lot better.



oxrufiioxo said:


> This is pretty much what I'm seeing if anyone else is wondering how bad the port is.


I think it's a beautiful port. Not everyone is having that glitch and in every other way it is an amazing game. Let's not blow things out of proportion folks.

That said, we're off-topic. Glitches with FF7Remake should be taken to a new thread f the discussion needs to progress. I'd be happy to open such a thread if everyone would like.


----------



## AleXXX666 (Dec 22, 2021)

oxrufiioxo said:


> That's a great idea actually.


better to get 3070 in such case, but, if money isn't the issue, then tou could buy pentium, sell it and buy i7 



oxrufiioxo said:


> That's a great idea actually.


great, if you sell 3060 for full price you got it... 



Vario said:


> I've been in EVGA queue for both of these products for almost a year. The 3060 12GB is likely to come up first.  I have a question regarding predicting which is more likely to have staying power ~4 years out from now.  I tend to run stuff a really long time. The 3060ti processor is much faster but the 12GB ram is possibly better in this aspect if lower vram amounts end up being a limiting factor. What do you guys think?
> 
> The main reason I ask is in 2013, I once purchased a 770 2GB instead of the 4GB because I was told the card wasn't really fast enough to matter.  After a few years, the limited VRAM became an issue that prevented me from running some games.  Similarly, the 780ti's 3GB severely limited that card just a couple years after its launch, despite being close to a 980 in performance otherwise.


2GB and 4 isn't same as 8GB and 12. Get Ti, i have it and it handles 1440p Ultra and 4K medium (Hitman 3, Days Gone i play). I've compared gameplay of Hitman 3 with friends' 3060 and 3060 is trash vs my Ti lol 



Rorre said:


> For gaming the 3060ti for sure, for AI and ML the more vram the better. So a 3060 12GB card for those who cant afford an RTX card for scientific computing.


for scientific you need Tesla or whatever they offer lol....


----------



## oxrufiioxo (Dec 22, 2021)

AleXXX666 said:


> great, if you sell 3060 for full price you got it...



The 3060 even used is selling for around 300 usd more than what he's paying for it currently.


----------



## Rorre (Dec 22, 2021)

lexluthermiester said:


> Not true at all.
> 
> 
> There was a 6GB version that was fairly popular. And it aged a lot better.
> ...



Video RAM size: It measures how much data your system can store and handle at one time. If you’ll be working with categorical data and Natural Language Processing (NLP), the amount of VRAM is not so important. However, higher VRAM is crucial for Computer Vision models.


----------



## lexluthermiester (Dec 22, 2021)

Rorre said:


> Video RAM size: It measures how much data your system can store and handle at one time. If you’ll be working with categorical data and Natural Language Processing (NLP), the amount of VRAM is not so important. However, higher VRAM is crucial for Computer Vision models.


I'm not disputing that. You said...


Rorre said:


> For gaming the 3060ti for sure, for AI and ML the more vram the better. So a 3060 12GB card for those who cant afford an RTX card for scientific computing.


...which is not true. There are many gamers who will prefer a 12GB because it will provide better longevity and thus the card will last them longer. It's not just attractive for scientific use as you originally implied.


----------



## AusWolf (Dec 22, 2021)

I'm not sure if the question is still valid - I didn't want to read 7 pages of comments to influence my opinion. 

Personally, in an ideal situation, I would consider 3 things: 1. How long do you want to keep the card for? 2. What resolution do you play at? 3. Are you willing to sacrifice visuals for performance?

1. If you want to keep it for 4-5 years, the more VRAM might be a better option (3060 12 GB).
2. If you play above 1440p, you need GPU power and VRAM as well (either will do).
3. If you're OK with taking graphical settings down a notch, a more powerful GPU is more useful (3060 Ti 8 GB).

Although, the GPU market is far from ideal, so I'd rather stay in the queue for both and get whichever comes first. They're both more than capable gaming GPUs anyway.


----------



## Frick (Dec 22, 2021)

About VRAM use: I play Cyberpunk 2077 fine on 4K on my 3060ti. Not on max settings, but that is not due to lack of VRAM. The current generation of consoles are very new. I am assuming the 8GB on the 3060ti will be plenty for as long as it's viable as a 4K card. If I have to dial down settings it will be because of lack of power and not lack of VRAM.


----------



## Rorre (Dec 22, 2021)

lexluthermiester said:


> I'm not disputing that. You said...
> 
> ...which is not true. There are many gamers who will prefer a 12GB because it will provide better longevity and thus the card will last them longer. It's not just attractive for scientific use as you originally implied.


I reread my quote, I meant that a 12GB card would be more useful in scientific Vision modelings (where all vram can be utilized) if you can’t afford a *Quadro *RTX card.  I’m not sure that any card can use more than 6-8GB gaming in 1080p. And if I’m wrong forgive me, but a 3060 12GB card playing in 1440p for most modern games may not be more enjoyable than playing in 1080p maxed out. IMHO.


----------



## Bomby569 (Dec 22, 2021)

Frick said:


> About VRAM use: I play Cyberpunk 2077 fine on 4K on my 3060ti. Not on max settings, but that is not due to lack of VRAM. The current generation of consoles are very new. I am assuming the 8GB on the 3060ti will be plenty for as long as it's viable as a 4K card. If I have to dial down settings it will be because of lack of power and not lack of VRAM.



exactly because you don't max the settings that you don't see the problem, if you played in 1080 or 1440p and could max the settings you would (not talking about CP77 but in general)


----------



## Selaya (Dec 22, 2021)

even if you gave the 3060ti 16gib of vram on a proper bus you wouldnt be able to max CP77 settings @4k. the vram's not your bottleneck here.


----------



## Solid State Soul ( SSS ) (Dec 22, 2021)

AusWolf said:


> 1. If you want to keep it for 4-5 years, the more VRAM might be a better option (3060 12 GB).
> 2. If you play above 1440p, you need GPU power and VRAM as well (either will do).


3. If ray tracing is meaningless to you, then Get the best of both worlds with an RTX 6800


----------



## Vario (Dec 22, 2021)

lexluthermiester said:


> There was a 6GB version that was fairly popular. And it aged a lot better.


That model was the 780 6GB not a 780ti, and it was a rarer card.


Solid State Soul ( SSS ) said:


> 3. If ray tracing is meaningless to you, then Get the best of both worlds with an RTX 6800


Without a queue to purchase at MSRP, I am not interested.  Not buying scalped cards above MSRP.


----------



## neatfeatguy (Dec 22, 2021)

Bomby569 said:


> exactly because you don't max the settings that you don't see the problem, if you played in 1080 or 1440p and could max the settings you would (not talking about CP77 but in general)



But I thought having 12GB of VRAM on a weaker card was better over having 8GB on a faster card?

I'm so confused. CP2077 can utilize up to 10GB of VRAM when RT is enabled (upwards of 7GB without RT), according to TPU and their benchmarking. Why is a 3060Ti with only 8GB out pacing a 3060 with 12GB? Last I checked 12 > 8.

Guru3D shows the 3060Ti, even with it's limited VRAM is 30-35% faster than the 3060 that has more VRAM.

Maybe it's just that game.....what other game can use lots of VRAM....?

Oh, maybe FarCry 6 will show how much better the 12GB is over 8GB.
At 1440p almost 8GB is used.
At 4k over 9GB can be used.
No...still seeing a 30-35% faster performance from the lesser VRAM 3060Ti over the 3060.

I just don't understand. What am I not getting? Maybe folks are saying that 4 years down the road the weaker 3060 with 12GB will give better performance over a 3060Ti with only a measly 8GB. Man. I'm just going to have to wait a while to see if this is true. Shucks, by then no one will care because we'll tall be talking/complaining about the current gen AMD/Nvidia/Intel will be providing.


----------



## chrcoluk (Dec 22, 2021)

oxrufiioxo said:


> On both my 3080 ti and 2080 ti it's bad at all 3 resolutions. My buddies with 3080s and 3090s are also mentioning stutters but I haven't been able to swing by their places to see if it's the same issues I'm seeing in person yet.
> 
> Don't get me wrong it's playable but frame times are still worse than the PS5 version which I also own. It's just a bad port in general.... They've also ruined the hdr when compared to the Playstation versions. Apparently running the game in DX 11 helps but I haven't had a chance to try it.
> 
> ...


DF reported the stutters stop if you stand still which seems further evidence its shader/texture evictions and reloading causing the problem.


----------



## AusWolf (Dec 22, 2021)

neatfeatguy said:


> But I thought having 12GB of VRAM on a weaker card was better over having 8GB on a faster card?
> 
> I'm so confused. CP2077 can utilize up to 10GB of VRAM when RT is enabled (upwards of 7GB without RT), according to TPU and their benchmarking. Why is a 3060Ti with only 8GB out pacing a 3060 with 12GB? Last I checked 12 > 8.
> 
> ...


Just because the game _allocates_ that much VRAM when it's available doesn't mean it actually _needs_ it. I also played CP77 with a 4 GB GTX 1650 at 1080p High and VRAM usage hovered between 3-3.5 GB.


----------



## Bomby569 (Dec 22, 2021)

neatfeatguy said:


> Oh, maybe FarCry 6 will show how much better the 12GB is over 8GB.
> At 1440p almost 8GB is used.
> At 4k over 9GB can be used.
> No...still seeing a 30-35% faster performance from the lesser VRAM 3060Ti over the 3060.



VRAM depends on settings, there are games that tell you how much vram is used based on how much you cranck the settings like RE village for example. It's not as simple as what resolution you are playing on.


----------



## lexluthermiester (Dec 23, 2021)

Rorre said:


> I reread my quote, I meant that a 12GB card would be more useful in scientific Vision modelings (where all vram can be utilized) if you can’t afford a *Quadro *RTX card.


Ah, I see what you mean now. Fair enough.


Rorre said:


> I’m not sure that any card can use more than 6-8GB gaming in 1080p.


I've seen it. It used to be rare but it's now getting more common. 8GB for 1080p gaming is good, but for max settings and a few AAA titles, it's just not enough.



Vario said:


> That model was the 780 6GB not a 780ti, and it was a rarer card.


Actually, both had a 6GB variant. I sold them both, side by side. The following for reference;








						NVIDIA GeForce GTX 780 6 GB Specs
					

NVIDIA GK110B, 902 MHz, 2304 Cores, 192 TMUs, 48 ROPs, 6144 MB GDDR5, 1502 MHz, 384 bit




					www.techpowerup.com
				











						NVIDIA GeForce GTX 780 Ti 6 GB Specs
					

NVIDIA GK110B, 928 MHz, 2880 Cores, 240 TMUs, 48 ROPs, 6144 MB GDDR5, 1753 MHz, 384 bit




					www.techpowerup.com
				




The 780ti 6GB, even though it's just had it's last driver set, is still a reasonable card to have, especially in this current state of affairs...


----------



## AusWolf (Dec 23, 2021)

Bomby569 said:


> VRAM depends on settings, there are games that tell you how much vram is used based on how much you cranck the settings like RE village for example. It's not as simple as what resolution you are playing on.


It also depends on how much you've got available. If you have less, the game will allocate less, which will not necessarily result in a difference in performance.



lexluthermiester said:


> I've seen it. It used to be rare but it's now getting more common. 8GB for 1080p gaming is good, but for max setting and a few AAA titles, it's just not enough.


Which titles? I happen to have an 8 GB card and I'd be happy to test it.


----------



## Kissamies (Dec 23, 2021)

chrcoluk said:


> FF15 can use 12 gigs at 1080p due to massive textures   It even has nasty leaks with nvidia grass feature that it will consume 10s of gigs of VRAM if its available.
> 
> Depends on size of textures and textures need VRAM more than horsepower.
> 
> Remember the market is bigger than AAA shooters.


Tho still ran fine on 980 Ti, allocating isn't the same as usage.


----------



## Bomby569 (Dec 23, 2021)

AusWolf said:


> It also depends on how much you've got available. If you have less, the game will allocate less, which will not necessarily result in a difference in performance.



You can literally try like i did a couple days ago because i was playing it and had it installed. It is accurate andRE village turns into a slide show at 10fps if you go over the limits.


----------



## AusWolf (Dec 23, 2021)

Bomby569 said:


> You can literally try like i did a couple days ago because i was playing it and had it installed. It is accurate andRE village turns into a slide show at 10fps if you go over the limits.


I haven't tried RE:Village, but it's on my wishlist, so I eventually will.  I just know that Cyberpunk works quite cleverly with VRAM. It will not try to allocate more than what you have, so looking at VRAM usage on a 16-24 GB card can be misleading. Though of course, there is a lower limit for every game, but I didn't reach it in Cyberpunk with a 4 GB 1650. Most modern games work on similar principles, as far as I know.


----------



## Kissamies (Dec 23, 2021)

At least 11GB is fine for Village at 1080p everything maxed out except RT


----------



## Bomby569 (Dec 23, 2021)

AusWolf said:


> I haven't tried RE:Village, but it's on my wishlist, so I eventually will.  I just know that Cyberpunk works quite cleverly with VRAM. It will not try to allocate more than what you have, so looking at VRAM usage on a 16-24 GB card can be misleading. Though of course, there is a lower limit for every game, but I didn't reach it in Cyberpunk with a 4 GB 1650. Most modern games work on similar principles, as far as I know.



not the best example as you can get assets pop in in CP77 (npc's, cars, etcs...), especially in low end hardware. I wouldn't call that exactly cleaver.


----------



## AusWolf (Dec 23, 2021)

Bomby569 said:


> not the best example as you can get assets pop in in CP77 (npc's, cars, etcs...), especially in low end hardware. I wouldn't call that exactly cleaver.


Even if that's true, it wasn't noticeable. Or maybe my tolerance level is lower than most.


----------



## efikkan (Dec 23, 2021)

AusWolf said:


> Just because the game _allocates_ that much VRAM when it's available doesn't mean it actually _needs_ it. I also played CP77 with a 4 GB GTX 1650 at 1080p High and VRAM usage hovered between 3-3.5 GB.


If only more people used some common sense and understood this.^

Games may allocate a lot of VRAM if it is available;
- Buffers like Z-buffers, stencil buffers etc. which are heavily compressed because they are mostly empty at any time. On top of that, many texture buffers like normal maps, opacity maps, and with RT there is metallic, emissive and roughness textures too, most of which have very low data density, which means they can be compressed heavily (>>50%). Additionally some temporary buffers may only be used for a specific render pass, and be compressed down to nearly nothing the rest of the frame. Still, these buffers will end up as "allocated" space, because to the GPU compression is transparent, so an allocated "256MB" buffer may in reality be only a few kB in physical VRAM. Those who don't understand the terms I'm using here, are *not* qualified for a such technical discussion.
- Some games use free VRAM for additional temporary buffers and caching.

Additionally, most people fail to understand the fundamental truth that if you have space for extra textures in VRAM, then using those textures will require both more bandwidth and computational performance to utilize it. This balance is not going to change until the hardware gains new capabilities which reduces the computational load, so this is not going to change during the lifetime of an existing product, and even then, using the VRAM data will still require more bandwidth. So for these reasons, thinking adding extra VRAM is "future proofing" is nothing but foolishness.

If there are still people claiming RTX 3060 12 GB is a better card than RTX 3060 Ti 8GB due to the extra VRAM after reading this, then they don't know what the heck they are talking about.
Buy the one that makes sense in terms of performance, price and availability. Just ignore the VRAM difference, it doesn't matter for gaming. RTX 3060 Ti 8GB is the faster card, and it will remain so 2 years and even 5 years from now.


----------



## lexluthermiester (Dec 23, 2021)

AusWolf said:


> I just know that Cyberpunk works quite cleverly with VRAM.


While true, you take a big quality and performance hit. Max everything out and even at 1080p, 8GB is just not enough.



efikkan said:


> If there are still people claiming RTX 3060 12 GB is a better card than RTX 3060 Ti 8GB due to the extra VRAM after reading this, then they don't know what the heck they are talking about.


That's a bold claim and just as equally incorrect.

This is the same argument that happens with every generation of GPU's that end up offering more VRAM for games pushing and exceeding the limits. People argued that same way when the 2GB cards came out when everyone thought 1GB was enough. It happened again with the 2GB VS 4GB jump and again with the 4GB vs 8GB jump. And the same result happens EVERY time. The cards with the greater amount of VRAM end up being more useful longer.

Progress ALWAYS marches forward and as it does the resources needed to accommodate that progress need to expand.


----------



## efikkan (Dec 23, 2021)

lexluthermiester said:


> This is the same argument that happens with every generation of GPU's that end up offering more VRAM for games pushing and exceeding the limits. People argued that same way when the 2GB cards came out when everyone thought 1GB was enough. It happened again with the 2GB VS 4GB jump and again with the 4GB vs 8GB jump. And the same result happens EVERY time. The cards with the greater amount of VRAM end up being more useful longer.
> 
> Progress ALWAYS marches forward and as it does the resources needed to accommodate that progress need to expand.


I don't know if you're attempting a straw man argument here, or if you are just completely missing the point.

No one is claiming that x GB will be enough for future hardware forever, what I am pointing out is the fact that there are limits to how much a particular GPU can effectively utilize in games, and the fact that this is related to memory bandwidth and computational performance. 8 GB is more than enough for 3060 Ti, and therefore it's nonsensical that 3060 would benefit from more than that. Even you can't deny that.


----------



## Bomby569 (Dec 23, 2021)

This is not as simple as an allocation problem because that will vary wildly based on the game engine and how the game uses vram.  It's true that what you see is allocation not actual use, but that always comes up when talking about ram or vram, it's not always the answer to the question people are asking, it depends on how close you are to the limit.

I also don't think it's a question of better, of course the faster one is better, but ratter if the 8GB is really enough for a normal gamers that doesn't change cards every year especially in this market, there is a real question of long term use of this cards.
I bought and own the 3060ti, but i really wish they had put a little extra vram especially considering the prices, just talking about MSRP, not even going for the shortage problem.



efikkan said:


> 8 GB is more than enough for 3060 Ti, and therefore it's nonsensical that 3060 would benefit from more than that. Even you can't deny that.


Just play RE village and try to max the settings on a 3060ti to see what happens. And don't give me a theoretical answer, download the demo and do it. See for yourselve what i saw. You don't own the card or don't want to try the game, too bad because i have and i did


----------



## AusWolf (Dec 23, 2021)

lexluthermiester said:


> While true, you take a big quality and performance hit. Max everything out and even 1080p, 8GB is just not enough.


That's not true. I finished the game with my 8 GB 2070 at 1080p, all settings maxed out, RT Psycho, DLSS Quality, and suffered no technical issues (no texture/asset popping or slowdowns) whatsoever.

Additionally, VRAM usage barely hit 7 GB. Like I said, the game uses as much VRAM as it can. Anno 2021, VRAM usage is not a constant value across different hardware.



lexluthermiester said:


> That's a bold claim and just as equally incorrect.
> 
> This is the same argument that happens with every generation of GPU's that end up offering more VRAM for games pushing and exceeding the limits. People argued that same way when the 2GB cards came out when everyone thought 1GB was enough. It happened again with the 2GB VS 4GB jump and again with the 4GB vs 8GB jump. And the same result happens EVERY time. The cards with the greater amount of VRAM end up being more useful longer.
> 
> Progress ALWAYS marches forward and as it does the resources needed to accommodate that progress need to expand.


I agree with that when you're comparing two cards using an identical GPU with different amounts of VRAM. When you're comparing a slower card with more VRAM to a faster one with less, I'm not entirely sure.


----------



## Selaya (Dec 23, 2021)

also the 3060ti has a wider bus than the 3060 and bandwidth matters, too so yeah


----------



## AusWolf (Dec 23, 2021)

Bomby569 said:


> Just play RE village and try to max the settings on a 3060ti to see what happens. And don't give me a theoretical answer, download the demo and do it. See for yourselve what i saw. You don't own the card or don't want to try the game, too bad because i have and i did


It has a demo? I didn't know!  Downloading it now...


----------



## efikkan (Dec 23, 2021)

Bomby569 said:


> This is not as simple as an allocation problem because that will vary wildly based on the game engine and how the game uses vram.


While the game is in control of what it allocates and how much, the game is not in control of how much bandwidth and computational performance is required to utilize that texture. We developers can't speed up how the TMUs work, this is hardware, it's fixed. We can only change the utilization of these resources.



Bomby569 said:


> … but ratter if the 8GB is really enough for a normal gamers that doesn't change cards every year especially in this market, there is a real question of long term use of this cards.


Well, if you understood my earlier post, you wouldn't be wondering about this.
If you want to utilize more VRAM in future games on the same hardware, you have to sacrifice frame rate, since you are limited by the bottlenecks of bandwidth and computational performance.



Bomby569 said:


> Just play RE village and try to max the settings on a 3060ti to see what happens. And don't give me a theoretical answer, download the demo and do it. See for yourselve what i saw. You don't own the card or don't want to try the game, too bad because i have and i did


Really? Because there are people who are able to run 3060 Ti in 4K at max settings.
But trying to do this with a 3060 12 GB would give you much lower FPS, lower than 60 FPS, so what's the point? Running a ~40 FPS slideshow? RTX 3060 Ti still remains the better card, regardless of your feelings.


----------



## Bomby569 (Dec 23, 2021)

efikkan said:


> Really? Because there are people who are able to run 3060 Ti in 4K at max settings.



No they don't, that's impossible, it's not even a question of maxing it, you won't even get nowhere near close to be able to do it. Maybe for a couple of minutes then it's a slide show. As soon as those vram numbers get to red you're done.
I'm not talking about "people", i did it myself.


----------



## AusWolf (Dec 23, 2021)

efikkan said:


> If you want to utilize more VRAM in future games on the same hardware, you have to sacrifice frame rate, since you are limited by the bottlenecks of bandwidth and computational performance.


This!

Let's not forget the fact that GPU performance improves through the years just as much as VRAM capacity does (if not more). Your "superior" 12 GB 3060 might run out of computational power long before more powerful, but lesser VRAM-equipped cards run out of VRAM. Futureproofing is a myth.

Edit: https://www.techpowerup.com/review/halo-infinite-benchmark-test-performance/4.html

Guys, let's look at two graphs: VRAM usage: 1080p: 6600 MB, 1440p: 7 GB. Yet, the 6 GB 1660 Ti runs it better than the 8 GB 1080 even at 1440p. It only shows its weakness at 4K. Why?


----------



## Frick (Dec 23, 2021)

lexluthermiester said:


> While true, you take a big quality and performance hit. Max everything out and even 1080p, 8GB is just not enough.



The problem is that it's really hard to test this statement, but it's clear the 12GB 3060 doesn't handle Cyperbunk 2077 better than the 3060ti. I don't get a smooth 60fps with everything maxed out @ 1080p, and that is not because of a lack of memory. But what about the next generation games? Well, the lack of power would require a dropping of settings anyway.


lexluthermiester said:


> This is the same argument that happens with every generation of GPU's that end up offering more VRAM for games pushing and exceeding the limits. People argued that same way when the 2GB cards came out when everyone thought 1GB was enough. It happened again with the 2GB VS 4GB jump and again with the 4GB vs 8GB jump. And the same result happens EVERY time. The cards with the greater amount of VRAM end up being more useful longer.



The only proper test on this I know of is this, and while it's a bit old now, it's still pretty telling. The graphs below is the worst case scenario (where the amount of RAM actually did make a difference). You'd probably want to tweak the graphics anyway to get a smoother framerate, meaning dropping settings. The big problem for the cards below isn't the amount of memory, it's the performance of the GPU and the memory. But you do have a point, sure. More memory is good, most of the time (remember the Geforce 4 MX440 variants with stupid amounts of memory?). But when a game comes along that brings the GPU to its knees the reason for that is always - to my recollection - GPU performance, not the amount of VRAM. If you have a specific example saying otherwise (even just from memory) tell us. I mean I've seen the same things as you, but my conclusion is basically the opposite: GPU performance is more important than the amount of VRAM, up to a point. A 6GB version of the RTX 3090 would be terrible, but that doesn't exist. The 8GB RTX 3070 exists however, and that is faster than the RTX 3060ti, and I guess only time will tell if the 12GB 3060 with at any point be a better at games than both of them, but I seriously doubt it (or rather, I know in my bones it won't).

A 12GB 3060ti would of course have been lovely, but that is not a thing that exists. And in any case the market is bonkers now, so all this is theoretical if you want to buy a GPU for "sane" prices.

EDIT: Hah, here's a fun quote:



> Standout products included the GeForce GTX 980 Ti and Radeon R9 390X, though at $400+ we realize these stellar graphics cards aren’t for everyone.



Yeah.

EDIT: Thinking about it, maybe the R9 Fury was held back by the 4GB of HBM?


----------



## AusWolf (Dec 23, 2021)

Frick said:


> The 8GB RTX 3070 exists however, and that is faster than the RTX 3060ti, and I guess only time will tell if the 12GB 3060 with at any point be a better at games than both of them, but I seriously doubt it (or rather, I know in my bones it won't).


My gut feeling is that by the time we can say for certain that 8 GB VRAM isn't enough anymore, the pure computational power of the 3060 won't be enough, either, regardless of its 12 GB VRAM.

Alright, guys. I've been playing RE: Village demo with 1080p maxed out settings for nearly half an hour now. While my VRAM usage is constantly pegged at 8 GB, the game runs at a rock solid 40 FPS. As far as I've seen, this is normal behaviour from new RE games. Unless you want to be looking at your VRAM usage instead of playing the game, I'd say everything's fine.


----------



## lexluthermiester (Dec 23, 2021)

efikkan said:


> I don't know if you're attempting a straw man argument here, or if you are just completely missing the point.
> 
> No one is claiming that x GB will be enough for future hardware forever, what I am pointing out is the fact that there are limits to how much a particular GPU can effectively utilize in games, and the fact that this is related to memory bandwidth and computational performance. 8 GB is more than enough for 3060 Ti, and therefore it's nonsensical that 3060 would benefit from more than that.


No, I'm very certain it is you missing some context. Your opinion about what a 3060 can do is rather off kilter. Please review;








						EVGA GeForce RTX 3060 XC Review
					

EVGA engineered a compact dual-slot design with the RTX 3060 XC that will fit all cases. Unlike all other RTX 3060 cards we've tested today, a metal backplate is included. EVGA's card ticks at a rated boost of 1852 MHz, and the cooler features the idle-fan-stop capability.




					www.techpowerup.com
				



Those results are nearly universal so the argument over which brand of card performs better can be ignored.

And as you can see from those results, the 3060 performs on par with or beats out cards that were top tier just a few years ago and are still considered powerful enough for current highend gaming. If those older cards were good enough for gaming at 4k back then, they are still good now. The question of whether or not it can perform well is academic. That leaves us the question of the topic, will a player using a 3060 benefit more from 12GB or the extra performance of the ti model. The simple answer is clear: Long term, the 12GB will age better. History has ALWAYS shown this. Cards with more VRAM stay relevant longer. It is fact, not opinion. Full stop, end of discussion.

Now if a prospective buyer knows they will upgrade again in a year or two, the problem is once again academic and the VRAM is not going to be a factor. But if that prospective user needs the card to last them more than 2 years, then the math flips itself and the VRAM becomes very much more important and needs to be a compelling factor in the buying choice. Anyone who fails to grasp that simple logic and understand proven history is fool unto themselves.



Frick said:


> The problem is that it's really hard to test this statement


I will concede that it can be objectively difficult to make an exact determination. However, the math doesn't fail to show games are pushing the limits of VRAM more than the limits of raw performance. And history teaches us that lesson further. It is a pattern that has been repeating itself since the dawn of the computer age.


Frick said:


> but it's clear the 12GB 3060 doesn't handle Cyperbunk 2077 better than the 3060ti.


That's just one game and example. As was stated above, short term, the 3060ti would be the better choice. But if a buyer needs their GPU purchase to last them 2 or more years, the 3060-12GB will be the objectively better choice by far. Games will always be optimized to meet the needs of the mid-tier GPU's, but few devs use high level texture compression anymore and thus the need of expansive VRAM is very important, again long term.


----------



## Tetras (Dec 23, 2021)

lexluthermiester said:


> No, I'm very certain it is you missing some context. Your opinion about what a 3060 can do is rather off kilter. Please review;
> 
> 
> 
> ...



It's not academic how the card performs (or the memory bandwidth). As the card ages it gets less effective at pushing pixels in newer games and higher resolutions have more pixels, so to preserve playability the resolution gets dropped. VRAM is a factor, but it's not the only factor.

History does care for performance, if you take the RX 570/580 for example, 8GB of memory doesn't put the 570/580 in another performance category. It could never compete with a 3060 Ti at 1080p or 1440p, it's simply too slow. If you'd got the 8GB model anticipating it would last into the 8GB generation (as 8GB is on mainstream cards now), it hasn't helped. The RX 6600 8GB outperforms the 580 8GB by a wide margin. If you'd went for a RX 570 8GB instead of an RX 580 4GB, it wouldn't have helped either, the RX 6600 simply monsters it, at any resolution. The 3GB model of the 1060 was criticised even at the time as being too marginal, so that's maybe a different story.


----------



## lexluthermiester (Dec 23, 2021)

Tetras said:


> 8GB of memory doesn't put the 570/580 in another performance category





Tetras said:


> If you'd got the 8GB model anticipating it would last into the 8GB generation (as 8GB is on mainstream cards now), it hasn't helped.


Total rubbish. The differences in gaming capability between the RX570/580 4GB and 8GB are very distinct. A user can, in fact, still game on an 8GB RX570 or RX580 with a good level of quality and experience where as the 4GB models are struggling more to keep up due to the lack of VRAM. Those cards are a perfect examples of the differences to gaming experience a user can expect. The 8GB versions are still usable, the 4GB versions less so.

There are no hard lines to be drawn in this debate, but to say a 12GB card is currently less useful than an 8GB card and will remain so going forward is simply short-sighted to say the least.


----------



## Tetras (Dec 23, 2021)

lexluthermiester said:


> Total rubbish. The differences in gaming capability between the RX570/580 4GB and 8GB are very distinct. A user can, in fact, still game on an 8GB RX570 or RX580 with a good level of quality and experience where as the 4GB models are struggling more to keep up due to the lack of VRAM. Those cards are a perfect examples of the differences to gaming experience a user can expect. The 8GB version are still usable, the 4GB versions less so.
> 
> There are no hard lines to be drawn in this debate, but to say a 12GB card is currently less useful than an 8GB card and will remain so going forward is simply short-sighted to say the least.



It's funny you say there are no hard lines when you 'rubbish' everybody else's opinion.

The RX 570/580 4/8 are largely unplayable in the same scenarios nowadays. The RX 6600 is 1080 class level of performance and capable of things the RX 570/580 are not. I expect there are some edge cases that the 8GB can just about manage it at higher detail settings, but modern triple a 1440p/4K gaming are just not possible on these cards anymore and having double the VRAM doesn't change that.


----------



## lexluthermiester (Dec 23, 2021)

Tetras said:


> It's funny you say there are no hard lines when you 'rubbish' everybody else's opinion.


Think through that one a moment.


Tetras said:


> but modern triple a 1440p/4K gaming are just not possible on these cards anymore


I never mentioned a specific resolution for those cards. You seem to be missing the point or deliberately glossing it over. Either way, it's on you.


----------



## oxrufiioxo (Dec 23, 2021)

AusWolf said:


> I agree with that when you're comparing two cards using an identical GPU with different amounts of VRAM. When you're comparing a slower card with more VRAM to a faster one with less, I'm not entirely sure.



That's the whole point though, Nvidia cheaped out on the 3060 Ti giving it less Vram than their much weaker 3060 while making the 3070 and 3080 look even worse as under the 3080 ti the weakest 30 series card has the most vram on Desktop. They partly didn't have a choice because 6GB is no longer enough and really shouldn't be on any 300+ usd card anyways but that still doesn't excuse Nvidia from cheaping out on higher tiered variants.

When looking at msrp there has been no progress in Vram amounts since Pascal other than the 3060 which only has 12GB because 6GB would have looked terrible. Half Decade with zero progress from the green team is just sad.

Everyone has got to decide with their own wallet though and at least for me I would expect more vram at a given price bracket generation to generation.....


----------



## lexluthermiester (Dec 23, 2021)

oxrufiioxo said:


> That's the whole point though, Nvidia cheaped out on the 3060 Ti giving it less Vram than their much weaker 3060


No one can argue this. NVidia has misread the market needs. If the 3060ti had the same 12GB, the choice would need no debate. But NVidia is also constrained by the pandemic-effect on the market.


----------



## AusWolf (Dec 23, 2021)

oxrufiioxo said:


> That's the whole point though, Nvidia cheaped out on the 3060 Ti giving it less Vram than their much weaker 3060 while making the 3070 and 3080 look even worse as under the 3080 ti the weakest 30 series card has the most vram on Desktop. They partly didn't have a choice because 6GB is no longer enough and really shouldn't be on any 300+ usd card anyways but that still doesn't excuse Nvidia from cheaping out on higher tiered variants.
> 
> When looking at msrp there has been no progress in Vram amounts since Pascal other than the 3060 which only has 12GB because 6GB would have looked terrible. Half Decade with zero progress from the green team is just sad.
> 
> Everyone has got to decide with their own wallet though and at least for me I would expect more vram at a given price bracket generation to generation.....


I don't think they "cheaped out".

To put 12 GB VRAM on the 3060 Ti, they would have had to cut down the memory controller to 192 bits, resulting in a 336 GB/s transfer speed which is lower than what the 3060 non-Ti has. They could have used higher speed memory, but it would have increased costs without helping much in performance. According to the TPU database, there's roughly 25% difference in performance between the 3060 and the 3060 Ti, which wouldn't be the case if the Ti was bottlenecked by VRAM bandwidth. Then nothing would have justified its higher price, even though it uses the larger GA104 chip, which doesn't matter much to the end user.

Another route could have been putting 16 GB VRAM on it which would not only have been unnecessary, but probably uneconomical as well.

The only reason the 3060 has 12 GB VRAM in my opinion, is that 6 GB on a mid-range card doesn't look too appealing anno 2021, and its memory controller doesn't make 8 GB possible.


----------



## Vario (Dec 23, 2021)

3060 Super would probably be the card to watch for, if 12GB with Ti performance.  Regardless, if the pandemic/crypto VGA market holds, my present plan is buy a 3060 Ti when available, sell the 3060, enter queue for 4060, sell the 3060ti.  Presently planning on selling my 1060 6GB as they are worth about $300 right now.









						NVIDIA GeForce RTX 3080 SUPER, 3070 SUPER and 3060 SUPER get rumored specifications - VideoCardz.com
					

NVIDIA RTX 30 SUPER Specifications might be confusing to customers The full list of all four RTX 30 SUPER SKUs has been shared by kopite7kimi, a leaker who revealed the specs of NVIDIA Ampere GPUs more than a year before release. Kopite revealed the specs of the missing three SKUs of the...




					videocardz.com


----------



## oxrufiioxo (Dec 23, 2021)

lexluthermiester said:


> No one can argue this. NVidia has misread the market needs. If the 3060ti had the same 12GB, the choice would need no debate. But NVidia is also constrained by the pandemic-effect on the market.



Yeah, I'm guessing with the 4000 series Nvidia will be more generous with Vram amounts but only because AMD is competitve in Rasterization performance and seem to have the edge in newer games minus 4k.







AusWolf said:


> I don't think they "cheaped out".
> 
> To put 12 GB VRAM on the 3060 Ti, they would have had to cut down the memory controller to 192 bits, resulting in a 336 GB/s transfer speed which is lower than what the 3060 non-Ti has. They could have used higher speed memory, but it wouldn't have helped much. According to the TPU database, there's roughly 25% difference in performance between the 3060 and the 3060 Ti, which wouldn't be the case if the Ti was bottlenecked by VRAM bandwidth. Then nothing would have justified its higher price, even though it uses the larger GA104 chip, which doesn't matter much to the end user.
> 
> Another route could have been putting 16 GB VRAM on it which wouldn't only have been unnecessary, but probably uneconomical as well.



We should never feel them giving us too much vram is uneconomical from a billion dollar company yes they have proved they do not give a shit about gamers but even if MSRP were 100 higher the extra vram would be welcomed even if as just an option.

There are already examples of games with shit performance due to the 8GB vram look at the 3070 vs the 2080 ti at 4k, look at that 3060 go embarrassing.


----------



## AusWolf (Dec 23, 2021)

Vario said:


> 3060 Super would probably be the card to watch for, if 12GB with Ti performance.  Regardless, if the pandemic/crypto VGA market holds, my present plan is buy a 3060 Ti when available, sell the 3060, enter queue for 4060, sell the 3060ti.  Presently planning on selling my 1060 6GB as they are worth about $300 right now.
> 
> 
> 
> ...


Why plan on selling a card that you don't even have yet? I'd say, just buy it, and sell it when it doesn't make you happy anymore.


----------



## chrcoluk (Dec 23, 2021)

Maenad said:


> Tho still ran fine on 980 Ti, allocating isn't the same as usage.



Just to be clear it 'used' the memory.

I also keep reading that allocated memory is not used memory, it is used.

If memory is allocated to an app, it means another app cannot use the same memory, windows doesn't support overcommit.

However in the case of FF15 the memory isn't just allocated it is actually used by the game in fact if your VRAM runs out, it will then fall back to normal system RAM causing stutters and then eventually if you don't stop the game, give you a out of memory error crashing windows.

How do I know? It happened to me many times until I disabled Nvidia grass.  It also happened to many others, someone on reddit, got a OOM playing FF15 with 48 gigs of RAM.  The dev of the SpecialK mod driver tried to fix it as well and was partially successful.  I don't know if SE ever managed to eventually fix it themselves.


----------



## AusWolf (Dec 23, 2021)

oxrufiioxo said:


> We should never feel them giving us too much vram is uneconomical from a billion dollar company yes they have proved they do not give a shit about gamers but even if MSRP were 100 higher the extra vram would be welcomed even if as just an option.


They didn't become a billion dollar company by giving VRAM away for free.



oxrufiioxo said:


> There are already examples of games with shit performance due to the 8GB vram look at the 3070 vs the 2080 ti at 4k, look at that 3060 go embarrassing.
> 
> View attachment 229950


Of course you need more VRAM for 4K! Who would have thought?


----------



## oxrufiioxo (Dec 23, 2021)

AusWolf said:


> They didn't become a billion dollar company by giving VRAM away for free.
> 
> 
> Of course you need more VRAM for 4K! Who would have thought?



This game would be playable on the 3060 ti and 3070 if they had more vram at 4k they are plenty fast enough but instead they lose to a weaker gpu with more Vram.

These are all 400+ cards even at MSRP you shouldn't need to worry about if you are going to run into vram limitations in a couple years with them....... Again if people want to spend that much on 8gb cards more power to them and if they can get them at or near MSRP I would say go for it but that doesn't make any less shitty.


----------



## lexluthermiester (Dec 23, 2021)

oxrufiioxo said:


> Yeah, I'm guessing with the 4000 series Nvidia will be more generous with Vram amounts


Agreed.


Vario said:


> 3060 Super would probably be the card to watch for, if 12GB with Ti performance.


This would be great bang for buck.


----------



## AusWolf (Dec 23, 2021)

oxrufiioxo said:


> This game would be playable on the 3060 ti and 3070 if they had more vram at 4k they are plenty fast enough but instead they lose to a weaker gpu with more Vram.


I haven't seen anybody talk about 4K before you mentioned it. It is still a niche market segment that doesn't concern most gamers. Not to mention, OP has a 1440p screen according to their profile.

It is clearly not the topic here, I believe.


----------



## oxrufiioxo (Dec 23, 2021)

AusWolf said:


> I haven't seen anybody talk about 4K before you mentioned it. It is still a niche market segment that doesn't concern most gamers. Not to mention, OP has a 1440p screen according to their profile.



Still doesn't make buying a 500+ gpu that will lose to a lower tiered sku any better regardless of how niche that may be.


----------



## AusWolf (Dec 23, 2021)

oxrufiioxo said:


> Still doesn't make buying a 500+ gpu that will lose to a lower tiered sku any better regardless of how niche that may be.


So I shouldn't buy X graphics card for playing at 1080p/1440p because it loses to Y card in 4K? That doesn't make any sense.


----------



## oxrufiioxo (Dec 23, 2021)

AusWolf said:


> So I shouldn't buy X graphics card for playing at 1080p because it loses to Y card in 4K? That doesn't make any sense.



A higher tier sku shouldn't be losing to a lower tiered one in any circumstance you don't pay more for less.... 1080p gpu should not be 500+ either but what you do with your own wallet is your choice everyone has to do their own research and make a choice the 20 series was garbage as far as progress from pascal but a lot of people own them even me.


----------



## Frick (Dec 23, 2021)

oxrufiioxo said:


> Still doesn't make buying a 500+ gpu that will lose to a lower tiered sku any better regardless of how niche that may be.



It probably won't though.



oxrufiioxo said:


> That's the whole point though, Nvidia cheaped out on the 3060 Ti giving it less Vram than their much weaker 3060 while making the 3070 and 3080 look even worse as under the 3080 ti the weakest 30 series card has the most vram on Desktop. They partly didn't have a choice because 6GB is no longer enough and really shouldn't be on any 300+ usd card anyways but that still doesn't excuse Nvidia from cheaping out on higher tiered variants.
> 
> When looking at msrp there has been no progress in Vram amounts since Pascal other than the 3060 which only has 12GB because 6GB would have looked terrible. Half Decade with zero progress from the green team is just sad.
> 
> Everyone has got to decide with their own wallet though and at least for me I would expect more vram at a given price bracket generation to generation.....



+1 to all of this. The RTX3060ti would have made a good <€400 card (with the downside To Little VRAM), but that isn't what reality looks like right now.


----------



## AusWolf (Dec 23, 2021)

oxrufiioxo said:


> A higher tier sku shouldn't be losing to a lower tiered one in any circumstance you don't pay more for less....


It doesn't matter. What matters is what _you_ use it for. I've got a 1080p monitor, so if it's better at 1080p, I'll buy it. I couldn't care less about 4K performance.



oxrufiioxo said:


> 1080p gpu should not be 500+ either


That I agree with, but it is what it is, unfortunately. You can take it or leave it.



oxrufiioxo said:


> the 20 series was garbage as far as progress from pascal but a lot of people own them even me.


The 20 series was about the introduction of raytracing and the tensor cores. Nothing more. If you think about it, the 30 series is garbage too, as it basically has the same performance/watt ratio as the 20 series.


----------



## dirtyferret (Dec 23, 2021)

AusWolf said:


> My gut feeling is that by the time we can say for certain that 8 GB VRAM isn't enough anymore, the pure computational power of the 3060 won't be enough, either, regardless of its 12 GB VRAM.


This, end of debate.


----------



## Bomby569 (Dec 23, 2021)

AusWolf said:


> It has a demo? I didn't know!  Downloading it now...



It does, but honestly it doesn't represent the game that impressively. It's good to check the gameplay not the game itself. Game is very very good. sorry for the offtopic


----------



## AusWolf (Dec 23, 2021)

Bomby569 said:


> It does, but honestly it doesn't represent the game that impressively. It's good to check the gameplay not the game itself. Game is very very good. sorry for the offtopic


It's not off.  I downloaded it mainly for testing purposes.

Like I mentioned, it runs fine with a stable 40 fps at 1080p max settings on my 2070. VRAM usage pegged at 8 GB isn't an issue.


----------



## oxrufiioxo (Dec 23, 2021)

AusWolf said:


> It doesn't matter. What matters is what _you_ use it for. I've got a 1080p monitor, so if it's better at 1080p, I'll buy it. I couldn't care less about 4K performance.
> 
> 
> That I agree with, but it is what it is, unfortunately. You can take it or leave it.
> ...



And people should always buy what feels best to them in their given price range as it is what it is... Doesn't mean I'm going to agree that Nvidia didn't cheap out... I mean they already saved money going with Samsung 8nm as it is the whole ampere linup likely would have been much better on TSMC 7nm. 

I would've liked to see how much better a 3090 class product on it would have been.

The biggest problem with the 20 series was RT was a bust for almost a year and it took them almost 2 years to get DLSS in a usable state.... I think I can count on one hand how many RT games I played on my 2080 ti before it was replaced and that was the only card that was better than Pascal but Nvidia made sure we paid for it... I semi give them a pass becuase you have to start someone developers aren't going to make games RT compatible until we have cards capable of it..... Although I feel the same about Vram if developers have more of it and can target it we will get games with much higher resolution textures a lot of games still compress the shit out of them.


----------



## AusWolf (Dec 23, 2021)

oxrufiioxo said:


> Although I feel the same about Vram if developers have more of it and can target it we will get games with much higher resolution textures a lot of games still compress the shit out of them.


If it means I won't have to replace my 2070 for a couple more years, devs can compress all they want out of their textures.


----------



## Vario (Dec 23, 2021)

AusWolf said:


> Why plan on selling a card that you don't even have yet? I'd say, just buy it, and sell it when it doesn't make you happy anymore.


Because I can essentially go from a 1060 6GB to a 4060(?) with this plan without a net cash outlay by being patient and having other people pay me to be patient for them.


----------



## AusWolf (Dec 23, 2021)

Vario said:


> Because I can essentially go from a 1060 6GB to a 4060(?) with this plan without a net cash outlay by being patient and having other people pay me to be patient for them.


I don't quite get what you mean, but if a 4060 is your goal anyway, then you might as well just wait it out and save for it. Mid-term solutions aren't the most economical ones.


----------



## Bomby569 (Dec 23, 2021)

i know it's just in early stages (even if it's coming real soon, early 2022) but the UE5 demo needs 8GB vram minimum to run








						Unreal Engine 5
					

Unreal Engine 5 empowers all creators across all industries to deliver stunning real-time content and experiences.




					www.unrealengine.com


----------



## Vario (Dec 24, 2021)

AusWolf said:


> I don't quite get what you mean, but if a 4060 is your goal anyway, then you might as well just wait it out and save for it. Mid-term solutions aren't the most economical ones.


It is, presently people will pay me $300 for a 5 year old 1060 6GB and pay over me $800 for a 3060 that costs me $400.  It is the most economical solution.  Have you looked at the cost right now on the used market?


----------



## AusWolf (Dec 24, 2021)

Vario said:


> It is, presently people will pay me $300 for a 5 year old 1060 6GB and pay over me $800 for a 3060 that costs me $400.  It is the most economical solution.  Have you looked at the cost right now on the used market?


I have. I just didn't know that your 3060 will only cost you $400 through the EVGA store. Unfortunately, that is something we don't get here in the UK.


----------



## Bomby569 (Dec 24, 2021)

AusWolf said:


> I have. I just didn't know that your 3060 will only cost you $400 through the EVGA store. Unfortunately, that is something we don't get here in the UK.


they aren't selling those cards directly, it's a wait list system and it's closed for sometime now. And it's that plus taxes.


----------



## efikkan (Dec 24, 2021)

lexluthermiester said:


> And as you can see from those results, the 3060 performs on par with or beats out cards that were top tier just a few years ago and are still considered powerful enough for current highend gaming. If those older cards were good enough for gaming at 4k back then, they are still good now.


Having current mid-range cards compete with high-end cards from a couple generations ago has been the norm for ages. What you fail to realize is that newer architectures have architectural features which improves performance beyond what you would expect from just specs like VRAM size.



lexluthermiester said:


> The question of whether or not it can perform well is academic. That leaves us the question of the topic, will a player using a 3060 benefit more from 12GB or the extra performance of the ti model. The simple answer is clear: Long term, the 12GB will age better. History has ALWAYS shown this. Cards with more VRAM stay relevant longer. It is fact, not opinion. Full stop, end of discussion.


You are *100% wrong*.The evidence show the opposite. :faceplam:
Take a look at the evidence you yourself provided: (image cut)





A few facts are easily observable;
- As resolutions grow, lower VRAM cards (but same architecture) like RTX 3060 Ti, 3070 and 3080 increase their advantage over RTX 3060 12GB. This clearly shows as the workloads increase, other factors become more important than more VRAM, and is a very good indicator of further scaling. (look at that massive advantage 3080 has in 4k!)
- Additionally, looking at prev gen RTX 2080, 2080 Super etc. you see the same pattern; at higher resolutions and details, they pull further ahead of RTX 3060 12 GB.
So the evidence is clear; VRAM is not a larger scaling factor than overall performance.

You are also ignoring the fact that using more VRAM will always require more bandwidth and computational power. The amount of raw computational power, texture mapping, memory bandwidth etc. are all fixed throughout the lifetime of a product. So anyone who have the rudimentary understanding of rendering should understand the implications of this; if you want larger textures in the future, you have to sacrifice a lot of frame rate. For this reason, we can conclude that RTX 3060 12 GB will be obsolete long before it can outperform RTX 3060 Ti 8 GB (outperform it due to VRAM limitations). RTX 3060 12 GB will *never* be objectively the better card.

I'm trying to tell you politely, you clearly have no idea of what you are talking about, you don't even grasp the basics of how GPUs work, so please get the knowledge before you make such incorrect claims. I just want to stop this spread of misinformation. I know you are smarter than this, so listen an learn. Have a merry Christmas.


----------



## lexluthermiester (Dec 24, 2021)

efikkan said:


> Having current mid-range cards compete with high-end cards from a couple generations ago has been the norm for ages. What you fail to realize is that newer architectures have architectural features which improves performance beyond what you would expect from just specs like VRAM size.
> 
> 
> You are *100% wrong*.The evidence show the opposite. :faceplam:
> ...


Seriously with that nonsense? Yeah, you're done. Bye bye.


----------



## Bomby569 (Dec 24, 2021)

This got completely out control but i must agree with Lex here, the RX 480 or the 580, great cards that can still do a lot of gaming, but the 4GB version are completey bottlenecked by the low VRAM. It's really not worth any savings you got by buying the low VRAM version, unless you planed on upgrading in a short time, and there are a lot of people that keep their cards for a long time and even a lot of use after that for someone else on the used market.

Sure you can still game, but you are really not using it's full power and have to scale back because of the VRAM. This seems obvious to me at least.


----------



## AusWolf (Dec 24, 2021)

Bomby569 said:


> This got completely out control but i must agree with Lex here, the RX 480 or the 580, great cards that can still do a lot of gaming, but the 4GB version are completey bottlenecked by the low VRAM.


There is some disagreement here:











lexluthermiester said:


> Seriously with that nonsense? Yeah, you're done. Bye bye.


Did you look at the graphs in the review you linked earlier?


----------



## lexluthermiester (Dec 24, 2021)

AusWolf said:


> Did you look at the graphs in the review you linked earlier?


I did, if the context of my point is not being understood, that's not my problem.


----------



## freeagent (Dec 24, 2021)

Just curious, because I can only game at 1080p/60 right now..

In a game like RE2, in the graphics settings you can choose to exceed your video memory, I select ok and bam looks fantastic.. same with FC5 I can run it with maxed out settings and it looks and runs great.. but that's at 1080p, I would assume the lack of memory would show up more in higher resolutions?


----------



## oxrufiioxo (Dec 24, 2021)

freeagent said:


> Just curious, because I can only game at 1080p/60 right now..
> 
> In a game like RE2, in the graphics settings you can choose to exceed your video memory, I select ok and bam looks fantastic.. same with FC5 I can run it with maxed out settings and it looks and runs great.. but that's at 1080p, I would assume the lack of memory would show up more in higher resolutions?




Resident evil village is the only game when set to 4k with max settings and raytracing that the 8GB cards choke hard and lose to weaker gpus that come with more vram afaik.

1080p and 1440p is fine


----------



## Bomby569 (Dec 24, 2021)

freeagent said:


> Just curious, because I can only game at 1080p/60 right now..
> 
> In a game like RE2, in the graphics settings you can choose to exceed your video memory, I select ok and bam looks fantastic.. same with FC5 I can run it with maxed out settings and it looks and runs great.. but that's at 1080p, I would assume the lack of memory would show up more in higher resolutions?



i tried to max it and it became a slide show, but i'm at 144, but i think even at 60 that should happen.



oxrufiioxo said:


> Resident evil village is the only game when set to 4k with max settings and raytracing that the 8GB cards choke hard and lose to weaker gpus that come with more vram afaik.
> 
> 1080p and 1440p is fine



that's absolutely not true, it struggles at 1440p if you go past the vram limit, ask me how i know


----------



## chrcoluk (Dec 25, 2021)

I think one factor when considering how much VRAM is needed, is how much of a game you play in a single session.  This is perhaps an area one could argue where benchmarking wont properly represent the playing experience, e.g. in Tales of Berseria if I just stay in one area, the VRAM usage may only be a couple of gigs, but if I start exploring the world, fight battles, let FMVs play and so forth it will consume triple the amount.

Now I did do an experiment as I am aware of the allocated vs usage argument, I let a browser leak VRAM by selecting D3D9 as the Angle API, watched some youtube videos to get the VRAM allocation high enough that only 2 gigs was left, and then launched Tales of Berseria, the game doesn't crash but there is a lot of stutters and mini freezes, e.g. the game kept momentarily freezing whenever I used a mystic arte.  An interesting experiment, but when I tried the same thing with a newer and theoretically more demanding game, Final Fantasy 15 (a game which uses lots of VRAM), there was no stutters, but however was lots of texture pop in issues which are not normally there, I assume due to eviction of textures to free up memory.  So VRAM is kind of a quality of life thing and potential stutter mitigation depending on how well the game handles loading in of textures.  There was other games I tested which had no issues at all.


----------



## Vayra86 (Dec 25, 2021)

neatfeatguy said:


> But I thought having 12GB of VRAM on a weaker card was better over having 8GB on a faster card?
> 
> I'm so confused. CP2077 can utilize up to 10GB of VRAM when RT is enabled (upwards of 7GB without RT), according to TPU and their benchmarking. Why is a 3060Ti with only 8GB out pacing a 3060 with 12GB? Last I checked 12 > 8.
> 
> ...



Maybe the comparison now should be a focus on 6 versus 8 GB to see the gap. Then, FC6 shows its struggle. Older Ubisoft titles had similar issues. Engine quirk, but since the engine is used a lot, its there anyway. Pop in was always a thing in FC since part 3 as well.

Thats why Im saying : look back because history repeats, and higher VRAM cards just tend to last longer and resell for more on second hand markets. 

The cards that stood out from midrange to high end over the last ten years have all been higher VRAM cards. Most notably, 7970 that would eclipse everything even years down the line due to combo of 3GB and wide bus. And the 980ti that is still relevant with its 6GB while any Fury X with 4GB (despite a superb bandwidth!!) is utterly pointless right now. The 1080ti is in a similar place, it will be relevant for quite a while still even with 5-6 years of age. With its 11GB it will eat any older, modded texture upped game at any res even ten years from now.

These differences dont show up in the first couple of years post launch. But year 3-5, they do. Its surprising even to me that we are actually already hearing issues with 10GB in certain titles. At the same time, the consoles with 16GB are pushing that mainstream bottom end up rapidly. Its a perfect storm for low VRAM to fall off faster than usual. Or at least require post launch dev/Nvidia TLC through patching or drivers. Its for that reason Nvidia has been shitting game ready drivers out of every hole for so long now.

And we can complain then that 'AMD is pushing useless stuff to fill 16 gigs', but what really happens is devs are using resources because it saves them effort. As they always have: once hardware gets mainstream it gets used.


----------



## Bomby569 (Dec 25, 2021)

chrcoluk said:


> I think one factor when considering how much VRAM is needed, is how much of a game you play in a single session.  This is perhaps an area one could argue where benchmarking wont properly represent the playing experience, e.g. in Tales of Berseria if I just stay in one area, the VRAM usage may only be a couple of gigs, but if I start exploring the world, fight battles, let FMVs play and so forth it will consume triple the amount.



for my part on this i did test my 3060ti with RE Village, and got past the vram limit, i was playing fine, untill it got either some time limit or some zone (the game makes you go back and forwards a lot) and it simply couldn't handle it anymore, the fps started to drop of a cliff, it got so bad i was hearing the voices and the image didn't move anymore. I had to go to the menu a couple of minutes to get the image to continue to move.
It never crashed or got unresponsive to the point i couldn't go to menu. And it never happened again in my 9hrs or something like that. If anyone is wondering.


----------



## AusWolf (Dec 25, 2021)

Bomby569 said:


> i tried to max it and it became a slide show, but i'm at 144, but i think even at 60 that should happen.
> 
> 
> 
> that's absolutely not true, it struggles at 1440p if you go past the vram limit, ask me how i know


That's strange. I have absolutely no issues running it maxed out at 1080p with a 2070. I played the demo for nearly an hour two days ago at a rock solid 40 fps.


----------



## Vayra86 (Dec 25, 2021)

AusWolf said:


> This!
> 
> Let's not forget the fact that GPU performance improves through the years just as much as VRAM capacity does (if not more). Your "superior" 12 GB 3060 might run out of computational power long before more powerful, but lesser VRAM-equipped cards run out of VRAM. Futureproofing is a myth.
> 
> ...



Not sure why but its clearly not VRAM related. Looks like a HALO problem. Look at the 1080ti and several other older cards.  The 1660ti got driver TLC that Pascal didnt.


----------



## AusWolf (Dec 25, 2021)

Vayra86 said:


> Not sure why but *its clearly not VRAM related*. Looks like a HALO problem. Look at the 1080ti and several other older cards.  The 1660ti got driver TLC that Pascal didnt.


That's exactly what I mean.


----------



## Vayra86 (Dec 25, 2021)

AusWolf said:


> That's exactly what I mean.



But it doesnt prove or disprove anything. It just shows us Halo might play better on newer architecture as 1660ti is Turing.

Far Cry 6 runs on ancient Dunia... so there we have something of a conclusion perhaps. Newer cards will more effectively use newer engines. But old engines are also used and improved over (several) decades. So this doesnt work out well as an example for your point.


----------



## AusWolf (Dec 25, 2021)

Vayra86 said:


> But it doesnt prove or disprove anything. It just shows us Halo might play better on newer architecture as 1660ti is Turing.
> 
> Far Cry 6 runs on ancient Dunia... so there we have something of a conclusion perhaps. *Newer cards will more effectively use newer engines.* But old engines are also used and improved over (several) decades. So this doesnt work out well as an example for your point.


^This is exactly what it proves. VRAM is not everything, you also need more computational power and more efficient architectures for newer games.


----------



## Vayra86 (Dec 25, 2021)

AusWolf said:


> ^This is exactly what it proves. VRAM is not everything, you also need more computational power and more efficient architectures for newer games.



And then Far Cry 6 avoids that statement entirely. The majority of games gets built on old or refined engines. In practice you will always be using a wild mix of old and new stuff. And really, lots of new core power gets spent on silly stuff you really wont miss, like any of a dozen blur filters and overexpensive settings. Cut that away and the balance quickly leans over to be needy for VRAM as time passes.


----------



## AusWolf (Dec 25, 2021)

Vayra86 said:


> And then Far Cry 6 avoids that statement entirely. The majority of games gets built on old or refined engines. In practice you will always be using a wild mix of old and new stuff. And really, lots of new core power gets spent on silly stuff you really wont miss, like any of a dozen blur filters and overexpensive settings. Cut that away and the balance quickly leans over to be needy for VRAM as time passes.


But then, Far Cry 6 is a game that runs over 30 FPS on a 6 GB 2060 at 4K.

When you're playing old games, you don't need to care about either VRAM or compute power. New stuff is a mixture of both, but a lot depends on your settings, too. If you don't game at 4K, or you're willing to put some settings down a notch, you're fine with less VRAM, but compute power is always needed, regardless of settings used.


----------



## Vayra86 (Dec 25, 2021)

AusWolf said:


> But then, Far Cry 6 is a game that runs over 30 FPS on a 6 GB 2060 at 4K.
> 
> When you're playing old games, you don't need to care about either VRAM or compute power. New stuff is a mixture of both, but a lot depends on your settings, too. If you don't game at 4K, or you're willing to put some settings down a notch, you're fine with less VRAM, but compute power is always needed, regardless of settings used.



You keep talking about 4K but I never ever did  but the impact of resolution is not that great on VRAM. The issues occur in certain games irrespective of res.  Its about the needed bottom line, this is also why at some point you are forced to dial down graphics settings that really impact IQ.


----------



## AusWolf (Dec 25, 2021)

Vayra86 said:


> You keep talking about 4K but I never ever did


The reason why I mentioned 4K is because 1. other people have mentioned it, 2. as far as I see, it's the only setting where VRAM really matters.



Vayra86 said:


> but the impact of resolution is not that great on VRAM.


To be fair, I see higher VRAM cards pull ahead only at 4K in some games. At lower resolutions, pure compute power wins all across the board. Though I agree that even this "pulling ahead" is not that great.

To bring my point closer back to OP - if we disregard 4K data (and the myth of futureproofing), the 3060 Ti wins against the 3060 12 GB every single time. The extra 4 GB VRAM doesn't give the 3060 enough advantage to beat the Ti in any situation.


----------



## lexluthermiester (Dec 25, 2021)

AusWolf said:


> 2. as far as I see, it's the only setting where VRAM really matters.


This is not correct. Games will dynamically resize draw distance if more VRAM is detected. Some game will even default to higher quality textures. There are a lot of factors that go in to how games run. This applies equally to 1440p as well as 1080p. However, 4k is the larger concern. That said, the 3060 12GB is a capable 4k card. For perspective, the GTX1080 was a capable 4k card and the 3060 smokes it on all levels.


----------



## efikkan (Dec 25, 2021)

AusWolf said:


> To be fair, I see higher VRAM cards pull ahead only at 4K in some games. At lower resolutions, pure compute power wins all across the board. Though I agree that even this "pulling ahead" is not that great.
> 
> To bring my point closer back to OP - if we disregard 4K data (and the myth of futureproofing), the 3060 Ti wins against the 3060 12 GB every single time. The extra 4 GB VRAM doesn't give the 3060 enough advantage to beat the Ti in any situation.


As you can see in #217, RTX 3060 Ti and RTX 3070 (both 8GB) increases their lead over RTX 3060 12GB with higher resolutions and details (on average). Any case this doesn't happen is probably due to inferior engine design rather than VRAM.

So let's talk some math and technical details again;
While this should be obvious, if your target frame rate is 60 FPS, then 12 GB VRAM for RTX 3060 is not going to help much, as the bandwidth of 360 GB/s would only permit a maximum of 6 GB read (if read 100% optimally, and the same data is never read twice, the game never reuses any resources and have only a single render pass), while RTX 3060 Ti 8 GB could manage to read ~7.5 GB of its VRAM. RTX 3060 Ti also have 26-45% higher texture fill rate, so there should be little doubt of which GPU is more capable of handling higher texture loads. In reality, the case for RTX 3060 12 GB is even worse, as most games will reuse the same textures multiple times, most have several render passes (e.g. for reflections), and definetly will reuse temporary buffers multiple times, leaving too little bandwidth to actually ustilize that 12 GB VRAM and ustain a high frame rate.

Now to the less obvious. You can probably see that most games get bottlenecked by computational performance as details increase. The underlying reason for this is not obvious to non-programmers. As resolution increases, the computational workload for pixel/fragment shaders is fairly linear with the amount of pixels on screen, but the texture data the TMU transforms comes from a larger block of memory, which means running multiple samples simultaneously gets a "streaming bonus" from VRAM. So even if the game increases the detail level, the computational workload nearly always grows quicker.
Secondly, there is the issue of "mipmaps". Even in a game with vast landscapes and incredible texture details, the majority of these textures will at any time be in fairly low detail in VRAM. (here is a random video illustrating texture detail levels, now imagine this on a larger scene with more distant objects)
Both of these elements combined contributes to a game being increasingly bottlenecked by computational resources as details increase, which seems to be contrary with what many of you would expect, but this is exactly what you see in the graph in #217. To those of us who have programmed with graphics APIs for many years, this stuff is obvious, and is why me and others have been pointing out this VRAM misconception for years.


----------



## Ibotibo01 (Dec 25, 2021)

If you will use RTX 3060 for 4 years with 1440p ultra settings(also texture quality is ultra), it will be more futureproof than 3060 Ti after 4 years but if you will be use RTX 3060 Ti for 2 years with 1440p ultra settings (also texture quality is ultra), it will be more future proof than 3060 for 2 years. 

Concept of the future proof is changing with people' choices like playing on low settings or high settings furthermore it is about how many years people will use these GPUs.

I personally choose RTX 3060 Ti over 3060 because it has 4864 cores with 80 ROPs. 3060 has 48 ROPs with 3584 cores. Even RTX 3050s has 40 ROPs with 2048-2560 cores. Of course VRAM is important but almost half of the released GPUs(after 2016) has 8GB VRAM so game developers probably will focus on 8GB usage on GPUs. I think 8GB VRAM will be enough for 2 years with ultra. After the 2 years, ultra texture will probably using 10GB. (for 1080p)


----------



## Vario (Dec 26, 2021)

Messing around with undervolting and gaming on this 3060.  Seems to run well on 0.9 volt.  Nice card but I do wish for a bit more performance, so I'll probably go for the 3060ti when available.

edit:


----------



## eidairaman1 (Dec 28, 2021)

Vario said:


> The 780ti also aged terribly due to that 3GB.


970 did as well


----------



## Bomby569 (Dec 29, 2021)

someone pointed this out

the 1080ti and the 2080 ti are around the same performance as the 3060ti and they both have 11gb vram. Does this make sense?


----------



## neatfeatguy (Dec 29, 2021)

Bomby569 said:


> someone pointed this out
> 
> the 1080ti and the 2080 ti are around the same performance as the 3060ti and they both have 11gb vram. Does this make sense?



In terms of relative performance, 1080Ti is about on par with the 3060. 2080Ti appears to be more on par with the 3060Ti.


----------



## Selaya (Dec 29, 2021)

no the 2080ti's performing like the 3070, the 1080ti is between the 3060 and the 3060ti


----------



## Bomby569 (Dec 29, 2021)

neatfeatguy said:


> In terms of relative performance, 1080Ti is about on par with the 3060. 2080Ti appears to be more on par with the 3060Ti.



what does "about" mean in English?

but that's really missing the point here.


----------



## neatfeatguy (Dec 29, 2021)

Bomby569 said:


> what does "about" mean in English?
> 
> but that's really missing the point here.



About - used in this context I guess you could say it would mean "close to" or something along the lines of "similar"




Let's put all this to rest now that more VRAM is better for future gaming on a slower card.

Techspot got their hands on a RTX 2060 12GB and benched it. The card, even with more CUDA is only about 4% faster than the 6GB model. The 12GB model has all the specs of a 2060 Super, but keeps the original bus of a the 2060 and that is where its downfall is, choked by the lack of memory bandwidth

Techspot didn't bench 4k resolution, but let's be honest, 2060 and even the 3060 weren't designed for 4k resolution. And for the 3060 and for those that say, "but DLSS!" - not all games support DLSS. You need to go in expecting a game not to support DLSS and simply go by the relative performance of the 3060. 









						More VRAM, But for Who? Nvidia RTX 2060 12GB Review
					

Today we're taking a look at the new GeForce RTX 2060, you know, the 12GB model that gamers were hoping would help solve the GPU shortage, but...




					www.techspot.com


----------



## lexluthermiester (Dec 29, 2021)

neatfeatguy said:


> The card, even with more CUDA is only about 4% faster than the 6GB model.


You missing the point. More VRAM has NEVER been about going faster. More VRAM equals more room for a game to play in requiring fewer trips to system memory and ultimately the storage. More VRAM is means that a game can display more textures at a higher quality.


neatfeatguy said:


> but let's be honest, 2060 and even the 3060 weren't designed for 4k resolution.


Rubbish. A 2060Super is a capable 4k card, not with high or ultra settings, but still good quality. The 3060 is a large step above that. Any of the 3060's can do 4k well. Only the 2060 6GB will struggle at 4k with some games.


neatfeatguy said:


> The 12GB model has all the specs of a 2060 Super, but keeps the original bus of a the 2060 and that is where its downfall is, choked by the lack of memory bandwidth


Your understanding of VRAM bandwidth need adjustment. 








						NVIDIA GeForce RTX 2060 Specs
					

NVIDIA TU106, 1680 MHz, 1920 Cores, 120 TMUs, 48 ROPs, 6144 MB GDDR6, 1750 MHz, 192 bit




					www.techpowerup.com
				



336GBps is more than enough bandwidth for most games.

Folks, how is the math difficult to understand? More VRAM equals good. Less VRAM equals not as good.


----------



## ixi (Dec 29, 2021)

lexluthermiester said:


> Folks, how is the math difficult to understand? More VRAM equals good. Less VRAM equals not as good.



Time to jump in in troll wagon.

Show us currently any title where 3060 eats 3060 ti in 4K or even after 3,4,5 years .


----------



## AusWolf (Dec 29, 2021)

lexluthermiester said:


> You missing the point. More VRAM has NEVER been about going faster. More VRAM equals more room for a game to play in requiring fewer trips to system memory and ultimately the storage. More VRAM is means that a game can display more textures at a higher quality.


I think you're missing the point. OK, the game can load more assets from VRAM, and less from system memory. Great! But what's the point, if it barely gives you any detectable performance increase?



lexluthermiester said:


> Rubbish. A 2060Super is a capable 4k card, not with high or ultra settings, but still good quality. The 3060 is a large step above that. Any of the 3060's can do 4k well. Only the 2060 6GB will struggle at 4k with some games.


I respectfully disagree. Only an idiot (or a non-gamer) spends more money on a monitor than a graphics card, imo.



lexluthermiester said:


> Your understanding of VRAM bandwidth need adjustment.
> 
> 
> 
> ...


VRAM is not for games. It is for game assets and textures. The higher detail and resolution you go, the more of it you need, just like of bandwidth, computing power, and basically everything. All I'm saying is that with higher resolutions, your requirement for GPU power and VRAM bandwidth increase more than your need for more VRAM does.



lexluthermiester said:


> Folks, how is the math difficult to understand? More VRAM equals good. Less VRAM equals not as good.


Yes, when you're comparing two identical graphics cards. Other than that, more compute power is better than more VRAM. Or are you saying that the very rare GT 730 4 GB is better than my GT 1030 2 GB?

The RX 480 8 GB is better than the RX 480 4 GB.
The GTX 960 4 GB is better than the GTX 960 2 GB.
The 3060 12 GB than the 3060 *Ti* 8 GB or the 3070 (Ti) 8 GB? No. Just simply no.


----------



## lexluthermiester (Dec 29, 2021)

AusWolf said:


> I think you're missing the point.


No, I'm not.


AusWolf said:


> But what's the point, if it barely gives you any detectable performance increase?


More VRAM isn't supposed to increase performance from the GPU, it's meant to grant more operating space for the GPU to do it's work. And you say *I'M* missing the point?


AusWolf said:


> I respectfully disagree.


Ok, disagree. Buy what you want.


AusWolf said:


> Only an idiot (or a non-gamer) spends more money on a monitor than a graphics card, imo.


Then you're calling a lot of very smart people idiots. Think that over for a moment..

This is not that complicated people. And I'm done debating this with brick walls. Buy whatever you want, be happy with it.


----------



## oxrufiioxo (Dec 30, 2021)

ixi said:


> Time to jump in in troll wagon.
> 
> Show us currently any title where 3060 eats 3060 ti in 4K or even after 3,4,5 years .



I already did, Resident Evil village at 4k with RT enabled it also beats the 3070.... Like halfway through this pointless argument.... The OP already decided what he wants to do any bickering over how much vram 400+ gpus should come with is pointless at this point.


----------



## lexluthermiester (Dec 30, 2021)

oxrufiioxo said:


> The OP already decided what he wants to do any bickering over how much vram 400+ gpus should come with is pointless at this point.


Wait, they did? I thought they were still watching.. Bugger me, I'm out.

EDIT;
Yup, back on Page 6, Dec 13th. He got an EVGA 3060 XC 12GB


----------



## oxrufiioxo (Dec 30, 2021)

lexluthermiester said:


> Wait, they did? I thought they were still watching.. Bugger me, I'm out.



Maybe it got lost in all the random Nvidia only giving people the min amount of vram they need for the next two years is ok on 400+ usd gpus.... 

He already has the 3060. He plans on grabbing the 3060ti when he gets notified in the evga queue selling the 3060 for a large profit. Then when the 4000 series gets announced get back in the queue for the 4060/4060ti.... Seems to be his plan.


----------



## lexluthermiester (Dec 30, 2021)

oxrufiioxo said:


> Maybe it got lost in all the random Nvidia only giving people the min amount of vram they need for the next two years is ok on 400+ usd gpus....
> 
> He already has the 3060. He plans on grabbing the 3060ti when he gets notified in the evga queue selling the 3060 for a large profit. Then when the 4000 series gets announced get back in the queue for the 4060/4060ti.... Seems to be his plan.


See edit. Yeah, saw that. He might not though. Vario might realize it is a heck of a card and decide to keep it long term.


----------



## oxrufiioxo (Dec 30, 2021)

lexluthermiester said:


> See edit. Yeah, saw that. He might not though. Vario might realize it is a heck of a card and decide to keep it long term.



Definitely, nothing wrong with that if it suits his needs.... With the current market being so terrible he might be able to pick up a 40 series card for little out of pocket so regardless he will have options it seems with none of them being bad.


----------



## AusWolf (Dec 31, 2021)

lexluthermiester said:


> More VRAM isn't supposed to increase performance from the GPU, it's meant to grant more operating space for the GPU to do it's work. And you say *I'M* missing the point?


If it's not meant to increase performance, then what is it meant to do? "More operating space for the GPU" is useless if your GPU isn't fast enough to operate in that space at resolutions and settings that choke the GPU itself. The whole point of upgrading a gaming PC is more performance. If the only situation a 3060 12 GB delivers more performance than the 3060 Ti 8 GB is in Resident Evil Village at 4k, then it is the weaker card of the two. Simple as.



lexluthermiester said:


> Then you're calling a lot of very smart people idiots. Think that over for a moment..


If you like a slideshow, fair play to you. If I had spare 5-600 quid, I'd much rather spend it on faster PC parts than a bigger monitor that only necessitates the faster parts that I didn't buy in the first place.

In the rest of your post, you're merely stating the fact that you disagree without any argument on your side, so please excuse me if I disregard it.



oxrufiioxo said:


> I already did, Resident Evil village at 4k with RT enabled it also beats the 3070.... Like halfway through this pointless argument.... The OP already decided what he wants to do any bickering over how much vram 400+ gpus should come with is pointless at this point.


That's one game at one resolution. Anything else?



oxrufiioxo said:


> Maybe it got lost in all the random Nvidia only giving people the min amount of vram they need for the next two years is ok on 400+ usd gpus....
> 
> He already has the 3060. He plans on grabbing the 3060ti when he gets notified in the evga queue selling the 3060 for a large profit. Then when the 4000 series gets announced get back in the queue for the 4060/4060ti.... Seems to be his plan.


Fair enough. No point in me (or anyone else) arguing any further, then. I'm still holding my opinion, though.


----------



## ixi (Dec 31, 2021)

oxrufiioxo said:


> I already did, Resident Evil village at 4k with RT enabled it also beats the 3070.... Like halfway through this pointless argument.... The OP already decided what he wants to do any bickering over how much vram 400+ gpus should come with is pointless at this point.
> 
> 
> View attachment 230584











Image battle lets go, TPU review show differently!

Gratz OP on gpu! Game on.


----------



## oxrufiioxo (Dec 31, 2021)

ixi said:


> View attachment 230761View attachment 230762
> 
> 
> Image battle lets go, TPU review show differently!
> ...



Unfortunately neither guru3d or tpu benchmarks the whole game likely only a sub 3 min section in different parts of the game doesn't make either wrong though.


----------



## MentalAcetylide (Dec 31, 2021)

More VRAM = More headroom = better performance with higher resolutions for games. To someone like myself who does rendering, both CUDA/RT cores and VRAM are just as important since this improves rendering speed, but that extra speed won't mean anything if there isn't sufficient VRAM. In games, the extra VRAM will allow for higher resolution gaming, but if the other specs of the card don't get bumped up appropriately along with it, the FPS is probably going to "septic-tank" when you go with a higher resolution; especially if you have RT enabled. Anyway, I would never expect a 3060 to perform like a 3070 or 3080 no matter how much VRAM it is given.


----------



## wolf (Dec 31, 2021)

oxrufiioxo said:


> Resident Evil village at 4k with RT enabled it also beats the 3070


Despite the subsequent reply showing a different story, lets go with this... Despite it being a borderline situation for either card... who would buy either expecting 4k gaming?......

If you had the 3060ti/70, and encountered this "issue", would one not lower texture setting literally one notch, likely not notice the IQ difference (lets even say one did, are the textures now mud?), but notice the ~30% uptick in performance from the faster card?

Come on... Faster card here is the better bet, for 99% of buyers. You want the 3060 12Gb for future proof max texture / substantial GPU performance deficit reasons? you're the outlier.


----------



## oxrufiioxo (Dec 31, 2021)

wolf said:


> Despite the subsequent reply showing a different story, lets go with this.
> 
> If you had the 3060ti/70, and encountered this "issue", would one not lower texture setting literally one notch, likely not notice the IQ difference (lets even say one did, are the textures now mud?), but notice the ~30% uptick in performance from the faster card?
> 
> Come on... Faster card here is the better bet, for 99% of buyers. You want the 3060 12Gb for future proof max texture / substantial GPU performance deficit reasons? you're the outlier.



I'm more of the mindset the 3060ti and 3070 should come with more Vram not how much is the bare minimum for the next couple years.... Everyone just has to vote with their wallet and lets be real anyone who can get a gpu right now near msrp should especially those in need of an upgrade. I even stated in my original post although I don't personally care for either gpu for different reasons the 3060ti is the better of the 2.


----------



## eidairaman1 (Dec 31, 2021)

3060 and 3060 ti are different gpu dies, to get more from the 3060 requires a die and ram oc. Vram just offloads from the CPU, RAM and HDD/SSD.

A 3060 is not going to magically beat a 3060ti, this aint like GF 2, 3, 4...


----------



## lexluthermiester (Dec 31, 2021)

AusWolf said:


> If you like a slideshow, fair play to you.


Slideshow? Have you even used a 3060? Do have a 4k display? I've actually put that to the test. Turn a few settings down and a 3060 does 4k very well.



eidairaman1 said:


> A 3060 is not going to magically beat a 3060ti


No one said that. There are a few here that think I implied it, but context seems lost. However the OC models most vendors are selling have a very respectable boost over stocks.



ixi said:


> View attachment 230761View attachment 230762
> 
> 
> Image battle lets go, TPU review show differently!
> ...


That is ONE example of a game. And you'll notice the Legend at the top that gives context: Highest, RT on, TPU custom bench scene. That is W1zzards way of testing everything to the max with that game engine. And wouldn't you know it, the 3060 12GB does fine at MAX settings. Yes, I consider anything above 30fps playable. Not optimal, but easily playable. However, when settings are config'd to more reasonable adjustments, 60fps is easily achieved.

Also take note, with this game example at 4k, the game is brushing up close to the 8GB line. Some games exceed it already. This makes the 12GB model desirable as the future of gaming will push passed 8GB.


----------



## MentalAcetylide (Dec 31, 2021)

lexluthermiester said:


> Yes, I consider anything above 30fps playable.


Ugh, 30 FPS is really bad for me in World of Tanks. While it is playable, it can be a real ball ache. When it starts dipping down below 50 or so, it becomes more of a chore to control your tank/aim; especially when there's a lot o foliage, trees, structures, and/or when there's several or more tanks on your screen firing around you. When it comes to 30 vs. 30 Frontline matches, I don't think even 60 FPS would be enough where you could have 20 or more tanks on the screen, and you're playing an artillery, you can easily have 30+ tanks on the screen in overhead view.


----------



## lexluthermiester (Dec 31, 2021)

MentalAcetylide said:


> Ugh, 30 FPS is really bad for me in World of Tanks. While it is playable, it can be a real ball ache.


It can depend on the game. Some games, 30fps is perfectly enjoyable, for others it borderline unacceptable. As a rule I agree and as I said above, it's not optimal.


----------



## AusWolf (Dec 31, 2021)

lexluthermiester said:


> Slideshow? Have you even used a 3060? Do have a 4k display? I've actually put that to the test. Turn a few settings down and a 3060 does 4k very well.


I have a 2070 which is essentially a 3060 with 4 GB less VRAM and a wider memory bus. I don't game at 4k because 1. it's too expensive for me, 2. I'm really not that bothered by the resolution hype. I'm happy to test some stuff, though (using driver level upscaling).



lexluthermiester said:


> Yes, *I consider anything above 30fps playable*. Not optimal, but easily playable. However, when settings are config'd to more reasonable adjustments, 60fps is easily achieved.


Me too. That's why I found my experience testing Resident Evil Village with maxed out settings totally fine.  Of course, it's only 1080p, but I still don't think one should expect a miracle from a xx60 (or last gen xx70) level card in the latest games at 4k... especially not in any of the latest RE games which have been notoriously and unreasonably harsh on VRAM usage.


----------



## seth1911 (Jan 4, 2022)

ill prefer the 12GB.


I play mostly open world Games like No Mans Sky, Hitman Series and MMORPGS on PC


----------



## Mussels (Jan 5, 2022)

AusWolf said:


> If it's not meant to increase performance, then what is it meant to do? "More operating space for the GPU" is useless if your GPU isn't fast enough to operate in that space at resolutions and settings that choke the GPU itself. The whole point of upgrading a gaming PC is more performance. If the only situation a 3060 12 GB delivers more performance than the 3060 Ti 8 GB is in Resident Evil Village at 4k, then it is the weaker card of the two. Simple as.


More RAM and more VRAM do not increase performance.
They remove a cause of performance loss, when you run out of available RAM/VRAM.


Adding more tires to a bigass truck does not change engine performance one little bit, but it does smooth out the ride


----------



## lexluthermiester (Jan 5, 2022)

Mussels said:


> More RAM and more VRAM do not increase performance.
> They remove a cause of performance loss, when you run out of available RAM/VRAM.
> 
> 
> Adding more tires to a bigass truck does not change engine performance one little bit, but it does smooth out the ride


Exactly. More RAM only expands capabilities and prevents unnecessary slow-downs.


----------



## chrcoluk (Jan 5, 2022)

neatfeatguy said:


> About - used in this context I guess you could say it would mean "close to" or something along the lines of "similar"
> 
> 
> 
> ...



They did, but the problem with reviews is they dont really do long gaming runs that allow VRAM to get saturated and they also tend to bench on very well optimised games which seem to not have texture asset swapping issues, I dont see lightning returns benched e.g. which still can only do 37FPS in one of the towns on my 3080.

Finally they dont test mods which can send VRAM usage through the roof, neither have I ever seen optional 4k texture mod packs benched either.  So we not seeing a clear picture, but if your choice of games is the one's that keep been picked for reviews and you dont explore much out of benchmark areas then fair enough. 

With that said I do agree generally raw rendering performance is usually going to be the primary factor, I think when VRAM should be considered is if either you know the GPU will bottleneck in your use scenario or the performance differential is only single digits.  I would e.g. take a 5% slower card that had 30% more VRAM.  Especially when you consider turning down textures in a game is usually extremely noticeable vs say toning down fog density.


----------



## eidairaman1 (Jan 9, 2022)

Mussels said:


> More RAM and more VRAM do not increase performance.
> They remove a cause of performance loss, when you run out of available RAM/VRAM.
> 
> 
> Adding more tires to a bigass truck does not change engine performance one little bit, but it does smooth out the ride



Yeah tell that to rubber band tire runners


----------



## AusWolf (Jan 9, 2022)

Mussels said:


> More RAM and more VRAM do not increase performance.
> They remove a cause of performance loss, when you run out of available RAM/VRAM.











						NVIDIA GeForce RTX 2060 12 GB Review
					

NVIDIA stealth-launched the GeForce RTX 2060 12 GB last month. We bought a card in retail, so we can find out how much of a difference doubling the VRAM from 6 GB to 12 GB makes, and how much of the performance gains can be attributed to the increased GPU core counts.




					www.techpowerup.com
				




... if there is a performance loss in the first place. Notice that the 12 GB 2060 can't beat the 8 GB 2060 Super in basically any instance. Those two cards have very similar GPUs, not like the 3060 Ti which is significantly faster than the 3060. This is why I'm saying that going for more VRAM is only a no-brainer when you're comparing two cards with identical GPUs.


----------



## lexluthermiester (Jan 9, 2022)

AusWolf said:


> Notice that the 12 GB 2060 can't beat the 8 GB 2060 Super in basically any instance.


You're missing context. That card is a hybrid between the 192bit bus of the 2060 and the greater number of compute units of the 2060S. It performs right where it's suppose to. What makes this card a winner is the fact that it exists and is not garbage. Then when you account for the fact that most AIB's will make OC models, the picture gets rosier. Then when you account for the reality that miners are going to love this card for cryptocoin that is memory intensive and they will prefer these cards over the 30X0 cards as the ROI will be much better.

Your argument really isn't one. Please don't take that personally, just trying to give you a bigger picture view.


----------



## Vicious (Jan 9, 2022)

Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?


----------



## AusWolf (Jan 9, 2022)

lexluthermiester said:


> That card is a hybrid between the 192bit bus of the 2060 and the greater number of compute units of the 2060S.


Exactly.
The point is: bus width matters more than an extra 4 GB VRAM on a 2060 (Super) class card. If you have a difference in GPU config as well, like on the 3060 - 3060 Ti pair, the difference is much more pronounced.



lexluthermiester said:


> What makes this card a winner is the fact that it exists and is not garbage.


I totally agree with that, though I was not disputing its relevance. I was only stating that factors other than VRAM capacity matter more at the 2060-3060 range.


----------



## lexluthermiester (Jan 9, 2022)

AusWolf said:


> The point is: bus width matters more than an extra 4 GB VRAM





AusWolf said:


> I was only stating that factors other than VRAM capacity matter more at the 2060-3060 range.


Again, that depends greatly on the usage scenario. Some programs/games will use the extra space to cache more data. While this will not speed up the GPU, it will prevent slowdowns due to system RAM/storage access. This is always the most important benefit.


----------



## Mussels (Jan 10, 2022)

Vicious said:


> Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?


If they run RTX off with DLSS? a few people.

I game at 4K with a GTX1080, but it's not at ultra settings.


----------



## VeqIR (Jan 26, 2022)

Vicious said:


> Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?


Not everyone buys a 4K monitor purely for gaming reasons—some might be doing photo processing too, for example.  And with that, not everyone can afford the current high end cards.  A good 4K monitor is quite accessible in price nowadays (bought an open box 28” 4K IPS monitor at Microcenter for $220+ tax for a family member, for example), whereas getting a GPU is a highway robbery.  Making do with midrange cards is the reality for a lot of people who don’t have an extra $1000+ to spend on just a GPU.


----------



## systemBuilder (Jan 31, 2022)

The 3060Ti is basically a 3070, gimped.
It was my first choice, too.
Realize that if everybody is recommending a 3060Ti, that means it's so popular you will have ZERO chance of getting one!
I went for a 3070Ti (newegg lottery), and won on the first day.  It's a widely hated card therefore i won it on the first day.  Didn't pay $600 - paid $980 - but MSRPs are a fiction anyway, especially with inflation (7.5% inflation since MSRPs were announced), 25% made-in-China tariffs, and TSMC raising prices 10-20% this year (2022).


----------



## AusWolf (Jan 31, 2022)

systemBuilder said:


> The 3060Ti is basically a 3070, gimped.
> It was my first choice, too.
> Realize that if everybody is recommending a 3060Ti, that means it's so popular you will have ZERO chance of getting one!
> I went for a 3070Ti (newegg lottery), and won on the first day.  It's a widely hated card therefore i won it on the first day.  Didn't pay $600 - paid $980 - but MSRPs are a fiction anyway, especially with inflation (7.5% inflation since MSRPs were announced), 25% made-in-China tariffs, and TSMC raising prices 10-20% this year (2022).


Honestly, I would love to have a 3070 Ti - considering the slowing down of development in game technologies, I'm sure it would be enough for a good couple of years. My only issue is its price. Not really its price alone, but paying £1,000-1,100 for an 8 GB graphics card when I've already got one (2070) is a bit steep. I'd gladly buy a 16 GB model if it existed, and call it a day until 2025, if nvidia ever bothered realising it.


----------



## Bomby569 (Jan 31, 2022)

going back to this, Far cry 6 has a HD pack that requires 11GB, and doesn't work on a 3060ti, ask me how i know?

a card like the 3060ti or the 3070 with 8GB was a mistake, especially with the prices they sell this cards, and i'm only considering MRSP, not the real prices.


----------



## neatfeatguy (Jan 31, 2022)

Bomby569 said:


> going back to this, Far cry 6 has a HD pack that requires 11GB, and doesn't work on a 3060ti, ask me how i know?
> 
> a card like the 3060ti or the 3070 with 8GB was a mistake, especially with the prices they sell this cards, and i'm only considering MRSP, not the real prices.



So, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures? Seriously, one crappy game (in my opinion, not a fan of the Far Cry series, it's like Madden football games, or CoD games or BF games....same rehashed crap upon every release) that limits HD textures and everyone constantly brings it up that new GPUs can't run it because of the VRAM amount.

I'm not supporting Nvidia for having 8-10GB of VRAM on their mid-high to high end GPUs. I just hate that people are so hung up on FC6 and use it as the only game that cannot run the HD textures due to not having more than 8-10GB of VRAM.

So, again, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures?


----------



## Bomby569 (Jan 31, 2022)

neatfeatguy said:


> So, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures? Seriously, one crappy game (in my opinion, not a fan of the Far Cry series, it's like Madden football games, or CoD games or BF games....same rehashed crap upon every release) that limits HD textures and everyone constantly brings it up that new GPUs can't run it because of the VRAM amount.
> 
> I'm not supporting Nvidia for having 8-10GB of VRAM on their mid-high to high end GPUs. I just hate that people are so hung up on FC6 and use it as the only game that cannot run the HD textures due to not having more than 8-10GB of VRAM.
> 
> So, again, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures?



i just gave other examples like RE village, there's 2 examples, and i don't even play that many games


----------



## neatfeatguy (Jan 31, 2022)

Bomby569 said:


> i just gave other examples like RE village, there's 2 examples, and i don't even play that many games



I find it interesting that 4k (RT off) FarCry 6 only uses around 9.5GB of VRAM and I don't know what to say about RE Village. According to TPU at 4k it only used just shy of 8GB of VRAM.

The artificial limitation of VRAM amounts sounds like a ploy to make 8-10GB GPUs bad when it shouldn't really be a problem. Games and your system will allocate VRAM and swap textures very well without having an artificial limit in place to make it sound like you lack enough VRAM to run the game at the best settings.

I see no reason why any GPU in the mid-high to high range from AMD or Nvidia would have any issues running games with HD textures if the artificial limits weren't put in place. If the games had a suggested VRAM notification, that would be fine. If you wanted to max out all settings at 4k for FC6 and the game showed you a suggested VRAM amount needed - showing how much you have an how much they suggest is needed, but you can run the game anyway....your system will handle loading and offloading textures as needed. Let the system do what it was designed to do. If the performance really does become an issue then at that point the user can work on adjusting settings.


----------



## Tetras (Jan 31, 2022)

neatfeatguy said:


> I find it interesting that 4k (RT off) FarCry 6 only uses around 9.5GB of VRAM and I don't know what to say about RE Village. According to TPU at 4k it only used just shy of 8GB of VRAM.
> 
> The artificial limitation of VRAM amounts sounds like a ploy to make 8-10GB GPUs bad when it shouldn't really be a problem. Games and your system will allocate VRAM and swap textures very well without having an artificial limit in place to make it sound like you lack enough VRAM to run the game at the best settings.
> 
> I see no reason why any GPU in the mid-high to high range from AMD or Nvidia would have any issues running games with HD textures if the artificial limits weren't put in place. If the games had a suggested VRAM notification, that would be fine. If you wanted to max out all settings at 4k for FC6 and the game showed you a suggested VRAM amount needed - showing how much you have an how much they suggest is needed, but you can run the game anyway....your system will handle loading and offloading textures as needed. Let the system do what it was designed to do. If the performance really does become an issue then at that point the user can work on adjusting settings.



You can see this in some of the 6500 XT testing, where older cards also with 4GB and proper PCI-E connectors don't suffer anywhere near as badly as the 6500 limited to only 4 lanes does when available vram is exceeded and the loss of performance or visual quality (in otherwise the same situation) is always game dependent to some degree.


----------



## xorstl (Jan 31, 2022)

neatfeatguy said:


> So, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures? Seriously, one crappy game (in my opinion, not a fan of the Far Cry series, it's like Madden football games, or CoD games or BF games....same rehashed crap upon every release) that limits HD textures and everyone constantly brings it up that new GPUs can't run it because of the VRAM amount.
> 
> I'm not supporting Nvidia for having 8-10GB of VRAM on their mid-high to high end GPUs. I just hate that people are so hung up on FC6 and use it as the only game that cannot run the HD textures due to not having more than 8-10GB of VRAM.
> 
> So, again, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures?


Not only that but the xx60 cards are entry tier, not mid or high tier. 70s are mid, 70Ti and up are mid-high or high. 50s are always the "barely usable in gaming" ones and therefore can't even be considered entry tier for a gaming GPU.

We're given a choice of 12GB or 8GB with much better bandwidth for an entry tier card, that's more than bloody enough. Equivalent to previous gen's mid-high tier which is typically repeated between generations that had a decent leap, so it makes sense.

What's all the fuss about the 3060Ti only having 8GB? If your needs really require good solid use of all 12GB of a card, you won't buy a 3060 anyway because you'll be bottlenecked by a shittier GPU and less memory bandwidth anyway. The card is exactly in the range it should, the problem is the market prices prohibiting regular middle-class people getting a mid-high end gpu, not the characteristics of a shitty entry level card being shitty. I have a 3060Ti and would only be happier about it if I had paid MSRP, it behaves like a solid mid tier card for everything except the 1% cases where you need more than 8GB memory.


----------



## Bomby569 (Jan 31, 2022)

neatfeatguy said:


> I find it interesting that 4k (RT off) FarCry 6 only uses around 9.5GB of VRAM and I don't know what to say about RE Village. According to TPU at 4k it only used just shy of 8GB of VRAM.
> 
> The artificial limitation of VRAM amounts sounds like a ploy to make 8-10GB GPUs bad when it shouldn't really be a problem. Games and your system will allocate VRAM and swap textures very well without having an artificial limit in place to make it sound like you lack enough VRAM to run the game at the best settings.
> 
> I see no reason why any GPU in the mid-high to high range from AMD or Nvidia would have any issues running games with HD textures if the artificial limits weren't put in place. If the games had a suggested VRAM notification, that would be fine. If you wanted to max out all settings at 4k for FC6 and the game showed you a suggested VRAM amount needed - showing how much you have an how much they suggest is needed, but you can run the game anyway....your system will handle loading and offloading textures as needed. Let the system do what it was designed to do. If the performance really does become an issue then at that point the user can work on adjusting settings.



i literally own the games and the card. Both games get to a point they get unplayable frame rates, both on 1440p, it doesn't happen imediately, it's after some time of play that triggers it in both games. The rest depends on the settings, if the card can do more why would i disable RT or lower the settings? 

I even tried 1080p on FC6 as is just as bad, as you can literally max everything, all ultra, RT on.


----------



## Frick (Jan 31, 2022)

Bomby569 said:


> i literally own the games and the card. Both games get to a point they get unplayable frame rates, both on 1440p, it doesn't happen imediately, it's after some time of play that triggers it in both games. The rest depends on the settings, if the card can do more why would i disable RT or lower the settings?
> 
> I even tried 1080p on FC6 as is just as bad, as you can literally max everything, all ultra, RT on.



That just sounds like a poorly made game tbh.


----------



## xorstl (Jan 31, 2022)

Frick said:


> That just sounds like a poorly made game tbh.


Or temps



> it doesn't happen imediately


----------



## R-T-B (Jan 31, 2022)

oxrufiioxo said:


> That's a great idea actually.


Is it just me, or is that like scalping with extra steps?

Kidding mostly.  One does what one has to these days...


----------



## oxrufiioxo (Jan 31, 2022)

R-T-B said:


> Is it just me, or is that like scalping with extra steps?
> 
> Kidding mostly.  One does what one has to these days...



Honestly with how shit the market is if people can they should take advantage to get the hardware they want.... I sold a Titan Xp for 900 usd and used that for a 3080 ti making it a much more sensible purchase.... My buddy sold a 5700XT for nearly the same and picked up a 6800XT.


----------



## Mussels (Jan 31, 2022)

Bomby569 said:


> going back to this, Far cry 6 has a HD pack that requires 11GB, and doesn't work on a 3060ti, ask me how i know?
> 
> a card like the 3060ti or the 3070 with 8GB was a mistake, especially with the prices they sell this cards, and i'm only considering MRSP, not the real prices.


I have a 3090, and that HD pack just made everything look fuzzy


----------



## Kissamies (Jan 31, 2022)

Vicious said:


> Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?


My next monitor will be a 4K one and not going to upgrade from 1080 Ti in the near future. This isn't THAT much faster than 3060.

Nobody is pointing a gun to my head and forcing me to play the newest AAA titles with max details if that's the thing when wondering why 4K.


----------



## Mister300 (Jan 31, 2022)

oxrufiioxo said:


> At this point unless you come up in the queue in the next month or so I'd probably skip both and wait for the 4000 series that will likely have adequate  vram. I personally wouldn't be shocked if none of the nvidia cards from the 3080 down age well due to nvidia cheaping out on vram.
> 
> If I had to choose one of these two it would be the ti though it's quite a bit faster but having to lower settings in a year or so for a 500+ gpu feels bad.


Would not wait on 4000 series at the scale of 5 nm pricing will be high due to poor yields and other factors, the days of inexpensive GPUs are over.  I run a Zotac 3080 at 2K at it has no problem with 144Hz.


----------



## ShiBDiB (Jan 31, 2022)

Vario said:


> I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.



This is basically what I did, except with a few more steps... (Got a 3060ti at early cheapish scalp prices, then got a 3060ti from EVGA which I sold BNIB, then got a 3070ti at MSRP and sold the OG 3060ti)

In the end I have a 3070ti and I'm only down ~$300.


----------



## VeqIR (Jan 31, 2022)

ShiBDiB said:


> This is basically what I did, except with a few more steps... (Got a 3060ti at early cheapish scalp prices, then got a 3060ti from EVGA which I sold BNIB, then got a 3070ti at MSRP and sold the OG 3060ti)
> 
> In the end I have a 3070ti and I'm only down ~$300.


That's really nice, hindsight 20/20 for me.  I wish I had signed up for a 3070 or 3070 Ti early on.  Instead I signed up only for a 3060 for some stupid reason.  Well I know the reason--it was because I expected prices to drop sooner and paying over $400-500 seemed too much, when that tier would normally be going for significantly less in the past in the open box / gently used marketplace.  For example I bought an excellent version of a 1080 for my family member's computer for $300 when it was still the current GPU series.


----------



## Bomby569 (Jan 31, 2022)

xorstl said:


> Or temps



i have no temps problem on this card on this case, i never seen it go past 72c



Frick said:


> That just sounds like a poorly made game tbh.



in what world do you live that games came optimized? certainly not mine, most games are a mess this days, reality is just what it is.


----------



## VeqIR (Feb 1, 2022)

Bomby569 said:


> in what world do you live that games came optimized? certainly not mine, most games are a mess this days, reality is just what it is.



Sadly that's the truth: dive deeply enough in a game, and there's usually regular talk about how it's not properly optimized for the hardware (like not properly-threaded on a CPU, outdated engine, etc., etc.)


----------



## AusWolf (Feb 1, 2022)

xorstl said:


> Not only that but the xx60 cards are entry tier, not mid or high tier. 70s are mid, 70Ti and up are mid-high or high. 50s are always the "barely usable in gaming" ones and therefore can't even be considered entry tier for a gaming GPU.


x60 has never been entry level. x30 is where the entry level really is. The fact that nvidia has recently forgotten about it (like AMD did long ago) doesn't change the fact. The 3060 gives you solid 1440p (or 1080p maxed out) gaming. Hell, even the 3050 does with some compromise and/or DLSS. I'd say this perfectly describes what mid-tier means.



xorstl said:


> We're given a choice of 12GB or 8GB with much better bandwidth for an entry tier card, that's more than bloody enough. Equivalent to previous gen's mid-high tier which is typically repeated between generations that had a decent leap, so it makes sense.
> 
> What's all the fuss about the 3060Ti only having 8GB? If your needs really require good solid use of all 12GB of a card, you won't buy a 3060 anyway because you'll be bottlenecked by a shittier GPU and less memory bandwidth anyway. The card is exactly in the range it should, the problem is the market prices prohibiting regular middle-class people getting a mid-high end gpu, not the characteristics of a shitty entry level card being shitty. I have a 3060Ti and would only be happier about it if I had paid MSRP, it behaves like a solid mid tier card for everything except the 1% cases where you need more than 8GB memory.


This I agree with. I'm still more than happy with my 2070, and cannot grasp what the fuss around 12 GB VRAM is about on similar level of cards.


----------



## VeqIR (Feb 1, 2022)

AusWolf said:


> x60 has never been entry level. x30 is where the entry level really is. The fact that nvidia has recently forgotten about it (like AMD did long ago) doesn't change the fact. The 3060 gives you solid 1440p (or 1080p maxed out) gaming. Hell, even the 3050 does with some compromise and/or DLSS. I'd say this perfectly describes what mid-tier means.


Yeah I don’t know what that person is talking about….  Entry level is anything above integrated graphics level of cards, and X30-x50 falls into that range.  X60-X70 are the dedicated gamer cards, historically.  Anything above is various degrees of high to super enthusiast to “I have a lot of money, look at my shiny build log with lots of brand names on display.  Don’t forget people used to SLI/Crossfire mid-range cards for more performance on a budget too. I know technology moves fast, but we’re talking about recent generations still.  Whatever happened since 2019 is completely out of the norm with prices and also what the makers can get away with as a result of the shortages, and is like some nightmarish alternate reality.


----------



## lexluthermiester (Feb 1, 2022)

Why is this thread still going? It's not even on topic anymore..


----------



## oxrufiioxo (Feb 1, 2022)

lexluthermiester said:


> Why is this thread still going? It's not even on topic anymore..



It's the never ending debate kinda like quadcores, 8GB vs 16Gb of system ram, Amd vs Nvidia. 

I was kinda surprised it got revived as well.


----------



## xorstl (Feb 1, 2022)

AusWolf said:


> x60 has never been entry level. x30 is where the entry level really is. The fact that nvidia has recently forgotten about it (like AMD did long ago) doesn't change the fact. The 3060 gives you solid 1440p (or 1080p maxed out) gaming. Hell, even the 3050 does with some compromise and/or DLSS. I'd say this perfectly describes what mid-tier means.
> 
> 
> This I agree with. I'm still more than happy with my 2070, and cannot grasp what the fuss around 12 GB VRAM is about on similar level of cards.


For the longest time a 60 card is an entry gaming card. Dont confuse an entry gaming card with an entry card... my whole comment is about gaming GPUs. Anything below the 60s can typically be matched with a cheap mid tier from the previous gen (not cheap anymore these days I know). 30 cards are literally the trash for office pcs that want to claim they have a dedicated gpu, never have I seen a 30-50 series game acceptably. 60 series game acceptably for high res latest gen games of their era, thus it should be considered the entry level for gaming setups. Let's call it enthusiast entry level.



VeqIR said:


> Entry level is anything above integrated graphics level of cards, and X30-x50 falls into that range. X60-X70 are the dedicated gamer cards, historically.


Exactly my point though. Entry level for a gaming setup is not the same as entry level for a regular office PC or wtv. 60s were always the bare minimum any proper gamer is willing to go for. Below that you're buying trash and you know it. Thus I (and many others) call this tier "gaming entry level".



Bomby569 said:


> i have no temps problem on this card on this case, i never seen it go past 72c
> 
> 
> 
> in what world do you live that games came optimized? certainly not mine, most games are a mess this days, reality is just what it is.


If you got an LHR v1 card and you see core temps (hotspot I hope?) Of 72C, you're most likely running the mems at 90+. Sadly nvidia cheaps out on sensors too so we can't know without a custom sensor. LHR v1 Hynix mems are cheap af and cant handle those temps steadily long term, nor can they handle overclock. Lhr v2 hynix are OK, they handle as much OC as samsungs but if I'm not mistaken they are still not able to cope with temps as high as samsung and miron can.
Was just a suggestion anyway, problems that only happen after running something for a while tend to be related to long term unstable temps.


----------



## Bomby569 (Feb 1, 2022)

xorstl said:


> If you got an LHR v1 card and you see core temps (hotspot I hope?) Of 72C, you're most likely running the mems at 90+. Sadly nvidia cheaps out on sensors too so we can't know without a custom sensor. LHR v1 Hynix mems are cheap af and cant handle those temps steadily long term, nor can they handle overclock. Lhr v2 hynix are OK, they handle as much OC as samsungs but if I'm not mistaken they are still not able to cope with temps as high as samsung and miron can.
> Was just a suggestion anyway, problems that only happen after running something for a while tend to be related to long term unstable temps.



Your just making crazy claims, the card run CP77 for hours maxed. This is a VRAM issue, i can see it went over the limit.


----------



## AusWolf (Feb 1, 2022)

xorstl said:


> For the longest time a 60 card is an entry gaming card. Dont confuse an entry gaming card with an entry card... my whole comment is about gaming GPUs. Anything below the 60s can typically be matched with a cheap mid tier from the previous gen (not cheap anymore these days I know). 30 cards are literally the trash for office pcs that want to claim they have a dedicated gpu, never have I seen a 30-50 series game acceptably. 60 series game acceptably for high res latest gen games of their era, thus it should be considered the entry level for gaming setups. Let's call it enthusiast entry level.


My GT 1030 got offended. I'm not saying that it plays the latest games. I'm saying that it produces acceptable framerates in age-appropriate titles with reduced graphical settings. This is what entry-level means.

I'm always puzzled when someone thinks "Ultra" is the only graphics option and anything below 60 FPS is unacceptable / office PC category. With this logic, consoles fit the office PC category too.

Edit: Also, why would an office PC need a GT 1030 (or even a GT 730)?


----------



## Taraquin (Feb 1, 2022)

oxrufiioxo said:


> It's the never ending debate kinda like quadcores, 8GB vs 16Gb of system ram, Amd vs Nvidia.
> 
> I was kinda surprised it got revived as well.


It only applies to 8 vs 16gb if you run 8gb at 4000MHz and 16gb at 2133MHz  I have had both cards and 3060ti is close to 30% faster, but 3060 12gb is nice for dual mining


----------



## Frick (Feb 1, 2022)

Bomby569 said:


> Your just making crazy claims, the card run CP77 for hours maxed. This is a VRAM issue, i can see it went over the limit.



If the game slows down over a period of time in a way that other - also very demanding - games doesn't I'd say it's the game.


----------



## Bomby569 (Feb 1, 2022)

Frick said:


> If the game slows down over a period of time in a way that other - also very demanding - games doesn't I'd say it's the game.



am i talking but no one listens? it's the V R A M limit


----------



## Frick (Feb 1, 2022)

Bomby569 said:


> am i talking but no one listens? it's the V R A M limit



Maybe, but honestly it sounds like it's just one of those unnecessary texture packs for Skyrim that doesn't actually matter and is more an e-peen thing. You can make texture packs that "require" 20GB of VRAM, but ... why would you?


----------



## neatfeatguy (Feb 1, 2022)

Bomby569 said:


> i literally own the games and the card. Both games get to a point they get unplayable frame rates, both on 1440p, it doesn't happen imediately, it's after some time of play that triggers it in both games. The rest depends on the settings, if the card can do more why would i disable RT or lower the settings?
> 
> I even tried 1080p on FC6 as is just as bad, as you can literally max everything, all ultra, RT on.





Bomby569 said:


> am i talking but no one listens? it's the V R A M limit



You literally said "_it doesn't happen imediately, it's after some time of play that triggers it in both games_".

I don't understand how @Frick 's comment could come off as not listening. 

I think Frick's comment is spot on since you also said that you've played CP2077 hours on end without issues.

When I play extended periods of time on games, even when I ran my 980Ti (looking at you Shadow of Mordor - you maxed out the 6GB VRAM limit on that card), the game ran great and never started to run into unplayable frame rates.

To me, it sounds like the games are not well optimized if it's causing a system to run fine for a while and then start to cause unplayable framerates as time goes on. The game should be dropping and picking up textures as needed into the VRAM. To me, it sounds like it's not doing this very well and starts storing large textures into the system RAM, which can cause noticeable slowdowns if the game continues to dump more and more in the system RAM and not swapping textures out of the VRAM. I could be wrong, but that's kind of what it sounds like is happening....that, or they have a memory leak.


----------



## xorstl (Feb 1, 2022)

AusWolf said:


> My GT 1030 got offended. I'm not saying that it plays the latest games. I'm saying that it produces acceptable framerates in age-appropriate titles with reduced graphical settings. This is what entry-level means.
> 
> I'm always puzzled when someone thinks "Ultra" is the only graphics option and anything below 60 FPS is unacceptable / office PC category. With this logic, consoles fit the office PC category too.
> 
> Edit: Also, why would an office PC need a GT 1030 (or even a GT 730)?


I wouldnt consider anything below 60 fps on minimum settings to be acceptable and for many gens now the 50s and below cant do that on latest gen games of their era. I'm not saying it shoulr be that way, just how it is.. 

As for your question, plenty use on offices for a low end gpu.. the most common being multi display setups at proper refresh rate!


----------



## VeqIR (Feb 1, 2022)

You guys are acting like the definition of a “gamer” is to not just play games in a serious, dedicated manner, but to also do it at 144Hz and 4K with 10-bit HDR or something.  Most people can’t afford or don’t think they need anything beyond decent 1080p, and don’t necessarily need all settings maxed.  I’ve known some people turn down a lot of graphics on their older systems so that they could play at high frame rate and stay competitive, and it was fine by them.  144Hz is also just for certain first person shooter type games, for example: you don’t need that for MMOs or strategy games.  There are lots of game categories and games that one can play competitively that don’t need the latest and greatest hardware, like LoL, Magic Arena, etc, etc.

Many gamers have no idea how to build and upgrade their computer, or know what parts are good, and lots play on laptops that overheat, and they still don’t do anything about it.


----------



## AusWolf (Feb 2, 2022)

xorstl said:


> I wouldnt consider anything below 60 fps on minimum settings to be acceptable and for many gens now the 50s and below cant do that on latest gen games of their era. I'm not saying it shoulr be that way, just how it is..


Then you wouldn't consider console gaming acceptable either, yet many people enjoy it around the world. Nothing personal here, but your opinion seems to be coming from a quite snobby point of view. I, for one, can't really tell the difference between 40 and 60 fps, and used to have tons of fun in The Witcher 3 with a 750 Ti at 30 fps, 1080p, medium-high settings.



xorstl said:


> As for your question, plenty use on offices for a low end gpu.. the most common being multi display setups at proper refresh rate!


Most offices don't need more than 2 monitors that can be driven with an iGPU. If they did, we would see a lot more low-end cards on the market. Refresh rate isn't a concern either. Offices that actually need many monitors, high refresh rate, colour accuracy, etc. won't make do with a lowly GeForce or Radeon. That's what Quadros are for.


----------



## VeqIR (Feb 2, 2022)

xorstl said:


> As for your question, plenty use on offices for a low end gpu.. the most common being multi display setups at proper refresh rate!


That hasn't been the case in _years_.  Most office and workstation computers (I don't mean those with Xeons--in that case very basic GPUs can be used) use integrated graphics and monitors connect to the motherboard.  If a bundled motherboard has a single DP output, a splitter can be used.  For example, an i3 7100 supports up to a 4K display at 60Hz and up to 3 displays in total.


----------



## xorstl (Feb 4, 2022)

VeqIR said:


> That hasn't been the case in _years_.  Most office and workstation computers (I don't mean those with Xeons--in that case very basic GPUs can be used) use integrated graphics and monitors connect to the motherboard.  If a bundled motherboard has a single DP output, a splitter can be used.  For example, an i3 7100 supports up to a 4K display at 60Hz and up to 3 displays in total.


Can you do 3x 1080p at 144hz? That's the use case I see every day in my field, especially for laptops. Might be outdated knowledge on the companies' side or might be too lazy to work with splitters, but the fact is that I still see people prioritizing either an unnecessarily expensive cpu or just a decent one with dedicated (even if shitty) card.

On the other side of my point, who the actual f enjoys gaming at 30fps? And what 50 card performed better than that on games launched around the same year as the card? I'm yet to see such benchmark and never met a gamer that happily uses a 50 series without constant nagging and complaining. I can't fit a well bellow standard performance product in a "gaming" category, just doesn't add up. The fact they are marketed as gaming gpus doesn't make them gaming gpus. A GPU that can run a 5 yo game maxed is nothing, non-gaming gpus can do that too with older games and they are clearly marketed as not for gaming. I have a 3060Ti, it's clearly a lowish mid tier card making the non-ti an entry tier and the 3050 just a pile of shit. None of this is with ultra settings or the biggest resolution of the era, not even a mid tier can do that with an AAA game launched the same year. Also PC gamers tend to be years behind the latest resolution to prioritize performance with high end cards, which tells you how much performance they are looking for. I'm not including the casual FIFA or F1 player, that's a typical console player that just decided to go for better refresh rate.



AusWolf said:


> I'm always puzzled when someone thinks "Ultra" is the only graphics option and anything below 60 FPS is unacceptable / office PC category. With this logic, consoles fit the office PC category too.


I guess you are that person. I'd throw my PC off the window rather than gaming under 60fps. That's completely unnaceptable and well below average. PC gamers dont see consoles as office PCs, at least those you can usually upgrade. Console makers historically never worked for more than 60fps stable since TVs run at 50-60hz anyway. Ofc this is changing but we're talking about the past not the future. It's not because they thought 60 fps is a great goal to aim for. Why do you think the pc vs console war exists? You're clearly ok with console standards, I'm not. Let's all be happy about our diverse opinions and shut up about it


----------



## neatfeatguy (Feb 4, 2022)

xorstl said:


> Can you do 3x 1080p at 144hz? That's the use case I see every day in my field, especially for laptops. Might be outdated knowledge on the companies' side or might be too lazy to work with splitters, but the fact is that I still see people prioritizing either an unnecessarily expensive cpu or just a decent one with dedicated (even if shitty) card.
> 
> On the other side of my point, who the actual f enjoys gaming at 30fps? And what 50 card performed better than that on games launched around the same year as the card? I'm yet to see such benchmark and never met a gamer that happily uses a 50 series without constant nagging and complaining. I can't fit a well bellow standard performance product in a "gaming" category, just doesn't add up. The fact they are marketed as gaming gpus doesn't make them gaming gpus. A GPU that can run a 5 yo game maxed is nothing, non-gaming gpus can do that too with older games and they are clearly marketed as not for gaming. I have a 3060Ti, it's clearly a lowish mid tier card making the non-ti an entry tier and the 3050 just a pile of shit. None of this is with ultra settings or the biggest resolution of the era, not even a mid tier can do that with an AAA game launched the same year. Also PC gamers tend to be years behind the latest resolution to prioritize performance with high end cards, which tells you how much performance they are looking for. I'm not including the casual FIFA or F1 player, that's a typical console player that just decided to go for better refresh rate.
> 
> ...



You do what you want with your PC and your gaming needs, but don't judge others what they do with theirs and how they enjoy gaming. Not everyone is a graphic whore or needs the best of the best, nor can they afford it. You come off very cynical about what others do and criticize them, only in the end, to say we should all just be quiet and be happy with our own opinions.

I don't care that you like to have 60fps. I don't care if you game on 720p or 4k resolution. Just like you shouldn't care about my experiences you can see below or my brother's experiences....

I ran 5760x1080 with GTX 570s in SLI and I enjoyed my gaming experiences. I adjusted settings down to a mix of low to high (depending on the game) and enjoyed the games I played. When Far Cry 3 was out and I had fun in it, I liked the antagonist in the game, he made the story worthwhile for me. With the mix of settings I used, I was pulling around 40fps with my 570s acros 5760x1080. I ran the cards for a bit over 4 years and they worked great for my needs.

My brother went on to use one of those GTX 570 cards for about 2.5 years after I stopped using them. He used it for his gaming needs and absolutely loved the experience it gave him. He wasn't maxing anything out, but at least we could now play Dying Light because the GTX 280 he was using up until I gave him the 570 wouldn't work with Dying Light, he actually got a message on the screen saying his GPU wasn't supported and the game wouldn't run.

If someone wants to run a 3050 or 3060 or 3090 for their gaming needs, that's fine by me. Enjoy your experience.


----------



## AusWolf (Feb 5, 2022)

xorstl said:


> Can you do 3x 1080p at 144hz? That's the use case I see every day in my field, especially for laptops. Might be outdated knowledge on the companies' side or might be too lazy to work with splitters, but the fact is that I still see people prioritizing either an unnecessarily expensive cpu or just a decent one with dedicated (even if shitty) card.


That must be a very special use case. Most offices need some kind of Office suit (usually Microsoft) and a Web browser running, maybe with some basic virtualisation for added security. Literally any GPU can do that.



xorstl said:


> On the other side of my point, who the actual f enjoys gaming at 30fps?


I do. So what?



xorstl said:


> I guess you are that person. I'd throw my PC off the window rather than gaming under 60fps. That's completely unnaceptable and well below average. PC gamers dont see consoles as office PCs, at least those you can usually upgrade. Console makers historically never worked for more than 60fps stable since TVs run at 50-60hz anyway. Ofc this is changing but we're talking about the past not the future. It's not because they thought 60 fps is a great goal to aim for. Why do you think the pc vs console war exists? You're clearly ok with console standards, I'm not. Let's all be happy about our diverse opinions and shut up about it


Your opinion is fine, as long as you don't try to present it as universal truth that applies to everybody - which is exactly what you did.

Just because you feel like a x50 series card would be inadequate for your needs (which is OK), it doesn't mean it's below entry-grade stuff.


----------



## eidairaman1 (Feb 5, 2022)

The buffer is a placebo and wont up the fps.


----------



## Kanan (Feb 5, 2022)

3060 Ti is easily better, that's the only Nvidia GPU I would buy new atm, all others are even worse. The 3060 has 12 GB, too bad only 4K needs it and it's not a 4K card.


----------



## Vario (Mar 31, 2022)

The 3060ti came up for purchase now.  I am now on the fence about buying it because the price including shipping, tax, and 3% discount is actually around $500.

Due to the increased availability of graphics cards, the pricing on the secondary market has come down quite a bit.

The 3060 cost me $425 shipping, tax, and discount.  Ideally I'd be able to sell the 3060 for $450 on Craigslist and recoup the cost.  I think its likely I can get at least $400 but not sure how much more than that. I don't want to sell on eBay.  I'd be content with break even.

One factor to consider is whether the availability will dry up yet again.  Another is the next gen series will come out soon.  Asus has released press that they will drop prices 25% on the 30 series as well.  

Any thoughts?


----------



## droopyRO (Mar 31, 2022)

25% ? i heard about slashing 10% off. But even if it's 25, will they get close to MSRP where you live ? I went for a 3060 Ti too, it was an opened box(resealed) full 3 years warranty and the card was in great shape, brand new looking. I have about 60 days in which to return it. So i will wait it out until May, to see if prices drop more. If not, it will have to last me a long time, at least 3 years as my GTX 1070 Ti did.


----------



## chrcoluk (Mar 31, 2022)

If you value visual quality, better texture streaming performance and play modern games at capped frame rate, the 3060 12 gig.

Other extreme, if you ok with low quality textures, prefer max frame rate possible, ok with occasional low vram texture streaming stutters, then get the 3060ti.


----------



## droopyRO (Mar 31, 2022)

8GB worth of vRAM textures is low quality


----------



## freeagent (Mar 31, 2022)

That's crazy talk. I game @ 3840x2160 with an 8GB card and it does pretty well. Some games don't like it but most of my games run just fine.. Sliders are usually maxed, but not always. RDR2 is a bit brutal.


----------



## ThrashZone (Mar 31, 2022)

freeagent said:


> That's crazy talk. I game @ 3840x2160 with an 8GB card and it does pretty well. Some games don't like it but most of my games run just fine.. Sliders are usually maxed, but not always. RDR2 is a bit brutal.


Hi,
Wild thread here 325 posts now

Funny I saw R2D2 on that last bit


----------



## Lei (Apr 1, 2022)




----------



## neatfeatguy (Apr 1, 2022)

Love all the folks that came through here to bash 8GB of VRAM on a mid ranged GPU.

Where was the hate for the RTX 2060 6GB? I don't recall any massive threads about last gen mid ranged GPU being hated for 6GB of VRAM. After the slightly beefier specs on the 2060 12GB that came out recently it was proven that the added RAM made almost no difference for it. Do you guys honestly think 12GB would make any real difference on the 3060Ti?

The 3060Ti is a mid ranged card. It has 8GB of VRAM. It works very well for it's capabilities of handling any 1080p resolution and handling 1440p fairly well. You'd think people would get the F over the mid ranged GPU with 8GB of VRAM that was released 2 years ago and move on.

As for the OP - the 3060 Ti can give you upwards of 25% faster performance in most games at any resolution, compared to the 3060. If you feel spending about $500 and then recouping about 2/3 of that money for selling your 3060 is a good choice for you, I'd say go for it.


----------



## Vario (Apr 1, 2022)

neatfeatguy said:


> Love all the folks that came through here to bash 8GB of VRAM on a mid ranged GPU.
> 
> Where was the hate for the RTX 2060 6GB? I don't recall any massive threads about last gen mid ranged GPU being hated for 6GB of VRAM. After the slightly beefier specs on the 2060 12GB that came out recently it was proven that the added RAM made almost no difference for it. Do you guys honestly think 12GB would make any real difference on the 3060Ti?
> 
> ...


Yeah I'll probably hold off I think.  I don't want to get stuck with two cards.  It took longer than expected to sell my 1060.  $100 out of pocket for that additional bit is steep. If the market was still slammed it would be an easy decision.  Also, nice monitor.


----------



## chrcoluk (Apr 1, 2022)

droopyRO said:


> 8GB worth of vRAM textures is low quality



Well if you want to load 8 gig of textures you will need a bigger capacity, there is other things that go in VRAM as well. 

Bear in mind there is a big difference between loading 8 gigs of VRAM in say a small room, vs an open world area.  If you doing streaming you also need spare capacity to load in new textures for areas coming up vs, on loading screens, you can flush it all and load in the empty area.  Because consoles use video memory for loading normal data as well (some call it unified memory), we will see this practice more and more on PC ports, as was the case in far cry, and is the case in FF7 remake.  This is even before RTX IO gets rolled out.

It is funny whenever someone mentions scenarios that may need more VRAM, some seem offended by it almost.


----------



## Lei (Apr 1, 2022)

@chrcoluk @droopyRO @freeagent 

Here Horizon Zero uses more than 8gb in 1080, COD Warzone uses more than 10, almost entirely using the 3060 vram at 1440p:


----------



## Mussels (Apr 1, 2022)

Honestly, as much as i'm in the "8GB is the minimum now" crowd it's super easy for 99% of people to just run medium textures and not come close to it.


----------



## freeagent (Apr 1, 2022)

Op made has made his decision, as requested thread is closed.


----------

