• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3060ti 8gb vs 3060 12gb

Status
Not open for further replies.
great, if you sell 3060 for full price you got it... :D

The 3060 even used is selling for around 300 usd more than what he's paying for it currently.
 
Not true at all.


There was a 6GB version that was fairly popular. And it aged a lot better.


I think it's a beautiful port. Not everyone is having that glitch and in every other way it is an amazing game. Let's not blow things out of proportion folks.

That said, we're off-topic. Glitches with FF7Remake should be taken to a new thread f the discussion needs to progress. I'd be happy to open such a thread if everyone would like.
  • Video RAM size: It measures how much data your system can store and handle at one time. If you’ll be working with categorical data and Natural Language Processing (NLP), the amount of VRAM is not so important. However, higher VRAM is crucial for Computer Vision models.
 
  • Video RAM size: It measures how much data your system can store and handle at one time. If you’ll be working with categorical data and Natural Language Processing (NLP), the amount of VRAM is not so important. However, higher VRAM is crucial for Computer Vision models.
I'm not disputing that. You said...
For gaming the 3060ti for sure, for AI and ML the more vram the better. So a 3060 12GB card for those who cant afford an RTX card for scientific computing.
...which is not true. There are many gamers who will prefer a 12GB because it will provide better longevity and thus the card will last them longer. It's not just attractive for scientific use as you originally implied.
 
I'm not sure if the question is still valid - I didn't want to read 7 pages of comments to influence my opinion. :ohwell:

Personally, in an ideal situation, I would consider 3 things: 1. How long do you want to keep the card for? 2. What resolution do you play at? 3. Are you willing to sacrifice visuals for performance?

1. If you want to keep it for 4-5 years, the more VRAM might be a better option (3060 12 GB).
2. If you play above 1440p, you need GPU power and VRAM as well (either will do).
3. If you're OK with taking graphical settings down a notch, a more powerful GPU is more useful (3060 Ti 8 GB).

Although, the GPU market is far from ideal, so I'd rather stay in the queue for both and get whichever comes first. They're both more than capable gaming GPUs anyway.
 
About VRAM use: I play Cyberpunk 2077 fine on 4K on my 3060ti. Not on max settings, but that is not due to lack of VRAM. The current generation of consoles are very new. I am assuming the 8GB on the 3060ti will be plenty for as long as it's viable as a 4K card. If I have to dial down settings it will be because of lack of power and not lack of VRAM.
 
I'm not disputing that. You said...

...which is not true. There are many gamers who will prefer a 12GB because it will provide better longevity and thus the card will last them longer. It's not just attractive for scientific use as you originally implied.
I reread my quote, I meant that a 12GB card would be more useful in scientific Vision modelings (where all vram can be utilized) if you can’t afford a Quadro RTX card. I’m not sure that any card can use more than 6-8GB gaming in 1080p. And if I’m wrong forgive me, but a 3060 12GB card playing in 1440p for most modern games may not be more enjoyable than playing in 1080p maxed out. IMHO.
 
About VRAM use: I play Cyberpunk 2077 fine on 4K on my 3060ti. Not on max settings, but that is not due to lack of VRAM. The current generation of consoles are very new. I am assuming the 8GB on the 3060ti will be plenty for as long as it's viable as a 4K card. If I have to dial down settings it will be because of lack of power and not lack of VRAM.

exactly because you don't max the settings that you don't see the problem, if you played in 1080 or 1440p and could max the settings you would (not talking about CP77 but in general)
 
exactly because you don't max the settings that you don't see the problem, if you played in 1080 or 1440p and could max the settings you would (not talking about CP77 but in general)

But I thought having 12GB of VRAM on a weaker card was better over having 8GB on a faster card?

I'm so confused. CP2077 can utilize up to 10GB of VRAM when RT is enabled (upwards of 7GB without RT), according to TPU and their benchmarking. Why is a 3060Ti with only 8GB out pacing a 3060 with 12GB? Last I checked 12 > 8.

Guru3D shows the 3060Ti, even with it's limited VRAM is 30-35% faster than the 3060 that has more VRAM.

Maybe it's just that game.....what other game can use lots of VRAM....?

Oh, maybe FarCry 6 will show how much better the 12GB is over 8GB.
At 1440p almost 8GB is used.
At 4k over 9GB can be used.
No...still seeing a 30-35% faster performance from the lesser VRAM 3060Ti over the 3060.

I just don't understand. What am I not getting? Maybe folks are saying that 4 years down the road the weaker 3060 with 12GB will give better performance over a 3060Ti with only a measly 8GB. Man. I'm just going to have to wait a while to see if this is true. Shucks, by then no one will care because we'll tall be talking/complaining about the current gen AMD/Nvidia/Intel will be providing.
 
On both my 3080 ti and 2080 ti it's bad at all 3 resolutions. My buddies with 3080s and 3090s are also mentioning stutters but I haven't been able to swing by their places to see if it's the same issues I'm seeing in person yet.

Don't get me wrong it's playable but frame times are still worse than the PS5 version which I also own. It's just a bad port in general.... They've also ruined the hdr when compared to the Playstation versions. Apparently running the game in DX 11 helps but I haven't had a chance to try it.

This is pretty much what I'm seeing if anyone else is wondering how bad the port is.

DF reported the stutters stop if you stand still which seems further evidence its shader/texture evictions and reloading causing the problem.
 
But I thought having 12GB of VRAM on a weaker card was better over having 8GB on a faster card?

I'm so confused. CP2077 can utilize up to 10GB of VRAM when RT is enabled (upwards of 7GB without RT), according to TPU and their benchmarking. Why is a 3060Ti with only 8GB out pacing a 3060 with 12GB? Last I checked 12 > 8.

Guru3D shows the 3060Ti, even with it's limited VRAM is 30-35% faster than the 3060 that has more VRAM.

Maybe it's just that game.....what other game can use lots of VRAM....?
Just because the game allocates that much VRAM when it's available doesn't mean it actually needs it. I also played CP77 with a 4 GB GTX 1650 at 1080p High and VRAM usage hovered between 3-3.5 GB.
 
Oh, maybe FarCry 6 will show how much better the 12GB is over 8GB.
At 1440p almost 8GB is used.
At 4k over 9GB can be used.
No...still seeing a 30-35% faster performance from the lesser VRAM 3060Ti over the 3060.

VRAM depends on settings, there are games that tell you how much vram is used based on how much you cranck the settings like RE village for example. It's not as simple as what resolution you are playing on.
 
I reread my quote, I meant that a 12GB card would be more useful in scientific Vision modelings (where all vram can be utilized) if you can’t afford a Quadro RTX card.
Ah, I see what you mean now. Fair enough.
I’m not sure that any card can use more than 6-8GB gaming in 1080p.
I've seen it. It used to be rare but it's now getting more common. 8GB for 1080p gaming is good, but for max settings and a few AAA titles, it's just not enough.

That model was the 780 6GB not a 780ti, and it was a rarer card.
Actually, both had a 6GB variant. I sold them both, side by side. The following for reference;

The 780ti 6GB, even though it's just had it's last driver set, is still a reasonable card to have, especially in this current state of affairs...
 
Last edited:
VRAM depends on settings, there are games that tell you how much vram is used based on how much you cranck the settings like RE village for example. It's not as simple as what resolution you are playing on.
It also depends on how much you've got available. If you have less, the game will allocate less, which will not necessarily result in a difference in performance.

I've seen it. It used to be rare but it's now getting more common. 8GB for 1080p gaming is good, but for max setting and a few AAA titles, it's just not enough.
Which titles? I happen to have an 8 GB card and I'd be happy to test it. :)
 
FF15 can use 12 gigs at 1080p due to massive textures ;) It even has nasty leaks with nvidia grass feature that it will consume 10s of gigs of VRAM if its available.

Depends on size of textures and textures need VRAM more than horsepower.

Remember the market is bigger than AAA shooters.
Tho still ran fine on 980 Ti, allocating isn't the same as usage.
 
It also depends on how much you've got available. If you have less, the game will allocate less, which will not necessarily result in a difference in performance.

You can literally try like i did a couple days ago because i was playing it and had it installed. It is accurate andRE village turns into a slide show at 10fps if you go over the limits.
 
You can literally try like i did a couple days ago because i was playing it and had it installed. It is accurate andRE village turns into a slide show at 10fps if you go over the limits.
I haven't tried RE:Village, but it's on my wishlist, so I eventually will. :) I just know that Cyberpunk works quite cleverly with VRAM. It will not try to allocate more than what you have, so looking at VRAM usage on a 16-24 GB card can be misleading. Though of course, there is a lower limit for every game, but I didn't reach it in Cyberpunk with a 4 GB 1650. Most modern games work on similar principles, as far as I know.
 
At least 11GB is fine for Village at 1080p everything maxed out except RT
 
I haven't tried RE:Village, but it's on my wishlist, so I eventually will. :) I just know that Cyberpunk works quite cleverly with VRAM. It will not try to allocate more than what you have, so looking at VRAM usage on a 16-24 GB card can be misleading. Though of course, there is a lower limit for every game, but I didn't reach it in Cyberpunk with a 4 GB 1650. Most modern games work on similar principles, as far as I know.

not the best example as you can get assets pop in in CP77 (npc's, cars, etcs...), especially in low end hardware. I wouldn't call that exactly cleaver.
 
not the best example as you can get assets pop in in CP77 (npc's, cars, etcs...), especially in low end hardware. I wouldn't call that exactly cleaver.
Even if that's true, it wasn't noticeable. Or maybe my tolerance level is lower than most.
 
Just because the game allocates that much VRAM when it's available doesn't mean it actually needs it. I also played CP77 with a 4 GB GTX 1650 at 1080p High and VRAM usage hovered between 3-3.5 GB.
If only more people used some common sense and understood this.^

Games may allocate a lot of VRAM if it is available;
- Buffers like Z-buffers, stencil buffers etc. which are heavily compressed because they are mostly empty at any time. On top of that, many texture buffers like normal maps, opacity maps, and with RT there is metallic, emissive and roughness textures too, most of which have very low data density, which means they can be compressed heavily (>>50%). Additionally some temporary buffers may only be used for a specific render pass, and be compressed down to nearly nothing the rest of the frame. Still, these buffers will end up as "allocated" space, because to the GPU compression is transparent, so an allocated "256MB" buffer may in reality be only a few kB in physical VRAM. Those who don't understand the terms I'm using here, are not qualified for a such technical discussion.
- Some games use free VRAM for additional temporary buffers and caching.

Additionally, most people fail to understand the fundamental truth that if you have space for extra textures in VRAM, then using those textures will require both more bandwidth and computational performance to utilize it. This balance is not going to change until the hardware gains new capabilities which reduces the computational load, so this is not going to change during the lifetime of an existing product, and even then, using the VRAM data will still require more bandwidth. So for these reasons, thinking adding extra VRAM is "future proofing" is nothing but foolishness.

If there are still people claiming RTX 3060 12 GB is a better card than RTX 3060 Ti 8GB due to the extra VRAM after reading this, then they don't know what the heck they are talking about.
Buy the one that makes sense in terms of performance, price and availability. Just ignore the VRAM difference, it doesn't matter for gaming. RTX 3060 Ti 8GB is the faster card, and it will remain so 2 years and even 5 years from now.
 
I just know that Cyberpunk works quite cleverly with VRAM.
While true, you take a big quality and performance hit. Max everything out and even at 1080p, 8GB is just not enough.

If there are still people claiming RTX 3060 12 GB is a better card than RTX 3060 Ti 8GB due to the extra VRAM after reading this, then they don't know what the heck they are talking about.
That's a bold claim and just as equally incorrect.

This is the same argument that happens with every generation of GPU's that end up offering more VRAM for games pushing and exceeding the limits. People argued that same way when the 2GB cards came out when everyone thought 1GB was enough. It happened again with the 2GB VS 4GB jump and again with the 4GB vs 8GB jump. And the same result happens EVERY time. The cards with the greater amount of VRAM end up being more useful longer.

Progress ALWAYS marches forward and as it does the resources needed to accommodate that progress need to expand.
 
Last edited:
This is the same argument that happens with every generation of GPU's that end up offering more VRAM for games pushing and exceeding the limits. People argued that same way when the 2GB cards came out when everyone thought 1GB was enough. It happened again with the 2GB VS 4GB jump and again with the 4GB vs 8GB jump. And the same result happens EVERY time. The cards with the greater amount of VRAM end up being more useful longer.

Progress ALWAYS marches forward and as it does the resources needed to accommodate that progress need to expand.
I don't know if you're attempting a straw man argument here, or if you are just completely missing the point.

No one is claiming that x GB will be enough for future hardware forever, what I am pointing out is the fact that there are limits to how much a particular GPU can effectively utilize in games, and the fact that this is related to memory bandwidth and computational performance. 8 GB is more than enough for 3060 Ti, and therefore it's nonsensical that 3060 would benefit from more than that. Even you can't deny that.
 
Status
Not open for further replies.
Back
Top