• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3060ti 8gb vs 3060 12gb

Status
Not open for further replies.
The point is: bus width matters more than an extra 4 GB VRAM
I was only stating that factors other than VRAM capacity matter more at the 2060-3060 range.
Again, that depends greatly on the usage scenario. Some programs/games will use the extra space to cache more data. While this will not speed up the GPU, it will prevent slowdowns due to system RAM/storage access. This is always the most important benefit.
 
Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?
If they run RTX off with DLSS? a few people.

I game at 4K with a GTX1080, but it's not at ultra settings.
 
Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?
Not everyone buys a 4K monitor purely for gaming reasons—some might be doing photo processing too, for example. And with that, not everyone can afford the current high end cards. A good 4K monitor is quite accessible in price nowadays (bought an open box 28” 4K IPS monitor at Microcenter for $220+ tax for a family member, for example), whereas getting a GPU is a highway robbery. Making do with midrange cards is the reality for a lot of people who don’t have an extra $1000+ to spend on just a GPU.
 
The 3060Ti is basically a 3070, gimped.
It was my first choice, too.
Realize that if everybody is recommending a 3060Ti, that means it's so popular you will have ZERO chance of getting one!
I went for a 3070Ti (newegg lottery), and won on the first day. It's a widely hated card therefore i won it on the first day. Didn't pay $600 - paid $980 - but MSRPs are a fiction anyway, especially with inflation (7.5% inflation since MSRPs were announced), 25% made-in-China tariffs, and TSMC raising prices 10-20% this year (2022).
 
Last edited:
The 3060Ti is basically a 3070, gimped.
It was my first choice, too.
Realize that if everybody is recommending a 3060Ti, that means it's so popular you will have ZERO chance of getting one!
I went for a 3070Ti (newegg lottery), and won on the first day. It's a widely hated card therefore i won it on the first day. Didn't pay $600 - paid $980 - but MSRPs are a fiction anyway, especially with inflation (7.5% inflation since MSRPs were announced), 25% made-in-China tariffs, and TSMC raising prices 10-20% this year (2022).
Honestly, I would love to have a 3070 Ti - considering the slowing down of development in game technologies, I'm sure it would be enough for a good couple of years. My only issue is its price. Not really its price alone, but paying £1,000-1,100 for an 8 GB graphics card when I've already got one (2070) is a bit steep. I'd gladly buy a 16 GB model if it existed, and call it a day until 2025, if nvidia ever bothered realising it.
 
going back to this, Far cry 6 has a HD pack that requires 11GB, and doesn't work on a 3060ti, ask me how i know?

a card like the 3060ti or the 3070 with 8GB was a mistake, especially with the prices they sell this cards, and i'm only considering MRSP, not the real prices.
 
going back to this, Far cry 6 has a HD pack that requires 11GB, and doesn't work on a 3060ti, ask me how i know?

a card like the 3060ti or the 3070 with 8GB was a mistake, especially with the prices they sell this cards, and i'm only considering MRSP, not the real prices.

So, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures? Seriously, one crappy game (in my opinion, not a fan of the Far Cry series, it's like Madden football games, or CoD games or BF games....same rehashed crap upon every release) that limits HD textures and everyone constantly brings it up that new GPUs can't run it because of the VRAM amount.

I'm not supporting Nvidia for having 8-10GB of VRAM on their mid-high to high end GPUs. I just hate that people are so hung up on FC6 and use it as the only game that cannot run the HD textures due to not having more than 8-10GB of VRAM.

So, again, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures?
 
So, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures? Seriously, one crappy game (in my opinion, not a fan of the Far Cry series, it's like Madden football games, or CoD games or BF games....same rehashed crap upon every release) that limits HD textures and everyone constantly brings it up that new GPUs can't run it because of the VRAM amount.

I'm not supporting Nvidia for having 8-10GB of VRAM on their mid-high to high end GPUs. I just hate that people are so hung up on FC6 and use it as the only game that cannot run the HD textures due to not having more than 8-10GB of VRAM.

So, again, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures?

i just gave other examples like RE village, there's 2 examples, and i don't even play that many games
 
i just gave other examples like RE village, there's 2 examples, and i don't even play that many games

I find it interesting that 4k (RT off) FarCry 6 only uses around 9.5GB of VRAM and I don't know what to say about RE Village. According to TPU at 4k it only used just shy of 8GB of VRAM.

The artificial limitation of VRAM amounts sounds like a ploy to make 8-10GB GPUs bad when it shouldn't really be a problem. Games and your system will allocate VRAM and swap textures very well without having an artificial limit in place to make it sound like you lack enough VRAM to run the game at the best settings.

I see no reason why any GPU in the mid-high to high range from AMD or Nvidia would have any issues running games with HD textures if the artificial limits weren't put in place. If the games had a suggested VRAM notification, that would be fine. If you wanted to max out all settings at 4k for FC6 and the game showed you a suggested VRAM amount needed - showing how much you have an how much they suggest is needed, but you can run the game anyway....your system will handle loading and offloading textures as needed. Let the system do what it was designed to do. If the performance really does become an issue then at that point the user can work on adjusting settings.
 
I find it interesting that 4k (RT off) FarCry 6 only uses around 9.5GB of VRAM and I don't know what to say about RE Village. According to TPU at 4k it only used just shy of 8GB of VRAM.

The artificial limitation of VRAM amounts sounds like a ploy to make 8-10GB GPUs bad when it shouldn't really be a problem. Games and your system will allocate VRAM and swap textures very well without having an artificial limit in place to make it sound like you lack enough VRAM to run the game at the best settings.

I see no reason why any GPU in the mid-high to high range from AMD or Nvidia would have any issues running games with HD textures if the artificial limits weren't put in place. If the games had a suggested VRAM notification, that would be fine. If you wanted to max out all settings at 4k for FC6 and the game showed you a suggested VRAM amount needed - showing how much you have an how much they suggest is needed, but you can run the game anyway....your system will handle loading and offloading textures as needed. Let the system do what it was designed to do. If the performance really does become an issue then at that point the user can work on adjusting settings.

You can see this in some of the 6500 XT testing, where older cards also with 4GB and proper PCI-E connectors don't suffer anywhere near as badly as the 6500 limited to only 4 lanes does when available vram is exceeded and the loss of performance or visual quality (in otherwise the same situation) is always game dependent to some degree.
 
So, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures? Seriously, one crappy game (in my opinion, not a fan of the Far Cry series, it's like Madden football games, or CoD games or BF games....same rehashed crap upon every release) that limits HD textures and everyone constantly brings it up that new GPUs can't run it because of the VRAM amount.

I'm not supporting Nvidia for having 8-10GB of VRAM on their mid-high to high end GPUs. I just hate that people are so hung up on FC6 and use it as the only game that cannot run the HD textures due to not having more than 8-10GB of VRAM.

So, again, is it a problem with the GPUs or with Ubisoft and their inability to properly build, design and implement their HD textures?
Not only that but the xx60 cards are entry tier, not mid or high tier. 70s are mid, 70Ti and up are mid-high or high. 50s are always the "barely usable in gaming" ones and therefore can't even be considered entry tier for a gaming GPU.

We're given a choice of 12GB or 8GB with much better bandwidth for an entry tier card, that's more than bloody enough. Equivalent to previous gen's mid-high tier which is typically repeated between generations that had a decent leap, so it makes sense.

What's all the fuss about the 3060Ti only having 8GB? If your needs really require good solid use of all 12GB of a card, you won't buy a 3060 anyway because you'll be bottlenecked by a shittier GPU and less memory bandwidth anyway. The card is exactly in the range it should, the problem is the market prices prohibiting regular middle-class people getting a mid-high end gpu, not the characteristics of a shitty entry level card being shitty. I have a 3060Ti and would only be happier about it if I had paid MSRP, it behaves like a solid mid tier card for everything except the 1% cases where you need more than 8GB memory.
 
I find it interesting that 4k (RT off) FarCry 6 only uses around 9.5GB of VRAM and I don't know what to say about RE Village. According to TPU at 4k it only used just shy of 8GB of VRAM.

The artificial limitation of VRAM amounts sounds like a ploy to make 8-10GB GPUs bad when it shouldn't really be a problem. Games and your system will allocate VRAM and swap textures very well without having an artificial limit in place to make it sound like you lack enough VRAM to run the game at the best settings.

I see no reason why any GPU in the mid-high to high range from AMD or Nvidia would have any issues running games with HD textures if the artificial limits weren't put in place. If the games had a suggested VRAM notification, that would be fine. If you wanted to max out all settings at 4k for FC6 and the game showed you a suggested VRAM amount needed - showing how much you have an how much they suggest is needed, but you can run the game anyway....your system will handle loading and offloading textures as needed. Let the system do what it was designed to do. If the performance really does become an issue then at that point the user can work on adjusting settings.

i literally own the games and the card. Both games get to a point they get unplayable frame rates, both on 1440p, it doesn't happen imediately, it's after some time of play that triggers it in both games. The rest depends on the settings, if the card can do more why would i disable RT or lower the settings?

I even tried 1080p on FC6 as is just as bad, as you can literally max everything, all ultra, RT on.
 
i literally own the games and the card. Both games get to a point they get unplayable frame rates, both on 1440p, it doesn't happen imediately, it's after some time of play that triggers it in both games. The rest depends on the settings, if the card can do more why would i disable RT or lower the settings?

I even tried 1080p on FC6 as is just as bad, as you can literally max everything, all ultra, RT on.

That just sounds like a poorly made game tbh.
 
That's a great idea actually.
Is it just me, or is that like scalping with extra steps?

Kidding mostly. One does what one has to these days...
 
Is it just me, or is that like scalping with extra steps?

Kidding mostly. One does what one has to these days...

Honestly with how shit the market is if people can they should take advantage to get the hardware they want.... I sold a Titan Xp for 900 usd and used that for a 3080 ti making it a much more sensible purchase.... My buddy sold a 5700XT for nearly the same and picked up a 6800XT.
 
going back to this, Far cry 6 has a HD pack that requires 11GB, and doesn't work on a 3060ti, ask me how i know?

a card like the 3060ti or the 3070 with 8GB was a mistake, especially with the prices they sell this cards, and i'm only considering MRSP, not the real prices.
I have a 3090, and that HD pack just made everything look fuzzy
 
Some of you guys are pretty funny on here. Who in their right mind is going to get a 3060 to game at 4K?
My next monitor will be a 4K one and not going to upgrade from 1080 Ti in the near future. This isn't THAT much faster than 3060.

Nobody is pointing a gun to my head and forcing me to play the newest AAA titles with max details if that's the thing when wondering why 4K.
 
At this point unless you come up in the queue in the next month or so I'd probably skip both and wait for the 4000 series that will likely have adequate vram. I personally wouldn't be shocked if none of the nvidia cards from the 3080 down age well due to nvidia cheaping out on vram.

If I had to choose one of these two it would be the ti though it's quite a bit faster but having to lower settings in a year or so for a 500+ gpu feels bad.
Would not wait on 4000 series at the scale of 5 nm pricing will be high due to poor yields and other factors, the days of inexpensive GPUs are over. I run a Zotac 3080 at 2K at it has no problem with 144Hz.
 
I was inclined to buy the 3060 when it comes up and then buy a 3060ti when it comes up and sell the 3060.

This is basically what I did, except with a few more steps... (Got a 3060ti at early cheapish scalp prices, then got a 3060ti from EVGA which I sold BNIB, then got a 3070ti at MSRP and sold the OG 3060ti)

In the end I have a 3070ti and I'm only down ~$300.
 
This is basically what I did, except with a few more steps... (Got a 3060ti at early cheapish scalp prices, then got a 3060ti from EVGA which I sold BNIB, then got a 3070ti at MSRP and sold the OG 3060ti)

In the end I have a 3070ti and I'm only down ~$300.
That's really nice, hindsight 20/20 for me. I wish I had signed up for a 3070 or 3070 Ti early on. Instead I signed up only for a 3060 for some stupid reason. Well I know the reason--it was because I expected prices to drop sooner and paying over $400-500 seemed too much, when that tier would normally be going for significantly less in the past in the open box / gently used marketplace. For example I bought an excellent version of a 1080 for my family member's computer for $300 when it was still the current GPU series.
 

i have no temps problem on this card on this case, i never seen it go past 72c

That just sounds like a poorly made game tbh.

in what world do you live that games came optimized? certainly not mine, most games are a mess this days, reality is just what it is.
 
in what world do you live that games came optimized? certainly not mine, most games are a mess this days, reality is just what it is.

Sadly that's the truth: dive deeply enough in a game, and there's usually regular talk about how it's not properly optimized for the hardware (like not properly-threaded on a CPU, outdated engine, etc., etc.)
 
Not only that but the xx60 cards are entry tier, not mid or high tier. 70s are mid, 70Ti and up are mid-high or high. 50s are always the "barely usable in gaming" ones and therefore can't even be considered entry tier for a gaming GPU.
x60 has never been entry level. x30 is where the entry level really is. The fact that nvidia has recently forgotten about it (like AMD did long ago) doesn't change the fact. The 3060 gives you solid 1440p (or 1080p maxed out) gaming. Hell, even the 3050 does with some compromise and/or DLSS. I'd say this perfectly describes what mid-tier means.

We're given a choice of 12GB or 8GB with much better bandwidth for an entry tier card, that's more than bloody enough. Equivalent to previous gen's mid-high tier which is typically repeated between generations that had a decent leap, so it makes sense.

What's all the fuss about the 3060Ti only having 8GB? If your needs really require good solid use of all 12GB of a card, you won't buy a 3060 anyway because you'll be bottlenecked by a shittier GPU and less memory bandwidth anyway. The card is exactly in the range it should, the problem is the market prices prohibiting regular middle-class people getting a mid-high end gpu, not the characteristics of a shitty entry level card being shitty. I have a 3060Ti and would only be happier about it if I had paid MSRP, it behaves like a solid mid tier card for everything except the 1% cases where you need more than 8GB memory.
This I agree with. I'm still more than happy with my 2070, and cannot grasp what the fuss around 12 GB VRAM is about on similar level of cards.
 
x60 has never been entry level. x30 is where the entry level really is. The fact that nvidia has recently forgotten about it (like AMD did long ago) doesn't change the fact. The 3060 gives you solid 1440p (or 1080p maxed out) gaming. Hell, even the 3050 does with some compromise and/or DLSS. I'd say this perfectly describes what mid-tier means.
Yeah I don’t know what that person is talking about…. Entry level is anything above integrated graphics level of cards, and X30-x50 falls into that range. X60-X70 are the dedicated gamer cards, historically. Anything above is various degrees of high to super enthusiast to “I have a lot of money, look at my shiny build log with lots of brand names on display. Don’t forget people used to SLI/Crossfire mid-range cards for more performance on a budget too. I know technology moves fast, but we’re talking about recent generations still. Whatever happened since 2019 is completely out of the norm with prices and also what the makers can get away with as a result of the shortages, and is like some nightmarish alternate reality.
 
Status
Not open for further replies.
Back
Top