• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

3060ti 8gb vs 3060 12gb

Status
Not open for further replies.
Why is this thread still going? It's not even on topic anymore..

It's the never ending debate kinda like quadcores, 8GB vs 16Gb of system ram, Amd vs Nvidia. :slap:

I was kinda surprised it got revived as well.
 
Last edited:
x60 has never been entry level. x30 is where the entry level really is. The fact that nvidia has recently forgotten about it (like AMD did long ago) doesn't change the fact. The 3060 gives you solid 1440p (or 1080p maxed out) gaming. Hell, even the 3050 does with some compromise and/or DLSS. I'd say this perfectly describes what mid-tier means.


This I agree with. I'm still more than happy with my 2070, and cannot grasp what the fuss around 12 GB VRAM is about on similar level of cards.
For the longest time a 60 card is an entry gaming card. Dont confuse an entry gaming card with an entry card... my whole comment is about gaming GPUs. Anything below the 60s can typically be matched with a cheap mid tier from the previous gen (not cheap anymore these days I know). 30 cards are literally the trash for office pcs that want to claim they have a dedicated gpu, never have I seen a 30-50 series game acceptably. 60 series game acceptably for high res latest gen games of their era, thus it should be considered the entry level for gaming setups. Let's call it enthusiast entry level.

Entry level is anything above integrated graphics level of cards, and X30-x50 falls into that range. X60-X70 are the dedicated gamer cards, historically.
Exactly my point though. Entry level for a gaming setup is not the same as entry level for a regular office PC or wtv. 60s were always the bare minimum any proper gamer is willing to go for. Below that you're buying trash and you know it. Thus I (and many others) call this tier "gaming entry level".

i have no temps problem on this card on this case, i never seen it go past 72c



in what world do you live that games came optimized? certainly not mine, most games are a mess this days, reality is just what it is.
If you got an LHR v1 card and you see core temps (hotspot I hope?) Of 72C, you're most likely running the mems at 90+. Sadly nvidia cheaps out on sensors too so we can't know without a custom sensor. LHR v1 Hynix mems are cheap af and cant handle those temps steadily long term, nor can they handle overclock. Lhr v2 hynix are OK, they handle as much OC as samsungs but if I'm not mistaken they are still not able to cope with temps as high as samsung and miron can.
Was just a suggestion anyway, problems that only happen after running something for a while tend to be related to long term unstable temps.
 
If you got an LHR v1 card and you see core temps (hotspot I hope?) Of 72C, you're most likely running the mems at 90+. Sadly nvidia cheaps out on sensors too so we can't know without a custom sensor. LHR v1 Hynix mems are cheap af and cant handle those temps steadily long term, nor can they handle overclock. Lhr v2 hynix are OK, they handle as much OC as samsungs but if I'm not mistaken they are still not able to cope with temps as high as samsung and miron can.
Was just a suggestion anyway, problems that only happen after running something for a while tend to be related to long term unstable temps.

Your just making crazy claims, the card run CP77 for hours maxed. This is a VRAM issue, i can see it went over the limit.
 
For the longest time a 60 card is an entry gaming card. Dont confuse an entry gaming card with an entry card... my whole comment is about gaming GPUs. Anything below the 60s can typically be matched with a cheap mid tier from the previous gen (not cheap anymore these days I know). 30 cards are literally the trash for office pcs that want to claim they have a dedicated gpu, never have I seen a 30-50 series game acceptably. 60 series game acceptably for high res latest gen games of their era, thus it should be considered the entry level for gaming setups. Let's call it enthusiast entry level.
My GT 1030 got offended. I'm not saying that it plays the latest games. I'm saying that it produces acceptable framerates in age-appropriate titles with reduced graphical settings. This is what entry-level means.

I'm always puzzled when someone thinks "Ultra" is the only graphics option and anything below 60 FPS is unacceptable / office PC category. With this logic, consoles fit the office PC category too.

Edit: Also, why would an office PC need a GT 1030 (or even a GT 730)?
 
Last edited:
It's the never ending debate kinda like quadcores, 8GB vs 16Gb of system ram, Amd vs Nvidia. :slap:

I was kinda surprised it got revived as well.
It only applies to 8 vs 16gb if you run 8gb at 4000MHz and 16gb at 2133MHz :p I have had both cards and 3060ti is close to 30% faster, but 3060 12gb is nice for dual mining ;)
 
Last edited:
Your just making crazy claims, the card run CP77 for hours maxed. This is a VRAM issue, i can see it went over the limit.

If the game slows down over a period of time in a way that other - also very demanding - games doesn't I'd say it's the game.
 
If the game slows down over a period of time in a way that other - also very demanding - games doesn't I'd say it's the game.

am i talking but no one listens? it's the V R A M limit
 
am i talking but no one listens? it's the V R A M limit

Maybe, but honestly it sounds like it's just one of those unnecessary texture packs for Skyrim that doesn't actually matter and is more an e-peen thing. You can make texture packs that "require" 20GB of VRAM, but ... why would you?
 
i literally own the games and the card. Both games get to a point they get unplayable frame rates, both on 1440p, it doesn't happen imediately, it's after some time of play that triggers it in both games. The rest depends on the settings, if the card can do more why would i disable RT or lower the settings?

I even tried 1080p on FC6 as is just as bad, as you can literally max everything, all ultra, RT on.

am i talking but no one listens? it's the V R A M limit

You literally said "it doesn't happen imediately, it's after some time of play that triggers it in both games".

I don't understand how @Frick 's comment could come off as not listening.

I think Frick's comment is spot on since you also said that you've played CP2077 hours on end without issues.

When I play extended periods of time on games, even when I ran my 980Ti (looking at you Shadow of Mordor - you maxed out the 6GB VRAM limit on that card), the game ran great and never started to run into unplayable frame rates.

To me, it sounds like the games are not well optimized if it's causing a system to run fine for a while and then start to cause unplayable framerates as time goes on. The game should be dropping and picking up textures as needed into the VRAM. To me, it sounds like it's not doing this very well and starts storing large textures into the system RAM, which can cause noticeable slowdowns if the game continues to dump more and more in the system RAM and not swapping textures out of the VRAM. I could be wrong, but that's kind of what it sounds like is happening....that, or they have a memory leak.
 
My GT 1030 got offended. I'm not saying that it plays the latest games. I'm saying that it produces acceptable framerates in age-appropriate titles with reduced graphical settings. This is what entry-level means.

I'm always puzzled when someone thinks "Ultra" is the only graphics option and anything below 60 FPS is unacceptable / office PC category. With this logic, consoles fit the office PC category too.

Edit: Also, why would an office PC need a GT 1030 (or even a GT 730)?
I wouldnt consider anything below 60 fps on minimum settings to be acceptable and for many gens now the 50s and below cant do that on latest gen games of their era. I'm not saying it shoulr be that way, just how it is..

As for your question, plenty use on offices for a low end gpu.. the most common being multi display setups at proper refresh rate!
 
You guys are acting like the definition of a “gamer” is to not just play games in a serious, dedicated manner, but to also do it at 144Hz and 4K with 10-bit HDR or something. Most people can’t afford or don’t think they need anything beyond decent 1080p, and don’t necessarily need all settings maxed. I’ve known some people turn down a lot of graphics on their older systems so that they could play at high frame rate and stay competitive, and it was fine by them. 144Hz is also just for certain first person shooter type games, for example: you don’t need that for MMOs or strategy games. There are lots of game categories and games that one can play competitively that don’t need the latest and greatest hardware, like LoL, Magic Arena, etc, etc.

Many gamers have no idea how to build and upgrade their computer, or know what parts are good, and lots play on laptops that overheat, and they still don’t do anything about it.
 
Last edited:
I wouldnt consider anything below 60 fps on minimum settings to be acceptable and for many gens now the 50s and below cant do that on latest gen games of their era. I'm not saying it shoulr be that way, just how it is..
Then you wouldn't consider console gaming acceptable either, yet many people enjoy it around the world. Nothing personal here, but your opinion seems to be coming from a quite snobby point of view. I, for one, can't really tell the difference between 40 and 60 fps, and used to have tons of fun in The Witcher 3 with a 750 Ti at 30 fps, 1080p, medium-high settings.

As for your question, plenty use on offices for a low end gpu.. the most common being multi display setups at proper refresh rate!
Most offices don't need more than 2 monitors that can be driven with an iGPU. If they did, we would see a lot more low-end cards on the market. Refresh rate isn't a concern either. Offices that actually need many monitors, high refresh rate, colour accuracy, etc. won't make do with a lowly GeForce or Radeon. That's what Quadros are for.
 
As for your question, plenty use on offices for a low end gpu.. the most common being multi display setups at proper refresh rate!
That hasn't been the case in years. Most office and workstation computers (I don't mean those with Xeons--in that case very basic GPUs can be used) use integrated graphics and monitors connect to the motherboard. If a bundled motherboard has a single DP output, a splitter can be used. For example, an i3 7100 supports up to a 4K display at 60Hz and up to 3 displays in total.
 
Last edited:
That hasn't been the case in years. Most office and workstation computers (I don't mean those with Xeons--in that case very basic GPUs can be used) use integrated graphics and monitors connect to the motherboard. If a bundled motherboard has a single DP output, a splitter can be used. For example, an i3 7100 supports up to a 4K display at 60Hz and up to 3 displays in total.
Can you do 3x 1080p at 144hz? That's the use case I see every day in my field, especially for laptops. Might be outdated knowledge on the companies' side or might be too lazy to work with splitters, but the fact is that I still see people prioritizing either an unnecessarily expensive cpu or just a decent one with dedicated (even if shitty) card.

On the other side of my point, who the actual f enjoys gaming at 30fps? And what 50 card performed better than that on games launched around the same year as the card? I'm yet to see such benchmark and never met a gamer that happily uses a 50 series without constant nagging and complaining. I can't fit a well bellow standard performance product in a "gaming" category, just doesn't add up. The fact they are marketed as gaming gpus doesn't make them gaming gpus. A GPU that can run a 5 yo game maxed is nothing, non-gaming gpus can do that too with older games and they are clearly marketed as not for gaming. I have a 3060Ti, it's clearly a lowish mid tier card making the non-ti an entry tier and the 3050 just a pile of shit. None of this is with ultra settings or the biggest resolution of the era, not even a mid tier can do that with an AAA game launched the same year. Also PC gamers tend to be years behind the latest resolution to prioritize performance with high end cards, which tells you how much performance they are looking for. I'm not including the casual FIFA or F1 player, that's a typical console player that just decided to go for better refresh rate.

I'm always puzzled when someone thinks "Ultra" is the only graphics option and anything below 60 FPS is unacceptable / office PC category. With this logic, consoles fit the office PC category too.
I guess you are that person. I'd throw my PC off the window rather than gaming under 60fps. That's completely unnaceptable and well below average. PC gamers dont see consoles as office PCs, at least those you can usually upgrade. Console makers historically never worked for more than 60fps stable since TVs run at 50-60hz anyway. Ofc this is changing but we're talking about the past not the future. It's not because they thought 60 fps is a great goal to aim for. Why do you think the pc vs console war exists? You're clearly ok with console standards, I'm not. Let's all be happy about our diverse opinions and shut up about it ;)
 
Last edited:
Can you do 3x 1080p at 144hz? That's the use case I see every day in my field, especially for laptops. Might be outdated knowledge on the companies' side or might be too lazy to work with splitters, but the fact is that I still see people prioritizing either an unnecessarily expensive cpu or just a decent one with dedicated (even if shitty) card.

On the other side of my point, who the actual f enjoys gaming at 30fps? And what 50 card performed better than that on games launched around the same year as the card? I'm yet to see such benchmark and never met a gamer that happily uses a 50 series without constant nagging and complaining. I can't fit a well bellow standard performance product in a "gaming" category, just doesn't add up. The fact they are marketed as gaming gpus doesn't make them gaming gpus. A GPU that can run a 5 yo game maxed is nothing, non-gaming gpus can do that too with older games and they are clearly marketed as not for gaming. I have a 3060Ti, it's clearly a lowish mid tier card making the non-ti an entry tier and the 3050 just a pile of shit. None of this is with ultra settings or the biggest resolution of the era, not even a mid tier can do that with an AAA game launched the same year. Also PC gamers tend to be years behind the latest resolution to prioritize performance with high end cards, which tells you how much performance they are looking for. I'm not including the casual FIFA or F1 player, that's a typical console player that just decided to go for better refresh rate.


I guess you are that person. I'd throw my PC off the window rather than gaming under 60fps. That's completely unnaceptable and well below average. PC gamers dont see consoles as office PCs, at least those you can usually upgrade. Console makers historically never worked for more than 60fps stable since TVs run at 50-60hz anyway. Ofc this is changing but we're talking about the past not the future. It's not because they thought 60 fps is a great goal to aim for. Why do you think the pc vs console war exists? You're clearly ok with console standards, I'm not. Let's all be happy about our diverse opinions and shut up about it ;)

You do what you want with your PC and your gaming needs, but don't judge others what they do with theirs and how they enjoy gaming. Not everyone is a graphic whore or needs the best of the best, nor can they afford it. You come off very cynical about what others do and criticize them, only in the end, to say we should all just be quiet and be happy with our own opinions.

I don't care that you like to have 60fps. I don't care if you game on 720p or 4k resolution. Just like you shouldn't care about my experiences you can see below or my brother's experiences....

I ran 5760x1080 with GTX 570s in SLI and I enjoyed my gaming experiences. I adjusted settings down to a mix of low to high (depending on the game) and enjoyed the games I played. When Far Cry 3 was out and I had fun in it, I liked the antagonist in the game, he made the story worthwhile for me. With the mix of settings I used, I was pulling around 40fps with my 570s acros 5760x1080. I ran the cards for a bit over 4 years and they worked great for my needs.

My brother went on to use one of those GTX 570 cards for about 2.5 years after I stopped using them. He used it for his gaming needs and absolutely loved the experience it gave him. He wasn't maxing anything out, but at least we could now play Dying Light because the GTX 280 he was using up until I gave him the 570 wouldn't work with Dying Light, he actually got a message on the screen saying his GPU wasn't supported and the game wouldn't run.

If someone wants to run a 3050 or 3060 or 3090 for their gaming needs, that's fine by me. Enjoy your experience.
 
Can you do 3x 1080p at 144hz? That's the use case I see every day in my field, especially for laptops. Might be outdated knowledge on the companies' side or might be too lazy to work with splitters, but the fact is that I still see people prioritizing either an unnecessarily expensive cpu or just a decent one with dedicated (even if shitty) card.
That must be a very special use case. Most offices need some kind of Office suit (usually Microsoft) and a Web browser running, maybe with some basic virtualisation for added security. Literally any GPU can do that.

On the other side of my point, who the actual f enjoys gaming at 30fps?
I do. So what?

I guess you are that person. I'd throw my PC off the window rather than gaming under 60fps. That's completely unnaceptable and well below average. PC gamers dont see consoles as office PCs, at least those you can usually upgrade. Console makers historically never worked for more than 60fps stable since TVs run at 50-60hz anyway. Ofc this is changing but we're talking about the past not the future. It's not because they thought 60 fps is a great goal to aim for. Why do you think the pc vs console war exists? You're clearly ok with console standards, I'm not. Let's all be happy about our diverse opinions and shut up about it ;)
Your opinion is fine, as long as you don't try to present it as universal truth that applies to everybody - which is exactly what you did.

Just because you feel like a x50 series card would be inadequate for your needs (which is OK), it doesn't mean it's below entry-grade stuff.
 
The buffer is a placebo and wont up the fps.
 
3060 Ti is easily better, that's the only Nvidia GPU I would buy new atm, all others are even worse. The 3060 has 12 GB, too bad only 4K needs it and it's not a 4K card.
 
The 3060ti came up for purchase now. I am now on the fence about buying it because the price including shipping, tax, and 3% discount is actually around $500.

Due to the increased availability of graphics cards, the pricing on the secondary market has come down quite a bit.

The 3060 cost me $425 shipping, tax, and discount. Ideally I'd be able to sell the 3060 for $450 on Craigslist and recoup the cost. I think its likely I can get at least $400 but not sure how much more than that. I don't want to sell on eBay. I'd be content with break even.

One factor to consider is whether the availability will dry up yet again. Another is the next gen series will come out soon. Asus has released press that they will drop prices 25% on the 30 series as well.

Any thoughts?
 
25% ? i heard about slashing 10% off. But even if it's 25, will they get close to MSRP where you live ? I went for a 3060 Ti too, it was an opened box(resealed) full 3 years warranty and the card was in great shape, brand new looking. I have about 60 days in which to return it. So i will wait it out until May, to see if prices drop more. If not, it will have to last me a long time, at least 3 years as my GTX 1070 Ti did.
 
If you value visual quality, better texture streaming performance and play modern games at capped frame rate, the 3060 12 gig.

Other extreme, if you ok with low quality textures, prefer max frame rate possible, ok with occasional low vram texture streaming stutters, then get the 3060ti.
 
That's crazy talk. I game @ 3840x2160 with an 8GB card and it does pretty well. Some games don't like it but most of my games run just fine.. Sliders are usually maxed, but not always. RDR2 is a bit brutal.
 
That's crazy talk. I game @ 3840x2160 with an 8GB card and it does pretty well. Some games don't like it but most of my games run just fine.. Sliders are usually maxed, but not always. RDR2 is a bit brutal.
Hi,
Wild thread here 325 posts now

Funny I saw R2D2 on that last bit :laugh:
 
Status
Not open for further replies.
Back
Top