Friday, December 18th 2020

NVIDIA GeForce RTX 3060 to Come in 12GB and 6GB Variants

NVIDIA could take a similar approach to sub-segmenting the upcoming GeForce RTX 3060, as it did for the "Pascal" based GTX 1060, according to a report by Igor's Lab. Mr Wallossek predicts a mid-January launch for the RTX 3060 series, possibly on the sidelines of the virtual CES. NVIDIA could develop two variants of the RTX 3060, one with 6 GB of memory, and the other with 12 GB. Both the RTX 3060 6 GB and RTX 3060 12 GB probably feature a 192-bit wide memory interface. This would make the RTX 3060 series the spiritual successors to the GTX 1060 3 GB and GTX 1060 6 GB, although it remains to be seen if the segmentation is limited to the memory size, and doesn't also go into the chip's core-configuration. It's likely that the RTX 3060 series goes up against AMD's Radeon RX 6700 series, with the RX 6700 XT being rumored to feature 12 GB of memory across a 192-bit wide memory interface.
Source: Igor's Lab
Add your own comment

126 Comments on NVIDIA GeForce RTX 3060 to Come in 12GB and 6GB Variants

#51
lexluthermiester
bugIt's been proven time and again. Many game performance reviews right here in TPU show a level of allocated VRAM and no penalty for cards haven fewer VRAM than that.
While that's true, the extra VRAM does enable extra functionality. Always has.
Posted on Reply
#52
evernessince
bugIt's been proven time and again. Many game performance reviews right here in TPU show a level of allocated VRAM and no penalty for cards haven fewer VRAM than that.
To a certain extent. The drivers and game engine can manage to be frugal about VRAM usage when somewhat limited. Typically you start seeing issues when you are 30% short of optimal VRAM allocation. When that does happen though, the game goes to unplayable pretty quick, typically taking a large hit to 1% lows and frame consistency.

A 6GB card is in the same position as the 3GB 1060 was years back. Just barely enough VRAM to last until Nvidia's next products come out. 6GB is already around that 30% threshold on Cyberpunk 2077 at 1080p.

Then again the 1060 3GB launched at $200, this is a $300 - $350 USD card. For 1080p gaming in this day and age, 6GB is not a compromise you should have to make at this price point.
Posted on Reply
#53
Nihilus
Armature hour here at TPU.
"6 GB is a sad joke, look at this chart!!"
Very few games actually need more than 6 GB, especially at this performance level. Thats a FACT. God forbid you tweak textures just a hair.

"Why 6 GB and 12 GB? Why not 8 GB??" Christ people, it is a 192 bit card. What is so confusing about it. The 12 GB version will be for those that really want to crank up RTX and other details. They are giving you options.
evernessinceTo a certain extent. The drivers and game engine can manage to be frugal about VRAM usage when somewhat limited. Typically you start seeing issues when you are 30% short of optimal VRAM allocation. When that does happen though, the game goes to unplayable pretty quick, typically taking a large hit to 1% lows and frame consistency.

A 6GB card is in the same position as the 3GB 1060 was years back. Just barely enough VRAM to last until Nvidia's next products come out. 6GB is already around that 30% threshold on Cyberpunk 2077 at 1080p.

Then again the 1060 3GB launched at $200, this is a $300 - $350 USD card. For 1080p gaming in this day and age, 6GB is not a compromise you should have to make at this price point.
We don't even know the prices yet and you have no idea what you are talking about. Find me a game where the 4 GB 1050ti performed better than the 3 GB 1060, and I will find 10 where the opposite is the case.
evernessinceSure and the same could be said for the 7970 Toxic I had years back at 1080p as well. If only I played old games and those light on resources I would have no reason to upgrade still. If that's the stance you are taking then there's no point for you to even partake in the discussion as this product clearly isn't targeted at people of the same mindset. We know for a fact that AAA titles on the 1060 3GB suffered performance wise a year after the card's launch due to lack of VRAM.
Like which game exactly? You do realize that the 3 GB version had less cores?
Posted on Reply
#54
lexluthermiester
NihilusAmateur hour here at TPU.
Let's review....
(BTW, you got that wording wrong)
NihilusVery few games actually need more than 6 GB, especially at this performance level. Thats a FACT. God forbid you tweak textures just a hair.
Hmm. Interesting opinion. Opinion, not fact, but let's move on...
Nihilus"Why 6 GB and 12 GB? Why not 8 GB??" Christ people, it is a 192 bit card. What is so confusing about it. The 12 GB version will be for those that really want to crank up RTX and other details. They are giving you options.
This statement shows several things. First it shows that you have not been paying attention to game advances for the last 3 years. Second, it shows that you have not been paying attention to the reviews done ALL OVER the internet. Third, Assuming TPU staff where getting things wrong somehow, most of the rest of the reviewers on the internet are ALSO getting the very same things wrong.

Now either EVERYONE ELSE on the planet are getting something wrong and you & a few other people are right(possible), or you and the people who think like you are missing something very important.

Which do you think is more likely? Hmm?
NihilusAmateur hour here at TPU.
You were saying?
Posted on Reply
#55
dirtyferret
NihilusArmature hour here at TPU
We are going to work on sculptures?
Posted on Reply
#56
Caring1
dirtyferretWe are going to work on sculptures?
Maybe he's building an electric motor?
Posted on Reply
#57
Minus Infinity
Vya DomusTo be fair, I don't think even Nvidia in their infinite greed could sell a mid range GPU with more VRAM than an 800$ card that's twice as fast. But then again, they do push the boundaries for what's acceptable and what isn't with every release so who knows.
Well they don't care about 3060 Ti cannibalising the 3070 sales.
Posted on Reply
#58
Vader
In fairness to the 1060 3gb, most reviewers used either high or ultra graphics, while users would've opted for medium/low textures instead, and avoid tanking their fps.

What bothers me is that these cards feel like a trap in the long term, because even if you have to run a game at it's lowest settings, with ultra textures they still look good. Extra memory helps to increase the longevity of the card IMO.
Posted on Reply
#59
Nihilus
lexluthermiesterLet's review....
(BTW, you got that wording wrong)

Hmm. Interesting opinion. Opinion, not fact, but let's move on...

This statement shows several things. First it shows that you have not been paying attention to game advances for the last 3 years. Second, it shows that you have not been paying attention to the reviews done ALL OVER the internet. Third, Assuming TPU staff where getting things wrong some how, most of the rest of the reviewers on the internet are ALSO getting the very same things wrong.

Now either EVERYONE ELSE on the planet are getting something wrong and you & a few other people are right(possible), or you and the people who think like you are missing something very important.

Which do you think is more likely? Hmm?


You were saying?
Ok glad you got that out, feel better? Show me in TPU game reviews other than id software where 6 GB is a problem.

If you are still having anxiety after seeing the 8 GB+ vram ALLOCATION graphs, just buy the 12 GB version. Others will be better happy to save a few bucks and get the 6 GB version. Wow, choices! Everyone is happy!

Ps International tech forums are probably not the best place to be spelling/grammar nazis...
Posted on Reply
#60
Pumper
Bubster12 gb for 3060 and not for 3080 0r 3070 !!!! ??? wtf
nvidia is supposed to release 16GB 3060Ti and 3070, and 20GB 3080 by that time.
Posted on Reply
#61
Chrispy_
Pricing theory:

4.5 years ago, the 1060 6GB variant launched at $249. Increase by 8.4% for inflation and 25% for current US-China trade tariffs that didn't exist in 2016.
That means the 3060 12GB should be priced somewhere close to $337 in order to approximately match the target price of the 1060 6GB

4.5 years ago, the 1060 3GB variant launched at $199. The same inflation and tariff increases mean that the 3060 6GB should be priced somewhere close to $270

Given Nvidia have also increased their greed in the last 4.5 years, I would expect $349 and $299 prices, respectively.
Posted on Reply
#62
N3M3515
NihilusThe 12 GB version will be for those that really want to crank up RTX and other details.
No they won't. Not enough horsepower. Of this 12GB at least 4GB are worthless.
Posted on Reply
#63
Vya Domus
Chrispy_4.5 years ago, the 1060 6GB variant launched at $249.
Isn't that insane ? The same SKU now is supposedly going to have the same amount of VRAM.
Posted on Reply
#64
Chrispy_
Vya DomusIsn't that insane ? The same SKU now is supposedly going to have the same amount of VRAM.
It's happened before and it'll happen again.

The problem is that GDDR BGA packages only increase in multiples of two. Whilst a sensible amount of RAM for this card might be around 8GB, it's difficult to achieve that without crippling the already-limiting 192-bit bus, or increasing the PCB complexity and manufacturing cost (and therefore MSRP) by using a 256-bit bus instead.

The x60 SKU has always been about getting the best price/performance ratio in the fiercely-competitive mainstream price range and a fully-utilised 192-bit bus is the way Nvidia (and AMD) have historically met that price point. I'm sure if they could buy 5Gbit GDDR6 packages, they would, but unfortunately all that is on the market is either 4Gbit or 8Gbit packages, leading to 6GB and 12GB card configs.
Posted on Reply
#65
RandallFlagg
evernessinceModern games utilize more than that at 1080p. This is for CP 2077:





Sure, that's an option. Then again you are dropping 300 - 350 USD on a GPU and only playing at 1080p to begin with. What other compromises will you have to make with your new GPU as games take more and more VRAM? Who spends that kind of money to play at such a low resolution and know their card doesn't have enough VRAM for modern games, let alone down the line? The only scenario it makes sense to get the 6GB card is if you are only keeping the card for a single gen.


2GB more alone would have been a massive boost for this card.
If we assume the 3060 will perform about like a 2070 or 2070 Super, then there is no configuration that you can run this game at that requires > 6GB VRAM where the 3060 could give more than about 40FPS avg.

1440P ultra RT off is that 6GB VRAM line, and a 2070 can't even average 40FPS at those settings.

It looks to me like 6GB will be fine up to 1440P Ultra settings, RT disabled on Cyberpunk 2077, and that is one of if not the biggest VRAM hogs in existence. Any higher than that and you'll have a slideshow not so much because of VRAM, but because of the GPU not being powerful enough. The 2070 is already really borderline with 39 avg fps.

The only place I can think of the VRAM helping is custom texture mods, however one of the most popularly modded games - Skyrim - only uses about 4GB with 2K textures.

Now 3060 Ti, 3070 and so on - sure, they have the GPU chops to actually use settings that need 8 GB of VRAM.


Posted on Reply
#66
lexluthermiester
NihilusOk glad you got that out, feel better? Show me in TPU game reviews other than id software where 6 GB is a problem.
No, you're the one having issues, you go do your own research. I'm not your errand boy.
NihilusPs International tech forums are probably not the best place to be spelling/grammar nazis...
Then correct yourself. It's called proof-reading. Try it sometime. Otherwise you're just embarrassing yourself..
Posted on Reply
#67
Legacy-ZA
NihilusOk glad you got that out, feel better? Show me in TPU game reviews other than id software where 6 GB is a problem.

If you are still having anxiety after seeing the 8 GB+ vram ALLOCATION graphs, just buy the 12 GB version. Others will be better happy to save a few bucks and get the 6 GB version. Wow, choices! Everyone is happy!

Ps International tech forums are probably not the best place to be spelling/grammar nazis...
You have never modded games before, have you?
Posted on Reply
#68
Nihilus
lexluthermiesterNo, you're the one have issues, you go do your own research. I'm not your errand boy.

Then correct yourself. It's called proof-reading. Try it sometime. Otherwise you're just embarrassing yourself..
"No, you're the one have issues..." pffft yeah, great proof reading yourself.

I am not asking you to be my errand boy or to give me a ride home, I am just asking that you back up some of the FUD you are spreading.
Legacy-ZAYou have never modded games before, have you?
Obviously I wasn't talking about your 8k or whatever texture packs. If that is your thing, than all the better to have a 6 GB and 12 GB version.
Posted on Reply
#69
lexluthermiester
Nihilus"No, you're the one have issues..." pffft yeah, great proof reading yourself.
Oops, but you'll notice, I just corrected myself. Hmmm...
NihilusI am just asking that you back up some of the FUD you are spreading.
I don't need to. History and the whole world disagrees with you. You're the one telling everyone else we're wrong, YOU prove up.
Posted on Reply
#70
efikkan
windwhirlWhy 6 GB? I don't get it.
Because it's a midrange card which doesn't need more.
evernessinceModern games utilize more than that at 1080p.
Allocated and utilized memory are not the same thing. GPUs are very good at compressing memory which are not currently in use, or partially used. During the lifecycle of a single frame, multiple buffers are compressed and expanded significantly, as they are used then blanked during different render passes. This is why games can allocate 8-9 GB on a 6 GB card without swapping.

But when you actually do run out of VRAM, it's very noticeable. Proper reviews will reveal if this is a problem for a card or not.
goodeedididSo 3080 is 10GB and the lower tier card 12GB?? What is the reasoning here?
Because buyers are clueless and think they need more VRAM than they actually do.
Posted on Reply
#71
Nihilus
lexluthermiesterHistory and the whole world disagrees with you.
You couldn't possibly be any more pretentious. We are talking about vram usage in video games for Christ sake.
Posted on Reply
#72
lexluthermiester
NihilusYou couldn't possibly be any more pretentious.
You see how you just embarrassed yourself? Your above comment is evidence that you know you're incorrect about your flawed postulation. How you ask? Simple, instead of responding with an example that has some degree of merit, you offer a personal jab and insult. Well done.
NihilusWe are talking about vram usage in video games for Christ sake.
Incorrect again. The current discussion taking place is the old, tired debate about whether or not more ram is better or indeed needed. It is an argument of futility as functionality and the idea of some degree of future planning always proves the winning perspective and the detractors(like yourself) always make up excuses for why they were right in the moment of the argument but proved wrong as time moved forward.

History and practical technological application has already proven that more ram is ALWAYS BETTER in both short term & long term computing usage considerations.
efikkanBecause it's a midrange card which doesn't need more.
That's an opinion. Not everyone shares it.
efikkanBecause buyers are clueless and think they need more VRAM than they actually do.
Another opinion, one that does not take into consideration all use-case-scenario's or future gaming developments.
Posted on Reply
#73
Nihilus
lexluthermiesterYou see how you just embarrassed yourself? Your above comment is evidence that you know you're incorrect about your flawed postulation. How you ask? Simple, instead of responding with an example that has some degree of merit, you offer a personal jab and insult. Well done.

Incorrect again. The current discussion taking place is the old, tired debate about whether or not more ram is better or indeed needed. It is an argument of futility as functionality and the idea of some degree of future planning always proves the winning perspective and the detractors(like yourself) always make up excuses for why they were right in the moment of the argument but proved wrong as time moved forward.

History and practical technological application has already proven that more ram is ALWAYS BETTER in both short term & long term computing usage considerations.
Stop trying to take the intellectual high ground by claiming you don't name call. Your a passive aggressive pretentious that writes a paragraph nitpicking every last sentence and grammatical error. Its an underhanded way of name calling and in some ways worse.

I was only pointing out that this whole thread is basically complaining that the new 3060 has a 6 GB version and an 12 GB version when people are complaining that is too much and too little when we don't even know the prices yet.
Posted on Reply
#74
efikkan
lexluthermiesterHistory and practical technological application has already proven that more ram is ALWAYS BETTER in both short term & long term computing usage considerations.
If anything, history has proven that cards usually get obsolete in other ways before VRAM capacity.
lexluthermiester
efikkanBecause it's a midrange card which doesn't need more.
That's an opinion. Not everyone shares it.
Actually not.
For the game resolutions and performance intended for this card, 6 GB should be plenty.
Exceptions would be people who have other use cases which needs more VRAM, like content creation or development, but those are edge cases.
lexluthermiester
efikkanBecause buyers are clueless and think they need more VRAM than they actually do.
Another opinion, one that does not take into consideration all use-case-scenario's or future gaming developments.
Nope. This has been the case since at least the Radeon 200 series, where people keeps arguing that certain cards are more "future proof" due to more VRAM.
The best prediction for future gaming would be current games. Usage of more VRAM per frame would require more bandwidth and more computational performance, and it's very unlikely that future games would somehow manage to use significantly more VRAM in a single frame and somehow maintain the frame rate without also requiring more computational performance and bandwidth. The only way to do this would be to utilize some new rendering algorithm which somehow consumes more VRAM capacity more than anything else (and not require new hardware etc.). This highly unlikelihood is why such predictions about VRAM has failed over and over again for the last 10 years. Games will continue to get more demanding, but there is a proportional relation between computational workload, memory bandwidth and memory capacity (especially the last two), which seems to remain fairly consistent over time, even as games increase detail levels.
Posted on Reply
#75
lexluthermiester
NihilusStop trying to take the intellectual high ground by claiming you don't name call.
I don't need to try...
efikkanIf anything, history has proven that cards usually get obsolete in other ways before VRAM capacity.
Not true. The 2GB version of the GTX 460 was still useful well beyond the point when it's 1GB little brother hit a VRAM limit wall. Another example would be the GTX 560TI 2.5GB getting more life than it's 1.25GB little brother. On AMD's side there was the Radeon7850 1GB vs the 2GB version. There are a ton of examples like that.

History always proves people with opinions like yours wrong. So keep thinking what you want and the rest of us will buy cards with the higher amount of ram and get more use out of them. Yes, yes.
Posted on Reply
Add your own comment
Jul 28th, 2024 00:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts