Friday, December 18th 2020

NVIDIA GeForce RTX 3060 to Come in 12GB and 6GB Variants

NVIDIA could take a similar approach to sub-segmenting the upcoming GeForce RTX 3060, as it did for the "Pascal" based GTX 1060, according to a report by Igor's Lab. Mr Wallossek predicts a mid-January launch for the RTX 3060 series, possibly on the sidelines of the virtual CES. NVIDIA could develop two variants of the RTX 3060, one with 6 GB of memory, and the other with 12 GB. Both the RTX 3060 6 GB and RTX 3060 12 GB probably feature a 192-bit wide memory interface. This would make the RTX 3060 series the spiritual successors to the GTX 1060 3 GB and GTX 1060 6 GB, although it remains to be seen if the segmentation is limited to the memory size, and doesn't also go into the chip's core-configuration. It's likely that the RTX 3060 series goes up against AMD's Radeon RX 6700 series, with the RX 6700 XT being rumored to feature 12 GB of memory across a 192-bit wide memory interface.
Source: Igor's Lab
Add your own comment

126 Comments on NVIDIA GeForce RTX 3060 to Come in 12GB and 6GB Variants

#26
Unregistered
The 3070 should've got a 12gb of vram 384bits (correctly if I'm wrong but I think it works). The 3060ti just the 3070 obsolete.
Posted on Edit | Reply
#27
Vayra86
dyonoctisNvidia is also marketing their gaming gpu really hard for content creation, maybe that's why they went with "double the vram on everything" :confused:
Ah yes everyone is a video content creator these days, I totally forgot :D Totally worth to have a card with relative equal VRAM to two generations ago :) After all, 12GB on a 3060 is about a similar card as a 1080ti with 11 GB, wrt core power and VRAM. Strangely enough there were no creators back when Pascal launched.

Yes, this is how they sell it and not shit all over their half VRAM version. PR and a truth and VRAM for everyone...
Posted on Reply
#28
ymbaja
windwhirlWhy 6 GB? I don't get it.
because this is their budget line. (Think of it as the 3Gb 1060)
Posted on Reply
#29
Bubster
12 gb for 3060 and not for 3080 0r 3070 !!!! ??? wtf
Posted on Reply
#30
vctr
droopyROOr you turn the textures from ultra to high :)
The only questions i have are price and availability at launch.
249 and 299 respectively, at 349 it makes no sense to sell it as for 50 bucks(in theory) more you have 3060ti.
Posted on Reply
#31
r9
Hopefully everything else is the same between the 6gb and 12gb version so we can see how much of an impact VRAM will actually have.
Posted on Reply
#32
siki
ChomiqMe neither, seeing how we already saw 6 GB RTX cards choke with RT enabled.
And you think more ram will help you with that?
Nvidia should just sell 100 GB ram graphics card , people would think they set for life.
Posted on Reply
#33
Sithaer
Can't wait for the ~500$ starting price in my country. :laugh:
Posted on Reply
#34
dirtyferret
windwhirlMy thoughts were that 12 were too much and 6 is cutting it too close. 8 GB would have been better, I think.
I agree
Posted on Reply
#35
Nordic
ExcuseMeWtfIsn't 6GB enough for 1080p? Many still play on 1060 3GB LMAO, though they have to live with reducing texture levels by now.
I have a 1060 3gb and have not had to reduce texture levels yet. It depends on the games you play and I am not playing the latest and greatest.
Posted on Reply
#36
evernessince
fancuckerAllocation /= utilization. The GPU itself will become an obstacle before VRAM capacity. This has all been spurned by AMD attempting to produce more value with GDDR6 instead of GDDR6X in premium options
Modern games utilize more than that at 1080p. This is for CP 2077:

droopyROOr you turn the textures from ultra to high :)
The only questions i have are price and availability at launch.
Sure, that's an option. Then again you are dropping 300 - 350 USD on a GPU and only playing at 1080p to begin with. What other compromises will you have to make with your new GPU as games take more and more VRAM? Who spends that kind of money to play at such a low resolution and know their card doesn't have enough VRAM for modern games, let alone down the line? The only scenario it makes sense to get the 6GB card is if you are only keeping the card for a single gen.


2GB more alone would have been a massive boost for this card.
Posted on Reply
#37
MxPhenom 216
ASIC Engineer
Vayra86Correct so if you arent storing you are fetching. Which adds latency to frames and can produce stutter. But this is not something history has proven at all. Totally new. :)
You will only have fetching happening if data the GPU needs is not in cache. And typically chips cache designs and memory management have functionality where it will look ahead and make sure what it needs next is available and fetch or swap accordingly so that what it needs is available via the cache. There is logic in cache too that will only trigger the need to fetch from GPU ram or system RAM if a miss is triggered in the cache control logic.

The amount of clock cycles to fetch from cache vs GPU ram to system RAM is increase at each layer of memory in the system. GPU ram will require less cycles then system RAM etc. And then obtaining data in real time for whatever storage medium is available is even worst, by a lot.

Posted on Reply
#38
xorbe
The memory sizes on this gen are just whack, lol.
Posted on Reply
#39
evernessince
NordicI have a 1060 3gb and have not had to reduce texture levels yet. It depends on the games you play and I am not playing the latest and greatest.
Sure and the same could be said for the 7970 Toxic I had years back at 1080p as well. If only I played old games and those light on resources I would have no reason to upgrade still. If that's the stance you are taking then there's no point for you to even partake in the discussion as this product clearly isn't targeted at people of the same mindset. We know for a fact that AAA titles on the 1060 3GB suffered performance wise a year after the card's launch due to lack of VRAM.
Posted on Reply
#40
lexluthermiester
windwhirlWhy 6 GB? I don't get it.
Instead of 256bit VRAM bus with 8GB, it's going to be 192bit bus, thus 6GB & 12GB.
dirtyferretWhy 12GB? The MSRP will be too close to the 8GB ti version and the the chip will most likely be too slow to take advantage of that amount of video ram.
More VRAM at the cost of somewhat lower performance.
xorbeThe memory sizes on this gen are just whack, lol.
No arguments there. It has been a bit wonky.
Posted on Reply
#41
thesmokingman
xorbeThe memory sizes on this gen are just whack, lol.
It's just Nvidia that is monetizing vram.
Posted on Reply
#42
MxPhenom 216
ASIC Engineer
thesmokingmanIt's just Nvidia that is monetizing vram.
I mean both AMD and Nvidia do that, no?
Posted on Reply
#43
thesmokingman
MxPhenom 216I mean both AMD and Nvidia do that, no?
AMD not so much. Nvidia is taking this approach to an extreme.
Posted on Reply
#44
lexluthermiester
MxPhenom 216I mean both AMD and Nvidia do that, no?
Yes they do and always have.
thesmokingmanAMD not so much. Nvidia is taking this approach to an extreme.
They are not, let's not exaggerate.
Posted on Reply
#45
goodeedidid
So 3080 is 10GB and the lower tier card 12GB?? What is the reasoning here?
Posted on Reply
#46
Fluffmeister
goodeedididSo 3080 is 10GB and the lower tier card 12GB?? What is the reasoning here?
Because nothing is confirmed yet and the 10GB card still beats cards with 16GB.
Posted on Reply
#47
Icon Charlie
You are missing the main reason why Both AMD and Nvidia is doing this. To control the price point.

And that is that.

The fact is that last year I brought brand new a AMD 5700 for $279.99 and a 5700XT for $299.99. BRAND NEW. This was 1 week before the castrated 5600XT came out at 279.99, thereby a day later bumping the 5700 and 5700XT back to $329.99 and $349.99 - 379.99. Close to launch price. And these price numbers have been basically stable for the entire year.

That is all they are doing. Slotting castrated versions of cards to keep the premier cards at their current levels of pricing as long as possible.

I'm not buying anything this year. There is no value out there.
Posted on Reply
#48
defaultluser
thesmokingmanIt's just Nvidia that is monetizing vram.
No, it's not.

Ampere launched just at the moment 2GB GDDR6 chips started to be affordable, but they needed the added bandwidth bump from GDDR6X (only 1GB chips back in September).

Nvidia has been riding the same 1GB chip density for all cards until they have the top-end 3080 20GB (2GB x 10, probably a Ti model) available. After that, I would expect a refresh on the rest of the cards that need it (3070S 16GB at 16Gbps speed, to match that 6800), and later 3060 12GB.

AMD managed to get by with the same bus with as the RX 5700 XT, but had no complexity added by mixing memory types, so they just launched all 2GB chips.
Posted on Reply
#49
lexluthermiester
Icon CharlieYou are missing the main reason why Both AMD and Nvidia is doing this. To control the price point.

And that is that.
Nope. Please see the comment right above this one by @defaultluser. The 2GB chip availability was a factor.
Posted on Reply
#50
bug
Vayra86Keep believing. Its almost like religion at this point.
It's been proven time and again. Many game performance reviews right here in TPU show a level of allocated VRAM and no penalty for cards haven fewer VRAM than that.
Posted on Reply
Add your own comment
Jul 28th, 2024 00:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts