Thursday, December 2nd 2021

NVIDIA GeForce RTX 2060 12GB Has CUDA Core Count Rivaling RTX 2060 SUPER

NVIDIA's surprise launch of the GeForce RTX 2060 12 GB graphics card could stir things up in the 1080p mainstream graphics segment. Apparently, there's more to this card than just a doubling in memory amount. Specifications put out by NVIDIA point to the card featuring 2,176 CUDA cores, compared to 1,920 on the original RTX 2060 (6 GB). 2,176 is the same number of CUDA cores that the RTX 2060 SUPER was endowed with. What sets the two cards apart is the memory configuration.

While the RTX 2060 maxed out the "TU106" silicon, the RTX 2060 12 GB is likely based on the larger "TU104," in order to achieve its CUDA core count. The RTX 2060 SUPER features 8 GB of memory across a 256-bit wide memory bus, however, the RTX 2060 12 GB uses a narrower 192-bit wide bus, disabling 1/4th of the bus width of the "TU104." The memory data-rate on both SKUs is the same—14 Gbps. The segmentation between the two in the area of GPU clock speeds appears negligible. The original RTX 2060 ticks at 1680 MHz boost, while the new RTX 2060 12 GB does 1650 MHz boost. The typical board power is increased to 185 W compared to 160 W of the original RTX 2060, and 175 W of the RTX 2060 SUPER.

Update 15:32 UTC: NVIDIA has updated their website to remove the "Founders Edition" part from their specs page (3rd screenshot below). We confirmed with NVIDIA that there will be no RTX 2060 12 GB Founders Edition, only custom designs by their various board partners.
NVIDIA is getting its add-in card partners to come up with several custom-design products based on the new SKU, which should occupy price-points below those of the RTX 3060 "Ampere." This could be an answer to AMD's Radeon RX 6600 (non-XT), which beats the RTX 2060 SUPER by 3% and the original RTX 2060 by 13%, at 1080p, in our testing. Technologically, the older "Turing" architecture won't find itself obsolete in the current market, as it maintains full DirectX 12 Ultimate compatibility.
Source: VideoCardz
Add your own comment

99 Comments on NVIDIA GeForce RTX 2060 12GB Has CUDA Core Count Rivaling RTX 2060 SUPER

#51
Vya Domus
RandallFlaggSo lets assume you're right, and see how dumb this theoretical miner is.
Your definition of which of those configuration suck and which don't is arbitrary. If they can wait 400 days to get profit out of a 3060 rig, they'll wait 500 days as well for a 580 one.

But you are ignoring what I said, look this up, whether or not they are dumb, people are still building 580 rigs :
Posted on Reply
#52
Totally
diopterSo they increased the CUDA core count and the VRAM size in order to justify increasing the price of it for gamers and giving some gaming performance increase. Then they left the memory bandwidth the same as original 2060 in order to minimise the ethereum mining performance and keep it equivalent to the older lower spec card. It will appeal to gamers and not really to miners. This was the intention.
They didn't increase core count it's a 2060 super with more mem. Just would have been silly to call it 2060 SUPER 12GB.
Posted on Reply
#53
AusWolf
TotallyThey didn't increase core count it's a 2060 super with more mem. Just would have been silly to call it 2060 SUPER 12GB.
Not exactly. 12 GB VRAM necessitates 6 VRAM chips which probably bring only 48 ROPs instead of 64. I'm curious how it'll translate into real-world performance.
Posted on Reply
#54
Totally
deurebinning of higher sku failed dies?
You haven't been paying attention at all have you? These are 12nm so that can't possibly be the case.
Posted on Reply
#55
SSGBryan
SelayaWorse than the 2060S.
Sadge man. Why couldn't they just revived the 2060S instead of making up this abomination w/ VRAM no1 will ever make use of.

Oof.
This may come as a great shock to you, but people can actually use video cards for more than just mining or playing video games. For those of us that do things like 3d rendering - it is exactly what the doctor ordered in these trying times.
Posted on Reply
#56
Rob94hawk
watzupkenAt the end of the day, this product will have its place if pricing is right. However it is unlikely that the price is not going to be enticing. Nvidia refused to comment on pricing, and instead directed the question to their AIB partners. This is not a good sign, and that means Nvidia is saying to their AIB partners, "price whatever you want, but your upper limit is the RTX 3060."

Anyway, I feel we have gone a full circle and back to TSMC 12nm again. In my opinion, the initial GPU shortage was because Nvidia went ahead to drop TSMC 12nm based Turing way too early, while at the same time, their Samsung 8nm based Ampere was still struggling with supply.


Do you have an AGP board to run it? ;)
Still have the board in my sig. :D
Posted on Reply
#57
Selaya
SSGBryanThis may come as a great shock to you, but people can actually use video cards for more than just mining or playing video games. For those of us that do things like 3d rendering - it is exactly what the doctor ordered in these trying times.
True, but wouldn't you be better served with something like, a 3060? (I guess it all depends on the price ...)
Posted on Reply
#58
Mussels
Freshwater Moderator
This is good news for 2060 owners, as you're gunna get driver support for longer now :D
Posted on Reply
#59
sith'ari
:love: Brilliant:love: move by nVIDIA if they play their cards(price) well...
Posted on Reply
#60
chrcoluk
"Nvidia have you stopped supporting the low budget market"

"No we contribute to something called the Switch"

:)
Posted on Reply
#61
MentalAcetylide
SSGBryanThis may come as a great shock to you, but people can actually use video cards for more than just mining or playing video games. For those of us that do things like 3d rendering - it is exactly what the doctor ordered in these trying times.
Depending on what kind of stuff you're rendering, that's still going to be a shit card to use for anything substantial in regards to iray. Having more VRAM without the CUDA cores to go with it and you're still going to have long render times.
MusselsThis is good news for 2060 owners, as you're gunna get driver support for longer now :D
I'm not quite understanding the point of having all of these different versions. For example, I can understand the need for a low, mid, high, and ultra-high tier(i.e. RTX 3090 for example), but having all of this in-between crap is just ridiculous. A 2080 Ti has like double the CUDA cores in comparison while having 1 Gb less VRAM. Doesn't make any sense to me.
Posted on Reply
#62
sith'ari
MentalAcetylideI'm not quite understanding the point of having all of these different versions. For example, I can understand the need for a low, mid, high, and ultra-high tier(i.e. RTX 3090 for example), but having all of this in-between crap is just ridiculous. A 2080 Ti has like double the CUDA cores in comparison while having 1 Gb less VRAM. Doesn't make any sense to me.
I have already said my opinion about the RTX2060 12GB in youtube , so since i'm bored writing the same things i'll copy-paste my comment :
....about RTX2060 12GB , in my opinion , this is the most critical release of the entire year . if nVIDIA releases this product in proper price. nVIDIA has the manufacturing process ( 12nm ) to push massive production quantities in a stagnant gaming market and with the proper pricing they can make Intel's upcoming lineup obsolete/unecessary (to gamers) ,before they even show up... to my mind , the release of the RTX2060 12GB in these market conditions seems : brilliant
Posted on Reply
#63
MentalAcetylide
sith'ariI have already said my opinion about the RTX2060 12GB in youtube , so since i'm bored writing the same things i'll copy-paste my comment :
That still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
Posted on Reply
#64
sith'ari
MentalAcetylideThat still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
My guess is ... partly for marketing purposes , and partly out of necessity.
6 GB are borderline sufficient for future games ,and if i understand correctly the design restrictions, nVIDIA could either choose stay with that amount , or they had to double it and go to 12 GB.
Since the 6 GB isn't an option , the 12 GB is the only choice.
But also marketing is equally important factor
Posted on Reply
#65
80-watt Hamster
MentalAcetylideThat still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
Perhaps to make up for the narrower memory bus. Or they figured that 6gb wasn't enough, whether for marketing or performance, and the design doesn't support memory capacities between 6 and 12.
Posted on Reply
#66
Vayra86
Well yay, more core power and RT on top of what was already in perfect balance with a 256 bit bus, now handicapped with a 192 bit bus.

Well done Nvidia. You just made a shit GPU that was surpassed once again in Pascal the year 2016. Well done making yet another 1080 that is effectively worse. I'll order none
MusselsThis is good news for 2060 owners, as you're gunna get driver support for longer now :D
I love your optimism, I really do. Lol
Posted on Reply
#67
Mussels
Freshwater Moderator
MentalAcetylideThat still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
They would have had more of that VRAM available than the newer higher performance stuff, simple as that

They make up their speed tiers, and then need new launches every year for investors/system builders to throw some new name or logo in
Then with the parts shortage they've had to get even weirder, re-using older designs since more parts were available - they dont want to sell 30 series cards for less yet, so why not sell a 20 series at a profit and keep the 30's as 'premium' parts everyone wants?

They kinda did this before with the 1650 and 1660, so it's not a super new concept
Posted on Reply
#68
watzupken
SelayaWorse than the 2060S.
Sadge man. Why couldn't they just revived the 2060S instead of making up this abomination w/ VRAM no1 will ever make use of.

Oof.
The decision is deliberate for sure because this is meant to create another product stack below the RTX 3060. If you look at existing RTX 3060 performance, it is generally close to a RTX 2070, which happens to be about the same as the RTX 2060S. If they were to revive the 2060S, it is going to be too close for comfort for the lackluster RTX 3060, and likely too big a performance gap between the RTX 2060 and 3050 series.
Posted on Reply
#69
Vayra86
Musselsnot a super new concept
I see what you did there
Posted on Reply
#70
Linguica
I bit the bullet and bought a used 2060 Super for $600 back in August when the scalper prices reached their low ebb. One thing I have noticed was in Deathloop trying to use DLSS in combination with my 4K display resulted in the VRAM being overloaded to the point the game kept complaining unless I set the textures to "Low", which looked awful. Assuming DLSS continues to be a feature to help lower-end cards punch above their weight and uprez to higher resolutions, the bump from 8 to 12GB could help "future-proof" them a little bit against running into hard video memory limits.
Posted on Reply
#71
80-watt Hamster
Owen19821. To the surprise of no one, it's been on the rumor mill for a while now.

2. It's still rubbish though! And why use fab capacity to build something old that nobody really wants? Are they using an old node that nobody is using?
MentalAcetylideThat still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
Vayra86Well yay, more core power and RT on top of what was already in perfect balance with a 256 bit bus, now handicapped with a 192 bit bus.

Well done Nvidia. You just made a shit GPU that was surpassed once again in Pascal the year 2016. Well done making yet another 1080 that is effectively worse. I'll order none
I don't understand this recurring line of thinking. Is this an instance of old = bad or something? Lots of folks, myself included, would love to have a card that performs on the level of a 2060, or a 2060 straight up. Even in its original form, performance is only about 20% behind the 3060, which nobody's calling out as a "shit card." Maybe they are, I dunno. "No bad cards, only bad prices" still holds, IMO. Granted, there are nothing but bad prices right now, and for the foreseeable future. But look at it this way: The more cards that can get shoved on the market, the further secondhand prices (and presumably retail ASPs) will fall come the next crypto crash. 'Sides, nobody's forcing anyone to buy the thing.

Posted on Reply
#72
Vayra86
80-watt HamsterI don't understand this recurring line of thinking. Is this an instance of old = bad or something? Lots of folks, myself included, would love to have a card that performs on the level of a 2060, or a 2060 straight up. Even in its original form, performance is only about 20% behind the 3060, which nobody's calling out as a "shit card." Maybe they are, I dunno. "No bad cards, only bad prices" still holds, IMO. Granted, there are nothing but bad prices right now, and for the foreseeable future. But look at it this way: The more cards that can get shoved on the market, the further secondhand prices (and presumably retail ASPs) will fall come the next crypto crash. 'Sides, nobody's forcing anyone to buy the thing.

The 2060 is an RT capable GPU without the oomph to drive it proper, 6GB instead of 8GB and yet its being sold at MSRP that is close to a 2016 GPU that was better in every way.

I'm not sure what's there to cheer about. Sure, if you're happy buying yesterdays' technology at an inflated price with a small handicap... jump on them. But they're going to be surpassed by much better stuff at similar price points sooner rather than later, and they even already are today. The only issue is supply. Buying into shit product is not a way to fix that. All it gets you is something that can game reasonably. For a year and half, maybe two, before it becomes utterly obsolete. At the end of the day you've still overpaid for something, so what's the progress here exactly?
Posted on Reply
#73
sith'ari
Vayra86The 2060 is an RT capable GPU without the oomph to drive it proper, 6GB instead of 8GB and yet its being sold at MSRP that is close to a 2016 GPU that was better in every way.

I'm not sure what's there to cheer about.
It's an RTX2060 in name only .
The original RTX2060 was a TU106 die while this one will be a TU104 , probably around the RTX2060Super performance
--By the way , do you know what's its price , because as far as i know it hasn't been annonced yet...
Posted on Reply
#74
Vayra86
sith'ariIt's an RTX2060 in name only .
The original RTX2060 was a TU106 die while this one will be a TU104 , probably around the RTX2060Super performance
--By the way , do you know what's its price , because as far as i know it hasn't been annonced yet...
It'll have 10% more shaders at best, so what do we expect here? Miracles? Its also got a tighter bus so it won't matter in the end.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts