• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 2060 12GB Has CUDA Core Count Rivaling RTX 2060 SUPER

Source? Recent eBay US sales range from $400-550, though I don't know how to tell who's buying them for what purpose. 6GB 1060s are going for the mid-300s; so you could be right. Don't know what else would account for that price differential.

miners buy up what is suitable for mining.. this leaves a shortage of everything else.. the price of "everything else" goes up.. normal supply and demand at work..

trog
 
So lets assume you're right, and see how dumb this theoretical miner is.

Your definition of which of those configuration suck and which don't is arbitrary. If they can wait 400 days to get profit out of a 3060 rig, they'll wait 500 days as well for a 580 one.

But you are ignoring what I said, look this up, whether or not they are dumb, people are still building 580 rigs :
 
So they increased the CUDA core count and the VRAM size in order to justify increasing the price of it for gamers and giving some gaming performance increase. Then they left the memory bandwidth the same as original 2060 in order to minimise the ethereum mining performance and keep it equivalent to the older lower spec card. It will appeal to gamers and not really to miners. This was the intention.
They didn't increase core count it's a 2060 super with more mem. Just would have been silly to call it 2060 SUPER 12GB.
 
They didn't increase core count it's a 2060 super with more mem. Just would have been silly to call it 2060 SUPER 12GB.
Not exactly. 12 GB VRAM necessitates 6 VRAM chips which probably bring only 48 ROPs instead of 64. I'm curious how it'll translate into real-world performance.
 
rebinning of higher sku failed dies?
You haven't been paying attention at all have you? These are 12nm so that can't possibly be the case.
 
Worse than the 2060S.
Sadge man. Why couldn't they just revived the 2060S instead of making up this abomination w/ VRAM no1 will ever make use of.

Oof.
This may come as a great shock to you, but people can actually use video cards for more than just mining or playing video games. For those of us that do things like 3d rendering - it is exactly what the doctor ordered in these trying times.
 
At the end of the day, this product will have its place if pricing is right. However it is unlikely that the price is not going to be enticing. Nvidia refused to comment on pricing, and instead directed the question to their AIB partners. This is not a good sign, and that means Nvidia is saying to their AIB partners, "price whatever you want, but your upper limit is the RTX 3060."

Anyway, I feel we have gone a full circle and back to TSMC 12nm again. In my opinion, the initial GPU shortage was because Nvidia went ahead to drop TSMC 12nm based Turing way too early, while at the same time, their Samsung 8nm based Ampere was still struggling with supply.


Do you have an AGP board to run it? ;)

Still have the board in my sig. :D
 
This may come as a great shock to you, but people can actually use video cards for more than just mining or playing video games. For those of us that do things like 3d rendering - it is exactly what the doctor ordered in these trying times.
True, but wouldn't you be better served with something like, a 3060? (I guess it all depends on the price ...)
 
This is good news for 2060 owners, as you're gunna get driver support for longer now :D
 
:love: Brilliant:love: move by nVIDIA if they play their cards(price) well...
 
"Nvidia have you stopped supporting the low budget market"

"No we contribute to something called the Switch"

:)
 
This may come as a great shock to you, but people can actually use video cards for more than just mining or playing video games. For those of us that do things like 3d rendering - it is exactly what the doctor ordered in these trying times.
Depending on what kind of stuff you're rendering, that's still going to be a shit card to use for anything substantial in regards to iray. Having more VRAM without the CUDA cores to go with it and you're still going to have long render times.

This is good news for 2060 owners, as you're gunna get driver support for longer now :D
I'm not quite understanding the point of having all of these different versions. For example, I can understand the need for a low, mid, high, and ultra-high tier(i.e. RTX 3090 for example), but having all of this in-between crap is just ridiculous. A 2080 Ti has like double the CUDA cores in comparison while having 1 Gb less VRAM. Doesn't make any sense to me.
 
I'm not quite understanding the point of having all of these different versions. For example, I can understand the need for a low, mid, high, and ultra-high tier(i.e. RTX 3090 for example), but having all of this in-between crap is just ridiculous. A 2080 Ti has like double the CUDA cores in comparison while having 1 Gb less VRAM. Doesn't make any sense to me.
I have already said my opinion about the RTX2060 12GB in youtube , so since i'm bored writing the same things i'll copy-paste my comment :

....about RTX2060 12GB , in my opinion , this is the most critical release of the entire year . if nVIDIA releases this product in proper price. nVIDIA has the manufacturing process ( 12nm ) to push massive production quantities in a stagnant gaming market and with the proper pricing they can make Intel's upcoming lineup obsolete/unecessary (to gamers) ,before they even show up... to my mind , the release of the RTX2060 12GB in these market conditions seems : brilliant
 
I have already said my opinion about the RTX2060 12GB in youtube , so since i'm bored writing the same things i'll copy-paste my comment :
That still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
 
That still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
My guess is ... partly for marketing purposes , and partly out of necessity.
6 GB are borderline sufficient for future games ,and if i understand correctly the design restrictions, nVIDIA could either choose stay with that amount , or they had to double it and go to 12 GB.
Since the 6 GB isn't an option , the 12 GB is the only choice.
But also marketing is equally important factor
 
That still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.

Perhaps to make up for the narrower memory bus. Or they figured that 6gb wasn't enough, whether for marketing or performance, and the design doesn't support memory capacities between 6 and 12.
 
Well yay, more core power and RT on top of what was already in perfect balance with a 256 bit bus, now handicapped with a 192 bit bus.

Well done Nvidia. You just made a shit GPU that was surpassed once again in Pascal the year 2016. Well done making yet another 1080 that is effectively worse. I'll order none

This is good news for 2060 owners, as you're gunna get driver support for longer now :D

I love your optimism, I really do. Lol
 
That still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.
They would have had more of that VRAM available than the newer higher performance stuff, simple as that

They make up their speed tiers, and then need new launches every year for investors/system builders to throw some new name or logo in
Then with the parts shortage they've had to get even weirder, re-using older designs since more parts were available - they dont want to sell 30 series cards for less yet, so why not sell a 20 series at a profit and keep the 30's as 'premium' parts everyone wants?

They kinda did this before with the 1650 and 1660, so it's not a super new concept
 
Worse than the 2060S.
Sadge man. Why couldn't they just revived the 2060S instead of making up this abomination w/ VRAM no1 will ever make use of.

Oof.
The decision is deliberate for sure because this is meant to create another product stack below the RTX 3060. If you look at existing RTX 3060 performance, it is generally close to a RTX 2070, which happens to be about the same as the RTX 2060S. If they were to revive the 2060S, it is going to be too close for comfort for the lackluster RTX 3060, and likely too big a performance gap between the RTX 2060 and 3050 series.
 
I bit the bullet and bought a used 2060 Super for $600 back in August when the scalper prices reached their low ebb. One thing I have noticed was in Deathloop trying to use DLSS in combination with my 4K display resulted in the VRAM being overloaded to the point the game kept complaining unless I set the textures to "Low", which looked awful. Assuming DLSS continues to be a feature to help lower-end cards punch above their weight and uprez to higher resolutions, the bump from 8 to 12GB could help "future-proof" them a little bit against running into hard video memory limits.
 
1. To the surprise of no one, it's been on the rumor mill for a while now.

2. It's still rubbish though! And why use fab capacity to build something old that nobody really wants? Are they using an old node that nobody is using?

That still doesn't explain the purpose of putting more VRAM on a shit card. Like I said before, if you don't have the extra CUDA cores to go with it, all that extra VRAM probably won't make much of a difference in regards to performance.

Well yay, more core power and RT on top of what was already in perfect balance with a 256 bit bus, now handicapped with a 192 bit bus.

Well done Nvidia. You just made a shit GPU that was surpassed once again in Pascal the year 2016. Well done making yet another 1080 that is effectively worse. I'll order none

I don't understand this recurring line of thinking. Is this an instance of old = bad or something? Lots of folks, myself included, would love to have a card that performs on the level of a 2060, or a 2060 straight up. Even in its original form, performance is only about 20% behind the 3060, which nobody's calling out as a "shit card." Maybe they are, I dunno. "No bad cards, only bad prices" still holds, IMO. Granted, there are nothing but bad prices right now, and for the foreseeable future. But look at it this way: The more cards that can get shoved on the market, the further secondhand prices (and presumably retail ASPs) will fall come the next crypto crash. 'Sides, nobody's forcing anyone to buy the thing.

1638814434713.png
 
I don't understand this recurring line of thinking. Is this an instance of old = bad or something? Lots of folks, myself included, would love to have a card that performs on the level of a 2060, or a 2060 straight up. Even in its original form, performance is only about 20% behind the 3060, which nobody's calling out as a "shit card." Maybe they are, I dunno. "No bad cards, only bad prices" still holds, IMO. Granted, there are nothing but bad prices right now, and for the foreseeable future. But look at it this way: The more cards that can get shoved on the market, the further secondhand prices (and presumably retail ASPs) will fall come the next crypto crash. 'Sides, nobody's forcing anyone to buy the thing.

View attachment 227884

The 2060 is an RT capable GPU without the oomph to drive it proper, 6GB instead of 8GB and yet its being sold at MSRP that is close to a 2016 GPU that was better in every way.

I'm not sure what's there to cheer about. Sure, if you're happy buying yesterdays' technology at an inflated price with a small handicap... jump on them. But they're going to be surpassed by much better stuff at similar price points sooner rather than later, and they even already are today. The only issue is supply. Buying into shit product is not a way to fix that. All it gets you is something that can game reasonably. For a year and half, maybe two, before it becomes utterly obsolete. At the end of the day you've still overpaid for something, so what's the progress here exactly?
 
The 2060 is an RT capable GPU without the oomph to drive it proper, 6GB instead of 8GB and yet its being sold at MSRP that is close to a 2016 GPU that was better in every way.

I'm not sure what's there to cheer about.
It's an RTX2060 in name only .
The original RTX2060 was a TU106 die while this one will be a TU104 , probably around the RTX2060Super performance
--By the way , do you know what's its price , because as far as i know it hasn't been annonced yet...
 
It's an RTX2060 in name only .
The original RTX2060 was a TU106 die while this one will be a TU104 , probably around the RTX2060Super performance
--By the way , do you know what's its price , because as far as i know it hasn't been annonced yet...

It'll have 10% more shaders at best, so what do we expect here? Miracles? Its also got a tighter bus so it won't matter in the end.
 
Back
Top