# NVIDIA GeForce RTX 2060 12 GB



## W1zzard (Jan 3, 2022)

NVIDIA stealth-launched the GeForce RTX 2060 12 GB last month. We bought a card in retail, so we can find out how much of a difference doubling the VRAM from 6 GB to 12 GB makes, and how much of the performance gains can be attributed to the increased GPU core counts.

*Show full review*


----------



## Vecix6 (Jan 3, 2022)

At conclusion page:



> The NVIDIA GeForce RTX 3060 is 12% faster, but has only 8 GB VRAM—a tradeoff that I'd make any day.



The RTX 3060 gets 12 GB, 8 GB VRAM is for 3060 Ti but that cards is 40% faster @1080p.. so I think that this sentence is not correct.. So next sentence the "too" isn't correct...



> Last but not least, the aging AMD Radeon RX 5700 and RX 5700 XT are 8% and 20% faster at 1440p, offering 8 GB VRAM, too


----------



## LeonNg (Jan 3, 2022)

Do you know that nvidia made this rtx 2060 12GB only for mining. If you buying this for gaming, you are in wrong direction. I've seen some scalper on mining group already setup mining rig with over 100 new rtx 2060 cards


----------



## GerKNG (Jan 3, 2022)

great! an new entry level mining card...


----------



## Outback Bronze (Jan 3, 2022)

LeonNg said:


> rtx 2060 12GB only for mining



I wouldn't touch one for mining. If you are buying this for mining you are in the wrong direction.


----------



## Ravenmaster (Jan 4, 2022)

LeonNg said:


> Do you know that nvidia made this rtx 2060 12GB only for mining. If you buying this for gaming, you are in wrong direction. I've seen some scalper on mining group already setup mining rig with over 100 new rtx 2060 cards


Hopefully one of those cards will set on fire and burn their establishment to the ground


----------



## Garrus (Jan 4, 2022)

6nm is going to crush this for gaming. Wait for tomorrow.


----------



## mechtech (Jan 4, 2022)

"At 1080 Full HD, AMD's Radeon RX 6600 is 8% faster than the RTX 2060 12 GB, but the gap shrinks to 2% at 1440p, still a solid alternative, especially if you're focusing on 1080p"

hmmm the RX6600 does have an MSRP of $330, what's the MSRP for this card?  Not that it matters much with all the gouging, scalping, etc. going on.

I guess the more important question is it available on the shelf?

Newegg.ca - nothing.  A few 6GB ones going for about $1000 CAD, mainly shipped from China and not newegg.  Pass on that.


----------



## Selaya (Jan 4, 2022)

a brilliant showcase that for the vast majority memory bandwidth > memory capacity when it comes to gpu (assuming capacity isn't starved to the extreme).
great review, very informational.


----------



## Chrispy_ (Jan 4, 2022)

LeonNg said:


> Do you know that nvidia made this rtx 2060 12GB only for mining. If you buying this for gaming, you are in wrong direction. I've seen some scalper on mining group already setup mining rig with over 100 new rtx 2060 cards


It's not made for miners, the scalpers are just scalpers; They'll scalp anything they can.

ETH mining needs as much bandwidth as possible, then low purchase costs, and then low power consumption - in that order.

Compared to a 2060 6GB it's the same hashrate but costs more to buy and significant more to run.
Compared to a 2060S 8GB it's a much worse hashrate, still costs more to buy, and still costs a little more to run.

It wasn't given 12GB for mining. It was given 12GB because there's a global shortage of 8Gbit GDDR6 chips and the more profitable 3070/3080 cards get first dibs.



mechtech said:


> hmmm the RX6600 does have an MSRP of $330, what's the MSRP for this card?  Not that it matters much with all the gouging, scalping, etc. going on.


There is no MSRP, intentionally.

I guess they also would struggle to list a sensible MSRP because it would obviously need to be between the $349 of the 2060 below it in the product stack and the $399 of the 2060S above it in the product stack.

I'm guessing now, but I suspect even tier1 OEMs are paying more than that just in component costs. They were complaining about making 30-series MSRP cards at a loss even way back at launch and component prices have only become worse since then.


----------



## DrCR (Jan 4, 2022)

Desperate measures, perhaps, to feed the market, but they could surely had done a less unimpressive job


----------



## Mussels (Jan 4, 2022)

I was looking forward to the most useless GPU launch of the year


----------



## W1zzard (Jan 4, 2022)

Vecix6 said:


> At conclusion page


Fixed, thanks


----------



## Space Lynx (Jan 4, 2022)

I hope Elon succeeds in colonizing Mars, humans need another shot... sigh... regression in so many sectors...


----------



## Meanhx (Jan 4, 2022)

Chrispy_ said:


> It wasn't given 12GB for mining. It was given 12GB because there's a global shortage of 8Gbit GDDR6 chips and the more profitable 3070/3080 cards get first dibs.



One correction here, 3070ti and above does not use GDDR6 chips, they use GDDR6X chips.


----------



## Vayra86 (Jan 4, 2022)

Hey, look, yet another 1080! Now with extra fashionable 12GB


----------



## mb194dc (Jan 4, 2022)

2022 and Moor's law is dead? In fact we're going backwards pretty much.


----------



## Vayra86 (Jan 4, 2022)

mb194dc said:


> 2022 and Moor's law is dead? In fact we're going backwards pretty much.



Maybe they started applying Moore's Law to the pricing, not the transistor count now?

Seems to line up quite well


----------



## Chomiq (Jan 4, 2022)

lynx29 said:


> I hope Elon succeeds in colonizing Mars, humans need another shot... sigh... regression in so many sectors...


Not unless he figures out hibernation.


mb194dc said:


> 2022 and Moor's law is dead? In fact we're going backwards pretty much.


Technically this was released in 2021.


----------



## MrMilli (Jan 4, 2022)

Considering that this even uses more power than the 2060S, says a lot about the power efficiency of the RX6000 series.


----------



## arni-gx (Jan 4, 2022)

well, i just hope, that this new vga has lot of stock on all gpu market in my country.....


----------



## Frick (Jan 4, 2022)

This card is actually a better buy than a 3070ti because of the 12GB.


----------



## Chrispy_ (Jan 4, 2022)

Meanhx said:


> One correction here, 3070ti and above does not use GDDR6 chips, they use GDDR6X chips.


On desktop, yes.
All current 30-series laptop GPUs are regular GDDR6 though, and laptops outsell PCs 3:1 or something like that.


----------



## dirtyferret (Jan 4, 2022)

_"Additional VRAM makes no noteworthy difference vs. RTX 2060 and RTX 2060 Super"_

I haven't been 100% shocked by a statement like that since the Pope announced he was catholic!  And after al those idiot fan boys (you know who you are) kept posting "more memory equal better gaming" with no regard to gpu chip on the forum threads too!


----------



## Chrispy_ (Jan 4, 2022)

DrCR said:


> Desperate measures, perhaps, to feed the market, but they could surely had done a less unimpressive job


This is a zero-effort release that promised no additional performance over the original 2060.

The extra VRAM is a side effect of the supply shortage, nothing else.

The extra CUDA cores are because TSMC12 yields improved enough in 2019 to retire the 1920-core version completely and replace it with a 2176-core version. They're just using the dies they have, with the boards they have, with the only VRAM they can find that fits.



dirtyferret said:


> _"Additional VRAM makes no noteworthy difference vs. RTX 2060 and RTX 2060 Super"_
> 
> I haven't been 100% shocked by a statement like that since the Pope announced he was catholic!  And after al those idiot fan boys (you know who you are) kept posting "more memory equal better gaming" with no regard to gpu chip on the forum threads too!


I did actually get rid of my 2060 6GB because it didn't have enough VRAM for a couple of games.

That isn't the only reason though, I also got rid of it because it was too slow at 1440p/4K and it couldn't raytrace very well even at 1080p, and the 2060FE had shitty 1200rpm idle fan speeds that were always annoying. So adding more VRAM would have fixed only one of the 2060's problems.

I think the 3070 may be the most shortsighted card in terms of VRAM. 8GB is fine for now but the 3070 is powerful enough that it will still be a viable GPU even when games need to turn down a lot of settings to fit in 8GB.


----------



## seth1911 (Jan 5, 2022)

599$ for a RTX 2060


----------



## Chrispy_ (Jan 5, 2022)

seth1911 said:


> 599$ for a RTX 2060


Which is the market value. The 2060 6GB sells for $600-700 on ebay, used, because that is repeatedly what people will pay for one.
It may not be pleasing to hear but $599 is an accurate price for the performance on offer here, perhaps even good value as it undercuts the current market value. That would be relevant if the $599 price was realistic but it will rapidly become unavailable at that price, I suspect because undercutting the market makes it a prime target for scalpers.

It's hard to get angry at Nvidia or Zotac for selling something at market value. That is how trade has worked for the past 5000 years since the earliest documented trade records appeared, and it probably worked that way beforehand too, just without people keeping records.


----------



## W1zzard (Jan 5, 2022)

Chrispy_ said:


> Which is the market value. The 2060 6GB sells for $600-700 on ebay, used, because that is repeatedly what people will pay for one.
> It may not be pleasing to hear but $599 is an accurate price for the performance on offer here, perhaps even good value as it undercuts the current market value. That would be relevant if the $599 price was realistic but it will rapidly become unavailable at that price, I suspect because undercutting the market makes it a prime target for scalpers.
> 
> It's hard to get angry at Nvidia or Zotac for selling something at market value. That is how trade has worked for the past 5000 years since the earliest documented trade records appeared, and it probably worked that way beforehand too, just without people keeping records.


qft


----------



## Assimilator (Jan 5, 2022)

This is an offensive cash-grab product at a time when the market needs less, not more, of such products. The only reason it has 12GB of memory is to outdo the Radeon 6600 series, which is absolutely the worst example of marketing driving wasteful and unnecessary product design.



Chrispy_ said:


> Which is the market value. The 2060 6GB sells for $600-700 on ebay, used, because that is repeatedly what people will pay for one.
> It may not be pleasing to hear but $599 is an accurate price for the performance on offer here, perhaps even good value as it undercuts the current market value. That would be relevant if the $599 price was realistic but it will rapidly become unavailable at that price, I suspect because undercutting the market makes it a prime target for scalpers.
> 
> It's hard to get angry at Nvidia or Zotac for selling something at market value. That is how trade has worked for the past 5000 years since the earliest documented trade records appeared, and it probably worked that way beforehand too, just without people keeping records.



People aren't angry at the market value. They're angry that the market value is completely bogus and that the people who are able to control that value are doing nothing but push it higher. Nobody likes greed.


----------



## Chrispy_ (Jan 5, 2022)

Assimilator said:


> People aren't angry at the market value. They're angry that the market value is completely bogus and that the people who are able to control that value are doing nothing but push it higher. Nobody likes greed.


You don't seem to understand the GPU market at all; The people who control that market are not Nvidia or Zotac. It is called a free market for a reason, nobody controls it and the market is regulated by a balance of supply and demand.

Nvidia are making GPUs at the maximum possible rate. They are fabless so they can't build more fabs to fix the problem and they can't just pick up a chip like TU106 and move it to a competing foundry.
Zotac are making graphics cards at their maximum possible rate. They need GPUs from Nvidia, memory chips from Hynix/Samsung/Micron, and various other SMCs that are also in short supply and more expensive than usual.

There's no conspiracy theory here; Graphics card production is at maximum capacity. In 2020, Nvidia sold as many GPUs as they did in 2019 despite production shutdowns. In 2021 they sold 37% more GPUs than in 2020. Supply has increased but demand has increased more.

I'm not going to turn this into an Econ.101 wall of text, I'm just asking you to understand that there is a ton of credible-source evidence that Nvidia are producing and selling GPUs as fast as possible. That is all they can do, they do not control the market. If you don't really understand free-market capitalism then you should probably educate yourself on that because it's useful information that will benefit you throughout your life.

The people controlling the market are *us*. We decide what the demand is. The fact that "us" is suddenly a much bigger demographic than it used to be is the problem. "Us" now has a larger contingent of greedy miners buying multiple cards, scalpers, people stuck indoors during lockdown wanting to play more games, people with more disposable income because they couldn't travel or whatever choosing to spend it on GPUs.... The list goes on.


----------



## Vayra86 (Jan 5, 2022)

Chrispy_ said:


> You don't seem to understand the GPU market at all; The people who control that market are not Nvidia or Zotac. It is called a free market for a reason, nobody controls it and the market is regulated by a balance of supply and demand.
> 
> Nvidia are making GPUs at the maximum possible rate. They are fabless so they can't build more fabs to fix the problem and they can't just pick up a chip like TU106 and move it to a competing foundry.
> Zotac are making graphics cards at their maximum possible rate. They need GPUs from Nvidia, memory chips from Hynix/Samsung/Micron, and various other SMCs that are also in short supply and more expensive than usual.
> ...



While true, these factors aren't mutually exclusive either. Meet human hypocrisy  The source of all conflict.

I see it on this forum a lot, too, and most of the time we don't even realize it. 'You can't have your cake and eat it too' is a concept people seem unwilling to grasp. Rather they blame someone else for taking cake. The same thing applies to how we vote, too. We're never happy, and its never our own fault.

Take crypto, and it needs no further explanation. Its a technology born and carried by us. Its even the whole reason some people preach it like the next best thing. Somehow, when _we _call it 'just a hobby', all is well in the world. How about taking that little mirror and using it for once. The same thing applies to 'enthusiasts' always straight up buying the fattest GPU money can buy, while complaining when Nvidia introduces a price hike in the entire line up. Hello? More examples... pre ordering games, and then complaining about day one patches of 10GB and up. Wanting to cheat in online gaming, and then wondering why it's gone to shit. Heck it even echoes in games themselves, instant gratification 'gameplay loops' and more of that beautiful psychological manipulation like a lootbox.

And then you inevitably come to the big one... climate. We're all literally submerged in wasteful practices, so what's the way forward? Even more of it, because 'it doesn't matter anyway'? An interesting choice 

Do we really still control the market, or has the market come to control us? Are we slaves to capitalism?

I think we are.


----------



## Assimilator (Jan 5, 2022)

Chrispy_ said:


> You don't seem to understand the GPU market at all; The people who control that market are not Nvidia or Zotac. It is called a free market for a reason, nobody controls it and the market is regulated by a balance of supply and demand.
> 
> Nvidia are making GPUs at the maximum possible rate. They are fabless so they can't build more fabs to fix the problem and they can't just pick up a chip like TU106 and move it to a competing foundry.
> Zotac are making graphics cards at their maximum possible rate. They need GPUs from Nvidia, memory chips from Hynix/Samsung/Micron, and various other SMCs that are also in short supply and more expensive than usual.
> ...



I'm sure it hasn't slipped past your notice, oh economics professor, that both NVIDIA and AMD have posted record profits over the past couple of years.

I'm sure you also haven't failed to notice that graphics cards are the only PC component that has increased in price by such a large factor.

Yes, there's a component shortage. Yes, it's going to cause price increases. But it's absolutely not significant enough to cause the magnitude of price increases we've seen in the graphics card space. The problem is scalpers sitting on hoards of GPUs.

NVIDIA could literally fix this tomorrow by choosing to sell its Founders Edition cards at a loss for a month. All the scalpers would immediately be stuck with massive stocks of GPUs that they would now have to sell at a loss, as a result said scalpers would go bust and the market would be flooded with the GPUs they've been hoarding, and prices would almost instantly drop down to something approaching sanity.

Of course, NVIDIA isn't obligated to do anything like that, and they won't because they're run by shareholders. But if I were ol' Leather Jacket, I'd happily put my company in the red for a month just for the opportunity to kick every greedy scalper in the nuts.


----------



## W1zzard (Jan 5, 2022)

Assimilator said:


> NVIDIA could literally fix this tomorrow by choosing to sell its Founders Edition cards at a loss for a month





Assimilator said:


> Of course, NVIDIA isn't obligated to do anything like that, and they won't because they're run by shareholders


NVIDIA is legally obligated to bring profits to shareholders, like any other public company


----------



## Chrispy_ (Jan 5, 2022)

Assimilator said:


> NVIDIA could literally fix this tomorrow by choosing to sell its Founders Edition cards at a loss for a month. All the scalpers would immediately be stuck with massive stocks of GPUs that they would now have to sell at a loss, as a result said scalpers would go bust and the market would be flooded with the GPUs they've been hoarding, and prices would almost instantly drop down to something approaching sanity.


There is so much crazy and utterly incorrect in this one paragraph that I think I'm going to just quote it and leave it at that.
It's not my job to educate you just like it's not Nvidia's job to become a charity and fix everything wrong with the GPU economy.



Vayra86 said:


> Do we really still control the market, or has the market come to control us? Are we slaves to capitalism?


The answer to that is an unequivocal 'yes'; We really do still control the market.

The issue is that "we" is not you, me, or groups of like-minded people.
The we is "we the consumers" and that includes the overwhelming ignorance of the masses, as well as the unstoppable amorality of the opportunists.

Nvidia at least deserve some credit for two attempts to curb mining demand and reduce the influence of amoral opportunists on the market. Whether you think that was for their own benefit or for gamers is irrelevant; There were both philanthropic and selfish reasons to do so. It was, however, two clear attempts to restore the market back to how it was and both attempts failed miserably because of human nature. I'll give them credit for trying but it really does feel like they were pissing in the wind as anything they did was always likely to be circumvented in some way.


----------



## sith'ari (Jan 6, 2022)

DrCR said:


> Desperate measures, perhaps, *to feed the market*, but they could surely had done a less unimpressive job



This has been my hope for this release.
That due to the older 12nm TSMC node , nVIDIA could create huge capacity of a low-cost product , and "flood" the market with an affordable(and decent performing) product that would drastically increase nVIDIA's marketshare for their new RTX-based brand.
This would lead nVIDIA to a dominant marketshare ,boosting this way their proprietary-tech such as RTX/DLSS ,while making Intel's offerings unecessary to gamers before they even launch.

This has been my hope for the RTX2060 12GB : 1)Huge quantities , thus ...  2)affordable ,but until now we get ... neither !!


----------



## VeqIR (Jan 6, 2022)

Thank you for the great reviews!  I always refer to them first.

It might also be useful to add some tests that show relative rendering performance for heavily-GPU accelerated tasks, like DxO DeepPrime noise reduction with AI for photo processing—not only game performance.  Basically some other real world tasks that rely on GPU processing for speed (though slower CPU processing is possible), like photo and video rendering times.


----------



## Vayra86 (Jan 6, 2022)

Chrispy_ said:


> The answer to that is an unequivocal 'yes'; We really do still control the market.
> 
> The issue is that "we" is not you, me, or groups of like-minded people.
> The we is "we the consumers" and that includes the overwhelming ignorance of the masses, as well as the unstoppable amorality of the opportunists.



The issues you mention result in 'us' not being in control. Now that GPU has been connected to speculation, all bets are off. 'We' is now a target market that encompasses everyone. I think we can safely agree that we don't really control the markets or state of capitalism right now. We're just moving the goalposts ever further into debt because its too big to fail. Nobody even remotely dares to disturb the status quo, and if they do, only to gain an advantage on the rest.



Chrispy_ said:


> Nvidia at least deserve some credit for two attempts to curb mining demand and reduce the influence of amoral opportunists on the market. Whether you think that was for their own benefit or for gamers is irrelevant; There were both philanthropic and selfish reasons to do so. It was, however, two clear attempts to restore the market back to how it was


Meh, I don't give a company credit for a good and evidently, perfectly predictable PR-move. Because that's what that was. Trying to find goodwill among gamers left out in the cold. All while price bumping their chips for AIBs. Hello? The honest story would have been 'as long as there is crypto and speculation, you're screwed'. Because really, what have we got now? A bunch of LHR GPUs that are less capable in some fields but have done zip to reduce mining. The resale value of that product is lower than of any regular version of said GPU. Who's doing who a favor now, really?


----------



## W1zzard (Jan 6, 2022)

VeqIR said:


> Thank you for the great reviews!  I always refer to them first.
> 
> It might also be useful to add some tests that show relative rendering performance for heavily-GPU accelerated tasks, like DxO DeepPrime noise reduction with AI for photo processing—not only game performance.  Basically some other real world tasks that rely on GPU processing for speed (though slower CPU processing is possible), like photo and video rendering times.


Thanks. We tried including some compute tests from time to time and the traffic was extremely underwhelming, so I rather focus the testing time on data that people are actually interested in


----------



## Chrispy_ (Jan 6, 2022)

Vayra86 said:


> Because really, what have we got now? A bunch of LHR GPUs that are less capable in some fields but have done zip to reduce mining. The resale value of that product is lower than of any regular version of said GPU. Who's doing who a favor now, really?


It's not obvious from the high price of everything, but LHR cards were selling for less than their regular, full-hashrate siblings at the time of switchover. I don't mine on Nvidia cards but the mining discord server I use is very active and has always had a buying/pricing assistance channel with realtime feedback on whether any given card is worth buying at the price asked.

As bad as the pricing is, without LHR models they could easily be 30% higher.


----------



## Mussels (Jan 6, 2022)

I had to double and triple check to make sure i didnt miss it, but how come no VRAM usage numbers were shown?


----------



## W1zzard (Jan 7, 2022)

Mussels said:


> I had to double and triple check to make sure i didnt miss it, but how come no VRAM usage numbers were shown?


Now that you mention it, could have been interesting to include


----------



## Chrispy_ (Jan 7, 2022)

W1zzard said:


> Now that you mention it, could have been interesting to include


The two games that forced me to drop image quality due to a lack of VRAM on the 2060 were SOTR and Doom Eternal, specifically 1440p. Both ran but SOTR performance fell off a cliff to a stuttery mess whenever turning to face a new direction in the game world, Doom Eternal just refused to load Ultra settings citing a shortage of VRAM, and this was in the pre-raytracing patch too. I can only assume that the addition of Raytracing effects requires further VRAM?


----------



## W1zzard (Jan 7, 2022)

Chrispy_ said:


> I can only assume that the addition of Raytracing effects requires further VRAM?


Correct, this is visible in the FPS numbers RT on vs off.. RT on perf falls off a cliff, while RT off it's fine


----------



## watzupken (Jan 18, 2022)

I feel the re-release of the RTX 2060 with a slight bump in specs really made the current RTX 3060 look very bad despite having a much higher CUDA count, and more advanced node. While the power consumption is generally very close with the RTX 3060 offering better performance, the step up in performance isn't fantastic, at least in my opinion. Ultimately, it is the pricing of this card that will do it in. Where I live, the RTX 2060 refresh cost a little bit more than a RX 6600 XT, which the latter tends to edge it out in most games. One can argue that the RTX 2060 can benefit from DLSS, but I feel the RX 6600 XT may have a slightly longer runway and also a decent enough FSR to fall back on.


----------



## Chrispy_ (Jan 18, 2022)

watzupken said:


> I feel the re-release of the RTX 2060 with a slight bump in specs really made the current RTX 3060 look very bad despite having a much higher CUDA count, and more advanced node. While the power consumption is generally very close with the RTX 3060 offering better performance, the step up in performance isn't fantastic, at least in my opinion. Ultimately, it is the pricing of this card that will do it in. Where I live, the RTX 2060 refresh cost a little bit more than a RX 6600 XT, which the latter tends to edge it out in most games. One can argue that the RTX 2060 can benefit from DLSS, but I feel the RX 6600 XT may have a slightly longer runway and also a decent enough FSR to fall back on.


You can't compare a Turing CUDA core count to an Ampere CUDA core count.

A Turing "CUDA core" was made up of an integer block and an FP32 block. Technically, in a perfectly-designed and perfectly-scheduled workload, Turing could execute INT and FP32 simultaneously on the same "core".

Ampere integer blocks can also run FP32, which means the two blocks being counted as one "INT or FP32" CUDA core are now each being counted as an CUDA core. It's why Nvidia seemed like they doubled the core count in one generation when in reality all they did is extend their INT cores slightly to allow them to run FP32 _*instead*_ (not at the same time). The downside is that only half of Ampere's CUDA cores can even run integer math _at all._

So, in a completely hypothetical FP32-only situation, the 3060 has 3584 FP32 blocks compared to the 2060S's 2176 INT blocks. Clock for clock, the *3060 is 65% better*.
And, in a completely hypothetical INT-only situation, the 3060 has 1792 INT block compared to the 2060S's 2176 INT blocks. Clock for clock, the *3060 is 22% worse*.

In reality, workloads are mixed FP32 and INT, the 3060 falls somewhere between 22% worse and 65% better. A lot of gaming workload is FP32 but you have to offset that 65% advantage because the 3060 has 25% fewer texture units and ROPs compared to the 2060S. Remember, that Ampere's core counts were doubled by just tweaking existing INT blocks. If the 3060 was counted by INT & FP32 blocks like Turing, it would be a 1792-core part, with the reduced count of TMU, ROP, L2 cache, being the obvious side effect of reducing the number of SMs.

*TL;DR
Looking at matchups in reviews, it's fairly accurate to say that in the current suite of games people are testing, an Ampere CUDA core is only worth about two-thirds the performance of a similarly-clocked Turing CUDA core. If you want to compare on paper specs, a 3000-core Ampere card will roughly match a 2000-core Turing card. Simples!*


----------



## r9 (Feb 3, 2022)

I would rather buy 2060 $100 cheaper with 6gb same for 6700xt if they had 8gb version.


----------



## sliderider (May 6, 2022)

LeonNg said:


> Do you know that nvidia made this rtx 2060 12GB only for mining. If you buying this for gaming, you are in wrong direction. I've seen some scalper on mining group already setup mining rig with over 100 new rtx 2060 cards


It is not only for mining. Miners want the cards that deliver the highest hash rates, so they can accumulate coins faster, and this isn't one of those cards. This card is in the same performance class as the GTX 1080, with the advantages that it has 12gb vs 8gb, GDDR6 vs GDDR5, and even though the memory bus is narrower, the GDDR6 vs GDDR5 makes up for the difference. The maximum power draw may be similar, but under an identical workload, this card will draw less than GTX 1080, due to greater efficiencies in the core. 12nm vs 16nm matters. The card is shorter than most GTX 1080 cards, so it will fit in more cases and runs cooler. It won't take the sizable hit while performing raytracing that the GTX cards take, and can use DLSS in games that have it enabled, which the GTX cards can't. If you were considering a used GTX 1080 in this market, or any of the GTX 16x0 cards, this is a much better option than any of those.


----------

