• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 12 GB

599$ for a RTX 2060 :wtf:
 
599$ for a RTX 2060 :wtf:
Which is the market value. The 2060 6GB sells for $600-700 on ebay, used, because that is repeatedly what people will pay for one.
It may not be pleasing to hear but $599 is an accurate price for the performance on offer here, perhaps even good value as it undercuts the current market value. That would be relevant if the $599 price was realistic but it will rapidly become unavailable at that price, I suspect because undercutting the market makes it a prime target for scalpers.

It's hard to get angry at Nvidia or Zotac for selling something at market value. That is how trade has worked for the past 5000 years since the earliest documented trade records appeared, and it probably worked that way beforehand too, just without people keeping records.
 
Which is the market value. The 2060 6GB sells for $600-700 on ebay, used, because that is repeatedly what people will pay for one.
It may not be pleasing to hear but $599 is an accurate price for the performance on offer here, perhaps even good value as it undercuts the current market value. That would be relevant if the $599 price was realistic but it will rapidly become unavailable at that price, I suspect because undercutting the market makes it a prime target for scalpers.

It's hard to get angry at Nvidia or Zotac for selling something at market value. That is how trade has worked for the past 5000 years since the earliest documented trade records appeared, and it probably worked that way beforehand too, just without people keeping records.
qft
 
This is an offensive cash-grab product at a time when the market needs less, not more, of such products. The only reason it has 12GB of memory is to outdo the Radeon 6600 series, which is absolutely the worst example of marketing driving wasteful and unnecessary product design.

Which is the market value. The 2060 6GB sells for $600-700 on ebay, used, because that is repeatedly what people will pay for one.
It may not be pleasing to hear but $599 is an accurate price for the performance on offer here, perhaps even good value as it undercuts the current market value. That would be relevant if the $599 price was realistic but it will rapidly become unavailable at that price, I suspect because undercutting the market makes it a prime target for scalpers.

It's hard to get angry at Nvidia or Zotac for selling something at market value. That is how trade has worked for the past 5000 years since the earliest documented trade records appeared, and it probably worked that way beforehand too, just without people keeping records.

People aren't angry at the market value. They're angry that the market value is completely bogus and that the people who are able to control that value are doing nothing but push it higher. Nobody likes greed.
 
People aren't angry at the market value. They're angry that the market value is completely bogus and that the people who are able to control that value are doing nothing but push it higher. Nobody likes greed.
You don't seem to understand the GPU market at all; The people who control that market are not Nvidia or Zotac. It is called a free market for a reason, nobody controls it and the market is regulated by a balance of supply and demand.

Nvidia are making GPUs at the maximum possible rate. They are fabless so they can't build more fabs to fix the problem and they can't just pick up a chip like TU106 and move it to a competing foundry.
Zotac are making graphics cards at their maximum possible rate. They need GPUs from Nvidia, memory chips from Hynix/Samsung/Micron, and various other SMCs that are also in short supply and more expensive than usual.

There's no conspiracy theory here; Graphics card production is at maximum capacity. In 2020, Nvidia sold as many GPUs as they did in 2019 despite production shutdowns. In 2021 they sold 37% more GPUs than in 2020. Supply has increased but demand has increased more.

I'm not going to turn this into an Econ.101 wall of text, I'm just asking you to understand that there is a ton of credible-source evidence that Nvidia are producing and selling GPUs as fast as possible. That is all they can do, they do not control the market. If you don't really understand free-market capitalism then you should probably educate yourself on that because it's useful information that will benefit you throughout your life.

The people controlling the market are us. We decide what the demand is. The fact that "us" is suddenly a much bigger demographic than it used to be is the problem. "Us" now has a larger contingent of greedy miners buying multiple cards, scalpers, people stuck indoors during lockdown wanting to play more games, people with more disposable income because they couldn't travel or whatever choosing to spend it on GPUs.... The list goes on.
 
Last edited:
You don't seem to understand the GPU market at all; The people who control that market are not Nvidia or Zotac. It is called a free market for a reason, nobody controls it and the market is regulated by a balance of supply and demand.

Nvidia are making GPUs at the maximum possible rate. They are fabless so they can't build more fabs to fix the problem and they can't just pick up a chip like TU106 and move it to a competing foundry.
Zotac are making graphics cards at their maximum possible rate. They need GPUs from Nvidia, memory chips from Hynix/Samsung/Micron, and various other SMCs that are also in short supply and more expensive than usual.

There's no conspiracy theory here; Graphics card production is at maximum capacity. In 2020, Nvidia sold as many GPUs as they did in 2019 despite production shutdowns. In 2021 they sold 37% more GPUs than in 2020. Supply has increased but demand has increased more.

I'm not going to turn this into an Econ.101 wall of text, I'm just asking you to understand that there is a ton of credible-source evidence that Nvidia are producing and selling GPUs as fast as possible. That is all they can do, they do not control the market. If you don't really understand free-market capitalism then you should probably educate yourself on that because it's useful information that will benefit you throughout your life.

The people controlling the market are us. We decide what the demand is. The fact that "us" is suddenly a much bigger demographic than it used to be is the problem. "Us" now has a larger contingent of greedy miners buying multiple cards, scalpers, people stuck indoors during lockdown wanting to play more games, people with more disposable income because they couldn't travel or whatever choosing to spend it on GPUs.... The list goes on.

While true, these factors aren't mutually exclusive either. Meet human hypocrisy ;) The source of all conflict.

I see it on this forum a lot, too, and most of the time we don't even realize it. 'You can't have your cake and eat it too' is a concept people seem unwilling to grasp. Rather they blame someone else for taking cake. The same thing applies to how we vote, too. We're never happy, and its never our own fault.

Take crypto, and it needs no further explanation. Its a technology born and carried by us. Its even the whole reason some people preach it like the next best thing. Somehow, when we call it 'just a hobby', all is well in the world. How about taking that little mirror and using it for once. The same thing applies to 'enthusiasts' always straight up buying the fattest GPU money can buy, while complaining when Nvidia introduces a price hike in the entire line up. Hello? More examples... pre ordering games, and then complaining about day one patches of 10GB and up. Wanting to cheat in online gaming, and then wondering why it's gone to shit. Heck it even echoes in games themselves, instant gratification 'gameplay loops' and more of that beautiful psychological manipulation like a lootbox.

And then you inevitably come to the big one... climate. We're all literally submerged in wasteful practices, so what's the way forward? Even more of it, because 'it doesn't matter anyway'? An interesting choice :)

Do we really still control the market, or has the market come to control us? Are we slaves to capitalism?

I think we are.
 
Last edited:
You don't seem to understand the GPU market at all; The people who control that market are not Nvidia or Zotac. It is called a free market for a reason, nobody controls it and the market is regulated by a balance of supply and demand.

Nvidia are making GPUs at the maximum possible rate. They are fabless so they can't build more fabs to fix the problem and they can't just pick up a chip like TU106 and move it to a competing foundry.
Zotac are making graphics cards at their maximum possible rate. They need GPUs from Nvidia, memory chips from Hynix/Samsung/Micron, and various other SMCs that are also in short supply and more expensive than usual.

There's no conspiracy theory here; Graphics card production is at maximum capacity. In 2020, Nvidia sold as many GPUs as they did in 2019 despite production shutdowns. In 2021 they sold 37% more GPUs than in 2020. Supply has increased but demand has increased more.

I'm not going to turn this into an Econ.101 wall of text, I'm just asking you to understand that there is a ton of credible-source evidence that Nvidia are producing and selling GPUs as fast as possible. That is all they can do, they do not control the market. If you don't really understand free-market capitalism then you should probably educate yourself on that because it's useful information that will benefit you throughout your life.

The people controlling the market are us. We decide what the demand is. The fact that "us" is suddenly a much bigger demographic than it used to be is the problem. "Us" now has a larger contingent of greedy miners buying multiple cards, scalpers, people stuck indoors during lockdown wanting to play more games, people with more disposable income because they couldn't travel or whatever choosing to spend it on GPUs.... The list goes on.

I'm sure it hasn't slipped past your notice, oh economics professor, that both NVIDIA and AMD have posted record profits over the past couple of years.

I'm sure you also haven't failed to notice that graphics cards are the only PC component that has increased in price by such a large factor.

Yes, there's a component shortage. Yes, it's going to cause price increases. But it's absolutely not significant enough to cause the magnitude of price increases we've seen in the graphics card space. The problem is scalpers sitting on hoards of GPUs.

NVIDIA could literally fix this tomorrow by choosing to sell its Founders Edition cards at a loss for a month. All the scalpers would immediately be stuck with massive stocks of GPUs that they would now have to sell at a loss, as a result said scalpers would go bust and the market would be flooded with the GPUs they've been hoarding, and prices would almost instantly drop down to something approaching sanity.

Of course, NVIDIA isn't obligated to do anything like that, and they won't because they're run by shareholders. But if I were ol' Leather Jacket, I'd happily put my company in the red for a month just for the opportunity to kick every greedy scalper in the nuts.
 
NVIDIA could literally fix this tomorrow by choosing to sell its Founders Edition cards at a loss for a month
Of course, NVIDIA isn't obligated to do anything like that, and they won't because they're run by shareholders
NVIDIA is legally obligated to bring profits to shareholders, like any other public company
 
NVIDIA could literally fix this tomorrow by choosing to sell its Founders Edition cards at a loss for a month. All the scalpers would immediately be stuck with massive stocks of GPUs that they would now have to sell at a loss, as a result said scalpers would go bust and the market would be flooded with the GPUs they've been hoarding, and prices would almost instantly drop down to something approaching sanity.
There is so much crazy and utterly incorrect in this one paragraph that I think I'm going to just quote it and leave it at that.
It's not my job to educate you just like it's not Nvidia's job to become a charity and fix everything wrong with the GPU economy.

Do we really still control the market, or has the market come to control us? Are we slaves to capitalism?
The answer to that is an unequivocal 'yes'; We really do still control the market.

The issue is that "we" is not you, me, or groups of like-minded people.
The we is "we the consumers" and that includes the overwhelming ignorance of the masses, as well as the unstoppable amorality of the opportunists.

Nvidia at least deserve some credit for two attempts to curb mining demand and reduce the influence of amoral opportunists on the market. Whether you think that was for their own benefit or for gamers is irrelevant; There were both philanthropic and selfish reasons to do so. It was, however, two clear attempts to restore the market back to how it was and both attempts failed miserably because of human nature. I'll give them credit for trying but it really does feel like they were pissing in the wind as anything they did was always likely to be circumvented in some way.
 
Desperate measures, perhaps, to feed the market, but they could surely had done a less unimpressive job

This has been my hope for this release.
That due to the older 12nm TSMC node , nVIDIA could create huge capacity of a low-cost product , and "flood" the market with an affordable(and decent performing) product that would drastically increase nVIDIA's marketshare for their new RTX-based brand.
This would lead nVIDIA to a dominant marketshare ,boosting this way their proprietary-tech such as RTX/DLSS ,while making Intel's offerings unecessary to gamers before they even launch.

This has been my hope for the RTX2060 12GB : 1)Huge quantities , thus ... 2)affordable ,but until now we get ... :oops::Dneither !!
 
Thank you for the great reviews! I always refer to them first.

It might also be useful to add some tests that show relative rendering performance for heavily-GPU accelerated tasks, like DxO DeepPrime noise reduction with AI for photo processing—not only game performance. Basically some other real world tasks that rely on GPU processing for speed (though slower CPU processing is possible), like photo and video rendering times.
 
The answer to that is an unequivocal 'yes'; We really do still control the market.

The issue is that "we" is not you, me, or groups of like-minded people.
The we is "we the consumers" and that includes the overwhelming ignorance of the masses, as well as the unstoppable amorality of the opportunists.

The issues you mention result in 'us' not being in control. Now that GPU has been connected to speculation, all bets are off. 'We' is now a target market that encompasses everyone. I think we can safely agree that we don't really control the markets or state of capitalism right now. We're just moving the goalposts ever further into debt because its too big to fail. Nobody even remotely dares to disturb the status quo, and if they do, only to gain an advantage on the rest.

Nvidia at least deserve some credit for two attempts to curb mining demand and reduce the influence of amoral opportunists on the market. Whether you think that was for their own benefit or for gamers is irrelevant; There were both philanthropic and selfish reasons to do so. It was, however, two clear attempts to restore the market back to how it was
Meh, I don't give a company credit for a good and evidently, perfectly predictable PR-move. Because that's what that was. Trying to find goodwill among gamers left out in the cold. All while price bumping their chips for AIBs. Hello? The honest story would have been 'as long as there is crypto and speculation, you're screwed'. Because really, what have we got now? A bunch of LHR GPUs that are less capable in some fields but have done zip to reduce mining. The resale value of that product is lower than of any regular version of said GPU. Who's doing who a favor now, really?
 
Last edited:
Thank you for the great reviews! I always refer to them first.

It might also be useful to add some tests that show relative rendering performance for heavily-GPU accelerated tasks, like DxO DeepPrime noise reduction with AI for photo processing—not only game performance. Basically some other real world tasks that rely on GPU processing for speed (though slower CPU processing is possible), like photo and video rendering times.
Thanks. We tried including some compute tests from time to time and the traffic was extremely underwhelming, so I rather focus the testing time on data that people are actually interested in
 
Because really, what have we got now? A bunch of LHR GPUs that are less capable in some fields but have done zip to reduce mining. The resale value of that product is lower than of any regular version of said GPU. Who's doing who a favor now, really?
It's not obvious from the high price of everything, but LHR cards were selling for less than their regular, full-hashrate siblings at the time of switchover. I don't mine on Nvidia cards but the mining discord server I use is very active and has always had a buying/pricing assistance channel with realtime feedback on whether any given card is worth buying at the price asked.

As bad as the pricing is, without LHR models they could easily be 30% higher.
 
I had to double and triple check to make sure i didnt miss it, but how come no VRAM usage numbers were shown?
 
I had to double and triple check to make sure i didnt miss it, but how come no VRAM usage numbers were shown?
Now that you mention it, could have been interesting to include
 
Now that you mention it, could have been interesting to include
The two games that forced me to drop image quality due to a lack of VRAM on the 2060 were SOTR and Doom Eternal, specifically 1440p. Both ran but SOTR performance fell off a cliff to a stuttery mess whenever turning to face a new direction in the game world, Doom Eternal just refused to load Ultra settings citing a shortage of VRAM, and this was in the pre-raytracing patch too. I can only assume that the addition of Raytracing effects requires further VRAM?
 
I can only assume that the addition of Raytracing effects requires further VRAM?
Correct, this is visible in the FPS numbers RT on vs off.. RT on perf falls off a cliff, while RT off it's fine
 
I feel the re-release of the RTX 2060 with a slight bump in specs really made the current RTX 3060 look very bad despite having a much higher CUDA count, and more advanced node. While the power consumption is generally very close with the RTX 3060 offering better performance, the step up in performance isn't fantastic, at least in my opinion. Ultimately, it is the pricing of this card that will do it in. Where I live, the RTX 2060 refresh cost a little bit more than a RX 6600 XT, which the latter tends to edge it out in most games. One can argue that the RTX 2060 can benefit from DLSS, but I feel the RX 6600 XT may have a slightly longer runway and also a decent enough FSR to fall back on.
 
I feel the re-release of the RTX 2060 with a slight bump in specs really made the current RTX 3060 look very bad despite having a much higher CUDA count, and more advanced node. While the power consumption is generally very close with the RTX 3060 offering better performance, the step up in performance isn't fantastic, at least in my opinion. Ultimately, it is the pricing of this card that will do it in. Where I live, the RTX 2060 refresh cost a little bit more than a RX 6600 XT, which the latter tends to edge it out in most games. One can argue that the RTX 2060 can benefit from DLSS, but I feel the RX 6600 XT may have a slightly longer runway and also a decent enough FSR to fall back on.
You can't compare a Turing CUDA core count to an Ampere CUDA core count.

A Turing "CUDA core" was made up of an integer block and an FP32 block. Technically, in a perfectly-designed and perfectly-scheduled workload, Turing could execute INT and FP32 simultaneously on the same "core".

Ampere integer blocks can also run FP32, which means the two blocks being counted as one "INT or FP32" CUDA core are now each being counted as an CUDA core. It's why Nvidia seemed like they doubled the core count in one generation when in reality all they did is extend their INT cores slightly to allow them to run FP32 instead (not at the same time). The downside is that only half of Ampere's CUDA cores can even run integer math at all.

So, in a completely hypothetical FP32-only situation, the 3060 has 3584 FP32 blocks compared to the 2060S's 2176 INT blocks. Clock for clock, the 3060 is 65% better.
And, in a completely hypothetical INT-only situation, the 3060 has 1792 INT block compared to the 2060S's 2176 INT blocks. Clock for clock, the 3060 is 22% worse.

In reality, workloads are mixed FP32 and INT, the 3060 falls somewhere between 22% worse and 65% better. A lot of gaming workload is FP32 but you have to offset that 65% advantage because the 3060 has 25% fewer texture units and ROPs compared to the 2060S. Remember, that Ampere's core counts were doubled by just tweaking existing INT blocks. If the 3060 was counted by INT & FP32 blocks like Turing, it would be a 1792-core part, with the reduced count of TMU, ROP, L2 cache, being the obvious side effect of reducing the number of SMs.

TL;DR
Looking at matchups in reviews, it's fairly accurate to say that in the current suite of games people are testing, an Ampere CUDA core is only worth about two-thirds the performance of a similarly-clocked Turing CUDA core. If you want to compare on paper specs, a 3000-core Ampere card will roughly match a 2000-core Turing card. Simples!
 
Last edited:
I would rather buy 2060 $100 cheaper with 6gb same for 6700xt if they had 8gb version.
 
Do you know that nvidia made this rtx 2060 12GB only for mining. If you buying this for gaming, you are in wrong direction. I've seen some scalper on mining group already setup mining rig with over 100 new rtx 2060 cards
It is not only for mining. Miners want the cards that deliver the highest hash rates, so they can accumulate coins faster, and this isn't one of those cards. This card is in the same performance class as the GTX 1080, with the advantages that it has 12gb vs 8gb, GDDR6 vs GDDR5, and even though the memory bus is narrower, the GDDR6 vs GDDR5 makes up for the difference. The maximum power draw may be similar, but under an identical workload, this card will draw less than GTX 1080, due to greater efficiencies in the core. 12nm vs 16nm matters. The card is shorter than most GTX 1080 cards, so it will fit in more cases and runs cooler. It won't take the sizable hit while performing raytracing that the GTX cards take, and can use DLSS in games that have it enabled, which the GTX cards can't. If you were considering a used GTX 1080 in this market, or any of the GTX 16x0 cards, this is a much better option than any of those.
 
Last edited:
Back
Top