Monday, September 13th 2021

NVIDIA Reportedly Readies RTX 2060 12 GB SKUs for Early 2022 Launch

Videocardz, citing their own sources in the industry, claims that NVIDIA is readying a resurrection of sorts for the popular RTX 2060 graphics card. One of the hallmarks of the raytracing era, the Turing-based RTX 2060 routinely stands as the second most popular graphics card on Steam's hardware survey. Considering the still-ongoing semiconductor shortages and overreaching demand stretching logistics and supply lines thin, NVIDIA would thus be looking at a slight specs bump (double the GDDR6 memory to 12 GB) as a marketing point for the revised RTX 2060. This would also add to the company's ability to deliver mainstream-performance graphics cards in a high enough volume that enables the company to keep reaping benefits from the current Ampere line-up's higher ASP (Average Selling Price) across the board.

Videocardz' sources claim the revised RTX 2060 will be making use of the PG116 board, recycling it from the original GTX 1660 Ti design it was born unto. Apparently, NVIDIA has already warned board partners that the final design and specifications might be ready at years' end, with a potential re-release for January 2021. While the increase to a 12 GB memory footprint on an RTX 2060 graphics card is debatable, NVIDIA has to have some marketing flair to add to such a release. Remember that the RTX 2060 was already given a second lease of life earlier this year as a stopgap solution towards getting more gaming-capable graphics cards on the market; NVIDIA had allegedly moved its RTX 2060 manufacturing allocation back to Ampere, but now it seems that we'll witness a doubling-down on the RTX 2060. Now we just have to wait for the secondary market pricing to come down from its current $500 average... For a $349 MSRP, 2019 graphics card.
Source: Videocardz
Add your own comment

65 Comments on NVIDIA Reportedly Readies RTX 2060 12 GB SKUs for Early 2022 Launch

#2
Raevenlord
News Editor
ZoneDymoPC gaming is dying
It is becoming less democratic, for sure.

However, only when it comes to hardware. PC gamers can always opt for streaming as well. It's not the same - of course - but it is something.
Posted on Reply
#3
Vya Domus
Upcoming mining replacement to the 1060 since 6GB will eventually be insufficient.
Posted on Reply
#4
P4-630
Vya DomusUpcoming mining replacement to the 1060 since 6GB will eventually be insufficient.
IMO Miners can have these...
Posted on Reply
#5
Chomiq
Waste of silicone.

Edit.
T!ts.
Posted on Reply
#6
Sybaris_Caesar
Jeez. I hope my card dies in the near future so I can get this card as warranty replacement. Although it sure as hell won't be fast as my current card I reckon. I heard later 2060s were cut from TU104 chips and is slower in mining compared to native TU106 for some reason. Nvidia's LHR won't help either imo.
Posted on Reply
#7
LTUGamer
Mining madness stop evolution of GPUs. New gpus were introduced every single year, now it is introduced once per two years. That is because everyone buys outdated stuff anyways
Posted on Reply
#8
DonKnotts
RaevenlordPC gamers can always opt for streaming as well. It's not the same - of course - but it is something.
I can't. Neither can millions of others. Low latency high speed internet service is still a pipe dream in large swathes of the US. I'm not even in a rural area and I can't get ANY wired high speed internet service at all. Extremely unreliable and slow 4G, and equally slow and outrageously overpriced fixed wireless is all I have to choose from. They can barely handle a 720p youtube video without buffering, I certainly won't be able to use a game streaming service.
Posted on Reply
#9
Space Lynx
Astronaut
ZoneDymoPC gaming is dying
literally and metaphorically. really a shame... :/
Posted on Reply
#10
Unregistered
The first thought fro every new GPU now seems to be, how good is it for mining. No wonder Gaming is seen as dying. Miners need to fucking die
Posted on Edit | Reply
#12
rtwjunkie
PC Gaming Enthusiast
LTUGamerMining madness stop evolution of GPUs. New gpus were introduced every single year, now it is introduced once per two years. That is because everyone buys outdated stuff anyways
It was always an illusion meant to convince you that you needed a new card every year. Even now with everything stalled and stagnant you will find many people even on this enthusiast site who are getting great graphics and are very satisfied with their “older” hardware. Games never advanced as fast as the GPU’s did.
Posted on Reply
#13
silentbogo
Seen this rumor a few days ago and I strongly suspect that these actually are their infamous turing-based CMP mining cards. "Cheapified" 1660 PCB is a dead giveaway.
Probably CMP30HX, which made no sense given that the good-ole Pascal w/ proper maintenance gives the same result at fraction of the cost.
Can't sell it to miners - sell it to gamers. Just slap 12GB of VRAM on it to make it look more modern and appealing to non-tech-savvy and call it a day (even though it makes no sense from the performance standpoint).
Posted on Reply
#14
Selaya
Xex360Why the 2060 and not the 2070s or something better.
because nv knows you'll buy it anyways, even if its some anemic shit like the 2060 KEKW
Posted on Reply
#15
defaultluser
Xex360Why the 2060 and not the 2070s or something better.
Because it's the only Turing RTX card that they can afford to sell at around $200 (aand it should be faster than Intel's 1060 performance level)

And right now Intel is threatening with RTX. (might be slower at it than the 1660 Ti with the same effects turned on, but the PR is strong with Intel)

Of course, unless we have Etherium go Proof of stake before this launches, this will all be pointless.
Posted on Reply
#16
Richards
This is good because majority of the market buys this range of card
Posted on Reply
#17
silentbogo
rtwjunkieIt was always an illusion meant to convince you that you needed a new card every year. Even now with everything stalled and stagnant you will find many people even on this enthusiast site who are getting great graphics and are very satisfied with their “older” hardware. Games never advanced as fast as the GPU’s did.
I only partially agree. For existing PC users - definitely. I've been on a 3-4 year cycle with my hardware, and recently even downgraded to 1070Ti cause it fits my needs and requirements(on a 4k display, mind you). But new users - not so much. One of my friends wanted to build a decent gaming rig for his son's birthday back in January, and he still can't believe that a decent GPU currently costs more than 2 servers I built him awhile ago(combined!!!), and a whole gaming rig is in a ballpark of a used car. The only few of "affordable" options in my area are: a newly-introduced Intel DG1-4G which no one can use (next to impossible to find a compatible board), or a puny RX560 4G which 2 years ago was worth pennies and can barely handle any modern title.
Posted on Reply
#18
Sisyphus
More demand leads to higher prices, not much to discuss here. For a long time GPUs where pure gaming products. Know there are larger groups of prospective buyers. GPU producer will earn more money, more will be invested, production capacity will go up with 2-3 years delay. Everything fine. There will be times, when mining goes down, overcapacities and price consolidation. I went from a gtx970 to rtx2070 and will jump to an rtx4070 in 2023. Upgrade every second generation or 4 years.
Posted on Reply
#19
neatfeatguy
Yeah! A sub-performance card of the 3060 that has just as much VRAM and won't really be able to use it all. Sounds like a waste of RAM.

Nvidia scared that Intel will make bank off the sub-RTX 3060 gaming segment or something?
Posted on Reply
#20
thebluebumblebee
Raevenlordwith a potential re-release for January 2021
Ooo, time travel.

Actually, why will it take so long to bring it back?
Posted on Reply
#21
Vayra86
ZoneDymoPC gaming is dying
And music, and common sense, and the environment, fiat money, general normal behaviour between people, open discussions... level of intelligence...

I'm spotting a trend here

So far PC gaming survived them all. Let's see about this one :D
neatfeatguyNvidia scared that Intel will make bank off the sub-RTX 3060 gaming segment or something?
I think you're right and this could be the strongest indicator that Intel is actually going to release something along those lines, at a competitive price.

Stronger than any Xe news so far.
RichardsThis is good because majority of the market buys this range of card
This performance was readily available and has been for five years now, at very competitive price points. If you haven't got it yet, you're just new to gaming, but honestly, if you are, why on earth would you step into it today? I can't say this is good. Maybe if they sell it at MSRP 150-200?

What it really is, is two steps back. 2022 should be 4xxx series, with a midrange card 60 odd percent faster than this.
Posted on Reply
#22
P4-630
Where Intel's GPU's compete with


Posted on Reply
#23
BSim500
rtwjunkieIt was always an illusion meant to convince you that you needed a new card every year. Even now with everything stalled and stagnant you will find many people even on this enthusiast site who are getting great graphics and are very satisfied with their “older” hardware.
I have so much in the backlog to keep the old GTX 1660 busy, I can easily wait 3-5 years for a new GPU and not notice.
rtwjunkieGames never advanced as fast as the GPU’s did.
Outside of the "excessive sparkly bits vs MOAR GPU" rat-race, in many cases recent PC games have been "advancing" in all the wrong areas:-

1990's vs 1980's innovation = Birth of 3D, entire new genres (FPS, RTS, etc) and the modding scene, simple dungeon crawlers turned into Baldur's Gate epics, hundreds of entirely new AAA franchises being created (Hitman, Tomb Raider, Far Cry, Doom, Wolfenstein, Civ, etc), going from "PC Speaker beeps" to orchestral soundtracks and Aureal 3D positioning, 16-colour CGA to 16.7m colour SVGA+, etc.

2010's vs 2000's innovation = How to shoehorn trashy mobile "Freemium" monetization mechanics into full priced PC games and get away with it, "surprise mechanics", explosion in cheating for online multi-player, how to pull off a Call of Duty with almost every mainstream AAA franchise and keep recycling the same 15-30 year old IP over & over because 'new ideas are risky', etc...

Honestly, part of me is glad PC gaming hardware is being screwed up now and not 20 years ago.
Posted on Reply
#24
Vayra86
rtwjunkieIt was always an illusion meant to convince you that you needed a new card every year. Even now with everything stalled and stagnant you will find many people even on this enthusiast site who are getting great graphics and are very satisfied with their “older” hardware. Games never advanced as fast as the GPU’s did.
At the same time the steady stream of updated cards kept developers on an improvement cycle too wrt graphics. Games certainly didn't advance as fast, but they sure did advance. Because of the yearly cadence, projects could be planned ahead. Look at Crysis. Its a game that solely relied on further advances in PC graphics and overall GPU performance. And there were many more of those. Today, they don't exist. Additionally, it was a 'chicken egg' game being played, and in that way, gaming on PC helped GPU sales, there was always something interesting, new, to see, at a similar price point later down the line.

Right now, what you're getting is technologies to REDUCE graphics load and also detail along with it. And its not even exclusive to RT effects enabled being the trigger/requirement to do so. At the same time, graphics technology was already at a point, in 2016-17, that for almost everyone 'it was enough to look good'. Thát is why these older cards last. Nvidia then quickly, over 2-3 generations (if we include Pascal), moved the 'playable' bar up to 4K60, while we came from 1080p, alongside some high refresh if lucky. Gaming certainly advanced, it just didn't in ways we can make use of. We have no DLSS. No RTX. Some don't even have MFAA or DX12 support.

Its going to be very interesting to see how the market adjusts to this, because it will adjust, and it already kind of is. A 2060 re-release is practically Nvidia saying the market is looking the same as it did when they launched it. People just couldn't move 'up'.

What's a given though is that any effects that heavily increase the load on a GPU are going to be frowned upon, for now and the next half decade. It was VERY telling that Nvidia told devs to start using RT proper - usually its the devs asking for it, and that was NOT the case here, as you can clearly see from the initial responses at events and the subsequent adoption rate. And that push might backfire hard.
Posted on Reply
#25
TheinsanegamerN
BSim500I have so much in the backlog to keep the old GTX 1660 busy, I can easily wait 3-5 years for a new GPU and not notice.


Outside of the "excessive sparkly bits vs MOAR GPU" rat-race, in many cases recent PC games have been "advancing" in all the wrong areas:-

1990's vs 1980's innovation = Birth of 3D, entire new genres (FPS, RTS, etc) and the modding scene, simple dungeon crawlers turned into Baldur's Gate epics, hundreds of entirely new AAA franchises being created (Hitman, Tomb Raider, Far Cry, Doom, Wolfenstein, Civ, etc), going from "PC Speaker beeps" to orchestral soundtracks and Aureal 3D positioning, 16-colour CGA to 16.7m colour SVGA+, etc.

2010's vs 2000's innovation = How to shoehorn trashy mobile "Freemium" monetization mechanics into full priced PC games and get away with it, "surprise mechanics", explosion in cheating for online multi-player, how to pull off a Call of Duty with almost every mainstream AAA franchise and keep recycling the same 15-30 year old IP over & over because 'new ideas are risky', etc...

Honestly, part of me is glad PC gaming hardware is being screwed up now and not 20 years ago.
Modern AAA gaming has been trash, the entire PS4/XBONE generation was a huge dissapointment to me, especially after the 2008-2012 period where we were getting one banger after another.

My worry is that even if you are interested in older games or less demanding titles, the price of a 1650 super or 5500xt is absolutely bonkers. If you dont have a dGPU already you're just plain screwed until prices finally fix themselves. If my vega 64 were to die on me, I'd have no recourse for a replacement even if I wanted to downgrade.
Posted on Reply
Add your own comment
Jul 2nd, 2024 03:45 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts