Monday, September 17th 2018
NVIDIA GTX 1060 and GTX 1050 Successors in 2019; Turing Originally Intended for 10nm
NVIDIA could launch successors to its GeForce GTX 1060 series and GTX 1050 series only by 2019, according to a statement by an ASUS representative, speaking with PC Watch. This could mean that the high-end RTX 2080 Ti, RTX 2080, and RTX 2070, could be the only new SKUs for Holiday 2018 from NVIDIA, alongside cut-rate GeForce GTX 10-series SKUs. This could be a combination of swelling inventories of 10-series GPUs, and insufficient volumes of mid-range RTX 20-series chips, should NVIDIA even decide to extend real-time ray-tracing to mid-range graphics cards.
The way NVIDIA designed the RTX 2070 out of the physically smaller TU106 chip instead of TU104 leads us to believe that NVIDIA could carve out the GTX 1060-series successor based on this chip, since the RTX 2070 maxes it out, and NVIDIA needs to do something with imperfect chips. An even smaller chip (probably half-a-TU104?) could power the GTX 1050-series successor.The PC Watch interview also states that NVIDIA's "Turing" architecture was originally designed for Samsung 10 nanometer silicon fabrication process, but was faced with delays and redesigning for the 12 nm process. This partially explains how NVIDIA hasn't kept up with the generational power-draw reduction curve of the previous 4 generations. NVIDIA has left the door open for a future optical-shrink of Turing to the 8 nm silicon fabrication node, an extension of Samsung's 10 nm node, with reduction in transistor sizes.
Sources:
PC Watch, PCOnline.com.cn, Dylan on Reddit
The way NVIDIA designed the RTX 2070 out of the physically smaller TU106 chip instead of TU104 leads us to believe that NVIDIA could carve out the GTX 1060-series successor based on this chip, since the RTX 2070 maxes it out, and NVIDIA needs to do something with imperfect chips. An even smaller chip (probably half-a-TU104?) could power the GTX 1050-series successor.The PC Watch interview also states that NVIDIA's "Turing" architecture was originally designed for Samsung 10 nanometer silicon fabrication process, but was faced with delays and redesigning for the 12 nm process. This partially explains how NVIDIA hasn't kept up with the generational power-draw reduction curve of the previous 4 generations. NVIDIA has left the door open for a future optical-shrink of Turing to the 8 nm silicon fabrication node, an extension of Samsung's 10 nm node, with reduction in transistor sizes.
39 Comments on NVIDIA GTX 1060 and GTX 1050 Successors in 2019; Turing Originally Intended for 10nm
I looking at this generation purely as beta test. Will pass it for sure, the TPU vote also shows that I am not the only one thinking like that. Arguing that mid tier cards won't get tensor cores? IMHO the flagships barely will manage to deliver proper FPS with the features enabled unless dumbed down.
No upgrades for this year, I guess.
All opinions (especially on price-performance) aside 20-series are faster than 10-series cards and will sell regardless of the high price.
At the same time they (and frankly, industry) want raytracing adoption in one form or another - DXR, Vulkan has RT extensions, Optix/ProRender and other proprietary APIs.
Lack of competition in normal rendering space allowed them to do this in the extreme way we see.
The inclusion of RT hardware was a pretty well-kept secret, or at least the extent of that hardware. Clearly developers got their hands on the cards very late. I would suspect that RTX 2080Ti delay is to allow at least some games developers to add RTX stuff (even if it is DLSS) to games to bolster the sales, not card shortage or anything else.
First generation of any new thing will suck. Next one will be better. And moving to 10nm or 7nm should give them a boost in performance as well even if everything remains the same.
It will be a mess, just as with tessellation it was at start.
How stupid do they think ppl are...
That will be the RTX reception for ya when the NDA lifts, basically nothing to even talk about.
How else do you think people will "save money" by buying more of these cards?
8nm ~65M transistors/mm² provides 18% improvement over 10nm 55Mtr/mm² .
Intels 10nm provides 100 Mtr/mm². GFX cards using that that node will debut in 2020. 14nm GFX is unlikely but who knows.
7nm TSMC is 100 Mtr/mm² too.
So the cards that we didn't get this time will get smaller next time. Pretty cool.
You do know that their upcoming Navi is another refinement of GCN, right? Node names are all marketing at this point. Intel 10 nm is denser than Samsung 10 nm.
Also, designs are usually not shrunk proportionally. Usually colder parts are shrunk more, hotter parts are hardly shrunk at all.