Monday, September 17th 2018
NVIDIA GTX 1060 and GTX 1050 Successors in 2019; Turing Originally Intended for 10nm
NVIDIA could launch successors to its GeForce GTX 1060 series and GTX 1050 series only by 2019, according to a statement by an ASUS representative, speaking with PC Watch. This could mean that the high-end RTX 2080 Ti, RTX 2080, and RTX 2070, could be the only new SKUs for Holiday 2018 from NVIDIA, alongside cut-rate GeForce GTX 10-series SKUs. This could be a combination of swelling inventories of 10-series GPUs, and insufficient volumes of mid-range RTX 20-series chips, should NVIDIA even decide to extend real-time ray-tracing to mid-range graphics cards.
The way NVIDIA designed the RTX 2070 out of the physically smaller TU106 chip instead of TU104 leads us to believe that NVIDIA could carve out the GTX 1060-series successor based on this chip, since the RTX 2070 maxes it out, and NVIDIA needs to do something with imperfect chips. An even smaller chip (probably half-a-TU104?) could power the GTX 1050-series successor.The PC Watch interview also states that NVIDIA's "Turing" architecture was originally designed for Samsung 10 nanometer silicon fabrication process, but was faced with delays and redesigning for the 12 nm process. This partially explains how NVIDIA hasn't kept up with the generational power-draw reduction curve of the previous 4 generations. NVIDIA has left the door open for a future optical-shrink of Turing to the 8 nm silicon fabrication node, an extension of Samsung's 10 nm node, with reduction in transistor sizes.
Sources:
PC Watch, PCOnline.com.cn, Dylan on Reddit
The way NVIDIA designed the RTX 2070 out of the physically smaller TU106 chip instead of TU104 leads us to believe that NVIDIA could carve out the GTX 1060-series successor based on this chip, since the RTX 2070 maxes it out, and NVIDIA needs to do something with imperfect chips. An even smaller chip (probably half-a-TU104?) could power the GTX 1050-series successor.The PC Watch interview also states that NVIDIA's "Turing" architecture was originally designed for Samsung 10 nanometer silicon fabrication process, but was faced with delays and redesigning for the 12 nm process. This partially explains how NVIDIA hasn't kept up with the generational power-draw reduction curve of the previous 4 generations. NVIDIA has left the door open for a future optical-shrink of Turing to the 8 nm silicon fabrication node, an extension of Samsung's 10 nm node, with reduction in transistor sizes.
39 Comments on NVIDIA GTX 1060 and GTX 1050 Successors in 2019; Turing Originally Intended for 10nm
Hell, RTX ON/OFF is even a meme now. Its being ridiculed... and its not rocket science to consider why. It falls squarely in the same corner as VR, 3D stereoscopic and all those other gimmicks that won't last. High cost, uncanny, low benefit and virtually zero adoption rate, which creates the eternal chicken/egg situation many new technologies die from.
Also, people seem to forget that much of the performance is known and can be calculated - you don't need Nvidia slides to tell you this. And the reality is that only in a select few use cases does Turing improve on perf/dollar AT ALL. In most cases its complete stagnation or worse. That already cuts out most Pascal owners from a decent deal. And do you really think those who skipped Pascal are going to spend big on features they never need? En masse? Naaah - Pascal on discount is a far better deal for them, and has a much more friendly price tag too.
Nvidia is in a very strange position right now, and they've kinda dug their own hole.
I hear the spin cycle starting and it's shaking hard.
That world is already beautiful and adding more immersion to it would be even better. It is also nice that it is a title where it isn't competitive so FPS could be a tad on the lower/tolerable side compared to BFV.
Thanks for the best laugh of the day.
What specifically proves that Turing is stagnating or worse than Pascal?
Which hole is Nvidia in right now? The only valid complaint so far for Turing is pricing, everything else is grumpy AMD fans.