Thursday, September 6th 2018

NVIDIA TU106 Chip Support Added to HWiNFO, Could Power GeForce RTX 2060

We are all still awaiting how NVIDIA's RTX 2000 series of GPUs will fare in independent reviews, but that has not stopped the rumor mill from extrapolating. There have been alleged leaks of the RTX 2080 Ti's performance and now we see HWiNFO add support to an unannounced NVIDIA Turing microarchitecture chip, the TU106. As a reminder, the currently announced members in RTX series are based off TU102 (RTX 2080 Ti), and TU104 (RTX 2080, RTX 2070). It is logical to expect a smaller die for upcoming RTX cards based on NVIDIA's history, and we may well see an RTX 2060 using the TU106 chip.

This addition to HWiNFO is to be taken with a grain of salt, however, as they have been wrong before. Even recently, they had added support for what, at the time, was speculated to be NVIDIA Volta microarchitecture which we now know as Turing. This has not stopped others from speculating further, however, as we see 3DCenter.org give their best estimates on how TU106 may fare in terms of die size, shader and TMU count, and more. Given that TSMC's 7 nm node will likely be preoccupied with Apple iPhone production through the end of this year, NVIDIA may well be using the same 12 nm FinFET process that TU102 and TU104 are being manufactured on. This mainstream GPU segment is NVIDIA's bread-and-butter for gross revenue, and so it is possible we may see an announcement with even retail availability towards the end of Q4 2018 to target holiday shoppers.
Source: HWiNFO Changelog
Add your own comment

56 Comments on NVIDIA TU106 Chip Support Added to HWiNFO, Could Power GeForce RTX 2060

#51
efikkan
Vayra86It's even more saddening to see others yell 'pre order nao!' before performance is known. That just needs culling and this is what we're doing. Its all about peer pressure - that is why Nvidia tries to get 'influencers' on the net to join their dark side, using free hardware and bags of money.
Tsukiyomi91 did no such thing, and if you read his/her posts you'll see it was criticism of anyone prejudging the performance.
Vayra86an architecture such as Pascal literally just sold itself because it was a leap forward. Turing really is not and this is clear as day.
Pascal had the advantage of a node shrink, and it enabled much more aggressive boosting. Turing is a refinement of Volta, which is the first major architecture since Maxwell, that is clear as day.
Vayra86Realistically, looking at the numbers, historically we have always been very accurate at making assumptions on performance. Simply because the low hanging fruit in CUDA really is gone now. We won't see tremendous IPC jumps and if we do, they will cost something else that also provides performance (such as clockspeeds). Its really not rocket science and the fact is, if you can't predict Turing performance with some accuracy, you just don't know enough about GPU.
And those estimates have usually been off when comparing across architectures. Most estimates for Kepler were completely off, this was back when Nvidia moved from their old "hot clock" to a new redesigned SM. Maxwell were not according to predictions either.
Turing have the largest SM change since Kepler. The throughput is theoretically double, >50% in compute workloads and probably somewhat less in gaming.
Posted on Reply
#52
TheoneandonlyMrK
efikkanTsukiyomi91 did no such thing, and if you read his/her posts you'll see it was criticism of anyone prejudging the performance.


Pascal had the advantage of a node shrink, and it enabled much more aggressive boosting. Turing is a refinement of Volta, which is the first major architecture since Maxwell, that is clear as day.


And those estimates have usually been off when comparing across architectures. Most estimates for Kepler were completely off, this was back when Nvidia moved from their old "hot clock" to a new redesigned SM. Maxwell were not according to predictions either.
Turing have the largest SM change since Kepler. The throughput is theoretically double, >50% in compute workloads and probably somewhat less in gaming.
Hype ,heard of it, your contradictory too , double is not 50% and as ever thats in select workloads, using the same methods and double compute my Vega's a winner ,but it is not in reality while gaming, though it's not as bad as some say either.

Hype sells.
And picking out edge cases such as you have there when hot clock shaders came in , they also brought hot chips that you neglected to mention and their own drama too ,it doesn't help.
People don't guess too wide of the mark here imho.
Posted on Reply
#53
cucker tarlson
DLSS performance could potentially be very important to me since I use a lot of AA even at 1440p 24",but in order to say that 2x figure apllies to what I experience it's gonna have to be supported in more than a bunch of games. Still,if 20 series has advantage in DLSS,Vulkan/DX12 and in performance in HDR mode, on top of let's say 40% over last gen, it's overall performance increase compared to pascal will look pretty good across the board even witohut adding RT into the equasion.
Posted on Reply
#54
nemesis.ie
In the case that all of that positive stuff happens, is it still worth the massive price increase though?
Posted on Reply
#55
londiste
Prince ValiantNo one is looking at paper specs when referring to RT performance, they're basing it off how it actually performed. I'd say it's a lot worse to be extolling Turing's virtues than it is to maintain healthy skepticism when performance is unknown.
That "actually performed" has a huge asterisk next to it. This opinion is based on early versions of Shadow of Tomb Raider, Metro Exodus and Battlefield V, in that order. Assuming that developers (who had the actual cards for around two weeks before the event) do no further optimizations. Convienently things that do work get ignored. Pica Pica comes to mind as do a bunch of professional pieces of software that do not necessarily do real-time raytracing but offer considerable speedups in normal rendering, Autodesk Arnold for example. Because Quadro release was a bit earlier and software companies were more involved they had more time to implement RT bits and pieces.

Anyhow, slides will be out on Friday and reviews next Monday. We will see how things fare.
Games and RT-wise, we will probably have to wait for a couple months until first games with DXR support are out.
nemesis.ieIn the case that all of that positive stuff happens, is it still worth the massive price increase though?
Nope. It is a balancing act though. RTX cards are prices into enthusiast range and beyond, this is the target market that traditionally is willing to pay more for state of the art hardware. Whether Nvidia's gamble will pay off in this generation remains to be seen.
cucker tarlsonDLSS performance could potentially be very important to me since I use a lot of AA even at 1440p 24",but in order to say that 2x figure apllies to what I experience it's gonna have to be supported in more than a bunch of games. Still,if 20 series has advantage in DLSS,Vulkan/DX12 and in performance in HDR mode, on top of let's say 40% over last gen, it's overall performance increase compared to pascal will look pretty good across the board even witohut adding RT into the equasion.
40% is optimistic. No doubt Nvidia chose the best performers to put onto their slides. Generational speed increases have been around 25% for a while and it is likely to be around that this time as well, across a large number of titles at least.

DLSS is a real unknown here. In theory, it is free-ish AA (due to being run on Tensor cores) that would be awesome considering how problematic AA is in contemporary renderers. Whether it will pan out as well as current marketing blabber makes it out to be. The potential is there.
Posted on Reply
#56
Prince Valiant
londisteThat "actually performed" has a huge asterisk next to it. This opinion is based on early versions of Shadow of Tomb Raider, Metro Exodus and Battlefield V, in that order. Assuming that developers (who had the actual cards for around two weeks before the event) do no further optimizations. Convienently things that do work get ignored. Pica Pica comes to mind as do a bunch of professional pieces of software that do not necessarily do real-time raytracing but offer considerable speedups in normal rendering, Autodesk Arnold for example. Because Quadro release was a bit earlier and software companies were more involved they had more time to implement RT bits and pieces.

Anyhow, slides will be out on Friday and reviews next Monday. We will see how things fare.
Games and RT-wise, we will probably have to wait for a couple months until first games with DXR support are out.
It's still the only real information we have on these cards. I'd be against pre-ordering these even if we had more information because it's daft to pre-order a GPU. We should hopefully have 3D Mark's RT benchmark at or around launch.
Posted on Reply
Add your own comment
Nov 23rd, 2024 07:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts