Thursday, September 6th 2018

NVIDIA TU106 Chip Support Added to HWiNFO, Could Power GeForce RTX 2060

We are all still awaiting how NVIDIA's RTX 2000 series of GPUs will fare in independent reviews, but that has not stopped the rumor mill from extrapolating. There have been alleged leaks of the RTX 2080 Ti's performance and now we see HWiNFO add support to an unannounced NVIDIA Turing microarchitecture chip, the TU106. As a reminder, the currently announced members in RTX series are based off TU102 (RTX 2080 Ti), and TU104 (RTX 2080, RTX 2070). It is logical to expect a smaller die for upcoming RTX cards based on NVIDIA's history, and we may well see an RTX 2060 using the TU106 chip.

This addition to HWiNFO is to be taken with a grain of salt, however, as they have been wrong before. Even recently, they had added support for what, at the time, was speculated to be NVIDIA Volta microarchitecture which we now know as Turing. This has not stopped others from speculating further, however, as we see 3DCenter.org give their best estimates on how TU106 may fare in terms of die size, shader and TMU count, and more. Given that TSMC's 7 nm node will likely be preoccupied with Apple iPhone production through the end of this year, NVIDIA may well be using the same 12 nm FinFET process that TU102 and TU104 are being manufactured on. This mainstream GPU segment is NVIDIA's bread-and-butter for gross revenue, and so it is possible we may see an announcement with even retail availability towards the end of Q4 2018 to target holiday shoppers.
Source: HWiNFO Changelog
Add your own comment

56 Comments on NVIDIA TU106 Chip Support Added to HWiNFO, Could Power GeForce RTX 2060

#1
cucker tarlson
No tensor cores or RT features on 2060, if there are it's pure stupidity.
Posted on Reply
#2
VSG
Editor, Reviews & News
cucker tarlsonNo tensor cores or RT features on 2060, if there are it's pure stupidity.
Would you not want to see RTX support on the most popular GPUs? If the majority of end users will not use it, why will developers support it?
Posted on Reply
#3
R0H1T
Either way RTX is a crash grab, would also be interesting if this is the new normal for Nvidia as they believe they're in Apple's shoes now?
Posted on Reply
#4
Raendor
VSGWould you not want to see RTX support on the most popular GPUs? If the majority of end users will not use it, why will developers support it?
What for if even 2080/TI struggle to bring 1080p@60 with RT on? So you could enjoy cinematic 24 fps with 2060?
Posted on Reply
#5
cucker tarlson
VSGWould you not want to see RTX support on the most popular GPUs? If the majority of end users will not use it, why will developers support it?
Cause 2070 is probably too weak to support it in majority of games, I reckon 2070 will only be able to run rtx like in one or two out of five, the best optimized only.
RaendorWhat for if even 2080/TI struggle to bring 1080p@60 with RT on? So you could enjoy cinematic 24 fps with 2060?
Well to be honest we don't know how optimized sotr was at that point, 40 fps avg. tells us nothing. If that is done on top of a 120 fps game, that's a ball wrecking of a hit on perfromance. If that was done on a 60-70 fps game, then avg. in mid 40s is not that bad. I'd perefer moar fps either way, if I happen to get rtx 2070/2080 I'll probably just use it for ansel.
Posted on Reply
#6
agent_x007
If AMD's Navi sucks we probably get a RTX 2060 (because it's cheap to produce and does have RT cores).
Best example : 8600 series all over again... too slow for anything to be usefull (but a good HTPC card in retrospect).
If Navi is good enough, we may get a GTX 2060 (without RT cores), which will refresh the Pascal high-ish end stuff, with offering better non-RayTraces capabilities (DLSS for example).
It will give NV best position maket wise, since it will mean that they can adress both Ray-Traced part (RTX series), and those that want best and relatively cheap stuff without that unfinished/not-fast-enough Ray Tracing.
Posted on Reply
#7
T4C Fantasy
CPU & GPU DB Maintainer
VSGWould you not want to see RTX support on the most popular GPUs? If the majority of end users will not use it, why will developers support it?
Its RTX 2070, TU106 that is
Posted on Reply
#8
cucker tarlson
T4C FantasyIts RTX 2070, TU106 that is
I can't really give it any credibility, since full tu104 is 3072, then 2070 is cut 104 with 2304. 106 are usually half 104, with less memory. same amount of vram and 75% of full core, that's cut 104 on xx70 cards.
Posted on Reply
#9
T4C Fantasy
CPU & GPU DB Maintainer
cucker tarlsonI can't really give it any credibility, since full tu104 is 3072, then 2070 is cut 104 with 2304. 106 are usually half 104, with less memory. same amount of vram and 75% of full core, that's cut 104 on xx70 cards.
Afaik 2060 wont be turing and i talk to these program developers, hwinfo and aida, 2070 is tu106
I guess they are not posting names for criticism
Posted on Reply
#10
cucker tarlson
Weird to see 106 at 75% of 104. They only do 2944/3072 ? Cause they cut the sh*t out of GP104 last time and they all sold like hotcakes.
Posted on Reply
#11
T4C Fantasy
CPU & GPU DB Maintainer
cucker tarlsonWeird to see 106 at 75% of 104. They only do 2944/3072 ? Cause they cut the sh*t out of GP104 last time and they all sold like hotcakes.
My guess is 2070 ti will be 104 and there will be a 2080+ or 2080 core 3072 in the distant future

And if there is a tu106 2060 it will be 128bit at 14gbps which is a little faster than 9gbps 192bit gddr6 vs gddr5 of course
Posted on Reply
#12
cucker tarlson
T4C FantasyMy guess is 2070 ti will be 104 and there will be a 2080+ or 2080 core 3072 in the distant future

And if there is a tu106 2060 it will be 128bit at 14gbps which is a little faster than 9gbps 192bit gddr6 vs gddr5 of course
there's no space for 2070ti if 2070 is still 75% of full 104 with same 8gb ddr6 256-bit while 2080 is cut,there's gonna be 10-15% between them. 1070Ti only existed cause there was +20% between 1080,they used g5x and g5 non-x on same chip and cause there was V56.
Posted on Reply
#13
ikeke
RTX2080Ti TU102
RTX2080 TU104
RTX2070 TU106 (no nvlink)
Posted on Reply
#14
atomicus
RTX 2060... pfft. It's hardly going to be worthy of the RTX moniker. Nvidia are in real trouble here.

This is akin to Ferrari putting their name to a 1-litre 75bhp "supercar". Doubtful anyone is going to trust Nvidia after this... even the fanboys are going to have a hard time defending them. But then this is what happens when you don't have any competition in the marketplace... it was inevitable.
Posted on Reply
#15
john_
$299-$349. And the prices keep going up.
Posted on Reply
#16
Venger
RaendorWhat for if even 2080/TI struggle to bring 1080p@60 with RT on? So you could enjoy cinematic 24 fps with 2060?
So Much THIS^ I game on a 1440p 165Hz Gsync Monitor. 1080p@60 with RT on is pointless for me...
Posted on Reply
#17
TheoneandonlyMrK
Seams crazy to me this, no Rtx and no Nvlink for close to 1070 money.

However it could show just how potent Nvidias shaders and core are unhindered by the requirements of tensor and Rtx cores, ie, could possibly be that first to Ship above 2Ghz , maybe that's it's thing.
Posted on Reply
#18
Darksword
$349.99 Founders Edition price means about $399.99 AIB prices.... for a freaking xx60 card.

:banghead:
Posted on Reply
#19
sweet
Darksword$349.99 Founders Edition price means about $399.99 AIB prices.... for a freaking xx60 card.

:banghead:
Nah. FE price is not the base price anymore.
Posted on Reply
#20
Prince Valiant
sweetNah. FE price is not the base price anymore.
All the board partners are pricing their cards higher than FE prices from what I've seen.

Having RTX features on the x60 card seems pointless if the Tomb Raider RT demo wasn't extremely lacking on optimization.
Posted on Reply
#21
TheGuruStud
Consumers “2070 is even more useless than 2080”

Nvidia “Hold my beer, here’s the 2060”
Posted on Reply
#22
crazyeyesreaper
Not a Moderator
If there is an RTX 2060 its not for us it is likely an OEM card so the big players can say there systems have ray tracing support etc etc. Over promise on features under deliver on actual performance.
Posted on Reply
#23
Nkd
VSGWould you not want to see RTX support on the most popular GPUs? If the majority of end users will not use it, why will developers support it?
DO you really expect it to run anything with RTX on? GTX 2080ti is struggling to run 1080p at 60+ frames. They are trying to lower resolution on ray tracing. I mean they are finding ways to find workarounds to make it faster on a 1200 dollar card. I highly doubt 2060 will be able to give you anything playable with rtx on. If you keep lowering the quality of RTX at some point just turn it off and play the game lol.
Posted on Reply
#24
TheGuruStud
NkdDO you really expect it to run anything with RTX on? GTX 2080ti is struggling to run 1080p at 60+ frames. They are trying to lower resolution on ray tracing. I mean they are finding ways to find workarounds to make it faster on a 1200 dollar card. I highly doubt 2060 will be able to give you anything playable with rtx on. If you keep lowering the quality of RTX at some point just turn it off and play the game lol.
It's to the point now that other methods are probably far superior. And the games are going to need 12 threads to run properly, b/c RT has to be processed on CPU. Surprise, vid card is worthless for what it was supposedly made for.

I told you, they're scam cores, and it's all for marketing. Real time reflections aren't new, but robbing this much perf is lol

It's time for AMD to have the devs add vega support and release a RT driver lol. I'm practically convinced that AMD perf can't be worse.
Posted on Reply
#25
Nkd
TheGuruStudIt's to the point now that other methods are probably far superior. And the games are going to need 12 threads to run properly, b/c RT has to be processed on CPU. Surprise, vid card is worthless for what it was supposedly made for.

I told you, they're scam cores, and it's all for marketing. Real time reflections aren't new, but robbing this much perf is lol

It's time for AMD to have the devs add vega support and release a RT driver lol. I'm practically convinced that AMD perf can't be worse.
LOL honestly when I saw the batterfield demo I was like o cool and the metro demo. But then I played strange brigade and I was like is this shit doing ray tracing on my gtx 1080? Hahahha. I guess I was just paying too much attention to shadows just because it was all about light and shadows. I was like but strange brigade looks just like that its so shiny and got great light.. Hahahha.
Posted on Reply
Add your own comment
Nov 23rd, 2024 03:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts