Sunday, May 14th 2023
NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code
NVIDIA is launching 8 GB and 16 GB variants of its upcoming GeForce RTX 4060 Ti graphics card, with the 8 GB model debuting later this month, and the 16 GB model slated for July, as we learned in an older article. We are learning what else sets the two apart. Both are based on the 5 nm "AD106" silicon, by enabling 34 out of 36 SM physically present on the silicon, which works out to 4,352 out of 4,608 CUDA cores present. While the 8 GB model has the ASIC code "AD106-350," the 16 GB model gets the ASIC code "AD106-351."
The 16 GB model of the RTX 4060 Ti also has a slightly higher TDP, rated at 165 W, compared to 160 W of the 8 GB model. This is the TDP of the silicon, and not TGP (typical graphics power,) which takes into account power drawn by the entire board. The 16 GB model is sure to have a higher TGP on account of its higher-density memory. NVIDIA is likely to use four 32 Gbit (4 GB) GDDR6 memory chips to achieve 16 GB (as opposed to eight 16 Gbit ones with two chips piggybacked per 32-bit path).
Source:
VideoCardz
The 16 GB model of the RTX 4060 Ti also has a slightly higher TDP, rated at 165 W, compared to 160 W of the 8 GB model. This is the TDP of the silicon, and not TGP (typical graphics power,) which takes into account power drawn by the entire board. The 16 GB model is sure to have a higher TGP on account of its higher-density memory. NVIDIA is likely to use four 32 Gbit (4 GB) GDDR6 memory chips to achieve 16 GB (as opposed to eight 16 Gbit ones with two chips piggybacked per 32-bit path).
59 Comments on NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code
If both the 8 GB 4060 Ti and RX 7600 arrive for less than $300, they will sell and they will sell very well indeed. Might even buy one to keep as a plaything (RX 7600, that is).
The 4060 Ti last gen (3060 Ti) was $400 and the rumors were of it being higher this gen.
A 8GB 4060Ti at $420-450 isn't really attractive or affordable. Yeah the problem is Nvidia wants a more premium price on the xx60 Ti series (last gen it was $400 and rumors of it starting close to $450 this gen) but 8GB is far from premium.
So most likely good luck to us to have from green 4060ti under 400e.
WTF is the point of a 16GB 4060ti? As it is, the 4070ti isn't a 4k card. Can't wait to see the reviews - I'm pretty sure the engine itself will limit the performance, and the 16GB is just candy. In fact, it'd not surprise me if it was 100% marketing to pitch it against AMD's higher memory offerings.
They also did it with Kepler's 670 ;) Back then they had that primarily for SLI, a 4GB iteration.
Today you have that core power in a single card, the 4070ti is essentially what the 660ti was in Kepler, the similarities are striking and so is the VRAM problem - the 660ti was notorious for being VRAM starved as well, but then mostly on bandwidth (4070ti : same issue). The 670 is really what Ada's 4080 is now; the 680 is again bandwidth/VRAM constrained, as evident by the 770 that goes faster purely on better memory.
The 16GB will show its true colors 3-4 years from now, just like 8GB cards Ampere cards getting surpassed, despite a stronger core, by higher VRAM equivalents.
Will that be 'worth the price difference today'... time will tell. However regardless of price it does prove every time Nvidia uses VRAM to make sure you buy your next card in a timely manner. And then you pay a premium, again, to buy a VRAM starved card, again. You've just proven the point ;) Isn't 800 quid enough for a 20GB equivalent?
Also, 4070ti is not a 4K card? What?
The cognitive dissonance is DAMN strong, wow. So since Ada, the only conceivable card that does register as 'a 4K card' is the 4090 now? The rest is within spitting distance. I don't understand the madness, sorry...
Now with that said, someone could make the same argument about AMD, they are RT performance starved to force you to upgrade. The thing is, anyone with a 3080 or a 3090 will feel that the 7900xtx is a sidegrade in some areas, and a downgrade in others, which it is when it comes to losing DLSS and the RT performance.
The real question is whether you even need to think of upgrades with a 3090; with a 3080 a 7900XTX is a straight upgrade in every possible metric.
I mean I had a 1080ti,, I upgraded to a 3090 for basically 3 games, RDR2 / cyberpunk / control. Everything else could be played very decently on my 1080ti.
The 7900xtx is a sidegrade / slight upgrade when it comes to RT performance sadly.
We need to stop parroting Nvidia's marketing, because it is ab-so-lu-te BS. You either have crippled perf with the full box of nonsense maxed out, or you pay north of 1500 for a 4090, to play those 2-3 titles that are fully Nvidia sponsored. Everything else runs like a dream on anything in the high end either red or green. Except when compiling shaders ;)
This is the same category of bullshit I read when I see those endless discussions on DLSS/FSR where you need a magnifying glass and frozen imagery to spot the differences, just to lend credence to a supposed major difference between camps. Except all those sheep are actually knee deep in Red/Green marketing stories and barely register it, apparently.
I trust HU will have us covered and will compare the 16GB 4060TI with the 3070 on those VRAM limited scenarios (as well as others), it sucks that with all the info already in the wild we still have to wait until July.
It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.
Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder Generally speaking, sure, but cyberpunk specifically, for some freaking reason 40 fps is a horrible experience to me. I've played games at 30 fps and it doesn't bother me that much, but 40 on cyberpunk feels atrocious.