Sunday, May 14th 2023

NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

NVIDIA is launching 8 GB and 16 GB variants of its upcoming GeForce RTX 4060 Ti graphics card, with the 8 GB model debuting later this month, and the 16 GB model slated for July, as we learned in an older article. We are learning what else sets the two apart. Both are based on the 5 nm "AD106" silicon, by enabling 34 out of 36 SM physically present on the silicon, which works out to 4,352 out of 4,608 CUDA cores present. While the 8 GB model has the ASIC code "AD106-350," the 16 GB model gets the ASIC code "AD106-351."

The 16 GB model of the RTX 4060 Ti also has a slightly higher TDP, rated at 165 W, compared to 160 W of the 8 GB model. This is the TDP of the silicon, and not TGP (typical graphics power,) which takes into account power drawn by the entire board. The 16 GB model is sure to have a higher TGP on account of its higher-density memory. NVIDIA is likely to use four 32 Gbit (4 GB) GDDR6 memory chips to achieve 16 GB (as opposed to eight 16 Gbit ones with two chips piggybacked per 32-bit path).
Source: VideoCardz
Add your own comment

59 Comments on NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

#1
Unregistered
The 8GB model makes no sense in existing except selling customers a GPU that runs out of Vram and making them look for another card.
Posted on Edit | Reply
#2
Dr. Dro
EngageThe 8GB model makes no sense in existing except selling customers a GPU that runs out of Vram and making them look for another card.
It will sell, as will the RX 7600. 8 GB isn't entirely unworkable for 1080p gaming where these will be used, and people are desperate for an affordable graphics card.

If both the 8 GB 4060 Ti and RX 7600 arrive for less than $300, they will sell and they will sell very well indeed. Might even buy one to keep as a plaything (RX 7600, that is).
Posted on Reply
#3
tussinman
Dr. DroIt will sell, as will the RX 7600. 8 GB isn't entirely unworkable for 1080p gaming where these will be used, and people are desperate for an affordable graphics card.
Well that's the problem. At least with the 7600 it's rumored to be a $250-280 card.

The 4060 Ti last gen (3060 Ti) was $400 and the rumors were of it being higher this gen.

A 8GB 4060Ti at $420-450 isn't really attractive or affordable.
EngageThe 8GB model makes no sense in existing except selling customers a GPU that runs out of Vram and making them look for another card.
Yeah the problem is Nvidia wants a more premium price on the xx60 Ti series (last gen it was $400 and rumors of it starting close to $450 this gen) but 8GB is far from premium.
Posted on Reply
#4
Prima.Vera
Unless those are less than $299 , they will probably be DOA...
Posted on Reply
#5
eidairaman1
The Exiled Airman
tussinmanWell that's the problem. At least with the 7600 it's rumored to be a $250-280 card.

The 4060 Ti last gen (3060 Ti) was $400 and the rumors were of it being higher this gen.

A 8GB 4060Ti at $420-450 isn't really attractive or affordable.

Yeah the problem is Nvidia wants a more premium price on the xx60 Ti series (last gen it was $400 and rumors of it starting close to $450 this gen) but 8GB is far from premium.
People that buy them at 400+ are suckers
Posted on Reply
#6
R0H1T
Well good thing there's a sucker born every minute then!
Posted on Reply
#7
N/A
I can't find any 32Gb memory chips listed by any company. If a 4352 Cuda 128 bit card can make use of 16 GB, then aren't we supposed to expect 32GB on a 8704 256 bit GPU.
Posted on Reply
#8
sepheronx
eidairaman1People that buy them at 400+ are suckers
most of the PC community are suckers.
Posted on Reply
#9
Arkz
N/AI can't find any 32Gb memory chips listed by any company. If a 4352 Cuda 128 bit card can make use of 16 GB, then aren't we supposed to expect 32GB on a 8704 256 bit GPU.
Yeah pretty sure the biggest package GDDR6 chips come in is 16Gb. Dunno where OP pulled 32Gb from... AFAIK the only 32Gb ones are the GDDR6W Samsung made, but I'm not even aware of any consumer card having them.
Posted on Reply
#10
ixi
tussinmanThe 4060 Ti last gen (3060 Ti) was $400 and the rumors were of it being higher this gen.

A 8GB 4060Ti at $420-450 isn't really attractive or affordable.
Yep, where I live. Cheapest 3060 ti is for 410e and that is zotac which fans never stop (don't have 0rpm mode)...

So most likely good luck to us to have from green 4060ti under 400e.
Posted on Reply
#11
The Von Matrices
ArkzYeah pretty sure the biggest package GDDR6 chips come in is 16Gb. Dunno where OP pulled 32Gb from... AFAIK the only 32Gb ones are the GDDR6W Samsung made, but I'm not even aware of any consumer card having them.
Yeah, I was wondering that too considering I haven't even seen a press release about 32Gb chips.
Posted on Reply
#12
the54thvoid
Intoxicated Moderator
For reasons, I bought a 4070ti. My max budget (self-imposed for FU reasons) was £800. I got a very quiet custom card. It has 12GB ram.

WTF is the point of a 16GB 4060ti? As it is, the 4070ti isn't a 4k card. Can't wait to see the reviews - I'm pretty sure the engine itself will limit the performance, and the 16GB is just candy. In fact, it'd not surprise me if it was 100% marketing to pitch it against AMD's higher memory offerings.
Posted on Reply
#13
Dr. Dro
tussinmanWell that's the problem. At least with the 7600 it's rumored to be a $250-280 card.

The 4060 Ti last gen (3060 Ti) was $400 and the rumors were of it being higher this gen.

A 8GB 4060Ti at $420-450 isn't really attractive or affordable.

Yeah the problem is Nvidia wants a more premium price on the xx60 Ti series (last gen it was $400 and rumors of it starting close to $450 this gen) but 8GB is far from premium.
Yeah, agreed. Nvidia is probably going to pitch their rich feature set and efficiency lead to charge a premium, but I think this is the sweet spot where the 7600 would prove to be a better product even if it's not faster - it's critical that we have a capable gaming card that doesn't cost stacks of money, and that makes the segment's winner by default for me as long as AMD releases it at 300 or less.
Posted on Reply
#14
Vayra86
the54thvoidFor reasons, I bought a 4070ti. My max budget (self-imposed for FU reasons) was £800. I got a very quiet custom card. It has 12GB ram.

WTF is the point of a 16GB 4060ti? As it is, the 4070ti isn't a 4k card. Can't wait to see the reviews - I'm pretty sure the engine itself will limit the performance, and the 16GB is just candy. In fact, it'd not surprise me if it was 100% marketing to pitch it against AMD's higher memory offerings.
Unlikely, Nvidia did the same thing with Ampere, that gen is just as VRAM starved as Ada and your 4070ti.
They also did it with Kepler's 670 ;) Back then they had that primarily for SLI, a 4GB iteration.
Today you have that core power in a single card, the 4070ti is essentially what the 660ti was in Kepler, the similarities are striking and so is the VRAM problem - the 660ti was notorious for being VRAM starved as well, but then mostly on bandwidth (4070ti : same issue). The 670 is really what Ada's 4080 is now; the 680 is again bandwidth/VRAM constrained, as evident by the 770 that goes faster purely on better memory.



The 16GB will show its true colors 3-4 years from now, just like 8GB cards Ampere cards getting surpassed, despite a stronger core, by higher VRAM equivalents.
Will that be 'worth the price difference today'... time will tell. However regardless of price it does prove every time Nvidia uses VRAM to make sure you buy your next card in a timely manner. And then you pay a premium, again, to buy a VRAM starved card, again. You've just proven the point ;) Isn't 800 quid enough for a 20GB equivalent?

Also, 4070ti is not a 4K card? What?




The cognitive dissonance is DAMN strong, wow. So since Ada, the only conceivable card that does register as 'a 4K card' is the 4090 now? The rest is within spitting distance. I don't understand the madness, sorry...
Posted on Reply
#15
fevgatos
Vayra86Unlikely, Nvidia did the same thing with Ampere, that gen is just as VRAM starved as Ada and your 4070ti.
They also did it with Kepler's 670 ;) Back then they had that primarily for SLI, a 4GB iteration.
Today you have that core power in a single card, the 4070ti is essentially what the 660ti was in Kepler, the similarities are striking and so is the VRAM problem - the 660ti was notorious for being VRAM starved as well, but then mostly on bandwidth (4070ti : same issue). The 670 is really what Ada's 4080 is now; the 680 is again bandwidth/VRAM constrained, as evident by the 770 that goes faster purely on better memory.



The 16GB will show its true colors 3-4 years from now, just like 8GB cards Ampere cards getting surpassed, despite a stronger core, by higher VRAM equivalents.
Will that be 'worth the price difference today'... time will tell. However regardless of price it does prove every time Nvidia uses VRAM to make sure you buy your next card in a timely manner. And then you pay a premium, again, to buy a VRAM starved card, again. You've just proven the point ;) Isn't 800 quid enough for a 20GB equivalent?
I don't think nvidia cares much about whether or not you buy their new card. They are not putting much vram to prevent pros going for the mainstream models instead of the quadros that sell for much more.

Now with that said, someone could make the same argument about AMD, they are RT performance starved to force you to upgrade. The thing is, anyone with a 3080 or a 3090 will feel that the 7900xtx is a sidegrade in some areas, and a downgrade in others, which it is when it comes to losing DLSS and the RT performance.
Posted on Reply
#16
Vayra86
fevgatosNow with that said, someone could make the same argument about AMD, they are RT performance starved to force you to upgrade. The thing is, anyone with a 3080 or a 3090 will feel that the 7900xtx is a sidegrade in some areas, and a downgrade in others, which it is when it comes to losing DLSS and the RT performance.
If you let yourself get forced by RT to upgrade, you're simply doing it wrong, buying into early tech instead of considering basic principles of specs required for your next GPU (which is VRAM & raster perf). But to each their own.

The real question is whether you even need to think of upgrades with a 3090; with a 3080 a 7900XTX is a straight upgrade in every possible metric.
Posted on Reply
#17
fevgatos
Vayra86If you let yourself get forced by RT to upgrade, you're simply doing it wrong, buying into early tech instead of considering basic principles of specs required for your next GPU (which is VRAM & raster perf). But to each their own.

The real question is whether you even need to think of upgrades with a 3090; with a 3080 a 7900XTX is a straight upgrade in every possible metric.
I get what you are saying, but most games without RT can be played with decent settings on older cards. Yeah you won't max them out, but you can enjoy a really really good experience. The games you can't / won't be able to run easily will exclusively be the RT games.. And I know there won't be a lot this gen that make extensive use of RT effects, maybe 2 -3 -5 games best case scenario. But it's almost always the case that people upgrade their cards for a handful of games all the time, because most games can be played on their older cards as well.

I mean I had a 1080ti,, I upgraded to a 3090 for basically 3 games, RDR2 / cyberpunk / control. Everything else could be played very decently on my 1080ti.

The 7900xtx is a sidegrade / slight upgrade when it comes to RT performance sadly.
Posted on Reply
#18
Dr. Dro
Vayra86Unlikely, Nvidia did the same thing with Ampere, that gen is just as VRAM starved as Ada and your 4070ti.
They also did it with Kepler's 670 ;) Back then they had that primarily for SLI, a 4GB iteration.
Today you have that core power in a single card, the 4070ti is essentially what the 660ti was in Kepler, the similarities are striking and so is the VRAM problem - the 660ti was notorious for being VRAM starved as well, but then mostly on bandwidth (4070ti : same issue). The 670 is really what Ada's 4080 is now; the 680 is again bandwidth/VRAM constrained, as evident by the 770 that goes faster purely on better memory.



The 16GB will show its true colors 3-4 years from now, just like 8GB cards Ampere cards getting surpassed, despite a stronger core, by higher VRAM equivalents.
Will that be 'worth the price difference today'... time will tell. However regardless of price it does prove every time Nvidia uses VRAM to make sure you buy your next card in a timely manner. And then you pay a premium, again, to buy a VRAM starved card, again. You've just proven the point ;) Isn't 800 quid enough for a 20GB equivalent?

Also, 4070ti is not a 4K card? What?




The cognitive dissonance is DAMN strong, wow. So since Ada, the only conceivable card that does register as 'a 4K card' is the 4090 now? The rest is within spitting distance. I don't understand the madness, sorry...
Nvidia has always been stingy with VRAM. Remember the GTX 295 with only 896 MB? Either way, I wouldn't call it within spitting distance (the 4090 leaves everything else in the dust with an extreme generational uplift over the 3090 even in its badly cutdown configuration), but IMHO - as long as you're consistently hitting 60 fps or above, I'd argue for it being well 4K capable. 4070 Ti would have that 12 GB VRAM problem that the RTX 3090 doesn't have, but otherwise, it does okay.
fevgatosI don't think nvidia cares much about whether or not you buy their new card. They are not putting much vram to prevent pros going for the mainstream models instead of the quadros that sell for much more.

Now with that said, someone could make the same argument about AMD, they are RT performance starved to force you to upgrade. The thing is, anyone with a 3080 or a 3090 will feel that the 7900xtx is a sidegrade in some areas, and a downgrade in others, which it is when it comes to losing DLSS and the RT performance.
Agreed. The 7900 XTX is not a worthy upgrade from the 3090, I calculated everything and despite minor raw frame rate improvements, it's generally a downgrade feature- and driver support-wise. People are baffled when I tell them that I had what they're thinking it's surreal 3 years ago thus it doesn't look anywhere near as impressive to me, but it's the truth.
Posted on Reply
#19
Vayra86
fevgatosI get what you are saying, but most games without RT can be played with decent settings on older cards. Yeah you won't max them out, but you can enjoy a really really good experience. The games you can't / won't be able to run easily will exclusively be the RT games.. And I know there won't be a lot this gen that make extensive use of RT effects, maybe 2 -3 -5 games best case scenario. But it's almost always the case that people upgrade their cards for a handful of games all the time, because most games can be played on their older cards as well.

I mean I had a 1080ti,, I upgraded to a 3090 for basically 3 games, RDR2 / cyberpunk / control. Everything else could be played very decently on my 1080ti.

The 7900xtx is a sidegrade / slight upgrade when it comes to RT performance sadly.
Dude, even Cyberpunk is perfectly playable with RT/FSR on a 7900XT. What the hell are we even talking about.
We need to stop parroting Nvidia's marketing, because it is ab-so-lu-te BS. You either have crippled perf with the full box of nonsense maxed out, or you pay north of 1500 for a 4090, to play those 2-3 titles that are fully Nvidia sponsored. Everything else runs like a dream on anything in the high end either red or green. Except when compiling shaders ;)

This is the same category of bullshit I read when I see those endless discussions on DLSS/FSR where you need a magnifying glass and frozen imagery to spot the differences, just to lend credence to a supposed major difference between camps. Except all those sheep are actually knee deep in Red/Green marketing stories and barely register it, apparently.
Posted on Reply
#20
Dr. Dro
Vayra86Dude, even Cyberpunk is perfectly playable with RT/FSR on a 7900XT. What the hell are we even talking about.
We need to stop parroting Nvidia's marketing, because it is ab-so-lu-te BS. You either have crippled perf with the full box of nonsense maxed out, or you pay north of 1500 for a 4090, to play those 2-3 titles that are fully Nvidia sponsored. Everything else runs like a dream on anything in the high end either red or green. Except when compiling shaders ;)
It's really not BS, but you are right there, it is playable as long as you compromise here or there and I'm 100% gonna agree that it's not a big deal, but they are compromises nonetheless. You want the real deal, ultimate ultraaaaa giga hyper super performance, 4090's your only choice I'm afraid.
Posted on Reply
#21
tvshacker
tussinmanWell that's the problem. At least with the 7600 it's rumored to be a $250-280 card.

The 4060 Ti last gen (3060 Ti) was $400 and the rumors were of it being higher this gen.

A 8GB 4060Ti at $420-450 isn't really attractive or affordable.

Yeah the problem is Nvidia wants a more premium price on the xx60 Ti series (last gen it was $400 and rumors of it starting close to $450 this gen) but 8GB is far from premium.
I'm really curious about the price/performance of the entire 4060 lineup. The 6700XT can (finally) be had for just over 380€ and the 6750XT below 410€ so at the 400~450€(?) price range they have big shoes to fill.
I trust HU will have us covered and will compare the 16GB 4060TI with the 3070 on those VRAM limited scenarios (as well as others), it sucks that with all the info already in the wild we still have to wait until July.
Posted on Reply
#22
Dr. Dro
tvshackerI trust HU will have us covered and will compare the 16GB 4060TI with the 3070 on those VRAM limited scenarios (as well as others), it sucks that with all the info already in the wild we still have to wait until July.
They've somewhat already done this, comparing the 3070 to its pro counterpart, if you missed it you might want to watch it:


It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
Posted on Reply
#23
fevgatos
Vayra86Dude, even Cyberpunk is perfectly playable with RT/FSR on a 7900XT. What the hell are we even talking about.
We need to stop parroting Nvidia's marketing, because it is ab-so-lu-te BS. You either have crippled perf with the full box of nonsense maxed out, or you pay north of 1500 for a 4090, to play those 2-3 titles that are fully Nvidia sponsored. Everything else runs like a dream on anything in the high end either red or green. Except when compiling shaders ;)

This is the same category of bullshit I read when I see those endless discussions on DLSS/FSR where you need a magnifying glass and frozen imagery to spot the differences, just to lend credence to a supposed major difference between camps. Except all those sheep are actually knee deep in Red/Green marketing stories and barely register it, apparently.
How is it playable on a 7900xt? It averages 70-80 fps on my 4090 with dlss on.
Posted on Reply
#24
Dr. Dro
fevgatosHow is it playable on a 7900xt? It averages 70-80 fps on my 4090 with dlss on.
Contrary to popular belief, 80fps at native settings isn't necessary for it to be considered playable :laugh:

I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.
Posted on Reply
#25
fevgatos
Dr. DroThey've somewhat already done this, comparing the 3070 to its pro counterpart, if you missed it you might want to watch it:


It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
His videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.

Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder
Dr. DroContrary to popular belief, 80fps at native settings isn't necessary for it to be considered playable :laugh:

I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.
Generally speaking, sure, but cyberpunk specifically, for some freaking reason 40 fps is a horrible experience to me. I've played games at 30 fps and it doesn't bother me that much, but 40 on cyberpunk feels atrocious.
Posted on Reply
Add your own comment
May 15th, 2024 16:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts