Tuesday, May 9th 2023

NVIDIA GeForce RTX 4060 Ti Available as 8 GB and 16 GB, This Month. RTX 4060 in July

In what could explain the greater attention by leaky taps on the GeForce RTX 4060 Ti compared to its sibling, the RTX 4060, NVIDIA is preparing a staggered launch for its RTX 4060-series. We're also learning that there are as many as three SKUs in the series—the RTX 4060 Ti 8 GB, the RTX 4060 Ti 16 GB, and the RTX 4060. All three will be announced later this month, however, only the RTX 4060 Ti 8 GB will be available to purchase at the time. The RTX 4060 Ti 16 GB and RTX 4060 will be available from July.

At this point, little is known about what segments the 8 GB and 16 GB variants of the RTX 4060 Ti besides memory size. The RTX 4060 Ti 8 GB is rumored to feature 34 out of 36 streaming multiprocessors (SM) physically present on the 5 nm "AD106" silicon, which gives NVIDIA some theoretical headroom to enable a few more shaders. These 34 work out to 4,352 CUDA cores, while a fully unlocked AD106 has 4,608. The RTX 4060 is a significantly different SKU that's based on a maxed out "AD107" silicon, with 30 SM, or 3,840 CUDA cores, although it should be possible for some RTX 4060 cards be based on a heavily cut-down AD106.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

120 Comments on NVIDIA GeForce RTX 4060 Ti Available as 8 GB and 16 GB, This Month. RTX 4060 in July

#26
kondamin
TumbleGeorgeI wonder how much VRAM will be needed in just a few years when the world's first holographic PC games will be available? I mean real volume projected full color holography, high definition, not some imitation.
Few? if you are in your teens you might see it in your lifetime.
I do not expect it anymore in mine.
Posted on Reply
#27
tvshacker
N3utro16GB 4060 ti would be a big "f you" in the face of all 4070 and 4070 ti owners (i'm one of them :p)
Depends, it they go for an MSRP of 500$ (or above) for the 16GB version you'll be the one laughing.
Posted on Reply
#28
Daven
I’m going to guess the 4060 non-Ti 8 GB will be the lowest GPU in the 4000 product stack. Nvidia seems less and less interested in the sub $300 budget GPU market.
Posted on Reply
#29
tvshacker
DavenI’m going to guess the 4060 non-Ti 8 GB will be the lowest GPU in the 4000 product stack. Nvidia seems less and less interested in the sub $300 budget GPU market.
That would be an "honest" move as long as it's better than the 3060 8GB. The RTX3050 served little purpose, it was worse than the 2060 and more expensive.
Posted on Reply
#30
Pumper
4070 12GB needs a price cut and a 16GB version should replace it at $600.
tvshackerThe 3070 only recently (last week or so) dropped below 500€ where I live, so if the 4060TI goes for the rumoured 450$ it better be the 16GB or the 8GB should perform considerably better than the 3070...
Oh, don't worry, it will be 3x****************** faster than 3070 in nvidia marketting.
Posted on Reply
#31
tvshacker
Pumper4070 12GB needs a price cut and a 16GB version should replace it at $600.
I think the likely move is a price cut on the 4070 to 500$, 4070TI to 600/650$, and a release of a 16GB 4070 Super/Ultra made from a cut down 4080 chip at 800$ in about a year from now.
Posted on Reply
#32
TumbleGeorge
tvshacker16GB 4070 Super/Ultra made from a cut down 4080 chip at 800$ in about a year from now.
No way! Less than 6 months before graphic cards generation with GDDR7 and with double size of VRAM will haven't buyers which pay $800 for that garbage.
Posted on Reply
#33
bug
DavenI’m going to guess the 4060 non-Ti 8 GB will be the lowest GPU in the 4000 product stack. Nvidia seems less and less interested in the sub $300 budget GPU market.
It's not that they're less interested, it's that RT is barely tractable on mid-range cards, so they don't know what the low end should look like. Remember how they had to come up with the 1660?
Posted on Reply
#34
BoboOOZ
tvshackerAnd less bandwidth than the 3070 as well.
Given that that is exactly the purpose of these textures, to remain in the VRAM cache for a long time (as opposed to being reloaded for each frame) the memory bandwidth is not really used. The only case where the bandwidth gets used is when there is not enough VRAM and the GPU tries to load the textures for each frame and then it fails miserably, like for the 3070Ti, in spite of bandwidth being decent, because the quantity of data to be loaded is simply too large to be done on a per frame basis.
Posted on Reply
#35
docnorth
Any info about memory bus? Can it be upgraded to 256-bit?
Posted on Reply
#36
Legacy-ZA
ixiRtx 4060 ti with 16GB, what kind of miracle is this?
Yes yes, but for a mere $2000 dollar extra! ;D
Posted on Reply
#37
Daven
bugIt's not that they're less interested, it's that RT is barely tractable on mid-range cards, so they don't know what the low end should look like. Remember how they had to come up with the 1660?
That’s right. The 1600 series was the last budget line from Nvidia where they built a GPU without RT compute units. Just imagine if Nvidia did that all the way up the stack and designed highend GPUs with and without RT. The non-RT units would be cheaper along the lines of historic pricing. Guess how many people would spend the extra $100s for RT enabled GPUs. I’m thinking of a number close but not quite zero.

This situation reminds me of 32-64-bit CPU hybrid architectures. This was done right. There were 64 total registers. If the software was only 32-bit, it would use the first 32 registers. 64-bit software would use all 64. Just imagine if AMD had to add a separate bank of 64 that couldn’t overlap so you have 96 total. The extra transistors would have increased costs.

This is why I do not like RT functionality. In Nvidia’s case, the functional units between raster and RT do not overlap. As more parts of a gaming scene use RT, less of the compute units for raster are used. If I understand things correctly, a full path RT game would ignore the huge majority of the GPU compute units.

So now we are saddled with expensive GPUs because of the extra RT compute units, no overlap of functionality and the loss of budget GPUs because Nvidia knows that a choice between RT and non-RT enabled GPUs would not go their preferred way leading to the loss of their supposed competitive advantage.
Posted on Reply
#39
wolf
Better Than Native
wonder if we'll see higher memory version of other SKU's.....
64KAll I can tell you is that Nvidia works in mysterious ways. They do oddball things from time to time and their logic is unfathomable when they do it.
They all do mate, hard to account for why half of AMD does the things it does too.

Oh wait, money.
Posted on Reply
#40
Colddecked
docnorthAny info about memory bus? Can it be upgraded to 256-bit?
Memory bus is not going to be upgraded. They'll just use higher capacity chips.
Posted on Reply
#41
Mahboi
whereismymind16 GB is just flat out overkill.. my guess is that the 4060 Ti 16G will cost a pretty 100-200 usd above the normal Ti, and thatll be harboring into 4070 Ti territory, the 4070 already has 12G vram, so if you wanted to game, its either worse card with more ram or better card with less ram. why not just make the 4060 Ti at 10 GB
Doesn't matter. If it were a budget card (lol, Nvidia budget card) with 16Go, I'd advocate for it to anyone trying to save. A weaker chip that doesn't consume too much power and has 16Go basically has everything you'd want in a future proofing low expenditure card. Perfect to have to pay as little as possible and still use it in say 5 years with every new AAA at medium or high, without compromising on textures or the low computation stuff.

No RT, no extreme lighting, no precise details, but a decent looking game that isn't gimped and that never really chokes on anything. And the 16Go is a great added bonus for all kinds of productivity tasks too.
It'd be a budget monster if it weren't for Nvidia. If AMD sells a card with this much VRAM with 25% less price, I'm not even considering this card as worthwhile unless you need CUDA.
Posted on Reply
#42
ixi
tvshackerWhat shenanigans will they come up with to not cripple the professional line?
Already did that with 4070 ti as it is weaker than 3080 without dlss :D.
Posted on Reply
#43
TheinsanegamerN
whereismymind16 GB is just flat out overkill.. my guess is that the 4060 Ti 16G will cost a pretty 100-200 usd above the normal Ti, and thatll be harboring into 4070 Ti territory, the 4070 already has 12G vram, so if you wanted to game, its either worse card with more ram or better card with less ram. why not just make the 4060 Ti at 10 GB
Because 10GB would either require A) an asymmetric VRAM setup which people will lose their shat over, or B) a re-designed GPU with a wider memory bus, which would mean you have two different GPUs with the same name, which people will lose their shat over.

So just double the VRAM, which people will ALSO lose their shat over, but will be the cheapest to implement.
Posted on Reply
#44
AusWolf
Solaris17I’m willing to bet it was a late sku they had no intention of initially releasing until the backlash regarding there vram practices.
It looks like there needs to be backlash about everything if we want Nvidia to make any effort.

My answer to the poll is: nothing. The 4060 Ti is overpriced enough already. $450 would be a nice price for a 16 GB 4070 Ti.
Posted on Reply
#45
JimmyDoogs
Oh wow so budget gamers might actually be able to run games in 1440p with this thing then with 16GB's and DLSS on.
Posted on Reply
#46
AusWolf
JimmyDoogsOh wow so budget gamers might actually be able to run games in 1440p with this thing then with 16GB's and DLSS on.
If you call $450 plus change "budget"... For me, this is the line between high-end and enthusiast tier.
Posted on Reply
#47
remunramu
So does this mean the 4070s class refresh next year will also get 16gb? smh my 4070ti surely could get obsolete within a year :twitch:
Posted on Reply
#49
BoboOOZ
AusWolfIt looks like there needs to be backlash about everything if we want Nvidia to make any effort.

My answer to the poll is: nothing. The 4060 Ti is overpriced enough already. $450 would be a nice price for a 16 GB 4070 Ti.
Same answer here, just looked at the poll. Nvidia's lineup is both more expensive and more cut down this year. We'll see in 6 months who wins, Nvidia or the gamers.
Posted on Reply
#50
playerlorenzo
A new, non-VRAM starved NVIDIA GPU? That’s new…

Hopefully the advertised upfront of such additional VRAM doesn’t make such version practically moot.
Posted on Reply
Add your own comment
Nov 28th, 2024 03:50 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts