Tuesday, May 9th 2023

NVIDIA GeForce RTX 4060 Ti Available as 8 GB and 16 GB, This Month. RTX 4060 in July

In what could explain the greater attention by leaky taps on the GeForce RTX 4060 Ti compared to its sibling, the RTX 4060, NVIDIA is preparing a staggered launch for its RTX 4060-series. We're also learning that there are as many as three SKUs in the series—the RTX 4060 Ti 8 GB, the RTX 4060 Ti 16 GB, and the RTX 4060. All three will be announced later this month, however, only the RTX 4060 Ti 8 GB will be available to purchase at the time. The RTX 4060 Ti 16 GB and RTX 4060 will be available from July.

At this point, little is known about what segments the 8 GB and 16 GB variants of the RTX 4060 Ti besides memory size. The RTX 4060 Ti 8 GB is rumored to feature 34 out of 36 streaming multiprocessors (SM) physically present on the 5 nm "AD106" silicon, which gives NVIDIA some theoretical headroom to enable a few more shaders. These 34 work out to 4,352 CUDA cores, while a fully unlocked AD106 has 4,608. The RTX 4060 is a significantly different SKU that's based on a maxed out "AD107" silicon, with 30 SM, or 3,840 CUDA cores, although it should be possible for some RTX 4060 cards be based on a heavily cut-down AD106.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

120 Comments on NVIDIA GeForce RTX 4060 Ti Available as 8 GB and 16 GB, This Month. RTX 4060 in July

#51
Punkenjoy
TheinsanegamerNBecause 10GB would either require A) an asymmetric VRAM setup which people will lose their shat over, or B) a re-designed GPU with a wider memory bus, which would mean you have two different GPUs with the same name, which people will lose their shat over.

So just double the VRAM, which people will ALSO lose their shat over, but will be the cheapest to implement.
Yep

The main issue was that Nvidia had to use overpriced memory (GDDR6X) since their GPU needed all that bandwidth to perform. Since it was pricey, the amount of memory on Nvidia GPU stayed stable for few years and people got used to it. The other issue is memory size and bus size.

Memory chips are 32 bit. So you take your bus size, your memory chip size and you get your total ram.

What could help is non binary chip size. 3 GB chips could give 12 GB to the 4060.

Now Most GPU could use more memory and they would just run fine. VRAM is like RAM. You want to have always enough and it's better to have enough free space than being thigh.

GPU can always swap data in and out, but that use precious time and bandwidth that you could use on something else. And even with high main system memory bandwidth and PCI-E 4.0 16x bandwidth, it still take time to load what you need. This is why you have stutter when you lack of ram.

In the past, VRAM was mostly used to store Frame buffer (rendered frames waiting to be displayed) and textures. And this was the main reason why someone would say you need X amount of memory to run X resolution. At that time, the frame buffer was using a significant portion of the VRAM

Today, it must store way more. Almost Every shader you run will generate some data that will need to be reused at some point. Also, now most game will use various buffer to exchange things with CPU and there is also the need to keep data from previous frames for all the temporal effects. That is one of the reason why memory requirement between resolution are much closer than their actual pixel counts.

GPU get more powerful as they compute more stuff. That stuff need to be stored somewhere. I wouldn't buy the 8 GB variant. i find that the 6 GB 3060 was already too low on memory. The 4060 might cost more, but will last way longer.
Posted on Reply
#52
ixi
BoboOOZSame answer here, just looked at the poll. Nvidia's lineup is both more expensive and more cut down this year. We'll see in 6 months who wins, Nvidia or the gamers.
Kinda off topic, but I don't get the people who bought nvidia or just buying because they can eRTeX with shitty performance without dlss. Native is native and it will be better than fake resolution.
Posted on Reply
#53
AusWolf
BoboOOZSame answer here, just looked at the poll. Nvidia's lineup is both more expensive and more cut down this year. We'll see in 6 months who wins, Nvidia or the gamers.
I've already won with the 6750 XT. I'm not gonna buy another Nvidia card as long as they keep selling cut-down versions for gold just to upsell the proper stuff even more.
ixiKinda off topic, but I don't get the people who bought nvidia or just buying because they can eRTeX with shitty performance without dlss. Native is native and it will be better than fake resolution.
Agreed. I also don't get the people who voted more than $10 on this poll. $450 is high enough already. How much more do you guys want to pay for a mid-range card?
Posted on Reply
#54
JimmyDoogs
It's not a great price for a 60 ti card but still I'd really like to see nVidia vs AMD's $1k build this summer in 1440p gaming.
Posted on Reply
#55
ValenOne
whereismymind16 GB is just flat out overkill.. my guess is that the 4060 Ti 16G will cost a pretty 100-200 usd above the normal Ti, and thatll be harboring into 4070 Ti territory, the 4070 already has 12G vram, so if you wanted to game, its either worse card with more ram or better card with less ram. why not just make the 4060 Ti at 10 GB
RTX 4060 Ti has about 22 TFLOPS which is higher than 16 GB equipped GA104-based RTX A4000's 19.17 TFLOPS.

Without desktop PC IGP being enabled, Windows 10/11 DWM and non-gaming apps will consume VRAM and the PC game doesn't have the full 12 GB VRAM. Check your Windows Task Manager's dedicated GPU memory usage before you run a game.

2GB x 4 32-bit chips has 8GB. For 16 GB with a 128-bit bus, it would need four 4GB density chips which don't exist. The clamshell configuration doubles the chip count usage.

RTX 3060 12GB used a 192-bit bus with six 2GB density chips.

NVIDIA's 48 GB VRAM cards are in a clamshell configuration which uses 2GB x 24 chips.

GDDR6W has a 4GB chip density.
Posted on Reply
#56
MarsM4N
16GB on a 4060ti is just wasted potential. You pay for something extra that benefits in maybe 0,01% of games. :rolleyes: Nvidia's lineup doesn't make any sense.
If they would copy AMD (who are since years more generous without senseless overstacking), their lineup would look like that:

RTX 4090 (24GB) / RTX 4090ti (24GB)
RTX 4080 (20GB) / RTX 4080ti (20GB)
RTX 4070 (16GB) / RTX 4070ti (16GB)
RTX 4060 (12GB) / RTX 4060ti (12GB)
RTX 4050 (10GB) / RTX 4050ti (10GB)
Posted on Reply
#57
bug
remunramuSo does this mean the 4070s class refresh next year will also get 16gb? smh my 4070ti surely could get obsolete within a year :twitch:
If this sells well, we may get new 4070 16GB models. But there are still many unknowns.
ixiKinda off topic, but I don't get the people who bought nvidia or just buying because they can eRTeX with shitty performance without dlss. Native is native and it will be better than fake resolution.
The only way you would not be getting this would be if in your mind there was zero possibility that people were actually curious to try out RT.

I know I'm a particular case, but unless AMD can come up with something really awesome, I'm not getting an AMD card simply because I'm nauseated by the hordes of fanboys out there blaming Nvidia every move they make.
Posted on Reply
#58
Punkenjoy
MarsM4N16GB on a 4060ti is just wasted potential. You pay for something extra that benefits in maybe 0,01% of games. :rolleyes: Nvidia's lineup doesn't make any sense.
If they would copy AMD (who are since years more generous without senseless overstacking), their lineup would look like that:

RTX 4090 (24GB) / RTX 4090ti (24GB)
RTX 4080 (20GB) / RTX 4080ti (20GB)
RTX 4070 (16GB) / RTX 4070ti (16GB)
RTX 4060 (12GB) / RTX 4060ti (12GB)
RTX 4050 (10GB) / RTX 4050ti (10GB)
So since memory chip are populated by each 32 bit portion of the memory bus, you would have

RTX 4090 could potentially be 192 or 384 bit
RTX 4080 could potentially be 160 or 320 bit
RTX 4070 could potentially be 128 or 256 bit
RTX 4060 could potentially be 96 or 192 bit

for the 4050, well, it would need to be 160 bit minimum for 10 GB

Unless they figure it out how to produce non binary size memory chip (ex 3 GB instead of 2 per 32 bit).
Posted on Reply
#59
Aretak
remunramuSo does this mean the 4070s class refresh next year will also get 16gb? smh my 4070ti surely could get obsolete within a year :twitch:
Not unless they change the silicon. The full AD104 die has a 192-bit bus, so 16GB isn't going to happen. GDDR6X is only available in 1GB and 2GB modules, so unless Nvidia start using cut-down AD103 dies for the 70-series cards then 12GB is already the maximum (six 2GB modules). But using the larger die and upping the bus to 256-bit would have a large (positive) impact on performance and make the 4080 (even more of) a joke, so that doesn't seem likely. Nor does cutting AD104 down to 128-bit, as that'd cripple it. They've rather backed themselves into a corner by trying to get away with a 192-bit bus on 70-class cards for the first time ever.
Posted on Reply
#60
pavle
MarsM4N16GB on a 4060ti is just wasted potential. You pay for something extra that benefits in maybe 0,01% of games. :rolleyes: Nvidia's lineup doesn't make any sense.
If they would copy AMD (who are since years more generous without senseless overstacking), their lineup would look like that:

RTX 4090 (24GB) / RTX 4090ti (24GB)
RTX 4080 (20GB) / RTX 4080ti (20GB)
RTX 4070 (16GB) / RTX 4070ti (16GB)
RTX 4060 (12GB) / RTX 4060ti (12GB)
RTX 4050 (10GB) / RTX 4050ti (10GB)
That would indeed be a very good lineup, but instead nvidia is offering a broken lineup with a re-baked 4060Ti, what losers. To teach them a lesson, one has to hit them where it hurts the most, their wallet - do not buy such cards. It is as you well said wasted potential, with added fake_frames™ to increase the latency of when buyer's remorse hits you. Quite appalling really.
Posted on Reply
#61
Punkenjoy
People here that don't understand that memory capacity and bus size are linked

Posted on Reply
#62
MarsM4N
PunkenjoyPeople here that don't understand that memory capacity and bus size are linked

Pretty sure most people understand the Nvidia issue. ;) Memory capacity >< Bus size >< Greed.

Or do you wanna tell me Nvidia, the absolute market leader of GPU's, in incapable to develop GPU's with adequate bus sizes for adequate memory sizes? Come on.
Posted on Reply
#63
droopyRO
ixiRtx 4060 ti with 16GB, what kind of miracle is this?
Maybe they have some info on what AMD is going to do with their unreleased 7000 series GPUs.
Posted on Reply
#64
bug
PunkenjoyPeople here that don't understand that memory capacity and bus size are linked

AMD didn't put that into their slides. They only told users to complain about memory size. So that's what the users are doing.
Understanding is so '90s.
Posted on Reply
#65
rv8000
For such a successful company, the 4000 series lineup has been quite stupid in terms of segmentation. They’re on schedule for more SKUs than the 2000/3000 series combined at this point.

Releasing a 16GB 60 series card while your more premium 4070 cards get 12 GB? They’re going to be all over the pace if they start releasing larger capacity SKUs between the 4070/4080 with super and Ti models with 16/20GB vram capacities. Apparently Nvidia enjoys bullets in their feet.
Posted on Reply
#66
Why_Me
People complain that the card didn't have enough VRAM and now they're complaining again after the card will offer 16GB of VRAM. Go figure eh.
Posted on Reply
#67
MrDweezil
Would be the first card in the lineup I consider buying if they can keep the price premium to $50.
Posted on Reply
#68
AusWolf
bugIf this sells well, we may get new 4070 16GB models. But there are still many unknowns.
I have my doubts. The "choose VRAM or performance but not both" strategy played well for Nvidia in the 30-series. I don't see why it wouldn't work now. Besides, introducing a 16 GB 4070 would kill sales of 8 GB cards. That's basically why the 16 GB 3070 Ti got cancelled.
bugThe only way you would not be getting this would be if in your mind there was zero possibility that people were actually curious to try out RT.

I know I'm a particular case, but unless AMD can come up with something really awesome, I'm not getting an AMD card simply because I'm nauseated by the hordes of fanboys out there blaming Nvidia every move they make.
Do you buy your upgrades for yourself, or to please/anger the fanboys? Personally, I don't give a damn how much people hate Nvidia or AMD, I buy whatever suits my needs and piques my interest, and that right now is AMD. Nvidia has been offering the same architecture with slight changes and low VRAM for 3 generations now, not to mention the price which is a joke. $450 for an 8 GB midrange card? Ridiculous!
Posted on Reply
#69
rv8000
Why_MePeople complain that the card didn't have enough VRAM and now they're complaining again after the card will offer 16GB of VRAM. Go figure eh.
I’ll make it simple for you. Nvidia attempting to appease people with higher vram for models across the board creates the necessity for xx70, xx80/ti SKUs with different bus widths. This leads to different SM configs where you have 4 or more 4070xx SKUs. We do not needs 4-5 SKUs per tier of card. Has nothing to do with complaining about VRAM, and everything to do with their lineup making no sense and making models potentially DOA or obsolete within 5-6 months.

Why you don’t find this comical is weird. I feel for their board partners. While I miss EVGA the shenanigans Nvidia is currently partaking in is wild and I can definitely see why someone wouldn’t want to work with them as an AIB.
Posted on Reply
#70
Quicks
Will be a bit of a waste if used on 128bit.
Posted on Reply
#71
Warigator
No meaningful improvement in specs/price at all. The best thing to do was to buy GTX 1660 Super when it was new like my friend and just keep it until things finally start moving again [at a constant price]. RTX 3060 brought improvement in memory/price and that's it. Remember, that according to Steam statistics, GTX 1650 is the most popular gaming graphics card. It's between 1050 Ti and 1060 3GB in performance. It also has 4 GB of VRAM.
Posted on Reply
#72
Dragokar
So this confirms that the 7700XT will also come with a 16GB model later on to the release with 7800.
Posted on Reply
#73
tussinman
Solaris17I’m willing to bet it was a late sku they had no intention of initially releasing until the backlash regarding there vram practices.
Also doesn't help that they were originally planning on pricing the 4060 Ti in the 6700XT and 7700XT tier price range so 8GB only was an odd choice.
Posted on Reply
#74
JimmyDoogs
DragokarSo this confirms that the 7700XT will also come with a 16GB model later on to the release with 7800.
7700XT 16GB please come out so we can finally make those under $1k beasts again like back in 2019 and save PC gaming.
Posted on Reply
#75
ixi
JimmyDoogs7700XT 16GB please come out so we can finally make those under $1k beasts again like back in 2019 and save PC gaming.
CPU, Mobo, RAM, PSU should lower their prices too. Freaking weakest chipsets going above 100e :D. What a time to be alive... and then then again... z68 for 100euro... sometimes would be great to go back in time.
Posted on Reply
Add your own comment
May 21st, 2024 21:54 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts