Thursday, July 6th 2023

16GB Variant of GeForce RTX 4060 Ti Launches July 18

NVIDIA is preparing the launch of its third and final RTX 4060-series graphics card SKU, the GeForce RTX 4060 16 GB, for July 18, 2023. Going by past convention, reviews of the RTX 4060 Ti 16 GB graphics card priced at its steep $499 MSRP, will go live on June 17, and those priced above the MSRP on July 18, alongside market availability. The RTX 4060 Ti is essentially a memory variant of the RTX 4060 Ti. It offers 16 GB of video memory across the card's 128-bit wide memory interface.

According to the specs-sheet put out by NVIDIA on the May 18 launch date for the RTX 4060 series, besides memory size, there are no other differences between the RTX 4060 Ti 16 GB, and the current RTX 4060 Ti 8 GB. In particular, there is no change in the core-configuration or clock-speed, since the shader compute throughput of both models is listed at the same 22 TFLOPs. Even the memory speed is the same, at 18 Gbps (GDDR6-effective), at which the GPU has 288 GB/s of memory bandwidth at its disposal. It will be interesting to see the performance impact of 16 GB memory.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

60 Comments on 16GB Variant of GeForce RTX 4060 Ti Launches July 18

#1
Dan.G
Would like a GDDR6X variant, similar to what the 3060 Ti has. But, even with 21 Gbps GDDR6X, you'd still get only 336 GB/s because of the narrow 128-bit bus. Sounds better, though. Like prices: the 4060 Ti is 399$ not 400$.
Posted on Reply
#2
chrcoluk
Where is the 16 gig 4070ti?
Posted on Reply
#3
bug
Next to these, the 8GB models will look like they're selling in droves.
Posted on Reply
#4
Vayra86
chrcolukWhere is the 16 gig 4070ti?
Don't go around presenting sensible configurations now, come on. Nvidia needs that handicapped stack to fight AMD! And they have a newly discounted 4080 to sell you
Posted on Reply
#5
Legacy-ZA
So it can choke on an additional 8GB of VRAM? WTF.
Posted on Reply
#6
wolf
Better Than Native
chrcolukWhere is the 16 gig 4070ti?
They'd need to use AD103 for that, or cut down the memory bus from 192 bit to 128 bit which will hurt performance.

The way they've named their GPUS relative to the dies this time round has put them in a pretty average spot for memory configs. I feel that coming from Samsung's node to TSMC netted more improvement than consumers expected so they want to charge high for lower their products.

Personally I'd have preferred to see

4090 24GB AD102
4080 20GB AD102
4070 16GB AD103
4060 12GB AD104
4050 8GB AD106

But it seems like they, as well as AMD want to condition the market to expect linear pricing relative to the top product. It used to be the xx80 was 10-20% from the top dog for roughly half the price, but now half the performance of a 4090 costs half as much and so on. I don't see this changing, both camps seem happy with lacklustre sales if it means we're used to it for future generations that will bring their own generational improvements.

Edit AD106 where I put ad107
Posted on Reply
#7
chrcoluk
wolfThey'd need to use AD103 for that, or cut down the memory bus from 192 bit to 128 bit which will hurt performance.

The way they've named their GPUS relative to the dies this time round has put them in a pretty average spot for memory configs. I feel that coming from Samsung's node to TSMC netted more improvement than consumers expected so they want to charge high for lower their products.

Personally I'd have preferred to see

4090 24GB AD102
4080 20GB AD102
4070 16GB AD103
4060 12GB AD104
4050 8GB AD107

But it seems like they, as well as AMD want to condition the market to expect linear pricing relative to the top product. It used to be the xx80 was 10-20% from the top dog for roughly half the price, but now half the performance of a 4090 costs half as much and so on. I don't see this changing, both camps seem happy with lacklustre sales if it means we're used to it for future generations that will bring their own generational improvements.
Yeah I am not buying into this crappy ecosystem they have setup.

I have learnt on newer windows builds VRAM gets even worse as some native windows processes are now gpu accelerated, meaning more VRAM usage as baseline, my plan is to use igpu for desktop when I do my platform upgrade meaning all gpu accelerated desktop stuff loads up DRAM instead of VRAM which will buy me anywhere from about 500meg to a couple of gigs of VRAM depending on what apps are running. GPU will just be for rendering games. Would be cool if Nvidia did render only drivers.

For those who dont know in the Nvidia control panel is an option on the top menu to enable a systray icon which shows which apps are using the GPU.

Right now on my PC, the following are all using my GPU and as such also VRAM. WFC, firefox, searchapp, onedrive, textinputhost, and msedge. Discord also will use it by default but can be disabled, steam likewise.
Posted on Reply
#8
bug
wolfThey'd need to use AD103 for that, or cut down the memory bus from 192 bit to 128 bit which will hurt performance.

The way they've named their GPUS relative to the dies this time round has put them in a pretty average spot for memory configs. I feel that coming from Samsung's node to TSMC netted more improvement than consumers expected so they want to charge high for lower their products.

Personally I'd have preferred to see

4090 24GB AD102
4080 20GB AD102
4070 16GB AD103
4060 12GB AD104
4050 8GB AD107

But it seems like they, as well as AMD want to condition the market to expect linear pricing relative to the top product. It used to be the xx80 was 10-20% from the top dog for roughly half the price, but now half the performance of a 4090 costs half as much and so on. I don't see this changing, both camps seem happy with lacklustre sales if it means we're used to it for future generations that will bring their own generational improvements.
It used to be that the xx80 was the top dog. Tbh to me it still is, I was never able to see the xx90 as anything other than some sort of tech demo. It's just the old Titan, rebadged.
Posted on Reply
#9
ZoneDymo
bugIt used to be that the xx80 was the top dog. Tbh to me it still is, I was never able to see the xx90 as anything other than some sort of tech demo. It's just the old Titan, rebadged.
and that would be true if the 4090 had actual professional hardware/software additions like Titan had and also cost a lot more, like 2000+ today.

but instead it only costs a bit more and in terms of frames per dollar I think it was even a better deal then the 4080....you get less for your money going down the stack, which is how the sellers like it.
Posted on Reply
#11
bug
Titanium_Carbide100$ for 8GB VRAM ? :oops:
Goes well with $400 for a $250 card, I guess.
Posted on Reply
#13
wNotyarD
ZoneDymobut instead it only costs a bit more and in terms of frames per dollar I think it was even a better deal then the 4080....yeah get less for your money going down the stack, which is how the sellers like it.
The more you buy, the more you save.
Posted on Reply
#14
Outback Bronze
Yep, this is piss poor.

More GB than 4070 & Ti and equals 4080.

Joke release. They really want us to upgrade our hardware every gen it seems.

I suppose no worse than a 3060 with more GB than a 3070 & 3080 :confused:
Posted on Reply
#15
Broken Processor
I can't believe there's not more backlash with the 4060 considering it's really a 4050 using 107 silicon.
Posted on Reply
#16
bug
Broken ProcessorI can't believe there's not more backlash with the 4060 considering it's really a 4050 using 107 silicon.
I can't believe there's not more backlash with all the people getting hung up on numbers printed on boxes. But what can you do?
Posted on Reply
#17
gurusmi
Why should i pay €200+ for a NVidia graphices card with 16gb when i can have an Intel Arc 770 with also 16gb? My desktop doesn't take care about a certain one. It will be displayed on both same good and same fast.
Posted on Reply
#18
Chrispy_
Are they still launching it at $499?
The 8GB failed so hard at being a sellable product, they dropped prices within 3 weeks of launch.
At $499 it's not $100 for 8GB more, it's now $120, and I've seen MIR's on the $379 models in the past (but can't find one right now).
I'll buy one to see how bad the 128-bit bus is for CUDA workflows. I strongly suspect the 12GB 3060 will be a compelling alternative with 50% more bandwidth for half the price, meaning that we'll keep buying 3060's

edit:
I'm not sure I'll even buy one. The 4060 8GB's results in SPECviewperf are abysmal. It's slower than the 3060Ti on average, often tied, sometimes considerably worse. Adding more VRAM won't change the performance results...
128-bit cards are garbage for anything other than light gaming at low resolutions, it would seem.
Posted on Reply
#19
bug
Chrispy_Are they still launching it at $499?
The 8GB failed so hard at being a sellable product, they dropped prices within 3 weeks of launch.
At $499 it's not $100 for 8GB more, it's now $120, and I've seen MIR's on the $379 models in the past (but can't find one right now).
I'll buy one to see how bad the 128-bit bus is for CUDA workflows. I strongly suspect the 12GB 3060 will be a compelling alternative with 50% more bandwidth for half the price, meaning that we'll keep buying 3060's

edit:
I'm not sure I'll even buy one. The 4060 8GB's results in SPECviewperf are abysmal. It's slower than the 3060Ti on average, often tied, sometimes considerably worse. Adding more VRAM won't change the performance results...
128-bit cards are garbage for anything other than light gaming at low resolutions, it would seem.
Unfortunately, the 4060Ti is still selling for $400. Dips below that are rare, the MSRP hasn't changed.

I've always been buying mid-range cards, but the last time I touched a 128bit bus was back in 6600GT days (great card, however). The 260 has a bus as wide as 448 bits!
Posted on Reply
#20
TheinsanegamerN
bugI can't believe there's not more backlash with all the people getting hung up on numbers printed on boxes. But what can you do?
Yeah, they're just numbers! Just buy the card, money is just numbers too!

:slap: :nutkick:
Posted on Reply
#21
bug
TheinsanegamerNYeah, they're just numbers! Just buy the card, money is just numbers too!

:slap: :nutkick:
There's numbers that matter and numbers that don't.

Actual gaming numbers and price matter.
Efficiency matters for some. Bus width, VRAM size, manufacturing process only matter for very specific needs.
Numbers on the box don't matter at all. I used to have a 6600GT, then I had a GTX 260, now I have a 1060... I haven't bought a single one because of the model number or the codename of the silicon die.
Posted on Reply
#22
InVasMani
I'd be more interested in a Intel ARC A770 32GB Special Edition with 360 AIO.
Posted on Reply
#23
bug
InVasManiI'd be more interested in a Intel ARC A770 32GB Special Edition with 360 AIO.
I'll have to admit, Arc is looking pretty good, save for its power (in)efficiency. Battlemage can't come soon enough.
Posted on Reply
#24
WorringlyIndifferent
bugThere's numbers that matter and numbers that don't.

Actual gaming numbers and price matter.
Efficiency matters for some. Bus width, VRAM size, manufacturing process only matter for very specific needs.
Numbers on the box don't matter at all. I used to have a 6600GT, then I had a GTX 260, now I have a 1060... I haven't bought a single one because of the model number or the codename of the silicon die.
The issue isn't us not understanding the numbers. People who visit sites like TechPowerUp are the top 1% of knowledge/caring about PCs and their components. The point is that the numbers are dishonest, by design, to trick consumers. Mislead them. You know, lying.

"Oh but you should just read the actual numbers" isn't a valid excuse for a massive company to mislead the public. But we all know people are going to rush to the defense of a multi-billion dollar corporation. Who cares if it's hurting consumers and ultimately you, right? The precious corporations must be protected and defended online, for free.
Posted on Reply
#25
cbb
lol, i'm just waiting for the NV and AMD to conclude that the dGPU market is dying rather than realizing their sales are tanking because they pushed crappy products at inflated prices. Not that the dGPU mkt isn't in decline; there are a whole lot of consoles out there these days. Price, convenience, and they (relatively, compared to the chaos of pc configs) mostly just work.

I suspect the lackluster sales of this crappy gen of dGPUs is only gonna accelerate that transition. Idk.

I find this a bit regrettable as the higher res/higher perf and more complex controls of pc suit me; games designed to be played with six buttons* are very diff from ones w/ kybd&mouse (or joystick or...etc).

*whatever the actual number is. I don't own one, and there are a couple brands out there selling well.
Posted on Reply
Add your own comment
Dec 22nd, 2024 07:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts