Thursday, July 6th 2023

16GB Variant of GeForce RTX 4060 Ti Launches July 18

NVIDIA is preparing the launch of its third and final RTX 4060-series graphics card SKU, the GeForce RTX 4060 16 GB, for July 18, 2023. Going by past convention, reviews of the RTX 4060 Ti 16 GB graphics card priced at its steep $499 MSRP, will go live on June 17, and those priced above the MSRP on July 18, alongside market availability. The RTX 4060 Ti is essentially a memory variant of the RTX 4060 Ti. It offers 16 GB of video memory across the card's 128-bit wide memory interface.

According to the specs-sheet put out by NVIDIA on the May 18 launch date for the RTX 4060 series, besides memory size, there are no other differences between the RTX 4060 Ti 16 GB, and the current RTX 4060 Ti 8 GB. In particular, there is no change in the core-configuration or clock-speed, since the shader compute throughput of both models is listed at the same 22 TFLOPs. Even the memory speed is the same, at 18 Gbps (GDDR6-effective), at which the GPU has 288 GB/s of memory bandwidth at its disposal. It will be interesting to see the performance impact of 16 GB memory.
Sources: MEGAsizeGPU (Twitter), VideoCardz
Add your own comment

60 Comments on 16GB Variant of GeForce RTX 4060 Ti Launches July 18

#26
bug
WorringlyIndifferentThe issue isn't us not understanding the numbers. People who visit sites like TechPowerUp are the top 1% of knowledge/caring about PCs and their components. The point is that the numbers are dishonest, by design, to trick consumers. Mislead them. You know, lying.

"Oh but you should just read the actual numbers" isn't a valid excuse for a massive company to mislead the public. But we all know people are going to rush to the defense of a multi-billion dollar corporation. Who cares if it's hurting consumers and ultimately you, right? The precious corporations must be protected and defended online, for free.
Trick customers how? Mislead them into what?
Posted on Reply
#27
InVasMani
bugI'll have to admit, Arc is looking pretty good, save for its power (in)efficiency. Battlemage can't come soon enough.
I think the hardware specs of ARC are impressive, but software support is more lacking relative to competition. At least that's my impression looking at it. Still I suspect ARC would be a strong GPU for reshade within it's price rang given the hardware specs are more robust.
Posted on Reply
#28
holyprof
That card might be popular among video editors on a budget, just like the 3060 12GB before it. The 12 GB 3060 beats or ties the 3060Ti in every single video test for slightly lower price.
  • Relatively low price (compared to 4070 and 4080) - check;
  • Same Nv video codec (as 4070 and 4080) - check;
  • AI and RTX cores used in video editing - check;
  • More VRAM than the 4070 - check.
For gaming, I'd say the 4070 would be a much better investment, or if you don't bother with useless RT - AMD GPU.
Posted on Reply
#29
enb141
Titanium_Carbide100$ for 8GB VRAM ? :oops:
Apple charges $200 for 8 GB standard DDR RAM. :)
gurusmiWhy should i pay €200+ for a NVidia graphices card with 16gb when i can have an Intel Arc 770 with also 16gb? My desktop doesn't take care about a certain one. It will be displayed on both same good and same fast.
Because:
- Arc only has DirectX 12 by hardware, the lower DirectX versions are emulated.
- Intel Drivers suck.
Posted on Reply
#30
Icon Charlie
Legacy-ZASo it can choke on an additional 8GB of VRAM? WTF.
Yea This is a FFFFFFFFFFFFFFk moment. All they had to do is to open the bus up a bit more and they would be it... but Noooooo! lets slap Moar Memory (at effing reduced cost from the foundries) and call it a day.

GOD... I hate Silicon Valley.
Posted on Reply
#31
N/A
wolfThey'd need to use AD103 for that, or cut down the memory bus from 192 bit to 128 bit which will hurt performance.
Ironically that is what RTX 5070 is going to be, 128 bit is here to stay and climb the latter into the 70-tier, a 104-die 3N shrink of the 4080 and since it is 70% denser it will barely be able to fit 128 bit bus same fate as the 4060 Ti that is somewhat of a 2070 with dual issue 2304 and 32M L2$. But the transistor count is that of a 2080 Ti, so that is better suited for comparison and packs 4608 CUDA as well.

Well forget it this gen is a flop. Nvidia will not yield no matter what. And even if they do it's too late.
Posted on Reply
#32
Ryrynz
If these had 21Gbp/s or 23Gbp/s memory then they'd probably sell and peform a fair bit better. Seems like a real missed opportunity.
Awaiting a good sale price on this or an overclocked model or maybe a 4060 Super with GDDR6X.. doubt that happen though.

A 60 series card should not be 8GB for starters in 2023, Nvidia really doing it's best to make bank with this gens 60 series and people aren't having any of it.
InVasManiI'd be more interested in a Intel ARC A770 32GB Special Edition with 360 AIO.
Battlemage is where it's at.. Not far off.
Posted on Reply
#33
gurusmi
enb141Because:
- Arc only has DirectX 12 by hardware, the lower DirectX versions are emulated.
- Intel Drivers suck.
1. Does that matter when working on the machine using Linux? Both displays shows the desktop. Two or three times a year i use a 3D scanner. Otherwise LibreCalc, Excel and Lazarus/Gambas. More rare i recode a video to place it on Youtube.

2. Intel drivers might suck when playing games on Windows. but i don't. NVidia drivers sucks on Linux. NVidia or Intel with linux is like choosing between plague and cholera. ;) On linux it is best to choose AMD. But try to find a more powerful two slot card. No matter if AMD, Intel or NVidia.
Posted on Reply
#34
enb141
gurusmi1. Does that matter when working on the machine using Linux? Both displays shows the desktop. Two or three times a year i use a 3D scanner. Otherwise LibreCalc, Excel and Lazarus/Gambas. More rare i recode a video to place it on Youtube.

2. Intel drivers might suck when playing games on Windows. but i don't. NVidia drivers sucks on Linux. NVidia or Intel with linux is like choosing between plague and cholera. ;) On linux it is best to choose AMD. But try to find a more powerful two slot card. No matter if AMD, Intel or NVidia.
I can't comment on Linux because I don't use Linux, but on windows AMD and specially Intel suck.
Posted on Reply
#35
ixi
16GB that is nice. But I have settled already for next gen. Waiting for amd/intel/nvidia next big things.
Posted on Reply
#36
gurusmi
enb141I can't comment on Linux because I don't use Linux, but on windows AMD and specially Intel suck.
When you don't use Linux how you are able to know that Intel drivers sucks on Linux? I don't care about windows. I own only one Windows Notebook for my 3D scanner. The rest of my systems are running Linux.
Posted on Reply
#37
Chrispy_
bugThere's numbers that matter and numbers that don't.

Actual gaming numbers and price matter.
Efficiency matters for some. Bus width, VRAM size, manufacturing process only matter for very specific needs.
Numbers on the box don't matter at all. I used to have a 6600GT, then I had a GTX 260, now I have a 1060... I haven't bought a single one because of the model number or the codename of the silicon die.
If you ignore the name of the card, the 4060 Ti 16GB is going to perform about the same as other products from AMD and Nvidia at $350. There's literally a 50% 40-series tax.
Posted on Reply
#38
bug
Chrispy_If you ignore the name of the card, the 4060 Ti 16GB is going to perform about the same as other products from AMD and Nvidia at $350. There's literally a 50% 40-series tax.
I wouldn't judge the "tax" for the whole series based on one SKU, but yeah. You get DLSS3 and better efficiency for that "tax", but that's it...
Fwiw, I might still pull the trigger on a 4060Ti 8GB, but I will definitely not pay $500 (+tax) for the 16GB version.
Posted on Reply
#39
gffermari
I think the 16GB model does not target the gamers but rather users who use the GPU for other stuff too.
Either work or hobby or gaming but definitely a mixed use.
Posted on Reply
#40
wNotyarD
gffermariI think the 16GB model does not target the gamers but rather users who use the GPU for other stuff too.
Either work or hobby or gaming but definitely a mixed use.
Issue with the 4060Ti for that is its lack of memory bandwidth.
Posted on Reply
#41
enb141
gurusmiWhen you don't use Linux how you are able to know that Intel drivers sucks on Linux? I don't care about windows. I own only one Windows Notebook for my 3D scanner. The rest of my systems are running Linux.
That's what I'm saying, I don't know Linux because I don't use Linux, but on WINDOWS, AMD and Intel drivers suck.
Posted on Reply
#42
Chrispy_
wNotyarDIssue with the 4060Ti for that is its lack of memory bandwidth.
Exactly. It's often far worse in productivity workloads than the older, cheaper, 3060Ti because it doesn't have the bandwidth to stretch it's computational advantage.
Posted on Reply
#43
gurusmi
enb141That's what I'm saying, I don't know Linux because I don't use Linux, but on WINDOWS, AMD and Intel drivers suck.
I don't think that you will have problems with any driver showing your desktop. No matter which OS at all. And exactly about that i wrote about. And I defined that clearly already in my first post. How usefull might your reaction be with the provided facts in mind? Did you read my posts? Or did you just answer allergic after the word "Intel" and "Arc"?
Posted on Reply
#44
enb141
gurusmiI don't think that you will have problems with any driver showing your desktop. No matter which OS at all. And exactly about that i wrote about. And I defined that clearly already in my first post. How usefull might your reaction be with the provided facts in mind? Did you read my posts? Or did you just answer allergic after the word "Intel" and "Arc"?
Yes I do, with AMD and Intel, I can't set 10 bit color for my smart TV, only with Nvidia Cards.
Posted on Reply
#45
Chrispy_
enb141Yes I do, with AMD and Intel, I can't set 10 bit color for my smart TV, only with Nvidia Cards.
I think your TV or HDMI cable are garbage, or there's some PEBKAC occurring; AMD have made plenty of mistakes but driver support for colour depth isn't one of them:



I can get 10 or 12-bit colour on my 6700XT, 6800XT, and Steam Deck. Across hundreds of machines at the office and homes, I've never once had issues that couldn't be attributed to shit displays or damaged cables.
bugI wouldn't judge the "tax" for the whole series based on one SKU, but yeah. You get DLSS3 and better efficiency for that "tax", but that's it...
Fwiw, I might still pull the trigger on a 4060Ti 8GB, but I will definitely not pay $500 (+tax) for the 16GB version
What's a 6800XT cost in your region?
Raster performance of a 6800XT is vastly superior to the 4060Ti, the RT performance is similar at worst, much better otherwise.

I'm not trying to discourage you from buying Nividia, but unless you need CUDA and DLSS frame gen it's hugely overpriced compared to AMD and also Ampere. You'll struggle to buy a 3080 in many regions (including the UK) but the 3080 is equivalent in value to a 6800XT on the used/refurb market.
Posted on Reply
#46
enb141
Chrispy_I think your TV or HDMI cable are garbage, or there's some PEBKAC occurring; AMD have made plenty of mistakes but driver support for colour depth isn't one of them:



I can get 10 or 12-bit colour on my 6700XT, 6800XT, and Steam Deck. Across hundreds of machines at the office and homes, I've never once had issues that couldn't be attributed to shit displays or damaged cables.
I tried with different cables, even high end 8K cables, both HDMI and DisplayPort to HDMI, none work.

With my old 1050 TI had no issues with 10 bit on my Smart TV.
Posted on Reply
#47
gurusmi
enb141Yes I do, with AMD and Intel, I can't set 10 bit color for my smart TV, only with Nvidia Cards.
Is there a visible difference between 8 or 10 bit at the normal desktop whilst editing a worksheet in LibreCalc/Excel or at a IDE for programming like Gambas, Lazarus or Visual studio? I use 2 32" monitors with a native resolution of 2560x1440.My daily used sheet has around 2500 lines of code at Librecalc. I want to see at one screen the makro-IDE and at the other the original worksheet when developing needed Makros. When developing software i want to have the debugged code at one monitor at the debugged pro9gram on the other. Is it visible when slicing a 3D object for a 3D Printer?

It's much harder to find a fast graphics card that has a height of a maximum of 2 slots, has a RAM of at least 16GB and can output to 2 display ports. I want to have one more HDMI and one Display port output as an additional free option to use. My new rig has a case with a glass side. I install the Graphics card with a riser card. The 2 slots are needed as the case do not support higher cards. If possible the gfx card also should have a decent look with possibly no eyecatching blingbling. Have fun to search for them.
Posted on Reply
#48
enb141
gurusmiIs there a visible difference between 8 or 10 bit at the normal desktop whilst editing a worksheet in LibreCalc/Excel or at a IDE for programming like Gambas, Lazarus or Visual studio? I use 2 32" monitors with a native resolution of 2560x1440.My daily used sheet has around 2500 lines of code at Librecalc. I want to see at one screen the makro-IDE and at the other the original worksheet when developing needed Makros. When developing software i want to have the debugged code at one monitor at the debugged pro9gram on the other. Is it visible when slicing a 3D object for a 3D Printer?

It's much harder to find a fast graphics card that has a height of a maximum of 2 slots, has a RAM of at least 16GB and can output to 2 display ports. I want to have one more HDMI and one Display port output as an additional free option to use. My new rig has a case with a glass side. I install the Graphics card with a riser card. The 2 slots are needed as the case do not support higher cards. If possible the gfx card also should have a decent look with possibly no eyecatching blingbling. Have fun to search for them.
If you need a high end video card with just 2 slots, then get a water cooled one, those ones are super thin.
Posted on Reply
#49
gurusmi
enb141If you need a high end video card with just 2 slots, then get a water cooled one, those ones are super thin.
Did i say that i need a high end card? To show a desktop? It would be possible to get a 4070TI with 2 slots only. But i don't want to. It's a waste of time. In former times i always used Matrox cards. That Intel bla bla bla card is enough to show my desktop without any issue. And so we are again at my first post with my question: why should i pay more? At Intel i get a good card to show a desktop. Additionally i can recode Videos for i.e. YT with that Intel card quite fast. And at all it has device drivers for Linux Mint. More i do not need.
Posted on Reply
#50
enb141
gurusmiDid i say that i need a high end card? To show a desktop? It would be possible to get a 4070TI with 2 slots only. But i don't want to. It's a waste of time. In former times i always used Matrox cards. That Intel bla bla bla card is enough to show my desktop without any issue. And so we are again at my first post with my question: why should i pay more? At Intel i get a good card to show a desktop. Additionally i can recode Videos for i.e. YT with that Intel card quite fast. And at all it has device drivers for Linux Mint. More i do not need.
I was looking for a cheap card with no issues, so to me that card would be a 4050/4060. My current AMD 6400 sucks (drivers not performance) and intel also sucks (no 10 bit color or VRR and emulated older DirectX).

Nvidia is the only card that works for me that gives my smart tv 10 bit.
Posted on Reply
Add your own comment
Dec 22nd, 2024 12:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts