Thursday, July 6th 2023
16GB Variant of GeForce RTX 4060 Ti Launches July 18
NVIDIA is preparing the launch of its third and final RTX 4060-series graphics card SKU, the GeForce RTX 4060 16 GB, for July 18, 2023. Going by past convention, reviews of the RTX 4060 Ti 16 GB graphics card priced at its steep $499 MSRP, will go live on June 17, and those priced above the MSRP on July 18, alongside market availability. The RTX 4060 Ti is essentially a memory variant of the RTX 4060 Ti. It offers 16 GB of video memory across the card's 128-bit wide memory interface.
According to the specs-sheet put out by NVIDIA on the May 18 launch date for the RTX 4060 series, besides memory size, there are no other differences between the RTX 4060 Ti 16 GB, and the current RTX 4060 Ti 8 GB. In particular, there is no change in the core-configuration or clock-speed, since the shader compute throughput of both models is listed at the same 22 TFLOPs. Even the memory speed is the same, at 18 Gbps (GDDR6-effective), at which the GPU has 288 GB/s of memory bandwidth at its disposal. It will be interesting to see the performance impact of 16 GB memory.
Sources:
MEGAsizeGPU (Twitter), VideoCardz
According to the specs-sheet put out by NVIDIA on the May 18 launch date for the RTX 4060 series, besides memory size, there are no other differences between the RTX 4060 Ti 16 GB, and the current RTX 4060 Ti 8 GB. In particular, there is no change in the core-configuration or clock-speed, since the shader compute throughput of both models is listed at the same 22 TFLOPs. Even the memory speed is the same, at 18 Gbps (GDDR6-effective), at which the GPU has 288 GB/s of memory bandwidth at its disposal. It will be interesting to see the performance impact of 16 GB memory.
60 Comments on 16GB Variant of GeForce RTX 4060 Ti Launches July 18
The way they've named their GPUS relative to the dies this time round has put them in a pretty average spot for memory configs. I feel that coming from Samsung's node to TSMC netted more improvement than consumers expected so they want to charge high for lower their products.
Personally I'd have preferred to see
4090 24GB AD102
4080 20GB AD102
4070 16GB AD103
4060 12GB AD104
4050 8GB AD106
But it seems like they, as well as AMD want to condition the market to expect linear pricing relative to the top product. It used to be the xx80 was 10-20% from the top dog for roughly half the price, but now half the performance of a 4090 costs half as much and so on. I don't see this changing, both camps seem happy with lacklustre sales if it means we're used to it for future generations that will bring their own generational improvements.
Edit AD106 where I put ad107
I have learnt on newer windows builds VRAM gets even worse as some native windows processes are now gpu accelerated, meaning more VRAM usage as baseline, my plan is to use igpu for desktop when I do my platform upgrade meaning all gpu accelerated desktop stuff loads up DRAM instead of VRAM which will buy me anywhere from about 500meg to a couple of gigs of VRAM depending on what apps are running. GPU will just be for rendering games. Would be cool if Nvidia did render only drivers.
For those who dont know in the Nvidia control panel is an option on the top menu to enable a systray icon which shows which apps are using the GPU.
Right now on my PC, the following are all using my GPU and as such also VRAM. WFC, firefox, searchapp, onedrive, textinputhost, and msedge. Discord also will use it by default but can be disabled, steam likewise.
but instead it only costs a bit more and in terms of frames per dollar I think it was even a better deal then the 4080....you get less for your money going down the stack, which is how the sellers like it.
More GB than 4070 & Ti and equals 4080.
Joke release. They really want us to upgrade our hardware every gen it seems.
I suppose no worse than a 3060 with more GB than a 3070 & 3080 :confused:
The 8GB failed so hard at being a sellable product, they dropped prices within 3 weeks of launch.
At $499 it's not $100 for 8GB more, it's now $120, and I've seen MIR's on the $379 models in the past (but can't find one right now).
I'll buy one to see how bad the 128-bit bus is for CUDA workflows. I strongly suspect the 12GB 3060 will be a compelling alternative with 50% more bandwidth for half the price, meaning that we'll keep buying 3060's
edit:
I'm not sure I'll even buy one. The 4060 8GB's results in SPECviewperf are abysmal. It's slower than the 3060Ti on average, often tied, sometimes considerably worse. Adding more VRAM won't change the performance results...
128-bit cards are garbage for anything other than light gaming at low resolutions, it would seem.
I've always been buying mid-range cards, but the last time I touched a 128bit bus was back in 6600GT days (great card, however). The 260 has a bus as wide as 448 bits!
:slap: :nutkick:
Actual gaming numbers and price matter.
Efficiency matters for some. Bus width, VRAM size, manufacturing process only matter for very specific needs.
Numbers on the box don't matter at all. I used to have a 6600GT, then I had a GTX 260, now I have a 1060... I haven't bought a single one because of the model number or the codename of the silicon die.
"Oh but you should just read the actual numbers" isn't a valid excuse for a massive company to mislead the public. But we all know people are going to rush to the defense of a multi-billion dollar corporation. Who cares if it's hurting consumers and ultimately you, right? The precious corporations must be protected and defended online, for free.
I suspect the lackluster sales of this crappy gen of dGPUs is only gonna accelerate that transition. Idk.
I find this a bit regrettable as the higher res/higher perf and more complex controls of pc suit me; games designed to be played with six buttons* are very diff from ones w/ kybd&mouse (or joystick or...etc).
*whatever the actual number is. I don't own one, and there are a couple brands out there selling well.