Friday, July 7th 2023
NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch
It looks like the only reason the GeForce RTX 4060 Ti 16 GB even exists, is to prove to naysayers that think 8 GB is too little an amount of memory for the original RTX 4060 Ti. Andreas Schilling of HardwareLuxx.de in a tweet stated that NVIDIA add-in card (AIC) partners tell him that very few of them are interested in promoting the RTX 4060 Ti 16 GB, which goes on sale starting July 18. Board partners have very few custom-design graphics card models in their product stack, and much of this has to do with the card's steep $499 MSRP.
Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
Sources:
Andreas Schilling (Twitter), VideoCardz
Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
79 Comments on NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch
"Yes, once again the 8GB VRAM buffer of the RTX 3070 isn't enough once we enable ray tracing, but rather than stutter like mad, crash, or perform poorly, it just fails to load textures at all, leaving you with this blurry mess."
Is that considered "fine" to you? See, I would consider that "unacceptable" for a video card that cost more then an xbox series S and nearly as much as the PS5 this game runs fine on. We are not talking about the 12-24GB GPUs, We are talking, specifically, about the 8GB 4060/TI and why they should not exist as is. You seem to really be struggling with this, so let me help you out.
The 4060ti, with it's 8GB VRAM buffer, cannot handle current games at 1080p. Forget 4k. Forget 2k. You cannot run games, at high, with RT, at 1080p, on this $400 GPU. 1080p is out of range of this card.
1080p
1080p
1080p
Hope this helps you. If you still cant understand this VRAM argument, and why it applies to 8GB cards, I can only suggest getting that lobotomy reversed.
There are several modern game releases that already run into issues with 8gb vram buffers, even at 1080p. The most notable issue being massive 1% low spikes, which is probably one of the most noticeable/worst things to experience in game.
(I also made a typo and wrote 3080 instead of 3070, my bad.)
In all seriousness though, this thing is even more DOA than the original 4060 Ti was/is. Any way you look at it. Especially considering the 6800xt and 6700/50xt is still available.
The sword told samurai the box is a vice
Samurai committed to seppuku
For the price, Arse.
Are those companies so afraid to stand up to the chip makers? Would NVIDIA really be happy if they all left and they could only sell their founder's edition models?
• Does not have RT (I know).
• Does not support DLSS.
• Is about a third slower. In the games of 2023, the difference is up to 2 times.
You were basically getting something similar to GTX 1070 Ti but with RT/DLSS for less money than you had to pay for a basic 1070.
But yeah, 2080 and 2080 Ti have been overpriced. 2070... Just made no sense because either used 1080 Ti is cheaper or 2080 Ti is actually faster than anything Pascal.
Oh and the 2060's competitor, the 5600XT, was the same speed at $280. So the 2060 was an OK card and the cheapest tech demo for RT and DLSS but nothing special pricewise as you paid more for the added features, just like the rest of the 2000 series.
The logic is off entirely. Im sure you can tweak settings, fantastic, but it doesnt change a thing: people buy cards that are brutally underpowered for way too much money in Ada.
You do you... I know I wont. But stop living in denial when there are various examples AT LAUNCH of a new gen that simply get crushed by VRAM caps while they could run the game fine in every other way. You can play at lower settings. It doesnt change facts. 8GB is shit.
Also about what you could expect on a card... I ran 3440x1440 on bloody x80 Pascal. Of course you can and should be able to max most games at 1440p on these cards. What are you on about?! And even thén, tweaking stuff left and right, you will find games that just won't be satisfactory any longer, much like I did recently. This is a crazy argument that even counters what you, yourself are saying about these cards: that they 'run out of juice' for whatever res because of lacking core power just the same as they do on VRAM. This test shows you, they don't, there is in fact a lot of untapped core power, that just doesn't get a chance to show itself without >8GB.
Similarly, for RT - Callisto Protocol's example even at 1080p + RT shows a serious lead for the 16GB equipped RX6800, while at 1440p with RT the Ampere card falls apart completely. It has the shading/RT horse power! It lacks the VRAM.
This also puts a different lens on the idea that games are not released with features or quality settings that cards cannot run. You could run these settings on a 12GB midrange card. But not an 8GB one that is otherwise identical. If you own an 8GB card in 2023, you're in that shitty corner no developer wants to wait for any longer. Consoles have more, and an ever growing amount of GPUs will have more as time passes, even within the current gen, as RDNA2 cards with more are still out there selling, while the 8GB cards that have been there since 2016 (!) are slowly checking out.
Also - and yes, I know, this is clearly just the maximum load on it - I have to say that in virtually every new'ish game I get into now, I'm seeing well north of 10GB in use. Every 8GB card will be that much harder pushed on its L2 and its overall bandwidth in every game going forward. Even thát incurs a performance penalty all on its own, even if it won't stutter. That is probably part of the reason the x60 isn't going places performance wise relative to Ampere - Ampere has a lot more bandwidth. Another side effect of lacking VRAM / bw is that engines dynamically adjust quality of assets on the fly now, so you will have more pop in too even if you do keep the frames. Darktide is a good example... I played that on an 8GB 1080, and had blurry textures left and right, while frames were good. On the 20GB card, those issues are gone.