Friday, July 7th 2023
NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch
It looks like the only reason the GeForce RTX 4060 Ti 16 GB even exists, is to prove to naysayers that think 8 GB is too little an amount of memory for the original RTX 4060 Ti. Andreas Schilling of HardwareLuxx.de in a tweet stated that NVIDIA add-in card (AIC) partners tell him that very few of them are interested in promoting the RTX 4060 Ti 16 GB, which goes on sale starting July 18. Board partners have very few custom-design graphics card models in their product stack, and much of this has to do with the card's steep $499 MSRP.
Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
Sources:
Andreas Schilling (Twitter), VideoCardz
Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
79 Comments on NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch
Plenty of reviewers and youtubers have analysed resolution/texture scaling on the 4060 and 4060Ti and it's clear that Nvidia's cache increase isn't very effective at masking the paltry bandwidth. It's significantly worse than even the older RDNA2 infinity cache on far cheaper, slower cards like the 128-bit 6600XT.
The Partners aren't exactly clean in all this, they made insane amounts of money, they also scalped the prices, they ordered insane amounts of 3000 series cards because of greed. Seem very hypocritical of them to be honest. F'them and f'nvidia. Oh and AMD.
The 16 GB 4060 Ti will excel in the same situations that the 16 GB A4000 did in this video. Most buyers should gravitate towards the 8 GB model due to cost concerns but the way I see it, this GPU will only realize its full potential on the 16 GB model. We agree. AMD included, see: the defense force holding the line on their contractual block of DLSS in their sponsored titles.
I don't believe for a second Nvidia was compelled to introduce 16GB variants on 60-class GPUs to fill the naysayers apetite. nV is 101% profit orientated and all decisions are based on "making money" and seeing games are utilizing more than 8GB and everyone/reviewers and even the competition surprisingly and belligerently made a case for it - they chose to cash in! But what a tight-arse, 16GB on a crappy 60 class variant stapped on a 128bit bus for half a thousand dollars. How much, HALF A THOUSAND DOLLARS!! Someone pinch me!!
About the Naysayers; If broader consumer sentiments were even remotely of concern, it wouldn't have been the introduction of 16GB on a shit-show but more importantly lowered prices for a crappy skimped 50-class product dressed as 60-class. What they should have done was significantly reduce prices across the board and dropped a 70-class 16GB variant on a 192-bit bus as a bare minimum (for $500). VRAM was always a secondary concern to Nvidias rediculous performance segmentation and extravagant pricing.
For those who claim 8GB is still reasonable/feasible at the "mid-range" - even if your favourite graphics hungry games can be run on ULTRA doesn't always mean you're getting the best graphics package available on that preset. Dynamic adjustments are made (skimped) depending on how much VRAM is available whereby "ultra" would be redefined (the ultra illusion). Another thing worthy of note, why should developers be constrained in ~2023 with 8GB limits at the broader mid-performance segment - why not increase the drawing palette for game devs to unleash the already long-awaited finer graphics prowess which is widely available but denied by skimped hardware bottlenecks. Nowadays devs can't keep up with keeping everyone happy, newer titles just pack too much of a punch in the graphics dept and too much dynamic pruning is getting way too obvious.
Moral of the story: Nvidia partners are absolutely spot-on for not entertaining nVs dirty laundry. When you've got a 12GB 192-bit 6700 XT/6750 XT for just over $300, thats a very expensive $500 bucket load of dirty soiled pants.
The 12GB x70's have primarily a bandwidth issue before VRAM I think simply because the 12GB is still sufficient to comfortably load up everything a game needs. We just didn't see much of it yet.
But... this is the bandwidth I had on a 2016 card.
And this is a 4060ti.
Indeed, x50 class nonsense. And just let it sink in here... not even GDDR6X to save its paltry 128 bit just a little. Nvidia could have easily done that. They didn't - not even on the Ti. Theft in broad daylight...
Here is the perf equivalent RX6700 (to the 4060, mind, which has 272GB/s)
Cut up like hell, but well balanced in capacity and bandwidth.
and RX6800 is already at the 4070ti level - beyond in fact in capacity and bandwidth.
And note... both architectures now carry a heavy L2$ solution.
This is how Nvidia pulls its customers towards DLSS. You just can't work without it anymore, sooner rather than later on these cards.
Both RDNA, RDOA, and ADoA GPUs feature the same "I literally die of high resolutions" attitude wreaked by insane skimping on bus width. This cache is not efficient, nor is it effective at higher resolutions. You need real bandwidth. They don't offer it.
That's why RX 5700 XT is better at 1440p compared to RX 6600 despite worse architecture per se.
That's why RTX 3080 Ti is future proof and RTX 4070 Ti isn't.
That's why RX 6700 XT is faster than 3060 Ti at 1080p and worse at 2160p.
Sometimes it feels like they want 1080p to stay on the top for literally forever so they don't need to make better products.
But RDOA? Seems fine to me. The 6950XT only has 576, being a mere 13 percent slower than the 7900XT
RDOA has low mem b/w compared to higher tier RTX 3000 series GPUs which run beyond 1 TB/s. That's why the higher the resolution the lower the difference between RX 7900 XT and RTX 3090. That's why the only case RX 6900 XT could possibly make sense is 2560x1440. One step back (2560x1080) and RTX 3080 is ideal; one step forward (3440x1440) and RTX 3080 Ti is ideal.
Not to mention RX 7600 whose price competition is RX 6700 XT which has 1.5x the VRAM and almost 1.5x the VRAM effective speed. You know who is tortured by 1440p and who is killed by it.
Just kidding. Most games are playable at 4K framerate-wise even if you got a lower class GPU such as RX 6700 XT but the older the game the lower the chance their developers even thought of this game being run on a 4K display. Makes for weird UI, unreadable text and other nuisances.
I think we all agree that for raster games the 16MB will rarely make an improvement, but in some cases it does. However if nV wants to showcase the new features of 40xx it needs to do a better job.
nV should launch the 4060 Ti 16MB at the original 8MB price, and reduce the 8MB price by $60.
Done.
ignoring pricing for a moment, as it is different for different markets etc, as no one would not get this on sale at 199$.
more than 50% of the planets gamers are on below 1080p res, so probably will run at FHD, maybe 2K,
and very much likely with lower settings for newer titles, as ppl are already used to reduce stuff to get games to work (with existing hw).
+50% of those arent playing those AAA titles at 4K.
+80% of gamers are neither enthusiasts like us, nor are they on forums like this, knowing about things in detail.
not everyone wants to wait for another gen, or in case of components death, forced to get something "now".
not every market still offers other gens for purchase.
while i dont buy most things based on brand (name) there are things where i wont get a different (brand),
the same way someone interested in say a BMW, wont buy a Merc or Lexus, just because its 5k cheaper.
and it doesnt mean they are "defenders" of a brand or they actions.
from all the folks i know, friends with, play online in teams together, +80% have a 1060 or less and are running games at 720/1080p,
a couple have a 1080ti/or newer stuff, mainly because existing hw broke and they were (forced) to get something,
of course using this to get something better.
all their games would run better, probably at same FHW or 2K res, and probably "offer" other things they didnt have before,
so outside the pricing, a 4060ti would still be a great upgrade for them.
besides all that: not a single person here will pay for someone's upgrade from 4060ti to a 4070,
nor will ppl have a bigger budget, just because ("you" think) the 4060ti is useless,
as especially younger folks might be saving for almost a year, putting together everything they have just to be able to get a 500$ card.
they wont magically have 100$ more, to get a 4070, so i wont be souring the 4060ti for them,
just because it wont make sense/is crappy positioned/overpriced/not enough xxx/not worth vs next step up etc. to me/"us" that know better.
i remember more than one at least not being "optimized" for dlls, which to me is effective as a block,
as you wont get the perf (fps etc) you could get.
$500 for a 4060? Not on my watch, I'm surely not touching that. That's the price the 4080 should've been sold for.
4080 - $500
4070 - $350
4060 - $200 (maybe $250 for 16 GB)
4050 - $130
4030 - <$100
Squeeze the Titanium cards in that structure.
Let's not forget the GPU is ONE PC component, and not the entire PC.
Since 16 GB GPU requirements are slowly about to become "mainstream", logic dictates that "mainstream" GPUs should come equipped with that amount of VRAM without premiums that put them out of reach for regular people.
Oh, and I'm talking about cards with good raster performance. Ray-Tracing shouldn't be pushed as much as nVidia is trying to. Also, 4060 shouldn't be considered a 4k gaming GPU. Stick to 1440p at most. Most people use 1080p so that should provide some "future - proofing".
They've run out of things to show us, so they've made Ray-Tracing a central part of their marketing strategy.
except the company making/selling a product is the one responsible for pricing/names etc,
and not based on what "we" want/think/it should be, nor do they have to follow logic.
its like everything else: no one forces you to buy it, doesnt mean no one else should.
"value" of an item is mainly connected to the size of the wallet.
Off topic and haven’t done enough research to compare, but curious nonetheless
CPU prices indeed scaled with the inflation rate perfectly, whereas GPU prices don't only grow by the inflation, they also get boosted by tier-shifting (i.e. nGreedia is selling a 4050 (as scenarios where it's weaker than the 3060 DO exist) as a 4060 for the price which was always considered adequate for **60 Super/Ti cards) and COVID, mining, floods and whatever to one or another extent made-up bollocks.
You can buy CPUs cheaper than their MSRP in a couple months after their release. RTX 3000 series are STILL sold for more than their original MSRP in some countries, unlike newer Intel's 12th gen which is possible to obtain for ~90 percent MSRP in one click worldwide.
This is why folks are mad.
Oh and by the way, a $130 CPU from 2021 (i3-12100) obliterates a $310 CPU from 2017 (i7-7700K) whereas there is not a single $X GPU to obliterate a 7 y.o. GPU for $2.5X MSRP ($300 RTX 4060 really is not much faster than $800 GTX 1080 Ti).
Either way, it's a PR disaster and an L for AMD for not being transparent and quelling any rumors, they could simply come clean and say they aren't doing it and claim that contractual clauses do not forbid or discourage competing techs - but they refused to do so thus far
These days going from a $300 to a $600 CPU doubles your multi-threaded performance.
Another thing is that $200 has pretty much always given you a CPU capable of 60+ FPS gaming. Can't say the same about GPUs.