Friday, July 7th 2023

NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch

It looks like the only reason the GeForce RTX 4060 Ti 16 GB even exists, is to prove to naysayers that think 8 GB is too little an amount of memory for the original RTX 4060 Ti. Andreas Schilling of HardwareLuxx.de in a tweet stated that NVIDIA add-in card (AIC) partners tell him that very few of them are interested in promoting the RTX 4060 Ti 16 GB, which goes on sale starting July 18. Board partners have very few custom-design graphics card models in their product stack, and much of this has to do with the card's steep $499 MSRP.

Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
Sources: Andreas Schilling (Twitter), VideoCardz
Add your own comment

79 Comments on NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch

#51
Chrispy_
At this point I'm unconvinced that the card has enough memory bandwidth to even benefit from more VRAM.

Plenty of reviewers and youtubers have analysed resolution/texture scaling on the 4060 and 4060Ti and it's clear that Nvidia's cache increase isn't very effective at masking the paltry bandwidth. It's significantly worse than even the older RDNA2 infinity cache on far cheaper, slower cards like the 128-bit 6600XT.
Posted on Reply
#52
Bomby569
the stupid pricing will not change until the endless supply of 3060, 3060ti, 3070 finally comes to an end, somewhere around the year 2030.

The Partners aren't exactly clean in all this, they made insane amounts of money, they also scalped the prices, they ordered insane amounts of 3000 series cards because of greed. Seem very hypocritical of them to be honest. F'them and f'nvidia. Oh and AMD.
Posted on Reply
#53
N/A
Chrispy_At this point I'm unconvinced that the card has enough memory bandwidth to even benefit from more VRAM.

Plenty of reviewers and youtubers have analysed resolution/texture scaling on the 4060 and 4060Ti and it's clear that Nvidia's cache increase isn't very effective at masking the paltry bandwidth. It's significantly worse than even the older RDNA2 infinity cache on far cheaper, slower cards like the 128-bit 6600XT.
Actually that may not be entirely true. in Techpowerup's RTX 4060 Dual OC review 7600 is on par with 4060 in 1080p and 1440p and in 4K and drops to 89%, 6600XT drops to 79%.Clearly the NvIdia L2 cache must be better than the Infinity cache even in the gimped 24MB desktop version, AD107-mobile is said to contain 32MB.

Posted on Reply
#54
Dr. Dro

The 16 GB 4060 Ti will excel in the same situations that the 16 GB A4000 did in this video. Most buyers should gravitate towards the 8 GB model due to cost concerns but the way I see it, this GPU will only realize its full potential on the 16 GB model.
WorringlyIndifferentActively distrusting and questioning corporations (along with billionaires, politicians, journalists, or anyone else who has influence over you) should be the default behavior for all humans. It's absolutely baffling that it isn't, and a day doesn't go by that I don't interact with someone who legitimately just blindly trusts their preferred company or billionaire or whatever. They just accept whatever they hear, no critical thought, no doubt, just "whatever they said is true." Just mind boggling. The thought never even crosses their mind that whoever they're listening to might be lying, or might just be wrong. I can't imagine living like that.
We agree. AMD included, see: the defense force holding the line on their contractual block of DLSS in their sponsored titles.
Posted on Reply
#55
wheresmycar
Nvidia has completely cocked things up with 40-series (IMO). It all starts from the 4080 with that rediculous MSRP of $1200 (not even gonna mention the halo~luxury 4090). The second Nvidia decided to intensify profits with the 80-class, it was game-over for the mid & bottom-barrel performance tiers. The trickling down effect!

I don't believe for a second Nvidia was compelled to introduce 16GB variants on 60-class GPUs to fill the naysayers apetite. nV is 101% profit orientated and all decisions are based on "making money" and seeing games are utilizing more than 8GB and everyone/reviewers and even the competition surprisingly and belligerently made a case for it - they chose to cash in! But what a tight-arse, 16GB on a crappy 60 class variant stapped on a 128bit bus for half a thousand dollars. How much, HALF A THOUSAND DOLLARS!! Someone pinch me!!

About the Naysayers; If broader consumer sentiments were even remotely of concern, it wouldn't have been the introduction of 16GB on a shit-show but more importantly lowered prices for a crappy skimped 50-class product dressed as 60-class. What they should have done was significantly reduce prices across the board and dropped a 70-class 16GB variant on a 192-bit bus as a bare minimum (for $500). VRAM was always a secondary concern to Nvidias rediculous performance segmentation and extravagant pricing.

For those who claim 8GB is still reasonable/feasible at the "mid-range" - even if your favourite graphics hungry games can be run on ULTRA doesn't always mean you're getting the best graphics package available on that preset. Dynamic adjustments are made (skimped) depending on how much VRAM is available whereby "ultra" would be redefined (the ultra illusion). Another thing worthy of note, why should developers be constrained in ~2023 with 8GB limits at the broader mid-performance segment - why not increase the drawing palette for game devs to unleash the already long-awaited finer graphics prowess which is widely available but denied by skimped hardware bottlenecks. Nowadays devs can't keep up with keeping everyone happy, newer titles just pack too much of a punch in the graphics dept and too much dynamic pruning is getting way too obvious.

Moral of the story: Nvidia partners are absolutely spot-on for not entertaining nVs dirty laundry. When you've got a 12GB 192-bit 6700 XT/6750 XT for just over $300, thats a very expensive $500 bucket load of dirty soiled pants.
Posted on Reply
#56
Ruru
S.T.A.R.S.
TheinsanegamerNWhy? The card is not bandwidth starved, it is capacity starved.
I disagree with that. In some cases, the 8GB model gets beaten even by a 3060 Ti.

Posted on Reply
#57
Vayra86
Chrispy_At this point I'm unconvinced that the card has enough memory bandwidth to even benefit from more VRAM.

Plenty of reviewers and youtubers have analysed resolution/texture scaling on the 4060 and 4060Ti and it's clear that Nvidia's cache increase isn't very effective at masking the paltry bandwidth. It's significantly worse than even the older RDNA2 infinity cache on far cheaper, slower cards like the 128-bit 6600XT.
Yes, Ada has both problems - bandwidth and VRAM.
The 12GB x70's have primarily a bandwidth issue before VRAM I think simply because the 12GB is still sufficient to comfortably load up everything a game needs. We just didn't see much of it yet.

But... this is the bandwidth I had on a 2016 card.



And this is a 4060ti.
Indeed, x50 class nonsense. And just let it sink in here... not even GDDR6X to save its paltry 128 bit just a little. Nvidia could have easily done that. They didn't - not even on the Ti. Theft in broad daylight...

Here is the perf equivalent RX6700 (to the 4060, mind, which has 272GB/s)
Cut up like hell, but well balanced in capacity and bandwidth.


and RX6800 is already at the 4070ti level - beyond in fact in capacity and bandwidth.

And note... both architectures now carry a heavy L2$ solution.

This is how Nvidia pulls its customers towards DLSS. You just can't work without it anymore, sooner rather than later on these cards.
Posted on Reply
#58
Macro Device
Vayra86well balanced in capacity and bandwidth.
You stop playing at low resolutions (read: turn at least quasi-4K on), this "well balanced" becomes "does this GPU even have VRAM b/w?"

Both RDNA, RDOA, and ADoA GPUs feature the same "I literally die of high resolutions" attitude wreaked by insane skimping on bus width. This cache is not efficient, nor is it effective at higher resolutions. You need real bandwidth. They don't offer it.

That's why RX 5700 XT is better at 1440p compared to RX 6600 despite worse architecture per se.
That's why RTX 3080 Ti is future proof and RTX 4070 Ti isn't.
That's why RX 6700 XT is faster than 3060 Ti at 1080p and worse at 2160p.

Sometimes it feels like they want 1080p to stay on the top for literally forever so they don't need to make better products.
Posted on Reply
#59
Vayra86
Beginner Micro DeviceYou stop playing at low resolutions (read: turn at least quasi-4K on), this "well balanced" becomes "does this GPU even have VRAM b/w?"

Both RDNA, RDOA, and ADoA GPUs feature the same "I literally die of high resolutions" attitude wreaked by insane skimping on bus width. This cache is not efficient, nor is it effective at higher resolutions. You need real bandwidth. They don't offer it.

That's why RX 5700 XT is better at 1440p compared to RX 6600 despite worse architecture per se.
That's why RTX 3080 Ti is future proof and RTX 4070 Ti isn't.
That's why RX 6700 XT is faster than 3060 Ti at 1080p and worse at 2160p.

Sometimes it feels like they want 1080p to stay on the top for literally forever so they don't need to make better products.
~500 still isn't world class I agree, but its better than 288 :D

But RDOA? Seems fine to me. The 6950XT only has 576, being a mere 13 percent slower than the 7900XT
Posted on Reply
#60
Macro Device
Vayra86its better than 288
"OH WOW MY **70 CARD HAS MORE MEM B/W THAN **70 CARD FROM 10 YEARS AGO" kinda achievement.

RDOA has low mem b/w compared to higher tier RTX 3000 series GPUs which run beyond 1 TB/s. That's why the higher the resolution the lower the difference between RX 7900 XT and RTX 3090. That's why the only case RX 6900 XT could possibly make sense is 2560x1440. One step back (2560x1080) and RTX 3080 is ideal; one step forward (3440x1440) and RTX 3080 Ti is ideal.

Not to mention RX 7600 whose price competition is RX 6700 XT which has 1.5x the VRAM and almost 1.5x the VRAM effective speed. You know who is tortured by 1440p and who is killed by it.
Posted on Reply
#61
Vayra86
Beginner Micro Device"OH WOW MY **70 CARD HAS MORE MEM B/W THAN **70 CARD FROM 10 YEARS AGO" kinda achievement.

RDOA has low mem b/w compared to higher tier RTX 3000 series GPUs which run beyond 1 TB/s. That's why the higher the resolution the lower the difference between RX 7900 XT and RTX 3090. That's why the only case RX 6900 XT could possibly make sense is 2560x1440. One step back (2560x1080) and RTX 3080 is ideal; one step forward (3440x1440) and RTX 3080 Ti is ideal.

Not to mention RX 7600 whose price competition is RX 6700 XT which has 1.5x the VRAM and almost 1.5x the VRAM effective speed. You know who is tortured by 1440p and who is killed by it.
We don't disagree. You just described perfectly why I'm staying away from 4K :) Too. Much. Effort.
Posted on Reply
#62
Macro Device
Vayra86Too. Much. Effort.
Why, just casually throw $2K on a 4090, throw another $700 on PSU and cooling, waste another $900 on a rig which actually fuels all the HP in this GPU and play at 4K@Low@DLSS: Performance@FrameGenerationOn, and you'll probably get enough framerates XDDDDDDDDDDD

Just kidding. Most games are playable at 4K framerate-wise even if you got a lower class GPU such as RX 6700 XT but the older the game the lower the chance their developers even thought of this game being run on a 4K display. Makes for weird UI, unreadable text and other nuisances.
Posted on Reply
#63
lemonadesoda
DLSS needs memory. The 40xx series is all about RT and DLSS.

I think we all agree that for raster games the 16MB will rarely make an improvement, but in some cases it does. However if nV wants to showcase the new features of 40xx it needs to do a better job.

nV should launch the 4060 Ti 16MB at the original 8MB price, and reduce the 8MB price by $60.

Done.
Posted on Reply
#64
Waldorf
Funny how lots here forgot a few things.
ignoring pricing for a moment, as it is different for different markets etc, as no one would not get this on sale at 199$.

more than 50% of the planets gamers are on below 1080p res, so probably will run at FHD, maybe 2K,
and very much likely with lower settings for newer titles, as ppl are already used to reduce stuff to get games to work (with existing hw).

+50% of those arent playing those AAA titles at 4K.

+80% of gamers are neither enthusiasts like us, nor are they on forums like this, knowing about things in detail.

not everyone wants to wait for another gen, or in case of components death, forced to get something "now".

not every market still offers other gens for purchase.

while i dont buy most things based on brand (name) there are things where i wont get a different (brand),
the same way someone interested in say a BMW, wont buy a Merc or Lexus, just because its 5k cheaper.
and it doesnt mean they are "defenders" of a brand or they actions.

from all the folks i know, friends with, play online in teams together, +80% have a 1060 or less and are running games at 720/1080p,
a couple have a 1080ti/or newer stuff, mainly because existing hw broke and they were (forced) to get something,
of course using this to get something better.
all their games would run better, probably at same FHW or 2K res, and probably "offer" other things they didnt have before,
so outside the pricing, a 4060ti would still be a great upgrade for them.


besides all that: not a single person here will pay for someone's upgrade from 4060ti to a 4070,
nor will ppl have a bigger budget, just because ("you" think) the 4060ti is useless,
as especially younger folks might be saving for almost a year, putting together everything they have just to be able to get a 500$ card.
they wont magically have 100$ more, to get a 4070, so i wont be souring the 4060ti for them,
just because it wont make sense/is crappy positioned/overpriced/not enough xxx/not worth vs next step up etc. to me/"us" that know better.
Posted on Reply
#65
R0H1T
Dr. Droholding the line on their contractual block of DLSS in their sponsored titles.
Nope you made that up, while there is conjecture no concrete evidence as of now!
Posted on Reply
#66
Waldorf
@R0H1T while i dont care about any "current" titles (as in 12 month +/- of present),
i remember more than one at least not being "optimized" for dlls, which to me is effective as a block,
as you wont get the perf (fps etc) you could get.
Posted on Reply
#67
3ogdy
The problem is not the 6 in 16GB, but the 6 in 4060.

$500 for a 4060? Not on my watch, I'm surely not touching that. That's the price the 4080 should've been sold for.

4080 - $500
4070 - $350
4060 - $200 (maybe $250 for 16 GB)
4050 - $130
4030 - <$100

Squeeze the Titanium cards in that structure.
Let's not forget the GPU is ONE PC component, and not the entire PC.

Since 16 GB GPU requirements are slowly about to become "mainstream", logic dictates that "mainstream" GPUs should come equipped with that amount of VRAM without premiums that put them out of reach for regular people.

Oh, and I'm talking about cards with good raster performance. Ray-Tracing shouldn't be pushed as much as nVidia is trying to. Also, 4060 shouldn't be considered a 4k gaming GPU. Stick to 1440p at most. Most people use 1080p so that should provide some "future - proofing".

They've run out of things to show us, so they've made Ray-Tracing a central part of their marketing strategy.
Posted on Reply
#68
Waldorf
@3ogdy
except the company making/selling a product is the one responsible for pricing/names etc,
and not based on what "we" want/think/it should be, nor do they have to follow logic.

its like everything else: no one forces you to buy it, doesnt mean no one else should.
"value" of an item is mainly connected to the size of the wallet.
Posted on Reply
#69
claes
Just realizing now that cpu prices have reflected inflation for quite some time and suddenly GPU prices reflect and people are up in arms

Off topic and haven’t done enough research to compare, but curious nonetheless
Posted on Reply
#70
Macro Device
claeshaven’t done enough research
This is too true for you to be good.

CPU prices indeed scaled with the inflation rate perfectly, whereas GPU prices don't only grow by the inflation, they also get boosted by tier-shifting (i.e. nGreedia is selling a 4050 (as scenarios where it's weaker than the 3060 DO exist) as a 4060 for the price which was always considered adequate for **60 Super/Ti cards) and COVID, mining, floods and whatever to one or another extent made-up bollocks.

You can buy CPUs cheaper than their MSRP in a couple months after their release. RTX 3000 series are STILL sold for more than their original MSRP in some countries, unlike newer Intel's 12th gen which is possible to obtain for ~90 percent MSRP in one click worldwide.

This is why folks are mad.

Oh and by the way, a $130 CPU from 2021 (i3-12100) obliterates a $310 CPU from 2017 (i7-7700K) whereas there is not a single $X GPU to obliterate a 7 y.o. GPU for $2.5X MSRP ($300 RTX 4060 really is not much faster than $800 GTX 1080 Ti).
Posted on Reply
#71
Dr. Dro
R0H1TNope you made that up, while there is conjecture no concrete evidence as of now!
I didn't make anything up, but the AMD fanbase IS in full defensive mode over this and were called out even by the tech press on their obnoxious clubism. There's no concrete evidence in 9 out of 10 cases of "ngreedia's" evildoing, but people will hate them for things they did 10+ years ago anyway.

Either way, it's a PR disaster and an L for AMD for not being transparent and quelling any rumors, they could simply come clean and say they aren't doing it and claim that contractual clauses do not forbid or discourage competing techs - but they refused to do so thus far
Posted on Reply
#72
Vayra86
claesJust realizing now that cpu prices have reflected inflation for quite some time and suddenly GPU prices reflect and people are up in arms

Off topic and haven’t done enough research to compare, but curious nonetheless
CPU availability AND pricing has always been maintained at a much more sensible, gradual line up.
Posted on Reply
#73
THU31
High-end CPUs have actually gotten much cheaper. Just compare the lowest and highest i7s back in the Nehalem area, or go back as far as Pentium 4. Clock speed was pretty much the only difference between different models, yet one would cost $300 and another $1000.
These days going from a $300 to a $600 CPU doubles your multi-threaded performance.

Another thing is that $200 has pretty much always given you a CPU capable of 60+ FPS gaming. Can't say the same about GPUs.
Posted on Reply
#74
Kyan
Vayra86This is how Nvidia pulls its customers towards DLSS. You just can't work without it anymore, sooner rather than later on these cards.
I've never thought about it, but it scares me to think that this is really what they're planning.
Posted on Reply
#75
Bomby569
THU31High-end CPUs have actually gotten much cheaper. Just compare the lowest and highest i7s back in the Nehalem area, or go back as far as Pentium 4. Clock speed was pretty much the only difference between different models, yet one would cost $300 and another $1000.
These days going from a $300 to a $600 CPU doubles your multi-threaded performance.

Another thing is that $200 has pretty much always given you a CPU capable of 60+ FPS gaming. Can't say the same about GPUs.
Nothing changed there, a 60fps gpu was always more expensive then a 60fps cpu. Only on the bleeding edge if you were buying those stupid waste of money extreme intel cpu's.
Posted on Reply
Add your own comment
Dec 18th, 2024 20:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts