Friday, July 7th 2023

NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch

It looks like the only reason the GeForce RTX 4060 Ti 16 GB even exists, is to prove to naysayers that think 8 GB is too little an amount of memory for the original RTX 4060 Ti. Andreas Schilling of HardwareLuxx.de in a tweet stated that NVIDIA add-in card (AIC) partners tell him that very few of them are interested in promoting the RTX 4060 Ti 16 GB, which goes on sale starting July 18. Board partners have very few custom-design graphics card models in their product stack, and much of this has to do with the card's steep $499 MSRP.

Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
Sources: Andreas Schilling (Twitter), VideoCardz
Add your own comment

79 Comments on NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch

#1
Unregistered
Easy name it properly as 4050 and price it a such it would sell.
#2
wNotyarD
Would you look at that? Seems finally Nvidia's partners and us consumers came to agree at something.
Posted on Reply
#3
SCP-001
Considering how little the other SKUs are selling, I'm not surprised that even the AIB partners are unenthusiastic about the 40 series in general in terms of promotion.
Posted on Reply
#4
MrDweezil
There's no interest because the price for the 8GB model is bad, and $100 on top of that is insane.
Posted on Reply
#5
bug
wNotyarDWould you look at that? Seems finally Nvidia's partners and us consumers came to agree at something.
Read the news piece. Even Nvidia agrees, they just did it to show you don't really need 16GB VRAM, like AMD says. They don't need you to buy it, they just need to get it in the hands of reviewers. And then they can burry it.
Posted on Reply
#6
wNotyarD
bugRead the news piece. Even Nvidia agrees, they just did it to show you don't really need 16GB VRAM, like AMD says. They don't need you to buy it, they just need to get it in the hands of reviewers. And then they can burry it.
Andreas Schilling of HardwareLuxx.de in a tweet stated that NVIDIA add-in card (AIC) partners tell him that very few of them are interested in promoting the RTX 4060 Ti 16 GB
Yes, I did read it. And no, it's not Nvidia's itself stance.
Posted on Reply
#7
bug
wNotyarDYes, I did read it. And no, it's not Nvidia's itself stance.
the only reason the GeForce RTX 4060 Ti 16 GB even exists, is to prove to naysayers that think 8 GB is too little an amount of memory for the original RTX 4060 Ti
Nvidia (not AiC partners) did it to disprove naysayers.
Posted on Reply
#8
Unregistered
The AIBs were already handed a poop sandwich, can't say I'm surprised they don't want to smear more on.
#9
neatfeatguy
bugRead the news piece. Even Nvidia agrees, they just did it to show you don't really need 16GB VRAM, like AMD says. They don't need you to buy it, they just need to get it in the hands of reviewers. And then they can burry it.
With how poorly the 4060 Ti 8GB was received (and even the 4060 8GB) - Nvidia will now claim that 16GB didn't sell and therefore they will stand firm that 16GB isn't a necessary step forward because no one is buying a card with 16GB on it. People will either opt for the $100 less 8GB version or move up to the 4070 12GB cards that'll offer upwards of 30% performance gains and still have more than 8GB of RAM.

Technically, Nvidia will be correct. No one wants the 16GB card, but not because of the reasons they will claim; that 16GB isn't needed. It will be because it's an overpriced POS that doesn't make any sense to buy based on the cost of it compared to the cost of the other cards around it.
Posted on Reply
#10
Dahita
A 4070TI with 16Gb to get in between the 4070TI and the 4080 would have worked for me. Being able to be a little future proof in 1440p requiers 16Gb, and right now the entry price is the 4080 for it. Upgrading a lower end card when its upgrade (4070) has less memory for 30% performance is a non-sense.
Posted on Reply
#11
bug
neatfeatguyWith how poorly the 4060 Ti 8GB was received (and even the 4060 8GB) - Nvidia will now claim that 16GB didn't sell and therefore they will stand firm that 16GB isn't a necessary step forward because no one is buying a card with 16GB on it. People will either opt for the $100 less 8GB version or move up to the 4070 12GB cards that'll offer upwards of 30% performance gains and still have more than 8GB of RAM.

Technically, Nvidia will be correct. No one wants the 16GB card, but not because of the reasons they will claim; that 16GB isn't needed. It will be because it's an overpriced POS that doesn't make any sense to buy based on the cost of it compared to the cost of the other cards around it.
If people would act on what Nvidia says, yes. We will have reviews showing how wasteful 16GB really is.
Posted on Reply
#12
Quicks
Only way this will make enough difference is to upgrade the bus to 256bit.

Good for everyone showing Nvidia greed the middle finger!
Posted on Reply
#13
TheinsanegamerN
bugRead the news piece. Even Nvidia agrees, they just did it to show you don't really need 16GB VRAM, like AMD says. They don't need you to buy it, they just need to get it in the hands of reviewers. And then they can burry it.
Except the modded 3070 with 16GB showed major performance improvements. Woops.

guess we forgot that existed? Games are gonna be using more then 8GB. Why do people insist on arguing about that?
QuicksOnly way this will make enough difference is to upgrade the bus to 256bit.

Good for everyone showing Nvidia greed the middle finger!
Why? The card is not bandwidth starved, it is capacity starved.
Posted on Reply
#14
bug
TheinsanegamerNExcept the modded 3070 with 16GB showed major performance improvements. Woops.
It did? Where?
TheinsanegamerNguess we forgot that existed? Games are gonna be using more then 8GB. Why do people insist on arguing about that?
Nobody's arguing that. The argument is whether getting more VRAM is more future proof than getting a more potent GPU.
Posted on Reply
#16
TheinsanegamerN
bugIt did? Where?
www.techpowerup.com/307724/modded-nvidia-geforce-rtx-3070-with-16-gb-of-vram-shows-impressive-performance-uplift

www.techspot.com/images2/news/bigimage/2023/06/2023-06-01-image-17.jpg

OOF look at those 1% gains.
One could also look at the 3070 vs 6800 revisit, where the AMD card, once overall slower then the 3070, absolutely CRUSHES it in more then half the games tested because, wait for it, the VRAM runs out. That doesn't mention games that either hard crashed or developed noticeable visual issues on the 8GB card, like textures not loading.

www.techspot.com/article/2661-vram-8gb-vs-16gb/
bugNobody's arguing that. The argument is whether getting more VRAM is more future proof than getting a more potent GPU.
Potency of the GPU doesn't matter if you don't have the VRAM capacity to hold the game. This isn't the first time this has happened. 512MB GTX 9800? 2GB 770? One could even say the 3.5GB 970 showed how bad running out of high speed VRAM was. VRAM usage will only increase from here, so unless you want disastrous stuttering and huge gulfs between average and 1% lows, VRAM will always be a better investment.

The 4060ti is effectively a 3070. The 3070 is shown to struggle with games not because the GPU is at its limit, but because it runs out of memory. Therefore, buying a 8GB 4060ti is absolutely pointless, you cannot use its current potency, let alone anything faster, with only 8GB of VRAM. And the problem will only get worse from here. And yes, even at 1080p, this matters. 8GB should be the realm of sub $200 display adapters only, anything meant for any form of gaming should be, bare minimum, 10GB.

It's not a matter of opinion, it is a demonstrable fact, with evidence to back it up, that 8GB is not enough to play current games, and will not be enough for future titles. On a $300+ card, this is inexcusable.
Posted on Reply
#17
WorringlyIndifferent
Beginner Micro Device"yOu Do NoT nEeD sIxTeEn GiGaByTeS oF rAm"

Nvidia apologists will never stop protecting their precious multi-billion dollar company, for free
Posted on Reply
#18
Quicks
TheinsanegamerNExcept the modded 3070 with 16GB showed major performance improvements. Woops.

guess we forgot that existed? Games are gonna be using more then 8GB. Why do people insist on arguing about that?


Why? The card is not bandwidth starved, it is capacity starved.
How about to justify the 100$ increase in price? Will also make it more desirable if it bumps the performance 10% - 15%...
Posted on Reply
#19
TheinsanegamerN
QuicksHow about to justify the 100$ increase in price? Will also make it more desirable if it bumps the performance 10% - 15%...
So you're suggesting they sell a 4080 as a 4060ti? Because that's the only way you are getting a 106 die with a 256 bit bus.

It also doesnt need it. Again, look at the 4070. The 4060ti doesnt need more bandwidth to max out the chip, it needs capacity. More bits =! faster. We learned that in the 90s....

Better idea, the 16GB model should be the $399 product. And it should be reduced to $299.
Posted on Reply
#20
wNotyarD
TheinsanegamerNIt also doesnt need it. Again, look at the 4070. The 4060ti doesnt need more bandwidth to max out the chip, it needs capacity. More bits =! faster. We learned that in the 90s....
For general gaming where vram buffer is needed, maybe. For anything else, both 4060 and Ti are absolute jokes due to the narrow bus, independently of 8 or 16GB.
Posted on Reply
#21
TheinsanegamerN
wNotyarDFor general gaming where vram buffer is needed, maybe. For anything else, both 4060 and Ti are absolute jokes due to the narrow bus, independently of 8 or 16GB.
In what situations is the 4060/ti bandwidth starved? I havent seen that in reviews.
Posted on Reply
#22
Macro Device
WorringlyIndifferentNvidia apologists will never stop protecting their precious multi-billion dollar company, for free
And I will never stop blaming nGreedia even if they start gifting me multi billion worth of stuff. This is just the "burn corpo shit" attitude which I sport.
Posted on Reply
#23
bug
TheinsanegamerNwww.techpowerup.com/307724/modded-nvidia-geforce-rtx-3070-with-16-gb-of-vram-shows-impressive-performance-uplift

www.techspot.com/images2/news/bigimage/2023/06/2023-06-01-image-17.jpg

OOF look at those 1% gains.
One could also look at the 3070 vs 6800 revisit, where the AMD card, once overall slower then the 3070, absolutely CRUSHES it in more then half the games tested because, wait for it, the VRAM runs out. That doesn't mention games that either hard crashed or developed noticeable visual issues on the 8GB card, like textures not loading.

www.techspot.com/article/2661-vram-8gb-vs-16gb/
That's not a massive improvement, it's just 10fps on average. It does better with the occasional dip, but that's expected when you pick titles that don't fit into VRAM.
TheinsanegamerNPotency of the GPU doesn't matter if you don't have the VRAM capacity to hold the game.
This is where we disagree. I can go from Ultra/Very high textures to just High to fix a VRAM issue. But if I don't have the shading or RT horse power, I'm screwed.
Posted on Reply
#24
TheinsanegamerN
bugThat's not a massive improvement, it's just 10fps on average.
I know math is hard for you, but really dude? Going from 37 to 49 FPS is a 33% increase. That's not massive? :slap: Also, no mention of the removal of microstuttering? :laugh::roll::laugh:
bugIt does better with the occasional dip, but that's expected when you pick titles that don't fit into VRAM.
Oh, so now were claiming cherry picking. Bruh, modern games are frequently hitting this wall. You cant "well you picked that specific game" your way out of this one. And, I'd say going from 7 FPS minimum to 40 FPS minimum Is a LOT more then "better with the occasional dip". Unless your definition of "picking titles" is using modern AAA games in any capacity, which just LMFAO.
bugThis is where we disagree. I can go from Ultra/Very high textures to just High to fix a VRAM issue. But if I don't have the shading or RT horse power, I'm screwed.
If you dont have the VRAM you're not using RT either LMFAO :nutkick: Did you not read the techspot review? As an example: callisto protocol. Going from ultra to ultra +RT, the 3070 went from dead even with the 6800 to more then a generation BEHIND the 6800. With RT.



Just take the L man. You need VRAM capacity to use the GPU's capability, whether that be shaders, pixels, RT, whatever. If you buy a $400 GPU and have to immediately turn settings down at 1080p to avoid running out of VRAM, that is, objectively, a shat product, and shows the VRAM buffer is too small for what the GPU can do. The 1080p scene was settled by $200 GPUs 7 YEARS ago. A $400 GPU should not be struggling with this.
Posted on Reply
#25
bug
TheinsanegamerNI know math is hard for you, but really dude? Going from 37 to 49 FPS is a 33% increase. That's not massive? :slap: Also, no mention of the removal of microstuttering? :laugh::roll::laugh:

Oh, so now were claiming cherry picking. Bruh, modern games are frequently hitting this wall. You cant "well you picked that specific game" your way out of this one. And, I'd say going from 7 FPS minimum to 40 FPS minimum Is a LOT more then "better with the occasional dip". Unless your definition of "picking titles" is using modern AAA games in any capacity, which just LMFAO.

If you dont have the VRAM you're not using RT either LMFAO :nutkick: Did you not read the techspot review? As an example: callisto protocol. Going from ultra to ultra +RT, the 3070 went from dead even with the 6800 to more then a generation BEHIND the 6800. With RT.



Just take the L man. You need VRAM capacity to use the GPU's capability, whether that be shaders, pixels, RT, whatever. If you buy a $400 GPU and have to immediately turn settings down at 1080p to avoid running out of VRAM, that is, objectively, a shat product, and shows the VRAM buffer is too small for what the GPU can do. The 1080p scene was settled by $200 GPUs 7 YEARS ago. A $400 GPU should not be struggling with this.
30870 is the worst case scenario (and even then if you look at Forspoken, it's still fine).
With Ada, only 4060 and 4060 Ti come with 8GB VRAM, everything else gets 12-24. Since 4060 is not meant to push 4k or maxxed out QHD, I still don't get why people use "Nvidia doesn't offer enough VRAM" as a blanket statement.
Posted on Reply
Add your own comment
Dec 18th, 2024 04:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts