Friday, July 7th 2023

NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch

It looks like the only reason the GeForce RTX 4060 Ti 16 GB even exists, is to prove to naysayers that think 8 GB is too little an amount of memory for the original RTX 4060 Ti. Andreas Schilling of HardwareLuxx.de in a tweet stated that NVIDIA add-in card (AIC) partners tell him that very few of them are interested in promoting the RTX 4060 Ti 16 GB, which goes on sale starting July 18. Board partners have very few custom-design graphics card models in their product stack, and much of this has to do with the card's steep $499 MSRP.

Priced $100 above the original RTX 4060 Ti for essentially double the memory size and no other difference in specs, the RTX 4060 Ti 16 GB is hard enough to sell at MSRP, premium overclocked models would end up being priced around the $550-mark, which puts it just $50 short of the $599 MSRP of the RTX 4070. The RTX 4070 is around 30% faster than the RTX 4060 Ti 8 GB, and we doubt if the additional memory size will narrow the gap by more than a couple of percentage points.
Sources: Andreas Schilling (Twitter), VideoCardz
Add your own comment

79 Comments on NVIDIA Partners Not Too Enthusiastic About GeForce RTX 4060 Ti 16 GB Launch

#26
TheinsanegamerN
bug3080 is the worst case scenario
3080? This is looking at a 3070, you know, the GPU that has almost the same performance as the 4060ti and also only has 8GB of VRAM. Perhaps you are a little confused?
bug(and even then if you look at Forspoken, it's still fine).
From the techspot review:

"Yes, once again the 8GB VRAM buffer of the RTX 3070 isn't enough once we enable ray tracing, but rather than stutter like mad, crash, or perform poorly, it just fails to load textures at all, leaving you with this blurry mess."

Is that considered "fine" to you? See, I would consider that "unacceptable" for a video card that cost more then an xbox series S and nearly as much as the PS5 this game runs fine on.
bugWith Ada, only 4060 and 4060 Ti come with 8GB VRAM, everything else gets 12-24.
We are not talking about the 12-24GB GPUs, We are talking, specifically, about the 8GB 4060/TI and why they should not exist as is.
bugSince 4060 is not meant to push 4k or maxxed out QHD, I still don't get why people use "Nvidia doesn't offer enough VRAM" as a blanket statement.
You seem to really be struggling with this, so let me help you out.

The 4060ti, with it's 8GB VRAM buffer, cannot handle current games at 1080p. Forget 4k. Forget 2k. You cannot run games, at high, with RT, at 1080p, on this $400 GPU. 1080p is out of range of this card.

1080p

1080p

1080p



Hope this helps you. If you still cant understand this VRAM argument, and why it applies to 8GB cards, I can only suggest getting that lobotomy reversed.
Posted on Reply
#27
rv8000
bug3080 is the worst case scenario (and even then if you look at Forspoken, it's still fine).
With Ada, only 4060 and 4060 Ti come with 8GB VRAM, everything else gets 12-24. Since 4060 is not meant to push 4k or maxxed out QHD, I still don't get why people use "Nvidia doesn't offer enough VRAM" as a blanket statement.
He just linked/provided examples of 1080p/1440p, and you start taking about 4K???

There are several modern game releases that already run into issues with 8gb vram buffers, even at 1080p. The most notable issue being massive 1% low spikes, which is probably one of the most noticeable/worst things to experience in game.
Posted on Reply
#28
sLowEnd
It's plainly obvious why they wouldn't be too enthusiastic about this card. The MSRP is stupid, even for the current generation of cards not known for having great value. Worst among the bad.
Posted on Reply
#29
bug
rv8000He just linked/provided examples of 1080p/1440p, and you start taking about 4K???

There are several modern game releases that already run into issues with 8gb vram buffers, even at 1080p. The most notable issue being massive 1% low spikes, which is probably one of the most noticeable/worst things to experience in game.
I was talking about settings and maxxing out things. If the game doesn't know how to manage textures properly and loads them into VRAM just because, it will load them regardless of the resolution you're playing at.
(I also made a typo and wrote 3080 instead of 3070, my bad.)
Posted on Reply
#30
Gmr_Chick
The 4060 was already a shitshow to begin with, but Nvidia said "hold my beer" and decided to release Episode II: The Shitshow Strikes Back (Now With 16GB!) :roll:

In all seriousness though, this thing is even more DOA than the original 4060 Ti was/is. Any way you look at it. Especially considering the 6800xt and 6700/50xt is still available.
Posted on Reply
#31
bug
Gmr_ChickThe 4060 was already a shitshow to begin with, but Nvidia said "hold my beer" and decided to release Episode II: The Shitshow Strikes Back (Now With 16GB!) :roll:

In all seriousness though, this thing is even more DOA than the original 4060 Ti was/is. Any way you look at it. Especially considering the 6800xt and 6700/50xt is still available.
That's what I think, too. The 8GB was priced just above what could be considered a buy. The 16GB version is just ridiculous. I mean: www.tomshardware.com/news/gddr6-vram-prices-plummet
Posted on Reply
#32
rv8000
bugI was talking about settings and maxxing out things. If the game doesn't know how to manage textures properly and loads them into VRAM just because, it will load them regardless of the resolution you're playing at.
(I also made a typo and wrote 3080 instead of 3070, my bad.)
Two different scenarios. One being GPU throughput and the other VRAM limitations. Obviously you’re not buying a 4060, 7600, 3070, 6600-6700 to play modern 4K games. The purchase scenario for those GPUs will be 1080p/1440p max to high settings. All of them being relatively close in the realm of rasterization horsepower, it’s been made quite plain which GPUs fall behind SOLEY due to vram buffer size.
Posted on Reply
#33
LabRat 891
IMHO, board partners aren't happy, because a 16GB card has longer legs. Meaning, longer periods between a customer becoming a return-customer.
Posted on Reply
#34
neatfeatguy
Gmr_ChickThe 4060 was already a shitshow to begin with, but Nvidia said "hold my beer" and decided to release Episode II: The Shitshow Strikes Back (Now With 16GB!) :roll:

In all seriousness though, this thing is even more DOA than the original 4060 Ti was/is. Any way you look at it. Especially considering the 6800xt and 6700/50xt is still available.
Based on the sales history of the 4060 Ti 8GB and the 4060 8GB in Japan, at least one person will be there when the sale embargo is lifted at 10pm to buy one of these. Hopefully Nvidia at least provides 1 of these cards to whatever AIB that sells in Japan so their 1 customer sales record can continue to hold strong in that market.
Posted on Reply
#35
Beginner Macro Device
neatfeatguyHopefully Nvidia at least provides 1 of these cards to whatever AIB that sells in Japan so their 1 customer sales record can continue to hold strong in that market.
An old samurai noticed a green box
The sword told samurai the box is a vice
Samurai committed to seppuku
Posted on Reply
#37
THU31
I don't understand how NVIDIA can force the MSRP of cards using the same chip. AIBs are buying the AD106 chips, so why can't they do whatever they want with them? Why can't they put 16 GB and simply add the cost of the memory to the price of the card?

Are those companies so afraid to stand up to the chip makers? Would NVIDIA really be happy if they all left and they could only sell their founder's edition models?
Posted on Reply
#38
SCP-001
THU31I don't understand how NVIDIA can force the MSRP of cards using the same chip. AIBs are buying the AD106 chips, so why can't they do whatever they want with them? Why can't they put 16 GB and simply add the cost of the memory to the price of the card?

Are those companies so afraid to stand up to the chip makers? Would NVIDIA really be happy if they all left and they could only sell their founder's edition models?
tbh I think Nvidia would prefer that. It's already known that Nvidia treats their AIB partners like ass (See EVGA). I'm sure they would love to keep all the chips for themselves. As for why the AIBs don't stand up for themselves. If they did, I'm pretty sure Nvidia would just blacklist them and stop selling the GPUs to the companies in any form, desktop or mobile.
Posted on Reply
#39
WorringlyIndifferent
Beginner Micro DeviceAnd I will never stop blaming nGreedia even if they start gifting me multi billion worth of stuff. This is just the "burn corpo shit" attitude which I sport.
Actively distrusting and questioning corporations (along with billionaires, politicians, journalists, or anyone else who has influence over you) should be the default behavior for all humans. It's absolutely baffling that it isn't, and a day doesn't go by that I don't interact with someone who legitimately just blindly trusts their preferred company or billionaire or whatever. They just accept whatever they hear, no critical thought, no doubt, just "whatever they said is true." Just mind boggling. The thought never even crosses their mind that whoever they're listening to might be lying, or might just be wrong. I can't imagine living like that.
Posted on Reply
#40
Beginner Macro Device
WorringlyIndifferentActively distrusting
Yes. My old SSD is also on this phony list.

Posted on Reply
#41
Dahita
TheinsanegamerNI know math is hard for you, but really dude? Going from 37 to 49 FPS is a 33% increase. That's not massive? :slap: Also, no mention of the removal of microstuttering? :laugh::roll::laugh:

Oh, so now were claiming cherry picking. Bruh, modern games are frequently hitting this wall. You cant "well you picked that specific game" your way out of this one. And, I'd say going from 7 FPS minimum to 40 FPS minimum Is a LOT more then "better with the occasional dip". Unless your definition of "picking titles" is using modern AAA games in any capacity, which just LMFAO.

If you dont have the VRAM you're not using RT either LMFAO :nutkick: Did you not read the techspot review? As an example: callisto protocol. Going from ultra to ultra +RT, the 3070 went from dead even with the 6800 to more then a generation BEHIND the 6800. With RT.



Just take the L man. You need VRAM capacity to use the GPU's capability, whether that be shaders, pixels, RT, whatever. If you buy a $400 GPU and have to immediately turn settings down at 1080p to avoid running out of VRAM, that is, objectively, a shat product, and shows the VRAM buffer is too small for what the GPU can do. The 1080p scene was settled by $200 GPUs 7 YEARS ago. A $400 GPU should not be struggling with this.
Dude, maybe you calm down a bit, you seem really aggressive for no reason. He's got a point. No matter how much VRAM you have, if you don't have the horsepower to use it, you're screwed. That is not in contradiction with your graphs. Cheer up.
Posted on Reply
#42
eidairaman1
The Exiled Airman
MrDweezilThere's no interest because the price for the 8GB model is bad, and $100 on top of that is insane.
Their pricing is ridiculous to begin with. Started with RTX 2000.
Posted on Reply
#43
Beginner Macro Device
eidairaman1Started with RTX 2000.
Not completely true. RTX 2060 was a banger for its $349. 1060 6 GB was only 50 dollars cheaper and...
• Does not have RT (I know).
• Does not support DLSS.
• Is about a third slower. In the games of 2023, the difference is up to 2 times.

You were basically getting something similar to GTX 1070 Ti but with RT/DLSS for less money than you had to pay for a basic 1070.

But yeah, 2080 and 2080 Ti have been overpriced. 2070... Just made no sense because either used 1080 Ti is cheaper or 2080 Ti is actually faster than anything Pascal.
Posted on Reply
#44
tommesfps
"Medium or High graphics settings in a video game at whatever resolution are ok sweetie" #JustMediumGraphicsPositivity #JustHighGraphicsPositivity #8GBVRAMPositivity #PLZNO8GBVRAMCybermobbingFFS #CANTsleepbecausemydreamsdontfitinto8GBVRAM #whydoconsoleportsevenhavedifferentgraphicsettingspresetsquestionmark
Posted on Reply
#45
Lew Zealand
Beginner Micro DeviceNot completely true. RTX 2060 was a banger for its $349. 1060 6 GB was only 50 dollars cheaper and...
• Does not have RT (I know).
• Does not support DLSS.
• Is about a third slower. In the games of 2023, the difference is up to 2 times.

You were basically getting something similar to GTX 1070 Ti but with RT/DLSS for less money than you had to pay for a basic 1070.

But yeah, 2080 and 2080 Ti have been overpriced. 2070... Just made no sense because either used 1080 Ti is cheaper or 2080 Ti is actually faster than anything Pascal.
1060 was $100 cheaper: $250. That's a big single-gen markup, though there was the 1660 and 1660 Super for less than the 1060, but no extra features. Only the Founder's Edition 1060 was $300.

Oh and the 2060's competitor, the 5600XT, was the same speed at $280. So the 2060 was an OK card and the cheapest tech demo for RT and DLSS but nothing special pricewise as you paid more for the added features, just like the rest of the 2000 series.
Posted on Reply
#46
eidairaman1
The Exiled Airman
Lew Zealand1060 was $100 cheaper: $250. That's a big single-gen markup, though there was the 1660 and 1660 Super for less than the 1060, but no extra features. Only the Founder's Edition 1060 was $300.

Oh and the 2060's competitor, the 5600XT, was the same speed at $280. So the 2060 was an OK card and the cheapest tech demo for RT and DLSS but nothing special pricewise as you paid more for the added features, just like the rest of the 2000 series.
349 for a low end card is stupid.
Posted on Reply
#47
watzupken
It is a pointless product for the price that it commands. 100 bucks more gets you a RTX 4070 that is more capable in every sense. Sure its got 4GB less VRAM, but for the RTX 4060 Ti which targets primarily 1080p resolution, 16GB is mostly redundant. Moreover, its performance is just around a 3060 Ti.
Posted on Reply
#48
N/A
closer to 3070 than to 3060 Ti with 91-100-104% in 1440p and 89-100-101% in 1080p and is definitely viable for 1440p with Average of 97FPS at the highest preset across 25 game titles. on Normal or High it's even better. this is the 3070 16G that hardware unboxed forced into existence only not that cheap.
Posted on Reply
#49
Vayra86
bug30870 is the worst case scenario (and even then if you look at Forspoken, it's still fine).
With Ada, only 4060 and 4060 Ti come with 8GB VRAM, everything else gets 12-24. Since 4060 is not meant to push 4k or maxxed out QHD, I still don't get why people use "Nvidia doesn't offer enough VRAM" as a blanket statement.
Just no. Sorry but its truly time to leave your old notions at the door on this subject. This train was underway and it has now arrived at 'my VRAM is insufficient just a single gen later' for Ampere cards in the midrange. Then we got Ada with samey VRAM relative to core power, and people keep saying this is fine.

The logic is off entirely. Im sure you can tweak settings, fantastic, but it doesnt change a thing: people buy cards that are brutally underpowered for way too much money in Ada.

You do you... I know I wont. But stop living in denial when there are various examples AT LAUNCH of a new gen that simply get crushed by VRAM caps while they could run the game fine in every other way. You can play at lower settings. It doesnt change facts. 8GB is shit.

Also about what you could expect on a card... I ran 3440x1440 on bloody x80 Pascal. Of course you can and should be able to max most games at 1440p on these cards. What are you on about?! And even thén, tweaking stuff left and right, you will find games that just won't be satisfactory any longer, much like I did recently.
bugThat's not a massive improvement, it's just 10fps on average. It does better with the occasional dip, but that's expected when you pick titles that don't fit into VRAM.

This is where we disagree. I can go from Ultra/Very high textures to just High to fix a VRAM issue. But if I don't have the shading or RT horse power, I'm screwed.
This is a crazy argument that even counters what you, yourself are saying about these cards: that they 'run out of juice' for whatever res because of lacking core power just the same as they do on VRAM. This test shows you, they don't, there is in fact a lot of untapped core power, that just doesn't get a chance to show itself without >8GB.

Similarly, for RT - Callisto Protocol's example even at 1080p + RT shows a serious lead for the 16GB equipped RX6800, while at 1440p with RT the Ampere card falls apart completely. It has the shading/RT horse power! It lacks the VRAM.

This also puts a different lens on the idea that games are not released with features or quality settings that cards cannot run. You could run these settings on a 12GB midrange card. But not an 8GB one that is otherwise identical. If you own an 8GB card in 2023, you're in that shitty corner no developer wants to wait for any longer. Consoles have more, and an ever growing amount of GPUs will have more as time passes, even within the current gen, as RDNA2 cards with more are still out there selling, while the 8GB cards that have been there since 2016 (!) are slowly checking out.

Also - and yes, I know, this is clearly just the maximum load on it - I have to say that in virtually every new'ish game I get into now, I'm seeing well north of 10GB in use. Every 8GB card will be that much harder pushed on its L2 and its overall bandwidth in every game going forward. Even thát incurs a performance penalty all on its own, even if it won't stutter. That is probably part of the reason the x60 isn't going places performance wise relative to Ampere - Ampere has a lot more bandwidth. Another side effect of lacking VRAM / bw is that engines dynamically adjust quality of assets on the fly now, so you will have more pop in too even if you do keep the frames. Darktide is a good example... I played that on an 8GB 1080, and had blurry textures left and right, while frames were good. On the 20GB card, those issues are gone.
Posted on Reply
#50
lightning70
A waste of sand is just a card.No one would prefer this card when there is a more powerful RTX 4070 with AD104 close to this price.
Posted on Reply
Add your own comment
Jun 3rd, 2024 11:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts