• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4060 Ti 16 GB

I think a lack of VRAM is holding this card back, a lack of VRAM holds all cards back apparently.
 
I think a lack of VRAM is holding this card back, a lack of VRAM holds all cards back apparently.

64-bit 288GB/s effective bandwidth 32GB VRAM here we come! RTX 5060 SUPER Ti...have they done done a SUPER Ti or is the marketing team sleeping on making it sound more impressive than it is in actuality which is what they specialize at.
 
It does make me wonder with the bigger titles at least if it is a matter of the games originating on Consoles. Both the PS5 and Xbox series X have 16gb of memory. They also have at least 256-bit buses. The Xbox series X has a mixed memory with 10GB's being 256-bit and 6gb's being 320-bit and with both consoles having much larger bandwidth.
[Major reeleases are primarily optimized for consoles, as those have more optimal and standardised features. Then optimizations for PC hardware and its diverse VRAM and performance tiers. Devs don't have time and money to cover everything perfectly. Obvious victim could be 8GB in increasing number of cases. Not all textures would load and game automatically looks much better on consoles.
 
Sooo, all the crap talking and it means nothing. Where dem Green bois at?
 
>10GB and >500Gbps is where you must be for any new purchase or you are wasting time and precious money.
I don't think this statement is valid if you're investing $40 in an RX 480 or something like that. Yeah, this card sucks but $40 is also a joke of a price. A good joke, unlike 4060 Ti's.

I still have this GPU as my backup and it's allowing for playing if not everything but at least most games. Even demanding ones like Cyberpunk 2077 are give or take playable (with asterisk but still).

There also are games which don't need you to have a serious GPU. Starcraft 2, for example. It's very CPU-taxing but even HD 6950 is enough. And this GPU is 13 years old or so. Some gamers just ignore graphics settings and can live with everything low if framerate and frametimes are comfortable. CS:GO as a clear example of such approach.

But yeah, if you need ultimate 1080p performance you have to acquire something which has double-digit VRAM amount and at least 500 GBps VRAM bandwidth. RX 6800/RTX 3080/RTX 4070 as the minimum if we consider 1% lows below 60 FPS unacceptable.
 
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.

But at least this review shows properly for all the people crying on the net about nvidia cards not having enough vram that currently 16GB or 8GB for midrange cards equals to the same performance. Of course in the following years more vram will be beneficial, the same way a 6GB 1060 is way better than a 1060 4GB nowdays but it wont come to this before a few years.

In other words for people who change their GPU every 2-3 years there is no reason to put an extra $100 for a 16GB 4060.
 
Last edited:
The test results show why Nvidia isn't too keen on submitting the product for review. Except for a few optimized disabled games, the 16gb model doesn't make a difference in performance.
Assuming the intended use of this card is Full HD, it won't matter for a long time.
In short, this gpu is a complete waste of sand.
 
Portal RTX would be a good title to test and compare 8 v 16.

Problem is 16GB helps most at higher resolutions, but 4060 Ti already struggles at resolutions above 1080P due to low bus width. Too many other bottlenecks before it reaches the VRAM pool size limit.
 
Question.
Is this THE WORST or the 2nd worst Video Card released by NVidia, price/performance wise?
 
To anyone who thinks this should have been a 50-tier card... No. Even the 4060 (which I'll grant could be argued as a 4050 ti) holds up reasonably well against its predecessors. Yes, it costs too much, by at least USD50, and has too narrow a memory bus. But let's look at the last 9 generations of x60 cards.

I collated the results from TPU testing at the second-highest tested resolution. 960 is listed at two resolutions as the transition point to 1440p. The 560, 660, 960, 1060 and 1660 are all within 5 FPS of 60 and range from USD200-250 and 120-150W. The 760 was an outlier pre-Turing. It's also tied for hungriest here, as well as tied for most expensive until the 2060.

Note that with the 2060, price and power both took a big jump to achieve the accompanying performance increase. This was where Nvidia decided to redefine what 60-series, and all performance tiers, meant. If one simply looks at model numbers, yeah; the 4060 is a slap in the face relative to the 3060. But look at power. Gen-on-gen, performance of x60 cards pre-RT was pretty consistent relative to contemporary titles, as was power and price (again, 760 excepted). Anyone expecting the 4060 ti or even 4060 to be sold for USD200 is bordering on delusional. But at $250 and, say, $280? Whole different ball game. Downward price pressure is nearly always weaker than upward. Barring another crypto-style event, we could see "normal" 60-series P/P and pricing in a gen or two.

ModelPriceWattsVRAMBus widthResolutionAvg FPS
5602001501GB2561920x120060
6602301402GB1921920x120063
7602501702GB2561920x108073
9602001202GB1281920x108065
9602001202GB1282460x144043
10602501206GB1922460x144055
16602201206GB1922460x144056
20603501606GB1922460x144085
306033017012GB1922460x144085
40603001158GB1282460x144070

I will beat this drum until people start listening: Performance expectations are, in many cases, simply too high. Beyond resolutions increasing, rendering demand rises right along with it. x60 cards were meant, pre-Turing (and for Turing if you count the 1660), to make around 60 FPS in mainstream to upper-mainstream resolutions for 150W/$250 or less. Then Nvidia belched out RTX, and PC gamers got Stockholm Syndrome and let NV nudge "mainstream" up the ladder. Hell, by the above numbers, the 3050 should have launched as a 60-series card: $250*. 130W. 60fps@1440p. Inflation adjusted, it's practically a dead ringer for the 960.

*Yes, I know the 3050 never actually listed for it's "launch" price

Question.
Is this THE WORST or the 2nd worst Video Card released by NVidia, price/performance wise?

4th worst at 1080p, 3rd worst above (sticking to Ada, anyway).

1690261127200.png
 
Last edited:
Is this THE WORST or the 2nd worst Video Card released by NVidia, price/performance wise?
Laaaaaaadies and gentleman, and all the other 65535 genders, welcome to our Ridiculous Crap Chart 2023!

Today's participants are Ada Lovelace GPUs and the question is who is giving the least bang per buck!

We are measuring their bangs in relative performance at 4K in video games from the 16-gigabyte 4060 Ti's TPU review, one and only to this moment!
We are measuring their bucks in illegal transients United States dollars. $20 is $20.

RTX 4060 is 60% the price and 78% the performance. Not bad considering how puny and cheeky this GPU looks.
RTX 4060 Ti 8 GB is 80% the price and 98% the performance. Record breaking efficiency here! Call in the cops!
RTX 4060 Ti 16 GB is 100% the price and 100% the performance. How come!
RTX 4070 is 120% the price and 139% the performance. Not like we didn't expect this but... Smells fishy.
RTX 4070 Ti is 160% the price and 169% the performance. This participant tried their best to disappoint us and our jury rated their attempts high, yet not high enough.
RTX 4080 is 240% the price and 213% the performance. That's how we laze!
RTX 4090 is 320% the price and 274% the performance. This participant has been banned from participating due to bearing too many ARMs being unable to run at 100% potential due to current state of CPU performance.

So our golden tier underperformer is RTX 4090, yet it has to be revoked due to tango only being a thing if both perform, yet CPUs nowadays are too slow to make completely fair use of 4090. Thus, the golden award is bestown upon the RTX 4080 which has grasped the fail from the jaws of victory. And the silver award, indeed, is given to the 4060 Ti 16 GB, the second worst performer of this generation.

The bronze award should have been given to the RTX 4070 Ti but all things considered, this GPU doesn't deserve the credit.
 
So much for games needing more than 8 GB VRAM. In 4K with RT on, perhaps.
It is not as simple as that, many games just strait up not loading some assets when it runs out of vram. Meaning the image might not be the same on both cards.
Also it depends on the scene, often the game starts out smooth but as vram fills up the 8GB cards can starts stuttering. So it also depends on how long the benchmark run is.
 
Last edited:
"The fact is that NVIDIA owns the GPU market"

Sums up everything for me with your statement W1z for the 4xxx release.

Thanks for the purchase and review.
 
To anyone who thinks this should have been a 50-tier card... No. Even the 4060 (which I'll grant could be argued as a 4050 ti) holds up reasonably well against its predecessors.
Whilst performance-wise you're more correct than not there are other things.

0. All Ada Lovelace products and especially, the lower tier ones, lack reasonable VRAM amount and, most importantly, bandwidth. This shifts their real tier one tier lower.
1. All Ada Lovelace products are more cut than their predecessors. 3060 sports the same potential of Ampere architecture as 4070 sports the potential of Ada. But hey, 4070 suddenly costs almost twice as expensive despite being nowhere near twice as fast!
2. All Ada Lovelace products have been launched at the time when almost nobody wants a GPU. There is little demand and most demanders are now either picky, or broke, or both. There is almost no miners with their "please give us another million GPUs for whatever price" around as of now. They're gone. nVidia has to consider this fact as well.
3. If everyone submits to nVidia's policy and pays what they ask for they will see no reason for improvement. Now, seeing the worst demand ever, they are forced to think. Mining fever hangover is still here in the air but it will fade away some day. And then, nVidia will sober their pricing up, unless AMD and Intel fail to achieve anything. And they are currently achieving very little with only two meaningful products available, namely A770 from Intel and RX 7900 XTX from AMD. The rest is incompetitive. Or non-existent.
4. You can't take GPU's naming seriously if it is not faster than its predecessor in every possible scenario. And yes, all 4060 Series GPUs do lose to their 3060 Series predecessors in certain tasks which has never happened before.
 
... only two meaningful products available, namely A770 from Intel and RX 7900 XTX from AMD. The rest is incompetitive. Or non-existent.
I agree with your points, except for this. RDNA 2 is still alive and kicking and deserving of attention. Both the 6600 and 6700 series are great value, even the 6800 and 6900 series are okay with the right discount.
 
This card has hidden value for plenty of gamers on budget. They will see the value in one or two years from now when the price drops. DLSS+frame gen+reflex+L2 cache+16GB VRAM is death combo for future games which will be more adapted to this gen. The combo was very well designed by nvidia.
It is good that 16GB vram got this model, because 60class are used by the most gamers who are keeping their cards many years in opposite 4070/4080 gamers who are changing cards more often for new ones.

Another point gamers are overlooking is we have plenty of options how to manage fps using game settings(even without frame gen), but only few how to manage VRAM utilization/allocation.

16GB could be immediatelly used by any modern open world game. CP77 is constantly reading game data fom disk at avg. speed of 30MB/s on Ultra settings.

Graphics HW is not responsible for the software we are running on it. The software needs to be adapted to HW. The bad product simply does not exist, every one has its palce.
All the comments I see on web about this gen has simply almost zero value for me.
 
Both the 6600 and 6700 series are great value, even the 6800 and 6900 series are okay with the right discount.
No clue (aterisk) about pricings abroad but in my particular piece of the globe the whole RDNA2 makes zero sense to buy. RX 6600 is absolutely destroyed by aftermarket 2080s, RX 6600 XT is trashed by 3060-12, 6700 does not exist, 6700 XT somehow is priced like 3070 which basically is far from passing the reality check, the 6800 is priced a bit lower than 3080 (and y'know, 3080 DESTROYS 6800 in RT and doesn't let it relax in non-RT), the 6800 XT is priced like 3080 (but do you remember absence of RT performance and DLSS? I do) and 6900s are priced so high it makes no sense to even consider them. I speak about both BNIB and aftermarket ones. They're equally biased and not in favour of AMD Perf/Price ratio.

Half a year ago, yes, my purchase of 6700 XT was questionable, yet somewhat reasonable. In today's market, I'd be ballin' on a used 3080 or perhaps a 6800 if my [Barter 95] wreaked some sense in a vendor.

* just checked the Newegg.
RX 6600 XT is more expensive than RTX 3060-12. Fail.
RX 6700 non-XT is same price as 4060/3060 Ti, yet it's noticeably slower than them, especially in RT-happy scenarios. Fail.
RX 6700 XT is 10 bucks more expensive. Fair enough but "great value" is a very ambitious way to describe it.
RX 6800 makes no sense since the XT version is only 50 bucks more expensive and provides with significantly more pace.
XT version, though, makes sense.
 
Maybe the upcoming Radeon RX 7800 XT and RX 7700 XT can change that.

All the leaks point to non-XT versions. We might see the XT ones once NVIDIA refreshes its line-up somewhere next year.
 
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.

But at least this review shows properly for all the people crying on the net about nvidia cards not having enough vram that currently 16GB or 8GB for midrange cards equals to the same performance. Of course in the following years more vram will be beneficial, the same way a 6GB 1060 is way better than a 1060 4GB nowdays but it wont come to this before a few years.

In other words for people who change their GPU every 2-3 years there is no reason to put an extra $100 for a 16GB 4060.
Wrong, DLSS3 is not a universal feature so no. It needs its own chart as much as a feature like Hairworks did way back. Its proprietary + game specific + hardware specific (ada only) and even driver /patch/update specific performance. It should have no effect on the overall averages.

"The fact is that NVIDIA owns the GPU market"

Sums up everything for me with your statement W1z for the 4xxx release.

Thanks for the purchase and review.
GPUs in x86 consoles amount to an equal market share and drive most gaming forward. The amount of PC first releases these days is a rarity rather than commonplace, on top of that. A good thing to keep in mind. This is also the main reason we have seen games for PC ported over as badly as they are now and why features like RT are still an afterthought more often than not, even despite the fact there are three generations of RT capable hardware from this supposed GPU market 'owner'...
 
Wrong, DLSS3 is not a universal feature so no. It needs its own chart as much as a feature like Hairworks did way back. Its proprietary + game specific + hardware specific (ada only) and even driver /patch/update specific performance. It should have no effect on the overall averages.
Yup .. with the new H2 2023 test platform I've added testing in 3 games to the DLSS page:

Didn't feel it was relevant for the 16 GB model, so I didn't test.

This will be included in all new releases going forward, but I have no plans to bring DLSS performance results into our "normal" testing.

This card has hidden value for plenty of gamers on budget. They will see the value in one or two years from now when the price drops
No doubt, if the price is right, this will be an interesting choice. /doubt on value in a few years, how's 3080 12 GB doing these days? Seems to me most of the action on the used market is for the 10 GB model (I could be wrong though)
 
No clue (aterisk) about pricings abroad but in my particular piece of the globe the whole RDNA2 makes zero sense to buy. RX 6600 is absolutely destroyed by aftermarket 2080s, RX 6600 XT is trashed by 3060-12, 6700 does not exist, 6700 XT somehow is priced like 3070 which basically is far from passing the reality check, the 6800 is priced a bit lower than 3080 (and y'know, 3080 DESTROYS 6800 in RT and doesn't let it relax in non-RT), the 6800 XT is priced like 3080 (but do you remember absence of RT performance and DLSS? I do) and 6900s are priced so high it makes no sense to even consider them. I speak about both BNIB and aftermarket ones. They're equally biased and not in favour of AMD Perf/Price ratio.

Half a year ago, yes, my purchase of 6700 XT was questionable, yet somewhat reasonable. In today's market, I'd be ballin' on a used 3080 or perhaps a 6800 if my [Barter 95] wreaked some sense in a vendor.

* just checked the Newegg.
RX 6600 XT is more expensive than RTX 3060-12. Fail.
RX 6700 non-XT is same price as 4060/3060 Ti, yet it's noticeably slower than them, especially in RT-happy scenarios. Fail.
RX 6700 XT is 10 bucks more expensive. Fair enough but "great value" is a very ambitious way to describe it.
RX 6800 makes no sense since the XT version is only 50 bucks more expensive and provides with significantly more pace.
XT version, though, makes sense.
At Scan UK, the 6650 XT is £100 cheaper than the 3060 - that's about third of the price. The 6700 XT is the same price as the 3060. The 6800 is on par with the 3070. There is an XFX 6950 XT that is £50 cheaper than the 3080 10 GB right now. So yeah, I think AMD offers damn good value right now, at least here.
 
It is a *50-tier card, rebadged to mislead the customers that it belongs to a higher tier. No.

Card | Chip | Die area
RTX 4060 AD107 159 mm^2
RTX 3050 GA106 276 mm^2
GTX 1650 S TU116 284 mm^2
GTX 1650 TU117 200 mm^2
GTX 1050 GP107 132 mm^2
GTX 950 GM206 228 mm^2
 
I agree with your points, except for this. RDNA 2 is still alive and kicking and deserving of attention. Both the 6600 and 6700 series are great value, even the 6800 and 6900 series are okay with the right discount.
Sure but having to watch out for PSU limitations with RDNA3 and Ada to compete with makes them a bit problematic imho.
 
Sure but having to watch out for PSU limitations with RDNA3 and Ada to compete with makes them a bit problematic imho.
What PSU limitations? A basic 500 W unit (not the cheapest kind) can drive a 6600 XT with any CPU.

If by "Ada", you mean the 4060, then sure, I agree. The rest are too expensive, imo.
 
To anyone who thinks this should have been a 50-tier card... No. Even the 4060 (which I'll grant could be argued as a 4050 ti) holds up reasonably well against its predecessors. Yes, it costs too much, by at least USD50, and has too narrow a memory bus. But let's look at the last 9 generations of x60 cards.

I collated the results from TPU testing at the second-highest tested resolution. 960 is listed at two resolutions as the transition point to 1440p. The 560, 660, 960, 1060 and 1660 are all within 5 FPS of 60 and range from USD200-250 and 120-150W. The 760 was an outlier pre-Turing. It's also tied for hungriest here, as well as tied for most expensive until the 2060.

Note that with the 2060, price and power both took a big jump to achieve the accompanying performance increase. This was where Nvidia decided to redefine what 60-series, and all performance tiers, meant. If one simply looks at model numbers, yeah; the 4060 is a slap in the face relative to the 3060. But look at power. Gen-on-gen, performance of x60 cards pre-RT was pretty consistent relative to contemporary titles, as was power and price (again, 760 excepted). Anyone expecting the 4060 ti or even 4060 to be sold for USD200 is bordering on delusional. But at $250 and, say, $280? Whole different ball game. Downward price pressure is nearly always weaker than upward. Barring another crypto-style event, we could see "normal" 60-series P/P and pricing in a gen or two.

ModelPriceWattsVRAMBus widthResolutionAvg FPS
5602001501GB2561920x120060
6602301402GB1921920x120063
7602501702GB2561920x108073
9602001202GB1281920x108065
9602001202GB1282460x144043
10602501206GB1922460x144055
16602201206GB1922460x144056
20603501606GB1922460x144085
306033017012GB1922460x144085
40603001158GB1282460x144070

I will beat this drum until people start listening: Performance expectations are, in many cases, simply too high. Beyond resolutions increasing, rendering demand rises right along with it. x60 cards were meant, pre-Turing (and for Turing if you count the 1660), to make around 60 FPS in mainstream to upper-mainstream resolutions for 150W/$250 or less. Then Nvidia belched out RTX, and PC gamers got Stockholm Syndrome and let NV nudge "mainstream" up the ladder. Hell, by the above numbers, the 3050 should have launched as a 60-series card: $250*. 130W. 60fps@1440p. Inflation adjusted, it's practically a dead ringer for the 960.

*Yes, I know the 3050 never actually listed for it's "launch" price



4th worst at 1080p, 3rd worst above (sticking to Ada, anyway).

View attachment 306153

[Major reeleases are primarily optimized for consoles, as those have more optimal and standardised features. Then optimizations for PC hardware and its diverse VRAM and performance tiers. Devs don't have time and money to cover everything perfectly. Obvious victim could be 8GB in increasing number of cases. Not all textures would load and game automatically looks much better on consoles.

The problem is in the past you could always lower settings a bit and run at higher resolutions on cards. The 4060 ti is clearly 100 percent only a 1080p card. In fact Hardware Unboxed took it further and said the 4060 ti is only a 720p card because it can't complete all textures in a lot of games at 1080p.

I don't think expectations are to high. We have had two console generations in a row that support 4k gaming in one form or another. A 4060 ti doesn't get you 4k at all. And it cost up to over $500 for $16gb versions. Here is a metric for you. The first 1080P graphics card was the 8800gts. It released in 2007. 16 years pc gamers are still expected to pay $500 or more to game at 1080P just on the gpu side of things. While for $400 or $500 with disc drive you can buy a PS5 or Xbox Series X that games at 4k.

Also remember that the 4070ti was supposed to be a 4080 model. That means that the whole product stack was originally planned to even be up one place. Also those plans included a $100 price increase for the 4080 that eventually became the 4070ti. That means the 4070 was supposed to be the 4070ti. That means the 4060ti was supposed to be the 4070. That the 4060 was supposed to be the 4060 ti.

That also means they meant to sell what is now the 4060ti at probably $600. That is a total joke.
 
Back
Top