• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4060 Ti 16 GB

It's like I said, if you thought that the review for the RTX 4060 Ti 8GB was the worst that you've ever seen (and it certainly was the worst that I can remember), just wait, because the 16GB review will be even worse.

Sure enough...

I swear man, the ones in charge over at nVidia must be smoking moon rocks because putting this card up against the RX 6800 XT is asking for a Romulan Bloodbath (green blood will flow).

The 6800 XT is irrelevant to the matter at hand, it is a much higher segment, previous generation card built on a far larger and much more advanced GPU that draws twice as much power, of course both it and the RTX 3080 are going to clobber it.

The price analogy doesn't work very well because neither of those cards are manufactured any longer, availability is relying on leftover stock which hasn't been sold yet. In essence, these cards don't matter. Stocks of any remaining new units are depleting fast.

I'll go out on a limb and say that the ones smoking moon rocks were those kvetching about Nvidia not putting enough VRAM on their GPUs, chiefly, an AMD camp complaint. Nvidia's own lack of interest in this SKU makes it look like they just put this out to prove naysayers wrong.

Of course, convenient detail to hide @Vayra86's excellent point of "the GPU is only as strong as its weakest link", and this cutdown AD106 on 128-bit should never have been sold as anything other than an RTX 4050, but that's a problem this entire generation is facing.

II just find it bizarre that this card has effectively the same fundamental design flaw of the RX 6500 XT, an overreliance on the cache to make up for the extremely anemic memory bandwidth... that is to say, both are low-end, power efficient chips you'd otherwise find in a budget laptop that your average League of Legends player would have.
 
The 6800 XT is irrelevant to the matter at hand, it is a much higher segment, previous generation card built on a far larger and much more advanced GPU that draws twice as much power, of course both it and the RTX 3080 are going to clobber it.

The price analogy doesn't work very well because neither of those cards are manufactured any longer, availability is relying on leftover stock which hasn't been sold yet. In essence, these cards don't matter. Stocks of any remaining new units are depleting fast.

No.
AMD has nothing to replace these cards with. They are still in production and will be in production for a very long time.
You can undervolt the RX 6800 and approach the RTX 4060 Ti power consumption.
 
No.
AMD has nothing to replace these cards with. They are still in production and will be in production for a very long time.
You can undervolt the RX 6800 and approach the RTX 4060 Ti power consumption.

You're wrong, the 6800 and 6900 series have already been replaced by the 7900 XT and 7900 XTX. They're even releasing a 7900 GRE which is a 7900 XT with 4 MCDs/16 GB configuration for the Chinese market.

RX 7800 and 7700 based on Navi 32 are imminent at this point, even with all delays. RDNA 2 is 3 years old, these 6800 cards are old news by now. They're not even easy to find anymore in most of the world, here we only have the models that no one wanted to buy left in stock. The MBA and nice AIB models that you'd want to have are all gone already.
 
these 6800 cards are old news by now.

I know. But...

They're not even easy to find anymore in most of the world, here we only have the models that no one wanted to buy left in stock. The MBA and nice AIB models that you'd want to have are all gone already.

Here I see 7 models for the RX 6800 XT:

ASUS Radeon RX 6800 XT O16GB GDDR6 (TUF-RX6800XT-O16G-GAMING) - available in 1 store
ASRock RX 6800 XT 16GB OC (RX6800XT PG 16GO) - available in 27 stores
ASRock Radeon RX 6800 XT 16GB GDDR6 256bit (RX6800XT PGD 16GO) - available in 26 stores
SAPPHIRE Radeon RX 6800 XT 16GB GDDR6 256bit (11304-03-20G) - available in 1 store
PowerColor RX 6800 XT Red Dragon 16GB GDDR6 256bit (AXRX 6800XT 16GBD6-3DHR/OC) - available in 27 stores
BIOSTAR Radeon Extreme RX 6800 XT 16GB GDDR6 256bit (VA68T6TMP2) - available in 1 store (1 unit left)
XFX Radeon Speedster MERC 319 RX 6800 XT 16GB GDDR6 256bit (RX-68XTALFD9) - available in 1 store

RX 7800 and 7700 based on Navi 32 are imminent at this point, even with all delays.

It's still July, these are rumoured* for September release, if even later availability...
 
More and more games will need lots of VRAM. More VRAM means more future-proofing. AMD chose to equip the RX 6800 with as much as 16 GB and it will pay off in the long run - fine wine.

View attachment 306188
View attachment 306187
it's worth pointing out these are 4K resolution results, far outside the scope, context, and capabilities of this paltry 128-bit card which already chokes at 1440p in many of those titles, regardless of whether it's the 8GB or 16GB variant. There's no getting around the damage caused by nvidia halving the memory bandwidth from last generation, it's truly brutal for a GPU with this much compute power to be held back by such a lame cost-cutting measure.
 
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.

But at least this review shows properly for all the people crying on the net about nvidia cards not having enough vram that currently 16GB or 8GB for midrange cards equals to the same performance. Of course in the following years more vram will be beneficial, the same way a 6GB 1060 is way better than a 1060 4GB nowdays but it wont come to this before a few years.

In other words for people who change their GPU every 2-3 years there is no reason to put an extra $100 for a 16GB 4060.
Not much point in focusing on a scenario that only shows up in a handful of games (DLSS3, RTX).

7900 XTX eats about as much as 4090 does but is slower and by a huge lot.
7600 eats about as much as 4060 Ti but loses to 4060 speedwise.

RDNA3 is utter garbage in terms of energy efficiency compared to Ada. RDNA2 is even worse. That's why their value is lower. Not everyone is in desperate need of spending extra cash on cooling and electricity bills.

Oh and by the way, transients in RDNA2 are humongous. They are way higher than in ANYTHING else. Especially in early 6900 XTs which can perform 750 W transient spike kickflips.
Not everyone is in desperate need of penny pinching on power when buying an almost $1000 graphics card either... Those that are don't have their priorities in line or live somewhere with exorbitant energy prices.
 
Last edited:
Those that are don't have their priorities in line
So you are telling me the will to not spend the whole net worth just because manufacturers stopped caring about power consumption is weird prioritising?

Everyone, even rich kids who can afford themselves a high-tier card, has a right not to spend extra. But nVidia and AMD decided they're the only ones with this right.
 
Should have been a 12G card with a 192 bit bus the entire time. This generation will probably go down in history as one of the worst. Nvidia and AMD are milking everyone and it's pathetic. I'm glad I got my 3080 Ti when I did because I'd have to wait another 2 years for something decent.

You mean the 4070 that should really be a 4060Ti at best and ideally just the standard 4060?
 
It's certainly a GPU scenario where I'd like to see how the memory kit performance impacts the narrow memory bus. That seems like a area where you might be able to alleviate some of the overall problem in part though you can only expect to gain so much from a better binned memory kit in terms of bandwidth uplift and latency reduction.
 
You'd only notice it at low framerates. The lower the performance GPU, the more noticeable the increased input latency of DLSS3 will be. Once you get down to sub-60 FPS with DLSS2, it will be noticeable, as DLSS3 essentially reverts your input latency right back to where it was at native resolution, sometimes worse, rarely better. In other worse, it means your base input latency would be about where the ~30-40 FPS input latency would be if you were getting 60 FPS with DLSS2 and switched to DLSS3.

You also seem to be confused on what exactly DLSS3 is. DLSS3 is DLSS2+FG. You don't need to clarify FG, it's the only distinguishing factor between DLSS 2 & 3.

You're evading my question. I bet you've never tried it yourself, yet you talk like you know more than people who did. Fine, keep thinking what you want, i dont care
 
You're evading my question. I bet you've never tried it yourself, yet you talk like you know more than people who did. Fine, keep thinking what you want, i dont care
Some people might value an independent professional and knowledgeable source of information over their own limited amateur viewpoint.
 
The 6800XT at the same price point as this x60ti needs much more. A 6600XT is a weak card as much as x60 is, its not even worth looking at imho. 6700 is in an okay place in that sense too, still doing ok without an out-of-tier rated PSU.

The x60 is also too expensive, its a penny wise pound stupid purchase. This gen you either go upper midrange or better or you really should not bother at all. Rather, get something dirt cheap second hand to sit out until 2025 or something.

6600XT is is in fact similar to my old 1080, seeing the relative perf. Or a 2060. Its ancient by these standards, near obsolete unless you drop the bar to the bottom. Youre looking at medium 1080p~
6600 XT is significantly faster than the 1080. TechSpot found that the RX 6600 is 20% faster than the 1080 at 1080p. The 6600 XT should be 35-40% faster than the 1080.
 
Precisely.
how is it a 60 Ti, you are getting ahead of time. one could argue that it reaches a 60 status only as second gen refresh based on the same node. relax.

RTX 4070-4070 Ti is 294 mm². 4070 Ti was supposed to be the 4080,
and the 4080 16GB was the oddball that is kind of like GTX 980 but way way ahead of time. to counter the weak RX 7900. it is meant to be the mobile 4090.
4070 same as GTX 670-680 294 mm²
4070 same as GTX 1070-1080 314 mm²

GTX 760-770 294 mm² second gen on same 28nm node.
and then you have a third gen on the same node yeah. GTX 970-980 390mm² and only then you can expect it to be down to earth as a third iteration, cough 4080 16GB
GTX 1660-1660 Ti 284 mm² second gen on same 12/16nm node

288 GB/s is +50% effective so a 192 bit bus in disguise.
 
it's worth pointing out these are 4K resolution results, far outside the scope, context, and capabilities of this paltry 128-bit card which already chokes at 1440p in many of those titles, regardless of whether it's the 8GB or 16GB variant. There's no getting around the damage caused by nvidia halving the memory bandwidth from last generation, it's truly brutal for a GPU with this much compute power to be held back by such a lame cost-cutting measure.

TPU states that it's so-so fine even for 4K. Yellow colour means depends on the settings and use cases.

1690311449422.png


I believe that the sweet spot for the potential RTX 4060 Ti owners is 1440p screens, but some enthusiasts can move to the better quality 2160p screens.
 
The review doesn't paint the whole picture. It is either the chip itself is so slow, that it can not use more than 8 GB, or that nvidia cheats and automatically modifies the image quality in order to fit in the available VRAM buffer.

I thought this was proven in several reviews. I diddnt read everything, but I did see screenshots from multiple sources where some of the textures looked bad. Really bad.

If the 8GB cards are showing signs of trouble now, even if its in just one or two titles, how is it going to look in a year or two? Maybe its a question of $100 extra today, or several times that in a year or so.

Lets hope we never find out. I hope every single card gathers dust on the shelves for all time as a nice reminder for the greedy. All of them, not only nVidia.
 
It be needed to release with 256 bit and pci-e 16x.
192 bit x8 12GB and this card would've been pretty good.
16x for this class of product is pointless for the vast vast majority of users and 256 bit 16x would've increased the price by much more than they already cost now.
 
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.

But at least this review shows properly for all the people crying on the net about nvidia cards not having enough vram that currently 16GB or 8GB for midrange cards equals to the same performance. Of course in the following years more vram will be beneficial, the same way a 6GB 1060 is way better than a 1060 4GB nowdays but it wont come to this before a few years.

In other words for people who change their GPU every 2-3 years there is no reason to put an extra $100 for a 16GB 4060.
Frame Generation is awful. Especially with mouse and keyboard. Enough said. I have used it, found out it was awful, then turned it off. That's my experience with the RTX 4070 and RTX 4080.

We have reached the "PS5 port to PC" part of the generation. It took a long time, but honestly most new games don't work right without the Ryzen 7800X3D and the RTX 4070 at minimum. Sucks. That's the entry point for quality PC gaming in 2023.
 
So you are telling me the will to not spend the whole net worth just because manufacturers stopped caring about power consumption is weird prioritising?

Everyone, even rich kids who can afford themselves a high-tier card, has a right not to spend extra. But nVidia and AMD decided they're the only ones with this right.
I think the point was that power consumption isn't your first priority when buying a top-tier graphics card. It's like ordering the new Corvette ZR1 and asking the dealer for its fuel usage. No one does that, ever.
 
I don't understand people are so upset that their green cards suck so bad and at the same time make bold statements like current RX 7000 sucks just as bad. "AMD is milking everyone". The hatred and disdain for AMD is just unreal when AMD has nothing to do with this piece of trash. It seems as if green team fans are mad that AMD is offering more VRAM at a very competitive price but somehow "AMD is milking" everyone.

AMD RADEON cards that don't suck:

6700XT 12GB $300
6750XT 12GB $320
6800 16GB $430
6800XT 16GB $480
6950XT 16GB $570
7900XT 20GB $700
7900XTX 24GB $850

You have personal and deeper issues if you don't acknowledge the VRAM/price to performance and somehow "missed out" on these deals.
 
I don't understand people are so upset that their green cards suck so bad and at the same time make bold statements like current RX 7000 sucks just as bad. "AMD is milking everyone". The hatred and disdain for AMD is just unreal when AMD has nothing to do with this piece of trash. It seems as if green team fans are mad that AMD is offering more VRAM at a very competitive price but somehow "AMD is milking" everyone.

AMD RADEON cards that don't suck:

6700XT 12GB $300
6750XT 12GB $320
6800 16GB $430
6800XT 16GB $480
6950XT 16GB $570
7900XT 20GB $700
7900XTX 24GB $850

You have personal and deeper issues if you don't acknowledge the VRAM/price to performance and somehow "missed out" on these deals.
Especially if you consider today this 7900XT rivals a 4080 at raster perf for nearly half the price. Or better - as an AIB 4080 goes for above 1200 with ease. I bought it at 899 and I still cant say Im unhappy. Its highly competitive to say the very least. There is also definitely room now for a 7800XT at around 600 with 16GB and a -10/15% perf deficit. And even in RT these two cards would do just fine comparatively. Without proprietary BS as a bonus - no game ready driver or DLSS3 updates required, the perf is just there out of the box. As it should be.

Its a no brainer to me tbh
 
Last edited:
But wouldn't faster system RAM help with performance, somehow? I mean, DDR5-6000 provides 96 GB/s in dual-channel - fairly decent, I'd say.
It's 1/3 of what the actual bandwidth of the 4060 Ti is. They could use the system RAM to store textures in the distance, and dedicated VRAM for the closer proximity (if that's possible?).
I know graphics cards do this (taking system RAM), but I believe there's some unexplored potential here: differences should be more significant going from 51.2 GB/s to 96 GB/s.
 
Back
Top