• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4060 Ti 16 GB

7900GRE performance is fine for a $500-600 target price, and it ought to be more power-efficient on smaller Navi32 silicon rather than the big Navi31's worst yields with tons of dead die area.

10-20% faster than a 6800XT at 10-20% lower power draw is a win. Pricing should (hopefully) be in line with current market price of RDNA2 - and it's no faster than the 6900XT which has been selling for sub-$600 for a long time now.

It's only competing against RDNA2 and Ampere. It's not competing against the 4070/Ti in this segment, because this far into the enthusiast price range, features like DLSS3 and RT performance are actually relevant. Raytraced with DLSS3 even comparing a 4070 to a 7900XTX is basically no contest; Upscaling is required, so DLSS is worth having over FSR. Frame gen is a bonus on top of that that AMD doesn't have at all, and Nvidia's base RT performance is significantly ahead of AMD's so the end result is an unplayable 20-30fps on one and silky-smooth 80fps on the other, at better image quality to boot....

10% faster than a 6800 XT, bearing in mind that the 6800 XT is a Navi 21 processor that has 10% of its execution units (72/80 present) disabled, while offering practically no new features whatsoever, after 3 years is... a disaster, and that's being nice. It's no wonder that the 6950 XT beats it.
 
1. The logic I'm applying is that you're dishonestly comparing a previous generation product that's going for clearance prices to a current generation product that is indubitably priced high, but where the manufacturer intended it to be priced all along. Forgot the 6950 XT's original $1199 MSRP?
I do remember and at the time, I crapped all over it. It wasn't worth anywhere close to that, especially since it was only slightly faster than my RX 6800 XT which cost a lot less. The difference was that, at launch, AMD was very candid about how the RX 6900 XT (and by extension the 6950 XT) was not a card for anyone who wanted any kind of value. That's what the RX 6800 XT was for. As far as the generation is concerned, what difference does it make? A new card with high-performance is a new card with high-performance. It would be far more dishonest to ignore it than to talk about it. If you care about generation, then you're in a small minority because the sales numbers of the RX 4060 Ti vis a vis the sales numbers of the RX 6700 XT paint a picture that pretty much screams that nobody cares about what generation something is, they care about what it can do for what they have to pay and I don't blame them for that. For gamers, especially ones that don't have more money than brains, price-to-performance is very important.

Since you brought up the RX 6950 XT, let's look at it and compare it to the "that which should not be compared because it's a newer-gen" RTX 4070 Ti.
XFX Radeon RX 6950 XT MERC319 OC 16GB - $630 at Newegg
PNY GeForce RTX 4070 Ti 12GB - $790 at Best Buy
Performance delta between the two cards: The RTX 4070 Ti is all of 4% faster than the RX 6950 XT and has 4 fewer gigabytes of VRAM (TPU GPU Database Numbers). I'm also comparing a factory OC XFX to what appears to be a Vanilla PNY so the performance difference is probably even less than that but I don't care, I'll still say that it's 4%.

Now, I don't know about you and I can only speak for myself in this situation but.... I would have a REALLY hard time spending an extra $160 (an extra 25%) for only 4% extra performance. To be fair, I'm not the least bit impressed with the current implementations of RT and I have little to no interest in using the fake frames of DLSS3 or FSR3 because I'm not buying software here, I'm buying hardware. There's also the fact that I don't care about upscaling because these are high-end cards and by the time they need upscaling, DLSS, FSR and XeSS will be completely different from what they are now just as they are now completely different from their iterations from three years ago. That's why I never understood people caring about what upscaler a card has when it's a high-end card. I mean, sure, DLSS can be a big deal but keep in mind that both XeSS and FSR also look good if they're all that you have.

Now sure, the RTX 4070 Ti probably uses way less electricity but if people actually cared about that, nobody would be buying Intel 13th-gen CPUs which use about twice as much juice as comparable Zen4 Ryzens, but they do, so they don't. Not only that, even in the UK, one of the most expensive places in the world for electricity right now, it would take about 6 years just to break even when it comes to power use and cost and all costs are less painful when amortised over long periods of time.

This is why I don't care about what gen a card is (and from what I've seen, neither do most people) so calling that comparison dishonest is completely out of touch with what most people would consider simple reality. You can call it dishonest all you want, but believe it or not, it doesn't get more true just because someone keeps repeating it.
2. Good, buy the AMD card while you still can. I'm merely pointing out that this is the exception and not the norm, once RDNA 2 and Ampere stocks deplete you will not be able to do this
Absolutely. When the situation changes, my opinion will change to meet the new conditions. I'm only talking about right now.
3. Nvidia GPUs outsell AMD ones 9 to 1, it's irrelevant. Nvidia themselves have shown little interest in this 16 GB model, as I have mentioned earlier, it's more of an SKU they've released to prove a point than anything else.
So... You're saying that they released something pointless to prove a point? :roll:
Ok... I'll bite... What point was nVidia trying to prove? That they could make a worse release than the 8GB RTX 4060 Ti? (I'll grant them that they totally succeeded at that!) ;)
4. You could have linked to the video where HUB tested the 8 GB 3070 v. the 16 GB RTX A4000 to make a fairer point as those cards share the exact same architecture and feature set, but like I mentioned on the forum before, the situation between 4060 Ti 8 GB and 4060 Ti 16 GB is exactly the same as described in the aforementioned video. The 16 GB card is better, when you need the memory... but with this anemic a configuration, it matters less than ever - it's just a bad card all around
I couldn't agree more. It's the worst version of the card that itself is likely the worst release in GPU history. I'd like to say that it can't get worse but I don't want to jinx anything.
5. Never said you didn't or did dump on the 6500 XT and my point wasn't to demean it as a product, just to say that Nvidia committed the same reckless cost-saving measure that cost this card the performance it could have, the 64-bit bus in Navi 24 is the primary bottleneck even on such a small die
Well I sure as hell have no problem demeaning the RX 6500 XT because as a product, it's incomplete. It's like back in the day when Hyundai's were awful. Sure, they were junk but since they were dirt cheap, they still sold like crazy. The RX 6500 XT is a very niche product that doesn't have enough VRAM, has a PCIe4x4 connection and is about as potent as a GTX 980. If you want to sell a glorified video adapter, I don't have a problem with that. What I did have a problem with was AMD trying to market it as a gaming card and pricing it as such. I realise that they were just trying to get something out there but they would've been better off with cut-down versions of the RX 6600.
6. Agreed that the prices are obnoxiously high, at $500 this GPU is simply not worth the price of admission, even if last-gen 6800 XT and RTX 3080s weren't, for the time being, accessible at this price
Yep. The fact that the RX 6800 XT is available at $500 (I haven't seen a new RTX 3080 in forever) only makes things even worse.

I find it hilarious and annoyed at the same while watching Steve make up excuses after excuses how the nVidia GeForce RTX 3070 8GB didn't suck even though his own data says the 6800 is the superior card. He also chose the 3070 over the 6800 from day one review because MuH rAy tRaCiNg.
I know, and in some cases he dismissed the VRAM disparity because "We only review for today." which made me scratch my head and think "Where is the Steve Walton that I know and love from so many years ago?". :(

Cards need to be readily available for them to be relevant in comparison, the 6800XTs are running out slowly. Its a while stocks last thing and on top of that, local availability can be worse, making the price of such highly competitive cards skyrocket. I see them go over 800 eur here for some AIB models, and a rare couple over 1200 (!!). Theory vs practice.,.
When the stock runs out, then they won't be readily available and I'll be totally in agreement with you. Right now though... They are an obstacle to both companies and I'm glad that they are because it gives consumers a better option than to buy the crap that they're putting in front of us and it's forcing the prices down. Years from now, we might be thanking RDNA2 for having blocked nVidia and AMD from just charging whatever they felt like.
They really should be pushing that 7800(XT) button just about now, the momentum is there if they would given how poor 8GB cards turn out. Even a 12GB 7700 would be marvellous... if they position that at 450 they will sell. But like @Beginner Micro Device stated... Always Mighty Delayed...
That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.

It's like I said, if you thought that the review for the RTX 4060 Ti 8GB was the worst that you've ever seen (and it certainly was the worst that I can remember), just wait, because the 16GB review will be even worse.

Sure enough...

I swear man, the ones in charge over at nVidia must be smoking moon rocks because putting this card up against the RX 6800 XT is asking for a Romulan Bloodbath (green blood will flow).
You know, I think that Daniel Owen has been reading my "Romulan Bloodbath" posts because...

This is why I don't understand anyone who calls the RX 6800 XT "irrelevant" to the RTX 4060 Ti 16GB, because it's obviously relevant while it's still out there! :laugh:
 
Last edited:
I think you and most reviewers are failing to see the point of the 16gb of Vram. This, is not a gaming GPU I think this is Nvidia's entre level Generative AI GPU. With 16gb Vram you get to run up to a 13b parameter LLM and have no issues with VRam while using Stable Diffusion, even the larger SDXL models will work fine. Generative AI is about Vram capapacity because you need to load the entire model onto the existing Vram and not vram speed. Your next bet for Generative AI gpu after this is the much pricier 4080 or even the absurd 4090.

Not saying its good value or a wise decision but I think there is reason behind the madness...
 
The review doesn't paint the whole picture. It is either the chip itself is so slow, that it can not use more than 8 GB, or that nvidia cheats and automatically modifies the image quality in order to fit in the available VRAM buffer.

There is a difference if you know where to look at.

View attachment 306114
Boring only a 6 FPS difference here, if I were to build 2 identical rigs and game on them you cannot tell me which one is the higher FPS rig. VRR makes this even more difficult to judge. And what is the sigma deviation/range here ?

NV marketing has a bunch a peeps brainwashed.
 
I do remember and at the time, I crapped all over it. It wasn't worth anywhere close to that, especially since it was only slightly faster than my RX 6800 XT which cost a lot less. The difference was that, at launch, AMD was very candid about how the RX 6900 XT (and by extension the 6950 XT) was not a card for anyone who wanted any kind of value. That's what the RX 6800 XT was for. As far as the generation is concerned, what difference does it make? A new card with high-performance is a new card with high-performance. It would be far more dishonest to ignore it than to talk about it. If you care about generation, then you're in a small minority because the sales numbers of the RX 4060 Ti vis a vis the sales numbers of the RX 6700 XT paint a picture that pretty much screams that nobody cares about what generation something is, they care about what it can do for what they have to pay and I don't blame them for that. For gamers, especially ones that don't have more money than brains, price-to-performance is very important.

Since you brought up the RX 6950 XT, let's look at it and compare it to the "that which should not be compared because it's a newer-gen" RTX 4070 Ti.
XFX Radeon RX 6950 XT MERC319 OC 16GB - $630 at Newegg
PNY GeForce RTX 4070 Ti 12GB - $790 at Best Buy
Performance delta between the two cards: The RTX 4070 Ti is all of 4% faster than the RX 6950 XT and has 4 fewer gigabytes of VRAM (TPU GPU Database Numbers). I'm also comparing a factory OC XFX to what appears to be a Vanilla PNY so the performance difference is probably even less than that but I don't care, I'll still say that it's 4%.

Now, I don't know about you and I can only speak for myself in this situation but.... I would have a REALLY hard time spending an extra $160 (an extra 25%) for only 4% extra performance. To be fair, I'm not the least bit impressed with the current implementations of RT and I have little to no interest in using the fake frames of DLSS3 or FSR3 because I'm not buying software here, I'm buying hardware. There's also the fact that I don't care about upscaling because these are high-end cards and by the time they need upscaling, DLSS, FSR and XeSS will be completely different from what they are now just as they are now completely different from their iterations from three years ago. That's why I never understood people caring about what upscaler a card has when it's a high-end card. I mean, sure, DLSS can be a big deal but keep in mind that both XeSS and FSR also look good if they're all that you have.

Now sure, the RTX 4070 Ti probably uses way less electricity but if people actually cared about that, nobody would be buying Intel 13th-gen CPUs which use about twice as much juice as comparable Zen4 Ryzens, but they do, so they don't. Not only that, even in the UK, one of the most expensive places in the world for electricity right now, it would take about 6 years just to break even when it comes to power use and cost and all costs are less painful when amortised over long periods of time.

This is why I don't care about what gen a card is (and from what I've seen, neither do most people) so calling that comparison dishonest is completely out of touch with what most people would consider simple reality. You can call it dishonest all you want, but believe it or not, it doesn't get more true just because someone keeps repeating it.

Absolutely. When the situation changes, my opinion will change to meet the new conditions. I'm only talking about right now.

So... You're saying that they released something pointless to prove a point? :roll:
Ok... I'll bite... What point was nVidia trying to prove? That they could make a worse release than the 8GB RTX 4060 Ti? (I'll grant them that they totally succeeded at that!) ;)

I couldn't agree more. It's the worst version of the card that itself is likely the worst release in GPU history. I'd like to say that it can't get worse but I don't want to jinx anything.

Well I sure as hell have no problem demeaning the RX 6500 XT because as a product, it's incomplete. It's like back in the day when Hyundai's were awful. Sure, they were junk but since they were dirt cheap, they still sold like crazy. The RX 6500 XT is a very niche product that doesn't have enough VRAM, has a PCIe4x4 connection and is about as potent as a GTX 980. If you want to sell a glorified video adapter, I don't have a problem with that. What I did have a problem with was AMD trying to market it as a gaming card and pricing it as such. I realise that they were just trying to get something out there but they would've been better off with cut-down versions of the RX 6600.

Yep. The fact that the RX 6800 XT is available at $500 (I haven't seen a new RTX 3080 in forever) only makes things even worse.


I know, and in some cases he dismissed the VRAM disparity because "We only review for today." which made me scratch my head and think "Where is the Steve Walton that I know and love from so many years ago?". :(


When the stock runs out, then they won't be readily available and I'll be totally in agreement with you. Right now though... They are an obstacle to both companies and I'm glad that they are because it gives consumers a better option than to buy the crap that they're putting in front of us and it's forcing the prices down. Years from now, we might be thanking RDNA2 for having blocked nVidia and AMD from just charging whatever they felt like.

That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.


You know, I think that Daniel Owen has been reading my "Romulan Bloodbath" posts because...

This is why I don't understand anyone who calls the RX 6800 XT "irrelevant" to the RTX 4060 Ti 16GB, because it's obviously relevant while it's still out there! :laugh:
That guy harps on RT too much and productivity for a gaming channel and now he's gotten big with over 100K subs. He's just like HUB picking RT over VRAM and raw performance.
 
This, is not a gaming GPU I think this is Nvidia's entre level Generative AI GPU.
It is still gaming GPU with large potential play upcoming games which will be more vram hungry and will use frame gen routinely.
Frame gen and additional 8GB is perfect combo even on 128bit bus. This is simply card for masses playing still at 1080p. The price is adequate.
People and reviewers made decision too soon.
Also perfect for non game industry as you mentioned.
 
That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.

What amazes me is that you actually reached the conclusion I was pushing you towards yet you still felt a compulsive need to justify it for AMD. Romulan bloodbath, eh
 
I do remember and at the time, I crapped all over it. It wasn't worth anywhere close to that, especially since it was only slightly faster than my RX 6800 XT which cost a lot less. The difference was that, at launch, AMD was very candid about how the RX 6900 XT (and by extension the 6950 XT) was not a card for anyone who wanted any kind of value. That's what the RX 6800 XT was for. As far as the generation is concerned, what difference does it make? A new card with high-performance is a new card with high-performance. It would be far more dishonest to ignore it than to talk about it. If you care about generation, then you're in a small minority because the sales numbers of the RX 4060 Ti vis a vis the sales numbers of the RX 6700 XT paint a picture that pretty much screams that nobody cares about what generation something is, they care about what it can do for what they have to pay and I don't blame them for that. For gamers, especially ones that don't have more money than brains, price-to-performance is very important.

Since you brought up the RX 6950 XT, let's look at it and compare it to the "that which should not be compared because it's a newer-gen" RTX 4070 Ti.
XFX Radeon RX 6950 XT MERC319 OC 16GB - $630 at Newegg
PNY GeForce RTX 4070 Ti 12GB - $790 at Best Buy
Performance delta between the two cards: The RTX 4070 Ti is all of 4% faster than the RX 6950 XT and has 4 fewer gigabytes of VRAM (TPU GPU Database Numbers). I'm also comparing a factory OC XFX to what appears to be a Vanilla PNY so the performance difference is probably even less than that but I don't care, I'll still say that it's 4%.

Now, I don't know about you and I can only speak for myself in this situation but.... I would have a REALLY hard time spending an extra $160 (an extra 25%) for only 4% extra performance. To be fair, I'm not the least bit impressed with the current implementations of RT and I have little to no interest in using the fake frames of DLSS3 or FSR3 because I'm not buying software here, I'm buying hardware. There's also the fact that I don't care about upscaling because these are high-end cards and by the time they need upscaling, DLSS, FSR and XeSS will be completely different from what they are now just as they are now completely different from their iterations from three years ago. That's why I never understood people caring about what upscaler a card has when it's a high-end card. I mean, sure, DLSS can be a big deal but keep in mind that both XeSS and FSR also look good if they're all that you have.

Now sure, the RTX 4070 Ti probably uses way less electricity but if people actually cared about that, nobody would be buying Intel 13th-gen CPUs which use about twice as much juice as comparable Zen4 Ryzens, but they do, so they don't. Not only that, even in the UK, one of the most expensive places in the world for electricity right now, it would take about 6 years just to break even when it comes to power use and cost and all costs are less painful when amortised over long periods of time.

This is why I don't care about what gen a card is (and from what I've seen, neither do most people) so calling that comparison dishonest is completely out of touch with what most people would consider simple reality. You can call it dishonest all you want, but believe it or not, it doesn't get more true just because someone keeps repeating it.

Absolutely. When the situation changes, my opinion will change to meet the new conditions. I'm only talking about right now.

So... You're saying that they released something pointless to prove a point? :roll:
Ok... I'll bite... What point was nVidia trying to prove? That they could make a worse release than the 8GB RTX 4060 Ti? (I'll grant them that they totally succeeded at that!) ;)

I couldn't agree more. It's the worst version of the card that itself is likely the worst release in GPU history. I'd like to say that it can't get worse but I don't want to jinx anything.

Well I sure as hell have no problem demeaning the RX 6500 XT because as a product, it's incomplete. It's like back in the day when Hyundai's were awful. Sure, they were junk but since they were dirt cheap, they still sold like crazy. The RX 6500 XT is a very niche product that doesn't have enough VRAM, has a PCIe4x4 connection and is about as potent as a GTX 980. If you want to sell a glorified video adapter, I don't have a problem with that. What I did have a problem with was AMD trying to market it as a gaming card and pricing it as such. I realise that they were just trying to get something out there but they would've been better off with cut-down versions of the RX 6600.

Yep. The fact that the RX 6800 XT is available at $500 (I haven't seen a new RTX 3080 in forever) only makes things even worse.


I know, and in some cases he dismissed the VRAM disparity because "We only review for today." which made me scratch my head and think "Where is the Steve Walton that I know and love from so many years ago?". :(


When the stock runs out, then they won't be readily available and I'll be totally in agreement with you. Right now though... They are an obstacle to both companies and I'm glad that they are because it gives consumers a better option than to buy the crap that they're putting in front of us and it's forcing the prices down. Years from now, we might be thanking RDNA2 for having blocked nVidia and AMD from just charging whatever they felt like.

That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.


You know, I think that Daniel Owen has been reading my "Romulan Bloodbath" posts because...

This is why I don't understand anyone who calls the RX 6800 XT "irrelevant" to the RTX 4060 Ti 16GB, because it's obviously relevant while it's still out there! :laugh:
The GRE is not comparable to the 6800XT, why would it be? Its noticeably faster even if it just matches a 4070ti, and first results show it to be faster than a 4070 which is about equal to a 6800XT.

There is 35% perf there to play with, that's more than a full tier of GPU (which I consider to be a 25-30% gap). It also won't require a larger than life PSU to fight transient spikes.
 
It is still gaming GPU with large potential play upcoming games which will be more vram hungry and will use frame gen routinely.
Frame gen and additional 8GB is perfect combo even on 128bit bus. This is simply card for masses playing still at 1080p. The price is adequate.
People and reviewers made decision too soon.
Also perfect for non game industry as you mentioned.
You could make an argument for AI, but for gaming, you would be better off going with a 4070 or a RDNA2/RDNA3 equivalent.
 
You could make an argument for AI, but for gaming, you would be better off going with a 4070 or a RDNA2/RDNA3 equivalent.

Except the 4070 is two hundred more dollars, a full 50% more expensive for a single step up the ladder. And it's not even close to 50% faster. If the 4060 ti is truly as poor a value as everyone's saying, it's wierd to recommend a card whose P/P ratio is even worse.
 
Except the 4070 is two hundred more dollars, a full 50% more expensive for a single step up the ladder. And it's not even close to 50% faster. If the 4060 ti is truly as poor a value as everyone's saying, it's wierd to recommend a card whose P/P ratio is even worse.
The poster I replied to was praising the 4060 Ti 16 GB not its cheaper sibling. That is only $100 less than the 4070 and the gulf in performance between them is far more than 20%. In fact, two variants of the 4070 are available for $590 right now and others have been sold for a bit less than that recently.

1691003020183.png
 
Last edited:
You could make an argument for AI, but for gaming, you would be better off going with a 4070 or a RDNA2/RDNA3 equivalent.
4070 or a RDNA2/RDNA3 equivalent is good choice for people who have plans sell it in 2-3 years or simply said changing cards more often.
But for long term strategy(6+ years) is better 16GB and frame gen support for that price we see now on 4060ti 16GB.

4060ti can easily catch up 30% loss in performance on 4070 with adjusting game settings. For decreasing 4GB VRAM(16 -> 12) consumption we need to sacrifice a lot more settings(especially in new games 6 years from now).
My prediction is that existence of 16GB version in low end segment will make change how developers will be setting min. recommendations in terms of VRAM in the future. Games are already big now, there is no problem to allocate even 24GB of VRAM in today games....because games(mostly open worlds) are constantly loading data from SSD....e.g. CP77 ~30MB/s.
 
4070 or a RDNA2/RDNA3 equivalent is good choice for people who have plans sell it in 2-3 years or simply said changing cards more often.
But for long term strategy(6+ years) is better 16GB and frame gen support for that price we see now on 4060ti 16GB.

4060ti can easily catch up 30% loss in performance on 4070 with adjusting game settings. For decreasing 4GB VRAM(16 -> 12) consumption we need to sacrifice a lot more settings(especially in new games 6 years from now).
My prediction is that existence of 16GB version in low end segment will make change how developers will be setting min. recommendations in terms of VRAM in the future. Games are already big now, there is no problem to allocate even 24GB of VRAM in today games....because games(mostly open worlds) are constantly loading data from SSD....e.g. CP77 ~30MB/s.
The gap between the 4060 Ti and the 4070 is bigger than the gap between the 7900 XTX and the 4090. Even after adjusting settings, it would still be slower than a 4070 at the same settings. I reiterate; for gaming, this SKU makes no sense. The only use would be AI models that hunger for VRAM.

I agree with one point. With time, games will use more VRAM, but given that both AMD and Nvidia have opted to equip their mass market GPUs with only 8 GB, the time for 16 GB isn't here yet.
 
The gap between the 4060 Ti and the 4070 is bigger than the gap between the 7900 XTX and the 4090. Even after adjusting settings, it would still be slower than a 4070 at the same settings. I reiterate; for gaming, this SKU makes no sense. The only use would be AI models that hunger for VRAM.

I agree with one point. With time, games will use more VRAM, but given that both AMD and Nvidia have opted to equip their mass market GPUs with only 8 GB, the time for 16 GB isn't here yet.
Why you are comparing that gap with another gap(7900xtx/4090)?
 
To show how big the gap is. It's also bigger than the gap between the 4060 and the 4060 Ti.
View attachment 307379
Comparing gaps like you did does not make sense.

The point is, in long term or very long term(x060 gamers) 4070 with 12GB will be useless. It is better grab slower card but with larger VRAM. You do not have so many options how to decrease VRAM usage in games, but you have tonsof options how to increase fps(core performance).
 
Comparing gaps like you did does not make sense.

The point is, in long term or very long term(x060 gamers) 4070 with 12GB will be useless. It is better grab slower card but with larger VRAM. You do not have so many options how to decrease VRAM usage in games, but you have tonsof options how to increase fps(core performance).
I don't understand what is nonsensical about showing that the 4060 Ti 16 GB is the worst gaming GPU based upon the fps per $ metric. If you're a gamer and worried about VRAM, then buy a 7800 next month. I agree that lowering VRAM usage isn't easy and the options that lower it are the ones that harm image quality the most: texture quality.
 
I don't understand what is nonsensical about showing that the 4060 Ti 16 GB is the worst gaming GPU based upon the fps per $ metric. If you're a gamer and worried about VRAM, then buy a 7800 next month. I agree that lowering VRAM usage isn't easy and the options that lower it are the ones that harm image quality the most: texture quality.
Fps per $ metric is giving us false illusion about product value. It has very narrow field of view in decision process. It is mostly reflecting software performance and not HW performance and features and benefits. It can be used for short term scenario for specific player's needs.
4060 ti 16GB is valuable product in long term strategy at that price. Short term metric fps/$ is not able to point to this value.

7800 is out of business, still no frame gen possibility. Again, false illusion for future games(it is not only about VRAM, everything counts if you are making complex decision).
 
I think you and most reviewers are failing to see the point of the 16gb of Vram. This, is not a gaming GPU I think this is Nvidia's entre level Generative AI GPU. With 16gb Vram you get to run up to a 13b parameter LLM and have no issues with VRam while using Stable Diffusion, even the larger SDXL models will work fine. Generative AI is about Vram capapacity because you need to load the entire model onto the existing Vram and not vram speed. Your next bet for Generative AI gpu after this is the much pricier 4080 or even the absurd 4090.

Not saying its good value or a wise decision but I think there is reason behind the madness...
I kinda think that the reason behind the madness is that Jensen knows that there are a lot of people who will buy a card in a green box no matter how good or bad the value is. I'm pretty sure that the reason behind all of nVidia's madness is profit, pure and simple.

It makes no sense to have the 4060 Ti have 16GB of VRAM but the 4070 and 4070 Ti, two cards that would be a lot more useful for AI, only have 12.

That guy harps on RT too much and productivity for a gaming channel and now he's gotten big with over 100K subs. He's just like HUB picking RT over VRAM and raw performance.
You're right, he does, but at the same time, he had no qualms about showing everyone the impossible position that the RTX 4060 Ti 16GB was forced into by nVidia. He showed that he has much bigger cojones than HUB because they somehow managed to "neglect" to do this comparison. There's no excuse for that because the comparison was immediately obvious to anyone with more than two brain cells to rub together and HUB isn't that stupid.

I'm sure that there's another reason why they didn't do that comparison. Based on Daniel Owen's results, we can be sure that the only ones who wouldn't be pleased with such a comparison would be nVidia. Now, I don't know for certain why HUB didn't do that ridiculously obvious comparison, but when one does the math...

What amazes me is that you actually reached the conclusion I was pushing you towards yet you still felt a compulsive need to justify it for AMD. Romulan bloodbath, eh
What are you talking about? You weren't pushing me anywhere and I didn't justify a damn thing for AMD. I predicted a bloodbath and since nVidia is green, I ironically called it a "Romulan Bloodbath".

You're really starting to make me wonder about you. :kookoo:

The GRE is not comparable to the 6800XT, why would it be? Its noticeably faster even if it just matches a 4070ti, and first results show it to be faster than a 4070 which is about equal to a 6800XT.

There is 35% perf there to play with, that's more than a full tier of GPU (which I consider to be a 25-30% gap). It also won't require a larger than life PSU to fight transient spikes.
That's not exactly what I meant. What I meant was that there's only a 20% performance difference between it and the RX 6800 XT. Where does that leave the RX 7800 XT? People are going to expect more than a 10% performance uplift between the 7800 and 7900 cards. Even if the RX 7800 XT is 10% faster than the RX 6800 XT, they're going to have to price it pretty low (like $450-$500) if it's going to sell because the market is currently saturated around the $500-$600 price point. There just isn't enough of a performance delta for the RX 7800 XT to have a proper place in the market because the RX 6950 XT still exists. AMD knew this all too well which is why they released the RX 7600 first. They were hoping that the RX 6800 XT and RX 6950 XT would sell out before they were forced to release the RX 7800 XT but that hasn't happened. If it is 10% faster than the RX 6800 XT, then it would be compelling at $500 but not so much at $550 because it would be 10% more expensive for 10% more performance. That's not exactly something to get excited about, especially if the RX 6800 XT's price gets cut even further. At $450, it would be the de facto market leader and would be the most praised card of this generation regardless of what colour box it comes in.

Of course, that would require foresight and AMD has demonstrated the exact opposite of that for this entire generation so far. :kookoo:

AMD will shoot themselves in the foot yet again. I'd like to say that it's stupidity but they're not stupid people. This whole situation was caused by an anti-consumer play by AMD that they failed to get away with and I'm really glad that they didn't. AMD is going to suffer pretty badly this generation and it's 100% their fault. I'm just going to sit back and laugh at them as they receive exactly what they deserve.

If the RX 7800 XT releases at $450, then I'll be the first to happily take back everything that I said. I would love to be wrong here but I wouldn't even bet a dime on it.
 
What are you talking about? You weren't pushing me anywhere and I didn't justify a damn thing for AMD. I predicted a bloodbath and since nVidia is green, I ironically called it a "Romulan Bloodbath".

You're really starting to make me wonder about you. :kookoo:

For a bloodbath (or at the very least, a much needed price war) to occur, first AMD needs to have a better product, and then they need to sell that product to the masses and earn the mindshare of the people.

Currently, they do not meet any of these requirements. But don't worry, AMD's been granted another generational reprieve, the AI nuts are gonna buy up all the Radeon stock that gamers won't, so maybe RDNA 4 will be a better AI accelerator if not a better gaming card. They'll persist. :)

 
For a bloodbath (or at the very least, a much needed price war) to occur, first AMD needs to have a better product, and then they need to sell that product to the masses and earn the mindshare of the people.

Currently, they do not meet any of these requirements. But don't worry, AMD's been granted another generational reprieve, the AI nuts are gonna buy up all the Radeon stock that gamers won't, so maybe RDNA 4 will be a better AI accelerator if not a better gaming card. They'll persist. :)

That piece about one AI focused company buying up Radeons could be troublesome if it actually turns into a trend.
 
That piece about one AI focused company buying up Radeons could be troublesome if it actually turns into a trend.

The AI trend is only starting, and it may grow larger than the crypto one ever did. Once Ethereum went proof of stake, GPU mining practically died overnight. Prices still haven't fully normalized, especially at the budget end, which is flooded with expensive overstock purchased by distributors during the price high and also by low-quality recycled hardware and mining refuse. This also helped the performance segment (4060, 4060 Ti, RX 7600) and the budget segment (RX 6400, RX 6500 XT, GTX 1630 and 1650, Arc A380) to keep at a level that's basically twice the price they should currently be at. GPUs from the Arc A750 and up have been basically priced at near normal (pre-mining, pre-pandemic) levels already, usually the markup is no more than 25-30%.

It was one of the things I took into consideration when I decided to sell my 3090 and upgrade. I couldn't bear having it for another full generation with no hope of upgrading due to even crazier prices, and were I a betting man, I would say that this will happen within the year.
 
Fps per $ metric is giving us false illusion about product value. It has very narrow field of view in decision process. It is mostly reflecting software performance and not HW performance and features and benefits. It can be used for short term scenario for specific player's needs.
4060 ti 16GB is valuable product in long term strategy at that price. Short term metric fps/$ is not able to point to this value.

7800 is out of business, still no frame gen possibility. Again, false illusion for future games(it is not only about VRAM, everything counts if you are making complex decision).
No, the 16GB 4060ti still has far too low bandwidth to properly use the large framebuffer in most games. This product is too handicapped beyond just capacity of VRAM. Compare that to the 4070 and you know its 12GB is a much better offer even at a sonewhat higher price. Still not optimal, but certainly not as poor as this 4060Ti which is a total dud.
 
No, the 16GB 4060ti still has far too low bandwidth to properly use the large framebuffer in most games. This product is too handicapped beyond just capacity of VRAM. Compare that to the 4070 and you know its 12GB is a much better offer even at a sonewhat higher price. Still not optimal, but certainly not as poor as this 4060Ti which is a total dud.
Check the mem bus utilization during gameplay, there is big big reserve. Frame gen and L2 cache are making the reserve even bigger.
Main purpose of VRAM is fast storage for game assets. Most of the hard work is done using caches. No need to worry that low bandwidth can not properly use frame buffer(of any size).
Loading assets through PCIe is way slower. Having larger VRAM is simply more beneficial, especially for open world games.
Owners of 4060ti 16GB will be able to play modern titles easily even after 5-7 years. Of course only on 1080p. But that is the point, for 500$ now, play many many years in the future.

I was watching CP77 gameplay on gtx1060 6GB @1080p. That card is still not penalized for its VRAM capacityon and try to imagine there would be frame gen for that card. I see 4060ti very very bright in the future, much brighter than 4070 12GB.
 
  • Haha
Reactions: 3x0
Check the mem bus utilization during gameplay, there is big big reserve. Frame gen and L2 cache are making the reserve even bigger.
Main purpose of VRAM is fast storage for game assets. Most of the hard work is done using caches. No need to worry that low bandwidth can not properly use frame buffer(of any size).
Loading assets through PCIe is way slower. Having larger VRAM is simply more beneficial, especially for open world games.
Owners of 4060ti 16GB will be able to play modern titles easily even after 5-7 years. Of course only on 1080p. But that is the point, for 500$ now, play many many years in the future.

I was watching CP77 gameplay on gtx1060 6GB @1080p. That card is still not penalized for its VRAM capacityon and try to imagine there would be frame gen for that card. I see 4060ti very very bright in the future, much brighter than 4070 12GB.
The 1060 6GB has about 200GB/s while the 4060ti is 162% faster on a mere 288GB/s.

Its definitely not going to be able to use all that core advantage with even 2x the memory capacity of a 1060 filled up. Cache isn't large so it will simply choke on complex scenes when more data is needed at once. You say it right, 1080p only, yes. But even then, there is more data. The VRAM capacities required for games between 1080p and 4K is often no more than a 2GB gap. The 16GB will help for games that exceed 8GB, but it won't resolve the issue of lacking bandwidth everywhere.

The reason the 1060 6GB isn't penalized for its 6GB is because every game pegs the core at 100% anyway, and that card relatively has solid bandwidth to feed it. Comparatively it has more real bandwidth than whatever the 4060ti has in bandwidth + cache advantage. The same thing applies throughout Pascal's stack. The 1080 already sits at 500GB/s while being just 60% ish faster than a 1060 6GB - and here's the kicker, if you OC a 1080's memory to 11Gbps (a +500 on the rev. 1, every single one can do it) you gain real frames, consistently. So even thát card still has core to spare for its given bandwidth. It will choke only when games exceed its framebuffer ánd require large amounts of new data on top, a situation that applies, for example, to TW Warhammer 3 at 1440p ultra/high and up, which will readily use north of 10GB VRAM.

As for bright futures, I think they're both pretty weak offers; a 4060ti and a 4070 both are priced too high. With the 4060ti you spend $500,- for what is virtually entry level hardware.
 
Last edited:
Back
Top