• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4060 Ti 16 GB

No.
AMD has nothing to replace these cards with. They are still in production and will be in production for a very long time.
You can undervolt the RX 6800 and approach the RTX 4060 Ti power consumption.
The 6800XTs are definitely running thin over here in NL. A few available at around 560 EUR at this time. Some priced double that for reasons unknown. It signifies theyre EOL. 6800 and 6700 are better populated still though.
 
But wouldn't faster system RAM help with performance, somehow? I mean, DDR5-6000 provides 96 GB/s in dual-channel - fairly decent, I'd say.
It's 1/3 of what the actual bandwidth of the 4060 Ti is. They could use the system RAM to store textures in the distance, and dedicated VRAM for the closer proximity (if that's possible?).
I know graphics cards do this (taking system RAM), but I believe there's some unexplored potential here: differences should be more significant going from 51.2 GB/s to 96 GB/s.
No it won't as the GPU can't access that RAM at anywhere close to 96 GB/s. The 4060 Ti has 8 lanes of PCIe 4 so it won't be able to access system RAM at more than 15.75 GB/s. Real bandwidth will be lower than that.
 
But wouldn't faster system RAM help with performance, somehow? I mean, DDR5-6000 provides 96 GB/s in dual-channel - fairly decent, I'd say.
It's 1/3 of what the actual bandwidth of the 4060 Ti is. They could use the system RAM to store textures in the distance, and dedicated VRAM for the closer proximity (if that's possible?).
I know graphics cards do this (taking system RAM), but I believe there's some unexplored potential here: differences should be more significant going from 51.2 GB/s to 96 GB/s.
Not when you're limited to PCI-e x8.

No it won't as the GPU can't access that RAM at anywhere close to 96 GB/s. The 4060 Ti has 8 lanes of PCIe 4 so it won't be able to access system RAM at more than 15.75 GB/s. Real bandwidth will be lower than that.
Oops, didn't see your comment before posting. Sorry. :ohwell:
 
I bet you've never tried it yourself
I have a 4090 & I've tried FG & I was also very underwhelmed. It was really interesting to try, when the base fps is low (the fps before FG slots in the generated frames) I could really feel the input lag, yet I was still seeing a pretty smooth image. To me it felt a bit like gaming on an old TV with high input lag - the image looks smooth but everything just feels a bit like it's in jelly.
It's a nice option to have & I hope a version that's supported by all GPUs comes soon, but for me personally it's only useful when I've got a decent frame rate already anyway (100+ fps).
 
@W1zzard Sorry if it was asked before, but are there any notable differences in PCB complexity (number of layers etc.) between the 8GB and 16GB versions? I've read that the PCB complexity is also a big factor in the price increase, according to some sources. Which is complete nonsense IMO, but would like to confirm.
 
I've read that the PCB complexity is also a big factor in the price increase
Yes if we compare PCBs of 3090 VS 3060. In this case, difference is about $5 if ever exists.
 
The 6800 XT is irrelevant to the matter at hand, it is a much higher segment, previous generation card built on a far larger and much more advanced GPU that draws twice as much power, of course both it and the RTX 3080 are going to clobber it.
I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.
The price analogy doesn't work very well because neither of those cards are manufactured any longer, availability is relying on leftover stock which hasn't been sold yet. In essence, these cards don't matter. Stocks of any remaining new units are depleting fast.
Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.
I'll go out on a limb and say that the ones smoking moon rocks were those kvetching about Nvidia not putting enough VRAM on their GPUs, chiefly, an AMD camp complaint.
Does this look like AMD to you?
It sure doesn't look like AMD to me.
Nvidia's own lack of interest in this SKU makes it look like they just put this out to prove naysayers wrong.
Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.
Of course, convenient detail to hide @Vayra86's excellent point of "the GPU is only as strong as its weakest link", and this cutdown AD106 on 128-bit should never have been sold as anything other than an RTX 4050, but that's a problem this entire generation is facing.
I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.
II just find it bizarre that this card has effectively the same fundamental design flaw of the RX 6500 XT, an overreliance on the cache to make up for the extremely anemic memory bandwidth... that is to say, both are low-end, power efficient chips you'd otherwise find in a budget laptop that your average League of Legends player would have.
Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.
 
Whilst performance-wise you're more correct than not there are other things.

0. All Ada Lovelace products and especially, the lower tier ones, lack reasonable VRAM amount and, most importantly, bandwidth. This shifts their real tier one tier lower.
1. All Ada Lovelace products are more cut than their predecessors. 3060 sports the same potential of Ampere architecture as 4070 sports the potential of Ada. But hey, 4070 suddenly costs almost twice as expensive despite being nowhere near twice as fast!
2. All Ada Lovelace products have been launched at the time when almost nobody wants a GPU. There is little demand and most demanders are now either picky, or broke, or both. There is almost no miners with their "please give us another million GPUs for whatever price" around as of now. They're gone. nVidia has to consider this fact as well.
3. If everyone submits to nVidia's policy and pays what they ask for they will see no reason for improvement. Now, seeing the worst demand ever, they are forced to think. Mining fever hangover is still here in the air but it will fade away some day. And then, nVidia will sober their pricing up, unless AMD and Intel fail to achieve anything. And they are currently achieving very little with only two meaningful products available, namely A770 from Intel and RX 7900 XTX from AMD. The rest is incompetitive. Or non-existent.
4. You can't take GPU's naming seriously if it is not faster than its predecessor in every possible scenario. And yes, all 4060 Series GPUs do lose to their 3060 Series predecessors in certain tasks which has never happened before.

Ya, Ada 60 should, or at least could, be better and/or cheaper. As mentioned, I can see the justification behind the argument for the 4060 being a 4050 ti (and/or 4060 ti being a 4060), and the memory bandwidth definitely seems to be hampering AD10[6,7]. When it comes to model naming, performance matters. Nvidia's not going to throw anyone a bone unless absolutely forced. New take (but probably not an original one) after mulling this whole thing over for another day: AD10[6,7] are bandwidth-constrained on purpose. A 192-bit AD60 family may have been too competitive with the 4070, and cannibalized those higher-margin sales. If AD70 is moving slower than NV would like, the above would make some sense. Beyond that, lots of buyers don't cross-shop brands, so NV might figure they can save a few bucks on the BOM without driving said buyers AMD's way.

Now for my argument against the 4060 ti being a 50-series card. The 4060 fits the mold in some ways (power, bus width), but the ti just doesn't. The sole thing it shares is memory bus. It literally DOUBLES contemporary performance relative to pre-RTX x50 cards, and clobbers the 3050 (which, btw, performed on par with pre-RTX x60 parts) by a full 50%.

ModelPriceWattsVRAMBus widthResolutionAvg FPS
550 ti1501161GB1281680x105051
650 ti1501101GB1281680x105050
750 ti150602GB1281600x90058
950160902GB1281600x90065
950160902GB1281080p52
1050 ti140754GB1281080p48
1650150754GB1281080p55
30502501308GB1281080p85
40603001158GB1281080p101
4060 ti4001608GB1281080p120

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.

Nice to see that somebody gets it.
 
I hope this is the very last generation of 8GB cards from both vendors.
Isn't the 4050 rumoured to have 6GB? That would make me think that maybe the 5050 would still have 8gb, or less. Though I suppose thats more excusable on a 50 series card, if its priced accordingly.
 
RTX 2060 6GB
RTX 2060 12GB (increased TMUs, increased cores)

RTX 3060 8GB
RTX 3060 12GB (increased bus)

RTX 3080 10GB
RTX 3080 12GB (increased bus, increased TMUs, increased cores)

RTX 4060 TI 8GB
RTX 4060 TI 16GB (nothing)

Yep absolutely nothing, just slapped on 16 gigs and called it a night. Reminds me of those funny looking hybrid blokes in the gym. BIGUPPERBOD.littleskinnylegs - it just doesn't work together.

Anyway, you can dress her as you want, it's still and always will be a praiseworthy x50-class cutback with a huge gap from Lovelace's best. Actually forget captain halo, even the 4080 sees an enormous 80-115% increase in perf over the 4060 "ti" (thats crazy!). Both 20 and 30 series cards carried the x60 and x80 subdivisions on a well received 30-40% relative perf discrepancy - what on earth were they thinking with 40-series? To make matters tremendously worse, the awarded MSRP of $500 is just sinfully offensive.
 
A 192-bit AD60 family may have been too competitive with the 4070
No. 4060 Ti's core is too weak to compete with 4070. They still would be miles away from each other, just like GTX 1080 VS GTX 1070 with the latter being up to 30 percent slower despite quite the same VRAM bus of 256 bits.
It literally DOUBLES contemporary performance relative to pre-RTX x50 cards
And more than doubles their price making its speed nothing all too impressive. It's also almost double as Watt-hungry compared to an average xx50 card (GTX 750 is either 60 or 75W, 950 is 90W, 1050 is 75W, 1650 is 75W, whereas 4060 Ti eats up to 160W).
clobbers the 3050
Not an achievement. Almost everything clobbers the 3050; with 3070 Ti being the only Ampere GPU which is worse in Perf/W*USD ratio.
the 4060 ti being a 50-series card
It STILL technically is. It loses to plain OG RTX 3060, the one of 12 GB, in a couple scenarios where 8 GB of VRAM just don't cut it and 12 GB allow for something despite very slow GPU. Playing at 15 FPS is better than not playing at all, y'know.
and/or cheaper
The GTX 1070 from 2016 was priced $380 at launch (which roughly translates to 2022's $460 considering inflation). So, the only game which wasn't comfortable for this card at 1080p was Deus Ex: Mankind Divided, and others were playable at 60+ FPS. https://www.techpowerup.com/review/msi-gtx-1070-quick-silver-oc/12.html
The RTX 4060 series do already have a handful of games which are COMPLETELY unplayable at 1080p at ultra settings. Do I need to remind this was not the case for the GTX 1060 ($250, or $305 of 2022) which was just slow in some titles but has not run into slideshow and texture quality shenanigans?
This probably has not been the case for the GTX 1050 Ti as well. Just a reminder: that was a $140 ($170 as of 2022) GPU. https://www.techpowerup.com/review/asus-gtx-1050-ti-strix-oc/12.html

And now, with 10+ GB VRAM GPUs getting more and more popularity, game studios will definitely increase the VRAM tax in their future products. Which means the utter garbage state of things for RTX 4060 series with the sole survivor by the name of 4060 Ti 16 GB being able to actually play games from, say, 2026. But when you pay half a thousand dollars for a card you expect it to run them, not to snail them.
 
No. 4060 Ti's core is too weak to compete with 4070. They still would be miles away from each other, just like GTX 1080 VS GTX 1070 with the latter being up to 30 percent slower despite quite the same VRAM bus of 256 bits.

30% is significant, but not miles IMO. Let's assume for the moment that a better memory subsystem would have gotten it to within 20%, or at least alleviated the 1% and .1% lows. Nvidia's already asking buyers to spend $200, an extra 50%, over the 4060 ti to get that 30%. If the performance delta were smaller, you don't think a portion of potential 4070 buyers would step down? Yeah, it gets you that magic 60 fps number at 4k, so if you bleed green and are set on 4k, AD60's not gonna do it for ya. For the rest of us, 115 vs 85 at 1440p doesn't seem like a good trade for those two Benjamins. At least not to me. (115 vs 85 is +26%, which I guess rounds up to 30% if you're going in tens.)

And more than doubles their price making its speed nothing all too impressive. It's also almost double as Watt-hungry compared to an average xx50 card (GTX 750 is either 60 or 75W, 950 is 90W, 1050 is 75W, 1650 is 75W, whereas 4060 Ti eats up to 160W).

We all know the price is out of whack. Re: power, 160W being too much for x50 is what I'm saying here.

Not an achievement. Almost everything clobbers the 3050; with 3070 Ti being the only Ampere GPU which is worse in Perf/W*USD ratio.

Because NV abandoned the midrange. The 3050 performed relative to contemporary titles at the same level pre-RTX x60 cards did relative to theirs. Even on spec, it's very similar to the 960, which didn't get anything like the 3050s hate then or now.

It STILL technically is. It loses to plain OG RTX 3060, the one of 12 GB, in a couple scenarios where 8 GB of VRAM just don't cut it and 12 GB allow for something despite very slow GPU. Playing at 15 FPS is better than not playing at all, y'know.

Gotta disagree there. I'm going to go out on a limb and say that if a game can't at least run, and ideally be configurable to run at ~30fps, on all current-gen cards at a games release, that's on the game.

The GTX 1070 from 2016 was priced $380 at launch (which roughly translates to 2022's $460 considering inflation). So, the only game which wasn't comfortable for this card at 1080p was Deus Ex: Mankind Divided, and others were playable at 60+ FPS. https://www.techpowerup.com/review/msi-gtx-1070-quick-silver-oc/12.html
The RTX 4060 series do already have a handful of games which are COMPLETELY unplayable at 1080p at ultra settings. Do I need to remind this was not the case for the GTX 1060 ($250, or $305 of 2022) which was just slow in some titles but has not run into slideshow and texture quality shenanigans?

Pascal was really good. This is well known. It's also known that pricing is completely out of whack for Nvidia cards, so I'm not sure what you're getting at here.

What games are unplayable on AD60 at 1080p Ultra? Everything but two in the TPU suite hits at least 60 average on the 4060, and those don't miss by much.

This probably has not been the case for the GTX 1050 Ti as well. Just a reminder: that was a $140 ($170 as of 2022) GPU. https://www.techpowerup.com/review/asus-gtx-1050-ti-strix-oc/12.html

Yes, I know. Every x50 was in that range or less until Ampere.

And now, with 10+ GB VRAM GPUs getting more and more popularity, game studios will definitely increase the VRAM tax in their future products. Which means the utter garbage state of things for RTX 4060 series with the sole survivor by the name of 4060 Ti 16 GB being able to actually play games from, say, 2026. But when you pay half a thousand dollars for a card you expect it to run them, not to snail them.

Let me be very clear: I am not claiming AD60 is good, especially not at RRP. The ti def should have had 12. What I'm claiming is that it's not, the 4060 ti in particular, 50-series, regardless of the number of bits on the bus. From Fermi to Turing, x50 cost significantly less than 200 bucks, pulled less than 120 watts, and provided playable performance at mainstream resolutions. The 4060 ti is $400 (should be no more than $300), draws 160W, and puts up a reasonable showing at 1440p. Not everything quite hits 60, but with every AAA title apparently trying to be the next Crysis, that's hardly surprising.

Actually forget captain halo, even the 4080 sees an enormous 80-115% increase in perf over the 4060 "ti" (thats crazy!). Both 20 and 30 series cards carried the x60 and x80 subdivisions on a well received 30-40% relative perf discrepancy - what on earth were they thinking with 40-series?

4080 beats 4060 ti by 80-115% by pulling 100% more power. I'm not sure why this is surprising. There was the same gap between the 3060 and 3080: +95% performance for +90% watts. Turing's 60-80 delta was as you describe.
 
30% is significant, but not miles IMO.

30% is more than the delta between the RTX 4080 and 4090, despite the latter's much wider core and abundance of execution resources (roughly +70% resources for +20-25% performance), didn't make sense to get one, even from an enthusiast point of view. Ada's smoothest chips imo are AD104 and AD103. Neither are wasteful and both come in well-balanced configurations, even if the AD104 doesn't quite live to the "RTX 4080" branding.

I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.

Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.

Does this look like AMD to you?
It sure doesn't look like AMD to me.

Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.

I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.

Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.

1. The logic I'm applying is that you're dishonestly comparing a previous generation product that's going for clearance prices to a current generation product that is indubitably priced high, but where the manufacturer intended it to be priced all along. Forgot the 6950 XT's original $1199 MSRP?

2. Good, buy the AMD card while you still can. I'm merely pointing out that this is the exception and not the norm, once RDNA 2 and Ampere stocks deplete you will not be able to do this

3. Nvidia GPUs outsell AMD ones 9 to 1, it's irrelevant. Nvidia themselves have shown little interest in this 16 GB model, as I have mentioned earlier, it's more of an SKU they've released to prove a point than anything else.

4. You could have linked to the video where HUB tested the 8 GB 3070 v. the 16 GB RTX A4000 to make a fairer point as those cards share the exact same architecture and feature set, but like I mentioned on the forum before, the situation between 4060 Ti 8 GB and 4060 Ti 16 GB is exactly the same as described in the aforementioned video. The 16 GB card is better, when you need the memory... but with this anemic a configuration, it matters less than ever - it's just a bad card all around

5. Never said you didn't or did dump on the 6500 XT and my point wasn't to demean it as a product, just to say that Nvidia committed the same reckless cost-saving measure that cost this card the performance it could have, the 64-bit bus in Navi 24 is the primary bottleneck even on such a small die

6. Agreed that the prices are obnoxiously high, at $500 this GPU is simply not worth the price of admission, even if last-gen 6800 XT and RTX 3080s weren't, for the time being, accessible at this price
 
Nah, they didn't. They just moved it up by a few hundred bucks together with all the other tiers. :roll:

At least they released something there... AMD didn't yet :(
 
Especially if you consider today this 7900XT rivals a 4080 at raster perf for nearly half the price. Or better - as an AIB 4080 goes for above 1200 with ease. I bought it at 899 and I still cant say Im unhappy. Its highly competitive to say the very least. There is also definitely room now for a 7800XT at around 600 with 16GB and a -10/15% perf deficit. And even in RT these two cards would do just fine comparatively. Without proprietary BS as a bonus - no game ready driver or DLSS3 updates required, the perf is just there out of the box. As it should be.

Its a no brainer to me tbh
The 7900XT is less than 4% behind the 4080 at 1440p and 8% at 4K. Both the XT & XTX got huge FPS boost from recent drivers with FH5 and TLOU.



Since the 4060Ti is set at $400/$500 and the 4070 at $600, AMD can easily take over these 3 lower-mid to mid-range cards. They can rebrand both the 6950XT and 6800XT as 7800XTX and 7800XT, 6800 as 7800 and charge $600, $500 and $400. Of course with 16GB , lower power draw and RDNA features. Do you think that's fair?
 
Last edited:
AMD didn't yet
Because they are on their mission! The Very Important Mission of losing in every price segment as hard as possible. They seem quad US national debt worth bribed to never be competitive by nVidia.
30% is significant, but not miles
It's just enough to justify the fact these cards don't belong to same segment. And if you look at 1% lows, it's more than 30%. Providing the 4060 its 192 bit bus back would've skimped this difference a bit but 4070 would still be significantly better despite being a de facto rebranded 4060... Ti perhaps? Nah, I'm paying too much respect. A rebranded 4060 it is.
If the performance delta were smaller, you don't think a portion of potential 4070 buyers would step down?
Yes but this doesn't matter because if nGreedia had put 1.5 times more RAM on 4060s they'd still have a whole lot of profit just because Adas are cheaper than "equivalent" Amperes to produce and sell for historical maximums (MSRP-wise, not street price wise).
Because NV abandoned the midrange
Incorrect. They abandoned reasonable pricing and adopted misleading branding.
at least 60 average
This doesn't matter how many millions you have in your average framerate if 1% low is a single digit number. And by 2025, the number of games allergic to 8 GB cards will by no mean be a single digit one (pun intended).
What I'm claiming is that it's not, the 4060 ti in particular, 50-series
And you're incorrect. Real 4060 must beat 3060 in EVERY POSSIBLE scenario. If there is a ONE case where 4060 is slower than 3060 it's a fail and you (meaning nVidia, not you personally) must've named this at least a half tier lower, namely 4050 Super or Ti. Or at least stopped pretending that $400 is not twice the reasonable price.
Pascal was really good
Maxwell, if we don't count GTX 960 (really BAD product all things considered), was even better considering how far it's gone from Kepler.
 
I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.

Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.

Does this look like AMD to you?
It sure doesn't look like AMD to me.

Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.

I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.

Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.
I find it hilarious and annoyed at the same while watching Steve make up excuses after excuses how the nVidia GeForce RTX 3070 8GB didn't suck even though his own data says the 6800 is the superior card. He also chose the 3070 over the 6800 from day one review because MuH rAy tRaCiNg.
 
TPU states that it's so-so fine even for 4K. Yellow colour means depends on the settings and use cases.

I believe that the sweet spot for the potential RTX 4060 Ti owners is 1440p screens, but some enthusiasts can move to the better quality 2160p screens.
Personally I don't think people buying $500 graphics cards are gaming at 60Hz. If they are, then it's a 1440p card at best, and unsuitable for 4K IMO.

If you have a high-refresh display like most gamers do these days, it's only matching the display capabilites in older games and esports. Here's the kicker - if you are only playing older games and esports, you don't need a $500 graphics card.
 
I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.

Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.

Does this look like AMD to you?
It sure doesn't look like AMD to me.

Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.

I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.

Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.
Cards need to be readily available for them to be relevant in comparison, the 6800XTs are running out slowly. Its a while stocks last thing and on top of that, local availability can be worse, making the price of such highly competitive cards skyrocket. I see them go over 800 eur here for some AIB models, and a rare couple over 1200 (!!). Theory vs practice.,.

At least they released something there... AMD didn't yet :(
They really should be pushing that 7800(XT) button just about now, the momentum is there if they would given how poor 8GB cards turn out. Even a 12GB 7700 would be marvellous... if they position that at 450 they will sell. But like @Beginner Micro Device stated... Always Mighty Delayed...
 
Last edited:
Quite obvious from the results that nvidias 8gbs are more comparable to AMD's 10gb. Look at the 7600, in some cases it loses 90% of it's performance moving from 1440p to 4k while the 4060 8gb is doing fine.

More and more games will need lots of VRAM. More VRAM means more future-proofing. AMD chose to equip the RX 6800 with as much as 16 GB and it will pay off in the long run - fine wine.

View attachment 306188
View attachment 306187
What fine wine are you drinking? RDNA 2 is slow as heck due to RT being non existent

Especially if you consider today this 7900XT rivals a 4080 at raster perf for nearly half the price.
It doesn't? The XT is in fact closer to the 4070ti in raster performance than it is to the 4080. So, the 4070ti rivals the 7900xt in raster.
 
4080 beats 4060 ti by 80-115% by pulling 100% more power. I'm not sure why this is surprising.

With this sort of context it gets uglier... eg. if next Gen x60 vs x80 saw a ~200% perf discrepancy with a 200% power variation it doesn't mean x60 is a good fit.

anyway perf-per-watt is off-topic to what was being referenced. What was being suggested was, its crazy seeing "x60(ti)" being massively condensed with a 80/115% perf diff coming from a 4080. Keep in mind thats without mention of the wider 4090 delta or the standard x60 (non-TI) vs x80 with a 100/135% cutback in perf. This sort of widely elongated performance disparity cannot be justified as a x60-class product but a x50. Whether its gimping on bandwidth, bus-width, core% segment, memory, etc etc sometimes its simply sufficient to look to "raw performance" comparables alone to conclude something is amiss.... or out of character..... or in this case, out-of-class!

There was the same gap between the 3060 and 3080: +95% performance for +90% watts.

You might want to check x60 (TI) 30-series benchmarks again.
 
Last edited:
Quite obvious from the results that nvidias 8gbs are more comparable to AMD's 10gb. Look at the 7600, in some cases it loses 90% of it's performance moving from 1440p to 4k while the 4060 8gb is doing fine.


What fine wine are you drinking? RDNA 2 is slow as heck due to RT being non existent


It doesn't? The XT is in fact closer to the 4070ti in raster performance than it is to the 4080. So, the 4070ti rivals the 7900xt in raster.
EC87846E-FEF3-4E7D-892C-0035462A0EE4.png

Depends where you look ;)

As for RT.. sure, but old news and x60 is not really RT capable.
 
View attachment 306431
Depends where you look ;)

As for RT.. sure, but old news and x60 is not really RT capable.
No it doesn't depend where you look, you just cherry-picked. I can find a game where the 4070ti is similar or better than the XTX. Doesn't mean much. The XT certainly does not rival the 4080,not in raster or anything else for that matter.
 
Back
Top