• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

5060 Ti 8GB DOA

Is there? Historically the price increase of larger VRAM chip has been considerably less than the increase in capacity. IIRC the rule of thumb was double the size and price increases 30-40%.

With very loud moaning about VRAM capacities I would think using 3GB GDDR7 would have been really beneficial for Nvidia in some consumer SKUs. 12GB 5060Ti would likely be OK for most or reviewers, techtubers and us. Similarly, a 24GB 5080 would be much less controversial.
If nvidia can sell out of every 8/12/16GB card they make why would they spend more money turning them into 12/18/24GB? While I doubt there was enough 3GB IC volume for the early 50 series launches there certainly was for the xx60 cards. They can also now do refreshes with 3GB IC next year if they want to.

Without the controversy equating to lost sales I don't see any reason for them to care. Just take a look at how Intel dragged their feet regarding the degrading 13/14th Gen which was extremely serious for anyone affected or looking to buy. If it's cheaper for the company it's a safe bet that choice is the one they'll make.
 
If nvidia can sell out of every 8/12/16GB card they make why would they spend more money turning them into 12/18/24GB? While I doubt there was enough 3GB IC volume for the early 50 series launches there certainly was for the xx60 cards. They can also now do refreshes with 3GB IC next year if they want to.

Without the controversy equating to lost sales I don't see any reason for them to care. Just take a look at how Intel dragged their feet regarding the degrading 13/14th Gen which was extremely serious for anyone affected or looking to buy. If it's cheaper for the company it's a safe bet that choice is the one they'll make.
It'd be a pretty cheap and easy way to upsell the next generation given they're basically causing the problem and selling the solution at this point. Of course that assumes they want to sell to gamers, now they just push all production to datacenter instead so they don't actually have to do anything.
 
It doesn't launch on 8gb cards? You sure about that? So if I post a video playing on 3 different 8gb cards, what do I get?
I'm just going off what the TPU graph says, but it does seem to launch on 8GB AMD cards. Doesn't look like a fun experience though, you're definitely getting VRAM starved.
 
That was them blatantly trolling. Which is why I skipped over that nonsense.
People have the weird expectation that there shouldn't be any card on the market that cannot play all games (even the heaviest ones) at maxed out settings. Which is where the whole "8gb not enough" comes from. When my 4090 barely manages to play 4k maxed out (even with DLSS) it stands to reason that a 4060ti 8gb should be playing 4k low / medium, because it's literally 1/5th the cost. I do not expect a 60 tier card to max out anything modern / heavy. Historically that has always been the case even going 20 years back.

Now people will point at the price - but the price is still fine relatively speaking. You can't say 400$ is too much when the 5080 is 1400 and the 5090 is 2.5k++. The price of every card is too high, that has nothing to do with 8 or 16gb or what have you.

I'm just going off what the TPU graph says, but it does seem to launch on 8GB AMD cards. Doesn't look like a fun experience though, you're definitely getting VRAM starved.
You are talking about trying to play it maxed out? Man...we are really talking about how the cheapest current gen card cannot max out the heaviest game in existence as a negative. Aokay. This is pointless.

EG1. I've played indiana jones on a 3060ti btw.
 
People have the weird expectation that there shouldn't be any card on the market that cannot play all games (even the heaviest ones) at maxed out settings. Which is where the whole "8gb not enough" comes from. When my 4090 barely manages to play 4k maxed out (even with DLSS) it stands to reason that a 4060ti 8gb should be playing 4k low / medium, because it's literally 1/5th the cost. I do not expect a 60 tier card to max out anything modern / heavy. Historically that has always been the case even going 20 years back.
In the past with new budget cards you'd just turn down shaders and particles and other stuff to get a playable framerate then turn up textures and get an acceptable gaming experience. There is zero historical precedent for having to turn down textures on a new midrange GPU.

The Switch 2 has effectively more VRAM now. It's time to stop the cope.
Now people will point at the price - but the price is still fine relatively speaking. You can't say 400$ is too much when the 5080 is 1400 and the 5090 is 2.5k++. The price of every card is too high, that has nothing to do with 8 or 16gb or what have you.
No it's not. The 7800/7700 XT exists and has been sold for less than 5060 Ti RRP for years now.
You are talking about trying to play it maxed out? Man...we are really talking about how the cheapest current gen card cannot max out the heaviest game in existence as a negative. Aokay. This is pointless.

EG1. I've played indiana jones on a 3060ti btw.
The 16GB version gets 60+ FPS at 1440p with RT enabled in that game. Not an argument.

Cool. You're VRAM starved.
 
It doesn't launch on 8gb cards? You sure about that? So if I post a video playing on 3 different 8gb cards, what do I get?
I believe they just said it was crashing on anything above 1080p/medium settings (native resolution). I know it crashed on ultra which the 16GB model had zero issue with and was getting >60 fps mins. It's really just a problem with the game design and Vulkan where it crashes when running out of VRAM. Most DX12 games won't do this, but will purge the VRAM with unloading/downgrading textures and whatnot.
 
In the past with new budget cards you'd just turn down shaders and particles and other stuff to get a playable framerate then turn up textures and get an acceptable gaming experience. There is zero historical precedent for having to turn down textures on a new midrange GPU.

The Switch 2 has effectively more VRAM now. It's time to stop the cope.

No it's not. The 7800/7700 XT exists and has been sold for less than 5060 Ti RRP for years now.

The 16GB version gets 60+ FPS at 1440p with RT enabled in that game. Not an argument.

Cool. You're VRAM starved.
Obviously im vram starved if I try to max out Indiana Jones. Every card is starved in something. Either shaders, bandwidth or vram, else we would be getting thousands of fps.

But I don't get what the point is, the card exists in a 16gb version, buy that if you feel like you need the vram. Why do you want to remove the option from me to buy an 8gb card since I don't feel like I need it? I'd never pay 70$ more just to set textures to ultra, lol.

I believe they just said it was crashing on anything above 1080p/medium settings (native resolution). I know it crashed on ultra which the 16GB model had zero issue with and was getting >60 fps mins. It's really just a problem with the game design and Vulkan where it crashes when running out of VRAM. Most DX12 games won't do this, but will purge the VRAM with unloading/downgrading textures and whatnot.
I know it crashes on ultra (it can barely play on 16gb cards mind you), but why focus on ultra with the cheapest current Gen card is the question
 
But I don't get what the point is, the card exists in a 16gb version, buy that if you feel like you need the vram.
Indeed.
Why do you want to remove the option from me to buy an 8gb card since I don't feel like I need it?
Fewer SKUs reduces cost. Clamshelling the memory is an additional expense. You don't lose out by having more than you need.
I'd never pay 70$ more just to set textures to ultra, lol.
In the past, you didn't have to.
 
Indeed.

Fewer SKUs reduces cost. Clamshelling the memory is an additional expense. You don't lose out by having more than you need.

In the past, you didn't have to.
Of course you lose out if you have to pay more for something you don't need. It's like me complaining about the 16gb card existing. Like who cares, I'm glad it exists for people like you that need the vram, enjoy it. I'm jot spending 80$ for ultra textures. To each their own.

Tlou 1 has the best looking textures right now. Textures set to high plays fine on 8gb cards and they still look super crisp. If there is a game that you need to set textures to low to work on 8gb then those low textures should look at least as good as tlou. In which case, that's great why the heck would I care if it's low or high or ultra when the end result is amazing...
 
Right. You might still want to double check all the settings those benches ran at, before the conclusion is undeniable about your general level of intelligence on this subject. Just saying. You do you ;)

any new comments on the TPU review? i'm just curious how stupid do you think you are now.
 
any new comments on the TPU review? i'm just curious how stupid do you think you are now.
Seems pretty clear doesn't it, if you haven't got your tunnel vision going? It already starts with the review title: 'So many compromises'.

Then there is the conclusion:
"Especially in VRAM-heavy titles the performance drops, but in many cases it's slightly faster than the more expensive 16 GB version, thanks to improved power efficiency."

Its clear as day the 8GB is the limiting factor for the lifetime or quality settings of this card. We now have the first TPU benches where 8GB even on 1080p (!!!) is already showing a massive cut in performance in several games. That's a unique situation, you can go back in time to compare a similar set of cards and you won't find one, gen to gen. What you think is supposed to be normal for a product in this range, is really not normal. You're merely coping.

Now, if your argument is that because its an x60, it can indeed play games that already exist on 1080p properly, then yes, you are absolutely right, thanks for that immensely valuable nugget of knowledge... But the reality is, this capacity struggles, regardless of how fast the core of the GPU is. The product is poorly balanced, overpriced, and specced for games that were released two years back or longer.

Also, the review is not looking into or at quality reductions due to lower VRAM. Its just measuring FPS.
 
Last edited:
Of course you lose out if you have to pay more for something you don't need. It's like me complaining about the 16gb card existing. Like who cares, I'm glad it exists for people like you that need the vram, enjoy it. I'm jot spending 80$ for ultra textures. To each their own.

Tlou 1 has the best looking textures right now. Textures set to high plays fine on 8gb cards and they still look super crisp. If there is a game that you need to set textures to low to work on 8gb then those low textures should look at least as good as tlou. In which case, that's great why the heck would I care if it's low or high or ultra when the end result is amazing...
I care. If it means the 5060 Ti 16GB has a $400 MSRP then I'm more than willing to make you pay $20 more so I can pay $30 less. No offense. The reason why you don't care about 16GB cards existing is because it doesn't actually affect you while the opposite isn't true. Oh no, more VRAM, spooky.

Even with games released years ago VRAM was an issue, and it's more apparent now when current GPUs are otherwise more than capable enough to run the game maxed out.

Also, the review is not looking into or at quality reductions due to lower VRAM. Its just measuring FPS.
Do they really? That's pretty sad.
 
Seems pretty clear doesn't it, if you haven't got your tunnel vision going? It already starts with the review title: 'So many compromises'.

Then there is the conclusion:
"Especially in VRAM-heavy titles the performance drops, but in many cases it's slightly faster than the more expensive 16 GB version, thanks to improved power efficiency."

Its clear as day the 8GB is the limiting factor for the lifetime or quality settings of this card. We now have the first TPU benches where 8GB even on 1080p (!!!) is already showing a massive cut in performance in several games. That's a unique situation, you can go back in time to compare a similar set of cards and you won't find one, gen to gen. What you think is supposed to be normal for a product in this range, is really not normal. You're merely coping.

Now, if your argument is that because its an x60, it can indeed play games that already exist on 1080p properly, then yes, you are absolutely right, thanks for that immensely valuable nugget of knowledge... But the reality is, this capacity struggles, regardless of how fast the core of the GPU is. The product is poorly balanced, overpriced, and specced for games that were released two years back or longer.

Also, the review is not looking into or at quality reductions due to lower VRAM. Its just measuring FPS.

5 years old and stupid
 
I care. If it means the 5060 Ti 16GB has a $400 MSRP then I'm more than willing to make you pay $20 more so I can pay $30 less. No offense. The reason why you don't care about 16GB cards existing is because it doesn't actually affect you while the opposite isn't true. Oh no, more VRAM, spooky.

Even with games released years ago VRAM was an issue, and it's more apparent now when current GPUs are otherwise more than capable enough to run the game maxed out.


Do they really? That's pretty sad.
Textures, asset LoDs, geometry can all be dynamically adjusted, for example, giving you the higher detail model only when you go closer. These tricks are as old as gaming itself, except now you no longer control all of them with a slider, they're just there and are directed by an FPS target. That's how a lot of 'optimization' is done these days, too.

5 years old and stupid
Mhm, so stupid. Its fine if you can't grasp what's happening. Live and learn, you'll get there eventually.

People have the weird expectation that there shouldn't be any card on the market that cannot play all games (even the heaviest ones) at maxed out settings. Which is where the whole "8gb not enough" comes from. When my 4090 barely manages to play 4k maxed out (even with DLSS) it stands to reason that a 4060ti 8gb should be playing 4k low / medium, because it's literally 1/5th the cost. I do not expect a 60 tier card to max out anything modern / heavy. Historically that has always been the case even going 20 years back.

Now people will point at the price - but the price is still fine relatively speaking. You can't say 400$ is too much when the 5080 is 1400 and the 5090 is 2.5k++. The price of every card is too high, that has nothing to do with 8 or 16gb or what have you.


You are talking about trying to play it maxed out? Man...we are really talking about how the cheapest current gen card cannot max out the heaviest game in existence as a negative. Aokay. This is pointless.

EG1. I've played indiana jones on a 3060ti btw.
Its not strange at all to be expecting to max out anything 1080p on an x60 at this point if you are doing that @ 4K on a 4090, because that's easily four times the GPU, and four times the resolution. And then you figure out what's really going on... VRAM requirements do not have a linear relationship to the core power on tap. You just need a bottom amount to have the core do its work proper in every situation, and if you fall short, the penalty is pretty high. The same thing applies to the top end: you're fine with 16GB. Even at 4K Ultra.

The reality is that for GPUs the entire stack really does need to be positioned close to the requirement of the time it gets released in. So yes, it does indeed make a lot more sense economically to be making fewer SKUs with 'sufficient' memory on them. Ideally, the entire Blackwell stack would've had 12-16GB across the x60~x80 and 24GB on the x90. That would have made sense. What you get now is a reality where the x60s with 8GB are really positioned and have the lifetime of an x50 release, no ifs or buts, but you're still paying $400,-.

The price is fine not even relatively speaking. The die is tiny, the TDP is low and the VRAM below par. You've lost the plot, honestly.

I'd avoid this segment entirely this gen, myself, if I were in the market for something new. Even if the budget didn't allow for more. Better off waiting for Supers or something.
 
Last edited:
Indiana Jones and the Great Circle.
Works fine here, I tested it with 5060 Ti 8 GB, check the paragraph in my conclusion.

The "crashed" in the cart is at "max" texture settings, which will crash on cards with 8 GB
 
Mhm, so stupid. Its fine if you can't grasp what's happening. Live and learn, you'll get there eventually.

you spent that time defending clickbait trash, and calling other people stupid.
Now you had to go search for nuances on a good actual review, and can't see how stupid you are with this, a real idiot troll from the interne if i ever saw one
 
you spent that time defending clickbait trash, and calling other people stupid.
Now you had to go search for nuances on a good actual review, and can't see how stupid you are with this, a real idiot troll from the interne if i ever saw one
Hey, I don't need to defend anything, I've got a card. If you want to buy 8GB cards in 2025, power to you, go buy two, the more you buy, the more you save!

Ignorance is bliss, they say. Just don't go and keep complaining about how bad dGPU has become then. You're only advocating for it to get worse ;)
All I see here is that the predictions I've been making the last few years are now slowly turning into truths proven by numbers. Its not rocket science, you know. All of this is easy to see coming, if you know where to look. I'm pointing people to it. Its their problem if they don't want to know, not mine :)
 
Textures, asset LoDs, geometry can all be dynamically adjusted, for example, giving you the higher detail model only when you go closer. These tricks are as old as gaming itself, except now you no longer control all of them with a slider, they're just there and are directed by an FPS target. That's how a lot of 'optimization' is done these days, too.


Mhm, so stupid. Its fine if you can't grasp what's happening. Live and learn, you'll get there eventually.


Its not strange at all to be expecting to max out anything 1080p on an x60 at this point if you are doing that @ 4K on a 4090, because that's easily four times the GPU, and four times the resolution. And then you figure out what's really going on... VRAM requirements do not have a linear relationship to the core power on tap. You just need a bottom amount to have the core do its work proper in every situation, and if you fall short, the penalty is pretty high. The same thing applies to the top end: you're fine with 16GB. Even at 4K Ultra.

The reality is that for GPUs the entire stack really does need to be positioned close to the requirement of the time it gets released in. So yes, it does indeed make a lot more sense economically to be making fewer SKUs with 'sufficient' memory on them. Ideally, the entire Blackwell stack would've had 12-16GB across the x60~x80 and 24GB on the x90. That would have made sense. What you get now is a reality where the x60s with 8GB are really positioned and have the lifetime of an x50 release, no ifs or buts, but you're still paying $400,-.

The price is fine not even relatively speaking. The die is tiny, the TDP is low and the VRAM below par. You've lost the plot, honestly.

I'd avoid this segment entirely this gen, myself, if I were in the market for something new. Even if the budget didn't allow for more. Better off waiting for Supers or something.
So the problem is the name? If it was an x50 it would be fine?

Historically not every card could max out every game. Not even high end cards could. If the argument is that every current card should be able to max out games then we have a fundamental disagreement.

Again we will go back at the price but the reality is at the current market 400 is an entry level card. It's a lot of money but that is the situation we are in. When xx60 was sold for 300 the xx80 was at 700 or less. Now the latter is 1k+.
 
So the problem is the name? If it was an x50 it would be fine?

Historically not every card could max out every game. Not even high end cards could. If the argument is that every current card should be able to max out games then we have a fundamental disagreement.

Again we will go back at the price but the reality is at the current market 400 is an entry level card. It's a lot of money but that is the situation we are in. When xx60 was sold for 300 the xx80 was at 700 or less. Now the latter is 1k+.
In the current market 400 is entry level card? I think you need to double check that for a moment. Entry level cards start at 199,-, also carrying 8GB, and are being offered by AMD, Intel. Nvidia itself? Starts at 319,- with a 4060 8GB. 326 EUR gets you a 12GB B580. 299,- gets you a 16GB 7600XT.

This is an x60TI specced like a true entry level card (in that you are correct...) and priced like an x60ti ~ x70. That's the problem, and it should be pretty obvious.
 
Um, 2005ish? Name ONE Geforce "XX6XX" series card that was not the budget offering since then. Seriously, go ahead..
Wrong line of thought. If you can't remember, there used to be a whole range from 10-tier to 50-tier (700-series, in 2013, was a prime example - 710, 720, 730, 740, 750, 750Ti - 6 cards below the 760 and 4 above made it smack dab mid-range).
Yes, those were budget-oriented cards. I know many enthusiast wouldn't even consider them "gaming" cards, but there's something called "third world".
 
People have the weird expectation that there shouldn't be any card on the market that cannot play all games (even the heaviest ones) at maxed out settings.
I agree, it's total nonsense. Jeff over at Craft Computing just did a video on this very subject and with an 8GB 3050;
If THAT 8GB card can do some good and reasonable gaming, the 8GB 5060 and 5060ti will do just fine, as W1zzards review clearly shows.

The "crashed" in the cart is at "max" texture settings, which will crash on cards with 8 GB
Exactly. Turn some settings down, all is well, which most people who get this card will do anyway.

Wrong line of thought. If you can't remember, there used to be a whole range from 10-tier to 50-tier (700-series, in 2013, was a prime example - 710, 720, 730, 740, 750, 750Ti - 6 cards below the 760 and 4 above made it smack dab mid-range).
Yes, those were budget-oriented cards. I know many enthusiast wouldn't even consider them "gaming" cards, but there's something called "third world".
Sure, those were low end budget cards. The "6" series has always been the lower-mid tier, upper budget tier cards. And they still are. The "7" series are the mid-tier to upper mid-tier cards. "8"+ cards were high end to top tier. That has been NVidia's formulae for the last 20 years at minimum.
 
Last edited:
Sure, those were low end budget cards. The "6" series has always been the lower-mid tier, upper budget tier cards. And they still are. The "7" series are the mid-tier to upper mid-tier cards. "8"+ cards were high end to top tier. That has been NVidia's formulae for the last 20 years at minimum.
Hard disagree. 20 years ago would put us at the 7-series, and there was no 7-tier regularly (it became regular since the 200-series in 2008). Until then, the 6-tier was intermediate, below the 8 in its many variants (GT, GTS, GTX, Ultra and anything else Nvidia named their subtiering).
From 200-series forwards until 1000-series, the 6-tier was lower high-end at best, square mid-range at worst. It was always GTX, never GT or GTS.
 
Last edited:
In the current market 400 is entry level card? I think you need to double check that for a moment. Entry level cards start at 199,-, also carrying 8GB, and are being offered by AMD, Intel. Nvidia itself? Starts at 319,- with a 4060 8GB. 326 EUR gets you a 12GB B580. 299,- gets you a 16GB 7600XT.

This is an x60TI specced like a true entry level card (in that you are correct...) and priced like an x60ti ~ x70. That's the problem, and it should be pretty obvious.
When the high end is 3k and the midrsnge 800 to 900 yeah, 400 is entry level. When the high end was 800 (1080ti) then entry level was 150 to 200. It's all relative.
 
Back
Top