• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

5060 Ti 8GB DOA

Hard disagree.
Ok. You do you.
20 years ago would put us at the 7-series, and there was no 7-tier
That would have been the 6000 series;
And yes, there was a 6700XL.
The 7000 series didn't hit market until late 2005 and early 2006.
From 200-series forwards until 1000-series, the 6-tier was lower high-end at best
No they weren't. I'm not going to debate this point to death. The "60" series cards are the lower mid-range budget offerings and have been for a looooong time. This is unlikely to change.
 
any new comments on the TPU review? i'm just curious how stupid do you think you are now.
I've read the whole thing but it says right there in the title - so many compromises. - Too many for a 400€ card.
View attachment 396444

Also, do you find it absolutely necessary to call people who disagree with you stupid and "5 years old"?
 
Last edited:
No they weren't. I'm not going to debate this point to death. The "60" series cards are the lower mid-range budget offerings and have been for a looooong time. This is unlikely to change.
It doesn't even matter what historically it was, that's like surface level criticism. Are we bothered by the name? Currently the 8gb 5060ti is the 2nd lowest (including the yet unreleased 5060) out of the 7 offerings.

It's perfectly normal for the 2nd lowest product to play with a mix of ultra / high / med at 1080 and 1440p depending on how heavy the game is.
 
It's perfectly normal for the 2nd lowest product to play with a mix of ultra / high / med at 1080 and 1440p depending on how heavy the game is.
While it's not expected that the second lowest card (or third, if 5050 comes to fruition) can game at the highest resolutions and settings reliably, I'd expect such a card to not cost more than twice what a similar positioned card did a few generations back (the 1650S's MSRP was $159, and there's no inflation which could make that become $380 today).
If the 5060 and 5060 Ti are the de facto low-end, they should be priced accordingly.
 
While it's not expected that the second lowest card (or third, if 5050 comes to fruition) can game at the highest resolutions and settings reliably, I'd expect such a card to not cost more than twice what a similar positioned card did a few generations back (the 1650S's MSRP was $159, and there's no inflation which could make that become $380 today).
If the 5060 and 5060 Ti are the de facto low-end, they should be priced accordingly.
And we go back to the pricing yet again. When the 1650 was 159 the top end card cost 649. Now the top end costs 2k msrp. This is going around in circles isn't it...
 
And we go back to the pricing yet again. When the 1650 was 159 the top end card cost 649. Now the top end costs 2k msrp. This is going around in circles isn't it...
If we consider the entirety of the 16/20-series lineup, the top card was the 2080Ti (yes, I know it came later, but as I said I'm considering the whole generation sans the Titan RTX which has the same number of cores as the 2080Ti). Now, the MSRP for the 2080Ti was $999. So with the 1650S we had roughly 28% of the cores of the 2080Ti at ~16% of its msrp. Were I to consider the Titan ($2499), that'd be 6% of the msrp.
With the 5060Ti 8GB/16GB we have ~24% of the cores at 19%/22% of the msrp against the 5090. That's regression, but it at least is better than the 4060Ti was against the 4090 (27% of the cores at 25% or 31% of the price).
Naming only becomes an issue because of the perception it carries. Back then, the 5-tier meant compromising, not the 6, and this analysis only means that both 4060Ti and 5060Ti are xx50-tier cards in disguise.
 
Last edited:
If we consider the entirety of the 16/20-series lineup, the top card was the 2080Ti (yes, I know it came later, but as I said I'm considering the whole generation sans the Titan RTX which has the same number of cores as the 2080Ti). Now, the MSRP for the 2080Ti was $999. So with the 1650S we had roughly 28% of the cores of the 2080Ti at ~16% of its msrp. Were I to consider the Titan ($2499), that'd be 6% of the msrp.
With the 5060Ti 8GB/16GB we have ~24% of the cores at 19%/22% of the msrp against the 5090. That's regression, but it at least is better than the 4060Ti was against the 4090 (27% of the cores at 25% or 31% of the price).
Naming only becomes an issue because of the perception it carries. Back then, the 5-tier meant compromising, not the 6, and this analysis only means that both 4060Ti and 5060Ti are xx50-tier cards in disguise.

I mean if you look back at reviews of cards like the 750 Ti, 950 Ti, 1050 Ti and 1650 non-super. None of those managed 60fps at 1080p in most games. More like in the 30s-40s. The 60-series cards now still provide a substantially better experience than those cards did then.
 
I mean if you look back at reviews of cards like the 750 Ti, 950 Ti, 1050 Ti and 1650 non-super. None of those managed 60fps at 1080p in most games. More like in the 30s-40s. The 60-series cards now still provide a substantially better experience than those cards did then.
I can't recall when 1080p became the mainstream resolution, though. As far as I recall (and I may be admittedly be wrong here), 1366x768 was way more common as the resolution for such users back then.
I also reckon that may be a third world bias I have. I used a 1680x1050 monitor through the span of at least three different budget cards from 2009 until 2018 (HD4350, HD7770 and RX460).
 
Last edited:
I can't recall when 1080p became the mainstream resolution, though. As far as I recall (and I may be admittedly be wrong here), 1366x768 was way more common as the resolution for such users back then.

I don't either actually, I used to have a 750 Ti and I think I had a 1680x1050 monitor at the time.

Here's the lowest resolutions TPU used. For the 750 Ti it was 1600x900. In Battlefield 4 the 750 Ti only managed 37fps. (Or 28 at 1080p)


GTX 950 (not Ti, my mistake) they used 1600x900 as well.

From 1050 Ti they used 1080p.

I know people are looking at how it compares proportionally to the very high end and working backwards from there, the trouble with that is the current high end is out of sight faster than the lower cards, to a degree the high end never used to be.
 
I don't either actually, I used to have a 750 Ti and I think I had a 1680x1050 monitor at the time.

Here's the lowest resolutions TPU used. For the 750 Ti it was 1600x900. In Battlefield 4 the 750 Ti only managed 37fps. (Or 28 at 1080p)


GTX 950 (not Ti, my mistake) they used 1600x900 as well.

From 1050 Ti they used 1080p.

I know people are looking at how it compares proportionally to the very high end and working backwards from there, the trouble with that is the current high end is out of sight faster than the lower cards, to a degree the high end never used to be.
Funny thing is: at the time, 30 fps was the bar for things to be considered "playable" at the low end, 60 fps was the holy grail. On that metric with current standards (1080p60 being the "playable"/"acceptable" minimum), the 5060Ti stands at that exact same position. So why did price seemingly skyrocket?
 
Funny thing is: at the time, 30 fps was the bar for things to be considered "playable" at the low end, 60 fps was the holy grail. On that metric with current standards (1080p60 being the "playable"/"acceptable" minimum), the 5060Ti stands at that exact same position. So why did price seemingly skyrocket?

I dunno, I think on PC 60fps has been the standard for 20 years, at least in genres like shooters. Nobody was happy playing Unreal Tournament / UT2003+4, Quake 3, Call of Duty, Battlefield etc. at 30fps. I agree that 1080p wasn't mainstream until more recently though (but still around 10 years ago). 1024x768 or 1280x1024 were mainstream back then, 'HD' and 16:9 came later after the Xbox 360 took off I think, coinciding with the popularity of HD flat screen televisions. (Those are all multiplayer and remember CoD was 60fps even on the Xbox 360.)

Cards like the RTX 4060 non-Ti aren't REALLY that expensive, not the MSRP nor the actual price you pay. Prices took off during the bitcoin craze and Covid, and have come down a lot since, inflation-adjusted they really aren't particularly high, and the 5060 series has actually got a price reduction both nominally and in real terms. They might not be fantastic value but they aren't terrible either, by any means.
 
When the high end is 3k and the midrsnge 800 to 900 yeah, 400 is entry level. When the high end was 800 (1080ti) then entry level was 150 to 200. It's all relative.
It's all relative... to historical pricing and performance. Other GPUs exist. It's not entry level, last gen is entry level. This is a midrange GPU just like the 7700 XT was a midrange GPU. If you're gonna argue that $400 is entry level because nonexistent $2000 halo products exist then all computers are entry level because billion dollar mainframes exist.

ACTUAL entry-level GPUs have been replaced by APUs.
 
idk i was doing 4k 60 in like 2013. bear in mind it was on two 290x's but still its insane to think that we could do 4k back then with a 4gb framebuffer and now 8 gb is bare minimum. maybe we just need to stop outsourcing games to slophouses using UE.
 
It's all relative... to historical pricing and performance. Other GPUs exist. It's not entry level, last gen is entry level. This is a midrange GPU just like the 7700 XT was a midrange GPU. If you're gonna argue that $400 is entry level because nonexistent $2000 halo products exist then all computers are entry level because billion dollar mainframes exist.

ACTUAL entry-level GPUs have been replaced by APUs.
Billion dollar mainframes do not play games. The 5060ti 8gb is literally the lowest / cheapest current Gen gpu that exists out of the 6 nvidia gpus. Arguing that that is middle range is plainly dishonest. It's like finishing last in race and then arguing that you were middle of the pack. No you were not.
 
It does change the point.... the 9060XT die is limited to 4 x 32 bit controllers. IE: 128 bit.... lol.

32 CU card. Directly competes to 181mm2 GB206, except worse since GDDR6. I expect same <200mm2 size.

AMD would have to strip down the 357mm2 NAVI 48 die currently used for 9070 and 9070XT.

There's no GB205 competitor on AMD side.

No it doesn’t?

1) Clamshell 16GB model and don’t release an 8GB
2) Design it with a 192-bit bus in the first place
3) Source other IC or memory types allowing for higher memory capacity overall.

The point, and it’s sad I have to restate this, is the card should have never released (5060ti and 9600XT) with 8GB variants. It has absolutely nothing to do with how it’s possible, take your semantics somewhere else.
 
Billion dollar mainframes do not play games. The 5060ti 8gb is literally the lowest / cheapest current Gen gpu that exists out of the 6 nvidia gpus. Arguing that that is middle range is plainly dishonest. It's like finishing last in race and then arguing that you were middle of the pack. No you were not.

Finish the idea.

In the 3000 series the 3080 existed, the 3070, the 3060, and the last chronological release, the 3050.

The 4000 series ended with the 4060....because the 4050 never released. Maybe if you grasp at straws you can say it didn't exist...rather than Nvidia decided not to bring to market given the profit margins. It's not true, but you could say it.

We just got the 5000 series this year. It's doing the same releases with higher end down to lower end, and the 5060 just launched. There's no indication that there won't be a 5050...but a month ago the "lowest end available" 5000 series was the 5070. By your logic, last month the 5070 was the same failure to meet expectations of a low end market. So, this is the point where you admit that a very narrow time window is silly, or that you just think Nvidia's offerings suck for anything short of astronomical pricing. Me, I'm in the later camp. You do you though.
 
Jensen big scam :rockout:
 
And we go back to the pricing yet again. When the 1650 was 159 the top end card cost 649. Now the top end costs 2k msrp. This is going around in circles isn't it...
News flash:

You're talking out of your ass


Adjusted for inflation that's a (waaaay...) beyond 2k card right there.

Should I go on, or?
The lesson of the day for you in GPU history here, is that if you drop your green goggles you can actually see that expensive GPUs have always lived right next to competitive budget offerings. What we are seeing now is not business as usual, its a market lacking competition and owned by a practical monopolist. You can sugar coat that with piles of cognitive dissonance, or you can just be honest to yourself ;) The vast majority has already figured this out, you know.
 
Last edited:
Finish the idea.

In the 3000 series the 3080 existed, the 3070, the 3060, and the last chronological release, the 3050.

The 4000 series ended with the 4060....because the 4050 never released. Maybe if you grasp at straws you can say it didn't exist...rather than Nvidia decided not to bring to market given the profit margins. It's not true, but you could say it.

We just got the 5000 series this year. It's doing the same releases with higher end down to lower end, and the 5060 just launched. There's no indication that there won't be a 5050...but a month ago the "lowest end available" 5000 series was the 5070. By your logic, last month the 5070 was the same failure to meet expectations of a low end market. So, this is the point where you admit that a very narrow time window is silly, or that you just think Nvidia's offerings suck for anything short of astronomical pricing. Me, I'm in the later camp. You do you though.
Uhm, of course nvidia cards ( and to the same extent amd ones but whatever) suck for the asking price. All of them, top to bottom. That's why I didn't buy a 5xxx. But that's not relevant to whether or not an 8gb card at the entry level tier is or isn't good.

News flash:

You're talking out of your ass


Adjusted for inflation that's a (waaaay...) beyond 2k card right there.

Should I go on, or?
The lesson of the day for you in GPU history here, is that if you drop your green goggles you can actually see that expensive GPUs have always lived right next to competitive budget offerings. What we are seeing now is not business as usual, its a market lacking competition and owned by a practical monopolist. You can sugar coat that with piles of cognitive dissonance, or you can just be honest to yourself ;) The vast majority has already figured this out, you know.
Dual die GPUs to make a point. Green goggles lol, ill be saying the same things next month when amd releases their 8gb card, don't you worry.
 
These cards were never meant for gaming station but for basic tasks like using monitors in windows 11.
(...)

Some want to use a monitor with their computers and therefore needs a dedicated graphic card with several display outputs and dedicated memory with dedicated encoder and video decoders.
Sorry but that's nonsense. If you want a video output for basic tasks, in 2025, you use your iGPU.
Or, if you don't have one/need something slightly beefier, you get a 50 tier GPU. Anything above that is overpriced.
 
The 5060ti 8gb is literally the lowest / cheapest current Gen gpu that exists out of the 6 nvidia gpus. Arguing that that is middle range is plainly dishonest. It's like finishing last in race and then arguing that you were middle of the pack. No you were not.
using that logic, the RTX 5080 was an entry level card for six weeks prior to the RTX 5070 launch since it was the lowest/cheapest of the current Gen GPU out of the two RTX 5xxx nvidia gpus. Arguing that it was a high end card while finishing last to the RTX 5090 would be incorrect as to paraphrase an online philosopher "no are you not"
 
No it doesn’t?

1) Clamshell 16GB model and don’t release an 8GB
2) Design it with a 192-bit bus in the first place
3) Source other IC or memory types allowing for higher memory capacity overall.

The point, and it’s sad I have to restate this, is the card should have never released (5060ti and 9600XT) with 8GB variants. It has absolutely nothing to do with how it’s possible, take your semantics somewhere else.

1) Increases logistical cost (8 * 2GB IC's), especially capacitors near back side IC's.

The entire point of a smaller die is to incrementally cram more performance (SM/CU) into a smaller package and allow more the same or better performance relative to previous generation. Yield per wafer obviously matters too.. AMD/NVIDIA can extract way more 200mm2 units than stuff nearing 400mm2 per 300mm wafer.

MC layout is sacrificed for more SM/CU.

The obvious adaptation for "low end" is 3/4GB IC on GDDR7. NVIDIA is just being greedy as hell right now.

Both AMD and NVIDIA have the same goals and motives. It's clear as day how they're segmenting the newer "60" classes. I'm sorry, it's not 2016 anymore. 128 (4 x 32) bit is the new "192 bit" config for future cards.

2) AMD completely skipped the "mid tier" and went a little bigger this generation via NAVI 48 (357mm2). It's almost same size as GB203 (378mm2), but more dense transistor wise.

Last "big" monolithic AMD was 520mm2 on TSMC 7nm, 80 CU. A lot of the density increase on N48 seems to be RT related, same should be true for Navi44.

In reference to the 5060 TI specifically, 128bit @ 448GB/S with 28GB/s GDDR7 shouldn't even exactly bottleneck a card at 1080/1440P. I guess I'm just tired of this "bit bus" non sense, when bandwidth and cache are the only things that matter in this aspect. You're just mad you're getting less relative to legacy segmentation and I can understand that.

Both 5060 TI (36 SM) and 9060XT (32CU) are both inherently weaker, at lease relative to what NAVI 48 (64 CU) and GB203 (84 SM) are. All of these are full enabled and used interchangeably with Quadro's and whatever AMD brands theirs as.

AMD using GDDR6 puts them behind NVIDIA in the real world bandwidth metric, regardless of double sided PCB implementation. It will end up more expensive long term as GDDR6 will start to phase out. They dug themselves in a hole, but prob don't care as much given sales have been good with Navi48.

3) What other IC's? They're not gonna use HBM again (SoC design regardless) and GDDR6X is a NVIDIA/MICRON deal. Memory controller on AMD side is designed for GDDR6... Were kind of stuck with 2GB config in 8/16 layouts given die layout on AMD's end.

I agree with you... 5060 TI shouldn't have released with 8GB or 16GB. It should have been delayed and launched with 3GB IC (12GB). 448GB/s with inherit cache pool won't bottleneck a 36 SM card.

Even if AMD launches a 192 bit GDDR6 model (9070 GRE? cut from N48?) the bandwidth would be almost even with NVIDIA on 128 bit via GDDR7.

Semantics are fine. It paints the why. Hurr durr we need 192 bit means nothing.
 
Last edited:
Uhm, of course nvidia cards ( and to the same extent amd ones but whatever) suck for the asking price. All of them, top to bottom. That's why I didn't buy a 5xxx. But that's not relevant to whether or not an 8gb card at the entry level tier is or isn't good.


Dual die GPUs to make a point. Green goggles lol, ill be saying the same things next month when amd releases their 8gb card, don't you worry.

Do you need me to quote you again? I mean, I can:
"...The 5060ti 8gb is literally the lowest / cheapest current Gen gpu that exists out of the 6 nvidia gpus. Arguing that that is middle range is plainly dishonest. It's like finishing last in race and then arguing that you were middle of the pack. No you were not."


I read the comment before, and you're absolutely cranky about the xx60 levels of performance. You clearly want to die on that hill, but your death is cleanly defined as only viable in this exact moment...because reasons. That's a pretty dishonest point of contention, but I really don't feel that you have the introspection to slow down and admit your own bias. The reason I gave you the out of just being price frustrated was because it's an easy end to your rant without having to walk anything back because of a dishonest argument.

Consider the invitation walked back from my side. Continue to rant and rave. The Tariffs in the US, barring some specific items or stuff from China, are at 10%. Despite that, the Microcenter in Charlotte NC had the 889 USD model 9070XTs in stock on Monday...and sold out Tuesday. Pricing sucks because everyone is gouging because they all think they can gouge then blame an outside force to consider their business's profitability level adjusted up...and people will take it. I look at this and only see greed, that will keep me on my current hardware until the inevitable crash that will happen, which will reset the price of goods down to the detriment of companies who view the greed based reset as acceptable. Far as I'm concerned, either stop whining or make your own. In about 1-2 years all of the extra capacity will crater the market anyway...so I look forward to the 2027 return to gaming not sucking raw eggs through a lemon straw.
 
using that logic, the RTX 5080 was an entry level card for six weeks prior to the RTX 5070 launch since it was the lowest/cheapest of the current Gen GPU out of the two RTX 5xxx nvidia gpus. Arguing that it was a high end card while finishing last to the RTX 5090 would be incorrect as to paraphrase an online philosopher "no are you not"
Past experience should have made you assume that there will be plenty of cards on a lower tier than the 5080. That is not the case with the 8gb 5060ti. Even after the full 5xxx lineup is launched it will remain bottom of the barrel. Even if Nvidia releases a 5050 and all the super refreshes the 5060ti 8gb will be among the lowest offerings and nowhere near the middle of the stack. Out of the 11 or 12 cards of the generation it will be 9th or 10th.
 
Past experience should have made you assume that there will be plenty of cards on a lower tier than the 5080. That is not the case with the 8gb 5060ti.
lol, name one series from Nvidia where the X060ti was the last card in the series. You are just trolling on the forums and spewing out gibberish on a topic you clearly have shown to know nothing about.
 
Last edited:
Back
Top