• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Pushes GeForce RTX 5060 Ti Launch to Mid-April, RTX 5060 to May

Indeed.

These 8GB comments always happen in topics where new 8GB cards are announced ;) I think they should be viewed in that context. Its blatantly obvious to me but apparently not to everyone. You're not going to 'un-buy' your 8GB card are you.
I don't know, Im reading a lot of comments (not just on TPU) that 8gb cards can't play AAA games and im like, huh, you haven't tried it have you. Like sure, Ultra settings is usually a pipe dream (unless you drop to 1080p and maybe use upscale on top) but ultra settings are usually so much heavier than high that dropping to the high preset gives you like 50 to 100% extra performance.

Also what people are missing is that - if say you can run TLOU with high textures on 8gb vram - then even in games that you have to drop to low the texture quality should be at least equal to TLOU at high. If texture quality is way worse in X new game while hogging 8+ gb of vram then there is clearly an issue with the game. It's impossible to keep having worse and worse textures that require more and more vram without it being the fault of the devs.
 
I don't know, Im reading a lot of comments (not just on TPU) that 8gb cards can't play AAA games and im like, huh, you haven't tried it have you. Like sure, Ultra settings is usually a pipe dream (unless you drop to 1080p and maybe use upscale on top) but ultra settings are usually so much heavier than high that dropping to the high preset gives you like 50 to 100% extra performance.
Oh well the 1060 6GB was running fine until a few years back and has now officially shit the bed, these things take time. But nobody in the market for GPUs is buying them for today - and if you HAVE an 8GB card I don't think you'll be interested upgrading to another one either. And in THAT context, which attracts a lot of viewers because its potentially everyone in the market for a new GPU, 8GB is not enough to play properly.

Nobody wants to buy a new card and immediately drop to bottom end settings. So I do know why I'm reading a lot of those comments. Context. Especially the crowd that hasn't got a lot to spend will want to make their GPUs last. They're all sides of the same coin really.
 
Oh well the 1060 6GB was running fine until a few years back and has now officially shit the bed, these things take time. But nobody in the market for GPUs is buying them for today - and if you HAVE an 8GB card I don't think you'll be interested upgrading to another one either. And in THAT context, which attracts a lot of viewers because its potentially everyone in the market for a new GPU, 8GB is not enough to play properly.

Nobody wants to buy a new card and immediately drop to bottom end settings. So I do know why I'm reading a lot of those comments. Context.
Do you think the 1060 died cause of vram though? I think it's just too slow to run any new game at decent settings regardless of vram.
 
Do you think the 1060 died cause of vram though? I think it's just too slow to run any new game at decent settings regardless of vram.
No it was quite well balanced, but the full fat 6GB version (not the lower shader one) certainly could still run games at (raster) ultra 1080p but didn't quite have the VRAM for it. Total War is one of those games, hell, it even killed my 1080 with 8GB while still pushing 40+ FPS at native res. Stutters happen.
 
Do you think the 1060 died cause of vram though? I think it's just too slow to run any new game at decent settings regardless of vram.
Thats exactly what I'm noticing in some of the games, even if I DO drop textures to low so its defo not an issue the overall performance is still meh so not even more Vram would save this card in those cases.
Also I've never said that I'm supporting the idea of upgrading to a 8GB card in 2025 but like you said if you already have one then its still useable w/o having to completely murder your ingame settings.

Personally I've wanted to upgrade to a 4070 Super but the prices went totally bonkers in my country and they are also hard to find even on the second hand market where I usually buy my stuff.
5070 is too expensive for what it is and the 16 GB 5060 Ti probably wont have enough raster performance uplift for me to worth it but that I will see when its out. 'that and most likely the pricing will be crap as usual'
 
Except that I honestly don't give a damn what a card is postioned at or used to.
I check the overall performance and the pricing in my country and THEN decide what to buy not whatever number its called and in my case its usually the 60 serie heck I've used to ran a GTX 950 for nearly 3 years and that never stoped me from playing any of my games at the time nor my 3060 Ti does now tho I do notice that UE 5 games are pretty difficult to run. 'hence why I want an upgrade later this year'

To me its TAA that looks total ass with its flickering and image instability in most of the games and that I do notice right away but the so called DLSS 'quality' blur I do not so theres that.
Also there is no such thing as tiers of gaming only in maybe the eyes of an elitist/snob, gaming is gaming as long as one is enjoying it. 'some ppl argue that console gamers can't possibly enjoy their locked 30 fps and yet most of them actually do or simply don't care'

No I would never replace PC gaming with console gaming cause first of all I despise controllers for 90%+ of the game genres and I also enjoy tinkering/tweaking with my games and that I can't do on a console so no thanks.:)
In a practical sense though you do still agree with me because you'd never buy another 8GB card, right? You've mentioned you'd buy a 4070 if you could've. And of course, what settings and perf level is palatable to you is a you thing. Nobody is contesting that, I think. The real and only point here is the stagnation and the killed upgrade path: you could reasonably move through x60 cards back in the day every time and keep finding more performance and gaming options that get playable. That's no longer the case - that now only occurs if you happen to also get cool DLSS4 support so you can still suffer the same latency but have more frames.
 
Thats exactly what I'm noticing in some of the games, even if I DO drop textures to low so its defo not an issue the overall performance is still meh so not even more Vram would save this card in those cases.
Also I've never said that I'm supporting the idea of upgrading to a 8GB card in 2025 but like you said if you already have one then its still useable w/o having to completely murder your ingame settings.

Personally I've wanted to upgrade to a 4070 Super but the prices went totally bonkers in my country and they are also hard to find even on the second hand market where I usually buy my stuff.
5070 is too expensive for what it is and the 16 GB 5060 Ti probably wont have enough raster performance uplift for me to worth it but that I will see when its out. 'that and most likely the pricing will be crap as usual'
I know man, the game that started the whole "8gb vram not enough" was hogwarts legacy, I've actually finished the entire game on a 3060ti. Everything ultra + maxed out RT except shadows + textures @ high, 3440x1440p DLSS Q resolution. Fps was hovering around 60 to 70. That's not just decent settings, that's absolutely great considering RT. But youtubers used everything ultra and oh well...lets create some clickbait videos.
 
I know man, the game that started the whole "8gb vram not enough" was hogwarts legacy, I've actually finished the entire game on a 3060ti. Everything ultra + maxed out RT except shadows + textures @ high, 3440x1440p DLSS Q resolution. Fps was hovering around 60 to 70. That's not just decent settings, that's absolutely great considering RT. But youtubers used everything ultra and oh well...lets create some clickbait videos.
But that's really the issue with most discussions on our forum, they're defined by clickbait bullshit and only work to increase polarization and 'different opinions' when in fact we all feel the same about it. If we buy something it needs to be worth a damn.
 
But that's really the issue with most discussions on our forum, they're defined by clickbait bullshit and only work to increase polarization and 'different opinions' when in fact we all feel the same about it. If we buy something it needs to be worth a damn.
Well that's the good thing about written reviews like TPU. They don't need click bait titles and thumbnails.
 
In a practical sense though you do still agree with me because you'd never buy another 8GB card, right? You've mentioned you'd buy a 4070 if you could've. And of course, what settings and perf level is palatable to you is a you thing. Nobody is contesting that, I think. The real and only point here is the stagnation and the killed upgrade path: you could reasonably move through x60 cards back in the day every time and keep finding more performance and gaming options that get playable. That's no longer the case - that now only occurs if you happen to also get cool DLSS4 support so you can still suffer the same latency but have more frames.
It depends, personally I wouldn't cause I'm a variety gamer so I do plan on playing the heavier games too like Borderlands 4 later this year. I guess if someone is only playing lighter games then a new 8 GB card is still okay if the price is right. Aight I cant really post on a phone while sitting on a bus so I'm out for now.
 
Indeed.

These 8GB comments always happen in topics where new 8GB cards are announced ;) I think they should be viewed in that context. Its blatantly obvious to me but apparently not to everyone. You're not going to 'un-buy' your 8GB card are you.
This.

Owning an 8GB doesn't automatically invalidate it - it's likely a few years old now and having to tweak/tune/edit launch parameters etc to get past the GPU's limitations is somewhat expected for ageing GPUs in the latest games.

What's stupid is buying a relatively expensive (probably $350+) GPU in 2025 that already lacks the VRAM to survive game launches in multiple titles from 2023-2025 (with increasing regularity) and expecting it to be an acceptable card for the next 3-5 years. The games industry has moved on from 8GB baselines and 8GB cards aren't going to suffer in the future, they're already suffering now.
 
Why bother, just scrap them before production starts.
 
The games industry has moved on from 8GB baselines and 8GB cards aren't going to suffer in the future, they're already suffering now.
See this is the part that I have issues with. The industry hasn't moved from 8gb baselines. Not even close. Handhelds are becoming more and more popular and have way less than 8gb of vram. Baseline means the minimum, and 8 gb ain't it. They aren't suffering now either. Suffering means being in a dire condition having to drop to all low or something. That's far from the truth.
 
See this is the part that I have issues with. The industry hasn't moved from 8gb baselines. Not even close. Handhelds are becoming more and more popular and have way less than 8gb of vram. Baseline means the minimum, and 8 gb ain't it. They aren't suffering now either. Suffering means being in a dire condition having to drop to all low or something. That's far from the truth.
The industry is not a homogenous entity, there are games that do and an increasing number that won't or just simply can't. It really depends what developers choose to have reach and sales on. Handhelds with igp range specs certainly help in creating market share for that. But those optimizations ARE tweaking towards the very bottom end just to get 30 odd fps at say 720p. Or maybe 1080p low at 30. We are also looking at 15-25W chips here. Pairing that with 8GB makes sense.
 
See this is the part that I have issues with. The industry hasn't moved from 8gb baselines. Not even close. Handhelds are becoming more and more popular and have way less than 8gb of vram. Baseline means the minimum, and 8 gb ain't it. They aren't suffering now either. Suffering means being in a dire condition having to drop to all low or something. That's far from the truth.
Dunno, there was a series of articles here on TPU last year about how well things run on the Ally/Deck and week after week it was like "nope, nope, not great, only plugged in" etc.
Here you go - 10 games on Steam Deck, only 2 run acceptably (30fps native) 4 are questionable sub-30fps experiences even with upscaling, and 4 are just absolutely no good on Deck.

yes -

no -

hell no -

questionable -

questionable -

questionable -

no -

hell no -

questionable -

yes -
 
Last edited:
Dunno, there was a series of articles here on TPU last year about how well things run on the Ally/Deck and week after week it was like "nope, nope, not great, only plugged in" etc
Plugged in doesn't increase the vram, it increases the clockspeeds. You just made my point for me, it isn't vram that's holding it back but actual performance.
 
Plugged in doesn't increase the vram, it increases the clockspeeds. You just made my point for me, it isn't vram that's holding it back but actual performance.
Its obviously NOT VRAM holding a 25W APU back when you've got 8GB, no.

Have you been drinking, or? You should take a brief look at the other specs of these handhelds I think. The point about 8GB cards is about discrete cards with 4-5x the performance. Its about good balance.

But even regardless of anything, if you give a chip more power it will produce more frames. Irrespective of VRAM. That doesn't say jack shit about whether its vram capacity is actually holding the chip back. Frametime consistency is one big thing here for example - bandwidth of the VRAM being a big influence on that. Another one is the overall quality level and the actual framerate you get. These blanket statements do nobody any favors. Its odd having to explain this to you, frankly, I think you know better ;)
 
Last edited:
Its obviously NOT VRAM holding a 25W APU back when you've got 8GB, no.

Have you been drinking, or? You should take a brief look at the other specs of these handhelds I think. The point about 8GB cards is about discrete cards with 4-5x the performance. Its about good balance.

But even regardless of anything, if you give a chip more power it will produce more frames. Irrespective of VRAM. That doesn't say jack shit about whether its vram capacity is actually holding the chip back. Frametime consistency is one big thing here for example - bandwidth of the VRAM being a big influence on that. Another one is the overall quality level and the actual framerate you get. These blanket statements do nobody any favors. Its odd having to explain this to you, frankly, I think you know better ;)
At this point I think we just agree to disagree. We're not going to change his mind. Let him spend $350 on a 5060 or $450 on a 5060Ti 8GB and mute him when he starts moaning about all the new releases that fail to even launch on 8GB cards throughout 2025, 2026, etc.

At some point developers will just stop caring about 8GB GPU PC owners, because they'll be too small a slice of their audience for the effort it takes to make all the compromises just for them. It's already started happening and it'll only get worse over time....
 
Latest AAA heavy game just came out, 4060ti 8gb vs 16gb perfom identical even at minimum fps even at 4k Ultra settings. Even at 4k ultra settings + RT. Nough said I think

min-fps-3840-2160.png
 
Latest AAA heavy game just came out, 4060ti 8gb vs 16gb perfom identical even at minimum fps even at 4k Ultra settings. Even at 4k ultra settings + RT. Nough said I think

min-fps-3840-2160.png
The 4060ti is neutered by its abysmal bandwidth before even getting to that point, the VRAM is just sipping power that can't go to core clocks. The 16GB version is never faster.

Its also just this game; for example, the A770 swaps places in the hierarchy in KCD2 with a much better bandwidth and 16gb even though KCD2's VRAM requirement is supposed to be higher - you'd be expecting the polar opposite there. Its entirely engine/scene dependent what you will get at this point. We're looking at sub 20 FPS.
 
Last edited:
Latest AAA heavy game just came out, 4060ti 8gb vs 16gb perfom identical even at minimum fps even at 4k Ultra settings. Even at 4k ultra settings + RT. Nough said I think

Nah you haven't said enough because you picked one game. Now look at the graph below. 4060Ti 16GB is 31.1% faster than 4060Ti 8GB at 2560x1440 with RT.


31.1% is literally two tiers lol. Oh and it'll only get worse in the coming months with the way the trajectory has been.

Enough said? No. In a year or two when the gap will be more than 50%, maybe enough will have been said.
 
Nah you haven't said enough because you picked one game. Now look at the graph below. 4060Ti 16GB is 31.1% faster than 4060Ti 8GB at 2560x1440 with RT.


31.1% is literally two tiers lol. Oh and it'll only get worse in the coming months with the way the trajectory has been.

Enough said? No. In a year or two when the gap will be more than 50%, maybe enough will have been said.
Cause obviously you will be buying an 8gb low end card to play RT games at ultra settings. Bud, this is just getting stupid. Yes obviously if you want to play ULTRA with RT on top of that you don't get an 8gb card. In fact you don't get an xx60 tier card regardless of VRAM. You are doing exactly what I predicted in page 1, you are looking at ultra maxed out settings and claiming 8gb are obsolete. Well, duh. In the same vein, a 5090 is obsolete. It barely manages 25 fps in Cyberpunk maxed out. Way to go man
 
In a practical sense though you do still agree with me because you'd never buy another 8GB card, right? You've mentioned you'd buy a 4070 if you could've. And of course, what settings and perf level is palatable to you is a you thing. Nobody is contesting that, I think. The real and only point here is the stagnation and the killed upgrade path: you could reasonably move through x60 cards back in the day every time and keep finding more performance and gaming options that get playable. That's no longer the case - that now only occurs if you happen to also get cool DLSS4 support so you can still suffer the same latency but have more frames.
Aight I finally got the chance to actually reply properly 'long ass day huh..'.
I do agree in a sense that buying a brand new 8 GB card in 2025 especially considering the prices nowadays is indeed unwise unless the person actually has a list of specifc games to play and that those games are fine with 8 GB at the given resolution.
What performance and settings are fine for each person is subjective ofc and thats why I refuse to argue that part since its just pointless.

Stagnation yeah I've noticed that but lets be real its happening to both AMD and Nvidia nowadays.
Like 3060/Ti to 4060/4060 Ti is like a whatever upgrade but the same goes to RX 6600/XT to 7600 heck even the 6800XT to 7800XT was kind of meh and even now with the 9000 serie the biggest uplift was done in the RT department and FSR 4 is also exclusive to them as of now.
I'm not as tech savvy as some of you are here like I always skip the first few pages of a review where they break that part down since I don't exactly care about what node is a GPU on and all that stuff since it really doesn't affect the choice of my GPU in the end. 'if it works well enough for my needs and fits my budget range then its good enough for me+preferably its not over 250W max power draw'
So with that out of the way idk if this is the reason but I've read that the stagnation is also thanks to shrinking the node is becoming increasingly difficult and thats why we can't see those big jumps between generations anymore and the focus is more on the other features like RT and upscaling.

Personally I don't hate any of those features nor blame them for anything really, to me its part of progressing tech in general and they aint going anywhere either. 'Devs being lazy and refusing to optimize their games is an entirely different can of worms that I would not like to address here..'
Upscaling is like everywhere now from consoles to handhelds heck I even use FSR on my mobile in the UE 4 based gacha I'm playing and on the PC client I'm using Transformer DLSS Quality via the Nvidia app and its defo better than whatever crap built in AA the game has. 'its not listed just simply called AA but it does look like a crappy implementation of TAA or maybe even FXAA'
Frame gen I can't say much about since yeah I can only try AMD Frame gen and thats a mixed bag so far but it actually works kind of okay in Stalker 2 with Nvida Reflex enabled on top, tried it and it gave me a flat frametime at my capped 'monitor's refresh rate' 75 FPS and the latency was still good enough for me to pop ppl in the head with a pistol so eh it depends on the game I guess.
 
Last edited:
Cause obviously you will be buying an 8gb low end card to play RT games at ultra settings. Bud, this is just getting stupid. Yes obviously if you want to play ULTRA with RT on top of that you don't get an 8gb card. In fact you don't get an xx60 tier card regardless of VRAM. You are doing exactly what I predicted in page 1, you are looking at ultra maxed out settings and claiming 8gb are obsolete. Well, duh. In the same vein, a 5090 is obsolete. It barely manages 25 fps in Cyberpunk maxed out. Way to go man

Slow down bud, you're missing the point. I didn't say it's not manageable with 8GB nor did I state 8GB is obsolete so stop putting words in my mouth. I stated earlier that when faced with VRAM limitations of course there are workarounds by fiddling with settings. Some deal with it, and some hate it (for a multitude of reasons).

The point I was stating is, unlike your example of only one game I provided the whole suite. And the difference is 31% when going from 8GB to 16GB today. Guess what it was at the same resolution during the 4060Ti 16GB's launch less than two years ago? Around 2%.

So the overarching point is that the 8GB framebuffer limitation is getting significantly worse in time, and will require more management moving forward. It's just fact.
 
Remember when the 8700G reviews came out and some outfits used the 6500XT to blow it away? It is all about what you want to play and at what resolution. Most racing Games play fine and 1080P is not demanding if you are not playing what we call AAA.
 
Back
Top