Thursday, March 20th 2025

NVIDIA Pushes GeForce RTX 5060 Ti Launch to Mid-April, RTX 5060 to May
NVIDIA is reportedly pushing the launch dates of its upcoming mid-range GeForce RTX 5060-series graphics cards by a couple of weeks, each. The faster RTX 5060 Ti is now expected to launch some time in mid-April 2025, while the RTX 5060 is now slated for a month later in mid-May, just before the media gears up for the 2025 Computex later that month. The RTX 5060 Ti comes in 16 GB and 8 GB memory variants, and both are expected to launch around the same time, if not on the same day; while the RTX 5060 is expected to come in just 8 GB.
Both SKUs are expected to be based on the "GB206" silicon, which probably features 36 or 40 streaming multiprocessors, from which the RTX 5060 Ti is configured with 36, to yield 4,608 CUDA cores. The RTX 5060 is significantly cut down, enabling 30 SM for 3,840 CUDA cores. The silicon features a 128-bit GDDR7 memory interface, and both SKUs are expected to be configured with 28 Gbps memory speeds, giving them 448 GB/s memory bandwidth.
Source:
VideoCardz
Both SKUs are expected to be based on the "GB206" silicon, which probably features 36 or 40 streaming multiprocessors, from which the RTX 5060 Ti is configured with 36, to yield 4,608 CUDA cores. The RTX 5060 is significantly cut down, enabling 30 SM for 3,840 CUDA cores. The silicon features a 128-bit GDDR7 memory interface, and both SKUs are expected to be configured with 28 Gbps memory speeds, giving them 448 GB/s memory bandwidth.
72 Comments on NVIDIA Pushes GeForce RTX 5060 Ti Launch to Mid-April, RTX 5060 to May
Most games are being developed primarily for systems with 9-14GB of VRAM available (XBox and PS5). After that majority market is satisfied, the devs then go back and modify stuff for the PC market, but we're second class citizens because the console gamers outnumber us, but more importantly they outspend us and that's the only thing that really matters to game developers.
I don't get this 8gb vram not enough. Are people actually suggesting that you can't play modern AAA games with an 8gb card? Cause I got plenty of those (6600xt, 3060ti) and they do great.
But ofc the moment where I'm about to mention that I'm using DLSS on Quality renders my post invalid by those ppl cause to them it butchers the image quality which it does NOT especially with DLSS 4 and Transformer and it actually looks better than the built in TAA in most if not all of the modern games but hey let them belive whatever they want and I will just keep playing my games that I'm not supposed to run on a 60 series 8GB card according to them.:rolleyes: 'O ye and I even have RT force enabled in one my gacha games that runs on UE 4 and its all good..'
I think a number of users around here on this enthusiast site and the likes could use a reality check when it comes to users with budget-mid range hardware cause no they aint only playing old games or e-sports and they are a having the same fun playing those games like anyone else. 'Again my Indiana Jones experience is not a stuttery low fps mess with missing textures or anything, it looks pretty damn good with my tweaked settings and its usually between 60-75 FPS which is my refresh rate anyway..'
'my main reason for wanting an upgrade is cause the raster performance of the 3060 Ti is starting to really show its age in UE 5 games for example, even with low Textures they just run like crap so its not a Vram issue whatsoever at least not at my resolution and I'm a massive Borderlands fan and 4 will be on UE 5 thats coming later this years so I could use a new GPU for that game alone..'
here's a benchmark proving that all the 8GB and 10GB cards completely shit the bed at launch, even at 1080p. The problem was that the developer targeted a 12GB baseline and had to go back a few weeks after launch to tune it for 8GB cards by hacking image quality down even further.
In no circumstance should a 6700XT be twice as fast as an RTX 3080, unless the 3080 has run out of VRAM. For games where 10GB is enough, the 3080 is a good 25-50% faster than the 6700XT, not 50% slower.
I've experienced it in person, but these benches are all over the web and not hard to find. This isn't the first game to hate 8GB cards either, and it absolutely will not be the last.
Btw the Gamepass version wont even let me enable RT in that game tho I wouldn't even dare to use it in that game and also the biggest issue in Indiana Jones is the draw distance pop in which is only fixable with Path tracing enabled or using a console command which works on any system as long as you are fine with a performance hit.
Again, the lower tier card users DO tweak their settings and most of the time use upscaling so benches like that are completely moot in that regard and guess what the game still looks perfectly fine and it runs with no issues whatsoever.
Likea really do I have to record a video of me playing the game or what? 'even tho I can only use Nvida's software for that so that might be another slight performance hit'
I guess transformer model might be okay, but prior to DLSS4, 1080p looked pretty horrible to me with any upscaling at all. This is obviously subjective, but that's an opinion shared by the majority, I think. I don't think a 3080 owner who likely paid $1000 for their GPU would call 720p upscaled "perfectly fine".
As for RT in Indiana Jones, it's on by default you can't turn it off, and the game simply can't be run on non-RT cards like the 1080Ti or 5700XT.
Sure you can't completely turn RT off but the advanced settings are greyed out for me and thats what I meant.
3060 12G is a fine card.
6700 10G was amazing value when you could find it in 2021/2022, before the GRE replaced it.
6700XT has always been a great recommendation.
edit - the 4060Ti 16GB is looking less stupid by the day
Every review of an 8GB card in the last few years has mentioned VRAM as a potential caveat going forwards, and often used VRAM as a reason to recommend alternative options in the conclusions. The issue was certainly very rare back in 2023 when you could count the number of games that sucked on 8GB on the fingers of one hand, but it's not 2023 any more.
We had this same tiresome discussion with 4GB vs 8GB cards back in 2016/2017 with all of the Polaris cards vs the 6GB GTX 1060. All of those 4GB cards turned out to have too little VRAM and they've struggled to run many of the current-generation console games where the 1060 and 8GB Polaris 480/570/580 cards have all handled those games just fine. I'm genuinely surprised why you think this situation with 8GB vs 12GB cards is any different, other than to vehemently defend the 8GB cards that are clearly unfit for some purposes in 2025. What about the 5060 8GB that will be purchased in 2025 and expected to be usable in games in 2028, 2029, or 2030?
What I meant is that such ppl and communites are plenty fine playing whatever new games on such hardware cause they can and will tweak their settings and actually appreciate upscaling as a tech unlike around here.:rolleyes:
I've had zero issues with my 3060 Ti in the past 2 and half years and no Vram was never an issue more like the raster performance in newer games like I've mentioned before. 'Finished Hellblade 2 on this card'
Thats why I was looking at 4070 Supers as a possible upgrade cause I'm all good with 12GB with that kind of a raster and RT uplift vs my 3060 Ti but sadly those cards are gone or are insanely expensive now.
I'm not saying it's impossible to play those games on a potato if that's all you have access to, but when a used 12GB GPU can be purchased for the cost of only 3 AAA games, why are you continuing to buy games you can't run when the hardware to run them costs so little?
Each to their own....
Exactly he usually goes for the comical side and goes to the extremes or tries to hit 60 fps which I do not care about so I usually play at higher settings and/or upscaling quality than him.
There is NOT a single game on the market atm that I can't play with my 3060 Ti at my res at reasonable settings and quality upscaling or at least I'm yet to run into one unless you count HL2RTX and like I said I do have Gamepass Ultimate so I can try whatever is there. 'it was a gift, did not pay for it myself'
Also if you think that quality upscaling with DLSS is a blurry soup then lets agree to disagree and have fun with your TAA or whatever else crap.:)
You'd have to look at channels that cover image quality like DigitalFoundry, but I'm not sure they'd even cover Indiana jones on lowest because it simply didn't run on 8GB cards at all. A quick google says that texture cacheing was reworked to accommodate 8GB cards, at a performance hit, but that sure sounds better than 2-3 second pauses and then crash to desktop.
From my friend who wanted to run it on a 4060 he said it's now okay on 1080p medium, when it was an unplayable mess at launch.
Sorry, I meant to say "the scalpers are all getting discounts". Scalper prices will remain the same, because they're scalpers.
The point here is that the midrange is at a practical standstill sinds Ampere - your 3060ti - in terms of overall performance. Hell its even worse: sometimes cards even perform worse in a newer gen. You might be lucky there with your relatively high bandwidth ampere card even. Shit's gotten worse - as long as games work well with the larger cache sizes then they can get by. But even that gets flooded and then you are looking at a raw bandwidth over that 8GB that is similar to what you got in 2016 on that same range of cards. Cards that have gotten more expensive, too.
Its just a horrible, horrible deal and somehow people feel adamant to keep parroting the '8GB is enough' bullshit when in reality, developers have been jumping through hoops to make their games fit into it ever since the PS5 and especially the Xbox S got released. With varying success... and ALWAYS with a hidden quality reduction, that is sold as 'optimization'. Gamers happy... but then you might want to think long and hard what they're really happy about: stagnation because their own hard earned GPUs are actually way past expiry date, but still bought at full prices.
x60 therefore is penny wise pound stupid. The world has already moved way past this segment. Buy a console! You're wasting cash upgrading your 3060ti to a new x60. All you will get is more DLSS for your money and literally less GPU. The dies are tiny, the VRAM chips lacking, the price beyond sanity, and the resale value is gone.
And about DLSS at quality mode ergo rendering below the native res... to each their own, but I do recognize that blurfest from miles away (yes even DLSS and even transformer model) and the motion resolution is terrible. The vaseline effect is there. If you play in that mode all the time then sure, you can unsee it. But its a race to the bottom. Stop deluding yourself. We never liked FXAA either, and its remarkably similar. It was a necessary evil at best.
Let's call it what it is... You are rendering 720p like its 2010 and you're trying to get the newest game to run on your new card. Don't sugar coat that, you've almost reached the absolute bottom of what gaming really is in 2025 - and that is absolutely NOT where x60 was positioned historically. x60 was always the first card in the stack that could game properly and certainly out of the box wasn't struggling with recent games on day one, and most DEFINITELY wasn't rendering below your native res either to get playable frames.
And if all this is lost on you... its really your loss, your money, and your choice. But the defense of it, is getting pretty silly. Nvidia is actively killing this segment, if you haven't noticed. You're defending a smoking crater that has been.
TLDR : If you have an 8gb vram card, you are perfectly fine and enjoy your AAA games, but don't buy one for a high price at 2025.
These 8GB comments always happen in topics where new 8GB cards are announced ;) I think they should be viewed in that context. Its blatantly obvious to me but apparently not to everyone. You're not going to 'un-buy' your 8GB card are you.
I check the overall performance and the pricing in my country and THEN decide what to buy not whatever number its called and in my case its usually the 60 serie heck I've used to ran a GTX 950 for nearly 3 years and that never stoped me from playing any of my games at the time nor my 3060 Ti does now tho I do notice that UE 5 games are pretty difficult to run. 'hence why I want an upgrade later this year'
To me its TAA that looks total ass with its flickering and image instability in most of the games and that I do notice right away but the so called DLSS 'quality' blur I do not so theres that.
Also there is no such thing as tiers of gaming only in maybe the eyes of an elitist/snob, gaming is gaming as long as one is enjoying it. 'some ppl argue that console gamers can't possibly enjoy their locked 30 fps and yet most of them actually do or simply don't care'
No I would never replace PC gaming with console gaming cause first of all I despise controllers for 90%+ of the game genres and I also enjoy tinkering/tweaking with my games and that I can't do on a console so no thanks.:)
Also what people are missing is that - if say you can run TLOU with high textures on 8gb vram - then even in games that you have to drop to low the texture quality should be at least equal to TLOU at high. If texture quality is way worse in X new game while hogging 8+ gb of vram then there is clearly an issue with the game. It's impossible to keep having worse and worse textures that require more and more vram without it being the fault of the devs.