Saturday, February 1st 2025
NVIDIA GeForce RTX 5060 and RTX 5060 Ti Rumored to Launch in March 2025
A recently leaked slide from the Taiwanese company Chaintech has seemingly confirmed the launch dates for the RTX 5060 and RTX 5060 Ti GPUs. Previous leaks have hinted at an early Q2 launch for the mid-range gaming GPUs, in both 8 GB and 16 GB VRAM flavors. Chaintech's slide does not reveal any specifications regarding the GPUs, although we do have a pretty good idea of what the upcoming GPUs will bring to the table.
As per recent leaks, the RTX 5060 and 5060 Ti are both expected to sport the GB206 GPU, paired with 8 or 16 GB of VRAM on a 128-bit bus. Despite employing the speedy new GDDR7 standard, there is no denying that 8 GB of VRAM is far from sufficient for a comfortable ray-traced gaming experience in 2025, perhaps even less so in the near future. Considering that the Arc B580 ships with 50% more VRAM, the entry-level RTX 5060 is more than likely to be hard sell for many people, unless, of course, the RTX 5060/Ti somehow pulls off impressive performance uplifts.
Source:
@wxnod
As per recent leaks, the RTX 5060 and 5060 Ti are both expected to sport the GB206 GPU, paired with 8 or 16 GB of VRAM on a 128-bit bus. Despite employing the speedy new GDDR7 standard, there is no denying that 8 GB of VRAM is far from sufficient for a comfortable ray-traced gaming experience in 2025, perhaps even less so in the near future. Considering that the Arc B580 ships with 50% more VRAM, the entry-level RTX 5060 is more than likely to be hard sell for many people, unless, of course, the RTX 5060/Ti somehow pulls off impressive performance uplifts.
45 Comments on NVIDIA GeForce RTX 5060 and RTX 5060 Ti Rumored to Launch in March 2025
For ~$250, you have the option of many 11/12GB cards (B580, 6700xt, 2080Ti). None of those cards are 'bad' for 1080p, pretty cheap, and then RAM isn't a worry.
For ~$400, you will likely have the option of a 7800xt/9070 in the same time frame. 9070 is rumored to be $400, and 7800xt may likely drop even lower until it's gone; both 16GB.
The $400 cards are 1440p cards in reality, 1080p for RT.
Why 8GB when you can 11-12? Why buy a ridiculous 16GB card when you can buy a non-ridiculous 16GB card? I can't see this out-performing a 7800xt, even realistically in RT.
I'm not even arguing about 8GB. The point is there really isn't any reason to buy one anymore. If you want to argue a 2080ti vs a 6700xt/B580 (11GB vs 12GB), I'm with you on the 11GB of green.
Besides that i just want to coin that i bought a zotac for the 5 year warranty they provide, how lucky are people with how long their gpu will last.. really realistically under normal use more than lets say 6-7 Years? Not to mention that after a good full four years you very likely with a mid range card need to upgrade to be able to play newest games without issues
either they can pay for it because its that necessary for their job or they are e.g. student and use
nasa computer paid with tax money :) (which is totally ok, when something good comes out if it, in the end will, numbers game)
No need to spend more than $300 for a product that covers 95% of the market.
I still use the same screen WHQD + freesync ASUS PA278QV.
I bought in the first days Radeon 6600XT. The gpu mining days were expensive. This was with 5800X + 2x32 GiB DDR4 RAM.
I switched the processor to something which is basically a sidegrade Ryzen 7600X. (I mention that because the sold am4 platform and my current am5 platform are very close - so it is not really a apple to banana comparision. of course the am5 platform used windows 23h2 - now 24h2. the am4 platform used windows 10. the operating system changes)
The more VRAM I could see in variuos free games with the radeon 6800 non xt. That card was one of the last cards of the product cycle. VRAM matters - even for the free epic games giveaway games - I most of the time play and enjoy.
I'm kinda happy with the Powercolor 7800XT hellhound. (the point why I wrote this hole text wall - 7800XT)
Raytracing is a nonsense feature. Hardly any of my games support it. Except those free trash games amd gave away. Avatar pandora / the last of us / star wars jedi survivor. (I never bought a steam game - ubisoft - ea game. My last game purchase was in the old days with a worm media.)
Why do you want raytracing with an entry level graphic card? The nivida 3070 was too weak / radeon 6800 non xt / 7800xt are too weak.
If you want raytracing you should buy at least 7900XT or go higher.
RT = raytracing is just the killer argument. I do not see a difference or any improvement with raytracing. I always see the same technical demonstration software to show the work in progress.(tech demo in short) For entry level gaming a 7800XT is far decent card. It's expensive card, but mine is at least quiet.
The cards above are for those who are willing to buy twice as much for a graphic card.
It was already hard to justify to buy a graphic card which cost as much as my mainboard + cpu bundle and the expensive dram 2x32GiB DDR5 .
I also believe these higher end graphic cards take more than the 220Watts I see. Of course you have "up to" double the frames with "up to" 450 Watts or 670 Watts for a graphic card.
the idle consumption of ~47 Watts (please check the video yourself - not sure if it was 40 or 47 -- much too high as my ~7-16 Watts in idle) according to the gamers nexus video for the Nividia 5090 is another flaw. I trust them more because they measure with checked equipment any power going in a graphic card.
-- That VRAM topic is also over. Keep staying in your 8GiB of 12GiB VRAM bubble if you want to. I saw clear difference in certain games from the 6600XT 8GB going to the 6800 non XT 16GB card with certain games. I play the older games from epic games and such. With recent games I most likely would not use a 7800XT, but something much better. I want to see how your 8GB card does in WHQD in AVATAR Pandora. That game is demanding and was a free "trash" amd giveaway game. Buggy quests and such. My statement is only valid for AMD graphic cards. I tested a 4GB nvidia card for the driver quality in 2023. Intel is not really worth considering in my point of view.
The reason I bought the 7800xt hellhound was the low noise in various tests. MSI radeon 6800 z trio / asrock challenger 6600XT 8GB D were far too loud in my point of view. One of many reason why these cards had to go to the second hand market.
edit: the buyer should be aware of - he limits himself to certain games - certain frame rates - certain display resolution - certain game settings when buying a 8GB VRAM card. I would not call a 16GB card future proof but acceptable for games made up to the year ~2021 (feel free to determine the year limit yourself).
Its already been shown that 8gb isnt enough for just basic raster in most games. So when users enable RT, it will use more VRAM than the card has, which will cripple FPS. Then these nvidia-gaslit users will enable MFG to try and claw some of FPS back, but then the base FPS will be so low (eg 30 or less) that the lag will be so bad as you have 100-200ms lag on your mouse movements. If the current nGreedia-upsell-trend continues, the 5060 will actually be a 5050 like the 4060 was a 4050.
But as otters in this thier thread have reminded me, thats why we have the B570/580 and the soom to be released 9070. There is zero reason to buy anything lower than the 5070 from nvidia. I sincerly hope the 5000 generation bites nGreedia in the only thing they understand, their bottom line.
though at least with gddr7, bandwidth isn't that bad at 128 bit.
Right product placement:
5050 - 8GB
5060 - 12GB
5060 Ti - 16GB
5070 - 16GB
5070 Ti - 20GB
5080 - 24GB
5090 - 32GB
Natively.
8400GS/210 with 32-bit bus, with "TurboCache" tech (if you want something newer).
There are always exceptions to rules, current cards just aren't worthy of such title (yet) ;)