Saturday, February 1st 2025

NVIDIA GeForce RTX 5060 and RTX 5060 Ti Rumored to Launch in March 2025

A recently leaked slide from the Taiwanese company Chaintech has seemingly confirmed the launch dates for the RTX 5060 and RTX 5060 Ti GPUs. Previous leaks have hinted at an early Q2 launch for the mid-range gaming GPUs, in both 8 GB and 16 GB VRAM flavors. Chaintech's slide does not reveal any specifications regarding the GPUs, although we do have a pretty good idea of what the upcoming GPUs will bring to the table.

As per recent leaks, the RTX 5060 and 5060 Ti are both expected to sport the GB206 GPU, paired with 8 or 16 GB of VRAM on a 128-bit bus. Despite employing the speedy new GDDR7 standard, there is no denying that 8 GB of VRAM is far from sufficient for a comfortable ray-traced gaming experience in 2025, perhaps even less so in the near future. Considering that the Arc B580 ships with 50% more VRAM, the entry-level RTX 5060 is more than likely to be hard sell for many people, unless, of course, the RTX 5060/Ti somehow pulls off impressive performance uplifts.
Source: @wxnod
Add your own comment

45 Comments on NVIDIA GeForce RTX 5060 and RTX 5060 Ti Rumored to Launch in March 2025

#26
alwayssts
Onyx TurbineA 8 GB Vram card in 2025 costing more than 250 Dollar would either be warranted when you absolutely only game low gpu intensive games so to say.
Else 10 GB Vram+ is mandatory would be the correct 8, the thing is what no one tells is that usually team green Vram holds up better compared to team Red,
that slightly save team green with 12 GB Vram card, 8 no more, but things are on the edge
All I'm saying is that is these likely cost (at least) $300/400.

For ~$250, you have the option of many 11/12GB cards (B580, 6700xt, 2080Ti). None of those cards are 'bad' for 1080p, pretty cheap, and then RAM isn't a worry.
For ~$400, you will likely have the option of a 7800xt/9070 in the same time frame. 9070 is rumored to be $400, and 7800xt may likely drop even lower until it's gone; both 16GB.
The $400 cards are 1440p cards in reality, 1080p for RT.

Why 8GB when you can 11-12? Why buy a ridiculous 16GB card when you can buy a non-ridiculous 16GB card? I can't see this out-performing a 7800xt, even realistically in RT.

I'm not even arguing about 8GB. The point is there really isn't any reason to buy one anymore. If you want to argue a 2080ti vs a 6700xt/B580 (11GB vs 12GB), I'm with you on the 11GB of green.
Posted on Reply
#27
InVasMani
alwaysstsProbably right around a stock 3080 best-case. I certainly wouldn't want to use it for RT in newer titles.

In no possible scenario do I envision these being better nor a better value than a 7800xt and/or 9070.
Yeah with 8GB limitation it would need to ideally land at least above a RTX 3070Ti and ideally a bit above RTX 3080 and priced well enough to not be prohibitive enough to be people actually buying the card in the first place. I don't see it far fetched that could beat RTX 3080 potentially at least at lower resolution targets. Doesn't leave much room for future proofing though even at 1080p.
Posted on Reply
#28
Onyx Turbine
alwaysstsAll I'm saying is that is these likely cost (at least) $300/400.

For ~$250, you have the option of many 11/12GB cards (B580, 6700xt, 2080Ti). None of those cards are 'bad' for 1080p, pretty cheap, and then RAM isn't a worry.
For ~$400, you will likely have the option of a 7800xt/9070 in the same time frame. 9070 is rumored to be $400, and 7800xt may likely drop even lower until it's gone; both 16GB.
The $400 cards are 1440p cards in reality, 1080p for RT.

Why 8GB when you can 11-12? Why buy a ridiculous 16GB card when you can buy a non-ridiculous 16GB card? I can't see this out-performing a 7800xt, even realistically in RT.

I'm not even arguing about 8GB. The point is there really isn't any reason to buy one anymore. If you want to argue a 2080ti vs a 6700xt/B580 (11GB vs 12GB), I'm with you on the 11GB of green.
8 GB for 1080p... not that future proof but doable, i think we figured it out, the smart idea behind it is to automatically motivate 1440p players to buy a 12GB+ and 4K a 16GB+
Besides that i just want to coin that i bought a zotac for the 5 year warranty they provide, how lucky are people with how long their gpu will last.. really realistically under normal use more than lets say 6-7 Years? Not to mention that after a good full four years you very likely with a mid range card need to upgrade to be able to play newest games without issues
Posted on Reply
#29
InVasMani
Not much to figure on that if you play 1440p/4K leaning towards at least a 12GB to 16GB is pretty evident. Not mention if you have any aspirations around local AI training though FP4 kind of puts a wrinkle in the AI aspect.
Posted on Reply
#30
Onyx Turbine
Any use of gpu besides of gaming is a different used case, when someone needs high vram for their private AI endeavours i suppose
either they can pay for it because its that necessary for their job or they are e.g. student and use
nasa computer paid with tax money :) (which is totally ok, when something good comes out if it, in the end will, numbers game)
Posted on Reply
#31
SSGBryan
Onyx Turbine8 GB for 1080p... not that future proof but doable, i think we figured it out, the smart idea behind it is to automatically motivate 1440p players to buy a 12GB+ and 4K a 16GB+
Besides that i just want to coin that i bought a zotac for the 5 year warranty they provide, how lucky are people with how long their gpu will last.. really realistically under normal use more than lets say 6-7 Years? Not to mention that after a good full four years you very likely with a mid range card need to upgrade to be able to play newest games without issues
It is why I bought a b580.

No need to spend more than $300 for a product that covers 95% of the market.
Posted on Reply
#32
Onyx Turbine
SSGBryanIt is why I bought a b580.

No need to spend more than $300 for a product that covers 95% of the market.
Well priced blue alternative
Posted on Reply
#33
_roman_
alwaysstsWhy 8GB when you can 11-12? Why buy a ridiculous 16GB card when you can buy a non-ridiculous 16GB card? I can't see this out-performing a 7800xt, even realistically in RT.
Come on.

I still use the same screen WHQD + freesync ASUS PA278QV.

I bought in the first days Radeon 6600XT. The gpu mining days were expensive. This was with 5800X + 2x32 GiB DDR4 RAM.

I switched the processor to something which is basically a sidegrade Ryzen 7600X. (I mention that because the sold am4 platform and my current am5 platform are very close - so it is not really a apple to banana comparision. of course the am5 platform used windows 23h2 - now 24h2. the am4 platform used windows 10. the operating system changes)

The more VRAM I could see in variuos free games with the radeon 6800 non xt. That card was one of the last cards of the product cycle. VRAM matters - even for the free epic games giveaway games - I most of the time play and enjoy.

I'm kinda happy with the Powercolor 7800XT hellhound. (the point why I wrote this hole text wall - 7800XT)
Raytracing is a nonsense feature. Hardly any of my games support it. Except those free trash games amd gave away. Avatar pandora / the last of us / star wars jedi survivor. (I never bought a steam game - ubisoft - ea game. My last game purchase was in the old days with a worm media.)

Why do you want raytracing with an entry level graphic card? The nivida 3070 was too weak / radeon 6800 non xt / 7800xt are too weak.

If you want raytracing you should buy at least 7900XT or go higher.

RT = raytracing is just the killer argument. I do not see a difference or any improvement with raytracing. I always see the same technical demonstration software to show the work in progress.(tech demo in short) For entry level gaming a 7800XT is far decent card. It's expensive card, but mine is at least quiet.

The cards above are for those who are willing to buy twice as much for a graphic card.

It was already hard to justify to buy a graphic card which cost as much as my mainboard + cpu bundle and the expensive dram 2x32GiB DDR5 .

I also believe these higher end graphic cards take more than the 220Watts I see. Of course you have "up to" double the frames with "up to" 450 Watts or 670 Watts for a graphic card.
the idle consumption of ~47 Watts (please check the video yourself - not sure if it was 40 or 47 -- much too high as my ~7-16 Watts in idle) according to the gamers nexus video for the Nividia 5090 is another flaw. I trust them more because they measure with checked equipment any power going in a graphic card.

-- That VRAM topic is also over. Keep staying in your 8GiB of 12GiB VRAM bubble if you want to. I saw clear difference in certain games from the 6600XT 8GB going to the 6800 non XT 16GB card with certain games. I play the older games from epic games and such. With recent games I most likely would not use a 7800XT, but something much better. I want to see how your 8GB card does in WHQD in AVATAR Pandora. That game is demanding and was a free "trash" amd giveaway game. Buggy quests and such. My statement is only valid for AMD graphic cards. I tested a 4GB nvidia card for the driver quality in 2023. Intel is not really worth considering in my point of view.

The reason I bought the 7800xt hellhound was the low noise in various tests. MSI radeon 6800 z trio / asrock challenger 6600XT 8GB D were far too loud in my point of view. One of many reason why these cards had to go to the second hand market.

edit: the buyer should be aware of - he limits himself to certain games - certain frame rates - certain display resolution - certain game settings when buying a 8GB VRAM card. I would not call a 16GB card future proof but acceptable for games made up to the year ~2021 (feel free to determine the year limit yourself).
Posted on Reply
#34
Zazigalka
agent_x007And after 3GB G7 chips become wide use, NV will launch 5060 Super with 12GB VRAM on 128-bit and "will live happy ever after".
there are no bad cards, there are only bad prices. 5060 Super with 12G 192-bit g7 actually sounds quite well balanced, provided it's only a slightly cut 5060ti.
SSGBryanIt is why I bought a b580.

No need to spend more than $300 for a product that covers 95% of the market.
what I like about intel is that they price their cards like they always intended to, they're not waiting for nvidia's prices like amd only to be caught pants down after they sent the cards to retailers.
Posted on Reply
#35
Ruru
S.T.A.R.S.
AusWolf5060 Ti with 16 GB or 5070 with 12 GB? Another slap in the face from Nvidia.
This kinda started already on the 30 series; 3060 had 12GB, 3060 Ti and 3070 had 8GB and the initial 3080 had 10GB.
Posted on Reply
#36
b1k3rdude
Welp, im just waiting for the inevitable sh!tshow when users try to use MFG on a 60 series card. The base FPS will so low that will make anything more than regular FG unplayable even in SP games.

Its already been shown that 8gb isnt enough for just basic raster in most games. So when users enable RT, it will use more VRAM than the card has, which will cripple FPS. Then these nvidia-gaslit users will enable MFG to try and claw some of FPS back, but then the base FPS will be so low (eg 30 or less) that the lag will be so bad as you have 100-200ms lag on your mouse movements. If the current nGreedia-upsell-trend continues, the 5060 will actually be a 5050 like the 4060 was a 4050.

But as otters in this thier thread have reminded me, thats why we have the B570/580 and the soom to be released 9070. There is zero reason to buy anything lower than the 5070 from nvidia. I sincerly hope the 5000 generation bites nGreedia in the only thing they understand, their bottom line.
Posted on Reply
#37
iameatingjam
dartuil5060 still on 128bit not 160 or 192?
NV is slapping faces
tbh, I was worried nvidia would use 3gb chips to make the 5070 128 bit as well. Seemed to be the trend...
though at least with gddr7, bandwidth isn't that bad at 128 bit.
Posted on Reply
#38
Zazigalka
AusWolf5060 Ti with 16 GB or 5070 with 12 GB? Another slap in the face from Nvidia.
still more useful than 16gb on 7600xt
Posted on Reply
#39
AusWolf
oxrufiioxoTitanium.

I'm sure the naming for the Titan was a derivative of that though.
Oh... I thought it was tits. :( (only joking)
Zazigalkastill more useful than 16gb on 7600xt
I'm not quite sure. Let's wait with the specs and benchmarks. It might as well be a 4060 v2.0 judging by the 5080.
RuruThis kinda started already on the 30 series; 3060 had 12GB, 3060 Ti and 3070 had 8GB and the initial 3080 had 10GB.
Do you want your GPU to become obsolete quickly because of its lack of GPU power, or because its lack of VRAM? The choice is yours. :p
Posted on Reply
#40
PaddieMayne
3valatzyRight product placement:
5050 - 12GB
5060 - 16GB
5060 Ti - 16GB
5070 - 21GB
5070 Ti - 24GB
5080 - 30GB
5090 - 32GB
Here's how I see it...

Right product placement:
5050 - 8GB
5060 - 12GB
5060 Ti - 16GB
5070 - 16GB
5070 Ti - 20GB
5080 - 24GB
5090 - 32GB
Posted on Reply
#41
Prima.Vera
My good ol' SVGA CRT monitor cannot wait for this 8GB card to play my games in 1024x768 resolution.
Natively.
Posted on Reply
#42
agent_x007
Zazigalkathere are no bad cards, there are only bad prices. 5060 Super with 12G 192-bit g7 actually sounds quite well balanced, provided it's only a slightly cut 5060ti.
I guess you never used early 3D cards : AT3D, "NV1", Virge 3D, Matrox G100, etc.
8400GS/210 with 32-bit bus, with "TurboCache" tech (if you want something newer).
There are always exceptions to rules, current cards just aren't worthy of such title (yet) ;)
Posted on Reply
#43
Bomby569
at this point i just don't care anymore, i think i will give up on this madness and drain my 306ti for a couple more years
Posted on Reply
#44
Hxx
AusWolfAre you suggesting that Nvidia should also use the numbers between 1 and 5? :wtf: Blasphemy!
I’m holding out for a 5010 TI super in white to match my build bro
Posted on Reply
#45
docnorth
At least Nvidia managed to tamper (almost vanish) the excitement or impatience about the ‘new’ generation. They may release the mainstream GPUs in 1, 2 or 3 months, but it’s just an Ada refresh. Nvidia 4000 and (probably more efficient than 7000) AMD 9000 series might be better options, depending on price.
Posted on Reply
Add your own comment
Feb 2nd, 2025 12:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts