Wednesday, March 12th 2025

NVIDIA Reportedly Prepares GeForce RTX 5060 and RTX 5060 Ti Unveil Tomorrow
NVIDIA is set to unveil its RTX 5060 series graphics cards tomorrow, according to VideoCardz information, which claims NVIDIA shared launch info with some media outlets today. The announcement will include two desktop models: the RTX 5060 and RTX 5060 Ti, confirming leaks from industry sources last week. The upcoming lineup will feature three variants: RTX 5060 Ti 16 GB, RTX 5060 Ti 8 GB, and RTX 5060. All three cards will utilize identical board designs and the same GPU, allowing manufacturers to produce visually similar Ti and non-Ti models. Power requirements are expected to range from 150-180 W. NVIDIA's RTX 5060 Ti will ship with 4608 CUDA cores, representing a modest 6% increase over the previous generation RTX 4060 Ti. The most significant improvement comes from the implementation of GDDR7 memory technology, which could deliver over 50% higher bandwidth than its predecessor if NVIDIA maintains the expected 28 Gbps memory speed across all variants.
The standard RTX 5060 will feature 3840 CUDA cores paired with 8 GB of GDDR7 memory. This configuration delivers 25% more GPU cores than its predecessor and marks an upgrade in GPU tier from AD107 (XX7) to GB206 (XX6). The smaller GB207 GPU is reportedly reserved for the upcoming RTX 5050. VideoCardz's sources indicate the RTX 5060 series will hit the market in April. Tomorrow's announcement is strategically timed as an update for the Game Developers Conference (GDC), which begins next week. All models in the series will maintain the 128-bit memory bus of their predecessors while delivering significantly improved memory bandwidth—448 GB/s compared to the previous generation's 288 GB/s for the Ti model and 272 GB/s for the standard variant. The improved bandwidth stems from the introduction of GDDR7 memory.
Source:
VideoCardz
The standard RTX 5060 will feature 3840 CUDA cores paired with 8 GB of GDDR7 memory. This configuration delivers 25% more GPU cores than its predecessor and marks an upgrade in GPU tier from AD107 (XX7) to GB206 (XX6). The smaller GB207 GPU is reportedly reserved for the upcoming RTX 5050. VideoCardz's sources indicate the RTX 5060 series will hit the market in April. Tomorrow's announcement is strategically timed as an update for the Game Developers Conference (GDC), which begins next week. All models in the series will maintain the 128-bit memory bus of their predecessors while delivering significantly improved memory bandwidth—448 GB/s compared to the previous generation's 288 GB/s for the Ti model and 272 GB/s for the standard variant. The improved bandwidth stems from the introduction of GDDR7 memory.
40 Comments on NVIDIA Reportedly Prepares GeForce RTX 5060 and RTX 5060 Ti Unveil Tomorrow
I mean I am all for more ~$500 cards out there. However, I doubt there is going to be a ton of these available. Plus they are probably going to be marked up in price once the 15 available get bought by all the scalpers.
And if you don't want to, you will definitely be turning down settings. At 1080p. In 2025. On a brand new card costing upwards of 350 bucks.
Its no different than when Ada launched, and has gotten progressively worse. The # of games that really want more to run optimally, is increasing, as is the number of tweaks Nvidia needs to apply to keep the card from falling apart. We also know lacking VRAM will dynamically reduce image quality for example.
If we are getting single to low double digit gains that diminish in value over time due to higher prices and newer gaming technologies then I agree with the other commenter that these products shouldn’t exist.
What I think most consumers are hoping for is that the 4060->5060 jump will be big enough to be of value and enough of a reason to upgrade. OF course, more people skip a gen or two before upgrading. So there are a lot of people who will be going from a 2060 or 3060 to a 5060. I'm betting that increase will be very solid.
@Darc Requiem
See above. I comprehended your point just fine, I just chose to disregard it and address a more interesting one. “Mainstream cards can’t handle contemporary AAA titles maxed out” isn’t a controversial take you think it is, it’s just the way of things.
If 8GB isn't enough for 1080p, 16GB is surely not enough for 4K and no one seems concerned while buying 5080's, 5070's or 9070's.
8GB for 1080p is enough, stop the elitist movement on this forum.
Then there is nothing to discuss at all, is there? *shrug* Just idly musing about the lack of VRAM on budget GPUs is all well and good, but is meaningless in the end, got old probably a year or two ago and demonstrably doesn’t affect the sales of the cards with perceived lack of VRAM nor, seemingly, the satisfaction of their owners outside of a very limited circle of enthusiasts. The whole debate is, while quite droll, essentially a waste of everyone’s time and energy.