Wednesday, March 12th 2025

NVIDIA Reportedly Prepares GeForce RTX 5060 and RTX 5060 Ti Unveil Tomorrow

NVIDIA is set to unveil its RTX 5060 series graphics cards tomorrow, according to VideoCardz information, which claims NVIDIA shared launch info with some media outlets today. The announcement will include two desktop models: the RTX 5060 and RTX 5060 Ti, confirming leaks from industry sources last week. The upcoming lineup will feature three variants: RTX 5060 Ti 16 GB, RTX 5060 Ti 8 GB, and RTX 5060. All three cards will utilize identical board designs and the same GPU, allowing manufacturers to produce visually similar Ti and non-Ti models. Power requirements are expected to range from 150-180 W. NVIDIA's RTX 5060 Ti will ship with 4608 CUDA cores, representing a modest 6% increase over the previous generation RTX 4060 Ti. The most significant improvement comes from the implementation of GDDR7 memory technology, which could deliver over 50% higher bandwidth than its predecessor if NVIDIA maintains the expected 28 Gbps memory speed across all variants.

The standard RTX 5060 will feature 3840 CUDA cores paired with 8 GB of GDDR7 memory. This configuration delivers 25% more GPU cores than its predecessor and marks an upgrade in GPU tier from AD107 (XX7) to GB206 (XX6). The smaller GB207 GPU is reportedly reserved for the upcoming RTX 5050. VideoCardz's sources indicate the RTX 5060 series will hit the market in April. Tomorrow's announcement is strategically timed as an update for the Game Developers Conference (GDC), which begins next week. All models in the series will maintain the 128-bit memory bus of their predecessors while delivering significantly improved memory bandwidth—448 GB/s compared to the previous generation's 288 GB/s for the Ti model and 272 GB/s for the standard variant. The improved bandwidth stems from the introduction of GDDR7 memory.
Source: VideoCardz
Add your own comment

40 Comments on NVIDIA Reportedly Prepares GeForce RTX 5060 and RTX 5060 Ti Unveil Tomorrow

#26
GhostRyder
I mean, the only part I hate is when they do the 8gb version/16gb versions of the same model numbers (AMD does it too, I am not singling Nvidia out). As for the amount of VRAM on these, I think the 5060ti should be 16gb and the others should be 8gb. Just assuming the price is gonna be sub $500 for these is how I am judging them.

I mean I am all for more ~$500 cards out there. However, I doubt there is going to be a ton of these available. Plus they are probably going to be marked up in price once the 15 available get bought by all the scalpers.
Posted on Reply
#27
scooze
I don't see a problem with 8GB on the 5060. The main problem will be the price. If it offers 25% more cores, it could be more expensive than the 4060 and that would be terrible.
Posted on Reply
#28
wNotyarD
5050 should be slot-powered for it to make more sense
Posted on Reply
#29
G777
RexterIt saddens me that they are releasing that slop for such a price. Single digit generational uplift. Absolutely not worth the money.
Its even more ironic that Nvidia is sitting on a "real" 5060:
(quick note: I am not affiliated with that particular youtube channel or anything of the like).
The video is about a franken gpu from aliexpress that has a 4090M gpu with 16gb vram. In the benchmarks in the video it consistently performs identical or faster than the 5070. Oh and uses around 100w power. Why in buttcracks name isnt this the one that Nvidia is releasing? 5070 performance!? 16gb vram!? 100w at load!? ITS THERE! ITS RIGHT THERE NVIDIA! JUST SELL IT OFFICIALLY!
The 4090 mobile gpu uses the same die as the 4080. It’s basically a heavily power-limited 4080.
Posted on Reply
#30
Vayra86
OnasiNo, what you people are talking about is RT which, even at 1080p, is a punishing feature to enable for ANY GPU. Doesn’t matter what NV says or thinks, absolutely anyone remotely sane understands that RT is still not ready to be a permanent tentpole feature, at least not in the way it’s currently implemented. You can add however more VRAM to the 4060 - it will not make it truly RT capable. That’s just how things are. So RT is a no go, what else is there? Insane optional texture packs that make 0 difference at 1080p anyway? That’s not to mention the fact that I already mentioned in another thread - most people playing on PC and buying relatively budget cards don’t even play graphically advanced titles. Can a 5060 pull high FPS in insert e-sports of choice and also perform reasonably well on medium-high settings in AAA? Yes? Congratulations, the card has a reason to exist and absolutely will become a commercial success. I will never tire to repeat this, but what PC enthusiasts THINK should be and what the actual reality of the market IS pretty much never aligns.
No, you can readily use >8GB on 1080p without RT just fine.

And if you don't want to, you will definitely be turning down settings. At 1080p. In 2025. On a brand new card costing upwards of 350 bucks.

Its no different than when Ada launched, and has gotten progressively worse. The # of games that really want more to run optimally, is increasing, as is the number of tweaks Nvidia needs to apply to keep the card from falling apart. We also know lacking VRAM will dynamically reduce image quality for example.
Posted on Reply
#31
Daven
lexluthermiesterWhile the performance of RTX5000 has been disappointing, your suggestion is(and I'm being nice here) daft as a brush. :slap::rolleyes:

The 5060/5050 offerings are still needed to fill in market segments the rest do not fit into.
It would be nice to see performance of the low end on one graph over time to see if we are truly getting any value in this market segment or the same performance is just being repackaged over and over.

If we are getting single to low double digit gains that diminish in value over time due to higher prices and newer gaming technologies then I agree with the other commenter that these products shouldn’t exist.
Posted on Reply
#32
lexluthermiester
Vayra86No, you can readily use >8GB on 1080p without RT just fine.
8GB on 1080p is more than enough for RT as has been proven by the 3 previous generations of RTX cards that only had 8GB of VRAM.
Vayra86And if you don't want to, you will definitely be turning down settings. At 1080p. In 2025.
Depends on the game. The benchmarks will tell the tale.
DavenIt would be nice to see performance of the low end on one graph over time to see if we are truly getting any value in this market segment or the same performance is just being repackaged over and over.
@W1zzard This sounds like an idea that could be useful. Just a thought.
Posted on Reply
#33
Vayra86
lexluthermiester8GB on 1080p is more than enough for RT as has been proven by the 3 previous generations of RTX cards that only had 8GB of VRAM.

Depends on the game. The benchmarks will tell the tale.


@W1zzard This sounds like an idea that could be useful. Just a thought.
*was. And you would still be turning down stuff (or run below render res) anyway wouldn't you, or 60 fps is a fairy tale.
Posted on Reply
#34
lexluthermiester
Vayra86*was.
Is. I build PCs frequently and demonstrate them for customers to see at 1080p. I know what I'm talking about. 8GB is still enough for 1080p at mostly high/ultra settings, again, depending on the game.
Vayra86And you would still be turning down stuff (or run below render res) anyway wouldn't you, or 60 fps is a fairy tale.
True. However, with every generation uptick at the same resolution more setting can be left on or at higher levels. The differences between 2060->3060 were outstanding. The differences between 3060->4060 while not as impressive, were still a very good increase.

What I think most consumers are hoping for is that the 4060->5060 jump will be big enough to be of value and enough of a reason to upgrade. OF course, more people skip a gen or two before upgrading. So there are a lot of people who will be going from a 2060 or 3060 to a 5060. I'm betting that increase will be very solid.
Posted on Reply
#35
Darc Requiem
Darc RequiemThat's the thing, they aren't. These are 1080p cards. 1080p cards shouldn't require you to turn down settings at 1080p because of a lack of VRAM. Games at 1080p are pushing up against and spilling over the 8GB VRAM buffers. Nvidia pushes Ray Tracing, which increases the VRAM usage further, and yet skimps on VRAM. It reminds me of when Sony launched the PS3. They included Talledega Nights to show off the difference between Blu Ray and DVD, but they only included composiste cables with the console. So that Blu Ray's image quality would look just like a DVD's. Even with that silly situation, the consumer would be able to buy more cables.
OnasiNo, what you people are talking about is RT which, even at 1080p, is a punishing feature to enable for ANY GPU. Doesn’t matter what NV says or thinks, absolutely anyone remotely sane understands that RT is still not ready to be a permanent tentpole feature, at least not in the way it’s currently implemented. You can add however more VRAM to the 4060 - it will not make it truly RT capable. That’s just how things are. So RT is a no go, what else is there? Insane optional texture packs that make 0 difference at 1080p anyway? That’s not to mention the fact that I already mentioned in another thread - most people playing on PC and buying relatively budget cards don’t even play graphically advanced titles. Can a 5060 pull high FPS in insert e-sports of choice and also perform reasonably well on medium-high settings in AAA? Yes? Congratulations, the card has a reason to exist and absolutely will become a commercial success. I will never tire to repeat this, but what PC enthusiasts THINK should be and what the actual reality of the market IS pretty much never aligns.

Edit: @lexluthermiester kinda said what I did in a more concise and based way.
You may want to comprehend what I said before you go off on a tangent. I said cards are pushing up against and spilling over 8GB of RAM at 1080p and RT increases VRAM usage further. I.e. it's not enough before RT is taken into account. You've gone on some RT rant, completely skipping over the fact that 8GB is insufficent without it. I brought up how Nvidia is pushing the feature as selling point because skimping on VRAM makes the feature look worse than it should. Which is counter productive to them trying to sell customers on it's importance. The example I listed was just to show a previous example of another company doing something similarly stupid.
Posted on Reply
#36
Onasi
lexluthermiesterTrue. However, with every generation uptick at the same resolution more setting can be left on or at higher levels. The differences between 2060->3060 were outstanding. The differences between 3060->4060 while not as impressive, were still a very good increase.
It’s not even a negative point people think it is. You have to, in certain cases, turn down settings on a mainstream GPU running the mainstream resolution. Okay. And this is bad or new… how? I remember running a 6600GT. I did not max things out reliably at 1280x1024. Neither did the 8600GTS in 2007 with your Bioshocks and Crysises. 1080p is not some sort of 640x480 relic that enthusiasts THINK it is. It’s THE mainstream resolution and games (that are ever pushing forward with graphics because gamers are morons) do run into issues resulting in a setting drop on mainstream cards. As they ALWAYS did. And no, the anomaly that was Pascal and the 1060 isn’t disproving anything. There will not be another one.

@Darc Requiem
See above. I comprehended your point just fine, I just chose to disregard it and address a more interesting one. “Mainstream cards can’t handle contemporary AAA titles maxed out” isn’t a controversial take you think it is, it’s just the way of things.
Posted on Reply
#37
Bomby569
what's the point if the performance is basically the same as my 3060ti? you can **** *** Nvidia

If 8GB isn't enough for 1080p, 16GB is surely not enough for 4K and no one seems concerned while buying 5080's, 5070's or 9070's.
8GB for 1080p is enough, stop the elitist movement on this forum.
Posted on Reply
#38
Darc Requiem
OnasiSee above. I comprehended your point just fine, I just chose to disregard it and address a more interesting one. “Mainstream cards can’t handle contemporary AAA titles maxed out” isn’t a controversial take you think it is, it’s just the way of things.
You are making an assumption on what I think and an incorrect one. I don't think it's a controversial take at all.
Posted on Reply
#39
Onasi
@Darc Requiem
Then there is nothing to discuss at all, is there? *shrug* Just idly musing about the lack of VRAM on budget GPUs is all well and good, but is meaningless in the end, got old probably a year or two ago and demonstrably doesn’t affect the sales of the cards with perceived lack of VRAM nor, seemingly, the satisfaction of their owners outside of a very limited circle of enthusiasts. The whole debate is, while quite droll, essentially a waste of everyone’s time and energy.
Posted on Reply
#40
lexluthermiester
Onasi1080p is not some sort of 640x480 relic that enthusiasts THINK it is.
Exactly. It's a standard and has been for a decade because it is a sweetspot resolution. Not since the the CRT analogue TV broadcast times has a standard stuck around for so long. And that is a good thing.
Posted on Reply
Add your own comment
Mar 12th, 2025 15:11 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts