Monday, April 7th 2025

NVIDIA GeForce RTX 5060 Ti & 5060 128-bit Memory Interfaces "Confirmed" by Leaked Shipping Manifest

Last month, PG152 board designs were linked to NVIDIA's rumored lineup of upcoming GeForce RTX 5060 Ti, RTX 5060, and RTX 5050 "Blackwell" GPUs. Despite the emergence of fairly legitimate looking "incomplete" technical information, claimed "128-bit memory bus" spec points (for all lower end cards) did not sit well with a portion of the PC gaming hardware community. In theory, Team Green could roll out truly next-generation budget offerings with 192-bit buses, rather than repeat some of its GeForce RTX 4060 "Ada Lovelace" series homework. Two weeks ago, a GeForce RTX 5060 Ti-specific "full specification" leak reiterated the design's (alleged) 128-bit wide GDDR7 memory interface.

Earlier today, VideoCardz unearthed another example—sourced from shipping manifests—of NVIDIA outfitting PG152 boards with a 128-bit memory bus. The "PG152 SKU 25" and "PG152 SKU 10" identifiers seem to confirm the existence of GeForce RTX 5060 and GeForce RTX 5060 Ti graphics cards (respectively)—the latter design is reportedly due for launch next week. The "wallet friendly" end of Team Green's "Blackwell" GPU spectrum is expected to utilize GDDR7 memory; thus elevating new-gen options above preceding hardware. An advantageous generational leap grants bandwidths of 448.0 GB/s, rather than 288.0 GB/s.
Source: VideoCardz
Add your own comment

8 Comments on NVIDIA GeForce RTX 5060 Ti & 5060 128-bit Memory Interfaces "Confirmed" by Leaked Shipping Manifest

#2
narutoramox
I think i will not change my 3070 to a 5060ti, its the same 8GB framge gen + latenci? thanks no..
Posted on Reply
#3
hsew
FrancoportoRTX 3070Ti + a few % OUTCH
Looking at the TPU DB,

Remember when a new x60 would challenge the previous x80? 2060 vs 1080 anyone?

Suddenly, the 4070 lagged behind the 3080, and now it’s looking like it’ll take 2 generations to see the 5060/Ti tie the 3070/Ti.
Posted on Reply
#4
3DVCash
hsewRemember when a new x60 would challenge the previous x80? 2060 vs 1080 anyone?
I remember!

So this thing basically brings 3070 performance for the 3070's MSRP about 5 years later...

Does this feel like a dying hobby to anyone else?
Posted on Reply
#5
N/A
5060 exists to upsell the 5070.
3DVCashDoes this feel like a dying hobby to anyone else?
Enjoy your 4080 for longer. win win. N3/N2 second iteration and then N1.4 node.
Posted on Reply
#6
Vayra86
3DVCashI remember!

So this thing basically brings 3070 performance for the 3070's MSRP about 5 years later...

Does this feel like a dying hobby to anyone else?
Why is it a dying hobby, because you can't keep buying things that try to convince you they really are faster all the time?

I've said it before... computer graphics are done, have been done for a few years now. This is why we get stuck in this RT bullshit. Its not because of AI or anything. The diminishing returns are just too great, unless of course you believe looking at virtually the same image except now with lighting brute forced in real time, is actually a massive difference.

Its a commercial clusterfuck, and it clearly ain't going places. Only crutches will get playable RT, and you'll be paying progressively more to get those crutches. If that's the sign of a dying hobby, then yeah I guess so, but the revenue numbers in the gaming market disprove that completely: its growth, YoY, not even just mobile, but PC too, despite a worsening GPU market. I'm just in this to play games that look decent enough to sell me the illusion, and the immersion - and that has been possible and is being done for well over a decade now.

We didn't even need DX12 for graphics to be honest, although the improved threading is extremely nice. That was really the biggest hurdle gaming still had to cross: multithread CPU usage. We're there now. This shit is done, and we've rapidly moved towards the same place smartphones are in now: innovation for the sake of trying something new. Not because there's a real demand - you just need a new phone and they'll all do the same things, which one you get is just a matter of minor preferences and budget.

Realities change, nothing keeps on giving. The eternal car comparison... horse and carriage evolved to cars and at some point got a maximum speed that we considered 'enough'. And sure, you can drive faster cars. But they're more often than not, not even road legal - that's where RT is at too. You can fake a fast car in RT by using MFG. But at its core, the tech is just looking for a problem that doesn't exist, you could already drive from A to B just fine, and fast, too.

Reality is like a wall that you could momentarily push aside, but in the end, its there, and it'll keep blocking you. That's why GPUs need to go to ridiculous power usage to get playable RT, as well. That's the wall, right there. You can pull 450W to push it aside though. The question is, is that really worth it at this point? That, on top of a $2000,- or 3000,- price tag for playing a few games?

We're well into Ferrari territory here - just the happy few will do it, and use a different car to cross a regular speed bump, or get groceries, or go on holiday with, because the Ferrari is practically pointless for that. That is a direct analogy to all those epeen topics where people ooh ah over their 8K TAA DLSS18 super duper RT experience - they ain't even playing, they're just wowing over their supposed advantages, even if there's a half broken game underneath. If thát's supposed to be the hobby.... man. What a desolate sadness that is.
Posted on Reply
#7
Macro Device
hsewto see the 5060/Ti tie the 3070/Ti.
I think it'll trail behind 4070 ever so slightly being much closer to 4070 than to 3070 Ti. It has both more cores than 4060 Ti and faster clocks. Not that it's gonna make it any good value, no, it's still to be a very bad product but not as bad as you think it will be.
Posted on Reply
#8
3DVCash
Vayra86Reality is like a wall that you could momentarily push aside, but in the end, its there, and it'll keep blocking you. That's why GPUs need to go to ridiculous power usage to get playable RT, as well. That's the wall, right there. You can pull 450W to push it aside though. The question is, is that really worth it at this point? That, on top of a $2000,- or 3000,- price tag for playing a few games?
I mean that's kind of my point. The gpu market has largely stagnated for the last 5 years except at the bleeding edge where you have to spend thousands of dollars to see any uplift at all.

Sure some of that is the slowing of tech advances, but that's not the only reason. Nvidia absolutely could have delivered a better product stack this gen, but that would come at the cost of their precious margins. So instead, they're content to use smaller and more compromised die configs in place of previous gen SKUs that offer roughly the same performance. The end result, no real shift in the market at all.

5 years later and it's still almost impossible to build a "console killing" PC for the same price. What is there to even get excited about?
Posted on Reply
May 4th, 2025 16:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts