Friday, March 21st 2025

NVIDIA GeForce RTX 5060 Ti GPU "Full Specification" Leaks Out

A ramped up flow of early-to-mid March period leaks—regarding upcoming NVIDIA GeForce RTX 5060 Ti and RTX 5060 graphics cards—suggested an official pre-GTC 2025 unveiling of lower-end Blackwell gaming GPUs. Speculative specifications appeared online earlier in the month, but some key technical talking points seemed to be missing. As reported yesterday, insiders believe that Team Green has adjusted its new product release schedule. Leaked roadmaps have outlined GeForce RTX 5060 Ti cards arriving by mid-April 2025, with less potent RTX 5060 models launching around the middle of May. Despite the alleged delay, VideoCardz has continued its investigation into pre-launch conditions. Their latest report points to full GeForce RTX 5060 Ti specifications being distributed to board partners, at least in the recent past.

Leaked details seemingly reconfirm the existence of 16 GB and 8 GB variants (on a 128-bit memory bus); both utilizing the same GB206-300-A1 GPU with 4608 CUDA cores. VideoCardz disclosed a couple of finer (new) details:"based on the specs we have, both models will ship with 28 Gbps memory. This means that the bandwidth is 448 GB/s, which is 55% higher than the last-gen model. Moving on to GPU clocks, NVIDIA has set a 2407 MHz base clock and a 2572 MHz boost clock for this GB206-based model. This means that the base clock is 97 MHz and the boost is 37 MHz higher than the RTX 4060 Ti." The fresh leak suggest that a few of Team Green's AIBs will be configuring their custom designs with 8-pin power connectors; sufficient for a reported 180 W TDP-rated product. VideoCardz anticipates that the vast majority of GeForce RTX 5060 Ti models will utilize 16-pin connectors. Unfortunately, finalized price guides were not discovered during recent sleuthing sessions.
Source: VideoCardz
Add your own comment

65 Comments on NVIDIA GeForce RTX 5060 Ti GPU "Full Specification" Leaks Out

#26
Chrispy_
lexluthermiesterI'm going to go out on a limb here and suggest that this might be the positive thing about the 5000 series launch. As long as the benchmarks show a good performance for the price.
It's always about the price. At least the 5070 sets an upper limit on how greedy Nvidia can be for the 16GB variant of the 5060Ti. I'm hoping for $450 and that would seem to be a realistic hope grounded in reality rather than an optimistic pipe-dream.
Posted on Reply
#27
Caring1
Still only a 128bit bus and I suppose the 15W uplift in power is due to the Vram.
Posted on Reply
#28
Sithaer
Chrispy_It's always about the price. At least the 5070 sets an upper limit on how greedy Nvidia can be for the 16GB variant of the 5060Ti. I'm hoping for $450 and that would seem to be a realistic hope grounded in reality rather than an optimistic pipe-dream.
Yeah if the price would be right then I would think about it but considering that even the second hand but new 5070's are between 750-830 Euro around here I really have no hope left and the similar performance 4000 serie is either gone or about the same price anyway.. 'Before anyone says just go AMD, guess what the 9070 is the same if not more expensive'
Posted on Reply
#29
ZoneDymo
Rightness_1Why would the more expensive 5070 have less VRAM than a 5060ti?


They (AMD+NV) are colluding together. Intel is the outsider, and if they could release a faster card, they would be a real threat, but Intel is too incompetent.
Personally I dont agree that Intel is too incompetent....well for GPU's atleast, drivers are a monumental tasks and they are 2 decades behind on the competition yet in about 2 years. pretty much caught up.
Then the gpu's, the Arc Alchemist line was actually really impressive hardware, but they made poor choices/assumptions/gambles on what was needed for the games, which they learned from with Battlemage which you can see with the B570 vs the A770, much less hardware yet performance is better.
So they are learning and fast, I think its just a strategic choice to not release a B770, and instead power through with Celestial for some more higher end cards.

I am looking forward to getting an Intel gpu in the future, right now I have an RX7800XT so no upgrade path there yet.
Posted on Reply
#30
Red Hood
Number of ROPs missing from spec, is that cause they vary? :D

The specs suggest its slower than a 4070 but with the current pricing wouldn't be suprised if it costs as much as a 4070 did
SithaerYeah if the price would be right then I would think about it but considering that even the second hand but new 5070's are between 750-830 Euro around here I really have no hope left and the similar performance 4000 serie is either gone or about the same price anyway.. 'Before anyone says just go AMD, guess what the 9070 is the same if not more expensive'
The 9060 XT 16GB could potentially be a decent upgrade if sanely priced
Posted on Reply
#31
Knight47
Will they fix the drivers by the time this comes out? I was hoping that 5060Ti will be significantly faster than 4060Ti, but supposedly they both have same amount of ROPs.
Posted on Reply
#32
Onasi
Knight47Will they fix the drivers by the time this comes out? I was hoping that 5060Ti will be significantly faster than 4060Ti, but supposedly they both have same amount of ROPs.
Significantly would be a stretch considering the state the rest of the stack released in, but the amount of ROPs isn’t and wouldn’t have been the deciding factor anyway. What a weird metric to focus on.
Posted on Reply
#33
Ruru
S.T.A.R.S.
8GB again for the non-Ti.. come on, 3060 had already 12GB two generations ago. Are 3GB chips a thing already since they clearly don't want the card to have a 192bit bus?
scoozeMain problem of 4060ti wasn't the throughput, but the pathetic 4352 cores. Even the controversial and not very good 4070 had a monstrous amount of them in comparison, almost 6K.
I'd say that the shader count itself isn't that important since you can't compare different architectures by just comparing the shader count.
Posted on Reply
#34
Athena
Rightness_1They (AMD+NV) are colluding together. Intel is the outsider, and if they could release a faster card, they would be a real threat, but Intel is too incompetent.
Colluding how?

It's still a supply issue that all makers would have, and if Intel actually had a card ready, they would all be in the same boat, not enough fabs with the right tech that can make the chips
Posted on Reply
#35
Beermotor
Legacy-ZAYeah no; You have a great card, however the 5070Ti is 20-30% faster and will especially be so with with the upcoming DirectX features at the end of April from Microsoft.
I think that elicits a follow-up question for potential 5060/90x0 buyers: when will we see neural rendering/DXR features in games?
Can it be leveraged in current titles to improve the way existing content is rendered?
Posted on Reply
#36
Nostras
The 5060 looks really impressive, considering the amount of cuda cores, bumped TGP and much higher memory bandwidth I'd expect it to be about 30% faster than a 4060.
If the 5060's MSRP is no more than 300$ I'd be happy. Any less even better.

There's no saving the 5060Ti. The 8GB model is unrecommendable at what price it's going to sell and the 16GB model is going to be much too expensive.
The only way the 5060Ti is not DOA is if it costs no more than 300$, which also means that the 5060 has to cost significantly less. Not going to happen lol.
Posted on Reply
#37
RandallFlagg
While everyone is distracted with the 5060 Ti..

The 5060 non-Ti looks to be a runaway winner based on the specs. Of course, this all depends on the actual price. That is a big boost in both memory bandwidth (+65%) and cores (25%) on the standard 5060 though.
Posted on Reply
#38
dicobalt
Why is it using expensive gddr7, this is an entry level card. It's like they don't have any idea what they're doing.
Posted on Reply
#39
dartuil
5060 will be minimum 350.
TI 450.
I have no faith in NV to be cheap.
Posted on Reply
#40
scooze
Chrispy_Pathetic?
The 4070 cost 50% more money and only had 35% more cores. The 4060Ti was way better value than the 4070 if you're looking at core counts or performance.
That's right, pathetic.
35% is kind of a lot, I would say it's a decent amount, it's a big difference, maybe you just don't count well. For example, 3060ti looks very decent against 3070, and 4060ti against very dubious 4070 looks like a tattered piece of junk. 4060ti8/16 are terrible, 5060ti8/16 will be just as terrible if their price is not adjusted. They will get an increase in memory bandwidth, and the terrible 5070 will mock them all the way. And don't give examples that 5060-16 will float from the bottom somewhere with ultra textures turned up.
Chrispy_If anything, the 4070 was the pathetic card, pushing prices per core and overall prices for the "midrange" above $600 for the first time ever whilst also dropping the 70 series down to 192-bit from 256 for the first time since 2012.
4070ti super got super bit depth and what did it give her? Nothing, super 8%. Which were obtained from the difference in those same cores.
RuruI'd say that the shader count itself isn't that important
I've never heard anything stupider. when selling you a 5090, they are selling you first and foremost 22 thousand shaders on one crystal, and everything else is secondary.
Posted on Reply
#41
tussinman
RandallFlaggWhile everyone is distracted with the 5060 Ti..

The 5060 non-Ti looks to be a runaway winner based on the specs. Of course, this all depends on the actual price. That is a big boost in both memory bandwidth (+65%) and cores (25%) on the standard 5060 though.
True although the decent boost in performance is really just because the 4060 was a xx50 tier chip, not because the 5060 is anything special (good chance it's slower or barely on par with a 2020 era 3070)

If the original 4060 had at least or slightly higher than 3060 TI level bandwidth and performance than that would have made this 5060 really look marginal generation wise.
Posted on Reply
#42
Chrispy_
scoozeThat's right, pathetic.
35% is kind of a lot, I would say it's a decent amount, it's a big difference, maybe you just don't count well. For example, 3060ti looks very decent against 3070, and 4060ti against very dubious 4070 looks like a tattered piece of junk.
I don't count well? What's that supposed to mean?
[INDENT]5888 shaders on the 4070,[/INDENT]
[INDENT]4352 shaders on the 4060Ti,[/INDENT]
[INDENT]5888 / 4352 = 1.35, which is 35% more.[/INDENT]
[INDENT]I didn't do the counting, a calculator did.[/INDENT]

If you're concerned about performance for the name, then you're a bit late to the party. The entire community has been moaning about this reduction of silicon config per tier for years already. It gets mentioned by someone in almost every thread about Nvidia because it's such a hard pill to swallow. In this thread, it was me, post #25.
scooze4070ti super got super bit depth and what did it give her? Nothing, super 8%. Which were obtained from the difference in those same cores.
It's power-limited. That's 8% more performance from 0% more power. Nvidia restricted the power of the 4070TiS to the exact same 285W of the 4070Ti to keep the 4070TiS from stealing sales of the 4080 card.

If you look at TechPowerUp's reviews of the 4070TiS you'll see that the Gigabyte model with a max power limit of 320W (exactly the same as a 4080) gets a much better 19% improvement over the base 4070Ti performance, and is only 6% behind the 4080.

Also, if you look at the overall results by resolution (www.techpowerup.com/review/gigabyte-geforce-rtx-4070-ti-super-gaming-oc/32.html) you'll see that the 256-bit 4070TiS scales better with resolution than the 192-bit 4070 Ti, since 4K needs more bandwidth and the 192-bit 4070Ti is starting to struggle.
Posted on Reply
#43
Ruru
S.T.A.R.S.
scoozeI've never heard anything stupider. when selling you a 5090, they are selling you first and foremost 22 thousand shaders on one crystal, and everything else is secondary.
And I haven't heard anything stupidier than buying a card just because of its shader count. With that logic, old Radeon Fury X must be almost as good as RX 9070 XT since they both have 4096 shaders. :rolleyes:
Posted on Reply
#44
Chrispy_
dicobaltWhy is it using expensive gddr7, this is an entry level card. It's like they don't have any idea what they're doing.
Because the puny 128-bus lacks bandwidth using GDDR6.

Also, GDDR7 *is* more expensive than GDDR6 but it's still a very small part of the overall cost of the whole GPU. The difference between GDDR6 and GDDR7 is likely about $20 at retail, closer to $5-6 for the AIB, which is less expensive than increasing the bit-depth of the GPU and moving to a PCB that supports 192-bits instead of 128-bits. There are two ways to increase VRAM bandwidth - make the bus wider using the same memory, or make the existing memory faster - both options achieve the same result.
RuruAnd I haven't heard anything stupidier than buying a card just because of its shader count. With that logic, old Radeon Fury X must be almost as good as RX 9070 XT since they both have 4096 shaders. :rolleyes:
In his defence, he was talking about two GPUs from the same 40-series with the same architecture. You're both right, but you're not arguing about the same context.
Posted on Reply
#45
AnarchoPrimitiv
It's incredible to me that after everything Nvidia just put you people through, the first comments I see here are instantly taking Nvidia back.....no wonder we never get anything more than we get
ZoneDymobehold, the new midrange, 500+ euro for a 60 class....from both amd and nvidia.

sigh pls Intel, shake these bastards up
Youre really holding a company with 9% marketshare to equal responsibility with a company with 90% marketshare for the current state of the market?
Posted on Reply
#46
Knight47
NostrasThe 5060 looks really impressive,

There's no saving the 5060Ti. The 8GB model is unrecommendable at what price it's going to sell and the 16GB model is going to be much too expensive.
The only way the 5060Ti is not DOA is if it costs no more than 300$, which also means that the 5060 has to cost significantly less. Not going to happen lol.
4060/5060 isn't great if your on 2060Super or better gpu. I'm considering a second hand 3080 since that's close to 4070 performance. 5060 will cost more and will be worse.

5060Ti 16GB needs to be close to 4070 to not to be doa. 5060 will be doa unless it's under 200usd
Posted on Reply
#47
Chrispy_
Knight475060Ti 16GB needs to be close to 4070 to not to be doa. 5060 will be doa unless it's under 200usd
I suspect the 5060Ti 16G will still be 10% slower than a vanilla 4070, though it'd be nice to be wrong.

The 5060 will sell just fine at $250+ because it's better than the 4060 which was still selling like hotcakes at $290 before all the stock dried up a couple of months ago. If anything, the shortage of 40-series GPUs and the delayed launch of the 5060-series means that there'll be massive demand that outstrips supply and pushes prices up even higher. Don't be surprised to see 5060s selling for almost $400 in the first few months of launch.

I know I've shouted that >$200 is too much for an 8GB GPU until I'm blue in the face for the last 2+ years, but the masses keep buying them and that's why they've climbed the ranks in the Steam Hardware Survey with absolutely massive marketshare.
Posted on Reply
#48
RandallFlagg
Chrispy_I suspect the 5060Ti 16G will still be 10% slower than a vanilla 4070, though it'd be nice to be wrong.

The 5060 will sell just fine at $250+ because it's better than the 4060 which was still selling like hotcakes at $290 before all the stock dried up a couple of months ago. If anything, the shortage of 40-series GPUs and the delayed launch of the 5060-series means that there'll be massive demand that outstrips supply and pushes prices up even higher. Don't be surprised to see 5060s selling for almost $400 in the first few months of launch.

I know I've shouted that >$200 is too much for an 8GB GPU until I'm blue in the face for the last 2+ years, but the masses keep buying them and that's why they've climbed the ranks in the Steam Hardware Survey with absolutely massive marketshare.
5060 should be at 4060 Ti / 3070 performance levels, so it will be very popular. Ever since the 3060 Ti, there was an abnormally huge performance gap between the XX60 and XX60 Ti mostly because the non-Ti had really minimal performance bumps. Based on specs, this should get that back to normal.

You are right though, a lot of folks just default to a XX60 card and have for many years. Nvidia has kinda been hosing them for the past 2 generations. To be fair, Nvidia blew it out of the water with the 2060 though. There was a huge gaping hole between a 1060 and a 2060, and the Super made that hole even bigger.

Posted on Reply
#49
ZoneDymo
AnarchoPrimitivYoure really holding a company with 9% marketshare to equal responsibility with a company with 90% marketshare for the current state of the market?
Im comparing one billion dollar company to another and apart from that I fail to see how marketshare matters at all in this discussion.
Posted on Reply
#50
Chrispy_
RandallFlaggTo be fair, Nvidia blew it out of the water with the 2060 though. There was a huge gaping hole between a 1060 and a 2060, and the Super made that hole even bigger.
That was partly because the 2060 was the most egregious price bump the x60 series has ever had. The x60 series had been hovering a little north of $200 for a while and then BOOM, $349.

$199 GTX 460 (768MB)
$199 GTX 560 (1GB)
$229 GTX 660 (1.5GB)
$249 GTX 760 (2GB)
$229 GTX 960 (2GB*)
$249 GTX 1060 (6GB)

We were hoping for $249, expecting $279, and not ready for $349. Also, this was the near the beginning of the miserly VRAM allocations. Nvidia were ahead of the curve with VRAM until the 2060, which was the first card I've ever sold prematurely for running out of VRAM in new games whilst it was still basically brand new. Yes, I was playing on maximum settings at 1440p, but that's no excuse for a graphics card that was aleady 40% more expensive than expected, and no 60-series GPU had ever failed to meet VRAM requirements of current-gen games before. It was truly a landmark failure in my eyes.

As disappointed as many people were with the 3060, I thought it was a great card. It brought the price back down from the 2060's $349, it solved the VRAM issues of the 2060, and it was just well-rounded, or at least a more rounded, balanced GPU than the alternatives at the time.
Posted on Reply
Add your own comment
Mar 25th, 2025 07:54 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts