• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5060 Ti 8 GB

Joined
Jul 15, 2020
Messages
1,054 (0.60/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
TL;DR

If you consider this 8GB, wait for the 12GB 5050ryx. It will do better.
 
Joined
Mar 21, 2016
Messages
2,676 (0.80/day)
Cooler looked disappointing especially since it seemed to OC decently otherwise and had a real impact on GPU design itself. The 4K results and the Path tracing in Alan Wake in particular were awful. It looks like 1440p is iffy and probably a little bit hit or miss especially if you start to factor in 4K results and think about like mods and like post process with reshade. It seems like 1080p is pretty much fine though with not much concern at all.
 
Joined
Feb 26, 2019
Messages
89 (0.04/day)
While most people are hung up on the vram discussion, I think that the discussion should be that 2 gens later, a 60ti card cannot beat an 80 card and is laughable and an insult.

Granted, the 5060ti should of been a 5060 at most when looking at its specs, but it is still losing to a 3080 lmao.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,632 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
but are you planning on doing a performance review of Indiana Jones?
No plans, the game has just been out for way too long, no way to justify the time=$ investment

A problem for reviewers is that not only are drivers and features changing gen over gen, but so are the games
and that's why I retest everything every few months, check the list at the end of the setup page

How is it losing so badly to the 8GB 4060 Ti sometimes, surely that should be impossible? Same architecture, same amount of memory but faster, more cores, but losing like this?
I retested the 5060 Ti 8 GB and the results are better now. The charts have been remade (also the summaries)
Not sure why it was so slow on the first runs

This game has some strange and possibly random behavior with 8GB cards.
This, thanks for reporting the odd numbers
 
Last edited:
Joined
Jul 5, 2013
Messages
30,546 (7.07/day)
Cherry picking as I understand it is to choose data points that support a conclusion while ignoring the ones that don't. Highlighting an outlier and examining why it's an outlier is not cherry picking: it's analysis.
Except when there are a couple points of example. Then it comes off as cherry-picking, especially when the same user has been bashing the very same product for a while before the performance numbers for the review were published.

I retested the 5060 Ti 8 GB and the results are better now. The charts have been remade (also the summaries)
Not sure why it was so slow on the first runs
Of course there is this.
EDIT: Not seeing the difference. What was corrected?
EDIT2: Any plans to review a 5060 non-ti 8GB card?

Talk of cherry-picking, you're ignoring all of the RT tests
How so? Nothing stands out. Care to state an example?
 
Last edited:
Joined
Mar 21, 2016
Messages
2,676 (0.80/day)
Except when there are a couple points of example. Then it comes off as cherry-picking, especially when the same user has been bashing the very same product for a while before the performance numbers for the review were published.


Of course there is this.
EDIT: Not seeing the difference. What was corrected?
EDIT2: Any plans to review a 5060 non-ti 8GB card?


How so? Nothing stands out. Care to state an example?

I think it was to do with performance of 5060Ti 8GB relative to 4060Ti 8GB.
 
Joined
Jun 7, 2023
Messages
222 (0.32/day)
Location
Holy Roman Empire
System Name DeathStar /// DeathStar II
Processor AMD Ryzen 7 7700 "Raphael" /// AMD Ryzen 9 5900X "Vermeer"
Motherboard Asus ROG Strix X670E-E Gaming /// Asus ROG Strix B550-E Gaming
Cooling Noctua NH-D15S chromax.black /// Thermalright Royal Knight 120 SE
Memory Kingston Fury Beast RGB 6000C30 2x16GB /// G.Skill Trident Z RGB 3200C14 4x8GB
Video Card(s) Zotac GeForce RTX 4080 Trinity 16GB /// Palit GeForce RTX 3090 GameRock 24GB
Storage Kingston KC3000 2TB + Samsung 970EVO 2TB + 2x WD Red 4TB /// Samsung 970EVO 1TB + WD Red 4TB
Display(s) Gigabyte M27Q /// Dell U2715H
Case Fractal Define 7 Black /// Fractal Define R5 Black
Power Supply Seasonic SS-750KM3 /// Seasonic SS-660KM
Mouse Logitech MX Master 2S /// Logitech MX Master
Keyboard Logitech K270 /// Logitech K270
Software Linux Debian 12 x64 + MS Windows 10 Pro x64
It sure will have a reserved space on Disappointment of the year tour T-shirt provided by GN.
 
Joined
Jun 14, 2020
Messages
4,931 (2.77/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Is a lower quality experience in 1/4 of your newer games worth saving $60-70 or 15-20%? Some of this may be mitigated with settings changes and some may not, it's up to everyone to assess the cost benefit.
Yes, absolutely, if 60-70$ only gets me a move from high to ultra textures im saving that 60-70$ and laughing all the way to the bank.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,632 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Any plans to review a 5060 non-ti 8GB card?
I want to, but not sure about the timings and mechanics. It'll probably be a launch without reviews, during Computex, so reviewers don't have time to review is properly. I'll buy one eventually of course

I retested the 5060 Ti 8 GB and the results are better now. The charts have been remade (also the summaries)
Not sure why it was so slow on the first runs
Ok so I actually retested the wrong card t.t

Just tested back to back, RTX 5060 Ti 8 GB, RTX 4060 Ti 8 GB, and the 4060 really runs considerably faster in TLOU at 4K. This must be some memory management issue on Blackwell, because as you noted correctly, the expectation should be that 5060 Ti 8 GB is faster, because it has the same or more of everything
 
Joined
Sep 15, 2011
Messages
7,015 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) KUROUTOSHIKOU RTX 5080 GALAKURO
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Is it me or this card is faster than the one with 16BG VRAM on 1080p ?
I think for 350$, this is a good card for 1080p gaming.
 
Joined
May 17, 2021
Messages
3,842 (2.66/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
it's a stupid card that shouldn't exist at this price point, but it seems like it's fine for 1080p.
This would be kind of a 2025 gtx 1050 ti maybe, price should be a lot lower, 200usd accounting for inflation and being insanely generous considering that one came out at 139usd.
 
Joined
Feb 18, 2013
Messages
2,202 (0.49/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine Reborn :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) ASRock ARC B580 Challenger 12GB (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Tweaked Windows 11 Professional x64 (Update 24H2) OS
so, it's roughly 2x faster than the ARC B580 in both 1080p and 1440p while being a card that can be bought *at* MSRP is both rare and somewhat expected. No one is gonna consider the 5060 after this, considering that card will be just as good or worst than the B580 at both resolutions.
 
Joined
Jul 5, 2013
Messages
30,546 (7.07/day)
I think it was to do with performance of 5060Ti 8GB relative to 4060Ti 8GB.
That is a curiosity. I'm starting to suspect that the latest driver issues are part of that odd result.

I want to, but not sure about the timings and mechanics. It'll probably be a launch without reviews, during Computex, so reviewers don't have time to review is properly. I'll buy one eventually of course
Fair enough. I look forward to it when It happens.

Just tested back to back, RTX 5060 Ti 8 GB, RTX 4060 Ti 8 GB, and the 4060 really runs considerably faster in TLOU at 4K. This must be some memory management issue on Blackwell, because as you noted correctly, the expectation should be that 5060 Ti 8 GB is faster, because it has the same or more of everything
NVidia is having some fairly serious driver issues currently, might that be related somehow? Seems like it's plausible.
 
Last edited:
Joined
Nov 1, 2008
Messages
495 (0.08/day)
System Name It does stuff
Processor Ryzen 5700X3D
Motherboard B550 Gaming X V2
Cooling Thermalright Assassin X SE
Memory 16GB DDR4 3600
Video Card(s) RX 6700XT
Storage Too much
Display(s) 31.5" 2560x1440, 27" 1920x1080, 24" 1200x1920
Case Antec 300
Audio Device(s) Xonar DGX
Power Supply Corsair 750W
Tiny 181mm2 5050 die, sold as a 5060
 
Joined
Jul 24, 2024
Messages
562 (1.99/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
I've read multiple reviews here and there. And yet again, TPU's results tend to differ and not by a small margin.

Why is DLSS section missing? It was present for 16 GB variant. Is this intentional?

One doesn't need to cherry pick in order to prove this card has serious problems.
Compared to 2-generations older RTX 3070 this card is absolutely pathethic.
This card should have been named RTX 5050 Ti, putting 40% of price off and it would be just okay.
Even mighty Nvidia did not want this card to be reviewed along with 16 GB version. Now we know why.

Blackwell was cooked too quickly with minum focus on gamers. RTX 5000 still ongoing driver disaster is direct consequence of that.
 

demonicmaniac

New Member
Joined
Mar 15, 2024
Messages
6 (0.01/day)
As I said before I don't hate the card - I find it stupid from NVIDIA's standpoint as it is fairly obvious the gpu can do so much more if it had more vram. HUB did some testing at different quality settings enabling framegen and dlss and rt and it showed an average of 30% slower due to vram limits in a lot of games and up to 90% lower 1% lows with framegen enabled due to running out of vram at medium settings even. That's why I argue it doesn't make "sense" from a technical standpoint as NVIDIA is clearly bottlenecking the card and a 5060 with less units and 8gb makes more sense. From a price point to fill the slot between the 5060 and the 5060 ti 16gb it makes sense consumer perception and having something for each price point but technically the card is simply heavily bottlenecked unable to utilize its power. Compared to the other disappointments in the EU the 5060 being available at release for msrp and no scalping with the previously insane 40xx prices since they stopped manufacturing is a good thing for sure and finally something for people sub 700€ to buy. It's really just a "why it could have been so much better" and 8gb of vram can't cost NVIDIA that much.
At least we finally have something between 300€ for b580/7600 and 700€ for 4070/5070/9700 that offers at least not worse value than previous gens
 

Calicifer

New Member
Joined
Apr 4, 2025
Messages
23 (0.85/day)
While I do agree with a sentiment about VRAM, it highlighted how much general cluelessness of tech reviewers and gaming enthusiasts. I often see that from youtube's tech channels and little bit here too. Why the hell xx60 class card is being measured at 4k? Of course it is going to perform poorly there! You are taking 1080p card which could be tested at 1440p too and then purposefully putting it in the situations in which it is not designed to be used only to create forced examples where it is broken.

It seems that people grew so entitled in recent times. I remember getting my first GPU, GTX 760 and I ran Crysis 3 to really stretch it out. It ran around 30-40 FPS there at 1080p and I was so happy. Now, it is 4k at 60 FPS from your xx60 card or it is trash. Like, wtf people?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,632 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Why the hell xx60 class card is being measured at 4k?
I can only speak for my own testing, but I test all cards at three resolutions, to provide the data. If you've read my conclusion I am very clear that 5060 Ti is not a 4K card, even 1440p will be difficult in some titles
 

Calicifer

New Member
Joined
Apr 4, 2025
Messages
23 (0.85/day)
Yes, I had read that too. This is why I tried to not mention you directly. If you want a direct example, you can check Hardware Unboxed 'review' of this GPU. They start with 4k cherry picked game in order to promote their own personal point. I have no doubt that there are many more examples out there.
 
D

Deleted member 187435

Guest
Yes, I had read that too. This is why I tried to not mention you directly. If you want a direct example, you can check Hardware Unboxed 'review' of this GPU. They start with 4k cherry picked game in order to promote their own personal point. I have no doubt that there are many more examples out there.

The trouble is, they have to do that because otherwise you get people saying 'VRAM doesn't matter, because by the time you exceed 8GB, the card isn't powerful enough for it to make any difference anyway'. You get a lot of people saying that, it's considered conventional wisdom at this point. And yet, it isn't true because there are multiple cases where the 16GB version of these cards (4060 Ti and 5060 Ti, same applies) provide a good or playable framerate at 4K in some games, but the 8GB card seriously struggles.

As for why should 4K be used to review cards in this class, it's to show prospective buyers whether the card is up to the task. Of course, if often isn't, but in some games, actually it is. Even the 8GB card does provide 60fps+ in some games at 4K. So if you play those games, and/or are prepared to turn down some settings to play the others, it might still be a viable card for you. It's about letting people make an informed buying decision.
 

demonicmaniac

New Member
Joined
Mar 15, 2024
Messages
6 (0.01/day)
Yes, I had read that too. This is why I tried to not mention you directly. If you want a direct example, you can check Hardware Unboxed 'review' of this GPU. They start with 4k cherry picked game in order to promote their own personal point. I have no doubt that there are many more examples out there.
The HUB review clearly showcases a good 12 games where at 1080p 1440p 4k the 16gb variant with different settings achieves a 50+ fps base and > 30 fps lows whereas the 8gb variant falls below 30 fps and 10 fps even. They don't cherry pick anything to prove any point - they show where 8gb bottlenecks and limits the card hard vs 16gb especially using 5000 series features. There are plenty of examples in the HUB review where the 8gb card has to drop to low or medium on 1080p or 1440p to not have frame stutters and < 30 fps lows and the 16gb deliver 60 fps lows. Since they are the same card aside from vram the point is to simply show that paired with the compute power 8gb hobbles the card extremely and given the cost for the die the card is also super expensive given how crippled it is by the vram. So you can argue this is a bad product compared to something like a 5060 with 16gb filling the spot between 5060 ti 16gb and 5060 8gb for a lower price point as the gpu die would be much cheaper.

To make it clear: 8gb isn't the problem, pairing a die of this class with a base price of 300+ with only 8gb ram hobbles the card and costs too much. A 5060 8gb is likely fine because there isn't enough compute oomph anyways that 8gb becomes an obvious hard limiting factor. It's just unbalanced and the price too much given the size of the die. I'm not arguing the card should cost sub 300 since that's bonkers for the die used and nvidia would shoot themselves in the foot margin wise but that's on nvidia for making such an unbalanced product that technically doesn't make sense
 
Last edited:

Calicifer

New Member
Joined
Apr 4, 2025
Messages
23 (0.85/day)
I think that it should be put in its own separate sub-topic in a review like we do with ray tracing performance. Starting your entire review at 4k with a cherry picked example feels like a bias review who focuses on pushing the narrative than reviewing the actual product. This is why I dislike how reviews for this card went down even if I agree that 8 GB for such a card is really bad. This is again, critique for HUB in this case.


And again, we shouldn't even be testing this card at 4k as it is not its intended resolution. We only test it at 4k like CPUs at 1080p to show their longevity. It is not a practical case scenario under which anyone will be gaming. Well, at least they shouldn't. And to emphasize the point again, the problem is how they started their review with it and doesn't properly separate it into its own sub-category. This is the question of GPU longevity and actual benchmarking tests shows that this card is mostly fine.
 
Joined
Jul 24, 2024
Messages
562 (1.99/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
While I do agree with a sentiment about VRAM, it highlighted how much general cluelessness of tech reviewers and gaming enthusiasts. I often see that from youtube's tech channels and little bit here too. Why the hell xx60 class card is being measured at 4k? Of course it is going to perform poorly there! You are taking 1080p card which could be tested at 1440p too and then purposefully putting it in the situations in which it is not designed to be used only to create forced examples where it is broken.

It seems that people grew so entitled in recent times. I remember getting my first GPU, GTX 760 and I ran Crysis 3 to really stretch it out. It ran around 30-40 FPS there at 1080p and I was so happy. Now, it is 4k at 60 FPS from your xx60 card or it is trash. Like, wtf people?
It's perfectly okay to test a card outside of purposed usage scenarios. You do it to find limits. Sure, no one sane would buy RTX 5060 Ti class card to play all newest titles at 4K at 60 Hz. That does not mean it should not be tested in 4K. Take a look here:
1745407254983.png

You can clearly see noticeable peformance uplifts even when VRAM is of the same amount.
If you take a look at 1080p, which is 5060 Ti territory, you won't notice such performance progress:
1745407343063.png

Thus, resolution provides great options for testing limits of GPU of any class. It shows you how the technology progresses over time.

Also, with lower resolution you can get easily bottlenecked by your CPU. Check Far Cry 5 tested at 1080p with RX 6800.
1745407931747.png


How come old RTX 1080 Ti or old RX 5700 XT beats RX 6800 in 1080p?
In this CPU-limited scenariou, you either upgrade CPU, or increase rendered resolution.
Higher resolution causes lower fps, lower fps = less compute load for CPU.
 

Calicifer

New Member
Joined
Apr 4, 2025
Messages
23 (0.85/day)
So why 4k testing became the most important thing in HUB review rather than its own sub-topic? And why its performance at 4k decides its review conclusion outright? Sure, it sucks and has short legs, but HUB is basing its entire conclusion out of it and makes it as the most important thing of a review.

It is like what Nvidia wanted to make them do in regards to RT back in the day. Make it the most prominent talking point. They refused to do that, but now they are doing the same exact thing with VRAM amount. It is not a fair review of hardware. Even VRAM limited cards are not worthless.
 
Joined
Nov 13, 2024
Messages
354 (2.09/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) RX 9070XT XFX
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
So why 4k testing became the most important thing in HUB review rather than its own sub-topic? And why its performance at 4k decides its review conclusion outright? Sure, it sucks and has short legs, but HUB is basing its entire conclusion out of it and makes it as the most important thing of a review.

It is like what Nvidia wanted to make them do in regards to RT back in the day. Make it the most prominent talking point. They refused to do that, but now they are doing the same exact thing with VRAM amount. It is not a fair review of hardware. Even VRAM limited cards are not worthless.
Why are we discussing the HUB Review in a TPU Thread? Please post it under the Video you are talking about as feedback to the person you are criticizing, thanks.
 
Top