• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Launches GeForce RTX 5060 Series, Beginning with RTX 5060 Ti This Week

Yeah, in the above examples, it's running 1440p ultimate, which I think is going to push past the limits of 8GB quite often on a modern title.
Yet that didn't happen in the JegsTV video. He ran 1440p Ultra or max in all of those games and the VRAM usage never showed itself being a problem. A 5060/ti is a better GPU than the 4060/ti and will suffer even less than was shown. TPU's benchmark numbers support that conclusion.

I’m not sure Nvidia needs to be “kept in check”, nor is it a journalist‘s job to do so.

Finally, Linus is far from being a journalist.
To a point they do. However, NVidia and AIB partners are under no obligation to grant anyone review samples at all. You're right on the point that Linus and LTT are entertainers, not professional journalists.
 
I'll take TPU and JegsTV's numbers over HUB all day, any day, every day.

I might start taking HUB seriously when they fire Steve. Until then they are on the “pander to fanboys for clicks” list.
 
I quit watching HUB like 18 months ago, or so.. the fake rage was too much. Buy it or don't.. but crying about it wont change anything, and it hasn't in all these years lol. Both companies are still making 8GB cards in 2025 lol.
 
I might start taking HUB seriously when they fire Steve. Until then they are on the “pander to fanboys for clicks” list.
Steve Walton IS HUB. He owns it, at least as far as I know. There's no get rid of him, there only taking everything they say with a HUGE grain of salt and not trusting any of it without supporting, independent verification.

Buy it or don't.. but crying about it wont change anything
Right there, that's the answer! Vote with the wallet. I'd bet real money plenty of people are going to by the 5060/ti 8GB cards and will be happy with them, when they're in stock and the prices come back down.
 
I'm not being hostile, if I were, you would have zero doubt of it. You're sidestepping the issue. Please explain how the PS5 RAM scheme has anything to do with this discussion, I'm genuinely curious.

Not from what I've seen. TPU benchmarks don't show that and anyone with credibility doesn't either.
Example, someone who's testing is actually trustworthy;
This gentleman shows the numbers and the settings used realtime.
There are very little differences between 8GB and 16GB in 1080p or 1440p.

I'll take TPU and JegsTV's numbers over HUB all day, any day, every day.
That's a video from almost 2 full years ago. Nobody is claiming that 16GB was particularly helpful in June 2023 - there were a few situations where 8GB cards completely fell over, but they were rare and didn't represent the majority of games.

Fast forwards to April 2025 and there are plenty of games that demonstrate issues on 8GB, as well as a couple of games that flat out refused to even run on 8GB GPUs until the issue was highlighted and patched. Developers have been increasingly vocal about how 8GB GPUs are holding them back, and that time wasted trying to juggle fidelity around to make an 8GB GPU viable is effort the could instead have been spent on more content, polish, or other more significant bugs.

Even games where 8GB GPUs seem to run fine, while 12GB and 16GB cards allocate >8GB of VRAM, you can now see the damage the frame-buffering to system RAM causes with spikes in the frametime graph, minimum FPS numbers, and slightly slower average FPS. In hindsight, a new 8GB GPU purchase less than 18 months ago such as a 4060Ti, 4060, 7600 is looking like a mistake that will be hampered by VRAM shortages and requiring compromise before the warranty has even run out.

For Nvidia to offer 8GB cards even now, when a sizeable fraction of commonly-benchmarked games in 2025 already suffer to some extent with 8GB is awful, because these lower-end GPUs are bought by more casual gamers who have much longer upgrade cycles than most of us here. Hiding those 8GB GPUs from the review cycle is dishonest, because Nvidia knows that in a typical 2025 review suite of games, 8GB cards are going to have issues that didn't really exist in the typical 2023 suite of games when the 4060Ti came out.
 
Steve Walton IS HUB. He owns it, at least as far as I know. There's no get rid of him, there only taking everything they say with a HUGE grain of salt and not trusting any of it without supporting, independent verification.
Oh, bummer, I thought the reasonable one - Tim - would be the boss.

Hiding those 8GB GPUs from the review cycle is dishonest
Any reviewer is free to buy an 8GB card, do what they want with it, and write what they want about it.

Nvidia is under no obligation to give any specific product to any specific person.

Seriously, you need to calm down. Having products at multiple price points for people of varying budgets is a good thing.
 
Last edited:
Any reviewer is free to buy an 8GB card, do what they want with it, and write what they want about it.
Reviews don't just pop into existence the instant a reviewer has a card.

The reviews you're reading today have been for cards with reviewers for a couple of weeks already. Nvidia intentionally prevented 8GB reviews from landing today by blocking access to 8GB cards over the last two weeks or so. Multiple outlets have independently confirmed this, and it's a douchebag anti-consumer move, while also pretty unfair on reviewers who will now have to purchase a GPU out of their own pocket and crunch/rush themselves to get a timely review out.
 
Calling HW Unboxed AMD fanboys is hilarious, you know they have good reviews when they upset the Nvidia mindshare.
And voting with the wallet won't work either, Nvidia knows people will buy the 8GB cards regardless of reviewers not being allowed to have review cards, only allowing the 16GB version is putting the xx60Ti tier in its best light and is very anti-consumer. It is sad that people will still defend Nvidia for such nonsense.
 
That's a video from almost 2 full years ago. Nobody is claiming that 16GB was particularly helpful in June 2023 - there were a few situations where 8GB cards completely fell over, but they were rare and didn't represent the majority of games.
You're missing the context, 2 years ago doesn't matter. The resolutions and GPU in question matter. The situation hasn't changed. Game devs are STILL targeting 8GB cards and they are optimizing for them.
8GB is still fine, and I'd bet real money when W1z gets his 8GB cards in, those tests will show that. Most gamers are still on 1080p, with some moving to 1440p. Very few are any higher. For those resolutions, 8GB is enough for 99.9% of games. If someone wants to play something that needs more, then they should buy something more.

It is sad that people will still defend Nvidia for such nonsense.
I would say the exact same thing about AMD's cards. The upcoming 9060/XT cards will have 8GB variants and I'd bet the same real money that they will also be just fine. This particular debated point is not about NVidia or AMD specifically as it relates to them both.
 
Last edited:
have been with reviewers for a couple of weeks already. Nvidia intentionally prevented 8GB reviews from landing today by blocking access to 8GB cards over the last two weeks or so. Multiple outlets have independently confirmed th
2023 98% of games were fine on 8GB
2024 95% of games were fine on 8GB
2025 85% of games are fine on 8GB.

Today's GPUs aren't disposable purchases that will be discarded in 2026 with the 5060 Super refresh arrives, they need to work in 2029 or beyond.
 
2023 99.5% of games were fine on 8GB
2024 99% of games were fine on 8GB
2025 99% of games are fine on 8GB.
Fixed that for you. Until meritful and verifiable data is presented, that will be my stance.
they need to work in 2029 or beyond.
That is expecting a bit much. 2027 nor 2028 maybe. A lot of people upgrade every 2 to 3 years.
 
Fixed that for you. Until meritful and verifiable data is presented, that will be my stance.
That's an opinion you're entitled to hold, though I suspect you wont have to wait long as the review of the 8GB 5060Ti and comparisons to the 16GB variant are likely to start surfacing in the next few days now that reviewers have access to the 8GB cards. The suite of games tested in today's reviews are largely 2024 games, with a few 2025 titles thrown in. I'm very much expecting the 8GB to underperform in several of them, not 1%.
A lot of people upgrade every 2 to 3 years.
Steam hardware survey proves conclusively that an even greater number do not.
 
Last edited:
That's an opinion you're entitled to hold, though I suspect you wont have to wait long as the review of the 8GB 5060Ti and comparisons to the 16GB variant are likely to start surfacing in the next few days now that reviewers have access to the 8GB cards.
Yeah, we'll see. My personal AND professional guess is that not much will change.
The suite of games tested in today's reviews are largely 2024 games, with a few 2025 titles thrown in. I'm very much expecting the 8GB to underperform in several of them, not 1%.
That is possible. However, the point above is still valid. Most gaming PC's still have, and will continue to have for some time to come, 8GB of VRAM. So games are going to continue being optimized for that amount.
Steam hardware survey proves conclusively that an even greater number do not.
The Steam survey can't show those kinds of numbers. They're not even implied as there is no way for them to publish that kind of data without running afoul of legal code in various places in the world.
 
Reviews don't just pop into existence the instant a reviewer has a card.

The reviews you're reading today have been for cards with reviewers for a couple of weeks already. Nvidia intentionally prevented 8GB reviews from landing today by blocking access to 8GB cards over the last two weeks or so. Multiple outlets have independently confirmed this, and it's a douchebag anti-consumer move, while also pretty unfair on reviewers who will now have to purchase a GPU out of their own pocket and crunch/rush themselves to get a timely review out.
How exactly did Nvidia ”block” this access?

And who cares? Anti-consumer because they want their best face to show day one? Will you be this riled up when AMD finally gets around to competing in this segment? (I’m willing to bet not)

Again, what is the problem with giving more choice to consumers? You not liking the particular choices being provided doesn’t mean others shouldn’t have the choice.

Let’s see how you felt about AMD offering 16GB cards that would run out of horse power long before they ran out of memory, a true anti-consumer move.
 
To be fair, you said not one graph shows the 16GB card being any better. That's misleading isn't it?

Multiple different websites and YouTubers have produced similar results of 8GB cards collapsing in performance when VRAM is exceeded, where the 16GB card sails along unaffected. Yes, people can turn down some settings and mitigate the problem, but on a 16GB card you don't have to - that's the point. Pretending the two cards are all but identical in performance with nary a sliver between them is false.
Misleading indeed. I wont bother replaying to him. Just ignore him. He argues totally invalid points. Then gets mods to remove posts that disagree with him.
For those that are actually open minded Daniel Owen showed how already two of the nine games he tested had problems with 8GB cards at 1080p medium settings. Not 4K. Not even 1440p. 1080p medium and achieved only 60% 1% lows of the 16GB card. Let that sink in. In one game at 1080p max (non-RT btw) 8GB completely crashed. It will only get worse from here and anyone arguing otherwise has an agenda, not truth in their mind when they type.

I moved from 8GB to 11GB in 2021 because already back then i started noticing the problems. Others are catching up now.
it's all the more bizarre that Nvidia is so stingy with VRAM. Memory is cheap. Incredibly cheap compared to the silicon itself.
It's not CPU cache where you cant just double or quadruple the L1 or L2 due to hit rate and latency issues. VRAM does not have these restrictions.
 
You're missing the context, 2 years ago doesn't matter. The resolutions and GPU in question matter. The situation hasn't changed. Game devs are STILL targeting 8GB cards and they are optimizing for them.
8GB is still fine, and I'd bet real money when W1z gets his 8GB cards in, those tests will show that. Most gamers are still on 1080p, with some moving to 1440p. Very few are any higher. For those resolutions, 8GB is enough for 99.9% of games. If someone wants to play something that needs more, then they should buy something more.

Here's another one where the difference is notable, 16GB card is 23% faster than the 8GB. At 1080p.

performance-rt-1920-1080.png
 
Steve Walton IS HUB. He owns it, at least as far as I know. There's no get rid of him, there only taking everything they say with a HUGE grain of salt and not trusting any of it without supporting, independent verification.
It's rather unfortunate salty Steve is at the helm, and I truly believe it's taken a toll on subscribership.

I've likely said it before, but my thoughts on HUB are this;

I have no reason to doubt the actual numbers they provide, they seem largely in line with other publications, when you account for their test system and testing methodology, specifically paying attention to each games tested settings.

My issues are three fold.
  1. The testing methodology constantly changes, and it's becoming evident Steve does so to prove whatever point he wants to on the day, like for example within one video running some games on reduced or competitive settings, but others with texture packs or max RT just to cripple cards with VRAM amounts he personally doesn't agree with (as if anyone would actually play that way).
  2. His lengthy subjective commentaries at the beginning and ends of video's, I come for the numbers not his (sometimes borderline unhinged) rants that feel disconnected from the actual testing performed.
  3. Specifically an extension of 2, when he gets dragged down to toxic fanboy level, taking flame bait and feeling compelled to respond and deliver cringey "I told you so" type commentary or even entire videos.
Somehow, Tim manages to entirely avoid 2 and 3, and while his methodology can change, it's leagues more consistent overall. He's just the more balanced, objective person, who comes across not salty at all and just kind of sticks to the point.

Then gets mods to remove posts that disagree with him.
Mods don't do that here, the only reason they would remove those posts is if they otherwise voilated the rules in some manner. If you drew a Venn diagram there might be crossover, but it's not accurate and rather insulting to the mods imo to claim they do this.
 
Last edited:
Calling HW Unboxed AMD fanboys is hilarious, you know they have good reviews when they upset the Nvidia mindshare.
And voting with the wallet won't work either, Nvidia knows people will buy the 8GB cards regardless of reviewers not being allowed to have review cards, only allowing the 16GB version is putting the xx60Ti tier in its best light and is very anti-consumer. It is sad that people will still defend Nvidia for such nonsense.
When are you going to post your specs?
 
Using RT is optional
Using Max settings is optional
Using brains is optional

Game works fine whit 8GB Vram Gpus if using Brains and turn off RT and lower settings

I know that, and everyone pointing out the 8GB cards' shortcomings has acknowledged it too. The reason for posting this evidence is to counter those who claim there is no difference between the cards. You do want buyers to make a truly informed decision don't you? And not be disappointed?
 
I know that, and everyone pointing out the 8GB cards' shortcomings has acknowledged it too. The reason for posting this evidence is to counter those who claim there is no difference between the cards. You do want buyers to make a truly informed decision don't you? And not be disappointed?
To be perfectly honest we can argue 8GB this, 8GB that, as there's certainly arguments to both (more vram is ideal for future proof, but most games run fine with 8GB right now, yadada), but the real issue is that the amount of VRAM in what are effectively entry level cards now (xx60, x600, etc) is stagnating, which is gonna catch up and become noticeable eventually even if it isn't right now. Thats where my opinion lies right now.

My 2080S is fine.. for now, but it will eventually be keelhauled for realsies instead of just in some very specific edge cases. Probably in another 2 or 4 years.. honestly as much as I don't agree with the belief that 8GB isn't enough *now*, it wont be eventually.

I think NVIDIA (and AMD) should start targeting 192 bit bandwidth for memory bus width, or 160 bit bandwidth for the entry level cards. Intel set a good example imo. There definitely is some concern about how that'd affect the rest of the performance though, but Intel pulled it off (but who knows how that would effect the price though.)

Honestly, I just want to see the VRAM in entry level cards go up again. (unrelated, its sad that cards that are $350+ USD are called entry level now.)
 
Here's another one where the difference is notable, 16GB card is 23% faster than the 8GB. At 1080p.
Ooo, ok. So ONE more game, out of hundreds(thousands?) of games that have been released this decade.

Just ignore him.
Yes, please do.
He argues totally invalid points.
I argue against misinformed, meritless and silly opinions. There's a difference, one you've missed.
Then gets mods to remove posts that disagree with him.
I have no such influence or power as the mods themselves would be all too happy to verify. That statement alone shows just how full of "nonsense" you are.

Using RT is optional
Using Max settings is optional
Using brains is optional

Game works fine whit 8GB Vram Gpus if using Brains and turn off RT and lower settings
Exactly! No one runs their games on Max/Ultra settings. People with budget and mid range cards always adjust their settings to get good looks and reasonable frame-rates. That's how PC gaming has always worked.
 
Last edited:
Ooo, ok. So ONE more game, out of hundreds(thousands?) of games that have been released this decade.
If you're playing decade-old games, you don't need to buy a new graphics card at all.

There are several AAA games from 2023 and 2024 that run worse on 8GB cards. It's not about whether they are unplayable at max settings, it's about them simply being worse than cards with more VRAM which is very likely to reduce their useful lifespan in terms of playing new games in the future, and the reduced performance/$ hurts their value, which is a big part of reviewers concluding if a GPU is good or bad.

There's no need to move the goalposts, the discussion on VRAM is that 8GB is unquestionably becoming a problem as games evolve and VRAM capacity does not. Right now, how bad 8GB is depends on what games you intend to play and at what settings, but I'm making the assumption that someone buying a new graphics card wants games to look their best, because if they were okay with dialled-back settings there really aren't many games that can't be played to a decent standard on affordable GPUs from 5+ years ago.

Conversely, if you are buying a new system and don't have an older GPU to run newer AAA games at medium/low settings then $200 8GB cards are still viable. 8GB is the new minimum devs are targeting and as long as you are okay with minimum settings, they still (currently) have a place.

Using RT is optional
That stopped being true for some games over a year ago.
I'm not a fan of forced RT from game developers, but I'm not going to ignore facts. There are now games that no longer run at all on perfectly good capable cards like the RX5700 or 1660Super.
 
If you're playing decade-old games, you don't need to buy a new graphics card at all.
I said this decade, as in the last 4 years and change. Context is important.. Not answering the rest of your comment because it is out of context.
 
I said this decade, as in the last 4 years and change. Context is important.. Not answering the rest of your comment because it is out of context.
So you did. I misread it as last decade because you said hundreds or thousands of games.

In the context of games that require a new GPU, it's really not that many. Each year only a few dozen titles are AAA enough to push hardware. Of the 15000 steam games added in 2024, 14000 of them run on a potato and a further ~950 still run well on an older graphics card like the GTX 1060.
 
Back
Top