• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Rumor 5060ti 16gb vram and 5060 8gb vram

5060 with 8!!!!!!!
oh boy not even 10 or 12gb…

go nvidia ya scum
Jensen Huang, how much VRAM does the RTX 5060 need?

9eav62.jpg
 
Jensen Huang, how much VRAM does the RTX 5060 need?

9eav62.jpg

you have to make the Ti version relevant. Or everyone would just buy the *060 like they always do.
Usually the Ti versions would come later, now it's just another way to segment the cards.

... and they even added the Super now. 5060 super duper Ti platinum XXX
 
you have to make the Ti version relevant.
yes, with performance worthy of the asking price. I got my 4060ti about a year ago for $299 (the price it should have been at launch)
 
16GB should be the lowest entry point for anything that costs 3 digit if you want to keep your GPU for longer than at best one generation.
No. VRAM is important, but 12GB would be more than enough for a $150 graphics card, and there's not much point in getting extra VRAM when your GPU isn't powerful enough to use it effectively. 8GB is going obsolete, but it's still very rare to need more than that at 1080p (there are some games that use more, and they are becoming more common, but they won't be the majority for at least a couple of generations). It's not worth the cost of putting 16GB on a GPU that costs less than about $250, except for specialised workstation tasks; the overwhelming majority of people would be better off with a more powerful GPU with 12GB.
This will change in the future, as VRAM becomes cheaper and games become more demanding, but 16GB on entry-level GPUs will be a waste of money until 4GB GDDR7 chips (so you can get 16GB on a 128-bit bus without clamshelling) become available, which probably won't be until 2027 at the earliest.
 
No. VRAM is important, but 12GB would be more than enough for a $150 graphics card, and there's not much point in getting extra VRAM when your GPU isn't powerful enough to use it effectively. 8GB is going obsolete, but it's still very rare to need more than that at 1080p (there are some games that use more, and they are becoming more common, but they won't be the majority for at least a couple of generations). It's not worth the cost of putting 16GB on a GPU that costs less than about $250, except for specialised workstation tasks; the overwhelming majority of people would be better off with a more powerful GPU with 12GB.
This will change in the future, as VRAM becomes cheaper and games become more demanding, but 16GB on entry-level GPUs will be a waste of money until 4GB GDDR7 chips (so you can get 16GB on a 128-bit bus without clamshelling) become available, which probably won't be until 2027 at the earliest.

at 1080p 8GB is still enough today unless you want the cinematic experience of 30fps or less in ultra epic settings. 12GB should be more than enough.
 
... and they even added the Super now. 5060 super duper Ti platinum XXX
Ti, Super and AI is just to "muddy the waters" to fool/confuse customers. And it works.
 
i do agree, but it's very idiotic to test this cards on epic settings (completely irrelevant and pointless setting, and HU even made a video about it), and get shit framerates anyway even if you have 16GB
Sure, but it’s a shame that a 4y old 6700XT 12GB can perform ok for a slow paced game like this.
High framerate is not necessary to play this kind of game. Some people can live with 30-40FPS. I was for years.
And we are talking about the basic 1080p. The entry level resolution.
This is an indicator of where games are going.

$400 MSRP 4060Ti 8GB and can’t max out 1080p.
BTW this same game test does try “high” settings too. It’s much better, but still a $400 card can’t max out.
What will happen in 1 year from now?

How much will the 5060 cost?

16GB is unnecessary for 1080p.
12GB is ok but should be the minimum for 2025 and onwards, unless you want your customer to want to upgrade in a year.
 
Sometimes memory count doesn't matter if the card doesn't have enough horsepower, Looking at you 4070Ti.
RTX3060 12 GB
RX 6700/6750 12 GB
RTX 4060ti 16 GB
even as great as the intel b580 is, it doesn't have the horsepower for 12GB and loses out to top 8GB cards at 1080p and 1440p and only matches them at 4k (at that point you are already playing around with settings to get FPS up anyway)

RTX 5060 8GB (@RTX4060ti performance) at $249 is fine (just fine) but unless we are taking the trolley to the land of make believe, Nvidia will be launching that card at $300-330 which is a joke and if the performance is only a marginal improvement over the RTX4060 than its a calamity

relative-performance-3840-2160.png
 
High framerate is not necessary to play this kind of game. Some people can live with 30-40FPS. I was for years.

there are very few games were 30fps is an acceptable experience, and the trade-off between reducing settings and getting at least 60fps is always worth it

PS: I'm not excusing 8gb cards in 2024/5 in any way.
 
Should just go back to the Maxwell, Pascal era of segmentation. 5090 is pretty much its own thing on its own die. Put 5080 and 5070 on the same silicon with the same memory configuration (16GB in this case). Then with the 60-class, drop-down to the next die and cut memory config to 192-bit for the 12GB config. If you're going to still have a 128-bit 8GB card, make it a 5050 and it should be below $200.

I guess in a way we are kind of getting that with the rumored 5070 Ti being a cut-down GB203 (like the 4070 Ti Super is a cutdown AD103). But below that the stack definitely suffers similar to the 40-series. The 4070 was in no way of similar performance characteristics to say a 3070 or 1070. In my mind the 4070 is what should have been the 4060 Ti, and I would have considered the 4070 Ti really just the 4070. But hey, call it a Ti and jack up the price. Its the Nvidia way!
 
Because you are comparing this gen v. next gen. If AMD releases another 8GB card against the 5060 it'll be a dust collector. Any next gen card that cost more than $199 with 8GB will be a ripoff. Intel has figured it out, AMD had better as well.
Not only that but wasn't the 7600XT 16GB MSRP only like $330 ? That's a lot different than "if you want more than 8GB you got to pay $450+ for the premium edition".

Back on topic I'm not shocked they are sticking with 8GB. If the gen-to-gen performance increase is the same as last gen (which is being very generous considering this card will be on the same node as last gen) the 5060 would still be slightly slower than a 3070 which is an 8GB card from 2020.
Those 3gb memory chips can't arrive soon enough, this needs at least 12GB now. Hopefully people vote with their wallets.
Yes 12GB would have been a great successor to the 3060 !
 
Not only that but wasn't the 7600XT 16GB MSRP only like $330 ? That's a lot different than "if you want more than 8GB you got to pay $450+ for the premium edition".

Back on topic I'm not shocked they are sticking with 8GB. If the gen-to-gen performance increase is the same as last gen (which is being very generous considering this card will be on the same node as last gen) the 5060 would still be slightly slower than a 3070 which is an 8GB card from 2020.

Yes 12GB would have been a great successor to the 3060 !
Same old shit they did in the 900 line (cough970)
 
I've heard a rumbling that the 5060ti will be 16GB and the 5060 will be 12GB. How that memory bus would work is unknown, but it seems unlikely that they'd do 8GB for the 5060. The 5050 is likely to be the 8GB card..
 
I've heard a rumbling that the 5060ti will be 16GB and the 5060 will be 12GB. How that memory bus would work is unknown, but it seems unlikely that they'd do 8GB for the 5060. The 5050 is likely to be the 8GB card..
By gimping 5060 with 96-bit memory bus!
 
I've heard a rumbling that the 5060ti will be 16GB and the 5060 will be 12GB. How that memory bus would work is unknown, but it seems unlikely that they'd do 8GB for the 5060. The 5050 is likely to be the 8GB card..
Clamshell?
 
But there are cache improvements, so that will help.
Most of the time memory bus width is more important than VRAM size. If they do it it will be another quiet downgrade.
 
If new games have texture upscaling using turing cores, then that VRAM may be enough. But not for normal raster.
 
I've heard a rumbling that the 5060ti will be 16GB and the 5060 will be 12GB. How that memory bus would work is unknown, but it seems unlikely that they'd do 8GB for the 5060. The 5050 is likely to be the 8GB card..
If were lucky. We don't get xx50 cards all that often anymore (I still think theres a place for them, since the ultra budget options rn are limited) . If they actually do that it'll cement my distaste for the 40 series as being a very boring generation (at least at a non high-end level)
 
Last edited:
By gimping 5060 with 96-bit memory bus!
Maybe. That would suck.. But if they use fast enough VRAM it won't matter much for that tier GPU.

But there are cache improvements, so that will help.
There's also this.

If were lucky. We don't get xx50 cards all that often anymore (I still think theres a place for them, since the ultra budget options rn are limited) . If they actually do that it'll cement my distaste for the 40 series as being a very boring generation (at least at a non high-end level)
Like I said, it was just a rumbling of a rumor. Seems logical though.
 
Last edited:
Back
Top