Monday, December 16th 2024
32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec
Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.
It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources:
VideoCardz, Wccftech
It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
174 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec
Only a denier would look at 7.6GB on a 8GB card and say - yeah that outta be enough... Did how? Provided zero links to disprove those youtubers. Who are people gonna believe?
Random guy with a few hundred posts in a forum where he joined a few weeks ago or a person with decades of experience and millions of subs? Only an Nvidia fanboy would say something as moronic as that. I think everyone who's rational would agree that Nvidia is ahead, but Intel is not generations behind with "primitive software". They have all the same features as Nvidia. What holds Intel back is their own management on low volume on those cards. Oh so now they're poor or 1080p is edge case. Didn't you say earlier that 1080p is the most used res? Now suddenly it's an edge case.... Oh so you looked at average framerate numbers and concluded that 4060 has zero issues.
What about 1% lows?, frametimes?, what about image quality (textures not loading in) etc.
Average framerate is only one metric and by that even ancient SLI was totally ok with zero issues... From TPU's own chart. 480 is 4% slower than 1060 6GB and 7% faster than 1060 3GB. I guess that's competitive. It is you who has nothing to provide in this discussion. Calling 1080p as edge case, ignoring mounting evidence of 8GB problems and cheering Nvidia.
I hope at least your green check clears.
And even others mocking you for your delusions. But sure. Go on buying your high end cards and believing 8GB is enough.
I've done arguing with a fanboy. And dont bother replying. Your in my blocklist now.
This year already 27% of TPU tested games exhibited this behavior and it will only increase each year. Frame gen and RT increase VRAM usage even more. Yes upscaling lowers it, but only enough to counter increase from FG/RT.
Do you know that you have no idea what you're talking about even in the slightest, or?
Just curious.
The outlying factor here is that there is a shift from 1080p and 1440p and 4K. You can get a half decent 165-180hz 1440p monitor for as low as $150 BNIB 2024. 4K 120-144hz is only $300-400 now. The average 1440p 144hz monitor used to cost $400-500 in 2020.
Textures in modern games are just too dense to ignore, especially at what is now a mainstream resolution (1440p).
You're also completely wrong and think youre smarter than you actually are, but I digress. Not going to bother arguing with someone who quote replies literally everyone.
Still dislike that the double-up VRAM card has been up-tiered for Ada and now Blackwell. It reeks of, like was muttered many times with every Ada release, Nvidia quietly up-badging every card in the lineup and whistling as they slink away. What would have a wonderful name as a 5050 is now the 5060, with no good justification—and the only comprehensible reason being that they want to leverage the legacy of the product stacks that came before to sell an inferior product.
To be honest, the new Intel launched cards are the best cards one can buy now. They have 12GB of VRAM, and they are priced fair. Those cards are perfect for 1080p gaming.
I sincerely see no point on buying a x060 card over those.
Just dont buy it, problem solved.
This whole topic is full of Nvidia hate and QQ whining. And cant even OC it whitout huge hasle, also huge problems is playing something else than AAA/ bench mark games and sometimes even those have a problems and no DLSS.
But just like i told lot of hate agains Nvidia..
All the haters are in forums
And on the topic of 'hate', am I not justified in complaining about what I see as an attempt to fleece unaware buyers of their cash by intentionally and misleadingly altering the naming conventions of a product stack? Nvidia is slowly building up a history of such egregious tactics, ranging from the 4080 12GB to the 3050 6GB to the 4060/4060Ti 8GB; the only way to keep Nvidia accountable is to continually make our displeasure known and inform people about these disingenuous practices so that they aren't—and I'll say the quiet part out loud here—cheated out of their money.
But its good time to sell old 4090 I dont mind what nvidia release, 3050,4080 12Gb 4060 or Ti, at least there is something for everyone.
if u dont like it u dont buy it.
5060Ti 16Gb can be good if price/perf is right
edit: to make this a bit more clear, I'm gonna explain what the game review data shows for the RTX 4060 - and what it does not show:
- games are all tested in Ultra settings, so in a lot of cases *beyond* sweet spot of the 4060, settings which should not be used for a 4060, in other words, or only with DLSS activated. Worst case scenario, you could say.
- despite this beyond sweet spot usage, the card never had terrible fps (< 30 or even <10 fps), which would show that the vram buffer is running into short comings, a very obvious sign usually.
- the only "problem" the card had was in some games in 1080p (and I only talk about 1080p here for this card), it had less than 60 fps.
- the min fps scaled with the average fps it had in that game, so if the avg fps was already under 60 fps, obviously the min fps wouldn't be great as well - not related to vram.
- also there are games like Starfield which generally have issues with min fps that are visible in that review, and not only with the RTX 4060. Also not related to vram.
- the card generally behaved like it should, it was *not* hampered by 8 GB vram in any of the games. Just proving that what I said all along is true.
- furthermore 8 GB vram is also proven to be mostly stable for even 1440p+, as the vram requirements barely increase with resolution alone. There are a variety of parameters that will increase vram usage, for example world size, ray tracing, texture complexity, optimisation issues. A lot of games don't even have problems in 4K (!) with 8 GB vram. That is the funny thing here. :) The problems start when the parameters get more exotic, so to say.
- so saying 8 GB vram isn't enough today for 1080p or even 1440p, is just wrong. Can it have select issues? Yes. It's not perfect, if you go beyond 1080 it will have more issues, but it will still mostly be fine. In 1080p, which this discussion was about, it's 99.9% fine on the other hand, making this discussion unnecessary.
- as it's still easily enough for 1080p, it will also be enough for a new low end card, for the foreseeable future. haha now let's stay in reality. What I can tell you is this: if you have a ton of vram it can be used, it can be useful even in games that don't *need* it, just so your PC has never to reload vram again, it's basically luxury, less stutter than cards with 16 GB for example. I experienced this while switching from my older GPU to the new one, Diablo 4 ran that much smoother, there was basically no reloading anymore while zoning. The issue here is that you're comparing two different companies. AMD used to do a "I give you extra vram" for marketing vs Nvidia. The fact here is, that historically Nvidia Upper Mid range GTX 1070 and Nvidia Semi-High End GTX 1080 used 8 GB back then, this is a fact. Whether AMD used a ton of vram on a mid range card, does not change this fact. The same chip of AMD started originally with 4 GB, as per the link you provided yourself, everyone can see. AMD also used marketing tactics like this on R9 390X, so you can go even further back to GPUs that never needed 8 GB in their relevant life time. When the R9 390X "needed" 8 GB, it was already too slow to really utilise it, making 4 GB the sweet spot for the GPU, not 8 GB (averaged on the life time of course). :) And as was already said, by multiple people not just me, Nvidia vram management is simply better than AMDs - this is also a historical fact since this is true for a very long time now, making a AMD vs Nvidia comparison kinda moot. AMD will historically always need a bit more vram to do the same things Nvidia does (not have lags or stutter). As someone said this is probably due to some software optimisation.
Seen a post from someone who things the "8 gig is enough crowd" are basing it on a bygone era, when it was the norm for GPUs to have loads of VRAM that had no effect in games, times have changed.
The fact we have streaming tech in game engines tells us what we need to know, that is primarily a VRAM mitigation tool, without it, GPUs would need dozens of gigs of VRAM to load everything up all the time. These engines collapse when they simply have to do too much to mitigate extreme low amounts of VRAM like on 8-10 gig cards. End up with missing textures, low detail textures that are meant to be only used at distance and other issues. But it's "I still get 400fps dude playing my shooter where I dont care about details, its all good".
My last upgrade was pretty much down to VRAM, wouldnt surprise me if my next one is as well. I would have gone 4070ti super instead of 4080 super if was a FE version of it. I can also now use VRAM guzzling apps with ease, before I had to shut everything down aggressively to get every byte of VRAM I could get, had a plan to use iGPU for desktop to save VRAM.
And people are too forgiving of a massive rich company skimping on VRAM. 5060 should really launch with about 12GB, but will no doubt be 8GB. 12GB should at least let it run games for the next few years without running into any issues. Nvidia is just greedy and stingy. The fact that my 3080 only came with 10GB, and an empty slot on the PCB where another 2GB module could fit, is showing their greed physically. My 6700 XT cost far less and came with 12GB. The 6800 XT and 6900 XT both had 16GB. NV are lagging behind and people excuse it with the most games work fine argument. If a new card is coming out now, I should expect it to work with any game just fine, a new product shouldn't have an issue right away, rare as it may be.