Monday, December 16th 2024
32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec
Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.
It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources:
VideoCardz, Wccftech
It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
173 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec
8GB is still enough for 1080p@high/max settings (I cannot confirm this year's AAA titles, cause I've not played any of them yet)
But even if it wasn't enough, you can still turn graphics settings a notch down, and you'll get similar quality as on a PS5/SeriesX
So, 8GB is still enough?
Yes
Would I buy an 50xx card with 8GB for 1080p?
No (except if I were on a budget, but in that case, I'd go for a 40xx card/AMD/Intel/used card instead)
DLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.
Even Intel realized that 8 GB is not enough in lowend.
The less is cache capacity, the more bandwidth there must be to compensate it. And bandwidth is not just about GPU's memory bus-width,
but think about where are graphics assets stored and from where and through what else components are they pulled from to the VRAM.
Sometimes you need space, not bandwidth. Bandwitdh is of no use when there is not enough space to fit things into.
As soon as devs will move onto higher quality textures, 8 GB will become unsufficient even for 1080p aimed GPUs.
1) it's proven 8 GB is enough, by data, anyone who is of a different opinion seems to be phishing for engagement (youtubers), or is looking for the worst case scenarios (not practical, not a good argument to have)
2) the 2 companies with 100% dGPU share of the market currently still use 8 GB cards (intel has A750 / A770 which is also 8 GB vram and has no issues worth talking about), so they confirm that it is enough, otherwise wouldn't do that
3) only a cynic would reduce their technical decision just to "omg it's just about money! They are saving money by reducing vram, or it's planned obsolescence". I don't envy people that are so cynical. And cynicism leads you to roads that aren't worth traveling on.
4) let's be real and honest here for a second, Nvidia is a pretty great company that is known for quality and not for nonsense, so if they do 8 GB video cards in 2022 + probably soon still in 2025, they have good reasons for it, that are beyond than just "omg I have 1 trillion dollars, I need 50 million dollars more!", that's how it is. When exactly was the last time, a video card of Nvidia really didn't work out because the vram amount wasn't enough? Right, never? Good to know. We have a lot, a ton of baseless talk here, and the few data points provided point in the direction that 8 GB is enough today and for the foreseeable future, for 1080p at least. Not talking about 1440p, I don't give any guarantees there, it's well known that 3070 is a 1440p card (or was?) that has some issues with vram, but that's another topic, it's not a low end card and not about 1080p, which is, what this is about. Nobody buys 4060 for 1440p, at least not without accepting compromises, not with thinking "that will be perfectly fine!". Anyone who buys a 4060, or soon 5060, will think to themselves, "this is a low end card, it is what it is, I'll accept compromises, or I'll stay firmly with 1080p even accepting compromises there, if necessary if anything I play is just AAA games". As soon as the discussion isn't about worst case scenarios anymore, which are the moot points the opponents of 8 GB vram brought, it's just AA or even A gaming, the vram isn't even a topic anymore. Also the usage of smart settings eviscerates the points the opponents of 8 GB vram have as well. Which is usually what a low end card owner should do. :) "Ultra Ultra Ultra" is usually for people who own more expensive, higher end cards. Tbh, Ultra is a enthusiast setting, so why should I force it on a low end card? I'm gonna use it if I can, and if the game is in the slightest laggy, it's lowered to "High" after 5 seconds, that's the reality.
As well, many games are still made to the lowest common denominator, which are consoles, and those are generally only equal to mid-range GPUs, so in theory, most games should work on limited VRAM (but of course, they don't always). In this generation, the XSS is the limiting factor with 8GB allocated to its GPU, while XSX allocates up to 10GB to its GPU and the PS5 lets most of the 16GB be usable by the GPU depending on game optimization. Of those, the XSS has shown actual issues with its allocated 8GB given that most games were not designed for it but rather the XSX, so additional tweaks had to be made to games to offer settings that ran on an XSS at lower settings than XSX. The 2 extra GB on the XSX make a lot of difference, and the PS5 makes use of the larger available memory to make up for the slightly inferior GPU specs relative to the XSX.
Now if the 5060 is at the level of, say, 4060Ti, this is what we're looking at:
Not insignificant. So essentially the games where 4060 was bottlenecked by framebuffer will have the same upper limit of graphics settings in the 5060.
The funny thing was when one guy tried to use my own data against me, I studied the data again and pointed out quickly that it supports all my claims and thanked him for it. This supports multiple of my claims, thanks. 8 GB opponents are just fishing for edge cases and exemptions instead of looking for the broad rule, that the card works with 0 issues when used normally. :)
You guys generally have 1 thing in common: edge cases, low % usage cases, impractical usage of a low end card. This all just makes it easier for me to bring this to bed. :)
You can also say, a Fiat will break down if you always drive it at 180 km/h. This is the argument you guys are basically making, a very unrealistic scenario which makes 0 sense and will never happen.
It is like witnessing a wrong way crash...
You are saying others aren't providing proof whereas i see a lot of people posting other reviews but state you only trust TPU. So that's the review i'm basing my statement on.
You see, for some people that might absolutely be a deal breaker but for you it seems it's perfectly fine since you're justifying nvidia's opinion on having 8GB cards. Opinions can obviously differ but seemingly most people are in the latter camp where they absolutely don't see 8GB as enough because it's only going to get worse down the line. Maybe you are fine when a random texture goes missing and you realise you need to turn down the settings or just turn it down altogether when enabling RT, watching your framebuffer etc. But for most, it's a major no no.+
By now you should just agree to disagree and move on. There's enough evidence of the 8GB vs 16GB debate on numerous websites, HUB for one did a lot of testing and the results speak for themselves.
Wish I could get it soon after lunch!
Would be fun to run "AI" stuff one it.
Btw, do also check other reviews when something new is released, then make opinion. For instance, W1zzard reviewed 9800X3D and found it to be not so much better than predecessor, while other sites reported higher inter-generational performance gains and not by a small amount (5% and 10% is a huge difference). I'm not trying to point out that W1zzard's reviews are bad, it's just he uses his own metrics and techniques, other sites have their own. Different software tested, different games tested. You can see these days that a significant performance loss/gain can easily be obtained with different Windows 11 version, which is in fact ridiculous, because that's not difference between generational OS releases (like Win10 vs. Win11) but only half year feature updates.
That's a 20% performance uplift for the 16GB version. Note that it's not a flat 20% uplift but that the 8GB version is fine and playable till it isn't in areas where 8GB is maxed out and it's an awful choppy mess.
I don't know why you keep repeating "you guys are just wrong". There's no right or wrong, when someone says 8GB is absolutely not enough he's right - there are many cases where 8GB is showing to be a limiting factor. But if someone says 8GB is enough he isn't necessarily wrong - maybe his games don't max out 8GB or he's willing to reduce settings in edge cases.
All i'm trying to say is 8GB is already maxed out in a few instances today and it'll be worse down the line. There's no argument to be had here.
You guys pick extreme cases, with Ultra settings on a low end card, worst case scenarios. This makes your point moot at best, completely irrelevant and out of touch, at worst. Nobody has problems with the 4060, because nobody uses the card like you think they would. Where are the million unhappy customers?
I literally even said this the last time I responded to you, no care from your side, you seem to be completely oblivious to arguments in general. Stay in your bubble then. :) This discussion is over anyway. The opponents just use extreme cases and think that's an argument to make on a low end card that nearly nobody uses like that (or really, nobody). Again, there is more to PC usage than just Ultra settings, try to be realistic for once. And even on the worst case scenarios the card works flawlessly 99% of time. You got no point at all.
If you need to use extreme edge cases as an argument, it just really proves, you don't have real arguments anyway. General rule in life. This was already debunked as well, so many times, future guessing isn't a argument you can realistically make, because you don't know it. That's another entirely moot point. If Nvidia brings a new 8 GB card soon, it will probably work for 2-4 years, easily. I bet on it. And you will lose that bet. After that the card is overstretched anyways and details will be reduced - in the lifetime of the card, the vram will never be the problem before it's out of power anyway. This already happened countless times in history of GPUs, btw.
1) I'm not sure where you're getting the 80% number from, or the "stay in your bubble" part because I thought I covered your points of substance. Do you mean the one other point you made about "unrealistic ultra settings" scenario? I literally was countering that very point, saying these aren't unrealistic settings because the framerates are playable as I showed in the rachet and clank graph I posted. How is that an extreme edge case? You know what, that might subjectively be an extreme edge case for you so how about this? Surely over 100FPS isn't an extreme edge case, yet the 8GB is unplayable here?
What was the other point I ignored? All I saw was you saying people who disagree with you are panicking and angry while people who agreed with you seem like chill, well balanced people. That's a given, considering people who agree with you will not argue with you and vice versa. Not that i've seen that in this thread anyway but just saying.
2) There aren't millions of unhappy customers because most games were fine till recent months. See the side by side graph of when the 4060Ti 16GB was released, and the latest one:
See how a negligible 3% difference turned into over 30% in a year? It's easy to predict the trajectory at which this is going.
What's the last point you're "debunking"? My point was 8GB is being maxed out in a few instances today, and I literally provided a graphs (two now) of those instances. I also just showed in the last graph that it is getting worse down the line - memory requirements throughout the ages hasn't decreased for games nor will it anytime soon. This isn't predicting the future, it's an educated assumption based on historical trends. How are you debunking that, by typing debunked?
Sure, we do not know the future 5060's performance but its certainly going to be higher than the 4060 which will only expose the 8GB limitation further. Don't count on nvidia bringing some magic vram management technique if that's what you're implying and if they do, more power to them. History says otherwise though and speaking of history since you went down this road already, know that the GTX680 and Fury X in the past were easily VRAM limited a year or two down the line. I owned both, so I know and i'm sure there are others (and other cases) too.
With 2k for a Geforce GPU you're already solid in fantasy land, stick any number on it, who cares. Its not even a full die lmao Lol bud if you think 8GB is enough you are free to knock yourself out on x60's all day long.
To each their own, they say.
Nvidia can easily make a 128bit card with 12GB from the 5060 thanks to the 4x3GB DDR7 chip. If they do not do this, it is only because they want to earn more on higher models and delay progress as much as possible in order to earn money on it in the long term. And what are you defending here?, 8GB?, memory cost?. Or maybe you think that card prices are fair and dictated in accordance with their production cost?, don't be naive. If they could, they would sell the 4060 for 2000usd and produce it for a fraction of that amount. And believe me, there would be people defending a given price because production costs have increased for nvidia.
For a large number of people, 30-40 frames are playable and add to that crap like rt, dls and fg and 8GB will have a problem despite the chip's ability to run everything at better settings. And don't write that it's a low-end card and you have to reduce the details because it's the same graphics card as other cards, everyone wants to have the best graphics possible. Who knows, maybe ''tomorrow'' the low end will be a card like xx90 (by name) and people will write about reducing the details in 1080p. Nvidia's dream.