Monday, December 16th 2024

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.

It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources: VideoCardz, Wccftech
Add your own comment

173 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

#126
k0vasz
3060ti user here

8GB is still enough for 1080p@high/max settings (I cannot confirm this year's AAA titles, cause I've not played any of them yet)
But even if it wasn't enough, you can still turn graphics settings a notch down, and you'll get similar quality as on a PS5/SeriesX

So, 8GB is still enough?
Yes
Would I buy an 50xx card with 8GB for 1080p?
No (except if I were on a budget, but in that case, I'd go for a 40xx card/AMD/Intel/used card instead)
Posted on Reply
#127
LittleBro
AcEThey only play triple A games on Ultra settings with exact the same one vram heavy level over and over again, just to prove the dramatubers right. /jk :)))))

It does afaik, DLSS lowers vram amount, but the way I used the DLSS argument was in general, for performance, not only vram. With Frame Gen, not sure if it stagnates vram usage or increases it.

Edge cases and things normal users (so most people on planet) barely care about, as long as game is mostly fine and they have no problems. You're basically citing a luxury problem here to try making a point that 8 GB isn't enough, you can only say 8 GB is "suboptimal" with your argument, but this topic was about *not enough* and not "suboptimal" so your argument is firmly beside the point. Also you are citing 4060 Ti, which isn't the main point of this discussion, this is about 5060, 4060 and other low end cards that will 100% be fine with 8 GB vram.

Strawman argument that I have already refuted in #29.

And those were lower midrange cards / mid range cards, from 2016, and had 6 GB / 3 GB Vram, yes, not 8 GB. If a card that is mid or semi highend from 2016 has 8 GB it just proves, that high end of 2016 is now the low end, is just natural evolution, very normal. Nothing special.

Otherwise read my other posts, nothing you said wasn't already answered multiple times.
While I think further arguing in this thread is pointless, I shall add one more comment of mine, for fun.

DLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.

Even Intel realized that 8 GB is not enough in lowend.
The less is cache capacity, the more bandwidth there must be to compensate it. And bandwidth is not just about GPU's memory bus-width,
but think about where are graphics assets stored and from where and through what else components are they pulled from to the VRAM.

Sometimes you need space, not bandwidth. Bandwitdh is of no use when there is not enough space to fit things into.

As soon as devs will move onto higher quality textures, 8 GB will become unsufficient even for 1080p aimed GPUs.
Posted on Reply
#128
Bwaze
LittleBroWhile I think further arguing in this thread is pointless, I shall add one more comment of mine, for fun.

DLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.
But it goes hand in hand that if you're already lowering your resolution and upscaling, you might as well lower the graphics settings - there is no game on market that can't be somehow enjoyed even on integrated graphics - so users of expensive discrete graphics cards should be silent! (Doesn't matter that you have to resort to that at relatively low resolution, raytracing and other fancy settings off, on a card that costs half of monthly salary)
Posted on Reply
#129
TSiAhmat
LittleBroAs soon as devs will move onto higher quality textures, 8 GB will become unsufficient even for 1080p aimed GPUs.
yep, also curious how the 4060ti 16gb version ages. Maybe it becomes relevant in some years.
Posted on Reply
#130
AcE
ArkzPeople have been having issues with Indiana Jones on 8GB cards in 1080p. That's right now, not the future.
Already said it multiple times in this thread, if you have a opinion provide proof, otherwise it's just talk, and we had more than enough talk here already. :)
LittleBroDLSS does not lower VRAM utilization amount and if it by miracle does, it will be within margin of error. Go check the internet.
Your "argument", so I'm not gonna check anything, if you want to make an argument, provide data, talk really isn't worth much in technical discussions, not at this point at least. :) Afaik, DLSS does lower vram usage, if it doesn't, well, wasn't really part of my argument anyways, the data is also without DLSS, I only used DLSS as a side point and not in the main argument.
LittleBroEven Intel realized that 8 GB is not enough in lowend.
Speculation, Intel never said anything like that, otherwise, again!, provide proof. We are not in the bible studies here, that you see "oh the lowest card of Intel has 12 GB now, intel must be of the opinion that 8 GB isn't enough!", that's called *interpretation*, be honest about it, and don't pretend it's fact. (edit: the B580 isn't even the lowest card of Intel for this generation, as the name already implies, so your argument will never make any sense to begin with). The most obvious interpretation here isn't yours, however, it's simply that 192 bit bus will either use 6 GB (which is too less) or 12 GB and they chose 12 (just like Nvidia did with 3060 back then), so it's just a very basic decision they did which does not have anything to do with this discussion, that 8 GB is enough or not. :) But again, it's been already proven numerous times that 8 GB *IS* enough for low end gaming and a low end card, so there's that. :) I'm gonna slowly also quit this discussion and will probably soon ignore "@" that are just talk without proper arguments that contain facts.

1) it's proven 8 GB is enough, by data, anyone who is of a different opinion seems to be phishing for engagement (youtubers), or is looking for the worst case scenarios (not practical, not a good argument to have)

2) the 2 companies with 100% dGPU share of the market currently still use 8 GB cards (intel has A750 / A770 which is also 8 GB vram and has no issues worth talking about), so they confirm that it is enough, otherwise wouldn't do that

3) only a cynic would reduce their technical decision just to "omg it's just about money! They are saving money by reducing vram, or it's planned obsolescence". I don't envy people that are so cynical. And cynicism leads you to roads that aren't worth traveling on.

4) let's be real and honest here for a second, Nvidia is a pretty great company that is known for quality and not for nonsense, so if they do 8 GB video cards in 2022 + probably soon still in 2025, they have good reasons for it, that are beyond than just "omg I have 1 trillion dollars, I need 50 million dollars more!", that's how it is. When exactly was the last time, a video card of Nvidia really didn't work out because the vram amount wasn't enough? Right, never? Good to know. We have a lot, a ton of baseless talk here, and the few data points provided point in the direction that 8 GB is enough today and for the foreseeable future, for 1080p at least. Not talking about 1440p, I don't give any guarantees there, it's well known that 3070 is a 1440p card (or was?) that has some issues with vram, but that's another topic, it's not a low end card and not about 1080p, which is, what this is about. Nobody buys 4060 for 1440p, at least not without accepting compromises, not with thinking "that will be perfectly fine!". Anyone who buys a 4060, or soon 5060, will think to themselves, "this is a low end card, it is what it is, I'll accept compromises, or I'll stay firmly with 1080p even accepting compromises there, if necessary if anything I play is just AAA games". As soon as the discussion isn't about worst case scenarios anymore, which are the moot points the opponents of 8 GB vram brought, it's just AA or even A gaming, the vram isn't even a topic anymore. Also the usage of smart settings eviscerates the points the opponents of 8 GB vram have as well. Which is usually what a low end card owner should do. :) "Ultra Ultra Ultra" is usually for people who own more expensive, higher end cards. Tbh, Ultra is a enthusiast setting, so why should I force it on a low end card? I'm gonna use it if I can, and if the game is in the slightest laggy, it's lowered to "High" after 5 seconds, that's the reality.
Posted on Reply
#131
TechLurker
Most of the issues with modern games is that they're poorly optimized, and can only be optimized so much for PC given the wide array of possible system configs. We see that most games that are well optimized can run at higher settings with lower VRAM, while poorly optimized games rely on the brute force of a GPU + larger VRAM to have the same quality as an optimized game with lower VRAM. It seems more likely that game companies will continue to shift to the easier brute force method, which will also necessitate more VRAM to make up for the shortcomings of trying to chase the "ultra-realistic" settings on stock game engines, as those work "well enough".

As well, many games are still made to the lowest common denominator, which are consoles, and those are generally only equal to mid-range GPUs, so in theory, most games should work on limited VRAM (but of course, they don't always). In this generation, the XSS is the limiting factor with 8GB allocated to its GPU, while XSX allocates up to 10GB to its GPU and the PS5 lets most of the 16GB be usable by the GPU depending on game optimization. Of those, the XSS has shown actual issues with its allocated 8GB given that most games were not designed for it but rather the XSX, so additional tweaks had to be made to games to offer settings that ran on an XSS at lower settings than XSX. The 2 extra GB on the XSX make a lot of difference, and the PS5 makes use of the larger available memory to make up for the slightly inferior GPU specs relative to the XSX.
Posted on Reply
#132
mkppo
I think the 8GB argument is pointless and we are going in circles. One person seems adamant that it's enough while a few others say it isn't. I'm firmly in the latter camp simply because there are instances with the 4060 where 8GB framebuffer is maxed out especially with RT at which point lowering graphics requirements are a necessity otherwise framerates are terrible and choppy. The 5060 will only make this worse because it's (hopefully) faster.

Now if the 5060 is at the level of, say, 4060Ti, this is what we're looking at:


Not insignificant. So essentially the games where 4060 was bottlenecked by framebuffer will have the same upper limit of graphics settings in the 5060.
Posted on Reply
#133
AcE
mkppoI think the 8GB argument is pointless and we are going in circles. One person seems adamant
It's especially pointless because the opponents are just talking, and are unable to give proof / data that supports their claims. :) "One person", sadly, no, multiple people are on my side, and 2-3 companies, AMD, Nvidia and Intel. They all still use 8 GB video cards.

The funny thing was when one guy tried to use my own data against me, I studied the data again and pointed out quickly that it supports all my claims and thanked him for it.
mkppothere are instances with the 4060 where 8GB framebuffer is maxed out especially with RT
This supports multiple of my claims, thanks. 8 GB opponents are just fishing for edge cases and exemptions instead of looking for the broad rule, that the card works with 0 issues when used normally. :)

You guys generally have 1 thing in common: edge cases, low % usage cases, impractical usage of a low end card. This all just makes it easier for me to bring this to bed. :)

You can also say, a Fiat will break down if you always drive it at 180 km/h. This is the argument you guys are basically making, a very unrealistic scenario which makes 0 sense and will never happen.
Posted on Reply
#134
tommesfps
The only proof in this thread is that you may see an ace but it is really not an ace by all means. :(

It is like witnessing a wrong way crash...
Posted on Reply
#135
mkppo
AcEThis supports multiple of my claims, thanks. 8 GB opponents are just fishing for edge cases and exemptions instead of looking for the broad rule, that the card works with 0 issues when used normally. :)

You guys generally have 1 thing in common: edge cases, low % usage cases, impractical usage of a low end card. This all just makes it easier for me to bring this to bed. :)

You can also say, a Fiat will break down if you always drive it at 180 km/h. This is the argument you guys are basically making, a very unrealistic scenario which makes 0 sense and will never happen.
Well, not all games will be able to max an 8GB framebuffer of a card with performance characteristics of a 4060 while framerates are still playable. But there are many instances already where 8GB is maxed out while the framerates are still playable but because of that limitation it becomes a choppy mess. It'll just be worse on the 5060 as the 4060Ti 16GB has a greater than 30% performance uplift on average compared to the 8GB version at 1440p with RT. That's not an edge case at all and very much playable in many games, just not on the 8GB version unless you lower textures and such.

You are saying others aren't providing proof whereas i see a lot of people posting other reviews but state you only trust TPU. So that's the review i'm basing my statement on.

You see, for some people that might absolutely be a deal breaker but for you it seems it's perfectly fine since you're justifying nvidia's opinion on having 8GB cards. Opinions can obviously differ but seemingly most people are in the latter camp where they absolutely don't see 8GB as enough because it's only going to get worse down the line. Maybe you are fine when a random texture goes missing and you realise you need to turn down the settings or just turn it down altogether when enabling RT, watching your framebuffer etc. But for most, it's a major no no.+

By now you should just agree to disagree and move on. There's enough evidence of the 8GB vs 16GB debate on numerous websites, HUB for one did a lot of testing and the results speak for themselves.
Posted on Reply
#136
Arkz
AcEAlready said it multiple times in this thread, if you have a opinion provide proof, otherwise it's just talk, and we had more than enough talk here already. :)

Your "argument", so I'm not gonna check anything, if you want to make an argument, provide data, talk really isn't worth much in technical discussions, not at this point at least. :) Afaik, DLSS does lower vram usage, if it doesn't, well, wasn't really part of my argument anyways, the data is also without DLSS, I only used DLSS as a side point and not in the main argument.

Speculation, Intel never said anything like that, otherwise, again!, provide proof. We are not in the bible studies here, that you see "oh the lowest card of Intel has 12 GB now, intel must be of the opinion that 8 GB isn't enough!", that's called *interpretation*, be honest about it, and don't pretend it's fact. (edit: the B580 isn't even the lowest card of Intel for this generation, as the name already implies, so your argument will never make any sense to begin with). The most obvious interpretation here isn't yours, however, it's simply that 192 bit bus will either use 6 GB (which is too less) or 12 GB and they chose 12 (just like Nvidia did with 3060 back then), so it's just a very basic decision they did which does not have anything to do with this discussion, that 8 GB is enough or not. :) But again, it's been already proven numerous times that 8 GB *IS* enough for low end gaming and a low end card, so there's that. :) I'm gonna slowly also quit this discussion and will probably soon ignore "@" that are just talk without proper arguments that contain facts.

1) it's proven 8 GB is enough, by data, anyone who is of a different opinion seems to be phishing for engagement (youtubers), or is looking for the worst case scenarios (not practical, not a good argument to have)

2) the 2 companies with 100% dGPU share of the market currently still use 8 GB cards (intel has A750 / A770 which is also 8 GB vram and has no issues worth talking about), so they confirm that it is enough, otherwise wouldn't do that

3) only a cynic would reduce their technical decision just to "omg it's just about money! They are saving money by reducing vram, or it's planned obsolescence". I don't envy people that are so cynical. And cynicism leads you to roads that aren't worth traveling on.

4) let's be real and honest here for a second, Nvidia is a pretty great company that is known for quality and not for nonsense, so if they do 8 GB video cards in 2022 + probably soon still in 2025, they have good reasons for it, that are beyond than just "omg I have 1 trillion dollars, I need 50 million dollars more!", that's how it is. When exactly was the last time, a video card of Nvidia really didn't work out because the vram amount wasn't enough? Right, never? Good to know. We have a lot, a ton of baseless talk here, and the few data points provided point in the direction that 8 GB is enough today and for the foreseeable future, for 1080p at least. Not talking about 1440p, I don't give any guarantees there, it's well known that 3070 is a 1440p card (or was?) that has some issues with vram, but that's another topic, it's not a low end card and not about 1080p, which is, what this is about. Nobody buys 4060 for 1440p, at least not without accepting compromises, not with thinking "that will be perfectly fine!". Anyone who buys a 4060, or soon 5060, will think to themselves, "this is a low end card, it is what it is, I'll accept compromises, or I'll stay firmly with 1080p even accepting compromises there, if necessary if anything I play is just AAA games". As soon as the discussion isn't about worst case scenarios anymore, which are the moot points the opponents of 8 GB vram brought, it's just AA or even A gaming, the vram isn't even a topic anymore. Also the usage of smart settings eviscerates the points the opponents of 8 GB vram have as well. Which is usually what a low end card owner should do. :) "Ultra Ultra Ultra" is usually for people who own more expensive, higher end cards. Tbh, Ultra is a enthusiast setting, so why should I force it on a low end card? I'm gonna use it if I can, and if the game is in the slightest laggy, it's lowered to "High" after 5 seconds, that's the reality.
Go tell Alex he's wrong then.
Posted on Reply
#137
csendesmark
5090 comes with 32GB GDDR7?!
Wish I could get it soon after lunch!
Would be fun to run "AI" stuff one it.
Posted on Reply
#138
wheresmycar
csendesmarkWish I could get it soon after lunch!
You'll have better chance after breakfast before stock runs out (hehe) :roll:
Posted on Reply
#139
Solid State Soul ( SSS )
The 5060 8gb is manufactured e waste, how is it acceptable the 3060 have 12gb while the 4060 and the 5060 having only 8
Posted on Reply
#140
Dr. Dro
Solid State Soul ( SSS )The 5060 8gb is manufactured e waste, how is it acceptable the 3060 have 12gb while the 4060 and the 5060 having only 8
The only reason the 3060 has 12 GB is that 6 GB really wasn't enough and its 192-bit memory bus can only do 6 or 12
Posted on Reply
#141
AcE
ArkzGo tell Alex he's wrong then.
Who? Go tell W1zzard that he's wrong. :) Data > Any person on the planet. I don't care to bring personal things into this. Logic > *
mkppoBut there are many instances already where 8GB is maxed out while the framerates are still playable but because of that limitation it becomes a choppy mess.
Impossible, that would show in the average FPS - and your argument is solely concentrated on worst case scenario Ultra settings, which are only smart part of the general argument and already debunked, maybe try to regard more than just 1 setting? - Also the outcry of masses would arise, which never did, not even for the RX 7600 with worse vram management than Nvidias. You guys are simply wrong, accept it and move on. :) Endless deflecting, certainly will never impress me, to the contrary, the people who agreed with me all seemed chilled, well balanced people, the guys who didn't more often than not behaved panicked and angrily most of the time. There's a interesting pattern. I guess being positive is just better than being sceptical and negative and coloring it into how you see the world (and then see too many things just not how they really are, but over everything else, there will be *balance*, the zen mantra). But I digress, the discussion is just too long in the tooth now.
Posted on Reply
#142
LittleBro
A man can give you ANY fact, you just deny it and continue mumbling your own words. Look, there's a reason why most of people don't agree with you.

Btw, do also check other reviews when something new is released, then make opinion. For instance, W1zzard reviewed 9800X3D and found it to be not so much better than predecessor, while other sites reported higher inter-generational performance gains and not by a small amount (5% and 10% is a huge difference). I'm not trying to point out that W1zzard's reviews are bad, it's just he uses his own metrics and techniques, other sites have their own. Different software tested, different games tested. You can see these days that a significant performance loss/gain can easily be obtained with different Windows 11 version, which is in fact ridiculous, because that's not difference between generational OS releases (like Win10 vs. Win11) but only half year feature updates.
Posted on Reply
#143
mkppo
AcEImpossible, that would show in the average FPS - and your argument is solely concentrated on worst case scenario Ultra settings, which are only smart part of the general argument and already debunked, maybe try to regard more than just 1 setting? - Also the outcry of masses would arise, which never did, not even for the RX 7600 with worse vram management than Nvidias. You guys are simply wrong, accept it and move on. :) Endless deflecting, certainly will never impress me, to the contrary, the people who agreed with me all seemed chilled, well balanced people, the guys who didn't more often than not behaved panicked and angrily most of the time. There's a interesting pattern. I guess being positive is just better than being sceptical and negative and coloring it into how you see the world (and then see too many things just not how they really are, but over everything else, there will be *balance*, the zen mantra). But I digress, the discussion is just too long in the tooth now.
you replied to what I said as impossible, well, here's your trusted TPU chart for one game. Note that there are others:



That's a 20% performance uplift for the 16GB version. Note that it's not a flat 20% uplift but that the 8GB version is fine and playable till it isn't in areas where 8GB is maxed out and it's an awful choppy mess.

I don't know why you keep repeating "you guys are just wrong". There's no right or wrong, when someone says 8GB is absolutely not enough he's right - there are many cases where 8GB is showing to be a limiting factor. But if someone says 8GB is enough he isn't necessarily wrong - maybe his games don't max out 8GB or he's willing to reduce settings in edge cases.

All i'm trying to say is 8GB is already maxed out in a few instances today and it'll be worse down the line. There's no argument to be had here.
Posted on Reply
#144
AcE
mkppoyou replied to what I said as impossible, well, here's your trusted TPU chart for one game. Note that there are others:
You ignore 80% of my arguments and think you got a point, maybe read more carefully or be more honest?

You guys pick extreme cases, with Ultra settings on a low end card, worst case scenarios. This makes your point moot at best, completely irrelevant and out of touch, at worst. Nobody has problems with the 4060, because nobody uses the card like you think they would. Where are the million unhappy customers?

I literally even said this the last time I responded to you, no care from your side, you seem to be completely oblivious to arguments in general. Stay in your bubble then. :) This discussion is over anyway. The opponents just use extreme cases and think that's an argument to make on a low end card that nearly nobody uses like that (or really, nobody). Again, there is more to PC usage than just Ultra settings, try to be realistic for once. And even on the worst case scenarios the card works flawlessly 99% of time. You got no point at all.

If you need to use extreme edge cases as an argument, it just really proves, you don't have real arguments anyway. General rule in life.
mkppoAll i'm trying to say is 8GB is already maxed out in a few instances today and it'll be worse down the line. There's no argument to be had here.
This was already debunked as well, so many times, future guessing isn't a argument you can realistically make, because you don't know it. That's another entirely moot point. If Nvidia brings a new 8 GB card soon, it will probably work for 2-4 years, easily. I bet on it. And you will lose that bet. After that the card is overstretched anyways and details will be reduced - in the lifetime of the card, the vram will never be the problem before it's out of power anyway. This already happened countless times in history of GPUs, btw.
Posted on Reply
#145
mkppo
AcEYou ignore 80% of my arguments and think you got a point, maybe read more carefully or be more honest?

You guys pick extreme cases, with Ultra settings on a low end card, worst case scenarios. This makes your point moot at best, completely irrelevant and out of touch, at worst. Nobody has problems with the 4060, because nobody uses the card like you think they would. Where are the million unhappy customers?

I literally even said this the last time I responded to you, no care from your side, you seem to be completely oblivious to arguments in general. Stay in your bubble then. :) This discussion is over anyway. The opponents just use extreme cases and think that's an argument to make on a low end card that nearly nobody uses like that (or really, nobody). Again, there is more to PC usage than just Ultra settings, try to be realistic for once. And even on the worst case scenarios the card works flawlessly 99% of time. You got no point at all.

If you need to use extreme edge cases as an argument, it just really proves, you don't have real arguments anyway. General rule in life.

This was already debunked as well, so many times, future guessing isn't a argument you can realistically make, because you don't know it. That's another entirely moot point. If Nvidia brings a new 8 GB card soon, it will probably work for 2-4 years, easily. I bet on it. And you will lose that bet. After that the card is overstretched anyways and details will be reduced - in the lifetime of the card, the vram will never be the problem before it's out of power anyway. This already happened countless times in history of GPUs, btw.
I wasn't even being combative yet you seem to be the one pointing fingers here and not even once. Never mind that. A few things before I leave this "discussion":

1) I'm not sure where you're getting the 80% number from, or the "stay in your bubble" part because I thought I covered your points of substance. Do you mean the one other point you made about "unrealistic ultra settings" scenario? I literally was countering that very point, saying these aren't unrealistic settings because the framerates are playable as I showed in the rachet and clank graph I posted. How is that an extreme edge case? You know what, that might subjectively be an extreme edge case for you so how about this? Surely over 100FPS isn't an extreme edge case, yet the 8GB is unplayable here?




What was the other point I ignored? All I saw was you saying people who disagree with you are panicking and angry while people who agreed with you seem like chill, well balanced people. That's a given, considering people who agree with you will not argue with you and vice versa. Not that i've seen that in this thread anyway but just saying.

2) There aren't millions of unhappy customers because most games were fine till recent months. See the side by side graph of when the 4060Ti 16GB was released, and the latest one:



See how a negligible 3% difference turned into over 30% in a year? It's easy to predict the trajectory at which this is going.

What's the last point you're "debunking"? My point was 8GB is being maxed out in a few instances today, and I literally provided a graphs (two now) of those instances. I also just showed in the last graph that it is getting worse down the line - memory requirements throughout the ages hasn't decreased for games nor will it anytime soon. This isn't predicting the future, it's an educated assumption based on historical trends. How are you debunking that, by typing debunked?

Sure, we do not know the future 5060's performance but its certainly going to be higher than the 4060 which will only expose the 8GB limitation further. Don't count on nvidia bringing some magic vram management technique if that's what you're implying and if they do, more power to them. History says otherwise though and speaking of history since you went down this road already, know that the GTX680 and Fury X in the past were easily VRAM limited a year or two down the line. I owned both, so I know and i'm sure there are others (and other cases) too.
Posted on Reply
#146
LittleBro
Who's next in queue to reason with unreasonable one?
Posted on Reply
#147
AcE
mkppoHow is that an extreme edge case?
You picked the one (1) game that has heavy heavy vram usage, and everyone knows that. It's also the 1 (one) game that magically makes the 7600 XT a worth while buy. :D
mkppoThat's a given, considering people who agree with you will not argue with you and vice versa.
No, they just posted their posts, they did not quote me. Some of them posted good real life examples of how 8 GB video cards are not a issue.
mkppoWhat was the other point I ignored?
1) 4060 only, low end cards, not 4060 Ti and higher - I never said 4060 Ti is fine with 8 GB, it's a different matter as it's a lower mid range GPU 2) where I said that Ultra shouldn't be used as the main thing a low end user will use. 3) where I said not everyone with a 4060 will be triple A gamer, most are not (!) I'm pretty sure. So these Ultra benchmarks in all triple A games aren't that relevant, they show worst case scenarios, and still the 4060 works through them with almost 0 issues.
mkppoSee how a negligible 3% difference turned into over 30% in a year? It's easy to predict the trajectory at which this is going.
Irrelevant to the discussion because I was only talking about low end cards like 4060 and 7600, not the 4060 Ti. I never said the 4060 Ti is great. 7700 XT is better IMO and for multiple reasons.
mkppoSure, we do not know the future 5060's performance but its certainly going to be higher than the 4060 which will only expose the 8GB limitation further.
Unlikely, currently no problem in 1080p and it will stay like that in the foreseeable future. Low end buyers are also way less triple A gamers, they are mostly competitive game/<AAA gamers, and those games are way less demanding.
Posted on Reply
#148
Vayra86
Dr. DroMy hard limit is $1999, and not a dollar more. I'm still hopeful it'll slot in at the same $1599 price point of the RTX 4090, since tech is supposed to advance while keeping prices at the same level, but I know better than that.
Yeah right. I bet in 2025 you'll be saying oh but inflation, so now I can go up to 2100.

With 2k for a Geforce GPU you're already solid in fantasy land, stick any number on it, who cares. Its not even a full die lmao
AcEAlready said it multiple times in this thread, if you have a opinion provide proof, otherwise it's just talk, and we had more than enough talk here already. :)

Your "argument", so I'm not gonna check anything, if you want to make an argument, provide data, talk really isn't worth much in technical discussions, not at this point at least. :) Afaik, DLSS does lower vram usage, if it doesn't, well, wasn't really part of my argument anyways, the data is also without DLSS, I only used DLSS as a side point and not in the main argument.

Speculation, Intel never said anything like that, otherwise, again!, provide proof. We are not in the bible studies here, that you see "oh the lowest card of Intel has 12 GB now, intel must be of the opinion that 8 GB isn't enough!", that's called *interpretation*, be honest about it, and don't pretend it's fact. (edit: the B580 isn't even the lowest card of Intel for this generation, as the name already implies, so your argument will never make any sense to begin with). The most obvious interpretation here isn't yours, however, it's simply that 192 bit bus will either use 6 GB (which is too less) or 12 GB and they chose 12 (just like Nvidia did with 3060 back then), so it's just a very basic decision they did which does not have anything to do with this discussion, that 8 GB is enough or not. :) But again, it's been already proven numerous times that 8 GB *IS* enough for low end gaming and a low end card, so there's that. :) I'm gonna slowly also quit this discussion and will probably soon ignore "@" that are just talk without proper arguments that contain facts.

1) it's proven 8 GB is enough, by data, anyone who is of a different opinion seems to be phishing for engagement (youtubers), or is looking for the worst case scenarios (not practical, not a good argument to have)

2) the 2 companies with 100% dGPU share of the market currently still use 8 GB cards (intel has A750 / A770 which is also 8 GB vram and has no issues worth talking about), so they confirm that it is enough, otherwise wouldn't do that

3) only a cynic would reduce their technical decision just to "omg it's just about money! They are saving money by reducing vram, or it's planned obsolescence". I don't envy people that are so cynical. And cynicism leads you to roads that aren't worth traveling on.

4) let's be real and honest here for a second, Nvidia is a pretty great company that is known for quality and not for nonsense, so if they do 8 GB video cards in 2022 + probably soon still in 2025, they have good reasons for it, that are beyond than just "omg I have 1 trillion dollars, I need 50 million dollars more!", that's how it is. When exactly was the last time, a video card of Nvidia really didn't work out because the vram amount wasn't enough? Right, never? Good to know. We have a lot, a ton of baseless talk here, and the few data points provided point in the direction that 8 GB is enough today and for the foreseeable future, for 1080p at least. Not talking about 1440p, I don't give any guarantees there, it's well known that 3070 is a 1440p card (or was?) that has some issues with vram, but that's another topic, it's not a low end card and not about 1080p, which is, what this is about. Nobody buys 4060 for 1440p, at least not without accepting compromises, not with thinking "that will be perfectly fine!". Anyone who buys a 4060, or soon 5060, will think to themselves, "this is a low end card, it is what it is, I'll accept compromises, or I'll stay firmly with 1080p even accepting compromises there, if necessary if anything I play is just AAA games". As soon as the discussion isn't about worst case scenarios anymore, which are the moot points the opponents of 8 GB vram brought, it's just AA or even A gaming, the vram isn't even a topic anymore. Also the usage of smart settings eviscerates the points the opponents of 8 GB vram have as well. Which is usually what a low end card owner should do. :) "Ultra Ultra Ultra" is usually for people who own more expensive, higher end cards. Tbh, Ultra is a enthusiast setting, so why should I force it on a low end card? I'm gonna use it if I can, and if the game is in the slightest laggy, it's lowered to "High" after 5 seconds, that's the reality.
Lol bud if you think 8GB is enough you are free to knock yourself out on x60's all day long.

To each their own, they say.
Posted on Reply
#149
altla
The typical consumer wants to buy as much as possible as cheaply as possible. This applies to both customers for 4060 cards and customers for 4090 cards the same. Everyone buys what they can afford or what they need. In the case of gamers, it is rather ''what they can afford''. A 4060 owner can complain just as much as a 4090 owner. The lack of complaints from 4060 owners does not prove anything, just like with 4090. Everyone would like to have a stronger card. (I am leaving out those who deliberately bought a weak one).
Nvidia can easily make a 128bit card with 12GB from the 5060 thanks to the 4x3GB DDR7 chip. If they do not do this, it is only because they want to earn more on higher models and delay progress as much as possible in order to earn money on it in the long term. And what are you defending here?, 8GB?, memory cost?. Or maybe you think that card prices are fair and dictated in accordance with their production cost?, don't be naive. If they could, they would sell the 4060 for 2000usd and produce it for a fraction of that amount. And believe me, there would be people defending a given price because production costs have increased for nvidia.
For a large number of people, 30-40 frames are playable and add to that crap like rt, dls and fg and 8GB will have a problem despite the chip's ability to run everything at better settings. And don't write that it's a low-end card and you have to reduce the details because it's the same graphics card as other cards, everyone wants to have the best graphics possible. Who knows, maybe ''tomorrow'' the low end will be a card like xx90 (by name) and people will write about reducing the details in 1080p. Nvidia's dream.
Posted on Reply
#150
Dr. Dro
Vayra86Yeah right. I bet in 2025 you'll be saying oh but inflation, so now I can go up to 2100.

With 2k for a Geforce GPU you're already solid in fantasy land, stick any number on it, who cares. Its not even a full die lmao


Lol bud if you think 8GB is enough you are free to knock yourself out on x60's all day long.

To each their own, they say.
Way the exchange rate is going... I might not be able to get a card for a while. :(
Posted on Reply
Add your own comment
Jan 7th, 2025 03:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts