Monday, December 16th 2024

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.

It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources: VideoCardz, Wccftech
Add your own comment

174 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

#101
Tomorrow
AcESo you basically agree with me then? Cool. I already said this is natural evolution. But 8 GB is still not at the end. Go check performance of 6500 XT and you will see what "end" means. :) You seem to lack footing in reality.
Now you're changing your tune? You flat out denied that the same happened with 2GB, 4GB and 6GB. Now you're saying that it's natural evolution and you agree with me that it will happen with 10GB, 12GB etc in the future? Stop flip-flopping and make up your mind.
AcEIt's not in critical zone, it's in the "it's enough" zone, which is a notch above it. And "denial" is not relevant for people like me, I use high end cards since 2014. :) This is purely a technical discussion to me. Not a emotional one as it is to you. :)
So this means you dont even know what you're talking about since you've never experienced this yourself.
Only a denier would look at 7.6GB on a 8GB card and say - yeah that outta be enough...
AcE"Trying"? I already did. Just because you're losing the argument doesn't mean you have to get mad and call me "delusional" btw. :)
Did how? Provided zero links to disprove those youtubers. Who are people gonna believe?
Random guy with a few hundred posts in a forum where he joined a few weeks ago or a person with decades of experience and millions of subs?
AcEYes and the people will buy Nvidia, 90% and then AMD and Intel will get 5 and then later 0% like last time. Their products are just too far behind and their software stack is primitive.
Only an Nvidia fanboy would say something as moronic as that. I think everyone who's rational would agree that Nvidia is ahead, but Intel is not generations behind with "primitive software". They have all the same features as Nvidia. What holds Intel back is their own management on low volume on those cards.
AcEDoesn't make much sense, cause the people who buy those mostly don't have more money or they don't care about your edge cases, they go by fine with these video cards. :)
Oh so now they're poor or 1080p is edge case. Didn't you say earlier that 1080p is the most used res? Now suddenly it's an edge case....
AcEI checked all the links, 4060 has 0 issues in all the games. :) Reading and understanding seem to be 2 different things. The 4060 behaved perfectly normal in all those games, in fact. Thanks for proving all my points correct. :)
Oh so you looked at average framerate numbers and concluded that 4060 has zero issues.
What about 1% lows?, frametimes?, what about image quality (textures not loading in) etc.
Average framerate is only one metric and by that even ancient SLI was totally ok with zero issues...
AcE480 was firmly competitive with 1060. And your assessment is wrong. 1070 and 1080 are upper midrange and semi high end and used 8 GB vram, which is a historical fact btw and I won't debate this with you. :)
From TPU's own chart. 480 is 4% slower than 1060 6GB and 7% faster than 1060 3GB. I guess that's competitive.
AcEIf you got nothing to provide in this discussion, maybe stay away? Trolling isn't great, and to be honest, this is quite the easy discussion for me, as I have already said. He's not refuting one single word of mine, nothing. :) To the contrary, he provided all the TPU links to prove me right, thanks a lot. =)
It is you who has nothing to provide in this discussion. Calling 1080p as edge case, ignoring mounting evidence of 8GB problems and cheering Nvidia.
I hope at least your green check clears.
And even others mocking you for your delusions. But sure. Go on buying your high end cards and believing 8GB is enough.
I've done arguing with a fanboy. And dont bother replying. Your in my blocklist now.
Posted on Reply
#102
lexluthermiester
AcE32 GB is irrelevant though unless you do work with the 5090 that involves heavy vram usage.
There are rumors of a number of games that will push beyond the 24GB limit of the 4090. IF true, then the 32GB will be relevant.
Posted on Reply
#103
MCJAxolotl7
Hecate91I would be fine with 1080p medium if the price reflected being limited to 1080p 60fps medium at $200, but I really doubt that is going to happen since the 4060 8GB is $299.
But I think ultra settings is dumb in most cases for newer titles, not sure who brought up ultra. 10 or 12GB needs to be what an x60 card comes with since Intel has proven it can be done for less than $300.
Nobody specificially brought up ultra, but that is the only condition in which 8gb cards at 1080p would be bad, and people are saying that 1080p on 8gb cards is bad.
QuicksEven high settings on certain games takes more than 8GB and I have to switch to medium settings.

8GB is simply not enough today. 12GB is bare minimum and is next on the chopping block when the next console update comes. Probably in the next 2 years.
medium settings can be fine, also frame gen ai upscaling and dlss/xess/fsr exist for god's sake
Posted on Reply
#104
AusWolf
What a big leak, wow! A total of 5 model names that we always knew would exist at some point in time. :rolleyes:
A&P211My kids love playing outside.
I thought that was a thing of the past. I'm so glad it isn't. :)
Posted on Reply
#105
Tomorrow
MCJAxolotl7Nobody specificially brought up ultra, but that is the only condition in which 8gb cards at 1080p would be bad, and people are saying that 1080p on 8gb cards is bad.
Look at TPU game reviews. Some games go over the 8GB at even 1080p lowest setting with no FG/RT.
This year already 27% of TPU tested games exhibited this behavior and it will only increase each year.
MCJAxolotl7also frame gen ai upscaling and dlss/xess/fsr exist for god's sake
Frame gen and RT increase VRAM usage even more. Yes upscaling lowers it, but only enough to counter increase from FG/RT.
Posted on Reply
#106
wheresmycar
Forget the cunningly calculated performance tiers, when it comes to price, Intels B580 at $250 sets the benchmark for this range. You get 12GB of VRAM without skimping memory bandwidth, which should be the standard moving forward. Yes, 8GB cards are still relevant, but no savvy consumer should be rolling out the red carpet for this kind of compromise at this price point.
Posted on Reply
#107
Predictable
Gah, I love when anything whatsoever about GPUs makes headlines. These are the best comment sections in all of tech. I get so excited when there's 150 comments about something NVidia or AMD related.
Posted on Reply
#108
Rjc31
PredictableGah, I love when anything whatsoever about GPUs makes headlines. These are the best comment sections in all of tech. I get so excited when there's 150 comments about something NVidia or AMD related.
This is me when these articles pop up!
Posted on Reply
#109
nexxusty
freeagentMy kid loves gaming on his 8GB card. These are not AMD GPU's lol.. NV does things differently.
Lol. What?

Do you know that you have no idea what you're talking about even in the slightest, or?

Just curious.
Posted on Reply
#110
jaszy
BwazeWe also have a 4TB SSD ceiling for half a decade now, anything over that, and there's only 8 TB for all that time, costs twice as much per TB - for the whole duration.

Jensen Huang, September 2022:

"Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
We can argue diminishing returns for both, but the average game is now 50-100 gig, even esports/competitive stuff. You could get away with gaming on a 512GB SSD in 2016 without much real concern of filling it up.

The outlying factor here is that there is a shift from 1080p and 1440p and 4K. You can get a half decent 165-180hz 1440p monitor for as low as $150 BNIB 2024. 4K 120-144hz is only $300-400 now. The average 1440p 144hz monitor used to cost $400-500 in 2020.

Textures in modern games are just too dense to ignore, especially at what is now a mainstream resolution (1440p).
Posted on Reply
#111
Dawora
AusWolfWhat a big leak, wow! A total of 5 model names that we always knew would exist at some point in time. :rolleyes:


I thought that was a thing of the past. I'm so glad it isn't. :)
Yeah laptops.. now they can also play outside
Posted on Reply
#112
jaszy
AcEOff topic. The drama here is about low end GPUs having 8 GB, 2016 was a 1080 and 1070, that's semi high end and upper mid range, so completely different cards that have nothing to do with this discussion other than saying "oh 8 GB was also used back then on completely different cards".

It's only crazy if you think of it in non-technical terms. If you understand what 8 GB vram buffer is, in technical terms, it's not crazy at all. :)
www.techpowerup.com/gpu-specs/radeon-rx-480.c2848

You're also completely wrong and think youre smarter than you actually are, but I digress. Not going to bother arguing with someone who quote replies literally everyone.
Posted on Reply
#113
yfn_ratchet
Smeh. 5060Ti 16GB is... interesting? in concept, but it'll need to be a knockout to be worth it to me. Better mem bandwidth than the 3060 12GB, full x16 link, and ~4070S performance for ~$425 MSRP and I'll keep my eye on it.

Still dislike that the double-up VRAM card has been up-tiered for Ada and now Blackwell. It reeks of, like was muttered many times with every Ada release, Nvidia quietly up-badging every card in the lineup and whistling as they slink away. What would have a wonderful name as a 5050 is now the 5060, with no good justification—and the only comprehensible reason being that they want to leverage the legacy of the product stacks that came before to sell an inferior product.
Posted on Reply
#114
Prima.Vera
mxthunderIm kinda pumped for 512 bit bus. GTX280 was my favorite card of all time
You will get even more pumped up when you'll see its price

To be honest, the new Intel launched cards are the best cards one can buy now. They have 12GB of VRAM, and they are priced fair. Those cards are perfect for 1080p gaming.
I sincerely see no point on buying a x060 card over those.
Posted on Reply
#115
Dawora
yfn_ratchetSmeh. 5060Ti 16GB is... interesting? in concept, but it'll need to be a knockout to be worth it to me. Better mem bandwidth than the 3060 12GB, full x16 link, and ~4070S performance for ~$425 MSRP and I'll keep my eye on it.

Still dislike that the double-up VRAM card has been up-tiered for Ada and now Blackwell. It reeks of, like was muttered many times with every Ada release, Nvidia quietly up-badging every card in the lineup and whistling as they slink away. What would have a wonderful name as a 5050 is now the 5060, with no good justification—and the only comprehensible reason being that they want to leverage the legacy of the product stacks that came before to sell an inferior product.
When Vram is fine then looking something else not to like..
Just dont buy it, problem solved.

This whole topic is full of Nvidia hate and QQ whining.
Prima.VeraYou will get even more pumped up when you'll see its price

To be honest, the new Intel launched cards are the best cards one can buy now. They have 12GB of VRAM, and they are priced fair. Those cards are perfect for 1080p gaming.
I sincerely see no point on buying a x060 card over those.
And cant even OC it whitout huge hasle, also huge problems is playing something else than AAA/ bench mark games and sometimes even those have a problems and no DLSS.

But just like i told lot of hate agains Nvidia..
All the haters are in forums
Posted on Reply
#116
A&P211
AusWolfWhat a big leak, wow! A total of 5 model names that we always knew would exist at some point in time. :rolleyes:


I thought that was a thing of the past. I'm so glad it isn't. :)
I live on about 15 acres and 30 miles from the city. I dont mind driving 30min for work for them.
Outback BronzeI'm trying to get my kid playing Sonic on the Sega Mega Drive. He won't play damn it.



I'm trying to get him inside playing games. He stays cleaner inside playing games :)

Good comparison here of 8GB vs 16Gb:

Kids still need exercise not exercising their thumbs, get some chickens and a mule, aka jackass.
Posted on Reply
#117
Dr. Dro
mkppoa $2500 for 5090 is expected because, well, they can.
My hard limit is $1999, and not a dollar more. I'm still hopeful it'll slot in at the same $1599 price point of the RTX 4090, since tech is supposed to advance while keeping prices at the same level, but I know better than that.
Posted on Reply
#118
yfn_ratchet
DaworaWhen Vram is fine then looking something else not to like..
Just dont buy it, problem solved.

This whole topic is full of Nvidia hate and QQ whining.
It's not like I'm bouncing in my seat waiting to buy it, man. I have my checklist for what my next upgrade will be, and any of the three colors have a chance of slotting in where my 3060 is next year. It's down to meeting my performance expectations at my price expectation, bar none.

And on the topic of 'hate', am I not justified in complaining about what I see as an attempt to fleece unaware buyers of their cash by intentionally and misleadingly altering the naming conventions of a product stack? Nvidia is slowly building up a history of such egregious tactics, ranging from the 4080 12GB to the 3050 6GB to the 4060/4060Ti 8GB; the only way to keep Nvidia accountable is to continually make our displeasure known and inform people about these disingenuous practices so that they aren't—and I'll say the quiet part out loud here—cheated out of their money.
Posted on Reply
#119
Dawora
Dr. DroMy hard limit is $1999, and not a dollar more. I'm still hopeful it'll slot in at the same $1599 price point of the RTX 4090, since tech is supposed to advance while keeping prices at the same level, but I know better than that.
5090 spects is so off the charts it can cost 2000-2900$
But its good time to sell old 4090
yfn_ratchetIt's not like I'm bouncing in my seat waiting to buy it, man. I have my checklist for what my next upgrade will be, and any of the three colors have a chance of slotting in where my 3060 is next year. It's down to meeting my performance expectations at my price expectation, bar none.

And on the topic of 'hate', am I not justified in complaining about what I see as an attempt to fleece unaware buyers of their cash by intentionally and misleadingly altering the naming conventions of a product stack? Nvidia is slowly building up a history of such egregious tactics, ranging from the 4080 12GB to the 3050 6GB to the 4060/4060Ti 8GB; the only way to keep Nvidia accountable is to continually make our displeasure known and inform people about these disingenuous practices so that they aren't—and I'll say the quiet part out loud here—cheated out of their money.
I dont mind what nvidia release, 3050,4080 12Gb 4060 or Ti, at least there is something for everyone.
if u dont like it u dont buy it.

5060Ti 16Gb can be good if price/perf is right
Posted on Reply
#120
Quicks
MCJAxolotl7Nobody specificially brought up ultra, but that is the only condition in which 8gb cards at 1080p would be bad, and people are saying that 1080p on 8gb cards is bad.


medium settings can be fine, also frame gen ai upscaling and dlss/xess/fsr exist for god's sake
Ironically when enableing those settings you also require more Vram, so your point is very pointless TBH.
Posted on Reply
#121
Knight47
jaszyTextures in modern games are just too dense to ignore, especially at what is now a mainstream resolution (1440p).
If 1440p is mainstream, why are people worshipping the 1080p B580 card?
Posted on Reply
#122
AcE
TomorrowNow you're changing your tune? You flat out denied that the same happened with 2GB, 4GB and 6GB. Now you're saying that it's natural evolution and you agree with me that it will happen with 10GB, 12GB etc in the future? Stop flip-flopping and make up your mind.

So this means you dont even know what you're talking about since you've never experienced this yourself.
Only a denier would look at 7.6GB on a 8GB card and say - yeah that outta be enough...

Did how? Provided zero links to disprove those youtubers. Who are people gonna believe?
Random guy with a few hundred posts in a forum where he joined a few weeks ago or a person with decades of experience and millions of subs?

Only an Nvidia fanboy would say something as moronic as that. I think everyone who's rational would agree that Nvidia is ahead, but Intel is not generations behind with "primitive software". They have all the same features as Nvidia. What holds Intel back is their own management on low volume on those cards.

Oh so now they're poor or 1080p is edge case. Didn't you say earlier that 1080p is the most used res? Now suddenly it's an edge case....

Oh so you looked at average framerate numbers and concluded that 4060 has zero issues.
What about 1% lows?, frametimes?, what about image quality (textures not loading in) etc.
Average framerate is only one metric and by that even ancient SLI was totally ok with zero issues...

From TPU's own chart. 480 is 4% slower than 1060 6GB and 7% faster than 1060 3GB. I guess that's competitive.

It is you who has nothing to provide in this discussion. Calling 1080p as edge case, ignoring mounting evidence of 8GB problems and cheering Nvidia.
I hope at least your green check clears.
And even others mocking you for your delusions. But sure. Go on buying your high end cards and believing 8GB is enough.
I've done arguing with a fanboy. And dont bother replying. Your in my blocklist now.
I'm not going endlessly with you into arguments you have 100% lost. By the arguments provided, by even the links, proof, which I was talking about from the beginning, provided, you were firmly wrong, now you're only deflecting and I have no time for this senseless and endless debate. For you this is just about "being right" now, not about technology and interest and technical facts. 8 GB is 100% enough for even most 1440p games today, so it will be easily easily enough for 1080p for the foreseeable future, now. End of story. Also, Nvidia would 100% not risk a 8 GB card if you were right, but oh surprise, you're obviously not right. -> Otherwise debate this with @W1zzard perhaps. It's his data. Maybe you'd listen to him.

edit: to make this a bit more clear, I'm gonna explain what the game review data shows for the RTX 4060 - and what it does not show:

- games are all tested in Ultra settings, so in a lot of cases *beyond* sweet spot of the 4060, settings which should not be used for a 4060, in other words, or only with DLSS activated. Worst case scenario, you could say.
- despite this beyond sweet spot usage, the card never had terrible fps (< 30 or even <10 fps), which would show that the vram buffer is running into short comings, a very obvious sign usually.
- the only "problem" the card had was in some games in 1080p (and I only talk about 1080p here for this card), it had less than 60 fps.
- the min fps scaled with the average fps it had in that game, so if the avg fps was already under 60 fps, obviously the min fps wouldn't be great as well - not related to vram.
- also there are games like Starfield which generally have issues with min fps that are visible in that review, and not only with the RTX 4060. Also not related to vram.
- the card generally behaved like it should, it was *not* hampered by 8 GB vram in any of the games. Just proving that what I said all along is true.
- furthermore 8 GB vram is also proven to be mostly stable for even 1440p+, as the vram requirements barely increase with resolution alone. There are a variety of parameters that will increase vram usage, for example world size, ray tracing, texture complexity, optimisation issues. A lot of games don't even have problems in 4K (!) with 8 GB vram. That is the funny thing here. :) The problems start when the parameters get more exotic, so to say.
- so saying 8 GB vram isn't enough today for 1080p or even 1440p, is just wrong. Can it have select issues? Yes. It's not perfect, if you go beyond 1080 it will have more issues, but it will still mostly be fine. In 1080p, which this discussion was about, it's 99.9% fine on the other hand, making this discussion unnecessary.
- as it's still easily enough for 1080p, it will also be enough for a new low end card, for the foreseeable future.
lexluthermiesterThere are rumors of a number of games that will push beyond the 24GB limit of the 4090. IF true, then the 32GB will be relevant.
haha now let's stay in reality. What I can tell you is this: if you have a ton of vram it can be used, it can be useful even in games that don't *need* it, just so your PC has never to reload vram again, it's basically luxury, less stutter than cards with 16 GB for example. I experienced this while switching from my older GPU to the new one, Diablo 4 ran that much smoother, there was basically no reloading anymore while zoning.
jaszyYou're also completely wrong and think youre smarter than you actually are, but I digress. Not going to bother arguing with someone who quote replies literally everyone.
The issue here is that you're comparing two different companies. AMD used to do a "I give you extra vram" for marketing vs Nvidia. The fact here is, that historically Nvidia Upper Mid range GTX 1070 and Nvidia Semi-High End GTX 1080 used 8 GB back then, this is a fact. Whether AMD used a ton of vram on a mid range card, does not change this fact. The same chip of AMD started originally with 4 GB, as per the link you provided yourself, everyone can see. AMD also used marketing tactics like this on R9 390X, so you can go even further back to GPUs that never needed 8 GB in their relevant life time. When the R9 390X "needed" 8 GB, it was already too slow to really utilise it, making 4 GB the sweet spot for the GPU, not 8 GB (averaged on the life time of course). :) And as was already said, by multiple people not just me, Nvidia vram management is simply better than AMDs - this is also a historical fact since this is true for a very long time now, making a AMD vs Nvidia comparison kinda moot. AMD will historically always need a bit more vram to do the same things Nvidia does (not have lags or stutter). As someone said this is probably due to some software optimisation.
Posted on Reply
#123
chrcoluk
TomorrowLook at TPU game reviews. Some games go over the 8GB at even 1080p lowest setting with no FG/RT.
This year already 27% of TPU tested games exhibited this behavior and it will only increase each year.

Frame gen and RT increase VRAM usage even more. Yes upscaling lowers it, but only enough to counter increase from FG/RT.
Of course they will, for some reason so many people think rendering resolution is a big driver for VRAM usage.

Seen a post from someone who things the "8 gig is enough crowd" are basing it on a bygone era, when it was the norm for GPUs to have loads of VRAM that had no effect in games, times have changed.

The fact we have streaming tech in game engines tells us what we need to know, that is primarily a VRAM mitigation tool, without it, GPUs would need dozens of gigs of VRAM to load everything up all the time. These engines collapse when they simply have to do too much to mitigate extreme low amounts of VRAM like on 8-10 gig cards. End up with missing textures, low detail textures that are meant to be only used at distance and other issues. But it's "I still get 400fps dude playing my shooter where I dont care about details, its all good".

My last upgrade was pretty much down to VRAM, wouldnt surprise me if my next one is as well. I would have gone 4070ti super instead of 4080 super if was a FE version of it. I can also now use VRAM guzzling apps with ease, before I had to shut everything down aggressively to get every byte of VRAM I could get, had a plan to use iGPU for desktop to save VRAM.
Posted on Reply
#124
mkppo
igormpFor stuff like LLMs the jump will be waaaay higher due to all the extra memory bandwidth and also the extra VRAM. For this case the 4090 was a minor upgrade compared to the 3090 since the memory subsystem was still pretty much the same.
And you can bet tons of people will be buying this GPU solely for this reason.
Oh yeah sorry, I meant strictly gaming performance. I skipped 4090 over the 3090 solely because of the reason you mention. 5090 in this regard seems like a beast, or so it seems.
Posted on Reply
#125
Arkz
AcEI'm not going endlessly with you into arguments you have 100% lost. By the arguments provided, by even the links, proof, which I was talking about from the beginning, provided, you were firmly wrong, now you're only deflecting and I have no time for this senseless and endless debate. For you this is just about "being right" now, not about technology and interest and technical facts. 8 GB is 100% enough for even most 1440p games today, so it will be easily easily enough for 1080p for the foreseeable future, now. End of story. Also, Nvidia would 100% not risk a 8 GB card if you were right, but oh surprise, you're obviously not right. -> Otherwise debate this with @W1zzard perhaps. It's his data. Maybe you'd listen to him.

edit: to make this a bit more clear, I'm gonna explain what the game review data shows for the RTX 4060 - and what it does not show:

- games are all tested in Ultra settings, so in a lot of cases *beyond* sweet spot of the 4060, settings which should not be used for a 4060, in other words, or only with DLSS activated. Worst case scenario, you could say.
- despite this beyond sweet spot usage, the card never had terrible fps (< 30 or even <10 fps), which would show that the vram buffer is running into short comings, a very obvious sign usually.
- the only "problem" the card had was in some games in 1080p (and I only talk about 1080p here for this card), it had less than 60 fps.
- the min fps scaled with the average fps it had in that game, so if the avg fps was already under 60 fps, obviously the min fps wouldn't be great as well - not related to vram.
- also there are games like Starfield which generally have issues with min fps that are visible in that review, and not only with the RTX 4060. Also not related to vram.
- the card generally behaved like it should, it was *not* hampered by 8 GB vram in any of the games. Just proving that what I said all along is true.
- furthermore 8 GB vram is also proven to be mostly stable for even 1440p+, as the vram requirements barely increase with resolution alone. There are a variety of parameters that will increase vram usage, for example world size, ray tracing, texture complexity, optimisation issues. A lot of games don't even have problems in 4K (!) with 8 GB vram. That is the funny thing here. :) The problems start when the parameters get more exotic, so to say.
- so saying 8 GB vram isn't enough today for 1080p or even 1440p, is just wrong. Can it have select issues? Yes. It's not perfect, if you go beyond 1080 it will have more issues, but it will still mostly be fine. In 1080p, which this discussion was about, it's 99.9% fine on the other hand, making this discussion unnecessary.
- as it's still easily enough for 1080p, it will also be enough for a new low end card, for the foreseeable future.

haha now let's stay in reality. What I can tell you is this: if you have a ton of vram it can be used, it can be useful even in games that don't *need* it, just so your PC has never to reload vram again, it's basically luxury, less stutter than cards with 16 GB for example. I experienced this while switching from my older GPU to the new one, Diablo 4 ran that much smoother, there was basically no reloading anymore while zoning.

The issue here is that you're comparing two different companies. AMD used to do a "I give you extra vram" for marketing vs Nvidia. The fact here is, that historically Nvidia Upper Mid range GTX 1070 and Nvidia Semi-High End GTX 1080 used 8 GB back then, this is a fact. Whether AMD used a ton of vram on a mid range card, does not change this fact. The same chip of AMD started originally with 4 GB, as per the link you provided yourself, everyone can see. AMD also used marketing tactics like this on R9 390X, so you can go even further back to GPUs that never needed 8 GB in their relevant life time. When the R9 390X "needed" 8 GB, it was already too slow to really utilise it, making 4 GB the sweet spot for the GPU, not 8 GB (averaged on the life time of course). :) And as was already said, by multiple people not just me, Nvidia vram management is simply better than AMDs - this is also a historical fact since this is true for a very long time now, making a AMD vs Nvidia comparison kinda moot. AMD will historically always need a bit more vram to do the same things Nvidia does (not have lags or stutter). As someone said this is probably due to some software optimisation.
Really hungry VRAM games are indeed very rare. Doesn't mean they don't matter. People have been having issues with Indiana Jones on 8GB cards in 1080p. That's right now, not the future. This is gonna become more common. I'm sure there's ways of dealing with it like lowering textures, shadows, and what ever else. But the issue is there now. If all the common games don't have it, great. But if it's a game someone wants to play on their new card and they discover it's running crappy cause it's hitting the ceiling then... Well that just sucks.

And people are too forgiving of a massive rich company skimping on VRAM. 5060 should really launch with about 12GB, but will no doubt be 8GB. 12GB should at least let it run games for the next few years without running into any issues. Nvidia is just greedy and stingy. The fact that my 3080 only came with 10GB, and an empty slot on the PCB where another 2GB module could fit, is showing their greed physically. My 6700 XT cost far less and came with 12GB. The 6800 XT and 6900 XT both had 16GB. NV are lagging behind and people excuse it with the most games work fine argument. If a new card is coming out now, I should expect it to work with any game just fine, a new product shouldn't have an issue right away, rare as it may be.
Posted on Reply
Add your own comment
Jan 21st, 2025 17:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts