Monday, December 16th 2024

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.

It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources: VideoCardz, Wccftech
Add your own comment

173 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

#26
Outback Bronze
nguyenMy kid loves playing games on Intel integrated graphics
I'm trying to get my kid playing Sonic on the Sega Mega Drive. He won't play damn it.
A&P211My kids love playing outside.
I'm trying to get him inside playing games. He stays cleaner inside playing games :)

Good comparison here of 8GB vs 16Gb:

Posted on Reply
#27
jaszy
Mainstream 8GB GPU's first appeared in 2016.

Pretty crazy that we have decade long 8GB low-mid range.
Posted on Reply
#28
AcE
TomorrowWell except people like you who think it's normal to sell 8GB for 300+ and argue against progress.
Try to bring technical arguments instead of empty words and just drama queen talk.
TomorrowThis has happened may times before. It happened with 2GB, 4GB and 6GB cards too.
No, it didn't. 8 GB is way longer in the tooth as those ever were. 4 GB for example was quickly outdated because 4 GB is just a low amount, whereas 8 GB isn't. 6 GB was just "replaced" by 8 GB, it just vanished from the market. 2 GB had the same fate as 4 GB. Apples and kiwis.
TomorrowFor some reason i no longer see anyone arguing today that 4GB is enough if you lower enough settings.
4 GB already reached the critical zone years ago, while 8 GB is still far away from that, you're just technically wrong, and because you got no technical arguments your're just talking endlessly.
TomorrowThat's your problem. You refuse to watch videos and then claim i have no practical arguments or links. How lazy can a person be.
Lazy? No, I don't like videos, text is way better to digest. You are lazy. It is YOUR argument, so make it or lose the argument, life is simple. So far you didn't refute any of my arguments, it's quite easy going for me.
TomorrowAnd what do you mean by "shit storm"?
If you were right there would be millions of unhappy 8 GB video card users, namely 4060, 4060 Ti and 7600, 6600, 6600 XT. But there aren't because you're just making up drama and your words have no merit. :)
TomorrowComplain and nitpick on their chip as much as you do but every 8GB card released from now on will be compared to B580
Good joke, Nvidia surely won't compare their cards with a company that has 0% market share. This is like you saying Apple will compare their phones with a brand that nobody is buying. Yea makes a ton of sense. ^^
Hecate91Except low end cards should be capable of 1440p now, as 1440p monitors are very affordable.
No, and 1080p is still the widest used res. Edit: they are if you include upscaling or if you play older games / optimised settings. But 1440p will never be a must on a low end card, not in the foreseeable future at least. A lot of people are still happy with 1080p. Until that changes 1440p will not make 1080p obsolete.
Hecate91Honestly good luck with that, I like to have written content as an option, but if you want to compare how games look and perform with an analytical comparison you need to see it in a video.
I don't need luck, Nvidia will sell 5060 with 8 GB because 8 GB isn't a issue (given the rumours are true). Technical side (tech companies) is on mine. Aside from reviewers like W1zzard proving it.
Hecate91There was criticism from users and reviewers when AMD released 8GB cards, that same criticism doesn't happen with Nvidia cards,
Sure the good old mindshare of Nvidia, but the criticism towards AMD was more because the 7600 is nearly the same chip as 6600 XT, barely faster, not because of 8 GB vram. Also you are wrong, there was a lot of unnecessary drama with 4060 Ti 8 GB (because of vram), which then a lot of people compared to 7700 XT and said 7700 XT is better - well maybe it is, but that's another topic, because this topic is for me mainly (99%) about the 5060, not a mid range card (4060 Ti).
Hecate91A majority of reviews and comments from users have been positive on the Intel B580
We will see the market share, comments and reviews are irrelevant. :) Buyers vote, everything else is really irrelevant. Probably 5% market share and then back to 0%, like last time.
Outback BronzeGood comparison here of 8GB vs 16Gb:
So why is he calling it a joke if 16 GB is so great? Maybe because 16 GB is 99% useless on a card that is mainly used for 1080p. :) 8 GB largely also works fine with 1440p, btw.
jaszyMainstream 8GB GPU's first appeared in 2016.
Off topic. The drama here is about low end GPUs having 8 GB, 2016 was a 1080 and 1070, that's semi high end and upper mid range, so completely different cards that have nothing to do with this discussion other than saying "oh 8 GB was also used back then on completely different cards".
jaszyPretty crazy that we have decade long 8GB low-mid range.
It's only crazy if you think of it in non-technical terms. If you understand what 8 GB vram buffer is, in technical terms, it's not crazy at all. :)
Posted on Reply
#29
PaddieMayne
I'm going to apply to my bank to remortgage the family home so I can afford a shiny new 5000 series nvidia card, as I'm out of kidneys.
Posted on Reply
#30
Ibotibo01
I suppose that the 5060 will be priced around $259 or $269. Anything higher than that, and it will be another DOA. 8GB isn’t enough for ray tracing games, you can find plenty of articles and videos confirming this. Even, DLSS 3 can’t save the 4060 if its VRAM is fully used. Nvidia will likely say, "It’s okay, we have new compression techniques, and our 8GB is equivalent to AMD's and Intel's 12GB." I actually had hopes that Nvidia would release this card with 12GB and a 128-bit bus, possibly leveraging GDDR7. Anyway, just wait 3 or 4 months after the 5060 8GB release, and we'll probably see a 5060 Super or a 5060 12GB variant. Also, 5060 8GB could be on par with the 3070 Ti.
Posted on Reply
#31
Bwaze
jaszyMainstream 8GB GPU's first appeared in 2016.

Pretty crazy that we have decade long 8GB low-mid range.
We also have a 4TB SSD ceiling for half a decade now, anything over that, and there's only 8 TB for all that time, costs twice as much per TB - for the whole duration.

Jensen Huang, September 2022:

"Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
Posted on Reply
#32
nguyen
Outback BronzeI'm trying to my kid playing Sonic on the Sega Mega Drive. He won't play damn it.


I'm trying to get him inside playing games. He stays cleaner inside playing games :)
Is he asking for a 4080 to play fortnite :p

Yeah I would prefer my daughter to stay inside rather than outside
Posted on Reply
#33
AcE
Bwaze"Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
It's true, new gen is "4nm" which is a derivative of the 5nm they used last time, Moore's Law can't be upheld if your progress is too slim. Datacenter progress on the other hand is better, so I would say he is maybe half right with what he said.
Posted on Reply
#34
Tomgang
Really Nvidia still 8 gb vram on rtx 5060. Its all ready DOA at launch. At least that's my opinion.
If a possible rtx 5050 comes with only 6 gb. That will be DOA at launch. As there are now games out there that litterly refuse to start with less than 8 gb vram. As i have seen if i remember correct the new Indiana Jones games would not start on a 6 gb gpu. Just gave a error message with vram. Saw that in a youtube video a few days ago.

Rtx 5060 should at least have 10 gb and preferly 12 gb.
Posted on Reply
#35
Dr. Dro
There were some edge cases where 24 GB of VRAM would be maxed out (modded Bethesda games), 32 GB should rectify this and is welcome for high-resolution (4K and beyond) gamers. The one thing that bothers me is the 5080 staying with the 256-bit, 16 GB configuration of the previous generation. It's substantially worse than the 5090's and the gulf between these two cards will be simply insane, which may have severe implications on price (as in, the 5090 might end up supremely expensive).

On the low end, 8 GB is tight, but serviceable. The x60 cards are for 1080p and below gamers. Sure, they might be "good enough" for some basic 1440p gaming, but that's where they get you.
Posted on Reply
#36
kneblasch
Legacy-ZASpeaking of Zotac, is the brand any good? I never tried them before.
Ive been using a Zotac card since January, and have had no issues at all so far. I only chose it because it looked better than other cards in the same price range, but being my first Zotac card, I'm really happy with it.
Posted on Reply
#37
Dr. Dro
kneblaschIve been using a Zotac card since January, and have had no issues at all so far. I only chose it because it looked better than other cards in the same price range, but being my first Zotac card, I'm really happy with it.
The Zotac Ada cards really are beautiful this time around. I like how they look on a finished system. Might not be as pretty as a ROG Strix, but they certainly don't carry their price tag...
Posted on Reply
#38
AcE
Dr. DroThe one thing that bothers me is the 5080 staying with the 256-bit, 16 GB configuration of the previous generation. It's substantially worse than the 5090's and the gulf between these two cards will be simply insane, which may have severe implications on price (as in, the 5090 might end up supremely expensive).
True, the gap will widen, so the 5090 will probably cost at least 2000$ then and 5080 stay at 1200ish, but this time gap is so wide, 5080 actually worth to buy and not just "oh, it's only 400$ more? I get the 4090 then instead". This is what happens when there is 0 competition, Nvidia does whatever they want. Quite sad!
Posted on Reply
#39
john_
It's going to be expensive. Putting 32GBs on the 5090 means it can do more in AI, so I guess the price will be close to $2000. I would be saying more than $2000 if Nvidia wasn't discontinuing the 4090.
Posted on Reply
#40
kondamin
A full 512bit card, that’s going to be one expensive mofo.
kinda feel like nvidia is expecting the AI gravy train is slowing down being willing to bring this level of complexity to the masses.
Maybe 60 series of cards will have hbm on the consumer level
Posted on Reply
#41
AcE
john_It's going to be expensive. Putting 32GBs on the 5090 means it can do more in AI, so I guess the price will be close to $2000. I would be saying more than $2000 if Nvidia wasn't discontinuing the 4090.
Could very well start with 2000$ and due to not enough chips real pricing will be closer to 2500$ - same as with 4090, ~ about 500$ over msrp.
Posted on Reply
#42
Dr. Dro
AcETrue, the gap will widen, so the 5090 will probably cost at least 2000$ then and 5080 stay at 1200ish, but this time gap is so wide, 5080 actually worth to buy and not just "oh, it's only 400$ more? I get the 4090 then instead". This is what happens when there is 0 competition, Nvidia does whatever they want. Quite sad!
If it's $1999 MSRP, should be doable. Expensive, but I can buy one. But the 5080 isn't worth purchasing this time around, not if you have a premium AIB model 4080 or 4080S. Even if it's ~20% faster, it's not worth it.
Posted on Reply
#43
Bwaze
john_It's going to be expensive. Putting 32GBs on the 5090 means it can do more in AI, so I guess the price will be close to $2000. I would be saying more than $2000 if Nvidia wasn't discontinuing the 4090.
I think there's about zero chance RTX 5090 will be below $2000. This is the price movement of RTX 4090, it went ballistic before they allegedly stopped production. But it's still widely available, so it's not scarcity.

This AI focus is going to be worse than cryptomadness, and people are still pretending it doesn't affect them.

Posted on Reply
#44
AcE
Dr. DroIf it's $1999 MSRP, should be doable. Expensive, but I can buy one. But the 5080 isn't worth purchasing this time around, not if you have a premium AIB model 4080 or 4080S. Even if it's ~20% faster, it's not worth it.
It's not, yes. 256 shaders more, or what 512 in case of 4080 vanilla, it's not worth it, will maybe be 10, 20% tops faster. Nvidia is doing exactly this because they know they have 0 competition, otherwise this strategic move would be impossible btw. Otherwise the 5080 would be bigger and this "5080" would really be a 5070 Ti, and nothing else. Welcome to 100% monopoly in high end. Again same disaster like RTX 2000 times, but I think this time even worse.
Posted on Reply
#45
Prima.Vera
I don't know what kind of games you guys are playing, but I have no issues running any game with an GTX 1080 with 8GB of RAM on 1080p. Like basically ANY game I've played is hovering over 50fps.... And that's with MAX Texture details.
Posted on Reply
#46
LittleBro
AcEActivate DLSS and it is, however, as I said this discussion is about 5060, 4060 and 7600 (low end cards), not 3070 Ti, which is like a rat case of its own and I never said 3070 Ti is fine. :) 3070 / 3070 Ti are not fine, why did you buy it? RX 6800 ~ same price, way better card. Here we can start blaming the users who bought a bad video card, tbh.
It does not work like that. Even DLSS requires VRAM for caching. VRAM utilization raises with DLSS/FSR/XeSS and further more with frame generation.
Shit starts to hit the fan when display drivers can't satisfy caching requirements, then stuttering and fps drops are inevitable.
It has been proven multiple times, especially RTX 4060 Ti 8 GB vs Ti 16 GB, that system with less VRAM delivers less 0.1% and 1% lows fps and also uses more system RAM (about 2-3 GB more).

8GB is definitely not okay even for lower mainstream these days, it was already present in mainstream SKUs in 2016.
Game devs can't move forward with quality of graphics assets when they are still contrained by low VRAM amount.
RTX 3060 is most popular GPU according to Steam hardware survey and I'd say it's not the 12 GB variant, rather 6 GB.

Now with Intel introducing 12 GB in low-end segment (B580), Nvidia should really reconsider going past 8 GB. GDDR6(X) chips are not expensive today, you can get 8 GB for $22.
In lowend and lower mainstream it does not actually matter whether you have GDDR7 or GDDR6. Card will most probably don't have enough performance to fully utilize memory
bandwidth whether the chips have 21 Gbps or 28 Gbps. More important than bandwidth is capacity.
The bigger the capacity, the more things you can fit into cache. The lower the capacity, the more you depend on primary storage device's speed and memory bandwidth.

12 GB + 192-bit or 16 GB + 128-bit should become standard even in low-end or lower-mainstream.

Btw, if you recall, GTX x60 series were never low-end cards, they were lower mainstream cards. Nvidia used to make x50 cards and absolute lowend x30.
Since RTX 2000 that has changed, RTX xx70 eries are that what x60 series were meant to be. RTX 4080 12 GB was luckily renamed to 4070 after rich criticism.
But it goes the same way on AMD side. With HD 6000 series, things changed. HD6850/6870 was not a real successor to HD 5850/5870, the HD 6950/6970 was.

As for RTX 5090, forget about <$2k price tag, expect $2299 or more. We saw that there was demand for RTX 4090 even when it cost above $2k.
Posted on Reply
#47
RayneYoruka
I'd love to snag a 4080S or a 4090 if they were going down in price in the usual stores I buy tech but I guess the same is happening again, new gen incoming, no new stock whatsoever~~ And this is how I ended with a 3080 10G when 40 series came out. Hopefully this time I'm not as stupid buying so late lmao.

If the price is worth it as long as there is 16GB of Vram most likely I might end up with a 5080.. I'm hitting the ceiling of the Vram in some 3D stuff I do, RT then the worst offender has been VR where it crashes SteamVR causing the worst sickness I've ever experienced lol
Posted on Reply
#48
Hecate91
BwazeWe also have a 4TB SSD ceiling for half a decade now, anything over that, and there's only 8 TB for all that time, costs twice as much per TB - for the whole duration.

Jensen Huang, September 2022:

"Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
Due to every company focusing on AI, the consumer market suffers and I only expect things to get worse with SSD's and video cards. Nvidia has been treating the mainstream gamers as an afterthought yet people still get excited for these new cards.

Also Jensen saying that, its an admission of price can only go up, leather jacket man wouldn't even be slightly generous with VRAM or bandwidth unless you buy a flagship card.
Posted on Reply
#49
AcE
Prima.VeraI don't know what kind of games you guys are playing, but I have no issues running any game with an GTX 1080 with 8GB of RAM on 1080p. Like basically ANY game I've played is hovering over 50fps.... And that's with MAX Texture details.
They only play triple A games on Ultra settings with exact the same one vram heavy level over and over again, just to prove the dramatubers right. /jk :)))))
LittleBroIt does not work like that. Even DLSS requires VRAM for caching. VRAM utilization raises with DLSS/FSR/XeSS and further more with frame generation.
It does afaik, DLSS lowers vram amount, but the way I used the DLSS argument was in general, for performance, not only vram. With Frame Gen, not sure if it stagnates vram usage or increases it.
LittleBroIt has been proven multiple times, especially RTX 4060 Ti 8 GB vs Ti 16 GB, that system with less VRAM delivers less 0.1% and 1% lows fps and also uses more system RAM (about 2-3 GB more).
Edge cases and things normal users (so most people on planet) barely care about, as long as game is mostly fine and they have no problems. You're basically citing a luxury problem here to try making a point that 8 GB isn't enough, you can only say 8 GB is "suboptimal" with your argument, but this topic was about *not enough* and not "suboptimal" so your argument is firmly beside the point. Also you are citing 4060 Ti, which isn't the main point of this discussion, this is about 5060, 4060 and other low end cards that will 100% be fine with 8 GB vram.
LittleBro8GB is definitely not okay even for lower mainstream these days, it was already present in mainstream SKUs in 2016.
Strawman argument that I have already refuted in #29.
LittleBroBtw, if you recall, GTX x60 series were never low-end cards, they were lower mainstream cards.
And those were lower midrange cards / mid range cards, from 2016, and had 6 GB / 3 GB Vram, yes, not 8 GB. If a card that is mid or semi highend from 2016 has 8 GB it just proves, that high end of 2016 is now the low end, is just natural evolution, very normal. Nothing special.

Otherwise read my other posts, nothing you said wasn't already answered multiple times.
Posted on Reply
#50
TSiAhmat
BwazeI think there's about zero chance RTX 5090 will be below $2000. This is the price movement of RTX 4090, it went ballistic before they allegedly stopped production. But it's still widely available, so it's not scarcity.

This AI focus is going to be worse than cryptomadness, and people are still pretending it doesn't affect them.

for the love of god let the 8000 Series be bad at crypto, I meant AI oops
Posted on Reply
Add your own comment
Jan 5th, 2025 11:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts