Monday, December 16th 2024

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.

It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources: VideoCardz, Wccftech
Add your own comment

173 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

#76
Solid State Brain
If you like to play around with LLMs, the latest AI video/image models, and even finetuning them for fun, 32GB might not even be enough. No need to be a professional.
Posted on Reply
#77
MCJAxolotl7
Hecate91It isn't exaggerated at all, Nvidia has stagnated on 8GB of VRAM for way too long, there are plenty of videos explaining why 8GB isn't enough even at 1080P if you don't care to read the reasons why.
I also don't expect many reviewers to say 8GB is a problem when Nvidia often gets excuses for things like not enough VRAM on the low and mid range.
People act like ultra settings are the only option. 8gb is fine at 1080p if you turn down the settings.
Posted on Reply
#78
Quicks
MCJAxolotl7People act like ultra settings are the only option. 8gb is fine at 1080p if you turn down the settings.
Even high settings on certain games takes more than 8GB and I have to switch to medium settings.

8GB is simply not enough today. 12GB is bare minimum and is next on the chopping block when the next console update comes. Probably in the next 2 years.
Posted on Reply
#79
Hecate91
MCJAxolotl7People act like ultra settings are the only option. 8gb is fine at 1080p if you turn down the settings.
I would be fine with 1080p medium if the price reflected being limited to 1080p 60fps medium at $200, but I really doubt that is going to happen since the 4060 8GB is $299.
But I think ultra settings is dumb in most cases for newer titles, not sure who brought up ultra. 10 or 12GB needs to be what an x60 card comes with since Intel has proven it can be done for less than $300.
Posted on Reply
#80
igormp
Solid State BrainIf you like to play around with LLMs, the latest AI video/image models, and even finetuning them for fun, 32GB might not even be enough. No need to be a professional.
This pretty much, a 70b model at q4 is ~35GB already, not even considering context, so you're stuck with either really expensive GPUs (like the A6000), or going dual GPU (my case).

2x32GB would be really lovely, and would allow me to jump into 100B+ models
Posted on Reply
#81
Vader
8 GB on a XX60 card again?? Lol nvidia will be ridiculed by all reviewers again. Hopefully there will be good cards from AMD at this price point with more vram than that. The difference in texture quality at 1080p is noticeable enough to make a difference.
Posted on Reply
#82
Dr. Dro
Vader8 GB on a XX60 card again?? Lol nvidia will be ridiculed by all reviewers again. Hopefully there will be good cards from AMD at this price point with more vram than that. The difference in texture quality at 1080p is noticeable enough to make a difference.
Rightfully so, but it remains that 8 GB is still relatively adequate for the 1080p market this card is targeted at. My laptop's RTX 3050 has 4 GB only - managing that is a challenge but even then it's doable in a lot of games. If you just play eSports games like League of Legends or Dota 2 it's not even a concern. Ultimately these are the people they mean to sell the RTX 5060 to and their experience will be great.

It seems that since SKU numbers were bumped up and the x50 series effectively eliminated (there was never a desktop RTX 4050, the 4060 already taps into the smallest AD107 silicon, the xx107 tier was previously used in "50 Ti" SKUs) - by all intents and purposes treat the 4060 and 5060 as descendants of the GTX 750 Ti and GTX 1050 Ti cards.
Posted on Reply
#83
TSiAhmat
Dr. DroIt seems that since SKU numbers were bumped up and the x50 series effectively eliminated (there was never a desktop RTX 4050, the 4060 already taps into the smallest AD107 silicon, the xx107 tier was previously used in "50 Ti" SKUs) - by all intents and purposes treat the 4060 and 5060 as descendants of the GTX 750 Ti and GTX 1050 Ti cards.
Yeah there is nothing wrong with 8GBs on a Card, If it doesn't need more to perform notiably better. The Problem is the Price on these things, way to expensive for basically no future proofing & the second you want to upgrade to WQHD, UHD or UWHD(ultrawide) the bottelneck themselves hard.

Hopefully Intel & AMD can take those prices down a peg. But probably *looking at the results of the past* not.
Posted on Reply
#84
gffermari
Unless nvidia introduces a DLSS 5.0 for texture compression-reconstruction or whatever, the 8GB for a 5060 is a joke.
It should be 10GB, if not 12.
We shouldn't forget that the x60 series have to perform and play everything, with normal compromises. It's not x50s.
Posted on Reply
#85
TSiAhmat
gffermariUnless nvidia introduces a DLSS 5.0 for texture compression-reconstruction or whatever, the 8GB for a 5060 is a joke.
It should be 10GB, if not 12.
We shouldn't forget that the x60 series have to perform and play everything, with normal compromises. It's not x50s.
Wasn't the 4060Ti 8GB Version & 16GB Version basically the same relativ performance? Has something changed from then to now? I am agreeing on higher scoring skus but the x60 doesn't seem to benefit from more VRAM yet, might change with the 50 series, but to early to say yet. At least in my opinion.
Posted on Reply
#86
gffermari
When you run low on VRAM, it's either you get slideshow of framerate or reduced framerate with non loaded textures.
There are many occasions where the 8GB model had issues with loading textures.
Posted on Reply
#87
TSiAhmat
gffermariWhen you run low on VRAM, it's either you get slideshow of framerate or reduced framerate with non loaded textures.
There are many occasions where the 8GB model had issues with loading textures.
errrr okay, but wouldn't that decrease the "Relative Performance" of the GPU?

I neither tested the 4060ti 8GB nor the 16 gb version one. So i can't talk about the textures... But wouldn't the Reviewers (like W1zzard) talk about textures not loading on their gpus when they benchmark them for an review?
Posted on Reply
#88
Hecate91
TSiAhmatYeah there is nothing wrong with 8GBs on a Card, If it doesn't need more to perform notiably better. The Problem is the Price on these things, way to expensive for basically no future proofing & the second you want to upgrade to WQHD, UHD or UWHD(ultrawide) the bottelneck themselves hard.

Hopefully Intel & AMD can take those prices down a peg. But probably *looking at the results of the past* not.
Agreed, pricing is the main issue and the performance isn't representative of the price. If all someone is going to play are esports games thats fine, although many esports titles can be played on an iGPU.
Intel has already put up solid competition with the B580, AMD needs to be competitive with RDNA4.
TSiAhmatWasn't the 4060Ti 8GB Version & 16GB Version basically the same relativ performance? Has something changed from then to now? I am agreeing on higher scoring skus but the x60 doesn't seem to benefit from more VRAM yet, might change with the 50 series, but to early to say yet. At least in my opinion.
Hardware Unboxed has a comparison video of the 4060Ti 8GB vs. 16GB, most of the performance gains aren't massive but it still shows the 4060 8GB is VRAM limited, 1% lows are better with the 16GB as well.
Posted on Reply
#89
Makaveli
AcE32 GB is irrelevant though unless you do work with the 5090 that involves heavy vram usage.
For anyone using LLM's and AI even 32GB of VRAM is not enough.
Posted on Reply
#90
TSiAhmat
Hecate91Hardware Unboxed has a comparison video of the 4060Ti 8GB vs. 16GB, most of the performance gains aren't massive but it still shows the 4060 8GB is VRAM limited, 1% lows are better with the 16GB as well.
Oh good to know, thanks for sharing the link. But 2 Frames on average and 3 Frames low% in 1440p seems almost like a rounding error... (and thats for an price increase of 100$ at that time [around 25% higher price])
Posted on Reply
#91
Hecate91
TSiAhmatOh good to know, thanks for sharing the link. But 2 Frames on average and 3 Frames low% in 1440p seems almost like a rounding error... (and thats for an price increase of 100$ at that time [around 25% higher price])
Yeah it's not a very good example of more VRAM on a card, the 4060Ti is too bandwidth limited to use 16GB, I think 10 or 12GB would be better and wouldn't raise the price by $100.
Posted on Reply
#92
TSiAhmat
Hecate91Yeah it's not a very good example of more VRAM on a card, the 4060Ti is too bandwidth limited to use 16GB, I think 10 or 12GB would be better and wouldn't raise the price by $100.
but even then is it really worth it? Even a 10 $ price increase is probably to much for that performance difference... (At least on this card)

Edit: What i want to say is:

The problem of the card is not the vram but the price, i am not implying it has no other fault. But I think it's more important to focus on the price.
Posted on Reply
#93
mxthunder
AcEWell 8 GB still enough for the low end model, otherwise it’s nice to see bumps, 32 GB is irrelevant though unless you do work with the 5090 that involves heavy vram usage. Apparently they still needed to go to 512 bit despite using GDDR7, which is way faster than G6X. GTX 280 says hello, that was the last time Nvidia used a 512 bit bus, completely different times.
Im kinda pumped for 512 bit bus. GTX280 was my favorite card of all time
Posted on Reply
#94
Vayra86
Prima.VeraImagine telling people that their brand new overly expensive laptop, with its mobile RTX 4070 GPU and 8GB of VRAM sucks! :laugh: :laugh: :laugh:
And yet there is absolutely no game in existence that doesn't play properly on those laptops.
My cousin has one and use it as a multimedia/gaming station while on the 6 month ship voyage tour, and he is having better FPS on his laptop (1080p), then me with an RTX 3080 in 1440p. :)
Imagine that.
We are talking cost effectiveness here. Ill agree waste of sand is exaggerating but it serves to drive the point home about the lower midrange which is really entry level crap these days.

Its strange that when we get high up the stack for GPUs we compare them up to the single digit percentages in bang/buck and FPS gaps, but when its an x60, 'because its the cheapest real gpu you can get' that suddenly doesnt fly and the argument becomes 'as long as it runs settings be damned' and bang/buck being relatively worse or at best equal to much better cards not being something that should change your perspective.

That, to me is short sighted and ultimately especially for those on a budget, just not smart at all. Mobile is on a different plane altogether but even there: what did he pay for that laptop and how long will it really last gaming proper? Be real. Most of these devices are dead in 4 years. A solid dGPU lasts 8+.

just look at this chart. The 4060 aint leading.. AND has no resale value 3 years from now while better similar perf/$ cards do. You dont need to be scientist to figure this out; it is penny wise, pound foolish.

TSiAhmatYeah there is nothing wrong with 8GBs on a Card, If it doesn't need more to perform notiably better. The Problem is the Price on these things, way to expensive for basically no future proofing & the second you want to upgrade to WQHD, UHD or UWHD(ultrawide) the bottelneck themselves hard.

Hopefully Intel & AMD can take those prices down a peg. But probably *looking at the results of the past* not.
Ofc. But we all know this 5060 wont cost 200. Or 250. It will cost more.
Posted on Reply
#95
TSiAhmat
Vayra86Ofc. But we all know this 5060 wont cost 200. Or 250. It will cost more.
Yeah sadly you & I don't have to be a fortune teller to know that... I assume it's 330 ~ 350 € for maybe 20 % more performance

Also, i know Nvidia doesn't need to lower the current prices because the Performance per $ List pretty much flat-lined (if you disregard the Intel GPUs, which a actually really cool, happy to see them. Hope they keep it up and don't stomp them in to the ground/kill the project)
Posted on Reply
#96
adilazimdegilx
Let's not to be hasty about judging 8gb vram on 300$ (assuming) xx60 class card in 2025. Maybe Nvidia has other plans. They can justify the 8gb on 5060 just by making it slower than current 4060. Then noone would complain about lack of Vram.




/s
Posted on Reply
#97
Sir Beregond
Legacy-ZAAcE; as a previous 3070Ti owner, I can tell you, 8GB VRAM is not enough for High/Ultra textures, the GPU itself is capable, but the VRAM ceiling, when you hit it, performance tanks, and fast. It's simply not acceptable today. nGreedia promotes these(did) cards as high-end, when it's in fact, mid-range/entry at best.

They have confused themselves with their own lies/product stack so much over the years, they can't even keep track anymore and they want their customers to swallow their bs.
Hell, I've hit VRAM ceilings with 12GB at 4K. I somewhat regret getting the 3080 Ti and not just ponying up for the 3090 back then. But I did get it used at a non scalper used price, so I guess I can't complain, but I hate that I am now thinking of needing to upgrade again just to get more VRAM for a card that's otherwise meeting my needs for now.

As for 8GB...it still has a place, but not on any products $300 or more. They should just hold off on releasing a 128-bit 5060 with 8GB until the higher density GDDR7 is available and they can make it at least 12GB.
Posted on Reply
#98
Vayra86
adilazimdegilxLet's not to be hasty about judging 8gb vram on 300$ (assuming) xx60 class card in 2025. Maybe Nvidia has other plans. They can justify the 8gb on 5060 just by making it slower than current 4060. Then noone would complain about lack of Vram.




/s
You might be on to something, the 4060 was rivalling Ampere x60 performance already. Maybe that's the trick. They'll make the x60 progressively worse so that the rest looks relatively better! :roll::roll:
Posted on Reply
#99
mkppo
a $2500 for 5090 is expected because, well, they can. Maybe that'll actually allow people to get one and it won't be scalped? Who knows. I do know it'll be fast though, just not on the level of 3090 > 4090.

8GB on a 5060 is a god awful joke though. Newer games utilize larger framebuffers and 1440p monitors are very cheap now where 8GB is absolutely not enough. The weak ass 4060 could fill that up so i'm assuming the 5060 will be further bottlenecked. Buying a slower GPU is better than one that occasionally maxes it's framebuffer because the end result is unpredictable.
Posted on Reply
#100
igormp
mkppoI do know it'll be fast though, just not on the level of 3090 > 4090.
For stuff like LLMs the jump will be waaaay higher due to all the extra memory bandwidth and also the extra VRAM. For this case the 4090 was a minor upgrade compared to the 3090 since the memory subsystem was still pretty much the same.
And you can bet tons of people will be buying this GPU solely for this reason.
Posted on Reply
Add your own comment
Jan 5th, 2025 12:00 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts