Monday, December 16th 2024
32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec
Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.
It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources:
VideoCardz, Wccftech
It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
173 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec
8GB is simply not enough today. 12GB is bare minimum and is next on the chopping block when the next console update comes. Probably in the next 2 years.
But I think ultra settings is dumb in most cases for newer titles, not sure who brought up ultra. 10 or 12GB needs to be what an x60 card comes with since Intel has proven it can be done for less than $300.
2x32GB would be really lovely, and would allow me to jump into 100B+ models
It seems that since SKU numbers were bumped up and the x50 series effectively eliminated (there was never a desktop RTX 4050, the 4060 already taps into the smallest AD107 silicon, the xx107 tier was previously used in "50 Ti" SKUs) - by all intents and purposes treat the 4060 and 5060 as descendants of the GTX 750 Ti and GTX 1050 Ti cards.
Hopefully Intel & AMD can take those prices down a peg. But probably *looking at the results of the past* not.
It should be 10GB, if not 12.
We shouldn't forget that the x60 series have to perform and play everything, with normal compromises. It's not x50s.
There are many occasions where the 8GB model had issues with loading textures.
I neither tested the 4060ti 8GB nor the 16 gb version one. So i can't talk about the textures... But wouldn't the Reviewers (like W1zzard) talk about textures not loading on their gpus when they benchmark them for an review?
Intel has already put up solid competition with the B580, AMD needs to be competitive with RDNA4. Hardware Unboxed has a comparison video of the 4060Ti 8GB vs. 16GB, most of the performance gains aren't massive but it still shows the 4060 8GB is VRAM limited, 1% lows are better with the 16GB as well.
Edit: What i want to say is:
The problem of the card is not the vram but the price, i am not implying it has no other fault. But I think it's more important to focus on the price.
Its strange that when we get high up the stack for GPUs we compare them up to the single digit percentages in bang/buck and FPS gaps, but when its an x60, 'because its the cheapest real gpu you can get' that suddenly doesnt fly and the argument becomes 'as long as it runs settings be damned' and bang/buck being relatively worse or at best equal to much better cards not being something that should change your perspective.
That, to me is short sighted and ultimately especially for those on a budget, just not smart at all. Mobile is on a different plane altogether but even there: what did he pay for that laptop and how long will it really last gaming proper? Be real. Most of these devices are dead in 4 years. A solid dGPU lasts 8+.
just look at this chart. The 4060 aint leading.. AND has no resale value 3 years from now while better similar perf/$ cards do. You dont need to be scientist to figure this out; it is penny wise, pound foolish.
Ofc. But we all know this 5060 wont cost 200. Or 250. It will cost more.
Also, i know Nvidia doesn't need to lower the current prices because the Performance per $ List pretty much flat-lined (if you disregard the Intel GPUs, which a actually really cool, happy to see them. Hope they keep it up and don't stomp them in to the ground/kill the project)
/s
As for 8GB...it still has a place, but not on any products $300 or more. They should just hold off on releasing a 128-bit 5060 with 8GB until the higher density GDDR7 is available and they can make it at least 12GB.
8GB on a 5060 is a god awful joke though. Newer games utilize larger framebuffers and 1440p monitors are very cheap now where 8GB is absolutely not enough. The weak ass 4060 could fill that up so i'm assuming the 5060 will be further bottlenecked. Buying a slower GPU is better than one that occasionally maxes it's framebuffer because the end result is unpredictable.
And you can bet tons of people will be buying this GPU solely for this reason.