Tuesday, December 31st 2024

NVIDIA RTX 5000 Blackwell Memory Amounts Confirmed by Pre-Built PC Maker

By now, it's a surprise to almost nobody that NVIDIA plans to launch its next-generation RTX 5000-series "Blackwell" gaming graphics cards at the upcoming CES 2025 event in Las Vegas in early January. Previously, leaks and rumors gave us a full run-down of expected VRAM amounts and other specifications and features for the new GPUs, but these have yet to be confirmed by NVIDIA—for obvious reasons. Now, though, it looks as though iBuyPower has jumped the gun and prematurely revealed the new specifications for its updated line-up of pre-built gaming PCs with RTX 5000-series GPUs ahead of NVIDIA's official announcement. The offending product pages have since been removed, but they both give us confirmation of the previously leaked VRAM amounts and of the expected release cadence for RTX 5000, which will reportedly see the RTX 5070 Ti and RTX 5080 launch before the RTX 5090 flagship.

On iBuyPower's now-pulled pages, the NVIDIA GeForce RTX 5080 16 GB and GeForce RTX 5070 Ti 16 GB can be seen as the GPUs powering two different upcoming Y40 pre-built gaming PCs from the system integrator. The VRAM specifications here coincide with what we have previously seen from other leaked sources. Unfortunately, while an archived version of the page for the pre-built containing the RTX 5080 appears to show the design for an ASUS TUF Gaming RTX 5080 with a triple-fan cooler, it looks like iBuyPower is using the same renders for both the 5080 and 5070Ti versions of the pre-built PCs. What's also interesting is that iBuyPower looks to be pairing the next-gen GPUs with 7000-series AMD X3D CPUs, as opposed to the newly released AMD Ryzen 9000 X3D chips that have started making their way out into the market.
Sources: iBuyPower (via Archive.org), iBuyPower (via Archive.org)
Add your own comment

63 Comments on NVIDIA RTX 5000 Blackwell Memory Amounts Confirmed by Pre-Built PC Maker

#26
3valatzy
AusWolfI'm gonna get a 9070 XT. I'm on Linux, and AMD makes my life easier there with the kernel-integrated drivers. Not to mention, 7900 GRE/XT level performance with upgraded RT, 2-300 W power and 16 GB VRAM for ~500 is the sweet spot. I don't need anything more expensive or beefier.
Clever plan, but you can also consider the lower tier cards - RX 9600, RX 9600 XT, or RX 9700.
Posted on Reply
#27
AusWolf
3valatzyClever plan, but you can also consider the lower tier cards - RX 9600, RX 9600 XT, or RX 9700.
I could, but with a 6750 XT, they wouldn't be so much of an upgrade.

Also, I've bought too much PC hardware in recent years, I want to slow down a bit, and plan something long-term for a change. :)
Posted on Reply
#28
freeagent
3valatzyeven a 3080 10GB can't run it at max settings 1080p.
Because they recommend a 3080Ti. 12GB cards.
Posted on Reply
#29
mate123
3valatzyCP 2077 Phantom Liberty, a 2023 game, uses 18.3 GB VRAM at 4K with PT and DLSS3 frame gen, without textures mods from the community.
Alan Wake 2, a 2023 game: 17.8 GB.
Indiana Jones and the Great Circle have very high VRAM usage, even a 3080 10GB can't run it at max settings 1080p.
Avatar Frontiers of Pandora with unobtanium settings at 3440x1440 also uses 18/19 GB on a 4090.

For so expensive cards 16 GB is outrageous and DOA.

Vote with your wallet.
VRAM allocation does not necessarily means it really needs 18,3 GB VRAM used, I think for gaming alone it is gonna be fine.

They are not giving more than 16 GB VRAM because they don't want people running LLMs/AI art generators/training AI locally on these cards ---> those capacities are for Quadros etc which are I dont even wanna know how much these days. They obviously want to sell them very expensive but if they had 32-64 GB cards for reasonable prices companies would buy them in bulk for AI training and we would be back to where we were when everyone was mining on GPUs in their basements...
Posted on Reply
#30
3valatzy
mate123VRAM allocation
No one talks about VRAM allocation. Enough with this inappropriate excuse.
mate123They are not giving more than 16 GB VRAM because they don't want people running LLMs/AI art generators/training AI locally on these cards
mate123but if they had 32-64 GB cards
There is a very substantial difference between the number 16 and the number 32, even more so between the number 16 and the number 64.
We know that they are not giving because they are cutting corners and because of the greed.
Posted on Reply
#31
Prima.Vera
3valatzyCP 2077 Phantom Liberty, a 2023 game, uses 18.3 GB VRAM at 4K with PT and DLSS3 frame gen, without textures mods from the community.
Alan Wake 2, a 2023 game: 17.8 GB.
Indiana Jones and the Great Circle have very high VRAM usage, even a 3080 10GB can't run it at max settings 1080p.
Avatar Frontiers of Pandora with unobtanium settings at 3440x1440 also uses 18/19 GB on a 4090.

For so expensive cards 16 GB is outrageous and DOA.

Vote with your wallet.
I absolutely played ALL those games you mentioned on 3440x1440 with my 10GB 3080 , and I absolutely had ZERO issues maintaining 60+ fps on max details.
Again, stop being brainwashed by all those stupid youtube videos and crappy articles. Just test the game yourself.
And again, for the millionth time, the game engines cache almost all VRAM, but that doesn't mean it uses all.
Stop believing all this retarded propaganda and just verify, if possible, by yourself.
Actually, even the TPU has very good benchmarks where all those top hungry games are tested. It's all here.
Posted on Reply
#32
nguyen
Prima.VeraI'm keep saying the same thing over and over.
The RTX 5080 should have been a 320-bit card with 20GB VRAM.
This 5080 with 256-bit/16GB scam pulled by nVidia is actually the real 5070, while the 5070 Ti, is actually a 5060 Ti for the price of a 5080.
Weird that historically the xx80 card usually are 256bit bus, the 3080 is more of the outlier (because RX6000 was too close in performance, or Samsung 8nm yield was just terrible, or both)

GTX 980 - 256bit bus
GTX 1080 - 256bit bus
RTX 2080/2080Super - 256bit bus
RTX 3080 - 320bit bus
RTX 4080 - 256bit bus
RTX 5080 - 256bit bus
Posted on Reply
#33
mate123
3valatzyNo one talks about VRAM allocation. Enough with this inappropriate excuse.
VRAM allocation is exactly what we are talking about

You can cleary see that 8GB is indeed not enough here anymore but there is no difference on a 12GB card and a 24 GB card which means that while it is allocating 18GB when available it does not in fact need more than 12 GB
Posted on Reply
#34
docnorth
This could drive more users to 5070 ti than 5080, depending of course on the price. Even though GDDR7, 16gb seem more ‘appropriate’ for the 5070 ti.
Posted on Reply
#35
Zazigalka
Prima.Verathe 5070 Ti, is actually a 5060 Ti for the price of a 5080.
a 5060Ti card on a +300w gb103 die ? your thinking is really amusingily naive. or maybe you've seen too much clickbait on youtube, telling you a 4080 was really a 4060ti.
was 7900xtx supposed to be a 7600xt then ?
Posted on Reply
#36
Prima.Vera
Zazigalkaa 5060Ti card on a +300w gb103 die ? your thinking is really amusingily naive. or maybe you've seen too much clickbait on youtube, telling you a 4080 was really a 4060ti.
was 7900xtx supposed to be a 7600xt then ?
I'm not watching youtube junk videos, but it seems you are, since you are so brainwashed.
It doesn't matter. You can make the leather jacket guy even richer, while I enjoy my games for another 2 or 3 years, while spending my money on a nice holiday or something, instead of wasting it on some overpriced hardware.
Posted on Reply
#37
3valatzy
Zazigalkaa 5060Ti card on a +300w gb103 die ? your thinking is really amusingily naive. or maybe you've seen too much clickbait on youtube, telling you a 4080 was really a 4060ti.
was 7900xtx supposed to be a 7600xt then ?
RX 7900 XTX is indeed a real RX 7800 XTX in the best case. In the worst case it trades blows indeed with RTX 4070 and RTX 4070 Ti, so calling the "4080" a 4060 Ti is plausible.
Very fast forgot the 4080-12 ?
Posted on Reply
#38
Legacy-ZA
Geeze Jensen,

You could at least give your 5070Ti Super 20GB, same for the 5080Ti, for the 5080 Ti Super, 24GB. You know it takes a lot for me to start hating someone, but you Jensen, I hate, a lot.
Posted on Reply
#39
AusWolf
Legacy-ZAGeeze Jensen,

You could at least give your 5070Ti Super 20GB, same for the 5080Ti, for the 5080 Ti Super, 24GB. You know it takes a lot for me to start hating someone, but you Jensen, I hate, a lot.
The message is clear - you either buy a 5090, or you're a barefoot peasant and should suffer with the consequences of your choice. I'm not a fan of any company, but this makes Nvidia quite unsympathetic in my eyes. I'm not saying that 16 GB isn't enough for mainstream gaming, but if you fork out the cash for a 5080, then you should be getting a tad more for a bit of longevity, imo.
Posted on Reply
#40
Knight47
3valatzyCP 2077 Phantom Liberty, a 2023 game, uses 18.3 GB VRAM at 4K with PT and DLSS3 frame gen, without textures mods from the community.
Alan Wake 2, a 2023 game: 17.8 GB.
Indiana Jones and the Great Circle have very high VRAM usage, even a 3080 10GB can't run it at max settings 1080p.
Avatar Frontiers of Pandora with unobtanium settings at 3440x1440 also uses 18/19 GB on a 4090.

For so expensive cards 16 GB is outrageous and DOA.

Vote with your wallet.
I chose you 5060 Ti 16GB! Is my vote valid?
Posted on Reply
#41
Why_Me
A 5080 20GB would make sense.
Posted on Reply
#42
eldon_magi
AusWolfThe message is clear - you either buy a 5090, or you're a barefoot peasant and should suffer with the consequences of your choice.
I agree.
Posted on Reply
#43
dismuter
I would have hoped for a bit more critical thinking here. Both have "Super" suffixes, so they're just as likely to be typos for 4070 Ti Super and 4080 Super, which means that nothing has been confirmed.
Posted on Reply
#45
freeagent
eldon_magii want a nvidia 5000 turbo
Your gonna pay the big dollas for it :D
Posted on Reply
#46
cerulliber
well that's terrible news but as expected. I barely fit Star Wars with secret Outlaws settings 2k in 16gb vram.
via dramexchange website prices are sub 3$ per 8gb.
I also found that "With current-gen GDDR6 expected to get cheaper by 8-13%, while next-gen GDDR7 likely to become cheaper by max 5%. But don’t expect graphics cards makers like Nvidia and AMD to give that benefit to the consumers. For two reasons. They (Nvidia) want to keep their high profit margins and two, they have likely signed a long time deal with VRAM makers. Means price decrease of graphics cards will only follow the usual trends, not anything unusual.The reason for all this is simple, lesser demand of computers, mobiles, notebooks and datacenters worldwide. This has ensured that the prices go down everywhere."
conclusion: vote with your wallet 2025 edition
Posted on Reply
#47
oxrufiioxo
AusWolfThe message is clear - you either buy a 5090, or you're a barefoot peasant and should suffer with the consequences of your choice. I'm not a fan of any company, but this makes Nvidia quite unsympathetic in my eyes. I'm not saying that 16 GB isn't enough for mainstream gaming, but if you fork out the cash for a 5080, then you should be getting a tad more for a bit of longevity, imo.
I get it though Nvidia does not want companies using GeForce cards for professional work and instead wants them to buy insanely high margin professional cards. It just sucks gamers get the short end of the stick over it. A lack of any real competition is probably a factor but a really distant 2nd.
Posted on Reply
#48
JustBenching
Prima.VeraIs not about charity, is about playing fair to the group that made them who they are today. Renaming a x070 card to an x080 is beyond callous , especially when they doubled the price in just a couple of years.
The problem is that almost nobody calls them in. The so called tech influencers keep praising the company (naturally because of the big paycheck they received from them for good reviews), while unbiased tech sites don't emphasise enough those practises, afraid they will be taken out of the free samples for review or something...
How the heck is the 4080 just a 4070 - when the 4080 is just as fast as the 7900xtx in raster and faster in RT? That doesn't make sense man.
Posted on Reply
#49
oxrufiioxo
JustBenchingHow the heck is the 4080 just a 4070 - when the 4080 is just as fast as the 7900xtx in raster and faster in RT? That doesn't make sense man.
Cuz the 7900XTX is the real RX 6800 replacement duh..... Even though it uses a ton of silicon....

Honestly the only issue with the 4080 was it's price.... Even 50% more expensive than the 3080 would have been a massive improvement over the 70% it ended up being it did allow the 7900XTX the real 6800 successor to be priced like a 6900XT though lol...
Posted on Reply
#50
AusWolf
oxrufiioxoI get it though Nvidia does not want companies using GeForce cards for professional work and instead wants them to buy insanely high margin professional cards. It just sucks gamers get the short end of the stick over it. A lack of any real competition is probably a factor but a really distant 2nd.
What professional cards? Quadro is dead. The x90 GeForce cards are the professional cards now.

This is why Nvidia has such a high gaming GPU market share. All of their GPUs are gaming GPUs now. Technically...
Posted on Reply
Add your own comment
Jan 5th, 2025 15:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts