• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
Excellent delivery and perfection in proving!

Just check if any game consumes 20+ GB whilst staying playable (1% low is strictly more than 30 and avg is strictly more than 45 FPS) with 4090.
...there's no such game.

Post 159 in this thread...
 
The new changes to DirectX have some news sites talking about gaming PC's following the current consoles, and moving more and more into VRAM and using less system RAM
Low end systems wont be affected all, since they use system ram to top off the VRAM anyway - but it takes some strain off the CPU, the newer way


The argument with no answer, was comparing RAM and VRAM for bandwidth and latency and the balance between them, pros and cons of a system with only one type (like the consoles) and how it's theretically possible for a PC with almost no RAM, if it could boot use the VRAM like consoles do - we already have IGP's with no VRAM that use the system RAM, so we've got real world examples of both methods being done.

There's a never-ending push to move more and more processing over to the GPU and now to VRAM, and It's curious to think where it will stop

This reminds me of how much I wanted one of those Subor Z+ APU boxes.
I'm not sure how, but it was a (normal?)WindowsPC, that used VRAM for system memory and graphics.

I have actually wondered if I could use some of my HBM2 VRAM on my 16GB Vega as 'system memory'. As-is, only the opposite is supported (HBCC)
 
This reminds me of how much I wanted one of those Subor Z+ APU boxes.
I'm not sure how, but it was a (normal?)WindowsPC, that used VRAM for system memory and graphics.

I have actually wondered if I could use some of my HBM2 VRAM on my 16GB Vega as 'system memory'. As-is, only the opposite is supported (HBCC)
A bit of a side question. How well have graphics cards like the Radeon VII and the various professional 16GB models and especially the 32GB and 2X32GB ones aged? How do they feel in use nowadays?
 
I only can learn you had 18 FPS which is way below 30. Play this game on settings which drain 20 to 23 GB VRAM and if it never drops below 30 FPS, I will add The Witcher III as the first game to be fully optimised for VRAM-abundant gaming GPUs.

My point is mean games require ~8 GB at 1080p, ~10 GB at 1440p, and ~15 GB at 2160p. 8K isn't considered because it's too rare.

This means pushing beyond 16 GB of VRAM is feasible in terms of getting more room for GPU OC since it's lowering % usage of VRAM but... Getting the GPUs faster is of the highest priority.

That being said, 6950 XT is balanced, RX 6800 has too much VRAM for its slow core, RTX 3070 Ti is severely capped by its 8 GB and RTX 4090 might lose almost nothing if cut to 20 GB and gain more from making a more complex GPU with higher IOPS and FLOPS count.
 
A bit of a side question. How well have graphics cards like the Radeon VII and the various professional 16GB models and especially the 32GB and 2X32GB ones aged? How do they feel in use nowadays?

I went from:
R9 290X 4GB (512-bit GDDR5) to RX580 8GB (256-bit GDDR5) to 6500XT 4GB (64-bit GDDR6) (it was quantifiably slightly faster than the 580, BTW), then to a Vega 64 8GB (2048-bit HBM2),
and currently am on a Radeon Instinct MI25 flashed into a WX9100 16GB (2048-bit HBM2).

The Vega 64 and 'WX9100' both made games that were FPS-wise playable but didn't feel smooth, smooth. (I run a 144hz 1080p VA LCD)
They have "aged" better than most hardware I've bought; approaching the longevity of my i7-3770k (or an X79-build). Also, I swear AA is less penalizing w/ HBM2 (but that may well be a projection of expectations)
-I absolutely love my HBM2 cards :love:
If there weren't higher priorities for my meager discretionary spending, I'd 'stock up' on $75-100 MI25s, re-flash them, 'bin' them, and hang onto them as collectables. IMO, "HBM" on consumer GPUs was a 'thing' that will never happen again. (separate topic: but I feel the same about Optane as it's liquidated and retired. I have 'stocked up' on "16GB" M10 drives :D)

There is a sad part, though: All Vega 20 GPU-based cards (Especially the Radeon VII cards), are all pre-destined to kill themselves. (think: early-run Xbox 360 RRoD cause, taken to another level)
The only saving grace would be an owner that's aware of Vega 20's thermal warping, and underclocks+undervolts the card.
Pro cards are often in 'thermally disadvantaged environments', so even the cooler and lower clocked Apple and Pro Vega 20 products have a decent chance of eventually breaking themselves. Even WaterCooling will not save Vega 20; VIIs under water have also had endemic-level problems w/ the GPU itself warping away from the PCB. This problem isn't typical "cold solder joints", the arrangement of the BGA contacts and thickness of the GPU's packaging, along with Vega's current and thermal transients lends towards warping.
^TBQH, I am deeply concerned that this isn't going to end up isolated to Vega 20. Other than the details of the GPU chip's construction, the newest cards from nVidia and AMD both seem at risk of having this happen 3, 5, 7 years down the line.
 
I only can learn you had 18 FPS which is way below 30. Play this game on settings which drain 20 to 23 GB VRAM and if it never drops below 30 FPS, I will add The Witcher III as the first game to be fully optimised for VRAM-abundant gaming GPUs.

My point is mean games require ~8 GB at 1080p, ~10 GB at 1440p, and ~15 GB at 2160p. 8K isn't considered because it's too rare.

This means pushing beyond 16 GB of VRAM is feasible in terms of getting more room for GPU OC since it's lowering % usage of VRAM but... Getting the GPUs faster is of the highest priority.

That being said, 6950 XT is balanced, RX 6800 has too much VRAM for its slow core, RTX 3070 Ti is severely capped by its 8 GB and RTX 4090 might lose almost nothing if cut to 20 GB and gain more from making a more complex GPU with higher IOPS and FLOPS count.

I'm going to have to assume that you are new to reading OSD's. Mem is the vram, and at it is 100% maxed out. Then if you look at the frametime graph, you can see that it is extremely thick, due to the vram swapping taking place... and yes, this level of vram swapping will always drastically reduce the fps aswell...

But just to prove how rather ****** your statement was, i lowered texture quality, and lo and behold...

q88TgRW.jpg


As for the 6800 having too much vram... a gpu cant have "too much vram". That's like saying a car has a too big gas tank. At a certain point it might not NEED to be bigger, but it being bigger doesn't have any downsides. And case in point, high quality textures use a whole lot of vram while using barely any gpu power, which in other words enables the 6800 to continue using the nice textures, while the nvidia counterparts has to lower them to garbarge quality. The 6800 will age very well precisly because of its vram...
 
Last edited:
too much vram". That's like saying a car has a too big gas tank.
Devil in details, diminishing returns, etc...

I agree with your overall sentiments, but I have a few experiences that counter it.

Anyone else remember the "256MB 64-bit bus" cards in the early DX9 era? Also the multi-GB DDR2 equipped entry-level DX9-11 cards?

Which, leads to the reason I sought out an HBM GPU. VRAM can become useless with time, if it simply cannot I/O 'fast enough'.
Another example: the lack of vram compression on the FX 5200 and 5500 is what really crippled those cards in DX9. While loud, inefficient, etc. The FX 5700, etc. could perform well in early DX9.
 
Devil in details, diminishing returns, etc...

I agree with your overall sentiments, but I have a few experiences that counter it.

Anyone else remember the "256MB 64-bit bus" cards in the early DX9 era? Also the multi-GB DDR2 equipped entry-level DX9-11 cards?

Which, leads to the reason I sought out an HBM GPU. VRAM can become useless with time, if it simply cannot I/O 'fast enough'.
Another example: the lack of vram compression on the FX 5200 and 5500 is what really crippled those cards in DX9. While loud, inefficient, etc. The FX 5700, etc. could perform well in early DX9.

Yeah, in the early days of the gpu industry, there were alot of "goofy" cards. But that's mostly a thing of the distant past at this point.

I'd say that it depends what you are filling the vram with - textures doesn't really need that much bandwidth (or processing for that matter), so you can get away with a fairly narrow buswidth, and it will still be ok. You'd really have to have an extreme amount of vram and textures stored in them, before it would begin to pose an issue. For example i reckon a rtx 3050 would have been able to use ultra hd texture packs just fine (like in far cry) if it have had 16gb vram rather than 8gb.

But nvidia has always been doing these "just barely enough" vram cards, to force people to upgrade more regularly. But most of their real shenanigans started with kepler.
 
But nvidia has always been doing these "just barely enough" vram cards, to force people to upgrade more regularly. But most of their real shenanigans started with kepler.
Not always, in the good old times when they were competing with 3dfx they used to do the reverse, give you weak cards with lots of cheap slow RAM that was impossible to use at that time. I know, I fell for that. Of course, textures didn't exist in the modern sense, everything was a pattern or a color repeated ad nauseam.
But nowadays, high quality textures are the cheapest way, performance-wise, to have good looking games, so skimping on VRAM is a really ugly move from Nvidia, especially since they have very tight links with game developers and they must now how much existing low VRAM GPU are holding back game developers.
 
Not always, in the good old times when they were competing with 3dfx they used to do the reverse, give you weak cards with lots of cheap slow RAM that was impossible to use at that time. I know, I fell for that. Of course, textures didn't exist in the modern sense, everything was a pattern or a color repeated ad nauseam.
But nowadays, high quality textures are the cheapest way, performance-wise, to have good looking games, so skimping on VRAM is a really ugly move from Nvidia, especially since they have very tight links with game developers and they must now how much existing low VRAM GPU are holding back game developers.

Yeah, as said they really started with all their shenanigans in the kepler days - vram amongst many other things.
 
Tested games on Radeon RX 7900 XTX 24 Gb:
Spider-Man, max settings, RT ON, 8K + FSR Quality - avg. 30 FPS - avg. video Ram usage 20 Gb .
Uncharted 4 , max settings, 8K, no FSR - avg. 30 FPS - avg. video Ram usage 17 Gb. Graphics quality this settings are incredible both titles.
DO YOU REALLY WILL PLAY 8K... IN 2k23? I for sure say yes to 4K but 8K... let alone those nasty major boys tricks lol
 
Sorry, no. Better get something with 64GB of VRAM.

You will need it to play the upcoming port of Last of US part 2 to make it playable beyond potato settings.
maybe even 128GB of VRAM just to get to the Last of Us part 2 launch menu

Luckily for us Nvidia has released its new 11Kv cable for these GPUs
0saQ5WnisYv-iaL4nR6AQhvxHa2eREmXT12JrG4WlZg.jpg


And ATX 4.0 / PCIe 6.0 PSUs are starting to come out

nuclear-power-station-with-steaming-cooling-towers-and-canola-field.jpg
 
maybe even 128GB of VRAM just to get to the Last of Us part 2 launch menu

Luckily for us Nvidia has released its new 11Kv cable for these GPUs
0saQ5WnisYv-iaL4nR6AQhvxHa2eREmXT12JrG4WlZg.jpg


And ATX 4.0 / PCIe 6.0 PSUs are starting to come out

nuclear-power-station-with-steaming-cooling-towers-and-canola-field.jpg

Funny, except the last of us is far from as demanding as the interwebz mob would have you believe. And yes, this is at 8k...

YRs4fRV.jpg


But it's the same thing every time - once the hate train is rolling, no amount of facts will make people sway from their narrative of "THIS GAME IS TERRIBLE!!!!!!1111".
 
maybe even 128GB of VRAM just to get to the Last of Us part 2 launch menu

Luckily for us Nvidia has released its new 11Kv cable for these GPUs
0saQ5WnisYv-iaL4nR6AQhvxHa2eREmXT12JrG4WlZg.jpg


And ATX 4.0 / PCIe 6.0 PSUs are starting to come out

nuclear-power-station-with-steaming-cooling-towers-and-canola-field.jpg
All I see are pictures alluding to what should have been. :p

Where are my nuclear nonproliferable microreactors, dammit!

IIRC, an Alaskan city was supposed to get a Lockheed Martin unit, but between Bush and Obama, the project was scrapped.
 
Funny, except the last of us is far from as demanding as the interwebz mob would have you believe. And yes, this is at 8k...

YRs4fRV.jpg


But it's the same thing every time - once the hate train is rolling, no amount of facts will make people sway from their narrative of "THIS GAME IS TERRIBLE!!!!!!1111".
Sure, it's not demanding at all because it runs on a 4090. We get it. If anyone has any issue with a game, they only need to throw some money at it. Problem solved.
 
Sure, it's not demanding at all because it runs on a 4090. We get it. If anyone has any issue with a game, they only need to throw some money at it. Problem solved.

If you use your brain just a bit... just tiny wee bit... when im running 8k, then obviously you need ALOT less gpu power and vram to run at something like 1440p or 1080p...

How about you watch @Frag_Maniac 's video, running the game on a gtx 1080 with high settings, at 60 fps...


 
Funny, except the last of us is far from as demanding as the interwebz mob would have you believe
I've read the techspot reviews and how the hardware demands drop significantly at high and medium settings
And yes, this is at 8k...
ok :wtf:

The game itself is not really my cup of tea but people can enjoy it at 720p, 1080p, 1440p, 4k, 8k, and every resolution in between.
 
I've read the techspot reviews and how the hardware demands drop significantly at high and medium settings

ok :wtf:

The game itself is not really my cup of tea but people can enjoy it at 720p, 1080p, 1440p, 4k, 8k, and every resolution in between.

Indeed - the whole debacle stems from people being allowed to skip the shader compile, and alot of people expecting to always run ultra settings, regardless of their hardware.

Don't skip the shader compile, use reasonable settings, and the games hardware requirements are actually quite... reasonable.
 
Sure, it's not demanding at all because it runs on a 4090. We get it. If anyone has any issue with a game, they only need to throw some money at it. Problem solved.
FYI, it's one of the most demanding games out there, especially when it comes to CPU. It easily beats cyberpunk with RT on, and that say s a lot.

Also the VRAM usage on TLOU is absolute garbage. Plague Tale looked better and required 4.5GB at 4k native ULTRA. LOL!?
 
FYI, it's one of the most demanding games out there, especially when it comes to CPU. It easily beats cyberpunk with RT on, and that say s a lot.

Also the VRAM usage on TLOU is absolute garbage. Plague Tale looked better and required 4.5GB at 4k native ULTRA. LOL!?

Eh, no it aint. I can get quite a bit higher fps when lowering res to a cpu limited scenario in TLOU than i can in cyberpunk with RT on... and many games are substantially more gpu intensive than TLOU.
 
I'm kinda in the same boat.

Still running Vega64 XTX with its 8GB HBM2. I've found I've had to enable HBCC to get newer games to run smoothly. I usually use 12-14GB on the HBCC cache. Works fine.

So, I'll have to vote "Yes" in this one. lol
 
Eh, no it aint. I can get quite a bit higher fps when lowering res to a cpu limited scenario in TLOU than i can in cyberpunk with RT on...
Yes it is.


Look at the power draw, not even cyberpunk pushes the CPU at that high usage.
 
assume that you are new to reading OSD's
Wrong. I see you ran out of VRAM so this framerate can't be taken seriously and that's why I asked you for a screenshot with ~90% VRAM usage so it could be comprehensible.
But just to prove how rather ****** your statement was,
Congratulations. I'd agree with you be the situation similar in other games, alas they drain the performance way earlier than they eat the VRAM out. Frame generation w/ DLSS3 enabled is rather questionable since some input lag still stays.
which in other words enables the 6800 to continue using the nice textures
...paired with low FPS whatever quality textures are. Its GPU is very slow for games which demand over 12 GB of VRAM. It's the absolute minority of projects which scale with VRAM amount first.
while the nvidia counterparts has to lower them to garbarge quality
That's because 12 GB is optimal for cards of such speed. 4 GB deficite in greens and 4 GB stalling in reds. Absolute harmony.
The 6800 will age very well precisly because of its vram.
As a 6700 XT user I already face the issue of running out of steam way before my "funny" 12 GB get shredded away. RDR2, 45 FPS, 7 GB usage. Cyberpunk. 45 FPS, 6.5 GB usage. GTA V, 45 FPS, 7.5 GB usage. It's by no mean can be playable with 12 gigs maxed out. 45 FPS is already on the verge, even one frame lower and you're calling it a freak show of lags and stutters. 6800 only has 1.5x the die complexity and runs at ~15% lesser clocks no matter what which gives us ~30 percent theoretical boost. So if we wanna play rock solid 60 FPS our VRAM usage will never exceed 11 or maybe 12 gigs on RX 6800. 6950 XT, yes, it handles 16 gigs and might be capable of 18-gig scenarios, alas we don't know for sure since it only has 16 onboard but the 6800 is too slow.

And just for your information, I have never said it's bad to have more VRAM the GPU can actually handle. I only stated it's meaningless to only improve VRAM, the GPU is the main bottleneck in virtually every scenario. In yours, too. 48 FPS is too little to be taken seriously. 60 is the least number we can actually call business.
 
Yes it is.


Look at the power draw, not even cyberpunk pushes the CPU at that high usage.

...... -_-

The usage DOES NOT MATTER - some games will bottleneck on a single thread, others will use as much as it can. But the fps at which you get cpu bottlenecked is what matters. You were running 130-170 fps there. Do a vid of you driving around in the city in cyberpunk with RT enabled, and show us if the fps is higher than that...
 
Back
Top