• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
Joined
Mar 31, 2009
Messages
141 (0.02/day)
Tested games on Radeon RX 7900 XTX 24 Gb:
Spider-Man, max settings, RT ON, 8K + FSR Quality - avg. 30 FPS - avg. video Ram usage 20 Gb .
Uncharted 4 , max settings, 8K, no FSR - avg. 30 FPS - avg. video Ram usage 17 Gb. Graphics quality this settings are incredible both titles.
 
Last edited:
i got 16gb 1080p. XD moost games use around 9gb to 10gb of vram

i think 24gb for 8k is on the low side
 
24 GB is about the right size to target ultra quality 4K with raytracing and the general requirements of applications on high resolution monitors. If it wasn't needed, these GPUs wouldn't have it.
 
Not really, it's an enthusiast thing; 24GB is needed for the highest resolutions and there graphics cards run out of steam anyway if they don't use trickery (DLSS/FSR).
 
Nope, gimme a 1TB GPU, or gimme death, like, yesterday......hehehehe :)
 
The options are too black and white. Like Dr. Dro said 24GB is absolutely necessary if you're targeting super hi-res with maxed graphics, but 1440p only needs about 16GB to max graphics in most games. 1080p only needs ~10-12ish. Different VRAM amounts for different use cases. Once again it comes down to knowing how you're going to be using your system.
 
forspoken vram.png

for this rpg pc, yes it does.....
 
Maybe for 4K only. But in general 16gb of VRAM is enough.
 
For 4K, it’s still overkill but game developers have gotten used to dumping textures into the VRAM on the GPU and as textures get larger you are seeing more and more usage.
 
Either way, the next generation of graphics cards will arrive with GDDR7 chips that will likely be double the capacity of the latest and greatest GDDR6/X chips that are used in consumer graphics cards right now. So, we will have a doubling of VRAM in graphics cards, starting with the launch of this next generation in late 2024.
 
24GB means either 8+8+8 or 16+8 both of which are highly unrecommended as only 67% of your RAM will be in dual channel mode.

Go for 32, 48 or 64.

In fact I've had 64 for many years now.

I'm not going to vote because I don't understand the poll question. It's not about "worth" it's about performance and rationale.
VRAM size is OP question not RAM.
 
View attachment 281164
for this rpg pc, yes it does.....
this is for the umpteenth time but: just because a game allocates a certain amount of vram (usually bc of caching) doesnt mean that amount of vram is necessary for a game to perform well.
if you were to look at fps numbers you'd realise that it isn't - if it were, the lesser vram models w/ like, 8gib's performance would be falling off a cliff.

if you were to look at them forspoken benches you'd realise that none of the gpus exhibit this kind of behavoir.

stop spreading fud.
 
More memory has always been better than less, that's for sure.
 
There more than one games that today needed of big VRAM capacity... Usually most of published game requirements are related to minimal and recommended specs (for 1080p30 and 1080p60) but not for ultra settings at highest resolutions and VR. Yes there exist games for which this(for 4K gameplay) requirements are public known but not for all games. Requirements for 8K gameplay where are is written?
 
this is for the umpteenth time but: just because a game allocates a certain amount of vram (usually bc of caching) doesnt mean that amount of vram is necessary for a game to perform well.
if you were to look at fps numbers you'd realise that it isn't - if it were, the lesser vram models w/ like, 8gib's performance would be falling off a cliff.

if you were to look at them forspoken benches you'd realise that none of the gpus exhibit this kind of behavoir.

stop spreading fud.

Forspoken is apparently underperforming on 3060 Ti and 3070 against the 3060 12 GB and exhibiting texture streaming problems if raytracing is activated and higher texture quality settings are enabled even at 1080p, a few media outlets whose reviews I've read (notably Computerbase) and people I spoke to on Discord apparently ran into the problem. But given how heavy the game is, using raytracing on this class of hardware is probably a very bad idea. Luminous Studio has officially recommended a 12 GB GPU for the game, too.

I still say that the fierce rejection of the idea of needing high VRAM GPUs is very similar to the whole "you don't need more than 16 GB of RAM" holdout crew to this day: it's just the PC toaster race talking, the true face of "elitism". Yet... you'll find midrangers with 12 GB today. We're always evolving, even if some people will stick to their old guns until the bitter end.
 
Forspoken is apparently underperforming on 3060 Ti and 3070 against the 3060 12 GB and exhibiting texture streaming problems if raytracing is activated and higher texture quality settings are enabled even at 1080p, a few media outlets (notably Computerbase) and people I spoke to apparently ran into the problem. But given how heavy the game is, using raytracing on this class of hardware is probably a very bad idea. Luminous Studio has officially recommended a 12 GB GPU for the game, too.

Nvidia is notorious for skimping on memory on all but the absolutely highest end cards though.

I still say that the fierce rejection of the idea of needing high VRAM GPUs is very similar to the whole "you don't need more than 16 GB of RAM" holdout crew to this day: it's just the PC toaster race talking, the true face of "elitism". Yet... you'll find midrangers with 12 GB today. We're always evolving, even if some people will stick to their old guns until the bitter end.

People are always in denial about this sort of stuff, memory requirements increase all the time.
 
Forspoken is apparently underperforming on 3060 Ti and 3070 against the 3060 12 GB and exhibiting texture streaming problems if raytracing is activated and higher texture quality settings are enabled even at 1080p, a few media outlets whose reviews I've read (notably Computerbase) and people I spoke to on Discord apparently ran into the problem. But given how heavy the game is, using raytracing on this class of hardware is probably a very bad idea. Luminous Studio has officially recommended a 12 GB GPU for the game, too.

I still say that the fierce rejection of the idea of needing high VRAM GPUs is very similar to the whole "you don't need more than 16 GB of RAM" holdout crew to this day: it's just the PC toaster race talking, the true face of "elitism". Yet... you'll find midrangers with 12 GB today. We're always evolving, even if some people will stick to their old guns until the bitter end.
that is true ofcourse (and tbf i didnt even look at rtx numbers so yeah, mea culpa in that case), but it is usually quite obviously noticeable when you're actually running out of vram

also, on the flip side i'd also argue that a game that does not allocate as much vram as a gpu has is inefficient - if there's resources you can make use of to cache more, you should.
but i digress.
 
Yes, still the leather jacket fanbase that said in 2020 that the 3080 10GB would be relevant for the next 3-4 years keeps saying that. I'm wondering if the 3080 10GB will be able to hold the current front for the next nearly two years, until September 17, 2024? Of course, I mean with the new games that will hit the market between now and then.
 
More memory has always been better than less, that's for sure.
Sure, if the GPU has the power to actually use it, not always the case.

It's a marketing number as much as anything is.

Arc has 16 GB memory, does it perform better than a 10 GB 3080? I don't think so. How about an 8 GB 2080? Still no.
 
Sure, if the GPU has the power to actually use it, not always the case.

It doesn't matter, it's there. Better to have plenty than run out of it.

It's a marketing number as much as anything is.

It's a real metric and surprise surprise, people also do more than gaming on their video cards. I use my card for compute, you can bet those 20GB or not just marketing numbers for me.
 
It doesn't matter, it's there. Better to have plenty than run out of it.
Of course it matters.

GDDR, especially GDDR6X takes power, lots of it. That power generates heat and contributes significantly to TDP of the card. It also costs money to add memory.

If it's as simple as you'd like to imply every card would come with the maximum amount of memory its bus supports.
 
Hi,
I sure haven't run into a need for that much vmem = No.

But then again I'm not blowing $$ on new games either or new gpu's
I'm just fiddling with freebies off epic mostly it's about the best free games site option there is gog not to much I've only run across one game there.
Steam sux freebies wise.

Play on :cool:
 
If it's as simple as you'd like to imply every card would come with the maximum amount of memory its bus supports.

If it has N memory controllers then obviously that card was designed to have N number of memory chips attached. It's as simple as that, you know better than the people who design these thing, you think they do that because of "marketing" ?

Of course it matters.

GDDR, especially GDDR6X takes power, lots of it. That power generates heat and contributes significantly to TDP of the card. It also costs money to add memory.

You're logic is just bizarre, to say the least. Everything generates heat, like a faster GPU for instance. You must think that's just marketing as well, right ?
 
Back
Top