Games still have to run on current consoles (which is why 8GB is usable), but what everyone is talking about is 1440 and greater as base console resolution shrinks to 1080p and lower; <60fps.
Along with (currently) mostly PC-exclusive features. This means 1440p and higher, along with those higher-end settings, require more compute power and buffer in-step accordingly.
FWIW, I haven't watched any of their vids in a while (because I've been worn out from trying to decipher all the 50-series bs, not bc of them), and last I recall suggested 16GB; that is a fair opinion.
That has always been the argument. Wizard is/was on the '8GB is enough' forever (perhaps bc of the former rational),
which is the joke. People realize increasingly lower console (base) settings at diff times.
I say it truly all depends, but when you factor in things like upscaling, RT, FG, GI, PT, RR, etc, 12GB becomes a limitation with the ideal closer to 18GB. That's all I'm trying to get across.
SWO is a perfect example of where we're going. That game runs at 720p (8GB is in-fact enough...for 720p) on current consoles and will use 18GB with all the options turned on at 1440p.
I like using
SWO, some like
Hogwarts (which is >12GB at at any rez w/ RT) at 4k native. It's pretty clear next-gen is aimed at 1440p->4k upscaled (non-rt) and 1080p->4k RT w/ 16GB or more, worse-case.
Really ~14.5-17.5GB, but who's that precise?

Point is, >12-18GB. Conceivably >16GB and then 18GB for gpu, but probably not at first (bc that would ruin playerbase). 16GB > 12GB is my point.
It's totally fine if you don't play demanding games or use those settings, but if you want to be prepared for the reality of base 1440p or even 1080pRT upscaled to 1440p/4k, you want at least 16GB.
For the one millionth and 42nd time, this is why 12GB GPUs are a sham. I don't want people believing it will be a good option for 1440p (or 1080pRT upscaled) going forward (in some/many instances).
It's the same way you shouldn't expect 16GB to be enough for native 4k or 1440p->4k (esp RT). This is why 5080 is a sham. I know some will argue the GDDR7 bw, so let's argue bw vs buffer and stutters.
It gives me another reason to bring up Destiny. The reason why you need larger buffer allocation is so you don't need to refill the texture cache constantly which can cause stutter. 970 says hi.
While that RAM can be fast (like on a 5080 that uses BW to replace the extra ram it actually needs), if you need that texture rn and not allocated in buffer, a swap can (and often does) cause a stutter.
This is why in Destiny in-between biomes there are hallways/streets/whatever; to switch the buffer so it doesn't stutter. Not all games are designed that way and/or have luxery, esp if large open-world.
I know, it's a struggle. Believe me, I am very familiar with this ever since thoroughly testing the 970 back in the day and trying to figure out wtf was happening. It's made sense ever since.
I figure this is probably why Radeons are pretty-much designed to never run out of buffer (for their comparable compute capability). I am thoroughly convinced ATi/AMD does it to avoid that.
Curious if it can keep 4k60 DLSS MAX RT mins. Haven't looked, are you saying it forces TLAA?
I wonder if it's another one of those "no hotspot monitoring" situations. If it looks bad, hide it by making it not possible. I told ya'll DLSS4 performance hit would impact generally-accepted settings.
That said, if 4k60 mins DLSS MAX RT isn't possible on a 4090, I would argue Capcom needs to optimize the game to that standard. That's on them imho, not the 4090.
Apologies if this is later in the thread, I'm just taking it in and registering thoughts as I go.
I asked bc I ordered it on PS given my whole friends list is there and it's just easier. I don't need a benchmark to tell you the beta runs very 'not well' (bc the PS5 is obviously getting pretty outdated).
Hopefully they can polish it up decently for release. It would be nice if it were 1080p ~55fps minimum (limited by CPU clockspeed on base PS5 versus Pro [and maybe PS6?]) as it'd be okay with VRR.
That should be their goal if it's possible.
I honestly haven't studied all their settings, but see there's a 120hz mode on console. Is that 40fps? It should be if it isn't.
This is one of those games they should've optimized for whatever they can do @ 1080p/40 locked on a base PS5 imho.
I guess that would potentially screw over 12GB GPU owners @ 1440p though, and I get for this game they want the user base as large as humanly possible...so it makes sense to not do that.
That's my whole point though...It would be better if it were, but then people would need 16GB GPUs for a good base 1440p experience. Some games are and many more WILL be that way.
The reason why is to think of the PS5/XSX CPU. If the PS5 gpu is 8-core @ 3.5ghz, and the PS5 pro 8 @ 3.85ghz, and the PS6 12-core at around the level of PS5pro, what's the common-sense base setting?
It's around 1080p30-40fps, right (or 1536*864p60)? Now scale that to 1440p/60fps. That would require more than 12GB. That's not what this is (so 12GB isn't discouraged) but that's where we're going.
This is shown by use of the high-res texture pack...
Is it strange, or rather a prime example of
the whole point of what I've been trying to explain?
You be the judge.
But I don't want to hear about 12GB stutter struggle because you will have the non high-rez option available.
I guess in this game it's more like 12GB "monsters-are-fucked-up-nightmare-inducing-rudimentary-polygons" struggle, in this case, bc they don't want it to stutter. Same point applies.
Please don't get too hyped up about what I'm saying; it's a conversation. If you think I'm wrong, prove it (which helps everyone). I'm pushing for better understanding/preperation for everyone; that's all.