Yes
In game reviews? I've been testing VRAM since forever, but only allocation. This is a graphics card review, not a game review though.
There is no way to track usage. Each frame is different, and each one touches different resources and many come from the various caches in the GPU, and this changes from frame to frame, even when standing still. GPU vendors have software that can capture the state of the GPU and everything related, and they can replay the command buffers and analyze a single frame. This is how they design their next-gen GPUs. Nothing that's accessible outside of these companies
I know that, which is why trusting usage is sketch. Isn't that literally what presentmon/RTSS attempts to do though, usage/allocation?
Your whole take on this is quite literally retarded. Instead of blaming crap developers for making s**tty PC ports of console games, you blame W1zz for blaming those developers. I cannot stress enough how mind-bogglingly braindead your view is; I've come across some dumb takes in my 2 decades online, but this one is up there with the stupidest.
It is not NVIDIA's fault that developers make s**tty ports, nor is NVIDIA required - and nor should they be - to build their hardware to cater for s**tty ports.
Stop posting dumb crap and go sit in the dunce corner and for once THINK about what you're posting, BEFORE you post it.
I find this quite humorous. A port using next-gen features the current consoles can't support is a crappy port? Or using higher-rez textures and keeping them loaded? Which? I'm lost.
It's not the game. It's the cards. You're blaming the wrong thing.
I apologize you literally can't see the limitations imposed on each card. It's very apparent if you're looking.
Again, I challenge W1zrd to post 1440pRT mins for games that are ~60fps avg at 1440pRT. 4k quality upscale mins.
What is less than 1440p? 1080p. That's correct. Or 960p if you use quality up-scaling to 1440p. Which is less than 1080p. Do you need to run those 1440p/60fps? You do not. Are they it selling this way? Yes.
Is W1zard capitulating to this? You be the judge.
Will AMD be able to run 1080p? Probably yes. This is why their card makes just as much sense.
Is this the devs fault, or nVIDIA keeping you on a string to upgrade when again, the thresholds are very apparent? They literally inch everything along. If you can't see this, you're blind.
You can see where 45TF and 16GB both become requirements (4070ti/5070 both less on purpose). This is and will soon become more apparent.
You can see where 60TF and >16GB both become requirements across multiple scenarios. Are basically their cards all over/under this? Yeeeeppppp. Can you OVERCLOCK a 5080 above 60TF? Yes. Ram limit? Yes.
Is 9070xt literally hopping the 45/12GB limitation (that AMD themselves put on 7800xt @ 45TF even if 16GB)? YEEEEEP. Will this make a lot of settings above 12GB cards but same as GB203 playable? PROBABLY.
Is it still below 60TF raster? Yep. Is 7900xt limited to ~60TF? YEEPPP. Is that where you need more than 16GB? YEEEP. Again, it's very purposeful product segmentation...nvidia's is just more gross.
I don't like the idea of AMD selling 1080p->4k upscaling as 4k either. That's my point. But literally nothing can do 1440pRT 60mins outside of incredibly expensive 90 cards. This will eventually change.
But when it does, do you not expect the bottom to rise? Would it not make sense if >60TF/>16GB then becomes what you want for even 1080pRT upscaling w/ FG etc? I think this makes tremendous sense.
This is why their 'middle-ground' card may not hold up for the settings you want there, either. This is where (for right now) you would want a higher-clocked 24GB 5080, which doesn't exist RN.
But it surely will, because they will create that gap, and then you will see it more clearly. Before long you will need more raster (like a 4090+) for 1440p. It's pretty clear how things will probably evolve if you look at it.
And nVIDIA will be there, to sell you every damn card they can below that (and those) threshold(s) until they have to do it.
Ask yourself why a 8 cluster (~12288sp) 24GB nvidia card does not exist when it would clear many hurdles AD103/B203 do not. 9728, 10240. 10752. All 16GB. All with different limitations.
AMD too apparently...but at least they're trying to make a well-balanced card for what's affordably possible this generation. They are more a victim of circumstance, than anything else. For nvidia; clearly planned.