Are you sure you didn't "buy" it from torrents?
They worked a lot on optimization, especially on vRAM memory requirements. It seems a bit doubtful that you played it well when the game required 13GB for 1080p and you can't play it now when it requires less than 10GB.
In future games, as history shows, the GPU limitation appears before the vRAM limitation. In other words, you will use more than 8GB for nothing if the GPU cannot render a decent framerate. Whether you like it or not, you are forced to reduce details, except for the cases where various "influencers" try to convince us how cool the memory is when you play at 25 fps.
Probably the TPU reviews (and not only) are wrong if they show us that the 6700XT was not helped by the 12GB of vRAM in the games tested in 2023. The distance to the 3070 is the same, with one exception, brought (coincidentally or not) and this topic. More than two years have passed since their launch, how long can we wait?
Depends on the game.
I think Nvidia have quite simply got it wrong, what I mean by that is the tflops vs VRAM ratio.
In my case, the games I play VRAM has hit me first, once I was on 1080ti, I was almost completely free of GPU bottlenecking, I didnt need the 3080 for performance, I got it due to the state of the market and £650 wouldnt be around again for xx80 class for a while (so far proven right). On my 3080, I have yet to see utilisation get above around 70% in any sustained way, however I have had multiple instances of VRAM capacity issues.
But I dont play with uncapped FPS (60 is as high as I go), I dont play shooters. I have also yet to play any RT or DLSS game, simply because games I want to play dont implement those technologies.
Two crowds here, one crowd who just wants extreme frame rates, gimmie more frames!!!, they dont care much on visual fidelity but want that 500fps, Nvidia has catered for them nicely. Then the crowd who likes great textures, happy to play at 60fps or less, things like pop in textures ignore this crowd a lot, dynamic LOD stuck at low LOD annoys this crowd a lot, missing textures ignores this crowd a lot, and of course stutters caused by excessive asset swapping.
If you think about what Nvidia is doing with DLSS, and the lopsided blance of tflops vs VRAM, they catering to high frame rate gamers, RT is the exception as thats a visual thing, but ironically Nvidia own RT feature is showing up their own cards because as it turns out according to TPU testing RT loads up the VRAM lol.
In 2023, 8 gig should be on the bottom end cards, the bare minimum, 16 gig mid range, and 24/32 enthusiast. Nvidia have got it wrong very badly. 16 gig is to be considered the natural baseline now because that is what consoles have. Now Nvidia appear to have scrapped anything below xx60 now, looks like they lost interest there, so 4060 is their new bottom end card. But there is a new issue, pricing.
If these cards were cheap relative to previous generations, they could maybe, possibly be excused for VRAM skimping, but they not, they are the opposite. A RTX 4060 can be had for maybe £300 so 8 gig on that, but a 4070ti is at high end pricing that card should absolutely 100% have 16 gigs of VRAM, no ifs, no buts, no excuses, 16 gigs. The 4080 is now at enthusiast pricing level and I think should be at least 24 gigs on that thing. If Nvidia want to shuffle their product stacks upwards on pricing, then the spec needs to shuffle with it.
I am not paying £800 for a 12 gig card in 2023, even if it was a 4090 with 12 gigs. Right now for me VRAM means more than the compute power.
Nvidia know what they doing, those buying their 12 gig cards today will be needing an upgrade within 2 years. I still remember their interview on pcper during maxwell era. "Our biggest competitor is ourselves previous gen, we need to find a way to sell new products to existing customers".