Maybe I'm still not going to give Nvidia a pass charging 800+ for a 12GB card just becuase they want you to spend 1100+ on a 16GB one.
I know two people that bought 4070 Ti and they are more than satisfied + they paid more like 650-700 dollars, not 800+
Both of them run 1440p which seem perfect for this card, performing like a 3090 Ti here (which was 1999 dollars just 9-10 months before 4070 Ti came out), just with half the power usage and little to no heat
12GB is more than enough VRAM for 1440p and even 4K/UHD and I doubt this will change before next console generation (not refreshes)
You won't be able to max out games in some years with a 4070 Ti in 4K anyway, hence lowering VRAM usage
Even my 4090 will be considered mediocre by 2025-2026 probably, regardless of it having 24GB VRAM.
4070 Ti 12GB beats 3090 24GB in most new games coming out, even in 4K - 2-3 years ago 3090 was considered a 4K beast with "VRAM for years" ... Futureproofing is pointless
In this game, even 3090 Ti is beat by 4070 Ti in 4K on max settings, in both average and minimums, twice the VRAM, zero difference
Ada have better memory compression and cache hit/miss feature (hence ten times the cache vs ampere), all this lowers actual memory usage
Yep by 2027-2028 when RTX 6000 series probably comes out, 12GB might be too little to max out demanding games in 4K. Yet todays GPUs will be horribly slow regardless of having enough VRAM, including 4090. This is why futureproofing is pointless. If you want proper performance and optimization (from both game devs and AMD/Nvidia) you stay on the newer architectures.
I can hit 18.5GB usage on my 4090 in Cyberpunk fully maxed out, FOV 100 and Path Tracing + DLSS 3.5 + FG, yet 4070 Ti can run the same settings without running out of VRAM.
More VRAM = More Allocation. Tons of game engines allocations a given percentage of VRAM. You cant really use software readings anyway, you simply look at minimum fps, because this will drop fast when VRAM starved and from what I see, zero games needs more than 12GB at 4K yet
Lets see if Nvidia pushes Neural Texture Compression to developers. They confirmed that you don't need a huge RAM buffer to deliver best textures. Maybe game developers should get smarter instead of more lazy.
PCs get faster and faster, yet more and more games are not optimized well