Not really. You can always adjust settings, while retaining most of the IQ, and the faster GPU will push more frames in the end.
It's funny how people think that the PC gaming market is mostly playing AAA games on absolute max settings at 4K/UHD or higher. In reality very very few people cares about this, and the ones that do, often buys high-end stuff anyway. 4090 is the king of 4K+ gaming and this probably won't change before 5090.
In reality 95% of PC gamers use 1440p and the most popular PC games are not even demanding in terms of VRAM. Esport titles and popular multiplayer games in general are made for the masses, and 96% of Steam users have 12GB VRAM or less. Do you think developers code games for the 4%? They want to actually sell games.
Unless you want to push heavy RT or even Path Tracing at 4K/UHD native with Frame Gen on top, pretty much no-one needs more than 8-12GB and won't need it for years. VRAM requirement won't change before next gen consoles hit in 2028, meaning 4 years from now, and by then, every single GPU today is considered mid-end or even low-end at that point. GPU is simply too weak.
No you have not. You have allocated that amount. Allocation does not mean required amount. The fact you don't even knows this simple fact, is just sad.
Avatar: Frontiers of Pandora features stunning visuals that recreate the movie franchise's unique universe. There's also support for AMD FSR 3 Frame Generation and NVIDIA DLSS. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection...
www.techpowerup.com
12GB cards stomps your RX6800. Even the 3070 8GB is beating your 6800 16GB. Even in 4K minimum fps numbers.
This is an AMD sponsored game on top
and a great looking one. 4090 beats 7900XTX by more than 50% at 4K/UHD. In terms of minimum fps, 4090 beats 7900XTX by ~60%
4070 Ti beats 7900XT and easily stomps entire last gen 6800 and 6900 series.
"But but but only 12GB VRAM!!!111"
Seems like alot of people on this forum should read about actual VRAM requirement vs allocation.
That said, 6500XT is probably the worst GPU released in this century
Proof? Because this is one of the biggest gripes about AM5. Former Intel owners are used to lightning fast boot times and AM5 on release were 2+ mins, now we are down to 45-75 sec for most but the majority are still at 30-45sec with newest firmwares + AGESAs, even with MCR and every single of those features enabled.