Four months in isn't early. Telling people to lower the settings isn't the right solution, even if it is a good afterthought.
No one said this happens in every game.
You don't know that. Do you know how I know that? Because it all happened the during the worst global price hike for graphics cards in history. MSRP meant shit. During other releases, prices have been adjusted according to lots of factors, but that wasn't feasible at the time. Or, deduct $40 for the extra 8 GB and you're close enough. We can of course look at the price after that period, but no one knows how much they were affected by previos events.
In my example it was, for a few seconds, and that's the most frustrating thing to see. If your GPU is slow, fine, you have to upgrade. But when you go from this
View attachment 329912
to this within seconds, you're gonna get pissed.The GPU is up to the task, obviously. And by that time it doesn't matter if we're comparing to a 6800 or a 6700 XT, as both will do more than 25.
View attachment 329913
And of course there might be better examples out there than just this game, but you know it's going to happen again, because it's happened before. So many 3 - 4 GB cards in the past that could have worked for a longer time.
It's just a shame that the 3070 couldn't have both. You shouldn't have to choose.
That's a different matter, but since DLSS is overall better it might compensate lack of VRAM in the short run when it comes to popularity among buyers, but perhaps not in the long run.
I highly doubt AFMF will deliver, or AsFMotherF like I call it, but I guess we'll soon find out.
Well tons of other tests shows Callisto Protocol runs just fine on 3070/3070Ti
In this test, we present the final annual review for The Callisto Protocol, which was tested at maximum graphics quality settings with ray tracing. BASIC INFORMATION ON THE GAME Released : Dec 2 2
gamegpu.tech
Show 4K/UHD only and you see that 3070 Ti beats 6800 16GB here, at MAX + RT
2nd Link, TPU Testing->
https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html
3070 beats 6800 while costing 80 dollars less even at 4K/UHD on Ultra as well, and yeah Radeon 6800 was more expensive than 3070 around release, no matter how the GPU market was. Higher MSRP = Higher price. 6700XT was the 3070 competitor. AMD don't price a 3070 competitor at 579 dollars obviously.
Like I said, cherrypicking is not that hard, all kinds of weird results to find if you want.
In reality, VRAM is not really an issue for many people. Especially not if you use 1440p or less like 95% of PC gamers. However no last gen mid-end offerings are going to max out the most demanding games at 1440p today, without using upscaling, which lowers VRAM requirement.
However its always funny when people find demanding games, running at 4K/UHD native / without upscaling while maxing out ray tracing too, to make a point about how much VRAM matters when the 16GB cards from the same generation is struggling hard as well.
GPU have always been the most important part and generally people don't need more than 8GB at 1440p or less. 12GB will last for many years here.
If you actually use a 4K/UHD+ monitor, refuse to use upscaling and wants to max out every single demanding game in the next 4-5 years, which GPU would you buy? Because not even my 4090 will do it.
I have the VRAM probably till 2028-2030 but the GPU will already seem dated when 5090/5080 hits "soon"
VRAM never futureproofed a GPU and never will.
7900XT would have been a better GPU with more cores/power and less VRAM. 16GB would have been more than fine. 7900XTX again, more cores/power and 16-20GB VRAM, would have been better. 24GB is not doing ANYTHING on 7900XTX, and never will. Looks good on paper but the GPU can't evne do Ray Tracing or Path Tracing well, which is stuff that eats VRAM.
Also AMD keeps using GDDR6 instead of GDDR6X because their chips use more power. GDDR6 is kinda cheap.
AMD started this VRAM talk because of their marketing on the subject. TLOU and RE4 was rushed console ports sponsored by AMD. Ultra was not possible on 8GB cards on release. This has been fixed. This "bug" was probably not an accident as it came RIGHT AFTER AMD speaking about how much VRAM matters.
AMD did this before. Back in the Shadow of Mordor days, when they released a 6GB Texture Pack ->
https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/
The texture pack did NOTHING but max out VRAM on all cards with less than 6GB, meaning most Nvidia cards at the time. Graphics looked identical.
Also, people are generally too stupid to realize that VRAM usage on a 4090/7900XTX in game testing does not mean actual VRAM requirement. Allocation is a big part of the number you see in tests. VRAM usage does NOT mean required amount. This is nothing new.
And most games today also has pretty much identical looking textures at high and ultra presets.
Ultra often means uncompressed and high means slightly compressed. When playing you don't see the difference at all.
But sure, keep praising VRAM as the single thing that will make your GPU futureproof.
Nvidia already showcased that their Neural Texture Compression tech can make graphics look much better while using alot less VRAM long ago. The tech exist.