I repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!
View attachment 293221
Hey maybe if you repeat those same graphs 10 more times, you'll magically grow 4GB on that fantastic GPU of yours and you can make it last as long as its core.
If not, keeping fooling yourself, because obviously you'll decide the core power was insufficient sooner rather than later and now you must have a 12GB 4070
We all know where this ends. I'm very happy to see you're not coping at all though. On to the next shit purchase it is!
I don't demand less, I won't upgrade if I'm not looking at at least +20% fps uplift. I just don't fret over particular specs.
Its always about timing, whether or not that is a good idea.
And the timing we're on, is console timing. We're halfway through current gen consoles which can push 12GB or more, and you know as well as I do most games don't get the TLC/optimization they could potentially have. If the consoles demand 10-12 GB, and frankly that is where it is moving now, and fast; and if newer engines can work a lot better with >8GB, you can rest assured this is the new normal.
Being below the mainstream norm with any graphics card is a definite push to upgrade. You might work your way through several more years of gaming, but you'll also feel forced to skip content left and right because its just not playable enough. I've done that lately because I couldn't bring myself to any kind of upgrade path/deal at its retarded pricing, not to mention the fact that past generations have been notoriously weak. We keep telling ourselves we're content playing the backlog, and this is
also true, but there's also that thing you can't do and kinda do want.
Take note of the fact that an
8GB PS4 got released around the same time/year as Pascal which promptly pushed Maxwell's 4GB midrange to an 8GB norm from the x70 onwards (a DOUBLING... and that is after 2-3GB Kepler, so that's +50% and then +100% within the space of 3 generations), and even a pretty generous 6GB on the much weaker 1060. Its now really starting to show its limits, and in ever worse ways, and not because of lacking core - even Darktide was perfectly playable with FSR, not a mission was lost due to high variation of frametimes; even if framerates would be 40-50, much like in FSR Cyberpunk. But both games do love to eat 6-8GB. The disbalance you get now on Ada, and on a supposed 8GB RX x600 GPU is tremendous, and its absolutely a major difference with what we've seen in the past.
I've been beating this drum since Turing started cutting down VRAM / % core perf and here we are today - its happening,
on release of new midrange cards that they already have to cut back on settings.
And if you keep cards for longer than 3 years, the worst case scenario is likely to happen because you'll be looking at a PS6 with the full 16GB addressable as VRAM, for sure. Perhaps even more on some epeen versions of it.