It's quite evident that, although these VRAM numbers were measured, they aren't strictly necessary for great performance, it seems it can be overshot by 2GB+
On cards with good bandwidth, most likely. That also relates to the 3080's place in the charts.
That's because they're lower powered GPUs - Not a VRAM limitation
Every GPU is about balance. The 3080 has a mere 10GB, but covers that up nicely with a royal amount of bandwidth, comparatively to Ada it has a LOT more relative to its core power.
This is how that GPU can come side by side with the 4070ti 12GB (below 4K) that has lower bandwidth, and clearly a
stronger core, but can't properly use all of its core perf because it combines insufficient VRAM with lower bandwidth. Exactly as one could have predicted looking at the relative hardware of Ada's 4070ti vs its 4070, or the 4080 that makes a notable jump since it covers the VRAM requirement and can therefore use all of its core power.
Another aspect that cannot be understated is that Alan Wake 2 is clearly an Nvidia-first title so it'll be heavily optimized/tailored to cover a VRAM deficit, something implicitly noted by the position of AMD cards relative to Nvidia's, even if the gap isn't huge with numbers elsewhere, its there. For those (not you) that have trouble with comprehension: I'm
not saying Nvidia crippled AMD cards here, it just made sure its own don't run into issues.
10GB 3080 beats 16GB 7800XT & 6900XT at 1440p/4K (rasterization) in BOTH Average fps and minimum fps
10GB 3080 beats 20GB 7900XT in 1080p/1440p (Ray Tracing) in BOTH Average fps and minimum fps
Anything above that is useless....... Even 7900XT runs below 30fps when using path tracing.
IN OTHER WORDS, 10GB is not an issue....
I love how you guys twist things up..... nvidia is clearly doing better in this game, even if has lower VRAM......at any realistic settings.
Ha ?? Did you even look at benchmarks and fps ????!
10GB 3080 beats 16GB 7800XT & 6900XT at 1440p/4K (rasterization) in BOTH Average fps and minimum fps
10GB 3080 beats 20GB 7900XT in 1080p/1440p (Ray Tracing) in BOTH Average fps and minimum fps
Path tracing is pointless... The fps is so poor even on top AMD GPU
8GB 3060 Ti is beating 12GB 6700XT at all resolution (rasterization) in both average fps and minimum
Also, 8GB 3060 Ti beats 6700XT at 1080p with RT but both cards had poor fps.... Any higher resolution is slide show in both cards
First of all, people will not be running this game on extreme setting with path tracing on midranged GPU anyway....
Second of all, 8GB nvidia GPU is performing well in this game compared to 12GB AMD GPU with same generation.... For example 3060 Ti beats 6700XT 12GB in rasterization (all resolutions)........ and in 1080p with normal RT (not path tracing)...........Anything above that runs below 20fps on 6700XT.... It does not matter who wins when both cards are below 20fps or 30fps because nobody will play at those setting anyway.
Are you guys even look at the benchmarks ???!
You made a discussion on VRAM a discussion between different GPU architectures & brands. I'm looking exclusively at the
order of cards on the Nvidia side.
This isn't a pissing contest or camp battle, its a performance
analysis.
And if you are really still making a case for sub 12GB cards... all I can say is, you do you, I've moved on
The requirements don't lie.