"Looking at our test results, the 12 GB VRAM config is perfectly sufficient for all games except Alan Wake 2 RT in 4K, which runs at sub-30 FPS, even with more memory. This means that you'll have to turn on DLSS upscaling either way, which lowers the game's render resolution and the VRAM requirements accordingly."
A single game at 4K native (that ran out of conventional performance too).
Yeah this is exactly what I'm saying. Pushing any current gen mid-end or higher card into having vRAM issues will make the GPU buckle anyway. vRAM won't save you going forward.
Alan Wake 2 is pretty much optimized to use upscaling too. Running it at native UHD makes no sense at all, especially not with path tracing which will make even 4090 struggle hard.
12-16GB is plenty for 1440p-2160p and I doubt this will change any time soon, maybe in 4 years or 2 generations we will start seeing games pushing 16GB at 1440p on max settings but you will need the GPU power to run it as well, vRAM alone won't save you here.
The only reason 7900XTX and 4090 has 24GB is because its 384 bit bus. Both AMD and Nvidia knew it was pure overkill, for gaming that is.
4070Ti losing ground against both 4080 and 7900XT shows that to me increased cache doesn't help enough. And we should also consider that, traditionally, Nvidia does better than AMD in 1440p and higher resolutions so it's even worse for them to lose grounds against rival card. Moreover, seeing the same results are repeatable in pretty much every media outlet easily convinces me more that its 192bit just doesn't cut it. I mean you obviously looked at charts too since you mentioned 4070TI and 3090 being identical at 4K. But I don't know why you omit that 4070Ti is actually faster than 3090 at both 1440p (7%) and 1080 (10%) in the charts on the same page. Some specific results goes even worse than that. For example in Elden Ring at 1080p 4070Ti easily beats 3090 (16%) and 3090Ti (10%) yet it actually gets beaten by 3090! (2%) at 4K. That's losing 18% performance just with increased render resolution and 4080 only loses about 4% in same game against 3090 so this obviously not an architectural issue either.
I don't want to go into what is high res or not, but isn't that even worse for 4070 series if you don't consider 1440p native as high res? Because it is already starting to lose ground at those render resolutions.
I already said that new core specs will give 7% increase in performance, 4070 Super results indicate that, but I need to add that Nvidia didn't bump the TDP with Ti Super like it did with 4070 Super so 7% is actually the best case scenario with increases core counts alone for Ti Super. Any improvement above that would be with the help of the higher bus width so we will see. I also need to add that if gets only 7% it actually cannot beat the 7900XT in raster which would be a shame really. It needs 10% to match it. Yet I expect it to beat it 2-3% with the help of the 256bit bus.
Lastly 7900xt is already selling at 709$ (again) so that's already about 100$ than 4070Ti Super MSRP. I don't know how we are not in the same page on that.
4070 Ti performs like a 3090 Ti at 1440p and a 3090 at 2160p when you look at the overall picture. Considering that 4070 Ti uses much less power, almost half of 3090 Ti, I don't really see this as an issue for most PC gamers, which don't use 2160p to begin with. Also you get full DLSS 3 support and Frame Gen on top.
The NVIDIA GeForce RTX 4070 Super launches today, boasting a solid increase in GPU cores, ROP units and cache. This enhancement brings its performance much closer to that of the GeForce RTX 4070 Ti, at a price point of $600 that signals strong competition for AMD's Radeon RX 7800 XT.
www.techpowerup.com
Lets have a look at Techpowerups recent AAA game testing, seems to me that Ada is only getting faster
4070 Ti beats 3090 in Avatar 4K Ultra
Avatar: Frontiers of Pandora features stunning visuals that recreate the movie franchise's unique universe. There's also support for AMD FSR 3 Frame Generation and NVIDIA DLSS. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection...
www.techpowerup.com
4070 Ti beats 3090 in Alan Wake 2 4K Ultra
Alan Wake 2 is out soon, with incredible graphics, but demanding hardware requirements, too. There's forced DLSS/FSR, but we show you how to tweak the config files to get native back. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide...
www.techpowerup.com
4070 Ti beats 3090 in Lords of the Fallen 4K Ultra
Lords of the Fallen has been released, showcasing the incredible power of Unreal Engine 5. This souls-like game offers breathtaking visuals that rank among the most impressive we've seen, but it demands powerful hardware, too. In our performance review, we're taking a closer look at image...
www.techpowerup.com
4070 Ti very easily beats 3090 in AC Mirage 4K Ultra and even beats 3090 Ti as well in 4K
Assassin's Creed Mirage delivers a fresh experience by stripping away RPG elements, allowing players to fully engage in the series' core stealth gameplay. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection of modern graphics cards.
www.techpowerup.com
192 bit bus and 12GB vRAM seems to beat 384 bit and 24GB vRAM in all these games, the 3090 is having problems because of lacking GPU power really
On top, 4070 Ti even has option for DLSS 3 full feature support and Frame Gen which 3090 don't support.
I'd personally not buy either for native 4K gaming but if you offered me to choose between these cards, I'd choose 4070 Ti for sure over 3090. I will take a faster GPU with newer features and much lower power usage over "more vRAM" - Personally I don't care much about 4K gaming, 1440p is still the absolute sweet spot for me.
If I were a 4K gamer I'd buy 4090.