All the people crying "herp derp it's only whatever % faster than 2080 Ti" are missing the point entirely: RTX 3080 is the first graphics card that gets proportionally faster than its predecessors at higher resolutions. That is nucking futs!
Not to mention that RTX is finally able to render at framerates almost matching rasterised performance (or exceeding it - see Control @ 4K RTX). Yes it still requires DLSS to get there, but the fact that this was literally not even a possibility two years ago (before Turing launch) is just mindblowing. The next generation will almost certainly bring performance parity in RTX and rasterised rendering, and that's insane: we will have gone from "ray-traced rendering is an academic novelty" to "ray-traced rendering is the standard" in probably half a decade.
This is the world's first true 4K graphics card and NVIDIA should be congratulated for it. My only question is, what happens to us non-4K plebs now? Because NVIDIA is building GPUs that are literally too powerful for the rest of our systems!
While a lot of what you say here is true, the first point needs some moderation: this is mostly due to CPU limitations and/or architectural bottlenecks preventing some games scaling to higher FPS numbers, not due to 4k performance itself increasing more than lower resolution performance in a vacuum. Of course we can't get faster CPUs than what exists today, nor can we magically make game engines perform better or the Ampere architecture re-balance itself for certain games on demand, so it is what it is in terms of performance - but you are arguing as if this was caused by the GPU alone rather than these contextual factors.
RTX performance with DLSS looks excellent, and I'm definitely looking forward to how this in combination with RT-enabled consoles will affect games going forward. The big questions now become not only whether AMD is able to match Nvidia's RT performance, but if they can provide any type of alternative to DLSS.
Also, this is a perfectly suitable 1440p120 or 1440p144 GPU. No need for 4k, though it's obviously also great for that.
One question though: going by the meager perf/W increases, was anything beyond die sizes on 12nm stopping Nvidia from delivering this kind of performance with Turing?
A 20GB version of the RTX 3080 will likely be launched in the next two months:
NVIDIA hints that RTX 3080 with a different memory layout could come out later. NVIDIA GeForce RTX 3080 >>10GB<< 20GB It is no secret that NVIDIA is also preparing a 20GB model of the GeForce RTX 3080 graphics card. It has been speculated for weeks now and we had no problems confirming this...
videocardz.com
Ouch, that's going to piss off a lot of early buyers. Not that there's any reason to - those extra 10GB are most likely going to be entirely decorative for the lifetime of the GPU - but people tend not to like being presented "the best product ever, go buy it now" and then having it superseded in just a few months by the same company.