You can't expect every new generation to be 100% faster than the last one, technology slows down naturally after hitting a peak. I really don't understand why people think the 20 series is a rip-off. You have high expectations for the upcoming AMD cards but it will be the same thing, a minor improvement, and die shrinks wont change that. Better get used to it that we hit a peak and new generations will only have a "minor" improvement over the last ones
Its quite simple, perf/dollar. It used to
improve with every generation, and with Turing, it does not. And that is bad, seeing as we've already been looking at Pascal's performance for quite some time now. Turing is relatively late, and advances nothing in terms of absolute performance. It just added a new 1200 dollar tier on top of the stack.
As an added bonus you get functionality that is currently implemented in ONE game and never really makes a difference in its overall experience, and the majority of the Turing product stack hasn't even got enough RT cores to make use of it proper.
As my sources point out, if you read the numbers right, its
not even a minor improvement at all. GPU performance increases have grinded to a complete halt at every price point except 1200 bucks. All to facilitate some new feature nobody knows will be vaporware or not in a few years.
Now, the most important part of all this, is what you've just said yourself: its not realistic to keep expecting massive performance jumps ad infinitum. In that vein, isn't it especially strange for Nvidia to
spend 30% of the die space not on absolute performance, but on some new feature that only works for a tiny selection of games?
There is only one logic behind all of this, and that is trying to maximize the cashflow generated from already deployed tech. Turing's RT is no more than repurposed Volta technology, and Nvidia just threw it at the gaming wall to see if it sticks. So to me, the only real question here is "Do you want to support that business practice" or not. I don't, because I know that will make future GPUs that much more expensive to make for questionable benefits.
Let's judge the value when we see the pricing and actual performance.
The performance level suggested in the "leak" puts this close to GTX 1080, and also Vega 64. If this is accurate, and it's priced below Vega 64, it will be a much better buy.
I agree. If it provides 1080/V64 perf at 350 (and not the hamstrung bad VRAM/bus choice, shitty chip-version option... yeah that's a lot of caveats Nvidia's built into Turing these days!) then we have an actual perf/dollar improvement. Minor, but present.
We all know however that won't be the case, because the 2070 already takes that spot. So instead of fooling ourselves with some weird, illogical 'leaks' that misrepresent actual performance (as I've pointed out with the 1080ti stock vs 2080 OC examples), let's just get real.