Well, the 2060 is slightly faster than a 1070Ti and the 2070 is better than a 1080. The reason we're not seeing the whole expected performance shift is RTRT. So for Turing, we're trading some of the expected performance for a whole new GFX tech. I just don't see the fuss.
If you take RTRT out of the equation, even the 1660 is a good deal faster than the 1060 for about the same price.
The 1660 is a decent deal, absolutely, but sadly the clear exception to the rule in terms of Turing and value. The 1650 is downright terrible value, and the higher tiers provide no real-world value gains for normal users. You're right that we gain new tech, but that tech works in, what, five games now, 6 months after the launch of the "people's RTX" 2060? I doubt RTX will provide any actual value for users before the majority of these cards are obsolete, sadly - which means buyers are giving up on a generational perf/$ gain for a feature with very, very limited use. And in two years or so when we have far more games with RTRT, will an RTX 2060 be powerful enough to run these at 1080p60 without very significant compromises? That sounds unlikely to me - but of course, predicting the future is impossible, and I might be entirely wrong. It just seems like a poor bet from an end user stand point, and a poor showing from Nvidia in the way that they're essentially saying "No, you're not getting a faster GPU this time around, but you're getting a feature that you might - if you're lucky - get some IQ gains from in games in the future." It's also rather odd, given that GPUs tend to be upgraded on a 2-3 year cycle, meaning that 1st-gen products like these inevitably get superseded by newer solutions long before the tech actually becomes relevant or useful.
Now, don't get me wrong, I'm all for more realistic lighting, reflections and spatial audio (realism can foster immersion, after all), but I'm not a fan of paying a(n effective) premium for the promise that this might become a reality at some point in the future.