So as it stands, 1 year ago, Ray Tracing as a technology has been exclusive to movie making because of the exceptionally high computing costs required. Nvidia "decided" that Ray Tracing is the way for the future, and manages to create RT Cores, and integrate them in their high end TU 102, enabling them to run some RT features at playable frame rates at a decent resolution. (I'm not saying that the technology is exclusive to Nvidia, AMD can incorporate RT in their next gen cards if it made sense to them from an economical point of view, and I suspect it doesn't)
The problem? they cost die space. TU 102 is a massive 750mm2 chip. I don't recall there has been any consumer grade/gaming card sold with such a massive die before, these sizes have been exclusive to professional grade Quadro cards which sold at much higher prices. The reason has always been yields. This is a chip that's nearly 3x times the one in something like GTX 980 for instance, and that doesn't equate to 3 times the chip cost, it costs many times more. I'm not saying a $1200 card is not profitable for them, it definitely is, and even probably more profitable that previous generations if we talk margins percentages, but I don't imagine it to be by a huge margin, it's not the rip-off that it seems to be. This is a high-end card, and it would have always came at a premium given the lack of competition.
The technology is still in infancy, and Nvidia wanted to make sure their are first. The decision to include a novel unproven technology in their high end cards, eventually leading to higher manufacturing costs due to its massive chip die, and pass the costs to consumers might seem a little bit premature. However, The timing is perfect given the lack of competition, Nvidia couldn't have afforded to do so if AMD was on top of its game, and I suspect Nvidia saw AMD making a push with its edge in 7nm tech and the development of Infinity Fabric and MCM cards that were widely expected to be the tech behind Navi up until last June.
For anyone not interested in RT in its current state, Pascal cards are still sold around. I know they are previous gen cards that are still sold at a premium with an extended life cycle, but that's only because there was no real competition from AMD. Progression in chip making has slowed down due to the diminishing of Moore's law, and with that context, it's hard to decide if it was AMD who performed bad, or it was Nvidia that performed extra well in the previous generation.