It's not that they're less interested, it's that RT is barely tractable on mid-range cards, so they don't know what the low end should look like. Remember how they had to come up with the 1660?
That’s right. The 1600 series was the last budget line from Nvidia where they built a GPU without RT compute units. Just imagine if Nvidia did that all the way up the stack and designed highend GPUs with and without RT. The non-RT units would be cheaper along the lines of historic pricing. Guess how many people would spend the extra $100s for RT enabled GPUs. I’m thinking of a number close but not quite zero.
This situation reminds me of 32-64-bit CPU hybrid architectures. This was done right. There were 64 total registers. If the software was only 32-bit, it would use the first 32 registers. 64-bit software would use all 64. Just imagine if AMD had to add a separate bank of 64 that couldn’t overlap so you have 96 total. The extra transistors would have increased costs.
This is why I do not like RT functionality. In Nvidia’s case, the functional units between raster and RT do not overlap. As more parts of a gaming scene use RT, less of the compute units for raster are used. If I understand things correctly, a full path RT game would ignore the huge majority of the GPU compute units.
So now we are saddled with expensive GPUs because of the extra RT compute units, no overlap of functionality and the loss of budget GPUs because Nvidia knows that a choice between RT and non-RT enabled GPUs would not go their preferred way leading to the loss of their supposed competitive advantage.