I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.
I think that's a ... how to put it, an unreasonably harsh take. "Didn't take RT seriously" does not seem to be a fitting description of a sequence of events that goes something along the lines of "RTRT was considered unrealistic in the near future in consumer products -> Nvidia stuns people with launching it -> AMD responds with their own alternative the next generation, two years later." Regardless if both most likely were working on this in their R&D labs at roughly the same time (somewhat likely, at least), Nvidia's vastly larger R&D budgets tells us that it's highly unlikely that AMD had the resources to really prioritize RTRT before Turing. Nvidia also had no external pressure to produce this, meaning they could hold off on launching it until they deemed it ready - a luxury AMD didn't have due to Nvidia moving first. Managing to put out a solution that more or less matches Nvidia's own first generation effort, even if Nvidia at the same time launched a significantly improved second gen effort? That's overall relatively impressive, especially considering the resource differences in play. Summing that up as "AMD didn't take RT seriously" is just not a reasonable assessment of that development cycle.
That obviously doesn't change the fact that Nvidia's RT implementation is currently significantly faster - that's just facts. But that's also what you get through having massively superior resources to competitors and the first mover advantage that often brings along with it. AMD's current implementation is still a decent first-gen effort, especially considering what must have been a relatively rushed development cycle. That doesn't mean it's good enough - but neither is Ampere's RT, really. It's just better.
As for AMD paying developers to dumb down their RT implementations - something like that, or at least paying "marketing support", and providing some degree of development/engineering support aimed towards optimizing RTRT for current-gen consoles (specifically: not implementing features that these consoles just can't handle at all, instead focusing on more scalable features that work in lighter weight modes on the consoles) is likely happening, yes, but there's also an inherent incentive towards making use of console hardware (and not exceeding it by too much) just due to the sheer market force of console install bases. I don't for a second doubt that AMD will take any advantage they can get whereever they can get them - they're a corporation seeking profits, after all - but even despite their growth and success in recent years I don't think they have the funds to throw money at external problems in the same way Nvidia has been doing for decades. Some? Sure. Enough to, say, contractually bar developers from implementing
additional RTRT modes on PC, on top of the console ones, that might make AMD look bad? Doubtful IMO. It's quite likely IMO that they're trying to put pressure on developers in this direction, but a more likely explanation is that given that they're already developing a given set of RT features, implementing more, different RT features (especially more complex ones) is an additional cost on top of that, and one that's only going to pay off for a relatively small subset of customers (PC gamers with Nvidia RTX GPUs, and if very performance intensive features, PC gamers with an RTX 2080 or faster). At some point, the cost of those features starts becoming too high compared to the possible benefits to be worth the effort.