When we look at Directx 12 all directX12 technologies are automatically compatible with directX12 gpu's, the latest ones come with it as standard, no matter if they are AMD/INTEL/Nvidia.
If I own an AMD 7000 or 6000 series right now I would use Optiscaler,there are numerous tutorials on youtube,it allows you to enable dssl/xess/fsr4 on older and more modern games,even though it is beta software.
While a 7900XTX is a very powerful gpu with much more vram its AI capabilities are up to FP 16 and yes it is more powerful than the 9070 XT up to fp 16 but the 9000 series does fp 8 and there it is more powerful than any previous gpu because the previous 7000/6000 series cannot.
In the RX 9000 series no chiplets are used, they are heterogeneous gpu like nvidia.
The 9070XT that I own is well defended in 4k depending on what game you want to play, there are games that in 4k nor a 5080 can get more than 50fps.in those cases is the gpu that is used desescale the screen to 1440p or 1080p, no matter if it is a 5080 as a 5070 Ti as a 9070XT.the performance will be higher.
The use of frame multipliers in both AMD / Nvidia, really only work well if you have a fps rate of minimum average of 30 fps and even then I would not use it, the reason one is because it will overload the gpu with higher load, the second reason is that it will increase the latency, the third point is that it drastically increases the energy consumption of the gpu..... If you already have in native 30 fps or 60 no longer need more, if you can get 80 fps and would be brutal if you have a gpu with a higher load, the second reason is that it will increase the latency, the third point is that it drastically increases the energy consumption of the gpu.
Translated with DeepL.com (free version)
If you already have 30 fps or 60 fps natively, you don't need more; if you can get 80 fps, that would be awesome. If you have an 80 fps or higher display, 60 fps is actually excellent; any additional fps increases latency.
What you need to worry about is whether your GPU can run textures not only in ultra mode, but also in cinematic mode, which is a higher standard than ultra at the resolution you're playing at or if your GPU is designed for it.
People think ray tracing or path tracing makes their game look better.
Explanations: Cinematic-quality textures have nothing to do with ray tracing or path tracing, but let's clarify something here: there are games created and optimized for NVIDIA where cinematic-quality textures were removed as an option in the cinematic textures section and only included the ray tracing option to justify the technology, implying that ray tracing improves the game's textures, but that's not the case. In games not paid for by NVIDIA, cinematic-quality textures are an additional option, and ray tracing does more than just illuminate with a specific light beam and create shadow paths.
Ray tracing and path tracing don't improve textures; they simply add extra lighting to the game. But that won't make a square Minecraft texture into a work of art like the Mona Lisa.
Games paid for by Nvidia will always be more optimized for higher performance with Nvidia, games paid for by AMD will be optimized for AMD, for example in Nvidia cyberpunk 2077, for example in AMD Black ops 6 where the 9070 Xt takes +30 fps from the 5070 TI and is equal to a 5080.