@WhenMusicAttacks let me illustrate something to you that very much concerns what you said about amd being so much better under $1000 thanks to vram for rt and textures.
back in 2020 I placed an order for a 3080 10G, but the card wasn't available and the retailer kept me waiting forever. So I made a fuss and as a form of compensation, they gave me a 250eur discount on a 3060Ti 8G. I promptly took the offer, given I was using a 1070 which just wasn't good enough anymore. 3060Ti ran great, until I bought Dying Light 2, which looks pretty poor unless you turn on RT, which transforms it entierly. I mean it's like another game. The problem was, 3060Ti often ran out of vram in some areas, despite good performance in others. It either stuttered or dropped to 20fps. My choice was either lower textures or get a different card. I took internet's advice and traded the card for a used 6800 someone was selling. The problem was, the card could easily fit everything into the vram, but turning on the same settings (incl. RT) dropped into 40-50 fps when 3060Ti was getting 60-70 fps when it wasn't hitting vram limit. So, in hindsight, the internet was wrong, medium textures with RT would have worked out better on 3060Ti than max textures with RT on 6800. Indiana Jones is an even clearer example, since texture presets are almost identical from medium up to supreme.
Here's medium textures + full RT (path tracing) vs supreme textures with rasterization. Both use about 11.3GB.
Am I really wrong about RT being more worth it than textures ?