I blame reviewers honestly - they're so focused on making sure all the settings are bone stock and we're comparing raster to raster, or rt to rt, that they don't even talk about the features.
I've not seen review was like "these are the real settings a gamer would use in these games, and this one has settings that provide a better IQ and FPS, while this one doesn't". It's always "well if you run native raster at 4k ultra the 7900xtx gets a 40 fps slide show which is 100% faster than the 20 fps slideshow the 4070ti gets, so therefore the 7900xtx is double the speed at this title".
The graphs are misleading.
I don't blame reviewers because their job is already hard enough trying to cater to consumers that already have their personal biases as it is.
A good example is even though I think DLSS is superior to FSR 99.9% of the time I'm glad it isn't used in reviews becuase I think that is somthing the consumer has to decide for themselves and should always be a bonus on top not a crutch for developers optimization.
The RT thing is hard as well becuase I'd be willing to bet 50% ish + of pc gamers don't use it and there are still so many games shipped with terrible implementation like RE4 remake where the crap cube maps look better.
They really are doing the best they can and are in a no win situation.
Even the coverage for 8GB cards on TPU I don't agree with but I can fully understand the point they are trying to make in the here and now 99% games are fine.
This all makes it really hard for the consumer as well with pricing being so terrible.
Lucky for me I already had a 3080ti so I know full well how a 7900XTX generally performs in RT becuase in most games they are similar and to me already having that gpu for nearly 2 years prior made me probably more disappointed than most in how RDNA3 handles RT.