I recommend people view the comparison slider images with one having max RT and one just being max without RT.
I know RT is more accurate but man, just seeing how max adds some shadow detail here and there that arnt there in RT, which again is probably more realistic....it does make the non RT version more believable looking, on RT a lot of it is just so flat and ugly in how it meshes together. (mostly referring to the 2nd image, the one outside)
The more important thing:
Look at those 1440p RT (with 960p upscaling) mins for 4070ti.
If you spent a not-dissimilar amount of money on a 7900xtx (for absurdly better raster) you could actually keep 60fps mins (with a little OC)! Or even 4k native w/o RT. 4080 not even close at 4k native.
Look at those 4k RT (1440p upscaling) mins with 4090/5080. Ooo....Missed it by that much. Totally not made that way at all on purpose so overclocking will still keep you at below 60. Time to upgrade!
Not good-enough for native 1440p RT for 5080 either.
6800xt good-enough for 1440p native max non-RT. Cheap as nuts.
Ain't that a b. Or rather an L for the overpriced nvidia cards.
...and prove my point EXACTLY about what nVIDIA does.
Or, you know, exactly why you shouldn't invest in ray-tracing until 3nm (built toward next-gen consoles)...which I have said for 6 damn years and will continue to say until both companies launch 3nm parts.
I will then declare it has arrived, and we can all play together on whatever cards that can actually run the damn thing for more than one gen of games.
...But I can't promise it won't still be ugly and/or still won't be worth the performance trade-off.