Thanks for the benchmarks, I don't even want to think about the amount of work that goes in to doing something like this.
As someone who owns and plays with a 3080 and a 6800xt (at 1440p) I can confirm that they on average perform very similar, but when it comes to individual games I´ve seen up to a 20-25% difference in framerate, so it would be easy to "improve" the result in either GPU's favor just by selecting the right games.
For me the 6800xt is actually the GPU I prefer. I have actually had less issues with my monitors with the 6800xt than the 3080, and the drivers (so far) have been solid in the game I've played. The 3080 actually had issues with flickering some drivers, also had problems with textures not loading correctly in some games with a driver a few months ago. Using 2 monitors also increases VRAM usage a bit (usually by 200-700mb).
If someone asked me before I tried the 6800xt for myself, what GPU I would rather buy at msrp, the 6800xt or 3080, my answer would've been the 3080. Today it would actually be the 6800xt. Sure, having DLSS and better performing RT is nice, but DLSS below 4k is overrated imo (at 1440p I have always preferd native whenever I compared them) and with RT I only notice reflections looking better, other things such as GI, shadows, etc. only look "different" in most cases. But the biggest issue for me with the 3080 is actually how much hotter my room gets while playing games, I don't know if the Nvidia GPU also makes my 5900x run hotter, but the amount of "extra" heat generated feels like more than what it should be. I have a reference 6800xt (300W) that I OC slightly and a 3080 Auros Master (375W factory OC), and the difference feels more like 150W than 75W.