what titles does a 6900XT have superior raster performance (by a non insignificant margin, shall we say 10%+? ) and do that consuming 200w while a 3090 consumes 350w. I don't think I've ever seen that.
Well, metro exodus comes to mind, dirt 5, Forza 5 resident evil village, Red Dead Redemption to name a few that i know offer greater frames at 4k, and 1440p. I'm not saying their aren't Nvidia optimized titles (horizon zero dawn for instance) that offer better performance, but generally the 6900xt is within 10fps of the 3090 in non-RT scenarios, using 150w less power, at a much lower price point.
More materially to my point, the 6900XT enjoyed advantages in the metrics I quoted to the tune of 88% and 61%, I don't think it enjoys a single win over the 3090 to the tune of even 61%, let alone 88%, but I'm sure if you dig hard enough you might find an unrealistic niche example or two where that might be the case.
At the end of the day, the only things that really matter are how much you have to pay for given performance, and how much power is required to get there. The 3090 is 50% more expensive, with 50% higher power draw, to sometimes, in certain games, with certain settings perform marginally better.
From what I know, the 6900XT enjoyed a minor lead at 1080p (less than 10% on average), roughly par at 1440p, and the 3090 enjoyed a minor lead at 4k (less than 10% on average)
It's very much title dependant, some games are optimized in favor of one card or another, but overall the average findings favor the 6900xt.
That's the reality I remember, expect most publications don't really test DLSS, at least not in like for like testing, because then it wouldn't be like for like... so the 3090 trounces a 6900XT for RT, and then you have DLSS to help even more.
That was true prior to FSR being a thing, but it's more common place now, particularly when FSR2/2.1 can so easily be modded into any title with DLSS, and used on pretty much any GPU made in the past 5 years. Yes nvidia has the lead in RT, and yes i'm one of those idiots that spent stupid money buying a card during a pandemic in the EU specifically for that feature simply because when i started playing around with CAD apps 25 years ago, it took 36 hours to render a single RT light source on a blank backround, and while the nerd in me loves the fact that it's a thing, the reality is that it's RARELY noticeable in practice while gaming, beyond the fact that your performance has dropped by 2-3x. It's still in the realm of curiosity, though it has been gaining traction over the past year, i think it's going to be another year or two before we really see it's full potential.
If I were you I'd brace for AMD being all too happy to follow this trend, hell, it's already started.
From what i know the 79xx cards are going to have a power limit of 300-400w, with more than a 50% uplift in performance per watt and RT performance. In 2008, i had a pair of 4870x2's, each of which consumed 285-350w with 2.4 teraflops of rendering performance. AMD has managed to keep power consumption fairly consistent. Given that the 4090 costs $1600 if you can find it in stock and don't care about maybe setting your computer on fire, If AMD manages 75% of the performance for under $1000, they'll be in a very good position.
Now, despite everything i've said, if money were no object and i didn't consider anything beyond the raw performance numbers, i'd buy Nvidia in a heartbeat. The end result is incredibly impressive, the means of achieving it is just disappointing