Honestly both cards are hugely underwhelmimg, 7900XT barely 22% faster than 6900XT in 1440 and 4K, huge power draw for mulit-monitor setups, overall worse efficiency than the 4080, p!ss poor RT performance.
You're comparing the 2nd-tier model with last generation's flagship and disappointed that it's only 22% faster?
How is that different to Nvidia, or previous AMD generations going back at least half a decade?
As for multi-monitor power draw, that's likely a bug with launch-day drivers and should be patched soon. Neither the drivers nor the reference card's power delivery look great, but at the same time it's not as if either of Nvidia's last two launch generations have been problem-free either. That's why we get driver updates!
Efficiency is a big one that won't be solved with software or drivers; The fact it's not as efficient as Nvidia is potentially down to the chiplet design, which adds energy cost overheads and is one of the main reasons we only get monolithic AMD CPUs for laptops. Chiplet design increases the physical distance between bits of silicon that have to communicate with each other, and the additional interfaces between chiplets all have some internal resistance. It's small, but it adds up. Even if it doesn't beat the 4080, it's still MUCH more efficient than the previous generations.
You have to remember that every deesign decision has implications/drawbacks. Chiplet design reduces costs, at the expense of some efficiency (among other things) and with RDNA3 we
are seeing cheaper cards. Unlike the 6900-series, the XTX is ~$400 less than the cheapest 4080 cards and the XT, even at it's "incorrect" price is better performance/$ than the 4080 by a decent margin. We'll have to see what price Nvidia launches the
4080 12GB 4070Ti at to truly compare the 7900XT against the competion though.