You've made some flawed assumptions. The 5800x3d does give up a tiny bit of multi. That doesn't mean they sacrificed multi for gaming. And you've then carried this over to the next gen series. Again flawed assumptions about AMD's motives. It's got lower temp limits and oc limits due to the 3d cache. That is called a trade-off for the gains from the 3d cache. And taking the 5800x3d as an example, in the 5800x3d's case its like 500 points in a 15K score in R23. Stop making something out of nothing. And the scheduler is a whole other ball of wax...
You make it sound like 5800x3D only trails 5800x in synthetics like R23. TechPowerUp tested it in 38 applications:
"Averaged over our application test suite, the Ryzen 7 5800X3D falls 3% behind the original Ryzen 7 5800X because the 5800X3D runs at lower clocks than its sibling. AMD confirmed that the 3D V-Cache die is limited to 1.35 V maximum operating voltage, a limit that applies to the whole processor due to the way power is routed into the CPU, including the compute cores. Since Zen 3 does require 1.5 V and above to reach the highest boost clocks, AMD had to reduce the core clocks a bit to ensure stability at all times."
Remember, this is a more expensive CPU, trailing the less expensive CPU in a really wide assortment of applications. And some applications really take a much greater hit than the calculated average - but it"'s more exciting to talk about the selected games that gain the biggest boost.
I know it's a trade off. But I believe it's a trade off for something gamers usually wouldn't even notice without benchmarking:
"Averaged over our 10 games at the CPU-bottlenecked 720p resolution, the Ryzen 7 5800X3D can gain an impressive 10% in performance over its 5800X counterpart."
The whole 10% at eye watering resolution of 720p! Of course, this gets diminishingly smaller as we raise the resolution to real world ones, at 1440p we're at 5%, at 4K we're way below the trade off of 3% loss of productivity.
So, 5800x3D in my eyes gives up tiny bit in productivity, and gains a tiny bit in gaming. A trade off, but it's not priced like one, it's priced as an improvement, and people were jumping on it like it really is a noticeable one.
And comparing previous generation of x3D with the newer one as flawed? Sure, it might be. But all the hopes that AMD somehow overcame the problems of adding an extra layer of cache are slowly evaporating as we see the same impact in specs as with 5800x. And it might have an even greater impact in productivity, we don't know what the 6 degrees lower Tjmax will bring. But I think it can't be insignificant - otherwise AMD wouldn't push for CPUs that jump to 95 degrees, a temperature that a while ago surely meant a badly mounted cooler.