Here's the thing. Datacenters don't buy CPUs based on reviews.
In fact most PC owners don't look at reviews, nor would they understand them. Even many gamers don't.
Yes, this is an important issue, but many of you are overestimating it's importance for the whole market.
I mean: isn't the main argument of Intel critics that they only provide 5% with each generation? That it's
nothing, marginal, irrelevant? That Kaby Lake is just a revamped Haswell or something?
So now we have a CPU design flaw that, on average, moves us a generation back. Why is it suddenly such a deal for the same people?
I like the +5% yearly, so I should be pissed off when it's taken away from me. And I might be, but I'm waiting for the patch. We'll see what it does to my PC. Possibly (hopefully) not much.
Of course it's a serious flaw, But there is a difference between 5% and 30% - all I'm saying. If it's 5%, most people won't even notice.
BTW:
Bullseye on auto-update. I have nothing against it. Actually, since I moved to W10 on all my PCs, I stopped worrying about the updates, I stopped reading their descriptions and so on. It saves a lot of time. You're thinking less about technical issues and more about actual problems. It's like moving from C++ to C# (although C++ evolved anyway).
Think about how incoherent people are. Almost no one cares about how a new OS version differs from the previous one. Yet, so many people freak out about updates.
I know it slightly harms your ego, because you're "an enthusiast", you want to have control over your PC and so on. But productivity-wise, it really saves a lot of time. I'm trusting a 3rd party cleaning company with my suits, so why would I not trust Microsoft or Intel with my PC?
No, it will affect all PCs. But the issue itself is more severe on servers. (security-wise).
But here's some consolation, in case you worry too much. Performance of your desktop is compromised by server-specific needs anyway. It's been like that since the architectures were unified.