There seems to be a (potentially very cool, or very janky, depending on perspective) BIOS-level fix for this. Don't know if it's on current BIOSes or upcoming, but they've announced a toggle ("Legacy game compatibility mode" or something) that when enabled lets you press Scroll Lock to park and effectively disable the E cores on the fly, avoiding Denuvo issues and the like. I'm a bit shocked that this doesn't hard crash the system, but I guess it automatically migrates all threads off of the cores before shutting them down?
The underdog effect has
a lot to do with AMD's current success, that's for sure. And yes, AMD has historically been everywhere from competitive to clearly ahead to clearly behind, but they have always been a much smaller company than Intel, with all the competitive drawbacks of that - including (but not limited to) various illegal anticompetitive behaviour on Intel's part. That "Dell is the best friend money can buy" quote from internal Intel communications that came to light in one of the antitrust cases summarizes things pretty eloquently IMO. There's one major factor that hasn't been mention though: the size of the total PC market and how this affects things. Intel's late-2000s performance advantage with Core, combined with the tail end of their various dubiously legal exclusivity deals, combined at the point where a) laptops became properly viable for everyday use, and b) the internet had become a properly commonplace thing. So Intel came into a rapidly expanding market (lower prices per unit, but volume increased dramatically) while also rapidly becoming the only viable server alternative. They had some damn good luck with their timing there, and IMO that contributed quite a bit to their dominance in the 2010s, not just that Bulldozer and its ilk were bad. Now, it was pretty bad, but in a less drastically unequal competitive situation that could have evened out in some way. Instead AMD nearly went bankrupt, but thankfully stayed alive thanks to their GPU division churning out great products still. As such, I guess we can thank ATI, and the AMD decision to buy them, for having the competitive and interesting CPU market we have today. And I don't think we would have gotten the underdog effect without the GPUs either - their ability to deliver good value GPUs is likely to have contributed significantly to that.
Paragraphs, man. Paragraphs!
You're right about that, but my point was based on current reality, which is that both major vendors have had IMCs for at least two generations that can reliably handle XMP settings at 3600 or higher. I don't believe I've ever heard of a recent-ish Intel or Zen2/3 CPU that can't handle memory at those speeds. Zen and Zen+ on the other hand ... nah. My 1600X couldn't run my 3200c16 kit at XMP, and saw intermittent crashes at anything above 2933. But AMD IMCs have improved massively since then. So while you're right on paper, reality is that any current CPU is >99% likely to handle XMP just fine (given that, as I've said all the time, you stick to somewhat reasonable speeds).
Yeah, that makes sense. Zen and Zen+ IMCs were pretty bad, and 2dpc definitely didn't help. If those DIMMs had been dual rank as well, you might not have been able to boot at all. Thankfully AMD stepped up their IMC game drastically with Zen2 and beyond, and hopefully they're able to keep that up for DDR5.
I don't doubt it, but that's definitely niche territory
Yes, obviously, but my point was that compression/decompression workloads tend to be intermittent and relatively short-lived (unless you're working on something that requires on-the-fly decompression of some huge dataset, in which case you're looking at a much more complex workload to begin with). They might be tens or even hundreds of GB, but that doesn't take all that much time with even a mid-range CPU.
I said "roughly even", didn't I? Slower in one, faster in the other, that's roughly even to me.
Yeah, they were good if you had a low budget and workloads that were specifically MT-heavy and integer-only (with that shared FP unit per two cores making it effectively half the core count in FP loads). So, yes, they were great value for very specific workloads, but they were quite poor overall - and while this gave them a good niche to live in, the lack of flexibility is part of why Intel's perceived advantage grew so huge (in addition to the inefficiency and thus lack of scaling, of course).
Hm, thanks for linking those. Seems the board partners have been even worse than I thought - and, of course, Intel deserves some flak for not ensuring that they meet the requirements of the platform. This is simply not acceptable IMO.
In theory you're right, but if you're buying a high end CPU for a heavy, sustained workload, you really should also buy something with a VRM to match. If not, you're making a poor choice, whether that is due to ignorance or poor judgement. That could still be a H510 board of course, it's just that nobody makes a H510 board with those kinds of VRMs. That doesn't take away the responsibility of Intel and board manufacturers to ensure they can meet spec - i.e. every board should be able to handle PL1 power draws indefinitely for the most power hungry part on the platform, and should also handle PL2 reasonably well as long as conditions aren't terrible (i.e. there is some ventilation cooling the VRMs and ambient temperatures are below 30°C or thereabouts). The issue, and why I said you shouldn't buy that combo, is that all the H510 boards are bargain-basement boards that generally can't be expected to do this, and will almost guaranteed leave performance on the table - which undermines the "I only need CPU perf, not I/O" argument for going that route in the first place.
*raises hand* I used my C2Q 9450 from 2008 till 2017
You're entirely right that these chips lasted a
long time. However, a huge part of that is that MSDT platforms went quite rapidly from 1 to 4 cores, and then stopped for a decade, meaning software wasn't made to truly make use of them. HEDT platforms had more cores, but were clocked lower, meaning you had to OC them to match the ST perf of MSDT parts - and still topped out at 6 or 8 cores. Now that we've had 8+ core MSDT for four years we're seeing a lot more software make use of it, which is great as it actually makes things faster, rather than the <10% perf/year ST improvements we got for most of the 2010s.
There's definitely a point there, but the shrinking of the HEDT market shows us that the most important thing for most of these customers has been CPU performance, with I/O being secondary. And of course PCIe 4.0 and current I/O-rich chipsets alleviate that somewhat, as you can now get a lot of fast storage and accelerators even on a high-end MSDT platform. X570 gives you 20 native lanes of PCIe 4.0 plus a bunch off the chipset, and there are
very few people who need more than a single GPU/accelerator + 3-4 SSDs. They exist, but there aren't many of them - and at that point you're likely considering going for server hardware anyhow.
That's quite different from what I'm used to seeing across these and other forums + in my home markets. Here, Celerons and Pentiums have been nearly nonexistent since ... I want to say 2017, but it might have been 2018, when Intel's supply crunch started. Ryzen 3000G chips have come and gone in waves, being out of stock for months at a time, particularly the 3000G and 3200G, and the Ryzen 3100, 3300 and 3500 have been nearly unobtainable for their entire life cycle. These chips seem to have been distibuted almost entirely to OEMs from what I've seen, but clearly there are regional differences.
Oh, absolutely. But that just underscores what a massive turnaround this has been. They've always had a small core of fans, but that suddenly turning into a dominant group is downright shocking.
Hm, that sounds ... yes, mind-boggling. At least from that description it sounds like a society with a significant fixation on wealth and prestige - not that that's uncommon though. Still, in Norway and Sweden too we've gone from a flagship phone at 4-5 000NOK being considered expensive a decade ago to 10-15 000NOK flagships selling like hotcakes these days. Of course people's usage patterns and dependency on their phones have changed massively, so it makes some sense for them to be valued higher and people being more willing to spend more on them, but that's still a staggering amount of money IMO. PC parts haven't had quite the same rise, possibly due to being more of an established market + more of a niche. There have always been enthusiasts buying crazy expensive stuff, but there's a reason why the GTX 1060 is still the most used GPU globally by a country mile - cheaper stuff that performs well is still a much easier sell overall.
I think the promise of Intel HEDT-like performance (largely including the ST performance, though of course you could OC those HEDT chips) was much of the reason here, as well as the 5+ year lack of real competition making an opening for a compelling story of an up-and-coming company. If nothing else, Zen was
exciting, as it was new, different, and competitive. You're mistaken about the power consumption though:
Zen was very efficient from the start. That was perhaps the biggest shock - that AMD went from (discontinued) 250+W 8c CPUs that barely kept up with an i5 to ~100W 8c CPUs that duked it out with more power hungry Intel HEDT parts, and even competed on efficiency with Intel MSDT (though their ST perf was significantly behind).
I also think the surprise at "Intel being good again" is mainly due to Intel first not delivering a meaningful performance improvement for a solid 4 years (incremental clock boosts don't do much), and failing to get their new nodes working well. Rocket Lake underscored that with an IPC regression in some workloads - though that was predictable from the disadvantages of backporting a 10nm design to 14nm. Still, this is the first time in more than half a decade that Intel delivers
true improvement in the desktop space. Their mobile parts have been far better (Ice Lake was pretty good, Tiger Lake was rather impressive), but they've had clear scaling issues going above 50-ish watts and the very short boost times of laptops. They've finally demonstrated that they haven't lost what made them great previously - it just took a
long time.