You should be able to. And If you're running Windows 10 and you have problems with glitches or bugs, it might help. The Pcores are the shining part of Alder Lake anyway so the loss of the Ecores will not hurt your over-all performance much. However, if you're on Windows 11, leave them on.
That is the general consensus. However, Denuvo is the overwhelming offender here and most devs are making patches to remove it.
There seems to be a (potentially very cool, or very janky, depending on perspective) BIOS-level fix for this. Don't know if it's on current BIOSes or upcoming, but they've announced a toggle ("Legacy game compatibility mode" or something) that when enabled lets you press Scroll Lock to park and effectively disable the E cores on the fly, avoiding Denuvo issues and the like. I'm a bit shocked that this doesn't hard crash the system, but I guess it automatically migrates all threads off of the cores before shutting them down?
An even better summary (in my point of view) is that AMD has always been good at offering competitive performance to Intel, at usually lower prices and with lower thermals. They only messed up the FX series, which took them a good 5-6 years to get out of, while Intel was riding a high horse with their 4-core chips. Now that Ryzen is good, AMD receives so much unnecessary hype that they can afford to ride the same high horse that Intel did. Now that Alder Lake is good too, AMD hopefully gets the message and lowers prices to competitive levels.
In my opinion, a lot of AMD's hype comes from the "underdog effect". Though I agree that new Ryzen is good, it's good in a very different way than Intel is good. This is the kind of competition and diversification that I like to see.
The underdog effect has
a lot to do with AMD's current success, that's for sure. And yes, AMD has historically been everywhere from competitive to clearly ahead to clearly behind, but they have always been a much smaller company than Intel, with all the competitive drawbacks of that - including (but not limited to) various illegal anticompetitive behaviour on Intel's part. That "Dell is the best friend money can buy" quote from internal Intel communications that came to light in one of the antitrust cases summarizes things pretty eloquently IMO. There's one major factor that hasn't been mention though: the size of the total PC market and how this affects things. Intel's late-2000s performance advantage with Core, combined with the tail end of their various dubiously legal exclusivity deals, combined at the point where a) laptops became properly viable for everyday use, and b) the internet had become a properly commonplace thing. So Intel came into a rapidly expanding market (lower prices per unit, but volume increased dramatically) while also rapidly becoming the only viable server alternative. They had some damn good luck with their timing there, and IMO that contributed quite a bit to their dominance in the 2010s, not just that Bulldozer and its ilk were bad. Now, it was pretty bad, but in a less drastically unequal competitive situation that could have evened out in some way. Instead AMD nearly went bankrupt, but thankfully stayed alive thanks to their GPU division churning out great products still. As such, I guess we can thank ATI, and the AMD decision to buy them, for having the competitive and interesting CPU market we have today. And I don't think we would have gotten the underdog effect without the GPUs either - their ability to deliver good value GPUs is likely to have contributed significantly to that.
Actually AMD became big deal also since forever. Since AMD used to make similar parts to Intel, but later. They clocked them higher, sold them cheaper and competed that way with them, Usually they were strong competitors. Around K5 era, Intel had enough and patent trolled AMD away. So now they had to copyright their stuff and develop their own architecture, as well as boards. And while K5 was mostly a loss to AMD, AMD started to make strides with K6. It flopped, but once they improved it with K6-II and started to push 3D-now! instructions, they became strong in market. When they moved to K7 architecture, AMD launched Athlon chips. The fastest CPUs around and soon won 1GHz race. At that time AMD was doing exceptionally well and crushing Intel. They later made Athlon XP chips, which were much cheaper, vastly improved, gained another GHz, but weren't faster than Pentium 4s. Still, due to loads of flak for Intel about their lackluster launch of Pentium4, AMD was perceived very well, until AMD k8 architecture. It was vastly improved architecture, with slightly higher clocks, great thermals and power consumption, now K8 cores had support for 64 bit software and 4GB of RAM, while still remaining affordable. Intel only managed to raise clock speed, meddled with RAMBUS and FSB speeds to keep Pentium 4 on life support, meanwhile AMD was beating them on all fronts, literally everything. Be it power use, thermals, performance, value. Only in cache heavy tasks AMD was a bit behind, but in everything else it was a slaughter of Pentium 4. But AMD was not selling many of those Athlons, due to OEM exclusivity agreements with Intel and very heavy advertising of Pentium 4, while hyping up their clock speed. Then AMD came out with Athlon 64 FX, a fully unlocked, 3GHz server based chip made for consumers. It was very fast and outcompeted Pentium 4 Emergency Edition, but at 1000 dollar price and requirement to buy server board, it failed to get traction. Still, AMD soon came out with first consumer grade dual core chips, the Athlon 64 X2s. Intel had nothing to compete against AMD and had to glue together two Pentium 4s together. One Pentium 4 was dying from heat (
https://arstechnica.com/civis/viewtopic.php?t=752419) and had awful thermals and horrendous power usage, why not put two of these into one chip? What a swell idea it was, until reviews came out and proved that Pentium D was slow, ran awfully hot and literally warped motherboards from heat. Intel only regained traction with later Core 2 Duo release. So all in all, AMD was always a competent rival with worst beating of Intel actually being around 1999/2000 and then going strong until 2005. Still, it's not like AMD made crap until Ryzen either. There were underwhelming releases from them, but generally AMD was not much behind Intel. In K10 era, AMD was big on value and managed to sell quad core chips for as little as 100 dollars, while this previously was only possible for over 300 dollars and you could overclock them too. Soon they came out with 6 core chips and managed to sell them at 200 dollars. Not so cheap, but way better than over 500 dollars for Intel one. And with FX chips AMD managed to sell a six core chips for a price of i3. And 8 core chips for a price on higher end i3. That really doesn't sound awful to me. AMD also tried to create some hype with beating world records of overclocking with Phenom II chips and with FX chips, which are still unbeaten today. On more sane side, they made APUs, which could run AAA games at respectable resolutions and nearly 60 fps, which was unprecedented for single chip or integrated graphics in general. They managed to change perception of iGPU being display adapter to a budget gaming powerhouse. AMD was also making very competent graphics cards at the time with Radeon HD 6000 series being strong competitor to nVidia. And nVidia only had Thermi (aka Fermi) cards out, which ran obscenely hot. Even worse nVidia launched second generation of Thermi, called Thermi 2.0 with very modest improvements. Flagship Radeon HD 6950 card consumed literally twice less power than Thermi based GTX 580. So AMD only didn't have a prestige due to their best CPUs not quite measuring up to Intel chips, but they weren't much behind and were unbeatable at budget. Despite all that and despite good times, AMD historically managed to sell chips at low prices (except flagships), it is only with Ryzen that AMD went full wanker with pricing and I'm glad that they finally got a kick in balls to lower prices a bit. To put things in perspective, they sold 6 core chips in 2012 for 120 dollars and 8 core chips for 160 dollars. At launch 5600X sold at 300 dollars, while being physically smaller die chip. It's really unbelievable how overly inflated prices of AMD chips became and mostly due to hype, which was partially meritless.
Paragraphs, man. Paragraphs!
I think you are missing the point.
XMP is about setting a profile which the memory is capable of, but not necessarily the memory controller. Most of the stability issues people hare having are actually related to their CPU sample, and the fact that the memory controller (like any other silicon) degrades over time, depending on use etc.
So no, choosing a XMP profile supported by the motherboard and the memory module doesn't guarantee stability, it may not even complete POST. And over time it will become more unstable, some motherboards may gradually turn down the clock speed or revert to a lower profile after system crashes or failed POSTs, and the end user may not even know it's happening.
You're right about that, but my point was based on current reality, which is that both major vendors have had IMCs for at least two generations that can reliably handle XMP settings at 3600 or higher. I don't believe I've ever heard of a recent-ish Intel or Zen2/3 CPU that can't handle memory at those speeds. Zen and Zen+ on the other hand ... nah. My 1600X couldn't run my 3200c16 kit at XMP, and saw intermittent crashes at anything above 2933. But AMD IMCs have improved massively since then. So while you're right on paper, reality is that any current CPU is >99% likely to handle XMP just fine (given that, as I've said all the time, you stick to somewhat reasonable speeds).
It's a 2600X on a b450 board. I had been looking into going to a newer gen Ryzen but the used market is horrible right now probably due to the problems in the brand new market. The bios after the one I am using added memory compatibility fixes for zen+ but since proxmox is now stable, I decided to not rock the boat.
Also it is a 4 dimm setup and when it was stable on windows it was only 2 dimms (should have mentioned), so take that into account. The official spec sheets for zen+ and the original zen show a huge supported memory speed drop for 4 dimms, if I remember right original zen only officially supports 1866mhz for 4 dimms?
My current 9900k handles the same ram that my 8600k couldnt manage at XMP speeds fine, I suspect i5's might have lower binned imc's vs i7 and i9.
Yeah, that makes sense. Zen and Zen+ IMCs were pretty bad, and 2dpc definitely didn't help. If those DIMMs had been dual rank as well, you might not have been able to boot at all. Thankfully AMD stepped up their IMC game drastically with Zen2 and beyond, and hopefully they're able to keep that up for DDR5.
Well, used to. At some points I ran 3 machines all day with 2 out of 3 with native Windows BOINC loads and one with linux VM and with BOINC loads in both linux and Windows. I don't do that anymore, but when you start out in crunching, you quickly realize how generally decent everyday CPU now suddenly becomes relatively inadequate. And soon you start to want Opterons or Xeons and then you you realize in what rabbit hole you end up.
I don't doubt it, but that's definitely niche territory
That's just one type of decompressing.
Yes, obviously, but my point was that compression/decompression workloads tend to be intermittent and relatively short-lived (unless you're working on something that requires on-the-fly decompression of some huge dataset, in which case you're looking at a much more complex workload to begin with). They might be tens or even hundreds of GB, but that doesn't take all that much time with even a mid-range CPU.
Or did it? Overclocked FX 83xx was slower at first pass, but faster at second pass.
I said "roughly even", didn't I? Slower in one, faster in the other, that's roughly even to me.
Don't forget that FX octa core chips costed nearly 3 tiems less than i7. And that was close to ideal situation for i7 as that workload clearly benefited from HT, some workloads have negative performance impact from using HT. And FX has real cores, therefore performance of overclocked FX should had been far more predictable than with i7.But power consumption...
yeah, that was rough. But still even at stock speeds, FX was close to i7 and at second pass benchmark beat i7. Also FX chips were massively overvolted from factory. 0.3V undervolts were achievable on nearly any chip. Despite being more power hungry chip, AMD did no favours by setting voltage so unreasonably high.
Yeah, they were good if you had a low budget and workloads that were specifically MT-heavy and integer-only (with that shared FP unit per two cores making it effectively half the core count in FP loads). So, yes, they were great value for very specific workloads, but they were quite poor overall - and while this gave them a good niche to live in, the lack of flexibility is part of why Intel's perceived advantage grew so huge (in addition to the inefficiency and thus lack of scaling, of course).
Phenom II X6 was decent. Phenom II X6 had single core performance of first gen FX chips, which roughly translates as somewhere in between Core 2 Quad and first gen Core i series. It was closer to i7 in that regard than 3970X is to 5950X today. And Phenom II X6 1055T sold for nearly 2 times lower price than i7, so value preposition was great.
Seems very sketchy, boards clearly overheated, but I'm not sure if it's just boost that got cut or even bellow base speed. In 3900X video CPU clearly throttled bellow base clock, that's a fail by any definition. On Intel side it's even worse:
Gigabyte B560 D2V board throttled i9 way bellow base speed and some boards were just so so. On Intel Z490 side:
Asrock Phantom Gaming 4 was only legally not throttling. I guess it's a pass, but in hotter climate it would be a fail. And that's not really a cheap board, there are many H410 boards with even worse VRMs, which are complete gamble if they would work with i9 or not. I wouldn't have much confidence that they would.
All in all, there are plenty of shit claims from motherboard manufacturers, they mostly don't get flak for that because media only uses mid tier or high end boards, but if media cared about low end stuff, board makers would face lawsuits. I guess that's an improvement from AM3+ era, when certain MSI boards melted or caught on fire and many low end boards throttling straight to 800MHz
Hm, thanks for linking those. Seems the board partners have been even worse than I thought - and, of course, Intel deserves some flak for not ensuring that they meet the requirements of the platform. This is simply not acceptable IMO.
I don't see anything a bad about putting i9K on H510 board. After all, manufacturers claim that they are compatible. If you are fine with less features, lower end chipset and etc, you may as well not pay for fancier board. Also, some people upgrade old system which had i3 later to i7s. Today that would be throttlefest (with i9). I don't see anything unreasonable about upgrading CPU later and I don't think that those people deserve to have their VRMs burning.
In theory you're right, but if you're buying a high end CPU for a heavy, sustained workload, you really should also buy something with a VRM to match. If not, you're making a poor choice, whether that is due to ignorance or poor judgement. That could still be a H510 board of course, it's just that nobody makes a H510 board with those kinds of VRMs. That doesn't take away the responsibility of Intel and board manufacturers to ensure they can meet spec - i.e. every board should be able to handle PL1 power draws indefinitely for the most power hungry part on the platform, and should also handle PL2 reasonably well as long as conditions aren't terrible (i.e. there is some ventilation cooling the VRMs and ambient temperatures are below 30°C or thereabouts). The issue, and why I said you shouldn't buy that combo, is that all the H510 boards are bargain-basement boards that generally can't be expected to do this, and will almost guaranteed leave performance on the table - which undermines the "I only need CPU perf, not I/O" argument for going that route in the first place.
On the other hand, you could have bought non K i5 or i7 and see it last for nearly decade with excelent performance. It was unprecedented stagnation, but it wasn't entirely good or bad. Even Core 2 Quad or Phenom II X4 users saw their chips last a lot longer than expected. Game makers made games runable on that hardware too, now core race got restarted and I don't think that we will see chips with usable lifespan as long as Sandy, Ivy or Haswels. You may say that's good. Maybe for servers and HEDT it is, but for average consumer that means more unnecessary upgrading.
*raises hand* I used my C2Q 9450 from 2008 till 2017
You're entirely right that these chips lasted a
long time. However, a huge part of that is that MSDT platforms went quite rapidly from 1 to 4 cores, and then stopped for a decade, meaning software wasn't made to truly make use of them. HEDT platforms had more cores, but were clocked lower, meaning you had to OC them to match the ST perf of MSDT parts - and still topped out at 6 or 8 cores. Now that we've had 8+ core MSDT for four years we're seeing a lot more software make use of it, which is great as it actually makes things faster, rather than the <10% perf/year ST improvements we got for most of the 2010s.
Well, I made a point about VRMs, more RAM channels, more PCIe lanes and etc. HEDT boards were clearly made to professional use and those who migrated to mainstream are essentially not getting a full experiences minus performance. Is that really good? Or is it just some people pinching pennies and buying by performance only?
There's definitely a point there, but the shrinking of the HEDT market shows us that the most important thing for most of these customers has been CPU performance, with I/O being secondary. And of course PCIe 4.0 and current I/O-rich chipsets alleviate that somewhat, as you can now get a lot of fast storage and accelerators even on a high-end MSDT platform. X570 gives you 20 native lanes of PCIe 4.0 plus a bunch off the chipset, and there are
very few people who need more than a single GPU/accelerator + 3-4 SSDs. They exist, but there aren't many of them - and at that point you're likely considering going for server hardware anyhow.
Not at all, there used to be Ryzen 3100s, Ryzen 3200G-3400Gs, various Athlons. On Intel side Celerons and Pentiums were always available without issues, now they became unobtanium, well except Pentium 4
. Budget CPUs are nearly wiped out as a concept, along with GPUs. They don't really exist anymore, but they did in 2018.
That's quite different from what I'm used to seeing across these and other forums + in my home markets. Here, Celerons and Pentiums have been nearly nonexistent since ... I want to say 2017, but it might have been 2018, when Intel's supply crunch started. Ryzen 3000G chips have come and gone in waves, being out of stock for months at a time, particularly the 3000G and 3200G, and the Ryzen 3100, 3300 and 3500 have been nearly unobtainable for their entire life cycle. These chips seem to have been distibuted almost entirely to OEMs from what I've seen, but clearly there are regional differences.
Maybe, but AMD has fanboys, never underestimate fanboys and their appetite for being ripped off.
Oh, absolutely. But that just underscores what a massive turnaround this has been. They've always had a small core of fans, but that suddenly turning into a dominant group is downright shocking.
Or is it? I find my country's society mind boggling at times. I was reading comments in phone store about various phones and found out that S20 FE is "budget" phone and that A52 is basically poverty phone. Those people were talking about Z Flips and Folds as if they were somewhat expensive but normal, meanwhile average wage in this country is more than 2 times lower than what latest Z flip costs. And yet this exactly same country loves to bitch and whine how everything is bad, how everyone is poor or close to poverty. I really don't understand Lithuanians. It makes me think that buying 5950X may be far more common than I would like to admit and that those two stores may be a reasonable reflection of society.
Hm, that sounds ... yes, mind-boggling. At least from that description it sounds like a society with a significant fixation on wealth and prestige - not that that's uncommon though. Still, in Norway and Sweden too we've gone from a flagship phone at 4-5 000NOK being considered expensive a decade ago to 10-15 000NOK flagships selling like hotcakes these days. Of course people's usage patterns and dependency on their phones have changed massively, so it makes some sense for them to be valued higher and people being more willing to spend more on them, but that's still a staggering amount of money IMO. PC parts haven't had quite the same rise, possibly due to being more of an established market + more of a niche. There have always been enthusiasts buying crazy expensive stuff, but there's a reason why the GTX 1060 is still the most used GPU globally by a country mile - cheaper stuff that performs well is still a much easier sell overall.
I generally dislike public perception, but you have to admit that whatever AMD did with marketing was genius. People drank AMD's koolaid about those barely functional buggy excuses of CPUs and hyped Zen 1 to the moon. Despite it being really similar to how FX launched. Focus on more cores, poor single threaded performance, worse power consumption than Intel. I genuinely thought that Ryzen will be FX v2, but nah. Somehow buying several kits of RAM to find one compatible, having computer turning on once in 10 tries and overall being inferior than Intel suddenly became acceptable and not only that, but desirable. Later gens were better, but it was the first gen that built most of Ryzen's reputation. And people quickly became militant of idea that Intel is still better, some of them would burn you at stake for saying such heresy. And now people are surprised that Intel is good again, as if Intel was a clear market leader and hasn't been for half century.
I think the promise of Intel HEDT-like performance (largely including the ST performance, though of course you could OC those HEDT chips) was much of the reason here, as well as the 5+ year lack of real competition making an opening for a compelling story of an up-and-coming company. If nothing else, Zen was
exciting, as it was new, different, and competitive. You're mistaken about the power consumption though:
Zen was very efficient from the start. That was perhaps the biggest shock - that AMD went from (discontinued) 250+W 8c CPUs that barely kept up with an i5 to ~100W 8c CPUs that duked it out with more power hungry Intel HEDT parts, and even competed on efficiency with Intel MSDT (though their ST perf was significantly behind).
I also think the surprise at "Intel being good again" is mainly due to Intel first not delivering a meaningful performance improvement for a solid 4 years (incremental clock boosts don't do much), and failing to get their new nodes working well. Rocket Lake underscored that with an IPC regression in some workloads - though that was predictable from the disadvantages of backporting a 10nm design to 14nm. Still, this is the first time in more than half a decade that Intel delivers
true improvement in the desktop space. Their mobile parts have been far better (Ice Lake was pretty good, Tiger Lake was rather impressive), but they've had clear scaling issues going above 50-ish watts and the very short boost times of laptops. They've finally demonstrated that they haven't lost what made them great previously - it just took a
long time.