Monday, January 3rd 2022
Intel to Disable Rudimentary AVX-512 Support on Alder Lake Processors
Intel is reportedly disabling the rudimentary AVX-512 instruction-set support on its 12th Gen Core "Alder Lake" processors using a firmware/ME update, reports Igor's Lab. Intel does not advertise AVX-512 for Alder Lake, even though the instruction-set was much publicized for a couple of its past-generation client-segment chips, namely 11th Gen Rocket Lake, and 10th Gen Cascade Lake-X HEDT processors. The company will likely make AVX-512 a feature that sets apart its next-gen HEDT processors derived from Sapphire Rapids, its upcoming enterprise microarchitecture.
AVX-512 is technically not advertised for Alder Lake, but software that calls for these instructions can utilize them on certain 12th Gen Core processors, when paired with older versions of the Intel ME firmware. The ME version Intel releases to OEMs and motherboard vendors alongside its upcoming 65 W Core desktop processors, and the Alder Lake-P mobile processors, will prevent AVX-512 from being exposed to the software. Intel's reason to deprecate what little client-relevant AVX-512 instructions it had for Core processors, could have do with energy efficiency, as much as lukewarm reception from client software developers. The instruction is more relevant to the HPC and cloud-computing markets.Many Thanks to TheoneandonlyMrK for the tip.
Source:
Igor's Lab
AVX-512 is technically not advertised for Alder Lake, but software that calls for these instructions can utilize them on certain 12th Gen Core processors, when paired with older versions of the Intel ME firmware. The ME version Intel releases to OEMs and motherboard vendors alongside its upcoming 65 W Core desktop processors, and the Alder Lake-P mobile processors, will prevent AVX-512 from being exposed to the software. Intel's reason to deprecate what little client-relevant AVX-512 instructions it had for Core processors, could have do with energy efficiency, as much as lukewarm reception from client software developers. The instruction is more relevant to the HPC and cloud-computing markets.Many Thanks to TheoneandonlyMrK for the tip.
49 Comments on Intel to Disable Rudimentary AVX-512 Support on Alder Lake Processors
-Linus Torvalds
And it does.
I remember the useless discussion with that PS3 emulator news, it ain't the first time intel does omit extension due to some reasons. Those extensions are waste of silicon for consumer environment, if a coder decided to rely on it really shows how he doesn't grasp what his real user base is and hides his inability to code normally using alternative aids, let it be CUDA or openCL.
Disabling support for an instruction does not improve energy efficiency.
Intel screwed up by having different ISA support on the slow and fast cores of Alder Lake, which I warned would be a headache.
BTW, AVX-512 support can be enabled on certain motherboards, with slow cores disabled, at least with early firmware. So what if the core drops a few hundred MHz? It's still more than twice as fast as AVX2. You need to comprehend that performance is what matters
www.igorslab.de/en/intel-deactivated-avx-512-on-alder-lake-but-fully-questionable-interpretation-of-efficiency-news-editorial/
This too
www.igorslab.de/en/efficiency-secret-tip-avx-512-on-alder-lake-the-returned-command-set-in-practice-test/
I reckon newer ADL CPU's will have the AVX-512 fused off/disabled
Dr Ian cutress
Just to clarify in Roman's video here. Intel's engineers (Ari) specifically stated in Architecture Day (Aug) that AVX512 was fused. I explicitly asked the question if AVX512 would work in any instance, and they said no.
To the topic: With all due respect, Linux Tovalds had no idea what he was talking about and his followers just repeating him without thinking. In my humble opinion, AVX512 is an instruction development which we should all thank intel for as they are trying to improve things. It might not be the best first try, it sure could be improved, but definitely not something what should "die a painful death".
It has register orthogonality, so code using the new instructions could use it on 128 or 256 bit registers too, and no down-clocking needed, awesome enhanced vector extensions, embedded broadcasting, mask registers, and the list could go on for quite a while.
AVX512 is not only about power hungry CPU-downclocking floating calculations, it is so much more.
People cry for innovation day and night, and when they getting one, they wish for its death.... idontwanttoliveonthisplanetanymore.jpg
I do like Intel leaning towards this RUDIMENTARY ( ahahaa my arse) statement, the information gleaned from the web makes them seem disingenuous about this, the E cores had non the P cores is third gen avx512 no?! , And that's rudimentary whatever Intel.
They're just after segregation again the gits.
Not saying a power hungry instruction set is what the masses needed either. Just that I disagree with Linus on this one.
On a more serious note, bonkers power limits cook CPUs, not AVX-512. Instead of thinking about disabling an instruction set, I'd much rather recommend enforcing a power limit that actually makes sense.
I'm a believer in, its hot or your wasting it , I don't mind hot , in use, doing something useful, did it not come across.
I'm ok with innovative new tech avx512.
Not ok with fusing working features or labeling it rudimentary to lessen the blow though
This particular case illustrates the circus going on at Intel, where some engineering head doesn't know what their marketing arse does.
It is a bad show. AVX512 is not meant to be run all the time, but in reality some APIs tries to hammer it all the time and the concept fails at it's core, thus Intel has to do something about it ie disable it, meanwhile make more fragmentation and make their Xeon offerings more appealing for those rare peps that actually need the instruction set. And by no means... they won't allow them to save money on cheaper mere mortal desktop offerings that could do the same.
For others... AMD doesn't bother still... so doesn't Apple with their productivity suites on their own silicon, who ditched Intel for a reason with all their fancy AVX512.
Torvalds is right most of the time. He speaks honestly about the induced corporate nonsense let it be intel or nvidia.
TDP became complicated the moment CPUs learned to adjust frequency (and voltage) on the fly. It just cannot be reduced to a single number anymore.
Power use can easily be limited to one number. Mobile parts, T series parts, and even normal desktop parts have have power draw limited. In fact, such limits are a thing according to intel. Intel however is very mushy on the actual limit of PL2/3 power draw and time limits as well, things that should be enforced by default then turned off for OC, not the other way around. Most importantly, they need to be consistent, as right now all these board makers can be "in spec" yet have wildly different power draws and time limits.
This wasnt an issue before the boost wars, boost timing and power draw limits were prtty clear in the nehalem/sandy bridge era. AMD today is still more stringent on how much juice ryzen can pull to boost. Intel has been playing fast and loose for years, and it's a headache to keep track of.
But you can't just publish the highest figure, the vast majority of users don't run high-end heatsinks, so they'll never see that.
To give you an example, let’s forget how the pandemic and crypto mining affected the prices of GPUs, and let’s just focus on the product line of Nvidia with intended msrp (for the sake of the argument, let’s also ignore that msrp itself might have been a lie too). They made the 3080 and put a 700-ish pricetag on it, the card was a beast when it came out (perhaps still is), there can be very little argument about that. They gave us a ton of proprietary features too, pushing the limits of computer graphics to new heights. Things like RT cores (I don’t care about raytracing much, but I do believe that most people still doesn’t realize how big step real time full path tracing graphics really is, and how it will change computer graphics, and how will we percieve shadowmaps and all the other ancient terrible fake things 10 years from now), DLSS which allows me to play games ~30% more fps in resolutions my card couldn’t even do 60 without it, and all of that with more detail(!) than how it would look in native resolution, etc…. They are in a business of making graphics cards, the best they can do, so they also made the 3090. They took their current tech to the absolute limit, give it all the cores, ram, whatever they could find on the selves and put a stupid high pricetag on it. Nobody was forced to buy the 3090, to have 24GB of vram, to use ray tracing, to use dlss, etc… but we had the option, and I’m glad we did. We only live once and I want all the cool tech and I want it now, thank you very much.
But what did people do? They whined that it is overpriced, whined that only 5 games support those new features, etc, and the tech sites agreed. Yea lets not have any of those because they are segregating… Well sorry, but I disagree.
I’m really tired of the new trend of bashing companies who are giving us new things. I’m sad avx512 is going away now. Who cares if amd doesn’t have that or if it is segregating intel's lineup, if it is a good thing? Who cares if it eats lots of power (most of the avx512 doesn’t btw) if it makes some stuff better?
Intel eats a lot of power already because their cpu’s are not power efficient for a long time, avx512 logic just builds on top of that bad design, so it eats even more power. Is that bad? Yes! Shall we get our pitchforks and “hope” that it will “die in a painful death”? I think not.
I’m a computer enthusiast and I welcome every new feature they gave us, I’m grateful and I’m willing to pay for it if it is a good one, just how I willing to pay for - and use - more electricity for faster processors (and I will try to lower my carbon footprint at other areas of my life to make up for it, of course).
If Torvalds wants better products than intel and avx512, then he shouldn't “hope” for the death of new instructions, he should hope for competition like the apple m1 instead, which shows intel (and nvidia) how inefficient their stuff really are. True competition is our only hope against these monsters with their prices and segregation techniques, not death wishes on instructions.
PS.: I3 processors had ECC support until 9th gen, but nobody bothered (motherboard makers dropped it because there was zero market for it). I personally do think it would be good to have, but apparently most of the users think otherwise (they are probably enthusiast like me and want faster ram, which is a lot harder to do with ECC). :)