Tuesday, March 8th 2022

Apple Unveils M1 Ultra, the World's Most Powerful Chip For a Personal Computer

Apple today announced M1 Ultra, the next giant leap for Apple silicon and the Mac. Featuring UltraFusion — Apple's innovative packaging architecture that interconnects the die of two M1 Max chips to create a system on a chip (SoC) with unprecedented levels of performance and capabilities — M1 Ultra delivers breathtaking computing power to the new Mac Studio while maintaining industry-leading performance per watt.

The new SoC consists of 114 billion transistors, the most ever in a personal computer chip. M1 Ultra can be configured with up to 128 GB of high-bandwidth, low-latency unified memory that can be accessed by the 20-core CPU, 64-core GPU and 32-core Neural Engine, providing astonishing performance for developers compiling code, artists working in huge 3D environments that were previously impossible to render, and video professionals who can transcode video to ProRes up to 5.6x faster than with a 28-core Mac Pro with Afterburner.
"M1 Ultra is another game changer for Apple silicon that once again will shock the PC industry. By connecting two M1 Max die with our UltraFusion packaging architecture, we're able to scale Apple silicon to unprecedented new heights," said Johny Srouji, Apple's senior vice president of Hardware Technologies. "With its powerful CPU, massive GPU, incredible Neural Engine, ProRes hardware acceleration and huge amount of unified memory, M1 Ultra completes the M1 family as the world's most powerful and capable chip for a personal computer."

Groundbreaking UltraFusion Architecture

The foundation for M1 Ultra is the extremely powerful and power-efficient M1 Max. To build M1 Ultra, the die of two M1 Max are connected using UltraFusion, Apple's custom-built packaging architecture. The most common way to scale performance is to connect two chips through a motherboard, which typically brings significant trade-offs, including increased latency, reduced bandwidth and increased power consumption. However, Apple's innovative UltraFusion uses a silicon interposer that connects the chips across more than 10,000 signals, providing a massive 2.5 TB/s of low-latency, inter-processor bandwidth — more than 4x the bandwidth of the leading multi-chip interconnect technology. This enables M1 Ultra to behave and be recognised by software as one chip, so developers don't need to rewrite code to take advantage of its performance. There's never been anything like it.

Unprecedented Performance and Power Efficiency

M1 Ultra features an extraordinarily powerful 20-core CPU with 16 high-performance cores and four high-efficiency cores. It delivers 90 per cent higher multithreaded performance than the fastest available 16-core PC desktop chip in the same power envelope. Additionally, M1 Ultra reaches the PC chip's peak performance using 100 fewer watts. That astounding efficiency means less energy is consumed and fans run quietly, even as apps like Logic Pro rip through demanding workflows, such as processing massive amounts of virtual instruments, audio plug-ins and effects.

For the most graphics-intensive needs, like 3D rendering and complex image processing, M1 Ultra has a 64-core GPU — 8x the size of M1 — delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power.

Apple's unified memory architecture has also scaled up with M1 Ultra. Memory bandwidth is increased to 800 GB/s, more than 10x the latest PC desktop chip, and M1 Ultra can be configured with 128 GB of unified memory. Compared with the most powerful PC graphics cards that max out at 48 GB, nothing comes close to M1 Ultra for graphics memory to support enormous GPU-intensive workloads like working with extreme 3D geometry and rendering massive scenes.

The 32-core Neural Engine in M1 Ultra runs up to 22 trillion operations per second, speeding through the most challenging machine learning tasks. And, with double the media engine capabilities of M1 Max, M1 Ultra offers unprecedented ProRes video encode and decode throughput. In fact, the new Mac Studio with M1 Ultra can play back up to 18 streams of 8K ProRes 422 video — a feat no other chip can accomplish. M1 Ultra also integrates custom Apple technologies, such as a display engine capable of driving multiple external displays, integrated Thunderbolt 4 controllers and best-in-class security, including Apple's latest Secure Enclave, hardware-verified secure boot and runtime anti-exploitation technologies.

macOS and Apps Scale Up to M1 Ultra

Deep integration between hardware and software has always been at the heart of the Mac experience. macOS Monterey has been designed for Apple silicon, taking advantage of M1 Ultra's huge increases in CPU, GPU and memory bandwidth. Developer technologies like Metal let apps take full advantage of the new chip, and optimisations in Core ML utilise the new 32-core Neural Engine, so machine learning models run faster than ever.

Users have access to the largest collection of apps ever for Mac, including iPhone and iPad apps that can now run on Mac, and Universal apps that unlock the full power of the M1 family of chips. Apps that have not yet been updated to Universal will run seamlessly with Apple's Rosetta 2 technology.

Another Leap Forward in the Transition to Apple Silicon

Apple has introduced Apple silicon to nearly every Mac in the current line-up, and each new chip — M1, M1 Pro, M1 Max and now M1 Ultra — unleashes amazing capabilities for the Mac. M1 Ultra completes the M1 family of chips, powering the all-new Mac Studio, a high-performance desktop system with a re-imagined compact design made possible by the industry-leading performance per watt of Apple silicon.

Apple Silicon and the Environment

The energy efficiency of Apple's custom silicon helps Mac Studio use less power over its lifetime. In fact, while delivering extraordinary performance, Mac Studio consumes up to 1,000 kilowatt-hours less energy than that of a high-end PC desktop over the course of a year.

Today, Apple is carbon-neutral for global corporate operations, and by 2030, plans to have net-zero climate impact across the entire business, which includes manufacturing supply chains and all product life cycles. This means that every chip Apple creates, from design to manufacturing, will be 100 per cent carbon-neutral.
Source: Apple
Add your own comment

122 Comments on Apple Unveils M1 Ultra, the World's Most Powerful Chip For a Personal Computer

#26
wolf
Better Than Native
M1 Ultra has a 64-core GPU — 8x the size of M1 — delivering faster performance than even the highest-end PC GPU
I'd love to see that claim put to the test, something tells me it won't quite be apples to apples.
Posted on Reply
#27
Unregistered
wolfI'd love to see that claim put to the test, something tells me it won't quite be apples to apples.
delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power. Bollocks

Me too, against a 3090/ti
#28
watzupken
SteevoA lot of performance claims that if true are amazing.
But the GPU one got me, I’m sure it will play candy crush great, and some ports, but this is going to push Mac gaming back to the stone ages again.
I don’t think this is targeted for gamers to begin with. The GPU power may allow you to play some games, but that is not the primary purpose here. Professionals and content creators will love this chip with the GPU.
Posted on Reply
#29
Fourstaff
Very interested in the performance of this chip. Apple went from launching M1 in 2020 to "we are comparing M1 Ultra with intel i9" in 2022.
Posted on Reply
#30
evelynharthbrooke
Selaya:thinking:
this isn't x86, so this isn't a PC ...
It actually is a PC. A PC doesn’t have to be based on the x86 architecture to be a PC. The Raspberry Pi is based on the ARM architecture as well, but it’s still a PC and it’s the size of a credit card. Just like how if RISC-V ever makes its way into the mainstream and ends up being used in laptops and prebuilts, they would still be a PC. The CPU architecture that is used doesn’t define the PC.
Posted on Reply
#31
lexluthermiester
evelynmarieIt actually is a PC. A PC doesn’t have to be based on the x86 architecture to be a PC. The Raspberry Pi is based on the ARM architecture as well, but it’s still a PC and it’s the size of a credit card. Just like how if RISC-V ever makes its way into the mainstream and ends up being used in laptops and prebuilts, they would still be a PC. The CPU architecture that is used doesn’t define the PC.
This is true. Well said. However, Apple statement is still completely dishonest.
Posted on Reply
#32
Vader
I feel like Apple talks to Apple users. If this SoC is the fastest in common scenarios for them, like media production, they will agree with apple claims, it's all a matter of perspective.

However, as a hardware enthusiast, i do not like what apple is doing here. Their chip is really focused on media production, with average to lackluster performance in other areas (which aren't that many tbh), so the title "fastest cpu" might not apply to you depending on your needs
Posted on Reply
#33
Valantar
wolfI'd love to see that claim put to the test, something tells me it won't quite be apples to apples.
Given that this is already an MCM GPU (which is impressive in its own right), I would be extremely surprised if those numbers applied to anything that wasn't GPGPU. I have no doubt apple could manage a transparent multi-GPU solution that could work for games if they wanted to (their cash reserves are massive, after all), but gaming isn't a focus (or even interesting) to them, and making this work in GPGPU applications is bound to be a lot easier overall. Still, whether or not this scales as advertised, it can't be denied that Apple is now beating all established GPU makers to an actual, real-world, transparent multi-GPU implementation. And that's impressive in and of itself. We'll just have to see how well it performs in real life, as there are many, many factors affecting that.

Of course it also bears mentioning the differences in design philosophies here; consumer GPUs are designed to chase peak performance pretty much regardless of efficiency, while this is designed to hit very specific performance targets with near no regard to production costs. The M1 Ultra being >4x the transistor count of a GA102, even if that isn't all GPU, tells quite a bit about the differences here. If your GPU is twice as wide and clocked half as high, it's not a surprise if you're equally fast at less than half power, after all. But the cost of entry is also going to be really high, as you're just throwing transistors at the problem rather than clock cycles and power.
Posted on Reply
#34
Prima.Vera
UskompufFor the most graphics-intensive needs, like 3D rendering and complex image processing, M1 Ultra has a 64-core GPU — 8x the size of M1 — delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power.
I call on this bullshit statement. Really, did they actually claimed that???
Posted on Reply
#35
Valantar
Prima.VeraI call on this bullshit statement. Really, did they actually claimed that???
As with the M1 Pro and Max, that statement is likely true in a highly selective set of tests under relatively optimal conditions - but also not true in most cases. It just depends on what is important to you, and to Apple, those tests (which are mostly related to video and media production) are what matters. And TBH, with the silicon area and transistor count of this thing, it really shouldn't be difficult to beat the woefully inefficient, pushed-to-the-max 3090. That doesn't mean it will will perform well in games or other real-time 3D applications, but that isn't Apple's priority either.

As an example: The M1 Max already scored ~950 in PugetBench Premiere, which beats a 32-core Threadripper+5700 XT system, but trails most desktop GPUs (2060 and up) by 100-200 points. A 3090 with that CPU scores ~1140. If the new Ultra is able to bump those scores up by just a few hundred points, that's a win for Apple in a workload relevant to them and their customer base. Could they achieve that at slightly above 100W? I wouldn't be too surprised, given the massive wide-and-slow design of this chip. Remember: 4x the transistor count.
Posted on Reply
#37
GreiverBlade
well, it's good for Apple users ofc ... (for their wallet ... less much so ... )

but, a closed ecosystem becoming even more proprietary and closed is not what i like see, no matter what Apple defenders have to say about personal computers... (phone and tablets are fine, well physically fine ... not financially )

in a small addition, if i pay premium for something i usually, also, like to have more freedom in my "tech muesli"


in short the title, should be true like that : "Apple Unveils M1 Ultra, the World's Most Powerful Chip For a closed path/ecosystem Personal Computer in a set of highly elective set of tests under relatively optimal conditions" but ... well, less appealing, eh?
Posted on Reply
#38
Cutechri
GreiverBladewell, it's good for Apple users ofc ... (for their wallet ... less much so ... )

but, a closed ecosystem becoming even more proprietary and closed is not what i like see, no matter what Apple defenders have to say about personal computers... (phone and tablets are fine, well physically fine ... not financially )

in a small addition, if i pay premium for something i usually, also, like to have more freedom in my "tech muesli"


in short the title, should be true like that : "Apple Unveils M1 Ultra, the World's Most Powerful Chip For a closed path/ecosystem Personal Computer in a set of highly elective set of tests under relatively optimal conditions" but ... well, less appealing, eh?
It's weird how leather jacket man divorced Apple man when they both love proprietary stuff so much.
Posted on Reply
#39
evelynharthbrooke
lexluthermiesterThis is true. Well said. However, Apple statement is still completely dishonest.
May I kindly ask how you came to this conclusion despite the Mac Studio with the M1 Ultra not coming out until the 18th, therefore there are no reviews confirming or denying Apple’s performance claims? Innocent before proven guilty, but in this case honest before proven wrong.
Posted on Reply
#40
Steevo
evelynmarieMay I kindly ask how you came to this conclusion despite the Mac Studio with the M1 Ultra not coming out until the 18th, therefore there are no reviews confirming or denying Apple’s performance claims? Innocent before proven guilty, but in this case honest before proven wrong.
It’s two M1 Max chips glued together and we have performance numbers for those, the only change being a new process node that will make it more energy efficient.


Looking at it truly is an efficient CPU with hardware acceleration for a few things and a FPGA Im sure the will program well for a few applications to make them run faster.

But what happens when it ages a few years and new/different acceleration hardware is required?
Posted on Reply
#41
lexluthermiester
evelynmarieMay I kindly ask how you came to this conclusion despite the Mac Studio with the M1 Ultra not coming out until the 18th, therefore there are no reviews confirming or denying Apple’s performance claims? Innocent before proven guilty, but in this case honest before proven wrong.
Oh, simple. There is no way they have a CPU that beats AMD & Intel. CPU's from both of those companies will kick the snot out of Apples new chip. It's dishonest because it makes a claim that is both completely untrue and deliberately misleading.

Apple is blatantly lying.
Posted on Reply
#42
trparky
Alright people... Let's take a step back from the Apple hating bar that we've somehow managed to find ourselves at because this particular discussion needs to take a look at hardware as we're going to see it in the future.

In the past, we used to brute forced our way through issues by essentially throwing more hardware at the problem and hoping that it'll shave off some time in processing said data. Then as time went on, more specialized hardware came about; things like dedicated graphics cards with specific sections of the silicon to do certain things. Case in point, nVidia RTX chips where it has specific silicon to process ray tracing. Nearly all GPUs have hardware-based decoding for things like h.264, h.265, and even old MPEG4 simply because trying to decode that kind of highly compressed video in real time using standard x86/x64 hardware will bring even the highest-end Intel 12th gen chip to its knees begging for you to stop.

As computing jobs get more complex, eventually we're going to see even more specialized hardware to do specialized jobs. We can't just continue throwing hardware at a problem thereby brute forcing our way out of the issue simply because the amount of time it'll take to complete will be prohibitive or the energy costs will be hideous. We're already seeing that right now in how much power a 12th gen chip uses at full power. As much as I've called out Apple for doing some really stupid shit in years past, the M1 chip is not one of them. They're seeing the future of computing where eventually every one of us will be. We'll all have more specialized hardware inside our systems and as time goes on, more will be added (not less).
Posted on Reply
#43
lexluthermiester
trparkyLet's take a step back from the Apple hating bar
I'm not hating. I'm calling out BS and a blatant lie. They're not even trying to candy-coat it with provisos. They're just throwing out a complete load of crap expecting people to buy into it blindly. It's a clear attempt to abuse market power.
Posted on Reply
#44
Unregistered
They're seeing the future of computing where eventually every one of us will use an Apple machine :laugh:
#45
trparky
For certain use cases, they’re not lying. I’ve seen several YouTube videos that showcased even the A15 Bionic absolutely beating the snot out of a high-end Intel-based system in certain workloads (especially digesting 4K and 8K video).

Does that mean that these Apple systems will kick the ass of Intel in everything? No. However, if it kicks the snot out of them in the work loads that most people buy these systems for then in that case Apple isn’t lying. They’re just… massaging the truth.
Posted on Reply
#46
Unregistered
Trouble is for them most normal bods won't pay the Apple tax prices for their stuff. Specially when some of it is propriatory and non upgradeable. The new machine even has soldered CPU, ram and ssd's, screw that, if something fails you get screwed on repair costs again at a Apple store.
#47
trparky
Tiggern some of it is propriatory and non upgradeable. The new machine even has soldered CPU, ram and ssd's
The problem is that even some regular notebook computers are coming like that. Case in point, my Acer notebook; I can’t even replace the damn battery.

As for the memory, it’s why the M1 gets such ungodly amounts of memory bandwidth that even the likes of DDR5 can’t touch. I’ll admit you’re right about the SSD though; that’s bullshit.
Posted on Reply
#48
Valantar
lexluthermiesterOh, simple. There is no way they have a CPU that beats AMD & Intel. CPU's from both of those companies will kick the snot out of Apples new chip. It's dishonest because it makes a claim that is both completely untrue and deliberately misleading.

Apple is blatantly lying.
That's... actually not accurate at all. The M1 family has been demonstrated to be highly competitive in ST performance with every competing architecture previous to ADL. In MT it has been decent, but not amazing, but doubling the P core count will no doubt go some way towards alleviating that, even if core-to-core latencies will be a mess. ADL (and likely Zen 4) will keep winning in ST, as it's unlikely the design has much more to go on in terms of clock speed, but it'll still be decent, and an MT beast.
Posted on Reply
#49
lexluthermiester
trparkyI’ve seen several YouTube videos that showcased even the A15 Bionic absolutely beating the snot out of a high-end Intel-based system in certain workloads (especially digesting 4K and 8K video).
Let's see those video's. I suspect something...
ValantarThat's... actually not accurate at all. The M1 family has been demonstrated to be highly competitive in ST performance with every competing architecture previous to ADL.
Prove up. Let's see some like to like benchmarks.

I'm not saying that Apple's new hotness isn't good, I'm saying it's a far cry from being the "World's Most Powerful Chip For a Personal Computer". In that specific context, they're lying through their teeth...
Posted on Reply
#50
trparky
I call the SSD situation bullshit mainly because if the main system board dies, there goes your data. Should have had a backup. Do you want an iCloud account?
lexluthermiesterI'm saying it's a far cry from being the "World's Most Powerful Chip For a Personal Computer". In that specific context, they're lying through their teeth...
And I agree with that.
Posted on Reply
Add your own comment
Nov 19th, 2024 18:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts