• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Apple Unveils M1 Ultra, the World's Most Powerful Chip For a Personal Computer

Taking account the GPU investment, regarding architecture team, R&D etc. although not happening it would be an interesting strategy at 3nm (if they booked enough volume at TSMC), to compete in the console space, with a traditional console business model at start (without excluding multiply SKUs-price brackets, eg. $299/$499/$999 if the performance/specs is right in each bracket in relation with the competition) just to build their own gaming audience and get familiar, evaluate and build more close relationships with the developers & publishers who is not specialising in mobile gaming (and of course eventually buy some of them to build their internal studios/IPs) becauce as we recently saw with MS & Sony acquisitions and the trend that is building, gradually Apple may miss their opportunity to compete and build something similar to what MS for example is trying to build! I'm thinking about a traditional console-like business model, becauce the alternative would be either Mac related, which is too unfocused, too expensive, too time consuming to attract traditional gamers, or cloud based ala Stadia which will fail like Google, it's too early for services like these to prevail vs traditional console models and time is of essence in order to acquire talent & IP.
 
M1 Ultra has a 64-core GPU — 8x the size of M1 — delivering faster performance than even the highest-end PC GPU
I'd love to see that claim put to the test, something tells me it won't quite be apples to apples.
 
I'd love to see that claim put to the test, something tells me it won't quite be apples to apples.

delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power. Bollocks

Me too, against a 3090/ti
 
A lot of performance claims that if true are amazing.
But the GPU one got me, I’m sure it will play candy crush great, and some ports, but this is going to push Mac gaming back to the stone ages again.
I don’t think this is targeted for gamers to begin with. The GPU power may allow you to play some games, but that is not the primary purpose here. Professionals and content creators will love this chip with the GPU.
 
Very interested in the performance of this chip. Apple went from launching M1 in 2020 to "we are comparing M1 Ultra with intel i9" in 2022.
 
:thinking:
this isn't x86, so this isn't a PC ...
It actually is a PC. A PC doesn’t have to be based on the x86 architecture to be a PC. The Raspberry Pi is based on the ARM architecture as well, but it’s still a PC and it’s the size of a credit card. Just like how if RISC-V ever makes its way into the mainstream and ends up being used in laptops and prebuilts, they would still be a PC. The CPU architecture that is used doesn’t define the PC.
 
It actually is a PC. A PC doesn’t have to be based on the x86 architecture to be a PC. The Raspberry Pi is based on the ARM architecture as well, but it’s still a PC and it’s the size of a credit card. Just like how if RISC-V ever makes its way into the mainstream and ends up being used in laptops and prebuilts, they would still be a PC. The CPU architecture that is used doesn’t define the PC.
This is true. Well said. However, Apple statement is still completely dishonest.
 
I feel like Apple talks to Apple users. If this SoC is the fastest in common scenarios for them, like media production, they will agree with apple claims, it's all a matter of perspective.

However, as a hardware enthusiast, i do not like what apple is doing here. Their chip is really focused on media production, with average to lackluster performance in other areas (which aren't that many tbh), so the title "fastest cpu" might not apply to you depending on your needs
 
I'd love to see that claim put to the test, something tells me it won't quite be apples to apples.
Given that this is already an MCM GPU (which is impressive in its own right), I would be extremely surprised if those numbers applied to anything that wasn't GPGPU. I have no doubt apple could manage a transparent multi-GPU solution that could work for games if they wanted to (their cash reserves are massive, after all), but gaming isn't a focus (or even interesting) to them, and making this work in GPGPU applications is bound to be a lot easier overall. Still, whether or not this scales as advertised, it can't be denied that Apple is now beating all established GPU makers to an actual, real-world, transparent multi-GPU implementation. And that's impressive in and of itself. We'll just have to see how well it performs in real life, as there are many, many factors affecting that.

Of course it also bears mentioning the differences in design philosophies here; consumer GPUs are designed to chase peak performance pretty much regardless of efficiency, while this is designed to hit very specific performance targets with near no regard to production costs. The M1 Ultra being >4x the transistor count of a GA102, even if that isn't all GPU, tells quite a bit about the differences here. If your GPU is twice as wide and clocked half as high, it's not a surprise if you're equally fast at less than half power, after all. But the cost of entry is also going to be really high, as you're just throwing transistors at the problem rather than clock cycles and power.
 
For the most graphics-intensive needs, like 3D rendering and complex image processing, M1 Ultra has a 64-core GPU — 8x the size of M1 — delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power.
I call on this bullshit statement. Really, did they actually claimed that???
 
I call on this bullshit statement. Really, did they actually claimed that???
As with the M1 Pro and Max, that statement is likely true in a highly selective set of tests under relatively optimal conditions - but also not true in most cases. It just depends on what is important to you, and to Apple, those tests (which are mostly related to video and media production) are what matters. And TBH, with the silicon area and transistor count of this thing, it really shouldn't be difficult to beat the woefully inefficient, pushed-to-the-max 3090. That doesn't mean it will will perform well in games or other real-time 3D applications, but that isn't Apple's priority either.

As an example: The M1 Max already scored ~950 in PugetBench Premiere, which beats a 32-core Threadripper+5700 XT system, but trails most desktop GPUs (2060 and up) by 100-200 points. A 3090 with that CPU scores ~1140. If the new Ultra is able to bump those scores up by just a few hundred points, that's a win for Apple in a workload relevant to them and their customer base. Could they achieve that at slightly above 100W? I wouldn't be too surprised, given the massive wide-and-slow design of this chip. Remember: 4x the transistor count.
 
Last edited:
well, it's good for Apple users ofc ... (for their wallet ... less much so ... )

but, a closed ecosystem becoming even more proprietary and closed is not what i like see, no matter what Apple defenders have to say about personal computers... (phone and tablets are fine, well physically fine ... not financially )

in a small addition, if i pay premium for something i usually, also, like to have more freedom in my "tech muesli"


in short the title, should be true like that : "Apple Unveils M1 Ultra, the World's Most Powerful Chip For a closed path/ecosystem Personal Computer in a set of highly elective set of tests under relatively optimal conditions" but ... well, less appealing, eh?
 
Last edited:
well, it's good for Apple users ofc ... (for their wallet ... less much so ... )

but, a closed ecosystem becoming even more proprietary and closed is not what i like see, no matter what Apple defenders have to say about personal computers... (phone and tablets are fine, well physically fine ... not financially )

in a small addition, if i pay premium for something i usually, also, like to have more freedom in my "tech muesli"


in short the title, should be true like that : "Apple Unveils M1 Ultra, the World's Most Powerful Chip For a closed path/ecosystem Personal Computer in a set of highly elective set of tests under relatively optimal conditions" but ... well, less appealing, eh?
It's weird how leather jacket man divorced Apple man when they both love proprietary stuff so much.
 
This is true. Well said. However, Apple statement is still completely dishonest.
May I kindly ask how you came to this conclusion despite the Mac Studio with the M1 Ultra not coming out until the 18th, therefore there are no reviews confirming or denying Apple’s performance claims? Innocent before proven guilty, but in this case honest before proven wrong.
 
May I kindly ask how you came to this conclusion despite the Mac Studio with the M1 Ultra not coming out until the 18th, therefore there are no reviews confirming or denying Apple’s performance claims? Innocent before proven guilty, but in this case honest before proven wrong.
It’s two M1 Max chips glued together and we have performance numbers for those, the only change being a new process node that will make it more energy efficient.


Looking at it truly is an efficient CPU with hardware acceleration for a few things and a FPGA Im sure the will program well for a few applications to make them run faster.

But what happens when it ages a few years and new/different acceleration hardware is required?
 
May I kindly ask how you came to this conclusion despite the Mac Studio with the M1 Ultra not coming out until the 18th, therefore there are no reviews confirming or denying Apple’s performance claims? Innocent before proven guilty, but in this case honest before proven wrong.
Oh, simple. There is no way they have a CPU that beats AMD & Intel. CPU's from both of those companies will kick the snot out of Apples new chip. It's dishonest because it makes a claim that is both completely untrue and deliberately misleading.

Apple is blatantly lying.
 
Last edited:
Alright people... Let's take a step back from the Apple hating bar that we've somehow managed to find ourselves at because this particular discussion needs to take a look at hardware as we're going to see it in the future.

In the past, we used to brute forced our way through issues by essentially throwing more hardware at the problem and hoping that it'll shave off some time in processing said data. Then as time went on, more specialized hardware came about; things like dedicated graphics cards with specific sections of the silicon to do certain things. Case in point, nVidia RTX chips where it has specific silicon to process ray tracing. Nearly all GPUs have hardware-based decoding for things like h.264, h.265, and even old MPEG4 simply because trying to decode that kind of highly compressed video in real time using standard x86/x64 hardware will bring even the highest-end Intel 12th gen chip to its knees begging for you to stop.

As computing jobs get more complex, eventually we're going to see even more specialized hardware to do specialized jobs. We can't just continue throwing hardware at a problem thereby brute forcing our way out of the issue simply because the amount of time it'll take to complete will be prohibitive or the energy costs will be hideous. We're already seeing that right now in how much power a 12th gen chip uses at full power. As much as I've called out Apple for doing some really stupid shit in years past, the M1 chip is not one of them. They're seeing the future of computing where eventually every one of us will be. We'll all have more specialized hardware inside our systems and as time goes on, more will be added (not less).
 
Let's take a step back from the Apple hating bar
I'm not hating. I'm calling out BS and a blatant lie. They're not even trying to candy-coat it with provisos. They're just throwing out a complete load of crap expecting people to buy into it blindly. It's a clear attempt to abuse market power.
 
For certain use cases, they’re not lying. I’ve seen several YouTube videos that showcased even the A15 Bionic absolutely beating the snot out of a high-end Intel-based system in certain workloads (especially digesting 4K and 8K video).

Does that mean that these Apple systems will kick the ass of Intel in everything? No. However, if it kicks the snot out of them in the work loads that most people buy these systems for then in that case Apple isn’t lying. They’re just… massaging the truth.
 
Trouble is for them most normal bods won't pay the Apple tax prices for their stuff. Specially when some of it is propriatory and non upgradeable. The new machine even has soldered CPU, ram and ssd's, screw that, if something fails you get screwed on repair costs again at a Apple store.
 
n some of it is propriatory and non upgradeable. The new machine even has soldered CPU, ram and ssd's
The problem is that even some regular notebook computers are coming like that. Case in point, my Acer notebook; I can’t even replace the damn battery.

As for the memory, it’s why the M1 gets such ungodly amounts of memory bandwidth that even the likes of DDR5 can’t touch. I’ll admit you’re right about the SSD though; that’s bullshit.
 
Oh, simple. There is no way they have a CPU that beats AMD & Intel. CPU's from both of those companies will kick the snot out of Apples new chip. It's dishonest because it makes a claim that is both completely untrue and deliberately misleading.

Apple is blatantly lying.
That's... actually not accurate at all. The M1 family has been demonstrated to be highly competitive in ST performance with every competing architecture previous to ADL. In MT it has been decent, but not amazing, but doubling the P core count will no doubt go some way towards alleviating that, even if core-to-core latencies will be a mess. ADL (and likely Zen 4) will keep winning in ST, as it's unlikely the design has much more to go on in terms of clock speed, but it'll still be decent, and an MT beast.
 
I’ve seen several YouTube videos that showcased even the A15 Bionic absolutely beating the snot out of a high-end Intel-based system in certain workloads (especially digesting 4K and 8K video).
Let's see those video's. I suspect something...

That's... actually not accurate at all. The M1 family has been demonstrated to be highly competitive in ST performance with every competing architecture previous to ADL.
Prove up. Let's see some like to like benchmarks.

I'm not saying that Apple's new hotness isn't good, I'm saying it's a far cry from being the "World's Most Powerful Chip For a Personal Computer". In that specific context, they're lying through their teeth...
 
Last edited:
Back
Top