Tuesday, March 8th 2022

Apple Unveils M1 Ultra, the World's Most Powerful Chip For a Personal Computer

Apple today announced M1 Ultra, the next giant leap for Apple silicon and the Mac. Featuring UltraFusion — Apple's innovative packaging architecture that interconnects the die of two M1 Max chips to create a system on a chip (SoC) with unprecedented levels of performance and capabilities — M1 Ultra delivers breathtaking computing power to the new Mac Studio while maintaining industry-leading performance per watt.

The new SoC consists of 114 billion transistors, the most ever in a personal computer chip. M1 Ultra can be configured with up to 128 GB of high-bandwidth, low-latency unified memory that can be accessed by the 20-core CPU, 64-core GPU and 32-core Neural Engine, providing astonishing performance for developers compiling code, artists working in huge 3D environments that were previously impossible to render, and video professionals who can transcode video to ProRes up to 5.6x faster than with a 28-core Mac Pro with Afterburner.
"M1 Ultra is another game changer for Apple silicon that once again will shock the PC industry. By connecting two M1 Max die with our UltraFusion packaging architecture, we're able to scale Apple silicon to unprecedented new heights," said Johny Srouji, Apple's senior vice president of Hardware Technologies. "With its powerful CPU, massive GPU, incredible Neural Engine, ProRes hardware acceleration and huge amount of unified memory, M1 Ultra completes the M1 family as the world's most powerful and capable chip for a personal computer."

Groundbreaking UltraFusion Architecture

The foundation for M1 Ultra is the extremely powerful and power-efficient M1 Max. To build M1 Ultra, the die of two M1 Max are connected using UltraFusion, Apple's custom-built packaging architecture. The most common way to scale performance is to connect two chips through a motherboard, which typically brings significant trade-offs, including increased latency, reduced bandwidth and increased power consumption. However, Apple's innovative UltraFusion uses a silicon interposer that connects the chips across more than 10,000 signals, providing a massive 2.5 TB/s of low-latency, inter-processor bandwidth — more than 4x the bandwidth of the leading multi-chip interconnect technology. This enables M1 Ultra to behave and be recognised by software as one chip, so developers don't need to rewrite code to take advantage of its performance. There's never been anything like it.

Unprecedented Performance and Power Efficiency

M1 Ultra features an extraordinarily powerful 20-core CPU with 16 high-performance cores and four high-efficiency cores. It delivers 90 per cent higher multithreaded performance than the fastest available 16-core PC desktop chip in the same power envelope. Additionally, M1 Ultra reaches the PC chip's peak performance using 100 fewer watts. That astounding efficiency means less energy is consumed and fans run quietly, even as apps like Logic Pro rip through demanding workflows, such as processing massive amounts of virtual instruments, audio plug-ins and effects.

For the most graphics-intensive needs, like 3D rendering and complex image processing, M1 Ultra has a 64-core GPU — 8x the size of M1 — delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power.

Apple's unified memory architecture has also scaled up with M1 Ultra. Memory bandwidth is increased to 800 GB/s, more than 10x the latest PC desktop chip, and M1 Ultra can be configured with 128 GB of unified memory. Compared with the most powerful PC graphics cards that max out at 48 GB, nothing comes close to M1 Ultra for graphics memory to support enormous GPU-intensive workloads like working with extreme 3D geometry and rendering massive scenes.

The 32-core Neural Engine in M1 Ultra runs up to 22 trillion operations per second, speeding through the most challenging machine learning tasks. And, with double the media engine capabilities of M1 Max, M1 Ultra offers unprecedented ProRes video encode and decode throughput. In fact, the new Mac Studio with M1 Ultra can play back up to 18 streams of 8K ProRes 422 video — a feat no other chip can accomplish. M1 Ultra also integrates custom Apple technologies, such as a display engine capable of driving multiple external displays, integrated Thunderbolt 4 controllers and best-in-class security, including Apple's latest Secure Enclave, hardware-verified secure boot and runtime anti-exploitation technologies.

macOS and Apps Scale Up to M1 Ultra

Deep integration between hardware and software has always been at the heart of the Mac experience. macOS Monterey has been designed for Apple silicon, taking advantage of M1 Ultra's huge increases in CPU, GPU and memory bandwidth. Developer technologies like Metal let apps take full advantage of the new chip, and optimisations in Core ML utilise the new 32-core Neural Engine, so machine learning models run faster than ever.

Users have access to the largest collection of apps ever for Mac, including iPhone and iPad apps that can now run on Mac, and Universal apps that unlock the full power of the M1 family of chips. Apps that have not yet been updated to Universal will run seamlessly with Apple's Rosetta 2 technology.

Another Leap Forward in the Transition to Apple Silicon

Apple has introduced Apple silicon to nearly every Mac in the current line-up, and each new chip — M1, M1 Pro, M1 Max and now M1 Ultra — unleashes amazing capabilities for the Mac. M1 Ultra completes the M1 family of chips, powering the all-new Mac Studio, a high-performance desktop system with a re-imagined compact design made possible by the industry-leading performance per watt of Apple silicon.

Apple Silicon and the Environment

The energy efficiency of Apple's custom silicon helps Mac Studio use less power over its lifetime. In fact, while delivering extraordinary performance, Mac Studio consumes up to 1,000 kilowatt-hours less energy than that of a high-end PC desktop over the course of a year.

Today, Apple is carbon-neutral for global corporate operations, and by 2030, plans to have net-zero climate impact across the entire business, which includes manufacturing supply chains and all product life cycles. This means that every chip Apple creates, from design to manufacturing, will be 100 per cent carbon-neutral.
Source: Apple
Add your own comment

122 Comments on Apple Unveils M1 Ultra, the World's Most Powerful Chip For a Personal Computer

#101
Unregistered
Just think about Apple phones, No mem card slot, instead pay a lot for more storage. They could have put mem card slots on their phones, Android has been doing it from the very early ones with no detriment, and mem cards are pretty fast now, so why not. So you pay more for a phone with extra storage that costs less to put in the phone than the cost to get the extra.
#102
Valantar
lexluthermiesterOh yes. Under Tim Cook's sad leadership Apple computers have steadily lost upgradeablity one little bit at a time. It's really pathetic.
That's pretty much a universal trend across the industry though. It's true that Apple has been a champion of this for quite some time, but they're not unique by any means - they just take it a bit further than most. (It's also by no means unique to the Tim Cook era of Apple - Jobs championed the first mass market sealed-battery phone, after all.) Their well funded opposition to right to repair is much more of a problem imo. Imo part of the problem is that the paradigms of what constitutes "upgradeable" are severely out of sync with the real-world benefits of tight integration and non-modularity in certain respects (the speed and power advantages of LPDDR vs DDR, for example). If there was a functional market for replacement parts, including things like motherboards, then this would be far less of an issue (though soldered storage is particularly egregious, and largely unique to Apple). I would be quite happy with a world where most laptops were mostly non-upgradeable but you could easily swap its motherboard for one with a more powerful CPU, more RAM, etc, with the trade-in value of your existing board lowering the price of the upgrade. But of course this would need significant manufacturer support, as well as a broad network of third party workshops doing swaps and selling parts, and it would require manufacturers to stick to the same motherboard design for a long time, which most of them really don't want to.

Of course the real kicker is that Apple with their tight control, massive size, and vertical integration could implement such a system relatively easily. But they would much rather sell you an entire new device.
Posted on Reply
#103
Aquinus
Resident Wat-man
ValantarThat's pretty much a universal trend across the industry though. It's true that Apple has been a champion of this for quite some time, but they're not unique by any means - they just take it a bit further than most. (It's also by no means unique to the Tim Cook era of Apple - Jobs championed the first mass market sealed-battery phone, after all.) Their well funded opposition to right to repair is much more of a problem imo. Imo part of the problem is that the paradigms of what constitutes "upgradeable" are severely out of sync with the real-world benefits of tight integration and non-modularity in certain respects (the speed and power advantages of LPDDR vs DDR, for example). If there was a functional market for replacement parts, including things like motherboards, then this would be far less of an issue (though soldered storage is particularly egregious, and largely unique to Apple). I would be quite happy with a world where most laptops were mostly non-upgradeable but you could easily swap its motherboard for one with a more powerful CPU, more RAM, etc, with the trade-in value of your existing board lowering the price of the upgrade. But of course this would need significant manufacturer support, as well as a broad network of third party workshops doing swaps and selling parts, and it would require manufacturers to stick to the same motherboard design for a long time, which most of them really don't want to.

Of course the real kicker is that Apple with their tight control, massive size, and vertical integration could implement such a system relatively easily. But they would much rather sell you an entire new device.
There is also the bit about whether something like the M1 Ultra is even feasible without that level of integration. I agree with storage, I can't say that I'm very happy about that, particularly in laptops and desktops. I think there is a case to be made with phones though since size and efficiency tends to be of paramount concern. Look at it this way, if you take EPYC 7763, you're getting what, half of the bandwidth as the M1 Max? If Apple were to make something like memory a user replaceable part, how would you suggest that we get to 800GB/s with existing tech? You're not getting that with DDR5 as it's packaged for PCs today. Let's also talk about SD cards. A V90 card is going to get you 90 MB/s at best and that's your upper bound. Apple's interface within their phones is NVMe (albeit, probably not at the speed you're expecting from a computer.) However, that gives you a lot more potential beyond what an SD card can offer you.

So yeah, Apple could modularize their hardware more, but at what cost? SD card adoption and replaceable memory is probably not the best choice for the same reason, it would hamper performance and result in bigger devices. That doesn't sound like a win to me.
Posted on Reply
#104
Unregistered
AquinusThere is also the bit about whether something like the M1 Ultra is even feasible without that level of integration. I agree with storage, I can't say that I'm very happy about that, particularly in laptops and desktops. I think there is a case to be made with phones though since size and efficiency tends to be of paramount concern. Look at it this way, if you take EPYC 7763, you're getting what, half of the bandwidth as the M1 Max? If Apple were to make something like memory a user replaceable part, how would you suggest that we get to 800GB/s with existing tech? You're not getting that with DDR5 as it's packaged for PCs today. Let's also talk about SD cards. A V90 card is going to get you 90 MB/s at best and that's your upper bound. Apple's interface within their phones is NVMe (albeit, probably not at the speed you're expecting from a computer.) However, that gives you a lot more potential beyond what an SD card can offer you.

So yeah, Apple could modularize their hardware more, but at what cost? SD card adoption and replaceable memory is probably not the best choice for the same reason, it would hamper performance and result in bigger devices. That doesn't sound like a win to me.
With phones they could have base storage as NVME and extra with an SD card. I have never heard anyone complain that their extra storage via SD is slow on any phone. If they made base as say 28 with a SD slot imo that would be fine.
Posted on Edit | Reply
#105
Aquinus
Resident Wat-man
TiggerWith phones they could have base storage as NVME and extra with an SD card. I have never heard anyone complain that their extra storage via SD is slow on any phone. If they made base as say 28 with a SD slot imo that would be fine.
With extended storage being significantly slower than internal memory would make it hard for Apple to guarantee proper behavior should an app be expecting NVMe level of performance. That sounds like a very real impediment because even the V90 card in my Canon EOS RP is by no means fast when it comes to transfers. You would really have to limit what that extended storage could be used for and that doesn't sound like a good user experience to me.
Posted on Reply
#106
Valantar
AquinusWith extended storage being significantly slower than internal memory would make it hard for Apple to guarantee proper behavior should an app be expecting NVMe level of performance. That sounds like a very real impediment because even the V90 card in my Canon EOS RP is by no means fast when it comes to transfers. You would really have to limit what that extended storage could be used for and that doesn't sound like a good user experience to me.
Didn't even Android stop placing apps on SD years ago, precisely due to this? Apple certainly isn't alone in using NVMe storage in their phones, though most Android phones use UFS of some kind these days AFAIK - and the faster versions of UFS can still be fast. Still, any reasonably smart OS should be able to juggle that automatically, just shunt over some photos or other low-bandwidth data to the SD card if you're running out of app space.
AquinusThere is also the bit about whether something like the M1 Ultra is even feasible without that level of integration. I agree with storage, I can't say that I'm very happy about that, particularly in laptops and desktops. I think there is a case to be made with phones though since size and efficiency tends to be of paramount concern. Look at it this way, if you take EPYC 7763, you're getting what, half of the bandwidth as the M1 Max? If Apple were to make something like memory a user replaceable part, how would you suggest that we get to 800GB/s with existing tech? You're not getting that with DDR5 as it's packaged for PCs today.
Yeah, precisely. It just isn't doable. After all, it has a 1024-bit RAM interface, with regular DDR5 being 2x32-bit. So ... you'd need 16 channels to match Apple's LPDDR5 setup. Latency for the DDR5 would be lower, but ... it doesn't matter at that point. You'd be talking a massive server motherboard to fit all of those DIMMs, not to mention the way high layer count to ensure signal integrity. There are definitely major advantages to be found in this level of tight integration - but the drawbacks are also real.

One cool solution would be a hybrid memory architecture, with X amount of memory on-package (could be HBMx, could be LPDDR) and a desired number of ancillary channels of "second order" RAM for when that fills up. Of course this would be pretty difficult for the OS to manage without performance inconsistencies, but it's definitely doable.
AquinusLet's also talk about SD cards. A V90 card is going to get you 90 MB/s at best and that's your upper bound. Apple's interface within their phones is NVMe (albeit, probably not at the speed you're expecting from a computer.) However, that gives you a lot more potential beyond what an SD card can offer you.
Didn't Samsung try to launch a UFS card standard at some point? I wonder if we'll ever see the (seemingly dead elsewhere) SD Express standard take root in phones - it essentially makes SD cards into PCIe 3.0x1 SSDs. Of course heat and controller complexity (and die size) would be an issue for this in MicroSD, but I think the standard supports that size.
AquinusSo yeah, Apple could modularize their hardware more, but at what cost? SD card adoption and replaceable memory is probably not the best choice for the same reason, it would hamper performance and result in bigger devices. That doesn't sound like a win to me.
I don't think Apple is ever going to go modular storage for phones, unless they were mandated to do so by some major government. Heck, they don't even support dual SIM. PCs, though? Mandating that would actually be kind of reasonable (with some excemptions for industrial PCs or other special cases). Soldered storage in a PC just isn't acceptable. Soldered RAM kind of is, but IMO we really need some organized system for access to spare parts, repairs and upgrades. Plus, ideally, something like a mandated period of keeping motherboard designs (for portables) compatible - 3 years? It would be pretty awesome to be able to buy a laptop, then 3 years later either order a new motherboard or send it in to have it upgraded to a new CPU and RAM, with your existing board as a trade-in. Of course this is still more wasteful than upgrading individual components, but such a system would obviously need to be tied into spare parts networks and repair providers (and official refurbishers) to afford re-use of those traded-in boards. Even low power laptops today are so powerful that a decent 3-year-old laptop is more than enough for most users, so getting a system like this running would be immensely beneficial in many ways.
Posted on Reply
#107
Aquinus
Resident Wat-man
ValantarStill, any reasonably smart OS should be able to juggle that automatically, just shunt over some photos or other low-bandwidth data to the SD card if you're running out of app space.
Believe it or not, if I'm shooting jpeg + raw, my EOS RP will fill the buffer and slow down fairly quickly, even with a V90 card. Transfer speeds are pretty terrible as well. I can't imagine the experience in something like Apple's Photo app to perform well if these images are being read from really slow storage. I mean, come on, I have rotational media drives that have better sustained throughput. :p
ValantarYeah, precisely. It just isn't doable. After all, it has a 1024-bit RAM interface, with regular DDR5 being 2x32-bit. So ... you'd need 16 channels to match Apple's LPDDR5 setup. Latency for the DDR5 would be lower, but ... it doesn't matter at that point. You'd be talking a massive server motherboard to fit all of those DIMMs, not to mention the way high layer count to ensure signal integrity. There are definitely major advantages to be found in this level of tight integration - but the drawbacks are also real.
I think what we're seeing though is that Apple doesn't have an option if they want to maintain the numbers they're currently getting from this kind of setup. It's not like there is another option and with lack of standardization, they might as well roll it themselves like they did. They certainly have the resources to do it, so all in all, I'd call it a win, at least to show what such a device can do.
ValantarOne cool solution would be a hybrid memory architecture, with X amount of memory on-package (could be HBMx, could be LPDDR) and a desired number of ancillary channels of "second order" RAM for when that fills up. Of course this would be pretty difficult for the OS to manage without performance inconsistencies, but it's definitely doable.
I would totally be happy to see a SoC with HBM2(x?) as a first tier of memory and a second tier with more capacity at a slower speed. It's really no different than something like multiple cache levels, but that does have a complexity cost. It might be easier to just do what Apple did with a single fast pool. To me, it seems like what Apple did was a less risky solution because the cost of mispredicting when something should get bumped out of faster memory could have some huge ramifications that could just be avoided by doing what Apple already did. So from that perspective, I'm not sure if the added complexity would be worth it, although I'd still love to see something like this.
ValantarSoldered storage in a PC just isn't acceptable.
There really is no defending this. I agree. If I want to add more storage to my MBP, I either have to use a Thunderbolt 3 device (which would be plenty fast I might add, but it comes at a hell of a cost,) or USB 3.1 devices (since my 2019 model doesn't support 3.2,) and that limits me to essentially SATA3 (6G) speeds. So yes, I really wish storage could be replaced however there might even be a reason for that. Are there any NVMe drives that go up to 8TB because that's what you can get soldered into a Mac.
Posted on Reply
#108
TheoneandonlyMrK
AquinusBelieve it or not, if I'm shooting jpeg + raw, my EOS RP will fill the buffer and slow down fairly quickly, even with a V90 card. Transfer speeds are pretty terrible as well. I can't imagine the experience in something like Apple's Photo app to perform well if these images are being read from really slow storage. I mean, come on, I have rotational media drives that have better sustained throughput. :p

I think what we're seeing though is that Apple doesn't have an option if they want to maintain the numbers they're currently getting from this kind of setup. It's not like there is another option and with lack of standardization, they might as well roll it themselves like they did. They certainly have the resources to do it, so all in all, I'd call it a win, at least to show what such a device can do.

I would totally be happy to see a SoC with HBM2(x?) as a first tier of memory and a second tier with more capacity at a slower speed. It's really no different than something like multiple cache levels, but that does have a complexity cost. It might be easier to just do what Apple did with a single fast pool. To me, it seems like what Apple did was a less risky solution because the cost of mispredicting when something should get bumped out of faster memory could have some huge ramifications that could just be avoided by doing what Apple already did. So from that perspective, I'm not sure if the added complexity would be worth it, although I'd still love to see something like this.

These really is no defending this. I agree. If I want to add more storage to my MBP, I either have to use a Thunderbolt 3 device (which would be plenty fast I might add, but it comes at a hell of a cost,) or USB 3.1 devices (since my 2019 model doesn't support 3.2,) and that limits me to essentially SATA3 (6G) speeds. So yes, I really wish storage could be replaced however there might even be a reason for that. Are there any NVMe drives that go up to 8TB because that's what you can get soldered into a Mac.
That's where I draw the line though with Apple, no replaceable storage options and I won't buy, it's rarely the case that the storage available for a system at launch utilises all the bandwidth available to it, plus the memory card issue is beyond ridiculous, it's 2022.
Posted on Reply
#109
Aquinus
Resident Wat-man
TheoneandonlyMrKit's rarely the case that the storage available for a system at launch utilises all the bandwidth available to it
I have not experienced that to be the case with modern Apple devices. My MBP has disk access speeds that you'd expect from a high end PCIe 3.0 NVMe drive. The newer devices act like what you would expect from a device almost saturating PCIe 4.0 NVMe cards. So from an "available bandwidth" perspective, I do think that Apple is squeezing all they can out of it already, so I wouldn't call this a valid concern. However, what is a valid concern is if the SSD fails because you'd essentially have to replace the whole damn circuit board or desolder the parts that'd be getting replaced. That's my primary gripe. My secondary gripe is that in retrospect, I should have gotten 2TB instead of 1TB and now I have to live with that decision. However with that said, something like a USB 3.1 SSD is probably more than enough to handle my photo library, so there is that too.

All in all, when you buy Apple, you're buying something that should "just work." That's part of what you're paying for by going into this ecosystem and it's not for everyone. Your typical Mac user doesn't want to screw with the hardware. Your typical enthusiast is going to hate Apple because of how locked down it is, but honestly, this is a case where you can't have your cake and eat it too if you want to control quality from top to bottom.
Posted on Reply
#110
Unregistered
AquinusI have not experienced that to be the case with modern Apple devices. My MBP has disk access speeds that you'd expect from a high end PCIe 3.0 NVMe drive. The newer devices act like what you would expect from a device almost saturating PCIe 4.0 NVMe cards. So from an "available bandwidth" perspective, I do think that Apple is squeezing all they can out of it already, so I wouldn't call this a valid concern. However, what is a valid concern is if the SSD fails because you'd essentially have to replace the whole damn circuit board or desolder the parts that'd be getting replaced. That's my primary gripe. My secondary gripe is that in retrospect, I should have gotten 2TB instead of 1TB and now I have to live with that decision. However with that said, something like a USB 3.1 SSD is probably more than enough to handle my photo library, so there is that too.

All in all, when you buy Apple, you're buying something that should "just work." That's part of what you're paying for by going into this ecosystem and it's not for everyone. Your typical Mac user doesn't want to screw with the hardware. Your typical enthusiast is going to hate Apple because of how locked down it is, but honestly, this is a case where you can't have your cake and eat it too if you want to control quality from top to bottom.
How much would the jump from 1 to 2 TB have cost just out of interest?
Posted on Edit | Reply
#111
Aquinus
Resident Wat-man
TiggerHow much would the jump from 1 to 2 TB have cost just out of interest?
At the time with the discount I had, probably about $360 USD iirc (something like $400 without I think.) So significantly more than having a user replaceable part if you include the cost of the 1TB on top of the upgrade cost to 2TB. That wasn't a pill I was willing to swallow at the time, but given how my machine costed about $4k, an additional $400 wouldn't have been that much in the grand scheme of things. Also, I've picked up photography since buying it, so I wasn't accounting for the size of full frame raw images either. With that said though, I've spent a good $2,000 on camera hardware and getting something like a 2TB external SSD would do just fine for this purpose and cost only about $230 USD, so it's not like it's that big of a deal. I'm still also only at about 50% of my disk used after cleaning out everything I didn't care about.

So tl;dr: I didn't feel like a 10% increase in price was worth justifying 2TB over 1TB when I didn't think (at the time,) that I needed it. I would have bought it though had I was able to have accounted for being interested in photography. Hindsight is always 20/20 though.

Edit: With that said though, I do still use my Linux tower for some games. It's just that the Mac is nice in the sense that it never gives me trouble and it always works when I need it to. That's really important when I'm traveling, working, or both.
Posted on Reply
#112
lexluthermiester
TheoneandonlyMrKThat's where I draw the line though with Apple, no replaceable storage options and I won't buy, it's rarely the case that the storage available for a system at launch utilises all the bandwidth available to it, plus the memory card issue is beyond ridiculous, it's 2022.
And no replaceable batteries... Not that such is exclusive to Apple, but they did popularize it.
Posted on Reply
#113
trparky
lexluthermiesterAnd no replaceable batteries... Not that such is exclusive to Apple, but they did popularize it.
But non-user replaceable batteries have made batteries smaller, more energy-dense, and other such overall improvements. Yes, we've lost some things but we've gained in the long run.
Posted on Reply
#114
lexluthermiester
trparkysmaller, more energy-dense, and other such overall improvements.
Don't care. I'd rather have a slightly larger phone that isn't made instantly useless by a dead/dying battery. It's so very wasteful and not green.
Posted on Reply
#115
Valantar
AquinusBelieve it or not, if I'm shooting jpeg + raw, my EOS RP will fill the buffer and slow down fairly quickly, even with a V90 card. Transfer speeds are pretty terrible as well. I can't imagine the experience in something like Apple's Photo app to perform well if these images are being read from really slow storage. I mean, come on, I have rotational media drives that have better sustained throughput. :p
Oh, believe me, that's no surprise. That's a 26MP camera, so I'd expect ~30MB RAW files (that's what I get from my 24MP Pentax, at least), plus 5-10MB/jpg. At even a few frames a second, that will exceed the speed of any SD card quickly. But the same just doesn't apply to a smartphone camera - not only will it never shoot photos as quickly as a DSLR, but nobody in their right mind would use a phone camera that way. Not necessarily because of the quality (which can be decent), but because of handling and UX issues.

As for editing: most apps load low(er) quality previews of images before loading the entire file, so it's likely not an issue. You're not likely to do tons of edits every second either, so the only use case where this becomes an issue is when blasting through your album quickly, in which case the previews are typically more than sufficient. Batch copying/import/export is where the bottleneck will be felt the most - and that's a large part of why many pro cameras are moving to CFExpress.
AquinusI think what we're seeing though is that Apple doesn't have an option if they want to maintain the numbers they're currently getting from this kind of setup. It's not like there is another option and with lack of standardization, they might as well roll it themselves like they did. They certainly have the resources to do it, so all in all, I'd call it a win, at least to show what such a device can do.
Yeah, it's definitely a win for them. But I wouldn't call it a lack of standardization - they're using "standard" LPDDR5, after all (though seemingly in custom packages?). The issue is that you just can't reasonably make a standard for replaceable memory at those bandwidths - the pin counts, trace routing, and PCB quality demands would be astronomical, making it impossible in practice. That's why we have soldered standards like HBM and LPDDR.
AquinusI would totally be happy to see a SoC with HBM2(x?) as a first tier of memory and a second tier with more capacity at a slower speed. It's really no different than something like multiple cache levels, but that does have a complexity cost. It might be easier to just do what Apple did with a single fast pool. To me, it seems like what Apple did was a less risky solution because the cost of mispredicting when something should get bumped out of faster memory could have some huge ramifications that could just be avoided by doing what Apple already did. So from that perspective, I'm not sure if the added complexity would be worth it, although I'd still love to see something like this.
Yeah, they're definitely avoiding a lot of complexity goint this way. I can only imagine the issues of juggling data across two different memory architectures with drastically different speeds and latencies, and making this transparent to apps. In this way, on-board "RAM" in a system like this is likely better implemented as a (huge) L4 cache.
AquinusThere really is no defending this. I agree. If I want to add more storage to my MBP, I either have to use a Thunderbolt 3 device (which would be plenty fast I might add, but it comes at a hell of a cost,) or USB 3.1 devices (since my 2019 model doesn't support 3.2,) and that limits me to essentially SATA3 (6G) speeds. So yes, I really wish storage could be replaced however there might even be a reason for that. Are there any NVMe drives that go up to 8TB because that's what you can get soldered into a Mac.
At the time they first launched an 8TB option there weren't any 8TB NVMe options, no. And AFAIK all current ones are relatively slow QLC, but they do exist. Still, it's not like their soldered drives have any special NAND on them, so they could just as easily have made this into an NVMe card - but it might not have fit on a standard m.2 22xx form factor. There are other m.2 form factors they could have used though - they could have gone 30mm wide for more board space and still stayed compliant (and compatible with standard SSDs as long as they aren't too long). We could also look at the Mac Pro, which used what look like mSATA SSDs (though clearly a proprietary interface), but they are for some reason tied to the T2 security chip and non-replaceable. Apple claims this is a security feature, but ... that isn't reasonable. The data security implications of someone being able to steal or temporarily borrow and clone your SSD are near zero, as at that point they already need physical access to your device, so all bets are off at that point. Soldered storage is still denser - you can always pack everything more densely on one PCB rather than several + connectors - but the difference isn't really noticeable. They could have made it work if they wanted to. And that's the real problem.
lexluthermiesterDon't care. I'd rather have a slightly larger phone that isn't made instantly useless by a dead/dying battery. It's so very wasteful and not green.
That's true, though again Apple is hardly the worst of the bunch here - at least their phones are reasonably easily disassembled and they consistently use stretch-release adhesive for their batteries, making swaps possible for someone with a few basic tools and a modicum of patience and hand-eye coordination. It's far from perfect, but there are far worse examples out there (looking at you, Samsung). IMO there are valid arguments for not designing your products around easy modularity - the benefits of non-modularity are very much real. What there aren't valid arguments for is not designing for repairability. There's no reason why a non-modular design can't still be relatively easily repairable. Of course, companies like Fairphone and Framework are demonstrating that you can still make highly repairable and upgradeable phones and laptops without that much of a sacrifice.
Posted on Reply
#116
trparky
ValantarThat's true, though again Apple is hardly the worst of the bunch here - at least their phones are reasonably easily disassembled and they consistently use stretch-release adhesive for their batteries, making swaps possible for someone with a few basic tools and a modicum of patience and hand-eye coordination.
And if we go by the latest iFixIt teardown video of some new Samsung phone, they don't even give you the benefit of pull tabs to get the battery out. You damn near have to chisel it out while risking a literal fire. So, Apple isn't the worst here.
lexluthermiesterDon't care. I'd rather have a slightly larger phone that isn't made instantly useless by a dead/dying battery. It's so very wasteful and not green.
I'm talking about in notebook PCs, non-removable batteries make for more dense, smaller batteries.
Posted on Reply
#117
Unregistered
trparkyAnd if we go by the latest iFixIt teardown video of some new Samsung phone, they don't even give you the benefit of pull tabs to get the battery out. You damn near have to chisel it out while risking a literal fire. So, Apple isn't the worst here.

I'm talking about in notebook PCs, non-removable batteries make for more dense, smaller batteries.
I have changed batteries on some phones in the past and been terrified of it bursting into flames as i try to get the old one out. Now i just heat the back till it's near too hot to touch.
#118
Valantar
trparkyAnd if we go by the latest iFixIt teardown video of some new Samsung phone, they don't even give you the benefit of pull tabs to get the battery out. You damn near have to chisel it out while risking a literal fire. So, Apple isn't the worst here.
Yep. I linked to that, didn't I? ;)
trparkyI'm talking about in notebook PCs, non-removable batteries make for more dense, smaller batteries.
That's actually debatable. In phones there are few alternatives to adhesive, but in a laptop you can relatively easily fit a basic frame with screw holes to your battery without sacrificing any meaningful amount of density. As the size of the device increases, the benefits of adhesive shrink, as the relative size of X number of screws shrinks compared to the battery. There are quite a few thin-and-light laptops out there with really densely packed batteries that still have them held in with screws.
Posted on Reply
#119
trparky
ValantarThat's actually debatable. In phones there are few alternatives to adhesive, but in a laptop you can relatively easily fit a basic frame with screw holes to your battery without sacrificing any meaningful amount of density. As the size of the device increases, the benefits of adhesive shrink, as the relative size of X number of screws shrinks compared to the battery. There are quite a few thin-and-light laptops out there with really densely packed batteries that still have them held in with screws.
No, I'm referring to how notebook batteries were once fully separate components that you could flip a lever and remove it vs what they are today.
Posted on Reply
#120
Valantar
trparkyNo, I'm referring to how notebook batteries were once fully separate components that you could flip a lever and remove it vs what they are today.
Ah, okay. Yeah, I agree with that - that battery bay always messed with the structural rigidity of the case and required more stiffening and more material to make the laptop hold together well, so integrating it is definitely a boon. But gluing it down inside is just unnecessary.
Posted on Reply
#122
lexluthermiester
trparkyI'm talking about in notebook PCs, non-removable batteries make for more dense, smaller batteries.
Still, I'd rather have a replaceable battery and a slightly larger device than have to buy a new unit when(not if) the battery dies.

Making a device of any kind with a battery that can not be easily replaced is sheer stupidity and it's about as environmentally unfriendly as can be. But I digress, we've wandered a bit off topic..
Posted on Reply
Add your own comment
Nov 19th, 2024 18:00 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts