Friday, April 19th 2024

AMD "Strix Halo" Zen 5 Mobile Processor Pictured: Chiplet-based, Uses 256-bit LPDDR5X

Enthusiasts on the ChipHell forum scored an alleged image of AMD's upcoming "Strix Halo" mobile processor, and set out to create some highly plausible schematic slides. These are speculative. While "Strix Point" is the mobile processor that succeeds the current "Hawk Point" and "Phoenix" processors; "Strix Halo" is in a category of its own—to offer gaming experiences comparable to discrete GPUs in the ultraportable form-factor where powerful discrete GPUs are generally not possible. "Strix Halo" also goes head on against Apple's M3 Max and M3 Pro processors powering the latest crop of MacBook Pros. It has the same advantages as a single-chip solution, as the M3 Max.

The "Strix Halo" silicon is a chiplet-based processor, although very different from "Fire Range". The "Fire Range" processor is essentially a BGA version of the desktop "Granite Ridge" processor—it's the same combination of one or two "Zen 5" CCDs that talk to a client I/O die, and is meant for performance-thru-enthusiast segment notebooks. "Strix Halo," on the other hand, use the same one or two "Zen 5" CCDs, but with a large SoC die featuring an oversized iGPU, and 256-bit LPDDR5X memory controllers not found on the cIOD. This is key to what AMD is trying to achieve—CPU and graphics performance in the league of the M3 Pro and M3 Max at comparable PCB and power footprints.
The iGPU of the "Strix Halo" processor is based on the RDNA 3+ graphics architecture, and features a massive 40 RDNA compute units. These work out to 2,560 stream processors, 80 AI accelerators, 40 Ray accelerators, 160 TMUs, and an unknown number of ROPs (we predict at least 64). The slide predicts an iGPU engine clock as high as 3.00 GHz.

Graphics is an extremely memory sensitive application, and so AMD is using a 256-bit (quad-channel or octa-subchannel) LPDDR5X-8533 memory interface, for an effective cached bandwidth of around 500 GB/s. The memory controllers are cushioned by a 32 MB L4 cache located on the SoC die. The way we understand this cache hierarchy, the CCDs (CPU cores) can treat this as a victim cache, besides the iGPU treating this like an L2 cache (similar to the Infinite Cache found in RDNA 3 discrete GPUs).

The iGPU isn't the only logic-heavy and memory-sensitive device on the SoC die, there's also a NPU. From what we gather, this is the exact same NPU model found in "Strix Point" processors, with a performance of around 45-50 AI TOPS, and is based on the XDNA 2 architecture developed by AMD's Xilinx team.
The SoC I/O of "Strix Halo" isn't as comprehensive as "Fire Range," because the chip has been designed on the idea that the notebook will use its large iGPU. It has PCIe Gen 5, but only a total of 12 Gen 5 lanes—4 toward an M.2 NVMe slot, and 8 to spare for a discrete GPU (if present), although these can be used to connect any PCIe device, including additional M.2 slots. There's also integrated 40 Gbps USB4, and 20 Gbps USB 3.2 Gen 2.

As for the CPU, since "Strix Halo" is using one or two "Zen 5" CCDs, its CPU performance will be similar to "Fire Range." You get up to 16 "Zen 5" CPU cores, with 32 MB of L3 cache per CCD, or 64 MB of total CPU L3 cache. The CCDs are connected to the SoC die either using conventional IFOP (Infinity Fabric over package), just like "Fire Range" and "Granite Ridge," or there's even a possibility that AMD is using Infinity Fanout links like on some of its chiplet-based RDNA 3 discrete GPUs.
Lastly, there are some highly speculative performance predictions for the "Strix Halo" iGPU, which puts it competitive to the GeForce RTX 4060M and RTX 4070M.
Sources: ChipHell Forums, harukaze5719 (Twitter)
Add your own comment

109 Comments on AMD "Strix Halo" Zen 5 Mobile Processor Pictured: Chiplet-based, Uses 256-bit LPDDR5X

#101
nullington
DavenDo you really need 16 cores in such a beast? Or put another way, how are they fitting all this into such a small package?
It is much harder to create the hardware. Only 2 companies in the world can do it and bring a product to market (AMD and Intel). So generally the hardware has to come before the widespread usage in software. Nobody is gong to put much effort into creating software when there is no market/users for it.
Posted on Reply
#102
8086
DavenThe bad AMD driver quality misinformation is an internet myth perpetuated by bad players. There is a lot speculation on the who with regard to these bad players from viral Nvidia marketing to brand loyalists. But rest assured as you have found out, there is no truth to it.

There’s also thinking out there that if company A does something better than company B then it means company B has bad quality control or is ignorant to making good products. This relates to super sampling and ray tracing for the current discussion. These two things are features which Nvidia simply does better. It has no relationship to drivers or driver quality. If these features are not important to you, paying the extra premium priced into Nvidia products for said features would be a waste of money.
AMD has been making good drivers for over 20 years now. Prior to the ATi 9700 pro, this statement could have held true that ATi drivers are not good. But since 9700pro and AMD's take over, the quality has done nothing but improve. People need to stop spreading what they don't know and only hear.
Posted on Reply
#103
AusWolf
8086AMD has been making good drivers for over 20 years now. Prior to the ATi 9700 pro, this statement could have held true that ATi drivers are not good. But since 9700pro and AMD's take over, the quality has done nothing but improve. People need to stop spreading what they don't know and only hear.
I have to add, I did have driver-related problems with the 5700 XT. But before and after that, my experience with the drivers has been flawless.
Posted on Reply
#104
Neo_Morpheus
Kohl BaasI read on a forum that it evolved from the "nVidia is faster but ATI has a better image" parable, which itself originated from the pre DX9 era when some of the rendering/imaging methods were not standardized yet and the manufacturers did they own separate solutions (don't ask, I wasn't really into this back than and it was like the DX10-11 times when I read it). It held itself -relatively falsely- during DX9 because of the different HDR profileing they used.
Actually, thats even older and I recall that it went like this, quality wise:


1- Matrox
2- ATI.
3- Ngreedia.

And it was all the time during regular usage, not just during gaming. Better color reproduction, sharpness, image stability, etc.

Really miss those days.
Posted on Reply
#105
bug
Neo_MorpheusActually, thats even older and I recall that it went like this, quality wise:


1- Matrox
2- ATI.
3- Ngreedia.

And it was all the time during regular usage, not just during gaming. Better color reproduction, sharpness, image stability, etc.

Really miss those days.
There was also STB and S3 before that.
Posted on Reply
#106
ToTTenTranz
Neo_MorpheusActually, thats even older and I recall that it went like this, quality wise:


1- Matrox
2- ATI.
3- Ngreedia.

And it was all the time during regular usage, not just during gaming. Better color reproduction, sharpness, image stability, etc.

Really miss those days.
That's from the time we used analog RGB on monitors, and Matrox used higher quality external RAMDAC chips in their Millennium range.
It had little to do with the GPU architecture or drivers.
Posted on Reply
#107
8086
AusWolfI have to add, I did have driver-related problems with the 5700 XT. But before and after that, my experience with the drivers has been flawless.
No one ever talks about it, but over the years; Nvidia has had a number of it's share of driver issues including one infamous update that was killing gpus left and right.

Matrox is still alive.

Posted on Reply
#108
Neo_Morpheus
ToTTenTranzThat's from the time we used analog RGB on monitors, and Matrox used higher quality external RAMDAC chips in their Millennium range.
Indeed and I recall hearing/reading about them using superior/better components.
ToTTenTranzIt had little to do with the GPU architecture or drivers.
Never stated those specific points, simply brands/companies as it was discussed back in the day.
Posted on Reply
#109
8086
I had a high end Matrox card back in the day and the image quality was stunning. Better black levels than any other graphics card I owned in the early 2000s.
Posted on Reply
Add your own comment
Jun 2nd, 2024 00:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts