Friday, April 19th 2024

AMD "Strix Halo" Zen 5 Mobile Processor Pictured: Chiplet-based, Uses 256-bit LPDDR5X

Enthusiasts on the ChipHell forum scored an alleged image of AMD's upcoming "Strix Halo" mobile processor, and set out to create some highly plausible schematic slides. These are speculative. While "Strix Point" is the mobile processor that succeeds the current "Hawk Point" and "Phoenix" processors; "Strix Halo" is in a category of its own—to offer gaming experiences comparable to discrete GPUs in the ultraportable form-factor where powerful discrete GPUs are generally not possible. "Strix Halo" also goes head on against Apple's M3 Max and M3 Pro processors powering the latest crop of MacBook Pros. It has the same advantages as a single-chip solution, as the M3 Max.

The "Strix Halo" silicon is a chiplet-based processor, although very different from "Fire Range". The "Fire Range" processor is essentially a BGA version of the desktop "Granite Ridge" processor—it's the same combination of one or two "Zen 5" CCDs that talk to a client I/O die, and is meant for performance-thru-enthusiast segment notebooks. "Strix Halo," on the other hand, use the same one or two "Zen 5" CCDs, but with a large SoC die featuring an oversized iGPU, and 256-bit LPDDR5X memory controllers not found on the cIOD. This is key to what AMD is trying to achieve—CPU and graphics performance in the league of the M3 Pro and M3 Max at comparable PCB and power footprints.
The iGPU of the "Strix Halo" processor is based on the RDNA 3+ graphics architecture, and features a massive 40 RDNA compute units. These work out to 2,560 stream processors, 80 AI accelerators, 40 Ray accelerators, 160 TMUs, and an unknown number of ROPs (we predict at least 64). The slide predicts an iGPU engine clock as high as 3.00 GHz.

Graphics is an extremely memory sensitive application, and so AMD is using a 256-bit (quad-channel or octa-subchannel) LPDDR5X-8533 memory interface, for an effective cached bandwidth of around 500 GB/s. The memory controllers are cushioned by a 32 MB L4 cache located on the SoC die. The way we understand this cache hierarchy, the CCDs (CPU cores) can treat this as a victim cache, besides the iGPU treating this like an L2 cache (similar to the Infinite Cache found in RDNA 3 discrete GPUs).

The iGPU isn't the only logic-heavy and memory-sensitive device on the SoC die, there's also a NPU. From what we gather, this is the exact same NPU model found in "Strix Point" processors, with a performance of around 45-50 AI TOPS, and is based on the XDNA 2 architecture developed by AMD's Xilinx team.
The SoC I/O of "Strix Halo" isn't as comprehensive as "Fire Range," because the chip has been designed on the idea that the notebook will use its large iGPU. It has PCIe Gen 5, but only a total of 12 Gen 5 lanes—4 toward an M.2 NVMe slot, and 8 to spare for a discrete GPU (if present), although these can be used to connect any PCIe device, including additional M.2 slots. There's also integrated 40 Gbps USB4, and 20 Gbps USB 3.2 Gen 2.

As for the CPU, since "Strix Halo" is using one or two "Zen 5" CCDs, its CPU performance will be similar to "Fire Range." You get up to 16 "Zen 5" CPU cores, with 32 MB of L3 cache per CCD, or 64 MB of total CPU L3 cache. The CCDs are connected to the SoC die either using conventional IFOP (Infinity Fabric over package), just like "Fire Range" and "Granite Ridge," or there's even a possibility that AMD is using Infinity Fanout links like on some of its chiplet-based RDNA 3 discrete GPUs.
Lastly, there are some highly speculative performance predictions for the "Strix Halo" iGPU, which puts it competitive to the GeForce RTX 4060M and RTX 4070M.
Sources: ChipHell Forums, harukaze5719 (Twitter)
Add your own comment

109 Comments on AMD "Strix Halo" Zen 5 Mobile Processor Pictured: Chiplet-based, Uses 256-bit LPDDR5X

#26
TristanX
any confirmation that it is chiplet based ? Monolithic could be much more justified (both use N4), as additional controllers and PHYs take space, power and are costly in packaging
Posted on Reply
#27
SL2
NoyandAMD chiplets are Tiny. If the schematics are accurate,
Most likely not. Compare the IOD size with the one in your Dragon range pic, that's a tiny increase for a beefy graphics upgrade (20 times more CU's), even if not the same node.

Also:
These are speculative.
TristanXMonolithic could be much more justified (both use N4), as additional controllers and PHYs take space, power and are costly in packaging
If they want to go above 8 cores it makes sense, just as it does on desktop CPU's. Otherwise they have to make some huge dies with everything in one, that will cost a lot and give lower yields.
I guess going >300 mm² isn't something AMD do on CPU/APU's these days.
Posted on Reply
#28
G777
So it’s configurable up to 150 or 175W; that’s about the same as (actually it’s more than) max total power draw of a 2024 Zephryus G14 laptop, and it’s projected to achieve about the performance GPU-wise. Even the lowest power configuration at 45W is more than the standard for ultrabooks.

This product seems quite niche to me. I suppose you can slightly more compact high-performance laptops by saving some space on the dGPU and the associated components, but given the high idle power consumption AMD’s chiplet-based processors tend to have, battery life may not be great. It may also be used in the most premium desktop-replacement laptops just so they can have the best components, but then those would come with the downside of soldered RAM. I suspect that it will be too power-hungry to compete against the M3/M4 Max.
Posted on Reply
#29
wNotyarD
G777So it’s configurable up to 150 or 175W; that’s about the same as (actually it’s more than) max total power draw of a 2024 Zephryus G14 laptop, and it’s projected to achieve about the performance GPU-wise. Even the lowest power configuration at 45W is more than the standard for ultrabooks.

This product seems quite niche to me. I suppose you can slightly more compact high-performance laptops by saving some space on the dGPU and the associated components, but given the high idle power consumption AMD’s chiplet-based processors tend to have, battery life may not be great. It may also be used in the most premium desktop-replacement laptops just so they can have the best components, but then those would come with the downside of soldered RAM. I suspect that it will be too power-hungry to compete against the M3/M4 Max.
Ain't every Zen 4 mobile model monolithic? Why would Zen 5 mobile be chiplet based?
Posted on Reply
#31
G777
wNotyarDAin't every Zen 4 mobile model monolithic? Why would Zen 5 mobile be chiplet based?
Strip Halo is chiplet-based as described in the article, as was Dragon Range (Ryzen 7x45HX) and the upcoming Fire Range (both are essentially BGA versions of the desktop client processors).
Posted on Reply
#32
wNotyarD
G777Strip Halo is chiplet-based as described in the article, as was Dragon Range (Ryzen 7x45HX) and the upcoming Fire Range (both are essentially BGA versions of the desktop client processors).
I stand corrected then.
Posted on Reply
#33
AnotherReader
TumbleGeorgeLPDDR5X ok, soldered RAM. Not ok for me.
Unfortunately, LPDDR5X is far more efficient than DDR5 so soldered RAM is the tradeoff for better power efficiency.
Posted on Reply
#34
TheinsanegamerN
DavenThe bad AMD driver quality misinformation is an internet myth perpetuated by bad players. There is a lot speculation on the who with regard to these bad players from viral Nvidia marketing to brand loyalists. But rest assured as you have found out, there is no truth to it.

There’s also thinking out there that if company A does something better than company B then it means company B has bad quality control or is ignorant to making good products. This relates to super sampling and ray tracing for the current discussion. These two things are features which Nvidia simply does better. It has no relationship to drivers or driver quality. If these features are not important to you, paying the extra premium priced into Nvidia products for said features would be a waste of money.
It's ironic, the ones screaming "MuH mIsInFoRmAtIoN" are themselves guilty of misinformation. Every time. Remember: nvidia made FCAT and falsified the results to perpetuate the myth! LMAO

BTW Lisa isnt going to send you a thank you card for meatshielding them.
SRB151I've always wondered about this statement. I ran Nvida cards for years until they pulled that Geforce partner program, when I switched to AMD (also as much for the price/performance ratio, I can't afford $1500-2k for a GPU). I can't think of a single bug which was really a show stopper with either of them. Most annoying problem I ever had was the power draw with dual monitors, and eventually that got fixed.
You skipped a HUGE portion of AMD's driver history. After buying ATi AMD never put sufficient resources into their graphic driver development. From the HD 2000 series through the HD 6000 series, it was standard practice to keep multiple driver son hand, depending on what game you wanted to play. New drivers would break older games more often then not. when the HD 7000s came out, the HD6000s and older were left to wither, cut off half a decade before nvidia's fermi was.

There was the Frame Pacing issues that AMD swore up and down didnt exist, until nvidia proved they did, the 200 and 300 series black screen issues that were never truly resolved, and tons of optimization issues. In 2017 after the launch of polaris AMD committed to fixing their drivers, and by 2019 they were in a much better state.

To most reasonable people, it's understandable why nearly 2 decades of poor quality drivers (ATI wasn't a whole lot better) have informed a large portion of the market against AMD. A few years of good drivers dont immediately fix that. And AMD still hasnt escaped controversy. Dropping GPUs after only 6 years (Fury X), not releasing drivers for the RX 6000s for nearly 3 months because of the RX 7000s (something nvidia has never done), and the more recent minor feather ruffling with TinyBuild dont help their image.

They're worlds away from where they were, and for most users they operate without issue. Perceptions, however, do not change on a dime, they take a lot of time and dedication to change.
progsteThat GPU looks BEEFY!
Would be interesting to see this in a portable gaming machine like the steam deck, but maybe it's too power-hungry for that?
A thin gaming laptop with this could turn out very nice.
I can imagine a 13-14" laptop would be able to handle it.

Razer, get on it!
AnotherReaderUnfortunately, LPDDR5X is far more efficient than DDR5 so soldered RAM is the tradeoff for better power efficiency.
Dont forget WAY higher speeds. 9500 mhz is available now and will supposedly be used with zen 5 mobile, and 10700 has been shown off by Samsung.
Posted on Reply
#35
Daven
G777So it’s configurable up to 150 or 175W; that’s about the same as (actually it’s more than) max total power draw of a 2024 Zephryus G14 laptop, and it’s projected to achieve about the performance GPU-wise. Even the lowest power configuration at 45W is more than the standard for ultrabooks.

This product seems quite niche to me. I suppose you can slightly more compact high-performance laptops by saving some space on the dGPU and the associated components, but given the high idle power consumption AMD’s chiplet-based processors tend to have, battery life may not be great. It may also be used in the most premium desktop-replacement laptops just so they can have the best components, but then those would come with the downside of soldered RAM. I suspect that it will be too power-hungry to compete against the M3/M4 Max.
The power numbers probably include the RAM as well. Clocks can be set to different power envelopes. The entire SOC would be close to the whole system power draw. The 12 core, 32 CU version would be even lower.
Posted on Reply
#36
Denver
I hope this NPU has some use beyond fueling Microsoft's megalomaniacal dreams of having AI at our necks all the time.
Posted on Reply
#37
Carillon
TumbleGeorgeLPDDR5X ok, soldered RAM. Not ok for me.
If the MC is like the one in phoenix, it should be able to run ddr5.
I doubt any laptop manufacturer would put 4 sodimm slots, but they might, just to sell them with only one stick.

Also, if the article is correct, this is all speculation from some randoms on the internet that have seen an image of a thing and came up with some other images themselves.
Posted on Reply
#38
umdterps71
Anyone else think this might be part of an Xbox handheld or a lower power version?
Posted on Reply
#39
Cheeseball
Not a Potato
I still think this is possibly going in the purported PS5 Pro. The embedded and shared LPDDR5X RAM kinda gives it away.

The specs are direct upgrades from the Zen 2/Oberon APU of the PS5 and Zen 2/Scarlett APU of the Series X too.

However, I too really wish this goes in some sort of ultra-portable or lightweight gaming laptop.
umdterps71Anyone else think this might be part of an Xbox handheld or a lower power version?
I wish, but anything above 100W TDP in a handheld is going to push it out of its category for sure.
Posted on Reply
#40
Zareek
TheinsanegamerNYou skipped a HUGE portion of AMD's driver history. After buying ATi AMD never put sufficient resources into their graphic driver development. From the HD 2000 series through the HD 6000 series, it was standard practice to keep multiple driver son hand, depending on what game you wanted to play. New drivers would break older games more often then not. when the HD 7000s came out, the HD6000s and older were left to wither, cut off half a decade before nvidia's fermi was.

There was the Frame Pacing issues that AMD swore up and down didnt exist, until nvidia proved they did, the 200 and 300 series black screen issues that were never truly resolved, and tons of optimization issues. In 2017 after the launch of polaris AMD committed to fixing their drivers, and by 2019 they were in a much better state.

To most reasonable people, it's understandable why nearly 2 decades of poor quality drivers (ATI wasn't a whole lot better) have informed a large portion of the market against AMD. A few years of good drivers dont immediately fix that. And AMD still hasnt escaped controversy. Dropping GPUs after only 6 years (Fury X), not releasing drivers for the RX 6000s for nearly 3 months because of the RX 7000s (something nvidia has never done), and the more recent minor feather ruffling with TinyBuild dont help their image.

They're worlds away from where they were, and for most users they operate without issue. Perceptions, however, do not change on a dime, they take a lot of time and dedication to change.
Not in my experience, I ran an HD 3870 and an HD 5830 card. I never had to change drivers to play different games or experienced any of these purported driver issues. I've been switching between ATI/AMD and Nvidia cards for nearly 30 years, and I can honestly say I've had more driver issues with Nvidia cards than I ever did with ATI/AMD. It's more of a minor annoyance than anything, but with my 3060ti, right now I randomly get checkerboards that popup here and there on webpages. They only appear for 10 seconds or so and disappear, it's also very random. I've seen it with Brave, Chrome, Edge, FF and Vivaldi. This only started when I installed 552.12. I guess I need to try 552.22. I'm constantly installing a new driver with Nvidia cards, at least it feels that way. Updating drivers once every three to six months feels more reasonable to me, I guess I'm in the minority on that.
Posted on Reply
#41
Noyand
TumbleGeorgeLPDDR5X ok, soldered RAM. Not ok for me.
I think that it would be pretty hard to reach the 500GB/s memory bandwidth with sodimms. Especially when they can't reach high frequency. I'm not even sure that the current iteration of Camm can reach that amount.
Posted on Reply
#42
dir_d
DenverI hope this NPU has some use beyond fueling Microsoft's megalomaniacal dreams of having AI at our necks all the time.
Me too, i really think the NPU is like a Xilinx FPGA. I really hope many programs like Handbrake and other get on board to use the NPU to speed up work processing and its just not some dumb Microsoft only AI cash grab.
Posted on Reply
#43
Daven
ZareekNot in my experience, I ran an HD 3870 and an HD 5830 card. I never had to change drivers to play different games or experienced any of these purported driver issues. I've been switching between ATI/AMD and Nvidia cards for nearly 30 years, and I can honestly say I've had more driver issues with Nvidia cards than I ever did with ATI/AMD. It's more of a minor annoyance than anything, but with my 3060ti, right now I randomly get checkerboards that popup here and there on webpages. They only appear for 10 seconds or so and disappear, it's also very random. I've seen it with Brave, Chrome, Edge, FF and Vivaldi. This only started when I installed 552.12. I guess I need to try 552.22. I'm constantly installing a new driver with Nvidia cards, at least it feels that way. Updating drivers once every three to six months feels more reasonable to me, I guess I'm in the minority on that.
No matter how many times we say drivers from both AMD and Nvidia are similar quality with tons and tons of proof, there continues to be some calling us liars. Oh well.

Ironically many of these naysayers are not AMD GPU users and just spreading FUD.

With regard to what you were saying, I too have been switching back and forth between Nvidia and AMD without significant issues from one or the other. I started with 3dfx products, moved to TNT2, geforce 4200Ti and then the Radeon 9800 Pro. Between then and now I’ve owned a dozen cards from both companies in my primary and secondary builds.
Posted on Reply
#44
G777
One thing of note is that with a 256 bit memory bus: with the iGPU inactive, the CPU may have access to much higher memory bandwidth than desktop processors. I wonder if there will be cases where Strix Halo can outperform the desktop Granite Ridge
Posted on Reply
#45
stimpy88
Such low memory bandwidth, but interesting if you scaled this up to 512Bit, beefed up the GPU just a little more and made a console out of it...
Posted on Reply
#46
Zareek
DavenNo matter how many times we say drivers from both AMD and Nvidia are similar quality with tons and tons of proof, there continues to be some calling us liars. Oh well.

Ironically many of these naysayers are not AMD GPU users and just spreading FUD.

With regard to what you were saying, I too have been switching back and forth between Nvidia and AMD without significant issues from one or the other. I started with 3dfx products, moved to TNT2, geforce 4200Ti and then the Radeon 9800 Pro. Between then and now I’ve owned a dozen cards from both companies in my primary and secondary builds.
Wow the memories, I think I went Riva TNT, Riva TNT2 Ultra, GeForce 2 GTS, Radeon 9700 Pro, GeForce 6800 GT, HD 3870, HD 5830, GeForce GTX 970, Vega64 and finally RTX 3060ti. From about the 9700 Pro forward, there were usually at least two and sometimes three other cards or laptop GPUs in use, some new, some hand-me-downs. Typically, we had at least one AMD/ATI and one Nvidia card running at the same time. Right now, my wife is running an RX 6600m, zero driver issues. She loves her Minisforum HX100G.
Posted on Reply
#47
AusWolf
How do you fit 40 CUs onto an SoC die? This is insane! :eek:
Posted on Reply
#48
R-T-B
DavenThe bad AMD driver quality misinformation is an internet myth perpetuated by bad players. There is a lot speculation on the who with regard to these bad players from viral Nvidia marketing to brand loyalists. But rest assured as you have found out, there is no truth to it.
Thanks for invalidating a metric butt-ton of otherwise valid experiences, mine included.
DavenIronically many of these naysayers are not AMD GPU users and just spreading FUD.
Yeah, some maybe. Not all. And that's all it takes.

*Points at build*
Posted on Reply
#49
R0H1T
umdterps71Anyone else think this might be part of an Xbox handheld or a lower power version?
No, looks like a competitor to M4/pro/max & above! It's also possible AMD/Intel may finally get around to enable quad channel memory on normal (high end) desktops to fend off Apple & Co in the future. Right now they do have the slight advantage of more "choice" wrt RAM or SSD, but they're quite a bit behind in memory bandwidth against Apple's top end parts.
Posted on Reply
#50
Zareek
R-T-BThanks for invalidating a metric butt-ton of otherwise valid experiences, mine included.


Yeah, some maybe. Not all. And that's all it takes.

*Points at build*
So wait, your 7900XTX Linux build is proof that AMD's Windows drivers are crappy?
Posted on Reply
Add your own comment
Dec 3rd, 2024 12:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts