Wednesday, August 9th 2023

Suppliers Amp Up Production, HBM Bit Supply Projected to Soar by 105% in 2024

TrendForce highlights in its latest report that memory suppliers are boosting their production capacity in response to escalating orders from NVIDIA and CSPs for their in-house designed chips. These efforts include the expansion of TSV production lines to increase HBM output. Forecasts based on current production plans from suppliers indicate a remarkable 105% annual increase in HBM bit supply by 2024. However, due to the time required for TSV expansion, which encompasses equipment delivery and testing (9 to 12 months), the majority of HBM capacity is expected to materialize by 2Q24.

TrendForce analysis indicates that 2023 to 2024 will be pivotal years for AI development, triggering substantial demand for AI Training chips and thereby boosting HBM utilization. However, as the focus pivots to Inference, the annual growth rate for AI Training chips and HBM is expected to taper off slightly. The imminent boom in HBM production has presented suppliers with a difficult situation: they will need to strike a balance between meeting customer demand to expand market share and avoiding a surplus due to overproduction. Another concern is the potential risk of overbooking, as buyers, anticipating an HBM shortage, might inflate their demand.
HBM3 is slated to considerably elevate HBM revenue in 2024 with its superior ASP
The HBM market in 2022 was marked by sufficient supply. However, an explosive surge in AI demand in 2023 has prompted clients to place advance orders, stretching suppliers to their capacity limits. Looking ahead to 2024, TrendForce forecasts that due to aggressive expansion by suppliers, the HBM sufficiency ratio will rise from -2.4% to 0.6% in 2024.

TrendForce believes that the primary demand is shifting from HBM2e to HBM3 in 2023, with an anticipated demand ratio of approximately 50% and 39%, respectively. As an increasing number of chips adopting HBM3 hit the market, demand in 2024 will heavily lean toward HBM3 and eclipse HBM2e with a projected share of 60%. This expected surge—coupled with a higher ASP—is likely to trigger a significant increase in HBM revenue next year.

SK hynix currently holds the lead in HBM3 production, serving as the principal supplier for NVIDIA's server GPUs. Samsung, on the other hand, is focusing on satisfying orders from other CSPs. The gap in market share between Samsung and SK hynix is expected to narrow significantly this year due to an increasing number of orders for Samsung from CSPs. Both firms are predicted to command similar shares in the HBM market sometime between 2023 to 2024—collectively occupying around 95%. However, variations in shipment performance may arise across different quarters due to their distinct customer bases. Micron, which has focused mainly on HBM3e development, may witness a slight decrease in market share in the next two years due to the aggressive expansion plans of these two South Korean manufacturers.

Prices for older HBM generations are expected to drop in 2024, while HBM3 prices may remain steady
From a long-term perspective, TrendForce notes that the ASP of HBM products gradually decreases year on year. Given HBM's high-profit nature and a unit price far exceeding other types of DRAM products, suppliers aim to incrementally reduce prices to stimulate customer demand, leading to a price decline for HBM2e and HBM2 in 2023.

Even though suppliers have yet to finalize their pricing strategies for 2024, TrendForce doesn't rule out the possibility of further price reductions for HBM2 and HBM2e products, given a significant improvement in the overall HBM supply and suppliers' endeavors to broaden their market shares. Meanwhile, HBM3 prices are forecast to stay consistent with 2023. Owing to its significantly higher ASP compared to HBM2e and HBM2, HBM3 is poised to bolster suppliers' HBM revenue, potentially propelling total HBM revenue to a whopping US$8.9 billion in 2024—a 127% YoY increase.
Source: TrendForce
Add your own comment

12 Comments on Suppliers Amp Up Production, HBM Bit Supply Projected to Soar by 105% in 2024

#1
Denver
AMD must be feeling bad about not charging for HBM IP.
Posted on Reply
#2
Philaphlous
Maybe we'll finally see HBM adopted for GPUs? Finally in laptops too saving a bunch of space....
Posted on Reply
#3
Denver
PhilaphlousMaybe we'll finally see HBM adopted for GPUs? Finally in laptops too saving a bunch of space....
Huh ? With the (already expensive) price of HBM increasing by 2-3x do you expect that?

HBM is efficient, but it is complex and expensive to produce and this point only gets worse with each iteration. Servers and AI are the perfect market for this type of memory, because everything is expensive and has very high profit margins.
Posted on Reply
#5
TumbleGeorge
Denverit is complex and expensive to produce
The same can be said for literally any modern electronics. But if you present me with actual manufacturing costs for bit HBM vs bit GDDR, I'll be agree.
Posted on Reply
#6
Denver
TumbleGeorgeThe same can be said for literally any modern electronics. But if you present me with actual manufacturing costs for bit HBM vs bit GDDR, I'll be agree.

"One of the things that plagues high bandwidth memory right now is cost,” said Marc Greenberg, group director for product marketing in the IP group at Cadence. “3D stacking is expensive. There’s a logic die that sits at the base of the stack of dies, which is an additional piece of silicon you have to pay for. And then there’s a silicon interposer, which goes under everything under the CPU or GPU, as well as the HBM memories. That has a cost. Then, you need a larger package, and so on. There are a number of system costs that take HBM as it exists today out of the consumer domain and put it more firmly in the server room or the data center. By contrast, graphics memories like GDDR6, while they don’t offer as much performance as the HBM, do so at significantly less cost. The performance per unit cost on GDDR6 is actually much better than HBM, but the maximum bandwidth of a GDDR6 device doesn’t match the maximum bandwidth of an HBM.”

source: semiengineering.com/hbms-future-necessary-but-expensive/

By the nature of how HBM is produced it is also expected to have a higher defect rate.
Posted on Reply
#8
Denver
TumbleGeorgeOh, that's a very good example that doesn't include numbers like: X costs 0.0001 cents per bit and Y costs 0.0003 cents per bit production costs.
Market prices isn't argument what is BOM numbers.
By the logic of having more components and complexity, production costs are higher, but it is not easy for the common public to get accurate data on how much this difference is.

The best technical and cost comparison I've seen on HBM2 vs GDDR, was the one done by GamersNexus, but it's been years, and the cost of HBM has only gone up instead of down:

"The next question is what GDDR5 costs. A recent DigiTimes report pegs GDDR5 at about $6.50 for an 8Gb module, though also shows pricing for August onward at $8.50 per module. With old pricing, that’s around $52 cost for an 8GB card, or $68 with new pricing. We do not presently know GDDR5X cost. This puts us at around 3x the cost for HBM2 which, even without factoring in yields or the large GPU die, shows why AMD’s margins are so thin on Vega. We also know that AMD is passing along its HBM2 cost to partners at roughly a 1:1 rate – they’re not upcharging it, which is what typically happens with GDDR. There’s no room to upcharge the HBM2 with Vega’s price target.

Ignoring GPU cost and cost of less significant components, like the VRM and cooler, we’re at $100-$130 more than 8GB of GDDR5 cost to build. This is also ignoring other costs, like incalculable R&D or packaging costs."

The Cost of HBM2 vs. GDDR5 & Why AMD Had to Use It | GamersNexus - Gaming PC Builds & Hardware Benchmarks
Posted on Reply
#9
persondb
One thing to consider about HBM is that there is also a reduction of complexity at the PCB level, which is increasing at a very alarming rate. You can already see how insane it is when you see how close a 4090 is to it's memory modules and etc.

With that said, I am not really hopeful that there will ever be consumer GPUs with HBM ever again, but maybe something like what apple does to it's M-series chip that integrate the memory in package could work? I don't know.
Posted on Reply
#10
Flyordie
Denver
"One of the things that plagues high bandwidth memory right now is cost,” said Marc Greenberg, group director for product marketing in the IP group at Cadence. “3D stacking is expensive. There’s a logic die that sits at the base of the stack of dies, which is an additional piece of silicon you have to pay for. And then there’s a silicon interposer, which goes under everything under the CPU or GPU, as well as the HBM memories. That has a cost. Then, you need a larger package, and so on. There are a number of system costs that take HBM as it exists today out of the consumer domain and put it more firmly in the server room or the data center. By contrast, graphics memories like GDDR6, while they don’t offer as much performance as the HBM, do so at significantly less cost. The performance per unit cost on GDDR6 is actually much better than HBM, but the maximum bandwidth of a GDDR6 device doesn’t match the maximum bandwidth of an HBM.”

source: semiengineering.com/hbms-future-necessary-but-expensive/

By the nature of how HBM is produced it is also expected to have a higher defect rate.
My thinking on HBM2 is this- It should still be present on at least the high end GPUs. As for why? Those already have high asking prices.

I'm still running my Vega64 XTX. Its got Samsung HBM2 rated at 512GB/s. With how mature HBM2 is, they could restart using it.. like HBM2e and make cards smaller, more power efficient.
Posted on Reply
#11
maxfly
What exactly is it being used for that's creating such a huge, supposed increase in demand?
Posted on Reply
#12
mahirzukic2
maxflyWhat exactly is it being used for that's creating such a huge, supposed increase in demand?
AI, what else?
Posted on Reply
Add your own comment
May 21st, 2024 19:55 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts