News Posts matching #SK Hynix

Return to Keyword Browsing

NAND Flash Market Landscape to Change, Reports TrendForce

With the effective reduction of production by suppliers, the price of memory is rebounding, and the semiconductor memory market finally shows signs of recovery. From the perspective of market dynamics and demand changes, NAND Flash, as one of the two major memory products, is experiencing a new round of changes. Since 3Q23, NAND Flash chip prices have been on the rise for several consecutive months. TrendForce believes that, under the precondition of a conservative market demand prospect for 2024, chip price trends will depend on suppliers' production capacity utilization.

There have been frequent developments in the NAND flash memory industry chain, with some manufacturers indicating a willingness to raise prices or increase production capacity utilization. Wallace C. Kou, General Manager of NAND Flash Supplier SIMO, stated that prices for the second quarter of NAND Flash have already been settled down, which will increase by 20%; some suppliers have started to make profits in the first quarter, and most suppliers will earn money after the second quarter.

Samsung Reportedly Acquiring New Equipment Due to Disappointing HBM Yields

Industry insiders reckon that Samsung Electronics is transitioning to molded underfill (MR-MUF) production techniques—rival memory manufacturer, SK Hynix, champions this chip making technology. A Reuters exclusive has cited claims made by five industry moles—they believe that Samsung is reacting to underwhelming HBM production yields. The publication proposes that: "one of the reasons Samsung has fallen behind (competing producers) is its decision to stick with chip making technology called non-conductive film (NCF) that causes some production issues, while Hynix switched to the mass reflow molded underfill (MR-MUF) method to address NCF's weakness." The report suggests that Samsung is in the process of ordering new MUF-related equipment.

One anonymous source stated: "Samsung had to do something to ramp up its HBM (production) yields... adopting MUF technique is a little bit of swallow-your-pride type thing for (them), because it ended up following the technique first used by SK Hynix." Reuters managed to extract a response from the giant South Korean multinational—a company spokesperson stated: "we are carrying out our HBM3E product business as planned." They indicated that NCF technology remains in place as an "optimal solution." Post-publication, another official response was issued: "rumors that Samsung will apply MR-MUF to its HBM production are not true." Insiders propose a long testing phase—Samsung is rumored to be sourcing MUF materials, but mass production is not expected to start this year. Three insiders allege that Samsung is planning to "use both NCF and MUF techniques" for a new-generation HBM chip.

NVIDIA's Selection of Micron HBM3E Supposedly Surprises Competing Memory Makers

SK Hynix believes that it leads the industry with the development and production of High Bandwidth Memory (HBM) solutions, but rival memory manufacturers are working hard on equivalent fifth generation packages. NVIDIA was expected to select SK Hynix as the main supplier of HBM3E parts for utilization on H200 "Hopper" AI GPUs, but a surprise announcement was issued by Micron's press team last month. The American firm revealed that HBM3E volume production had commenced: ""(our) 24 GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, which will begin shipping in the second calendar quarter of 2024. This milestone positions Micron at the forefront of the industry, empowering artificial intelligence (AI) solutions with HBM3E's industry-leading performance and energy efficiency."

According to a Korea JoongAng Daily report, this boast has reportedly "shocked" the likes of SK Hynix and Samsung Electronics. They believe that Micron's: "announcement was a revolt from an underdog, as the US company barely held 10 percent of the global market last year." The article also points out some behind-the-scenes legal wrangling: "the cutthroat competition became more evident when the Seoul court sided with SK Hynix on Thursday (March 7) by granting a non-compete injunction to prevent its former researcher, who specialized in HBM, from working at Micron. He would be fined 10 million won for each day in violation." SK Hynix is likely pinning its next-gen AI GPU hopes on a 12-layer DRAM stacked HBM3E product—industry insiders posit that evaluation samples were submitted to NVIDIA last month. The outlook for these units is said to be very positive—mass production could start as early as this month.

JEDEC Agrees to Relax HBM4 Package Thickness

JEDEC is currently presiding over standards for 6th generation high bandwidth memory (AKA HBM4)—the 12 and 16-layer DRAM designs are expected to reach mass production status in 2026. According to a ZDNET South Korea report, involved manufacturers are deliberating over HBM4 package thicknesses—allegedly, decision makers have settled on 775 micrometers (μm). This is thicker than the previous generation's measurement of 720 micrometers (μm). Samsung Electronics, SK Hynix and Micron are exploring "hybrid bonding," a new packaging technology—where onboard chips and wafers are linked directly to each other. Hybrid bonding is expected to be quite expensive to implement, so memory makers are carefully considering whether HBM4 warrants its usage.

ZDNET believes that JEDEC's agreement—settling on 775 micrometers (μm) for 12-layer and 16-layer stacked HBM4—could have: "a significant impact on the future packaging investment trends of major memory manufacturers. These companies have been preparing a new packaging technology, hybrid bonding, keeping in mind the possibility that the package thickness of HBM4 will be limited to 720 micrometers. However, if the package thickness is adjusted to 775 micrometers, 16-layer DRAM stacking HBM4 can be sufficiently implemented using existing bonding technology." A revised schedule could delay the rollout of hybrid bonding—perhaps pushed back to coincide with a launch of seventh generation HBM. The report posits that Samsung Electronics, SK Hynix and Micron memory engineers are about to focus on the upgrading of existing bonding technologies.

HBM3 Initially Exclusively Supplied by SK Hynix, Samsung Rallies Fast After AMD Validation

TrendForce highlights the current landscape of the HBM market, which as of early 2024, is primarily focused on HBM3. NVIDIA's upcoming B100 or H200 models will incorporate advanced HBM3e, signaling the next step in memory technology. The challenge, however, is the supply bottleneck caused by both CoWoS packaging constraints and the inherently long production cycle of HBM—extending the timeline from wafer initiation to the final product beyond two quarters.

The current HBM3 supply for NVIDIA's H100 solution is primarily met by SK hynix, leading to a supply shortfall in meeting burgeoning AI market demands. Samsung's entry into NVIDIA's supply chain with its 1Znm HBM3 products in late 2023, though initially minor, signifies its breakthrough in this segment.

SK Hynix To Invest $1 Billion into Advanced Chip Packaging Facilities

Lee Kang-Wook, Vice President of Research and Development at SK Hynix, has discussed the increased importance of advanced chip packaging with Bloomberg News. In an interview with the media company's business section, Lee referred to a tradition of prioritizing the design and fabrication of chips: "the first 50 years of the semiconductor industry has been about the front-end." He believes that the latter half of production processes will take precedence in the future: "...but the next 50 years is going to be all about the back-end." He outlined a "more than $1 billion" investment into South Korean facilities—his department is hoping to "improve the final steps" of chip manufacturing.

SK Hynix's Head of Packaging Development pioneered a novel method of packaging the third generation of high bandwidth technology (HBM2E)—that innovation secured NVIDIA as a high-profile and long term customer. Demand for Team Green's AI GPUs has boosted the significance of HBM technologies—Micron and Samsung are attempting to play catch up with new designs. South Korea's leading memory supplier is hoping to stay ahead in the next-gen HBM contest—supposedly 12-layer fifth generation samples have been submitted to NVIDIA for approval. SK Hynix's Vice President recently revealed that HBM production volumes for 2024 have sold out—currently company leadership is considering the next steps for market dominance in 2025. The majority of the firm's newly announced $1 billion budget will be spent on the advancement of MR-MUF and TSV technologies, according to their R&D chief.

NVIDIA Reportedly Sampling SK Hynix 12-layer HBM3E

South Korean tech insiders believe that SK Hynix has sent "12-layer DRAM stacked HBM3E (5th generation HBM)" prototype samples to NVIDIA—according a ZDNET.co.kr article, initial examples were shipped out last month. Reports from mid-2023 suggested that Team Green had sampled 8-layer HBM3E (4th gen) units around summer time—with SK Hynix receiving approval notices soon after. Another South Korean media outlet, DealSite, reckons that NVIDIA's memory qualification process has exposed HBM yield problems across a number of manufacturers. SK Hynix, Samsung and Micron are competing fiercely on the HBM3E front—with hopes of getting their respective products attached to NVIDIA's H200 AI GPU. DigiTimes Asia proposed that SK Hynix is ready to "commence mass production of fifth-generation HBM3E" at some point this month.

SK Hynix is believed to be leading the pack—insiders believe that yield rates are good enough to pass early NVIDIA certification, and advanced 12-layer samples are expected to be approved in the near future. ZDNET reckons that SK Hynix's forward momentum has placed it an advantageous position: "(They) supplied 8-layer HBM3E samples in the second half of last year and passed recent testing. Although the official schedule has not been revealed, mass production is expected to begin as early as this month. Furthermore, SK Hynix supplied 12-layer HBM3E samples to NVIDIA last month. This sample is an extremely early version and is mainly used to establish standards and characteristics of new products. SK Hynix calls it UTV (Universal Test Vehicle)... Since Hynix has already completed the performance verification of the 8-layer HBM3E, it is expected that the 12-layer HBM3E test will not take much time." SK Hynix's Vice President recently revealed that his company's 2024 HBM production volumes for were already sold out, and leadership is already preparing innovations for 2025 and beyond.

Enterprise SSD Industry Hits US$23.1 Billion in Revenue in 4Q23, Growth Trend to Continue into Q1 This Year

The third quarter of 2023 witnessed suppliers dramatically cutting production, which underpinned enterprise SSD prices. The fourth quarter saw a resurgence in contract prices, driven by robust buying activity and heightened demand from server brands and buoyed by optimistic capital expenditure forecasts for 2024. This, combined with increased demand from various end products entering their peak sales period and ongoing reductions in OEM NAND Flash inventories, resulted in some capacity shortages. Consequently, fourth-quarter enterprise SSD prices surged by over 15%. TrendForce highlights that this surge in demand and prices led to a 47.6% QoQ increase in enterprise SSD industry revenues in 4Q23, reaching approximately $23.1 billion.

The stage is set for continued fervor as we settle into the new year and momentum from server brand orders continues to heat up—particularly from Chinese clients. On the supply side, falling inventory levels and efforts to exit loss-making positions have prompted enterprise SSD prices to climb, with contract prices expected to increase by over 25%. This is anticipated to fuel a 20% revenue growth in Q1.

NAND Flash Industry Revenue Grows 24.5% in Q4 2023, Expected to Increase Another 20% in Q1

TrendForce reports a substantial 24.5% QoQ increase in NAND Flash industry revenue, hitting US$11.49 billion in 4Q23. This surge is attributed to a stabilization in end-demand spurred by year-end promotions, along with an expansion in component market orders driven by price chasing, leading to robust bit shipments compared to the same period last year. Additionally, the corporate sector's continued positive outlook for 2024 demand—compared to 2023—and strategic stockpiling have further fueled this growth.

Looking ahead to 1Q24, despite it traditionally being an off-season, the NAND Flash industry is expected to see a continued increase in revenue by another 20%. This anticipation is underpinned by significant improvements in supply chain inventory levels and ongoing price rises, with clients ramping up their orders to sidestep potential supply shortages and escalating costs. The ongoing expansion of order sizes is expected to drive NAND Flash contract prices up by an average of 25%.

JEDEC Publishes GDDR7 Graphics Memory Standard

JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, is pleased to announce the publication of JESD239 Graphics Double Data Rate (GDDR7) SGRAM. This groundbreaking new memory standard is available for free download from the JEDEC website. JESD239 GDDR7 offers double the bandwidth over GDDR6, reaching up to 192 GB/s per device, and is poised to meet the escalating demand for more memory bandwidth in graphics, gaming, compute, networking and AI applications.

JESD239 GDDR7 is the first JEDEC standard DRAM to use the Pulse Amplitude Modulation (PAM) interface for high frequency operations. Its PAM3 interface improves the signal to noise ratio (SNR) for high frequency operation while enhancing energy efficiency. By using 3 levels (+1, 0, -1) to transmit 3 bits over 2-cycles versus the traditional NRZ (non-return-to-zero) interface transmitting 2 bits over 2-cycles, PAM3 offers higher data transmission rate per cycle resulting in improved performance.

DRAM Industry Sees Nearly 30% Revenue Growth in 4Q23 Due to Rising Prices and Volume

TrendForce reports a 29.6% QoQ in DRAM industry revenue for 4Q23, reaching US$17.46 billion, propelled by revitalized stockpiling efforts and strategic production control by leading manufacturers. Looking ahead to 1Q24, the intent to further enhance profitability is evident, with a projected near 20% increase in DRAM contract prices—albeit with a slight decrease in shipment volumes to the traditional off-season.

Samsung led the pack with the highest revenue growth among the top manufacturers in Q4 as it jumped 50% QoQ to hit $7.95 billion, largely due to a surge in 1alpha nm DDR5 shipments, boosting server DRAM shipments by over 60%. SK hynix saw a modest 1-3% rise in shipment volumes but benefited from the pricing advantage of HBM and DDR5, especially from high-density server DRAM modules, leading to a 17-19% increase in ASP and a 20.2% rise in revenue to $5.56 billion. Micron witnessed growth in both volume and price, with a 4-6% increase in each, resulting in a more moderate revenue growth of 8.9%, totaling $3.35 billion for the quarter due to its comparatively lower share of DDR5 and HBM.

Gauss Labs and SK hynix Publish the Latest Results on AI-based Semiconductor Metrology Technology

SK hynix Inc. and Gauss Labs announced today that they participated in the SPIE AL 2024, an international conference held in San Jose, California, and presented two papers based on the latest technology for AI-based metrology. SK hynix has been collaborating closely with Gauss Labs in various areas to increase semiconductor yield and productivity, and the results of this collaboration are published in these two papers.

In the paper "Model Aggregation for Virtual Metrology for High-Volume Manufacturing," Gauss Labs introduces "aggregated AOM", an algorithm that increases the prediction accuracy of its AI-based virtual metrology solution, Panoptes VM (Virtual Metrology). Since its adoption in December 2022, SK hynix used Panoptes VM to conduct virtual measurements on more than 50 million wafers so far, which translates to more than one wafer per second. The company was able to improve process variability by 29% thanks to this technology.

SK Hynix VP Reveals HBM Production Volumes for 2024 are Sold Out

SK Hynix Vice President Kitae Kim presides over the company's HBM Sales & Marketing (S&M) department—an official leadership blog profile reveals that the executive played a key role in making the South Korean supplier's high bandwidth memory (HBM) product line "a superstar of the semiconductor memory industry in 2023." Growing demand for powerful AI processors has placed SK Hynix in a more comfortable position, following recessive spells—including a major sales downturn in 2022. NVIDIA is the market leader in AI processing chips, and many of its flagship enterprise designs are fitted with cutting-edge SK Hynix memory modules. Kim noted that his firm has many notable international clients: "HBM is a revolutionary product which has challenged the notion that semiconductor memory is only one part of an overall system...in particular, SK Hynix's HBM has outstanding competitiveness. Our advanced technology is highly sought after by global tech companies."

The VP outlined how artificial intelligence industries are fuelling innovations: "With the diversification and advancement of generative AI services, demand for HBM, an AI memory solution, has also exploded. HBM, with its high-performance and high-capacity characteristics, is a monumental product that shakes the conventional wisdom that memory semiconductors are only a part of the overall system. In particular, SK Hynix HBM's competitiveness is outstanding." Business is booming, so much so that nothing can be added to this year's HBM order books: "Proactively securing customer purchase volumes and negotiating more favorable conditions for our high-quality products are the basics of semiconductor sales operations. With excellent products in hand, it's a matter of speed. Our planned production volume of HBM this year has already sold out. Although 2024 has just begun, we've already started preparing for 2025 to stay ahead of the market."

Kioxia Reportedly Presents Japanese Chipmaking Deal to SK Hynix

Japan's Jiji news agency has cottoned onto a major computer memory industry rumble—a Friday Reuters report suggests that Kioxia has offered an olive branch to SK Hynix, perhaps in a renewed push to get its proposed (and once rejected) merger with Western Digital over the finishing line. The South Korean memory manufacturing juggernaut took great issue with the suggested formation of a mighty Japanese-American 3D NAND memory chip conglomerate—SK Hynix's opposition reportedly placed great pressure on Western Digital (WD), and discussions with Kioxia ended last October.

Kioxia is seemingly eager to resume talks with WD, but requires a thumbs up from SK Hynix—according to Jiji's insider source(s), the Tokyo-headquartered manufacturer is prepared to offer its South Korean rival a nice non-volatile memory production deal. Kioxia's best Japanese 3D NAND fabrication facilities could play host to SK Hynix designs, although it is too early to tell whether this bid has been accepted. The Yokkaichi and Kitakami plants are set to receive a 150 billion yen Government subsidy—Kioxia and WD's joint venture is expected to move into cutting-edge semiconductor production. The Japanese government is hoping to secure its native operations in times of industry flux.

ASML High-NA EUV Twinscan EXE Machines Cost $380 Million, 10-20 Units Already Booked

ASML has revealed that its cutting-edge High-NA extreme ultraviolet (EUV) chipmaking tools, called High-NA Twinscan EXE, will cost around $380 million each—over twice as much as its existing Low-NA EUV lithography systems that cost about $183 million. The company has taken 10-20 initial orders from the likes of Intel and SK Hynix and plans to manufacture 20 High-NA systems annually by 2028 to meet demand. The High-NA EUV technology represents a major breakthrough, enabling an improved 8 nm imprint resolution compared to 13 nm with current Low-NA EUV tools. This allows chipmakers to produce transistors that are nearly 1.7 times smaller, translating to a threefold increase in transistor density on chips. Attaining this level of precision is critical for manufacturing sub-3 nm chips, an industry goal for 2025-2026. It also eliminates the need for complex double patterning techniques required presently.

However, superior performance comes at a cost - literally and figuratively. The hefty $380 million price tag for each High-NA system introduces financial challenges for chipmakers. Additionally, the larger High-NA tools require completely reconfiguring chip fabrication facilities. Their halved imaging field also necessitates rethinking chip designs. As a result, adoption timelines differ across companies - Intel intends to deploy High-NA EUV at an advanced 1.8 nm (18A) node, while TSMC is taking a more conservative approach, potentially implementing it only in 2030 and not rushing the use of these lithography machines, as the company's nodes are already developing well and on time. Interestingly, the installation process of ASML's High-NA Twinscan EXE 150,000-kilogram system required 250 crates, 250 engineers, and six months to complete. So, production is as equally complex as the installation and operation of this delicate machinery.

TSMC & SK Hynix Reportedly Form Strategic AI Alliance, Jointly Developing HBM4

Last week SK Hynix revealed ambitious plans for its next wave of High Bandwidth Memory (HBM) products—their SEMICON Korea 2024 presentation included an announcement about cutting-edge HBM3E entering mass production within the first quarter of this year. True next-gen HBM development has already kicked off—TPU's previous report outlines an HBM4 sampling phase in 2025, followed by full production in 2026. South Korea's Pulse News believes that TSMC has been roped into a joint venture (with SK Hynix). An alleged "One Team" strategic alliance has been formed according to reports emerging from Asia—this joint effort could focus on the development of HBM4 solutions for AI fields.

Reports from last November pointed to a possible SK Hynix and NVIDIA HBM4 partnership, with TSMC involved as the designated fabricator. We are not sure if the emerging "One Team" progressive partnership will have any impact on previously agreed upon deals, but South Korean news outlets reckon that the TSMC + SK Hynix alliance will attempt to outdo Samsung's development of "new-generation AI semiconductor packaging." Team Green's upcoming roster of—"Hopper" H200 and "Blackwell" B100—AI GPUs are linked to a massive pre-paid shipment of SK Hynix HMB3E parts. HBM4 products could be fitted on a second iteration of NVIDIA's Blackwell GPU, and the mysterious "Vera Rubin" family. Notorious silicon industry tipster, kopite7kimi, believes that "R100 and GR200" GPUs are next up in Team Green's AI-cruncher queue.

SK hynix Unveils Roadmap for Use of Recycled Materials

SK hynix announced today that it has established a roadmap to actively utilize recycled and renewable materials in production, marking the first case that a semiconductor company lays out such mid- to long-term plan. "In order to achieve Net Zero, or Carbon Neutrality, establishing a circular economy system centered on resource recycling has become an important task for countries and companies around the world," SK hynix said. "In line with this trend, we have decided to preemptively establish and faithfully implement the goal of increasing the use of recycled materials in stages."

Through this roadmap, SK hynix aims to raise the proportion of recycled materials used in the products currently manufactured by the company to 25% by 2025 and more than 30% (based on weight) by 2030. As part of the plan, SK hynix will start with essential metals for semiconductor production, such as copper, tin, and gold, and replace them with recycled materials. Industry experts point out that metal materials are the most effective when it comes to resource circulation as they account for a large proportion of the weight of finished memory products and are difficult to replace with other materials.

SK Hynix Targets HBM3E Launch This Year, HBM4 by 2026

SK Hynix has unveiled ambitious High Bandwidth Memory (HBM) roadmaps at SEMICON Korea 2024. Vice President Kim Chun-hwan announced plans to mass produce the cutting-edge HBM3E within the first half of 2024, touting 8-layer stack samples already supplied to clients. This iteration makes major strides towards fulfilling surging data bandwidth demands, offering 1.2 TB/s per stack and 7.2 TB/s in a 6-stack configuration. VP Kim Chun-hwan cites the rapid emergence of generative AI, forecasted for 35% CAGR, as a key driver. He warns that "fierce survival competition" lies ahead across the semiconductor industry amidst rising customer expectations. With limits approaching on conventional process node shrinks, attention is shifting to next-generation memory architectures and materials to unleash performance.

SK Hynix has already initiated HBM4 development for sampling in 2025 and mass production the following year. According to Micron, HBM4 will leverage a wider 2048-bit interface compared to previous HBM generations to increase per-stack theoretical peak memory bandwidth to over 1.5 TB/s. To achieve these high bandwidths while maintaining reasonable power consumption, HBM4 is targeting a data transfer rate of around 6 GT/s. The wider interface and 6 GT/s speeds allow HBM4 to push bandwidth boundaries significantly compared to prior HBM versions, fueling the need for high-performance computing and AI workloads. But power efficiency is carefully balanced by avoiding impractically high transfer rates. Additionally, Samsung is aligned on a similar 2025/2026 timeline. Beyond pushing bandwidth boundaries, custom HBM solutions will become increasingly crucial. Samsung executive Jaejune Kim reveals that over half its HBM volume already comprises specialized products. Further tailoring HBM4 to individual client needs through logic integration presents an opportunity to cement leadership. As AI workloads evolve at breakneck speeds, memory innovation must keep pace. With HBM3E prepping for launch and HBM4 in the plan, SK Hynix and Samsung are gearing up for the challenges ahead.

SK Hynix to Show Off its GDDR7, 48GB 16-layer HBM3E Stacks, and LPDDR5T-10533 Memory at IEEE-SSCC

Samsung isn't the only Korean memory giant showing off its latest tech at the upcoming IEEE Solid State Circuit Conference (SSCC) in February, 2024; it will be joined by SK Hynix, which will demo competing tech across both its volatile and non-volatile memory lines. To begin with, SK Hynix will be the second company to show off a GDDR7 memory chip, after Samsung. The SK Hynix chip is capable of 35.4 Gbps speeds, which is lower than the 37 Gbps Samsung is showing off, but at the same 16 Gbit density. This density allows the deployment of 16 GB of video memory across a 256-bit memory bus. Not all next-generation GPUs will max out 37 Gbps, some may run at lower memory speeds, and they have suitable options in the SK Hynix product stack. Much like Samsung, SK Hynix is implementing PAM3 I/O signaling, and a proprietary low-power architecture (though the company wouldn't elaborate on whether it's similar to the four low-speed clock states as the Samsung chips).

GDDR7 is bound to dominate the next generation of graphics cards across the gaming and pro-vis segments; however the AI HPC processor market will continue to bank heavily on HBM3E. SK Hynix has innovated here, and will show off a new 16-high 48 GB (384 Gbit) HBM3E stack design that's capable of 1280 GB/s over a single stack. A processor with even four such stacks will have 192 GB of memory at 5.12 TB/s of bandwidth. The stack implements a new all-around power TSV (through silicon via) design, and a 6-phase RDQS (read data queue strobe) scheme, for TSV area optimization. Lastly, the SK Hynix sessions will also include the first demo of its ambitious LPDDR5T (LPDDR5 Turbo) memory standard aimed at smartphones, tablets, and thin-and-light notebooks. This chip achieves a data-rate of 10.5 Gb/s per pin, and a DRAM voltage of 1.05 V. Such high data speeds are possible thanks to a proprietary parasitic capacitance reduction technology, and a voltage offset calibrated receiver tech.

Intel Lunar Lake-MX to Embed Samsung LPDDR5X Memory on SoC Package

According to sources close to Seoul Economy, and reported by DigiTimes, Intel has reportedly chosen Samsung as a supplier for its next-generation Lunar Lake processors, set to debut later this year. The report notes that Samsung will provide LPDDR5X memory devices for integration into Intel's processors. This collaboration could be a substantial win for Samsung, given Intel's projection to distribute millions of Lunar Lake CPUs in the coming years. However, it's important to note that this information is based on a leak and has not been officially confirmed. Designed for ultra-portable laptops, the Lunar Lake-MX platform is expected to feature 16 GB or 32 GB of LPDDR5X-8533 memory directly on the processor package. This on-package memory approach aims to minimize the platform's physical size while enhancing performance over traditional memory configurations. With Lunar Lake's exclusive support for on-package memory, Samsung's LPDDR5X-8533 products could significantly boost sales.

While Samsung is currently in the spotlight, it remains unclear if it will be the sole LPDDR5X memory provider for Lunar Lake. Intel's strategy involves selling processors with pre-validated memory, leaving the door open for potential validation of similar memory products from competitors like Micron and SK Hynix. Thanks to a new microarchitecture, Intel has promoted its Lunar Lake processors as a revolutionary leap in performance-per-watt efficiency. The processors are expected to utilize a multi-chipset design with Foveros technology, combining CPU and GPU chipsets, a system-on-chip tile, and dual memory packages. The CPU component is anticipated to include up to eight cores, a mix of four high-performance Lion Cove and four energy-efficient Skymont cores, alongside advanced graphics, cache, and AI acceleration capabilities. Apple's use of on-package memory in its M-series chips has set a precedent in the industry, and with Intel's Lunar Lake MX, this trend could extend across the thin-and-light laptop market. However, systems requiring more flexibility in terms of configuration, repair, and upgrades will likely continue to employ standard memory solutions like SODIMMs and/or the new CAMM2 modules that offer a balance of high performance and energy efficiency.

SK hynix Reports Financial Results for 2023, 4Q23

SK hynix Inc. announced today that it recorded an operating profit of 346 billion won in the fourth quarter of last year amid a recovery of the memory chip market, marking the first quarter of profit following four straight quarters of losses. The company posted revenues of 11.31 trillion won, operating profit of 346 billion won (operating profit margin at 3%), and net loss of 1.38 trillion won (net profit margin at negative 12%) for the three months ended December 31, 2023. (Based on K-IFRS)

SK hynix said that the overall memory market conditions improved in the last quarter of 2023 with demand for AI server and mobile applications increasing and average selling price (ASP) rising. "We recorded the first quarterly profit in a year following efforts to focus on profitability," it said. The financial results of the last quarter helped narrow the operating loss for the entire year to 7.73 trillion won (operating profit margin at negative 24%) and net loss to 9.14 trillion won (with net profit margin at negative 28%). The revenues were 32.77 trillion won.

HBM Industry Revenue Could Double by 2025 - Growth Driven by Next-gen AI GPUs Cited

Samsung, SK hynix, and Micron are considered to be the top manufacturing sources of High Bandwidth Memory (HBM)—the HBM3 and HBM3E standards are becoming increasingly in demand, due to a widespread deployment of GPUs and accelerators by generative AI companies. Taiwan's Commercial Times proposes that there is an ongoing shortage of HBM components—but this presents a growth opportunity for smaller manufacturers in the region. Naturally, the big name producers are expected to dive in head first with the development of next generation models. The aforementioned financial news article cites research conducted by the Gartner group—they predict that the HBM market will hit an all-time high of $4.976 billion (USD) by 2025.

This estimate is almost double that of projected revenues (just over $2 billion) generated by the HBM market in 2023—the explosive growth of generative AI applications has "boosted" demand for the most performant memory standards. The Commercial Times report states that SK Hynix is the current HBM3E leader, with Micron and Samsung trailing behind—industry experts believe that stragglers will need to "expand HBM production capacity" in order to stay competitive. SK Hynix has shacked up with NVIDIA—the GH200 Grace Hopper platform was unveiled last summer; outfitted with the South Korean firm's HBM3e parts. In a similar timeframe, Samsung was named as AMD's preferred supplier of HBM3 packages—as featured within the recently launched Instinct MI300X accelerator. NVIDIA's HBM3E deal with SK Hynix is believed to extend to the internal makeup of Blackwell GB100 data-center GPUs. The HBM4 memory standard is expected to be the next major battleground for the industry's hardest hitters.

SK Hynix Throws a Jab: CAMM is Coming to Desktop PCs

In a surprising turn of events, SK Hynix has hinted at the possibility of the Compression Attached Memory Module (CAMM) standard, initially designed for laptops, being introduced to desktop PCs. This revelation came from a comment made by an SK Hynix representative at the CES 2024 in Las Vegas for the Korean tech media ITSubIssub. According to the SK Hynix representative, the first implementation is underway, but there are no specific details. CAMM, an innovative memory standard developed by Dell in 2022, was certified to replace SO-DIMM as the official standard for laptop memory. However, the transition to desktop PCs could significantly disrupt the desktop memory market. The CAMM modules, unlike the vertical DRAM sticks currently in use, are horizontal and are screwed into a socket. This design change would necessitate a complete overhaul of the desktop motherboard layout.

The thin, flat design of the CAMM modules could also limit the number that can be installed on an ATX board. However, the desktop version of the standard CAMM2 was announced by JEDEC just a month ago. It is designed for DDR5 memory, but it is expected to become mainstream with the introduction of DDR6 around 2025. While CAMM allows for higher speeds and densities for mobile memory, its advantages for desktops over traditional memory sticks are yet to be fully understood. Although low-power CAMM modules could offer energy savings, this is typically more relevant for mobile devices than desktops. As we move towards DDR6 and DDR7, more information about CAMM for desktops will be needed to understand its potential benefits. JEDEC's official words on the new standard indicate that "DDR5 CAMM2s are intended for performance notebooks and mainstream desktops, while LPDDR5/5X CAMM2s target a broader range of notebooks and certain server market segments." So, we can expect to see CAMM2 in both desktops and some server applications.

V-COLOR Announces DDR5-7200 192GB RDIMM Kit for AMD TRX50 Threadripper Platform

V-COLOR announces the launch of their DDR5 overclocking R-DIMM the TRX50 Motherboards powered by AMD Ryzen Threadripper 7000 series processors with capacity 192 GB (4x48GB) ready to hit the market in speeds ranging from 6400 MHz up to 7200 MHz for end users who require the maximum capacity and speed. This package is geared toward content makers, intensive 3D modelers, AI programmers, trading machines, and HFT (high frequency trading) firms.

With EXPO ready v-color DDR5 OC R-DIMM memory is ready to for full potential, designed for a diverse user base that includes both non-overclocking users and overclocking enthusiasts, with a specific focus on content creators, intensive 3D modelers, AI programmers, trading machines or HFT (high frequency trading) companies. This kit is meticulously crafted with the most advanced SK Hynix DDR5 chips and Automated Binning sort for more reliability and endurance, continuously tested for full compatibility with all the motherboards TRX50.

SK hynix to Exhibit AI Memory Leadership at CES 2024

SK hynix Inc. announced today that it will showcase the technology for ultra-high performance memory products, the core of future AI infrastructure, at CES 2024, the most influential tech event in the world taking place from January 9 through 12 in Las Vegas. SK hynix said that it will highlight its future vision represented by its Memory Centric at the show and promote the importance of memory products accelerating the technological innovation in the AI era and its competitiveness in the global memory markets.

The company will run a space titled SK Wonderland jointly with other major SK Group affiliates including SK Inc., SK Innovation and SK Telecom, and showcase its major AI memory products including HBM3E. SK hynix plans to provide HBM3E, the world's best-performing memory product that it successfully developed in August, to the world's largest AI technology companies by starting mass production from the first half of 2024.
Return to Keyword Browsing
Nov 23rd, 2024 04:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts