News Posts matching #SK Hynix

Return to Keyword Browsing

SK Hynix Develops PCB01 NVMe SSD for AI PCs

SK hynix announced today that it developed PCB01, an SSD product with the industry's best specifications, for on-device AI PCs. The product marks the first case where the industry adopts the fifth generation of the 8-channel PCIe technology and brings innovation to performance including the data processing speed. The company expects the latest advancement in the NAND solution space to add to its success stories in the high-performance DRAM area led by HBM, enhancing its leadership in the overall AI memory space.

With a validation process with a global PC customer underway, SK hynix plans to mass produce and start shipping the products to both corporate customers and general consumers within this year. PCB01 comes with the capabilities of sequential read and write speeds of 14 GB and 12 GB per second, respectively, bringing the performance of an SSD to the level unseen before. The speeds allow the operation of a large language model or LLM, for AI training and inference, in a second.

Samsung, SK Hynix, and Micron Compete for GDDR7 Dominance

Competition among Samsung, SK Hynix, and Micron is intensifying, with a focus on enhancing processing speed and efficiency in graphics DRAM (GDDR) for AI accelerators and cryptocurrency mining. Compared with High Bandwidth Memory (HBM), the GDDR7 has a faster data processing speed and a relatively low price. Since Nvidia is expected to use next-generation GDDR7 with its GeForce RTX50 Blackwell GPUs, competition will likely be as strong as the demand. We can see that by looking, for example, at the pace of new GDDR7 releases from the past two years.

In July 2022, Samsung Electronics developed the industry's first 32 Gbps GDDR7 DRAM, capable of processing up to 1.5 TB of data per second, a 1.4 times speed increase and 20% better energy efficiency compared to GDDR6. In February 2023, Samsung demonstrated its first GDDR7 DRAM with a pin rate of 37 Gbps. On June 4, Micron launched its new GDDR7 at Computex 2024, with speeds up to 32 Gbps, a 60% increase in bandwidth, and a 50% improvement in energy efficiency over the previous generation. Shortly after, SK Hynix introduced a 40 Gbps GDDR7, showcased again at Computex 2024, doubling the previous generation's bandwidth to 128 GB per second and improving energy efficiency by 40%.

SK hynix Showcases Its New AI Memory Solutions at HPE Discover 2024

SK hynix has returned to Las Vegas to showcase its leading AI memory solutions at HPE Discover 2024, Hewlett Packard Enterprise's (HPE) annual technology conference. Held from June 17-20, HPE Discover 2024 features a packed schedule with more than 150 live demonstrations, as well as technical sessions, exhibitions, and more. This year, attendees can also benefit from three new curated programs on edge computing and networking, hybrid cloud technology, and AI. Under the slogan "Memory, The Power of AI," SK hynix is displaying its latest memory solutions at the event including those supplied to HPE. The company is also taking advantage of the numerous networking opportunities to strengthen its relationship with the host company and its other partners.

The World's Leading Memory Solutions Driving AI
SK hynix's booth at HPE Discover 2024 consists of three product sections and a demonstration zone which showcase the unprecedented capabilities of its AI memory solutions. The first section features the company's groundbreaking memory solutions for AI, including HBM solutions. In particular, the industry-leading HBM3E has emerged as a core product to meet the growing demands of AI systems due to its exceptional processing speed, capacity, and heat dissipation. A key solution from the company's CXL lineup, CXL Memory Module-DDR5 (CMM-DDR5), is also on display in this section. In the AI era where high performance and capacity are vital, CMM-DDR5 has gained attention for its ability to expand system bandwidth by up to 50% and capacity by up to 100% compared to systems only equipped with DDR5 DRAM.

SK hynix's Partner Company Mimir IP Sues Micron

Mimir IP, a South Korean patent management company, bought around 1,500 chip-related patents from SK hynix in May. They have now filed a lawsuit against the U.S. memory company Micron, accusing it of using these patents without permission, TrendForce reported. If Mimir wins, they could get up to USD 480 million in damages. The lawsuit, filed on June 3, also targets Tesla, Dell, HP, and Lenovo for using Micron's products. The patents in question are related to circuits, voltage measurement devices, and non-volatile memory devices.

The case is being heard in the US District Court for the Eastern District of Texas and the US International Trade Commission. This is the first time a South Korean company that acquired patents from domestic chipmakers has filed a lawsuit against a US semiconductor company. Officials from the involved companies have not commented. Micron, Samsung, and SK hynix have been changing how they deal with their patents recently, so this is not really a surprise move. In March 2023, Micron transferred over 400 chip-related patents to Lodestar Licensing Group. In June 2023, Samsung transferred 96 US chip patents, including the right to file patent infringement complaints, to IKT, an affiliate of Samsung Display.

Contract Price Increases Offset Seasonal Slump, Boosting DRAM Q1 Revenue by 5.1%

TrendForce reveals that the DRAM industry experienced a 5.1% revenue increase in 1Q24 compared to the previous quarter. This growth—reaching US$18.35 billion—was driven by rising contract prices for mainstream products, with the price increase being more significant than in 4Q23. As a result, most companies in the industry continued to see revenue growth.

The top three suppliers experienced a decline in shipments in the first quarter, demonstrating the industry's off-season effect. Additionally, downstream companies had higher inventory levels, which led to a significant reduction in procurement volume. As for ASP, the top three suppliers continued to benefit from contract price increases seen in 4Q23. With inventory levels still healthy, there was a strong intention to raise prices.

SK Hynix Targets Q1 2025 for GDDR7 Memory Mass Production

The race is on for memory manufacturers to bring the next generation GDDR7 graphics memory into mass production. While rivals Samsung and Micron are aiming to have GDDR7 chips available in Q4 of 2024, South Korean semiconductor giant SK Hynix revealed at Computex 2024 that it won't kick off mass production until the first quarter of 2025. GDDR7 is the upcoming JEDEC standard for high-performance graphics memory, succeeding the current GDDR6 and GDDR6X specifications. The new tech promises significantly increased bandwidth and capacities to feed the appetites of next-wave GPUs and AI accelerators. At its Computex booth, SK Hynix showed off engineering samples of its forthcoming GDDR7 chips, with plans for both 16 Gb and 24 Gb densities.

The company is targeting blazing-fast 40 Gbps data transfer rates with its GDDR7 offerings, outpacing the 32 Gbps rates its competitors are starting with on 16 Gb parts. If realized, higher speeds could give SK Hynix an edge, at least initially. While trailing a quarter or two behind Micron and Samsung isn't ideal, SK Hynix claims having working samples now validates its design and allows partners to begin testing and qualification. Mass production timing for standardized memories also doesn't necessarily indicate a company is "late" - it simply means another vendor secured an earlier production window with a specific customer. The GDDR7 transition is critical for SK Hynix and others, given the insatiable demand for high-bandwidth memory to power AI, graphics, and other data-intensive workloads. Hitting its stated Q1 2025 mass production target could ensure SK Hynix doesn't fall too far behind in the high-stakes GDDR7 race, with faster and higher-density chips to potentially follow shortly after volume ramp.

SK hynix Showcases Its Next-Gen Solutions at Computex 2024

SK hynix presented its leading AI memory solutions at COMPUTEX Taipei 2024 from June 4-7. As one of Asia's premier IT shows, COMPUTEX Taipei 2024 welcomed around 1,500 global participants including tech companies, venture capitalists, and accelerators under the theme "Connecting AI". Making its debut at the event, SK hynix underlined its position as a first mover and leading AI memory provider through its lineup of next-generation products.

"Connecting AI" With the Industry's Finest AI Memory Solutions
Themed "Memory, The Power of AI," SK hynix's booth featured its advanced AI server solutions, groundbreaking technologies for on-device AI PCs, and outstanding consumer SSD products. HBM3E, the fifth generation of HBM1, was among the AI server solutions on display. Offering industry-leading data processing speeds of 1.18 terabytes (TB) per second, vast capacity, and advanced heat dissipation capability, HBM3E is optimized to meet the requirements of AI servers and other applications. Another technology which has become crucial for AI servers is CXL as it can increase system bandwidth and processing capacity. SK hynix highlighted the strength of its CXL portfolio by presenting its CXL Memory Module-DDR5 (CMM-DDR5), which significantly expands system bandwidth and capacity compared to systems only equipped with DDR5. Other AI server solutions on display included the server DRAM products DDR5 RDIMM and MCR DIMM. In particular, SK hynix showcased its tall 128-gigabyte (GB) MCR DIMM for the first time at an exhibition.

SK hynix & NEOWIZ Collaborate on Limited Edition Lies of P T31 Solid State Drive

Today NEOWIZ, the publisher of hit action-adventure game Lies of P, and SK hynix, one of the world's leading computer hardware manufacturers, announced a collaboration for a Limited Edition Lies of P T31 Solid State Drive (SSD), set to be available this summer. The Tube T31 from SK hynix, first released earlier this year in March, is a stick-type SSD that can be used without a cable.

The Tube T31 Lies of P Limited Edition SSD comes in two varieties with two distinct artwork featuring the iconic Legion Arm from Neowiz's hit souls-like game. Both drives are fully compatible with PC and Console, allowing players to keep extra storage and transfer between systems (PlayStation 5/4, Xbox Series X|S, Xbox One, Windows, and macOS). The T31 also received a Global Product Design award at the 2024 Red Dot Awards. The Tube T31 Lies of P Edition will be officially unveiled at a private booth jointly operated by SK hynix and Hitachi-LG Data Storage, Inc. (SK hynix's global sales and manufacturing partner for SSDs) at Computex, held in Taiwan from June 4 to June 7. It will be available in limited quantities globally starting in July.

NVIDIA Devouring Chips Faster than South Korea's Supply, Lowest Inventory in 10 Years

South Korea's stock of semiconductor chips dropped more than it has since 2014. This big decrease shows that customers are buying chips faster than companies can make them, as they need more equipment for developing artificial intelligence (AI) technology. Official data released on May 31 revealed that in April, chip inventories fell by 33.7% compared to a year earlier - the largest drop since late 2014. This is the fourth month in a row that inventories have declined, while at the same time South Korea's exports of semiconductors have gone up again. Additionally, South Korea's production of chips rose 22.3% in April, which is less than the 30.2% increase from the previous month. Shipments from factories grew 18.6%, also lower than March's 16.4% growth.

South Korea is home to the two biggest memory chipmakers in the world (Samsung and SK Hynix), and they are competing to supply chips to NVIDIA, the latest having an insatiable appetite for more and more chips. These two Korean companies are in a race to develop a more advanced and more profitable version of high-bandwidth memory, or HBM. During the memory chip boom from 2013-2015, inventories didn't increase for about a year and a half. In the 2016-2017 cycle, inventory declines lasted nearly a year. A report from South Korea's central bank expects the latest surge in chip demand to continue at least until the first half of next year. This is because the "artificial intelligence boom" is driving up demand similarly to how cloud servers caused an expansion in 2016, and now mostly forgotten crypto-mining fever. South Korea will release its latest export data on June 1.
NVIDIA Chips South Korea South Korea Chips Inventory April 2024

Growing Demand for High-Capacity Storage Propels Enterprise SSD Revenue Up by Over 60% in 1Q24

TrendForce reports that a reduction in supplier production has led to unmet demand for high-capacity orders since 4Q23. Combined with procurement strategies aimed at building low-cost inventory, this has driven orders and significantly boosted enterprise SSD revenue, which reached US$3.758 billion in 1Q24—a staggering 62.9% QoQ increase.

TrendForce further highlights that demand for high-capacity, driven by AI servers, has surged. North American clients increasingly adopt high-capacity QLC SSDs to replace HDDs, leading to over 20% growth in Q2 enterprise SSD bit procurement. This has also driven up Q2 enterprise SSD contract prices by more than 20%, with revenue expected to grow by another 20%.

Details Revealed about SK Hynix HBM4E, Computing, and Caching Features Integrated Directly

SK Hynix, leader in HBM3E memory, has now shared more details about HBM4E. Based on fresh reports by Wccftech and ET News, SK Hynix plans to make an HBM memory type that features multiple things like computing, caching, and network memory, all within the same package. This will make SK Hynix stand out from others. This idea is still in the early stages, but SK Hynix has started getting the design information it needs to support its goals. The reports say that SK Hynix wants to lay the groundwork for a versatile HBM with its upcoming HBM4 design. The company reportedly plans to include a memory controller on board, which will allow new computing abilities with its 7th generation HBM4E memory.

By using SK Hynix's method, everything will be unified as a single unit. This will not only make data transfer faster because there is less space between parts, but it will also make it more energy-efficient. Previously in April, SK Hynix announced that it has been working with TSMC to produce the next generation of HBM and improve how logic chips and HBM work together through advanced packaging. In late May, SK Hynix has disclosed yield details regarding HBM3E for the first time, the memory giant reporting successfully reducing the time needed for mass production of HBM3E chips by 50%, while getting closer to the target yield of 80%. The company plans to keep developing HBM4, which is expected to start mass production in 2026.
SK Hynix HBM SK Hynix HBMe3

Micron DRAM Production Plant in Japan Faces Two-Year Delay to 2027

Last year, Micron unveiled plans to construct a cutting-edge DRAM factory in Hiroshima, Japan. However, the project has faced a significant two-year delay, pushing back the initial timeline for mass production of the company's most advanced memory products. Originally slated to begin mass production by the end of 2025, Micron now aims to have the new facility operational by 2027. The complexity of integrating extreme ultraviolet lithography (EUV) equipment, which enables the production of highly advanced chips, has contributed to the delay. The Hiroshima plant will produce next-generation 1-gamma DRAM and high-bandwidth memory (HBM) designed for generative AI applications. Micron expects the HBM market, currently dominated by rivals SK Hynix and Samsung, to experience rapid growth, with the company targeting a 25% market share by 2025.

The project is expected to cost between 600 and 800 billion Japanese yen ($3.8 to $5.1 billion), with Japan's government covering one-third of the cost. Micron has received a subsidy of up to 192 billion yen ($1.2 billion) for construction and equipment, as well as a subsidy to cover half of the necessary funding to produce HBM at the plant, amounting to 25 billion yen ($159 million). Despite the delay, the increased investment in the factory reflects Micron's commitment to advancing its memory technology and capitalizing on the growing demand for HBM. An indication of that is the fact that customers have pre-ordered 100% of the HBM capacity for 2024, not leaving a single HBM die unused.

NAND Flash Industry Revenue Grew 28.1% in 1Q24, Growth Expected to Continue into Q2

TrendForce reports that adoption of enterprise SSDs by AI servers began in February, which subsequently led to large orders. Additionally, PC and smartphone customers have been increasing their inventory levels to manage rising prices. This trend drove up NAND Flash prices and shipment levels in 1Q24 and boosted quarterly revenue by 28.1% to US$14.71 billion.

There were significant changes in market rankings this quarter, with Micron overtaking Western Digital to claim the fourth spot. Micron benefited from slightly lower prices and shipments than its competitors in 4Q23, resulting in a 51.2% QoQ revenue growth to $1.72 billion in 1Q24—the highest among its peers.

NVIDIA Reportedly Having Issues with Samsung's HBM3 Chips Running Too Hot

According to Reuters, NVIDIA is having some major issues with Samsung's HBM3 chips, as NVIDIA hasn't managed to finalise its validations of the chips. Reuters are citing multiple sources that are familiar with the matter and it seems like Samsung is having some serious issues with its HMB3 chips if the sources are correct. Not only do the chips run hot, which itself is a big issue due to NVIDIA already having issues cooling some of its higher-end products, but the power consumption is apparently not where it should be either. Samsung is said to have tried to get its HBM3 and HBM3E parts validated by NVIDIA since sometime in 2023 according to Reuter's sources, which suggests that there have been issues for at least six months, if not longer.

The sources claim there are issues with both the 8- and 12-layer stacks of HMB3E parts from Samsung, suggesting that NVIDIA might only be able to supply parts from Micron and SK Hynix for now, the latter whom has been supplying HBM3 chips to NVIDIA since the middle of 2022 and HBM3E chips since March of this year. It's unclear if this is a production issue at Samsung's DRAM Fabs, a packaging related issue or something else entirely. The Reuter's piece goes on to speculating about Samsung not having had enough time to develop its HBM parts compared its competitors and that it's a rushed product, but Samsung issued a statement to the publication that it's a matter of customising the product for its customer's needs. Samsung also said that it's "the process of optimising its products through close collaboration with customers" without going into which customer(s). Samsung issued a further statement saying that "claims of failing due to heat and power consumption are not true" and that testing was going as expected.

HBM3e Production Surge Expected to Make Up 35% of Advanced Process Wafer Input by End of 2024

TrendForce reports that the three largest DRAM suppliers are increasing wafer input for advanced processes. Following a rise in memory contract prices, companies have boosted their capital investments, with capacity expansion focusing on the second half of this year. It is expected that wafer input for 1alpha nm and above processes will account for approximately 40% of total DRAM wafer input by the end of the year.

HBM production will be prioritized due to its profitability and increasing demand. However, limited yields of around 50-60% and a wafer area 60% larger than DRAM products mean a higher proportion of wafer input is required. Based on the TSV capacity of each company, HBM is expected to account for 35% of advanced process wafer input by the end of this year, with the remaining wafer capacity used for LPDDR5(X) and DDR5 products.

TSMC Unveils Next-Generation HBM4 Base Dies, Built on 12 nm and 5 nm Nodes

During the European Technology Symposium 2024, TSMC has announced its readiness to manufacture next-generation HBM4 base dies using both 12 nm and 5 nm nodes. This significant development is expected to substantially improve the performance, power consumption, and logic density of HBM4 memory, catering to the demands of high-performance computing (HPC) and artificial intelligence (AI) applications. The shift from a traditional 1024-bit interface to an ultra-wide 2048-bit interface is a key aspect of the new HBM4 standard. This change will enable the integration of more logic and higher performance while reducing power consumption. TSMC's N12FFC+ and N5 processes will be used to produce these base dies, with the N12FFC+ process offering a cost-effective solution for achieving HBM4 performance and the N5 process providing even more logic and lower power consumption at HBM4 speeds.

The company is collaborating with major HBM memory partners, including Micron, Samsung, and SK Hynix, to integrate advanced nodes for HBM4 full-stack integration. TSMC's base die, fabricated using the N12FFC+ process, will be used to install HBM4 memory stacks on a silicon interposer alongside system-on-chips (SoCs). This setup will enable the creation of 12-Hi (48 GB) and 16-Hi (64 GB) stacks with per-stack bandwidth exceeding 2 TB/s. TSMC's collaboration with EDA partners like Cadence, Synopsys, and Ansys ensures the integrity of HBM4 channel signals, thermal accuracy, and electromagnetic interference (EMI) in the new HBM4 base dies. TSMC is also optimizing CoWoS-L and CoWoS-R for HBM4 integration, meaning that massive high-performance chips are already utilizing this technology and getting ready for volume manufacturing.

SK hynix Develops Next-Generation Mobile NAND Solution ZUFS 4.0

SK hynix announced today that it has developed the Zoned UFS, or ZUFS 4.0, a mobile NAND solution product for on-device AI applications. SK hynix said that the ZUFS 4.0, optimized for on-device AI from mobile devices such as smartphones, is the industry's best of its kind. The company expects the latest product to help expand its AI memory leadership to the NAND space, extending its success in the high-performance DRAM represented by HBM.

The ZUFS is a differentiated technology that classifies and stores data generated from smartphones in different zones in accordance with characteristics. Unlike a conventional UFS, the latest product groups and stores data with similar purposes and frequencies in separate zones, boosting the speed of a smartphone's operating system and management efficiency of the storage devices.

SK hynix Presents CXL Memory Solutions Set to Power the AI Era at CXL DevCon 2024

SK hynix participated in the first-ever Compute Express Link Consortium Developers Conference (CXL DevCon) held in Santa Clara, California from April 30-May 1. Organized by a group of more than 240 global semiconductor companies known as the CXL Consortium, CXL DevCon 2024 welcomed a majority of the consortium's members to showcase their latest technologies and research results.

CXL is a technology that unifies the interfaces of different devices in a system such as semiconductor memory, storage, and logic chips. As it can increase system bandwidth and processing capacity, CXL is receiving attention as a key technology for the AI era in which high performance and capacity are essential. Under the slogan "Memory, The Power of AI," SK hynix showcased a range of CXL products at the conference that are set to strengthen the company's leadership in AI memory technology.

SK hynix CEO Says HBM from 2025 Production Almost Sold Out

SK hynix held a press conference unveiling its vision and strategy for the AI era today at its headquarters in Icheon, Gyeonggi Province, to share the details of its investment plans for the M15X fab in Cheongju and the Yongin Semiconductor Cluster in Korea and the advanced packaging facilities in Indiana, U.S.

The event, hosted by theChief Executive Officer Kwak Noh-Jung, three years before the May 2027 completion of the first fab in the Yongin Cluster, was attended by key executives including the Head of AI Infra Justin (Ju-Seon) Kim, Head of DRAM Development Kim Jonghwan, Head of the N-S Committee Ahn Hyun, Head of Manufacturing Technology Kim Yeongsik, Head of Package & Test Choi Woojin, Head of Corporate Strategy & Planning Ryu Byung Hoon, and the Chief Financial Officer Kim Woo Hyun.

SK hynix Strengthens AI Memory Leadership & Partnership With Host at the TSMC 2024 Tech Symposium

SK hynix showcased its next-generation technologies and strengthened key partnerships at the TSMC 2024 Technology Symposium held in Santa Clara, California on April 24. At the event, the company displayed its industry-leading HBM AI memory solutions and highlighted its collaboration with TSMC involving the host's CoWoS advanced packaging technology.

TSMC, a global semiconductor foundry, invites its major partners to this annual conference in the first half of each year so they can share their new products and technologies. Attending the event under the slogan "Memory, the Power of AI," SK hynix received significant attention for presenting the industry's most powerful AI memory solution, HBM3E. The product has recently demonstrated industry-leading performance, achieving input/output (I/O) transfer speed of up to 10 gigabits per second (Gbps) in an AI system during a performance validation evaluation.

SK Hynix Announces 1Q24 Financial Results

SK hynix Inc. announced today that it recorded 12.43 trillion won in revenues, 2.886 trillion won in operating profit (with an operating margin of 23%), and 1.917 trillion won in net profit (with a net margin of 15%) in the first quarter. With revenues marking an all-time high for a first quarter and the operating profit a second-highest following the records of the first quarter of 2018, SK hynix believes that it has entered the phase of a clear rebound following a prolonged downturn.

The company said that an increase in the sales of AI server products backed by its leadership in AI memory technology including HBM and continued efforts to prioritize profitability led to a 734% on-quarter jump in the operating profit. With the sales ratio of eSSD, a premium product, on the rise and the average selling prices rising, the NAND business has also achieved a meaningful turnaround in the same period.

SK hynix to Produce DRAM from M15X in Cheongju

SK hynix Inc. announced today that it plans to expand production capacity of the next-generation DRAM including HBM, a core component of the AI infrastructure, in response to the rapidly increasing demand for AI semiconductors. As the board of directors approves the plan, the company will build the M15X fab in Cheongju, North Chungcheong Province, for a new DRAM production base, and invest about 5.3 trillion won for fab construction.

The company plans to start construction at the end of April with an aim to complete in November 2025 for an early mass production. With a gradual increase in equipment investment planned, the total investment in building the new production base will be more than 20 trillion won in the long-term. As a global leader in AI memory, SK hynix expects the expansion in investment to contribute to revitalizing the domestic economy, while refreshing Korea's reputation as a semiconductor powerhouse.

SK hynix Collaborates with TSMC on HBM4 Chip Packaging

SK hynix Inc. announced today that it has recently signed a memorandum of understanding with TSMC for collaboration to produce next-generation HBM and enhance logic and HBM integration through advanced packaging technology. The company plans to proceed with the development of HBM4, or the sixth generation of the HBM family, slated to be mass-produced from 2026, through this initiative.

SK hynix said the collaboration between the global leader in the AI memory space and TSMC, a top global logic foundry, will lead to more innovations in HBM technology. The collaboration is also expected to enable breakthroughs in memory performance through trilateral collaboration between product design, foundry, and memory provider. The two companies will first focus on improving the performance of the base die that is mounted at the very bottom of the HBM package. HBM is made by stacking a core DRAM die on top of a base die that features TSV technology, and vertically connecting a fixed number of layers in the DRAM stack to the core die with TSV into an HBM package. The base die located at the bottom is connected to the GPU, which controls the HBM.

SK hynix Signs Investment Agreement of Advanced Chip Packaging with Indiana

SK hynix Inc., the world's leading producer of High-Bandwidth Memory (HBM) chips, announced today that it will invest an estimated $3.87 billion in West Lafayette, Indiana to build an advanced packaging fabrication and R&D facility for AI products. The project, the first of its kind in the United States, is expected to drive innovation in the nation's AI supply chain, while bringing more than a thousand new jobs to the region.

The company held an investment agreement ceremony with officials from Indiana State, Purdue University, and the U.S. government at Purdue University in West Lafayette on the 3rd and officially announced the plan. At the event, officials from each party including Governor of Indiana Eric Holcomb, Senator Todd Young, Director of the White House Office of Science and Technology Policy Arati Prabhakar, Assistant Secretary of Commerce Arun Venkataraman, Secretary of Commerce State of Indiana David Rosenberg, Purdue University President Mung Chiang, Chairman of Purdue Research Foundation Mitch Daniels, Mayor of city of West Lafayette Erin Easter, Ambassador of the Republic of Korea to the United States Hyundong Cho, Consul General of the Republic of Korea in Chicago Junghan Kim, SK vice chairman Jeong Joon Yu, SK hynix CEO Kwak Noh-Jung and SK hynix Head of Package & Test Choi Woojin, participated.

SK Hynix Plans a $4 Billion Chip Packaging Facility in Indiana

SK Hynix is planning a large $4 billion chip-packaging and testing facility in Indiana, USA. The company is still in the planning stage of the decision to invest in the US. "[the company] is reviewing its advanced chip packaging investment in the US, but hasn't made a final decision yet," a company spokesperson told the Wall Street Journal. The primary product focus for this plant will be stacked HBM memory meant to be consumed by the AI GPU and self-driving automobile industries. The plant could also focus on other exotic memory types, such as high-density server memory; and perhaps even compute-in-memory. The plant is expected to start operations in 2028, and will create up to 1,000 skilled jobs. SK Hynix is counting for state- and federal tax incentives to propel this investment; under government initiatives such as the CHIPS Act. SK Hynix is a significant supplier of HBM to NVIDIA for its AI GPUs. Its HBM3E features in the latest NVIDIA "Blackwell" GPUs.
Return to Keyword Browsing
Jul 16th, 2024 03:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts