News Posts matching #DRAM

Return to Keyword Browsing

JEDEC Releases New Standard for LPDDR5/5X Serial Presence Detect (SPD) Contents

JEDEC Solid State Technology Association, the global leader in standards development for the microelectronics industry, today announced the publication of the JESD406-5 LPDDR5/5X Serial Presence Detect (SPD) Contents V1.0, consistent with the updated contents of JESD401-5B DDR5 DIMM Label and JESD318 DDR5/LPDDR5 Compression Attached Memory Module (CAMM2) Common Standard.

JESD406-5 documents the contents of the SPD non-volatile configuration device included on all JEDEC standard memory modules using LPDDR5/5X SDRAMs, including the CAMM2 standard designs outlined in JESD318. The JESD401-5B standard defines the content of standard memory module labels using the other two standards, assisting end users in selecting compatible modules for their applications.

Samsung to Install High-NA EUV Machines Ahead of TSMC in Q4 2024 or Q1 2025

Samsung Electronics is set to make a significant leap in semiconductor manufacturing technology with the introduction of its first High-NA 0.55 EUV lithography tool. The company plans to install the ASML Twinscan EXE:5000 system at its Hwaseong campus between Q4 2024 and Q1 2025, marking a crucial step in developing next-generation process technologies for logic and DRAM production. This move positions Samsung about a year behind Intel but ahead of rivals TSMC and SK Hynix in adopting High-NA EUV technology. The system is expected to be operational by mid-2025, primarily for research and development purposes. Samsung is not just focusing on the lithography equipment itself but is building a comprehensive ecosystem around High-NA EUV technology.

The company is collaborating with several key partners like Lasertec (developing inspection equipment for High-NA photomasks), JSR (working on advanced photoresists), Tokyo Electron (enhancing etching machines), and Synopsys (shifting to curvilinear patterns on photomasks for improved circuit precision). The High-NA EUV technology promises significant advancements in chip manufacturing. With an 8 nm resolution capability, it could make transistors about 1.7 times smaller and increase transistor density by nearly three times compared to current Low-NA EUV systems. However, the transition to High-NA EUV comes with challenges. The tools are more expensive, costing up to $380 million each, and have a smaller imaging field. Their larger size also requires chipmakers to reconsider fab layouts. Despite these hurdles, Samsung aims for commercial implementation of High-NA EUV by 2027.

Imec Demonstrates Logic and DRAM Structures Using High NA EUV Lithography

Imec, a world-leading research and innovation hub in nanoelectronics and digital technologies, presents patterned structures obtained after exposure with the 0.55NA EUV scanner in the joint ASML-imec High NA EUV Lithography Lab in Veldhoven, the Netherlands. Random logic structures down to 9,5 nm (19 nm pitch), random vias with 30 nm center-to-center distance, 2D features at 22 nm pitch, and a DRAM specific lay out at P32nm were printed after single exposure, using materials and baseline processes that were optimized for High NA EUV by imec and its partners in the framework of imec's Advanced Patterning Program. With these results, imec confirms the readiness of the ecosystem to enable single exposure high-resolution High NA EUV Lithography.

Following the recent opening of the joint ASML-imec High NA EUV Lithography Lab in Veldhoven, the Netherlands, customers now have access to the (TWINSCAN EXE:5000) High NA EUV scanner to develop private High NA EUV use cases leveraging the customer's own design rules and lay outs.

Samsung Electronics Begins Mass Production of Industry's Thinnest LPDDR5X DRAM Packages

Samsung Electronics, the world leader in advanced memory technology, today announced it has begun mass production for the industry's thinnest 12 nanometer (nm)-class, 12-gigabyte (GB) and 16 GB LPDDR5X DRAM packages, solidifying its leadership in the low-power DRAM market. Leveraging its extensive expertise in chip packaging, Samsung is able to deliver ultra-slim LPDDR5X DRAM packages that can create additional space within mobile devices, facilitating better airflow. This supports easier thermal control, a factor that is becoming increasingly critical especially for high-performance applications with advanced features such as on-device AI.

"Samsung's LPDDR5X DRAM sets a new standard for high-performance on-device AI solutions, offering not only superior LPDDR performance but also advanced thermal management in an ultra-compact package," said YongCheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics. "We are committed to continuous innovation through close collaboration with our customers, delivering solutions that meet the future needs of the low-power DRAM market."

addlink Unveils NAS D60 and D20 SSDs Optimized for Flash-based NAS

addlink Technology Co., Ltd., a global leader in high-performance storage solutions, proudly announces the launch of its groundbreaking new SSDs designed specifically for NAS storage systems: the NAS D60 and D20 SSDs. These cutting-edge SSDs are poised to set new standards in speed, reliability, and capacity, transforming the NAS storage landscape.

addlink's NAS D60 and D20 SSDs exemplify the company's dedication to innovation and quality. These products are meticulously designed to meet the rigorous demands of modern NAS users, providing reliable, high-performance storage solutions that enhance productivity and data management. Whether for home use or enterprise-level applications, the NAS D60 and D20 offer unmatched performance, endurance, and value.

Samsung Electronics Announces Results for Second Quarter of 2024

Samsung Electronics today reported financial results for the second quarter ended June 30, 2024. The Company posted KRW 74.07 trillion in consolidated revenue and operating profit of KRW 10.44 trillion as favorable memory market conditions drove higher average sales price (ASP), while robust sales of OLED panels also contributed to the results.

Memory Market Continues To Recover; Solid Second Half Outlook Centered on Server Demand
The DS Division posted KRW 28.56 trillion in consolidated revenue and KRW 6.45 trillion in operating profit for the second quarter. Driven by strong demand for HBM as well as conventional DRAM and server SSDs, the memory market as a whole continued its recovery. This increased demand is a result of the continued AI investments by cloud service providers and growing demand for AI from businesses for their on-premise servers.

Marvell Introduces Breakthrough Structera CXL Product Line to Address Server Memory Bandwidth and Capacity Challenges in Cloud Data Centers

Marvell Technology, Inc., a leader in data infrastructure semiconductor solutions, today launched the Marvell Structera product line of Compute Express Link (CXL) devices that enable cloud data center operators to overcome memory performance and scaling challenges in general-purpose servers.

To address memory-intensive applications, data center operators add extra servers to get higher memory bandwidth and higher memory capacity. The compute capabilities from the added processors are typically not utilized for these applications, making the servers inefficient from cost and power perspectives. The CXL industry standard addresses this challenge by enabling new architectures that can efficiently add memory to general-purpose servers.

Micron Introduces 9550 NVMe Data Center SSD

Micron Technology, Inc., today announced availability of the Micron 9550 NVMe SSD - the world's fastest data center SSD and industry leader in AI workload performance and power efficiency. The Micron 9550 SSD showcases Micron's deep expertise and innovation by integrating its own controller, NAND, DRAM and firmware into one world-class product. This integrated solution enables class-leading performance, power efficiency and security features for data center operators.

The Micron 9550 SSD delivers best-in-class performance with 14.0 GB/s sequential reads and 10.0 GB/s sequential writes to provide up to 67% better performance over similar competitive SSDs and enables industry-leading performance for demanding workloads such as AI. In addition, its random reads of 3,300 KIOPS are up to 35% better and random writes of 400 KIOPS are up to 33% better than competitive offerings.

Memory Industry Revenue Expected to Reach Record High in 2025 Due to Increasing Average Prices and the Rise of HBM and QLC

TrendForce's latest report on the memory industry reveals that DRAM and NAND Flash revenues are expected to see significant increases of 75% and 77%, respectively, in 2024, driven by increased bit demand, an improved supply-demand structure, and the rise of high-value products like HBM.

Furthermore, industry revenues are projected to continue growing in 2025, with DRAM expected to increase by 51% and NAND Flash by 29%, reaching record highs. This growth is anticipated to revive capital expenditures and boost demand for upstream raw materials, although it will also increase cost pressure for memory buyers.

Micron Technology Unveils MRDIMMs to Scale Up Memory Densities on Servers

Micron Technology, Inc., today announced it is now sampling its multiplexed rank dual inline memory module (MRDIMMs). The MRDIMMs will enable Micron customers to run increasingly demanding workloads and obtain maximum value out of their compute infrastructure. For applications requiring more than 128 GB of memory per DIMM slot, Micron MRDIMMs outperform current TSV RDIMMs by enabling the highest bandwidth, largest capacity with the lowest latency and improved performance per watt to accelerate memory-intensive virtualized multi-tenant, HPC and AI data center workloads.1 The new memory offering is the first generation in the Micron MRDIMM family and will be compatible with Intel Xeon 6 processors.

"Micron's latest innovative main memory solution, MRDIMM, delivers the much-needed bandwidth and capacity at lower latency to scale AI inference and HPC applications on next-generation server platforms," said Praveen Vaidyanathan, vice president and general manager of Micron's Compute Products Group. "MRDIMMs significantly lower the amount of energy used per task while offering the same reliability, availability and serviceability capabilities and interface as RDIMMs, thus providing customers a flexible solution that scales performance. Micron's close industry collaborations ensure seamless integration into existing server infrastructures and smooth transitions to future compute platforms."

Samsung Completes Validation of Industry's Fastest LPDDR5X for Use With MediaTek's Flagship Mobile Platform

Samsung Electronics, the world leader in advanced memory technology, today announced it has successfully completed verification of the industry's fastest 10.7 gigabit-per-second (Gbps) Low Power Double Data Rate 5X (LPDDR5X) DRAM for use on MediaTek's next-generation Dimensity platform.

The 10.7 Gbps operation speed verification was carried out using Samsung's LPDDR5X 16-gigabyte (GB) package on MediaTek's upcoming flagship Dimensity 9400 System on Chip (SoC), scheduled to be released in the second half of this year. The two companies have closely collaborated to complete the verification within just three months.

Nanya and Winbond Boost Memory Production Amid Rising Demand and Prices

As memory prices and volumes increase, manufacturers Nanya and Winbond have ceased the production cuts they implemented last year, with production now back to normal levels. Market research agencies and supply chain analysts indicate that memory shipments are expected to continue recovering in Q3 2024. Currently, memory factories are operating at a capacity utilization rate of 90% to full capacity, which is significantly higher than the 60% to 70% capacity utilization rate of wafer foundries with mature processes. Last year, Nanya adjusted its production volume reducing it by up to 20%. This year, production has gradually increased, reaching 70% to over 80% in the second quarter, and has now resumed normal levels.

Nanya anticipates that DRAM market conditions and prices will improve quarter by quarter, with the overall industry trending positively, potentially turning losses into profits in the third quarter. Nanya announced yesterday that its consolidated revenue for June was 3.363 billion yuan, marking a monthly increase of 0.35% and an annual increase of 36.83%, setting a high for the year. The cumulative consolidated revenue for the first half of the year was 19.424 billion yuan, a 44.4% increase compared to the same period last year. Nanya will hold a press conference on July 10 to announce its second-quarter financial results and operating outlook.

Panmnesia Uses CXL Protocol to Expand GPU Memory with Add-in DRAM Card or Even SSD

South Korean startup Panmnesia has unveiled an interesting solution to address the memory limitations of modern GPUs. The company has developed a low-latency Compute Express Link (CXL) IP that could help expand GPU memory with external add-in card. Current GPU-accelerated applications in AI and HPC are constrained by the set amount of memory built into GPUs. With data sizes growing by 3x yearly, GPU networks must keep getting larger just to fit the application in the local memory, benefiting latency and token generation. Panmnesia's proposed approach to fix this leverages the CXL protocol to expand GPU memory capacity using PCIe-connected DRAM or even SSDs. The company has overcome significant technical hurdles, including the absence of CXL logic fabric in GPUs and the limitations of existing unified virtual memory (UVM) systems.

At the heart of Panmnesia's solution is a CXL 3.1-compliant root complex with multiple root ports and a host bridge featuring a host-managed device memory (HDM) decoder. This sophisticated system effectively tricks the GPU's memory subsystem into treating PCIe-connected memory as native system memory. Extensive testing has demonstrated impressive results. Panmnesia's CXL solution, CXL-Opt, achieved two-digit nanosecond round-trip latency, significantly outperforming both UVM and earlier CXL prototypes. In GPU kernel execution tests, CXL-Opt showed execution times up to 3.22 times faster than UVM. Older CXL memory extenders recorded around 250 nanoseconds round trip latency, with CXL-Opt potentially achieving less than 80 nanoseconds. As with CXL, the problem is usually that the memory pools add up latency and performance degrades, while these CXL extenders tend to add to the cost model as well. However, the Panmnesia CXL-Opt could find a use case, and we are waiting to see if anyone adopts this in their infrastructure.
Below are some benchmarks by Panmnesia, as well as the architecture of the CXL-Opt.

SK Hynix to Invest $75 Billion by 2028 in Memory Solutions for AI

South Korean giant SK Group has unveiled plans for substantial investments in AI and semiconductor technologies worth almost $75 billion. SK Group subsidiary, SK Hynix, will lead this initiative with a staggering 103 trillion won ($74.6 billion) investment over the next three years, with plans to realize the investment by 2028. This commitment is in addition to the ongoing construction of a $90 billion mega fab complex in Gyeonggi Province for cutting-edge memory production. SK Group has further pledged an additional $58 billion, bringing the total investment to a whopping $133 billion. This capital infusion aims to enhance the group's competitiveness in the AI value chain while funding operations across its 175 subsidiaries, including SK Hynix.

While specific details remain undisclosed, SK Group is reportedly exploring various options, including potential mergers and divestments. SK Group has signaled that business practices need change amid shifting geopolitical situations and the massive boost that AI is bringing to the overall economy. We may see more interesting products from SK Group in the coming years as it potentially enters new markets centered around AI. This strategic pivot comes after SK Hynix reported its first loss in a decade in 2022. However, the company has since shown signs of recovery, fueled by the surging demand for memory solutions for AI chips. The company currently has a 35% share of the global DRAM market and plans to have an even stronger presence in the coming years. The massive investment aligns with the South Korean government's recently announced $19 billion support package for the domestic semiconductor industry, which will be distributed across companies like SK Hynix and Samsung.

Micron Technology, Inc. Reports Results for the Third Quarter of Fiscal 2024

Micron Technology, Inc. (Nasdaq: MU) today announced results for its third quarter of fiscal 2024, which ended May 30, 2024.

Fiscal Q3 2024 highlights
  • Revenue of $6.81 billion versus $5.82 billion for the prior quarter and $3.75 billion for the same period last year
  • GAAP net income of $332 million, or $0.30 per diluted share
  • Non-GAAP net income of $702 million, or $0.62 per diluted share
  • Operating cash flow of $2.48 billion versus $1.22 billion for the prior quarter and $24 million for the same period last year
"Robust AI demand and strong execution enabled Micron to drive 17% sequential revenue growth, exceeding our guidance range in fiscal Q3," said Sanjay Mehrotra, President and CEO of Micron Technology. "We are gaining share in high-margin products like High Bandwidth Memory (HBM), and our data center SSD revenue hit a record high, demonstrating the strength of our AI product portfolio across DRAM and NAND. We are excited about the expanding AI-driven opportunities ahead, and are well positioned to deliver a substantial revenue record in fiscal 2025."

DRAM Prices Expected to Increase by 8-13% in Q3

TrendForce reports that a recovery in demand for general servers—coupled with an increased production share of HBM by DRAM suppliers—has led suppliers to maintain their stance on hiking prices. As a result, the ASP of DRAM in the third quarter is expected to continue rising, with an anticipated increase of 8-13%. The price of conventional DRAM is expected to rise by 5-10%, showing a slight contraction compared to the increase in the second quarter.

TrendForce notes that buyers were more conservative about restocking in the second, and inventory levels on both the supplier and buyer sides did not show significant changes. Looking ahead to the third quarter, there is still room for inventory replenishment for smartphones and CSPs, and the peak season for production is soon to commence. Consequently, it is expected that smartphones and servers will drive an increase in memory shipments in the third quarter.

CSPs to Expand into Edge AI, Driving Average NB DRAM Capacity Growth by at Least 7% in 2025

TrendForce has observed that in 2024, major CSPs such as Microsoft, Google, Meta, and AWS will continue to be the primary buyers of high-end AI servers, which are crucial for LLM and AI modeling. Following establishing a significant AI training server infrastructure in 2024, these CSPs are expected to actively expand into edge AI in 2025. This expansion will include the development of smaller LLM models and setting up edge AI servers to facilitate AI applications across various sectors, such as manufacturing, finance, healthcare, and business.

Moreover, AI PCs or notebooks share a similar architecture to AI servers, offering substantial computational power and the ability to run smaller LLM and generative AI applications. These devices are anticipated to serve as the final bridge between cloud AI infrastructure and edge AI for small-scale training or inference applications.

Kingston Intros FURY Renegade RGB Limited Edition DDR5 Memory

Kingston today formally launched the FURY Renegade RGB Limited Edition DDR5 memory kits. These were shown at the company's Computex 2024 booth earlier this month. The module's design involves a two-tone die-cast metal shroud over the aluminium heat-spreaders, which are crowned by silicone diffusers for the RGB LEDs. The modules have a 19-preset lighting controller. You control the lighting using the first-party FURY CTRL software. Kingston says that the design of these modules are inspired by race cars.

The Kingston FURY Renegade RGB Limited Edition is available in only one density—48 GB (2x 24 GB kit), and in only one speed variant, DDR5-8000, with timings of CL36-48-48, and DRAM voltage of 1.45 V. The module also includes profiles for DDR5-7200 and DDR5-6400, with tighter timings. The modules pack an Intel XMP 3.0 SPD profile that enables the advertised speeds on Intel platforms. Kingston has extensively tested the modules on the latest Intel platforms, such as the 14th Gen Core "Raptor Lake Refresh" for compatibility with the advertised XMP speeds. The company didn't reveal pricing.

SK hynix Showcases Its New AI Memory Solutions at HPE Discover 2024

SK hynix has returned to Las Vegas to showcase its leading AI memory solutions at HPE Discover 2024, Hewlett Packard Enterprise's (HPE) annual technology conference. Held from June 17-20, HPE Discover 2024 features a packed schedule with more than 150 live demonstrations, as well as technical sessions, exhibitions, and more. This year, attendees can also benefit from three new curated programs on edge computing and networking, hybrid cloud technology, and AI. Under the slogan "Memory, The Power of AI," SK hynix is displaying its latest memory solutions at the event including those supplied to HPE. The company is also taking advantage of the numerous networking opportunities to strengthen its relationship with the host company and its other partners.

The World's Leading Memory Solutions Driving AI
SK hynix's booth at HPE Discover 2024 consists of three product sections and a demonstration zone which showcase the unprecedented capabilities of its AI memory solutions. The first section features the company's groundbreaking memory solutions for AI, including HBM solutions. In particular, the industry-leading HBM3E has emerged as a core product to meet the growing demands of AI systems due to its exceptional processing speed, capacity, and heat dissipation. A key solution from the company's CXL lineup, CXL Memory Module-DDR5 (CMM-DDR5), is also on display in this section. In the AI era where high performance and capacity are vital, CMM-DDR5 has gained attention for its ability to expand system bandwidth by up to 50% and capacity by up to 100% compared to systems only equipped with DDR5 DRAM.

Gigabyte Promises 219,000 TBW for New AI TOP 100E SSD

Gigabyte has quietly added a new SSD to its growing lineup and this time around it's something quite different. The drive is part of Gigabyte's new AI TOP (Trillions of Operations per Second) and was announced at Computex with little fanfare. At the show, the company only announced that it would have 150x the TBW compared to regular SSDs and that it was built specifically for AI model training. What that 150x means in reality is that the 2 TB version of the AI TOP 100E SSD will deliver no less than 219,000 TBW (TeraBytes Written), whereas most high-end 2 TB consumer NVMe SSDs end up somewhere around 1,200 TBW. The 1 TB version promises 109,500 TBW and both drives have an MTBF time of 1.6 million hours and a five-year warranty.

Gigabyte didn't reveal the host controller or the exact NAND used, but the drives are said to use 3D NAND flash and both drives have a LPDDR4 DRAM cache of 1 or 2 GB depending on the drive size. However, the pictures of the drive suggest it might be a Phison based reference design. The AI TOP 100E SSDs are standard PCIe 4.0 drives, so the sequential read speed tops out at 7,200 MB/s with the write speed for the 1 TB SKU being up to 6,500 MB/s, with the 2 TB SKU slightly behind at 5,900 MB/s. No other performance figures were provided. The drives are said to draw up to 11 Watts in use, which seems very high for PCIe 4.0 drives. No word on pricing or availability as yet.

SK hynix Showcases Its Next-Gen Solutions at Computex 2024

SK hynix presented its leading AI memory solutions at COMPUTEX Taipei 2024 from June 4-7. As one of Asia's premier IT shows, COMPUTEX Taipei 2024 welcomed around 1,500 global participants including tech companies, venture capitalists, and accelerators under the theme "Connecting AI". Making its debut at the event, SK hynix underlined its position as a first mover and leading AI memory provider through its lineup of next-generation products.

"Connecting AI" With the Industry's Finest AI Memory Solutions
Themed "Memory, The Power of AI," SK hynix's booth featured its advanced AI server solutions, groundbreaking technologies for on-device AI PCs, and outstanding consumer SSD products. HBM3E, the fifth generation of HBM1, was among the AI server solutions on display. Offering industry-leading data processing speeds of 1.18 terabytes (TB) per second, vast capacity, and advanced heat dissipation capability, HBM3E is optimized to meet the requirements of AI servers and other applications. Another technology which has become crucial for AI servers is CXL as it can increase system bandwidth and processing capacity. SK hynix highlighted the strength of its CXL portfolio by presenting its CXL Memory Module-DDR5 (CMM-DDR5), which significantly expands system bandwidth and capacity compared to systems only equipped with DDR5. Other AI server solutions on display included the server DRAM products DDR5 RDIMM and MCR DIMM. In particular, SK hynix showcased its tall 128-gigabyte (GB) MCR DIMM for the first time at an exhibition.

Mnemonic Electronic Debuts at COMPUTEX 2024, Embracing the Era of High-Capacity SSDs

On June 4th, COMPUTEX 2024 was successfully held at the Taipei Nangang Exhibition Center. Mnemonic Electronic Co., Ltd., the Taiwanese subsidiary of Longsys, showcased industry-leading high-capacity SSDs under the theme "Embracing the Era of High-Capacity SSDs." The products on display included the Mnemonic MS90 8TB SATA SSD, FORESEE ORCA 4836 series enterprise NVMe SSDs, FORESEE XP2300 PCIe Gen 4 SSDs, and rich product lines comprising embedded storage, memory modules, memory cards, and more. The company offers reliable industrial-grade, automotive-grade, and enterprise-grade storage products, providing high-capacity solutions for global users.

High-Capacity SSDs
For SSDs, Mnemonic Electronic presented products in various form factors and interfaces, including PCIe M.2, PCIe BGA, SATA M.2, and SATA 2.5-inch. The Mnemonic MS90 8 TB SATA SSD supports the SATA interface with a speed of up to 6 Gb/s (Gen 3) and is backward compatible with Gen 1 and Gen 2. It also supports various SATA low-power states (Partial/Sleep/Device Sleep) and can be used for nearline HDD replacement, surveillance, and high-speed rail systems.

Patriot Shows 14 GB/s PCIe 5.0 NVMe SSD and 11,500 MT/s DDR5 Memory at Computex 2024

At Computex 2024, we paid a visit to the Patriot booth and found a few new product announcements from the company. From record-shattering DDR5 memory speeds to next-generation Gen 5 SSDs, the company has prepared it all. Headlining the showcase is the Viper Xtreme 5 DDR5 memory series, achieving regular speeds of up to 8,200 MT/s and an astonishing 11,500 MT/s when overclocked. Patriot is also launching something for professional workstations with its overclockable ECC RDIMM modules, offering error correction, larger capacities, and the ability to exceed industry specifications through overclocking.

Biwin Brings New PC Memory and Flash Storage Lineup Under its Own Brand to Computex

Biwin is a licensee of SSDs, PC memory, and flash storage products for some for the biggest PC brands out there, including Acer and HP. This year, the company decided to launch a whole product stack under its own brand, so it could sell to the retail channel directly. We also spotted several licensed products under the coveted Acer Predator brand. Let's start our tour with them: the company showed us an Acer Predator Hera memory kit with 48 GB (2x 24 GB), which does an impressive DDR5-8000, at 40-48-48-128, and 1.35 V. The kit includes a DDR5-8000 @ 1.35 V XMP. The module features a mirror-finish metal heat spreader, and an RGB illuminated top. There are also 32 GB (2x 16 GB) kits in the series that go up to DDR5-8200.

Biwin's own first-party brand isn't too far behind the Predator Hera, the company showed us the Biwin (Editor's note: Wookong is local for Asia and will not be part of international branding) DW100 RGB, a high-end memory series, with kit capacities ranging from 32 GB (2x 16 GB) to 64 GB (2x 32 GB), speeds ranging from DDR5-6000 to DDR5-8200, and vDIMM going up to 1.45 V on the top-spec kit. There's also the DX100, which trades a little bit of performance for a more elaborate RGB LED setup. It comes in capacities up to 64 GB (2x 32 GB), and speeds of up to DDR5-8000. The HX100 is the mid-range kit, it lacks any lighting, capacities range up to 64 GB, and speeds of up to DDR5-7200. The timings aren't as tight as the ones on the DX100. Biwin also has a DDR5 LPCAMM2, with capacities of up to 64 GB, and speeds of up to 9600 MT/s. Most of Biwin's DRAM products launch in July 2024.

Samsung Strike Has No Immediate Impact on Memory Production, with No Shipment Shortages

The Samsung Electronics Union is reportedly planning to strike on June 7, TrendForce reports that this strike will not impact DRAM and NAND Flash production, nor will it cause any shipment shortages. Additionally, the spot prices for DRAM and NAND Flash had been declining prior to the strike announcement, and there has been no change in this downtrend since the announcement.

Samsung's global share of DRAM and NAND Flash output in 2023 was 46.8% and 32.4%, respectively. Even though the South Korean plants account for all 46.8% of global DRAM production and about 17.8% of global NAND Flash production, TrendForce identifies four reasons why this strike will not impact production. Firstly, the strike involves employees at Samsung's headquarters in Seocho, Seoul, where union participation in higher, but these employees do not directly engage in production. Secondly, this strike is planned for only one day, which falls within the flexible scheduling range for production.
Return to Keyword Browsing
Nov 23rd, 2024 05:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts