News Posts matching #DRAM

Return to Keyword Browsing

Asgard Intros Thor DDR5-9600 48 GB CUDIMM Memory

Asgard joined the ranks of V-Color to unveil its first high-frequency DDR5 memory, the Thor DDR5-9600. V-Color recently launched its DDR5-9200 memory, and Asgard one-upped it with DDR5-9600. Both these are CUDIMMs—UDIMMs that feature a client clock driver (CKD) component. These modules feature the highest bins of DDR5-9600 chips from SK Hynix. The advertised speed of DDR5-9600 is achieved using timings of 44-56-56-136, and a scorching DRAM voltage of 1.50 V. The module includes an XMP 3.0 profile to enable these settings on an Intel platform. With a single-rank configuration and 24 GB module density, the dual-channel kit gives you 48 GB. Asgard revealed that it is working on a DDR5-10000 kit. The company didn't announce US availability.

SK hynix Presents Upgraded AiMX Solution at AI Hardware and Edge AI Summit 2024

SK hynix unveiled an enhanced Accelerator-in-Memory based Accelerator (AiMX) card at the AI Hardware & Edge AI Summit 2024 held September 9-12 in San Jose, California. Organized annually by Kisaco Research, the summit brings together representatives from the AI and machine learning ecosystem to share industry breakthroughs and developments. This year's event focused on exploring cost and energy efficiency across the entire technology stack. Marking its fourth appearance at the summit, SK hynix highlighted how its AiM products can boost AI performance across data centers and edge devices.

Booth Highlights: Meet the Upgraded AiMX
In the AI era, high-performance memory products are vital for the smooth operation of LLMs. However, as these LLMs are trained on increasingly larger datasets and continue to expand, there is a growing need for more efficient solutions. SK hynix addresses this demand with its PIM product AiMX, an AI accelerator card that combines multiple GDDR6-AiMs to provide high bandwidth and outstanding energy efficiency. At the AI Hardware & Edge AI Summit 2024, SK hynix presented its updated 32 GB AiMX prototype which offers double the capacity of the original card featured at last year's event. To highlight the new AiMX's advanced processing capabilities in a multi-batch environment, SK hynix held a demonstration of the prototype card with the Llama 3 70B model, an open source LLM. In particular, the demonstration underlined AiMX's ability to serve as a highly effective attention accelerator in data centers.

SK Hynix Develops PEB110 E1.S SSD for Data Centers

SK hynix Inc. announced today that it has developed PEB110 E1.S (PEB110), a high-performance solid-state drive (SSD) for data centers. With the advent of the AI era, customer demand for high-performance NAND solutions such as SSDs for data centers, as well as ultra-fast DRAM chips including high bandwidth memory (HBM), is growing. In line with this trend, the company has developed and introduced a new product with improved data processing speed and power efficiency by applying the fifth-generation (Gen 5) PCIe specifications.

SK hynix expects to meet diverse customer needs with a more robust SSD portfolio following successful mass production of PS1010 with the introduction of PEB110. The company is currently in the qualification process with a global data center customer and plans to begin mass production of the product in the second quarter of next year, pending qualification. PCle Gen 5, applied to the new product, provides twice the bandwidth of the fourth generation (Gen 4), enabling PEB110 to achieve data transfer rates of up to 32 gigatransfers per second (GT/s). This enables PEB110 to double the performance of the previous generation and improve power efficiency by more than 30%.

Micron Announces 12-high HBM3E Memory, Bringing 36 GB Capacity and 1.2 TB/s Bandwidth

As AI workloads continue to evolve and expand, memory bandwidth and capacity are increasingly critical for system performance. The latest GPUs in the industry need the highest performance high bandwidth memory (HBM), significant memory capacity, as well as improved power efficiency. Micron is at the forefront of memory innovation to meet these needs and is now shipping production-capable HBM3E 12-high to key industry partners for qualification across the AI ecosystem.

Micron's industry-leading HBM3E 12-high 36 GB delivers significantly lower power consumption than our competitors' 8-high 24 GB offerings, despite having 50% more DRAM capacity in the package
Micron HBM3E 12-high boasts an impressive 36 GB capacity, a 50% increase over current HBM3E 8-high offerings, allowing larger AI models like Llama 2 with 70 billion parameters to run on a single processor. This capacity increase allows faster time to insight by avoiding CPU offload and GPU-GPU communication delays. Micron HBM3E 12-high 36 GB delivers significantly lower power consumption than the competitors' HBM3E 8-high 24 GB solutions. Micron HBM3E 12-high 36 GB offers more than 1.2 terabytes per second (TB/s) of memory bandwidth at a pin speed greater than 9.2 gigabits per second (Gb/s). These combined advantages of Micron HBM3E offer maximum throughput with the lowest power consumption can ensure optimal outcomes for power-hungry data centers. Additionally, Micron HBM3E 12-high incorporates fully programmable MBIST that can run system representative traffic at full spec speed, providing improved test coverage for expedited validation and enabling faster time to market and enhancing system reliability.

Spot Market for Memory Struggles in First Half of 2024; Price Challenges Loom in Second Half

TrendForce reports that memory module makers have been aggressively increasing their DRAM inventories since 3Q23, with inventory levels rising to 11-17 weeks by 2Q24. However, demand for consumer electronics has not rebounded as expected. For instance, smartphone inventories in China have reached excessive levels, and notebook purchases have been delayed as consumers await new AI-powered PCs, leading to continued market contraction.

This has led to a weakening in spot prices for memory products primarily used in consumer electronics, with Q2 prices dropping over 30% compared to Q1. Although spot prices remained disconnected from contract prices through August, this divergence may signal potential future trends for contract pricing.

SK Hynix Develops Industry's First 1c (10nm-class) DDR5 Memory

SK hynix announced today that it has developed the industry's first 16 Gb DDR5 built using its 1c node, the sixth generation of the 10 nm process. The success marks the beginning of the extreme scaling to the level closer to 10 nm in the memory process technology. The degree of difficulty to advance the shrinking process of the 10 nm-range DRAM technology has grown over generations, but SK hynix has become the first in the industry to overcome the technological limitations by raising the level of completion in design, thanks to its industry-leading technology of the 1b, the fifth generation of the 10 nm process.

SK hynix said it will be ready for mass production of the 1c DDR5 within the year to start volume shipment next year. In order to reduce potential errors stemming from the procedure of advancing the process and transfer the advantage of the 1b, which is widely applauded for its best performing DRAM, in the most efficient way, the company extended the platform of the 1b DRAM for development of 1c. The new product comes with an improvement in cost competitiveness, compared with the previous generation, by adopting a new material in certain process of the extreme ultraviolet, or EUV, while optimizing the EUV application process of total. SK hynix also enhanced productivity by more than 30% through technological innovation in design.

Micron is Buying More Production Plants in Taiwan to Expand HBM Memory Production

Micron has been on a spending spree in Taiwan, where the company has been looking for new facilities. Micron has agreed to buy no less than three LCD plants from display maker AUO, which are located in the central Taiwanese city of Taichung. Micron is looking at paying NT$ 8.1 billion (~US$253.3 million). Initially, Micron was interested in buying another plant in Tainan from Innolux, but was turned down, so Micron turned to AUO for the purchases. Earlier this year, TSMC spent NT$17 billion (~US$531.6 million) to buy a similar facility from Innolux, but it seems that Innolux wasn't willing to part with any more facilities this year.

The three AUO plants are said to have produced LCD colour filters and the two of the plants had closed for production earlier this month. However, it appears that for some reason, the plant that is still in operation, will be leased by AUO and the company will continue production of colour filters in the factory. The larger plant measures 146,033 square metres, with the smaller measuring 32,500 square metres. As for Micron's plans, not much is known at this point in time, but the company has announced that it's planning on using at least some of the space for front-end wafer testing and that the new plants will support its current and upcoming DRAM production fabs in Taichung and Taoyuan, which the company is currently expanding. Market sources in Taiwan are quoted as saying that the focus will be on HBM memory, due to the high demand from various AI products in the market, least not from NVIDIA. The deal is expected to be finalised by the end of the year.

JEDEC Releases New Standard for LPDDR5/5X Serial Presence Detect (SPD) Contents

JEDEC Solid State Technology Association, the global leader in standards development for the microelectronics industry, today announced the publication of the JESD406-5 LPDDR5/5X Serial Presence Detect (SPD) Contents V1.0, consistent with the updated contents of JESD401-5B DDR5 DIMM Label and JESD318 DDR5/LPDDR5 Compression Attached Memory Module (CAMM2) Common Standard.

JESD406-5 documents the contents of the SPD non-volatile configuration device included on all JEDEC standard memory modules using LPDDR5/5X SDRAMs, including the CAMM2 standard designs outlined in JESD318. The JESD401-5B standard defines the content of standard memory module labels using the other two standards, assisting end users in selecting compatible modules for their applications.

Samsung to Install High-NA EUV Machines Ahead of TSMC in Q4 2024 or Q1 2025

Samsung Electronics is set to make a significant leap in semiconductor manufacturing technology with the introduction of its first High-NA 0.55 EUV lithography tool. The company plans to install the ASML Twinscan EXE:5000 system at its Hwaseong campus between Q4 2024 and Q1 2025, marking a crucial step in developing next-generation process technologies for logic and DRAM production. This move positions Samsung about a year behind Intel but ahead of rivals TSMC and SK Hynix in adopting High-NA EUV technology. The system is expected to be operational by mid-2025, primarily for research and development purposes. Samsung is not just focusing on the lithography equipment itself but is building a comprehensive ecosystem around High-NA EUV technology.

The company is collaborating with several key partners like Lasertec (developing inspection equipment for High-NA photomasks), JSR (working on advanced photoresists), Tokyo Electron (enhancing etching machines), and Synopsys (shifting to curvilinear patterns on photomasks for improved circuit precision). The High-NA EUV technology promises significant advancements in chip manufacturing. With an 8 nm resolution capability, it could make transistors about 1.7 times smaller and increase transistor density by nearly three times compared to current Low-NA EUV systems. However, the transition to High-NA EUV comes with challenges. The tools are more expensive, costing up to $380 million each, and have a smaller imaging field. Their larger size also requires chipmakers to reconsider fab layouts. Despite these hurdles, Samsung aims for commercial implementation of High-NA EUV by 2027.

Imec Demonstrates Logic and DRAM Structures Using High NA EUV Lithography

Imec, a world-leading research and innovation hub in nanoelectronics and digital technologies, presents patterned structures obtained after exposure with the 0.55NA EUV scanner in the joint ASML-imec High NA EUV Lithography Lab in Veldhoven, the Netherlands. Random logic structures down to 9,5 nm (19 nm pitch), random vias with 30 nm center-to-center distance, 2D features at 22 nm pitch, and a DRAM specific lay out at P32nm were printed after single exposure, using materials and baseline processes that were optimized for High NA EUV by imec and its partners in the framework of imec's Advanced Patterning Program. With these results, imec confirms the readiness of the ecosystem to enable single exposure high-resolution High NA EUV Lithography.

Following the recent opening of the joint ASML-imec High NA EUV Lithography Lab in Veldhoven, the Netherlands, customers now have access to the (TWINSCAN EXE:5000) High NA EUV scanner to develop private High NA EUV use cases leveraging the customer's own design rules and lay outs.

Samsung Electronics Begins Mass Production of Industry's Thinnest LPDDR5X DRAM Packages

Samsung Electronics, the world leader in advanced memory technology, today announced it has begun mass production for the industry's thinnest 12 nanometer (nm)-class, 12-gigabyte (GB) and 16 GB LPDDR5X DRAM packages, solidifying its leadership in the low-power DRAM market. Leveraging its extensive expertise in chip packaging, Samsung is able to deliver ultra-slim LPDDR5X DRAM packages that can create additional space within mobile devices, facilitating better airflow. This supports easier thermal control, a factor that is becoming increasingly critical especially for high-performance applications with advanced features such as on-device AI.

"Samsung's LPDDR5X DRAM sets a new standard for high-performance on-device AI solutions, offering not only superior LPDDR performance but also advanced thermal management in an ultra-compact package," said YongCheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics. "We are committed to continuous innovation through close collaboration with our customers, delivering solutions that meet the future needs of the low-power DRAM market."

addlink Unveils NAS D60 and D20 SSDs Optimized for Flash-based NAS

addlink Technology Co., Ltd., a global leader in high-performance storage solutions, proudly announces the launch of its groundbreaking new SSDs designed specifically for NAS storage systems: the NAS D60 and D20 SSDs. These cutting-edge SSDs are poised to set new standards in speed, reliability, and capacity, transforming the NAS storage landscape.

addlink's NAS D60 and D20 SSDs exemplify the company's dedication to innovation and quality. These products are meticulously designed to meet the rigorous demands of modern NAS users, providing reliable, high-performance storage solutions that enhance productivity and data management. Whether for home use or enterprise-level applications, the NAS D60 and D20 offer unmatched performance, endurance, and value.

Samsung Electronics Announces Results for Second Quarter of 2024

Samsung Electronics today reported financial results for the second quarter ended June 30, 2024. The Company posted KRW 74.07 trillion in consolidated revenue and operating profit of KRW 10.44 trillion as favorable memory market conditions drove higher average sales price (ASP), while robust sales of OLED panels also contributed to the results.

Memory Market Continues To Recover; Solid Second Half Outlook Centered on Server Demand
The DS Division posted KRW 28.56 trillion in consolidated revenue and KRW 6.45 trillion in operating profit for the second quarter. Driven by strong demand for HBM as well as conventional DRAM and server SSDs, the memory market as a whole continued its recovery. This increased demand is a result of the continued AI investments by cloud service providers and growing demand for AI from businesses for their on-premise servers.

Marvell Introduces Breakthrough Structera CXL Product Line to Address Server Memory Bandwidth and Capacity Challenges in Cloud Data Centers

Marvell Technology, Inc., a leader in data infrastructure semiconductor solutions, today launched the Marvell Structera product line of Compute Express Link (CXL) devices that enable cloud data center operators to overcome memory performance and scaling challenges in general-purpose servers.

To address memory-intensive applications, data center operators add extra servers to get higher memory bandwidth and higher memory capacity. The compute capabilities from the added processors are typically not utilized for these applications, making the servers inefficient from cost and power perspectives. The CXL industry standard addresses this challenge by enabling new architectures that can efficiently add memory to general-purpose servers.

Micron Introduces 9550 NVMe Data Center SSD

Micron Technology, Inc., today announced availability of the Micron 9550 NVMe SSD - the world's fastest data center SSD and industry leader in AI workload performance and power efficiency. The Micron 9550 SSD showcases Micron's deep expertise and innovation by integrating its own controller, NAND, DRAM and firmware into one world-class product. This integrated solution enables class-leading performance, power efficiency and security features for data center operators.

The Micron 9550 SSD delivers best-in-class performance with 14.0 GB/s sequential reads and 10.0 GB/s sequential writes to provide up to 67% better performance over similar competitive SSDs and enables industry-leading performance for demanding workloads such as AI. In addition, its random reads of 3,300 KIOPS are up to 35% better and random writes of 400 KIOPS are up to 33% better than competitive offerings.

Memory Industry Revenue Expected to Reach Record High in 2025 Due to Increasing Average Prices and the Rise of HBM and QLC

TrendForce's latest report on the memory industry reveals that DRAM and NAND Flash revenues are expected to see significant increases of 75% and 77%, respectively, in 2024, driven by increased bit demand, an improved supply-demand structure, and the rise of high-value products like HBM.

Furthermore, industry revenues are projected to continue growing in 2025, with DRAM expected to increase by 51% and NAND Flash by 29%, reaching record highs. This growth is anticipated to revive capital expenditures and boost demand for upstream raw materials, although it will also increase cost pressure for memory buyers.

Micron Technology Unveils MRDIMMs to Scale Up Memory Densities on Servers

Micron Technology, Inc., today announced it is now sampling its multiplexed rank dual inline memory module (MRDIMMs). The MRDIMMs will enable Micron customers to run increasingly demanding workloads and obtain maximum value out of their compute infrastructure. For applications requiring more than 128 GB of memory per DIMM slot, Micron MRDIMMs outperform current TSV RDIMMs by enabling the highest bandwidth, largest capacity with the lowest latency and improved performance per watt to accelerate memory-intensive virtualized multi-tenant, HPC and AI data center workloads.1 The new memory offering is the first generation in the Micron MRDIMM family and will be compatible with Intel Xeon 6 processors.

"Micron's latest innovative main memory solution, MRDIMM, delivers the much-needed bandwidth and capacity at lower latency to scale AI inference and HPC applications on next-generation server platforms," said Praveen Vaidyanathan, vice president and general manager of Micron's Compute Products Group. "MRDIMMs significantly lower the amount of energy used per task while offering the same reliability, availability and serviceability capabilities and interface as RDIMMs, thus providing customers a flexible solution that scales performance. Micron's close industry collaborations ensure seamless integration into existing server infrastructures and smooth transitions to future compute platforms."

Samsung Completes Validation of Industry's Fastest LPDDR5X for Use With MediaTek's Flagship Mobile Platform

Samsung Electronics, the world leader in advanced memory technology, today announced it has successfully completed verification of the industry's fastest 10.7 gigabit-per-second (Gbps) Low Power Double Data Rate 5X (LPDDR5X) DRAM for use on MediaTek's next-generation Dimensity platform.

The 10.7 Gbps operation speed verification was carried out using Samsung's LPDDR5X 16-gigabyte (GB) package on MediaTek's upcoming flagship Dimensity 9400 System on Chip (SoC), scheduled to be released in the second half of this year. The two companies have closely collaborated to complete the verification within just three months.

Nanya and Winbond Boost Memory Production Amid Rising Demand and Prices

As memory prices and volumes increase, manufacturers Nanya and Winbond have ceased the production cuts they implemented last year, with production now back to normal levels. Market research agencies and supply chain analysts indicate that memory shipments are expected to continue recovering in Q3 2024. Currently, memory factories are operating at a capacity utilization rate of 90% to full capacity, which is significantly higher than the 60% to 70% capacity utilization rate of wafer foundries with mature processes. Last year, Nanya adjusted its production volume reducing it by up to 20%. This year, production has gradually increased, reaching 70% to over 80% in the second quarter, and has now resumed normal levels.

Nanya anticipates that DRAM market conditions and prices will improve quarter by quarter, with the overall industry trending positively, potentially turning losses into profits in the third quarter. Nanya announced yesterday that its consolidated revenue for June was 3.363 billion yuan, marking a monthly increase of 0.35% and an annual increase of 36.83%, setting a high for the year. The cumulative consolidated revenue for the first half of the year was 19.424 billion yuan, a 44.4% increase compared to the same period last year. Nanya will hold a press conference on July 10 to announce its second-quarter financial results and operating outlook.

Panmnesia Uses CXL Protocol to Expand GPU Memory with Add-in DRAM Card or Even SSD

South Korean startup Panmnesia has unveiled an interesting solution to address the memory limitations of modern GPUs. The company has developed a low-latency Compute Express Link (CXL) IP that could help expand GPU memory with external add-in card. Current GPU-accelerated applications in AI and HPC are constrained by the set amount of memory built into GPUs. With data sizes growing by 3x yearly, GPU networks must keep getting larger just to fit the application in the local memory, benefiting latency and token generation. Panmnesia's proposed approach to fix this leverages the CXL protocol to expand GPU memory capacity using PCIe-connected DRAM or even SSDs. The company has overcome significant technical hurdles, including the absence of CXL logic fabric in GPUs and the limitations of existing unified virtual memory (UVM) systems.

At the heart of Panmnesia's solution is a CXL 3.1-compliant root complex with multiple root ports and a host bridge featuring a host-managed device memory (HDM) decoder. This sophisticated system effectively tricks the GPU's memory subsystem into treating PCIe-connected memory as native system memory. Extensive testing has demonstrated impressive results. Panmnesia's CXL solution, CXL-Opt, achieved two-digit nanosecond round-trip latency, significantly outperforming both UVM and earlier CXL prototypes. In GPU kernel execution tests, CXL-Opt showed execution times up to 3.22 times faster than UVM. Older CXL memory extenders recorded around 250 nanoseconds round trip latency, with CXL-Opt potentially achieving less than 80 nanoseconds. As with CXL, the problem is usually that the memory pools add up latency and performance degrades, while these CXL extenders tend to add to the cost model as well. However, the Panmnesia CXL-Opt could find a use case, and we are waiting to see if anyone adopts this in their infrastructure.
Below are some benchmarks by Panmnesia, as well as the architecture of the CXL-Opt.

SK Hynix to Invest $75 Billion by 2028 in Memory Solutions for AI

South Korean giant SK Group has unveiled plans for substantial investments in AI and semiconductor technologies worth almost $75 billion. SK Group subsidiary, SK Hynix, will lead this initiative with a staggering 103 trillion won ($74.6 billion) investment over the next three years, with plans to realize the investment by 2028. This commitment is in addition to the ongoing construction of a $90 billion mega fab complex in Gyeonggi Province for cutting-edge memory production. SK Group has further pledged an additional $58 billion, bringing the total investment to a whopping $133 billion. This capital infusion aims to enhance the group's competitiveness in the AI value chain while funding operations across its 175 subsidiaries, including SK Hynix.

While specific details remain undisclosed, SK Group is reportedly exploring various options, including potential mergers and divestments. SK Group has signaled that business practices need change amid shifting geopolitical situations and the massive boost that AI is bringing to the overall economy. We may see more interesting products from SK Group in the coming years as it potentially enters new markets centered around AI. This strategic pivot comes after SK Hynix reported its first loss in a decade in 2022. However, the company has since shown signs of recovery, fueled by the surging demand for memory solutions for AI chips. The company currently has a 35% share of the global DRAM market and plans to have an even stronger presence in the coming years. The massive investment aligns with the South Korean government's recently announced $19 billion support package for the domestic semiconductor industry, which will be distributed across companies like SK Hynix and Samsung.

Micron Technology, Inc. Reports Results for the Third Quarter of Fiscal 2024

Micron Technology, Inc. (Nasdaq: MU) today announced results for its third quarter of fiscal 2024, which ended May 30, 2024.

Fiscal Q3 2024 highlights
  • Revenue of $6.81 billion versus $5.82 billion for the prior quarter and $3.75 billion for the same period last year
  • GAAP net income of $332 million, or $0.30 per diluted share
  • Non-GAAP net income of $702 million, or $0.62 per diluted share
  • Operating cash flow of $2.48 billion versus $1.22 billion for the prior quarter and $24 million for the same period last year
"Robust AI demand and strong execution enabled Micron to drive 17% sequential revenue growth, exceeding our guidance range in fiscal Q3," said Sanjay Mehrotra, President and CEO of Micron Technology. "We are gaining share in high-margin products like High Bandwidth Memory (HBM), and our data center SSD revenue hit a record high, demonstrating the strength of our AI product portfolio across DRAM and NAND. We are excited about the expanding AI-driven opportunities ahead, and are well positioned to deliver a substantial revenue record in fiscal 2025."

DRAM Prices Expected to Increase by 8-13% in Q3

TrendForce reports that a recovery in demand for general servers—coupled with an increased production share of HBM by DRAM suppliers—has led suppliers to maintain their stance on hiking prices. As a result, the ASP of DRAM in the third quarter is expected to continue rising, with an anticipated increase of 8-13%. The price of conventional DRAM is expected to rise by 5-10%, showing a slight contraction compared to the increase in the second quarter.

TrendForce notes that buyers were more conservative about restocking in the second, and inventory levels on both the supplier and buyer sides did not show significant changes. Looking ahead to the third quarter, there is still room for inventory replenishment for smartphones and CSPs, and the peak season for production is soon to commence. Consequently, it is expected that smartphones and servers will drive an increase in memory shipments in the third quarter.

CSPs to Expand into Edge AI, Driving Average NB DRAM Capacity Growth by at Least 7% in 2025

TrendForce has observed that in 2024, major CSPs such as Microsoft, Google, Meta, and AWS will continue to be the primary buyers of high-end AI servers, which are crucial for LLM and AI modeling. Following establishing a significant AI training server infrastructure in 2024, these CSPs are expected to actively expand into edge AI in 2025. This expansion will include the development of smaller LLM models and setting up edge AI servers to facilitate AI applications across various sectors, such as manufacturing, finance, healthcare, and business.

Moreover, AI PCs or notebooks share a similar architecture to AI servers, offering substantial computational power and the ability to run smaller LLM and generative AI applications. These devices are anticipated to serve as the final bridge between cloud AI infrastructure and edge AI for small-scale training or inference applications.

Kingston Intros FURY Renegade RGB Limited Edition DDR5 Memory

Kingston today formally launched the FURY Renegade RGB Limited Edition DDR5 memory kits. These were shown at the company's Computex 2024 booth earlier this month. The module's design involves a two-tone die-cast metal shroud over the aluminium heat-spreaders, which are crowned by silicone diffusers for the RGB LEDs. The modules have a 19-preset lighting controller. You control the lighting using the first-party FURY CTRL software. Kingston says that the design of these modules are inspired by race cars.

The Kingston FURY Renegade RGB Limited Edition is available in only one density—48 GB (2x 24 GB kit), and in only one speed variant, DDR5-8000, with timings of CL36-48-48, and DRAM voltage of 1.45 V. The module also includes profiles for DDR5-7200 and DDR5-6400, with tighter timings. The modules pack an Intel XMP 3.0 SPD profile that enables the advertised speeds on Intel platforms. Kingston has extensively tested the modules on the latest Intel platforms, such as the 14th Gen Core "Raptor Lake Refresh" for compatibility with the advertised XMP speeds. The company didn't reveal pricing.
Return to Keyword Browsing
Sep 17th, 2024 04:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts