News Posts matching #LPCAMM

Return to Keyword Browsing

AMD "Strix Halo" Processor Boosts up to 5.35 GHz, Geekbenched

AMD's upcoming "Strix Halo" mobile processor that features up to 16 "Zen 5" CPU cores and a powerful iGPU with 40 compute units, is beginning to surface in online benchmark databases. We've gone into the juicy technical bits about the processor in our older articles, but put simply, it is a powerful mobile processor meant to square off against the likes of the Apple M3 Pro and M3 Max. A chiplet-based processor, much like the upcoming "Granite Ridge" desktop processor and "Fire Range" mobile processor, "Strix Halo" features up to 16 full-sized "Zen 5" cores, as it uses up to two of the same "Eldora" CCDs as them; but wired to a large I/O die that contains the oversized iGPU, and an NPU, besides the memory controllers. The iGPU has 40 compute units (2,560 stream processors), and is based on the RDNA 3.5 graphics architecture, while the NPU is the same 50 TOPS-class unit carried over from "Strix Point."

A prototype HP laptop powered by a "Strix Halo" processor that uses a single 8-core "Zen 5" CCD, was spied on the web. This chip has eight full-sized "Zen 5" cores that share a 32 MB L3 cache. The iGPU on the I/O die has its own 32 MB Infinity Cache memory that cushions memory transfers. In our older reports, we speculated as to what the memory interface of "Strix Halo" would be. It turns out that the chip exclusively features a 256-bit wide LPDDR5X memory interface, which is double the bus width of "Strix Point." This is essentially what a "quad-channel DDR5" memory interface would be, and AMD is using a memory speed standard of at least LPDDR5X-8000. From the machine's point of view, this would be just a couple of hardwired LPDDR5X chips, or a pair of LPCAMM 2 modules. Back to the benchmarks, and we are shown a single-thread CPU score of 2099 to 2177 points, and a multithreaded score ranging between 5477 points to 13993 points. The laptop was tested with an unknown version and distribution of Linux. The CPU cores are shown boosting up to 5.35 GHz.

CSPs to Expand into Edge AI, Driving Average NB DRAM Capacity Growth by at Least 7% in 2025

TrendForce has observed that in 2024, major CSPs such as Microsoft, Google, Meta, and AWS will continue to be the primary buyers of high-end AI servers, which are crucial for LLM and AI modeling. Following establishing a significant AI training server infrastructure in 2024, these CSPs are expected to actively expand into edge AI in 2025. This expansion will include the development of smaller LLM models and setting up edge AI servers to facilitate AI applications across various sectors, such as manufacturing, finance, healthcare, and business.

Moreover, AI PCs or notebooks share a similar architecture to AI servers, offering substantial computational power and the ability to run smaller LLM and generative AI applications. These devices are anticipated to serve as the final bridge between cloud AI infrastructure and edge AI for small-scale training or inference applications.

SK hynix Showcases Its New AI Memory Solutions at HPE Discover 2024

SK hynix has returned to Las Vegas to showcase its leading AI memory solutions at HPE Discover 2024, Hewlett Packard Enterprise's (HPE) annual technology conference. Held from June 17-20, HPE Discover 2024 features a packed schedule with more than 150 live demonstrations, as well as technical sessions, exhibitions, and more. This year, attendees can also benefit from three new curated programs on edge computing and networking, hybrid cloud technology, and AI. Under the slogan "Memory, The Power of AI," SK hynix is displaying its latest memory solutions at the event including those supplied to HPE. The company is also taking advantage of the numerous networking opportunities to strengthen its relationship with the host company and its other partners.

The World's Leading Memory Solutions Driving AI
SK hynix's booth at HPE Discover 2024 consists of three product sections and a demonstration zone which showcase the unprecedented capabilities of its AI memory solutions. The first section features the company's groundbreaking memory solutions for AI, including HBM solutions. In particular, the industry-leading HBM3E has emerged as a core product to meet the growing demands of AI systems due to its exceptional processing speed, capacity, and heat dissipation. A key solution from the company's CXL lineup, CXL Memory Module-DDR5 (CMM-DDR5), is also on display in this section. In the AI era where high performance and capacity are vital, CMM-DDR5 has gained attention for its ability to expand system bandwidth by up to 50% and capacity by up to 100% compared to systems only equipped with DDR5 DRAM.

SK hynix Showcases Its Next-Gen Solutions at Computex 2024

SK hynix presented its leading AI memory solutions at COMPUTEX Taipei 2024 from June 4-7. As one of Asia's premier IT shows, COMPUTEX Taipei 2024 welcomed around 1,500 global participants including tech companies, venture capitalists, and accelerators under the theme "Connecting AI". Making its debut at the event, SK hynix underlined its position as a first mover and leading AI memory provider through its lineup of next-generation products.

"Connecting AI" With the Industry's Finest AI Memory Solutions
Themed "Memory, The Power of AI," SK hynix's booth featured its advanced AI server solutions, groundbreaking technologies for on-device AI PCs, and outstanding consumer SSD products. HBM3E, the fifth generation of HBM1, was among the AI server solutions on display. Offering industry-leading data processing speeds of 1.18 terabytes (TB) per second, vast capacity, and advanced heat dissipation capability, HBM3E is optimized to meet the requirements of AI servers and other applications. Another technology which has become crucial for AI servers is CXL as it can increase system bandwidth and processing capacity. SK hynix highlighted the strength of its CXL portfolio by presenting its CXL Memory Module-DDR5 (CMM-DDR5), which significantly expands system bandwidth and capacity compared to systems only equipped with DDR5. Other AI server solutions on display included the server DRAM products DDR5 RDIMM and MCR DIMM. In particular, SK hynix showcased its tall 128-gigabyte (GB) MCR DIMM for the first time at an exhibition.

XPG to Launch Handheld Gaming Device with LPCAMM2 Support

Handheld gaming devices are a dime a dozen these days and more and more companies are joining the fray on almost a weekly basis. At Computex, XPG was showing its upcoming handheld gaming device—currently known as the NIA—and it has several interesting features that most of their competitors haven't mentioned so far. The potentially most interesting feature that XPG has implemented is an LPCAMM2 module with support for up to 64 GB of LPDDR5x memory. XPG didn't list how much RAM the NIA will ship with, but 16 or 32 GB seems like the logical choices.

The device will be powered by AMD's Phoenix APU, but no details were given. XPG has implemented support for foveated rendering, which the company claims is an exclusive feature. This is courtesy of a front-facing camera with eye tracking, but it's unclear how exactly it'll work, since it won't be exactly the same as in a VR headset. The NIA will ship with an XPG Gammix S55 SSD, which is an M.2 2230 PCIe 4.0 NVMe drive with sizes of up to 2 TB. XPG also claims that the NIA is built for a "circular computing product lifecycle" whatever that means, but we're guessing it has something to do with using recycled materials and being recyclable. The screen size of the 1080p, 120 Hz display wasn't mentioned, but the screen can be tilted for better ergonomics and is supposed to deliver up to 500 nits brightness. The NIA also has a built-in kickstand.

Neo Forza Shows LPCAMM2 Modules and 14 GB/s SSD at Computex 2024

At Computex 2024, Neo Forza showcased its latest innovations in memory and storage products. One of the standout products unveiled was the CUDIMM, a DDR5 memory module tailored for advanced computing applications. With capacity options ranging from 8 GB to 48 GB and a data rate of 6400 MT/s, the CUDIMM promises to deliver high bandwidth and low latency, making it an ideal choice for gaming rigs, servers, and workstations.

Another highlight was the Thoth 5 Series, a state-of-the-art solid-state drive (SSD) lineup that prioritizes exceptional performance. Available in various form factors and capacities up to 7.68 TB, these SSDs boast read/write bandwidths of up to 14,000/8,800 MB/s, along with impressive random read and write IOPS. Designed for data centers and enterprise environments, the Thoth 5 Series combines speed, reliability, and endurance.

LPDDR6 LPCAMM2 Pictured and Detailed Courtesy of JEDEC

Yesterday we reported on DDR6 memory hitting new heights of performance and it looks like LPDDR6 will follow suit, at least based on details in a JEDEC presentation. LPDDR6 will just like LPDDR5 be available as solder down memory, but it will also be available in a new LPCAMM2 module. The bus speed of LPDDR5 on LPCAMM2 modules is expected to peak at 9.2 GT/s based on JEDEC specifications, but LPDDR6 will extend this to 14.4 GT/s or roughly a 50 percent increase. However, today the fastest and only LPCAMM2 modules on the retail market which are using LPDDR5X, comes in at 7.5 GT/s, which suggests that launch speeds of LPDDR6 will end up being quite far from the peak speeds.

There will be some other interesting changes to LPDDR6 CAMM2 modules as there will be a move from 128-bit per module to 192-bit per module and each channel will go from 32-bits to 48-bits. Part of the reason for this is that LPDDR6 is moving to a 24-bit channel width, consisting of two 12-bit sub channels, as mentioned in yesterday's news post. This might seem odd at first, but in reality is fairly simple, LPDDR6 will have native ECC (Error Correction Code) or EDC (Error Detection Code) support, but it's currently not entirely clear how this will be implemented on a system level. JEDEC is also looking at developing a screwless solution for the CAMM2 and LPCAMM2 memory modules, but at the moment there's no clear solution in sight. We might also get to see LPDDR6 via LPCAMM2 modules on the desktop, although the presentation only mentions CAMM2 for the desktop, something we've already seen that MSI is working on.

Mnemonic and Foresee Showcase Several New Enterprise SSD Models

During the COMPUTEX 2024 exhibition from June 4th to 7th, Mnemonic Electronic Co., Ltd. (hereinafter referred to as Mnemonic), Longsys's Taiwan subsidiary, will showcase a series of high-capacity SSD products under the theme "Embracing the Era of High-capacity SSDs," providing solutions for global users of high-capacity SSD products.

The lineup of high-capacity products presented by Mnemonic includes the ORCA 4836 series enterprise NVMe SSDs and the UNCIA 3836 series enterprise SATA SSDs. These products are equipped with the latest enterprise-grade 128-layer TLC NAND flash memory, offering high performance, low latency, adjustable power consumption, and high reliability storage solutions for enterprise-grade users such as servers, cloud computing, and edge computing, with a maximum capacity of up to 7.68 TB.

Micron Delivers Crucial LPCAMM2 with LPDDR5X Memory for the New AI-Ready Lenovo ThinkPad P1 Gen 7 Workstation

Micron Technology, Inc., today announced the availability of Crucial LPCAMM2, the disruptive next-generation laptop memory form factor that features LPDDR5X mobile memory to level up laptop performance for professionals and creators. Consuming up to 58% less active power and with a 64% space savings compared to DDR5 SODIMMs, LPCAMM2 delivers higher bandwidth and dual-channel support with a single module. LPCAMM2 is an ideal high-performance memory solution for handling AI PC and complex workloads and is compatible with the powerful and versatile Lenovo ThinkPad P1 Gen 7 mobile workstations.

"LPCAMM2 is a game-changer for mobile workstation users who want to enjoy the benefits of the latest mobile high performance memory technology without sacrificing superior performance, upgradeability, power efficiency or space," said Jonathan Weech, senior director of product marketing for Micron's Commercial Products Group. "With LPCAMM2, we are delivering a future-proof memory solution, enabling faster speeds and longer battery life to support demanding creative and AI workloads."

Lenovo Unveils Its New AI-Ready ThinkPad P1 Gen 7 Mobile Workstation

Today, Lenovo launched its latest mobile workstation offerings meticulously crafted to deliver the exceptional power and performance essential for handling complex workloads. Lenovo's ThinkPad P1 Gen 7, P16v i Gen 2, P16s i Gen 3, and P14s i Gen 5, with their cutting-edge AI technologies, are set to transform the way professionals engage with AI workflows. By collaborating with industry partners, Intel, NVIDIA, and Micron, Lenovo has introduced powerful and performance-packed AI PCs that meet the demands of modern-day AI-intensive tasks. The inclusion of the Intel Core Ultra processors with their integrated neural processing unit (NPU) and NVIDIA RTX Ada Generation GPUs signifies a major advancement in AI technology, boosting overall performance and productivity capabilities.

The latest ThinkPad P series mobile workstations powered by Intel Core Ultra processors and NVIDIA RTX Ada Generation GPUs deliver flexible, high-performance, and energy-efficient AI-ready PCs. The integrated NPU is dedicated to handling light, continuous AI tasks, while the NVIDIA GPU runs more demanding day-to-day AI processing. This combination enables smooth and reliable functioning of AI technologies, serving professionals engaged in diverse tasks ranging from 3D modeling and scene development to AI inferencing and training.

Samsung Electronics' Industry-First LPCAMM Ushers in Future of Memory Modules

Samsung Electronics, a world leader in advanced memory technology, today announced that it has developed the industry's first Low Power Compression Attached Memory Module (LPCAMM) form factor, which is expected to transform the DRAM market for PCs and laptops - and potentially even data centers. Samsung's groundbreaking development for its 7.5 gigabits-per-second (Gbps) LPCAMM has completed system verification through Intel's platform. Historically, PCs and laptops have conventionally used LPDDR DRAM or DDR-based So-DIMMs. While LPDDR is compact, it's permanently attached to the motherboard, making it challenging to replace during repairs or upgrades. On the other hand, So-DIMMs can be attached or detached easily but have limitations with performance and other physical features.

LPCAMM overcomes the shortcomings of both LPDDR and So-DIMMs, addressing the increased demand for more efficient yet compact devices. Being a detachable module, LPCAMM offers enhanced flexibility for PC and laptop manufacturers during the production process. Compared to So-DIMM, LPCAMM occupies up to 60% less space on the motherboard. This allows more efficient use of devices' internal space while also improving performance by up to 50% and power efficiency by up to 70%. LPDDR's power-saving features have made it an attractive option for servers, since it could potentially improve total cost of operation (TCO) efficiency. However, using LPDDR can create operational difficulties such as the need to replace the entire motherboard when upgrading a server's DRAM specifications. LPCAMM offers a solution to these challenges, creating significant potential for it to become the solution of choice for future data centers and servers.
Return to Keyword Browsing
Dec 21st, 2024 20:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts