News Posts matching #CAMM

Return to Keyword Browsing

CSPs to Expand into Edge AI, Driving Average NB DRAM Capacity Growth by at Least 7% in 2025

TrendForce has observed that in 2024, major CSPs such as Microsoft, Google, Meta, and AWS will continue to be the primary buyers of high-end AI servers, which are crucial for LLM and AI modeling. Following establishing a significant AI training server infrastructure in 2024, these CSPs are expected to actively expand into edge AI in 2025. This expansion will include the development of smaller LLM models and setting up edge AI servers to facilitate AI applications across various sectors, such as manufacturing, finance, healthcare, and business.

Moreover, AI PCs or notebooks share a similar architecture to AI servers, offering substantial computational power and the ability to run smaller LLM and generative AI applications. These devices are anticipated to serve as the final bridge between cloud AI infrastructure and edge AI for small-scale training or inference applications.

SK hynix Showcases Its New AI Memory Solutions at HPE Discover 2024

SK hynix has returned to Las Vegas to showcase its leading AI memory solutions at HPE Discover 2024, Hewlett Packard Enterprise's (HPE) annual technology conference. Held from June 17-20, HPE Discover 2024 features a packed schedule with more than 150 live demonstrations, as well as technical sessions, exhibitions, and more. This year, attendees can also benefit from three new curated programs on edge computing and networking, hybrid cloud technology, and AI. Under the slogan "Memory, The Power of AI," SK hynix is displaying its latest memory solutions at the event including those supplied to HPE. The company is also taking advantage of the numerous networking opportunities to strengthen its relationship with the host company and its other partners.

The World's Leading Memory Solutions Driving AI
SK hynix's booth at HPE Discover 2024 consists of three product sections and a demonstration zone which showcase the unprecedented capabilities of its AI memory solutions. The first section features the company's groundbreaking memory solutions for AI, including HBM solutions. In particular, the industry-leading HBM3E has emerged as a core product to meet the growing demands of AI systems due to its exceptional processing speed, capacity, and heat dissipation. A key solution from the company's CXL lineup, CXL Memory Module-DDR5 (CMM-DDR5), is also on display in this section. In the AI era where high performance and capacity are vital, CMM-DDR5 has gained attention for its ability to expand system bandwidth by up to 50% and capacity by up to 100% compared to systems only equipped with DDR5 DRAM.

SK hynix Showcases Its Next-Gen Solutions at Computex 2024

SK hynix presented its leading AI memory solutions at COMPUTEX Taipei 2024 from June 4-7. As one of Asia's premier IT shows, COMPUTEX Taipei 2024 welcomed around 1,500 global participants including tech companies, venture capitalists, and accelerators under the theme "Connecting AI". Making its debut at the event, SK hynix underlined its position as a first mover and leading AI memory provider through its lineup of next-generation products.

"Connecting AI" With the Industry's Finest AI Memory Solutions
Themed "Memory, The Power of AI," SK hynix's booth featured its advanced AI server solutions, groundbreaking technologies for on-device AI PCs, and outstanding consumer SSD products. HBM3E, the fifth generation of HBM1, was among the AI server solutions on display. Offering industry-leading data processing speeds of 1.18 terabytes (TB) per second, vast capacity, and advanced heat dissipation capability, HBM3E is optimized to meet the requirements of AI servers and other applications. Another technology which has become crucial for AI servers is CXL as it can increase system bandwidth and processing capacity. SK hynix highlighted the strength of its CXL portfolio by presenting its CXL Memory Module-DDR5 (CMM-DDR5), which significantly expands system bandwidth and capacity compared to systems only equipped with DDR5. Other AI server solutions on display included the server DRAM products DDR5 RDIMM and MCR DIMM. In particular, SK hynix showcased its tall 128-gigabyte (GB) MCR DIMM for the first time at an exhibition.

Mnemonic Electronic Debuts at COMPUTEX 2024, Embracing the Era of High-Capacity SSDs

On June 4th, COMPUTEX 2024 was successfully held at the Taipei Nangang Exhibition Center. Mnemonic Electronic Co., Ltd., the Taiwanese subsidiary of Longsys, showcased industry-leading high-capacity SSDs under the theme "Embracing the Era of High-Capacity SSDs." The products on display included the Mnemonic MS90 8TB SATA SSD, FORESEE ORCA 4836 series enterprise NVMe SSDs, FORESEE XP2300 PCIe Gen 4 SSDs, and rich product lines comprising embedded storage, memory modules, memory cards, and more. The company offers reliable industrial-grade, automotive-grade, and enterprise-grade storage products, providing high-capacity solutions for global users.

High-Capacity SSDs
For SSDs, Mnemonic Electronic presented products in various form factors and interfaces, including PCIe M.2, PCIe BGA, SATA M.2, and SATA 2.5-inch. The Mnemonic MS90 8 TB SATA SSD supports the SATA interface with a speed of up to 6 Gb/s (Gen 3) and is backward compatible with Gen 1 and Gen 2. It also supports various SATA low-power states (Partial/Sleep/Device Sleep) and can be used for nearline HDD replacement, surveillance, and high-speed rail systems.

ASUS Shows Off Latest and Upcoming PC Hardware at Computex 2024

ASUS today announced the innovative concepts and upcoming designs that it will be demonstrating at its Computex 2024 booths, including AI PCs, prototype AMD and Intel motherboards, as well as fresh CPU coolers.

AI PC-ready hardware
With all the groundbreaking AI-powered apps and features available today, and many more on the horizon, PC enthusiasts the world over are searching for hardware that will let them run AI tools on their own AI PC in addition to using cloud-based subscriptions. At Computex 2024, ASUS is showing off the PC hardware that builders can leverage to explore the future of this revolution. An AI PC is an entire platform ready to unlock advanced AI features—in turn, revolutionizing the way that people build, customize, and use such a device.

GeIL to Showcase Extended New and Enhanced Memory Products at COMPUTEX 2024

GeIL will feature at Computex 2024 the EVO V series and TUF GAMING ALLIANCE MEMORY with an Active Dual Fan Cooling System, ensuring optimal stability and performance. GeIL enhances memory stability and reliability with CKD (Client Clock Driver) chips in CUDIMM and CSODIMM products. The CAMM2 and LPCAMM2 modules offer faster speeds and larger capacities for next-gen compact devices, ideal for AI PCs and server applications requiring high capacity and quick access.

GeIL TUF GAMING ALLIANCE MEMORY
In addition to the well-known EVO V series, GeIL TUF GAMING ALLIANCE MEMORY includes the Active Dual Fan Cooling System design, which greatly helps heat dissipation. The enhanced cooling efficiency ensures optimal stability and performance under the most demanding conditions. To provide users with a variety of product options, GeIL TUF GAMING ALLIANCE MEMORY also offers an RGB SKU without fans. Both models will be demonstrated at GeIL's booth at Computex 2024.

LPDDR6 LPCAMM2 Pictured and Detailed Courtesy of JEDEC

Yesterday we reported on DDR6 memory hitting new heights of performance and it looks like LPDDR6 will follow suit, at least based on details in a JEDEC presentation. LPDDR6 will just like LPDDR5 be available as solder down memory, but it will also be available in a new LPCAMM2 module. The bus speed of LPDDR5 on LPCAMM2 modules is expected to peak at 9.2 GT/s based on JEDEC specifications, but LPDDR6 will extend this to 14.4 GT/s or roughly a 50 percent increase. However, today the fastest and only LPCAMM2 modules on the retail market which are using LPDDR5X, comes in at 7.5 GT/s, which suggests that launch speeds of LPDDR6 will end up being quite far from the peak speeds.

There will be some other interesting changes to LPDDR6 CAMM2 modules as there will be a move from 128-bit per module to 192-bit per module and each channel will go from 32-bits to 48-bits. Part of the reason for this is that LPDDR6 is moving to a 24-bit channel width, consisting of two 12-bit sub channels, as mentioned in yesterday's news post. This might seem odd at first, but in reality is fairly simple, LPDDR6 will have native ECC (Error Correction Code) or EDC (Error Detection Code) support, but it's currently not entirely clear how this will be implemented on a system level. JEDEC is also looking at developing a screwless solution for the CAMM2 and LPCAMM2 memory modules, but at the moment there's no clear solution in sight. We might also get to see LPDDR6 via LPCAMM2 modules on the desktop, although the presentation only mentions CAMM2 for the desktop, something we've already seen that MSI is working on.

Next Crop of MSI Project Zero Motherboards to Implement CAMM2 DDR5 Memory

The CAMM2 and LPCAMM2 form-factors were originally designed for thin-and-light notebooks, to provide them with memory replacements/upgrades without compromising on the Z-Height tolerances of the device's design. It looks like MSI sees a future for the CAMM2 form-factor on desktops, specifically the ones without cables sticking out. The company's next round of motherboards under its Project Zero banner will replace the conventional DDR5 DIMM slots with DDR5 CAMM2 slots. The company is joining forces with Kingston Technology for the effort.

Kingston is readying a new line of performance-segment CAMM2 modules under its FURY Impact brand that it originally uses for performance SO-DIMMs meant for gaming notebooks. MSI's next-gen Project Zero motherboard features contact points for a DDR5 CAMM2 module. A single CAMM2 module utilizes the entire 160-bit memory bus width of the Socket LGA1700 processor (that's both channels and their sub-channels). Kingston may release CAMM2 modules for most common memory sizes (such as 32 GB, 48 GB, 64 GB, and 96 GB), and most common DDR5 OC speeds for the platform (ranging between DDR5-6000 and DDR5-8000).

Dell XPS Roadmap Leak Spills Beans on Several Upcoming Intel, AMD, and Qualcomm Processors

A product roadmap leak at leading PC OEM Dell, disclosed the tentative launch dates of several future generations of processors by Intel, AMD, and Qualcomm. The slide was detailing hardware platforms for future revisions of the company's premium XPS notebooks. Given that Dell remains one of the largest PC OEMs, the dates revealed in the leaked slides are highly plausible.

In chronological order, Dell expects Intel's Core Ultra 200V series "Lunar Lake-MX" processor in September 2024, which should mean product unveilings at Computex. It's interesting to note that Intel is only designing "Lunar Lake" for the -MX memory-on-package segment. This chip squares off against Apple's M3, M4, and possibly even the M3 Pro. Intel also has its ambitious "Arrow Lake" architecture planned for the second half of 2024, hence the lack of product overlap—there won't be an "Arrow Lake-MX."

Micron Delivers Crucial LPCAMM2 with LPDDR5X Memory for the New AI-Ready Lenovo ThinkPad P1 Gen 7 Workstation

Micron Technology, Inc., today announced the availability of Crucial LPCAMM2, the disruptive next-generation laptop memory form factor that features LPDDR5X mobile memory to level up laptop performance for professionals and creators. Consuming up to 58% less active power and with a 64% space savings compared to DDR5 SODIMMs, LPCAMM2 delivers higher bandwidth and dual-channel support with a single module. LPCAMM2 is an ideal high-performance memory solution for handling AI PC and complex workloads and is compatible with the powerful and versatile Lenovo ThinkPad P1 Gen 7 mobile workstations.

"LPCAMM2 is a game-changer for mobile workstation users who want to enjoy the benefits of the latest mobile high performance memory technology without sacrificing superior performance, upgradeability, power efficiency or space," said Jonathan Weech, senior director of product marketing for Micron's Commercial Products Group. "With LPCAMM2, we are delivering a future-proof memory solution, enabling faster speeds and longer battery life to support demanding creative and AI workloads."

SK hynix Strengthens AI Memory Leadership & Partnership With Host at the TSMC 2024 Tech Symposium

SK hynix showcased its next-generation technologies and strengthened key partnerships at the TSMC 2024 Technology Symposium held in Santa Clara, California on April 24. At the event, the company displayed its industry-leading HBM AI memory solutions and highlighted its collaboration with TSMC involving the host's CoWoS advanced packaging technology.

TSMC, a global semiconductor foundry, invites its major partners to this annual conference in the first half of each year so they can share their new products and technologies. Attending the event under the slogan "Memory, the Power of AI," SK hynix received significant attention for presenting the industry's most powerful AI memory solution, HBM3E. The product has recently demonstrated industry-leading performance, achieving input/output (I/O) transfer speed of up to 10 gigabits per second (Gbps) in an AI system during a performance validation evaluation.

Montage Technology Pioneers the Trial Production of DDR5 CKDs

Montage Technology, a leading data processing and interconnect IC company, today announced that it has taken the lead in the trial production of 1st-generation DDR5 Clock Driver (CKD) chips for next-generation client memory. This new product aims to enhance the speed and stability of memory data access to match the ever-increasing CPU operating speed and performance.

Previously, clock driver functionality was integrated into the Registering Clock Driver (RCD) chips used on server RDIMM or LRDIMM modules, not deployed to the PCs. In the DDR5 era, as data rates climb 6400 MT/s and above, the clock driver has emerged as an indispensable component for client memory.

Intel Lunar Lake-MX to Embed Samsung LPDDR5X Memory on SoC Package

According to sources close to Seoul Economy, and reported by DigiTimes, Intel has reportedly chosen Samsung as a supplier for its next-generation Lunar Lake processors, set to debut later this year. The report notes that Samsung will provide LPDDR5X memory devices for integration into Intel's processors. This collaboration could be a substantial win for Samsung, given Intel's projection to distribute millions of Lunar Lake CPUs in the coming years. However, it's important to note that this information is based on a leak and has not been officially confirmed. Designed for ultra-portable laptops, the Lunar Lake-MX platform is expected to feature 16 GB or 32 GB of LPDDR5X-8533 memory directly on the processor package. This on-package memory approach aims to minimize the platform's physical size while enhancing performance over traditional memory configurations. With Lunar Lake's exclusive support for on-package memory, Samsung's LPDDR5X-8533 products could significantly boost sales.

While Samsung is currently in the spotlight, it remains unclear if it will be the sole LPDDR5X memory provider for Lunar Lake. Intel's strategy involves selling processors with pre-validated memory, leaving the door open for potential validation of similar memory products from competitors like Micron and SK Hynix. Thanks to a new microarchitecture, Intel has promoted its Lunar Lake processors as a revolutionary leap in performance-per-watt efficiency. The processors are expected to utilize a multi-chipset design with Foveros technology, combining CPU and GPU chipsets, a system-on-chip tile, and dual memory packages. The CPU component is anticipated to include up to eight cores, a mix of four high-performance Lion Cove and four energy-efficient Skymont cores, alongside advanced graphics, cache, and AI acceleration capabilities. Apple's use of on-package memory in its M-series chips has set a precedent in the industry, and with Intel's Lunar Lake MX, this trend could extend across the thin-and-light laptop market. However, systems requiring more flexibility in terms of configuration, repair, and upgrades will likely continue to employ standard memory solutions like SODIMMs and/or the new CAMM2 modules that offer a balance of high performance and energy efficiency.

SK Hynix Throws a Jab: CAMM is Coming to Desktop PCs

In a surprising turn of events, SK Hynix has hinted at the possibility of the Compression Attached Memory Module (CAMM) standard, initially designed for laptops, being introduced to desktop PCs. This revelation came from a comment made by an SK Hynix representative at the CES 2024 in Las Vegas for the Korean tech media ITSubIssub. According to the SK Hynix representative, the first implementation is underway, but there are no specific details. CAMM, an innovative memory standard developed by Dell in 2022, was certified to replace SO-DIMM as the official standard for laptop memory. However, the transition to desktop PCs could significantly disrupt the desktop memory market. The CAMM modules, unlike the vertical DRAM sticks currently in use, are horizontal and are screwed into a socket. This design change would necessitate a complete overhaul of the desktop motherboard layout.

The thin, flat design of the CAMM modules could also limit the number that can be installed on an ATX board. However, the desktop version of the standard CAMM2 was announced by JEDEC just a month ago. It is designed for DDR5 memory, but it is expected to become mainstream with the introduction of DDR6 around 2025. While CAMM allows for higher speeds and densities for mobile memory, its advantages for desktops over traditional memory sticks are yet to be fully understood. Although low-power CAMM modules could offer energy savings, this is typically more relevant for mobile devices than desktops. As we move towards DDR6 and DDR7, more information about CAMM for desktops will be needed to understand its potential benefits. JEDEC's official words on the new standard indicate that "DDR5 CAMM2s are intended for performance notebooks and mainstream desktops, while LPDDR5/5X CAMM2s target a broader range of notebooks and certain server market segments." So, we can expect to see CAMM2 in both desktops and some server applications.

Micron First to Market With LPDDR5X-based LPCAMM2 Memory

Micron Technology, Inc. (Nasdaq: MU), today unveiled the industry's first standard low-power compression attached memory module (LPCAMM2) available in capacities from 16 GB to 64 GB, which delivers higher performance, energy-efficiency, space savings and modularity for PCs. Sampling now with production in the first half of 2024, LPCAMM2 is the first disruptive new form factor for client PCs since the introduction of small outline dual inline memory modules (SODIMMs) in 1997. Micron's LPDDR5X DRAM incorporated into the innovative LPCAMM2 form factor will provide up to 61% lower power and up to 71% better performance for PCMark 10 essential workloads such as web browsing and video conferencing, along with a 64% space savings over SODIMM offerings.

As generative artificial intelligence (GAI) use cases proliferate to client PCs, performance of the memory subsystem becomes more critical. LPCAMM2 delivers the required performance to process AI workloads on PCs and provide the potential to scale to applications needing a high performance and low power solution in a compact and modular form factor, with the ability to upgrade low power DRAM for the first time, as customer needs evolve.

JEDEC Publishes New CAMM2 Memory Module Standard

JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, today announced the publication of JESD318: Compression Attached Memory Module (CAMM2) Common Standard. This groundbreaking standard defines the electrical and mechanical requirements for both Double Data Rate, Synchronous DRAM Compression-Attached Memory Modules (DDR5 SDRAM CAMM2s) and Low Power Double Data Rate, Synchronous DRAM Compression-Attached Memory Modules (LPDDR5/5X SDRAM CAMM2s) in a single, comprehensive document. JESD318 CAMM2 is available for download from the JEDEC website.

DDR5 and LPDDR5/5X CAMM2s cater to distinct use cases. DDR5 CAMM2s are intended for performance notebooks and mainstream desktops, while LPDDR5/5X CAMM2s target a broader range of notebooks and certain server market segments.

Samsung Electronics Holds Memory Tech Day 2023 Unveiling New Innovations To Lead the Hyperscale AI Era

Samsung Electronics Co., Ltd., a world leader in advanced memory technology, today held its annual Memory Tech Day, showcasing industry-first innovations and new memory products to accelerate technological advancements across future applications—including the cloud, edge devices and automotive vehicles.

Attended by about 600 customers, partners and industry experts, the event served as a platform for Samsung executives to expand on the company's vision for "Memory Reimagined," covering long-term plans to continue its memory technology leadership, outlook on market trends and sustainability goals. The company also presented new product innovations such as the HBM3E Shinebolt, LPDDR5X CAMM2 and Detachable AutoSSD.

SK hynix Displays Next-Gen Solutions Set to Unlock AI and More at OCP Global Summit 2023

SK hynix showcased its next-generation memory semiconductor technologies and solutions at the OCP Global Summit 2023 held in San Jose, California from October 17-19. The OCP Global Summit is an annual event hosted by the world's largest data center technology community, the Open Compute Project (OCP), where industry experts gather to share various technologies and visions. This year, SK hynix and its subsidiary Solidigm showcased advanced semiconductor memory products that will lead the AI era under the slogan "United Through Technology".

SK hynix presented a broad range of its solutions at the summit, including its leading HBM(HBM3/3E), CXL, and AiM products for generative AI. The company also unveiled some of the latest additions to its product portfolio including its DDR5 RDIMM, MCR DIMM, enterprise SSD (eSSD), and LPDDR CAMM devices. Visitors to the HBM exhibit could see HBM3, which is utilized in NVIDIA's H100, a high-performance GPU for AI, and also check out the next-generation HBM3E. Due to their low-power consumption and ultra-high-performance, these HBM solutions are more eco-friendly and are particularly suitable for power-hungry AI server systems.

ADATA Launches New Products at Computex 2023

ADATA, the world's leading memory module and flash memory brand, in association with its subsidiary gaming performance brand, XPG, will take part in the world's largest computer trade show in Taipei from May 30 to June 2, 2023- Computex Taipei. As an industry leader, ADATA continues to implement the company's brand philosophy of "Innovating the Future" through its businesses. In addition to providing new technological products and solutions, ADATA is also committed to promoting the sustainable development of corporations in order to lead the future.

To realize this year's theme of "The Xtreme Future Leader - Infinite Innovation," ADATA has leveraged emerging technologies to launch new products that are sustainable and extraordinarily innovative including the most powerful Gen 5 SSD, industry-leading CXL (Compute Express Link) memory module, CAMM (Compression Attached Memory Module), XPG titanium power supply (XPG FUSION 1600 W), and professional industrial solutions. Be sure to visit Computex to experience a full range of new products!

CAMM to Replace the Decades-old SO-DIMM Laptop Memory, JEDEC and Dell Argue

Laptop memory has been a controversial topic for many years. Its proprietary standard, SO-DIMM, has shown signs of aging as the decades-old JEDEC standard didn't adapt to other expanding capacities and speed trends. Today, JEDEC and Dell think that the future of laptop memory is in the new CAMM standard that both companies are working on. The introduction of CAMM comes from Dell, whose Senior Distinguished Engineer Tom Schnell is working on it. "We have unanimous approval of the 0.5 spec," said Mr. Schnell for PCWorld. One of the problems that SO-DIMM is facing is the capacity and speed issue, where the current DDR5 SO-DIMM memory stops at around 6400 MT/s. CAMM, on the other hand, starts from that speed and works its way up to offer higher capacities as well.

With Dell introducing CAMM in its laptops, it had no intentions of creating a proprietary solution but rather an expandable and upgradable memory platform with various benefits. With JEDEC's involvement in finalizing this, the CAMM standard is slowly on its way to becoming a viable option for different laptop manufacturers. Dell's Tom Schnell didn't reveal what companies are in the process of creating the final specification; however, we know that 32 of them are present, including Apple. If others join, the standard could take over future laptop designs and offer higher speeds and higher capacities, especially in the mobile workstation space where it matters. Below is an example of a CAMM memory module with a patent showing the SO-DIMM (upper left) versus CAMM (lower right) and CAMM's smaller trace path. With smaller tracing, the latency is also going down, so the new standard will bring additional efficiency. Additionally, devices that are based on LPDDR memory could have an upgrade path with the installment of CAMM.

Dell's DDR5 CAMM Appears in More Detail, Comes in Several Shapes, Won't be Proprietary

Last week the first details of Dell's CAMM (Compression Attached Memory Module) made an early appearance courtesy of a product leak, but now official details have appeared and the good news is that Dell is saying it won't be a proprietary solution. The Compression Connector looks unlike anything used by consumer computers today and Dell is said to be hoping that it'll be the next industry standard for memory modules, according to PCWorld. The interposer mentioned in the previous news article is also mentioned and allows for a pair of DDR5 SO-DIMMs to be used, albeit with a much taller Z-height.

Dell is apparently planning on getting its CAMM approved by the JEDEC, which is the standards organisation when it comes to memory. However, even if the CAMM format is accepted as a JEDEC standard, Dell holds patents and is likely to charge some kind of royalty fees to interested parties. That said, if it becomes a JEDEC standard, Dell has to follow RAND or Reasonable and Non-Discretionary terms, so the royalty fees would have to be reasonable for JEDEC to agree on making CAMM a standard. The main benefit of Dell's CAMM is that the memory traces end up being shorter and more direct, since the CAMM has a single-sided interface, whereas SO-DIMMs are interfaced on both sides, just like standard DIMMs. This would allow for higher speed memory interfaces, without the need of using something like signal re-drivers or re-timers.
Return to Keyword Browsing
Nov 21st, 2024 08:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts