• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AAEON MAXER-2100 Inference Server Integrates Both Intel CPU and NVIDIA GPU Tech

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,194 (7.56/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Leading provider of advanced AI solutions AAEON (Stock Code: 6579), has released the inaugural offering of its AI Inference Server product line, the MAXER-2100. The MAXER-2100 is a 2U Rackmount AI inference server powered by the Intel Core i9-13900 Processor, designed to meet high-performance computing needs.
The MAXER-2100 is also able to support both 12th and 13th Generation Intel Core LGA 1700 socket-type CPUs, up to 125 W, and features an integrated NVIDIA GeForce RTX 4080 SUPER GPU. While the product's default comes with the NVIDIA GeForce RTX 4080 SUPER, it is also compatible with and an NVIDIA-Certified Edge System for both the NVIDIA L4 Tensor Core and NVIDIA RTX 6000 Ada GPUs.

Given the MAXER-2100 is equipped with both a high-performance CPU and industry-leading GPU, a key feature highlighted by AAEON upon the product's launch is its capacity to execute complex AI algorithms and datasets, process multiple high-definition video streams simultaneously, and utilize machine learning to refine large language models (LLMs) and inferencing models.



Given the need for latency-free operation in such areas, the MAXER-2100 offers up to 128 GB of DDR5 system memory through dual-channel SODIMM slots. For storage, it includes an M.2 2280 M-Key for NVMe and two hot-swappable 2.5" SATA SSD bays with RAID support. The system also provides extensive functional expansion options, including one PCIe [x16] slot, an M.2 2230 E-Key for Wi-Fi, and an M.2 3042/3052 B-Key with a micro SIM slot.

For peripheral connectivity, the server boasts a total of four RJ-45 ports, two running at 2.5GbE and two at 1GbE speed, along with four USB 3.2 Gen 2 ports running at 10 Gbps. In terms of industrial communication options, the MAXER-2100 grants users RS-232/422/485 via a DB-9 port. Multiple display interfaces are available, thanks to HDMI 2.0, DP 1.4, and VGA ports, which leverage the exceptional graphic capability of the server's NVIDIA GeForce RTX 4080 SUPER GPU.

Given the combined thermal output of its 1000 W power supply, 125 W CPU, integrated NVIDIA GeForce RTX 4080 SUPER GPU, and potential additional add-on cards, the MAXER-2100 is remarkably compact at just 17" x 3.46" x 17.6" (431.8 mm x 88 mm x 448 mm). This is made possible by a novel cooling architecture utilizing three fans, prioritizing airflow around the CPU and key chassis components. Fan placement within the MAXER-2100 chassis also serves to reduce system noise.

AAEON has indicated that the system caters to three primary user bases - edge computing clients, central management clients, and enterprise AI clients.
The first of these refers to organizations and businesses that require scalable, server-grade edge inferencing for applications such as automated optical inspection (AOI) and smart city solutions.

"The MAXER-2100 can be used to run multiple AI models across multiple high-definition video streams simultaneously, via either its onboard peripheral interfaces or scaled up via network port integration." Associate Vice President of AAEON's Smart Platform Division Alex Hsueh said. "Its high-performance CPU, powerful GPU, large memory capacity, and high-speed network interfaces make it well-equipped to handle the acquisition and processing of 50-100+ high definition video streams, making it an ideal solution for applications requiring real-time video analysis." Hsueh added.

AAEON's second target market is those seeking remote multi-device management functions, such as running diagnostics, deploying or refining AI models, or storing local data on edge devices. On the topic of the product's suitability for such clients, Mr. Hsueh remarked, "With the MAXER-2100, our customers can utilize K8S, over-the-air, and out-of-band management to update and scale edge device operations across smart city, transportation, and enterprise AI applications, addressing key challenges faced by our customers when managing multiple AI workloads at the edge."

For enterprise AI clients, AAEON indicates that by leveraging the MAXER-2100, companies can effectively harness their data to build and deploy advanced AI solutions powered by LLMs. This includes applications in natural language processing, content generation, and customer interaction automation. The key benefits that the MAXER-2100 brings to such setups are the security provided by data being stored and processed at the edge and the system's ability to train and refine inference models during operations.

For more information and detailed specifications, please visit the MAXER-2100 product page.

View at TechPowerUp Main Site
 
Joined
Jan 11, 2022
Messages
849 (0.82/day)
2 sodimms?
ai workloads

I mean the connectivity is nice but limiting the platform by going with laptop ram and just 2 slots
 
Top