Wednesday, August 7th 2024
AIC Partners with Unigen to Launch Power-Efficient AI Inference Server
AIC, a global leader in design and manufacturing of industrial-strength servers, in partnership with Unigen Corporation has launched the EB202-CP-UG, an ultra-efficient Artificial Intelligence (AI) inference server boasting over 400 trillion operations per second (TOPS) of performance. This innovative server is designed around the robust EB202-CP, a 2U Genoa-based storage server featuring a removable storage cage. By integrating eight Unigen Biscotti E1.S AI modules in place of standard E1.S SSDs, AIC is offering a specialized configuration for AI, the EB202-CP-UG—an air-cooled AI inference server characterized by an exceptional performance-per-watt ratio that ensures long-term cost savings.
"We are excited to partner with AIC to introduce innovative AI solutions," said Paul W. Heng, Founder and CEO of Unigen. "Their commitment to excellence in every product, especially their storage servers, made it clear that our AI technology would integrate seamlessly."Michael Liang, President and CEO of AIC, stated, "By collaborating with Unigen to carve out a technological niche in AI, we have successfully demonstrated an efficient, powerful, air-cooled server that aligns perfectly with our customers' stringent requirements."
The EB202-CP-UG is built on the Capella Motherboard platform, which accommodates the AMD EPYC (Genoa) CPU. It features a unique daughter-card/baseboard specifically designed for EDSFF signals from E1.S modules, combined with 128 GB of high-speed DDR5 memory and dual power-efficient modular power supplies. Driven by eight Unigen Biscotti E1.S AI modules, each equipped with twin Hailo-8 AI Inference Accelerators, the server achieves an impressive 21,500 frames per second (FPS) in Resnet_V1_50. Leveraging the new AVX technology in the AMD server CPU, the EB202-CP-UG is capable of decoding one hundred 720p video streams at 25 frames per second each, while conducting AI analytics on each frame with significant processing headroom.
When paired with a Linux Ubuntu OS and VMS/AI software from Network Optix, this AI inference server offers unparalleled safety, security, and peace of mind for the most discerning IT and security professionals.The EB202-CP-UG AI Inference Server is available for purchase now.
Sources:
AIC, Unigen
"We are excited to partner with AIC to introduce innovative AI solutions," said Paul W. Heng, Founder and CEO of Unigen. "Their commitment to excellence in every product, especially their storage servers, made it clear that our AI technology would integrate seamlessly."Michael Liang, President and CEO of AIC, stated, "By collaborating with Unigen to carve out a technological niche in AI, we have successfully demonstrated an efficient, powerful, air-cooled server that aligns perfectly with our customers' stringent requirements."
The EB202-CP-UG is built on the Capella Motherboard platform, which accommodates the AMD EPYC (Genoa) CPU. It features a unique daughter-card/baseboard specifically designed for EDSFF signals from E1.S modules, combined with 128 GB of high-speed DDR5 memory and dual power-efficient modular power supplies. Driven by eight Unigen Biscotti E1.S AI modules, each equipped with twin Hailo-8 AI Inference Accelerators, the server achieves an impressive 21,500 frames per second (FPS) in Resnet_V1_50. Leveraging the new AVX technology in the AMD server CPU, the EB202-CP-UG is capable of decoding one hundred 720p video streams at 25 frames per second each, while conducting AI analytics on each frame with significant processing headroom.
When paired with a Linux Ubuntu OS and VMS/AI software from Network Optix, this AI inference server offers unparalleled safety, security, and peace of mind for the most discerning IT and security professionals.The EB202-CP-UG AI Inference Server is available for purchase now.
1 Comment on AIC Partners with Unigen to Launch Power-Efficient AI Inference Server
Still pretty nice. Genoa is getting used a lot (love them pcie lanes/$)