- Joined
- Oct 9, 2007
- Messages
- 47,297 (7.53/day)
- Location
- Hyderabad, India
System Name | RBMK-1000 |
---|---|
Processor | AMD Ryzen 7 5700G |
Motherboard | ASUS ROG Strix B450-E Gaming |
Cooling | DeepCool Gammax L240 V2 |
Memory | 2x 8GB G.Skill Sniper X |
Video Card(s) | Palit GeForce RTX 2080 SUPER GameRock |
Storage | Western Digital Black NVMe 512GB |
Display(s) | BenQ 1440p 60 Hz 27-inch |
Case | Corsair Carbide 100R |
Audio Device(s) | ASUS SupremeFX S1220A |
Power Supply | Cooler Master MWE Gold 650W |
Mouse | ASUS ROG Strix Impact |
Keyboard | Gamdias Hermes E2 |
Software | Windows 11 Pro |
Supermicro, Inc., a Total IT Solution Manufacturer for AI, Cloud, Storage, and 5G/Edge, is expanding its portfolio of AI solutions, allowing customers to leverage the power and capability of AI in edge locations, such as public spaces, retail stores, or industrial infrastructure. Using Supermicro application-optimized servers with NVIDIA GPUs makes it easier to fine-tune pre-trained models and for AI inference solutions to be deployed at the edge where the data is generated, improving response times and decision-making.
"Supermicro has the broadest portfolio of Edge AI solutions, capable of supporting pre-trained models for our customers' edge environments," said Charles Liang, president and CEO of Supermicro. "The Supermicro Hyper-E server, based on the dual 5th Gen Intel Xeon processors, can support up to three NVIDIA H100 Tensor Core GPUs, delivering unparalleled performance for Edge AI. With up to 8 TB of memory in these servers, we are bringing data center AI processing power to edge locations. Supermicro continues to provide the industry with optimized solutions as enterprises build a competitive advantage by processing AI data at their edge locations."
With these server advancements, users no longer need to send data back to the cloud for processing, only to retrieve the information back to the edge, where it's required. Customers can now use pre-trained large language models (LLMs) models, optimized for performance and available with NVIDIA AI Enterprise at their edge locations where the data is needed for accurate, real-time decision-making close to the data origination.
"Businesses across industries, including healthcare, retail, manufacturing, and auto, are increasingly looking to leverage AI at the edge," said Kevin Connors, vice president of partner alliances at NVIDIA. "The new Supermicro NVIDIA-Certified Systems, powered by the NVIDIA AI platform, are built to deliver the highest-performing accelerated computing infrastructure, as well as NVIDIA AI Enterprise software to help run edge AI workloads."
For example, Supermicro's Hyper-E server, the SYS-221HE, is optimized for edge training and inferencing and supports dual socket CPUs in a short-depth, front I/O system. The system holds up to 3 double-width NVIDIA Tensor Core GPUs, including the NVIDIA H100, A10, L40S, A40, and A2 GPUs. These GPUs give the Supermicro Hyper-E sufficient computing power to process AI workloads at edge environments where data is collected, analyzed, and stored. The Supermicro SYS-221HE is available with front or rear servicing options, allowing this server to be installed in various environments. As an example of the power and flexibility of the Supermicro Hyper-E server, partners such as Eviden are creating Edge AI solutions that enhance the customer experience while shopping in traditional retail outlets.
Supermicro's advanced edge servers also include:
View at TechPowerUp Main Site
"Supermicro has the broadest portfolio of Edge AI solutions, capable of supporting pre-trained models for our customers' edge environments," said Charles Liang, president and CEO of Supermicro. "The Supermicro Hyper-E server, based on the dual 5th Gen Intel Xeon processors, can support up to three NVIDIA H100 Tensor Core GPUs, delivering unparalleled performance for Edge AI. With up to 8 TB of memory in these servers, we are bringing data center AI processing power to edge locations. Supermicro continues to provide the industry with optimized solutions as enterprises build a competitive advantage by processing AI data at their edge locations."
With these server advancements, users no longer need to send data back to the cloud for processing, only to retrieve the information back to the edge, where it's required. Customers can now use pre-trained large language models (LLMs) models, optimized for performance and available with NVIDIA AI Enterprise at their edge locations where the data is needed for accurate, real-time decision-making close to the data origination.
"Businesses across industries, including healthcare, retail, manufacturing, and auto, are increasingly looking to leverage AI at the edge," said Kevin Connors, vice president of partner alliances at NVIDIA. "The new Supermicro NVIDIA-Certified Systems, powered by the NVIDIA AI platform, are built to deliver the highest-performing accelerated computing infrastructure, as well as NVIDIA AI Enterprise software to help run edge AI workloads."
For example, Supermicro's Hyper-E server, the SYS-221HE, is optimized for edge training and inferencing and supports dual socket CPUs in a short-depth, front I/O system. The system holds up to 3 double-width NVIDIA Tensor Core GPUs, including the NVIDIA H100, A10, L40S, A40, and A2 GPUs. These GPUs give the Supermicro Hyper-E sufficient computing power to process AI workloads at edge environments where data is collected, analyzed, and stored. The Supermicro SYS-221HE is available with front or rear servicing options, allowing this server to be installed in various environments. As an example of the power and flexibility of the Supermicro Hyper-E server, partners such as Eviden are creating Edge AI solutions that enhance the customer experience while shopping in traditional retail outlets.
Supermicro's advanced edge servers also include:
- The Supermicro SYS-621C-TNR12R (CloudDC family) is an all-in-one rackmount platform for Cloud Data Centers. This compact 2U system supports up to two double-width GPUs in a 25.5" chassis and 4 to 12 SATA/SAS drive bays with optional full NVMe support.
- The Supermicro SYS-111E-FWTR, a high-density edge system, 1U in height and features a 5th Gen Intel Xeon processor and two PCIe 5.0 x16 FHFL slots, is ideal for a broad range of networking and edge applications.
- The compact Supermicro SYS-E403-13E delivers data center level performance to the edge in a box PC form factor featuring a 5th Gen Intel Xeon processor and up to three NVIDIA GPUs. The compact form factor allows the system to be deployed in small spaces, such as in a wall-mounted cabinet or as a portable device.
- The ultra short-depth SYS-211E-FRN2T, with a system depth of 300 mm, is specifically designed to fit in space-constrained environments found at the networking edge and features up to a 5th Gen Intel Xeon processor. The SYS-211E-FRN2T is available with either AC or DC power options.
- The powerful and versatile SuperEdge system, the SYS-211SE-31D/A, is a multi-node system featuring three independent nodes, each with a 5th Gen Intel Xeon processor, three PCIe 5.0 x16 slots, and up to 2 TB of DDR5 memory. This 2U system also features front I/O and a broad operating temperature range, and its short depth makes it an excellent fit for deployment outside a data center.
- The SYS-E300-13AD is a compact IoT server featuring a 13th Gen Intel Core processor and is sized at just 265x226 mm, making it the smallest system to fit an NVIDIA GPU. The server is ideal for delivering distributed AI capabilities to the edge.
View at TechPowerUp Main Site