News Posts matching #Server

Return to Keyword Browsing

GIGABYTE Leads MLPerf Training v3.0 Benchmarks with Top-Performing Accelerators in GIGABYTE Servers

GIGABYTE Technology: The latest MLPerf Training v3.0 benchmark results are out, and the GIGABYTE G593-SD0 server has emerged as a leader in this round of testing. Going head-to-head against impressive systems, GIGABYTE's servers secured top positions in various categories, showcasing their prowess in handling real-world machine learning use cases. With an unparalleled focus on performance, efficiency, and reliability, GIGABYTE has once again proven its commitment to driving progress in the field of AI.

GIGABYTE, one of the founding members of MLCommons, has been actively contributing to the organization's efforts in designing and planning systems to benchmark fairly. Understanding the importance of replicating real-world scenarios in AI development, GIGABYTE's collaboration with MLCommons has been instrumental in shaping the benchmark tasks to encompass critical use cases such as image recognition, object detection, speech-to-text, natural language processing, and recommendation engines. By actively engaging with end applications, GIGABYTE ensures that its servers are designed to meet the highest standards, delivering supreme performance, and facilitating meaningful comparisons between different ML systems.

Alphacool Launches Mobile and Modular Open-Frame Server Racks

Flexibility with a server rack? With customizable depth, the Alphacool ES 19" Open Frame Server Rack delivers a mobile and modular solution for server systems. Made of sturdy steel, it not only provides secure carrying capacity for server units, it is also effortless to move thanks to the included casters.

Due to the variable depth from 59 cm up to 104 cm (23 - 40.9 inch) of the open frame rack, server units can be installed in a space-saving way. The standard-compliant mounting holes allow easy installation of the individual units. The server rack can be moved effortlessly with the help of the 360° rotating castors.

Microsoft Releases FY23 Q4 Earnings, Xbox Hardware Revenue Down 13%

Microsoft Corp. today announced the following results for the quarter ended June 30, 2023, as compared to the corresponding period of last fiscal year:
  • Revenue was $56.2 billion and increased 8% (up 10% in constant currency)
  • Operating income was $24.3 billion and increased 18% (up 21% in constant currency)
  • Net income was $20.1 billion and increased 20% (up 23% in constant currency)
  • Diluted earnings per share was $2.69 and increased 21% (up 23% in constant currency)
"Organizations are asking not only how - but how fast - they can apply this next generation of AI to address the biggest opportunities and challenges they face - safely and responsibly," said Satya Nadella, chairman and chief executive officer of Microsoft. "We remain focused on leading the new AI platform shift, helping customers use the Microsoft Cloud to get the most value out of their digital spend, and driving operating leverage."

Solidigm Introduces D5-P5336 - the World's Highest Capacity PCIe SSD

Solidigm over the weekend announced the Solidigm D5-P5336 line of enterprise SSDs. The company earns bragging rights for having the highest capacity among production SSDs, with these coming in capacities of up to 61.44 TB—an incredible amount of storage that HDDs have yet to catch up to. These drives are targeted at the data-center, where they not just replace HDDs due to their sheer density, but also provide the advantage of responsive flash storage. The drives aren't gunning to top SSD performance charts, the design goal is simply capacity and reliability. The D5-P5336 series feature enterprise-grade QLC NAND flash memory that's been extensively tested for reliability and write endurance.

The Solidigm D5-P5336 series comes in three data-center relevant form-factors—15 mm-thick U.2, 7.5 mm-thick E3.S, and the E1.L (ruler) form-factor. Capacities start at 7.68 TB for the U.2 and E3.S models, and 15.36 TB for the E1.L. The maximum capacity on offer is 61.44 TB for the U.2 and E1.L form-factors, and 30.72 TB for the E3.S. Among the capacity variants are 15.36 TB, 30.72 TB, and 61.44 TB. The drives use a proprietary controller that takes advantage of the PCI-Express 4.0 x4 host interface, and NVMe 1.4c protocol. The star attraction, however, is the 192-layer 3D QLC NAND flash memory made in-house by Solidigm.

AIC Launches HA401-TU, a New High-availability Server Model

AIC has launched the new high-availability storage server HA401-TU, which is optimized for mission-critical, enterprise-level storage applications. This cluster-in-a-box solution with active-active failover design and eliminates single points of failure. HA401-TU is a 4U high-availability (HA) server with 2 controller nodes and supports 24 3.5" SAS 12 Gb/s drives. Each controller node is equipped with an AIC Tucana server board that is powered by dual 3rd Gen Intel Xeon Scalable processors and Intel C621A chipset, which supports UPI speed up to 11.2 GT/s. HA401-TU provides enterprise users with a number of crucial benefits. The redundant hardware components ensure that there is no single point of failure.

With the hot-swappable functionality, the controller canisters protect enterprises from the loss of revenue that can occur when access to mission-critical data or applications is disrupted. Both controller nodes process data input/output (I/O) operations and users can experience simultaneous and balanced access to logical devices. In the event of failover, the secondary node will automatically take over the devices, client connections and all the processes and services running in the system. This high-availability design significantly enhances the overall performance of clusters, enabling seamless handling of demanding workloads.

Supermicro Adds 192-Core ARM CPU Based Low Power Servers to Its Broad Range of Workload Optimized Servers and Storage Systems

Supermicro, Inc., a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is announcing several new servers to its already broad application optimized product line. These new servers incorporate the new AmpereOne CPU, with up to 192 single-threaded cores and up to 4 TB of memory capacity. Applications such as databases, telco edge, web servers, caching services, media encoding, and video gaming streaming will benefit from increased cores, faster memory access, higher performance per watt, scalable power management, and the new cloud security features. Additionally, Cloud Native microservice based applications will benefit from the lower latencies and power usage.

"Supermicro is expanding our customer choices by introducing these new systems that incorporate the latest high core count CPUs from Ampere Computing," said Michael McNerney, vice president of Marketing and Security, Supermicro. "With high core counts, predictable latencies, and up to 4 TB of memory, users will experience increased performance for a range of workloads and lower energy use. We continue to design and deliver a range of environmentally friendly servers that give customers a competitive advantage for various applications."

Phanteks Unveils Enthoo Pro 2 Server Edition Full-tower Case

Phanteks is excited to announce the launch of the Enthoo Pro 2 Server Edition, designed to deliver unparalleled performance and versatility. With innovative features like Phanteks "High-performance Fabric" and a spacious interior, the Enthoo Pro 2 Server Edition can accommodate any high-end configuration whether its SSI-EEB motherboards, extreme water cooling, or extensive storage requirements.

With support for both consumer and server grade hardware, the Enthoo Pro 2 Server Edition enables the seamless integration of diverse components, allowing users to build a cost-efficient server system that meets their specific requirements. The Pro 2 Server Edition expands support for more PCIe devices, accommodating server-grade hardware and providing exceptional cooling options for optimal performance. The additional left side fan bracket provides additional direct cooling for the most extreme server builds.

Bioware Insists that Star Wars: The Old Republic has a Bright Future

A BioWare Austin developer is openly discussing the transfer of their long-running "Star Wars: The Old Republic" MMORPG to an external studio—games news sites picked up on insider information earlier this month alleging that the studio and its parent company (EA) were holding meetings with Broadsword Online Games. An EA spokesperson responded to the leak (at the time) and explained: "We're evaluating how we give the game and the team the best opportunity to grow and evolve, which includes conversations with Broadsword, a boutique studio that specializes in delivering online, community-driven experiences. Our goal is to do what is best for the game and its players." It seems that the involved parties have agreed upon terms for "handing off" responsibilities, according to a developer's recent declarations.

Keith Kanneg - the Executive producer for Star Wars: The Old Republic (SWTOR) - has this week provided a comprehensive update about future plans on the game's discussion board: "Appreciate your patience with us as we continue to navigate the future shift of SWTOR's development team to a third party studio. We're working through the changes right now so I'll share with you the details I have. With 7.3 now live, our priority is continuing to prepare for Game Updates 7.3.1 and 7.4 along with planning for 2024 and 2025 with a focus on content and continued modernization initiatives." It is interesting that Broadsword is not named or directly referred to.

Major CSPs Aggressively Constructing AI Servers and Boosting Demand for AI Chips and HBM, Advanced Packaging Capacity Forecasted to Surge 30~40%

TrendForce reports that explosive growth in generative AI applications like chatbots has spurred significant expansion in AI server development in 2023. Major CSPs including Microsoft, Google, AWS, as well as Chinese enterprises like Baidu and ByteDance, have invested heavily in high-end AI servers to continuously train and optimize their AI models. This reliance on high-end AI servers necessitates the use of high-end AI chips, which in turn will not only drive up demand for HBM during 2023~2024, but is also expected to boost growth in advanced packaging capacity by 30~40% in 2024.

TrendForce highlights that to augment the computational efficiency of AI servers and enhance memory transmission bandwidth, leading AI chip makers such as Nvidia, AMD, and Intel have opted to incorporate HBM. Presently, Nvidia's A100 and H100 chips each boast up to 80 GB of HBM2e and HBM3. In its latest integrated CPU and GPU, the Grace Hopper Superchip, Nvidia expanded a single chip's HBM capacity by 20%, hitting a mark of 96 GB. AMD's MI300 also uses HBM3, with the MI300A capacity remaining at 128 GB like its predecessor, while the more advanced MI300X has ramped up to 192 GB, marking a 50% increase. Google is expected to broaden its partnership with Broadcom in late 2023 to produce the AISC AI accelerator chip TPU, which will also incorporate HBM memory, in order to extend AI infrastructure.

Chinese Tech Firms Buying Plenty of NVIDIA Enterprise GPUs

TikTok developer ByteDance, and other major Chinese tech firms including Tencent, Alibaba and Baidu are reported (by local media) to be snapping up lots of NVIDIA HPC GPUs, with even more orders placed this year. ByteDance is alleged to have spent enough on new products in 2023 to match the expenditure of the entire Chinese tech market on similar NVIDIA purchases for FY2022. According to news publication Jitwei, ByteDance has placed orders totaling $1 billion so far this year with Team Green—the report suggests that a mix of A100 and H800 GPU shipments have been sent to the company's mainland data centers.

The older Ampere-based A100 units were likely ordered prior to trade sanctions enforced on China post-August 2022, with further wiggle room allowed—meaning that shipments continued until September. The H800 GPU is a cut-down variant of 2022's flagship "Hopper" H100 model, designed specifically for the Chinese enterprise market—with reduced performance in order to meet export restriction standards. The H800 costs around $10,000 (average sale price per accelerator) according to Tom's Hardware, so it must offer some level of potency at that price. ByteDance has ordered roughly 100,000 units—with an unspecified split between H800 and A100 stock. Despite the development of competing HPC products within China, it seems that the nation's top-flight technology companies are heading directly to NVIDIA to acquire the best-of-the-best and highly mature AI processing hardware.

Supermicro Expands AMD Product Lines with New Servers and New Processors Optimized for Cloud Native Infrastructure

Supermicro, Inc., a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is announcing that its entire line of H13 AMD based-systems is now available with support for 4th Gen AMD EPYC processors, based on "Zen 4c" architecture, and 4th Gen AMD EPYC processors with AMD 3D V-Cache technology. Supermicro servers powered by 4th Gen AMD EPYC processors for cloud-native computing, with leading thread density and 128 cores per socket, deliver impressive rack density and scalable performance with energy efficiency to deploy cloud native workloads in more consolidated infrastructure. These systems are targeted for cloud operators to meet the ever-growing demands of user sessions and deliver AI-enabled new services. Servers featuring AMD 3D V-Cache technology excel in running technical applications in FEA, CFD, and EDA. The large Level 3 cache enables these types of applications to run faster than ever before. Over 50 world record benchmarks have been set with AMD EPYC processors over the past few years.

"Supermicro continues to push the boundary of our product lines to meet customers' requirements. We design and deliver resource-saving, application-optimized servers with rack scale integration for rapid deployments," said Charles Liang, president, and CEO of Supermicro. "With our growing broad portfolio of systems fully optimized for the latest 4th Gen AMD EPYC processors, cloud operators can now achieve extreme density and efficiency for numerous users and cloud-native services even in space-constrained data centers. In addition, our enhanced high performance, multi-socket, multi-node systems address a wide range of technical computing workloads and dramatically reduce time-to-market for manufacturing companies to design, develop, and validate new products leveraging the accelerated performance of memory intensive applications."

Giga Computing Expands Support for 4th Gen AMD EPYC Processors

Giga Computing, a subsidiary of GIGABYTE and an industry leader in high-performance servers, server motherboards, and workstations, today announced support for the latest 4th Gen AMD EPYC processors. The new processors, based on "Zen 4c" architecture and featuring AMD 3D V-Cache technology, enhance Giga Computing's enterprise solutions, enabling superior performance and scalability for cloud native computing and technical computing applications in GIGABYTE enterprise solutions. To date, more than thirty unique GIGABYTE systems and platforms support the latest generation of AMD EPYC 9004 processors. As time goes on Giga Computing will roll out more new GIGABYTE models for this platform, including more SKUs for immersion-ready servers and direct liquid cooling systems.

"For every new generation of AMD EPYC processors, GIGABYTE has been there, offering diverse platform options for all workloads and users," said Vincent Wang, Sales VP at Giga Computing. "And with the recent announcement of new AMD EPYC 9004 processors for technical computing and cloud native computing, we are also ready to support them at this time on our current AMD EPYC 9004 Series platforms."

ASUS Unveils ESC N8-E11, an HGX H100 Eight-GPU Server

ASUS today announced ESC N8-E11, its most advanced HGX H100 eight-GPU AI server, along with a comprehensive PCI Express (PCIe) GPU server portfolio—the ESC8000 and ESC4000 series empowered by Intel and AMD platforms to support higher CPU and GPU TDPs to accelerate the development of AI and data science.

ASUS is one of the few HPC solution providers with its own all-dimensional resources that consist of the ASUS server business unit, Taiwan Web Service (TWS) and ASUS Cloud—all part of the ASUS group. This uniquely positions ASUS to deliver in-house AI server design, data-center infrastructure, and AI software-development capabilities, plus a diverse ecosystem of industrial hardware and software partners.

Seagate HAMR 32 TB Capacity Drives Arriving Later This Year, 40+ TB in 2024

Seagate has recently published a preview of its next generation product hard drive lineup that utilize heat-assisted magnetic recording (HAMR) technology. A company roadmap indicates that the first commercial release of 32 TB capacity HAMR Mach 2 drives is penciled in for a Q3 2023 window, with a short hop to increased storage (40 TB) models predicted for launch in 2024. Seagate is also expected to release 24 TB and 28 TB capacity HDDs - based on the older perpendicular magnetic recording (PMR) technology - at some point in the near future. Technology news outlets anticipate that these two product ranges will co-exist for a while, until Seagate decides to favor its more advanced thermal magnetic storage solution. A lucky data center client has been getting hands-on time with evaluation HAMR hardware, as reported in late April. Seagate has since supplied other enterprise customers with unspecified HAMR HDD models.

Executives at Seagate have been openly discussing their HAMR products - destined to sit in new Corvault server equipment. Gianluca Romano, the company's chief financial officer, mentioned several models during a presentation at the Bank of America 2023 Global Technology conference: "When you go to HAMR, our 32-terabyte (model) is based on 10 disks and 20 heads. So same number of disks and head of the current 20-terabyte PMR...So all the increase is coming through areal density. The following one, 40-terabyte, still (has) the same 10 disks and 20 heads. And also the 50 (TB model), we said at our earnings release, in our lab, we are already running individual disk at 5 terabytes."

Kingston Technology Releases Server Premier 5600MT/s and 5200MT/s DDR5 ECC UDIMMs and ECC SODIMMs

Kingston Technology Company, Inc., a world leader in memory products and technology solutions, today announced the release of its 32 GB and 16 GB Server Premier DDR5 5600MT/s and 5200MT/s ECC Unbuffered DIMMs and ECC SODIMMs.

Server Premier UDIMM and ECC SODIMMs
For over 35 years Kingston has been the memory brand trusted by leading server manufacturers and the world's largest data centers. Server Premier is Kingston's industry standard server class memory solution sold by specification for use in white-box systems, and is Intel platform validated and qualified by leading motherboard/system manufacturers. Featuring a locked BOM (Bill of Materials) to provide a consistent brand and revision of primary components (including DRAM, PMIC, SPD hub, thermal sensors, and PCB), all Kingston server memory solutions are 100% tested and undergo a rigorous dynamic burn-in process designed to catch early-life failures at the factory.

Kingston Brings XS1000 External SSD and Non-Binary DDR5 to Computex 2023

Kingston was thrilled to be back at Computex to showcase some of its new and upcoming products, including the new XS1000 External SSD and some of its new and current Fury DDR5 memory modules and kits. In addition to these new products, Kingston also showcased its dedication to enterprise and server market with DC600M enterprise SSD, industrial SD cards, and Server Premier DDR5 memory, as well as its focus on both creators, gamers, and those on the move, with the Fury DDR5 memory, Fury Renegade SSDs, the new DT microDuo and DTMax flash drives, SD and microSD cards, and more.

To be available in Q3 2023, the Kingston XS1000 External SSD aims to bring pocket-sized portability without compromising the performance. It will be available in 1 TB and 2 TB capacities, have USB 3.2 Gen 2 interface, and peak at 1000 MB/s read and write sequential performance. Kingston is also announcing an updated Fury Renegade DDR5 RGB memory lineup, which will be available in 16 GB to 48 GB capacities, bringing non-binary kits, and ranging from 6000 to 7200 MT/s. In addition, there is also the FURY Renegade PCIe 4.0 M.2 SSD, coming in capacities of up to 4 TB and reaching sequential read and write performance of up to 7,300 and 7,000 MB/s, respectively.

Giga Computing Goes Big with Green Computing and HPC and AI at Computex

Giga Computing, a subsidiary of GIGABYTE and an industry leader in high-performance servers, server motherboards, and workstations, today announced a major presence at Computex 2023, held May 30 to June 2, with a GIGABYTE booth that inspires while showcasing more than fifty servers that span GIGABYTE's comprehensive enterprise portfolio, including green computing solutions that feature liquid cooled servers and immersion cooling technology. The international computer expo attracts over 100,000 visitors annually and GIGABYTE will be ready with a spacious and attractive booth that will draw in curious minds, and at the same time there will be plenty of knowledgeable staff to answer questions about how our products are being utilized today.

The slogan for Computex 2023 is "Together we create." And just like parts that make a whole, GIGABYTE's slogan of "Future of COMPUTING" embodies all the distinct computing products from consumer to enterprise applications. For the enterprise business unit, there will be sections with themes: "Win Big with AI HPC," "Advance Data Centers," and "Embrace Sustainability." Each theme will show off cutting edge technologies that span x86 and ARM platforms, and great attention is placed on solutions that address challenges that come with more powerful computing.

NVIDIA Collaborates With Microsoft to Accelerate Enterprise-Ready Generative AI

NVIDIA today announced that it is integrating its NVIDIA AI Enterprise software into Microsoft's Azure Machine Learning to help enterprises accelerate their AI initiatives. The integration will create a secure, enterprise-ready platform that enables Azure customers worldwide to quickly build, deploy and manage customized applications using the more than 100 NVIDIA AI frameworks and tools that come fully supported in NVIDIA AI Enterprise, the software layer of NVIDIA's AI platform.

"With the coming wave of generative AI applications, enterprises are seeking secure accelerated tools and services that drive innovation," said Manuvir Das, vice president of enterprise computing at NVIDIA. "The combination of NVIDIA AI Enterprise software and Azure Machine Learning will help enterprises speed up their AI initiatives with a straight, efficient path from development to production."

Supermicro Launches Industry's First NVIDIA HGX H100 8 and 4-GPU H100 Servers with Liquid Cooling

Supermicro, Inc., a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, continues to expand its data center offerings with liquid cooled NVIDIA HGX H100 rack scale solutions. Advanced liquid cooling technologies entirely from Supermicro reduce the lead time for a complete installation, increase performance, and result in lower operating expenses while significantly reducing the PUE of data centers. Savings for a data center are estimated to be 40% for power when using Supermicro liquid cooling solutions compared to an air-cooled data center. In addition, up to 86% reduction in direct cooling costs compared to existing data centers may be realized.

"Supermicro continues to lead the industry supporting the demanding needs of AI workloads and modern data centers worldwide," said Charles Liang, president, and CEO of Supermicro. "Our innovative GPU servers that use our liquid cooling technology significantly lower the power requirements of data centers. With the amount of power required to enable today's rapidly evolving large scale AI models, optimizing TCO and the Total Cost to Environment (TCE) is crucial to data center operators. We have proven expertise in designing and building entire racks of high-performance servers. These GPU systems are designed from the ground up for rack scale integration with liquid cooling to provide superior performance, efficiency, and ease of deployments, allowing us to meet our customers' requirements with a short lead time."

Killer Instinct Migrating to Modern Server Infrastructure

Dear Combo Breakers - this winter will be Killer Instinct's 10th year on Xbox Live. Throughout that time, Xbox platforms and services have been constantly evolving. We have been able to keep KI operating and adapting to many of these changes behind the scenes. Over the last five years, however, it has become more challenging to deal with issues that crop up due to KI's reliance on legacy services. We have been working on solutions to address these concerns. Thank you for being patient and amazing fans through any bumps in the road.

We have started migrating KI's legacy services to PlayFab services, a process which will happen over the next several months. This is a quality-of-life (QOL) update and does not have any new content or tuning changes: it ensures the game you know and love continues to provide the best possible player experience. Today, we've kicked off the first migration build for Xbox consoles, Windows PC, and Steam.

Samsung Trademark Applications Hint at Next Gen DRAM for HPC & AI Platforms

The Korea Intellectual Property Rights Information Service (KIPRIS) has been processing a bunch of trademark applications in recent weeks, submitted by Samsung Electronics Corporation. News outlets pointed out, earlier on this month, that the South Korean multinational manufacturing conglomerate was attempting to secure the term "Snowbolt" as a moniker for an unreleased HBM3P DRAM-based product. Industry insiders and Samsung representatives have indicated that high bandwidth memory (5 TB/s bandwidth speeds per stack) will be featured in upcoming cloud servers, high-performance and AI computing - slated for release later on in 2023.

A Samsung-focused news outlet, SamMobile, has reported (on May 15) of further trademark applications for next generation DRAM (Dynamic Random Access Memory) products. Samsung has filed for two additional monikers - "Shinebolt" and "Flamebolt" - details published online show that these products share the same "designated goods" descriptors with the preceding "Snowbolt" registration: "DRAM modules with high bandwidth for use in high-performance computing equipment, artificial intelligence, and supercomputing equipment" and "DRAM with high bandwidth for use in graphic cards." Kye Hyun Kyung, CEO of Samsung Semiconductor, has been talking up his company's ambitions of competing with rival TSMC in providing cutting edge component technology, especially in the field of AI computing. It is too early to determine whether these "-bolt" DRAM products will be part of that competitive move, but it is good to know that speedier memory is on the way - future generation GPUs are set to benefit.

Server Shipments to Fall an Estimated 2.85% YoY in 2023

TrendForce reveals that alongside the four major CSPs reducing their procurement volumes, OEMs like Dell and HPE have also scaled back their annual shipment volume forecasts at some point between February and April, predicting YoY declines of 15% and 12%, respectively. Furthermore, server demand in China is facing headwinds due to geopolitical and economic challenges. Consequently, TrendForce projects a downward revision in global server shipment volumes for this year—a 2.85% YoY decrease at 13.835 million units.

TrendForce emphasizes that the server market in 1H23 remains pessimistic, with 1Q23 shipments experiencing a 15.9% QoQ decrease due to off-season factors and end-user inventory adjustments. The expected industry boom in 2Q23 failed to materialize, leading to a modest QoQ growth estimate of only 9.23%. Persistent influences on server shipments include OEMs lowering shipment volumes, subdued domestic demand in China, and continuous supply chain inventory adjustments. ESG issues have also led CSPs to prolong server lifecycles and reduce procurement volume. Moreover, OEMs are lengthening supports period for older platforms as businesses seek to control capital expenditures, further contributing to market strain.

Supermicro Announces New Eight- and Four-Socket 4th Gen Intel Xeon Servers

Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Provider for Cloud, AI/ML, Storage, and 5G/Edge, is introducing the most powerful server in its lineup for large-scale database and enterprise applications. The Multi-Processor product line includes the 8-socket server, ideal for in-memory databases requiring up to 480 cores and 32 TB of DDR5 memory for maximum performance. In addition, the product line includes a 4-socket server, which is ideal for applications that require a single system image of up to 240 cores and 16 TB of high-speed memory.

These powerful systems all use 4th Gen Intel Xeon Scalable processors. Compared with the previous generation of 8-socket and 4-socket servers, the systems have 2X the core count, 1.33X the memory capacity, and 2X the memory bandwidth. Also, these systems deliver up to 4X the I/O bandwidth compared to previous generations of systems for connectivity to peripherals. The Supermicro 8-socket system has attained the highest performance ratings ever for a single system based on the SPECcpu2017 FP Rate benchmarks, for both the base and peak results. In addition, the Supermicro 8-socket and 4-socket servers demonstrate performance leadership on a wide range of SPEC benchmarks.

Report: DRAM and NAND Flash Prices Expected to Fall Further in 2Q23 Due to Weak Server Shipments and High Inventory Levels

TrendForce's latest research indicates that, as production cuts to DRAM and NAND Flash have not kept pace with weakening demand, the ASP of some products is expected to decline further in 2Q23. DRAM prices are projected to fall 13~18%; NAND Flash is expected to fall between 8~13%.

TrendForce reports that the significant drop in DRAM prices was mostly attributed to high inventory levels of DDR4 and LPDDR5 as PC DRAM, server DRAM, and mobile DRAM collectively account for over 85% of DRAM consumption. Meanwhile, the market share for DDR5 remains relatively low.

Nfina Technologies Releases Two New 3rd Gen Intel Xeon Scalable Processor-based Systems

Nfina announces the addition of two new server systems to its lineup, customized for small to medium businesses and virtualized environments. Featuring 3rd Gen Intel Xeon Scalable Processors, these scalable server systems fill a void in the marketplace, bringing exceptional multi-socket processing performance, easy setup, operability, and Nfina's five-year warranty.

"We are excited to add two new 3rd generation Intel systems to Nfina's lineup. Performance, scalability, and flexibility are key deciding factors when expanding our offerings," says Warren Nicholson, President and CEO of Nfina. "Both servers are optimized for high- performance computing, virtualized environments, and growing data needs." He continues by saying, "The two servers can also be leased through our managed services division. We provide customers with choices that fit the size of their application and budget - not a one size fits all approach."
Return to Keyword Browsing
May 20th, 2024 14:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts