News Posts matching #Power Consumption

Return to Keyword Browsing

Alleged ARM Cortex-X5 Underperformance Linked to Power Consumption Concerns

ARM's in-progress fifth generation "Blackhawk" Cortex design is allegedly going through a troubled phase of development, according to Chinese insider sources. A Revegnus (@Tech_Reve) social media post highlights ongoing issues: "It's reported that the Cortex X5 architecture is underperforming compared to expectations. It's speculated that the high-frequency power consumption has surged explosively. Therefore, if performance is reduced for lower power consumption, the Geekbench 6 multi-core score of Dimensity 9400 may not achieve a score of 9,400 points." A recent Moor Insights & Strategy analysis piece proposed that "Blackhawk" would become "the most powerful option available at launch" later this year—mobile chipsets leveraging ARM's Cortex-X5 design are touted to face tough next-gen competition from Qualcomm and Apple corners.

Revegnus pulled in a rival SoC: "While Snapdragon 8 Gen 4 is seen to have minor issues, there is no evidence to support this claim. There might be a problem with low-frequency power consumption not showing clear superiority over ARM's middle cores." Qualcomm's next flagship model is performing admirably according to insiders—an engineering sample managed to score 10,628 points in alleged Geekbench 6 multi-core gauntlets. Late last month prototype clocks were leaked—Digital Chat Station claimed that a Snapdragon 8 Gen 4 High-Performance "Big" core was capable of reaching 4.0 GHz. Prior to the latest news, MediaTek's Dimensity 9400 SoC was observed achieving ~10,000 multi-core Geekbench 6 scores—leaked CPU cluster details present a single "Big" Cortex-X5 unit operating alongside three Cortex-X4 cores.

AI Power Consumption Surge Strains US Electricity Grid, Coal-Powered Plants Make a Comeback

The artificial intelligence boom is driving a sharp rise in electricity use across the United States, catching utilities and regulators off guard. In northern Virginia's "data center alley," demand is so high that the local utility temporarily halted new data center connections in 2022. Nation-wide, electricity consumption at data centers alone could triple by 2030 to 390 TeraWatt Hours. Add in new electric vehicle battery factories, chip plants, and other clean tech manufacturing spurred by federal incentives, and demand over the next five years is forecast to rise at 1.5%—the fastest rate since the 1990s. Unable to keep pace, some utilities are scrambling to revise projections and reconsider previous plans of closing fossil fuel plants even as the Biden administration pushes for more renewable energy. Some older coal power plans will stay online, until the grid adds more power production capacity. The result could be increased emissions in the near term and risks of rolling blackouts if infrastructure continues lagging behind demand.

The situation is especially dire in Virginia, the world's largest data center hub. The state's largest utility, Dominion Energy, was forced to pause new data center connections for three months last year due to surging demand in Loudoun County. Though connections have resumed, Dominion expects load growth to almost double over the next 15 years. With data centers, EV factories, and other power-hungry tech continuing rapid expansion, experts warn the US national electricity grid is poorly equipped to handle the spike. Substantial investments in new transmission lines and generation are urgently needed to avoid businesses being turned away or blackouts in some regions. Though many tech companies aim to power operations with clean energy, factories are increasingly open to any available power source.

AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

AMD Radeon RX 6000 and RX 700 series based on RDNA 2 and RDNA 3 GPU architectures have been benchmarked by folks over at ComputerBase. However, these weren't regular benchmarks of performance but rather power consumption. According to their latest results, they discovered that enabling Variable Refresh Rate (VRR) can lower the power consumption of AMD Radeon cards in idle. Using a 4K display with a 144 Hz refresh rate, ComputerBase benchmarked Radeon RX 6800/6700 XT and RX 7900 XT, both last-generation and current-generation graphics cards. The performance matrix also includes a comparison to Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080.

Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.

AMD Confirms that Instinct MI300X GPU Can Consume 750 W

AMD recently revealed its Instinct MI300X GPU at their Data Center and AI Technology Premiere event on Tuesday (June 15). The keynote presentation did not provide any details about the new accelerator model's power consumption, but that did not stop one tipster - Hoang Anh Phu - from obtaining this information from Team Red's post-event footnotes. A comparative observation was made: "MI300X (192 GB HBM3, OAM Module) TBP is 750 W, compared to last gen, MI250X TBP is only 500-560 W." A leaked Giga Computing roadmap from last month anticipated server-grade GPUs hitting the 700 W mark.

NVIDIA's Hopper H100 took the crown - with its demand for a maximum of 700 W - as the most power-hungry data center enterprise GPU until now. The MI300X's OCP Accelerator Module-based design now surpasses Team Green's flagship with a slightly greater rating. AMD's new "leadership generative AI accelerator" sports 304 CDNA 3 compute units, which is a clear upgrade over the MI250X's 220 (CDNA 2) CUs. Engineers have also introduced new 24G B HBM3 stacks, so the MI300X can be specced with 192 GB of memory (as a maximum), the MI250X is limited to a 128 GB memory capacity with its slower HBM2E stacks. We hope to see sample units producing benchmark results very soon, with the MI300X pitted against H100.

Giga Computing Leaked Server Roadmap Points to 600 W CPUs & 700 W GPUs

A leaked roadmap (that seems to be authored) by Giga Computing provides an interesting peak into the future of next generation enterprise-oriented CPUs and GPUs. TDP details of Intel, AMD and NVIDIA hardware are featured within the presentation slide - and all indications point to a trend of continued power consumption growth. Intel's server CPU lineups, including fourth generation Sapphire Rapids-SP and fifth-gen Emerald Rapids-SP Xeon chips, are projected to hit maximum TGPs of 350 W by mid-2024. Team Blue's sixth gen Granite Rapids is expected to arrive in the latter half of 2024, and Gigabyte's leaked roadmap points to a push into 500 W territories going forward into 2025.

AMD's Zen 5-based Turin server CPUs are expected to ship by the second half of 2024, and power consumption is estimated to hit a maximum of 600 W - representing a 50% increase over the Zen 4-based Genoa family. The 2024 NVIDIA PCIe GPU lineup is likely hitting TDPs of up to 500 W, it is rumored that these enterprise cards will be based on the Blackwell chip architecture - set to succeed current generation H100 "Hopper" PCIe accelerators (featuring 350-450 W TDPs). It is possible that AMD's Instinct-class PCIe accelerator family will become the direct competition, these cards are rated up to 400 W. The AMD Instinct MI250 OAM category has a maximum rating of 560 W. The NVIDIA Grace and Grace Hopper CPU Superchips are said to feature 600 W and 1000 W TDPs (respectively).

Kyocera's New "On-Board Optics Module" Achieves World-Record Bandwidth, Reduces Power Consumption for Data Centers

Kyocera Corporation today announced it has developed an On-Board Optics Module that achieves world-record bandwidth of 512 Gbps. The module is expected to support high-speed network applications, such as data centers. Additionally, by converting electrical signals into optical signals, the module uses much less power than conventional alternatives and will also help decrease power consumption and promote sustainability.

Kyocera's prototype module is miniaturized for installation on a printed circuit board near the processor, allowing electronic data to be converted into optical signals instantaneously. In addition, the product is designed to create unprecedented improvements in signal-to-noise ratio, virtually eliminating the signal loss caused by conventional electrical conductors. As a result of these technological advances, Kyocera's On-Board Optics Module has achieved world-record bandwidth of 512 gigabits per second (Gbps) and is expected to help data centers and supercomputers save power while increasing bandwidth and data transfer rates.

Latest AMD Radeon 21.6.1 Drivers Apparently Fix High Idle Power Consumption

Anecdotal reports of fixed idle power consumption for AMD's graphics cards are filling the internet following the release of their latest Radeon drivers, version 21.6.1. Besides introducing support for FSR (which we have already tested with a new special image comparison code from our resident W1zzard with great results), users are reporting reduced power consumption on idle or low-intensity workloads - something that we had already covered in our latest AMD reviews, which showcased higher power consumption compared to NVIDIA.

Users are reporting drops from around 30 W to ~8 W - nothing to scoff at, and getting AMD's offerings in line with NVIDIA's on that particular metric. While this remains anecdotal evidence for now, rest assured that we will be testing these new drivers so as to definitely claim an improvement (or the absence of it).

What AMD Didn't Tell Us: 21.4.1 Drivers Improve Non-Gaming Power Consumption By Up To 72%

AMD's recently released Radeon Software Adrenalin 21.4.1 WHQL drivers lower non-gaming power consumption, our testing finds. AMD did not mention these reductions in the changelog of its new driver release. We did a round of testing, comparing the previous 21.3.2 drivers, with 21.4.1, using Radeon RX 6000 series SKUs, namely the RX 6700 XT, RX 6800, RX 6800 XT, and RX 6900 XT. Our results show significant power-consumption improvements in certain non-gaming scenarios, such as system idle and media playback.

The Radeon RX 6700 XT shows no idle power draw reduction; but the RX 6800, RX 6800 XT, and RX 6900 XT posted big drops in idle power consumption, at 1440p, going down from 25 W to 5 W (down by about 72%). There are no changes with multi-monitor. Media playback power draw sees up to 30% lower power consumption for the RX 6800, RX 6800 XT, and RX 6900 XT. This is a huge improvement for builders of media PC systems, as not only power is affected, but heat and noise, too.

Intel's Comet Lake Absence at CES Reportedly Related to Power Consumption Wall

Reports are flooding the web regarding Intel's total lack of reference to their upcoming Comet Lake family of CPUs, which will be branded under the Intel Core 10000 series. As reports would have it, motherboard makers had stock of LGA 1200 motherboards ready to showcase at CES, but were told to pull them in what is equivalent to a logistical "last minute". It seems that both Intel's lack of commitment to Comet Lake on its CES presentation and absence of ecosystem showcase at this year's CES might have something to do with, well, close to shame on Intel's parts.

Comet Lake will increase the maximum core count for their desktop CPUs up to 10 cores and 20 logical threads. But being built on the same 14 nm process as previous Intel generations since Skylake, there isn't much that can be done to offset increased power consumption. This is why industry sources are claiming Intel decided to skip Comet Lake at this CES - a difficulty to rein in the processors' power consumption in time for the event, with power consumption hitting 300 W. And with Intel's Core i9 10900K being configured with a PL2 (Power Level 2) of 250 W, a maximum 300 W under full load seems more than plausible.

NVIDIA Releases GeForce 416.81 WHQL Drivers Fixing "Turing" Power Consumption

NVIDIA today released GeForce 416.81 WHQL drivers. These drivers provide optimization for "Battlefield V," which appears to be available to Origin Access users. In addition, the drivers significantly reduce Idle and Multi-monitor power-consumption of GeForce RTX 20-series graphics cards. It also corrects G-Sync issues with "Turing" GPUs. Stuttering noticed on the RTX 2080 Ti when playing back HEVC videos is also fixed. A number of game-specific fixes related to "ARK Survival," "Shadow of the Tomb Raider," "Witcher 3: Wild Hunt," "Monster Hunter World," and "Far Cry 5" were also fixed. Grab the drivers from the link below.
DOWNLOAD: NVIDIA GeForce 416.81 WHQL

The change-log follows.

NVIDIA Finally Fixes Multi-Monitor Power Consumption of Turing GeForce 20. Tested on RTX 2070, 2080 and 2080 Ti.

Today, NVIDIA released their GeForce 416.81 drivers, which among others, contains the following changelog entry: "[Turing GPU]: Multi-monitor idle power draw is very high. [2400161]". Back at launch in September, Turing was plagued with very high non-gaming power consumption, in both single-monitor and multi-monitor idle.

The company was quick to fix single-monitor power consumption, which we tested promptly. Unfortunately, at the time, multi-monitor power draw wasn't improved and people were starting to get worried that there might be some kind of unfixable issue present on Turing that would prevent NVIDIA from fixing multi-monitor power draw.

NVIDIA Fixes RTX 2080 Ti & RTX 2080 Power Consumption. Tested. Better, But not Good Enough

While conducting our first reviews for NVIDIA's new GeForce RTX 2080 and RTX 2080 Ti we noticed surprisingly high non-gaming power consumption from NVIDIA's latest flagship cards. Back then, we reached out to NVIDIA who confirmed that this is a known issue which will be fixed in an upcoming driver.

Today the company released version 411.70 of their GeForce graphics driver, which, besides adding GameReady support for new titles, includes the promised fix for RTX 2080 & RTX 2080 Ti.

We gave this new version a quick spin, using our standard graphics card power consumption testing methodology, to check how things have been improved.

ADATA Announces IUSP33F PCIe BGA SSD

ADATA Technology, a leading manufacturer of high-performance DRAM modules and NAND flash products, today launched the ADATA IUSP33F PCIe ball grid array (BGA) solid state drive (SSD). The SSD sports a form factor that is 80 percent more compact than M.2 2242 SSDs. Combined with a PCIe Gen3x2 interface and 3D Flash memory for excellent performance and durability, the IUSP33F is an ideal solution for slim-form-factor tablets, notebooks, hybrids, mini-PCs, thin clients, and wearables.

"We are thrilled to be introducing the new IUSP33F SSD, a compact solution that will enable next-generation tablets, ultrabooks, and other slim devices, but without compromising on performance and reliability," said Hedi Huang, Sales Directorof ADATA. "But the versatility of the IUSP33F goes beyond just these applications, and are also well-suited for new emerging applications in areas such as robotics, augmented and virtual reality, and automotive.

Upcoming Windows 10 Task Manager Update to Show Power Usage, Power Usage Trend per Process

One nifty new feature currently being deployed to Windows 10 Fast Ring users is the ability to see exactly how much power a given process is consuming in your system's hardware (CPU, GPU & Disk). The new feature, which appears as two additional Task Manager tabs, showcases the instantaneous power usage of a given process, but also features a trend calculator that covers a two-minute interval. This should be pretty handy, if the measurement process is close enough to the real power consumption. This could even be used as another flag for cryptomining malware or scripts in a given webpage. You can check the source for the additional updates that have been brought to build 17704 of the Windows Insider Program.

Cryptocurrency Mining Consumes More Power Than 17M Population Country

So, yes, the headline is accurate. We all know that cryptocurrency mining has now reached an all time high, which has affected availability and pricing of most graphics cards from both AMD and NVIDIA. Who doesn't want to make a quick buck here and there? So long as it's profitable, right?

Well, that kind of thinking has already brought the global mining power consumption to unprecedented levels (some might also say demented.) The two top cryptocurrencies right now (by market-cap), Bitcoin and Ethereum, are each responsible for 14.54 TWh and 4.69 TWh power consumption figures. As of now, Ethereum consumes almost as much power as the 120th most power-consuming country, Moldova, which has a population of around 3 million. Bitcoin, on the other hand, stands at 81st on the list, in-between Mozambique and Turkmenistan, the latter of which has a population estimated at 5.17 million people. Combined, Ethereum and Bitcoin consume more power than Syria, which had an estimated 2014 population above 17 million.
Return to Keyword Browsing
May 31st, 2024 22:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts