News Posts matching #SDR

Return to Keyword Browsing

AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

AMD Radeon RX 6000 and RX 700 series based on RDNA 2 and RDNA 3 GPU architectures have been benchmarked by folks over at ComputerBase. However, these weren't regular benchmarks of performance but rather power consumption. According to their latest results, they discovered that enabling Variable Refresh Rate (VRR) can lower the power consumption of AMD Radeon cards in idle. Using a 4K display with a 144 Hz refresh rate, ComputerBase benchmarked Radeon RX 6800/6700 XT and RX 7900 XT, both last-generation and current-generation graphics cards. The performance matrix also includes a comparison to Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080.

Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.

Avermedia Launches the Live Gamer Extreme 3 with VRR support

AVerMedia Technologies, Inc., a leader in digital video and audio solutions, is thrilled to announce the launch of the Live Gamer EXTREME 3 (GC551G2). As AVerMedia's latest plug-and-play external 4K capture card, it can capture up to 4K 30fps in SDR while passing through gameplay at up to 4K 60fps in HDR. It is also AVerMedia's first capture card that supports variable refresh rate (VRR).

With VRR, the Live Gamer EXTREME 3 brings out the best performance for a competitive streaming session. VRR allows the monitor and gaming device to synchronize each frame to provide smooth tear-free gameplay so gamers can focus solely on the actions without getting distracted by any graphic delay. Additionally, the Live Gamer EXTREME 3's ultra-low latency feature enables zero lag between the gaming device and the monitor, ensuring the capture signal is delivered to the live stream equally fast. These two features allow gamers to enjoy the highest video quality for their gameplays and streams. And thanks to the use of UVC standard, the Live Gamer EXTREME 3 works out of the box without the need to install any drivers.

Seagate Ramps 20TB HDD Shipments Answering Mass Data Growth

Data drives today's most innovative technology and business breakthroughs. Maximizing the value of an organization's data is dependent on the ability to store, access, and activate as much data as possible. Today, Seagate Technology Holdings plc, a world leader in mass-data storage infrastructure solutions, launched the new Exos X20 20 TB and IronWolf Pro 20 TB conventional magnetic recording (CMR)-based hard disk drives (HDDs), increasing mass-capacity data storage capabilities.

Seagate's Exos X20 enterprise HDD is designed for maximum storage capacity and the highest rack-space efficiency. Built with cloud storage in mind, the 20 TB Exos X20 delivers performance for hyperscale data centers and massive scale-out applications. With low latency of 4.16 ms and repeatable response times, Exos X20 provides enhanced caching that performs up to three times better than solutions that only utilize read or write caching. Exos X20 also delivers an increased sustained data rate (SDR) of up to 285 MB/s.

Performance Penalty from Enabling HDR at 4K Lower on AMD Hardware Versus NVIDIA

The folks over at Computerbase.de have took it into their hands to study exactly how much of an impact >(if any) would activating HDR on a 4K panel affect performance cross different hardware configurations. Supposedly, HDR shouldn't impose any performance penalty on GPUs that were designed to already consider that output on a hardware level; however, as we know, expectations can sometimes be wrong.

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Return to Keyword Browsing
Nov 24th, 2024 04:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts