News Posts matching #2160p

Return to Keyword Browsing

Avermedia Launches the Live Gamer Extreme 3 with VRR support

AVerMedia Technologies, Inc., a leader in digital video and audio solutions, is thrilled to announce the launch of the Live Gamer EXTREME 3 (GC551G2). As AVerMedia's latest plug-and-play external 4K capture card, it can capture up to 4K 30fps in SDR while passing through gameplay at up to 4K 60fps in HDR. It is also AVerMedia's first capture card that supports variable refresh rate (VRR).

With VRR, the Live Gamer EXTREME 3 brings out the best performance for a competitive streaming session. VRR allows the monitor and gaming device to synchronize each frame to provide smooth tear-free gameplay so gamers can focus solely on the actions without getting distracted by any graphic delay. Additionally, the Live Gamer EXTREME 3's ultra-low latency feature enables zero lag between the gaming device and the monitor, ensuring the capture signal is delivered to the live stream equally fast. These two features allow gamers to enjoy the highest video quality for their gameplays and streams. And thanks to the use of UVC standard, the Live Gamer EXTREME 3 works out of the box without the need to install any drivers.

Intel Core i5-13600K and Core i7-13700K QS CPUs Benchmarked

Is there anything better than yet another benchmark leak of upcoming products? This time around we don't have to make do with Geekbench or some other useless benchmark, as a bilibili user in the PRC has posted a video where he has put the upcoming Intel Core i5-13600K and Core i7-13700K CPUs through 10 different games, plus 3DMark Fire Strike and Time Spy. This has been done at 1080p, 1440p and 2160p at that, using a GeForce RTX 3090 Ti graphics card. Both CPUs are QS or Qualification Samples, which means they're going to be close to identical to retail chips, unless there are some last minute issues that are discovered. The CPUs were tested using an ASRock Z690 Steel Legends WiFi 6E motherboard, well, two actually, as both a DDR4 and a DDR5 version were used. The DDR4 RAM was running at 3600 MHz with slow-ish timings of 18-22-22 in gear 1, whereas the DDR5 memory was running at 5200 MHz, most likely at 40-40-40 timings, although the modules were rated for 6400 MHz, in both cases we're looking at 32 GB.

Courtesy of @harukaze5719, we have some much easier to read graphs than those provided by the person that tested the two CPUs, but we've included the full graphs below as well. Each CPU was compared to its current SKU equivalent from Intel and in many of the games tested, the gain was a mere percent or less to three or four percent. However, in some games—at specific resolutions—especially when paired with DDR5 memory, the performance gain was as much as 15-20 percent. A few of the games tested, such as FarCry 6 at 4K, the game ends up being GPU limited, so a faster CPU doesn't help here as you'll see in the graphs below. There are some odd results as well, where the DDR5 equipped systems saw a regression in performance, so it's hard to draw any final conclusions from this test. That said, both CPUs should offer a decent performance gain, as long as the game in question isn't GPU limited, of around five percent at 1440p when paired with DDR5 memory.

YouTube Updates Server Infrastructure With Custom ASICs for Video Transcoding

Video streaming is looking a bit like magic. The uploader sends a video to one platform in one resolution and encoding format, while the viewer requests a video in a specific resolution and encoding format used by the device the video is streamed on. YouTube knows this best, as it represents the world's largest video platform with over 2 billion users visiting the platform each month. That takes a massive load on the server infrastructure over at Google's data centers that host the service. There is about 500 hours worth of video content uploaded to the platform every minute, and regular hardware isn't being enough anymore to handle everything.

That is why YouTube has developed custom chips, ASICs, that are called VCUs or Video (trans)Coding Units. In Google data centers, there is a large problem with transcoding. Each video needs to adapt to the streaming platform and desired specifications, and doing that on regular hardware is a problem. By using ASIC devices, such as VCUs, Google can keep up with the demand and deliver the best possible quality. Codenamed Argos, the chip can deliver 20-33x improvement in efficiency compared to the regular server platform. In data centers, the VCU is implemented as a regular PCIe card, with two chips under the heatsinks.

BenQ Launches SW321C 32-inch Monitor

BenQ today announced the latest addition to its monitor family design for professional use. The SW321C, as it is called, is a 32-inch monitor with an IPS panel of 4K (3840×2160p) resolution. The panel itself is a 60 Hz screen with 250 nits of brightness, 1000:1 contrast ratio, 5 ms GtG response time, and it offers 178-degree viewing angles, which is standard for IPS panels. When it comes to the color coverage and the ability to accurately represent them, the SW321C features 95% of the DCI-P3, 99% of the Adobe RGB, and 100% of the sRGB color gamut. It has a 16-bit 3D look-up table (LUT) and features calibration for DeltaE ≤ 2.

The monitor comes with HDR10 specification, however, due to the brightness of 250 nits, it is not capable of performing any serious HDR content editing. Another interesting note is that this monitor supports Hybrid Log-Gamma (HLG) standard, which is an uncommon one. For input, the monitor had support for one DisplayPort 1.4, two HDMI 2.0, and one USB-C port. There is a dual-port USB hub, which has an SD card reader right next to it, making this very useful feature for photographers. Exact pricing and availability of this monitor are unknown, however, it is supposed to hit the market soon.
BenQ SW321C monitor BenQ SW321C monitor BenQ SW321C monitor

Bethesda Releases Final Specs Listing for Doom Eternal

Bethesda today released the final system requirements for its upcoming massacre-fest Doom Eternal. The game, which is geared for release just 10 days from now (March 20th), promises to be one of the most impressive (and fluid) games in recent times, if the original, modern Doom is anything to go by.

Bethesda has even gone so far so as to list preferred specs for gamers that want to play in 4K at 60 FPS or in 1440p at 120 FPS: and these are pretty abusive, with an NVIDIA GeForce RTX 2080 Ti being required - likely because of its gargantuan 11 GB VRAM. AMD's Ryzen 7 3700X or Intel's Core i9-9900K are the requirements here, alongside 16 GB of system RAM. Check after the break for a breakdown on recommended specs for other resolutions and quality settings, and for Bethesda's trailer showing off customization options for your DOOM slayer. Do you have what it takes to run the game?
Return to Keyword Browsing
Nov 23rd, 2024 15:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts