News Posts matching #Comparison

Return to Keyword Browsing

Intel "Sierra Forest" Xeon System Surfaces, Fails in Comparison to AMD Bergamo

Intel's upcoming Sierra Forest Xeon server chip has debuted on Geekbench 6, showcasing its potential in multi-core performance. Slated for release in the first half of 2024, Sierra Forest is equipped with up to 288 Efficiency cores, positioning it to compete with AMD's Zen 4c Bergamo server CPUs and other ARM-based server chips like those from Ampere for the favor of cloud service providers (CSP). In the Geekbench 6 benchmark, a dual-socket configuration featuring two 144-core Sierra Forest CPUs was tested. The benchmark revealed a notable multi-core score of 7,770, surpassing most dual-socket systems powered by Intel's high-end Xeon Platinum 8480+, which typically scores between 6,500 and 7,500. However, Sierra Forest's single-core score of 855 points was considerably lower, not even reaching half of that of the 8480+, which manages 1,897 points.

The difference in single-core performance is a matter of choice, as Sierra Forest uses Crestmont-derived Sierra Glen E-cores, which are more power and area-efficient, unlike the Golden Cove P-cores in the Sapphire Rapids-based 8480+. This design choice is particularly advantageous for server environments where high-core counts are crucial, as CSPs usually partition their instances by the number of CPU cores. However, compared to AMD's Bergamo CPUs, which use Zen 4c cores, Sierra Forest lacks pure computing performance, especially in multi-core. The Sierra Forest lacks hyperthreading, while Bergaamo offers SMT with 256 threads on the 128-core SKU. Comparing the Geekbench 6 scores to AMD Bergamo EPYC 9754 and Sierra Forest results look a lot less impressive. Bergamo scored 1,597 points in single-core, almost double that of Sierra Forest, and 16,455 points in the multi-core benchmarks, which is more than double. This is a significant advantage of the Zen 4c core, which cuts down on caches instead of being an entirely different core, as Intel does with its P and E-cores. However, these are just preliminary numbers; we must wait for real-world benchmarks to see the actual performance.

Intel LGA 7529 Socket Photographed Again, Comparisons Show Gargantuan Physical Footprint

A set of detailed photos has been uploaded to a blog on the Chinese Bilibili site, and the subject matter is an engineering sample of a motherboard that features Intel's next generation LGA 7529 socket. Specifications and photos relating to this platform have cropped up in the past, but the latest leak offers many new tidbits of information. The Bilibili blogger placed a Sapphire Rapids Xeon Processor on top of the the new socket, and this provides an interesting point of reference - it demonstrates the expansive physical footprint that the fifth-generation platform occupies on the board.

This year's Sapphire Rapids LGA 4677 (Socket E) is already considered to be a sizeable prospect - measuring at 61 × 82 mm. The upcoming Mountain Stream platform (LGA 7529) is absolutely huge in comparison, with eyeball estimates placing it possessing rough dimensions (including the retention arm) of 66 × 92.5 mm. The fifth generation platform is designed to run Intel's Granite Rapids and Sierra Forest CPUs - this family of Xeons featuring scalable microarchitecture is expected to launch in 2024. The code name "Avenue City" has been given to a reference platform that features a dual socket configuration.

NVIDIA Launches Image Comparison & Analysis Tool (ICAT)

Through NVIDIA's development of DLSS, we've found the best way to evaluate image quality is by comparing videos - it really shows off the true experience of what gamers experience. But video image quality analysis is time consuming and tedious. So we built an Image Comparison & Analysis Tool that we call ICAT, to make life easier for our testers and engineers. Our teams found the tool so useful that we've decided to make it available to everyone, making accurate and fast assessments of image quality between multiple images or videos far, far easier.

NVIDIA ICAT allows users to easily compare up to 4 screenshots or videos with sliders, side-by-sides, and pixel peeping zoom-ins. Align comparisons spatially and temporally, examine the differences, and draw your conclusions. To compare different scaling technologies, we encourage you to first use ICAT to find the most comparable image quality modes, and then look at the performance gains. Note that just because two quality modes are named the same, doesn't mean they are equivalent. While this varies by game and resolution, we've found that DLSS Performance mode is best compared to the Ultra Quality mode of spatial upscalers. We call this approach "ISO-Quality", as performance is compared at equivalent image quality levels. The NVIDIA ICAT program is now available to download from the link below.

Cyberpunk 2077 Graphics Comparison Video Between 2018 and 2020 Builds Shows Many Differences

Cyberpunk 2077 is the year's most awaited game release, and has been met with not one, but two delays already. Originally expected to ship in April of this year, it has since been postponed to September, and now to November 19th on account of extra optimization and bug quashing from developer CD Projekt Red. However, the recent gameplay videos released for the game by the developer showcase the amount of work that has gone into the engine since 2018, when we were first treated to a gameplay video.

The video after the break comes courtesy of YouTube user 'Cycu1', who set up the 2018 and 2020 trailers side by side. In it, you can see extreme improvements to overall level and character detail (some of this can certainly be attributed to a lower-quality 2018 video compression). However, the video also showcases some lighting differences (I guess it's subjective whether this has worked out for better or worse, but the new videos supposedly make use of ray tracing). Another point that I'd like to call your attention to is that there seem to be some environment differences between the two versions - it seems that some environments were simplified compared to their 2018 version, such as the "Going Pro" mission - the chair and panels were removed from the environment and replaced by what looks like a garage door. Whether this was done as a way to improve performance is on CD Projekt Red's purview.

A Case for Windows Defender: Triad of Perfect Scores in AV-Test

Here's a strange thing: a case for a free, bundled software solution being better (in the metrics concerned and evaluated) than paid, third-party counterparts. We're writing of none other than Microsoft's own Windows Defender suite, which is bundled with Windows and offers a security solution integrated into your OS. While the "paid is always better" philosophy has been proven wrong time and again and isn't that much of a powerhouse behind users' thought process anymore, the fact is that Windows Defender has somewhat been taken for granted as an "undesirability" in users' computers. However, a comparison made by AV-Test, which pits many of the available cybersecurity solutions available on the market, has found Microsoft's Windows Defender to be worthy of a triad of perfect scores.

The results for Windows Defender include perfect (6.0) scores in the "Protection", "Performance" and Usability" categories. The testing period refers to May through June of this year, and only F-Secure SAFE 17, Kaspersky Internet Security 19 and Norton Security 22.17 managed to get the same perfect scores as Windows Defender Version 4.18. Check out the link for the score of your cybersecurity solution of choice. But it's clear that least where this period is concerned, Windows Defender walked circles around some paid solutions.

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Return to Keyword Browsing
Nov 21st, 2024 12:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts