News Posts matching #Benchmark

Return to Keyword Browsing

AMD Zen 2 12-Core, 24-Thread Matisse CPU Spotted in UserBenchmark

A new development could shake up our expectations on AMD's new Ryzen 2 CPUs, which if true, could mean that previous rumors of much increased core-counts at the top of AMD's offerings were true. User TUM Apisak, who has been involved in multiple information leaks and scouting for the hardware world, has digged enough to find a submitted UserBenchmark that screams of a 12-core, 24-thread AMD Matisse part (an engineering sample at that, so keep your hats on for the presented clock speeds).

The benchmark list the used CPU via product code 2D3212BGMCWH2_37 / 34_N (H2 is indicative of a Matisse CPU The benchmark is listing a base clock speed of 3.4 GHz and an average boost clock speed of 3.6 GHz. The rest of the system specs are very, very basic, with 4 GB of 1333 MHz DDR4 memory being used on a new AMD platform, based on the Myrtle-MTS based chipset. The processor is listed having a 105 watts TDP and 32 MB of L3 cache.

Basemark GPU 1.1 Update Released, Adds DirectX 12 Support

Today Basemark releases version 1.1 of its multi-platform graphics hardware evaluation tool Basemark GPU. Basemark GPU has been made available for free to download and use for personal users. Additionally, Basemark has provided professional versions for Benchmark Development Program members, corporate and commercial users.

Basemark GPU 1.1 Benchmark offers unparalleled, objective comparisons between Vulkan, OpenGL, OpenGL ES and now DirectX 12 for graphics performance analysis across both mobile and desktop platforms. Our desktop Linux version of Basemark GPU 1.1 will be available in the next few days utilizing the easily installable universal Flatpak delivery format.

Basemark GPU is available for download now.

Final Fantasy XV Benchmark Gets DLSS Update, GeForce RTX 2080 Performance Tested

Square Enix has just updated their Final Fantasy XV Benchmark to version 1.2, adding support for NVIDIA's DLSS (Deep Learning Super-Sampling) technology. The new release will still allow users to test any graphics card(s) they have just as it did before. That said, owners of NVIDIA's RTX 2070, 2080, and 2080 Ti get the benefit of having access to DLSS for improved image quality and performance. NVIDIA claims that performance will improve by up to 38% with DLSS alone. In order to verify that we ran a few tests of our own to find out.

Preliminary testing was done using Corsair's Vengeance 5180 Gaming PC, which is equipped with an Intel i7-8700, 16 GB 2666 MHz DDR4 and an NVIDIA GeForce RTX 2080. At 3840x2160 with the highest possible settings, DLSS offered a 36% increase in performance. This is very close to NVIDIA's specified increase and within the expected margin of error. When compared to the older GTX 1080 Ti which was paired with a stock Intel i7-8700K, and 32 GB of 3466 MHz memory we see the GeForce RTX 2080 and GTX 1080 Ti offer roughly the same level of performance. Therefore DLSS really is the difference maker here allowing for better performance and image quality. It should also be noted both systems used the same NVIDIA 416.94 WHQL drivers.

AMD Vega 20 Possible Performance Spotted in Final Fantasy XV Benchmark

It would appear AMD's 7nm Vega 20 has been benchmarked in Final Fantasy XV. While the details are scarce, what we do know is the hardware device ID 66AF:C1 can be linked to Vega 20 via the Linux patches back in April. Now considering AMD has not confirmed any 7nm Vega graphics cards for consumers, It is more likely this version is an engineering sample for the new Radeon Instinct or Pro series cards.

Alleged AMD RX 590 3D Mark Time Spy Scores Surface

Benchmark scores for 3D Mark's Time Spy have surface, and are purported to represent the performance level of an unidentified "Generic VGA" - which is being identified as AMD's new 12 nm Polaris revision. The RX 590 product name makes almost as much sense as it doesn't, though; for one, there's no real reason to release another entire RX 600 series, unless AMD is giving the 12 nm treatment to the entire lineup (which likely wouldn't happen, due to the investment in fabrication process redesign and node capacity required for such). As such, the RX 590 moniker makes sense if AMD is only looking to increase its competitiveness in the sub-$300 space as a stop-gap until they finally have a new graphics architecture up their shader sleeves.

In Wake of Controversy, Intel-Paid Principled Technologies Retesting AMD Ryzen Processors

Well, that proverbial storm of dirty undies did serve to rile up some people over at Intel and their paid-for stint with Principled Technologies, whose name begs for a change for honesty's sake. In wake of the controversy regarding its... flawed... testing of AMD's Ryzen 2700X performance in gaming workloads, Principled technologies has now come forward to say it's retesting AMD's processors in less... biased circumstances.

Let's start with the glass half-full part of this retesting: initial reports of memory timings on AMD's system being set in an almost "whatever" kind of way apparently weren't fair, since Principled Technologies have said they used D.O.C.P. settings for AMD's XMP-equivalent memory settings (not properly disclosed in the initial report, so, it's their own fault this happened). The good stuff ends there, though; numerous other flaws in the methodology, such as the usage of AMD's stock cooling solutions against a Noctua cooler for the Intel system (which they'll now fix on retesting), and the usage of AMD's Game Mode on their consumer Ryzen processors, which meant the usually 8-core processor was working in a 4-core mode (really, now?)... The company will now retest both CPUs in a more even-footed way. How's that for a change?

Intel's 9th Gen Core Gaming Benchmarks Flawed and Misleading

At its 9th Generation Core processor launch extravaganza earlier this week, Intel posted benchmark numbers to show just how superior its processors are to AMD 2nd generation Ryzen "Pinnacle Ridge." PC enthusiasts worth their salt were quick to point out that Intel's numbers are both flawed and misleading as they misrepresent both test setups - by optimizing Intel processors beyond their out-of-the-box performance, and by running AMD processors with sub-optimal settings.

Intel paid Principled Technologies, a third-party performance testing agency, to obtain performance numbers comparing the Core i9-9900K with the Ryzen 7 2700X across a spectrum of gaming benchmarks, instead of testing the two chips internally, and posting their test setup data in end-notes, as if to add a layer of credibility/deniability to their charade. The agency posted its numbers that were almost simultaneously re-posted PCGamesN, gleaming the headline "Up to 50% Faster than Ryzen at Gaming." You could fertilize the Sahara with this data.

Latest 3DMark Update adds Night Raid DX12 Benchmark for Integrated Graphics

With update 2.6.6174, released today, 3DMark now includes a new benchmark dubbed Night Raid. This latest addition to the popular 3DMark suite offers DX12 performance testing for laptops, tablets and other devices with integrated graphics. It also offers full support for ARM based processors in the latest always-connected PCs running Microsoft's Windows 10 on ARM. Users running 3DMark Basic Edition which is free will have access to this latest addition upon installing the update.

The Night Raid benchmark continues the trend of offering two graphics tests and a CPU test. While not as visually stunning as previous entries this is to be expected considering it is targeted at integrated graphics processors and entry level systems. Even so, it makes use of numerous graphical features with graphics test 1 including; dynamic reflections, ambient occlusion, and deferred rendering. Graphics test 2 features; tessellation, complex particle systems and depth of field effects with forward-rendering. Finally, the CPU test will measures performance through a combination of physics simulation, occlusion culling, and procedural generation.

NVIDIA RTX 2080 / 2080 Ti Results Appear For Final Fantasy XV

The online results database for the Final Fantasy XV Benchmark has been partially updated to include NVIDIA's RTX 2080 and 2080 Ti. Scores for both standard and high quality settings at 2560x1440 and 3840x2160 are available. While the data for 1920x1080 and lite quality tests are not.

Taking a look at the RTX 2080 Ti results, show it beating out the GTX 1080 Ti by 26% and 28% in the standard and high quality tests respectively, at 2560x1440. Increasing the resolution to 3840x2160, again shows the RTX 2080 Ti ahead, this time by 20% and 31% respectively. The RTX 2080 offers a similar performance improvement over the GTX 1080 at 2560x1440, where it delivers a performance improvement of 28% and 33% in the same standard and high quality tests. Once again, increasing the resolution to 3840x2160 results in performance being 33% and 36% better than the GTX 1080. Overall, both graphics cards are shaping up to be around 30% faster than the previous generation without any special features. With Final Fantasy XV getting DLSS support in the near future, it is likely the performance of the RTX series will further improve compared to the previous generation.

First Time Spy Benchmark of Upcoming NVIDIA RTX 2080 Graphics Card Leaks

A Time Spy benchmark score of one of NVIDIA's upcoming RTX 20-series graphics cards has come out swinging in a new leak. We say "one of NVIDIA's" because we can't say for sure which core configuration this graphics card worked on: the only effective specs we have are the 8 GB of GDDR6 memory working at 14 Gbps, which translates to either NVIDIA's RTX 2070 or RTX 2080 graphics cards. If we were of the betting type, we'd say these scores are likely from an NVIDIA RTX 2080, simply because the performance improvement over the last generation 1080 (which usually scores around the 7,300's) sits pretty at some 36% - more or less what NVIDIA has been doing with their new generation introductions.

The 10,030 points scored in Time Spy by this NVIDIA RTX graphics card brings its performance levels up to GTX 1080 Ti levels, and within spitting distance of the behemoth Titan Xp. This should put to rest questions regarding improved performance in typical (read, non-raytracing) workloads on NVIDIA's upcoming RTX series. It remains to be seen, as it comes to die size, which part of this improvement stems from actual rasterization performance improvements per core, or if this comes only from increased number of execution units (NVIDIA says it doesn't, by the way).

UL's Raytracing Benchmark Not Based on Time Spy, Completely New Development

After we've covered news of UL's (previously known as 3D Mark) move to include a raytracing benchmark mode on Time Spy, the company has contacted us and other members of the press to clarify their message and intentions. As it stands, the company will not be updating their Time Spy testing suite with Raytracing technologies. Part of the reason is that this would need an immense rewrite of the benchmark itself, which would be counterproductive - and this leads to the rest of the reason why it's not so: such a significant change would invalidate previous results that didn't have the Raytracing mode activated.

As such, UL has elected to develop a totally new benchmark, built from the ground up to use Microsoft's DirectX Raytracing (DXR). This new benchmark will be added to the 3D Mark app as an update. The new test will produce its own benchmarking scores, very much like Fire Strike and Time Spy did, and will provide users with yet another ladder to climb on their way to the top of the benchmarking scene. Other details are scarce - which makes sense. But the test should still be available on or around the time of NVIDIA's 20-series launch, come September 20th.

NVIDIA Releases First Internal Performance Benchmarks for RTX 2080 Graphics Card

NVIDIA today released their first official performance numbers for their new generation of GeForce products - particularly, the RTX 2080. The RTX 20 series of graphics cards, according to the company, offers some 50% performance improvements (on average) on architectural improvements alone, in a per-core basis. This number is then built upon with the added RTX performance of the new RT cores, which allows the RTX 2080 to increase its performance advantage over the last generation 1080 by up to 2x more - while using the new DLSS technology. PUBG, Shadow of the Tomb Raider, and Final Fantasy XV are seeing around 75 percent or more improved performance when using this tech.

NVIDIA is also touting the newfound ability to run games at 4K resolutions at over 60 FPS performance, making the RTX 2080 the card to get if that's your preferred resolution (especially if paired with one of those dazzling OLED TVs...) Of course, IQ settings aren't revealed in the slides, so there's an important piece of the puzzle still missing. But considering performance claims of NVIDIA, and comparing the achievable performance on last generation hardware, it's fair to say that these FPS scores refer to the high or highest IQ settings for each game.

Denuvo's Impact on Game Performance Benchmarked

Denuvo's impact on gaming performance has been spoken of immensely - as always has been the case for any and all DRM solution that finds its way into games. However, evidence always seemed to be somewhat anecdotal on whether or not Denuvo really impacted performance - for a while, the inability to test games with Denuvo implemented and officially removed (which, unsurprisingly, isn't the same as it being cracked) was a grand stopgap to any sort of serious testing.

Now, courtesy of Overlord's YouTube channel, we can see whether or not Denuvo impacts performance. In a total of seven games tested on a platform with an Intel Core i7 2600K stock CPU (for adequate testing of whether Denuvo really impacts more the CPU than any other system component) paired with a stock clocked 1080 ti. You really should take a look at the video; it's a short, informative one, but the gist of is this: Some games revealed performance improvements with Denuvo being removed: Mass Effect: Andromeda saw a huge boost from an average of 57 FPS all the way to 64 FPS due to the removal of the DRM solution; and Mad Max saw a more meager 54 to 60 FPS increase. The other games (which included Hitman, Abzu, and others, didn't see any performance difference.

Basemark Launches Free Multiplatform GPU Benchmark

Basemark launched today Basemark GPU, a new graphics performance evaluation tool for systems with Vulkan 1.0, OpenGL 4.5 or OpenGL ES 3.1 graphics APIs. This tool enables the industry to objectively and reliably quantify and compare graphics performance of next generation mobile, automotive and desktop processors.

"We have poured all of our soul and expertise in making this product. The work started two and half years ago and this massive project has culminated in today's launch of a true state-of-the-art product," said Arto Ruotsalainen, CEO of Basemark. "We believe Basemark GPU will become an essential tool for anyone tasked to evaluate graphics performance in systems ranging from smart phones, smart TVs and cars to PCs."

First Benchmarks, CPU-Z Screenshots of AMD Ryzen Threadripper 32-core CPU Surface

First benchmarks and CPU-Z screenshots of AMD's upcoming Ryzen Threadripper 32-core monster have surfaced, courtesy of HKEPC. The on-time-for-launch (as AMD puts it) 12 nm "Pinnacle Ridge" processor has apparently been christened "Threadripper 2990X", which does make sense - should AMD be thinking of keeping the 2920X moniker for 12 cores and 1950X for 16-cores, then it follows a 20-core 2960X, a 24-core 2970X, a 28-core 2980X, and the aforementioned 32-core 2990X. whether AMD would want to offer such a tiered lineup of HEDT processors, however, is another matter entirely, and certainly open for discussion - too much of a good thing can actually happen, at least where ASP of the Threadripper portfolio is concerned.

On the CPU-Z screenshot, the 2990X is running at 3.4 GHz base with up to 4.0 GHz XFR, and carries a 250 W TDP - a believable and very impressive achievement, testament to the 12 nm process and the low leakage it apparently produces. The chip was then overclocked up to 4.2 GHz on all cores, which caused for some thermal throttling, since performance was lower than when the chip was clocked at just 4 GHz on all cores. Gains on this particular piece of silicon were reserved up to 4.12 GHz - the jump to 4.2 GHz must have required another bump in voltage that led to the aforementioned throttling. At 4.12 GHz, the chip scored 6,399 points in Cinebench - a remarkable achievement.

FutureMark Corporation Sees Its Name Changed to... Parent Company's "UL"

Futuremark, makers of probably the most recognizable benchmarks out there, have announced they are getting a company rebrand. The announcement has come via a blog post on their site, citing a "new home" for the company. UL, a company that has existed for more than a hundred years and that specializes in testing, inspection, auditing, and certification services and solutions, purchased FutureMark back in 2014, and is now looking to streamline the company's branding to their own.

The move will see FutureMark rebrand itself to "UL Benchmarks" as soon as April 23rd. Everything but the branding - and the hosting websites should remain the same. FutureMark's benchmarks databases will be relocated to benchmarks.UL.com. The company has reaffirmed its commitment in that everything will remain the same, save for the new coat of paint; the mission and intention behind UL Benchmarks will be the same as with FutureMark. I'd say there's something of a missing opportunity here, though; Underwrites Laboratories (the meaning of "UL") seems much better suited for a benchmarking company, whilst also keeping thematical relevance with the parent company's name. But oh well.

Square Enix Puts Final Fantasy XV Up for Pre-order and Releases Benchmark Tool

Final Fantasy XV Windows Edition will be released on March 6, but hardcore fans can pre-order the game now through Steam, Origin, or the Microsoft Store. The $49.99 price tag remains the same independent of the place of purchase. However, the preorder bonus differs hugely. Consumers who pre-order through Steam will receive the "FFXV Fashion Collection" which contains a collection of four t-shirts with different buffs for Noctis to wear. The buffs include strength enhancement, HP recovery rate acceleration, critical hit rate boost, and maximum HP increase. Microsoft Store preorders, on the other hand, come with a "FFXV Powerup pack" that provides players with a sleek Dodanuki sword which reduces enemy defense with each slash, 10 phoenix downs, and 10 elixirs. Origin seems to draw the short straw this time around. The Origin preorder bonus is the "FFXV Decal Selection" comprised of different decals to modify the cosmetics of the Regalia car.

Square Enix has graciously released a benchmark application to help future Final Fantasy XV Windows Edition players configure the game to get the best performance out of their systems. The zip file weighs around 3.37 GB and is available for download right here at TechPowerUp, or, if you don't like our servers for some unfathomable reason... from the developer's website. The benchmark only runs on 64-bit versions of Windows and requires Microsoft .NET Framework 4.6 to be present on the system. Users also need to have a graphics card that supports DirectX 11 and a monitor with a minimum resolution of 1280 x 720. Unfortunately, NVIDIA SLI and AMD Crossfire configurations are not supported at this time. The application lets users choose between the Lite, Standard, and High quality presets and resolutions from 1280 x 720, 1920 x 1080 to 3840 x 2160. The entire run takes around 6 minutes and 30 seconds. The score and performance evaluation are provided at the end of the run.

NVIDIA GeForce 390.65 Driver with Spectre Fix Benchmarked in 21 Games

The Meltdown and Spectre vulnerabilities have been making many headlines lately. So far, security researchers have identified three variants. Variant 1 (CVE-2017-5753) and Variant 2 (CVE-2017-5715) are Spectre, while Variant 3 (CVE-2017-5754) is Meltdown. According to their security bulletin, NVIDIA has no reason to believe that their display driver is affected by Variant 3. In order to strengthen security against Variant 1 and 2, the company released their GeForce 390.65 driver earlier today, so NVIDIA graphics card owners can sleep better at night.

Experience tells us that some software patches come with performance hits, whether we like it or not. We were more than eager to find out if this was the case with NVIDIA's latest GeForce 390.65 driver. Therefore, we took to the task of benchmarking this revision against the previous GeForce 388.71 driver in 21 different games at the 1080p, 1440p, and 4K resolutions. We even threw in an Ethereum mining test for good measure. Our test system is powered by an Intel Core i7-8700K processor overclocked to 4.8 GHz, paired with G.Skill Trident-Z 3866 MHz 16 GB memory on an ASUS Maximus X Hero motherboard. We're running the latest BIOS, which includes fixes for Spectre, and Windows 10 64-bit with Fall Creators Update, fully updated, which includes the KB4056891 Meltdown Fix.

NVIDIA's Latest Titan V GPU Benchmarked, Shows Impressive Performance

NVIDIA pulled a rabbit out of its proverbial hat late last week, with the surprise announcement of the gaming-worthy Volta-based Titan V graphics card. The Titan V is another one in a flurry of Titan cards from NVIDIA as of late, and while the healthiness of NVIDIA's nomenclature scheme can be put to the sword, the Titan V's performance really can't.

In the Unigine Superposition benchmark, the $3000 Titan V managed to deliver 5,222 points in the 8K Optimized preset, and 9,431 points on the 1080p Extreme preset. Compare that to an extremely overclocked GTX 1080 Ti running at 2,581 MHz under liquid nitrogen, which hit 8,642 points in the 1080p Extreme preset, and the raw power of NVIDIA's Volta hardware is easily identified. An average 126 FPS is also delivered by the Titan V in the Unigine Heaven benchmark, at 1440p as well. Under gaming workloads, the Titan V is reported to achieve from between 26% and 87% improvements in raw performance, which isn't too shabby, now is it?

Futuremark Introduces "Cyan Room" DX12 VR Benchmark to VRMark

Adding to its staple of benchmarking suites, Futuremark has announced the upcoming release of their latest benchmark, dubbed "Cyan Room". this is a Bioshock-esque benchmark that was made for DX12-powered, VR-enabled workloads, and should let users know just how much processing grunt they have at their disposal. It uses a pure DirectX 12 engine built in-house and optimized for VR, besides featuring "a large, complex environment and many eye-catching effects."

The Cyan Room can be explored at the users' leisure through its "Experience Mode"; it's a benchmarking experience where users can actually change the rendering resolution and other settings to make the scene more or less demanding, on the fly. This should allow users to truly gauge the difference in experience according to achieved performance in the benchmark - the company says "using Experience mode with a VR headset is a great way to see how system performance affects your VR experience." With its massive 5K rendering resolution and spectacular volumetric lighting effects, the company says it sets a high bar for future hardware generations. Cyan Room will be released on November 22 as a free update for VRMark Advanced Edition and VRMark Professional Edition.

Futuremark Celebrates Newegg Partnership with Huge Discounts - $5 for 3DMark

Futuremark, the developer of the world's most widely used benchmarking software, today announced a new partnership with Newegg, the leading tech-focused e-retailer in North America. The partnership sees Newegg complement its comprehensive selection of PC components and complete systems with Futuremark's popular 3DMark, VRMark and PCMark 10 benchmarks. It's a winning combination: everything you need to build and benchmark a new PC in one place.

Newegg has long been the preferred destination for tech-savvy PC users when buying or building a new PC or upgrading individual components. Futuremark benchmark tests have helped millions of people test, compare and understand PC performance. Now for the first time, PC enthusiasts can buy Futuremark benchmarks from the same place they buy their components and accessories.

Intel, AMD MCM Core i7 Design Specs, Benchmarks Leaked

Following today's surprise announcement of an Intel-AMD collaboration (of which NVIDIA seems to be the only company left in a somewhat more fragile position), there have already been a number of benchmark leaks for the new Intel + AMD devices. While Intel's original announcement was cryptic enough - to be expected, given the nature of the product and the ETA before its arrival to market - some details are already pouring out into the world wide web.

The new Intel products are expected to carry the "Kaby Lake G" codename, where the G goes hand in hand with the much increased graphics power of these solutions compared to other less exotic ones - meaning, not packing AMD Radeon graphics. For now, the known product names point to one Intel Core i7-8705G and Intel Core i7-8809G. Board names for these are 694E:C0 and 694C:C0, respectively.

NVIDIA GTX 1070 Ti 3DMark Benchmark Results Appear Online

NVIDIA's GeForce 10 series, codenamed Pascal, has been in the market since May of 2016. NVIDIA released both the GTX 1080 and the GTX 1070 using TSMC's new manufacturing 16nm FinFet technology. When they debuted, the GTX 1070 became a popular choice among gamers initially because it was the more budget friendly option between the two. Earlier this year, NVIDIA released the GTX 1080 Ti primarily aimed at the higher-end enthusiast crowd.

We have reported about the soon-to-be launched GTX 1070 Ti before, and we also saw a render of the Gigabyte offering yesterday. Adding to the fervor today, benchmark results for the GTX 1070 Ti emerged for 3DMark Fire Strike Extreme and Time Spy on the web. Although rumored to not overclock well, the GTX 1070 Ti paints a pretty picture for those looking to upgrade their gaming rigs. According to these early leaks, the GTX 1070 Ti bests AMD's Radeon RX Vega 56 in the Time Spy benchmark in both Turbo and Balanced modes for the latter, while trading blows in Fire Strike Extreme in balanced mode and losing to it in Turbo mode. Keep in mind, these are early leaks and more are sure to come as we inch closer to its release.

Futuremark Releases 3DMark v2.4.3819 with "Time Spy Extreme" Benchmark

Futuremark today released the latest update to the 3DMark graphics benchmark suite. Version 2.4.3819, released to the public today, introduces the new "Time Spy Extreme" benchmark for machines running Windows 10 and DirectX 12 compatible graphics cards. With a rendering resolution of 4K Ultra HD (3840 x 2160 pixels), the new benchmark applies enough stress to put today's 4K UHD gaming PCs through their paces. You don't require a 4K monitor to run the test, however, your graphics card must feature at least 4 GB of video memory.

Time Spy Extreme also comes with a new CPU benchmark that is up to 3 times more taxing than the older CPU tests. It can take advantage of practically any number of CPU cores you can throw at it, and benefits from the the AVX2 instruction-set. "Time Spy Extreme," isn't available on the free version of 3DMark. You will require at least 3DMark Advanced, with a license purchased after July 14, 2016, to get it as a free upgrade. The update also improves the API overhead tests.
DOWNLOAD: Futuremark 3DMark v2.4.3819

The change-log follows.

Futuremark Readies 3DMark TimeSpy Extreme Benchmark

Futuremark is giving final touches to its top-tier GPU benchmark, 3DMark "TimeSpy Extreme." This benchmark tests your graphics hardware's performance at the 4K Ultra HD (3840 x 2160 pixels) resolution, with the latest DirectX 12 API. You will need a graphics card with at least 4 GB of video memory to run the test. The benchmark will put not just the fastest graphics cards through their paces, but is also designed to take advantage of today's multi-core processors.

3DMark "TimeSpy" Extreme can take advantage of processors with 8 or more CPU cores, and will benefit from the processors supporting the AVX2 instruction-set. Futuremark claims that the CPU tests of "TimeSpy" Extreme will be 3 times more demanding. The company also mentions that it developed the new benchmark while taking inputs from AMD, NVIDIA, and Intel. The benchmark is expected to launch on the 11th of October as an update.
Return to Keyword Browsing
May 15th, 2024 14:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts