Test System
Test System |
---|
Processor: | Intel Core i9-9900K @ 5.0 GHz (Coffee Lake, 16 MB Cache) |
---|
Motherboard: | EVGA Z390 DARK Intel Z390 |
---|
Memory: | 16 GB DDR4 @ 3867 MHz 18-19-19-39 |
---|
Storage: | 2x 960 GB SSD |
---|
Power Supply: | Seasonic Prime Ultra Titanium 850 W |
---|
Cooler: | Cryorig R1 Universal 2x 140 mm fan |
---|
Software: | Windows 10 Professional 64-bit Version 1903 (May 2019 Update) |
---|
Drivers: | AMD: Radeon Software 19.11.1 Beta NVIDIA: GeForce 441.12 WHQL |
---|
Display: | Acer CB240HYKbmjdpr 24" 3840x2160 |
---|
Benchmark scores in other reviews are only comparable when this exact same configuration is used.
We tested the public Battle.net release version of Call of Duty Modern Warfare (not a press pre-release). We also installed the latest drivers from AMD and NVIDIA, which both have game-ready support for the game.
Graphics Memory Usage
Using a GeForce RTX 2080 Ti, which has 11 GB of VRAM, we measured the game's memory usage at the highest setting.
Just like in previous Call of Duty titles, the game seems very memory hungry at first, but the performance results from our weaker cards show that all these memory allocations happen preemptively. The philosophy here seems to be "fill up the VRAM as much as possible, maybe we need the data later".
FPS Analysis
In this new section we're comparing each card's performance against the average FPS measured in our graphics card reviews, which is based on a mix of 22 games. That should provide a realistic "average", covering a wide range of APIs, engines and genres.
We can clearly see that NVIDIA's Pascal generation of GPUs is at a large disadvantage here, while Turing does much better. It seems that NVIDIA focused their optimization efforts on improving performance for Turing, possibly through the use of Turing's concurrent FP+Int execution capabilities.
On the AMD side, the improvements look fairly constant across generations, showing that AMD took a more general approach. Only Radeon VII falls behind, no idea why.