RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review 166

RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review

(166 Comments) »

Conclusion

After hundreds of individual benchmarks, we present you with a fair idea of just how much the GeForce RTX 4090 "Ada" graphics card is bottlenecked by the CPU. Our testing should help answer some fundamental questions, the biggest of which is whether a pre-Alder Lake machine, such as AMD "Zen 3" or Intel "Rocket Lake" is good enough for the RTX 4090. The answer has to be no, but there are many things to consider.

In our highest resolution 4K Ultra HD, the Core i9-12900K is 6.5% faster than the Ryzen 7 5800X when averaged across all games, but you have to pay close attention to individual tests. In highly CPU-bound games such as Anno 1800, the framerate jumps up by a whopping 28%, and with Far Cry 5, by 26%. There are several other games posting 15-20% performance gains, which are hard to ignore—Hitman 3, Halo Infinite, Death Standing, Dota 2, AoE IV, you name it. But then there are several other games, including recent ones, and ones that use the latest game engines and APIs, where the performance difference is under 5%, or in quite a few cases, even under 3%. Control, for example, posts a meager 0.2% performance gain. These bulk of games drag down the average to 6.5% overall. The reason behind the games with minimal improvements has to do with the games being GPU-limited at 4K. This should be the "normal" situation, you want your games to be GPU-limited, because the GPU is much more expensive than the CPU. So if you run CPU limited, you'll have wasted money on the graphics card, because the GPU is sitting idle, waiting for the CPU to push out more frames. There are no situations where the 5800X ends up faster than the 12900K by a whole-number percent at 4K.

Things get interesting with 1440p. First off, the RTX 4090 really isn't designed for 1440p. You can max out 1440p with even older generation cards such as the RTX 3070 Ti, but perhaps those with high-refresh rate monitors would want to pay attention. Averaged across all game tests, the 12900K ends up a significant 13.8% faster than the 5800X. All those titles that posted a 20-25% performance gains at 4K are now seen with 30-45% gains! The 5800X really is posing a severe CPU bottleneck. All the games that were GPU-bottlenecked at 4K are now clocking at least a 3-10 percent performance gain, which goes a long way in uplifting the overall average. There are a couple of cases where the 5800X ends up faster than the 12900K—tests such as Devil May Cry 5, AC: Valhalla, GTA V, and Prey. DMC-5 could be a case of the game not being fully optimized for the Intel Hybrid architecture, with Thread Director misplacing the game's worker threads onto the slower E-cores.

1080p gains purely academic significance with a graphics card of the performance level of RTX 4090. Even the high refresh-rate crowd would do well to save a ton of money by opting for a cheaper graphics card around the $700-mark, such as the RX 6900 XT or RTX 3080 10 GB. With the games being thoroughly CPU-bottlenecked, the high frequency and IPC of the 12900K lends it a large 15.7% performance lead over the 5800X when averaged across all games, at 1080p. We see several titles such as the Guardians of the Galaxy, Halo Infinite, Warhammer III, and Watch_Dogs Legion gain performance in the 35-50% range. Several other titles clock 15-30% gains, but the bulk still push 0-10% gains, which drags down the average. DMC-5 loses as much as 59% frame-rates for the same reason it does with 1440p.

Intel's massive frame-rate gains aren't just a function of the 15-20% higher IPC of the "Golden Cove" P-cores of the 12900K versus the "Zen 3" cores of the 5800X; but several other factors that lubricate the graphics rendering pipeline, such as the larger caches available to the P-cores (1.25 MB per core, 30 MB L3); the faster main-memory running at DDR5-6000 compared to DDR4-4000 for the AMD machine; which have a cumulative impact on the CPU-level bottleneck.

Comparing the 3D graphics APIs at 4K UHD, we can't really see a pattern where Intel's architecture favors one API over the other. DirectX 12 and Vulkan are supposed to have better parallelization for the CPU-side of things; while NVIDIA has optimized its DirectX 11 code-path to be parallelized over the years. Therefore, all three APIs show 6.4-6.6% performance games, which make up the 6.5% average.

Older games are expected to be slightly less parallelized, particularly those released in 2018 or before; which were most likely developed on quad-core machines. Here, the 12900K is showing a 6.4% higher framerate than AMD on average, likely due to its sheer IPC and clock-frequency uplift. The memory-level bottleneck is less pronounced. Surprisingly, games released between 2019 and 2020 are only 5.5% faster on the Intel machine, this is probably around the time game developers were discovering higher CPU core-counts, and building for 6-core and 8-core machines. The latest games (released between 2021 and 2022), favor Intel in a bigger way—averaging 7.9%—the games are clearly getting more parallelized, taking advantage of newer instruction-sets, and perhaps Intel's Thread Director is able to harness the power of the E-cores with some of these games.

Intel's performance leads over AMD are in the 30-50% in some cases, despite its IPC only being ~20% higher than "Zen 3," proves that the AMD platform is being held back by slower memory as well. Intel's 12th Gen, AMD's Ryzen 7000 "Zen 4," and Intel's upcoming 13th Gen, paired with DDR5 memory, would be the ideal platforms for the RTX 4090, and someone still on a Ryzen 5000-series (or older) platform should brace for performance losses arising from the CPU bottleneck, but it's VERY game-dependent.

All said, the GeForce RTX 4090 is a next-generation graphics card that demands a next-generation platform. Our 13th Gen Core "Raptor Lake" reviews will go live in the coming days, and so you'll be able to make an informed choice on one.
Discuss(166 Comments)
View as single page
Dec 25th, 2024 03:25 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts