RTX 4090 & 53 Games: Core i9-13900K vs Ryzen 7 5800X3D Review 219

RTX 4090 & 53 Games: Core i9-13900K vs Ryzen 7 5800X3D Review

(219 Comments) »

Conclusion

With each such article, we're loving the Ryzen 7 5800X3D more. This processor is a fantastic technological bet that paid off for AMD, and the company should seriously consider scaling up supply of this specific SKU, so that gamers on the Socket AM4 platform can level up to desktops powered by next-generation processors. AMD really can sell these chips at their current price of $329.

Jumping straight into the action, and starting with the highest 4K Ultra HD resolution, which is the native stomping ground for the GeForce RTX 4090, we're seeing that the Ryzen 7 5800X3D is matching the Core i9-13900K "Raptor Lake" very well. Averaged across all 53 games, the i9-13900K is a negligible 1.3% faster than the 5800X3D. If you recall from our 5800X vs. i9-12900K TPU50 article, the 5800X was about 6.5% slower in this resolution, and the 5800X3D was 6.8% faster than the 5800X; which puts the 5800X3D faster than the i9-12900K, and gaining on the i9-13900K. Despite all its might, the RTX 4090 still presents us with a GPU-limited scenario at 4K (in most games), so the key takeaway is that even with the RTX 4090, the i9-13900K isn't all that faster than a 5800X3D. So if you're currently on a Socket AM4 platform, and are considering to upgrade the whole system, the 5800X3D could be an interesting upgrade path. You also have to consider that the i9-13900K in our testing enjoys the advantage of faster DDR5-6000 memory, while the 5800X3D goes with DDR4-3600, at better latency though. Looking at the individual tests at 4K we only see five titles (out of 53) where the i9-13900K posts a performance lead that isn't negligible. These are AoE IV, Anno 1800, Divinity 2, Hitman 3, and Sniper Ghost Warrior: Contracts 2. Civilization VI seems to be needing optimization for Intel Hybrid architecture, as the 5800X3D is posting a 13% performance lead over the i9-13900K in this test. Every other test gets a performance "lead" that's so small that it could be dismissed as a rounding error.

Moving down to the 1440p resolution, and we see that the RTX 4090 overpowers the system, and the bottleneck begins to shift toward the CPU. As we noted in our main review, the RTX 4090 really isn't meant for 1440p gaming, but if you must, then a faster CPU helps. Even so, we see that averaged across all 53 tests, the i9-13900K ends up only 4.7% faster than the 5800X3D. There are several more games where the Intel chip leads the AMD one with a bigger margin than we had at 4K, because this resolution is much more CPU limited than 4K. Pay attention to tests such as Ace Combat, Civ VI, Far Cry 6, AoE IV, etc, which post double-digit percent gains for Raptor Lake. There are a couple of games where the 5800X3D gets ahead, too, such as Far Cry 5 (very memory sensitive), and DMC 5 (which seems to have issues running on Intel, at 1080p too). In most other tests, the Intel chip is ahead by a low-mid single-digit percentage.

Our lowest game testing resolution for these articles is also the most popular gaming resolution, even in 2022 (about 15 years since the resolution came into being); and is the most-CPU-limited resolution we test today, for the RTX 4090, it only has academic relevance, similar to how we test 720p in our CPU reviews. The bottleneck is now firmly in the CPU's court, the RTX 4090 is way too overpowered, and renders 1080p frames at a diabolical rate, with the processor not being able to keep up with its end of the graphics rendering stack. Even so, the mighty i9-13900K is only 6.2% faster than the 5800X3D, averaged across all 53 tests. There are a dozen or so benchmarks where the i9-13900K posts a double-digit percent lead over the 5800X3D, but the sheer size of the data, with 53 games in it, drags the average down to mid single-digit. There's also the curious case of Devil May Cry 5, which is clearly handicapped on Intel in some way. I retested it several times, it really runs considerably slower, but still achieves hundreds of FPS. Most other game tests are mid-high single-digit percent in favor of the i9-13900K, and so we end up with the +6.2% average.

Given that we have numbers for the vanilla 5800X, it should end up at least 10-15% behind the i9-13900K, which means that the 5800X3D is overcoming a large portion of this performance deficit, even without DDR5 memory, which puts its gaming performance somewhere between the i9-12900K and the i9-13900K. This is fantastic news for those on the AM4 platform that were dreading the prospect of not only having to buy a $400-ish processor, but also a $200+ motherboard and optionally new DDR5 memory. The 5800X3D is supported on all AMD chipset generations for this socket, and BIOS support is easy to come by. The 5800X3D is also a very efficient chip, with a TDP of 105 W, which means cooling it is a lot easier than cooling the i9-13900K, and it also means you can buy a cheap motherboard, even if it doesn't have a super powerful VRM configuration. The lower power draw is also great news if you're running an older, weaker PSU—no upgrade needed. If, however, you want the fastest processor there is, money no bar, and you'd like to nip at the possibility of even the slightest CPU-level bottlenecks, then by all means, upgrade to Raptor Lake. Of course you'll also want a powerful GPU to make sure your CPU gets used to the fullest. If not only gaming is important for you, but applications, as well, then the 13900K will have a clear advantage, too. AMD is working on Ryzen 7000 paired with 3D V-Cache, which will probably have a huge impact on the gaming capabilities of Zen 4. The coming months will be interesting!
Discuss(219 Comments)
View as single page
Jul 19th, 2024 22:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts