God of War Benchmark Test & Performance Review 33

God of War Benchmark Test & Performance Review

(33 Comments) »
It's almost like hell froze over—Sony has released God of War on the PC platform. The game is really good, and definitely worth checking out. Using DirectX 11 exclusively with a custom in-house engine by Sony Santa Monica, God of War looks good—check out our screenshots. Like in most games, there are areas of varying details, but textures are generally sharp and models are richly detailed. Certainly not "next-gen", but totally fine for even 2022. Considering this God of War game released in 2018 on the PS4 and the graphics still look decent today definitely says something about the progress we made in recent years. What supports the visual experience is the amazing production quality. Yes, the game uses pre-scripted events quite liberally. I was still impressed and reminded that this is a flagship title meticulously designed by serious professionals.

For the PC version, Sony added support for ultra-wide 21:9, unlimited FPS rates, AMD FSR, and NVIDIA DLSS, and various rendering details have been optimized. For example, shadows and lighting have been improved. Ambient occlusion now takes into account the direction of incoming light rays, and volumetric lights are more detailed compared to the PlayStation original. What's a shame that there is no fullscreen setting in God of War, though. You can only pick "windowed" and "borderless." This slightly complicates refresh-rate locking and adds input lag.

To mitigate that, the developers added support for NVIDIA's Reflex technology, which brings down latencies, but only on supported hardware. Also considering that this is a single-player title that's not extremely fast-paced, I'm not sure if that was the best way to spend developer time. What is a useful addition is support for NVIDIA's DLSS upscaling technology, AMD FSR is supported, too—very nice. We took a look at this in a separate article here.

While playing the game, I noticed a little bit of pop-in, but it was not nearly as bad as in some Unreal Engine based titles. From time to time, there's a little bit of stutter, especially when you've just started the game. During my playthrough, I noticed many app crashes, a dozen or so, which is definitely distracting. Maybe it was because I alt-tabbed out quite often, though it still shouldn't happen. Another distraction is that you can't turn off the depth of field effects—motion blur and film grain can be disabled, though.

In terms of performance, God of War at Ultra is quite demanding on your hardware. For 1080p 60 FPS performance, you'll need a GeForce RTX 3060, Radeon RX 6600 XT, or Radeon RX 5700 XT. Fluid 1440p can be achieved with the RTX 3060 Ti and RX 6700 XT, and for 4K60, you'll need a Radeon RX 6800 XT or GeForce RTX 3080. Considering the graphics presented, I'd say that optimization could be a bit better. NVIDIA's cards run considerably faster than AMD Radeon despite the fact that the Sony PlayStation 4 GPU is based on the AMD GCN architecture. In terms of hardware generations, it doesn't seem as though older gens have performance issues, or newer gens provide a significant performance boost.

What is noteworthy is that due VRAM requirements, cards with 4 GB aren't doing so well. Especially the Radeon RX 5500 XT, and RX 6500 XT to a slightly lesser extent, take serious hits in performance, even at 1080p. As soon as you have 6 GB VRAM, performance is fine, though, even at 4K. While the game allocates almost 8 GB VRAM at 4K, it seems not all these assets are constantly used, so performance isn't affected in a significant way—the GTX 1070 8 GB vs. GTX 1660 Ti 6 GB matchup is a good way to keep an eye on that. Of course, there are plenty of settings to adjust if you feel you want to reduce VRAM pressure a little bit.
Discuss(33 Comments)
View as single page
Nov 23rd, 2024 22:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts