Assassin's Creed Mirage: DLSS vs FSR vs XeSS Comparison Review 38

Assassin's Creed Mirage: DLSS vs FSR vs XeSS Comparison Review

(38 Comments) »

Introduction

Assassin's Creed Mirage is the next installment in the Assassin's Creed franchise from Ubisoft. Mirage runs on Anvil with DirectX 12 exclusively, the same game engine that powered Assassin's Creed Valhalla and has been used in prior Assassin's Creed titles. This release on PC does not support any advanced ray tracing features, but this is the first Assassin's Creed title to support NVIDIA's DLSS Super Resolution, NVIDIA's Deep Learning Anti-Aliasing (DLAA), Intel's Xe Super Sampling 1.2 (XeSS 1.2) and AMD's FidelityFX Super Resolution 2.2 (FSR 2.2) from day one. XeSS and FSR 2.2 also has "Native AA" (Native Anti-Aliasing) quality mode that runs the game at native resolution without upscaling, similarly to NVIDIA's DLAA. In order to run this game at maximum graphics settings and reasonable framerates at native resolution, quite a powerful GPU is required, which is why upscaling solutions are so important. But depending on the game, there can be differences in the implementations of NVIDIA's DLSS/DLAA, Intel's XeSS, and AMD's FSR, so we are keen to have a look at these temporal upscalers in this game.



Below, you will find comparison screenshots at 4K, 1440p and 1080p resolutions, and in the various DLSS, XeSS and FSR quality modes; native TAA, FSR, XeSS and DLAA screenshots are also available in the dropdown menu. For those who want to see how these technologies perform in motion, watch our side-by-side comparison video, to help uncover issues like shimmering or temporal instability, which may not be visible in the game screenshots.

All tests were made using a GeForce RTX 4080 GPU at Ultra High graphics settings; motion blur, chromatic aberration and depth of field were disabled for better image viewing. DLSS Super Resolution in this game shipped with the very old version 2.3.1 and during this round of testing, we manually updated DLSS to 3.5.0.

Screenshots




Side-by-Side Comparison Video


Conclusion

In Assassin's Creed Mirage, the in-game TAA solution, DLAA, DLSS, XeSS and FSR 2.2 implementations all use a sharpening filter in the render path, and the game has the ability to tweak the sharpening values through separate sliders. By default each upscaling and anti-aliasing solution is using the value of 60, however, the actual amount of sharpening applied is very different even if it is set to equal values. In our testing we used the default value of 60 for each upscaling and anti-aliasing solution, and with such settings, the DLSS image has the sharpest look, while FSR 2.2 is completely the opposite with a very soft overall image, and XeSS is in the middle between DLSS and FSR 2.2 in terms of sharpness.

XeSS comes with three upscaling kernels that are optimized for various architectures. The first is the kernel that gets used on Intel Arc GPUs with XMX engines. This is the most advanced model too, that not only performs better in terms of FPS, but also offers the best upscaling quality, the "Advanced XeSS upscaling model." Intel also provides an optimized kernel for Intel Integrated Graphics, and another compatibility kernel, used for all other architectures that support Shader Model 6.4, e.g. all recent AMD and NVIDIA cards. These use the "Standard XeSS upscaling model," which is somewhat simpler, with lower performance and quality compared to the Advanced XeSS upscaling model (we use the compatibility model on our RTX 4080). If DP4a instructions aren't available, as on the Radeon RX 5700 XT, slower INT24 instructions are used instead.

The XeSS implementation in Assassin's Creed Mirage is using the latest version 1.2, which received a significant improvements to the overall image details, stability in motion and performance, especially when running in the Standard XeSS upscaling model (DP4a). With XeSS in "Quality" mode you can expect only a modest reduction in overall image quality details: the distant shadows are slightly more unstable, tree leaves and vegetation in the distance will have a softer look at lower resolutions, such as 1080p, in comparison to the native image. However, it is important to note that the XeSS implementation does not have ghosting or shimmering/flickering issues in this game. With XeSS running in "Native AA" mode you can expect an improved quality of built-in anti-aliasing, which means less visible pixelation and jaggies in vegetation in particular, but with a slight cost to performance compared to the TAA solution.

The FSR 2.2 image quality in this game is suffering from shimmering and flickering on three leaves, vegetation, distant shadows and thin lines, and these shimmering issues are visible even when standing still, across all resolutions and quality modes. The ghosting issues, mainly on small particle effects, and small amount of disocclusion artifacts around the main character and NPCs are also present with FSR 2.2 enabled. Also, the overall image has a very soft look with FSR 2.2 enabled in "Quality" mode across all resolutions, even if the sharpening slider is set to the value of 100. With FSR 2.2 running in "Native AA" you can expect a sharper image, but it doesn't help with the shimmering issues, which are noticeable even when standing still.

In Assassin's Creed Mirage, the DLSS Super Resolution implementation offers the best image quality across all resolutions and quality modes when upscaling is enabled. With DLSS in "Quality" mode you can expect slightly improved rendering of the details on tree leaves and vegetation in general, a sharp overall image with perfect stability in motion and on small particle effects, and the absence of any form of ghosting or shimmering artifacts. With DLAA enabled, the overall image quality improvement goes even higher, rendering additional details and offering the best graphical experience overall compared to the TAA solution, FSR, DLSS or XeSS.

Speaking of performance, the XeSS 1.2 implementation is practically identical to FSR 2.2 in terms of performance gain over native TAA across all resolutions, with DLSS being slightly slower than both XeSS and FSR, which is quite unusual. Most XeSS implementations that we've tested had around 10-13% lower performance gain while using the compatibility kernel instruction set that works with all GPU architectures, when compared to competing upscaling solutions from NVIDIA and AMD, but in this game XeSS essentially has the same performance gains as FSR 2.2 while producing better image quality at the same time using the DP4a instruction set, which is quite an impressive achievement. The DLAA solution has a performance cost of around 6%, compared to the TAA solution, but offers the best graphical experience overall, and running FSR 2.2 and XeSS at native resolution without upscaling has a performance cost of around 11%.
Discuss(38 Comments)
Dec 21st, 2024 23:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts