Shadow of the Tomb Raider: XeSS vs. DLSS Comparison Review 54

Shadow of the Tomb Raider: XeSS vs. DLSS Comparison Review

(54 Comments) »

Introduction

Intel's Xe Super Sampling (XeSS) technology is finally available—with the latest version of Shadow of the Tomb Raider. Announced earlier this Spring, XeSS is a major update to Intel's performance enhancement suite, rivaling NVIDIA DLSS and AMD FSR 2.0, which lets you improve framerates at minimal loss to image quality. XeSS, DLSS and FSR 2.0 work on the principle of getting the game to render everything except the HUD and post-FX at a lower resolution than the display is capable of and upscaling it using sophisticated algorithms that make the output look as if it was rendered at native resolution. Depending on the game, there are subtle differences in the implementations of Intel's Xe Super Sampling (XeSS) and NVIDIA's Deep Learning Super Sampling (DLSS), so we are keen to have a look at both in this game.



Below, you will find comparison screenshots at 4K, 1440p, 1080p, and in different XeSS and DLSS quality modes. For those who want to see how DLSS and XeSS perform in motion, watch our side-by-side comparison video. The video can help uncover issues like shimmering or temporal instability, which are inherently not visible in screenshots.

All tests were made using a GeForce RTX 3080 GPU at Ultra graphics settings, with ray tracing enabled; motion blur and depth of field were disabled for better image viewing. DLSS was manually updated to version 2.4.12 by swapping the DLL file.

Screenshots




Side-by-Side Comparison Video


Conclusion

In Shadow of the Tomb Raider, none of the anti-aliasing and upscaling solutions are using sharpening filters in the render path. However, you can still enable AMD FidelityFX CAS when TAA is enabled, and for our testing we disabled AMD FidelityFX CAS for TAA. What's also important to note, Shadow of the Tomb Raider has an option to launch the game in either DirectX 11 or DirectX 12 mode, and XeSS only supports the DirectX 12 API in this game. If you have been playing the game in DirectX 11 mode, DirectX 12 has to be enabled in order to utilize XeSS.

Compared to native TAA, the XeSS image quality is a very noticeable upgrade across all resolutions. The in-game TAA solution has a very blurry overall image across all resolutions except 4K, and very poor rendering of small object detail—tree leaves or fishing nets, for example. All of these issues with the in-game TAA solution were resolved with XeSS. Compared to DLSS, XeSS image quality is very close to what DLSS can output, with some differences in temporal stability. One of the most noticeable differences in image quality between XeSS and DLSS is how water puddles render. With XeSS they appear with a noticeable reduction in the resolution of the puddles and also look very jittery, which may be very distracting for some people. These issues with jittery water puddles are visible even at 4K XeSS Quality mode, and the lower internal resolution you are using, the more visible this issue will become. The second-most-noticeable difference is the hair rendering. With XeSS it appears to look pixelated in motion, which can be distracting. Also, there are some differences in how XeSS deals with ghosting in comparison to DLSS. Overall, XeSS handles ghosting similarly to DLSS at 1440p resolution and above. 1080p is a bit different, as XeSS has more ghosting on small objects such as falling leaves or walking NPCs at long distance.

XeSS comes with three upscaling kernels that are optimized for various architectures. The first is the kernel that gets used on Intel Arc GPUs with XMX engines. This is the most advanced model, too, that not only performs better in terms of FPS, but also offers the best upscaling quality, Intel calls this "Advanced XeSS upscaling model". Intel also provides an optimized kernel for Intel Integrated Graphics and another compatibility kernel used for all other architectures that support Shader Model 6.4, e.g. all recent AMD and NVIDIA cards. These use the "Standard XeSS upscaling model," which is a bit simpler, with lower performance and quality, compared to what you get on Arc GPUs (this is the model we're running on our RTX 3080). If DP4a instructions aren't available, like on Radeon RX 5700 XT, slower INT24 instructions are used instead.

Interestingly, when using XeSS, there are some major differences in performance gains, compared to DLSS or FSR 2.0, which essentially had equal performance gains in most games. As we are testing XeSS with an RTX 3080 GPU, which does not have the XMX instruction set, which is designed to accelerate XeSS workloads on Intel's Arc GPUs, the performance gains are less than what we can expect on Arc GPUs, so keep that in mind. That said, the actual performance increase difference between XeSS and DLSS is about 10% at 4K Quality mode, in favor of DLSS. However, compared to native 4K resolution, XeSS manages to deliver up to 40% more performance while using the DP4a instruction set that is compatible with all GPU architectures, which is still a quite decent performance uplift.
Discuss(54 Comments)
Dec 22nd, 2024 05:25 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts