• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Hogwarts Legacy: FSR 2.1 vs. XeSS vs. DLSS 3 Comparison

Joined
Sep 9, 2021
Messages
78 (0.06/day)
Hogwarts Legacy is out now, with support for NVIDIA's DLSS Super Resolution (DLSS 2.5), NVIDIA's DLSS Frame Generation (also known as DLSS 3), NVIDIA's Deep Learning Anti-Aliasing (DLAA), Intel's Xe Super Sampling (XeSS) and AMD's FidelityFX Super Resolution 2.1 (FSR 2.1). In this mini-review we take a look, comparing the image quality and performance gains offered by these technologies.

Show full review
 
"Speaking of performance, Hogwarts Legacy is a very CPU intensive game, especially with ray tracing enabled, as the CPU usage is mostly single-threaded on PC due to a very poor implementation of Unreal Engine 4 DirectX 12, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game, even at 4K. We've seen these issues before in other recent Unreal Engine 4 games, like The Callisto Protocol or Gotham Knights. In such a CPU limited scenario, a very welcome help comes from the DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect more than doubled performance at 4K and 1440p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with the input latency."

This is why I hate the likes of DLSS, it's being used as an excuse to not optimize games. We all saw this coming.
 
I like how there's not performance difference between the quality modes, lol.
 
"Speaking of performance, Hogwarts Legacy is a very CPU intensive game, especially with ray tracing enabled, as the CPU usage is mostly single-threaded on PC due to a very poor implementation of Unreal Engine 4 DirectX 12, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game, even at 4K. We've seen these issues before in other recent Unreal Engine 4 games, like The Callisto Protocol or Gotham Knights. In such a CPU limited scenario, a very welcome help comes from the DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect more than doubled performance at 4K and 1440p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with the input latency."

This is why I hate the likes of DLSS, it's being used as an excuse to not optimize games. We all saw this coming.
It can be used as an excuse.

Just like having faster hardware can be used as an excuse to not optimize games.

Having a useful tool doesn't always favour laziness, it's just how some people are.
 
Sweet, but also having some basic performance charts for easy comparison would be great..
 
Would like to see XeSS tested on an Arc GPU as well. There is little reason to use it over FSR on an Nvidia GPU, they're very close
 
"Speaking of performance, Hogwarts Legacy is a very CPU intensive game, especially with ray tracing enabled, as the CPU usage is mostly single-threaded on PC due to a very poor implementation of Unreal Engine 4 DirectX 12, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game, even at 4K. We've seen these issues before in other recent Unreal Engine 4 games, like The Callisto Protocol or Gotham Knights. In such a CPU limited scenario, a very welcome help comes from the DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect more than doubled performance at 4K and 1440p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with the input latency."

This is why I hate the likes of DLSS, it's being used as an excuse to not optimize games. We all saw this coming.

I don't hate DLSS but it should not be used as a crutch as you point out, particularly DLSS 3.0 which comes with latency drawbacks.



As for the quality comparison, I'm seeing DLSS getting a bit more foliage detail in some cases and FSR a bit more detail in the rock wall to the right.
 
Does anyone else think that things like DLSS and especially frame generation will make it more difficult for consumers to accurately compare videocards? For example, in pure raster, we can assume that a frame from gpu A is equal to gpu B, an apples to apples comparison (though there has been some claims that there is better image quality with AMD vs Nvidia several years ago), but with DLSS, XeSS, and especially with frame generation, is a generated frame equal to a rastered frame? Are these software tricks going to muddy the waters for consumers and make it much more difficult to confidently know which is the better product?
 
XeSS to me looks more blurry than FSR and DLSS (all Quality). FSR and DLSS look one sharper on the rocks, the other on the grass...
 
What does the sharpening filter do exactly and why not max it out? I haven't gotten to use DLSS yet, but in at least 75% of the comparisons I see of these technologies, it always appears that DLSS is more blurry than FSR to me, and like DLSS is running with a lower resolution texture pack or something. Is DLSS working well in this game? It looks identical, maybe even marginally worse than TAA!? Everything is 56/57FPS across all resolutions and all quality settings, are any of these even properly working?
 
Is the implementation buggy and there is no gain by changing the quality? lol
 
My wife got this game on Saturday. She said it did a good job detecting her hardware and automatically adjusting settings to Ultra RT OFF (She's got Ryzen 5800X3D and RX 6900XT).

When I saw her playing, my impression is that it's a very good looking game. I assumed it was rendering native 4K but I wanted to see what settings were dialed in because it seemed to be running too smooth for what I read on W1z review. Sure enough it was actually 1440p with FSR enabled. Well it fooled me, I thought it looked like native 4K.
 
Not sure why the FPS is all the same in the review?

If i change between Quality DLSS and Performance DLSS my frame rate changes from 57-60 to 42-46.
 
What does the sharpening filter do exactly and why not max it out? I haven't gotten to use DLSS yet, but in at least 75% of the comparisons I see of these technologies, it always appears that DLSS is more blurry than FSR to me, and like DLSS is running with a lower resolution texture pack or something. Is DLSS working well in this game? It looks identical, maybe even marginally worse than TAA!? Everything is 56/57FPS across all resolutions and all quality settings, are any of these even properly working?

Is the implementation buggy and there is no gain by changing the quality? lol

Not sure why the FPS is all the same in the review?

If i change between Quality DLSS and Performance DLSS my frame rate changes from 57-60 to 42-46.

It would be my guess that the 10700F used in this test is the limiting factor, maxing out at under 60fps no matter the settings.
 
57 FPS CPU limit in every resolution. :D

To be fair, by the time 4090 performance gets to mainstream, CPUs will be much faster as well, so I guess it's fine.
 
57 FPS CPU limit in every resolution. :D

To be fair, by the time 4090 performance gets to mainstream, CPUs will be much faster as well, so I guess it's fine.

not a problem on Intel
 
Yeah.. As more as I read into Hogwarts Legacy, the worse it is in terms of quality, optimalization and general polishing...

so a normal AAA launch
 
so a normal AAA launch
Standard brute force approach sprinkled with a upscaling techniques to make up for shortfalls. Wondering how Unreal engine would fair with nanite and Lumen.

If you want a challenging scene go to Hogwarts at night and do the butterfly challenge with rt on fortunately that lasts about 3 seconds but you definitely feel the dip in performance even on a 4090 suprim liquid with 7700x at all core oc at 5.65 ghz.
The game looks best at 4k ultra settings with Nvidia dlaa anti aliasing, upscaling off and frame generation on for the best balance between latency and visual quality except that one incident mentioned after 20% progress completion FYI. IMO.
 
Looking at the 1080p screenshots, the upscaling quality is surprisingly fine. Not as great as native, but fine.

With that said, hair looks weird with FSR, and the whole image seems washed out with XeSS. I don't see specific issues with DLSS at first glance. Interesting that performance is the same all around.

I'm even more tempted to try the game now. :)
 
XeSS to me looks more blurry than FSR and DLSS (all Quality). FSR and DLSS look one sharper on the rocks, the other on the grass...
I agree, XeSS is definitely softer, and the details also look lower res. More noticeable at 1440 and 1080.
 
not a problem on Intel
Another minimal effort clickbait article. Copy Pasta from Twitter and never bother to retify the false info.
1676342829469.png

 
Last edited:
One of the common theme among all those AAA titles that have issue are they all seems to use Unreal Engine 4 pushed to the maximum. I think that is the main issue.

We are at a point were we need next generation engine to really bring up the performance and visual quality.
 
One of the common theme among all those AAA titles that have issue are they all seems to use Unreal Engine 4 pushed to the maximum. I think that is the main issue.

We are at a point were we need next generation engine to really bring up the performance and visual quality.

UE 5 is already a thing

 
How Avalanche Software and other studios meetings go :
Employee : Boss, we need people and money to optimize for the PC platform
Boss : how cheap can we achieve this ?
Employee : we can use some PC technologies but they lower resolution and increase latency
Boss : Fu..ck them, they will crack it anyways in a month or two, make it so.
 
Back
Top