• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ratchet & Clank Rift Apart: DLSS vs FSR vs XeSS Comparison

DLSS wins again as usual. As someone who doesn't care about ray tracing, DLSS, or frame generation, I am completely content paying less for AMD gpu personally.

In the future I can see myself going with a RTX 5080 ti or something though, assuming Nvidia keeps improving and winning on all these fronts. That's still 3 years away though, so we will see how it all goes.
 
Thank you for adding this comparison @maxus24

DLSS wins again as usual. As someone who doesn't care about ray tracing, DLSS, or frame generation, I am completely content paying less for AMD gpu personally.

I'm kinda in the same boat as i'm someone that doesn't care about upscaling at all be it FSR or DLSS. And don't want it to be used as a crutch so they don't optimize their games.
 
What is the explanation for a game that was just released using an old version of FSR? lol
 
If DLSS is bad/doesnt work:
DLSS 3 is a scam ngreedia! hahahah!

DLSS 3 works great and doubles performance at 1440P and 4k:
[ insert copium ]
 
If DLSS is bad/doesnt work:
DLSS 3 is a scam ngreedia! hahahah!

DLSS 3 works great and doubles performance at 1440P and 4k:
[ insert copium ]

DLSS3 works well in these kinds of games for sure I still wouldn't use it in a competitive shooter.
 
DLSS3 works well in these kinds of games for sure I still wouldn't use it in a competitive shooter.

I personally don't use it in any shooter (or driving game) -- it's still a great tech tho.

Competitive shooters I turn everything down anyways - makes it a lot harder to hide in the shadows when there are no shadows.
 
I've done some tests with AMD hardware and with this drivers the only way to improve performance is a better GPU.
More memory, lower RAM latency, high CPU clock, 3D cache, smart access memory... No one will help. Maybe this things change in next driver / game updates.

 
I usually don't have any issue at all using FSR, but it looks like hot garbage in this game. The flickering on distant objects and disocclusion artifacts are nuts, at least at 1440p using the Quality option. XeSS looks much better, but the performance bump is much smaller. I ended up going with ITGI with my 1080 Ti. It provides a better image than FSR and a much larger performance boost than XeSS.
 
I've done some tests with AMD hardware and with this drivers the only way to improve performance is a better GPU.
More memory, lower RAM latency, high CPU clock, 3D cache, smart access memory... No one will help. Maybe this things change in next driver / game updates.

Digitalfoundry also mentioned they ran into some weird unexplained bottlenecks
 
"awww shit, here we go again".JPG
 
I don’t see one damn bit of difference. Don’t use any of it anyway. 4k,1440p,1080p works fine for instead all this fake frame and whatever.
 
Using FSR 2.1 and seemingly, XeSS 1.0 don't help their causes, XeSS 1.1 is shaping up to be excellent (as performant as FSR but with better IQ), and FSR 2.2 makes notable improvements to it's shortcomings in disocclusion artefacts.

Yet again we see DLSS in a league of it's own, besting Native+TAA and then DLAA wiping the floor with everything for a small performance cost.

Credit to Nixxes with this port, it's of course not perfect but the state at launch was generally way better than some of this years earlier garbage, and has already been patched twice since launch fixing many of the little niggling issues. I hope that updating FSR and XeSS makes it to the list of updates, people with non RTX hardware deserve the best versions of those solutions.
 
I've been using dldsr on my 1440p monitor to have my display output as 4k then using dlss on balanced. Seems like I get better image quality than native while keeping my power usage lower than it would be at just 4k. Only uses ~125w on my 4090 and looks great. Haven't tried any of the other upscalers. Though I hear XeSS is coming along nicely.
 
I've been using dldsr on my 1440p monitor to have my display output as 4k then using dlss on balanced. Seems like I get better image quality than native while keeping my power usage lower than it would be at just 4k.
This was the moment I knew that 100% irrefutably that this tech was capable of exceeding native IQ, my quote from ~18 months ago;
Enable DLDSR 2.25x in NVCP using the latest 511.23 driver.

Set your DLSS supported game to that resolution, for me using a native 3440x1440 monitor, this is 5120x2160.

Now enable DLSS in Performance mode, for me this is now an internal render res of 2560x1080.
So that Performance mode resolution is higher than what DLSS quality mode is if you output to native, and the magic it's going under the hood to try push that res 2.25x higher and then downsampling it back to native give it a properly chefs kiss result... it's really something.
 
This was the moment I knew that 100% irrefutably that this tech was capable of exceeding native IQ, my quote from ~18 months ago;

So that Performance mode resolution is higher than what DLSS quality mode is if you output to native, and the magic it's going under the hood to try push that res 2.25x higher and then downsampling it back to native give it a properly chefs kiss result... it's really something.

This gives me a headache. lol @maxus24 care to do some testing in this regard to see if it actually works in improved visuals at less cost to performance? i call bs on this until a reviewer like you does a deep dive on it :roll:
 
there is some rought spot in Ratchet & Clank that absolutely kills CPU performance, having Frame Generation help tremendously

Yes FPS fell as low as 28FPS on a 13700K with tuned 6000MT DDR5 :D
 
This gives me a headache. lol @maxus24 care to do some testing in this regard to see if it actually works in improved visuals at less cost to performance? i call bs on this until a reviewer like you does a deep dive on it :roll:
Oh it definitely works, no BS about it, I've just found it hard to prove with screenshots because they always come out at the higher output resolution rather than at native, so they're hard to compare apples to apples.

The numbers and method do make it sound more confusing than it is though, deep learning supersampling, only to use an AI upscaler at now altered values relative to native because of the supersampled output res that then gets down sampled to fit on your monitor... lol o_Oo_Oo_O

But when you do it in practise it's easy and the results speak for themselves.
 
XeSS ghosting/smearing (visible on the spacecrafts in the brackground) is not mentioned on the consclusions for this one either. Am I the only one who finds these distracting?
 
What is happening with DLSS quality in 4K? The textures are very blurry. DLSS balanced in 4K has much higher details than DLSS quality in 4K. Examples are Ratchets shoes, helmet, flowers in the grass etc.
 
This gives me a headache. lol @maxus24 care to do some testing in this regard to see if it actually works in improved visuals at less cost to performance? i call bs on this until a reviewer like you does a deep dive on it :roll:
Well idk about performance since my card is way overspec'd for my monitor so I'm running at 60 fps no matter what, but I usually use dldsr at 4k for anti-aliasing purposes. Started just with games with bad AA but then I started doing it for every game. And then from there, if there is dlss, I think, why not use it, to lessen power consumption? I mean it does do a really good job of filling in the blanks. Have I pixel peeped this game in particular? Not really. But I've found that this combo works well for my primary concern (anti-aliasing) in other games so its just become a habit by now.
 
Any measurements on the latency increase using the DLSS frame generation?

People playing with a controller may not care, but i've found i cant stand going back to games with high render latencies, as the input lag just feels far too slow
 
Any measurements on the latency increase using the DLSS frame generation?

People playing with a controller may not care, but i've found i cant stand going back to games with high render latencies, as the input lag just feels far too slow
I tried to use DLSS frame generation in this game. There is a more floaty feeling of the mouse when it is on and also there are artefacts on the crosshair when you sight zoom with the weapons and move the mouse quickly. With controllers this shouldn't be noticeable because there is no sporadic mouse movement, camera motion control is more smooth an slow so frame generation doesn't produce as much artefacts.
 
Back
Top