• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ratchet & Clank Rift Apart: DLSS vs FSR vs XeSS Comparison

I tried to use DLSS frame generation in this game. There is a more floaty feeling of the mouse when it is on and also there are artefacts on the crosshair when you sight zoom with the weapons and move the mouse quickly. With controllers this shouldn't be noticeable because there is no sporadic mouse movement, camera motion control is more smooth an slow so frame generation doesn't produce as much artefacts.
that floaty feel is the problem, I'm down to <6ms render latency and i'm seeing people say it "only" adds about 30ms
Theres a discussion here about CP2077 with framegen and how 30-40ms total latency is fast and frame gen at 60-70ms is totally peachy.

I run at ~12ms system latency and cant understand how these people can even aim at anything. These same people are probably on OLED displays and bragging about how fast and responsive it is.

Comments like this just scare me if they think 50ms is the 'best case', but i'm not seeing many figures in the actual reviews/articles on the games.
1690879210344.png

1690879263364.png

1690879364465.png
1690879393697.png



How can people even play like this? It'd be so incredibly floaty




For comparison:

BL2 (Vulkan, it's system latency but worded as render latency since it thinks its DX9)
1690879641782.png


DRG in DX12 with reflex (proving the PC is indeed keeping the system latency that low)
1690879708811.png


I cant image having 5x to 10x the render/input latency and thinking that's the best there is
(yes, if your render latency is 5x slower you're reacting to something 5x older - so it IS input latency, just not caused by the input devices)
 
Last edited:
I usually don't have any issue at all using FSR, but it looks like hot garbage in this game. The flickering on distant objects and disocclusion artifacts are nuts, at least at 1440p using the Quality option. XeSS looks much better, but the performance bump is much smaller. I ended up going with ITGI with my 1080 Ti. It provides a better image than FSR and a much larger performance boost than XeSS.
Typical for Nixxess ports, FSR looks bad in Spider Man games too. Terrible aliasing and flickering.
 
The gap in IQ between DLAA and the rest of the options is massive. Compared to the native 4K image it tidies up the softness and retains all of the details such as the better quality highlights, details under the light bridge, the very very fine detail on all the foliage and the windows on the background buildings. With DLAA you can actually see the leather grain on the gloves and helmet where as that gets smudged away with DLSS Q or Native.

If only the in game TAA solutions looked as good as the one NV have with DLAA and there would never be a case of DLSS Quality looking better than native, the bulk of the quality uplift is the much better TAA solution NV have and a little bit of sharpening.
 
there is some rought spot in Ratchet & Clank that absolutely kills CPU performance, having Frame Generation help tremendously

Yes FPS fell as low as 28FPS on a 13700K with tuned 6000MT DDR5 :D

Great, but pls dont be part of the problem by actively being positive on crappy optimization to promote some technology to pick up the slack.
This is exactly what people were worried about with upscaling technologies....and its happening already?

(that said I dont get this, Nixxess is an excellent PC porting company, im expecting patches, should have been better on release but it is what it is)

I usually don't have any issue at all using FSR, but it looks like hot garbage in this game. The flickering on distant objects and disocclusion artifacts are nuts, at least at 1440p using the Quality option. XeSS looks much better, but the performance bump is much smaller. I ended up going with ITGI with my 1080 Ti. It provides a better image than FSR and a much larger performance boost than XeSS.

I only really have experience with FSR(2.2) in warframe but its really bad there as well.
Just smeary stuff where I guess the game does not provide proper motion vectors, it happens on items that animate in place, the ghost all over the map, but if you start moving the camera, they are sharp, or atleast sharp enough.

So moving the camera sorta forces or provides motion vectors that arnt there normally making it all look...well bad.
Not unusable, godsent for really lot tier gpu's but still, lots of work to be done.


but I watched a friend play that crappy Callisto Protocol game on an RTX2080 with FSR enabled to get some usable frames and there it was more then fine.
 
The gap in IQ between DLAA
DLAA slows things down almost as much as frame generation does


Deep rock galactic, in game 143FPS cap, 4K all settings ultra, reflex set to "Enabled+boost" and NULL enabled in NVCP (doesnt work in DX12)
In DX11, Nvidas ultra low latency setting works. In DX12, only reflex works.


DX11 on the left, DX12 on the right.

DLSS quality, my normal settings
This game uses DLSS as a form of anti-aliasing and genuinely looks better with DLSS quality on.
1690884503003.png
1690884946951.png

pretty equal - tiny variances with the game can explain the small changes.

DLSS disabled
0.9ms latency is not noticeable to a human, but I do like that the better visual fidelity option above used 19 watts less.
1690884582172.png
1690885014697.png

DX11 vs 12 are again comparable


DLAA enabled, which disables DLSS from being enabled
In this case not unbearable at all, but it's still a 30% increase to latency for 50W more power.
Without the FPS cap and GPU undervolt, it'd come at quite a larger hit for what is effectively antialasing.
1690884323373.png
1690885093953.png

DX12 + DLAA? Uh oh.

GPU usage is higher (hitting 100%) and the latency has decided it's mighty morphin doublin' time (go go GPU power consumption rangers, or something. I need sleep.)



There so much to test, and not enough time for reviewers to test it all out - but when DLAA and DLSS Frame gen all add latency together, we're moving from the golden age of gaming DLSS hinted at into the brown age of bouncing our mouse and keyboard signals off starlink to control a game being played on the ISS.


Oh, heres "normal" settings that everyone seems to recommend
No FPS cap
Vsync off
DLSS quality
All that FPS, but with the higher latency whats the point? you cant benefit from it.
It'd be like watching a movie that goes from blu ray 24FPS to youtube 60FPS, but you get a 50ms audio delay slapped on top so mouth movements and voices never match
Left is reflex off, right is reflex on - not that you can tell, since it only works when you have GPU performance to spare.
1690885398048.png
1690885441616.png




Todays ranting shall now end, and i'll go play vidya games.
 
Last edited:
DLAA enabled, which disables DLSS from being enabled
Totally opposite. DLAA and DLSS is the very same technology with one difference - DLAA renders at native resolution, skipping the upscaling part. That's why you don't get performance uplift with DLAA (it actually reduces FPS since DLAA/DLSS processing takes time per frame) but you also don't get upscaling artifacts and loss of detail.
 
All this makes me glad I just play the game at native res.
 
DLAA slows things down almost as much as frame generation does


Deep rock galactic, in game 143FPS cap, 4K all settings ultra, reflex set to "Enabled+boost" and NULL enabled in NVCP (doesnt work in DX12)
In DX11, Nvidas ultra low latency setting works. In DX12, only reflex works.


DX11 on the left, DX12 on the right.

DLSS quality, my normal settings
This game uses DLSS as a form of anti-aliasing and genuinely looks better with DLSS quality on.

That's odd since I am under the impression dlss uses dlaa, which is really the only reason some people claim it looks better than native. The only difference with only using dlaa is that you're using it at your native resolution instead of fake res/upscaling.
 
the awful use of the TAA implementations on modern games, it's what made me go with a lower performance and higher price 4080 vs a 7900xtx

you just can't trust anymore gamedevs, and at least, even if not perfect, dlaa and dlss ends being better than native due to that shit TAA technique

AMD is dead in 3 years unless they can come with FSR3 soon and being much improved over FSR 2, DLSS have become almost mandatory on a lot of games.
 
Totally opposite. DLAA and DLSS is the very same technology with one difference - DLAA renders at native resolution, skipping the upscaling part. That's why you don't get performance uplift with DLAA (it actually reduces FPS since DLAA/DLSS processing takes time per frame) but you also don't get upscaling artifacts and loss of detail.
That's odd since I am under the impression dlss uses dlaa, which is really the only reason some people claim it looks better than native. The only difference with only using dlaa is that you're using it at your native resolution instead of fake res/upscaling.

DLDSR is the one that added the massive latency, i checked that one this morning. consistent 3-4x increase.

DLAA Isn't as bad as i thought, as i was mixing the two up (But the images i showed still do indicate it can increase the latency)
In some of the testing i did, DLSS was higher latency than DLSS off - this didn't occur at 4k, but did occur at 1080p. Some upscaling bugs are game and resolution dependent.
 
DLAA Isn't as bad as i thought, as i was mixing the two up (But the images i showed still do indicate it can increase the latency)
There's an FPS drop with DLAA so it will increase latency a bit, that's expected. Question is what do you want - image quality (DLAA) or FPS/latency (DLSS)? That's why it's good to have both DLSS and DLAA available in games, gives you flexibity without having to add DLAA via mods :D
 
There's an FPS drop with DLAA so it will increase latency a bit, that's expected. Question is what do you want - image quality (DLAA) or FPS/latency (DLSS)? That's why it's good to have both DLSS and DLAA available in games, gives you flexibity without having to add DLAA via mods :D
DLSS quality often gives you the best of both, since it replaces TAA with a better implementation.
 
DLSS quality often gives you the best of both, since it replaces TAA with a better implementation.
dlss is blurry AF compared to dlaa or dldsr as it will be always a lower input resolution to upscale.
 
DLSS quality often gives you the best of both, since it replaces TAA with a better implementation.

Makes it look smudged and kills detail vs DLAA which is the best option for IQ.

It is shocking how the shoddy TAA hurts the native image so much that DLSS Q is comparable in IQ.

This DLSS is better or equal to native rendering at 4K stuff is purely down the the much better TAA solution and DLAA proves it. How much cleaner the gloves look with DLAA vs native or DLSS Q is shocking and well worth 2fps vs native. 30fps vs DLSS Q is more of a question but for this kind of game I would probably say yes. For a faster paced game maybe not.
 
Every time one of these image quality comparisons comes out it just makes me miss MSAA. Sure it was a hard hit in performance and I understand why most renderers don't use it now but it would consistently improve visual quality if your card was good enough with no temporal artifacts. Now most games come with a built-in blur filter that you can't even turn off unless you use DLAA for the handful of games that include it.
 
MSAA doesn't work well with modern rendering techniques, leaving a lot of aliasing untoched so it's not used anymore for a good reason.
 
dlss is blurry AF compared to dlaa or dldsr as it will be always a lower input resolution to upscale.
Not always true - comes down to the game itself.

DRG has a pretty poor FSR2 implementation, but its DLSS is one of the best i've seen with it genuinely look equal to or better than native res since it works better than unreal engines default TAA implementation.
 
Back
Top