• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ratchet & Clank Rift Apart: DLSS vs FSR vs XeSS Comparison

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I tried to use DLSS frame generation in this game. There is a more floaty feeling of the mouse when it is on and also there are artefacts on the crosshair when you sight zoom with the weapons and move the mouse quickly. With controllers this shouldn't be noticeable because there is no sporadic mouse movement, camera motion control is more smooth an slow so frame generation doesn't produce as much artefacts.
that floaty feel is the problem, I'm down to <6ms render latency and i'm seeing people say it "only" adds about 30ms
Theres a discussion here about CP2077 with framegen and how 30-40ms total latency is fast and frame gen at 60-70ms is totally peachy.

I run at ~12ms system latency and cant understand how these people can even aim at anything. These same people are probably on OLED displays and bragging about how fast and responsive it is.

Comments like this just scare me if they think 50ms is the 'best case', but i'm not seeing many figures in the actual reviews/articles on the games.
1690879210344.png

1690879263364.png

1690879364465.png
1690879393697.png



How can people even play like this? It'd be so incredibly floaty




For comparison:

BL2 (Vulkan, it's system latency but worded as render latency since it thinks its DX9)
1690879641782.png


DRG in DX12 with reflex (proving the PC is indeed keeping the system latency that low)
1690879708811.png


I cant image having 5x to 10x the render/input latency and thinking that's the best there is
(yes, if your render latency is 5x slower you're reacting to something 5x older - so it IS input latency, just not caused by the input devices)
 
Last edited:
Joined
Jan 5, 2008
Messages
158 (0.03/day)
Processor Intel Core i7-975 @ 4.4 GHz
Motherboard ASUS Rampage III GENE
Cooling Noctua NH-D14
Memory 3x4 GB GeIL Enhance Plus 1750 MHz CL 9
Video Card(s) ASUS Radeon HD 7970 3 GB
Storage Samsung F2 500 GB
Display(s) Samsung SyncMaster 2243LNX
Case Antec Twelve Hundred V3
Audio Device(s) VIA VT2020
Power Supply Enermax Platimax 1000 W Special OC Edition
Software Microsoft Windows 7 Ultimate SP1
I usually don't have any issue at all using FSR, but it looks like hot garbage in this game. The flickering on distant objects and disocclusion artifacts are nuts, at least at 1440p using the Quality option. XeSS looks much better, but the performance bump is much smaller. I ended up going with ITGI with my 1080 Ti. It provides a better image than FSR and a much larger performance boost than XeSS.
Typical for Nixxess ports, FSR looks bad in Spider Man games too. Terrible aliasing and flickering.
 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
The gap in IQ between DLAA and the rest of the options is massive. Compared to the native 4K image it tidies up the softness and retains all of the details such as the better quality highlights, details under the light bridge, the very very fine detail on all the foliage and the windows on the background buildings. With DLAA you can actually see the leather grain on the gloves and helmet where as that gets smudged away with DLSS Q or Native.

If only the in game TAA solutions looked as good as the one NV have with DLAA and there would never be a case of DLSS Quality looking better than native, the bulk of the quality uplift is the much better TAA solution NV have and a little bit of sharpening.
 
Joined
Feb 11, 2009
Messages
5,584 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
there is some rought spot in Ratchet & Clank that absolutely kills CPU performance, having Frame Generation help tremendously

Yes FPS fell as low as 28FPS on a 13700K with tuned 6000MT DDR5 :D

Great, but pls dont be part of the problem by actively being positive on crappy optimization to promote some technology to pick up the slack.
This is exactly what people were worried about with upscaling technologies....and its happening already?

(that said I dont get this, Nixxess is an excellent PC porting company, im expecting patches, should have been better on release but it is what it is)

I usually don't have any issue at all using FSR, but it looks like hot garbage in this game. The flickering on distant objects and disocclusion artifacts are nuts, at least at 1440p using the Quality option. XeSS looks much better, but the performance bump is much smaller. I ended up going with ITGI with my 1080 Ti. It provides a better image than FSR and a much larger performance boost than XeSS.

I only really have experience with FSR(2.2) in warframe but its really bad there as well.
Just smeary stuff where I guess the game does not provide proper motion vectors, it happens on items that animate in place, the ghost all over the map, but if you start moving the camera, they are sharp, or atleast sharp enough.

So moving the camera sorta forces or provides motion vectors that arnt there normally making it all look...well bad.
Not unusable, godsent for really lot tier gpu's but still, lots of work to be done.


but I watched a friend play that crappy Callisto Protocol game on an RTX2080 with FSR enabled to get some usable frames and there it was more then fine.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
The gap in IQ between DLAA
DLAA slows things down almost as much as frame generation does


Deep rock galactic, in game 143FPS cap, 4K all settings ultra, reflex set to "Enabled+boost" and NULL enabled in NVCP (doesnt work in DX12)
In DX11, Nvidas ultra low latency setting works. In DX12, only reflex works.


DX11 on the left, DX12 on the right.

DLSS quality, my normal settings
This game uses DLSS as a form of anti-aliasing and genuinely looks better with DLSS quality on.
1690884503003.png
1690884946951.png

pretty equal - tiny variances with the game can explain the small changes.

DLSS disabled
0.9ms latency is not noticeable to a human, but I do like that the better visual fidelity option above used 19 watts less.
1690884582172.png
1690885014697.png

DX11 vs 12 are again comparable


DLAA enabled, which disables DLSS from being enabled
In this case not unbearable at all, but it's still a 30% increase to latency for 50W more power.
Without the FPS cap and GPU undervolt, it'd come at quite a larger hit for what is effectively antialasing.
1690884323373.png
1690885093953.png

DX12 + DLAA? Uh oh.

GPU usage is higher (hitting 100%) and the latency has decided it's mighty morphin doublin' time (go go GPU power consumption rangers, or something. I need sleep.)



There so much to test, and not enough time for reviewers to test it all out - but when DLAA and DLSS Frame gen all add latency together, we're moving from the golden age of gaming DLSS hinted at into the brown age of bouncing our mouse and keyboard signals off starlink to control a game being played on the ISS.


Oh, heres "normal" settings that everyone seems to recommend
No FPS cap
Vsync off
DLSS quality
All that FPS, but with the higher latency whats the point? you cant benefit from it.
It'd be like watching a movie that goes from blu ray 24FPS to youtube 60FPS, but you get a 50ms audio delay slapped on top so mouth movements and voices never match
Left is reflex off, right is reflex on - not that you can tell, since it only works when you have GPU performance to spare.
1690885398048.png
1690885441616.png




Todays ranting shall now end, and i'll go play vidya games.
 
Last edited:
Joined
Jan 5, 2008
Messages
158 (0.03/day)
Processor Intel Core i7-975 @ 4.4 GHz
Motherboard ASUS Rampage III GENE
Cooling Noctua NH-D14
Memory 3x4 GB GeIL Enhance Plus 1750 MHz CL 9
Video Card(s) ASUS Radeon HD 7970 3 GB
Storage Samsung F2 500 GB
Display(s) Samsung SyncMaster 2243LNX
Case Antec Twelve Hundred V3
Audio Device(s) VIA VT2020
Power Supply Enermax Platimax 1000 W Special OC Edition
Software Microsoft Windows 7 Ultimate SP1
DLAA enabled, which disables DLSS from being enabled
Totally opposite. DLAA and DLSS is the very same technology with one difference - DLAA renders at native resolution, skipping the upscaling part. That's why you don't get performance uplift with DLAA (it actually reduces FPS since DLAA/DLSS processing takes time per frame) but you also don't get upscaling artifacts and loss of detail.
 
Joined
Dec 28, 2012
Messages
3,980 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
All this makes me glad I just play the game at native res.
 
Joined
Nov 15, 2022
Messages
6 (0.01/day)
DLAA slows things down almost as much as frame generation does


Deep rock galactic, in game 143FPS cap, 4K all settings ultra, reflex set to "Enabled+boost" and NULL enabled in NVCP (doesnt work in DX12)
In DX11, Nvidas ultra low latency setting works. In DX12, only reflex works.


DX11 on the left, DX12 on the right.

DLSS quality, my normal settings
This game uses DLSS as a form of anti-aliasing and genuinely looks better with DLSS quality on.

That's odd since I am under the impression dlss uses dlaa, which is really the only reason some people claim it looks better than native. The only difference with only using dlaa is that you're using it at your native resolution instead of fake res/upscaling.
 
Joined
Nov 20, 2021
Messages
50 (0.04/day)
Processor 7800x3D
Motherboard Asus B650E-E
Cooling Arctic liquid freezer III 420
Memory 32GB 6000 CL28 DDR5
Video Card(s) RTX 4090
Storage 990 Pro 4tb
Display(s) Dell AW2725DF QD-OLED 360hz + LG Dual UP + 27GL850B
Audio Device(s) Topping DX7 PRO+
Power Supply Corsair HX850i
Mouse G-Wolves HSK pro 4k
Keyboard Wooting 80HE
Software 10 x64
the awful use of the TAA implementations on modern games, it's what made me go with a lower performance and higher price 4080 vs a 7900xtx

you just can't trust anymore gamedevs, and at least, even if not perfect, dlaa and dlss ends being better than native due to that shit TAA technique

AMD is dead in 3 years unless they can come with FSR3 soon and being much improved over FSR 2, DLSS have become almost mandatory on a lot of games.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Totally opposite. DLAA and DLSS is the very same technology with one difference - DLAA renders at native resolution, skipping the upscaling part. That's why you don't get performance uplift with DLAA (it actually reduces FPS since DLAA/DLSS processing takes time per frame) but you also don't get upscaling artifacts and loss of detail.
That's odd since I am under the impression dlss uses dlaa, which is really the only reason some people claim it looks better than native. The only difference with only using dlaa is that you're using it at your native resolution instead of fake res/upscaling.

DLDSR is the one that added the massive latency, i checked that one this morning. consistent 3-4x increase.

DLAA Isn't as bad as i thought, as i was mixing the two up (But the images i showed still do indicate it can increase the latency)
In some of the testing i did, DLSS was higher latency than DLSS off - this didn't occur at 4k, but did occur at 1080p. Some upscaling bugs are game and resolution dependent.
 
Joined
Jan 5, 2008
Messages
158 (0.03/day)
Processor Intel Core i7-975 @ 4.4 GHz
Motherboard ASUS Rampage III GENE
Cooling Noctua NH-D14
Memory 3x4 GB GeIL Enhance Plus 1750 MHz CL 9
Video Card(s) ASUS Radeon HD 7970 3 GB
Storage Samsung F2 500 GB
Display(s) Samsung SyncMaster 2243LNX
Case Antec Twelve Hundred V3
Audio Device(s) VIA VT2020
Power Supply Enermax Platimax 1000 W Special OC Edition
Software Microsoft Windows 7 Ultimate SP1
DLAA Isn't as bad as i thought, as i was mixing the two up (But the images i showed still do indicate it can increase the latency)
There's an FPS drop with DLAA so it will increase latency a bit, that's expected. Question is what do you want - image quality (DLAA) or FPS/latency (DLSS)? That's why it's good to have both DLSS and DLAA available in games, gives you flexibity without having to add DLAA via mods :D
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
There's an FPS drop with DLAA so it will increase latency a bit, that's expected. Question is what do you want - image quality (DLAA) or FPS/latency (DLSS)? That's why it's good to have both DLSS and DLAA available in games, gives you flexibity without having to add DLAA via mods :D
DLSS quality often gives you the best of both, since it replaces TAA with a better implementation.
 
Joined
Nov 20, 2021
Messages
50 (0.04/day)
Processor 7800x3D
Motherboard Asus B650E-E
Cooling Arctic liquid freezer III 420
Memory 32GB 6000 CL28 DDR5
Video Card(s) RTX 4090
Storage 990 Pro 4tb
Display(s) Dell AW2725DF QD-OLED 360hz + LG Dual UP + 27GL850B
Audio Device(s) Topping DX7 PRO+
Power Supply Corsair HX850i
Mouse G-Wolves HSK pro 4k
Keyboard Wooting 80HE
Software 10 x64
DLSS quality often gives you the best of both, since it replaces TAA with a better implementation.
dlss is blurry AF compared to dlaa or dldsr as it will be always a lower input resolution to upscale.
 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
DLSS quality often gives you the best of both, since it replaces TAA with a better implementation.

Makes it look smudged and kills detail vs DLAA which is the best option for IQ.

It is shocking how the shoddy TAA hurts the native image so much that DLSS Q is comparable in IQ.

This DLSS is better or equal to native rendering at 4K stuff is purely down the the much better TAA solution and DLAA proves it. How much cleaner the gloves look with DLAA vs native or DLSS Q is shocking and well worth 2fps vs native. 30fps vs DLSS Q is more of a question but for this kind of game I would probably say yes. For a faster paced game maybe not.
 
Joined
May 21, 2023
Messages
38 (0.06/day)
Processor Intel Core i7-4790
Memory 2x8GB DDR3-1600
Video Card(s) AMD Radeon RX 5700 XT Reference Card
Storage Samsung 870 EVO 500GB, 2TB + WD Black 1TB
Power Supply Corsair CX550M
Software Fedora Workstation 40 / Microsoft Windows 10 Pro
Every time one of these image quality comparisons comes out it just makes me miss MSAA. Sure it was a hard hit in performance and I understand why most renderers don't use it now but it would consistently improve visual quality if your card was good enough with no temporal artifacts. Now most games come with a built-in blur filter that you can't even turn off unless you use DLAA for the handful of games that include it.
 
Joined
Jan 5, 2008
Messages
158 (0.03/day)
Processor Intel Core i7-975 @ 4.4 GHz
Motherboard ASUS Rampage III GENE
Cooling Noctua NH-D14
Memory 3x4 GB GeIL Enhance Plus 1750 MHz CL 9
Video Card(s) ASUS Radeon HD 7970 3 GB
Storage Samsung F2 500 GB
Display(s) Samsung SyncMaster 2243LNX
Case Antec Twelve Hundred V3
Audio Device(s) VIA VT2020
Power Supply Enermax Platimax 1000 W Special OC Edition
Software Microsoft Windows 7 Ultimate SP1
MSAA doesn't work well with modern rendering techniques, leaving a lot of aliasing untoched so it's not used anymore for a good reason.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.90/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
dlss is blurry AF compared to dlaa or dldsr as it will be always a lower input resolution to upscale.
Not always true - comes down to the game itself.

DRG has a pretty poor FSR2 implementation, but its DLSS is one of the best i've seen with it genuinely look equal to or better than native res since it works better than unreal engines default TAA implementation.
 
Top