I'm afraid I'm going against the flow, but I actually vary what I use for both game and system-specific reasons.
I'll provide a few examples to get the idea of what I usually think.
In something like Call of Duty: Black Ops 6, I use DLAA for the campaign and DLSS Quality outside of it. There is some image degradation with DLSS, so you can think of it as "prioritize image quality" vs "prioritize performance while keeping it as close as native as possible". Given how relatively difficult the game is to run even considering that I've dropped settings (well, to reach a very high frame rate) for non-campaign content, every bit helps.
I also have quite a bunch of games that are graphically demanding (e.g. Cyberpunk 2077, Witcher 3) when ray tracing is enabled. The 8GB VRAM is also fairly small all things considered, and I'm also dealing with an increased pixel count of ultrawide 2560x1080 instead of the standard 1920x1080. I'd do anything to get the thing to be at least responsive and smooth. I've noticed that usually, once I've dialed in my desired settings (which surprisingly, isn't too far off what NVIDIA App's auto-optimize function would suggest), it's entirely possible to get to a 80FPS target or more with a bit of leftover performance to spare, if I enabled both DLSS Quality and frame generation. In my experience, it's actually harder to reach 60FPS with no FG than it is to reach 80 with. Funnily enough, the "weaker" cards still end up in a situation where significant amounts of RT effects were able to be enabled.
There's one more game, Wuthering Waves, that ends up with me having DLSS Quality enabled at all times even when the hardware is otherwise capable of running at native with no performance implications. The normal TAA has edge flicker issues. So it kind of ends up echoing what others have already said. DLAA isn't supported (yet). The game will receive a RT update in the near future, so it also means that there's a bit more headroom later on - I did watch the demonstration video NVIDIA posted and I think the general improvements are likely to be going to be worth it (although they described it as RT reflections, it looked like it also affected general lighting.)
If the game isn't demanding enough to actually need DLSS Quality and DLAA was available, I'd use it. There's actually something interesting, though. If you can't tell the difference, there's potentially a power saving angle, if the game is also being played with a set frame rate limit, which I usually use on single-player games. You also shave off the VRAM usage on games with resolution-dependent buffers, which I've noticed that can somewhat benefit in like the newest Indy game which had a high baseline VRAM usage, and extremely so in Stalker 2, where, when I last played the latter while checking out new stuff on Game Pass, had a nasty memory leak.
I think that's about it for the games that I've recently played so far.
For the "system-specific" part, I also sometimes play on a laptop with just a 2050 (mostly because it barely cost more than a very similar laptop with no dGPU and it's at the price tier where the CPUs won't have high-end iGPUs). Here, even after discounting games that will never fit within a 4GB buffer, DLSS Quality, or even more aggressive upscaling, becomes critical to get a decently smooth experience for the games that I'd play on it.
So in the end... I don't really have a set answer, to be honest, but if I had to pick a firm selection, it'd be DLSS Quality, FG where available and won't cause me to VRAM crash.