- Joined
- Nov 11, 2016
- Messages
- 3,459 (1.17/day)
System Name | The de-ploughminator Mk-III |
---|---|
Processor | 9800X3D |
Motherboard | Gigabyte X870E Aorus Master |
Cooling | DeepCool AK620 |
Memory | 2x32GB G.SKill 6400MT Cas32 |
Video Card(s) | Asus RTX4090 TUF |
Storage | 4TB Samsung 990 Pro |
Display(s) | 48" LG OLED C4 |
Case | Corsair 5000D Air |
Audio Device(s) | KEF LSX II LT speakers + KEF KC62 Subwoofer |
Power Supply | Corsair HX850 |
Mouse | Razor Death Adder v3 |
Keyboard | Razor Huntsman V3 Pro TKL |
Software | win11 |
Yeah, I agree the Witcher 3 looks better but it's game dependant. But it's still nothing similar to a 4K image. When I played CP2077 first time around, I think the DLSS was super early implementation. Never really noticed it that much, but I also had RT on lower settings (or it did it anyway because of DLSS?). Anyway, I had bought a Gsync monitor so playing single player fps didnt ned high frames, 40-45 fps was smooth enough for me.
DLSS has evolved to a point where a newer version like 2.5.1 will provide the best IQ across all DLSS2 games, instead of the stock DLL that shipped with the game. CP2077 originally shipped with version 2.1 and it didn't even have negative mipmap (which Nvidia advised should be used with DLSS).
With the latest DLSS 2.5.1, you can try notch lower DLSS setting (like Quality --> Balanced) and the resulting IQ is comparable to older version at higher setting, sure you can play game at 40-45FPS but getting 10-15% higher FPS without any noticeable IQ loss would be nice.
I just spent like 10h playing Witcher 3 on my old rig with 2080Ti at 3440x1440 DLSS Performance and it looks good with RT, play nice enough at 40-45FPS.