- Joined
- Feb 24, 2023
- Messages
- 2,924 (4.72/day)
- Location
- Russian Wild West
System Name | DLSS / YOLO-PC |
---|---|
Processor | i5-12400F / 10600KF |
Motherboard | Gigabyte B760M DS3H / Z490 Vision D |
Cooling | Laminar RM1 / Gammaxx 400 |
Memory | 32 GB DDR4-3200 / 16 GB DDR4-3333 |
Video Card(s) | RX 6700 XT / R9 380 2 GB |
Storage | A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD |
Display(s) | Compit HA2704 / MSi G2712 |
Case | Matrexx 55 / Junkyard special |
Audio Device(s) | Want loud, use headphones. Want quiet, use satellites. |
Power Supply | Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup] |
Mouse | Don't disturb, cheese eating in progress... |
Keyboard | Makes some noise. Probably onto something. |
VR HMD | I live in real reality and don't need a virtual one. |
Software | Windows 10 and 11 |
= this thread.
I decided to test my hardware in the Cyberpunk 2077 game in various resolutions and FSR* levels.
*FSR is here substituted with the DLSS spoofing modded version of FSR2.2 which solves the fast moving objects excessive ghosting issue, yet has some minor glitch/flicker issues. Ghosting pisses me off way harder so I decided on it being this way and not vanilla. Can't tell the difference in performance between quote-unquote normal FSR and modded FSR/DLSS which is huge.
System specs:
8-core 16-thread Rocket Lake @ 3500 MHz fixed clock (definitely a bottleneck even in higher resolutions).
8+8 GB DDR4-3200 16-20-20-38 CR2 (possibly a bottleneck in lower resolutions).
RX 6700 XT @ OC/UV [2715—2815 MHz core; 2140 MHz VRAM; default timings, default power limit, 1150 mV] (definitely a bottleneck in higher resolutions).
ReBAR: enabled.
31.5" 1080p display (definitely a bottleneck in higher resolutions).
The game is installed on a SATA SSD.
Game version: 1.63 (GOG)
Crowd density: medium.
HDD mode: enabled.
Additional mod info:
• UHD texture pack (about 650 to 750 MB VRAM additional usage);
• Traffic speed and density increase;
• Some QoL mods which shouldn't affect performance.
⛔️4K High native render: 32 AVG (100%) 25 MIN (100%); image quality is excellent, yet image stability is a lackluster probably because of low performance.
4K High FSR Quality: 53 AVG (166%) 44 MIN (176%); better than native.
4K High FSR Balanced: 63 AVG (197%) 51 MIN (204%); better than native in general but had two minor glitches on edges of contacting surfaces.
4K High FSR Performance: 76 AVG (237%) 47 MIN (188%); insignificant image quality nerf + said glitches
⚡️4K High FSR Ultra Performance: 101 AVG (315%) 40 MIN (160%); noticeable image quality worsening + said glitches
1440p High native render: 69 AVG (215%), 48 MIN (192%); not sure if differs from 4K
⚡️1440p High FSR Quality: 103 AVG (320%) 43 MIN (172%); image quality didn't worsen compared to native 1440p.
⚡️1440p High FSR Balanced: 114 AVG (356%) 56 MIN (280%); noticeable changes to IQ, minor surface contact glitches + major crispiness of some textures.
⚡️1080p High native render: 105 AVG (328%) 50 MIN (200%); good IQ overall, yet texture quality decreased due to their higher resolution than 1080p can fit in.
⚡️1080p High FSR On: makes no sense, CPU speed is insufficient to sensibly benefit from this as GPU is already chilling at 1080p which is proved by stupid low results in minimum FPS counts. Also 1080p is the resolution where FSR or DLSS usage is ill-advised due to low pixel count.
⚡️= very high performance level (above 85 FPS)
= good performance level (60 to 85 FPS)
= sensible performance level, yet suboptimal (40 to 60 FPS)
⛔️ = unplayable performance level (below 40 FPS)
= excellent image quality
= high image quality
= reasonable image quality
= bad image quality
RX 6700 XT is a little bit faster than:
• 2080 Super
• 3060 Ti*
and is a little bit slower than:
• 2080 Ti
• 3070
• 3070 Ti
• 4060 Ti
*would be untrue at stock 6700 XT settings.
Testing ray tracing has been put out of the equasion because AMD + RT = no performance. Who wants 20 FPS at 1080p? You? I doubt.
Please check in at your therapist if you read the whole passage. xD
I decided to test my hardware in the Cyberpunk 2077 game in various resolutions and FSR* levels.
*FSR is here substituted with the DLSS spoofing modded version of FSR2.2 which solves the fast moving objects excessive ghosting issue, yet has some minor glitch/flicker issues. Ghosting pisses me off way harder so I decided on it being this way and not vanilla. Can't tell the difference in performance between quote-unquote normal FSR and modded FSR/DLSS which is huge.
System specs:
8-core 16-thread Rocket Lake @ 3500 MHz fixed clock (definitely a bottleneck even in higher resolutions).
8+8 GB DDR4-3200 16-20-20-38 CR2 (possibly a bottleneck in lower resolutions).
RX 6700 XT @ OC/UV [2715—2815 MHz core; 2140 MHz VRAM; default timings, default power limit, 1150 mV] (definitely a bottleneck in higher resolutions).
ReBAR: enabled.
31.5" 1080p display (definitely a bottleneck in higher resolutions).
The game is installed on a SATA SSD.
Game version: 1.63 (GOG)
Crowd density: medium.
HDD mode: enabled.
Additional mod info:
• UHD texture pack (about 650 to 750 MB VRAM additional usage);
• Traffic speed and density increase;
• Some QoL mods which shouldn't affect performance.
⛔️4K High native render: 32 AVG (100%) 25 MIN (100%); image quality is excellent, yet image stability is a lackluster probably because of low performance.
4K High FSR Quality: 53 AVG (166%) 44 MIN (176%); better than native.
4K High FSR Balanced: 63 AVG (197%) 51 MIN (204%); better than native in general but had two minor glitches on edges of contacting surfaces.
4K High FSR Performance: 76 AVG (237%) 47 MIN (188%); insignificant image quality nerf + said glitches
⚡️4K High FSR Ultra Performance: 101 AVG (315%) 40 MIN (160%); noticeable image quality worsening + said glitches
1440p High native render: 69 AVG (215%), 48 MIN (192%); not sure if differs from 4K
⚡️1440p High FSR Quality: 103 AVG (320%) 43 MIN (172%); image quality didn't worsen compared to native 1440p.
⚡️1440p High FSR Balanced: 114 AVG (356%) 56 MIN (280%); noticeable changes to IQ, minor surface contact glitches + major crispiness of some textures.
⚡️1080p High native render: 105 AVG (328%) 50 MIN (200%); good IQ overall, yet texture quality decreased due to their higher resolution than 1080p can fit in.
⚡️1080p High FSR On: makes no sense, CPU speed is insufficient to sensibly benefit from this as GPU is already chilling at 1080p which is proved by stupid low results in minimum FPS counts. Also 1080p is the resolution where FSR or DLSS usage is ill-advised due to low pixel count.
⚡️= very high performance level (above 85 FPS)
= good performance level (60 to 85 FPS)
= sensible performance level, yet suboptimal (40 to 60 FPS)
⛔️ = unplayable performance level (below 40 FPS)
= excellent image quality
= high image quality
= reasonable image quality
= bad image quality
RX 6700 XT is a little bit faster than:
• 2080 Super
• 3060 Ti*
and is a little bit slower than:
• 2080 Ti
• 3070
• 3070 Ti
• 4060 Ti
*would be untrue at stock 6700 XT settings.
Testing ray tracing has been put out of the equasion because AMD + RT = no performance. Who wants 20 FPS at 1080p? You? I doubt.
Please check in at your therapist if you read the whole passage. xD
Last edited: