- Joined
- Mar 2, 2025
- Messages
- 1 (0.50/day)
Processor | AMD Ryzen 7 9800X3D in 65W Eco Mode |
---|---|
Motherboard | Gigabyte X870 Eagle WiFi7 |
Cooling | Noctua NH-L9x65 CPU + 3 * NF-P12 Redux 1700 case fans |
Memory | 2 * G.Skill X5 32GB DDR5-6000 |
Video Card(s) | GeForce RTX™ 4060 Ti WINDFORCE OC 16G at 65C / 90% power limit |
Case | SilverStone GD09 |
Power Supply | SilverStone HELLA 850R |
VR HMD | PSVR2 |
Benchmark Scores | https://www.3dmark.com/spy/51392168 @ <300W peak power draw from the wall |
So by default NVidia loads a HUGE number of additional video modes they inject into the EDID-provided data from your monitor, which is where all the weird stuff like 1280x1024 and other video modes on modern flat-screen monitors generally comes from even if you use some tool to override your EDID. These are added in the NV_Modes registry key, updating any one instance of the key will update all of them so it's a globally shared registry key.
These additional modes are still filtered based on individual monitor capabilities, but since they're injected on top of the EDID data there's no way to prevent them from showing up if your monitor can display them by default, and this can lead to a lot of older games having screen resolution lists a mile long listing all modes at all refresh rates, etc.
At a minimum to avoid having a billion options in a list just including the default primary monitor desktop resolution would also be sufficient as all modes listed in the EDID are still allowed, and the EDID values can be changed with CRU among other utilities on the fly, so for example if the primary monitor desktop resolution is 2560x1440:
The only additional 'advanced' feature beyond this that might be helpful is adding a user-specified range of integer decimations of the resolution, such as for my 4K monitor I can use:
This is a fairly extreme 1,2,3,4,5,6,8,10 decimation selection since I do play a lot of older low-res games and this lets them pretty much universally have something playable, with the 384x216 mode handling old 320x200 adventure games quite well.
These additional modes are still filtered based on individual monitor capabilities, but since they're injected on top of the EDID data there's no way to prevent them from showing up if your monitor can display them by default, and this can lead to a lot of older games having screen resolution lists a mile long listing all modes at all refresh rates, etc.
At a minimum to avoid having a billion options in a list just including the default primary monitor desktop resolution would also be sufficient as all modes listed in the EDID are still allowed, and the EDID values can be changed with CRU among other utilities on the fly, so for example if the primary monitor desktop resolution is 2560x1440:
{*}SHV 2560x1440x8,16,32,64=1;
The only additional 'advanced' feature beyond this that might be helpful is adding a user-specified range of integer decimations of the resolution, such as for my 4K monitor I can use:
{*}SHV 3840x2160x8,16,32,64 1920x1080x8,16,32,64 1280x720x8,16,32,64 960x540x8,16,32,64 768x432x8,16,32,64 640x360x8,16,32,64 480x270x8,16,32,64 384x216x8,16,32,64=1;
This is a fairly extreme 1,2,3,4,5,6,8,10 decimation selection since I do play a lot of older low-res games and this lets them pretty much universally have something playable, with the 384x216 mode handling old 320x200 adventure games quite well.