- Joined
- Feb 20, 2019
- Messages
- 8,619 (3.97/day)
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
Heh, I was just tapping out of the argument, no reason to leave the discussion.
Anyway, the benefit to the blocky pixel look is that text is sharper and easier to read than it would be if it were interpolated, so it's more than just games and emulation.
This is especially relevant on high-dpi displays such as 4K TVs and monitors because the alternative is Windows DPI scaling which is still very much a mixed bag in terms of quality and consistency, especially when only half of the Windows 10 interface and even fewer applications actually dpi-scale gracefully. Even if DPI scaling looks better, the jarring contrast between one application rendering text at 4K native with subpixel AA (Cleartype) alongside another application running at 1080p and bilnear filtering makes it look way worse than if the whole image was scaled using the same method.
Integer scaling faces one additional issue on the desktop, and that's subpixel AA.
With a typical RGB stripe, the horizonal resolution is tripled by using colour channels as effective edges - look up cleartype if that's not something you're already familiar with - and integer scaling is going to fail on that because the physical RGB stripe doesn't integer scale.
1080p on 4K monitors will only work well for integer scaling when subpixel AA is used if the physical stripe went RRGGBB for each pair of horizontal pixels instead of RGBRGB. The easy option is to just disable subpixel AA but a smarter option would be for the graphics driver to be aware of subpixel AA and the physical stripe layout, so that it could adjust rendered RGB subpixel AA output to a native RGBRGB physical pixel (using 2x integer scaling as an example there).
The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.
Anyway, the benefit to the blocky pixel look is that text is sharper and easier to read than it would be if it were interpolated, so it's more than just games and emulation.
This is especially relevant on high-dpi displays such as 4K TVs and monitors because the alternative is Windows DPI scaling which is still very much a mixed bag in terms of quality and consistency, especially when only half of the Windows 10 interface and even fewer applications actually dpi-scale gracefully. Even if DPI scaling looks better, the jarring contrast between one application rendering text at 4K native with subpixel AA (Cleartype) alongside another application running at 1080p and bilnear filtering makes it look way worse than if the whole image was scaled using the same method.
Integer scaling faces one additional issue on the desktop, and that's subpixel AA.
With a typical RGB stripe, the horizonal resolution is tripled by using colour channels as effective edges - look up cleartype if that's not something you're already familiar with - and integer scaling is going to fail on that because the physical RGB stripe doesn't integer scale.
1080p on 4K monitors will only work well for integer scaling when subpixel AA is used if the physical stripe went RRGGBB for each pair of horizontal pixels instead of RGBRGB. The easy option is to just disable subpixel AA but a smarter option would be for the graphics driver to be aware of subpixel AA and the physical stripe layout, so that it could adjust rendered RGB subpixel AA output to a native RGBRGB physical pixel (using 2x integer scaling as an example there).
The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.
Last edited: