- Joined
- May 14, 2004
- Messages
- 27,976 (3.71/day)
Processor | Ryzen 7 5700X |
---|---|
Memory | 48 GB |
Video Card(s) | RTX 4080 |
Storage | 2x HDD RAID 1, 3x M.2 NVMe |
Display(s) | 30" 2560x1600 + 19" 1280x1024 |
Software | Windows 10 64-bit |
German site Prad.de reports that sources close to monitor panel manufacturers told them that the production cost of a 27" 4K 3840x2160 panel is lower or at least equal to that of a 27" 2560x1440 QHD panel. This drives monitor manufacturers to use 4K panels in monitors that are specified as QHD - when panel supply is low, or monitor demand is high.
The sources did not mention any specific monitor manufacturer or model, but it's highly probable that some 1440p monitors in the hands of customers today use a 4K panel. Obviously you're not gonna get 4K resolution when paying for a QHD monitor. Rather the panel firmware is configured to report its maximum capability as 1440p, and internally scale the input signal accordingly, which may result in reduced image quality.
In order to scale a 2560x1440 image to 3840x2160, the scaling factor is x1.5. This means that a single pixel in the lower-resolution original image gets mapped onto one and a half pixels, which increases blurriness. This is vastly different to a 4K display running with 1920x1080 input, where each pixel simply gets doubled in width and height, so a 1:1 mapping exists and everything stays sharp.
When looking closely, visual quality differences could appear in text, which does get smoothed by all modern operating systems, though, so it comes with some inherent blurriness anyway. Media playback and gaming shouldn't be affected in any noticeable way. One potential method to detect such a monitor is to look for the pixel size specification, which should be around 0.23 mm. For a 4K panel that number is 0.16 mm, so if the specification of your 1440p monitor lists that number, it probably comes with a 4K panel.
The image below (by Prad) shows a simulated monitor test image, with native 1440p on top and 1440p scaled to 4K on bottom.
View at TechPowerUp Main Site
The sources did not mention any specific monitor manufacturer or model, but it's highly probable that some 1440p monitors in the hands of customers today use a 4K panel. Obviously you're not gonna get 4K resolution when paying for a QHD monitor. Rather the panel firmware is configured to report its maximum capability as 1440p, and internally scale the input signal accordingly, which may result in reduced image quality.
In order to scale a 2560x1440 image to 3840x2160, the scaling factor is x1.5. This means that a single pixel in the lower-resolution original image gets mapped onto one and a half pixels, which increases blurriness. This is vastly different to a 4K display running with 1920x1080 input, where each pixel simply gets doubled in width and height, so a 1:1 mapping exists and everything stays sharp.
When looking closely, visual quality differences could appear in text, which does get smoothed by all modern operating systems, though, so it comes with some inherent blurriness anyway. Media playback and gaming shouldn't be affected in any noticeable way. One potential method to detect such a monitor is to look for the pixel size specification, which should be around 0.23 mm. For a 4K panel that number is 0.16 mm, so if the specification of your 1440p monitor lists that number, it probably comes with a 4K panel.
The image below (by Prad) shows a simulated monitor test image, with native 1440p on top and 1440p scaled to 4K on bottom.
View at TechPowerUp Main Site