Thursday, June 21st 2018
Your 1440p Monitor Could Be Using a 4K Panel
German site Prad.de reports that sources close to monitor panel manufacturers told them that the production cost of a 27" 4K 3840x2160 panel is lower or at least equal to that of a 27" 2560x1440 QHD panel. This drives monitor manufacturers to use 4K panels in monitors that are specified as QHD - when panel supply is low, or monitor demand is high.
The sources did not mention any specific monitor manufacturer or model, but it's highly probable that some 1440p monitors in the hands of customers today use a 4K panel. Obviously you're not gonna get 4K resolution when paying for a QHD monitor. Rather the panel firmware is configured to report its maximum capability as 1440p, and internally scale the input signal accordingly, which may result in reduced image quality.In order to scale a 2560x1440 image to 3840x2160, the scaling factor is x1.5. This means that a single pixel in the lower-resolution original image gets mapped onto one and a half pixels, which increases blurriness. This is vastly different to a 4K display running with 1920x1080 input, where each pixel simply gets doubled in width and height, so a 1:1 mapping exists and everything stays sharp.
When looking closely, visual quality differences could appear in text, which does get smoothed by all modern operating systems, though, so it comes with some inherent blurriness anyway. Media playback and gaming shouldn't be affected in any noticeable way. One potential method to detect such a monitor is to look for the pixel size specification, which should be around 0.23 mm. For a 4K panel that number is 0.16 mm, so if the specification of your 1440p monitor lists that number, it probably comes with a 4K panel.
The image below (by Prad) shows a simulated monitor test image, with native 1440p on top and 1440p scaled to 4K on bottom.
Source:
Prad
The sources did not mention any specific monitor manufacturer or model, but it's highly probable that some 1440p monitors in the hands of customers today use a 4K panel. Obviously you're not gonna get 4K resolution when paying for a QHD monitor. Rather the panel firmware is configured to report its maximum capability as 1440p, and internally scale the input signal accordingly, which may result in reduced image quality.In order to scale a 2560x1440 image to 3840x2160, the scaling factor is x1.5. This means that a single pixel in the lower-resolution original image gets mapped onto one and a half pixels, which increases blurriness. This is vastly different to a 4K display running with 1920x1080 input, where each pixel simply gets doubled in width and height, so a 1:1 mapping exists and everything stays sharp.
When looking closely, visual quality differences could appear in text, which does get smoothed by all modern operating systems, though, so it comes with some inherent blurriness anyway. Media playback and gaming shouldn't be affected in any noticeable way. One potential method to detect such a monitor is to look for the pixel size specification, which should be around 0.23 mm. For a 4K panel that number is 0.16 mm, so if the specification of your 1440p monitor lists that number, it probably comes with a 4K panel.
The image below (by Prad) shows a simulated monitor test image, with native 1440p on top and 1440p scaled to 4K on bottom.
46 Comments on Your 1440p Monitor Could Be Using a 4K Panel
Sadly, that's also incredibly common.
I mean... as an ex-factory production guy i get it... this actually makes a TON of sense if you're trying to keep the cash steady by leapfrogging an already dead standard (1440p which is a clear stopgap) - but whoever mage the decision clearly just didn't understand how scaling works.
If i was them... i would offer some sort of support ticket that they could complain and flash the monitor to 4k to existing customers **then label the new boxes**. Most people wont notice it... and the ones that do... get a free upgrade to 4k since they care about that.
offer new customers a 'dlc' upgrade to new resolution... It's the Mcdonald's way.
Obviously if making a commcercial product, you'd hope they would configure it to scale right.
I think that whoever made our hypothetical monitor would be unlikely to do the decent thing and offer a flash to 4K since it would cannibalise sales of their more expensive 4K monitors. You never know though. That's what gets me. All they have to do is map signal pixel to 4 real pixels and that's it, but no, they insist on anti-aliasing the picture, which looks crap. And of course, there's no option to turn it off since "it looks better" according to them. :rolleyes: I've seen a friend's 4K monitor do that with a 1080p signal. The only saving grace is that the smoothing artefacts are smaller due to the hi res panel.
*That's a real cramped desktop! :p
Other than that I doubt these panels are in any high refresh gaming monitors, but cheaper 1440p@60Hz ones. Thank you for make me feel young, my first monitor was 15" crt with 800x600 maximum resolution, about 90s... :D
(1) Monochrome
(2) CGA
(3) EGA
(4) VGA
(5) Hercules
(ESC) Back to DOS
Think about it: you increase the resolution to 4K and whatever ratio the previous native pixel was to the new pixels get you a higher band-limit.
Note: written under the effect of this epic blogpost.