An LCD extracts light quite differently than an emissive display because it is transmissive. The backlight, the filter, the coating change the amount of light wavelengths passing through which primarily affects the saturation. There is not a direct scaling. If you start extracting more light, the gamut enlarges and you have to adjust internal LUT, or icc in order to keep the same standard colors. You cannot exchange its internals and call it a 125% srgb range monitor. You have to calibrate it manually, even from factory. Ambient light color affects the image, making it either duller in daytime, or brighter and flatter in night time.
I cannot even find the youtube lectures I watched that are about gamma compensation of display brightness. It is a huge topic with huge repercussions in power consumption.
If you don't believe me, that is why LCD's don't come with ambient light sensor because it cannot be done yet.
... again, what is the relevance of this? Nothing you are saying here is a) new or b) relevant. This monitor
is calibrated for sRGB from the factory, to a deltaE of <2. The 119% sRGB spec is obviously not in this calibrated mode (that would be very, very difficult in order to achieve an average delta error of less than 2), but is the total color gamut that the display is capable of outputting. You're right that both gamut output and calibration are brightness dependent, but ... so what? This monitor is calibrated. Likely at 200 nits, which is typical, though it can be at any brightness, really. That is typically listed on the included calibration report.
There's no real difference between self-emissive or backlit displays in this regard - no matter what, you need
both a wide gamut (i.e. the ability to output a wide range of light wavelengths)
and a well calibrated display driver if you want a good display. And again: where is anything in this saying that a wide gamut is a handicap for an entry-level monitor not targeting gamers? You still haven't backed up that statement whatsoever.
As for "you cannot exchange its internals and call it a 125% sRGB range monitor" - why not? If changing the backlight or adding a nanoparticle intermediate layer changes the total range of wavelengths output by the backlight, that will change the gamut of the monitor. Again, there is
nothing saying that a gamut specification in any way so much as implies accurate colors. Calibration and gamut are
entirely separate factors, with the only direct link being that you can't get a good calibration within a given color space if the panel in question can't output at least 100% of that color space. By your logic any 100% P3 monitor would be deeply problematic for ordinary sRGB desktop use, which, well, they aren't. A monitor capable of producing any color gamut >100% of any other color gamut can be calibrated to be accurate within the smaller gamut
as long as the display driver and controller support a sufficient level of calibration. These are entirely separate parts of the monitor, that can be combined pretty much at will. There are well calibrated narrow gamut monitors (though not
that narrow, but I've seen pretty good deltaEs with ~90% of sRGB, for example), well calibrated wide gamut monitors, poorly calibrated narrow gamut monitors, and poorly calibrated wide gamut monitors. All combinations exist.
And what is that about ambient light sensors? Do you mean for brightness adjustment? There are a few that have those, though not many. But ... have you heard of laptops? Yeah, those have light sensors. All of them. If what you meant is light sensors for adjusting color temperature on the fly to account for ambient light, well Apple has a thing or two they want to tell you about the IPS LCD displays in their laptops, tablets and phones. The reason professional monitors don't have features like this is because
they aren't accurate enough to be trusted. If you want to control for perceived color shift due to ambient light, you control the ambient light. There's a reason why people doing color grading typically work in blacked-out rooms, have monitor hoods, etc. Making an ambient light sensor advanced and accurate enough to do color temperature and brightness adjustments on the fly in a way that would be suitable for professional use would be essentially impossible - that would require not only a 3D LUT for the monitor, but cameras (not light sensors, full RGB cameras) on both the front and the back of the monitor (to account for both lighting behind the monitor and behind the user), also calibrated with their own separate 3D LUTs (per sensor), and with a highly complex algorithm to balance out how to prioritize different readings for the best result. And if you're in a scenario with mixed color temperature lighting, well, then you're not going to get it accurate no matter what. So doing this would be a fool's errand - it would require
massive amounts of work and would still not be good enough for professional use.
Don't be so boastful, you are the one posting text walls.
Not boasting about anything, just stating a fact - I do indeed take enjoyment from that. Comes from being very stubborn, I think. For the record, I never said you were a troll, though this discussion has come up frequently due to your frequent raising of ... well, dubious arguments with varying levels of ability to argue for them, and typically with sources that don't say what you claim they do. As we have seen here. Also, is there any correlation between post lengths and whether or not one is a troll? I've seen plenty of trolls post one-word posts, so ...
PS: you take a calibrated display and even slightly touch its brightness and those deep shades will be off. This is harder than you speculate.
Again, I'm fully aware of this, and really don't see the relevance to this discussion. To recap: you made one general claim, and expanded on that by making a specific claim.
The general claim:
Color gamut is a handicap, rather than an advantage in the domain of nanoIPS technology
While the specific claim was:
Ever heard of the 'red shift'?
"Slow red decay" to be more precise...
So, you claim that a nanoIPS monitor having a wide color gamut is (
always, as there were no reservations to your statement) a handicap, and never an advantage. You then claim that this is due to slow red decay.
We have seen from your linked reviews that slow red decay is only an issue in the nanoIPS monitors where it occurs if
both high refresh rates (144/165Hz)
and ULMB/backlight strobing are activated, and is not seen at lower refresh rates or without ULMB. The monitor in question for this thread does not refresh above 60Hz, and does not have any advertised backlight strobing feature, so there is no reason whatsoever to expect that slow red decay will be an issue.
Of course, adding to that, slow red decay as demonstrated in the linked reviews is an issue only when displaying fast motion, which is generally not very relevant for professional workloads. Film is typically 24fps, online video is 30 or at most 60, and a lot of editing work is with static views. Photo editing, graphical design, etc., are all mostly static. So an issue with red color fringing on fast moving high contrast images ... isn't likely to be very bothersome, and certainly won't make the color reproduction of the monitor inaccurate in any way.
So again: the issue you brought up has no relevance to the monitor we are discussing here, nor its intended use case. Nor is there any factual basis for your broad-reaching claim that the wide color gamut of nanoIPS is a handicap rather than an advantage.