You're over-dramatizing the issue. It's not a big deal, I haven't used a PC with newest LG monitors such as the C1 but probably there is a standard protocol to turn the TV to sleep or to wake when you turn on your PC. How is it possible to turn on the TV and also the sound system such as a sound-bar just by turning on a PS5 or an Apple TV? Could be the same deal with a new PC.'
Nope, this is literally not possible. TVs do not understand the signalling protocols PCs use for auto sleep/wake of monitors, and do not have this functionality built in. PCs do not have HDMI-CEC functionality (though it might be possible to add through third party software - I've never checked). But this is not a way in which PCs and TVs communicate.
What do you really mean by "over-processing their signals"?!? That sound really odd to be honest, everything is digital.. zeroes and ones, there is no analogue "signal" to process cause the way you say it sounds like you're talking about analogue TVs or something. Also you're a bit wrong about monitors too because almost all monitors don't have good calibration and they have pictures modes with different values that don't represent accuracy such as photo modes or movie modes or gaming mode.
Uh ... digital signals are also signals.
Changing data is signal processing. Over-processing means
processing the image data to make it look different. Whether it's analogue or digital is irrelevant. For TVs, typically this means motion smoothing/interpolation, drastic color adjustments, contrast adjustments, algorithmic/AI upscaling or "image enhancement" processes, black level adjustments, denoising, and a whole host of other "features" (I'd rather call them bugs, but that's just me). Why do you think TVs have a "game mode"? Because this mode cuts out all this processing to cut down on input lag, as this processing causes significant lag. Monitors have essentially none of these features, in part because they're fundamentally unsuitable for crucial PC use cases such as text rendering.
And I never said "all monitors have good calibration". I
specifically said that they don't:
Most decent quality monitors also have color profiles (sRGB, P3 D65, possibly AdobeRGB), though of course the quality of calibration varies a lot
So:
decent monitors have useful color profiles; the quality of calibration (for all monitors) varies a lot - even the decent ones, but particularly the bad ones. And is often crap, just as it typically is with most modes on most TVs - just in different ways (TVs tend towards overprocessing, oversaturation and excessive contrast; monitors tend to be muted, dull and just plain inaccurate).
That is not true, you will not get a burn in. Those TVs like the C1 are specifically marketed for gaming. They have a whole gaming mode and gaming UI that even has FPS counter.
"They have a gaming mode and gaming UI and are marketed for gaming" is in no way proof of burn-in being impossible through gaming. You understand that, right? That marketing can ... mislead? Lie? Just hide inconvenient facts, or gamble on that they don't apply to enough people to be a real problem? Like, there is
zero causal relationship between the things you are presenting as an argument here and the issue at hand. The existence of gaming modes is not proof of the absence of burn-in - it literally
can't be proof of that, as there is no causal relation between the two.
Now, it takes a lot of time to cause burn-in on a modern OLED if it is used for many different things, but if run at high brightness with static imagery (including portions of an image being static, such as a game UI), this will still burn in - and
it can even be quite quick. How long, and how severely, is dependent on the frequency of use, the brightness, the specific design of the static elements, whether these UI elements are covered by anti-burn in features (such as auto logo dimming), and more. None of this means it won't happen, it just means that it's less likely to happen than it was a few years ago, and that it won't happen to everyone. If you play lots of different games and don't use it for web browsing or work, you might never see meaningful retention. But if you play 4+ hours of a single game every day, and don't use the TV for much else? It'll likely happen sooner rather than later.
As for the processing thing and the lack of DP support, thankfully that's no longer an issue:
LG has lifted the lid on the UltraGear 48GQ900, a huge gaming monitor that will be available this summer. The UltraGear 48GQ900 features a 10-bit WOLED panel, along with modern I/O and a matte finish.
www.notebookcheck.net
Rubbish! TV's use the exact same protocols as PC displays. HDMI and DP standards require it. Quit talking nonsense..
Nope. PC GPUs generally do not support HDMI CEC (how TVs and STBs, BD players and the like signal to each other to turn on/off, adjust volume, etc.), and while TVs might understand PC display sleep/wake signals, they do not respond to these signals nor have any way of enabling this.