Generally speaking everything can be considered a signal, but that is completely besides the point. The point is that you said that the "signal" is "processed" and you can't do anything about it, but that is completely untrue! What you mean to say is that the data or the image has been processed after the TV has received the digital data,, which is called post processing, which is a completely different thing from what you wanted to say by "signal processing", which is not a term used in any digital gadget such as mobile phones and TVs. Digital TVs receive packets of data-information and after that data has been received then the TV applies post processing and this is an important point because you as an user you can choose what processing to apply or not to apply. In the case of the C1 you can turn off probably all post processing
Okay, so you admit that data that is transferred (anywhere) constitutes a signal, but it ... ceases to be a signal when it is being processed? Yeah, sorry, your logic is wildly incoherent. And "signal processing" is a
very broadly applicable term across nearly all fields where it might be relevant, including digital AV equipment and computers. It literally does not matter whatsoever whether the processing is done algorithmically by an IC or through some analog device - both are equally well described by "signal processing". I mean, what do you imagine happens after the TV's SoC is done processing it? It's trasmitted to the display panel.
As a signal. So, you have data, that is transferred to a TV as some sort of signal, but it conveniently ceases to be a signal just for the fraction of a second it takes to process it, before once again becomes a signal when transferred from the display controller SoC to the panel? Yeah, sorry, that logic is selective, incoherent, and ultimately arbitrary.
I've also never said that "you can't do anything about it", I've said that TVs up until the advent of game modes have often had no option to entirely turn off their (often quite aggressive) processing, which has resulted in high input lag and usability problems with PCs (some TVs have had "PC modes" but even those often didn't disable
all processing, leaving in place things like sharpening and denoising).
Also, your application of the term "post processing" here is ... strange.
Post (as in: "after") as to what, precisely? After the image is "done", i.e. finished, encoded, and transmitted? If that is the criterion, then by that definition
all post processing is signal processing. However, the conventional use of "post processing" is
not in relation to signals, but rather in relation to stored data - photoshopping an image, adding a filter to a video, adding a sharpening filter or AA to a game
before that image data is transmitted; the
post then refers to after the creation of the data (often an image), but crucially happens
before this is considered "finished" and is transmitted anywhere. You could of course insist that any processing done by the TV is "post processing" because it happens after the signal source has sent the signal, but then we're talking about a distinction without a difference, and your argument collapses.
That's besides the point, I'm just talking about the most modern TVs like the C1.
I'm sorry, but you've got this twisted around. You're not arguing a point here. You're arguing
against my points. Whatever delineation of your arguments you are claiming is meaningless, as you are arguing against my claims. Thus, the basis of my claims
must be the basis for any further claims made - otherwise we're just not talking about the same thing, making the discussion meaningless.
What is my fundamental misconception?
Let's see:
- The absurd idea that signal processing is a term only applicable to analog signals (which you seem to have abandoned for now, which I guess is progress?)
- The misconseption that there is a meaningful difference between your unusual application of "post processing" and my use of signal processing
- The idea that users have total control over their consumer electronics. Heck, even LG's OLED have widely reported user-inaccessible settings (that some users are gaining access to through a service menu)
That I keep on telling you the TVs all they do is apply post processing and that the "signal" they receive isn't "processed" as you claim it to be?
Once again: that sentence explicitly contradicts itself. If a TV receives a signal, and then processes the data contained in that signal, before displaying it? It is then processing the signal, which we might perhaps call signal processing? I really can't understand why you're so gung-ho on this immaterial use of broadly applicable terminology. It ultimately doesn't make any kind of difference to my argument.
I watched it and Linus clearly said "It shows no signs of burn-in whatsoever. And actually you would find plenty of testimonials from users online"
Like ... what kind of reality distortion bubble do you live in?
1:39 and forward: "This burn-in is a result of snapping my windows into the four corners of my display." While he is saying that, we can
clearly see the image retention on-screen. 2:00 and forward, Wendell: "It is amazing, but what's not amazing is that I'm starting to see signs of burn-in." Do you need more?
Yes, the pixel refresher program alleviates the problem - but it's a band-aid, that "fixes" it by literally wearing down
every pixel so that the localized wear is less apparent. It is not a long-term solution, as it will literally wear out the display. The video makes this abundantly clear. That is in fact the entire message of the video: beware that depending on your use case, even brand-new OLEDs can burn in quite quickly. That doesn't mean it
will happen - agin, it depends on your use case - but they are by no means image retention-proof.