Friday, March 3rd 2023
![LG Electronics](https://tpucdn.com/images/news/lg-v1723747195848.png)
LG Display Claims Samsung's QD OLED More Susceptible to Screen Burn Than LG's WOLED
Welcome to the battle of the Korean OLED display makers, where LG Display is now claiming that Samsung's new-ish QD OLED displays are far more susceptible to screen burn, compared to its own WOLED displays. In a way, this is LG getting back at Samsung, as the latter has criticised LG for quite some time, over screen burn on its OLED displays, despite the fact that Samsung hasn't had any of its own OLED products until last year. LG Display is basing much of its claims on testing by Rtings, which isn't yet publicly available, but the company also has a technical explanation behind it all.
Both LG's and Samsung's OLED panels are based around RGB subpixels, just like most LCD panels, with the difference being that OLED panels don't have a backlight, as the pixels themselves are supposed to emit the light. However, RGB subpixels on larger screens tend to lack in brightness and this is why LG added white subpixels to its WOLED panels, which was also a source of criticism from Samsung. However, Samsung's QD OLED displays use a blue OLED layer behind a Quantum Dot layer, which is meant to produce a brighter image than LG's WOLED panels. LG now claims that because Samsung went down the path of using pure RGB subpixels, each subpixel is subjected to a lot more stress on static images than its own WOLED design, which in turn causes screen burn. LG Display did apparently not go into much more details than that at the online press conference the company had called last week, so we'll have to wait and see what Rtings reveals in its next update on its long term testing, which is supposed to take place sometime this month.
Update Mar 3rd 15:08 UTC: Rtings reached out to us and explained that they didn't provide any data to LG Display. Instead, LG Display based its assumptions on photos posted by Rtings on its website. Rtings provided the following statement:
Sources:
Forbes, Rtings
Both LG's and Samsung's OLED panels are based around RGB subpixels, just like most LCD panels, with the difference being that OLED panels don't have a backlight, as the pixels themselves are supposed to emit the light. However, RGB subpixels on larger screens tend to lack in brightness and this is why LG added white subpixels to its WOLED panels, which was also a source of criticism from Samsung. However, Samsung's QD OLED displays use a blue OLED layer behind a Quantum Dot layer, which is meant to produce a brighter image than LG's WOLED panels. LG now claims that because Samsung went down the path of using pure RGB subpixels, each subpixel is subjected to a lot more stress on static images than its own WOLED design, which in turn causes screen burn. LG Display did apparently not go into much more details than that at the online press conference the company had called last week, so we'll have to wait and see what Rtings reveals in its next update on its long term testing, which is supposed to take place sometime this month.
Update Mar 3rd 15:08 UTC: Rtings reached out to us and explained that they didn't provide any data to LG Display. Instead, LG Display based its assumptions on photos posted by Rtings on its website. Rtings provided the following statement:
We didn't send any information to LG Display. We published our two-month data and pictures in two waves on February 6th and 16th. It appears LG took these images from our reviews when they were released publicly.
Further to that point, LG Display also did not reach out to us prior to their press call where they referenced our test and images.
125 Comments on LG Display Claims Samsung's QD OLED More Susceptible to Screen Burn Than LG's WOLED
I live in the EU and no
You're right, UE mandatory it's two years. In Spain and Portugal, we have three years, so I thought that was a UE law. Also, we have a obligation of 10 years of replacements and spare parts.
However, the point it's the same, no one-year and crack TV hehe.
at least their units run android, have motion processing that doesnt make me sick (and can actually be turned off completely),
while not selling tvs with an external control box dying after being mounted for not even 6 month (store display being moved).
@konga
60 with vrr is just fine. not everyone plays csgo at 400FPS :D
ignoring that there are many games, that dont need/improve going above 60.
@Readlight
running non 4k content will still look better.
the panel is always 4K, no matter the content res, and most tvs do a decent enough job on upscaling,
so it will look better than on any equally sized FHD tv.
was the easiest way for me to sell (4k) tvs.
not a single customer (up to +70y) claiming 4k tvs were useless because of no content stuck with their statement,
after running (single source) 1080p content on "identical" models mounted side by side, where only res was the difference (FHD vs UHD).
@kondamin
ever been to the movies?
bright room?
right...
60hz is just fine for people who have never went above
i do prefer to have it, not on auto tho.
i guess you assume everyone plays only shooting/flightsim games.
more than 50% of my games are not even capable of doing more than 60,
nor do i see the reason for things like 20y old sims doing more than 60.
not saying its not "better" to have higher hz, but saying its only for folks who have never seen more than 60 is crap.
when looking at global numbers for gamers, iirc is mainly 1080p/60-75 and maybe 100/120.
virtually all my shooters are running at 60-120 with vrr, but not for reducing latency, but smoother movement when looking around.
so far, i prefer (some form of) synced 60/75hz, over any non-synced 100 (or higher).
Just like transistors that suffer from electromigration and NBTI, OLEDs operate on the same principle of moving charge/ions across a barrier between a metallic oxide anode and organic cathode. Eventually the oxide is displaced by the non-zero momentum of electron flow.
So yes, make it hot and it will wear faster, but even cooled by LN2 and it will still definitely suffer electromigration in use.
If the efficiency thing is true (I can't see the physics reason why that would be true) then yeah, a more efficient blue photon source activating a red QD means your need less power and therefore less heat to get the same intensity of red as a less efficient red photon source. That part at least would make sense. I'm guessing now but maybe the organic cathode for red OLED is leakier?
sure, but it depends on the game. and i havent even incl the many games that have fixed and/or limited rates.
e.g. Minecraft wont look any different at higher rates, just waste more energy and produce more heat (hw),
and especially console conversions (at least until this gen) like NFS will only do 30, or it affects gameplay (speeding up things).
my rig is virtually identical to a friends (2080s vs Ti), him running fixed120/144Hz, me synced 60,
and while walking/running/raising weapons is better on his, i still prefer mine with a smoother L<>R experience.
Your one layer solution should be JOLED's RGB OLED, which is still maturing and not available in high refresh rate panel yet.
I don't know if it's possible for quantum dots to go from a lower energy wavelengh to a higher one, that might be a reason why they went with blue instead of green or red.
Still, as I've said above, pushing blue OLEDs is still the worst thing. I'm curious about what they did about that.