Friday, March 3rd 2023

LG Display Claims Samsung's QD OLED More Susceptible to Screen Burn Than LG's WOLED

Welcome to the battle of the Korean OLED display makers, where LG Display is now claiming that Samsung's new-ish QD OLED displays are far more susceptible to screen burn, compared to its own WOLED displays. In a way, this is LG getting back at Samsung, as the latter has criticised LG for quite some time, over screen burn on its OLED displays, despite the fact that Samsung hasn't had any of its own OLED products until last year. LG Display is basing much of its claims on testing by Rtings, which isn't yet publicly available, but the company also has a technical explanation behind it all.

Both LG's and Samsung's OLED panels are based around RGB subpixels, just like most LCD panels, with the difference being that OLED panels don't have a backlight, as the pixels themselves are supposed to emit the light. However, RGB subpixels on larger screens tend to lack in brightness and this is why LG added white subpixels to its WOLED panels, which was also a source of criticism from Samsung. However, Samsung's QD OLED displays use a blue OLED layer behind a Quantum Dot layer, which is meant to produce a brighter image than LG's WOLED panels. LG now claims that because Samsung went down the path of using pure RGB subpixels, each subpixel is subjected to a lot more stress on static images than its own WOLED design, which in turn causes screen burn. LG Display did apparently not go into much more details than that at the online press conference the company had called last week, so we'll have to wait and see what Rtings reveals in its next update on its long term testing, which is supposed to take place sometime this month.

Update Mar 3rd 15:08 UTC: Rtings reached out to us and explained that they didn't provide any data to LG Display. Instead, LG Display based its assumptions on photos posted by Rtings on its website. Rtings provided the following statement:
We didn't send any information to LG Display. We published our two-month data and pictures in two waves on February 6th and 16th. It appears LG took these images from our reviews when they were released publicly.

Further to that point, LG Display also did not reach out to us prior to their press call where they referenced our test and images.
Sources: Forbes, Rtings
Add your own comment

125 Comments on LG Display Claims Samsung's QD OLED More Susceptible to Screen Burn Than LG's WOLED

#51
R-T-B
clopeziIn the European Union we have three years warranty for all products, so it's hard to believe those people claiming a high rate failure in short time, no matter the manufacturer.
I've owned my B9 for over 3 years now. I kind of find it hard to believe that they fail quickly too, but we shall see.
Posted on Reply
#52
z1tu
clopeziIn the European Union we have three years warranty for all products, so it's hard to believe those people claiming a high rate failure in short time, no matter the manufacturer.
Citation needed*
I live in the EU and no
Posted on Reply
#55
Denver
clopezieuropa.eu/youreurope/business/dealing-with-customers/consumer-contracts-guarantees/consumer-guarantees/index_en.htm

You're right, UE mandatory it's two years. In Spain and Portugal, we have three years, so I thought that was a UE law. Also, we have a obligation of 10 years of replacements and spare parts.

However, the point it's the same, no one-year and crack TV hehe.
This is a law that should be adopted in other countries. Unfortunately, here(Brazil) the law obliges companies to offer the minimum of Just 1 year warranty. I think it's too little for some expensive products like big TVs.
Posted on Reply
#56
Waldorf
@TheLostSwede/Prima.Vera
at least their units run android, have motion processing that doesnt make me sick (and can actually be turned off completely),
while not selling tvs with an external control box dying after being mounted for not even 6 month (store display being moved).

@konga
60 with vrr is just fine. not everyone plays csgo at 400FPS :D
ignoring that there are many games, that dont need/improve going above 60.

@Readlight
running non 4k content will still look better.
the panel is always 4K, no matter the content res, and most tvs do a decent enough job on upscaling,
so it will look better than on any equally sized FHD tv.

was the easiest way for me to sell (4k) tvs.
not a single customer (up to +70y) claiming 4k tvs were useless because of no content stuck with their statement,
after running (single source) 1080p content on "identical" models mounted side by side, where only res was the difference (FHD vs UHD).

@kondamin
ever been to the movies?
bright room?
right...
Posted on Reply
#57
z1tu
Fry178@TheLostSwede/Prima.Vera
at least their units run android, have motion processing that doesnt make me sick (and can actually be turned off completely),
while not selling tvs with an external control box dying after being mounted for not even 6 month (store display being moved).

@konga
60 with vrr is just fine. not everyone needs to play csgo at 400FPS, ignoring there are many games that dont improve going above 60.
Only people who don't know better use motion processing
60hz is just fine for people who have never went above
Posted on Reply
#58
Waldorf
for some content not coming from a disc or not shot with cinematics in mind,
i do prefer to have it, not on auto tho.

i guess you assume everyone plays only shooting/flightsim games.
more than 50% of my games are not even capable of doing more than 60,
nor do i see the reason for things like 20y old sims doing more than 60.
Posted on Reply
#59
Chrispy_
bugI have serious doubts QD-OLED is the more efficient solution. Instead of one layer (OLED), it uses two (OLED+QD). And then the electronics need to keep both in sync...
But that's just the (uniformed) engineer in me, we'll just have to wait and see.
zero electronics needed for QD, it's a printed layer that's akin to phosphor dots in a CRT. Rather than an electron gun exciting phosphor dots, it's a blue OLED exciting a QD. 100% passive solution with no electronics and no moving parts for the QDs.
Posted on Reply
#60
z1tu
Fry178for some content not coming from a disc or not shot with cinematics in mind,
i do prefer to have it, not on auto tho.

i guess you assume everyone plays only shooting/flightsim games.
more than 50% of my games are not even capable of doing more than 60,
nor do i see the reason for things like 20y old sims doing more than 60.
I don't assume everyone plays that but you're in a fringe minority, you could just be playing your library on a 17 inch CRT
Posted on Reply
#61
Waldorf
@z1tu
not saying its not "better" to have higher hz, but saying its only for folks who have never seen more than 60 is crap.
when looking at global numbers for gamers, iirc is mainly 1080p/60-75 and maybe 100/120.

virtually all my shooters are running at 60-120 with vrr, but not for reducing latency, but smoother movement when looking around.
so far, i prefer (some form of) synced 60/75hz, over any non-synced 100 (or higher).
Posted on Reply
#62
bug
Chrispy_zero electronics needed for QD, it's a printed layer that's akin to phosphor dots in a CRT. Rather than an electron gun exciting phosphor dots, it's a blue OLED exciting a QD. 100% passive solution with no electronics and no moving parts for the QDs.
Ah, that's great. If it's passive than the only challenge left for QD-OLED would be that blue OLEDs still have the shortest life-span.
R-T-BQD layer is pretty much dumb and needs very minimal control. But I still agree with you.

Another issue though is it also has bleed-through to neighboring pixels on single pixel light tests.

How much that matters is debatable, but it certainly does not happen on WOLED.
Most likely not a problem for TVs, but it could mess with text on a monitor.
Posted on Reply
#63
Vya Domus
I wonder why these manufacturers are so inept at solving these problems, especially considering that they also make the OLED displays that have been used in smartphones for years and for some reason those work just fine.
Posted on Reply
#64
Chrispy_
dgianstefaniHeat is what kills OLED pixels not the amount of energy. If blue OLED is more efficient than the other colours then this isn't necessarily true.
Source? I can't prove otherwise but I always understood that heat was merely an acceleration factor, not the primary mechanism.

Just like transistors that suffer from electromigration and NBTI, OLEDs operate on the same principle of moving charge/ions across a barrier between a metallic oxide anode and organic cathode. Eventually the oxide is displaced by the non-zero momentum of electron flow.

So yes, make it hot and it will wear faster, but even cooled by LN2 and it will still definitely suffer electromigration in use.

If the efficiency thing is true (I can't see the physics reason why that would be true) then yeah, a more efficient blue photon source activating a red QD means your need less power and therefore less heat to get the same intensity of red as a less efficient red photon source. That part at least would make sense. I'm guessing now but maybe the organic cathode for red OLED is leakier?
Posted on Reply
#65
z1tu
Fry178@z1tu
not saying its not "better" to have higher hz, but saying its only for folks who have never seen more than 60 is crap.
when looking at global numbers for gamers, iirc is mainly 1080p/60-75 and maybe 100/120.

virtually all my shooters are running at 60-120 with vrr, but not for reducing latency, but smoother movement when looking around.
so far, i prefer (some form of) synced 60/75hz, over any non-synced 100 (or higher).
I agree that synced is always better but why not high refresh and synced? :D I just can't go back to 60hz, I can notice it
Posted on Reply
#66
Laykun
Chrispy_I hate WOLED, period.

It's only good for inflating the real specs, just like dynamic contrast used to.

Can a 2000-nit WOLED display 2000-nits of Red, Green, or Blue? No. More like 700 nits. BUT, the marketing can claim 2000 nits because that's how bright it can get as it's burning out your retinas with an inaccurate faded white version of whatever colour it's supposed to be displaying.
Isn't the same true for an RGB OLED? Being that nits are light per square meter measurement, and if you only turn on the Red subpixels you only achieve 1/3rd the output brightness vs. all the subpixels turned on?
Posted on Reply
#67
Waldorf
@z1tu
sure, but it depends on the game. and i havent even incl the many games that have fixed and/or limited rates.
e.g. Minecraft wont look any different at higher rates, just waste more energy and produce more heat (hw),
and especially console conversions (at least until this gen) like NFS will only do 30, or it affects gameplay (speeding up things).

my rig is virtually identical to a friends (2080s vs Ti), him running fixed120/144Hz, me synced 60,
and while walking/running/raising weapons is better on his, i still prefer mine with a smoother L<>R experience.
Posted on Reply
#68
z1tu
Fry178@z1tu
sure, but it depends on the game. and i havent even incl the many games that have fixed and/or limited rates.
e.g. Minecraft wont look any different at higher rates, just waste more energy and produce more heat (hw),
and especially console conversions (at least until this gen) like NFS will only do 30, or it affects gameplay (speeding up things).

my rig is virtually identical to a friends (2080s vs Ti), him running fixed120/144Hz, me synced 60,
and while walking/running/raising weapons is better on his, i still prefer mine with a smoother L<>R experience.
I don't know about you but I can immediately spot the differences on phones as well, one that has 60hz and one that has 120hz
Posted on Reply
#69
Crackong
bugI have serious doubts QD-OLED is the more efficient solution. Instead of one layer (OLED), it uses two (OLED+QD). And then the electronics need to keep both in sync...
But that's just the (uniformed) engineer in me, we'll just have to wait and see.
We are talking about LG's WOLED , which is also a 2-layer (white + color filter)

Your one layer solution should be JOLED's RGB OLED, which is still maturing and not available in high refresh rate panel yet.

Posted on Reply
#70
trsttte
bugI have serious doubts QD-OLED is the more efficient solution. Instead of one layer (OLED), it uses two (OLED+QD). And then the electronics need to keep both in sync...
Oled is also not just one layer, it requires color filters which waste a big part of the energy that's emited (only the correct color goes to your eyes). That's one of the great efficiency advantages of QD-OLED, instead of filtering stuff out and wasting power, Quantum dots just "convert" the wavelenght to the correct color and let almost all the original energy go through.
Chrispy_If the efficiency thing is true (I can't see the physics reason why that would be true) then yeah, a more efficient blue photon source activating a red QD means your need less power and therefore less heat to get the same intensity of red as a less efficient red photon source. That part at least would make sense. I'm guessing now but maybe the organic cathode for red OLED is leakier?
Blue has a shorter wavelength and higher energy than other colours, to produce the white "backlight" in woled they need to drive the other colors much harder to get the same energy, only to then waste that energy in the color filter.

I don't know if it's possible for quantum dots to go from a lower energy wavelengh to a higher one, that might be a reason why they went with blue instead of green or red.
Posted on Reply
#71
bug
trsttteOled is also not just one layer, it requires color filters which waste a big part of the energy that's emited (only the correct color goes to your eyes). That's one of the great efficiency advantages of QD-OLED, instead of filtering stuff out and wasting power, Quantum dots just "convert" the wavelenght to the correct color and let almost all the original energy go through.
I'm not sure why you'd say that. When you convert from blue to a longer wave length, you're getting waves with lower energy. So there's loss like in any other conversion. But if Samsung managed to keep this loss at a minimum, kudos to them.

Still, as I've said above, pushing blue OLEDs is still the worst thing. I'm curious about what they did about that.
Posted on Reply
#72
mechtech
kongaUnfortunately, you can't just divide the cost of a 48" 4K display by 4 and say that's how much a 24" 1080p display should cost. There are many fixed per-unit costs at every step along the way, and I wouldn't be surprised if the real cost was up to one half that of a 48" 4K display. You also picked the cheapest OLED display LG makes (the "A" series). I don't think most gamers would be happy with a 60hz display, which is what that is.
Gamers do not make up 100% of the market though. And if they can make a 48" 4k oled tv for a grand, then a oled monitor 1/4 the size should not be the same price.
Posted on Reply
#73
trsttte
bugI'm not sure why you'd say that. When you convert from blue to a longer wave length, you're getting waves with lower energy. So there's loss like in any other conversion. But if Samsung managed to keep this loss at a minimum, kudos to them.
You're getting waves with lower energy but more of them so the total energy stays about the same (they still have some losses but much less than simple filtering)
Posted on Reply
#74
bug
trsttteYou're getting waves with lower energy but more of them so the total energy stays about the same (they still have some losses but much less than simple filtering)
More of them? So for every photon emitted by the OLED layer, the QD layer emits another one? I find that hard to believe, the QD layer is not emissive, afaik.
Posted on Reply
#75
trsttte
bugMore of them? So for every photon emitted by the OLED layer, the QD layer emits another one? I find that hard to believe, the QD layer is not emissive, afaik.
Quantum dots are... weird. They are "emissive" as in when excited by a wavelength they will produce a different one.
Posted on Reply
Add your own comment
Aug 15th, 2024 15:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts