Friday, June 5th 2020
LG's 48-inch OLED Gaming TV with G-SYNC Goes on Sale This Month
LG is preparing to launch its latest addition to the gaming lineup of panels and this time it goes big. Preparing to launch this month is LG's 48-inch OLED Gaming TV with 120 HZ refreshing and G-SYNC support. To round up the impressive feature set, LG has priced this panel at $1499, which is a pricey but a tempting buy. Featuring 1 ms response time and low input lag, the 48CX TV is designed for gaming and fits into NVIDIA's Big Format Gaming Display (BFGD) philosophy. Interestingly, the TV uses LG's a9 Gen3 AI processor which does content upscaling so everything can look nice and crisp. Ai is used to "authentically upscale lower resolution content, translating the source to 4K's 8.3+ million pixels. The technology is so good, you might mistake non-4K for true 4K".
131 Comments on LG's 48-inch OLED Gaming TV with G-SYNC Goes on Sale This Month
"The technology is so good, you might mistake non-4K for true 4K "
I highly doubt it unless there are a bunch of asterisks.
This figure is actually from LG corperate research on their panel lifetime. It's not bad. This is the closest to the present model I can find, though technically we won't see this panel until at least a year or two.
Older model 2018 panel specs for reference:
Regardless of use case, you are talking around 50 years to 50% brightness. They stopped running that test a while ago and even they admitted it isn't really relevant to gaming if you "don't game the same game constantly with a static HUD"
had ipses before,can't stand the silverish tint and glow.internet pages look fancy tho.
I have a 48Hz gaming monitor, just to be pedantic ;)
For whatever reason after a few hours of non-gaming work in front of a large IPS monitor I always need to go lay down a bit.
My next monitor will be a high speed 27" TN at 1440P preferably curved.
It's indeed a small issue though. I'm aware, look at the charts I posted. Big changes over the years.
8bit is ok for brightness of up to ~80nits in a dark room, any brighter and you can easily see (greyscale) banding in dark scenes.
If we assume the same ”dark room” setting, in order to raise the maximum brightness without introducing banding to equally dark scenes (in nits) as before you need an extra bit of brightness information every time you double the max brightness in nits.
So 10bits is fine for 320 nits or so, 12 bits for 1200 nits etc.
For colour gradation 8bits was ok for rec.709. Just one bit more actually gives you better colour gradation for rec.2020 gamut.
The two combined, 10bits is fine for rec.2020 at 160 nits max brightness and 12 bits would be ok for up to 640 nits. This is true for ”raw data” i.e. pc use. In encoded HDR formats used in films etc. they do all kinds of trickery such as also give the min and max brightness for each frame and thus in most content just 10 bits is enough.
LCD screens are better in this regard, as the backlight is a separately controllable variable and adjusting it does not effectively affect the bit depth of the panel.
Also put me in the crowd of people who are fine with OLED as a monitor. I had a first gen LG OLED that I replaced with a current model after 4 years (I damaged it) with 0 burn in. RTINGS showed the level of use that it takes for burn in, and that's with HEAVY use of a specific type of content. Varied content with "normal" usage should be fine.