Thursday, January 10th 2019

CES 2019: Alienware Saves the PC Monitor World With a 55" OLED Gaming Monitor

So, that news title may be slightly too flashy for the actual product, but bear with me here: OLED is such an improvement over current mainstream display technologies that its transition to the PC monitor space is one of the most sought-after unicorns in this market. Alienware, via a partnership with LG (that's almost obvious), will be making this particular unicorn come to reality. The Alienware 55 OLED gaming monitor will feature 4K resolution @ 120 Hz, Variable Refresh Rate support via HDMI 2.1 (FreeSync? G-Sync? - all's still up in the air), DisplayPort 1.4, and 98 percent coverage of the DCI-P3 RGB spectrum with a candy of HDR support thrown in the mix.
If those specs sound familiar, they are; Alienware is naturally using an LG OLED panel for this product, which the specs, all in the same ballpark of LG's announced products, show. Even the diagonal, so incredibly big by PC monitor standards (unless you count NVIDIA's BFGD displays), is the lowest an LG OLED will go. So, the lack of a TV tuner seems to be one of the signs that points towards this television's classification as a gaming monitor. Whether there are more specific, under the hood improvements for a PC-centric monitor of if this is just a repurposed LG OLED television with Alienware's branding and a gaudy light strip on the back of the monitor isn't a question we're equipped to answer at the moment. However, it has several modes to support different game genres that Alienware says will be optimized for those types of games.

Linus Tech Tips actually recorded the Alienware monitor in-action at 4K resolution, so you can check that video (with YouTube compression, sadly) below.

Source: Digital Trends
Add your own comment

53 Comments on CES 2019: Alienware Saves the PC Monitor World With a 55" OLED Gaming Monitor

#26
INSTG8R
Vanguard Beta Tester
bugQLED manages to suggest both OLED and quantum dot at the same time. While having nothing to do with either one.
I have a Quantum Dot panel I’ll be perfectly honest I’m actually quite disappointed with it. My previous IPS panel(MG279Q) looked way better and had way less backlight bleed than this one.
Posted on Reply
#27
bug
INSTG8RI have a Quantum Dot panel I’ll be perfectly honest I’m actually quite disappointed with it. My previous IPS panel(MG279Q) looked way better and had way less backlight bleed than this one.
You don't have a quantum dot panel. No one does (outside labs).
Quantum dot is supposed to work much like OLED: self-emitting, wide gamut individual pixels. What we have today is what @londiste described above: standard LCD panels with a layer of quantum dots that are not emitting, but act as a filter to enhance color gamut.
Posted on Reply
#28
Axaion
Just give us a 24" version .. even 1080p high refresh rate id pay dumb amounts of money for
Posted on Reply
#29
INSTG8R
Vanguard Beta Tester
bugYou don't have a quantum dot panel. No one does (outside labs).
Quantum dot is supposed to work much like OLED: self-emitting, wide gamut individual pixels. What we have today is what @londiste described above: standard LCD panels with a layer of quantum dots that are not emitting, but act as a filter to enhance color gamut.
So Samsung is lying to me then??
www.samsung.com/us/computing/monitors/gaming/27--chg70-gaming-monitor-with-quantum-dot-lc27hg70qqnxza/
I get that this may be just “marketing speak” but why say it is if you say it doesn’t exist!?
Posted on Reply
#30
bug
INSTG8RSo Samsung is lying to me then??
www.samsung.com/us/computing/monitors/gaming/27--chg70-gaming-monitor-with-quantum-dot-lc27hg70qqnxza/
I get that this may be just “marketing speak” but why say it is if you say it doesn’t exist!?
I meant the potential of quantum led is far greater than what we have now. Today we only get a hybrid, stopgap solution. That's all.

Edit: if it's not obvious enough, today we get the Prius, but the true potential is Model S ;)
Posted on Reply
#31
Vayra86
MatsBring down the size? There are plenty to choose from if you want smaller.

Can PC monitors only be placed on desks? ;) This is clearly not made for desks as it has 80 ppi, lower than a 24" 1080 monitor (92 ppi).
So its a TV, and we already had quite a few of those :)
INSTG8RSo Samsung is lying to me then??
www.samsung.com/us/computing/monitors/gaming/27--chg70-gaming-monitor-with-quantum-dot-lc27hg70qqnxza/
I get that this may be just “marketing speak” but why say it is if you say it doesn’t exist!?
Yes Samsung is lying to you. QLED is 99% marketing.
Posted on Reply
#32
INSTG8R
Vanguard Beta Tester
Vayra86So its a TV, and we already had quite a few of those :)



Yes Samsung is lying to you. QLED is 99% marketing.
Well I did manage to trick it into doing native 4K@30 so it does have some tricks up its sleeve. I wish I’d taken a picture but I did have a witness. I was setting up my old Fury for a friend and was having DP issues so switched to HDMI and ran Time Spy Demo to test and it was definitely in 4K. I can’t reproduce it in my Vega tho over DP
Posted on Reply
#33
Vayra86
INSTG8RWell I did manage to trick it into doing native 4K@30 so it does have some tricks up its sleeve. I wish I’d taken a picture but I did have a witness. I was setting up my old Fury for a friend and was having DP issues so switched to HDMI and ran Time Spy Demo to test and it was definitely in 4K. I can’t reproduce it in my Vega tho over DP
It supports the resolution, apparently, but it can not display native 4K. Probably some scaling going on inside.

Posted on Reply
#34
INSTG8R
Vanguard Beta Tester
Bro I saw it with my own eyes it was 4k@30. 3840x2160@30(HMDI limitations) the OSD is not gonna lie and you’re giving it too much credit for upscaling. Pretty sure they’d list that in the specs if it was a feature.
Posted on Reply
#35
londiste
INSTG8RWell I did manage to trick it into doing native 4K@30 so it does have some tricks up its sleeve. I wish I’d taken a picture but I did have a witness. I was setting up my old Fury for a friend and was having DP issues so switched to HDMI and ran Time Spy Demo to test and it was definitely in 4K. I can’t reproduce it in my Vega tho over DP
CHG70? The monitor you linked? Hell no. You did not trick it into doing native 4K@30. This is quite literally impossible.
You might have "tricked it into" downscaling the 4K@30 input into 1440p.
Posted on Reply
#36
bug
INSTG8RBro I saw it with my own eyes it was 4k@30. 3840x2160@30(HMDI limitations) the OSD is not gonna lie and you’re giving it too much credit for upscaling. Pretty sure they’d list that in the specs if it was a feature.
It's possible it accepted 4k input and downscaled it. No way to prove it unfortunately, short of displaying a grid of one pixel wide lines and checking how it looks. Not sure whether it matters though.
Posted on Reply
#37
INSTG8R
Vanguard Beta Tester
londisteCHG70? The monitor you linked? Hell no. You did not trick it into doing native 4K@30. This is quite literally impossible.

You might have tricked it into downscaling the 4K@30 input into 1440p.
ORLY?
Posted on Reply
#38
goodeedidid
The price: sell your family to slavery, and then probably you can afford this... lol
Posted on Reply
#40
INSTG8R
Vanguard Beta Tester
londisteTry something like this:
hifi-writer.com/wpblog/?p=4400
Man I just switched back to DP...It will only do it over HDMI. I’ll bookmark it and try it again later but the OSD never lies about input. If it’s 1080 it says 1080, 1440 etc..
Posted on Reply
#41
londiste
INSTG8RMan I just switched back to DP...It will only do it over HDMI. I’ll bookmark it and try it again later but the OSD never lies about input. If it’s 1080 it says 1080, 1440 etc..
It doesn't lie about input. It just does not reflect what it actually shows.
Downscaling from input is common enough in TVs and monitors targeted for TV/consoles.
Posted on Reply
#42
INSTG8R
Vanguard Beta Tester
londisteIt doesn't lie about input. It just does not reflect what it actually shows.
Downscaling from input is common enough in TVs and monitors targeted for TV/consoles.
while I agree this is quite the opposite isn't it? On DP I can’t select any higher than 1440
Posted on Reply
#43
Valantar
RaevenlordVariable Refresh Rate support via HDMI 2.1 (FreeSync? G-Sync? - all's still up in the air)
No, it's not. HDMI 2.1 VRR is an adaptation of VESA Adaptive Sync, i.e. FreeSync over HDMI with some (likely minor) tweaks. Certainly no G-sync shenanigans here.


Other than that, while this is indeed very nice looking, OLED still has burn-in. Are we supposed to constantly hide our taskbars, move around windows, and have our games regularly move their UI elements? 'Cause otherwise, after a few years it's welcome to burn-in town, population: everyone with this monitor.
Posted on Reply
#44
efikkan
londisteSamsung should be spanked for naming those QLED.
Yes they should, but it's the same problem as most LCD TVs are branded as "LED-TVs", in an attempt to make people think it's OLED.
londisteWhat LCD has over OLED is brightness. This gets paid for in contrast. IPS has 1:1000 contrast, VAs have 1:3000/5000. The only reasonable solution to extend that contrast is FALD which inevitably has halo/blooming issues.

OLED is a conditional choice. If you use the screen in a well-lit room or are worries about retention/burn-in, get LCD (LED, QLED, nano-LED whichever).

When looking at the screen in dark or dim room, OLED dynamic range is very noticeable. Black blacks effectively mean infinite contrast/range.
Well there are certainly pros/cons with every technology, but there is no doubt that OLED is in a different league in terms of picture quality.
LCD have not really improved a lot in the last 10 years, and HDR LCD-TVs cheat by using local dimming, which introduces all sorts of graphical errors.

OLED have improved in many ways over the last years, but the main drawbacks vs. plasma still remains pretty much unchanged; terrible shadow detail (despite deep blacks), blurry movement, and struggling with saturation with bright colors. Despite all this, I would still choose an OLED today.
ValantarOther than that, while this is indeed very nice looking, OLED still has burn-in. Are we supposed to constantly hide our taskbars, move around windows, and have our games regularly move their UI elements? 'Cause otherwise, after a few years it's welcome to burn-in town, population: everyone with this monitor.
Well, technically neither OLED nor plasma has ever had burn-in.
What people are calling "burn-in" on OLED (and plasma) is uneven wear of pixels, and is unrelated to the actual burn-in that could happen on CRTs with static images. On OLED and plasma the so-called "burn-in" is in reality that some regions are more worn than others, and therefore becomes dimmer. This is a very important distinction, and it has nothing to do with the picture being static or not. You can have a constantly moving bright dot spinning in a circle on an OLED, and you will get a circle of so-called "burn-in" after some hundred hours, despite the picture never being static. Of course static pictures with bright elements is a common cause of so-called "burn-in", but it has nothing to do with them being static. You can game for thousands of hours with static UI elements with no problem, as long as they are not bright. But if you watch a TV channel all day with a bright scrolling news-ticker, you will get so-called "burn-in". It is important to understand this to know if it applies to your usage, and if you own one, how to take care of it. For most who watch varied content, there is no issue at all.
Posted on Reply
#45
Valantar
efikkanYes they should, but it's the same problem as most LCD TVs are branded as "LED-TVs", in an attempt to make people think it's OLED.


Well there are certainly pros/cons with every technology, but there is no doubt that OLED is in a different league in terms of picture quality.
LCD have not really improved a lot in the last 10 years, and HDR LCD-TVs cheat by using local dimming, which introduces all sorts of graphical errors.

OLED have improved in many ways over the last years, but the main drawbacks vs. plasma still remains pretty much unchanged; terrible shadow detail (despite deep blacks), blurry movement, and struggling with saturation with bright colors. Despite all this, I would still choose an OLED today.


Well, technically neither OLED nor plasma has ever had burn-in.
What people are calling "burn-in" on OLED (and plasma) is uneven wear of pixels, and is unrelated to the actual burn-in that could happen on CRTs with static images. On OLED and plasma the so-called "burn-in" is in reality that some regions are more worn than others, and therefore becomes dimmer. This is a very important distinction, and it has nothing to do with the picture being static or not. You can have a constantly moving bright dot spinning in a circle on an OLED, and you will get a circle of so-called "burn-in" after some hundred hours, despite the picture never being static. Of course static pictures with bright elements is a common cause of so-called "burn-in", but it has nothing to do with them being static. You can game for thousands of hours with static UI elements with no problem, as long as they are not bright. But if you watch a TV channel all day with a bright scrolling news-ticker, you will get so-called "burn-in". It is important to understand this to know if it applies to your usage, and if you own one, how to take care of it. For most who watch varied content, there is no issue at all.
From now on, I'll use "the phenomenon previously known as burn-in". Better? ;)

And to a degree you're right, the problem is that what an image displays and how bright it is are linked, and the wear you describe is on the color/subpixel level. In other words, if your entire display only showed blue for a year, the blue subpixels would dim noticeably, but not red or green. When you have a static UI element, such as a taskbar, where certain colors are displayed in certain patterns statically, only the relevant subpixels for each displayed colored field would dim. In your example, with a spinning bright dot, all the colors would dim, not just one or two (even if they'd likely not wear evenly). While annoying, that would be less of an issue than a negative image of the Chrome logo constantly being displayed in the bottom left of your screen, due to uneven wear of the subpixels. What causes this and whether it ought to be called burn-in or not is rather irrelevant IMO.
Posted on Reply
#46
ZoneDymo
ehhhh I care more about MLed tbh.... or lazor tv/screen tech but that seems dead :(
Posted on Reply
#47
Valantar
ZoneDymoehhhh I care more about MLed tbh.... or lazor tv/screen tech but that seems dead :(
MicroLEDs are still faaaaar from small enough to do 100+ dpi for regular PC usage (think 24" 1080p), let alone modern resolutions like 4k at reasonable sizes - and shrkinking them to that level is very likely to bring issues (heat, manufacturing difficulty, brightness) with it. It's a promising tech, but I doubt we'll see it below large TV sizes in the coming years.
Posted on Reply
#48
Gungar
ValantarMicroLEDs are still faaaaar from small enough to do 100+ dpi for regular PC usage (think 24" 1080p), let alone modern resolutions like 4k at reasonable sizes - and shrkinking them to that level is very likely to bring issues (heat, manufacturing difficulty, brightness) with it. It's a promising tech, but I doubt we'll see it below large TV sizes in the coming years.
Companies have already showed 100 dpi microLED screens.
Posted on Reply
#49
Valantar
GungarCompanies have already showed 100 dpi microLED screens.
Source? "Companies"?

Also, to pull in a relevant quote from another thread:
Vayra86

tweakers.net/nieuws/147538/samsung-toont-prototype-van-75-inch-4k-microled-tv.html
Samsung says 2-5 years for this 75 inch diameter to hit the shelves. Or, in other words; they haven't even got the slightest clue but its far off.

Right now its basically a patchwork of smaller displays, and it has nowhere near the quality for mass production. And don't get me started on the pixel density... look at the space between leds there, its bigger than the LEDs themselves. Ouch.
If that's the best Samsung (a tech giant with near unlimited resources, and leading display R&D expertise) is able to pull off for a CES demo (i.e. about as one-off as you're able to get), that really doesn't bode well for others doing better. Sure, some small manufacturer might be able to make a better solution for bespoke, low-volume applications where cost is no issue, but that doesn't mean consumers will see anything like this for quite a few years.

HiSense's "ULED XD" (despite the dumb name) looks far more promising, and is almost ready for prime time. The tech would be rather trivial to shrink down to smaller panel sizes too.
Posted on Reply
Add your own comment
Feb 1st, 2025 12:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts