Monday, February 14th 2022
Alienware's 34-inch QD-OLED Monitor Gets a Price
Remember that 34-inch QD-OLED monitor that Alienware announced at CES earlier this year? The company has finally worked out how much it's going to charge for it, although there is still no fixed availability date. At US$1,299 the AW3423DW is going to be a $100 pricier than the AW3821DW, which sports a 38-inch Nano IPS panel with a resolution of 3840x1600, rather than the 34-inch QD-OLED panel with a resolution of 3440x1440 of the AW3423DW.
Obviously the two display technologies aren't comparable, but it's at least an indication of how pricy QD-OLED will be initially, compared to more traditional display technologies. Both displays feature G-Sync Ultimate, so it's not as if Dell has tried to cut any corners here. The AW3423DW does offer a higher refresh rate of 175 Hz vs 144 Hz for the AW3821DW, which may be an advantage to some, but the official HDR certification is oddly enough only HDR 400 vs HDR 600, despite the fact that Dell claims it can deliver up to 1000 cd/m². That said, the black levels of the AW3423DW should be vastly superior, as should the colour gamut. The display is said to be available sometime early this spring, presumably in the US market first.
Sources:
@Alienware, via TFT Central
Obviously the two display technologies aren't comparable, but it's at least an indication of how pricy QD-OLED will be initially, compared to more traditional display technologies. Both displays feature G-Sync Ultimate, so it's not as if Dell has tried to cut any corners here. The AW3423DW does offer a higher refresh rate of 175 Hz vs 144 Hz for the AW3821DW, which may be an advantage to some, but the official HDR certification is oddly enough only HDR 400 vs HDR 600, despite the fact that Dell claims it can deliver up to 1000 cd/m². That said, the black levels of the AW3423DW should be vastly superior, as should the colour gamut. The display is said to be available sometime early this spring, presumably in the US market first.
135 Comments on Alienware's 34-inch QD-OLED Monitor Gets a Price
Indeed QD-OLED is cheaper to produce because it greatly simplifies the emissive layer ( only blue emitter instead of blue and yellow in tradition OLED ) and frontal layer sturctures , less layers means easier to manufacture so ultimately lower production cost .
For sure LG are on a rough spot with their WOLED because no matter how much they try to optimise it QD-OLED inherent advantages over WOLED ( burn-in resistance / higher brightness / better color accuracy ) will always preveil . Well the issue is there has never been a real 32'' 2160p market , yes there have been products here and there but untill now 27'' 2160p has been and still is the norm , only very recently we started seeing 32'' 2160p high refresh products . Good 32'' 2160p high refresh LCD panels such as the M32U can be found for around 800$ so there is plenty of room there for a sub 1000$ 32'' QD-OLED .
QD-OLED being disruptive is not only a possibility , it's a certainty as showcased by the Alienware monitor which sells for much cheaper than it's miniLED LCD competition ( while being a superior product ) . As i previously mentioned if we factor in Alienware / Ultrawide /G-Sync module which always come at a premium , it is reasonable to expect a sub 1000$ 32'' QD-OLED either by Dell , Samsung themselves or another manufacturer supplied by Samsung Displays .
I think it will be in high demand because no one else has made OLED PC monitors and its will be for those that have the gpu power to push 3440x1440 ultrawide at 144hz+
"not designed for static content": Gaming is not static content.. duhhh lol Not true, auto-AI brightness can be turned off.
Also, while image retention is far less of a problem than previously, it is entirely possible to cause it in a relatively short span of time - though it is obviously entirely dependent on your use case. I wouldn't worry for a 2-5-hours-a-day gaming/entertainment display, but for a 10+ hours a day productivity+gaming display? I'd be very worried, especially if the balance of those uses leans more towards productivity. High contrast screen borders and white background applications are prime candidates for causing image degradation on OLEDs - and, crucially, for making it visible when it happens, unlike in a visually complex and dynamic game scene. I assume you're familiar with the concept of UI elements? Health bars, minimaps, ammo counters, hotbars, skill selectors, etc.? Those are pretty static, definitely a lot more static than most TV content, unless your TV is constantly showing the news. TV channel logos are notorious for burning in; game UIs do the same.
You're entirely right that TVs generally have better image quality for video and HDR in particular, though your description of LG's filmmaker mode is wrong: it's a mode with very accurate color reproduction, it is not content-adaptive in any way. It's literally just a well calibrated mode that removes most of the image processing that most TVs add in their standard modes. My TV is constantly set to Samsung's equvialent to that mode, as IMO that's the only sensible setting for any display. Most decent quality monitors also have color profiles (sRGB, P3 D65, possibly AdobeRGB), though of course the quality of calibration varies a lot, and there's the issue that Windows isn't a color-aware OS, meaning that color accuracy is left to apps to handle (and thus most apps don't, leading to a lot of inaccurate colors). But filmmaker mode is ultimately a solution to the problem of TVs over-processing their signals and thus misrepresenting the content they're displaying, something that is a TV-only problem - monitors do not add any real processing to their images. Thus, presenting filmmaker mode as an advantage over monitors, which don't process their images at all, is ... rather odd.
But: I never said that either of these were deal breakers. I said that they are annoying downsides of using a TV as a monitor, adding nuance to your black-and-white "TVs are superior" stance. Different products are made for different uses, and have different features. Which are more important is a matter of personal preference.
It's also quite well established that TVs are much better value than monitors, which stands to reason as they also sell in massively larger volumes, and ultimately producing a larger display panel isn't all that much more expensive as long as the infrastructure to do so is in place. They've got the economies of scale firmly on their side. What is relatively new is that this doesn't come with massive image quality issues (thanks to the appearance of LG's filmmaker mode and similar solutions), though WOLED isn't ideal for text or static 2D PC content still - but it's pretty good.
And yes, console games obviously also have UIs, and those also burn in over time. Did I indicate that this was somehow exclusive to PCs? I was talking about use cases, not hardware. If you play CoD on your OLED at moderate to high brightness for hours each day, then in the end you will inevitably get an imprint of the static UI elements in your panel, whether that CoD is from a PC, console, or whatever. I was simply saying that while "gaming" in general isn't static content, the vast majority of games have significant static UI elements, which contradicts what you said in the first place.
Out of the box LG OLEDs have one of the most inaccurate picture settings because the white balance is set to 0, which is too blue (that's why on youtube you'll see plenty of people gaming on their OLEDs and picture on the TV is blue AF). You can correct it by setting it to Warm 50.
And I never said "all monitors have good calibration". I specifically said that they don't: So: decent monitors have useful color profiles; the quality of calibration (for all monitors) varies a lot - even the decent ones, but particularly the bad ones. And is often crap, just as it typically is with most modes on most TVs - just in different ways (TVs tend towards overprocessing, oversaturation and excessive contrast; monitors tend to be muted, dull and just plain inaccurate). "They have a gaming mode and gaming UI and are marketed for gaming" is in no way proof of burn-in being impossible through gaming. You understand that, right? That marketing can ... mislead? Lie? Just hide inconvenient facts, or gamble on that they don't apply to enough people to be a real problem? Like, there is zero causal relationship between the things you are presenting as an argument here and the issue at hand. The existence of gaming modes is not proof of the absence of burn-in - it literally can't be proof of that, as there is no causal relation between the two.
Now, it takes a lot of time to cause burn-in on a modern OLED if it is used for many different things, but if run at high brightness with static imagery (including portions of an image being static, such as a game UI), this will still burn in - and it can even be quite quick. How long, and how severely, is dependent on the frequency of use, the brightness, the specific design of the static elements, whether these UI elements are covered by anti-burn in features (such as auto logo dimming), and more. None of this means it won't happen, it just means that it's less likely to happen than it was a few years ago, and that it won't happen to everyone. If you play lots of different games and don't use it for web browsing or work, you might never see meaningful retention. But if you play 4+ hours of a single game every day, and don't use the TV for much else? It'll likely happen sooner rather than later.
As for the processing thing and the lack of DP support, thankfully that's no longer an issue:
www.notebookcheck.net/LG-UltraGear-48GQ900-48-inch-gaming-monitor-announced-with-a-WOLED-panel-HDMI-2-1-120-Hz-support-and-a-matte-finish.605806.0.html#8362734 Nope. PC GPUs generally do not support HDMI CEC (how TVs and STBs, BD players and the like signal to each other to turn on/off, adjust volume, etc.), and while TVs might understand PC display sleep/wake signals, they do not respond to these signals nor have any way of enabling this.
But generally speaking it's indeed a disadvantage many people only find out about too late.
So I apologize for being a bit aggressive and blunt.