Monday, February 14th 2022
Alienware's 34-inch QD-OLED Monitor Gets a Price
Remember that 34-inch QD-OLED monitor that Alienware announced at CES earlier this year? The company has finally worked out how much it's going to charge for it, although there is still no fixed availability date. At US$1,299 the AW3423DW is going to be a $100 pricier than the AW3821DW, which sports a 38-inch Nano IPS panel with a resolution of 3840x1600, rather than the 34-inch QD-OLED panel with a resolution of 3440x1440 of the AW3423DW.
Obviously the two display technologies aren't comparable, but it's at least an indication of how pricy QD-OLED will be initially, compared to more traditional display technologies. Both displays feature G-Sync Ultimate, so it's not as if Dell has tried to cut any corners here. The AW3423DW does offer a higher refresh rate of 175 Hz vs 144 Hz for the AW3821DW, which may be an advantage to some, but the official HDR certification is oddly enough only HDR 400 vs HDR 600, despite the fact that Dell claims it can deliver up to 1000 cd/m². That said, the black levels of the AW3423DW should be vastly superior, as should the colour gamut. The display is said to be available sometime early this spring, presumably in the US market first.
Sources:
@Alienware, via TFT Central
Obviously the two display technologies aren't comparable, but it's at least an indication of how pricy QD-OLED will be initially, compared to more traditional display technologies. Both displays feature G-Sync Ultimate, so it's not as if Dell has tried to cut any corners here. The AW3423DW does offer a higher refresh rate of 175 Hz vs 144 Hz for the AW3821DW, which may be an advantage to some, but the official HDR certification is oddly enough only HDR 400 vs HDR 600, despite the fact that Dell claims it can deliver up to 1000 cd/m². That said, the black levels of the AW3423DW should be vastly superior, as should the colour gamut. The display is said to be available sometime early this spring, presumably in the US market first.
135 Comments on Alienware's 34-inch QD-OLED Monitor Gets a Price
If these were dedicated to PC use, they would have
- DisplayPort
- PC compatible sleep/wake
- An option to bypass the "smarts" entirely, as it would have no use
- other PC-focused hardware features, like USB hubs (which all of LG's monitors except for the very cheapest have)
All of which their newly announced 48" Ultragear monitor have. That is dedicated to PCs (though they also advertise it for consoles). No problem :) I'm still under the impression that this is very rare as a TV feature, though maybe it's more common in US models and firmwares? It would definitely be interesting to find out the logic (if there is any!) behind this.
I said motion smoothing looks like garbage. Motion smoothing or interpolation is an image processing feature of TVs, to smooth out motion by adding interpolated "fake" frames in between the actual image data of whatever is being displayed. Remember those "480Hz" and "600Hz" TVs in the mid 2010s? That's motion interpolation. (And those AFAIK all had 60Hz panels.) Motion interpolation relates to the image signal, but not directly to the display panel.
To expand on that point: the OLEDs are 120Hz first and foremost because motion smoothing on a 60Hz OLED doesn't really work - the lighting fast pixel response times of OLED undermine the effect of any interpolation added to the signal. Doubling the number of frames allows for smoother transitions and thus smoother motion, which aligns better with conventions and expectations for TVs.
I have at no point differentiated between "bad" and "good" 120Hz, or any categorizations of refresh rates whatsoever. Okay, we need a dictionary here: Dedicated means "Wholly committed to a particular course of thought or action; devoted." It can also be formulated as "designed for a particular use or function", but in that formulation carries a strong implication of it being designed only or mainly for that use or funciton (and not simply as one more feature on top of a pile of others). If a TV is dedicated to gaming, then that is its main if not only use. This is not the case for any current LG OLED, despite them being good at gaming and having features well further improving this. Your TV might of course be dedicated to gaming in your usage, if that's your only or main use for it, but that doesn't mean that that type of TV is dedicated to gaming in general.
I'm not downplaying anything, I'm trying to adjust your perspective to match with the reality of these devices being general-purpose TVs with good gaming features that a relatively small subset of users buy them for, but which is not the main purpose of the product in general.
This image sums this up pretty succinctly, taken from LG's C1 web site:
www.lg.com/us/images/TV-OLED-C1-10-OLED-Value-4S-Desktop.jpg
Gaming is highlighted as one of four use cases showcasing one of four qualities they want to highlight. Beyond that, you have to scroll more than halfway down the page for it to get into gaming. That is not how a dedicated gaming device is marketed.
For contrast: On LG's Ultragear gaming monitors, "gaming" is in the title for every single product, and the first (and often near-exclusive) focus on their product pages is gaming. I have never contradicted that. I have simply tried to add some nuance to a simplistic perspective, highlighting that while TVs are always cheaper than monitors and often deliver superior image quality, monitors have other features and other qualities that TVs (typically/often) lack, there are UX challenges with using a TV as a monitor, and that not all TV features (image processing especially, hence the need for "game mode") are suited for PC usage, whether that is gaming or web browsing.
The prevalence of heavy-handed signal processing and how it both degrades image quality on static PC content and causes massive input lag (to the tune of 100+ ms often) is precisely why TVs today have game modes - because they're borderline unusable for anything requiring reaction speed without such a mode, and desktop use would be quite terrible due to text dithering issues, oversharpening, etc.
I mean, you might as well say it has a "movie dedicated" feature set. Which would be equally nonsensical. It is a multi-purpose product with many features that fit many use cases, and obviously some features align better with some use cases than others - some even only make sense for some uses, like adaptive sync. That does not amount to the TV being dedicated to that one use case. Worth noting: it doesn't. It supports VESA adaptive sync over HDMI, an extension made by AMD which Nvidia then came to support as "G-sync compatible". This is a feature also supported by Xbox consoles for quite a few years now. Yes, but to a much, much, much lesser degree, typically only amounting to minor adjustments of color balance and contrast. They don't add sharpening, motion interpolation, AI upscaling/image enhancements, or any of the other signal processing techniques applied by TVs. See above. I have also said this in several previous posts. That is literally the same thing, though in slightly different contexts. Post-processing refers to processing an image after it is "done" (a digital image is rendered, a film image is developed and scanned, etc.). As a "finished" image is a requirement for there being an image signal to transfer to a TV or monitor (it's "done" by default as it's being transmitted to the display device), the "post" is omitted as it's just meaningless at that point. All signal processing is post processing. This just shows that you have no idea what you're talking about, sorry. Monitors offer color balance and contrast adjustments on a basic level. Some offer black level adjustments, and (ironically) some gaming monitors are increasing the amount of processing available for various "gaming" features (though most of this is just marketing fluff). Some gaming monitors also add in fps counters and crosshairs, though those are simple graphics overlaid on the image - still, that's also a form of processing. But all of this is lightweight. It still has the potential to drastically change how the image looks, but it requires little processing power and adds very little input lag.
TVs, on the other hand, have long since started offering a host of features to "improve" image quality: motion smoothing/interpolation, algorithmic/AI upscaling, denoising, sharpening, all kinds of nebulously named "image enhancement" settings. This is not even close to an exhaustive list, and TVs also have all of the features above (typically except for the gaming-centric ones, though some do). What makes this distinct from what monitors do is the complexity of the processing and the processing power involved. TVs, even "dumb" ones, have relatively powerful display driver chipsets in order to perform this processing. Monitors mostly have very basic display controller chipsets, fundamentally incapable of performing image processing of this kind. I have given you concrete, real-world examples demonstrating otherwise. If you have proof that these examples are invalid, please provide it. Again: I have given you plentiful examples of the user experience niggles and annoyances of using a TV as a monitor. You might not care about them, but they exist. Done. Seriously, if you missed this, what have you been reading?
I also think the entire field of electrical engineering might want to have a word with you.
I mean, what do you imagine a stream of data between a source and receiver constitutes, if not a signal? Not necessarily. Before the advent of game mode on TVs, many TVs had no way of disabling all processing. It's fascinating to see someone who has absolutely zero idea what they are talking about constantly double down on their fundamental misconceptions. Like, do you believe that most consumer electronics grant you access to every possible setting? News flash: they don't. Most lock them down tight, exposing just a few. TV processing is much like this - there is a lot that isn't adjustable. Also, a lot of settings are adjustable or can't be disabled - but this of course depends on the specific TV and firmware. But thankfully with PC and game modes, many tvs now allow for them to be disabled entirely.
Also: you asked me for "proof" above, and examples. Not that I hadn't given them before, but they were repeated. Care to respond to that? 'Cause right now it just looks like you're working quite hard to dodge things that contradict your "arguments".
48" OLEDs sell for about 700 Euro atm.
LG promised 42" C2 to come in March. Curious sentence, especially given that we are already in "hurts my eyes" territory even with OLED TVs (which are watched from quite a distance) which gets only worse when you sit next to the screen.
So, dropping the promised higher brightness, what is that new stuff that is worth more money than normal OLEDs? - Not sure why one needs a display port. To NOT be able to pass over sound?
- CEC wakes up over HDMI just fine, can't modern GPUs do that?
- "Gaming mode" bypassing all the smarts is a given these days
- As had been discussed at length above, PCs don't generally support CEC.
- Pretty much, yes. I don't know how many times I've brought up Game Mode in the posts above, but it must be dozens. Your point?
Tl;dr: it's less about the peak brightness and more about how this extends the range of safe long-term usage of the panel, as well as increased versatility.
Also worth noting: Samsung's layer diagrams for their QD-OLEDs have very few layers and even omit a polarizing layer, which further increases light transmission efficiency. And of course (in time) the move to a single color of subpixel over AMOLED (WOLED already has this) has the potential to drastically simplify production - printing/growing a pattern of identical dots is dramatically easier than doing the same three times over with highly accurate spacing. WOLED has this, but needs a color filter layer, which is much less efficient than quantum dots.
And you do have static content in games (bright HUD). Not to mention the static content you have while not gaming (desktop & windowed apps). ;) Linus got burn-in.
Also they apparently talk about a model called CX, which I think is not the high-end TV as the C1 that I'm using. The C1 even has less chance of burn-in at incredibly prolonged usage periods. Also the C1 can be configured to not have auto-dimming and anything auto. On the C1 the auto-dimming and auto-brightness are controlled by the "AI" but you can turn that "AI" off and have almost complete control of the whole image, something that Valantar does not agree with.
I've also never said that "you can't do anything about it", I've said that TVs up until the advent of game modes have often had no option to entirely turn off their (often quite aggressive) processing, which has resulted in high input lag and usability problems with PCs (some TVs have had "PC modes" but even those often didn't disable all processing, leaving in place things like sharpening and denoising).
Also, your application of the term "post processing" here is ... strange. Post (as in: "after") as to what, precisely? After the image is "done", i.e. finished, encoded, and transmitted? If that is the criterion, then by that definition all post processing is signal processing. However, the conventional use of "post processing" is not in relation to signals, but rather in relation to stored data - photoshopping an image, adding a filter to a video, adding a sharpening filter or AA to a game before that image data is transmitted; the post then refers to after the creation of the data (often an image), but crucially happens before this is considered "finished" and is transmitted anywhere. You could of course insist that any processing done by the TV is "post processing" because it happens after the signal source has sent the signal, but then we're talking about a distinction without a difference, and your argument collapses. I'm sorry, but you've got this twisted around. You're not arguing a point here. You're arguing against my points. Whatever delineation of your arguments you are claiming is meaningless, as you are arguing against my claims. Thus, the basis of my claims must be the basis for any further claims made - otherwise we're just not talking about the same thing, making the discussion meaningless. Let's see:
- The absurd idea that signal processing is a term only applicable to analog signals (which you seem to have abandoned for now, which I guess is progress?)
- The misconseption that there is a meaningful difference between your unusual application of "post processing" and my use of signal processing
- The idea that users have total control over their consumer electronics. Heck, even LG's OLED have widely reported user-inaccessible settings (that some users are gaining access to through a service menu) Once again: that sentence explicitly contradicts itself. If a TV receives a signal, and then processes the data contained in that signal, before displaying it? It is then processing the signal, which we might perhaps call signal processing? I really can't understand why you're so gung-ho on this immaterial use of broadly applicable terminology. It ultimately doesn't make any kind of difference to my argument. Like ... what kind of reality distortion bubble do you live in? 1:39and forward: "This burn-in is a result of snapping my windows into the four corners of my display." While he is saying that, we can clearly see the image retention on-screen. 2:00 and forward, Wendell: "It is amazing, but what's not amazing is that I'm starting to see signs of burn-in." Do you need more?
Yes, the pixel refresher program alleviates the problem - but it's a band-aid, that "fixes" it by literally wearing down every pixel so that the localized wear is less apparent. It is not a long-term solution, as it will literally wear out the display. The video makes this abundantly clear. That is in fact the entire message of the video: beware that depending on your use case, even brand-new OLEDs can burn in quite quickly. That doesn't mean it will happen - agin, it depends on your use case - but they are by no means image retention-proof.
I mean, this discussion only started going off the rails after I pointed out that LG's filmmaker and game modes aren't an advantage but rather a feature meant to overcome previous disadvantages of TVs compared to monitors. Whether you are only talking about the C1 is irrelevant in regards to that. I've never argued that the C series forces overprocessing on users - that's what filmmaker mode and game mode exists to remove! - I've said that these features exist specifically to alleviate what was previously inherent disadvantages for TVs compared to monitors (and which is still a disadvantage many TVs have). Okay. And this happens in the TV's SoC, right? How, then, does this data reach your eyes? The SoC isn't a display, after all. It has decoded the image signal, processed it, and encoded it into a new format. Which is then transmitted, over wires, either through a display controller, or if this is integrated into the SoC, directly to the display panel. Which, again, makes it a signal. Heck, the photons carrying information to your eyes are a signal. This is precisely what I pointed out in a previous post: you're arbitrarily delineating a fraction of a second where the data is decoded and changed as it "not being a signal" despite this being an inherent part of a signal transmission and display pipeline. IMO, it would only meaningfully cease to be a signal if the data was at rest for any significant period of time - i.e. some form of storage (not caching). This never happens. Thus, what the SoC performs, is signal processing, as the purpose of what it does is to process the image signal before it reaches the display. You can call that post-processing, though the shortness of that Wiki article you shared does go some way towards indicating that this is not the most common usage of that term (it's more common in real-time 3D graphics, for example). But back to what I've been saying quite a few times now: this is a distinction without a difference. Whether we call this signal or post processing changes nothing. What? Where have I suggested that? I've said that many/most TVs don't allow you to disable it, as the settings aren't available to do so. I've also said that the most common usage of post processing, such as in games, is in relation to changes baked into a signal before it is sent (you can't make a TV or monitor remove an AA pass from a game, for example, as the required data to do so doesn't exist at that point in the pipeline). But I've never, ever, said that the processing done by TVs is baked into anything apart from the signal path from TV SoC to display panel. Did you seriously miss the part where I gave you specific time codes from that video where both Linus and Wendell confirm that they are seeing burn-in? Again: Linus at 1:39, Wendell at ~2:00. (In case you can't see it, those time codes are both links to that point in the video. Try clicking them, please.) I mean, the entire video is about them both getting burn-in on their new OLEDs after relatively short spans of time (and whether or not this is fixable). You're working really hard to pull out misleadingly selective quotes to support your argument here.
Monitors/comments/t75xwf
They're already available for order if you're signed up for Dell Premiere (or whatever it's called), official launch is March 9th in US.
Subpixel layout:
Compared to JOLED:
Now flat-screen 16:9? Heck yeah! I'd take two..