Monday, February 14th 2022

Alienware's 34-inch QD-OLED Monitor Gets a Price

Remember that 34-inch QD-OLED monitor that Alienware announced at CES earlier this year? The company has finally worked out how much it's going to charge for it, although there is still no fixed availability date. At US$1,299 the AW3423DW is going to be a $100 pricier than the AW3821DW, which sports a 38-inch Nano IPS panel with a resolution of 3840x1600, rather than the 34-inch QD-OLED panel with a resolution of 3440x1440 of the AW3423DW.

Obviously the two display technologies aren't comparable, but it's at least an indication of how pricy QD-OLED will be initially, compared to more traditional display technologies. Both displays feature G-Sync Ultimate, so it's not as if Dell has tried to cut any corners here. The AW3423DW does offer a higher refresh rate of 175 Hz vs 144 Hz for the AW3821DW, which may be an advantage to some, but the official HDR certification is oddly enough only HDR 400 vs HDR 600, despite the fact that Dell claims it can deliver up to 1000 cd/m². That said, the black levels of the AW3423DW should be vastly superior, as should the colour gamut. The display is said to be available sometime early this spring, presumably in the US market first.
Sources: @Alienware, via TFT Central
Add your own comment

135 Comments on Alienware's 34-inch QD-OLED Monitor Gets a Price

#76
Space Lynx
Astronaut
ValantarThat's part of my issue with OLED monitors - I'd need to use it for work, and while I do have a second monitor that I'd be keeping, I wouldn't be comfortable doing most of my work on a vertical 24" 1080p panel. (That secondary monitor + ho-hum game support is also why I've put aside any thought of an ultrawide, but that's another subject entirely.) So for me to go OLED, I'd need some real assurance that it won't have retention issues for at least 5 years with a lot of time spent on Word documents, pdfs and web pages. Given that my current LCD, which cost me about $500 when new, has lasted more than a decade without issues, anything significantly less than that at more than twice the price would be unacceptable.
I keep my work laptop/tablet separate from my personal life, don't really need to, but it helps me mentally to have separate work/personal laptops.
Posted on Reply
#77
Valantar
CallandorWoTI keep my work laptop/tablet separate from my personal life, don't really need to, but it helps me mentally to have separate work/personal laptops.
Yeah, I've considered getting a second small desktop for work just for the mental separation (my work-provided laptop is too slow for my tastes, it's mainly for portability and touchscreen apps), but since I WFH for the foreseeable future (that happens when your job is in a different country than where you live :P ) and I don't have room for two desks it is what it is in terms of monitors. You're completely right about the mental aspects of separating work and free time though, that's incredibly important for your well-being.
Posted on Reply
#78
Vayra86
CallandorWoTyeah if I buy this $1300 one, I will be babying it... keeping it turned off when not gaming, or movie watching. actual web browsing and desktop stuff will still be done on my laptop or tablet.
Dont put vaseline on it
Posted on Reply
#79
RH92
MarsM4NI don't see it. ;) Samsung didn't buy LG's OLED panels either, instead they tried to "survive" with their inferior & expensive QLED TV's while LG's OLED's where selling like hot cakes.

I did read a comment on one of the QD-OLED video's on Youtube, and there was a guy who seems to have some knowledge about OLED production. He mentioned that QD-OLED will be easier to produce, better yields, which means cheaper prices. If true it will be Samsung's "turn the tide" tech & LG's OLED's will collect dust on the shelfs.
Yeah it is unlikely but it would not be as absurd as for example expecting Samsung to sell their chips to Apple or vice versa .

Indeed QD-OLED is cheaper to produce because it greatly simplifies the emissive layer ( only blue emitter instead of blue and yellow in tradition OLED ) and frontal layer sturctures , less layers means easier to manufacture so ultimately lower production cost .

For sure LG are on a rough spot with their WOLED because no matter how much they try to optimise it QD-OLED inherent advantages over WOLED ( burn-in resistance / higher brightness / better color accuracy ) will always preveil .
ValantarHave you looked at the 32" 2160p monitor market recently? The cheapest high refresh rate monitors there are still close to $1000, though they are creeping down - 27-28" variants are $100-200 cheaper. It's entirely possible that Samsung aims for QD-OLED to be disruptive on price, but I see no reason for the $1300 pricing of this to be indicative of sub-$1000 32" 16:9 panels. Also, 1440p/1600p ultrawides are generally far cheaper than 2160p 16:9 panels.
Well the issue is there has never been a real 32'' 2160p market , yes there have been products here and there but untill now 27'' 2160p has been and still is the norm , only very recently we started seeing 32'' 2160p high refresh products . Good 32'' 2160p high refresh LCD panels such as the M32U can be found for around 800$ so there is plenty of room there for a sub 1000$ 32'' QD-OLED .

QD-OLED being disruptive is not only a possibility , it's a certainty as showcased by the Alienware monitor which sells for much cheaper than it's miniLED LCD competition ( while being a superior product ) . As i previously mentioned if we factor in Alienware / Ultrawide /G-Sync module which always come at a premium , it is reasonable to expect a sub 1000$ 32'' QD-OLED either by Dell , Samsung themselves or another manufacturer supplied by Samsung Displays .
Posted on Reply
#81
Nater
No way it sells for $1300. You'll be lucky to have one at that price. Scalpers will scoop up everything Dell has on launch and they'll all be on 3rd party Amazon/Newegg/Ebay sellers for north of $2000. Bet.
Posted on Reply
#82
Makaveli
NaterNo way it sells for $1300. You'll be lucky to have one at that price. Scalpers will scoop up everything Dell has on launch and they'll all be on 3rd party Amazon/Newegg/Ebay sellers for north of $2000. Bet.
Since when has scalping monitors been a thing now?

I think it will be in high demand because no one else has made OLED PC monitors and its will be for those that have the gpu power to push 3440x1440 ultrawide at 144hz+
Posted on Reply
#83
goodeedidid
MarsM4NBurn-In proof, UltraWide, curved (QD) OLED, finally a dream comes true. :cool: Expected the price to be more close to 2k instead of 1k.
Not into "Alien" branding, G-Sync tax & "G4M3R" finish, so I will wait for versions from other manufacturers (Samsung, LG, Lenovo, etc. get on it!)

Btw. here's the product video:




Sounds like you do not understand (new) tech & love to throw out money for products that are not designed for static content, and will get damaged in such use cases.
(Non QD) OLED will even get burn in from basic TV content, as the Real Life OLED Burn-In Test on 6 TVs shows.

Never wondered why they haven't released "wicked gaming" OLED monitors, until now (QD OLED)? Hint: Burn-In ;)

That is not really true the C1 and C2 are exclusively with gaming in mind and burn-ins aren't really an issue anymore. You can hardly very very hardly cause burn-ins for example with the C1 even if you tried to. So yes the upcoming C2 will be a wicked gaming TV with so much more value than a dumb overpriced monitor.

"not designed for static content": Gaming is not static content.. duhhh lol
TomorrowNot everyone has space for a tall 42" model. Plus it has agressive auto brightness that cant be turned off.

How so? 1300 (1100 without the G-Sync module) is roughly the same price is see the current 48" CX model being sold. Even cheaper than higher size OLED's.
So the price is very competitive like others have said. Sure it has slightly lower resolution (3440x1440) but it's also more compact to fit on a desk and has higher 175Hz refreshrate.
Not true, auto-AI brightness can be turned off.
Posted on Reply
#84
Valantar
goodeedididSo yes the upcoming C2 will be a wicked gaming TV with so much more value than a dumb overpriced monitor.
But also a ton of annoying features that makes it less suited to being a monitor - no auto sleep mode when the PC turns off the monitor or goes to sleep; no auto wake when the opposite happens; no easy input switching; no USB hub or other PC-focused fuctionality; no DP inputs. Everything has tradeoffs.

Also, while image retention is far less of a problem than previously, it is entirely possible to cause it in a relatively short span of time - though it is obviously entirely dependent on your use case. I wouldn't worry for a 2-5-hours-a-day gaming/entertainment display, but for a 10+ hours a day productivity+gaming display? I'd be very worried, especially if the balance of those uses leans more towards productivity. High contrast screen borders and white background applications are prime candidates for causing image degradation on OLEDs - and, crucially, for making it visible when it happens, unlike in a visually complex and dynamic game scene.
goodeedidid"not designed for static content": Gaming is not static content.. duhhh lol
I assume you're familiar with the concept of UI elements? Health bars, minimaps, ammo counters, hotbars, skill selectors, etc.? Those are pretty static, definitely a lot more static than most TV content, unless your TV is constantly showing the news. TV channel logos are notorious for burning in; game UIs do the same.
Posted on Reply
#85
dir_d
I use my c1 for about 8 hours a day but have it around 110nits while working and i dont have any problems with excel word or any other content. I believe if you go higher than 120nits you will have issues with programs like excel for multiple hours.
Posted on Reply
#86
goodeedidid
ValantarBut also a ton of annoying features that makes it less suited to being a monitor - no auto sleep mode when the PC turns off the monitor or goes to sleep; no auto wake when the opposite happens; no easy input switching; no USB hub or other PC-focused fuctionality; no DP inputs. Everything has tradeoffs.

Also, while image retention is far less of a problem than previously, it is entirely possible to cause it in a relatively short span of time - though it is obviously entirely dependent on your use case. I wouldn't worry for a 2-5-hours-a-day gaming/entertainment display, but for a 10+ hours a day productivity+gaming display? I'd be very worried, especially if the balance of those uses leans more towards productivity. High contrast screen borders and white background applications are prime candidates for causing image degradation on OLEDs - and, crucially, for making it visible when it happens, unlike in a visually complex and dynamic game scene.

I assume you're familiar with the concept of UI elements? Health bars, minimaps, ammo counters, hotbars, skill selectors, etc.? Those are pretty static, definitely a lot more static than most TV content, unless your TV is constantly showing the news. TV channel logos are notorious for burning in; game UIs do the same.
Not that your examples are wrong or something but those are just not good examples. C1 and now C2 are exclusively gaming TVs. I mean console games like PS5 don't have UI in gameplay? Also about the turning off and on I'm not sure you're entirely correct, cause for example when I turn off my Apple TV the TV itself turns off and even the sound bar and all that happens through the Arc protocols. I'm not sure if it is the same with PCs but to be honest this is such incredibly small detail that it doesn't matter. I really really doubt you're willing to pay $1,000 more just because the TV doesn't go in standby when you turn off your PC.. I mean is this the best thing that is going for the gaming monitors? The auto standby mode?!? What about for viewing media such as Netflix and movies? The gaming monitors are not good in that regard at all. The LG and other newer OLED TVs have the Filmmaker mode that automatically configures the colors of the TV to match the content you're watching as it was meant to be by the film producers. So again I'm pretty sure, especially now with that new C2 42-incher that you're getting so much more value for your money than just a simple lousy monitor.
Posted on Reply
#87
Valantar
goodeedididNot that your examples are wrong or something but those are just not good examples. C1 and now C2 are exclusively gaming TVs. I mean console games like PS5 don't have UI in gameplay? Also about the turning off and on I'm not sure you're entirely correct, cause for example when I turn off my Apple TV the TV itself turns off and even the sound bar and all that happens through the Arc protocols. I'm not sure if it is the same with PCs but to be honest this is such incredibly small detail that it doesn't matter. I really really doubt you're willing to pay $1,000 more just because the TV doesn't go in standby when you turn off your PC.. I mean is this the best thing that is going for the gaming monitors? The auto standby mode?!? What about for viewing media such as Netflix and movies? The gaming monitors are not good in that regard at all. The LG and other newer OLED TVs have the Filmmaker mode that automatically configures the colors of the TV to match the content you're watching as it was meant to be by the film producers. So again I'm pretty sure, especially now with that new C2 42-incher that you're getting so much more value for your money than just a simple lousy monitor.
PCs do not communicate with monitors over the same protocols that TVs and TV-connected devices use. PC monitors go into a sleep mode when the PC is off, asleep, or has switched off the monitor, a function TVs lack entirely - they are either on or off, and need to boot up an OS and other things that take some time. As for this being a "small thing" - is it really? Having to pick up your remote to turn off the monitor every time you turn off or leave your PC (unless you want to leave it to the no signal auto-off timer, which is typically ~15m), turning the act of turning on your PC from a one-button affair (or just touching your mouse or keyboard if it's asleep) into, again, requiring a remote control? That's a severe UX regression, harkening back to the AT PC days. Is it a deal-breaker? That depends on who you're asking. It's definitely a hassle (as is needing a remote for anything on your desk).

You're entirely right that TVs generally have better image quality for video and HDR in particular, though your description of LG's filmmaker mode is wrong: it's a mode with very accurate color reproduction, it is not content-adaptive in any way. It's literally just a well calibrated mode that removes most of the image processing that most TVs add in their standard modes. My TV is constantly set to Samsung's equvialent to that mode, as IMO that's the only sensible setting for any display. Most decent quality monitors also have color profiles (sRGB, P3 D65, possibly AdobeRGB), though of course the quality of calibration varies a lot, and there's the issue that Windows isn't a color-aware OS, meaning that color accuracy is left to apps to handle (and thus most apps don't, leading to a lot of inaccurate colors). But filmmaker mode is ultimately a solution to the problem of TVs over-processing their signals and thus misrepresenting the content they're displaying, something that is a TV-only problem - monitors do not add any real processing to their images. Thus, presenting filmmaker mode as an advantage over monitors, which don't process their images at all, is ... rather odd.

But: I never said that either of these were deal breakers. I said that they are annoying downsides of using a TV as a monitor, adding nuance to your black-and-white "TVs are superior" stance. Different products are made for different uses, and have different features. Which are more important is a matter of personal preference.

It's also quite well established that TVs are much better value than monitors, which stands to reason as they also sell in massively larger volumes, and ultimately producing a larger display panel isn't all that much more expensive as long as the infrastructure to do so is in place. They've got the economies of scale firmly on their side. What is relatively new is that this doesn't come with massive image quality issues (thanks to the appearance of LG's filmmaker mode and similar solutions), though WOLED isn't ideal for text or static 2D PC content still - but it's pretty good.

And yes, console games obviously also have UIs, and those also burn in over time. Did I indicate that this was somehow exclusive to PCs? I was talking about use cases, not hardware. If you play CoD on your OLED at moderate to high brightness for hours each day, then in the end you will inevitably get an imprint of the static UI elements in your panel, whether that CoD is from a PC, console, or whatever. I was simply saying that while "gaming" in general isn't static content, the vast majority of games have significant static UI elements, which contradicts what you said in the first place.
Posted on Reply
#88
Chomiq
goodeedididC1 and now C2 are exclusively gaming TVs
Says who? Plenty of people use C series primarily for media consumption. Gaming use isn't also the main marketing aspect on LG's page dedicated to C1. Or do you mean to say that 65", 77" and 83" C1's are meant to be used for gaming only?
goodeedididThe LG and other newer OLED TVs have the Filmmaker mode that automatically configures the colors of the TV to match the content you're watching as it was meant to be by the film producers.
No, filmmaker mode does no such thing. What it does it turns off majority of image processing and brings the white point to the correct D65 value:
Disables all post-processing (e.g. motion smoothing) for both SDR & HDR content preserving the correct aspect ratios, colors and frame rates for a more cinematic experience on Ultra HD TVs.


Image & Display Parameters
  • Maintain source content frame rate & aspect ratio
  • White Point: D65
  • Motion Smoothing/Interpolation: OFF
  • Overscan: Only if signaled with the image
  • Sharpening: OFF
  • TV Noise Reduction: OFF
  • Other image “enhancement” processing: OFF
Also, filmmaker mode is not exclusive to OLED TV's. It also needs (when operating in auto-detection mode) to have proper metadata (that tells the TV it's dealing with CINEMA content) on the media that you're playing. That's fine for streaming media like Netflix or Disney+ but results in problems in case of physical blu-ray disc (older disc won't have this metadata).

Out of the box LG OLEDs have one of the most inaccurate picture settings because the white balance is set to 0, which is too blue (that's why on youtube you'll see plenty of people gaming on their OLEDs and picture on the TV is blue AF). You can correct it by setting it to Warm 50.
Posted on Reply
#89
goodeedidid
ValantarPCs do not communicate with monitors over the same protocols that TVs and TV-connected devices use. PC monitors go into a sleep mode when the PC is off, asleep, or has switched off the monitor, a function TVs lack entirely - they are either on or off, and need to boot up an OS and other things that take some time. As for this being a "small thing" - is it really? Having to pick up your remote to turn off the monitor every time you turn off or leave your PC (unless you want to leave it to the no signal auto-off timer, which is typically ~15m), turning the act of turning on your PC from a one-button affair (or just touching your mouse or keyboard if it's asleep) into, again, requiring a remote control? That's a severe UX regression, harkening back to the AT PC days. Is it a deal-breaker? That depends on who you're asking. It's definitely a hassle (as is needing a remote for anything on your desk).
You're over-dramatizing the issue. It's not a big deal, I haven't used a PC with newest LG monitors such as the C1 but probably there is a standard protocol to turn the TV to sleep or to wake when you turn on your PC. How is it possible to turn on the TV and also the sound system such as a sound-bar just by turning on a PS5 or an Apple TV? Could be the same deal with a new PC.
ValantarBut filmmaker mode is ultimately a solution to the problem of TVs over-processing their signals and thus misrepresenting the content they're displaying, something that is a TV-only problem - monitors do not add any real processing to their images. Thus, presenting filmmaker mode as an advantage over monitors, which don't process their images at all, is ... rather odd.
What do you really mean by "over-processing their signals"?!? That sound really odd to be honest, everything is digital.. zeroes and ones, there is no analogue "signal" to process cause the way you say it sounds like you're talking about analogue TVs or something. Also you're a bit wrong about monitors too because almost all monitors don't have good calibration and they have pictures modes with different values that don't represent accuracy such as photo modes or movie modes or gaming mode.
ValantarIf you play CoD on your OLED at moderate to high brightness for hours each day, then in the end you will inevitably get an imprint of the static UI elements in your panel, whether that CoD is from a PC, console, or whatever.
That is not true, you will not get a burn in. Those TVs like the C1 are specifically marketed for gaming. They have a whole gaming mode and gaming UI that even has FPS counter.
Posted on Reply
#90
lexluthermiester
ValantarPCs do not communicate with monitors over the same protocols that TVs and TV-connected devices use.
Rubbish! TV's use the exact same protocols as PC displays. HDMI and DP standards require it. Quit talking nonsense..
Posted on Reply
#91
Valantar
goodeedididYou're over-dramatizing the issue. It's not a big deal, I haven't used a PC with newest LG monitors such as the C1 but probably there is a standard protocol to turn the TV to sleep or to wake when you turn on your PC. How is it possible to turn on the TV and also the sound system such as a sound-bar just by turning on a PS5 or an Apple TV? Could be the same deal with a new PC.'
Nope, this is literally not possible. TVs do not understand the signalling protocols PCs use for auto sleep/wake of monitors, and do not have this functionality built in. PCs do not have HDMI-CEC functionality (though it might be possible to add through third party software - I've never checked). But this is not a way in which PCs and TVs communicate.
goodeedididWhat do you really mean by "over-processing their signals"?!? That sound really odd to be honest, everything is digital.. zeroes and ones, there is no analogue "signal" to process cause the way you say it sounds like you're talking about analogue TVs or something. Also you're a bit wrong about monitors too because almost all monitors don't have good calibration and they have pictures modes with different values that don't represent accuracy such as photo modes or movie modes or gaming mode.
Uh ... digital signals are also signals. Changing data is signal processing. Over-processing means processing the image data to make it look different. Whether it's analogue or digital is irrelevant. For TVs, typically this means motion smoothing/interpolation, drastic color adjustments, contrast adjustments, algorithmic/AI upscaling or "image enhancement" processes, black level adjustments, denoising, and a whole host of other "features" (I'd rather call them bugs, but that's just me). Why do you think TVs have a "game mode"? Because this mode cuts out all this processing to cut down on input lag, as this processing causes significant lag. Monitors have essentially none of these features, in part because they're fundamentally unsuitable for crucial PC use cases such as text rendering.

And I never said "all monitors have good calibration". I specifically said that they don't:
ValantarMost decent quality monitors also have color profiles (sRGB, P3 D65, possibly AdobeRGB), though of course the quality of calibration varies a lot
So: decent monitors have useful color profiles; the quality of calibration (for all monitors) varies a lot - even the decent ones, but particularly the bad ones. And is often crap, just as it typically is with most modes on most TVs - just in different ways (TVs tend towards overprocessing, oversaturation and excessive contrast; monitors tend to be muted, dull and just plain inaccurate).
goodeedididThat is not true, you will not get a burn in. Those TVs like the C1 are specifically marketed for gaming. They have a whole gaming mode and gaming UI that even has FPS counter.
"They have a gaming mode and gaming UI and are marketed for gaming" is in no way proof of burn-in being impossible through gaming. You understand that, right? That marketing can ... mislead? Lie? Just hide inconvenient facts, or gamble on that they don't apply to enough people to be a real problem? Like, there is zero causal relationship between the things you are presenting as an argument here and the issue at hand. The existence of gaming modes is not proof of the absence of burn-in - it literally can't be proof of that, as there is no causal relation between the two.

Now, it takes a lot of time to cause burn-in on a modern OLED if it is used for many different things, but if run at high brightness with static imagery (including portions of an image being static, such as a game UI), this will still burn in - and it can even be quite quick. How long, and how severely, is dependent on the frequency of use, the brightness, the specific design of the static elements, whether these UI elements are covered by anti-burn in features (such as auto logo dimming), and more. None of this means it won't happen, it just means that it's less likely to happen than it was a few years ago, and that it won't happen to everyone. If you play lots of different games and don't use it for web browsing or work, you might never see meaningful retention. But if you play 4+ hours of a single game every day, and don't use the TV for much else? It'll likely happen sooner rather than later.


As for the processing thing and the lack of DP support, thankfully that's no longer an issue:
www.notebookcheck.net/LG-UltraGear-48GQ900-48-inch-gaming-monitor-announced-with-a-WOLED-panel-HDMI-2-1-120-Hz-support-and-a-matte-finish.605806.0.html#8362734
lexluthermiesterRubbish! TV's use the exact same protocols as PC displays. HDMI and DP standards require it. Quit talking nonsense..
Nope. PC GPUs generally do not support HDMI CEC (how TVs and STBs, BD players and the like signal to each other to turn on/off, adjust volume, etc.), and while TVs might understand PC display sleep/wake signals, they do not respond to these signals nor have any way of enabling this.
Posted on Reply
#92
trsttte
ValantarNope. PC GPUs generally do not support HDMI CEC (how TVs and STBs, BD players and the like signal to each other to turn on/off, adjust volume, etc.), and while TVs might understand PC display sleep/wake signals, they do not respond to these signals nor have any way of enabling this
Some TVs do support VESA Display Power Management Signaling (the thing from display port and monitors generally use), I think some of the LG OLED stuff is included (although searching results include several people complaining of it not working very well).

But generally speaking it's indeed a disadvantage many people only find out about too late.
Posted on Reply
#93
lexluthermiester
ValantarNope. PC GPUs generally do not support HDMI CEC (how TVs and STBs, BD players and the like signal to each other to turn on/off, adjust volume, etc.), and while TVs might understand PC display sleep/wake signals, they do not respond to these signals nor have any way of enabling this.
Absolute nonsense. I have a TV connected to the very PC I'm currently typing on and it responds to every signal it is supposed to. If you were correct, it would not behave properly. So either prove up with HDMI/DP specs documentation, or....
Posted on Reply
#94
Valantar
lexluthermiesterAbsolute nonsense. I have a TV connected to the very PC I'm currently typing on and it responds to every signal it is supposed to. If you were correct, it would not behave properly. So either prove up with HDMI/DP specs documentation, or....
So the TV goes to sleep when you turn off the PC or it otherwise 'turns off' its 'monitor'? Or does it just lose its signal and go into a 'no signal' pattern for 15 minutes before shutting off? Because that's what >99% of TVs do. And none of those wake from an off state when you turn on the PC either - again, because PCs don't support HDMI CEC.
trsttteSome TVs do support VESA Display Power Management Signaling (the thing from display port and monitors generally use), I think some of the LG OLED stuff is included (although searching results include several people complaining of it not working very well).

But generally speaking it's indeed a disadvantage many people only find out about too late.
That's interesting! Do they actually turn on and off like a monitor? That didn't apply to the LG CX that LTT did a video about using as a monitor, at least.
Posted on Reply
#95
R-T-B
CallandorWoT@nguyen @R-T-B what do you think?
I have not seen enough reviews to comment either way yet.
Posted on Reply
#96
Space Lynx
Astronaut
R-T-BI have not seen enough reviews to comment either way yet.
Damnit I was hoping you had just posted the pre-order link or something... why ya'll get me excited for :(
Posted on Reply
#97
R-T-B
CallandorWoTDamnit I was hoping you had just posted the pre-order link or something... why ya'll get me excited for :(
lol, I ceased to follow this news after I got my B9 and have been happy ever since, FWIW.
Posted on Reply
#98
lexluthermiester
ValantarSo the TV goes to sleep when you turn off the PC or it otherwise 'turns off' its 'monitor'?
Yup.
ValantarAnd none of those wake from an off state when you turn on the PC either - again, because PCs don't support HDMI CEC.
Wrong again. This TV and the 4k model I have in the living room reacts the same way. Of course it does that for ANY signal going live or shutting off.
ValantarDo they actually turn on and off like a monitor?
Yes, seriously.
ValantarThat didn't apply to the LG CX that LTT did a video about using as a monitor, at least.
Linus has a habit of missing things. MOST TV have a "PC" area in the setting menu to adjust how the TV will react. For some, it's very simple. For others it's more fine grained.
Posted on Reply
#99
Valantar
lexluthermiesterYup.

Wrong again. This TV and the 4k model I have in the living room reacts the same way. Of course it does that for ANY signal going live or shutting off.

Yes, seriously.
That's pretty cool! But also rare. What brand is it?
lexluthermiesterLinus has a habit of missing things. MOST TV have a "PC" area in the setting menu to adjust how the TV will react. For some, it's very simple. For others it's more fine grained.
Weird. Our Samsung Q80 definitely doesn't have any such feature, nor did any of the other TVs considered before we bought that one (and yes, I nagged sales reps in stores to give me the remote so I could check the options menus). I've seen "PC mode" toggles on lots of TVs, but those typically just turn off various types of image processing.
Posted on Reply
#100
lexluthermiester
ValantarThat's pretty cool! But also rare. What brand is it?

Weird. Our Samsung Q80 definitely doesn't have any such feature, nor did any of the other TVs considered before we bought that one (and yes, I nagged sales reps in stores to give me the remote so I could check the options menus). I've seen "PC mode" toggles on lots of TVs, but those typically just turn off various types of image processing.
One is an Insignia(BestBuy brand) and the 4k is Samsung. Been doing some reading, you might be right and I might be really lucky. Though I suspect that it isn't a case of one of us being dead wrong and the other right. I think that reality is somewhere inbetween and TV makers implement certain features or not and that it varies from maker to maker and model to model.

So I apologize for being a bit aggressive and blunt.
Posted on Reply
Add your own comment
Dec 19th, 2024 12:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts