• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gigabyte Launches 55-inch Android Powered Gaming Monitor

All current gen GPUs have HDMI 2.1 and so do current gen PCs with IGPs

A display like this would be targeted at those users like myself.

as if you have a older PC 4K120Hz won't be your target. Even if limited to 4K60 you'd only need HDMI 2.0 which from even back to the GTX 9xx series and AMD RX 4xx Series supports
Sorry, but this is inaccurate. Even though the IGP does in theory support HDMI 2.1, most computers don't, as it requires extra components which add extra cost, so both the motherboard makers and notebook makers are cutting corners. Most notebooks that doesn't have a dedicated GPU do in fact seem to ship with HDMI 1.4 for some stupid reason.
 
Sorry, but this is inaccurate. Even though the IGP does in theory support HDMI 2.1, most computers don't, as it requires extra components which add extra cost, so both the motherboard makers and notebook makers are cutting corners. Most notebooks that doesn't have a dedicated GPU do in fact seem to ship with HDMI 1.4 for some stupid reason.
Possibly I know all the laptops we receive and send out which are basic HPs and Dells have HDMI 2.0
 
Possibly I know all the laptops we receive and send out which are basic HPs and Dells have HDMI 2.0
Business models vs. consumer models. The manufacturers save every cent they can when their customers don't really know what they're buying.
 
Business models vs. consumer models. The manufacturers save every cent they can when their customers don't really know what they're buying.
Manufacturers are so cheap, it makes you wonder how much they saved by not adding price to this PR. :D :laugh:

TBH, IDK why any console player would buy one of these when they can use their 60 inch normal tv. If it can load apps like Plex or Peacock, someone will find this tv useful.
 
Business models vs. consumer models. The manufacturers save every cent they can when their customers don't really know what they're buying.
All of our units are consumer. Its cheaper than business
 
All of our units are consumer. Its cheaper than business
Do they have discrete GPUs?

Here's a brand new Asus model with HDMI 1.4.

A brand new Dell with HDMI 1.4.

And a brand new Lenovo with HDMI 1.4.
 
Do they have discrete GPUs?

Here's a brand new Asus model with HDMI 1.4.

A brand new Dell with HDMI 1.4.

And a brand new Lenovo with HDMI 1.4.
No they all mostly use the Ryzen IGPs

Do they have discrete GPUs?

Here's a brand new Asus model with HDMI 1.4.

A brand new Dell with HDMI 1.4.

And a brand new Lenovo with HDMI 1.4.
Those support HDMI 2.0 and 2.1 over USB-C
 
No they all mostly use the Ryzen IGPs
Right that could be why then.
Those support HDMI 2.0 and 2.1 over USB-C
That's hardly relevant, is it though? The native ports are HDMI 1.4 and I thought that was what we were talking about, not using some third party adapter.
 
Right that could be why then.

That's hardly relevant, is it though? The native ports are HDMI 1.4 and I thought that was what we were talking about, not using some third party adapter.
Most of ours include the usb dock

but my main argument was anyone looking to purchase this will have a setup that can support it being 4K120
 
Most of ours include the usb dock

but my main argument was anyone looking to purchase this will have a setup that can support it being 4K120
Save for the fact that almost no GPU in existence can comfortably push 4k at over 60fps, yes, everyone will have a setup like that.
This is a TV in disguise (probably with a side of horrible latency included) that aimed squarely at console owners.
 
What amazes me is that all these display manufacturers are still failing to put an eARC socket on any of their products. In a modern world where soundbars and AVR's rely on eARC to supply enough bandwidth to support Dolby Atmos, this is a huge oversight.

Sure you can run a second HDMI 2.1 cable from your GPU directly to the soundbar or AVR but this inadvertently creates a 2nd 'phantom' monitor within Windows. Which is a pain in the ass because your mouse cursor can now fall into the abyss of a non-existant screen if you move it too far to the right.
 
well another impressive gigafail product but this time them include remote detonator (app are ok) ?

:)
 
better get that alienware Q-LED
It's not 4K though.

What amazes me is that all these display manufacturers are still failing to put an eARC socket on any of their products. In a modern world where soundbars and AVR's rely on eARC to supply enough bandwidth to support Dolby Atmos, this is a huge oversight.

Sure you can run a second HDMI 2.1 cable from your GPU directly to the soundbar or AVR but this inadvertently creates a 2nd 'phantom' monitor within Windows. Which is a pain in the ass because your mouse cursor can now fall into the abyss of a non-existant screen if you move it too far to the right.
It does have eARC and ARC, I just didn't include it, since it seems to be standard for anything with HDMI 2.1 to have at least one eARC capable port.
 
  • Like
Reactions: bug
Then you buy an LCD below 1000. And why and for what kind of work is OLED useless?
Short answer: Any kind of work which requires any kind of static elements being displayed for extended periods of time, so any real world work. My wife got a Thinkpad X1 with an OLED screen a while ago, after just a few weeks you could easily see Excel's interface layered on everything, and after three months the screen was, or all intents and purposes, defective, with permanently burned areas. The replacement fared even worse and had to be trashed after just two months.
In light of this, my counter-question would be: For what purpose is OLED superior and therefore should occupy a higher price class? Aside from the obvious, people who like over-saturated colours and kids with "pro-gamer" fantasy believing that a faster pixel response will git them gud? After calibration, any reasonable LCD can achieve almost perfect colour rendering without the need for periodic re-callibration due to uneven subpixel burnout (blue ones go out first on OLED) and doesn't have built-in planned obsolescence.
 
Why do I need Android in my monitor? It's a gimped TV with DP and no TV tuner.
I have android in my TV, as a tech nerd it's perfect.
You are totally correct tho, this is a gaming TV minus the tuner. (And that answers almost every other question people asked in the thread)
 
I have android in my TV, as a tech nerd it's perfect.
You are totally correct tho, this is a gaming TV minus the tuner. (And that answers almost every other question people asked in the thread)
Most people also didn't get my hints in the news post that it was a TV without a tuner...
Plenty of those on sale here already, by the likes of TCL and even Toshiba I think.
 
Short answer: Any kind of work which requires any kind of static elements being displayed for extended periods of time, so any real world work.
Rtings did a test years ago. Granted that was with static TV content. It took thousands of hours displaying the same content to even start noticing any burn-in. And that was years ago with older panels. Recently someone did a test on Nintendo Swith OLED version where again it took thousands of hours of displaying static content to start noticing burn-in.
My wife got a Thinkpad X1 with an OLED screen a while ago, after just a few weeks you could easily see Excel's interface layered on everything, and after three months the screen was, or all intents and purposes, defective, with permanently burned areas. The replacement fared even worse and had to be trashed after just two months.
Laptop panels are not the same as monitor/TV panels.
In light of this, my counter-question would be: For what purpose is OLED superior and therefore should occupy a higher price class?
Anything relating to HDR content. Also situations where pixel response time is important. Also situations where extreme contrast is needed.
Aside from the obvious, people who like over-saturated colours and kids with "pro-gamer" fantasy believing that a faster pixel response will git them gud?
You ask a question to right away disqualify gaming?
After calibration, any reasonable LCD can achieve almost perfect colour rendering without the need for periodic re-callibration due to uneven subpixel burnout (blue ones go out first on OLED) and doesn't have built-in planned obsolescence.
And OLED cant be calibrated?
And no. LCD's need periodic re-calibration too as they age. Switching to another GPU/system may also require re-calibration.

For example Samsung QD-OLED does not even have blue diodes. Or green ones, or red ones. It has base white diodes that are then converted to colors via QD layer.
So there is no blue diode that would wear out.
 
  • Like
Reactions: bug
It's not 4K though.


It does have eARC and ARC, I just didn't include it, since it seems to be standard for anything with HDMI 2.1 to have at least one eARC capable port.
but the alianware model have much higher image quality, and much better HDR
 
but the alianware model have much higher image quality, and much better HDR
Hardly apples to apples though, is it?
Someone that wants to game on a 55-inch 4K "monitor" would never consider the Alienware and possibly vice versa.
 
Because they launched it. It was shown at CES, but now it's a real product. Just because you can't buy it, doesn't make it less of a real product.
Also, about 80 percent of all press releases that announces new products as launched, don't include pricing.
well, sorry didnt mean to make it personal, im aware lots of products are "launched" like that

anyway, there are lots of products shown are CES that never make it retail
 
Rtings did a test years ago. Granted that was with static TV content. It took thousands of hours displaying the same content to even start noticing any burn-in. And that was years ago with older panels. Recently someone did a test on Nintendo Swith OLED version where again it took thousands of hours of displaying static content to start noticing burn-in.
I did phone repairs for a few years, and would often see phones with burned in OLED screens (heck, even LCD at times)
We also see photos and video proof of people with burn in online regularly too

1. Temporary image retention can be mistaken for burn in, but it's still unacceptable to a user
2. the settings on the screen and even the ambient weather matter. Testing it in a 20C dry environment will give very different results to the people in hotter countries (my back brick walls were at 65C this summer)

1,000 hours is only 42 days of always-on use. Lets say 8 hours a day and you're looking at 120 days or 4 months before you could be experiencing permanent burn in, by your example
 
I did phone repairs for a few years, and would often see phones with burned in OLED screens (heck, even LCD at times)
We also see photos and video proof of people with burn in online regularly too

1. Temporary image retention can be mistaken for burn in, but it's still unacceptable to a user
2. the settings on the screen and even the ambient weather matter. Testing it in a 20C dry environment will give very different results to the people in hotter countries (my back brick walls were at 65C this summer)

1,000 hours is only 42 days of always-on use. Lets say 8 hours a day and you're looking at 120 days or 4 months before you could be experiencing permanent burn in, by your example
First - Portable devices do no use the same OLED panels as stationary OLED devices. They also have less pixels to use as backup. So there could be difference there.
Second - 8 hours but are you watching static content for 4-8 months with 8 hours a day? I doubt it. If a mobile OLED is active it is most likely in active use not idling at desktop/home screen.
Third - The temperature thing might have merit. I live in a fairly cold environment. We're lucky if we have 20c days in summer. Mostly it's sub 15c with overcast and/or rain/wind.
 
First - Portable devices do no use the same OLED panels as stationary OLED devices. They also have less pixels to use as backup. So there could be difference there.
Second - 8 hours but are you watching static content for 4-8 months with 8 hours a day? I doubt it. If a mobile OLED is active it is most likely in active use not idling at desktop/home screen.
Third - The temperature thing might have merit. I live in a fairly cold environment. We're lucky if we have 20c days in summer. Mostly it's sub 15c with overcast and/or rain/wind.
1. Irrelevant, as it's still an example of OLED burn in
2. yeah. a lot of people do that, and leave screens on for extremely long periods of time - especially with TV's and shared screens for the family, since multiple people can use the display
3. yeah we pass 40C here, electronics do not enjoy it. 45C with a thunderstorm and bushfires was a fun, fun time.
 
Back
Top