Friday, September 17th 2021

Gigabyte Launches the M32U 4K Gaming Monitor

The M32U will be Gigabyte's fifth gaming monitor and the second not to carry the Aorus brand. Design wise it looks identical to the M32Q and the specs are very similar too, as both monitors rely on a super speed IP panel that measures 31.5-inches. The big difference here is obviously the resolution, as the M32U sports a 4K 3840x2160 panel and this time it seems like the backlight has improved slightly, as we're looking at an HDR400 certification, even though both models deliver a typical brightness of 350cd/m².

On the other hand, colour saturation isn't quite as good at 90% DCI-P3 or 123% sRGB, even though this is an 8-bit + FRC panel rather than an 8-bit panel for the M32Q. The refresh rate is up to 144 Hz for PCs and 120 Hz for consoles, with a 1 ms MPRT response time. Inputs consist of two HDMI 2.1 ports, one DisplayPort 1.4 with DSC and a USB-C port with DP-Alt mode.The display also has one USB 3.0 type-B input and three type-A outputs and a headphone jack and as with previous Gigabyte displays, this one supports KVM functionality. Finally we have a pair of built in 3 W speakers. No word on pricing or availability.
Source: Gigabyte
Add your own comment

53 Comments on Gigabyte Launches the M32U 4K Gaming Monitor

#26
MxPhenom 216
ASIC Engineer
TardianWhy?
I play more competitive fps than anything else
Posted on Reply
#28
Mussels
Freshwater Moderator
LETSGOTHAT'S NOT AN OLED GAMING MONITOR/TV...IT'S MEANT FOR VISUAL ARTISTS...LG WILL HOWEVER BE OFFERING A 42 INCH OLED GAMING MONITOR/TV IN MARCH AND ITS EXACTLY WHAT I'M GETTING


JUST WAIT TILL MARCH LIKE I AM AND GET THE NEW LG 42 INCH OLED...IT WILL BE THE BEST OF BOTH WORLDS AND SHALL BE AWESOMELY AMAZING
You might need to start a help thread to figure out how to turn capslock off
Posted on Reply
#29
Valantar
Tardianwww.thecoldwire.com/how-many-fps-can-the-human-eye-see/

I can see a clear difference between 60 and 120 fps, but I remain unconvinced that 240 fps is more than marketing except for a select few. You may be one of them.
Wow, that article was terrible. It doesn't cite a single source for its claims, yet does nothing but make bombastic claims, is extremely poorly written, and makes some truly out there claims like "When FPS is lower, around 30, you will notice that the game seems to move a little slower, and images don’t look quite as realistic." (my emphasis). Whatever it is you're looking for, that article shows zero signs of being a trustworthy source.

There are clear indications that humans can quite clearly distinguish between 144Hz and 240 or even 360Hz, with quite a lot of (unscientific) blind testing done. I sadly haven't seen any proper studies on this at all, which means unscientific blind testing is the best we've got. There are clearly diminishing returns as you move upwards, but I've seen nothing convincing suggesting there human vision is that limited. IMO, part of this is the question being wrong in the first place: it presupposes that human vision functions in the same way as a camera, capturing whole, discrete frames in sequence and presenting them relatively unmodified. This is not necessarily the case, and without clarification of that the question can't be answered. Especially given the vast amount of processing our brains do on the signals coming in our optic nerve (seamlessly merging two very wide-angle images into one, filling in blind spots, and much more) the assumption that human vision is understandable through a simplistic question like "how many fps can it see" is deeply flawed.
Posted on Reply
#30
Mussels
Freshwater Moderator
Humans can see one frame different in thousands, as tested by the USAF decades ago

It's not about processing the entire image and every frame, it's about spotting that ONE piece of change as fast as possible, be it a single pixel or half the screen
60Hz is 16.6ms per frame
240Hz is 4.15ms frame

(This is ignoring Vsync off and the potential for certain pixels to refresh faster than others with smearing/tearing, and assuming a Vsync on scenario because the math is easier to explain. The advantages always lead the to 240Hz anyway)

Depending on the moment that pixel changes, there is a maximum 12ms advantage to the 240Hz player. For a casual or someone on australian internet, that's nothing. To a pro gamer on LAN at a tournament? It's everything.
Posted on Reply
#31
nguyen
MusselsHumans can see one frame different in thousands, as tested by the USAF decades ago

It's not about processing the entire image and every frame, it's about spotting that ONE piece of change as fast as possible, be it a single pixel or half the screen
60Hz is 16.6ms per frame
240Hz is 4.15ms frame

(This is ignoring Vsync off and the potential for certain pixels to refresh faster than others with smearing/tearing, and assuming a Vsync on scenario because the math is easier to explain. The advantages always lead the to 240Hz anyway)

Depending on the moment that pixel changes, there is a maximum 12ms advantage to the 240Hz player. For a casual or someone on australian internet, that's nothing. To a pro gamer on LAN at a tournament? It's everything.
144hz is the best 6.9ms baby :roll: .
Posted on Reply
#32
Shou Miko
ChomiqWell there's 27", 144 Hz miniLED but I guess that one will be "not at that price".
tftcentral.co.uk/news/lg-display-latest-panel-development-plans-july-2021

I wonder what will they pull off with that "IPS Black".

Back to the OG topic - this is the same panel as the one used for FI32U, it's going to be cheaper than that and it will have a "better" stand (one you can easily ditch for a VESA mount and not cry over RGB loss) and possibly better tuned overdrive mode (maybe they will learn something from FI32U).
It's also available for preorder at Newegg for $799 with... no refunds and no returns.
Mini-LED sounds nice for a monitor but I am afraid of their price because they want it certified with HDR1000.

I know it's for standards but they cost soo much which companies would skip them from time to time and just get the calibration pro lined without much extra cost.

Because buying like a £1000-2000 gaming monitor no even I own a Asus ROG Strix XG27UQ was because I feel in love with it but after Asus relieved them self yet again useless when contacting their support about this monitor and no one had an answer really I am done with paying Asus overpriced shit for thing.

With this said I can understand the colaperation with Alan Walker cost something because it's custom for their Zephyrus G14 just wish they used a RTX 3060 or something instead because you can get the regular Zephyrus G14 with the same specs but RTX 3060 for the same or a bit over the price of the AW edition.
Posted on Reply
#33
Valantar
Mini-LED looks great, but I can't imagine it becoming even remotely affordable for quite a few years yet. Am I the only one wondering why we haven't seen more (or any, really) dual-LCD panels? The ones with a front high resolution, full-color LCD layer, and a lower resolution, black and white panel behind this for improved contrast and local dimming? I get that those lose a significant amount of brightness over single layer ones (20-30% IIRC) which makes them less efficient and in need of a more powerful backlight, but this just seems so much simpler than mini-LED or other similar tech. Given how cheap lower resolution IPS panels are, and how relatively simple to drive a b/w panel would be, I can't imagine this not being a good solution overall. The rear panel would need to be accurately aligned (can't imagine that's a problem, really), and would need to roughly match the response times of the front panel (could make for some weird artefacting otherwise), and I guess the main challenge would be a driver that can handle both properly (including intelligently averaging light levels for each 4-pixel grid). But how hard can that be? It certainly doesn't sound like an insurmountable challenge. I'd certainly be willing to live with "only" 4-pixel square dimming zones if that meant 100 000:1 contrast at a good price point.
Posted on Reply
#34
Tardian
The answer is more complex than any of the answers above. There is a clear difference between gaming at 70fps and 300fps on a 60hz monitor. There are advantages in some (but not all) games of increasing the Hz rate from 60 to 240. However, input lag from the keyboard and mouse also needs to be considered. Less seasoned gamers probably get the most benefit. A great gamer will usually prevail regardless of equipment (to a certain degree). Ping rates for the internet mean some countries just don't stand a chance. Reaction times, game training, muscle memory, etc all count. I'd rather game on 120hz OLED than a 240hz IPS, however, others would disagree. I don't game professionally or we would starve.
Posted on Reply
#35
Valantar
TardianThe answer is more complex than any of the answers above. There is a clear difference between gaming at 70fps and 300fps on a 60hz monitor. There are advantages in some (but not all) games of increasing the Hz rate from 60 to 240. However, input lag from the keyboard and mouse also needs to be considered. Less seasoned gamers probably get the most benefit. A great gamer will usually prevail regardless of equipment (to a certain degree). Ping rates for the internet mean some countries just don't stand a chance. Reaction times, game training, muscle memory, etc all count. I'd rather game on 120hz OLED than a 240hz IPS, however, others would disagree. I don't game professionally or we would starve.
I don't think anyone here is arguing that other factors don't play a significant role, including both other equipment, rendered frame rates, and player skills, preferences and training. None of that takes away from the fact that, all else being equal, a higher refresh rate allows for faster reaction times and smoother motion tracking. Of course, all else is very rarely equal, and no two LCD panels have identical characteristics, let alone when comparing between panel technologies. OLED has unbeatable pixel response times, which for some people might be preferable to a higher refresh rate LCD with slower response times even at a lower refresh rate. All of these factors are tightly interconnected, so nothing you can read off a spec sheet (or even an in-depth review) will give any type of definitive answer - especially as user perceptions are highly contextual and variable, especially over time. I've never used a 240Hz (or higher) monitor, and I kind of doubt I ever will - I don't play those types of games, and other factors matter more to me. The main "benefit" of such a high refresh rate would likely be to highlight how poor my reaction times and aim are :P And there are of course diminishing returns as you go higher, as the Hz-to-ms ratio gets ever smaller - 60 to 120Hz goes from 16.7 to 8.3ms - a reduction of over 8ms and as such a major change, but on the other hand going from 240Hz to 360Hz is just a 1,39ms reduction from 4,167 to 2,778ms. The proposed future high-end refresh rate of 480Hz is an even smaller 0.7ms reduction. So while these changes are perceptible, and can give an advantage given that the player has sufficiently fast reaction speeds and training to make use of this faster perception, the gains are ever smaller, and the benefits are ever more specialized.
Posted on Reply
#36
nguyen
ValantarI don't think anyone here is arguing that other factors don't play a significant role, including both other equipment, rendered frame rates, and player skills, preferences and training. None of that takes away from the fact that, all else being equal, a higher refresh rate allows for faster reaction times and smoother motion tracking. Of course, all else is very rarely equal, and no two LCD panels have identical characteristics, let alone when comparing between panel technologies. OLED has unbeatable pixel response times, which for some people might be preferable to a higher refresh rate LCD with slower response times even at a lower refresh rate. All of these factors are tightly interconnected, so nothing you can read off a spec sheet (or even an in-depth review) will give any type of definitive answer - especially as user perceptions are highly contextual and variable, especially over time. I've never used a 240Hz (or higher) monitor, and I kind of doubt I ever will - I don't play those types of games, and other factors matter more to me. The main "benefit" of such a high refresh rate would likely be to highlight how poor my reaction times and aim are :p And there are of course diminishing returns as you go higher, as the Hz-to-ms ratio gets ever smaller - 60 to 120Hz goes from 16.7 to 8.3ms - a reduction of over 8ms and as such a major change, but on the other hand going from 240Hz to 360Hz is just a 1,39ms reduction from 4,167 to 2,778ms. The proposed future high-end refresh rate of 480Hz is an even smaller 0.7ms reduction. So while these changes are perceptible, and can give an advantage given that the player has sufficiently fast reaction speeds and training to make use of this faster perception, the gains are ever smaller, and the benefits are ever more specialized.
Higher refresh doesn't always mean better perceived motion clarity though, monitor with 120hz ULMB mode (Back Light Strobing) will have better motion clarity than 240hz screen without ULMB.
Now here come the monster e-sport screen, the Acer VX259Q 390hz with ULMB (it's called VRB now)
Posted on Reply
#37
Valantar
nguyenHigher refresh doesn't always mean better perceived motion clarity though, monitor with 120hz ULMB mode (Back Light Strobing) will have better motion clarity than 240hz screen without ULMB.
Now here come the monster e-sport screen, the Acer VX259Q 390hz with ULMB (it's called VRB now)
That's why I said all else being equal ;) You're entirely right though - backlight strobing can drastically improve LCD motion clarity (at the cost of brightness), though it is highly dependent on being implemented well. As far as I know ULMB/ELMB/VRB/whatever (and their - Sync variants) are all marketing terms for various brands for refresh rate synced backlight strobing.
Posted on Reply
#38
Metroid
SLObingerI have a 55 LG OLED and it absolutely kicks ass as a gaming monitor and TV (seriously amazing) but not so good for just about anything else on a PC. Hard to explain but text just looks bad.
55 is too big for desktop pc even at 4k, at 8k would be all right, the pixel density is not as good as smaller ones. 8k at 60hz or 4k at 120hz. I would prefer 4k at 120hz because we dont have gpus to handle 8k resolution yet with good framerates and that will take sometime, 6 years give or take.
Posted on Reply
#39
Valantar
Metroid55 is too big for desktop pc even at 4k, at 8k would be all right, the pixel density is not as good as smaller ones. 8k at 60hz or 4k at 120hz. I would prefer 4k at 120hz because we dont have gpus to handle 8k resolution yet with good framerates and that will take sometime, 6 years give or take.
I don't think 55" is reasonably usable at desk viewing distances regardless of the resolution - it'll still cause severe neck strain if you are to have even the faintest hope of making use of the whole display. At 1m viewing distance (which is a bit longer than the recommended arm's length, typically 70-80cm) the focal area of human stereoscopic vision (where visual acuity is sufficient to read etc., which is 30°) is ~50cm across. Our central field of view, where our eye is focused, is ~10° wide, or less than 20cm across. Our near peripheral vision covers ~60°, or ~120cm at 1m. A 55" TV is 122cm wide. That means that at 1m - which is a bit too far for eye strain and comfort - you're at any time focused on just 1/6th of the width of the display at any given time, can see less than half of it with reasonable sharpness, and can just see the rest in binocular peripheral vision (assuming your head is pointed towards the middle of the display). Monocular peripheral vision spans way beyond this, but having a monitor in peripheral vision is ... kind of useless unless its job is to show a huge blinking alert or something similar. That means you'd need very significant neck rotation - up to 30° to each side - to make full use of a 55" TV as a monitor. If that kind of movement is maintained frequently for hours at a time, you're likely looking at a serious neck injury in a bit of time. There are those "lucky" enough to avoid that, but I still don't see how using a 55" TV as a monitor is anything but very uncomfortable.

Also, of course, 8k gaming is really stupid. Even 2160p gaming is. There are barely any combinations of screen size and viewing distance where there is a perceptible difference in visual quality between 1440p and 2160p - they exist, but they're rare. There are no such combinations for ... 4320p? - you'd either be sitting too far away to notice, or so close that you're not seeing the whole display. Not to mention the 4x increase in render complexity from an already ridiculously taxing 2160p. There might be exceptions, like extremely slow-paced games (Anno, Civilization, etc.), but ... is there any real benefit still? I sincerely hope 4320p gaming will never, ever be a thing. It's just unnecessary and wasteful. We're reaching a reasonable point of diminishing returns in gaming resolution with 2160p for any practical monitor size.
Posted on Reply
#40
Shou Miko
Mon to Friday I go from IPS (my 4K monitor), OLED (iPhone), VA (1080p monitor at work), IPS and then 65" LG CX OLED TV at my dad's house and then home to IPS monitor and sometimes I use my LG 48" CX OLED TV.

I am worry about burn-in on my LG OLED TV my dad don't really even I told him also went through his menu this is kinda why I hope mini-led will be the stepping point I don't have to worry about burn-in because I love the colors on my OLED tv more then my IPS monitor even my LG 55" TV in the bedroom is IPS too.
Posted on Reply
#41
dj-electric
Sooo.... Need confirmation here
Is this basically an FI32U, but much cheaper?

Is Gigabyte really doing this again?
Posted on Reply
#42
Valantar
dj-electricSooo.... Need confirmation here
Is this basically an FI32U, but much cheaper?

Is Gigabyte really doing this again?
It's certainly an FI32U with ... let's see. Differences:
Valantarexcept for the Aorus having a mic jack, one less USB port, 20W less "AC input max" (is that the PSU's rating, or peak consumption? no idea.), a pivoting stand (but with less swivel angle), and a bundled USB-C cable.
That's it, at least from the spec sheet. Perhaps the Aorus has better binned panels? Either way, it sure looks like a cheaper version of the same monitor.
Posted on Reply
#43
Tartaros
ValantarThat's why I said all else being equal ;) You're entirely right though - backlight strobing can drastically improve LCD motion clarity (at the cost of brightness), though it is highly dependent on being implemented well. As far as I know ULMB/ELMB/VRB/whatever (and their - Sync variants) are all marketing terms for various brands for refresh rate synced backlight strobing.
UMLB is the holy grail, but anytime I want to use it I have to turn off lights, it just halves the brightness in my monitors. It's a real pity.
Posted on Reply
#44
dj-electric
ValantarIt's certainly an FI32U with ... let's see. Differences:

That's it, at least from the spec sheet. Perhaps the Aorus has better binned panels? Either way, it sure looks like a cheaper version of the same monitor.
Gigabyte really has to stop doing that. It was silly the first 3 times, and now its just selling an "AORUS" logo for 500$+
Posted on Reply
#46
Unregistered
SLObingerI have a 55 LG OLED and it absolutely kicks ass as a gaming monitor and TV (seriously amazing) but not so good for just about anything else on a PC. Hard to explain but text just looks bad.
LG OLEDs pixels are WRBG, I believe Windows doesn't have a way to properly display text.
#49
phanbuey
I use cx48 for work and gaming -- the fonts at 125% with me sitting 3-3.5' away is sharper than on my 1440P 27". I just leave it on 4:2:0 during desktop sdr work and that actually makes fonts perfect - hdr puts it automatically to 4:4:4 for gaming.

Gaming is just another planet.
Posted on Reply
#50
preseznik
ValantarI don't think 55" is reasonably usable at desk viewing distances regardless of the resolution - it'll still cause severe neck strain if you are to have even the faintest hope of making use of the whole display. At 1m viewing distance (which is a bit longer than the recommended arm's length, typically 70-80cm) the focal area of human stereoscopic vision (where visual acuity is sufficient to read etc., which is 30°) is ~50cm across. Our central field of view, where our eye is focused, is ~10° wide, or less than 20cm across. Our near peripheral vision covers ~60°, or ~120cm at 1m. A 55" TV is 122cm wide. That means that at 1m - which is a bit too far for eye strain and comfort - you're at any time focused on just 1/6th of the width of the display at any given time, can see less than half of it with reasonable sharpness, and can just see the rest in binocular peripheral vision (assuming your head is pointed towards the middle of the display). Monocular peripheral vision spans way beyond this, but having a monitor in peripheral vision is ... kind of useless unless its job is to show a huge blinking alert or something similar. That means you'd need very significant neck rotation - up to 30° to each side - to make full use of a 55" TV as a monitor. If that kind of movement is maintained frequently for hours at a time, you're likely looking at a serious neck injury in a bit of time. There are those "lucky" enough to avoid that, but I still don't see how using a 55" TV as a monitor is anything but very uncomfortable.

Also, of course, 8k gaming is really stupid. Even 2160p gaming is. There are barely any combinations of screen size and viewing distance where there is a perceptible difference in visual quality between 1440p and 2160p - they exist, but they're rare. There are no such combinations for ... 4320p? - you'd either be sitting too far away to notice, or so close that you're not seeing the whole display. Not to mention the 4x increase in render complexity from an already ridiculously taxing 2160p. There might be exceptions, like extremely slow-paced games (Anno, Civilization, etc.), but ... is there any real benefit still? I sincerely hope 4320p gaming will never, ever be a thing. It's just unnecessary and wasteful. We're reaching a reasonable point of diminishing returns in gaming resolution with 2160p for any practical monitor size.
I've been using my LG CX 55'' on my desk since launch and a 4k Samsung 55'' for a few years prior to that. Regular depth computer desk, too.
You're just wrong.
It's super comfortable and it's annoying having to go to anything smaller (at work, LAN parties, for example).
No "neck rotation", ever, in all of these years.
Posted on Reply
Add your own comment
Dec 23rd, 2024 13:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts