Thursday, March 22nd 2018

NVIDIA Expects Partners to Release 4K, 144 Hz G-Sync Displays With HDR in April

Reports have started doing the rounds that users might finally see NVIDIA's dream for 4K, 144 Hz gaming come to fruition as early as next month. NVIDIA's approach towards establishing a platform of a premium 4K gaming experience meant that manufacturers - of which ASUS and Acer are two of the foremost examples for this story - were forced to opt for a single panel solution, based on AU Optronics' M270QAN02.2 AHVA panel. This is because NVIDIA wanted gamers to be treated to a fully integrated solution, that boasted of features such as 3840×2160 resolution, a 144 Hz refresh rate, a 1000-nits brightness, a direct LED backlighting system with 384 zones, and feature a quantum dot film to enable HDR10 and coverage of the DCI-P3 color gamut.

However, with such stringent requirements, NVIDIA monitor partners would have to accept what constraints might arise from the panel manufacturer's side of the equation, which ultimately, resulted in a delay for the manufacturer's models - Acer Predator X27 and ASUS ROG Swift PG27UQ - from a 2017 release date to what is now expected to be a firm, April 2018 one. Gamers might thus be in for the impending release of some of the best monitors in the industry when it comes to a premium, high refresh-rate gaming experience. Now, where are those mainstream OLED panels with at least 900 nits brightness I wanted to get my hands on?
Source: AnandTech
Add your own comment

64 Comments on NVIDIA Expects Partners to Release 4K, 144 Hz G-Sync Displays With HDR in April

#26
nickbaldwin86
rather have a 34" UW with higher res and at least 144hz... I have a 34" UW panel that does 166hz (low res :( ) but going down from that would be painful and going smaller than a 34" UW is a no go

I just hope this means that the bandwidth limitation is going to be soon lifted... ie 2080 Ti ... I am not about SLi, to many times I have done it and to many times I have dealt with its short falls.
Posted on Reply
#27
BadFrog
TheLostSwedeAre you saying that extra one inch is the deal breaker between 1440p and 4k?
I wouldn't say deal breaker. More personal preference? I installed 27 inch 1080p monitor for the president of my company and it looked pixelated (IMO) and then we installed a 28 inch 2160p, and it almost seemed to small (IMO) then we installed a 27 inch 1440p and it seemed just right. Not big(pixelated) and not too small (He's old, about 65 and needed a monitor that had alot of real estate space to show his vast empire of properties to his rich friends) lol
Posted on Reply
#28
jabbadap
BadFrogI wouldn't say deal breaker. More personal preference? I installed 27 inch 1080p monitor for the president of my company and it looked pixelated (IMO) and then we installed a 28 inch 2160p, and it almost seemed to small (IMO) then we installed a 27 inch 1440p and it seemed just right. Not big(pixelated) and not too small (He's old, about 65 and needed a monitor that had alot of real estate space to show his vast empire of properties to his rich friends) lol
Well this is gaming monitor and in your use case very bad option anyway. But I would say it's more depending on how OS handles high DPI count than anything else.
Posted on Reply
#29
TheLostSwede
News Editor
BadFrogI wouldn't say deal breaker. More personal preference? I installed 27 inch 1080p monitor for the president of my company and it looked pixelated (IMO) and then we installed a 28 inch 2160p, and it almost seemed to small (IMO) then we installed a 27 inch 1440p and it seemed just right. Not big(pixelated) and not too small (He's old, about 65 and needed a monitor that had alot of real estate space to show his vast empire of properties to his rich friends) lol
Again, each to their own.

In addition to my 27" 4k display, I have a 25" Dell 1440p display and I think that's about the right size for that resolution. Prior to that I had a 23" 2048x1152 display and that felt about right for that size...

I just like to have a lot of desktop real estate when I'm not playing games on my screen, as I actually use it for work as well as playing games on it. I don't want to have to move my head around just to see the entire screen, so going much larger doesn't seem practical to me in that sense.
Posted on Reply
#30
jabbadap
TheLostSwedeAgain, each to their own.

In addition to my 27" 4k display, I have a 25" Dell 1440p display and I think that's about the right size for that resolution. Prior to that I had a 23" 2048x1152 display and that felt about right for that size...

I just like to have a lot of desktop real estate when I'm not playing games on my screen, as I actually use it for work as well as playing games on it.
Ooh true 2K 16:9 display, some CRT maybe?
Posted on Reply
#31
TheLostSwede
News Editor
jabbadapOoh true 2K 16:9 display, some CRT maybe?
Nope, Samsung made a couple and Acer made a couple. Mine was a Samsung. The backlight slowly faded though, good old days of florescent backlight...
Posted on Reply
#32
BadFrog
jabbadapWell this is gaming monitor and in your use case very bad option anyway. But I would say it's more depending on how OS handles high DPI count than anything else.
Thought we were having a side conversation about 4k and sizes than the actual monitor. But I agree, I wouldn't be installing this monitor for business work.
TheLostSwedeAgain, each to their own.

In addition to my 27" 4k display, I have a 25" Dell 1440p display and I think that's about the right size for that resolution. Prior to that I had a 23" 2048x1152 display and that felt about right for that size...

I just like to have a lot of desktop real estate when I'm not playing games on my screen, as I actually use it for work as well as playing games on it. I don't want to have to move my head around just to see the entire screen, so going much larger doesn't seem practical to me in that sense.
Yes, each their own. Not all eyes are equal
Posted on Reply
#33
Paganstomp
When I see benchmarks with the newest games that can push 144 fps+ @ 4K with MAX settings, I might raise an eyebrow. But I know that is going to be a good long while.
Posted on Reply
#34
ensabrenoir
....I honestly believe that Nvdia been sandbagging for so long that they actually do have something that can handle this......for a price of course
Posted on Reply
#35
jabbadap
PaganstompWhen I see benchmarks with the newest games that can push 144 fps+ @ 4K with MAX settings, I might raise an eyebrow. But I know that is going to be a good long while.
Uhm right, why? It has gsync, so staying on range within 30-144(Though preferably over 40 fps) should be enough and the crucial point of having VRR in the first place. Those high fps:ses are usually needed on competitive gaming anyway and lowering graphical fidelity is no problem on those kind of use cases(Might be even preferable to not distract some non-needed graphical effects).
Posted on Reply
#36
nickbaldwin86
ensabrenoir....I honestly believe that Nvdia been sandbagging for so long that they actually do have something that can handle this......for a price of course
But they are the leader and AMD has nothing close to offer, they have no reason to release anything new when they are making BANK! on the current gens and pumping them out doing so. they are only competing against themselves, I am sure they could release a 2080 Ti today but why would they?
Posted on Reply
#37
HammerON
The Watchful Moderator
I will be interested in seeing where these are priced at. Still waiting to upgrade from my old 30" Dell.
Posted on Reply
#38
evernessince
NordicMy grandfather cant tell the difference between 480p and 1080p at any distance. A persons vision is a big factor.
I don't think taking an extreme example is a good idea when advising the average person. I can see the benefits of 4K and my vision is aweful. That's largerly thanks to the existence of glasses.
Posted on Reply
#39
Slacker
Coming from a 27" 4k monitor to a 38" 3840x1600 monitor, I don't see no text difference being less crisp than the 4k monitor. I'd say the 38" is better than the 27" in almost all categories. A 40" 4k monitor with 144hz HDR is a much better sell. But knowing that it has G-Sync it would be hella expensive
Posted on Reply
#40
Prima.Vera
Once you go 21:9 you cannot go back.
Posted on Reply
#41
rtwjunkie
PC Gaming Enthusiast
jabbadapUhm right, why? It has gsync, so staying on range within 30-144(Though preferably over 40 fps) should be enough and the crucial point of having VRR in the first place. Those high fps:ses are usually needed on competitive gaming anyway and lowering graphical fidelity is no problem on those kind of use cases(Might be even preferable to not distract some non-needed graphical effects).
And how many people actually game competitively?

Meanwhile in the real world, people want all the details on like a game was designed to be seen. The GPU’s that can do that, and especially with these monitor specs is that elusive white whale (or unicorn, for those that prefer).

Edit: I realized there might be some differing views on “competitive”. To me, this means the pros, who get paid. Otherwise I see it as recreational.
Posted on Reply
#42
HopelesslyFaithful
ironcerealbox
maybe other people play older games or games that dont require an ass ton of GPU? I mean I do have 1200+ games and i have been using 4K for gaming forever but its at 60hz and i cant wait to get an ULMB 120hz screen for RTS and total war gaming and RPGs. Those never require a lot of GPU...minus the newest Total War games.
Posted on Reply
#43
StrayKAT
I can't even get Nvidia to work properly on 4K TVs (I've heard it's not a prob with AMD). Windows HDR mode creates an awful dimming of the screen. I've seen others complaining of the same thing for the past year too.

So yeah, screw them. When Vega gets cheaper, I'm jumping ship.
Posted on Reply
#44
jabbadap
rtwjunkieAnd how many people actually game competitively?

Meanwhile in the real world, people want all the details on like a game was designed to be seen. The GPU’s that can do that, and especially with these monitor specs is that elusive white whale (or unicorn, for those that prefer).

Edit: I realized there might be some differing views on “competitive”. To me, this means the pros, who get paid. Otherwise I see it as recreational.
Yeah should have said fast paced multiplayer games or esports(god I hate that term). If the game is not made by iD software ~100 fps on 4K is kind rare imaginary animal.
Posted on Reply
#45
Vayra86
I simply refuse to pay for Gsync due to its vendor lock-in. Its a bad practice for any consumer that cares about the marketplace. You keep monitors for multiple GPU upgrades, so it just doesn't make proper sense to me.

And having played on high refresh, you do get accustomed to a snappiness in everything that 60 fps/hz or lower just can't offer, Gsync or not, while 4K resolution detail is lost for the most part in moving images. Just don't see the merit of adding such a huge GPU performance requirement for it.
Posted on Reply
#46
nickbaldwin86
Vayra86I simply refuse to pay for Gsync due to its vendor lock-in. Its a bad practice for any consumer that cares about the marketplace. You keep monitors for multiple GPU upgrades, so it just doesn't make proper sense to me.

And having played on high refresh, you do get accustomed to a snappiness in everything that 60 fps/hz or lower just can't offer, Gsync or not, while 4K resolution detail is lost for the most part in moving images. Just don't see the merit of adding such a huge GPU performance requirement for it.
Once you play on Gsync and 144hz+ you can not go back. I can make sense of what you are saying but I still think Gsync is worth the extra pennies I spent
Posted on Reply
#47
HopelesslyFaithful
Vayra86I simply refuse to pay for Gsync due to its vendor lock-in. Its a bad practice for any consumer that cares about the marketplace. You keep monitors for multiple GPU upgrades, so it just doesn't make proper sense to me.

And having played on high refresh, you do get accustomed to a snappiness in everything that 60 fps/hz or lower just can't offer, Gsync or not, while 4K resolution detail is lost for the most part in moving images. Just don't see the merit of adding such a huge GPU performance requirement for it.
let me know when monitor manufactures get off their ass and make built in strobe back lighting with a decent panel. Only benq as far as i know has done that and the panels they used in those are trash compared to what is being offered today. When was the last one even sold? It was a trash TN and years ago last I saw.

Only Nvidia supports back light strobing via ULMB so let me know when manufactures actually make a damn product that has it built in...otherwise my options are ULMB....or no ULMB.....the latter isn't a valid option.....
nickbaldwin86Once you play on Gsync and 144hz+ you can not go back. I can make sense of what you are saying but I still think Gsync is worth the extra pennies I spent
Gsync is lame. ULMB is where it is at...though there is supposedly a way to trick it so both work at once.
Posted on Reply
#48
TristanX
Announcemnt in April (1st), sales on July :)
They will announce only price, to be prepared when sales begin :)
Posted on Reply
#49
Vayra86
nickbaldwin86Once you play on Gsync and 144hz+ you can not go back. I can make sense of what you are saying but I still think Gsync is worth the extra pennies I spent
I'm not contesting the use of Gsync especially (in fact, exclusively-) when you dip below 60 FPS, the advantages are clear. But high refresh and Gsync in fact exclude one another, you don't need it when you are pusing 100+ FPS. At that point you can just use Fast Sync which is free AND combines well with all sorts of other monitor goodies like strobing backlight.
Posted on Reply
#50
nickbaldwin86
HopelesslyFaithfullet me know when monitor manufactures get off their ass and make built in strobe back lighting with a decent panel. Only benq as far as i know has done that and the panels they used in those are trash compared to what is being offered today. When was the last one even sold? It was a trash TN and years ago last I saw.

Only Nvidia supports back light strobing via ULMB so let me know when manufactures actually make a damn product that has it built in...otherwise my options are ULMB....or no ULMB.....the latter isn't a valid option.....

Gsync is lame. ULMB is where it is at...though there is supposedly a way to trick it so both work at once.
I honestly can not tell you the difference between the two. the difference is SO small. But a monitor with and without Gsync... totally obvious! a monitor at 60hz and one at 144hz, clear as night vs day difference
TristanXAnnouncemnt in April (1st), sales on July :)
They will announce only price, to be prepared when sales begin :)
So in April I stop eating to save money for July purchases.... got it!
Posted on Reply
Add your own comment
Dec 18th, 2024 11:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts