Thursday, March 22nd 2018

NVIDIA Expects Partners to Release 4K, 144 Hz G-Sync Displays With HDR in April

Reports have started doing the rounds that users might finally see NVIDIA's dream for 4K, 144 Hz gaming come to fruition as early as next month. NVIDIA's approach towards establishing a platform of a premium 4K gaming experience meant that manufacturers - of which ASUS and Acer are two of the foremost examples for this story - were forced to opt for a single panel solution, based on AU Optronics' M270QAN02.2 AHVA panel. This is because NVIDIA wanted gamers to be treated to a fully integrated solution, that boasted of features such as 3840×2160 resolution, a 144 Hz refresh rate, a 1000-nits brightness, a direct LED backlighting system with 384 zones, and feature a quantum dot film to enable HDR10 and coverage of the DCI-P3 color gamut.

However, with such stringent requirements, NVIDIA monitor partners would have to accept what constraints might arise from the panel manufacturer's side of the equation, which ultimately, resulted in a delay for the manufacturer's models - Acer Predator X27 and ASUS ROG Swift PG27UQ - from a 2017 release date to what is now expected to be a firm, April 2018 one. Gamers might thus be in for the impending release of some of the best monitors in the industry when it comes to a premium, high refresh-rate gaming experience. Now, where are those mainstream OLED panels with at least 900 nits brightness I wanted to get my hands on?
Source: AnandTech
Add your own comment

64 Comments on NVIDIA Expects Partners to Release 4K, 144 Hz G-Sync Displays With HDR in April

#51
Vayra86
nickbaldwin86I honestly can not tell you the difference between the two. the difference is SO small. But a monitor with and without Gsync... totally obvious! a monitor at 60hz and one at 144hz, clear as night vs day difference




So in April I stop eating to save money for July purchases.... got it!
If you haven't done so already, do a small experiment on your 144hz Gsync monitor: disable Gsync, tweak your game for 100+ FPS stable, and enable Fast Sync :)
Posted on Reply
#52
nickbaldwin86
Vayra86If you haven't done so already, do a small experiment on your 144hz Gsync monitor: disable Gsync, tweak your game for 100+ FPS stable, and enable Fast Sync :)
I will give it a shot! I setup my games to run at 166hz because that is the refresh of my monitor. I will give up a 20X AA or SXAA blah blah blah for high refresh rates so I don't miss a shot
Posted on Reply
#53
Vayra86
nickbaldwin86I will give it a shot! I setup my games to run at 166hz because that is the refresh of my monitor. I will give up a 20X AA or SXAA blah blah blah for high refresh rates so I don't miss a shot
Let us know of your experience :) This will give you a very accurate view on the merit of Gsync at high refresh.
Posted on Reply
#54
nickbaldwin86
Vayra86Let us know of your experience :) This will give you a very accurate view on the merit of Gsync at high refresh.
What titles are best to test on? I really only play PUBG now :\
Posted on Reply
#55
Vayra86
nickbaldwin86What titles are best to test on? I really only play PUBG now :\
Overwatch is the first thing that comes to mind. It is extremely stable and if there is a tear in your screen refresh, you're likely to notice it (bright colors and simple geometry). Another nice test is any isometric perspective game that runs well, because again, scrolling over a map will make tearing visible in no time.
Posted on Reply
#56
jabbadap
HopelesslyFaithfullet me know when monitor manufactures get off their ass and make built in strobe back lighting with a decent panel. Only benq as far as i know has done that and the panels they used in those are trash compared to what is being offered today. When was the last one even sold? It was a trash TN and years ago last I saw.

Only Nvidia supports back light strobing via ULMB so let me know when manufactures actually make a damn product that has it built in...otherwise my options are ULMB....or no ULMB.....the latter isn't a valid option.....

Gsync is lame. ULMB is where it is at...though there is supposedly a way to trick it so both work at once.
Uhm. I don't think that is even physically possible. ULMB is working on static refresh rate, while gsync is working on variable refresh rate. So to actually make them working together, it has to strobe within changing fps.
Posted on Reply
#57
HopelesslyFaithful
nickbaldwin86I honestly can not tell you the difference between the two. the difference is SO small. But a monitor with and without Gsync... totally obvious! a monitor at 60hz and one at 144hz, clear as night vs day difference




So in April I stop eating to save money for July purchases.... got it!
I dont know what your smoking but ULMB vs no ULMB is night and day.
120hz

120hz ULMB


not sure how that isn't noticeable to you........
www.blurbusters.com/faq/60vs120vslb/
jabbadapUhm. I don't think that is even physically possible. ULMB is working on static refresh rate, while gsync is working on variable refresh rate. So to actually make them working together, it has to strobe within changing fps.
there was a video of a guy demonstrating it with a game. I dont recall where. This was about a year ago I remember someone posting about a video they found of this guy glitching the 2 to work together. Either here or hardforum i saw the youtube video being posted.
Posted on Reply
#58
nickbaldwin86
HopelesslyFaithfulI dont know what your smoking but ULMB vs no ULMB is night and day.
120hz
120hz ULMB
not sure how that isn't noticeable to you........
www.blurbusters.com/faq/60vs120vslb/

there was a video of a guy demonstrating it with a game. I dont recall where. This was about a year ago I remember someone posting about a video they found of this guy glitching the 2 to work together. Either here or hardforum i saw the youtube video being posted.
yeah it is VERY clear there. but in game it isn't that noticeable. difference between a real world game play and benchmark. during benchmarks the benchmark looks so good with Gsync on, but result in low marks.... turn off Gsync and get great marks but looks meh

I don't notice it in game, maybe my eyes don't see 200 fps as do yours? after all i am getting old :(
Posted on Reply
#59
HopelesslyFaithful
nickbaldwin86yeah it is VERY clear there. but in game it isn't that noticeable. difference between a real world game play and benchmark. during benchmarks the benchmark looks so good with Gsync on, but result in low marks.... turn off Gsync and get great marks but looks meh

I don't notice it in game, maybe my eyes don't see 200 fps as do yours? after all i am getting old :(
in game play its horrible obvious. Just turn ULMB on and off and play around. Biggest differences is RTSes. Have units move across screen. pan left to right in an RTS and see how bad everything looks as you scan the map with and without ULMB. Can you see them clearly? Can you read the names of the units as they move across the screen? I guarantee you can't unless ULMB is on and even than its still blurry-ish and has ghosting artifacts but 10 times better than no ULMB. They 1000 nit screens means we can start using 10 or 20% ULMB vs 100% and have even closer to CRT motion blur!.

The bluring is very obvious....you just have to pay attention. It is the same thing as an none calibrated screen. Once you see the difference its obvious.

Video games look as bad as they screen shots if you pay attention hence why i use ULMB 24/7
Posted on Reply
#60
nickbaldwin86
HopelesslyFaithfulin game play its horrible obvious. Just turn ULMB on and off and play around. Biggest differences is RTSes. Have units move across screen. pan left to right in an RTS and see how bad everything looks as you scan the map with and without ULMB. Can you see them clearly? Can you read the names of the units as they move across the screen? I guarantee you can't unless ULMB is on and even than its still blurry-ish and has ghosting artifacts but 10 times better than no ULMB. They 1000 nit screens means we can start using 10 or 20% ULMB vs 100% and have even closer to CRT motion blur!.

The bluring is very obvious....you just have to pay attention. It is the same thing as an none calibrated screen. Once you see the difference its obvious.

Video games look as bad as they screen shots if you pay attention hence why i use ULMB 24/7
only play first person shooters... I have not played any RTS games since Star craft or Tiberian Sun. so when I say I don't see a difference it is in PUBG or CSGO or Titan Fall 2... those are really the only two games I play any more... Don't notice any blurring or ghosting, just butter smooth.

thanks for the info, you seem to know what you are talking about.
Posted on Reply
#61
John Naylor
1. The human eye w/ normal vision and viewing distances can start to see individual pixels at about 96 ppi....lower you go, more noticeable it is Currently, the Acer / Asus panels from AU Optronics w/ 165Hz IP screens deliver 109 ppi, These will deliver 163 ppi. A 4k screen could be 40" in size and deliver the same ppi as those 1440p panels. We recommend no larger than 9for folks w/ normal vision)

1080p - 23.6 - 24"
1440p - 27"
2160p - 40"

2. If you have the GFX horsepower, you can use ULMB ... unfortunately, the GFX card industry has some catching up to do as today's top games are pushing 4k fps down to the mif 40s even with a 1080 Ti. But if you can afford this screen, you won't blink at getting a 2nd Ti ... scaling on the 10xx series (including games with no SLI support) averages only 50% but that will insure 60+ in everything I have seen so far.

3. I keep seeing posts pushing the faster screen needs bigger cards even if ya not at full range thing, but I have not seen a significant impact on fps with monitors at 120 Mhz or 144 or 165 hz when game fps is in the 80s.

4. I don't see freesync as a relevant issue here for one big reason ... despite most perceptions, they are not the same thing as G-Sync comes w/ a hardware module to provide motion blur reduction and Freesync does not. G-Sync and Freesysnc are great for < 70 fps and while they continue to have an effect above 70 hz, the impacts fades fast. This link is a great way to see the effects of motion blur ... soccer ball option makes it most apparent i think

frames-per-second.appspot.com/

5. I really don't get the sandbagging comment ... With the last few generations of cards, nVidia has really been pushing limits according to TPUs game test suite.

1080 Ti was 57% faster than 980 Ti
980 Ti was 41% faster than 780 Ti

Hard to see a significant impact from Intel here going back 5-6 generations, we top out at 5% ... maybe ... from generation to generation

Tho I will say that is likely the case w/ SLI ... 10xx series performance is less than half what 9xx series was at 1440p and my guess is that's an attenmpt to stop the cannibalization of top tier card sales by twin x70s. Until AMD offers something to compete at the top tiers, I see this as a their means of getting more top tier card sales for which they make much higher margins.

6. I went 21:9 and I went back... I liked it initially for the sense of immersion but w/o that AU Optonics panel, colors appeared washed out and brightness was lacking. The size is nice, but not worth the reduction in panel quality.

7. I have yet to see a good "gaming TV"
Posted on Reply
#62
HopelesslyFaithful
nickbaldwin86only play first person shooters... I have not played any RTS games since Star craft or Tiberian Sun. so when I say I don't see a difference it is in PUBG or CSGO or Titan Fall 2... those are really the only two games I play any more... Don't notice any blurring or ghosting, just butter smooth.

thanks for the info, you seem to know what you are talking about.
Not sure why you can't see it either in those. I play Natural Selection 2 and ULMB makes a world of a difference when aiming for aliens or marines. I went from 10-15% to 20-25% accuracy with the switch.

If you play NS2 its ridiculously fast pace flick and shoot game. Clearer the image the better because when an alien is blurring across your screen....its hard to aim.

case in point
This guy is pretty good

U:MB is massive difference.....
Posted on Reply
#63
Vego
jabbadapIt's Gsync monitor, if you use Gsync on gaming(Why would you even consider it if not using that though). The refresh rate will be equal to FPS on range 40-144 and double on range 30-40.
im pretty sure i wii be able to keep 100+ at 4k thats why i will be using it at 100Hz coz in my personal experience i dont see much difference between 100 and 144hz and there is 40% difference in power consumption and heat generation
Posted on Reply
#64
HopelesslyFaithful
Vegoim pretty sure i wii be able to keep 100+ at 4k thats why i will be using it at 100Hz coz in my personal experience i dont see much difference between 100 and 144hz and there is 40% difference in power consumption and heat generation
if you run 100hz ULMB vs 144hz without ULMB the 100hz ULMB will look and feel faster. That is a plain fact. It will also have less stutters too. I am going to be switching to 100hz ULMB soon from 120hz ULMB because it will prevent a lot of stuttering from CPU/GPU bottlenecks.

Posted on Reply
Add your own comment
Dec 18th, 2024 11:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts