Monday, May 29th 2023

NVIDIA Announces G-SYNC ULMB 2: Over 1000Hz Of Effective Motion Clarity

In 2015, NVIDIA launched Ultra Low Motion Blur (ULMB) - a novel technique of G-SYNC monitors to deliver extra motion clarity in competitive games. Today, we're launching G-SYNC Ultra Low Motion Blur 2 (ULMB 2), with over 1000 Hz of effective motion clarity for the best motion blur reduction for competitive gamers. Compared to the original, ULMB 2 delivers full refresh rate backlight strobing, nearly 2x higher brightness, and has practically zero crosstalk. ULMB 2 is available now, for free, for capable 1440p 360 Hz G-SYNC monitors through a single-click firmware updater!

When NVIDIA launched the original ULMB technology in 2015, monitor response times (the time it takes for a pixel to transition colors) were relatively slow, causing substantial ghosting and blurry images, resulting in poor motion clarity. Motion clarity is best described as the ability to clearly see and comprehend objects in motion. Sharp edges and non-blurry details are hallmarks of good motion clarity. To improve motion clarity, ULMB enabled a technique called backlight strobing (more on that in a moment).
To achieve backlight strobing, ULMB disables the backlight 75% of the time. This 25% duty cycle on a max 300 nit panel meant that the images would be clear, but less bright. With the original ULMB, we would need to wait longer for the pixels to transition to the right place before turning on the backlight due to the slower pixel response times in 2015. To compensate, ULMB would reduce the refresh rate to give the pixels more time to transition. Because of these drawbacks, competitive gamers often chose not to use the feature as the full refresh rate and bright image were more desirable.

Enter G-SYNC Ultra Low Motion Blur 2 (ULMB 2)
ULMB 2 provides full refresh rate backlight strobing and significantly brighter images, all while maintaining pristine image quality. With the panel response time improvements from our partners at AUO, ULMB 2 gives competitive gamers the motion clarity needed to perform at peak levels by keeping them in the game when moments get chaotic.
With ULMB 2, gamers get an effective motion clarity of over 1000 Hz with these improvements, calculated as the refresh rate of the monitor multiplied by one over the duty cycle [Effective Motion Clarity = Refresh rate * (1 / Duty Cycle)].

For a 360 Hz monitor with ULMB 2, the effective motion clarity is actually 1440 Hz. That means in order to obtain the same level of motion clarity without ULMB 2, gamers would need a classic panel capable of 1440 Hz.
To show this in action, we set up a test panel in our lab. Below is an example of a 120 Hz monitor with backlight strobing compared to a 480 Hz monitor without backlight strobing: Effective Motion Clarity = 480 Hz = 120*(1/0.25).

How Does ULMB 2 Work?
First, let's explain how LCD panels work. LCD panels consist of two main layers:
  • The liquid crystal pixels, which chemically change to adjust the color of light shining through them
  • The backlight, which produces the light that shines through the pixels
When a new frame needs to be displayed on the monitor, a new color value is sent to each pixel. At this point, the pixel will start to transition to its new color over time. During this process, the backlight is on the entire time so the gamer visually sees the full transition.

In addition, the image is "held" before and after transition which causes the human visual system to blur the two images together. The combination of the "motion hold" and visible transition is what causes display-based motion blur (not to be confused with an in-game motion blur setting).

With ULMB 2, the backlight is only turned on when each pixel is at its correct color value. The idea is to not show the pixels transitioning, and only show them when their color is accurate.

But this technique creates a challenge: backlights generally light up all pixels at the same time where pixels are changed on a rolling scanout. At any given point in time, a portion of the screen will have double images (as known as crosstalk).

The solution to this problem is what sets G-SYNC's ULMB 2 apart from other backlight strobing techniques: with G-SYNC, we're able to control the response time depending on where the vertical scan is, such that the pixels throughout the panel are at the right level at precisely the right time for the backlight to be flashed. We call this "Vertical Dependent Overdrive".

With Vertical Dependent Overdrive, ULMB 2 delivers great image quality even at high refresh rates where the optimal window for backlight strobing is small.

ULMB 2 Is Available Now
For ULMB 2 capability, monitors must meet the following requirements:
  • Deliver over 1000 Hz of effective motion clarity
  • Drive ULMB 2 at the monitor's full refresh rate
  • Deliver over 250 nits of brightness with minimal crosstalk or double images
Already, two ULMB 2 capable monitors are on the market, and another two are launching in the near future:
Available Today:
  • Acer Predator XB273U F - 27" 1440p 360 Hz
  • ASUS ROG Swift 360Hz PG27AQN - 27" 1440p 360 Hz
Available Soon:
  • ASUS ROG Swift Pro PG248QP - 25" 1080p 540 Hz
  • AOC AGON AG276QSG G-SYNC Monitor - 27" 1440p 360 Hz
Simply go to the links above to download the one-click firmware updater, to add G-SYNC ULMB 2 to a capable monitor. Please pay close attention to the directions in the update.
Add your own comment

30 Comments on NVIDIA Announces G-SYNC ULMB 2: Over 1000Hz Of Effective Motion Clarity

#2
qlum
Let me guess it's still locked to nvidia gpu's even though there is no technical reason for it.
Posted on Reply
#3
GoldenTiger
qlumLet me guess it's still locked to nvidia gpu's even though there is no technical reason for it.
Why would they give away millions in r&d to a competitor?
Posted on Reply
#4
Vayra86
Joker tech in a world of oled
Posted on Reply
#5
qlum
GoldenTigerWhy would they give away millions in r&d to a competitor?
Because other vendor's backlight strobing tech is not too different and works on AMD and by blocking them, they create a worse product. Similar to how adaptive sync changed to the vesa standard on gsync monitors.

They still have to sell their modules, as it stands they have worse compatibility then monitor makers own modules.
Posted on Reply
#6
KrazyT
@ Space Lynx :
you can delay your 4090, it won't be enough to play Cyberpunk at 165 Hz ! ;)
Now you want at least 360 Hz ! :)
Posted on Reply
#7
ZoneDymo
GoldenTigerWhy would they give away millions in r&d to a competitor?
idk how many times this needs to be adressed but:
They dont "give it away", they insure it gets put to good use, wide adoption, some actual value for the investment otherwise we just get silly crap like PhysX (which could have been great) or indeed Gsync when introduced....Nvidia just never learns this.
Posted on Reply
#8
Speedyblupi
"over 1000 Hz of effective motion clarity"
This is such marketing bull that it sounds like a parody.
Posted on Reply
#9
Tomorrow
Vayra86Joker tech in a world of oled
Indeed. I would rather take OLED+BFI than LCD+ULMB2. I do have an older LCD that is 1440p 165Hz and the few times i tried ULMB on it it was a big meh. Even with the brightness reduction i could barely notice a difference.
Posted on Reply
#10
Chomiq
Vayra86Joker tech in a world of oled


Really?

And that's not even with ULBM2 on AQN.

For better motion clarity OLED would need equivalent of backlight strobing (BFI in case of OLED), and this can't be done RN due to the low max brightness.

Here's Viewsonic with PureXP+:
Posted on Reply
#11
Tomorrow
ChomiqFor better motion clarity OLED would need backlight strobing, and this can't be done RN due to the low max brightness.
OLED has no backlight (to strobe). Each pixel is self-emissive and natively reacts faster than any LCD. Already a 240Hz OLED is equivalent to 360Hz LCD.
OLED uses BFI or Black Frame Insertion.
Posted on Reply
#12
Chomiq
TomorrowOLED has no backlight (to strobe). Each pixel is self-emissive and natively reacts faster than any LCD. Already a 240Hz OLED is equivalent to 360Hz LCD.
OLED uses BFI or Black Frame Insertion.
Yeah I meant BFI. LG ditched that from their TV's and do I wonder if it will ever return.
Posted on Reply
#13
Upgrayedd
ZoneDymoidk how many times this needs to be adressed but:
They dont "give it away", they insure it gets put to good use, wide adoption, some actual value for the investment otherwise we just get silly crap like PhysX (which could have been great) or indeed Gsync when introduced....Nvidia just never learns this.
PhysX is open source now.
qlumBecause other vendor's backlight strobing tech is not too different and works on AMD and by blocking them, they create a worse product. Similar to how adaptive sync changed to the vesa standard on gsync monitors.

They still have to sell their modules, as it stands they have worse compatibility then monitor makers own modules.
I don't really see how they create a worse product. It might be exclusive. But nothing was made worse.
Posted on Reply
#14
Tomorrow
UpgrayeddI don't really see how they create a worse product. It might be exclusive. But nothing was made worse.
Yes thankfully Nvidia removed the restriction with G-Sync v2 modules and AMD cards can now also use Adaptive Sync. Before with v1 modules you HAD to have Nvidia card. Otherwise using AMD+v1 would mean no Adaptive Sync.
Posted on Reply
#15
ZoneDymo

actually looks pretty solid, I dont think it would matter for me or so, im not superman, but still.
Posted on Reply
#16
qlum
UpgrayeddPhysX is open source now.


I don't really see how they create a worse product. It might be exclusive. But nothing was made worse.
A feature that is limited to one vendor is strictly worse than one that is not. Given the choice noone would choose the more limited option. Unless it has redeeming qualities such as performance or price.

I will add here that the consumers of gsync modules are monitor makers and not consumers, the gsync module generally costs more than not having it. At that point you would have an increase in price and a smaller adressable market.
Posted on Reply
#17
Space Lynx
Astronaut
coozie78Why did they bother?
I don't know.

OLED is still king.
Posted on Reply
#18
wolf
Performance Enthusiast
Space LynxOLED is still king.
Depends what you want, for competitive gaming this is clearly a step up, check out the video linked here
Posted on Reply
#19
Mussels
Freshwater Moderator
ZoneDymoidk how many times this needs to be adressed but:
They dont "give it away", they insure it gets put to good use, wide adoption, some actual value for the investment otherwise we just get silly crap like PhysX (which could have been great) or indeed Gsync when introduced....Nvidia just never learns this.
Nvidia never made PhsyX - they bought it from another company, made it exclusive to themselves and murdered it.

Simple fact: if it's exclusive companies don't want to sell it. They don't want to sell a product that only works for SOME of their customers - they want a product that they'll all buy.
Posted on Reply
#20
wolf
Performance Enthusiast
MusselsSimple fact: if it's exclusive companies don't want to sell it. They don't want to sell a product that only works for SOME of their customers - they want a product that they'll all buy.
While I can understand not liking this concept, everyone needs to understand it, and realize that 99.99999% of people or companies in the same position would do the same. If they could do it, they almost certainly would do it.
Posted on Reply
#21
Ne01_OnnA
Space LynxI don't know.

OLED is still king.
Hard to deny that :D
It's just Unreal.... I have it for 3 weeks now & still can't belive my eyes.
Posted on Reply
#22
Space Lynx
Astronaut
Ne01_OnnAHard to deny that :D
It's just Unreal.... I have it for 3 weeks now & still can't belive my eyes.
now I want you to upgrade that rig in a year or two, and max out those games at 240hz 180+ fps!!!! you can do it buddy! OLED will be my mistress someday... but not yet, cause my wallet lol
Posted on Reply
#23
Ne01_OnnA
Space Lynxnow I want you to upgrade that rig in a year or two, and max out those games at 240hz 180+ fps!!!! you can do it buddy! OLED will be my mistress someday... but not yet, cause my wallet lol
The truth about OLED is that I can't see so much difference with 70-90-120-240 Hz (FPS) Tested in Dead Cells.
On all FPS CAP (Via RTSS) game looks very good & crisp. Nahh You don't need to change anything yet ;)
Posted on Reply
#24
Upgrayedd
MusselsNvidia never made PhsyX - they bought it from another company, made it exclusive to themselves and murdered it.

Simple fact: if it's exclusive companies don't want to sell it. They don't want to sell a product that only works for SOME of their customers - they want a product that they'll all buy.
PhysX is open source
Posted on Reply
#25
Space Lynx
Astronaut
Ne01_OnnAThe truth about OLED is that I can't see so much difference with 70-90-120-240 Hz (FPS) Tested in Dead Cells.
On all FPS CAP (Via RTSS) game looks very good & crisp. Nahh You don't need to change anything yet ;)
an indie game like Dead Cells won't show the fps increases as well as say a third person action game with AAA graphics or a FPS game. I'd recommend trying something like Bioshock Inifinite. you should be able to tell a difference easier
Posted on Reply
Add your own comment
May 15th, 2024 23:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts