• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Been thinking of jumping on the hdr gsync 120hz bandwagon, but hesitant due to potential ghosting, motion blur, imperfections

Joined
Jan 17, 2010
Messages
12,327 (2.25/day)
Location
Oregon
System Name Juliette // My HTPC
Processor Intel i7 9700K // AMD Ryzen 5 5600G
Motherboard ASUS Prime Z390X-A // ASRock B550 ITX-AC
Cooling Noctua NH-U12 Black // Stock
Memory Corsair DDR4 3600 32gb //G.SKILL Trident Z Royal Series 16GB (2 x 8GB) 3600
Video Card(s) ASUS RTX4070 OC// ASUS RTX 4060 OC
Storage Samsung 970 EVO NVMe 1Tb, Intel 665p Series M.2 2280 1TB // Samsung 1Tb SSD
Display(s) ASUS VP348QGL 34" Quad HD 3440 x 1440 // 55" LG 4K SK8000 Series
Case Seasonic SYNCRO Q7// Silverstone Granada GD05
Audio Device(s) Focusrite Scarlett 4i4 // HDMI to Samsung HW-R650 sound bar
Power Supply Seasonic SYNCRO 750 W // CORSAIR Vengeance 650M
Mouse G903 and a Master Mouse MM710/No mouse, MS game copntroller
Keyboard EVGA / Logitech K400
Software Windows 11 Pro // Windows 10 Pro
Joined
Nov 27, 2010
Messages
924 (0.18/day)
System Name future xeon II
Processor DUAL SOCKET xeon e5 2686 v3 , 36c/72t, hacked all cores @3.5ghz, TDP limit hacked
Motherboard asrock rack ep2c612 ws
Cooling case fans,liquid corsair h100iv2 x2
Memory 96 gb ddr4 2133mhz gskill+corsair
Video Card(s) 2x 1080 sc acx3 SLI, @STOCK
Storage Hp ex950 2tb nvme+ adata xpg sx8200 pro 1tb nvme+ sata ssd's+ spinners
Display(s) philips 40" bdm4065uc 4k @60
Case silverstone temjin tj07-b
Audio Device(s) sb Z
Power Supply corsair hx1200i
Mouse corsair m95 16 buttons
Keyboard microsoft internet keyboard pro
Software windows 10 x64 1903 ,enterprise
Benchmark Scores fire strike ultra- 10k time spy- 15k cpu z- 400/15000
nah, it's 100hz, and 2k, if not a bit pricey. the form factor is on the too wide side
 
Joined
Nov 27, 2010
Messages
924 (0.18/day)
System Name future xeon II
Processor DUAL SOCKET xeon e5 2686 v3 , 36c/72t, hacked all cores @3.5ghz, TDP limit hacked
Motherboard asrock rack ep2c612 ws
Cooling case fans,liquid corsair h100iv2 x2
Memory 96 gb ddr4 2133mhz gskill+corsair
Video Card(s) 2x 1080 sc acx3 SLI, @STOCK
Storage Hp ex950 2tb nvme+ adata xpg sx8200 pro 1tb nvme+ sata ssd's+ spinners
Display(s) philips 40" bdm4065uc 4k @60
Case silverstone temjin tj07-b
Audio Device(s) sb Z
Power Supply corsair hx1200i
Mouse corsair m95 16 buttons
Keyboard microsoft internet keyboard pro
Software windows 10 x64 1903 ,enterprise
Benchmark Scores fire strike ultra- 10k time spy- 15k cpu z- 400/15000
necro'ing this thread to mention i pulled the trigger on this

07032020-202538.jpg


considering all the good advice on this thread, I still jumped the gun, i will test and evaluate the monitor, and still have my old fully functional as a spare.
another thing to tweak would be the asus windows software with it's per application settings profiles(!)
, and clear type tuning tool to account for text clarity in BGR configuration instead of RGB
 
Joined
Mar 18, 2015
Messages
2,963 (0.83/day)
Location
Long Island
What I first considered was the rog strix PG43UQ, recently released, but being pricey I went to the next best thing, XG438Q, can be found refurbished for 800$. It is va 120hz, gsync hdr, 43" 4k, from what it seems- perfection, however many issues seem to plague these panels, shared with acer predator CG437K, such as people complain about motion blur, skipped frames, smearing and light bleeds and what have you, even text seems to not be rendered perfectly due to BGR configuration rather than RGB. My option is sit on the money with my 40" 60hz regular VA, which does not suffer from any of those issues, and wait for technology to mature and maybe in a few years 165hz or more. The favourable reviews and the itch are strong in this matter, though.
Just throwing these models out there so people can know what is available in value segment for gaming large format displays.

1. Since their introduction, we have been told that G-Sync and Freesync , as package, are the same thing from two different vendows. Nothing could be further from the truth

G-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point
Free-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point

This is where the similarity ends. G-sync includes a hardware module which is responsible for the cost difference between the two; Freesync does not have one. Th ehardware module allows you to turn off G-Sync and instead use ULMB (Ultra Low Motion Blur) . Freesync monnitpr may provide a simiar hardwarte module of their own, but results are inconsistent, So freesync does nit have motion blur reduction. Generally, if youre card(s) can maintain 75-80 fps, you should try turning G-Sync off and using ULMB. Foie the best description as to how the technologies differ that I have seen, read this:

"On the plus side, by removing the traditional scaler it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming. VIDIA G-sync screens with the hardware module generally have a nice wide variable refresh rate (VRR) range. You will often see this listed in the product spec as something like "40 - 144Hz", or confirmed via third party testing. We have seen lots of FreeSync screens, particularly from the FreeSync 1 generation, with far more limited VRR ranges. NVIDIA also seem to be at the forefront of bringing the highest refresh rate gaming monitors to market first, so you will often see the latest and greatest models with G-sync support a fair while before alternative FreeSync options become available.

The presence of this module, and absence of a traditional scaler has allowed previously much slower panels to be successfully overclocked to higher refresh rates. ..... The G-sync module allowed a very good boost in refresh rate, and some excellent performance improvements as a result. This pattern continues today, as you will often see screens featuring the G-sync module advertised with a normal "native" refresh rate, and then an overclocked refresh rate where the panel has been boosted. For instance there's quite a lot of 144Hz native screens which can be boosted to 165Hz or above thanks to the G-sync module.

From our many tests of screens featuring the hardware G-sync module, the response times of the panels and the overdrive that is used seems to be generally very reliable and consistent, producing strong performance at both low and high refresh rates. This seems to be more consistent than what we have seen from FreeSync screens so far where often the overdrive impulse is impacted negatively by changes to the screens refresh rate. NVIDIA also talk about how their G-sync technology allows for "variable overdrive" where the overdrive is apparently tuned across the entire refresh rate range for optimal performance.

G-sync modules also often support a native blur reduction mode, dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it's a useful extra option for many of these G-sync module gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the actual G-sync or ULMB functions.

It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 - 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature if it's been included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It's nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. "

2. To my eyes ... I have not seen a 4k screen of any size that checks all the boxes at 4k ... As of yet, there's not a GFX card than can adequately drive them under ULMB ap at 120 hz (165 Hz G-Sync). Until 4k screens can do ULMB it's not a purchase I would personally consider. last I read, we don't even have a cable in production that can carry the necessary bandwidth.

3. Would I upgrade ?

Well we have different goals. Again yo my eyes, the best experience I have had in gaming has come at 1440p on a 165 Hz IPS monitor running ULMB @ 120 Hz w/ 10 bit color. At 4k, your pushing 2.5 times a smany pixels so , that requires significantly more GFX horsepower to drive. Again, for me 4k 60Hz @ 8 bit ==> 1440p 165Hz - H U G E upgrade

4, Be aware that advertised lag and refresh rates are completely bogus. Seen 1 ms advertised ... go look at it's test report.

Asus XG438Q was listed as a 1 ms monitor above ... well

Spec sheet says response Time = 4 ms
Lets see how it tested:

Overdrive Level 4 @ 120 Hz produced low levels of overshoot (RTC Error Average = 0%), and response times of 10.2 ms ... Overdrive Level 5 @ 120 Hz droped response time to 5,0 ms but overshoot (RTC Error Average = 15.30 %)

So this gives ya and idea of how fake advertised response times are ... an the lower quality the panel, the greater the level od exaggeration. Ina ddition, this monitor suffers from overshoot issues which which would a a strikethough on any potential list of choaices.

Samsung C49RG90 is another one I have seen recommended as a 1 ms monitor
Spec sheet says response Time = 4 ms
Lets see how it tested:

Depending upon the mode chosen, there were differen7 results .... 11.4, 11.0 and 10.1 ... not exactly in the neighborhood of 1.0 or 4.0 is it ?

So anytime you are are lookoing at a spec sheet of see a claim of low resonse times,

What I first considered was the rog strix PG43UQ, recently released, but being pricey I went to the next best thing, XG438Q, can be found refurbished for 800$. It is va 120hz, gsync hdr, 43" 4k, from what it seems- perfection, however many issues seem to plague these panels, shared with acer predator CG437K, such as people complain about motion blur, skipped frames, smearing and light bleeds and what have you, even text seems to not be rendered perfectly due to BGR configuration rather than RGB. My option is sit on the money with my 40" 60hz regular VA, which does not suffer from any of those issues, and wait for technology to mature and maybe in a few years 165hz or more. The favourable reviews and the itch are strong in this matter, though.
Just throwing these models out there so people can know what is available in value segment for gaming large format displays.

1. Since their introduction, we have been told that G-Sync and Freesync , as package, are the same thing from two different vendows. Nothing could be further from the truth

G-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point
Free-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point

This is where the similarity ends. G-sync includes a hardware module which is responsible for the cost difference between the two; Freesync does not have one. Th ehardware module allows you to turn off G-Sync and instead use ULMB (Ultra Low Motion Blur) . Freesync monnitpr may provide a simiar hardwarte module of their own, but results are inconsistent, So freesync does nit have motion blur reduction. Generally, if youre card(s) can maintain 75-80 fps, you should try turning G-Sync off and using ULMB. Foie the best description as to how the technologies differ that I have seen, read this:

"On the plus side, by removing the traditional scaler it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming. VIDIA G-sync screens with the hardware module generally have a nice wide variable refresh rate (VRR) range. You will often see this listed in the product spec as something like "40 - 144Hz", or confirmed via third party testing. We have seen lots of FreeSync screens, particularly from the FreeSync 1 generation, with far more limited VRR ranges. NVIDIA also seem to be at the forefront of bringing the highest refresh rate gaming monitors to market first, so you will often see the latest and greatest models with G-sync support a fair while before alternative FreeSync options become available.

The presence of this module, and absence of a traditional scaler has allowed previously much slower panels to be successfully overclocked to higher refresh rates. ..... The G-sync module allowed a very good boost in refresh rate, and some excellent performance improvements as a result. This pattern continues today, as you will often see screens featuring the G-sync module advertised with a normal "native" refresh rate, and then an overclocked refresh rate where the panel has been boosted. For instance there's quite a lot of 144Hz native screens which can be boosted to 165Hz or above thanks to the G-sync module.

From our many tests of screens featuring the hardware G-sync module, the response times of the panels and the overdrive that is used seems to be generally very reliable and consistent, producing strong performance at both low and high refresh rates. This seems to be more consistent than what we have seen from FreeSync screens so far where often the overdrive impulse is impacted negatively by changes to the screens refresh rate. NVIDIA also talk about how their G-sync technology allows for "variable overdrive" where the overdrive is apparently tuned across the entire refresh rate range for optimal performance.

G-sync modules also often support a native blur reduction mode, dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it's a useful extra option for many of these G-sync module gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the actual G-sync or ULMB functions.

It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 - 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature if it's been included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It's nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. "

2. To my eyes ... I have not seen a 4k screen of any size that checks all the boxes at 4k ... As of yet, there's not a GFX card than can adequately drive them under ULMB ap at 120 hz (165 Hz G-Sync). Until 4k screens can do ULMB it's not a purchase I would personally consider. last I read, we don't even have a cable in production that can carry the necessary bandwidth.

3. Would I upgrade ?

Well we have different goals. Again yo my eyes, the best experience I have had in gaming has come at 1440p on a 165 Hz IPS monitor running ULMB @ 120 Hz w/ 10 bit color. At 4k, your pushing 2.5 times a smany pixels so , that requires significantly more GFX horsepower to drive. Again, for me 4k 60Hz @ 8 bit ==> 1440p 165Hz - H U G E upgrade

4, Be aware that advertised lag and refresh rates are completely bogus. Seen 1 ms advertised ... go look at it's test report.

Asus XG438Q was listed as a 1 ms monitor above ... well

Spec sheet says response Time = 4 ms
Lets see how it tested:

Overdrive Level 4 @ 120 Hz produced low levels of overshoot (RTC Error Average = 0%), and response times of 10.2 ms ... Overdrive Level 5 @ 120 Hz droped response time to 5,0 ms but overshoot (RTC Error Average = 15.30 %)

So this gives ya and idea of how fake advertised response times are ... and the lower quality the panel, the greater the level od exaggeration. In addition, this monitor suffers from significant overshoot issues which which would result in a thick red strikethough on any potential list of acceptable choices to consider.

Samsung C49RG90 is another one I have seen recommended as a 1 ms monitor
Spec sheet says response Time = 4 ms
Lets see how it tested:

Depending upon the mode chosen, there were differen7 results .... 11.4, 11.0 and 10.1 ... not exactly in the neighborhood of 1.0 or 4.0 is it ?

So anytime you are are looking at a spec sheet of see a claim of low response times, it's not likely to be real. "Show me' should be the order of the day.
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
Depending upon the mode chosen, there were differen7 results .... 11.4, 11.0 and 10.1 ... not exactly in the neighborhood of 1.0 or 4.0 is it ?
This is not a theoretical measurement. Perceived motion response time depends on not just the pixel response time, but also the refresh rate and they are inversely correlated - a fast TN might curtail perceived blur at 120Hz whereas a slow VA might need the extra 240Hz upgrade in order to benefit from a shorter raster interval.
You cannot have all eggs in the same basket, if you check back at tftcentral the cheapest VA in its latest budget monitor review roundup comes with best VA overdrive they have yet tested, so it doesn't go at parity with its product placement - a VA with top quality overdrive is better off with blur reduction than vrr, which is how it is supposed to be on the red fence: vsync and mbr to maximise raster refresh period - freesync does not synchronize that setting.
G-Sync isn't all you said. Over at OCN, people were discussing it had a frame buffer. Gsync is fast due to its consistently short raster interval throughout its vrr range. Also, people on the green fence drop vsync whereas on the red fence they put up with vsync since that is how it operates with the least lag.
Green teamers are supposed to set a framerate limiter under the maximum refresh rate to avoid gpu pipeline breaks.
All these are courtesy of linus and some other snippets. The review if you want to check on it.
5-Figure3-1.png

Okay, remembered where it were.
Book on LCD
MPRT ~ [LC response time² + (0.8* 1/Hz)²]^1/2.
Lots of formulas to intertwine the subject. I wonder what response time means - either GtG, BtB, average mean.
PS: it says average of 7 GtG values.

So, looking at the subject budget VA example, fastest overdrive at 144 Hz rounds to 8.1 ms if I'm correct. There is a potential 2 ms benefit if we were to update the refresh rate to 240 Hz, so I would be inclined 8 ms > 6 ms perceived mprt is okay to go forward.
Summary: if panel GtG average is 5.9ms, there is benefit to overclocking the LCD to 240 Hz. Optimising the refresh rate is all a matter of matching the GtG with the mprt value.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,830 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
1. Since their introduction, we have been told that G-Sync and Freesync , as package, are the same thing from two different vendows. Nothing could be further from the truth

G-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point
Free-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point

This is where the similarity ends. G-sync includes a hardware module which is responsible for the cost difference between the two; Freesync does not have one. Th ehardware module allows you to turn off G-Sync and instead use ULMB (Ultra Low Motion Blur) . Freesync monnitpr may provide a simiar hardwarte module of their own, but results are inconsistent, So freesync does nit have motion blur reduction. Generally, if youre card(s) can maintain 75-80 fps, you should try turning G-Sync off and using ULMB. Foie the best description as to how the technologies differ that I have seen, read this:

"On the plus side, by removing the traditional scaler it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming. VIDIA G-sync screens with the hardware module generally have a nice wide variable refresh rate (VRR) range. You will often see this listed in the product spec as something like "40 - 144Hz", or confirmed via third party testing. We have seen lots of FreeSync screens, particularly from the FreeSync 1 generation, with far more limited VRR ranges. NVIDIA also seem to be at the forefront of bringing the highest refresh rate gaming monitors to market first, so you will often see the latest and greatest models with G-sync support a fair while before alternative FreeSync options become available.

The presence of this module, and absence of a traditional scaler has allowed previously much slower panels to be successfully overclocked to higher refresh rates. ..... The G-sync module allowed a very good boost in refresh rate, and some excellent performance improvements as a result. This pattern continues today, as you will often see screens featuring the G-sync module advertised with a normal "native" refresh rate, and then an overclocked refresh rate where the panel has been boosted. For instance there's quite a lot of 144Hz native screens which can be boosted to 165Hz or above thanks to the G-sync module.

From our many tests of screens featuring the hardware G-sync module, the response times of the panels and the overdrive that is used seems to be generally very reliable and consistent, producing strong performance at both low and high refresh rates. This seems to be more consistent than what we have seen from FreeSync screens so far where often the overdrive impulse is impacted negatively by changes to the screens refresh rate. NVIDIA also talk about how their G-sync technology allows for "variable overdrive" where the overdrive is apparently tuned across the entire refresh rate range for optimal performance.

G-sync modules also often support a native blur reduction mode, dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it's a useful extra option for many of these G-sync module gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the actual G-sync or ULMB functions.

It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 - 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature if it's been included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It's nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. "

2. To my eyes ... I have not seen a 4k screen of any size that checks all the boxes at 4k ... As of yet, there's not a GFX card than can adequately drive them under ULMB ap at 120 hz (165 Hz G-Sync). Until 4k screens can do ULMB it's not a purchase I would personally consider. last I read, we don't even have a cable in production that can carry the necessary bandwidth.

3. Would I upgrade ?

Well we have different goals. Again yo my eyes, the best experience I have had in gaming has come at 1440p on a 165 Hz IPS monitor running ULMB @ 120 Hz w/ 10 bit color. At 4k, your pushing 2.5 times a smany pixels so , that requires significantly more GFX horsepower to drive. Again, for me 4k 60Hz @ 8 bit ==> 1440p 165Hz - H U G E upgrade

4, Be aware that advertised lag and refresh rates are completely bogus. Seen 1 ms advertised ... go look at it's test report.

Asus XG438Q was listed as a 1 ms monitor above ... well

Spec sheet says response Time = 4 ms
Lets see how it tested:

Overdrive Level 4 @ 120 Hz produced low levels of overshoot (RTC Error Average = 0%), and response times of 10.2 ms ... Overdrive Level 5 @ 120 Hz droped response time to 5,0 ms but overshoot (RTC Error Average = 15.30 %)

So this gives ya and idea of how fake advertised response times are ... an the lower quality the panel, the greater the level od exaggeration. Ina ddition, this monitor suffers from overshoot issues which which would a a strikethough on any potential list of choaices.

Samsung C49RG90 is another one I have seen recommended as a 1 ms monitor
Spec sheet says response Time = 4 ms
Lets see how it tested:

Depending upon the mode chosen, there were differen7 results .... 11.4, 11.0 and 10.1 ... not exactly in the neighborhood of 1.0 or 4.0 is it ?

So anytime you are are lookoing at a spec sheet of see a claim of low resonse times,



1. Since their introduction, we have been told that G-Sync and Freesync , as package, are the same thing from two different vendows. Nothing could be further from the truth

G-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point
Free-sync does its best thing between 30 and 70 fps ... the affects of Adaptive Sync continue on past 70 fps, but the user impact begins tailing off at this point

This is where the similarity ends. G-sync includes a hardware module which is responsible for the cost difference between the two; Freesync does not have one. Th ehardware module allows you to turn off G-Sync and instead use ULMB (Ultra Low Motion Blur) . Freesync monnitpr may provide a simiar hardwarte module of their own, but results are inconsistent, So freesync does nit have motion blur reduction. Generally, if youre card(s) can maintain 75-80 fps, you should try turning G-Sync off and using ULMB. Foie the best description as to how the technologies differ that I have seen, read this:

"On the plus side, by removing the traditional scaler it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming. VIDIA G-sync screens with the hardware module generally have a nice wide variable refresh rate (VRR) range. You will often see this listed in the product spec as something like "40 - 144Hz", or confirmed via third party testing. We have seen lots of FreeSync screens, particularly from the FreeSync 1 generation, with far more limited VRR ranges. NVIDIA also seem to be at the forefront of bringing the highest refresh rate gaming monitors to market first, so you will often see the latest and greatest models with G-sync support a fair while before alternative FreeSync options become available.

The presence of this module, and absence of a traditional scaler has allowed previously much slower panels to be successfully overclocked to higher refresh rates. ..... The G-sync module allowed a very good boost in refresh rate, and some excellent performance improvements as a result. This pattern continues today, as you will often see screens featuring the G-sync module advertised with a normal "native" refresh rate, and then an overclocked refresh rate where the panel has been boosted. For instance there's quite a lot of 144Hz native screens which can be boosted to 165Hz or above thanks to the G-sync module.

From our many tests of screens featuring the hardware G-sync module, the response times of the panels and the overdrive that is used seems to be generally very reliable and consistent, producing strong performance at both low and high refresh rates. This seems to be more consistent than what we have seen from FreeSync screens so far where often the overdrive impulse is impacted negatively by changes to the screens refresh rate. NVIDIA also talk about how their G-sync technology allows for "variable overdrive" where the overdrive is apparently tuned across the entire refresh rate range for optimal performance.

G-sync modules also often support a native blur reduction mode, dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it's a useful extra option for many of these G-sync module gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the actual G-sync or ULMB functions.

It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 - 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature if it's been included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It's nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. "

2. To my eyes ... I have not seen a 4k screen of any size that checks all the boxes at 4k ... As of yet, there's not a GFX card than can adequately drive them under ULMB ap at 120 hz (165 Hz G-Sync). Until 4k screens can do ULMB it's not a purchase I would personally consider. last I read, we don't even have a cable in production that can carry the necessary bandwidth.

3. Would I upgrade ?

Well we have different goals. Again yo my eyes, the best experience I have had in gaming has come at 1440p on a 165 Hz IPS monitor running ULMB @ 120 Hz w/ 10 bit color. At 4k, your pushing 2.5 times a smany pixels so , that requires significantly more GFX horsepower to drive. Again, for me 4k 60Hz @ 8 bit ==> 1440p 165Hz - H U G E upgrade

4, Be aware that advertised lag and refresh rates are completely bogus. Seen 1 ms advertised ... go look at it's test report.

Asus XG438Q was listed as a 1 ms monitor above ... well

Spec sheet says response Time = 4 ms
Lets see how it tested:

Overdrive Level 4 @ 120 Hz produced low levels of overshoot (RTC Error Average = 0%), and response times of 10.2 ms ... Overdrive Level 5 @ 120 Hz droped response time to 5,0 ms but overshoot (RTC Error Average = 15.30 %)

So this gives ya and idea of how fake advertised response times are ... and the lower quality the panel, the greater the level od exaggeration. In addition, this monitor suffers from significant overshoot issues which which would result in a thick red strikethough on any potential list of acceptable choices to consider.

Samsung C49RG90 is another one I have seen recommended as a 1 ms monitor
Spec sheet says response Time = 4 ms
Lets see how it tested:

Depending upon the mode chosen, there were differen7 results .... 11.4, 11.0 and 10.1 ... not exactly in the neighborhood of 1.0 or 4.0 is it ?

So anytime you are are looking at a spec sheet of see a claim of low response times, it's not likely to be real. "Show me' should be the order of the day.

I didn't read the better half of this Great Wall, but the gist is 'response times aren't as advertised'... yes... we know and its common sense by now. Just like most other spec sheet numbers. Are you going to explain dynamic contrast of 10000000 :1 isn't real, either? Or HD Ready wasn't Full HD, and... :p

There is much, much more to all those monitor specs than you can capture within the spec number itself. That is why we keep adding terminology, also at blurbusters, to denote all those characteristics. The same goes for VRR, which is obviously a box of trickery and tweaking applied to existing technology. The approaches are slightly different and if you look at it long enough, sure you will see through it. Its getting to a point where one might question whether we're not too obsessed with those minute differences; we're talking about fractions of a second here.

In the end each display tech has nuances and sweet spots, that also applies to refreshes, sample and hold tech, the way the backlight ramps up and down, etc. The simple 'ms' number is just a tiny matter in this sea of information. In addition, almost all panels reach a baseline response time that is just fine. Even lower cost panels. It is becoming OEM-territory to produce 75hz as standard and I reckon this will quickly move up to 90 or 120hz. Its already happened for OLED and LCD can't stay behind, after all, its an inferior tech to begin with so they will need and grab every little advantage that exists to keep selling. Even phones get higher refreshes into more mainstream models, slowly.

That is also why you see these retarded specs pop up with 240hz and beyond; but also the VESA HDR spec exists only to keep non-OLED somewhat relevant. Yes, theoretically you should see an advantage... but refresh on its own tells you just about nothing about how nice a panel is to look at overall. I would not directly categorize VRR under this as well, but its pretty close because its still possible to just set a static refresh and keep FPS above or around that number, eliminating any VRR requirement or trouble in one go and with no investment.
 
Last edited:
Joined
Nov 27, 2010
Messages
924 (0.18/day)
System Name future xeon II
Processor DUAL SOCKET xeon e5 2686 v3 , 36c/72t, hacked all cores @3.5ghz, TDP limit hacked
Motherboard asrock rack ep2c612 ws
Cooling case fans,liquid corsair h100iv2 x2
Memory 96 gb ddr4 2133mhz gskill+corsair
Video Card(s) 2x 1080 sc acx3 SLI, @STOCK
Storage Hp ex950 2tb nvme+ adata xpg sx8200 pro 1tb nvme+ sata ssd's+ spinners
Display(s) philips 40" bdm4065uc 4k @60
Case silverstone temjin tj07-b
Audio Device(s) sb Z
Power Supply corsair hx1200i
Mouse corsair m95 16 buttons
Keyboard microsoft internet keyboard pro
Software windows 10 x64 1903 ,enterprise
Benchmark Scores fire strike ultra- 10k time spy- 15k cpu z- 400/15000
OK, now i'm really psyched about this display, exited to try the new technologies like HDR, vrr, 120hz, gsync, I'm definitely in for an upgrade, my display is five years old.
nevermind that i usually buy my hardware for 10 years future proof, as i did in 2008 with core i7 965 extreme, but eventually I upgrade after five-six years, which is good as well by any standard.
 
Joined
Sep 17, 2014
Messages
22,830 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
OK, now i'm really psyched about this display, exited to try the new technologies like HDR, vrr, 120hz, gsync, I'm definitely in for an upgrade, my display is five years old.
nevermind that i usually buy my hardware for 10 years future proof, as i did in 2008 with core i7 965 extreme, but eventually I upgrade after five-six years, which is good as well by any standard.

Rightly so, there is a lot of nice tech flying around and that spec list seems sensible to me as well - though I would not expect anything of HDR if I were you. Its like 8K content, you might just stumble on some of it in your lifetime :) For all the rest, just use the regular stuff. A well calibrated standard color space is just mighty fine on its own. Emphasis on well calibrated; no oversaturation and correct contrast makes a huge difference. Its something you notice over time - an image that is out of balance will eventually urge you to tweak it down a bit. Best to get that right straight away with a simple calibration run just by the naked eye. There are sites that can help you with that. http://www.lagom.nl/lcd-test/ is one of them.

If anything, with monitors you need to have a hassle free return policy and/or try before you buy. That still works best at home in your own space, with your own lighting conditions, so you can spot nuisances like IPS glow, smearing, blur, backlight bleed and overall panel quality/uniformity. Don't hesitate to return one that has glaring issues, they will keep bugging you.
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
.
OK, now i'm really psyched about this display, exited to try the new technologies like HDR, vrr, 120hz, gsync, I'm definitely in for an upgrade, my display is five years old.
nevermind that i usually buy my hardware for 10 years future proof, as i did in 2008 with core i7 965 extreme, but eventually I upgrade after five-six years, which is good as well by any standard.
Funny thing is, the marketers don't know how to adapt to the new checklist. The new 1000R 'optically ergonomic' screens, change everything. If display geometry is 16:9 it is just a 3% gain, but 32:9 screens are out the doors and they cover 13% more screen space than a flat screen. There is a marketing gap with what matters. I think we will see a shift similar to how cryptomining eventually crept into gpu sphere. They are currently going after the VESA standard displayhdr which I also look forward to, however there is an even shorter path to marketing this: contrast ratios. A display never attains the suggested contrast ratio, if the pixel is wrong. There is an inherent balance between how clearcut a pixel is to how glossy the panel glare coating can be.

VA is the breakeven point, imo. It has nice blocky pixels, it has less glare thereof and also due to the screen curvature effect, and apart from viewing angle concerns that handicap available contrast at right in the center alignment, we could have greater hdr designations.


Apart from that, quantum color filters and quantum lightguides have materializing potential to increasing light efficiency 3 times and 2 times, respectively. That would not only increase emitted light 6 times, it would also decrease halo effect of backlight modulation in the event dynamic contrast is a valid option.
So, apart from the last 2 suggestions requiring hardware changes, displays have gone far into the mainstream hdr targets.
 
Last edited:
Joined
Jun 1, 2011
Messages
4,723 (0.95/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
Joined
Jun 13, 2012
Messages
1,416 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Gsync has a licensing fee while freesync does not
Yes Gsync cost more due to the module needs to be bought on top of Nvidia did all the R&D work to make it work a they intended from the start. Freesync was cheaper cause it was just put out as free but the Monitor makers had to do their own R&D to get it to work with little to no ghosting. Its taken a few years for freesync to get to where it is where as gsync worked pretty much as intended from day 1 so you paid extra for that. If you looked in early days of freesync monitors they were also limited in maybe ways like hz they could do but ones in last say year or 2 have solved that issue pretty much on par. Dis Nvidia all you want for $ extra but it worked as they wanted from start vs what happened early day freesync.
 
Last edited:
Joined
Nov 27, 2010
Messages
924 (0.18/day)
System Name future xeon II
Processor DUAL SOCKET xeon e5 2686 v3 , 36c/72t, hacked all cores @3.5ghz, TDP limit hacked
Motherboard asrock rack ep2c612 ws
Cooling case fans,liquid corsair h100iv2 x2
Memory 96 gb ddr4 2133mhz gskill+corsair
Video Card(s) 2x 1080 sc acx3 SLI, @STOCK
Storage Hp ex950 2tb nvme+ adata xpg sx8200 pro 1tb nvme+ sata ssd's+ spinners
Display(s) philips 40" bdm4065uc 4k @60
Case silverstone temjin tj07-b
Audio Device(s) sb Z
Power Supply corsair hx1200i
Mouse corsair m95 16 buttons
Keyboard microsoft internet keyboard pro
Software windows 10 x64 1903 ,enterprise
Benchmark Scores fire strike ultra- 10k time spy- 15k cpu z- 400/15000
my panel will soon land on my porch, so I have inquired a bit of what to expect.


This article shows 1080 ti SLI in high refresh rate scenarios in the comparison benchmarks, so having 1080 superclocked in SLI, will I see a smooth gameplay with gsync on my display, say at 90~100 fps, even if the display will run @ 120hz? or it might stutter? Also I wonder if the 10 bit color will matter when 4k gaming, albeit not @120hz hdr but lower due to connection bandwidth issues, DSC not present on this display, and I won't shell 1500$ for it, the newer panel also not in stock for now.
some info I found to answer some of these questions:

https://www.reddit.com/r/nvidia/comments/4noifn
 
Joined
Mar 18, 2015
Messages
2,963 (0.83/day)
Location
Long Island
I didn't read the better half of this Great Wall, but the gist is 'response times aren't as advertised'... yes... we know and its common sense by now. Just like most other spec sheet numbers. Are you going to explain dynamic contrast of 10000000 :1 isn't real, either? Or HD Ready wasn't Full HD, and... :p

Why ? .... I was responding only to the thread topic's inclusion of the words "ghosting, motion blur, imperfections each of which is impacted by refresh rates .... and monitor recommendations which presented false numbers.

So, looking at the subject budget VA example, fastest overdrive at 144 Hz rounds to 8.1 ms if I'm correct. There is a potential 2 ms benefit if we were to update the refresh rate to 240 Hz, so I would be inclined 8 ms > 6 ms perceived mprt is okay to go forward.

It's not a matter of being correct, it's simply not relevant ... using OD has impacts, unpleasant impacts ... And when that's the case, as it as here, the results only count when if they are for the settings you are going to use. If the fastest OD has issues, you are not going to use it, so it's results at that setting are irrelevant. Also you don't get average response time by picking a point on the middle between lowest and highest.



144 Hz (165 w/ OC) monitors with AU Optonics panels. as used in the Acer XB271HU and Asus PG279Q routinely have response times than those other panels, with an average of 5.0 and ranging from a lows of 4.0 to highs of 6.5
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
It's not a matter of being correct, it's simply not relevant ... using OD has impacts, unpleasant impacts ... And when that's the case, as it as here, the results only count when if they are for the settings you are going to use. If the fastest OD has issues, you are not going to use it, so it's results at that setting are irrelevant. Also you don't get average response time by picking a point on the middle between lowest and highest.



144 Hz (165 w/ OC) monitors with AU Optonics panels. as used in the Acer XB271HU and Asus PG279Q routinely have response times than those other panels, with an average of 5.0 and ranging from a lows of 4.0 to highs of 6.5
It just needs the 'average' because it takes the average into consideration. The IPS panels are all in the clear; it is only VA panels that have response times slower than the 7ms frame period. Obviously, there are some shortcomings. However, mbr supposedly lets it proceed unnoticed. I suppose that holds for rtc errors, not the slowness of response times.
Since the mbr mode is an extension of the fastest overdrive mode seperate from vrr mode, the toggle between the two is very straightforward.
 
Joined
Aug 20, 2007
Messages
21,589 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Gsync and Freesync aren't really needed for monitors with high refresh rate. There is no tearing when FPS is lower than refresh rate.

I mean there is, but some don't see it. You are aparently in that group. It's not like this was born out of an absolute lack of need.
 

Regeneration

NGOHQ.COM
Joined
Oct 26, 2005
Messages
3,134 (0.45/day)
I mean there is, but some don't see it. You are aparently in that group. It's not like this was born out of an absolute lack of need.

It took a while until LCDs broke the 60hz, 5ms barrier. Tearing and ghosting was a big deal then.

Current 144hz+ 1ms GTG monitors refresh so quickly you don't notice any tearing. Even at 300 fps.
 
Joined
Aug 20, 2007
Messages
21,589 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Current 144hz+ 1ms GTG monitors refresh so quickly you don't notice any tearing. Even at 300 fps.

Again, this is your experience. It is not globally accepted truth.
 
Joined
Mar 10, 2015
Messages
3,984 (1.11/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
This is where the similarity ends. G-sync includes a hardware module which is responsible for the cost difference between the two; Freesync does not have one. Th ehardware module allows you to turn off G-Sync and instead use ULMB (Ultra Low Motion Blur) . Freesync monnitpr may provide a simiar hardwarte module of their own, but results are inconsistent, So freesync does nit have motion blur reduction.

So my Freesync monitor with MBR doesn't have MBR? I'm confused.
 
Joined
Jul 19, 2020
Messages
35 (0.02/day)
Gsync and Freesync aren't really needed for monitors with high refresh rate. There is no tearing when FPS is lower than refresh rate.
hey man, i just registered so i can say how fuking wrong you are, gsync/freesync is God sent, and if you dont see difference between it on/off theres something wrong with you
 
Joined
Jun 29, 2009
Messages
2,012 (0.35/day)
Location
Heart of Eutopia!
System Name ibuytheusedstuff
Processor 5960x
Motherboard x99 sabertooth
Cooling old socket775 cooler
Memory 32 Viper
Video Card(s) 1080ti on morpheus 1
Storage raptors+ssd
Display(s) acer 120hz
Case open bench
Audio Device(s) onb
Power Supply antec 1200 moar power
Mouse mx 518
Keyboard roccat arvo
not necessary to talk like this just because ya have other opinion.

i too think that this is mostly benefical for low frames in slower paced and\or strategy games but its the same discussion like vsync on versus off or do i need 120fps for 120hz to see it.

we all see\hear\smell differently.
 
Joined
Aug 6, 2017
Messages
7,412 (2.73/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I don't get this thread
First how is blur and any other motion imperfection gonna get worse coming from 60 to 120 ?
Second,half of this thread is a guy who says adaptive sync is not needed cause having unsynced frames doesn't produce tear.

only thing I'd be worried about is VA's
most of gaming VA's are still smearing

z27.jpg
 
Last edited:
Joined
Nov 27, 2010
Messages
924 (0.18/day)
System Name future xeon II
Processor DUAL SOCKET xeon e5 2686 v3 , 36c/72t, hacked all cores @3.5ghz, TDP limit hacked
Motherboard asrock rack ep2c612 ws
Cooling case fans,liquid corsair h100iv2 x2
Memory 96 gb ddr4 2133mhz gskill+corsair
Video Card(s) 2x 1080 sc acx3 SLI, @STOCK
Storage Hp ex950 2tb nvme+ adata xpg sx8200 pro 1tb nvme+ sata ssd's+ spinners
Display(s) philips 40" bdm4065uc 4k @60
Case silverstone temjin tj07-b
Audio Device(s) sb Z
Power Supply corsair hx1200i
Mouse corsair m95 16 buttons
Keyboard microsoft internet keyboard pro
Software windows 10 x64 1903 ,enterprise
Benchmark Scores fire strike ultra- 10k time spy- 15k cpu z- 400/15000
i just might live most of the time in 100hz territory of the screen, to avoid overshoot, not sweat my gpus, and not lose 10 bit color....
certainly on desktop
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
I don't get this thread
First how is blur and any other motion imperfection gonna get worse coming from 60 to 120 ?
Second,half of this thread is a guy who says adaptive sync is not needed cause having unsynced frames doesn't produce tear.

only thing I'd be worried about is VA's
most of gaming VA's are still smearing

View attachment 162604
That list is not up to date. You'll have to look for the budget contest review which makes up for it with the fastest reviewed VA yet on tftcentral - 2458-c.
Also, what is so hard to get about dropping vrr? VA is only fast enough in its fastest duty cycle.
 
Top