I can only see G-Sync being useful or desired by consumers if it did the following:
1. Increased Frame Rate Performance. If this gimmick actually increased your FPS. As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening. Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS. G-Sync would push it closer to 60 FPS.
To imply that NVidia users need a hardware component in the monitor to improve video fidelity, also implies that NVidia cards still suffer from their own form of micro stutters and screen tearing.
I want to clarify what G-Sync is and why it is different than other technologies other posters have presented.
G-Sync is meant to be used in conjunction with frame pacing; both solve different problems. G-Sync has nothing to do with NVidia specific issue; it is one faced by any output with a fixed refresh rate. Monitors currently only draw whole frames at a fixed interval. Therefore, if your framerate output by your video card is lower than your monitor's refresh rate, then you will have some frames duplicated and others not resulting in judder.
Let me explain the difference, using an example where you are displaying a game at 45fps on a 60Hz monitor. This should be helpful to anyone who is confused about the purpose of G-Sync. Let's use a small portion of this scenario (1/15 second or 4 60Hz frames)
Scenario 1 has no frame pacing or G-Sync:
2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
16.7ms - Frame 2 is displayed on the monitor since it is the most recent frame
33.4ms - Frame 2 is displayed on the monitor again since it is still the most recent frame
35ms - Frame 3 is written to the video buffer
50ms - Frame 3 is displayed on the monitor since it is the most recent frame
66.7ms - Frame 3 is displayed on the monitor again since it is still the most recent frame
In Scenario 1, the effective frame rate is 30fps since only two distinct frames were displayed by the monitor. The frames were displayed for identical periods of time, so the framerate appears smooth.
Scenario 2 has frame pacing but no G-Sync:
2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
16.7ms - Frame 1 is displayed on the monitor since it is the oldest frame in the video buffer
33.4ms - Frame 2 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 1, and Frame 1 is deleted from the video buffer
35ms - Frame 3 is written to the video buffer
50ms - Frame 3 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 2, and Frame 2 is deleted from the video buffer
66.7ms - Frame 3 is displayed on the monitor again since no newer frames are available
In Scenario 2, the effective frame rate is 45fps since three distinct frames were displayed by the monitor. However, the third frame was displayed for twice as long as either of the first two, so
judder is experienced.
Scenario 3 has frame pacing and G-Sync:
0ms - G-sync realizes that the graphics card is generating frames at an average of 45fps and adjust's the monitor's refresh rate to 45Hz
2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
22.2ms - Frame 1 is displayed on the monitor since it is the oldest frame in the video buffer
44.4ms - Frame 2 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 1, and Frame 1 is deleted from the video buffer
35ms - Frame 3 is written to the video buffer
66.6ms - Frame 3 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 2, and Frame 2 is deleted from the video buffer
In Scenario 3, the effective frame rate is 45fps since three distinct frames were displayed by the monitor. In addition, all three frames were displayed for equal amounts of time, so
no judder is experienced.
The whole point of this is to have a variable refresh rate on the monitor. Standard frame pacing can't adjust the monitor's refresh rate; it only helps to make sure that all the frames generated by the graphics card are displayed.
Nothing currently can remove all judder when the frame rate is different the monitor's refresh rate. This is what G-Sync aims to solve
G-Sync won't improve frame rate, but it will make lower frame rates look better. This is something that cannot be solved with frame pacing alone.
Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...
Adaptive V-sync was meant to reduce the performance impact of V-sync; it doesn't change anything visually. The idea behind Adaptive V-sync is that if your monitor's refersh rate is higher than the frame rate output of your graphics card at any given time, then V-sync does nothing and the extra computational power required by V-sync is wasted. Adaptive V-sync turns off V-sync in this situation thus resulting in extra performance when the frame rate is low.
See
http://www.geforce.com/hardware/technology/adaptive-vsync/technology