You also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.
I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.
That's because Nvidia are yet again using ideal world scenarios for practical demos.
My entire point was, G-sync fixes nothing, which is still true -- after watching that presentation, I'm even more sure than before. The side by side comparisons even show one monitor without G-sync is tearing but displaying stuff faster than the one with G-sync, which was clearly skipping/jumping frames and "jittering". Watching a pendulum on a screen (Nvidia's pointless demo) or the pointless slow-turning Borderlands 2 demonstration are of no value whatsoever -- I'd love to see someone try using G-sync for a fast motion FPS shooter like BF3 and see just how much input lag it will add to the already delayed engine that the game uses. That G-sync module is nothing more than a hardware-based v-sync framebuffer with extra memory for the monitor -- maybe worth $30 on its best selling day.
This is not even including the fact that this entire problem of tearing is non-existant on fast 120Hz+ panels, with the exception of a few games that run on old engines that suffer from uneven frame pacing in general, regardless of whether it is running on two or one graphics chip (case in point, COD4).
The only imaginable scenario I can think of where this G-sync module would be of any use, is purely in multi-monitor setups where the frames may be being fed unevenly on each different monitor -- but then the question is, is it a problem worth shelling out $175 per monitor for? Absolutely not, and anybody disagreeing with that, is insane ($175 a piece for a 3 monitor setup is $525, not even including the GPUs or any of the monitors).
Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.
I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.
I found th nvidia g-
spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.
NVIDIA 4K and G-Sync Demo - YouTube
See my above answer. I saw that demo in full and it proves my point further that it will not work to improve anything -- certainly not for its value or on any decent TN gaming monitor available these days. His entire argument is excessive/delayed frames>uneven draws by the monitor, so his G-sync module merely gives the monitor an extra large frame buffer to feed the monitor once each frame is ready from the GPU, as well as some proprietary draw calls to the GPU from G-sync to stop it rendering more frames -- at best, a $30 dollar's worth gimmicky solution, and again, nothing revolutionary or worth writing home about.
Now let's see the real-world case of this half-arsed solution -- capped frame rate means light GPU load (on Kepler GPUs, which are almost solely reliant on this to run at advertised clocks), which means the GPU runs at a less-than-optimal power state causing it to down clock, which means when more complex scenes are being rendered, it struggles with the load, and has to clock back up, causing a delay and therefore stutter (rinse and repeat). This is going to need a lot of driver-side support on a title-by-title basis in order to work properly, and I seriously doubt they are going to dedicate many -- if any man hours, into making this work. Makes it nothing more than a gimmick in my book, and an insanely overpriced one at that and I am yet to be proven wrong in this, unfortunately -- in practice or theory.