• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

You looking to get the 780Ti now if you are interested in G-SYNC?

Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.
 
Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.

yeah it wouldn't. I want to see Asus release an IPS 27" with 1440p with G-SYNC, and I would most definitely be interested.
 
I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.

I'll second that. This is how I splashed out £400 on 3D Vision (glasses and monitor) way back in 2009, then running off my GTX 285 of the time.

I remember "just happening" to go round to Novatech and checking it out. I barely even played the game and within 5 minutes they had my money. :laugh: Even my friend who knows nothing about computers and doesn't do gaming was impressed with it.

I don't doubt that G-Sync will deliver a similar kind of awesome which will be more in the way it feels when you control the action with the keyboard and mouse than anything else.

EDIT: you'll love LightBoost and you don't even need an NVIDIA card with the ToastyX utility, either.
 
I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.

Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it. As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.
 
Last edited:
I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag.

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work.

I think your example is unrealistic. Nvidia never said that your solution of a locked maximum frame rate wouldn't achieve a similar goal. Instead, they said it was a bad solution because it required developers to program games with very little detail so that they never drop below a monitor's refresh rate. This isn't practical since any game has scenes more complicated than others, and it makes no sense to run the GPU at 1/4 load 99.9% of the time just so that it never drops below the monitor's refresh rate the remaining 0.01% of the time.

You also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.
 
Last edited:
Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it.

Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.


As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.

I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube
 
Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.




I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube

I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors puts this tech out of reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.
 
I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors puts this tech out of reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.

Whether g-spot is a game changer remains to be seen, I quite agree. However, my feeling on it is that it will be. We'll soon know for sure.

The improvement is equally good on any monitor and resolution configuration as far as I can see. However, I'd have to compare them to really disagree with your point. :)

I do think it's ironic the fact that when the GPU is rendering faster than the monitor's highest refresh, say 120Hz, g-spot works just like normal vsync would, lol. A modern PC with a decent graphics card or cards will often achieve this, especially when the game is an older one.

Oh and it's expensive? Never! :laugh:
 
You also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.

That's because Nvidia are yet again using ideal world scenarios for practical demos.

My entire point was, G-sync fixes nothing, which is still true -- after watching that presentation, I'm even more sure than before. The side by side comparisons even show one monitor without G-sync is tearing but displaying stuff faster than the one with G-sync, which was clearly skipping/jumping frames and "jittering". Watching a pendulum on a screen (Nvidia's pointless demo) or the pointless slow-turning Borderlands 2 demonstration are of no value whatsoever -- I'd love to see someone try using G-sync for a fast motion FPS shooter like BF3 and see just how much input lag it will add to the already delayed engine that the game uses. That G-sync module is nothing more than a hardware-based v-sync framebuffer with extra memory for the monitor -- maybe worth $30 on its best selling day.

This is not even including the fact that this entire problem of tearing is non-existant on fast 120Hz+ panels, with the exception of a few games that run on old engines that suffer from uneven frame pacing in general, regardless of whether it is running on two or one graphics chip (case in point, COD4).

The only imaginable scenario I can think of where this G-sync module would be of any use, is purely in multi-monitor setups where the frames may be being fed unevenly on each different monitor -- but then the question is, is it a problem worth shelling out $175 per monitor for? Absolutely not, and anybody disagreeing with that, is insane ($175 a piece for a 3 monitor setup is $525, not even including the GPUs or any of the monitors).

Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.

I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube

See my above answer. I saw that demo in full and it proves my point further that it will not work to improve anything -- certainly not for its value or on any decent TN gaming monitor available these days. His entire argument is excessive/delayed frames>uneven draws by the monitor, so his G-sync module merely gives the monitor an extra large frame buffer to feed the monitor once each frame is ready from the GPU, as well as some proprietary draw calls to the GPU from G-sync to stop it rendering more frames -- at best, a $30 dollar's worth gimmicky solution, and again, nothing revolutionary or worth writing home about.

Now let's see the real-world case of this half-arsed solution -- capped frame rate means light GPU load (on Kepler GPUs, which are almost solely reliant on this to run at advertised clocks), which means the GPU runs at a less-than-optimal power state causing it to down clock, which means when more complex scenes are being rendered, it struggles with the load, and has to clock back up, causing a delay and therefore stutter (rinse and repeat). This is going to need a lot of driver-side support on a title-by-title basis in order to work properly, and I seriously doubt they are going to dedicate many -- if any man hours, into making this work. Makes it nothing more than a gimmick in my book, and an insanely overpriced one at that and I am yet to be proven wrong in this, unfortunately -- in practice or theory.
 
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
 
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.
 
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync?

He might be referring to G-Sync on-board memory

The pictures show that the FPGA is paired with a trio of 2Gb DDR3 DRAMs, giving it 768MB of memory for image processing and buffering.
 
The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.

Good point. In the end it's always swings and roundabouts, lol.

He might be referring to G-Sync on-board memory

Duh, I missed that. :) I'd love to see a white paper on G-Sync explaining all the technical details of it.
 
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.

In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:

Nvidia5.jpg.jpg


From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.

He might be referring to G-Sync on-board memory

Bingo.
 
In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:

http://images.eurogamer.net/2013/articles//a/1/6/2/5/7/2/2/Nvidia5.jpg.jpg

From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.

Ok, so I missed the bit about the memory buffer on the G-Sync module, but that doesn't actually change the principles of what I'm saying. As the NVIDIA CEO said himself, the system is simple in principle, but complex to implement properly in practice. This is similar to the situation with how jet engines work, for example. Not too complex in principle, but fiendishly complex and difficult to make one that works properly.

All that diagram shows is how things currently work with vsync off and a standard monitor. Of course you see tearing. Why didn't you show the one with the irregular GPU outputs that the monitor syncs to with G-Sync on? That would have been much more relevant.

Another way to think about G-Sync is Adaptive vsync without the tearing, although there's important subtleties there, such as the reduction of latency.

One thing to realize is that if the GPU is putting out frames faster than the fastest refresh of the monitor (say, 144Hz) then the system goes back to a standard vsync-on arrangement ie like G-Sync wasn't there and the GPU reverts to being synced with the monitor. This will typically be the case when playing old games on modern hardware.

However, with modern demanding games and high resolutions we know that a solid 144Hz cannot be maintained. That's where G-Sync syncs the monitor to the GPU, giving the advantages I explained previously. If you want me to repeat it all here, you're out of luck. We're going round in circles already.
 
Re:

So simple and short, G-Sync is going to regulate the refresh rate on the monitor so there isn't any dropped frames. Monitor is being told by G-Sync when a frame is coming in because it is communicating with the GPU. A frame gets sent by the GPU to the monitor, and the fame gets properly placed into a 16.667 ms scan-window... Still reminds me of the Dynamic Frame Control on the RadeonPro Beta for AMD users. It's just something that's being utilized on a Hardware level versus a software level.

So the issue is still the monitor's static refresh rate. Seems like it would be better if Display Manufacturers tried to increase the refresh rate above 144 hz, and make it dynamic instead of static. This would probably be more ideal for AMD Graphic Cards in CrossfireX than NVidia's SLI. CrossfireX has a bad habit of shooting frames out like a galling gun with more than two gpus in CrossfireX. Maybe add in a secondary frame buffer to the monitor incase the fame time is under 30 fps or 33.33 ms and below. That way the previous frame could be stored until a new frame arrives, or have a component in the monitor that skips a scan with the previous scan still displayed...

I still think the EIZO FORIS FG2421 240 Hz gaming monitor is more innovative than NVidia G-Sync. 120 Hz scans with a black-out period after the fame is displayed in each 8.33 ms window seems a lot more... creative. I may invest in this monitor or three.

For the NVidia users, I hope G-Sync does the job with little to no latency.
 
http://www.google.com/patents/US20080055318 Seems ATI already has the patent on this technology? I'm no expert but sounds like the same principle described. The other filings are specifically listed as under ATI. I assume AMD has these now?

It's not exactly the same thing from what I can tell. This technology seems to be adjusting the frame rate to match the source material, but not on a frame by frame basis. It seems to be more applicable to syncing video (fixed frame rate) to the display and to lowering power consumption through setting the monitor to a lower refresh rate, lowering GPU load.
 
Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....
 
Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....

Low res is preventing you to see tearing and smoothness differences? Really? :confused: Tech is proven, tested, reviewed and available - not much room for propaganda.
 
Back
Top