Friday, October 18th 2013

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.
"Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
Add your own comment

147 Comments on NVIDIA Introduces G-SYNC Technology for Gaming Monitors

#126
MxPhenom 216
ASIC Engineer
erockerMaybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.
yeah it wouldn't. I want to see Asus release an IPS 27" with 1440p with G-SYNC, and I would most definitely be interested.
Posted on Reply
#127
qubit
Overclocked quantum bit
erockerI would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.
I'll second that. This is how I splashed out £400 on 3D Vision (glasses and monitor) way back in 2009, then running off my GTX 285 of the time.

I remember "just happening" to go round to Novatech and checking it out. I barely even played the game and within 5 minutes they had my money. :laugh: Even my friend who knows nothing about computers and doesn't do gaming was impressed with it.

I don't doubt that G-Sync will deliver a similar kind of awesome which will be more in the way it feels when you control the action with the keyboard and mouse than anything else.

EDIT: you'll love LightBoost and you don't even need an NVIDIA card with the ToastyX utility, either.
Posted on Reply
#128
Am*
qubitI don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it. As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.
Posted on Reply
#129
The Von Matrices
Am*I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag.

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work.
I think your example is unrealistic. Nvidia never said that your solution of a locked maximum frame rate wouldn't achieve a similar goal. Instead, they said it was a bad solution because it required developers to program games with very little detail so that they never drop below a monitor's refresh rate. This isn't practical since any game has scenes more complicated than others, and it makes no sense to run the GPU at 1/4 load 99.9% of the time just so that it never drops below the monitor's refresh rate the remaining 0.01% of the time.

You also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.
Posted on Reply
#130
qubit
Overclocked quantum bit
Am*Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it.
Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.
Am*As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.
I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube
Posted on Reply
#131
TheoneandonlyMrK
qubitSome things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.




I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube
I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors puts this tech out of reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.
Posted on Reply
#132
qubit
Overclocked quantum bit
theoneandonlymrkI get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors puts this tech out of reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.
Whether g-spot is a game changer remains to be seen, I quite agree. However, my feeling on it is that it will be. We'll soon know for sure.

The improvement is equally good on any monitor and resolution configuration as far as I can see. However, I'd have to compare them to really disagree with your point. :)

I do think it's ironic the fact that when the GPU is rendering faster than the monitor's highest refresh, say 120Hz, g-spot works just like normal vsync would, lol. A modern PC with a decent graphics card or cards will often achieve this, especially when the game is an older one.

Oh and it's expensive? Never! :laugh:
Posted on Reply
#133
Am*
The Von MatricesYou also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.
That's because Nvidia are yet again using ideal world scenarios for practical demos.

My entire point was, G-sync fixes nothing, which is still true -- after watching that presentation, I'm even more sure than before. The side by side comparisons even show one monitor without G-sync is tearing but displaying stuff faster than the one with G-sync, which was clearly skipping/jumping frames and "jittering". Watching a pendulum on a screen (Nvidia's pointless demo) or the pointless slow-turning Borderlands 2 demonstration are of no value whatsoever -- I'd love to see someone try using G-sync for a fast motion FPS shooter like BF3 and see just how much input lag it will add to the already delayed engine that the game uses. That G-sync module is nothing more than a hardware-based v-sync framebuffer with extra memory for the monitor -- maybe worth $30 on its best selling day.

This is not even including the fact that this entire problem of tearing is non-existant on fast 120Hz+ panels, with the exception of a few games that run on old engines that suffer from uneven frame pacing in general, regardless of whether it is running on two or one graphics chip (case in point, COD4).

The only imaginable scenario I can think of where this G-sync module would be of any use, is purely in multi-monitor setups where the frames may be being fed unevenly on each different monitor -- but then the question is, is it a problem worth shelling out $175 per monitor for? Absolutely not, and anybody disagreeing with that, is insane ($175 a piece for a 3 monitor setup is $525, not even including the GPUs or any of the monitors).
qubitSome things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.

I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube
See my above answer. I saw that demo in full and it proves my point further that it will not work to improve anything -- certainly not for its value or on any decent TN gaming monitor available these days. His entire argument is excessive/delayed frames>uneven draws by the monitor, so his G-sync module merely gives the monitor an extra large frame buffer to feed the monitor once each frame is ready from the GPU, as well as some proprietary draw calls to the GPU from G-sync to stop it rendering more frames -- at best, a $30 dollar's worth gimmicky solution, and again, nothing revolutionary or worth writing home about.

Now let's see the real-world case of this half-arsed solution -- capped frame rate means light GPU load (on Kepler GPUs, which are almost solely reliant on this to run at advertised clocks), which means the GPU runs at a less-than-optimal power state causing it to down clock, which means when more complex scenes are being rendered, it struggles with the load, and has to clock back up, causing a delay and therefore stutter (rinse and repeat). This is going to need a lot of driver-side support on a title-by-title basis in order to work properly, and I seriously doubt they are going to dedicate many -- if any man hours, into making this work. Makes it nothing more than a gimmick in my book, and an insanely overpriced one at that and I am yet to be proven wrong in this, unfortunately -- in practice or theory.
Posted on Reply
#134
qubit
Overclocked quantum bit
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
Posted on Reply
#135
Wile E
Power User
qubitAm* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.
Posted on Reply
#136
Xzibit
qubitAm* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync?
He might be referring to G-Sync on-board memory
The pictures show that the FPGA is paired with a trio of 2Gb DDR3 DRAMs, giving it 768MB of memory for image processing and buffering.
Posted on Reply
#137
qubit
Overclocked quantum bit
Wile EThe plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.
Good point. In the end it's always swings and roundabouts, lol.
XzibitHe might be referring to G-Sync on-board memory
Duh, I missed that. :) I'd love to see a white paper on G-Sync explaining all the technical details of it.
Posted on Reply
#138
Am*
qubitAm* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:



From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.
XzibitHe might be referring to G-Sync on-board memory
Bingo.
Posted on Reply
#139
qubit
Overclocked quantum bit
Am*In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:

images.eurogamer.net/2013/articles//a/1/6/2/5/7/2/2/Nvidia5.jpg.jpg

From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.
Ok, so I missed the bit about the memory buffer on the G-Sync module, but that doesn't actually change the principles of what I'm saying. As the NVIDIA CEO said himself, the system is simple in principle, but complex to implement properly in practice. This is similar to the situation with how jet engines work, for example. Not too complex in principle, but fiendishly complex and difficult to make one that works properly.

All that diagram shows is how things currently work with vsync off and a standard monitor. Of course you see tearing. Why didn't you show the one with the irregular GPU outputs that the monitor syncs to with G-Sync on? That would have been much more relevant.

Another way to think about G-Sync is Adaptive vsync without the tearing, although there's important subtleties there, such as the reduction of latency.

One thing to realize is that if the GPU is putting out frames faster than the fastest refresh of the monitor (say, 144Hz) then the system goes back to a standard vsync-on arrangement ie like G-Sync wasn't there and the GPU reverts to being synced with the monitor. This will typically be the case when playing old games on modern hardware.

However, with modern demanding games and high resolutions we know that a solid 144Hz cannot be maintained. That's where G-Sync syncs the monitor to the GPU, giving the advantages I explained previously. If you want me to repeat it all here, you're out of luck. We're going round in circles already.
Posted on Reply
#140
Serpent of Darkness
Re:

So simple and short, G-Sync is going to regulate the refresh rate on the monitor so there isn't any dropped frames. Monitor is being told by G-Sync when a frame is coming in because it is communicating with the GPU. A frame gets sent by the GPU to the monitor, and the fame gets properly placed into a 16.667 ms scan-window... Still reminds me of the Dynamic Frame Control on the RadeonPro Beta for AMD users. It's just something that's being utilized on a Hardware level versus a software level.

So the issue is still the monitor's static refresh rate. Seems like it would be better if Display Manufacturers tried to increase the refresh rate above 144 hz, and make it dynamic instead of static. This would probably be more ideal for AMD Graphic Cards in CrossfireX than NVidia's SLI. CrossfireX has a bad habit of shooting frames out like a galling gun with more than two gpus in CrossfireX. Maybe add in a secondary frame buffer to the monitor incase the fame time is under 30 fps or 33.33 ms and below. That way the previous frame could be stored until a new frame arrives, or have a component in the monitor that skips a scan with the previous scan still displayed...

I still think the EIZO FORIS FG2421 240 Hz gaming monitor is more innovative than NVidia G-Sync. 120 Hz scans with a black-out period after the fame is displayed in each 8.33 ms window seems a lot more... creative. I may invest in this monitor or three.

For the NVidia users, I hope G-Sync does the job with little to no latency.
Posted on Reply
#142
quoloth
www.google.com/patents/US20080055318 Seems ATI already has the patent on this technology? I'm no expert but sounds like the same principle described. The other filings are specifically listed as under ATI. I assume AMD has these now?
Posted on Reply
#143
The Von Matrices
quolothwww.google.com/patents/US20080055318 Seems ATI already has the patent on this technology? I'm no expert but sounds like the same principle described. The other filings are specifically listed as under ATI. I assume AMD has these now?
It's not exactly the same thing from what I can tell. This technology seems to be adjusting the frame rate to match the source material, but not on a frame by frame basis. It seems to be more applicable to syncing video (fixed frame rate) to the display and to lowering power consumption through setting the monitor to a lower refresh rate, lowering GPU load.
Posted on Reply
#146
Prima.Vera
Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....
Posted on Reply
#147
BiggieShady
Prima.VeraNeh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....
Low res is preventing you to see tearing and smoothness differences? Really? :confused: Tech is proven, tested, reviewed and available - not much room for propaganda.
Posted on Reply
Add your own comment
Nov 21st, 2024 15:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts