# NVIDIA Introduces G-SYNC Technology for Gaming Monitors



## Cristian_25H (Oct 18, 2013)

Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.



 

 



"Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

*Leading Game Developers Blown Away*
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

*Rollout Plans by Monitor Manufacturers*
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

*Enthusiasm by System Builders and Integrators*
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.

*View at TechPowerUp Main Site*


----------



## de.das.dude (Oct 18, 2013)

just saw this on facebook!


----------



## Deleted member 24505 (Oct 18, 2013)

It's going to put a £100+ premium on the price of the monitor though I reckon.


----------



## Am* (Oct 18, 2013)

Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.

Also, have these guys ever heard of Dxtory? 

Please GTFO with more of this proprietary, overpriced and useless bullshit to further fragment PC gaming, Nvidia.

EDIT: and good lord, that stupid name...G-SYNC? Sounds like the name of an Nsync tribute band.


----------



## rokazs1 (Oct 18, 2013)

Don't forget 780ti !


----------



## SIGSEGV (Oct 18, 2013)

sigh, another proprietary crap from nvidia


----------



## Xzibit (Oct 18, 2013)

I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

Besides Nvidia users don't have stuttering nor tearing....right guys ???


----------



## Renald (Oct 18, 2013)

I've 60+ FPS on most of the games with a 200€ card.

Why would it be useful to have that ? It will not resolve multi-GPU problems, and it's useless for other usage.


I must surrender, it's too stupid, even from them.


----------



## Prima.Vera (Oct 18, 2013)

How much did nVidia pay for all those guys to have North Korean adulation declarations for this new crap??


----------



## RejZoR (Oct 18, 2013)

Does it even work with AMD GPU's ? If not, it's as useless as it can get.


----------



## SIGSEGV (Oct 18, 2013)

RejZoR said:


> Does it even work with AMD GPU's ? If not, it's as useless as it can get.





Cristian_25H said:


> G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)



hope it helps..


----------



## Recus (Oct 18, 2013)

Desperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag. 



Am* said:


> Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.
> 
> Also, have these guys ever heard of Dxtory?
> 
> ...



Today local pharmacy closed earlier. Did you attacked them because they are selling useless, proprietary and gimmicky pharmacy companies drugs?



RejZoR said:


> Does it even work with AMD GPU's ? If not, it's as useless as it can get.



Mantle won't work on Nvidia. Why it's not useless?


----------



## TheMailMan78 (Oct 18, 2013)

Cadaveca and I were talking about this about a year ago. Sometime (no matter the mfg of the GPU) the "flow" of the animations and movement on the screen would be so smooth that it gave the same feeling as watching a movie on a 120mhz HD monitor. However the "sensation" for lack of a better word was always short lived. He tried to figure out what was causing it and so did I. He also asked W1zz about it but its a very hard thing to explain. You either "know" the feeling or you don't. After reading this I think NVIDIA might have narrowed it down to a hardware level judging by what the developers were saying in the PR. If so than this is gonna be awesome.


----------



## Am* (Oct 18, 2013)

Recus said:


> Desperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag.



Butthurt Nvidiot strikes again, what a surprise...might want to check my system specs before you embarrass yourself any further.


----------



## MxPhenom 216 (Oct 18, 2013)

I think this could be pretty cool. id be interested to try out a monitor with a G-Sync module thats for sure.

Asus 27" 1440p monitor with G-Sync......anyone?


----------



## Recus (Oct 18, 2013)

Am* said:


> Butthurt Nvidiot strikes again, what a surprise...might want to check my system specs before you embarrass yourself any further.



And who can confirm your specs, mind invalid. You better go and write petition to AMD asking them to stop driver updates because games related problems aren't GPU makers problems it's game developers problem.


----------



## wickerman (Oct 18, 2013)

I really like this idea, but if it requires me to replace my u2711 I see that as a bit of a problem. Sure 2560x1440 panels have come down in price significantly since I bought mine, but it seems like a bit of a waste to replace my current panel with something that is the same resolution. I'd rather jump on the 4k train when the prices become more reasonable. 

I suppose if I had a friend/family member willing to buy mine and the replacement monitor offered benefits in other areas (color accuracy, response time, lower power, etc) then I could be talked into replacing my u2711 with another 1440p/1600p panel that supported this tech.

*edit*
Also if this is Nvidia exclusive tech, that is also a bit annoying. I tend to keep my monitors for a while, but I could switch back and forth between AMD and Nvidia graphics quite frequently. Would be kind of annoying to if we came across a generation down the line where AMD has the superior performance but I have to wait for Nvidia to catch up just to take advantage of the reason I upgraded my monitor.


----------



## The Von Matrices (Oct 18, 2013)

I look forward to learning more about this technology and seeing it implemented.  It's basically dynamic refresh  rate for monitors.  I do see this making a huge difference for people like me who like to turn up the details at the expense of having the frame rate frequently drop below the monitor's refresh rate.


----------



## Am* (Oct 18, 2013)

Recus said:


> And who can confirm your specs, mind invalid. You better go and write petition to AMD asking them to stop driver updates because games related problems aren't GPU makers problems it's game developers problem.



My PC is barely mid range compared to the systems some people here have, and you want me to prove my specs? Daaymn, you must be pretty broke to even consider saying that, no wonder you're mindlessly trolling the forums.

P.S. will attach a CPU-Z/whatever system validation is quickest, when I can be arsed to do it.


----------



## the54thvoid (Oct 18, 2013)

It looks to be a good thing but for God's sake don't be proprietary with it.  And wtf with the tone of this thread?


----------



## Crap Daddy (Oct 18, 2013)

It is proprietary and will cost money compared to Mantle which is proprietary but comes for free albeit with just one game. Guess we will soon have to own two systems for gaming depending on what games we like, one NV and one AMD.


----------



## The Von Matrices (Oct 18, 2013)

My hope is that their pricing estimates are not too far off.  I would be willing to pay $50 extra per monitor for this.  If NVidia has this working with their 3D Surround implementation, then I would seriously consider replacing my graphics cards and monitors for an upgrade.


----------



## 1d10t (Oct 18, 2013)

I reckon the previous hype nVidia boasting, "3D Games is The Future".And where all of those now? 

Is this nVidia respond to AMD Mantle ? 

OR...obviously nVidia cant reach above 60fps with highest detail ,but instead admitted they restricted it  

OR...they didn't good in 4K.But instead making another competitive card they make their own monitor that only do 1080p 

-= edited=-



erocker said:


> *Waits for demonstration by a 3rd party*



I bet it organize by Origin PC


----------



## Hilux SSRG (Oct 18, 2013)

If Nvidia can eliminate stuttering and provide ultrafast response, I'm very interested.  I hope we get to see some videos soon.


----------



## erocker (Oct 18, 2013)

*Waits for demonstration by a 3rd party*


----------



## Solidstate89 (Oct 18, 2013)

Am* said:


> Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.
> 
> Also, have these guys ever heard of Dxtory?
> 
> ...


Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on *ANY* of today's monitors without that specific hardware bundle.


----------



## Crap Daddy (Oct 18, 2013)

Solidstate89 said:


> Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?
> 
> The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on *ANY* of today's monitors without that specific hardware bundle.



This is needed in the monitor:


----------



## Deleted member 24505 (Oct 18, 2013)

Could they not do something similar to this that goes between the card and monitor in the cable, maybe a box, so it will work without a new monitor?


----------



## The Von Matrices (Oct 18, 2013)

Crap Daddy said:


> This is needed in the monitor:
> 
> http://img.techpowerup.org/131018/gsync-module.png



The mounting holes are evenly spaced around the "processor," which indicates to me that this needs a heatsink when operating.  I hope it doesn't need active cooling; having a small, loud fan in a monitor is certainly not a good thing.


----------



## Deleted member 24505 (Oct 18, 2013)

The Von Matrices said:


> The mounting holes are evenly spaced around the "processor," which indicates to me that this needs a heatsink when operating.  I hope it doesn't need active cooling; having a small, loud fan in a monitor is certainly not a good thing.



Looks like a MXM card for laptops


----------



## Serpent of Darkness (Oct 18, 2013)

*Re:*

I can only see G-Sync being useful or desired by consumers if it did the following:

1.  Increased Frame Rate Performance.  If this gimmick actually increased your FPS.  As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening.  Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS.  G-Sync would push it closer to 60 FPS.

To imply that NVidia users need a hardware component in the monitor to improve video fidelity, also implies that NVidia cards still suffer from their own form of micro stutters and screen tearing.

Sadly, in my opinion, I think this is the only innovative niche NVidia could come up with in such a short period of time.  Especially when the spot-light just got wider, and it was placed on AMD since RX9 series, Consoles, and AMD Mantle have been the top buzz in the news as of late.  AMD is gaining momentum.  NVidia wants to push it's own game bundle.  No surprise there.  Copying the concept from AMD to continue competition, and react to the current situation.  AMD is pushing AMD Mantle (can be used by both AMD and NVidia), TrueAudio, DX11.2 Support, PCIe CrossfireX through the PCIe Bus, and the buzz about the RX9-290 possibly outperforming GTX Titan.  It seems like AMD lit a match under NVidia's foot besides Microsoft.  Now, NVidia is reacting...

If this niche is proprietary, nobody is going to buy it unless the 3rd party benchers like Techpowerup.com, Anandtech.com, and others glorify it.  Otherwise, I see this as an unnecessary, supplemental feature that's cause the NVidia Base Consumers to purchase more NVidia products for outrages prices with very little improvements...


----------



## MuhammedAbdo (Oct 18, 2013)

NVIDIA will allow people to purchase the G-Sync module , and install it by modding their monitors , you don't actually have to buy a new monitor at all !




> Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.
> 
> If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.


http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming


----------



## FordGT90Concept (Oct 18, 2013)

Unless it is GPU manufacturer neutral and standardized, it's a gimmick just like the rest.  If NVIDIA was serious about improving monitor tech, it would create a forum so all manufactures can implement it (even if there is a reasonable licensing fee involved).


----------



## MuhammedAbdo (Oct 18, 2013)

Serpent of Darkness said:


> I can only see G-Sync being useful or desired by consumers if it did the following:
> 
> 1.  Increased Frame Rate Performance.  If this gimmick actually increased your FPS.  As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening.  Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS.  G-Sync would push it closer to 60 FPS.
> 
> ...


Seriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.

TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .

On the other hand, NVIDIA always has been the brute force of advancement in the PC space, AMD just plays catch up, and today's announcements just cements that idea.

NVIDIA was the first to introduce the GPU, SLi, Frame Pacing (FCAT) , GPU Boost ,Adaptive V.Sync, first to reach unified shader architecture on PCs, first with GPGPU , CUDA , PhysX , Optix (for real time ray tracing) , first with 3D gaming (3D Vision) . first with Optimus for mobile flexibility , first with Geforce Experience program , SLi profiles , shadow play (for recording) and game streaming .

They had better support with CSAA , FXAA , TXAA , driver side Ambient Occlusion and HBAO+ , TWIMTP program , better Linux , OpenGL support, .. and now G.Sync! all of these innovations are sustained and built upon to this day.

And even when AMD beat them to a certain invention , like Eyefinity , NVIDIA didn't stop until they topped it with more features , so they answered with Surround , then did 3D Surround and now 4k Surround.

NVIDIA was at the forefront in developing all of these technologies and they continue to sustain and expand them till now , AMD just follows suit , they fight back with stuff that they don't really sustain so they end up forgotten and abandoned ,even generating more trouble than they worth. Just look at the pathetic state of their own Eyefinity and all of it's CF frame problems.

In short AMD is the one feeling the heat, NOT NVIDIA, heck NVIDIA now fights two generations of AMD cards with only one generation of their own, Fermi held off HD 5870 and 6970 , Kepler held off 7970 and x290 ! and who knows about Maxwell !


----------



## RejZoR (Oct 18, 2013)

Recus said:


> Desperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag.
> 
> 
> 
> ...



You can't compare awesome Mantle with useless G-Sync. Besides, isn't Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...


----------



## MuhammedAbdo (Oct 18, 2013)

It's funny when the greatest minds of game development speak highly of G.Sync and it's application, while AMD fanboys try to shoot it down on the basis of nothing but their ignorance!


----------



## The Von Matrices (Oct 18, 2013)

Serpent of Darkness said:


> I can only see G-Sync being useful or desired by consumers if it did the following:
> 
> 1.  Increased Frame Rate Performance.  If this gimmick actually increased your FPS.  As if there was some form of loss or leakage in performance, and G-Sync prevented that from happening.  Instead of actually getting 60 FPS on Tomb Raider with maxed out graphics settings, you're only getting 32 FPS.  G-Sync would push it closer to 60 FPS.
> 
> To imply that NVidia users need a hardware component in the monitor to improve video fidelity, also implies that NVidia cards still suffer from their own form of micro stutters and screen tearing.



*I want to clarify what G-Sync is and why it is different than other technologies other posters have presented.*

G-Sync is meant to be used in conjunction with frame pacing; both solve different problems.  G-Sync has nothing to do with NVidia specific issue; it is one faced by any output with a fixed refresh rate.  Monitors currently only draw whole frames at a fixed interval.  Therefore, if your framerate output by your video card is lower than your monitor's refresh rate, then you will have some frames duplicated and others not resulting in judder.

Let me explain the difference, using an example where you are displaying a game at 45fps on a 60Hz monitor.  This should be helpful to anyone who is confused about the purpose of G-Sync.    Let's use a small portion of this scenario (1/15 second or 4 60Hz frames)


*Scenario 1 has no frame pacing or G-Sync:*

2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
16.7ms - Frame 2 is displayed on the monitor since it is the most recent frame
33.4ms - Frame 2 is displayed on the monitor again since it is still the most recent frame
35ms - Frame 3 is written to the video buffer
50ms - Frame 3 is displayed on the monitor since it is the most recent frame
66.7ms - Frame 3 is displayed on the monitor again since it is still the most recent frame

*In Scenario 1, the effective frame rate is 30fps* since only two distinct frames were displayed by the monitor.  The frames were displayed for identical periods of time, so the framerate appears smooth.


*Scenario 2 has frame pacing but no G-Sync:*

2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
16.7ms - Frame 1 is displayed on the monitor since it is the oldest frame in the video buffer
33.4ms - Frame 2 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 1, and Frame 1 is deleted from the video buffer
35ms - Frame 3 is written to the video buffer
50ms - Frame 3 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 2, and Frame 2 is deleted from the video buffer
66.7ms - Frame 3 is displayed on the monitor again since no newer frames are available

*In Scenario 2, the effective frame rate is 45fps* since three distinct frames were displayed by the monitor.  However, the third frame was displayed for twice as long as either of the first two, so *judder is experienced.*


*Scenario 3 has frame pacing and G-Sync:*

0ms - G-sync realizes that the graphics card is generating frames at an average of 45fps and adjust's the monitor's refresh rate to 45Hz
2ms - Frame 1 is written to the video buffer
4ms - Frame 2 is written to the video buffer
22.2ms - Frame 1 is displayed on the monitor since it is the oldest frame in the video buffer
44.4ms - Frame 2 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 1, and Frame 1 is deleted from the video buffer
35ms - Frame 3 is written to the video buffer
66.6ms - Frame 3 is displayed on the monitor since it is the oldest frame in the video buffer that is still newer than Frame 2, and Frame 2 is deleted from the video buffer

*In Scenario 3, the effective frame rate is 45fps* since three distinct frames were displayed by the monitor.  In addition, all three frames were displayed for equal amounts of time, so *no judder is experienced*.


The whole point of this is to have a variable refresh rate on the monitor.  Standard frame pacing can't adjust the monitor's refresh rate; it only helps to make sure that all the frames generated by the graphics card are displayed.  *Nothing currently can remove all judder when the frame rate is different the monitor's refresh rate.  This is what G-Sync aims to solve*

G-Sync won't improve frame rate, but it will make lower frame rates look better.  This is something that cannot be solved with frame pacing alone.



RejZoR said:


> Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...



Adaptive V-sync was meant to reduce the performance impact of V-sync; it doesn't change anything visually.  The idea behind Adaptive V-sync is that if your monitor's refersh rate is higher than the frame rate output of your graphics card at any given time, then V-sync does nothing and the extra computational power required by V-sync is wasted.  Adaptive V-sync turns off V-sync in this situation thus resulting in extra performance when the frame rate is low.

See http://www.geforce.com/hardware/technology/adaptive-vsync/technology


----------



## Solidstate89 (Oct 18, 2013)

RejZoR said:


> You can't compare awesome Mantle with useless G-Sync. Besides, isn't Adaptive V-Sync suppose to solve all the image tearing problems? I guess it wasn't as wonderful after all if they need to design some crappy special monitor to counter that...


You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.


----------



## NeoXF (Oct 18, 2013)

LOL, hypocrisy at it's finest level, nVidia saying PC gaming is awesome this and that... then releasing sum'more proprietary crapware to fragment it even further.

Jesus, I kinda wish Intel did pursue that dGPU thing... 3 GPU players = pretty much no one affording to fragment an already mother-less and scattered segment of modern video gaming...


AMD might not be perfect, but at least they are more open-minded and aware that walled-garden features aren't the future... They're the Linux of computer hardware...

Give me SteamOS/Mint + OGL w/ good driver support & a low-level API and I'm game.


----------



## SoundChaos (Oct 18, 2013)

I have been waiting for this feature in a monitor for years now, tearing and stuttering have always annoyed me more than overall FPS / Picture quality. If this really can fix these issues, I would pay just about anything to get it.

In regards to some other posts, this can not be done with a firmware upgrade for monitors, and no current videocard/monitor combination is free from the terrors of constant or even occasional micro stuttering / tearing... Some people's eyes seem to be less sensitive to it though.

Cant wait for a test drive, but it will make me sad to have to trade off my crossfire 7950 setup..


----------



## The Von Matrices (Oct 18, 2013)

NeoXF said:


> LOL, hypocrisy at it's finest level, nVidia saying PC gaming is awesome this and that... then releasing sum'more proprietary crapware to fragment it even further.



Can we at least let NVidia confirm that it's proprietary before you complain?  The only thing announced was that it was running on NVidia hardware and an NVidia display controller.  NVidia said nothing about disallowing other manufacturers from producing control boards.


----------



## NeoXF (Oct 18, 2013)

Recus said:


> Mantle won't work on Nvidia. Why it's not useless?



Please link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.



The Von Matrices said:


> Can we at least let NVidia confirm that it's proprietary before you complain?  The only thing announced was that it was running on NVidia hardware and an NVidia display controller.  NVidia said nothing about disallowing other manufacturers from producing control boards.



Name one nVidia branded tech that ISN'T proprietary.


----------



## The Von Matrices (Oct 18, 2013)

NeoXF said:


> Name one nVidia branded tech that ISN'T proprietary.



There's no point in debating an assumption.  I'm going to wait to hear more before I make any conclusions.


----------



## GC_PaNzerFIN (Oct 18, 2013)

You know what this means...

 good bye SLi microstuttering. Give me naooo 

I can't believe you guys are calling this useless. 

Guru3D - NVIDIA G-SYNC Overview - Shaky Cam Voice ...


----------



## NeoXF (Oct 18, 2013)

The Von Matrices said:


> There's no point in debating an assumption.  I'm going to wait to hear more before I make any conclusions.



Fair enough.



GC_PaNzerFIN said:


> You know what this means...
> 
> good bye SLi microstuttering. Give me naooo
> 
> ...



Not useless, but walled garden gaming isn't my idea of PC gaming... or entertainment, in general... Also it's very very debatable of how effective or easy-to-implement this will actually be (yay, from the really small number of choices for gaming displays, now we'll get an even smaller palette!). If TXAA, 3D Vision or PhysX are things to go by... it will be a stillborn horse that nVidia (with it's massive budget) will still be beating on for years and years, even after it's basically just a pile of bones.


----------



## acerace (Oct 19, 2013)

Like always, hate fest! Why am I not surprise? Typical..


----------



## Xzibit (Oct 19, 2013)

Another question is...

With the current Asus VG248QE going for as low as $249-$269.  Is the $130+ mark-up on G-Sync ($399) worth it.
or
Are you better off saving it for a second monitor since its 50% of the price.

Will it be a set price or scale with screen size $$$$


----------



## kn00tcn (Oct 19, 2013)

as a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)

name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient

sure GPU physx & cuda are very annoying, not doubting that, it's not 'good guy nvidia', but many things start out proprietary to show that they work

we should be pressuring VESA/displayport/hdmi for an adaptive refresh technique like this 

i dont get why nv doesnt license things out so that everyone can enjoy instead of locking down to only their platform (look at bluray, it's standard, just a royalty is paid, it's not locked to sony products)

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s


----------



## anonymous6366 (Oct 19, 2013)

I'm skeptical, I would like to sit down and play some games with this to see if I can actually tell a difference


----------



## haswrong (Oct 19, 2013)

instead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha   




anonymous6366 said:


> I'm skeptical, I would like to sit down and play some games with this to see if I can actually tell a difference


well, dont try that with a crt monitor or you end up in epileptical convulsion in no time


----------



## haswrong (Oct 19, 2013)

NeoXF said:


> Fair enough.
> Not useless, but walled garden ... it's basically just a pile of bones.



if nv did this in the range of 140-85Hz, id call that a gamers benefit 
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? 
gimme a f*cking break!  

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card


----------



## Xzibit (Oct 19, 2013)

Can I upgrade it to a 780M ?


----------



## 1d10t (Oct 19, 2013)

Crap Daddy said:


> This is needed in the monitor:
> 
> http://img.techpowerup.org/131018/gsync-module.png





tigger said:


> Looks like a MXM card for laptops
> 
> http://img15.imageshack.us/img15/522/mxmtypes.jpg





Xzibit said:


> Can I upgrade it to a 780M ?
> 
> http://www.techpowerup.com/gpudb/images/2128.jpg



interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield  /sarcasm


----------



## The Von Matrices (Oct 19, 2013)

kn00tcn said:


> as a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)
> 
> name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient
> 
> ...



Thank you for this post.  It seems like for every 1 neutral post on any AMD, NVidia or Intel press release there are 2 from people who read the only first sentence of the article and immediately claim the technology is evil based on some argument that would be disproved if they had read the entire article.  If only more people approached these topics from a neutral point of view and took time to understand what they were criticizing this forum would be full of intelligent discussion.  I guess this is the internet and you can't expect more... (I always wished there was a "No Thanks" button on TPU to note irrelevant posts, but I can imagine how it would be abused).

I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising.  This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.


----------



## ItsKlausMF (Oct 19, 2013)

MuhammedAbdo said:


> Seriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.
> 
> TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .
> 
> ...



Audio does matter, better audio = better experience.

Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.


----------



## The Von Matrices (Oct 19, 2013)

ItsKlausMF said:


> Audio does matter, better audio = better experience.



No argument there.  How much TrueAudio actually improves audio is still up for debate until AMD releases drivers for it.



ItsKlausMF said:


> Yeah nv first with SLI, but not the first to do it "right".



What in the world does this even mean?



ItsKlausMF said:


> FCAT was just an anti AMD tool to bottleneck AMD sales.



So you're arguing that we shouldn't have any way to measure frame pacing just because AMD couldn't do it properly?



ItsKlausMF said:


> PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI.



PhysX was doomed to fail even before NVidia took it over.  There was no way that dedicated physics accelerator cards were going to take off just like dedicated sound cards had died out in the years prior.



ItsKlausMF said:


> Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.



I don't know enough about these to make a comment.



ItsKlausMF said:


> Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.



Maxwell is speculated to launch on 28nm in early 2014 then move to 20nm in late 2014 according to reports.  And what does it matter how many generations each manufacturer puts out?  All that matters is performance/price, which is why I don't care one bit about the rebrands that both sides do as long as the rebrands move down pricing.


----------



## Xzibit (Oct 19, 2013)

1d10t said:


> interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield  /sarcasm



I actually wasn't kidding when I said



Xzibit said:


> I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips
> 
> Besides Nvidia users don't have stuttering nor tearing....right guys ???




It reminded me of Tegra 3 short board modules. If they were full board they be almost twins











NVIDIA GeForce GT 630M





its at the lower end for sure size wise


----------



## RejZoR (Oct 19, 2013)

Solidstate89 said:


> You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.



Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does

So, why do we need G-Sync to remove image tearing again?



NeoXF said:


> Please link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.
> 
> Name one nVidia branded tech that ISN'T proprietary.



Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).


----------



## Xzibit (Oct 19, 2013)

RejZoR said:


> Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.
> 
> High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).



Anyone remember GLIDE Wrappers ?

Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.

*Nvidia G-Sync FAQ*



> *Q: Does NVIDIA G-SYNC work for all games?*
> 
> A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.



Boo!!!




> *Q: Does G-SYNC work with FCAT?*
> 
> A: FCAT requires a video capture card to catch the output graphics stream going to a monitor. Since G-SYNC is DP only and G-SYNC manipulates DP in new ways, it is very unlikely that existing capture cards will work. Fortunately, FRAPS is now an accurate reflection of the performance of a G-SYNC enabled system, and FRAPS output can be directly read by FCAT for comparative processing.







> *Q: How much more does G-SYNC add to the cost of a monitor?*
> 
> A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.


----------



## Doc41 (Oct 19, 2013)

This looked like an interesting thing to try as i just bought the VG248QE a while ago, but it looks it'll cost a kidney and might not be available for me  will see when its released,
aand it looks i might need a new system too with a kepler card


----------



## RejZoR (Oct 19, 2013)

@Xzibit
Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...

Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?


----------



## Frick (Oct 19, 2013)

You guys make my hatred for booth babes and Bethsoft seem rational and levelheaded (which it totally are btw). 

Anyway wasn't qubit going on about something like this some time ago?


----------



## remixedcat (Oct 19, 2013)

Ok what the hell does this have to do with Glidewrapper?

Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.


----------



## GC_PaNzerFIN (Oct 19, 2013)

haswrong said:


> instead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha
> 
> 
> 
> well, dont try that with a crt monitor or you end up in epileptical convulsion in no time



This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

G-SYNC board will support Lightboost too.  
http://www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

edit:
30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)

https://twitter.com/ID_AA_Carmack/status/391300283672051713


----------



## john_ (Oct 19, 2013)

I am full with AMD and I hate Nvidia with their locks, but this really looks interesting and I believe AMD could follow in the future with something similar (who doesn't like to sell more hardware?).


----------



## Am* (Oct 19, 2013)

GC_PaNzerFIN said:


> This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.
> 
> No matter what this company is releasing it is always going to get this same hatred. *Doesn't matter what the greatest minds of game development think.*
> 
> https://twitter.com/ID_AA_Carmack/status/391300283672051713



No, it doesn't, especially since they're in Nvidia's pockets being payed to spread testimonials about useless "tech" like this, if you can even call it that.



Solidstate89 said:


> Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?
> 
> The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on *ANY* of today's monitors without that specific hardware bundle.



Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.



Xzibit said:


> *Nvidia G-Sync FAQ*
> 
> Q: How much more does G-SYNC add to the cost of a monitor?
> 
> ...








Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for.  And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple).



kn00tcn said:


> just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s



I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).


----------



## 1d10t (Oct 19, 2013)

The Von Matrices said:


> I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising.  This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.



Have you remember monitor technology related that nVidia introduced between 2010?Yep...3D Vision that require very expensive 3D shutter glass,emitter and monitor approved by nVidia.I got one glasses,one emitter and Samsung's 223RZ 22 inch LCD for the price of $800, not to mention another $350 for GTX 470.As far as i remember,3D only "works" on nVidia specific cards and specific monitor.Later i knew nVidia just adopting 3D FPR from LG and bring it to desktop.For the same $1200 i switch to HD5850 + 42 inch LG 240Hz  and had the same effect.Meanwhile,a pair of 3D Vision kit and Asus VG236H will cost you $650 and only works with highend GTX,or you can grab $250 LG D2343P-BN and paired it with every graphic card out there.Where are those "3D Gaming is the future" now?

Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech. 



ItsKlausMF said:


> Audio does matter, better audio = better experience.
> Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales.



A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.


----------



## NeoXF (Oct 19, 2013)

The Von Matrices said:


> I don't know enough about these to make a comment



Ha! Probably the smartest comment in this thread so far. You deserve a medal.
I know how hard I try to shut up about things I don't know much about yet.




RejZoR said:


> Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.



I know what you're saying but:

1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:

2. You could use wrappers to run Glide on non-3dfx hardware...

Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...


----------



## 15th Warlock (Oct 19, 2013)

I really hope nVidia releases a version of this tech that works with existing monitors, something like a HDMI dongle between the monitor and graphic, and make it hardware agnostic please (though I don't see that last part happening...)

Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...


----------



## Solidstate89 (Oct 19, 2013)

haswrong said:


> if nv did this in the range of 140-85Hz, id call that a gamers benefit
> but dropping frames to what? 5fps@5Hz and calling it a holy grail?????
> gimme a f*cking break!
> 
> im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card



It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.



Am* said:


> Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.



Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.


----------



## haswrong (Oct 19, 2013)

GC_PaNzerFIN said:


> edit: 30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)
> https://twitter.com/ID_AA_Carmack/status/391300283672051713



imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.





Solidstate89 said:


> It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.



thanks. as soon as i realized what this technology is, and i realized it as soon as they started talking about lowering the refresh frequency of the monitor, i wasnt exactly in the mood to start searching where is the lowest acceptable limit for those guys.. im inclined to think that they really have no limit if it boils down to getting money from you, lol!


----------



## Solidstate89 (Oct 19, 2013)

Then maybe you shouldn't have made a hyperbolic inaccurate statement on something you knew nothing about. You know, like most people would.

I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.


----------



## 15th Warlock (Oct 19, 2013)

haswrong said:


> imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.



Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S


----------



## Frick (Oct 19, 2013)

@haswrong: Isn't that how it is now though?


----------



## haswrong (Oct 19, 2013)

Frick said:


> @haswrong: Isn't that how it is now though?



nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter 





15th Warlock said:


> ...the audacity to sell this technology and make a profit! How dare them?!!
> /S



exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.


----------



## 1d10t (Oct 19, 2013)

I think somebody forgot, fps are not Hz so 60 fps doesn't mean 60 Hz.Displayed image consist of many frame rendered in one second,while monitor refresh rate consist three factors : horizontal frequencies,resolution and response time.
Care to explain where this G Sync take a part?


----------



## Mistral (Oct 19, 2013)

I'm surprised Carmack sounded so positive about this thing. I respect him as much as the next guy, but I can't see it that way at he moment.

I am curious though, how will g-sync do in say fighting games that require to-the-frame accuracy to pull out the best combos? It could be either a real boon or a curse for them.


----------



## NeoXF (Oct 19, 2013)

15th Warlock said:


> Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!
> 
> Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!
> 
> /S



Yes, how dare corporation care about anything else other than making money!

/S 


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...


----------



## 15th Warlock (Oct 19, 2013)

haswrong said:


> exactly.. if i decide to please someone, i do it for free..
> 
> remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.





NeoXF said:


> Yes, how dare corporation care about anything else other than making money!
> 
> /S
> 
> ...



Aww, how nice of you both, but let me ask you a few questions: Do you have a job? If you do, do you work for free? I mean, do you offer your services without expecting to be remunerated for them? And if that's the case, how do you support yourself and/or your family? Through charity/welfare? 

I mean, you have to find a way to pay your bills somehow, am I right?

I would assume that people who work for these companies (and please note that this applies to any given company in our "evil economy") expect some sort of compensation for their work, wouldn't they?

Anyway, not going to discuss the basics of how our society works, not the right forum to do so, but I just found your counter argument really amusing; besides, no one is pointing a gun to your face forcing you to buy these new monitors, so there's no reason to get all worked up about this superfluous piece of technology, when there're obviously way more important things to worry about and fix like the state of the economy, world hunger, world peace and other serious matters...

Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.


----------



## The Von Matrices (Oct 19, 2013)

haswrong said:


> nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?
> 
> ok, i have to give you one thing, you can now watch your ass being fragged without stutter



By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster.  Have fun running at "low" settings on your integrated graphics.  There's a lot more to video gaming than competition.



15th Warlock said:


> Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.



Thank you.  It seems as if for some people NVidia producing any technology first makes it inherently evil.  I don't care who invented it, I support the technology.  Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:



> Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.



Just because no one else supports it yet doesn't mean that no one else ever will.  Someone has to be first.  Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.


----------



## 15th Warlock (Oct 19, 2013)

The Von Matrices said:


> By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster.  Have fun running at "low" settings on your integrated graphics.  There's a lot more to video gaming than competition.
> 
> 
> 
> ...



Exactly, seems like they have taken addressing this problem to heart, with innovations like Adaptative V-Sync, FCAT and now G-Sync.


----------



## Xzibit (Oct 19, 2013)

The Von Matrices said:


> For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:



I think people read Nvidia own FAQ on G-Sync



> Q: Does NVIDIA G-SYNC work with other competitive products?
> 
> A: NVIDIA G-SYNC is *only* works with NVIDIA GPUs and G-SYNC enabled monitors.



You can draw a conclusion there


----------



## Fluffmeister (Oct 19, 2013)

Sounds like a GTX xx0 card combined with an G-SYNC enabled monitor will offer a pretty damn sweet BF4 experience.

Oh nVidia, you big meanies, no wonder peeps here are mad.


----------



## The Von Matrices (Oct 19, 2013)

Xzibit said:


> I think people read Nvidia own FAQ on G-Sync
> 
> You can draw a conclusion there



Your comment does not disprove mine.  As I quoted, the signaling technology should be possible to reverse engineer.  At that point anyone can produce monitors or video outputs that comply to that standard.  G-Sync will be NVidia exclusive for a few years just because no one has had time to dissect it.  It doesn't mean that there will never be generic components that are compatible with it.  The only difference is that third parties won't use the trademarked term "G-sync."

You also need to consider the source when you read the quote from the FAQ.  All manufacturers advertise that their products only work with first-party accessories.  It doesn't mean that third parties can't make compatible accessories.


----------



## Xzibit (Oct 19, 2013)

The Von Matrices said:


> You also need to consider the source when you read the quote from the FAQ.  All manufacturers advertise that their products only work with first-party accessories.  It doesn't mean that third parties can't make compatible accessories.



Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver.  When was the last time Nvidia let outsiders tinker with that ?


----------



## The Von Matrices (Oct 19, 2013)

Xzibit said:


> Which is the accessory the GPU or the G-Sync ?
> 
> Since G-Sync will be talking to the driver.  When was the last time Nvidia let outsiders tinker with that ?



Once you know what commands are being sent over the cable then you can implement them into your own drivers or hardware.  For example, if you create a monitor that can read all the signals sent via the G-sync protocol and respond to them just like a genuine G-sync monitor, then why would this matter to the drivers?  A properly reverse engineered product should be no different than the genuine device.  I doubt NVidia wants manufacturers to do this, but I see no reason, engineering or legal, that third party manufacturers cannot, and the driver shouldn't be able to tell otherwise.

The only hurdle would be the investment required to reverse engineer the protocol, and if genuine G-sync doesn't catch on, then there will be no financial incentive and no third party will bother to do it.


----------



## Assimilator (Oct 19, 2013)

Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

When was the last time AMD released ANYTHING game-changing? No, Mantle doesn't count, because the world doesn't need another API; we have DirectX and it works just great. No, TrueAudio doesn't count, because no-one gives a shit.


----------



## qubit (Oct 20, 2013)

So, all nvidia have done is reverse sync direction, making the monitor sync with the card's varying frame rate output instead. A simple enough change technically, but it looks like the visual impact is big judging by the PR and articles I've read.

Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere, so I expect both of these motion artifacts to be present. The motion blur in particular is horrible and I'd rather have a bit of lag and occasional stutter than put up with this. I'd have to see G-Sync in action to properly judge it, though.

Also, it would be interesting to see this varying video signal on an oscilloscope.

*To check out the shape distortion, just open a window on the desktop, make it stretch from top to bottom, but be rather thin, then move it from side to side with the mouse. The shape will change with the top leading the bottom - moving the mouse faster makes the effect stronger. This is due to the scanning nature of the video signal, where the bottom part of the window (the whole picture, in fact) is quite literally drawn later than the top part. Note that the slower the monitor refresh, the worse the effect. Note that it's separate to the tearing artifact that you're also likely to see.

LightBoost strobing blanks the display and only shows the completed picture, eliminating this effect. Of course, this comes at the expense of maxed out lag. At least the lag is very short at 120Hz. Sometimes you just can't win, lol.


----------



## NC37 (Oct 20, 2013)

This honestly isn't worth it nVidia. VSYNC is not such a terrible thing that it needs a special dedicated chip which...you aren't opening to the entire industry, will increase production costs, and likely only make the situation worse later when someone comes out with an alternative that does it without all the negatives.

You should have just made the tech and licensed it for everyone to use then enjoyed the royalties for years. I seriously doubt it requires a Kepler GPU to use it. Already know PhysX will work on non NV cards. This isn't something special either. Just setting yourself up for the fall later when someone, maybe even AMD, does it and does it better and for everyone to use.


----------



## Xzibit (Oct 20, 2013)

This would have been better instead of replacing monitors

Leadtek NVIDIA QUADRO SYNC






NVIDIA Quadro Sync

Nvidia Quadro Sync User Guide

Nvidia Quadro G-Sync II User Guide


----------



## Lionheart (Oct 20, 2013)

This site has gone to shit with all the fanboy's & trolls, Jesus Christ lolz


----------



## NeoXF (Oct 20, 2013)

15th Warlock said:


> [...]



Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... 


Again, open or bust.

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.


----------



## Frick (Oct 20, 2013)

NeoXF said:


> It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.



Just want to point out that no. Not in the least. Alternatives are incoming and to an extent already here, but right now? No way, no how.

EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.


----------



## Am* (Oct 20, 2013)

Solidstate89 said:


> Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.
> 
> Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.
> 
> That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.



Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing. 

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.

The ONLY problem that currently exists that has anything remotely to do with monitors is that when an old game runs too fast, you have to choose between A. running at a higher framerate and experiencing tearing or B. Capping the framerate with vsync. The biggest problem with Vsync is that it drops GPU utilization to the point where a GPU can barely distinguish between idle and 3D load and this is a problem that occurs only with Nvidia cards, even on single GPU setups (because they have too many clock profiles to switch between), which causes stuttering. AMD doesn't have this problem because they have idle, 2D (Blu-ray) and 3D load clock profiles, nothing more. This gimmick does nothing whatsoever to fix that, and every Nvidia GPU up to this point, from 200-700 series has had this problem, and Maxwell will continue to have it until they address every affected game individually in the drivers 1-2 years after initial release. My GTX 285 had this problem, my 460 had this problem until they fixed most of them 2 years back, and my 660 had this problem, until I returned it. When they can work out a way to scale their GPU cores/clusters to imitate old cards, they will solve the problem instantly. This G-SYNC crap does not affect this problem in either a positive or a negative way, therefore it is worthless (even more so to 120Hz/144Hz fast gaming monitor users). By all means feel free to explain any benefits from this tech that I am not seeing.



Assimilator said:


> Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.



Please give it a rest, bud. This G-sync and Shield (the joke of a Android tablet slapped onto a 360 controller, with about 30 games on its support list, about which almost nobody outside of North America even knows or gives a single shit about) are not innovations in the slightest, and they are exactly why Nvidia are slowly losing its consumer GPU market share to AMD, as well as the reason why hardcore PC gamers buying into this crap will continue to get ridiculed by our casual PC and console gaming bretheren. Instead of investing in features that matter, they continue churning out more pricey gimmicks. If that is what you're into, more power to you and continue buying Nvidia. I for one, see these "innovations" as gimmicks and they add no value whatsoever to their GPUs or anything else employing this sort of tech that will come at a premium because of it in comparison to AMD.

Nvidia (and AMD) deserve praise for a lot of things -- G-sync and Shield are neither of them.


----------



## 1d10t (Oct 20, 2013)

The Von Matrices said:


> Thank you.  It seems as if for some people NVidia producing any technology first makes it inherently evil.  I don't care who invented it, I support the technology.  Kudos to NVidia for doing it first.
> For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:
> Just because no one else supports it yet doesn't mean that no one else ever will.  Someone has to be first.  Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.



If and only if G-Sync try manipulate TDMS and DDC Clock over Display Port,there's highly probability this will leave DCP (Displayport Content Protection) exposed.Now that will be unpleasant for some.Unless G Sync is communicate between two return channel TMDS and DDC,this will crippling frame sequence rendering within two channel.And let me guess,this will not work on SLI and or Stereoscopic.



Assimilator said:


> Innovations like SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.



Ah yes...that's great.Let me ask a simple question.Do you own nVidia Shield or at least ever try it?Do you use android phones?Ever try running games on much cheaper Google Nexus 4?


----------



## qubit (Oct 20, 2013)

Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so _once_.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.



Am* said:


> Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.
> 
> If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.



I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.


----------



## shb- (Oct 20, 2013)

RejZoR said:


> Clearly neither do you then. Adaptive V-Sync is there to:
> a) remove image tearing
> b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync



Nvidia page:


> NVIDIA's Adaptive VSync fixes both problems by *unlocking the frame rate when below* the VSync cap, which reduces stuttering, and by locking the frame rate when performance improves once more, thereby minimizing tearing.


Below 60 (120) fps you get no stuttering (basically vsync turns off), but see tearing. Above 60 (120) fps vsync is on as usual, resulting in no tearing and no visible stutter.

Really, why would nvidia develop something thats already solved, and a guy like asus join them? Nobody thats stupid. Also this tech is praised by various tech journalists and devs like John Karmack who seen it in action.

Only sh1tty thing about this is vendor lockin. We need this for amd and intel too, on every tv and monitor. Lets hope nvidia wont be stupid and open it up.


----------



## Deleted member 24505 (Oct 20, 2013)

qubit said:


> Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.
> 
> At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.
> 
> ...



Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.


----------



## qubit (Oct 20, 2013)

Frick said:


> EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.



+1 there. Why do things like this induce such foaming at the mouth? It's fucking ridiculous. If someone doesn't want it, then just don't buy it. No one's forcing them.



tigger said:


> Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.



+1 again.

Another technical thing I've just thought of about G-Sync.

Regardless of how moving pictures are being displayed, they are still _sampled_, just like audio. This means that the Nyquist limit or Nyquist frequency applies.

Hence, for fast moving objects eg during frenetic FPS gaming, you want that limit to be as high as possible, since an object moving fast enough will not just be rendered with only a few frames, but will display sampling artefacts similar to the "reverse spokes" effect in cowboy movies of old. In gaming, you may not even see the object, or it may appear in completely the wrong place and of course, be heavily lagged. If the GPU drops to its minimum of 30fps, then you can bet you'll see this effect and in a twitchy FPS, that can easily mean the difference between fragging or being fragged.

So again, while G-Sync looks like a great innovation to me, there remains no substitute for a high framerate as well.


----------



## jagd (Oct 20, 2013)

Which innovations ? A company acting like headless chicken because losing it's main market  ( GPU 's ) ? Nvidia has hardtime between AMD apus /raising tablets -smartphones - dropping pc sales  etc etc  and trying to find new markets  nothing more  nothing less.

Btw who you are to decide about mantle and true audio in behalf of  all people on earth , i dont remember i made you speaker person   . What you dont get is i want microsoft free gaming = No direct X  fyi 



Assimilator said:


> Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.


----------



## remixedcat (Oct 20, 2013)

It's all frothing rabid fandogs that see the "enemy" release something innovative thier faction doesn't have and they attack and hate it even though they haven't seen it, used it, or understand it.


----------



## PopcornMachine (Oct 21, 2013)

This may be a bit better than AMD droning on about sound, but not much.

I remain disappointed the by lack of innovation relevant to the consumer's current sound system and monitor.

In the end, all we really have received from these two esteemed institutions is re-branded cards.

...


----------



## MikeMurphy (Oct 21, 2013)

Neat idea, but I'd rather spend that money on a video card that will do 60fps at my desired resolution.


----------



## Wile E (Oct 21, 2013)

NeoXF said:


> Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat...
> 
> 
> Again, open or bust.
> ...


Almost everything that's open started as a response to a proprietary tech. For example, OpenCL wouldn't be nearly as far along as it is if it weren't for CUDA.

Innovation is innovation, and should be respected as such. This little piece of tech, if it performs worthwhile, will spawn further ideas and refinments, and maybe even an open standard.


----------



## Solaris17 (Oct 21, 2013)

So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?


----------



## Wile E (Oct 21, 2013)

Solaris17 said:


> So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?



This is to eliminate stutter *AND* tearing. 120 or 140 does you no good if you can't run those framerates in your games. Run below that, and you get stuttering and tearing.


----------



## Steevo (Oct 22, 2013)

My TV already does frame scaling and interpolation to reduce tearing. Even at 13FPS when I have overloaded my GPU I barely see it. 


This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.


----------



## qubit (Oct 22, 2013)

Steevo said:


> This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.



It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.


----------



## 1d10t (Oct 22, 2013)

qubit said:


> It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.
> 
> I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.
> 
> If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.



As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).

Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.

I stated before that i had LG panel that do 240Hz FPR 3D effect.Yet i admitted there is severe judder if i crank motion interpolation to the max.It's only natural for HDTV that can't do a massive task rendering full white or pitch black frame,in RGB mode with MEMC and MI switch on.In addition,my panel doesn't carry EDID,DDC or desktop similar timing and clock,so it only does 120 Hz or 240Hz interpolation mode.

Bottom line,it's good to see nVidia aware of this matters.G Sync could be a game changer for desktop to have advanced MEMC and MI.But then again,don't expect LG and Samsung for not bringing their tech to desktop at a reasonable price.


----------



## qubit (Oct 23, 2013)

1d10t said:


> As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).
> 
> Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.
> 
> ...



I'm not sure why you think this already exists? This has not been done before in any commercial product.

G-Sync synchronizes the monitor _dynamically_ to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.

It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.


----------



## The Von Matrices (Oct 23, 2013)

qubit said:


> I'm not sure why you think this already exists? This has not been done before in any commercial product.



I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.


----------



## Steevo (Oct 23, 2013)

qubit said:


> It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.
> 
> I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.
> 
> If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.



So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here. 


So lets do some math.

60FPS = .01667 rounded seconds per frame. 

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point. 

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer. 

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming. 

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame. 

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

http://en.wikipedia.org/wiki/Multiple_buffering

Triple buffer with Vsync.......been there done that, it was crap and caused lag.


----------



## Wile E (Oct 23, 2013)

Steevo said:


> So how does it fix stutter and lag?
> 
> A) By redrawing old frames on the screen.
> B) Magic and pixie dust.
> ...



Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.


----------



## Steevo (Oct 23, 2013)

Wile E said:


> Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.



So you would like to pay for lag? Seriously, I must be in the wrong business.


----------



## Wile E (Oct 23, 2013)

Steevo said:


> So you would like to pay for lag? Seriously, I must be in the wrong business.



Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.


----------



## Steevo (Oct 23, 2013)

Wile E said:


> Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.



I know of the time constant, and buffering, and other attempts, and all still have that time issue. 


I will also wait to see it in action.


----------



## Xzibit (Oct 23, 2013)

Only thing that's been shown is Nvidias demo and a slow camera rotation in Tomb Raider with Lara Croft by her self.

We don't even know the specs of the systems they were run on.

Nvidia Tom Peterson also said it was game dependent as well.  Some games will not work with it and 2D is not a focus.

Until someone test this in several game scenarios I'll remain skeptical much like the 3D Vision Surround craze.


----------



## 1d10t (Oct 23, 2013)

qubit said:


> I'm not sure why you think this already exists? This has not been done before in any commercial product.
> 
> G-Sync synchronizes the monitor _dynamically_ to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.
> 
> It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.



Please read my explanation above.I do aware of all desktop thing in matter of speaking.
If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
TV wise,they do _dynamically upsample_ source to have 120Hz or 240 Hz display in opposite of G Sync _dynamically downsample_ display to match GPU output.
For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
It's only a matter of time LG and Samsung will make a monitor supporting this _dynamically downsample_ feature,on the other hand nVidia couldn't make a better dishwasher.



The Von Matrices said:


> I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.



Yeah..i don't even know the difference between 60Hz and 60fps...silly me


----------



## qubit (Oct 23, 2013)

Steevo said:


> So how does it fix stutter and lag?
> 
> A) By redrawing old frames on the screen.
> B) Magic and pixie dust.
> ...



I'm sorry, but you're wrong on all counts. I've already written an explanation of how this works, so I'm not gonna go round in circles with you trying to explain it again, especially after having seen your further replies since the one to me. Click the link below, where I explained it in three posts (just scroll down to see the other two).

http://www.techpowerup.com/forums/showthread.php?p=3000195#post3000195


Mind you, I like the idea of the pixie dust. 




1d10t said:


> Please read my explanation above.I do aware of all desktop thing in matter of speaking.
> If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
> TV wise,they do _dynamically upsample_ source to have 120Hz or 240 Hz display in opposite of G Sync _dynamically downsample_ display to match GPU output.
> For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
> It's only a matter of time LG and Samsung will make a monitor supporting this _dynamically downsample_ feature,on the other hand nVidia couldn't make a better dishwasher.



Yes, not only have I tried 3D Vision, but I have it and it works very well too. I think active shutter glasses were around before NVIDIA brought out their version. How is this relevant?

I think the link I posted above for Steevo will help you understand what I'm saying, as well. Again though, G-Sync is a first by NVIDIA, as it's irrelevant for watching video. It's the interaction caused by gaming that makes all the difference.


----------



## markybox (Oct 23, 2013)

qubit said:


> Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere


Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode?  Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.

So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced).  Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.

You can only choose G-SYNC mode, or strobed mode, though.  
But I'm happy it will be even better than LightBoost, officially for 2D mode.
This is NVIDIA's secret weapon.  Probably has an unannounced brand name.


----------



## qubit (Oct 23, 2013)

markybox said:


> Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode?  Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.
> 
> So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced).  Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.
> 
> ...



Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.

Be interesting to see exactly how NVIDIA address this. Got a link?


----------



## markybox (Oct 23, 2013)

qubit said:


> Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.
> Be interesting to see exactly how NVIDIA address this. Got a link?


It's an "either-or" proposition, according to John Carmack:
CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!

It is currently a selectable choice:
G-SYNC Mode: Better for variable framerates (eliminate stutters/tearing, more blur)
Strobe Mode: Better for constant max framerates (e.g. 120fps @ 120Hz, eliminates blur)
Also, 85Hz and 144Hz strobing is mentioned on the main page.


----------



## qubit (Oct 23, 2013)

markybox said:


> It's an "either-or" proposition, according to John Carmack:
> CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!
> 
> It is currently a selectable choice:
> ...



Cheers matey. I'm busy right now, but I'll read it later and get back to you.

I found these two links in the meantime:

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate

http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA


----------



## erocker (Oct 23, 2013)

I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.


----------



## MxPhenom 216 (Oct 23, 2013)

erocker said:


> I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.



You looking to get the 780Ti now if you are interested in G-SYNC?


----------



## erocker (Oct 23, 2013)

MxPhenom 216 said:


> You looking to get the 780Ti now if you are interested in G-SYNC?



Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.


----------



## MxPhenom 216 (Oct 23, 2013)

erocker said:


> Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.



yeah it wouldn't. I want to see Asus release an IPS 27" with 1440p with G-SYNC, and I would most definitely be interested.


----------



## qubit (Oct 23, 2013)

erocker said:


> I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.



I'll second that. This is how I splashed out £400 on 3D Vision (glasses and monitor) way back in 2009, then running off my GTX 285 of the time.

I remember "just happening" to go round to Novatech and checking it out. I barely even played the game and within 5 minutes they had my money.  Even my friend who knows nothing about computers and doesn't do gaming was impressed with it.

I don't doubt that G-Sync will deliver a similar kind of awesome which will be more in the way it feels when you control the action with the keyboard and mouse than anything else.

EDIT: you'll _love_ LightBoost and you don't even need an NVIDIA card with the ToastyX utility, either.


----------



## Am* (Oct 24, 2013)

qubit said:


> *I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place.* Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.
> 
> Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.
> 
> *Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.



Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it. As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever). 

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.


----------



## The Von Matrices (Oct 24, 2013)

Am* said:


> I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag.
> 
> Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work.



I think your example is unrealistic.  Nvidia never said that your solution of a locked maximum frame rate wouldn't achieve a similar goal.  Instead, they said it was a bad solution because it required developers to program games with very little detail so that they never drop below a monitor's refresh rate.  This isn't practical since any game has scenes more complicated than others, and it makes no sense to run the GPU at 1/4 load 99.9% of the time just so that it never drops below the monitor's refresh rate the remaining 0.01% of the time.

You also define "stuttering" and "lag" differently than NVidia does.  NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed.  "Stutter", in NVidia's terms, is the variance in "lag".  The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync.  However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter".  Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived.  The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power.  G-Sync does the same lag reduction as this without the necessity of wasting GPU power.


----------



## qubit (Oct 25, 2013)

Am* said:


> Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it.



Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.




Am* said:


> As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.
> 
> I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).
> 
> ...



I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube


----------



## TheoneandonlyMrK (Oct 25, 2013)

qubit said:


> Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.
> 
> 
> 
> ...



I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors  puts this tech out of  reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.


----------



## qubit (Oct 25, 2013)

theoneandonlymrk said:


> I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.
> 
> What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors  puts this tech out of  reach ot all but the highest paid or most enthusiastic pc gamer.
> Niche thats it.



Whether g-spot is a game changer remains to be seen, I quite agree. However, my feeling on it is that it will be. We'll soon know for sure.

The improvement is equally good on any monitor and resolution configuration as far as I can see. However, I'd have to compare them to really disagree with your point. 

I do think it's ironic the fact that when the GPU is rendering faster than the monitor's highest refresh, say 120Hz, g-spot works just like normal vsync would, lol. A modern PC with a decent graphics card or cards will often achieve this, especially when the game is an older one.

Oh and it's expensive? Never!


----------



## Am* (Nov 2, 2013)

The Von Matrices said:


> You also define "stuttering" and "lag" differently than NVidia does.  NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed.  "Stutter", in NVidia's terms, is the variance in "lag".  The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync.  However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter".  Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived.  The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.
> 
> I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power.  G-Sync does the same lag reduction as this without the necessity of wasting GPU power.



That's because Nvidia are yet again using ideal world scenarios for practical demos.

My entire point was, G-sync fixes nothing, which is still true -- after watching that presentation, I'm even more sure than before. The side by side comparisons even show one monitor without G-sync is tearing but displaying stuff faster than the one with G-sync, which was clearly skipping/jumping frames and "jittering". Watching a pendulum on a screen (Nvidia's pointless demo) or the pointless slow-turning Borderlands 2 demonstration are of no value whatsoever -- I'd love to see someone try using G-sync for a fast motion FPS shooter like BF3 and see just how much input lag it will add to the already delayed engine that the game uses. That G-sync module is nothing more than a hardware-based v-sync framebuffer with extra memory for the monitor -- maybe worth $30 on its best selling day.

This is not even including the fact that this entire problem of tearing is non-existant on fast 120Hz+ panels, with the exception of a few games that run on old engines that suffer from uneven frame pacing in general, regardless of whether it is running on two or one graphics chip (case in point, COD4). 

The only imaginable scenario I can think of where this G-sync module would be of any use, is purely in multi-monitor setups where the frames may be being fed unevenly on each different monitor -- but then the question is, is it a problem worth shelling out $175 per monitor for? Absolutely not, and anybody disagreeing with that, is insane ($175 a piece for a 3 monitor setup is $525, not even including the GPUs or any of the monitors).



qubit said:


> Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.
> 
> I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.
> 
> ...



See my above answer. I saw that demo in full and it proves my point further that it will not work to improve anything -- certainly not for its value or on any decent TN gaming monitor available these days. His entire argument is excessive/delayed frames>uneven draws by the monitor, so his G-sync module merely gives the monitor an extra large frame buffer to feed the monitor once each frame is ready from the GPU, as well as some proprietary draw calls to the GPU from G-sync to stop it rendering more frames -- at best, a $30 dollar's worth gimmicky solution, and again, nothing revolutionary or worth writing home about.

Now let's see the real-world case of this half-arsed solution -- capped frame rate means light GPU load (on Kepler GPUs, which are almost solely reliant on this to run at advertised clocks), which means the GPU runs at a less-than-optimal power state causing it to down clock, which means when more complex scenes are being rendered, it struggles with the load, and has to clock back up, causing a delay and therefore stutter (rinse and repeat). This is going to need a lot of driver-side support on a title-by-title basis in order to work properly, and I seriously doubt they are going to dedicate many -- if any man hours, into making this work. Makes it nothing more than a gimmick in my book, and an insanely overpriced one at that and I am yet to be proven wrong in this, unfortunately -- in practice or theory.


----------



## qubit (Nov 2, 2013)

Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature.  :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.


----------



## Wile E (Nov 2, 2013)

qubit said:


> Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.
> 
> Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.
> 
> ...


The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.


----------



## Xzibit (Nov 2, 2013)

qubit said:


> Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync?



He might be referring to G-Sync on-board memory



> The pictures show that the FPGA is paired with a trio of 2Gb DDR3 DRAMs, giving it 768MB of memory for image processing and buffering.


----------



## qubit (Nov 2, 2013)

Wile E said:


> The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.
> 
> That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.



Good point. In the end it's always swings and roundabouts, lol.



Xzibit said:


> He might be referring to G-Sync on-board memory



Duh, I missed that.  I'd love to see a white paper on G-Sync explaining all the technical details of it.


----------



## Am* (Nov 2, 2013)

qubit said:


> Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.
> 
> Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.
> 
> ...



In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:







From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.



Xzibit said:


> He might be referring to G-Sync on-board memory



Bingo.


----------



## qubit (Nov 2, 2013)

Am* said:


> In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:
> 
> http://images.eurogamer.net/2013/articles//a/1/6/2/5/7/2/2/Nvidia5.jpg.jpg
> 
> ...



Ok, so I missed the bit about the memory buffer on the G-Sync module, but that doesn't actually change the principles of what I'm saying. As the NVIDIA CEO said himself, the system is simple in principle, but complex to implement properly in practice. This is similar to the situation with how jet engines work, for example. Not too complex in principle, but fiendishly complex and difficult to make one that works properly.

All that diagram shows is how things currently work with vsync off and a standard monitor. Of course you see tearing. Why didn't you show the one with the irregular GPU outputs that the monitor syncs to with G-Sync on? That would have been much more relevant.

Another way to think about G-Sync is Adaptive vsync without the tearing, although there's important subtleties there, such as the reduction of latency.

One thing to realize is that if the GPU is putting out frames faster than the fastest refresh of the monitor (say, 144Hz) then the system goes back to a standard vsync-on arrangement ie like G-Sync wasn't there and the GPU reverts to being synced with the monitor. This will typically be the case when playing old games on modern hardware.

However, with modern demanding games and high resolutions we know that a solid 144Hz cannot be maintained. That's where G-Sync syncs the monitor to the GPU, giving the advantages I explained previously. If you want me to repeat it all here, you're out of luck. We're going round in circles already.


----------



## Serpent of Darkness (Nov 3, 2013)

*Re:*

So simple and short, G-Sync is going to regulate the refresh rate on the monitor so there isn't any dropped frames.  Monitor is being told by G-Sync when a frame is coming in because it is communicating with the GPU.  A frame gets sent by the GPU to the monitor, and the fame gets properly placed into a 16.667 ms scan-window...  Still reminds me of the Dynamic Frame Control on the RadeonPro Beta for AMD users.  It's just something that's being utilized on a Hardware level versus a software level.

So the issue is still the monitor's static refresh rate.  Seems like it would be better if Display Manufacturers tried to increase the refresh rate above 144 hz, and make it dynamic instead of static.  This would probably be more ideal for AMD Graphic Cards in CrossfireX than NVidia's SLI.  CrossfireX has a bad habit of shooting frames out like a galling gun with more than two gpus in CrossfireX. Maybe add in a secondary frame buffer to the monitor incase the fame time is under 30 fps or 33.33 ms and below.  That way the previous frame could be stored until a new frame arrives, or have a component in the monitor that skips a scan with the previous scan still displayed...

I still think the EIZO FORIS FG2421 240 Hz gaming monitor is more innovative than NVidia G-Sync.  120 Hz scans with a black-out period after the fame is displayed in each 8.33 ms window seems a lot more... creative.  I may invest in this monitor or three.

For the NVidia users, I hope G-Sync does the job with little to no latency.


----------



## Xzibit (Nov 3, 2013)

This is interesting...

[WCCF Tech] Nvidia G-Sync Will Only Come to ASUS Monitors – Will Come to Other Brands After Q3 2014 [RUMOR]



> Update: Status Changed to RUMOR. *However we have confirmation that G-Sync will stay with ASUS till sometime next year when it launches in full force*.


----------



## quoloth (Dec 13, 2013)

http://www.google.com/patents/US20080055318 Seems ATI already has the patent on this technology? I'm no expert but sounds like the same principle described. The other filings are specifically listed as under ATI. I assume AMD has these now?


----------



## The Von Matrices (Dec 13, 2013)

quoloth said:


> http://www.google.com/patents/US20080055318 Seems ATI already has the patent on this technology? I'm no expert but sounds like the same principle described. The other filings are specifically listed as under ATI. I assume AMD has these now?



It's not exactly the same thing from what I can tell.  This technology seems to be adjusting the frame rate to match the source material, but not on a frame by frame basis.  It seems to be more applicable to syncing video (fixed frame rate) to the display and to lowering power consumption through setting the monitor to a lower refresh rate, lowering GPU load.


----------



## Doc41 (Dec 13, 2013)

Not sure if this was posted but Tomshardware detailed this some more and they got a sample monitor with g-sync
http://www.tomshardware.com/reviews/g-sync-v-sync-monitor,3699.html#xtor=RSS-998


----------



## BiggieShady (Jan 3, 2014)

Here are some nice videos of 144Hz g-sync monitor recorded with slo-mo camera at 240fps to demonstrate benefits

http://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech/3


----------



## Prima.Vera (Jan 3, 2014)

Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....


----------



## BiggieShady (Jan 4, 2014)

Prima.Vera said:


> Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....



Low res is preventing you to see tearing and smoothness differences? Really?  Tech is proven, tested, reviewed and available - not much room for propaganda.


----------

