Friday, October 18th 2013

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Solving the decades-old problem of onscreen tearing, stuttering and lag, NVIDIA today unveiled NVIDIA G-SYNC technology which, for the first time, enables perfect synchronization between the GPU and the display. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies.

Several years in the making, G-SYNC technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. Scenes appear instantly. Objects are sharper. Game play is smoother.
"Our commitment to create a pure gaming experience led us to G-SYNC," said Jeff Fisher, senior vice president of the GeForce business unit at NVIDIA. "This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-SYNC monitor, you'll never want to go back."

Since their earliest days, displays have had fixed refresh rates -- typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-SYNC (or Vertical-SYNC) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates.

G-SYNC eliminates this tradeoff. It perfectly syncs the monitor to the GPU, regardless of frame rate, leading to uncompromised PC gaming experiences.

G-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)

Leading Game Developers Blown Away
Game developers have quickly embraced the benefits of G-SYNC technology, which enables their games to be played seamlessly.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better." - Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!" - Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter." - John Carmack, co-founder, iD Software

Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality." - Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level." - Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one." - Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays." - Jeff Volpe, president, ViewSonic

Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good." -Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter." - Elan Raja III, co-founder, Scan Computers.
Add your own comment

147 Comments on NVIDIA Introduces G-SYNC Technology for Gaming Monitors

#1
de.das.dude
Pro Indian Modder
just saw this on facebook!
Posted on Reply
#2
Unregistered
It's going to put a £100+ premium on the price of the monitor though I reckon.
Posted on Edit | Reply
#3
Am*
Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.

Also, have these guys ever heard of Dxtory?

Please GTFO with more of this proprietary, overpriced and useless bullshit to further fragment PC gaming, Nvidia.

EDIT: and good lord, that stupid name...G-SYNC? Sounds like the name of an Nsync tribute band.
Posted on Reply
#4
rokazs1
Don't forget 780ti !:cool:
Posted on Reply
#5
SIGSEGV
sigh, another proprietary crap from nvidia :wtf:
Posted on Reply
#6
Xzibit
I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

Besides Nvidia users don't have stuttering nor tearing....right guys ???
Posted on Reply
#7
Renald
I've 60+ FPS on most of the games with a 200€ card.

Why would it be useful to have that ? It will not resolve multi-GPU problems, and it's useless for other usage.


I must surrender, it's too stupid, even from them. :respect:
Posted on Reply
#8
Prima.Vera
How much did nVidia pay for all those guys to have North Korean adulation declarations for this new crap??
Posted on Reply
#9
RejZoR
Does it even work with AMD GPU's ? If not, it's as useless as it can get.
Posted on Reply
#10
SIGSEGV
RejZoRDoes it even work with AMD GPU's ? If not, it's as useless as it can get.
Cristian_25HG-SYNC technology includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. (A full list of GPUs that support NVIDIA G-SYNC is available here.)
hope it helps..
Posted on Reply
#11
Recus
Desperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag. :rolleyes:
Am*Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.

Also, have these guys ever heard of Dxtory?

Please GTFO with more of this proprietary, overpriced and useless bullshit to further fragment PC gaming, Nvidia.

EDIT: and good lord, that stupid name...G-SYNC? Sounds like the name of an Nsync tribute band.
Today local pharmacy closed earlier. Did you attacked them because they are selling useless, proprietary and gimmicky pharmacy companies drugs?
RejZoRDoes it even work with AMD GPU's ? If not, it's as useless as it can get.
Mantle won't work on Nvidia. Why it's not useless?
Posted on Reply
#12
TheMailMan78
Big Member
Cadaveca and I were talking about this about a year ago. Sometime (no matter the mfg of the GPU) the "flow" of the animations and movement on the screen would be so smooth that it gave the same feeling as watching a movie on a 120mhz HD monitor. However the "sensation" for lack of a better word was always short lived. He tried to figure out what was causing it and so did I. He also asked W1zz about it but its a very hard thing to explain. You either "know" the feeling or you don't. After reading this I think NVIDIA might have narrowed it down to a hardware level judging by what the developers were saying in the PR. If so than this is gonna be awesome.
Posted on Reply
#13
Am*
RecusDesperate AMD fanboys likes AMD's "exclusive" features onscreen tearing, stuttering and lag. :rolleyes:
Butthurt Nvidiot strikes again, what a surprise...might want to check my system specs before you embarrass yourself any further.
Posted on Reply
#14
MxPhenom 216
ASIC Engineer
I think this could be pretty cool. id be interested to try out a monitor with a G-Sync module thats for sure.

Asus 27" 1440p monitor with G-Sync......anyone?
Posted on Reply
#15
Recus
Am*Butthurt Nvidiot strikes again, what a surprise...might want to check my system specs before you embarrass yourself any further.
And who can confirm your specs, mind invalid. You better go and write petition to AMD asking them to stop driver updates because games related problems aren't GPU makers problems it's game developers problem. :eek:
Posted on Reply
#16
wickerman
I really like this idea, but if it requires me to replace my u2711 I see that as a bit of a problem. Sure 2560x1440 panels have come down in price significantly since I bought mine, but it seems like a bit of a waste to replace my current panel with something that is the same resolution. I'd rather jump on the 4k train when the prices become more reasonable.

I suppose if I had a friend/family member willing to buy mine and the replacement monitor offered benefits in other areas (color accuracy, response time, lower power, etc) then I could be talked into replacing my u2711 with another 1440p/1600p panel that supported this tech.

*edit*
Also if this is Nvidia exclusive tech, that is also a bit annoying. I tend to keep my monitors for a while, but I could switch back and forth between AMD and Nvidia graphics quite frequently. Would be kind of annoying to if we came across a generation down the line where AMD has the superior performance but I have to wait for Nvidia to catch up just to take advantage of the reason I upgraded my monitor.
Posted on Reply
#17
The Von Matrices
I look forward to learning more about this technology and seeing it implemented. It's basically dynamic refresh rate for monitors. I do see this making a huge difference for people like me who like to turn up the details at the expense of having the frame rate frequently drop below the monitor's refresh rate.
Posted on Reply
#18
Am*
RecusAnd who can confirm your specs, mind invalid. You better go and write petition to AMD asking them to stop driver updates because games related problems aren't GPU makers problems it's game developers problem. :eek:
My PC is barely mid range compared to the systems some people here have, and you want me to prove my specs? Daaymn, you must be pretty broke to even consider saying that, no wonder you're mindlessly trolling the forums.

P.S. will attach a CPU-Z/whatever system validation is quickest, when I can be arsed to do it.
Posted on Reply
#19
the54thvoid
Super Intoxicated Moderator
It looks to be a good thing but for God's sake don't be proprietary with it. And wtf with the tone of this thread?
Posted on Reply
#20
Crap Daddy
It is proprietary and will cost money compared to Mantle which is proprietary but comes for free albeit with just one game. Guess we will soon have to own two systems for gaming depending on what games we like, one NV and one AMD.
Posted on Reply
#21
The Von Matrices
My hope is that their pricing estimates are not too far off. I would be willing to pay $50 extra per monitor for this. If NVidia has this working with their 3D Surround implementation, then I would seriously consider replacing my graphics cards and monitors for an upgrade.
Posted on Reply
#22
1d10t
I reckon the previous hype nVidia boasting, "3D Games is The Future".And where all of those now? :p

Is this nVidia respond to AMD Mantle ? :p

OR...obviously nVidia cant reach above 60fps with highest detail ,but instead admitted they restricted it :p

OR...they didn't good in 4K.But instead making another competitive card they make their own monitor that only do 1080p :p

-= edited=-
erocker*Waits for demonstration by a 3rd party*
I bet it organize by Origin PC :p
Posted on Reply
#23
Hilux SSRG
If Nvidia can eliminate stuttering and provide ultrafast response, I'm very interested. I hope we get to see some videos soon.
Posted on Reply
#24
erocker
*
*Waits for demonstration by a 3rd party*
Posted on Reply
#25
Solidstate89
Am*Yet again, more useless, proprietary and gimmicky crap from Nvidia. This is nothing that can't be implemented as a firmware upgrade for literally every monitor available by forcing it to drop excessive frames or simply stopping the GPU from rendering more than the refresh rate allows in the drivers. I'm pretty certain several monitors, including mine, already implements something like this unofficially, as I see no tearing even in ancient games like COD4 and Q3A that run well over 250FPS.

Also, have these guys ever heard of Dxtory?

Please GTFO with more of this proprietary, overpriced and useless bullshit to further fragment PC gaming, Nvidia.

EDIT: and good lord, that stupid name...G-SYNC? Sounds like the name of an Nsync tribute band.
Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.
Posted on Reply
Add your own comment
Nov 21st, 2024 21:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts