• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
I think somebody forgot, fps are not Hz so 60 fps doesn't mean 60 Hz.Displayed image consist of many frame rendered in one second,while monitor refresh rate consist three factors : horizontal frequencies,resolution and response time.
Care to explain where this G Sync take a part?
 
Joined
Feb 23, 2008
Messages
1,064 (0.17/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
I'm surprised Carmack sounded so positive about this thing. I respect him as much as the next guy, but I can't see it that way at he moment.

I am curious though, how will g-sync do in say fighting games that require to-the-frame accuracy to pull out the best combos? It could be either a real boon or a curse for them.
 
Joined
Sep 19, 2012
Messages
615 (0.14/day)
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:

Yes, how dare corporation care about anything else other than making money!

/S :rolleyes:


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...
 
Joined
Aug 16, 2004
Messages
3,285 (0.44/day)
Location
Sunny California
Processor AMD Ryzen 7 9800X3D
Motherboard Gigabyte Aorus X870E Elite
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6000MHz Corsair Vengeance
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 990 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply beQuiet Straight Power 12 1500W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.

Yes, how dare corporation care about anything else other than making money!

/S :rolleyes:


Just because the world is how it is now, doesn't mean it's any good. But what the Hell, you guys can salute your new green, blue, red or WTF ever overlords in any way you want, it's not me being dead inside while shielding myself further and further away from things that should count more, with consumerism gimmicks.

This makes me sick to my stomach, but what can I do...

Aww, how nice of you both, but let me ask you a few questions: Do you have a job? If you do, do you work for free? I mean, do you offer your services without expecting to be remunerated for them? And if that's the case, how do you support yourself and/or your family? Through charity/welfare?

I mean, you have to find a way to pay your bills somehow, am I right?

I would assume that people who work for these companies (and please note that this applies to any given company in our "evil economy") expect some sort of compensation for their work, wouldn't they?

Anyway, not going to discuss the basics of how our society works, not the right forum to do so, but I just found your counter argument really amusing; besides, no one is pointing a gun to your face forcing you to buy these new monitors, so there's no reason to get all worked up about this superfluous piece of technology, when there're obviously way more important things to worry about and fix like the state of the economy, world hunger, world peace and other serious matters...:rolleyes:

Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.
 
Last edited:
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D

By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.

Back to topic, it sounds like nVidia is genuinely interested in fixing this V-Sync problem, that has existed for so long, I for one am excited to see that someone is focusing some research to addressing this issue, I just hope they do offer the results of their findings to more costumers than just owners of new monitors and/or Kepler based cards.

Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.
 
Last edited:
Joined
Aug 16, 2004
Messages
3,285 (0.44/day)
Location
Sunny California
Processor AMD Ryzen 7 9800X3D
Motherboard Gigabyte Aorus X870E Elite
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6000MHz Corsair Vengeance
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 990 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply beQuiet Straight Power 12 1500W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
By that logic there should be no reason why we should buy new graphics cards to run at more detailed settings because the increased detail doesn't make you respond any faster. Have fun running at "low" settings on your integrated graphics. There's a lot more to video gaming than competition.



Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.

For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:


Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.

Exactly, seems like they have taken addressing this problem to heart, with innovations like Adaptative V-Sync, FCAT and now G-Sync.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Sounds like a GTX xx0 card combined with an G-SYNC enabled monitor will offer a pretty damn sweet BF4 experience.

Oh nVidia, you big meanies, no wonder peeps here are mad.
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I think people read Nvidia own FAQ on G-Sync

You can draw a conclusion there

Your comment does not disprove mine. As I quoted, the signaling technology should be possible to reverse engineer. At that point anyone can produce monitors or video outputs that comply to that standard. G-Sync will be NVidia exclusive for a few years just because no one has had time to dissect it. It doesn't mean that there will never be generic components that are compatible with it. The only difference is that third parties won't use the trademarked term "G-sync."

You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
You also need to consider the source when you read the quote from the FAQ. All manufacturers advertise that their products only work with first-party accessories. It doesn't mean that third parties can't make compatible accessories.

Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
Which is the accessory the GPU or the G-Sync ?

Since G-Sync will be talking to the driver. When was the last time Nvidia let outsiders tinker with that ?

Once you know what commands are being sent over the cable then you can implement them into your own drivers or hardware. For example, if you create a monitor that can read all the signals sent via the G-sync protocol and respond to them just like a genuine G-sync monitor, then why would this matter to the drivers? A properly reverse engineered product should be no different than the genuine device. I doubt NVidia wants manufacturers to do this, but I see no reason, engineering or legal, that third party manufacturers cannot, and the driver shouldn't be able to tell otherwise.

The only hurdle would be the investment required to reverse engineer the protocol, and if genuine G-sync doesn't catch on, then there will be no financial incentive and no third party will bother to do it.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

When was the last time AMD released ANYTHING game-changing? No, Mantle doesn't count, because the world doesn't need another API; we have DirectX and it works just great. No, TrueAudio doesn't count, because no-one gives a shit.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
So, all nvidia have done is reverse sync direction, making the monitor sync with the card's varying frame rate output instead. A simple enough change technically, but it looks like the visual impact is big judging by the PR and articles I've read.

Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere, so I expect both of these motion artifacts to be present. The motion blur in particular is horrible and I'd rather have a bit of lag and occasional stutter than put up with this. I'd have to see G-Sync in action to properly judge it, though.

Also, it would be interesting to see this varying video signal on an oscilloscope.

*To check out the shape distortion, just open a window on the desktop, make it stretch from top to bottom, but be rather thin, then move it from side to side with the mouse. The shape will change with the top leading the bottom - moving the mouse faster makes the effect stronger. This is due to the scanning nature of the video signal, where the bottom part of the window (the whole picture, in fact) is quite literally drawn later than the top part. Note that the slower the monitor refresh, the worse the effect. Note that it's separate to the tearing artifact that you're also likely to see.

LightBoost strobing blanks the display and only shows the completed picture, eliminating this effect. Of course, this comes at the expense of maxed out lag. At least the lag is very short at 120Hz. Sometimes you just can't win, lol.
 
Joined
Oct 30, 2008
Messages
1,768 (0.30/day)
System Name Lailalo
Processor Ryzen 9 5900X Boosts to 4.95Ghz
Motherboard Asus TUF Gaming X570-Plus (WIFI
Cooling Noctua
Memory 32GB DDR4 3200 Corsair Vengeance
Video Card(s) XFX 7900XT 20GB
Storage Samsung 970 Pro Plus 1TB, Crucial 1TB MX500 SSD, Segate 3TB
Display(s) LG Ultrawide 29in @ 2560x1080
Case Coolermaster Storm Sniper
Power Supply XPG 1000W
Mouse G602
Keyboard G510s
Software Windows 10 Pro / Windows 10 Home
This honestly isn't worth it nVidia. VSYNC is not such a terrible thing that it needs a special dedicated chip which...you aren't opening to the entire industry, will increase production costs, and likely only make the situation worse later when someone comes out with an alternative that does it without all the negatives.

You should have just made the tech and licensed it for everyone to use then enjoyed the royalties for years. I seriously doubt it requires a Kepler GPU to use it. Already know PhysX will work on non NV cards. This isn't something special either. Just setting yourself up for the fall later when someone, maybe even AMD, does it and does it better and for everyone to use.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
This would have been better instead of replacing monitors

Leadtek NVIDIA QUADRO SYNC


NVIDIA Quadro Sync

Nvidia Quadro Sync User Guide

Nvidia Quadro G-Sync II User Guide
 
Last edited:
Joined
Apr 30, 2008
Messages
4,897 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor Intel Core i5 12600H
Motherboard MinisForum NAB6 Lite Board
Cooling Mini PC Cooling
Memory Apacer 16GB 3200Mhz
Video Card(s) Intel Iris Xe Graphics
Storage Kingston 512GB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum NAB6 Lite Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Home 64bit
Benchmark Scores Don't do them anymore.


:banghead::banghead:

This site has gone to shit with all the fanboy's & trolls, Jesus Christ lolz
 
Last edited:
Joined
Sep 19, 2012
Messages
615 (0.14/day)
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS

Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... :roll:


Again, open or bust.

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,571 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.

Just want to point out that no. Not in the least. Alternatives are incoming and to an extent already here, but right now? No way, no how.

EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.

Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.

The ONLY problem that currently exists that has anything remotely to do with monitors is that when an old game runs too fast, you have to choose between A. running at a higher framerate and experiencing tearing or B. Capping the framerate with vsync. The biggest problem with Vsync is that it drops GPU utilization to the point where a GPU can barely distinguish between idle and 3D load and this is a problem that occurs only with Nvidia cards, even on single GPU setups (because they have too many clock profiles to switch between), which causes stuttering. AMD doesn't have this problem because they have idle, 2D (Blu-ray) and 3D load clock profiles, nothing more. This gimmick does nothing whatsoever to fix that, and every Nvidia GPU up to this point, from 200-700 series has had this problem, and Maxwell will continue to have it until they address every affected game individually in the drivers 1-2 years after initial release. My GTX 285 had this problem, my 460 had this problem until they fixed most of them 2 years back, and my 660 had this problem, until I returned it. When they can work out a way to scale their GPU cores/clusters to imitate old cards, they will solve the problem instantly. This G-SYNC crap does not affect this problem in either a positive or a negative way, therefore it is worthless (even more so to 120Hz/144Hz fast gaming monitor users). By all means feel free to explain any benefits from this tech that I am not seeing.

Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

Please give it a rest, bud. This G-sync and Shield (the joke of a Android tablet slapped onto a 360 controller, with about 30 games on its support list, about which almost nobody outside of North America even knows or gives a single shit about) are not innovations in the slightest, and they are exactly why Nvidia are slowly losing its consumer GPU market share to AMD, as well as the reason why hardcore PC gamers buying into this crap will continue to get ridiculed by our casual PC and console gaming bretheren. Instead of investing in features that matter, they continue churning out more pricey gimmicks. If that is what you're into, more power to you and continue buying Nvidia. I for one, see these "innovations" as gimmicks and they add no value whatsoever to their GPUs or anything else employing this sort of tech that will come at a premium because of it in comparison to AMD.

Nvidia (and AMD) deserve praise for a lot of things -- G-sync and Shield are neither of them.
 
Last edited:
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
Thank you. It seems as if for some people NVidia producing any technology first makes it inherently evil. I don't care who invented it, I support the technology. Kudos to NVidia for doing it first.
For those who claim this is proprietary with no evidence, I quote Anandtech stating exactly the opposite:
Just because no one else supports it yet doesn't mean that no one else ever will. Someone has to be first. Most standards aren't designed by committee but by everyone adopting one manufacturer's implementation for simplicity.

If and only if G-Sync try manipulate TDMS and DDC Clock over Display Port,there's highly probability this will leave DCP (Displayport Content Protection) exposed.Now that will be unpleasant for some.Unless G Sync is communicate between two return channel TMDS and DDC,this will crippling frame sequence rendering within two channel.And let me guess,this will not work on SLI and or Stereoscopic.

Innovations like SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.

Ah yes...that's great.Let me ask a simple question.Do you own nVidia Shield or at least ever try it?Do you use android phones?Ever try running games on much cheaper Google Nexus 4?
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.

Well quite clearly, you're the one playing dumb, seeing how you have yet to explain to me why someone running a 120Hz or even a 75Hz monitor, would benefit from DROPPING the refresh rate to the frame rate of the game. Can you point out to me, a single person suffering from their monitor's higher refresh rate in any games that never even exceed it? It has never been a problem and Nvidia are yet again trying to fix a problem that never even existed in the first place, and quite clearly you are being ignorant by ignoring simple facts and have not had any experience with what causes tearing.

If a monitor is running at a refresh rate above the framerate of the GPU, unless the monitor does some image post-processing, scaling or duplicates frames (like those 240Hz TVs), the monitor will only draw the frames it has. End of. That renders this G-SYNC gimmick worthless because it is trying to show a problem that was never there in the first place. Whether your monitor runs at 30Hz or 120Hz it will only draw the frames that it has -- if it is less than the refresh rate, it won't affect the monitor either way.

I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.
 
Joined
Sep 24, 2010
Messages
56 (0.01/day)
Processor i7 950
Motherboard Asus P6T Deluxe V2
Cooling Zalman CNPS10X Extreme
Memory Kingston HyperX 3x 2GiB
Video Card(s) MSI GTX 570
Storage OCZ Vertex 3 120GB + 8TB
Power Supply Chieftec 750W
Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync

Nvidia page:
NVIDIA's Adaptive VSync fixes both problems by unlocking the frame rate when below the VSync cap, which reduces stuttering, and by locking the frame rate when performance improves once more, thereby minimizing tearing.
Below 60 (120) fps you get no stuttering (basically vsync turns off), but see tearing. Above 60 (120) fps vsync is on as usual, resulting in no tearing and no visible stutter.

Really, why would nvidia develop something thats already solved, and a guy like asus join them? Nobody thats stupid. Also this tech is praised by various tech journalists and devs like John Karmack who seen it in action.

Only sh1tty thing about this is vendor lockin. We need this for amd and intel too, on every tv and monitor. Lets hope nvidia wont be stupid and open it up.
 
D

Deleted member 24505

Guest
Thinking about it a little more, while this reduces lag, you can't eliminate it like nvidia claims and here's why.

At the moment, whether vsync is on or off, once the GPU has rendered the frame, it doesn't get displayed until the next monitor refresh (this is independent of the refresh frequency, of course) so you get lag. The amount of lag will vary too, depending when in that cycle the GPU has rendered the frame. If vsync is off, then you'll get tearing and stutters, regardless of whether the GPU is rendering faster or slower than the refresh rate. Remember there's still lag in the system due to the rendering time of the GPU and general propagation delay through the controller and also the rest of the computer.

Now you turn G-Sync on, what happens? The monitor waits for the GPU, not refreshing until it gets the command from to do so. This results in the frame being displayed the instant it's ready. Lag may be reduced, but not eliminated. Tearing and stuttering will be eliminated, however, because the monitor will be displaying every frame and crucially only ever doing so once.

Lag isn't eliminated, because the GPU still requires time to build the frame. Imagine an extreme case where the GPU is slowed down to around 15-20fps (can easily happen). You'll still have lag corresponding to this variable frame rate and therefore the game will feel horribly laggy. Responsiveness to your controls will be improved however since the monitor displays the frame as soon as it's ready, but more importantly perhaps there will be no tearing or stutters, which is a significant benefit, because both of these effects look bloody awful. NVIDIA obviously gets this.

The only thing I wonder is if the player will notice dynamic artifacts with the game responsiveness, since the frame rate and synced refresh rate vary continuously and significantly? It might lead to some weird effect where the player feels disoriented perhaps? Maybe even inducing nausea in some people? I don't know, but this is something to look out for and I will when I get my hands on some demo hardware. NVIDIA are obviously not gonna tell you about this in a press release, lol.

So, in short, while I can see this technology improving the game play experience, there's still no substitute for putting out frames as fast as possible. I'm currently gaming at 120Hz with very few dropped frames which makes a world of difference over 60Hz. (Adding LightBoost strobing to the mix with its blur elimination then takes this experience to another level altogether). That doesn't change with G-Sync. It would be really interesting to get my hands on demo hardware and see G-Sync for myself.



I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.

Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
EDIT: And I just can't fathom the depths to which this place has plunged. All this rage.. For something they have not seen irl. And if this does what it says it does, you do have to see it irl before you can pass judgement.

+1 there. Why do things like this induce such foaming at the mouth? It's fucking ridiculous. If someone doesn't want it, then just don't buy it. No one's forcing them.

Couple of interesting posts mate, much more so than the rest of the fan boy crap and uninformed arguing from others.

+1 again.

Another technical thing I've just thought of about G-Sync.

Regardless of how moving pictures are being displayed, they are still sampled, just like audio. This means that the Nyquist limit or Nyquist frequency applies.

Hence, for fast moving objects eg during frenetic FPS gaming, you want that limit to be as high as possible, since an object moving fast enough will not just be rendered with only a few frames, but will display sampling artefacts similar to the "reverse spokes" effect in cowboy movies of old. In gaming, you may not even see the object, or it may appear in completely the wrong place and of course, be heavily lagged. If the GPU drops to its minimum of 30fps, then you can bet you'll see this effect and in a twitchy FPS, that can easily mean the difference between fragging or being fragged.

So again, while G-Sync looks like a great innovation to me, there remains no substitute for a high framerate as well.
 
Joined
Jul 10, 2009
Messages
467 (0.08/day)
Location
TR
Processor C2duo e6750@ 2800mhz
Motherboard GA-P43T-ES3G
Cooling Xigmatek S1283
Memory 2x2Gb Kingstone HyperX DDR3 1600 (KHX1600C9D3K2/4GX )
Video Card(s) HIS HD 6870
Storage samsung HD103SJ
Case Xigmatek Utgard
Audio Device(s) X-FI Titanium Pci-e
Power Supply Xigmatek NRP-PC702 700W
Software win7 ultimate -64bit
Which innovations ? A company acting like headless chicken because losing it's main market ( GPU 's ) ? Nvidia has hardtime between AMD apus /raising tablets -smartphones - dropping pc sales etc etc and trying to find new markets nothing more nothing less.

Btw who you are to decide about mantle and true audio in behalf of all people on earth , i dont remember i made you speaker person :slap: . What you dont get is i want microsoft free gaming = No direct X fyi

Innovations like G-SYNC and SHIELD streaming show exactly why NVIDIA titles themselves "World Leader in Visual Computing Technologies". Because they are.
 
Top