• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
Jul 19, 2006
Messages
43,604 (6.51/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
You looking to get the 780Ti now if you are interested in G-SYNC?

Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Maybe. I also want to try out lightboost. It all depends on price/performance though, I've been very happy with my 7970 but I'm not particular to any brand. Thing is, it wouldn't make sense for me to buy a new monitor.

yeah it wouldn't. I want to see Asus release an IPS 27" with 1440p with G-SYNC, and I would most definitely be interested.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.

I'll second that. This is how I splashed out £400 on 3D Vision (glasses and monitor) way back in 2009, then running off my GTX 285 of the time.

I remember "just happening" to go round to Novatech and checking it out. I barely even played the game and within 5 minutes they had my money. :laugh: Even my friend who knows nothing about computers and doesn't do gaming was impressed with it.

I don't doubt that G-Sync will deliver a similar kind of awesome which will be more in the way it feels when you control the action with the keyboard and mouse than anything else.

EDIT: you'll love LightBoost and you don't even need an NVIDIA card with the ToastyX utility, either.
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
I don't think you properly understand what G-Sync does on a technical level or what causes tearing and stutters in the first place. Remember, the only way* to remove stutters is to have the GPU and monitor synced, it doesn't matter which way round that sync goes. NVIDIA have simply synced the monitor with the GPU ie synced the opposite way round to how it's done now.

Check out my explanation above and in my previous post a few posts back. Getting rid of stutters and tearing is a big deal, possibly even more so than the slight lag reduction. This feature may be proprietary for now, but it's no gimmick.

*Not strictly true, sort of. If I run an old game which is rendering at several hundred fps and leave vsync off, then it looks perfectly smooth, with very little judder or tearing - and noticeably less lag. It seems that brute forcing the problem with a large excess number of frames per second rendered by the GPU can really help, even though those frames are variable.

Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it. As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.
 
Last edited:
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag.

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work.

I think your example is unrealistic. Nvidia never said that your solution of a locked maximum frame rate wouldn't achieve a similar goal. Instead, they said it was a bad solution because it required developers to program games with very little detail so that they never drop below a monitor's refresh rate. This isn't practical since any game has scenes more complicated than others, and it makes no sense to run the GPU at 1/4 load 99.9% of the time just so that it never drops below the monitor's refresh rate the remaining 0.01% of the time.

You also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Well no offense, but neither do you, nor anyone else here that hasn't actually demoed it.

Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.


As it so happens now, we can only speculate on how useful it will be and that's exactly what I'm doing.

I read your comment in full and it still does not explain sufficiently how this monitor is going to fix stutter/tearing -- be it on a technical or a practical level. Variable frame rates happen because it is impossible to have even GPU load during a scene/map in a game, especially multiplayer -- if the GPU load is uneven, stuttering and lag are going to happen, therefore it does not fix it on a practical level if you're capping the GPU to render just "enough" frames and change refresh rates accordingly. Go and run COD4 at a maxfps rate of say 60. Run it with a refresh rate of 120Hz with this limit and then at 60Hz and tell me what "perceivable difference" you get from effectively doing the exact same function as what this monitor offers, minus the gigantic price tag (none whatsoever).

Your last statement contradicts your previous, from what I read. I'm giving you this as an example because I spent far too much time tweaking crap like this to run optimally on my older desktops to know that it doesn't work. You cannot have one without the other; frame rate synced with refresh rate = uneven GPU load, so you still get stuttering & tearing, vs the normal way, which is render as many frames as you can = maximum possible GPU load, which lessens the effect of tearing/stuttering but which only then becomes directly linked to the monitor tearing if refresh rate is too low. No add-in monitor circtuit board will un-link the effects of this.

And the above does not even take into account that a lot, and I do mean A LOT (vast majority) of old competitive games, like Quake III and Unreal, increase movement speed and/or sensitivity the higher the frame rate gets, due to the way the old game engines work, so Mr Carmack, of all people, should know better than to endorse this worthless crap. Any of the old school competitive gamers will give you exactly the same reasons for why it won't work.

I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.




I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube

I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors puts this tech out of reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I get it , but I don't think It's the game changer some are saying and imho it's mostly of use at Uhd and beyond with three screens because in that case most gpu setups are being run at the limit of what they can do.

What im saying is that 2-3 highend cards like the titan plus 3 g spot(like nv didn't want and expect this nickname) enabled monitors puts this tech out of reach ot all but the highest paid or most enthusiastic pc gamer.
Niche thats it.

Whether g-spot is a game changer remains to be seen, I quite agree. However, my feeling on it is that it will be. We'll soon know for sure.

The improvement is equally good on any monitor and resolution configuration as far as I can see. However, I'd have to compare them to really disagree with your point. :)

I do think it's ironic the fact that when the GPU is rendering faster than the monitor's highest refresh, say 120Hz, g-spot works just like normal vsync would, lol. A modern PC with a decent graphics card or cards will often achieve this, especially when the game is an older one.

Oh and it's expensive? Never! :laugh:
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
You also define "stuttering" and "lag" differently than NVidia does. NVidia refers to "lag" as the time between the a GPU renders a frame and when the next monitor refresh comes and that frame is displayed. "Stutter", in NVidia's terms, is the variance in "lag". The "stutter" you speak of, where frames are generated unevenly, will not be fixed by G-sync. However, the truth is that what you call "stutter" does not affect human perception nearly as much as uneven "lag", NVidia's "stutter". Humans anticipate what will occur in the few frames, and when what is displayed on the screen does not match the anticipated timing, "stutter" is perceived. The lack of a frame being displayed (down to a reasonable minimum, NVidia says 30fps) is not nearly as big of an issue as a frame being displayed at the wrong time.

I hope you understand that this proves exactly what you said about running Quake III at an insanely high frame rate; this reduces "lag" or the time between the newest frame is generated and the time it is displayed on the monitor, at the cost of discarding a ton of frames and wasting computational power. G-Sync does the same lag reduction as this without the necessity of wasting GPU power.

That's because Nvidia are yet again using ideal world scenarios for practical demos.

My entire point was, G-sync fixes nothing, which is still true -- after watching that presentation, I'm even more sure than before. The side by side comparisons even show one monitor without G-sync is tearing but displaying stuff faster than the one with G-sync, which was clearly skipping/jumping frames and "jittering". Watching a pendulum on a screen (Nvidia's pointless demo) or the pointless slow-turning Borderlands 2 demonstration are of no value whatsoever -- I'd love to see someone try using G-sync for a fast motion FPS shooter like BF3 and see just how much input lag it will add to the already delayed engine that the game uses. That G-sync module is nothing more than a hardware-based v-sync framebuffer with extra memory for the monitor -- maybe worth $30 on its best selling day.

This is not even including the fact that this entire problem of tearing is non-existant on fast 120Hz+ panels, with the exception of a few games that run on old engines that suffer from uneven frame pacing in general, regardless of whether it is running on two or one graphics chip (case in point, COD4).

The only imaginable scenario I can think of where this G-sync module would be of any use, is purely in multi-monitor setups where the frames may be being fed unevenly on each different monitor -- but then the question is, is it a problem worth shelling out $175 per monitor for? Absolutely not, and anybody disagreeing with that, is insane ($175 a piece for a 3 monitor setup is $525, not even including the GPUs or any of the monitors).

Some things you don't need to see the demo to understand the concept, although it helps. I think this is where you're tripping up.

I'm not sure what you're trying to say about rendering 60fps on a 120Hz refreshing screen? Yeah, you'll see constant judder (I tried it with the half vsync in the driver control panel). In fact, because it's happening so fast, it tends to look more like a doubled image moving smoothly than a judder. Depends a bit on the monitor, the game, your eyes, lighting etc. But basically, you see constant judder as the movement only happens every other frame.

I found th nvidia g-spot sync presentation on YouTube. It's explained by the CEO and has a few diagrams which might help you understand it better and realize that it's not a gimmick. It's not quite as detailed as what I explained, but then it is a marketing presentation lol, not a scientific paper that explores it from every angle. For example, he does explain that the monitor samples, but not the Nyquist limit which applies to any form of sampling. It's also an edited version, at 24 minutes.

NVIDIA 4K and G-Sync Demo - YouTube

See my above answer. I saw that demo in full and it proves my point further that it will not work to improve anything -- certainly not for its value or on any decent TN gaming monitor available these days. His entire argument is excessive/delayed frames>uneven draws by the monitor, so his G-sync module merely gives the monitor an extra large frame buffer to feed the monitor once each frame is ready from the GPU, as well as some proprietary draw calls to the GPU from G-sync to stop it rendering more frames -- at best, a $30 dollar's worth gimmicky solution, and again, nothing revolutionary or worth writing home about.

Now let's see the real-world case of this half-arsed solution -- capped frame rate means light GPU load (on Kepler GPUs, which are almost solely reliant on this to run at advertised clocks), which means the GPU runs at a less-than-optimal power state causing it to down clock, which means when more complex scenes are being rendered, it struggles with the load, and has to clock back up, causing a delay and therefore stutter (rinse and repeat). This is going to need a lot of driver-side support on a title-by-title basis in order to work properly, and I seriously doubt they are going to dedicate many -- if any man hours, into making this work. Makes it nothing more than a gimmick in my book, and an insanely overpriced one at that and I am yet to be proven wrong in this, unfortunately -- in practice or theory.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.
The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync?

He might be referring to G-Sync on-board memory

The pictures show that the FPGA is paired with a trio of 2Gb DDR3 DRAMs, giving it 768MB of memory for image processing and buffering.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
The plus side is that if the proprietary features work well and are desired, an open alternative will likely come into being and nVidia will likely support that as well.

That's why I have no issues with companies releasing proprietary ideas like this. They take on the financial gamble themselves. If it fails, it's purely their loss, if it is successful, we'll get other options in the market.

Good point. In the end it's always swings and roundabouts, lol.

He might be referring to G-Sync on-board memory

Duh, I missed that. :) I'd love to see a white paper on G-Sync explaining all the technical details of it.
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
Am* you really don't seem to get the technical details on this. I dunno why you think the monitor needs a frame buffer to work with G-Sync? It just syncs the monitor to the graphics card that's all. The monitor continues to show the same picture until it gets another lot of display data, that's all. That's how LCD monitors work already. The only difference is that currently the fresh display data comes in at regular intervals instead of irregular intervals with G-Sync on.

Also, you should realize that you can't go by how smooth the demo looks on the YouTube video. It's really quite obvious why if you think about it.

I still think you don't get it, but in the end it doesn't matter, because the reviews will tell us just how well it works and when it goes on sale you'll be able to see for yourself.

The one thing I do agree with however, is the price markup. I'm sure it'll be the usual price-gouging markup we see from NVIDIA for a proprietary feature. :shadedshu Just look at the price premium on LightBoost monitors, for example. It's a shame that there isn't something like a JEDEC-style standards body for displays that would allow this technology to be introduced by all players at reasonable prices.

In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:



From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.

He might be referring to G-Sync on-board memory

Bingo.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
In that case, feel free to correct me. This is the slide straight from that presentation explaining the problem of tearing:

http://images.eurogamer.net/2013/articles//a/1/6/2/5/7/2/2/Nvidia5.jpg.jpg

From my understanding, G-sync module waits for the frame to complete from the GPU, puts it in its own buffer waiting for the monitor to fully complete the last frame, before passing it onto the monitor to complete another full draw in order to prevent tearing and so on.

By all means, explain to me what "technical details" I'm not understanding that you are on this or how you think it works. Because I'm pretty certain I'm understanding it perfectly.

Ok, so I missed the bit about the memory buffer on the G-Sync module, but that doesn't actually change the principles of what I'm saying. As the NVIDIA CEO said himself, the system is simple in principle, but complex to implement properly in practice. This is similar to the situation with how jet engines work, for example. Not too complex in principle, but fiendishly complex and difficult to make one that works properly.

All that diagram shows is how things currently work with vsync off and a standard monitor. Of course you see tearing. Why didn't you show the one with the irregular GPU outputs that the monitor syncs to with G-Sync on? That would have been much more relevant.

Another way to think about G-Sync is Adaptive vsync without the tearing, although there's important subtleties there, such as the reduction of latency.

One thing to realize is that if the GPU is putting out frames faster than the fastest refresh of the monitor (say, 144Hz) then the system goes back to a standard vsync-on arrangement ie like G-Sync wasn't there and the GPU reverts to being synced with the monitor. This will typically be the case when playing old games on modern hardware.

However, with modern demanding games and high resolutions we know that a solid 144Hz cannot be maintained. That's where G-Sync syncs the monitor to the GPU, giving the advantages I explained previously. If you want me to repeat it all here, you're out of luck. We're going round in circles already.
 
Joined
Sep 29, 2013
Messages
97 (0.02/day)
Processor Intel i7 4960x Ivy-Bridge E @ 4.6 Ghz @ 1.42V
Motherboard x79 AsRock Extreme 11.0
Cooling EK Supremacy Copper Waterblock
Memory 65.5 GBs Corsair Platinum Kit @ 666.7Mhz
Video Card(s) PCIe 3.0 x16 -- Asus GTX Titan Maxwell
Storage Samsung 840 500GBs + OCZ Vertex 4 500GBs 2x 1TB Samsung 850
Audio Device(s) Soundblaster ZXR
Power Supply Corsair 1000W
Mouse Razer Naga
Keyboard Corsair K95
Software Zbrush, 3Dmax, Maya, Softimage, Vue, Sony Vegas Pro, Acid, Soundforge, Adobe Aftereffects, Photoshop
Re:

So simple and short, G-Sync is going to regulate the refresh rate on the monitor so there isn't any dropped frames. Monitor is being told by G-Sync when a frame is coming in because it is communicating with the GPU. A frame gets sent by the GPU to the monitor, and the fame gets properly placed into a 16.667 ms scan-window... Still reminds me of the Dynamic Frame Control on the RadeonPro Beta for AMD users. It's just something that's being utilized on a Hardware level versus a software level.

So the issue is still the monitor's static refresh rate. Seems like it would be better if Display Manufacturers tried to increase the refresh rate above 144 hz, and make it dynamic instead of static. This would probably be more ideal for AMD Graphic Cards in CrossfireX than NVidia's SLI. CrossfireX has a bad habit of shooting frames out like a galling gun with more than two gpus in CrossfireX. Maybe add in a secondary frame buffer to the monitor incase the fame time is under 30 fps or 33.33 ms and below. That way the previous frame could be stored until a new frame arrives, or have a component in the monitor that skips a scan with the previous scan still displayed...

I still think the EIZO FORIS FG2421 240 Hz gaming monitor is more innovative than NVidia G-Sync. 120 Hz scans with a black-out period after the fame is displayed in each 8.33 ms window seems a lot more... creative. I may invest in this monitor or three.

For the NVidia users, I hope G-Sync does the job with little to no latency.
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
http://www.google.com/patents/US20080055318 Seems ATI already has the patent on this technology? I'm no expert but sounds like the same principle described. The other filings are specifically listed as under ATI. I assume AMD has these now?

It's not exactly the same thing from what I can tell. This technology seems to be adjusting the frame rate to match the source material, but not on a frame by frame basis. It seems to be more applicable to syncing video (fixed frame rate) to the display and to lowering power consumption through setting the monitor to a lower refresh rate, lowering GPU load.
 
Joined
Sep 25, 2010
Messages
536 (0.10/day)
Location
Bahrain
System Name Lazyman 9000+
Processor AMD Ryzen 7 3800X
Motherboard X570 Aorus Ultra
Cooling Arctic LF II 280
Memory 2x8GB Crucial Ballistix 3600 MHz DDR4
Video Card(s) PowerColor Red Devil 5700XT
Storage Never Enough
Display(s) ASUS VG248QE+ Dell E2420H
Case LIAN LI Lancool II Mesh RGB
Audio Device(s) Sennheiser HD518 / Logitech Z623
Power Supply FSP AURUM PT 850W
Mouse Logitech G600
Keyboard Logitech G810 Orion Spectrum
Software Win 11
Benchmark Scores All over the bench
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Joined
Sep 15, 2011
Messages
6,716 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Neh, you can't see schmit on those low res videos. Smells more and more like nvidian propaganda....

Low res is preventing you to see tearing and smoothness differences? Really? :confused: Tech is proven, tested, reviewed and available - not much room for propaganda.
 
Top