• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
May 13, 2010
Messages
6,065 (1.14/day)
System Name RemixedBeast-NX
Processor Intel Xeon E5-2690 @ 2.9Ghz (8C/16T)
Motherboard Dell Inc. 08HPGT (CPU 1)
Cooling Dell Standard
Memory 24GB ECC
Video Card(s) Gigabyte Nvidia RTX2060 6GB
Storage 2TB Samsung 860 EVO SSD//2TB WD Black HDD
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + Dell E2013H 20 in @1600x900
Case Dell Precision T3600 Chassis
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 630w Dell T3600 PSU
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Linux Mint 20
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite Router/Sw:Meraki MX64 MS220-8P
It's all frothing rabid fandogs that see the "enemy" release something innovative thier faction doesn't have and they attack and hate it even though they haven't seen it, used it, or understand it.
 
Joined
Aug 17, 2009
Messages
1,585 (0.28/day)
Location
Los Angeles/Orange County CA
System Name Vulcan
Processor i6 6600K
Motherboard GIGABYTE Z170X UD3
Cooling Thermaltake Frio Silent 14
Memory 16GB Corsair Vengeance LPX 16GB (2 x 8GB)
Video Card(s) ASUS Strix GTX 970
Storage Mushkin Enhanced Reactor 1TB SSD
Display(s) QNIX 27 Inch 1440p
Case Fractal Design Define S
Audio Device(s) On Board
Power Supply Cooler Master V750
Software Win 10 64-bit
This may be a bit better than AMD droning on about sound, but not much.

I remain disappointed the by lack of innovation relevant to the consumer's current sound system and monitor.

In the end, all we really have received from these two esteemed institutions is re-branded cards.

...
 
Last edited:
Joined
Feb 3, 2005
Messages
499 (0.07/day)
Neat idea, but I'd rather spend that money on a video card that will do 60fps at my desired resolution.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Yeah, 'kay. You obviously 've got everything figured out. And your only real problem in your life seems to be that you need a better paying job. Righahahahahat... :roll:


Again, open or bust.

It might have taken a lot of years, but microsoft is finally on the way out as a de-facto PC gaming platform... and why? Yeah, I'll let someone else figure this one out.


But I digress, we shall see. I'm far away from getting a gaming monitor anytime soon either way.
Almost everything that's open started as a response to a proprietary tech. For example, OpenCL wouldn't be nearly as far along as it is if it weren't for CUDA.

Innovation is innovation, and should be respected as such. This little piece of tech, if it performs worthwhile, will spawn further ideas and refinments, and maybe even an open standard.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,925 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) 49" Philips Evnia OLED (49M2C8900)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Gunnr
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Moondrop Luna lights
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
So as an nvidia supporter this is the most rediculous thing ever. Why pay the $$ whats the failure rate? does it work on the good panels? whats the color repro like? why not get a 120hz or 240hz monitor and not give a f@#$?

This is to eliminate stutter AND tearing. 120 or 140 does you no good if you can't run those framerates in your games. Run below that, and you get stuttering and tearing.
 
Joined
Nov 4, 2005
Messages
11,980 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
My TV already does frame scaling and interpolation to reduce tearing. Even at 13FPS when I have overloaded my GPU I barely see it.


This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
This doesn't fix stutter or GPU stalls, it covers them up like bad lag by redrawing old frames.

It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.
 
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.

As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).

Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.

I stated before that i had LG panel that do 240Hz FPR 3D effect.Yet i admitted there is severe judder if i crank motion interpolation to the max.It's only natural for HDTV that can't do a massive task rendering full white or pitch black frame,in RGB mode with MEMC and MI switch on.In addition,my panel doesn't carry EDID,DDC or desktop similar timing and clock,so it only does 120 Hz or 240Hz interpolation mode.

Bottom line,it's good to see nVidia aware of this matters.G Sync could be a game changer for desktop to have advanced MEMC and MI.But then again,don't expect LG and Samsung for not bringing their tech to desktop at a reasonable price.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
As my previous post,such thing already exist in the consumer electric.LG introduce Dynamic Motion Clarity Index and Samsung had Clear Motion Rate in late 2010.Both of them are capable of adjusting basic timing to nearest standard NTSC,PAL or SECAM regardless any source.They also had Motion Estimates Motion Compensate (MEMC) mechanism,"similar" with nVidia's Lightboost.Another features like Motion Interpolation (MI) have capabilities to doubling or quadrupling any source refresh rate to match standard characteristic 3D effect Frame Patterned Retarder (FPR).

Fundamentally speaking there's is major difference between desktop and consumer electric.There's no GPU in consumer electric,so the source keep locking in NTSC 29,97 fps or PAL 25fps.There's only two factor to keep up rather than "virtually infinite" in desktop.Most HDTV had to deal with various input such as A/V,Composite,S-Video,SCART,RGB D-Sub and HDMI which has different clock.AFAIK, dekstop only had 3 clock, 30Hz,60Hz and recently introduced 4K's 33Hz spread across DVI,HDMI and Display Port.

I stated before that i had LG panel that do 240Hz FPR 3D effect.Yet i admitted there is severe judder if i crank motion interpolation to the max.It's only natural for HDTV that can't do a massive task rendering full white or pitch black frame,in RGB mode with MEMC and MI switch on.In addition,my panel doesn't carry EDID,DDC or desktop similar timing and clock,so it only does 120 Hz or 240Hz interpolation mode.

Bottom line,it's good to see nVidia aware of this matters.G Sync could be a game changer for desktop to have advanced MEMC and MI.But then again,don't expect LG and Samsung for not bringing their tech to desktop at a reasonable price.

I'm not sure why you think this already exists? This has not been done before in any commercial product.

G-Sync synchronizes the monitor dynamically to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.

It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I'm not sure why you think this already exists? This has not been done before in any commercial product.

I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.
 
Joined
Nov 4, 2005
Messages
11,980 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
It doesn't cover things up, it fixes stutter and lag. You misunderstand how it works.

I made three previous posts in this thread explaining the technical aspects of how G-Sync works which should clear that up for you if you'd like to check them out.

If you got hitching without it, then you'll get this with it too. It's not made for fixing that, which is caused by lots of things, quite often by other background software running on the PC such as a virus scanner kicking in at the wrong moment, Windows paging etc.

So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here.


So lets do some math.

60FPS = .01667 rounded seconds per frame.

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

http://en.wikipedia.org/wiki/Multiple_buffering

Triple buffer with Vsync.......been there done that, it was crap and caused lag.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here.


So lets do some math.

60FPS = .01667 rounded seconds per frame.

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.
 
Joined
Nov 4, 2005
Messages
11,980 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Or, it can alter the refresh rate of the monitor to match the output it's receiving from the video card, eliminating the need to hold back or duplicate frames, which is exactly what it's claiming to do.

So you would like to pay for lag? Seriously, I must be in the wrong business.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
So you would like to pay for lag? Seriously, I must be in the wrong business.

Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.
 
Joined
Nov 4, 2005
Messages
11,980 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Supposedly it comes with little to no lag hit. I don't know, but I do know I tend to actually see a product in action before I go about making judgments and accusations on it.

I know of the time constant, and buffering, and other attempts, and all still have that time issue.


I will also wait to see it in action.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Only thing that's been shown is Nvidias demo and a slow camera rotation in Tomb Raider with Lara Croft by her self.

We don't even know the specs of the systems they were run on.

Nvidia Tom Peterson also said it was game dependent as well. Some games will not work with it and 2D is not a focus.

Until someone test this in several game scenarios I'll remain skeptical much like the 3D Vision Surround craze.
 
Last edited:
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
I'm not sure why you think this already exists? This has not been done before in any commercial product.

G-Sync synchronizes the monitor dynamically to the variable GPU framerate which changes from instant to instant, or from one frame to another. This feature is irrelevant for TVs, which are just reproducing video, hence just need to run at a fixed rate to match the video source.

It's very different with video cards, because here you have interaction between what the user does and their output on the screen. In this case, things like lag, refresh rate, stutter and screen tearing are very important and G-Sync addresses all these things simply by syncing the monitor to the varying framerate of the graphics card rather than the other way, syncing the graphics card to the fixed framerate of the monitor, as happens now.

Please read my explanation above.I do aware of all desktop thing in matter of speaking.
If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
TV wise,they do dynamically upsample source to have 120Hz or 240 Hz display in opposite of G Sync dynamically downsample display to match GPU output.
For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
It's only a matter of time LG and Samsung will make a monitor supporting this dynamically downsample feature,on the other hand nVidia couldn't make a better dishwasher.

I think he's confused between the refresh rate and the TMDS clock rate, which will vary depending on resolution.

Yeah..i don't even know the difference between 60Hz and 60fps...silly me :)
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
So how does it fix stutter and lag?

A) By redrawing old frames on the screen.
B) Magic and pixie dust.



Don't know about you, but I am going with A here.


So lets do some math.

60FPS = .01667 rounded seconds per frame.

If a gamer notices the lack of updated frame(s) that get displayed as the "frame buffer" doesn't get updated with the next frame its "lag". http://www.hdtvtest.co.uk/news/input-lag All TV's and monitors have some native lag, this doesn't remove that, don't be confused on that point.

The lag or stutter here is http://www.mvps.org/directx/articles/fps_versus_frame_time.htm A increase in the number of MS it takes to render one or more of the frames to the output frame buffer.

the G-Spot runiner or whatever they want to call it can only work one of two ways to do as they are claiming.

A) Hold frames in a secondary buffer and display them at a set frequency rate that matches Vsync or whatever display rate it chooses. They can do things like what most TV's do, interpolating the two frames with motion adaptive/vector blending to create a new frame.

B) Or they can keep rerendering the frame and keep the pixels driven for more ms with the same data, which by definition is lag.....so.....yeah


There is this constant that we experience, called time, and unless they have started using quantum technology and keep us frozen in time while we wait for the GPU to render more fresh frames all we are going to see is lag or blur. Your choice, but now apparently you can pay for it.

http://en.wikipedia.org/wiki/Multiple_buffering

Triple buffer with Vsync.......been there done that, it was crap and caused lag.

I'm sorry, but you're wrong on all counts. I've already written an explanation of how this works, so I'm not gonna go round in circles with you trying to explain it again, especially after having seen your further replies since the one to me. Click the link below, where I explained it in three posts (just scroll down to see the other two).

http://www.techpowerup.com/forums/showthread.php?p=3000195#post3000195


Mind you, I like the idea of the pixie dust. ;)


Please read my explanation above.I do aware of all desktop thing in matter of speaking.
If you trying to distinguish between dekstop and consumer electric,let me ask you a question...have you tried 3D TV FPR,3D TV Passive and nVidia 3D Vision and tell the difference between them?
TV wise,they do dynamically upsample source to have 120Hz or 240 Hz display in opposite of G Sync dynamically downsample display to match GPU output.
For convincing this is already had long history exist in consumer electric,have you noticed major panel maker such as LG,Samsung,Sharp supporting this?It's only desktop specific vendor listed such as Asus,Benq,Phillips and Viewsonic.
It's only a matter of time LG and Samsung will make a monitor supporting this dynamically downsample feature,on the other hand nVidia couldn't make a better dishwasher.

Yes, not only have I tried 3D Vision, but I have it and it works very well too. I think active shutter glasses were around before NVIDIA brought out their version. How is this relevant?

I think the link I posted above for Steevo will help you understand what I'm saying, as well. Again though, G-Sync is a first by NVIDIA, as it's irrelevant for watching video. It's the interaction caused by gaming that makes all the difference.
 

markybox

New Member
Joined
Sep 6, 2013
Messages
4 (0.00/day)
Couple of things that might be worse though are motion blur and the shape distortion* of moving objects, both of which are currently fixed by nvidia's LightBoost strobing backlight feature which my monitor has. The PR doesn't mention LightBoost anywhere
Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode? Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.

So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced). Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.

You can only choose G-SYNC mode, or strobed mode, though.
But I'm happy it will be even better than LightBoost, officially for 2D mode.
This is NVIDIA's secret weapon. Probably has an unannounced brand name.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Are you aware that G-SYNC has a superior, low-persistence strobe backlight mode? Google for it -- pcper report, Blur Busters article, John Carmack youtube, etc.

So G-SYNC includes an improved LightBoost sequel, officially sanctioned by nVIDIA (but not yet announced). Probably with better colors (and sharper motion than LightBoost=10%), and very easily enabled via OSD menus.

You can only choose G-SYNC mode, or strobed mode, though.
But I'm happy it will be even better than LightBoost.

Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.

Be interesting to see exactly how NVIDIA address this. Got a link?
 

markybox

New Member
Joined
Sep 6, 2013
Messages
4 (0.00/day)
Yes, it's obvious that at the lower frame rates, a strobed backlight would be highly visible and intensely annoying.
Be interesting to see exactly how NVIDIA address this. Got a link?
It's an "either-or" proposition, according to John Carmack:
CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!

It is currently a selectable choice:
G-SYNC Mode: Better for variable framerates (eliminate stutters/tearing, more blur)
Strobe Mode: Better for constant max framerates (e.g. 120fps @ 120Hz, eliminates blur)
Also, 85Hz and 144Hz strobing is mentioned on the main page.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
It's an "either-or" proposition, according to John Carmack:
CONFIRMED: nVidia G-SYNC includes a strobe backlight upgrade!

It is currently a selectable choice:
G-SYNC Mode: Better for variable framerates (eliminate stutters/tearing, more blur)
Strobe Mode: Better for constant max framerates (e.g. 120fps @ 120Hz, eliminates blur)
Also, 85Hz and 144Hz strobing is mentioned on the main page.

Cheers matey. I'm busy right now, but I'll read it later and get back to you.

I found these two links in the meantime:

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate

http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA
 
Joined
Jul 19, 2006
Messages
43,604 (6.51/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.50/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
I would love to see Nvidia have these things setup at retailers. The only way I could make a purchasing decision on something like this is to see it running in person.

You looking to get the 780Ti now if you are interested in G-SYNC?
 
Top