• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
You just can't take whatever it is you find written on the internet by someone who calls themselves an expert and argue with the personal experience of just about anyone who jumped from 60hz to 120hz or higher. This is ridiculous. But I know you'd rather take a French leave when people point out the obvious flaws in your thinking and gaps in your experience and knowledge than admit even an expert can be 100% wrong, therefore you are too.

And why is it that people's first reaction when they see someone calls them out on their statements is to block or ignore. I've seen dozens of people point out my bad thinking, and among hundreds of things I don't agree with I still can find a lot of things that changed my perspective. One can't be 100% right at all times, but blocking whatever you see that you don't like is cowardly.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,199 (4.66/day)
Location
Kepler-186f
Yeah, people have no fuckin idea what G-Sync and Freesync does. Running a monitor at 120Hz+ without VRR technology doesn't make tearing go away, you need to sync up the frames between the video card and monitor.


i have never seen tearing on my gsync monitor once, i use rivatuner to cap frame 4 fps below max. so gsync never turns off.
 
Joined
Aug 2, 2011
Messages
1,458 (0.30/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 Pro 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Alienware AW3423DWF, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502X Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
 
Joined
Jan 8, 2017
Messages
9,434 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
 
Joined
Jun 25, 2018
Messages
2 (0.00/day)
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
It doesn't matter what you believe. http://www.100fps.com/how_many_frames_can_humans_see.htm Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.
 
  • Like
Reactions: HTC
Joined
Feb 14, 2012
Messages
2,355 (0.50/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
There is no way that VRR needs a $2000 fpga, there is something wrong with this info. There has been plenty of time to spin a custom chip to do what's needed. VRR is not magic, it's a very basic concept. Show the current frame until the next one arrives. That does not require loads of tricky transistors.
 
Joined
Mar 10, 2014
Messages
1,793 (0.46/day)
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.

Well look at tftcentrals reviews, they have quite good explanation about input lag. In short it is tied on panel refresh rate, higher the refresh rate is, lower the input lag has to be. So if you have 60Hz screen and about 16ms input lag, it's better than having 120Hz screen and 15ms input lag.

There is no way that VRR needs a $2000 fpga, there is something wrong with this info. There has been plenty of time to spin a custom chip to do what's needed. VRR is not magic, it's a very basic concept. Show the current frame until the next one arrives. That does not require loads of tricky transistors.

People seems to forget that it replaces whole mainboard from the monitor, so it's not automatic +$xxx over non-gsync monitor. And yeah there's no change in hell that nvidia will buy those fpgas at $2000 each either.
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
If a TVs slow enough, you may even notice it on pressing the buttons on the remote control lol.
 
Joined
Sep 1, 2009
Messages
1,232 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling ARCTIC Liquid Freezer II 240
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
Nvidia will be forced to pick up the Adaptive Sync standard and stop using an FPGA. HDMI 2.1 and consoles are now starting to use VRR. Nvidia responds with their Big TVs to play on but who wants to buy an OLED for movies and buy an over priced Nvidia TV for games. Samsung has already begun shipping VRR in their TVs and Nvidia will feel the squeeze because most cant buy more than one TV.
 
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Never said input lag doesn't exist , I simply find it unbelievable people can seriously claim they can perceive differences in the order of 10-20ms. As in 120hz synchronized vs not. We are talking 8ms vs less , that's just simply minuscule.
Of course they can. I mean if you play at v-sync 60 fps and you get tons of lag constantly, then adding another 10-20 ms will not be as noticeable. The point we're making here is when you try playing at 30ms and get used to it then adding 20ms on top of that is gonna feel bad instantly. Another example of how you just don't understand perspectives and try to always find some equivalency between cases that are completely different.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
60 fps v-sync felt great when I was running games on a 60hz display, but it never feels the same since I got used to +90 fps with g-sync. Doesn't mean 60 fps is all you ever need.
 
Last edited:
Joined
Jun 25, 2018
Messages
3 (0.00/day)
Location
Boise, Idaho
System Name Office desktop setup
Processor 3950x
Motherboard gigabyte x570 itx
Cooling custom watercooled
Memory 32gb 3600mhz cl 16
Video Card(s) Rtx 2080 ti
Storage Evo 970 1tb
Display(s) 3x acer 27in 1440p 144hz
Case Ncase M1
Audio Device(s) N/A
Power Supply Dagger 650w
Mouse Logitech G502
Keyboard Logitech G613
This is pathetic, i cant believe people have to pay this kind of $ for this. Everytime I see something about G-Sync i want to throw up in my mouth. I would love to buy g-sync but at the pricing it is completely and utterly stupid to do so. I have had nvidia gpus for quite some time b/c AMD can not give me the performance i want in gaming. So I am stuck with a 35" ultrawide with no g-sync b/c the damn thing is too expensive for the technology.

FYI - just venting
 
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I agree nvidia is charging an arm and a leg for those, but you get ulmb+adaptive sync in nvcp, those who want great response with no blur know it is friggin worth it. If I could choose to get one with g-sync implemented like freesync for a lower price and a normal g-sync one with ulmb and whole 30-1xx/2xx hz range guaranteed to work, I'd gladly pay the premium.
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.54/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
@Vya Domus You can see the input lag at your desktop if you hook your computer up to a regular TV that doesn't have low response times.

One of my PCs is hooked up to my TV every now at then, and there's always a heartbeat between moving the mouse physically, and action taking place on the screen.

Head on over to www.rtings.com and read through some of their reviews and look at the response time section. They are no nonsense and will give you just the facts on the TVs.
This is true. The Xbox just upped its refresh rate to 120mhz from the last update to address this.

https://news.xbox.com/en-us/2018/04/20/may-xbox-update/
 
Joined
Jan 24, 2008
Messages
888 (0.14/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
Freesync and G-Sync are very welcome technologies. It is completely up to the invidiual person if he notices the advantages or not. These adaptive synchronization technologies can eliminate microstutter and tearing with no added input lag, what regural vertical synchronization will induce, at any supported refresh rate defined by the monitor at hand.

My personal opinion is that at refresh rate more than 120 Hz tearing is noticable, but it doesn't bother me, unlike at 60 Hz when it certainly does. I can definetly enjoy games with such unnoticable tearing. However, depending on the game being played, microstutter can occur for various reasons, which adaptive sync is often able to reduce or completely eliminate.

You think Nvidia cares about that? If they can keep this up for a couple of years. They have earned millions. So what if the technologie doesn't survive.
 
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
Two pages of posts about G-Sync technology without mention of the hardware module's MBR function ?

G-Sync - The term "G-Sync" oddly is still used even when the user turns off G-Sync. A "G-Sync" monitor can be used 2 ways. Using Adaptive Sync to sync frame rates is only one of them. The sync technology provided has its most noticeable impact from 30 to 75ish fps at which point users will often choose to switch to ULMB, the usage of which requires G-Sync to be disabled. The nVidia hardware module provides the back light strobing required for motion blur (MBR) reduction.

Freesync - Freesync provides a similar adaptive sync technology and again has its most significant impact from 40 to 75ish fps. Like G-Sync it continues to have an impact above 75 fps, but in both instances it trails, off quickly. But here's the kicker. Freesync has no hardware module and is therefore incapable of providing Motion Blur Reduction technology (backlight strobing) which virtually eliminates ghosting. Some monitor manufacturers have provided such technology on there own ... problem is there's a myriad of designs and results vary. And when it's done well, the cost of providing the necessary MBR hardware, erases Freesync's cot advantage.

If you are running a nVidia top tier card on a Acer XB271HU or Asus PG279Q and don't have G-Sync disabled and instead using ULMB, you're missing out. When the new 4k 144 hz versions of those monitors drop, I'd expect users will be bouncing back between each setting depending on frame rates.
 
Joined
Feb 13, 2009
Messages
350 (0.06/day)
Location
NYC
Processor Intel Core i7 8700K
Motherboard ROG STRIX Z370-G GAMING AC
Cooling Corsair H115i Pro RGB
Memory G.Skill Trident Z RGB 16GB DDR4 3200Mhz
Video Card(s) Gigabyte GTX 1070 G1
Storage Samsung 970 Evo 500GB
Display(s) Dell S2417DG 165Hz
Case NZXT H400i
Power Supply Corsair AX760
Mouse Razer Deathadder Chroma
Keyboard Cooler Master - Masterkeys Pro L RGB
Software Windows 10 Pro 64Bit
I love my G-Sync monitor. My second monitor is only 60hz while my G-Sync monitor is at 165hz. The difference in smoothness without screen tearing is immense. Even desktop usage is so much better.

I use G-Sync mode, i have tried ULMB mode but meh, i still prefer G-Sync mode.

Don't knock it till you try it.
 
Joined
Nov 4, 2005
Messages
11,979 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......
 
Joined
Nov 29, 2016
Messages
670 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.

In order to get the G-Sync certification, minimum technical specifications and features must be available and certified to work correctly. Freesync is free for all.

https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia

It's running security checks to ensure only Nvidia cards are connected, that requires a VM, which needs RAM, and a fast processor.......

Stop lying. You can use G-Sync monitors with AMD cards.
 
Joined
Aug 6, 2017
Messages
7,412 (2.78/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT

bug

Joined
May 22, 2015
Messages
13,755 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
First-gen GSync did ULMB and didn't have that annoying FPS restriction that came with FreeSync.

But $500 for a GSync2 module is actually great news. It means the death of GSync is that much closer, so we can settle one one standard, like, you know, sane people.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
First-gen GSync did ULMB and didn't have that annoying FPS restriction that came with FreeSync.

But $500 for a GSync2 module is actually great news. It means the death of GSync is that much closer, so we can settle one one standard, like, you know, sane people.
Sane people. Hmmmm....
 

bug

Joined
May 22, 2015
Messages
13,755 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Joined
Feb 18, 2017
Messages
688 (0.24/day)
I'd love to know what G-SYNC does so much better over a FreeSync monitor to achieve a similar result that takes so much more engineering and money. I can only think that the solution is much more sophisticated, but there doesn't seem to be that much difference in reviews.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
Just check how many Freesync and how many G-Sync monitors are on the market. Even the second biggest TV manufacturer allowed Freesync support on some of their 2018 models. People can taunt AMD but getting a much cheaper monitor with the same specifications is good for everyone.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.46/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I laugh my ass off every time I see this. The average human reaction time is something like 250ms , whoever seriously thinks that a time frame of 16ms of less can make a perceivable difference is being delusional.

I remember playing the COD WWII beta , using v-sync at 60fps and a shitty wireless mouse I was pretty much always in the top 3. With all that "unbearable lag" , go figure.
It's persistance of vision, or lack thereof. Reaction time isn't just the visual component, it's understanding what you're looking at, then the brain telling the muscles to move, and then the muscles actually moving. Persistance of vision just being *slightly* off can cause motion sickness or dizziness.

In the end, the expensive G-SYNC v the free FreeSync difference will ensure that G-SYNC eventually dies out while FreeSync becomes the de facto standard, which is happening already. NVIDIA really needs to work on the pricing of this technology, or they'll end up creating GeForce cards that support FreeSync eventually. That would be no bad thing for the consumer, presumably.
NVIDIA would rather sell you a bridge to no where than put driver resources and certification testing into adopting adaptive sync.
 
Top