• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I won't argue with that, but I'll STILL take a 48-60Hz over fixed 60Hz every single time. That's all my 4K television offers but it makes a huge difference even if it doesn't support LFC.

I'm looking at a slim/portable 14" laptop and I'm noticing one with a 3700U and 48-60Hz VRR panel. Being able to run a game at 48Hz instead of 60Hz is equivalent to a 25% performance boost when seen in the context of avoiding stuttering, which is the number one concern for me when low-end gaming. Hell, it might even be the sort of jump that allows running at native 1080p instead of non-native 720p. That sort of difference is HUGE.
To sum it up, G-Sync shouldn't exist because you'll take any amount of variable refresh? :p
 
Joined
Dec 10, 2014
Messages
1,327 (0.36/day)
Location
Nowy Warsaw
System Name SYBARIS
Processor AMD Ryzen 5 3600
Motherboard MSI Arsenal Gaming B450 Tomahawk
Cooling Cryorig H7 Quad Lumi
Memory Team T-Force Delta RGB 2x8GB 3200CL16
Video Card(s) Colorful GeForce RTX 2060 6GV2
Storage Crucial MX500 500GB | WD Black WD1003FZEX 1TB | Seagate ST1000LM024 1TB | WD My Passport Slim 1TB
Display(s) AOC 24G2 24" 144hz IPS
Case Montech Air ARGB
Audio Device(s) Massdrop + Sennheiser PC37X | QKZ x HBB
Power Supply Corsair CX650-F
Mouse Razer Viper Mini | Cooler Master MM711 | Logitech G102 | Logitech G402
Keyboard Drop + The Lord of the Rings Dwarvish
Software Windows 10 Education 22H2 x64
1) Freesync is VESA Adaptive Sync when run on AMD card. G-sync Compatible on Nvidia card.
2) Freesync over HDMI is AMD-specific port of VESA Adaptive Sync. Depending on HDMI version and bandwidth, there's upto 120hz HDMI Freesync monitors available.
3) G-sync over HDMI is Nvidia implementing HDMI Forum VRR spec in HDMI 2.1 and forward monitors. Since AMD is also a member of HDMI Forum, it's a only a matter of before AMD also implements it.
4) While no current graphics cards are HDMI 2.1, according to a reddit post some features of 2.1 can be back-ported to HDMI 2.0.
5) This news piece is about Nvidia opening up G-sync modules to take AMD Freesync signal in the future.

1) Freesync 1st gen is trash because there's no certification via AMD. Or if any monitors are certified (AMD's official list on their website) they aren't tested.
2) In this case, official G-sync Compatible monitors (from the list) are better bet because they were tested by Nvidia.
3) But being certified also means Nvidia doesn't want monitor OSD to have any mention of "Freesync". It should just show up on Nvidia driver if you plug it in. Not fiddle the OSD to turn on "Freesync" like my current monitor.
4) So Freesync monitors not certified are also G-sync Compatible compatible.
 
Joined
Feb 20, 2019
Messages
8,277 (3.94/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
To sum it up, G-Sync shouldn't exist because you'll take any amount of variable refresh? :p
Correction: I'll take any amount of variable refresh if the alternative is none :)

My main gaming display supports LFC with a reasonably-wide 48Hz VRR window and it's a significant improvement over 48-60Hz, just as 48-60Hz was a significant improvement over no VRR at all. Nvidia would like you to believe that VRR is a binary feature and that it's pointless having less than full G-Sync VRR support. I'm trying to educate people that Nvidia is wrong; Any VRR is better than no VRR.
 

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Correction: I'll take any amount of variable refresh if the alternative is none :)

My main gaming display supports LFC with a reasonably-wide 48Hz VRR window and it's a significant improvement over 48-60Hz, just as 48-60Hz was a significant improvement over no VRR at all. Nvidia would like you to believe that VRR is a binary feature and that it's pointless having less than full G-Sync VRR support. I'm trying to educate people that Nvidia is wrong; Any VRR is better than no VRR.
I don't think Nvidia ever implied that's a binary feature.
If you're not paying for it, you'll probably take any amount. But if you're playing the G-Sync premium, I suspect you're not that easily pleased. Nvidia is simply guaranteeing you're getting something for your $$$.
 
Joined
Dec 22, 2011
Messages
289 (0.06/day)
Processor Ryzen 7 5800X3D
Motherboard Asus Prime X570 Pro
Cooling Deepcool LS-720
Memory 32 GB (4x 8GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil
Storage Samsung PM9A1 (980 Pro OEM) + 960 Evo NVMe SSD + 830 SATA SSD + Toshiba & WD HDD's
Display(s) Samsung C32HG70
Case Lian Li O11D Evo
Audio Device(s) Sound Blaster Zx
Power Supply Seasonic 750W Focus+ Platinum
Mouse Logitech G703 Lightspeed
Keyboard SteelSeries Apex Pro
Software Windows 11 Pro
Joined
Sep 17, 2014
Messages
22,438 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Sure but still irrelevant and OT.

People seem to think I have some side in this. They seem to forget I had an exclusively Ryzen/AMD system but a year ago, and love AMD as much as is healthy. That's not an excuse to bring them up where irrelevant, or worship them as anything more than what they are: A profit driven company.

Its a strange world where you have to support sensible arguments with the addendum that you have an AMD rig... man

People need to grow up already. FreeSync / Gsync and who did it better... its a different approach, be happy you had a choice to make instead of a green and red sticker that are identical in every way. Gsync was first and AMD followed. Gsync commanded a premium through quality and support that FreeSync could never achieve because its first incarnation was weak. We all know this, fanboying has no place. Both companies are in it for $$$ yes even AMD, because it is a mindshare battle too.

And guess what, now we have more options than ever and high refresh+VRR has even entered TV territory. What's not to like.

Meaning the game's own thread(s) caps the frame rate.

Got a source? As far as I know the game's thread is fundamentally capped by performance, which will vary all the time unless there is always ample headroom in the entire pipeline. I also fail to see the relation to manipulating the monitor refresh rate?

EDIT: just read the page full of similar responses, I think we can put that to bed.
 
Last edited:
Joined
Aug 20, 2007
Messages
21,453 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
It really depends on the game... not much with the monitor. For example, when I get a frame drop of 30 fps when a large battle occurs in total war games, and it jumps up high when i zoom in, its flawless smooth with freesync... no new monitor tech is going to change that... so no he is simply wrong. Freesync still matters and helps.

Technically speaking, freesync should handle that worse than gsync because at such low fpses it will fall back to unsynced video.

ie, it may appear smooth, but screen tearing will return... Unsure I'd prefer that over NVIDIAs frame duplication thing the gsync module does.

Opening up G-Sync is a step in the right direction but I still don't understand why G-Sync monitors are being made today.

Rather than this (now completely unnecessary) G-Sync/Freesync branding - monitor manufacturers should just list the refresh rate like this:

60Hz
48-60Hz
40-75Hz
120Hz
30-144Hz

etc....

It's not rocket-science to work out which ones have variable refresh rates and which ones don't and there's zero ambiguity. At the moment you have "144Hz monitors" that may or may not support VRR, the existence of which is often buried deep in the detailed specifactions and an afterthought, and you're lucky if the refresh rate range of VRR is even mentioned.

Because the behavior of module based gsync and freesync is actually different at the low end exit point of the range.
 
  • Like
Reactions: bug

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,209 (4.66/day)
Location
Kepler-186f
I am using a freesync monitor right now with gsync compatible, and I have no tearing at all. feels same as when I had an official gsync screen. my range is 30 to 144 though, so maybe I just got a good monitor.
 

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I am using a freesync monitor right now with gsync compatible, and I have no tearing at all. feels same as when I had an official gsync screen. my range is 30 to 144 though, so maybe I just got a good monitor.
See the comment above yours. The problem is with monitors having a narrow refresh range. FreeSync doesn't impose a range, whereas G-Sync does.
 
Joined
Aug 20, 2007
Messages
21,453 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
I am using a freesync monitor right now with gsync compatible, and I have no tearing at all. feels same as when I had an official gsync screen. my range is 30 to 144 though, so maybe I just got a good monitor.

Yeah your range is pretty darn good too. That may have a lot to do with it.
 
  • Like
Reactions: bug
Joined
Nov 9, 2018
Messages
57 (0.03/day)
Nice. I do wonder what the motivation behind this is.

The lg oled lineup and their implementation of hdmi 2.1 vrr pretty soon (like within a year or so) all gpu and gaming consoles will have a built in perfectly usable implementation (HDMI Forum VRR) via hdmi 2.1 chips they will all include and with that displays also coming on the market en masse with the feature just "built in" they know its time to open up and support the future. If not it'd be like them sticking to dvi after hdmi / dp came about.

I'm already asking advantage of hdmi 2.1 vrr with my 65c9 oled and I'll never go back and I don't think anyone else as a gamer will want to either once it's standard on all next Gen consoles and 2020 TV's.

Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary. I could be glossing over a few edge cases, but most games I play today support in game limits and I replaced my Asus GSYNC with a Samsung 32" non-gsync and I 1) play without Vsync 2) use in-game limits 3) never get tearing or input lag.

Could be there are other benefits, but I absolutely don't miss Gsync. I can actually enable Gsync now with latest drivers, but I see no benefits, in fact enabling FreeSync on my Samsung messes with/lowers the brightness for some damn reason, so there's just no reason for me to enable it to use it in Gsync-compatible mode.
You must be playing some pretty low spec game at sub 2k resolutions and at a max of 60 fps to always be at the cap you set.

Me I have a 2080 ti and and 4k oled with gsync compatibility and play many of today's best games and I cannot keep my refresh rate maxed out to the in game limits I set and fly anywhere between 75 to 120 and 1440p and 40 to 60 at 4k.

Without gsync even with in game limits set to my displays proper res minus a couple I'm still going to get tearing without gsync.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,209 (4.66/day)
Location
Kepler-186f
The lg oled lineup and their implementation of hdmi 2.1 vrr pretty soon (like within a year or so) all gpu and gaming consoles will have a built in perfectly usable implementation (HDMI Forum VRR) via hdmi 2.1 chips they will all include and with that displays also coming on the market en masse with the feature just "built in" they know its time to open up and support the future. If not it'd be like them sticking to dvi after hdmi / dp came about.

I'm already asking advantage of hdmi 2.1 vrr with my 65c9 oled and I'll never go back and I don't think anyone else as a gamer will want to either once it's standard on all next Gen consoles and 2020 TV's.


You must be playing some pretty low spec game at sub 2k resolutions and at a max of 60 fps to always be at the cap you set.

Me I have a 2080 ti and and 4k oled with gsync compatibility and play many of today's best games and I cannot keep my refresh rate maxed out to the in game limits I set and fly anywhere between 75 to 120 and 1440p and 40 to 60 at 4k.

Without gsync even with in game limits set to my displays proper res minus a couple I'm still going to get tearing without gsync.

there are LG OLED monitor's out already that have gsync VRR hdmi 2.1. Linus just did a video on it last week, and put one in his personal house.
 
Joined
Nov 9, 2018
Messages
57 (0.03/day)
there are LG OLED monitor's out already that have gsync VRR hdmi 2.1. Linus just did a video on it last week, and put one in his personal house.

I know this I have the lg c9 65" oled myself he got the lower end model the b9 from lg.

I was pissed when he put that Alienware in his living room cause he was talking crap about the lg OLED's t the time even though it was already announced at that time we would have vrr in just a couple weeks.

I had just picked mine up a month earlier and had a good feeling I would have something akin to gsync but figured it wouldn't be until the new year sometime before the new consoles imagine my surprise when Christmas came early lol.

Now linus is talking about going back to the lg oled in his living room and it's one of the reasons I hate these video these guys make where they take a sponsored product and put it in their home its not like they really chose this product cause it's the best. They chose it cause someone decided to give it to them possibly with a bag of money.

I was in the comments talking about how I got all the features of that Alienware for $3,500 less (I paid about $1700 for my 65" those are like $6,000) but everyone in the comments were just like links got rid of an oled for this it's obviously better.

Yea so much better he's replacing it with another oled less than 6 weeks later lol.
 
Joined
Aug 8, 2016
Messages
89 (0.03/day)
System Name Evo PC
Processor AMD Ryzen 5800X3D
Motherboard Asus X470 Prime Pro
Cooling Noctua NH15
Memory G.Skill Ripjaws 32GB 3,600MHz
Video Card(s) Gigabyte Aorus RX 7900XTX Gaming OC
Storage 1TB WD SN750/512GB Samsung 870 EVO
Display(s) LG UltraGear 27 2K 165Hz
Case Fractal Define C Glass
Audio Device(s) Creative X-Fi Titanium
Power Supply Supernova G2 1000
Mouse MSI Vigor GM11
Keyboard Corsair K55 Pro RGB
I don't trust your eyes.

Wow so clueless lol


Technically speaking, freesync should handle that worse than gsync because at such low fpses it will fall back to unsynced video.

ie, it may appear smooth, but screen tearing will return... Unsure I'd prefer that over NVIDIAs frame duplication thing the gsync module does.



Because the behavior of module based gsync and freesync is actually different at the low end exit point of the range.

That feature is called LFC aka Low Frame Compensation and several Freesync monitors supports it, its meant to extend the VRR in the lower end of the spectrum. The Samsung C27FG70 27" QLED monitor supports it and allows for VRR to run as low as 40FPS and feels exactly as 60, once the 60FPS threshold is reached it stop and runs at the regular VRR all the way to 144Hz. Freesync 2 have several imposed standards that needs to be met or otherwise won't get certified compared to the more liberal Freesync 1.
 
Last edited:
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
The thread starts with something that looks like it came out of Washingtn DC's Office of misinformation. Comparing G-Sync and Freesysnc is like comparing 4WD and AWD ... 99% of folks think they are the same thing. Nothing could be further from the truth.

My wife has AWD ... at least 3 times every winter, I tow her AWD out of the snow with my 4WD. Go off-road out in Moab w/ AWD ... only if ya wanna risk ya life. G-Sync and Freesync do what they do about equally from 40 - 75 fps. Hard to notice any difference between the two except nVidia seems to have an edge below 40 fps. But like 4WD where I can turn a switch on the dashboard and lock all 4 wheels, G-Syc can do something Freesync can't do... Motion Blur reduction (MBR) ,,,and that's one of the reasons why nvidia's market share if 5 times AMDs.

If I am at 75-80 fps of more, I have G-sync turned off and it's ULMB only for me. Waited 2 years for the 4k 144Hz panels and when they came to market w/o ULMB, I passed. Yes, you can buy Freesync monitors w/ MBR technology but it's not from AMD ... it's a hodpodge of different systems.

The move by nVidia hers is consistent with the GFX card strategy ... take the top spot, win mindshare and work you way down gobbling more and more market share. With the 7xx series, they had the top 2 tiers ... with 9xx they took another w/ the 970 ... with 10xx they took another with the 1060... with 2xx .... they have edged AMD in every market segment down to $200. AMD almsot held on with the 5700XT but when both cards are OCd ... nvidia has the edge ... in performance, power, heat and noise. They are doing the same thing w/ monitors. AMD had a niche that they owned in the budget market niche .... now the discussion in the boardroom isn't, as suggested "let's give up" .... that discussion is "here's a segment we haven't taken yet, let's jump in here too".
 
Top