Monday, November 25th 2019

NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

In the wars of variable refresh rates, much ink has already been spilled regarding the open, AMD FreeSync approach and NVIDIA's proprietary G-Sync modules. The war started to give its first signs of abatement once NVIDIA seemed to throw in the towel by officially supporting VESA's VRR (Variable Refresh Rate) technology on its graphics cards, basically opening the way for NVIDIA graphics cards to correctly operate with previously AMD FreeSync-branded monitors. Now, it seems one more step will be taken on that road which should be G-Sync's proprietary approach final whiff, since according to a report from TFT Central, confirmed by NVIDIA, the company will enable VRR support for next releases of monitors equipped with the company's G-Sync module. This will essentially enable AMD graphics cards to work with NVIDIA-branded G-Sync monitors.
This move will only work for future monitor releases, mind you - a firmware update which will be distributed amongst monitor makers will enable the next releases of G-Sync to support VESA's VRR standard. This will not, apparently, be happening with already-released G-Sync modules, whether carrying NVIDIA's first take on the technology, nor the v2 G-Sync modules. It's not a perfect solution, and current adapters of G-Sync are still locked-in to NVIDIA graphics cards for VRR support on their monitors. It is, however, a definite step forward. Or a step backwards from a proprietary, apparently unneeded technology - you can really look at it either way.
Whether or not this makes sense from a product standpoint will only be understood once pricing on future NVIDIA G-Sync monitors surfaces - but we are finding it a hard sell for monitor makers to invest much in the G-Sync module going forward, since there are no practical, user-observable differences aside final product cost.
Source: TFT Central
Add your own comment

66 Comments on NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

#1
xkm1948
Nice. I do wonder what the motivation behind this is.
Posted on Reply
#2
adulaamin
This move will only work for future monitor releases
Bummer... :(
Posted on Reply
#3
bug
xkm1948Nice. I do wonder what the motivation behind this is.
Clearly they were moving so many SKUs, they couldn't cope with it anymore. </sarcasm>

This may be a sad day, because, as hated as it was, G-Sync did one thing right: it imposed refresh limits on manufacturers. FreeSync doesn't and the logo can be awarded to monitors that do VRR over a 1Hz range as a result. Before you ask, I don't know of a monitor with such a narrow range, but monitors with unusable ranges are everywhere.
Then again, who knows, once everybody backs FreeSync, maybe FreeSync3 or 4 will eventually enforce some ranges, like G-Sync does.

(Yes, I know I used "FreeSync a little liberally, I am too lazy to keep writing VRR/AdaptiveSync.)
Posted on Reply
#4
HisDivineOrder
I was hoping this was nvidia finally announcing that they'll allow pseudo-Freesync via G-Sync over HDMI instead of just over Displayport. Well, besides on LG OLED TV's. I guess they'll wait till NEXT YEAR for that announcement.
Posted on Reply
#5
bug
HisDivineOrderI was hoping this was nvidia finally announcing that they'll allow pseudo-Freesync via G-Sync over HDMI instead of just over Displayport. Well, besides on LG OLED TV's. I guess they'll wait till NEXT YEAR for that announcement.
That's exactly what is announced. Go read the source ;)
Posted on Reply
#6
zlobby
Add this one to the countless technological advances brought forth by AMD.
Posted on Reply
#7
R-T-B
zlobbyAdd this one to the countless technological advances brought forth by AMD.
Uh... I love AMD too and all, but NVIDIA opened their modules to AMD, not vice versa.
Posted on Reply
#8
Vya Domus
R-T-BUh... I love AMD too and all, but NVIDIA opened their modules to AMD, not vice versa.
How do I put this ... AMD's FreeSync was open for everyone from day one, including for Nvidia. It's even in the name : FreeSync
Posted on Reply
#9
R-T-B
Vya DomusHow do I put this ... AMD's FreeSync was open for everyone from day one, including Nvidia. It's even in the name : FreeSync
That's also completely irrelevant to this news article about opening GSYNC (a propietary standard) to freesync monitors.

AMD had nothing to do with this other than having an open standard. NVIDIA could've sat on gsyncs ecosystem a lot longer. It probably was profit that drove them not to do so, yes, but accrediting this move (which involves new gsync module firmware at minimum) to AMD in any way is an absurd thought excercise.
Posted on Reply
#11
R-T-B
Vya DomusIrrelevant but accurate.
Sure but still irrelevant and OT.

People seem to think I have some side in this. They seem to forget I had an exclusively Ryzen/AMD system but a year ago, and love AMD as much as is healthy. That's not an excuse to bring them up where irrelevant, or worship them as anything more than what they are: A profit driven company.
Posted on Reply
#12
CrAsHnBuRnXp
Is there really a reason for both gsync and adaptive anymore?
Posted on Reply
#13
Camm
Now that Nvidia has finished co-opting VRR branding, it can afford to open it up to competitors, lol.
Posted on Reply
#14
BArms
xkm1948Nice. I do wonder what the motivation behind this is.
Gsync/Freesync are both becoming less relevant because Games with in-thread rate limits are becoming a standard option, kinda like AA/AF, which is by far the best way to handle this. In-thread limits have virtually no costs and no downsides, so for any game that supports it there's simply no need for Any-sync. Both Unity and Unreal Engine now support it so it's trivial to add to any new game from indies to blockbusters.
Posted on Reply
#15
Vya Domus
BArmsGsync/Freesync are both becoming less relevant because Games with in-thread rate limits are becoming a standard option, kinda like AA/AF, which is by far the best way to handle this. In-thread limits have virtually no costs and no downsides, so for any game that supports it there's simply no need for Any-sync. Both Unity and Unreal Engine now support it so it's trivial to add to any new game from indies to blockbusters.
What's a "in-thread rate limit' ?
Posted on Reply
#16
Apocalypsee
xkm1948Nice. I do wonder what the motivation behind this is.
With Freesync is basically 'free' on most new monitors, even big green supporting them through DP so there is VERY small number of people will buy Gsync monitor. With them opening up this to AMD there are larger market. To be honest I'll buy a Gsync monitor that is open and not linked just them (nvidia). Gsync is tightly regulated than Freesync (better VRR range, backlight strobbing etc.)
Posted on Reply
#17
Fluffmeister
ApocalypseeWith Freesync is basically 'free' on most new monitors, even big green supporting them through DP so there is VERY small number of people will buy Gsync monitor. With them opening up this to AMD there are larger market. To be honest I'll buy a Gsync monitor that is open and not linked just them (nvidia). Gsync is tightly regulated than Freesync (better VRR range, backlight strobbing etc.)
Yeah all credit to Nvidia for creating G-Sync, then help raise the standard of the countless FreeSync trash, by validating them.
Posted on Reply
#18
BArms
Vya DomusWhat's a "in-thread rate limit' ?
Meaning the game's own thread(s) caps the frame rate.
Posted on Reply
#19
Space Lynx
Astronaut
BArmsMeaning the game's own thread(s) caps the frame rate.
incorrect. the image is still smoother with freesync.
Posted on Reply
#20
BArms
lynx29incorrect. the image is still smoother with freesync.
Have any proof?
Posted on Reply
#21
Space Lynx
Astronaut
BArmsHave any proof?
yeah my eyes.
Posted on Reply
#22
BArms
lynx29yeah my eyes.
I don't trust your eyes.
Posted on Reply
#23
Apocalypsee
FluffmeisterYeah all credit to Nvidia for creating G-Sync, then help raise the standard of the countless FreeSync trash, by validating them.
NV are forces to do this in the first place because of competition if not people wont buy Gsync because its expensive. I bet you can't even list that 'countless Freesync trash' you mentioned.
Posted on Reply
#24
R-T-B
lynx29yeah my eyes.
Different monitor panels will make more difference than the tech really.

Gsync modules do offer some advantages, but they are so small the panel should matter exponentially more.
Posted on Reply
#25
Vya Domus
BArmsMeaning the game's own thread(s) caps the frame rate.
Capping the framerate does not eliminate tearing nor does it make up for a variable refresh rate.
Posted on Reply
Add your own comment
Nov 21st, 2024 04:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts