Monday, November 25th 2019
NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support
In the wars of variable refresh rates, much ink has already been spilled regarding the open, AMD FreeSync approach and NVIDIA's proprietary G-Sync modules. The war started to give its first signs of abatement once NVIDIA seemed to throw in the towel by officially supporting VESA's VRR (Variable Refresh Rate) technology on its graphics cards, basically opening the way for NVIDIA graphics cards to correctly operate with previously AMD FreeSync-branded monitors. Now, it seems one more step will be taken on that road which should be G-Sync's proprietary approach final whiff, since according to a report from TFT Central, confirmed by NVIDIA, the company will enable VRR support for next releases of monitors equipped with the company's G-Sync module. This will essentially enable AMD graphics cards to work with NVIDIA-branded G-Sync monitors.This move will only work for future monitor releases, mind you - a firmware update which will be distributed amongst monitor makers will enable the next releases of G-Sync to support VESA's VRR standard. This will not, apparently, be happening with already-released G-Sync modules, whether carrying NVIDIA's first take on the technology, nor the v2 G-Sync modules. It's not a perfect solution, and current adapters of G-Sync are still locked-in to NVIDIA graphics cards for VRR support on their monitors. It is, however, a definite step forward. Or a step backwards from a proprietary, apparently unneeded technology - you can really look at it either way.Whether or not this makes sense from a product standpoint will only be understood once pricing on future NVIDIA G-Sync monitors surfaces - but we are finding it a hard sell for monitor makers to invest much in the G-Sync module going forward, since there are no practical, user-observable differences aside final product cost.
Source:
TFT Central
66 Comments on NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support
2) Freesync over HDMI is AMD-specific port of VESA Adaptive Sync. Depending on HDMI version and bandwidth, there's upto 120hz HDMI Freesync monitors available.
3) G-sync over HDMI is Nvidia implementing HDMI Forum VRR spec in HDMI 2.1 and forward monitors. Since AMD is also a member of HDMI Forum, it's a only a matter of before AMD also implements it.
4) While no current graphics cards are HDMI 2.1, according to a reddit post some features of 2.1 can be back-ported to HDMI 2.0.
5) This news piece is about Nvidia opening up G-sync modules to take AMD Freesync signal in the future.
1) Freesync 1st gen is trash because there's no certification via AMD. Or if any monitors are certified (AMD's official list on their website) they aren't tested.
2) In this case, official G-sync Compatible monitors (from the list) are better bet because they were tested by Nvidia.
3) But being certified also means Nvidia doesn't want monitor OSD to have any mention of "Freesync". It should just show up on Nvidia driver if you plug it in. Not fiddle the OSD to turn on "Freesync" like my current monitor.
4) So Freesync monitors not certified are also G-sync Compatible compatible.
My main gaming display supports LFC with a reasonably-wide 48Hz VRR window and it's a significant improvement over 48-60Hz, just as 48-60Hz was a significant improvement over no VRR at all. Nvidia would like you to believe that VRR is a binary feature and that it's pointless having less than full G-Sync VRR support. I'm trying to educate people that Nvidia is wrong; Any VRR is better than no VRR.
If you're not paying for it, you'll probably take any amount. But if you're playing the G-Sync premium, I suspect you're not that easily pleased. Nvidia is simply guaranteeing you're getting something for your $$$.
People need to grow up already. FreeSync / Gsync and who did it better... its a different approach, be happy you had a choice to make instead of a green and red sticker that are identical in every way. Gsync was first and AMD followed. Gsync commanded a premium through quality and support that FreeSync could never achieve because its first incarnation was weak. We all know this, fanboying has no place. Both companies are in it for $$$ yes even AMD, because it is a mindshare battle too.
And guess what, now we have more options than ever and high refresh+VRR has even entered TV territory. What's not to like. Got a source? As far as I know the game's thread is fundamentally capped by performance, which will vary all the time unless there is always ample headroom in the entire pipeline. I also fail to see the relation to manipulating the monitor refresh rate?
EDIT: just read the page full of similar responses, I think we can put that to bed.
ie, it may appear smooth, but screen tearing will return... Unsure I'd prefer that over NVIDIAs frame duplication thing the gsync module does. Because the behavior of module based gsync and freesync is actually different at the low end exit point of the range.
I'm already asking advantage of hdmi 2.1 vrr with my 65c9 oled and I'll never go back and I don't think anyone else as a gamer will want to either once it's standard on all next Gen consoles and 2020 TV's. You must be playing some pretty low spec game at sub 2k resolutions and at a max of 60 fps to always be at the cap you set.
Me I have a 2080 ti and and 4k oled with gsync compatibility and play many of today's best games and I cannot keep my refresh rate maxed out to the in game limits I set and fly anywhere between 75 to 120 and 1440p and 40 to 60 at 4k.
Without gsync even with in game limits set to my displays proper res minus a couple I'm still going to get tearing without gsync.
I was pissed when he put that Alienware in his living room cause he was talking crap about the lg OLED's t the time even though it was already announced at that time we would have vrr in just a couple weeks.
I had just picked mine up a month earlier and had a good feeling I would have something akin to gsync but figured it wouldn't be until the new year sometime before the new consoles imagine my surprise when Christmas came early lol.
Now linus is talking about going back to the lg oled in his living room and it's one of the reasons I hate these video these guys make where they take a sponsored product and put it in their home its not like they really chose this product cause it's the best. They chose it cause someone decided to give it to them possibly with a bag of money.
I was in the comments talking about how I got all the features of that Alienware for $3,500 less (I paid about $1700 for my 65" those are like $6,000) but everyone in the comments were just like links got rid of an oled for this it's obviously better.
Yea so much better he's replacing it with another oled less than 6 weeks later lol.
My wife has AWD ... at least 3 times every winter, I tow her AWD out of the snow with my 4WD. Go off-road out in Moab w/ AWD ... only if ya wanna risk ya life. G-Sync and Freesync do what they do about equally from 40 - 75 fps. Hard to notice any difference between the two except nVidia seems to have an edge below 40 fps. But like 4WD where I can turn a switch on the dashboard and lock all 4 wheels, G-Syc can do something Freesync can't do... Motion Blur reduction (MBR) ,,,and that's one of the reasons why nvidia's market share if 5 times AMDs.
If I am at 75-80 fps of more, I have G-sync turned off and it's ULMB only for me. Waited 2 years for the 4k 144Hz panels and when they came to market w/o ULMB, I passed. Yes, you can buy Freesync monitors w/ MBR technology but it's not from AMD ... it's a hodpodge of different systems.
The move by nVidia hers is consistent with the GFX card strategy ... take the top spot, win mindshare and work you way down gobbling more and more market share. With the 7xx series, they had the top 2 tiers ... with 9xx they took another w/ the 970 ... with 10xx they took another with the 1060... with 2xx .... they have edged AMD in every market segment down to $200. AMD almsot held on with the 5700XT but when both cards are OCd ... nvidia has the edge ... in performance, power, heat and noise. They are doing the same thing w/ monitors. AMD had a niche that they owned in the budget market niche .... now the discussion in the boardroom isn't, as suggested "let's give up" .... that discussion is "here's a segment we haven't taken yet, let's jump in here too".