Monday, January 7th 2019

NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology
NVIDIA finally got around to realizing that the number of monitors with VESA adaptive-sync overwhelmingly outnumber those supporting NVIDIA G-Sync, and is going ahead with adding support for adaptive-sync monitors. This however, comes with a big rider. NVIDIA is not immediately going to unlock adaptive-sync to all monitors, just the ones it has tested and found to work "perfectly" with their hardware. NVIDIA announced that it has found a handful of the 550+ monitor models in the market that support adaptive-sync, and has enabled support to them. Over time, as it tests more monitors, support for these monitors will be added through GeForce driver updates, as a "certified" monitor.
At their CES event, the company provided a list of monitors that they already tested and that fulfill all requirements. G-Sync support for these models from Acer, ASUS, AOC, Agon and BenQ will be automatically enabled with a driver update on January 15th.
Update: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too."
Update 2: NVIDIA released these new Adaptive-Sync capable drivers, we tested G-SYNC on a FreeSync monitor.
At their CES event, the company provided a list of monitors that they already tested and that fulfill all requirements. G-Sync support for these models from Acer, ASUS, AOC, Agon and BenQ will be automatically enabled with a driver update on January 15th.
Update: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too."
Update 2: NVIDIA released these new Adaptive-Sync capable drivers, we tested G-SYNC on a FreeSync monitor.
231 Comments on NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology
G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.
Honestly I can't tell if it's working.
The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).
These are good news and in a way expected.
In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas
At least I have no negative issues.
Everything is smooth, but it looked that way before :)
drive.google.com/drive/folders/0B0RkAW7Y4oRSd1gtSkFPcXB6RGM
Not tested yet with my XG270HU but for sure I will.
Edit : It's working like a charm !
_less input lag
_an overdrive (well hmm ok lol)
_refresh rate starts lower than freesync or adaptative sync
_profits ? (for nVidia of course)
It's qualified as high end but it's just a marketing stuff which adds like $200 for something you may not be able to notice compared to freesync screens.
The thing is G-sync screens may have less (to no) ghosting compared to some Freesync 1 stuff, I don't know about adaptative sync screens. It can be not noticeable for some out of a bench by the way.
Edit : I just want to tell I'm not a fanboy or anything.
The only ATI (before AMD for young people) I owned was a X800GTO and the screen I bought was just for 1440p @ 144Hz, no way I planned to use Freesync on it. Now nVidia permits to use G-sync on it, I'm happy cause no way I would have bought a G-sync screen for this kind of small feature.
Small feature because I think it's only good when you get lower than 60fps, to permit the experience to remain smooth. For real right now I tried in The Division & AC Origins but I only get lower than 60 in The Division with nvidia custom shadows in some areas. I just used those shadows for the test and the sync is working good, by the way it's a no go for me to have less than 70fps (I picked back the ultra shadows instead of special ones) so I don't feel the need of the feesync/g-sync feature, I added it because why not ? It will be good for non optimized console port games I guess...before I switch the 1080Ti for something more powerful.
Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself). There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster. That's no fault of the technology though.
Those tests make people think. I never had trust in this G-sync tech by the way.
(real) Competitive people will not play in 4k@60fps@60Hz with G-sync & Vsync ON for sure so the little input lag will never be noticeable with a recent monitor, in games where a graphic card might struggle to keep the fps high. I say this cause input lag can be an issue in competitive games only. I don't know anybody who can notice it even on a TV screen as it is about ghosting but cyborgs might exist :D
No way can somebody play a competitive game with ultra graphics if it kills the min/max fps but lies have been told about G-Sync module for sure.
By the way I have read it's better to let nVidia Vsync to default in the nVidia control panel BUT put it OFF in games so G-Sync will do the job as intended. It appears there is a sort of certification, they said the screens pass like 300 tests for quality, so you would pay like $200 extra to be sure the "thing" works.
Now they tested the Freesync panels, I see no reason if you can see the monitor you want pass the nVidia tests (maybe not 300 I don't know about this I didn't read much about those). They report the Acer XG270HU as a 2560x1080 panel...it's 2560x1440 so you can see a lack of professionalism.
I do understand why they did lock G-Sync on monitors without the module : 1. Money / 2. Less bugs to fix
Now let's remember ATI hmm AMD said you could use different cards to pair in your computer for calculation (it could be games or anything) like a 980Ti + a RX 560X for example. Like a SLI without any synchronization. It's all about drivers. Imagine you could pair your 2080Ti with a 1080Ti, I don't see why you could not. Drivers just let you use 1 of the cards for PhysX but think about it, it's a driver lock.
My entire previous post, since you didn't notice, was a rebuttal to the "What solutions do you propose?" bit. Recycling a distraction instead of rebutting a post is another tired rhetorical strategy.
Also, trying to condemn me because of who decides to like my posts is really cheesy, along with language like "dear god".
Maybe a few more posturing bits like "dear god" and bolded text will increase the relevance.
Oh, I see you've added one: "walls of text are pointless". Since it's clear you have nothing on-topic to discuss I'm out.
www.techpowerup.com/forums/threads/nvidia-g-sync-now-supports-freesync-vesa-adaptive-sync-technology.251237/page-6#post-3971901
There is no substance here. Just a lot of words to convey the fact you don't like the tone of this discussion. OK. We got the memo - except everyone responding to you was already asking for that substance, including myself. We were listening - waiting, for you to make your point or drive it home. My 'dear god' got in there because after a few responses from others, we we were still not clear on the point of the post I linked above.
You can leave whenever you want to... but all this was is a bit of miscommunication. Literally nobody responded to your post's content, its up to you to figure out why. Or you can turn around and leave. All the same to me...