Wednesday, August 29th 2018
NVIDIA GPUs Can be Tricked to Support AMD FreeSync
Newer generations of NVIDIA GPUs such as "Pascal" and "Maxwell" meet or exceed the hardware requirements of AMD FreeSync, as they feature DisplayPort 1.4 connectors that include the features of DisplayPort 1.2a, required for VESA adaptive sync. In a bid to promote its own G-SYNC technology, NVIDIA doesn't expose this feature to monitors or software that support FreeSync. Redditor "bryf50" may have found a way around this. The trick is deceptively simple, however, you'll need games that support on-the-fly switching of rendering GPUs, and an AMD Radeon graphics card at hand.
When poking around with system settings in "Warcraft: Battle for Azeroth," bryf50 discovered that you can switch the "rendering GPU" on the fly, without having to physically connect your display to that newly selected GPU. You can start the game with your display connected to VGA1 (an AMD Radeon GPU), and switch the renderer in-game to VGA2 (an NVIDIA GPU). FreeSync should continue to work, while you enjoy the performance of that NVIDIA GPU. In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
Sources:
Reddit, PC Perspective
When poking around with system settings in "Warcraft: Battle for Azeroth," bryf50 discovered that you can switch the "rendering GPU" on the fly, without having to physically connect your display to that newly selected GPU. You can start the game with your display connected to VGA1 (an AMD Radeon GPU), and switch the renderer in-game to VGA2 (an NVIDIA GPU). FreeSync should continue to work, while you enjoy the performance of that NVIDIA GPU. In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
94 Comments on NVIDIA GPUs Can be Tricked to Support AMD FreeSync
As well as HDMI 2.1 with its (again optional) VRR.
edit: For the record, I think they deserved that chance. They just failed.
Btw, GSync in laptops uses eDP's Adaptive Sync.
Currently gsync is already pain in the arse for them, it is constantly broken in insider builds as it interacts with whole display driver model.
Microsoft already did such move in the past by spanking Creative.
Somebody must have a serious talk with each other. As most importantly, the user experience suffers from this proprietary nonsense, gsync or freesync etc things when buying a panel, consumer choices are limited, thus mere people don't even understand that. A pure circus from both AMD and nVidia tbh... there are things that should be left alone and common, just like graphics api.
I'm on mITX, so that's a different story. But based on System Specs most people here use ATX, so...?
I'm using ATX, but I'm kind of limited (of course, I'm also using a Core-X without all the PCIe lanes like it has with the i9... so one of my slots is disabled as it is.. then this Vega hogs up the space of two slots).
If you buy into Gsync for the ULMB youve lost the plot..,
You have to realize that other people have a brain too,not just you.
Its still an important little detail when it comes to emulating Freesync on an Nv card.
If anything it's you that needs to drop the condescending tone, not just here, but in general.No one was talking about ulmb in g-sync mode until you tried to convince people here that I was.
Relax man. Shit.
You first posted with no Ulmb no thanks... which really is offtopic here