Wednesday, August 29th 2018

NVIDIA GPUs Can be Tricked to Support AMD FreeSync

Newer generations of NVIDIA GPUs such as "Pascal" and "Maxwell" meet or exceed the hardware requirements of AMD FreeSync, as they feature DisplayPort 1.4 connectors that include the features of DisplayPort 1.2a, required for VESA adaptive sync. In a bid to promote its own G-SYNC technology, NVIDIA doesn't expose this feature to monitors or software that support FreeSync. Redditor "bryf50" may have found a way around this. The trick is deceptively simple, however, you'll need games that support on-the-fly switching of rendering GPUs, and an AMD Radeon graphics card at hand.

When poking around with system settings in "Warcraft: Battle for Azeroth," bryf50 discovered that you can switch the "rendering GPU" on the fly, without having to physically connect your display to that newly selected GPU. You can start the game with your display connected to VGA1 (an AMD Radeon GPU), and switch the renderer in-game to VGA2 (an NVIDIA GPU). FreeSync should continue to work, while you enjoy the performance of that NVIDIA GPU. In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
Sources: Reddit, PC Perspective
Add your own comment

94 Comments on NVIDIA GPUs Can be Tricked to Support AMD FreeSync

#26
StrayKAT
medi01If you'd read OP you'd notice it is done using graphic card integrated into APU.


You must be new here. nVidia has it's own level, named after it's CEO.
I got sidetracked by the OP mentioning a 550+1080Ti combo, I guess. :)
Posted on Reply
#27
bug
This whole debacle would go away if VESA would buck up and made FreeSync a mandatory part of DisplayPort.
Posted on Reply
#28
londiste
bugThis whole debacle would go away if VESA would buck up and made FreeSync a mandatory part of DisplayPort.
You mean DisplayPort Adaptive-Sync, right? :)
As well as HDMI 2.1 with its (again optional) VRR.
Posted on Reply
#29
StrayKAT
bugThis whole debacle would go away if VESA would buck up and made FreeSync a mandatory part of DisplayPort.
That's what it is? Although they call it Adaptive Sync. What am I missing?
Posted on Reply
#30
John Naylor
In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
By g-sync tax I assume you mean the hardware module missing from Freesync monitors unless the monitor manufacturer provides it ? Yeah, the module that adds significant increase in cost when provided by those manufacturers ? Buying a Freesync monitor to run an nVidia card is like going off roading in a vehicle w/o 4 WD. No ULMB, no thanks.
Posted on Reply
#31
StrayKAT
John NaylorBy g-sync tax I assume you mean the hardware module missing from Freesync monitors unless the monitor manufacturer provides it ? Yeah, the module that adds significant increase in cost when provided by those manufacturers ? Buying a Freesync monitor to run an nVidia card is like going off roading in a vehicle w/o 4 WD. No ULMB, no thanks.
It's not like the cards couldn't comply. They just wanted to own the display market.

edit: For the record, I think they deserved that chance. They just failed.
Posted on Reply
#33
londiste
Since 2009 in embedded DisplayPort. It was added to DisplayPort 1.2a in 2014 as an optional feature.
Btw, GSync in laptops uses eDP's Adaptive Sync.
Posted on Reply
#34
bug
londisteSince 2009 in embedded DisplayPort. It was added to DisplayPort 1.2a in 2014 as an optional feature.
Btw, GSync in laptops uses eDP's Adaptive Sync.
Yeah, I meant it was defined and functional. I'm pretty sure HDMI had something similar as well, but I can't dig anything up atm.
Posted on Reply
#35
londiste
bugYeah, I meant it was defined and functional. I'm pretty sure HDMI had something similar as well, but I can't dig anything up atm.
HDMI did not have VRR support until the upcoming 2.1. Both GSync and FreeSync are doing proprietary stuff for HDMI support.
Posted on Reply
#36
Ferrum Master
The only one who can actually scold and bring nvidia to their knees is Microsoft itself for screwing around their OS, and monetizing on OS feature fragmentation.

Currently gsync is already pain in the arse for them, it is constantly broken in insider builds as it interacts with whole display driver model.

Microsoft already did such move in the past by spanking Creative.

Somebody must have a serious talk with each other. As most importantly, the user experience suffers from this proprietary nonsense, gsync or freesync etc things when buying a panel, consumer choices are limited, thus mere people don't even understand that. A pure circus from both AMD and nVidia tbh... there are things that should be left alone and common, just like graphics api.
Posted on Reply
#37
StrayKAT
Ferrum MasterThe only one who can actually scold and bring nvidia to their knees is Microsoft itself for screwing around their OS, and monetizing on OS feature fragmentation.

Currently gsync is already pain in the arse for them, it is constantly broken in insider builds as it interacts with whole display driver model.

Microsoft already did such move in the past by spanking Creative.

Somebody must have a serious talk with each other. As most importantly, the user experience suffers from this proprietary nonsense, gsync or freesync etc things when buying a panel, consumer choices are limited, thus mere people don't even understand that. A pure circus from both AMD and nVidia tbh... there are things that should be left alone and common, just like graphics api.
So apparently it's not just Linus with the problems :P

Posted on Reply
#38
londiste
GSync's days are numbered the moment DP Adaptive Sync or more likely HDMI VRR (due to TVs using HDMI) starts being mandatory. Unfortunately, this is not too likely right now because there will be low-end monitor/TV manufacturers who will not want to have a (more expensive) scaler capable of that in their displays. We will get there some day though.
Posted on Reply
#39
StrayKAT
londisteGSync's days are numbered the moment DP Adaptive Sync or more likely HDMI VRR (due to TVs using HDMI) starts being mandatory. Unfortunately, this is not too likely right now because there will be low-end monitor/TV manufacturers who will not want to have a (more expensive) scaler capable of that in their displays. We will get there some day though.
Well, for now, I get to use Freesync on HDMI as well (it's on all new Samsungs). 4K is limited to 60hz though.. unlike those new G-Sync BFGD's coming.
Posted on Reply
#40
notb
StrayKATSure (and I wish they did), but I'm just talking about the topic at hand. Nvidia. These slots don't grow on trees!
Are PCIe slots so valuable, really? :-)
I'm on mITX, so that's a different story. But based on System Specs most people here use ATX, so...?
Posted on Reply
#41
StrayKAT
notbAre PCIe slots so valuable, really? :)
I'm on mITX, so that's a different story. But based on System Specs most people here use ATX, so...?
They are to me..

I'm using ATX, but I'm kind of limited (of course, I'm also using a Core-X without all the PCIe lanes like it has with the i9... so one of my slots is disabled as it is.. then this Vega hogs up the space of two slots).
Posted on Reply
#42
Vayra86
cucker tarlsonno ULMB, no thanks.
LOL you realize Gsync and ULMB are never active together do you

If you buy into Gsync for the ULMB youve lost the plot..,
Posted on Reply
#43
cucker tarlson
Vayra86LOL you realize Gsync and ULMB are never active together do you

If you buy into Gsync for the ULMB youve lost the plot..,
I'm going to leave that without engaging into a dumb conversation you've just tried to start.
You have to realize that other people have a brain too,not just you.
Posted on Reply
#44
Vayra86
OK..

Its still an important little detail when it comes to emulating Freesync on an Nv card.
Posted on Reply
#45
cucker tarlson
Vayra86OK..

Its still an important little detail when it comes to emulating Freesync on an Nv card.
What is an important detail ?
Posted on Reply
#46
Vayra86
cucker tarlsonWhat is an important detail ?
That you arent missing anything freesync would offer if you lack ULMB. But I see you are in resistance mode, thats fine, just drop it.
Posted on Reply
#47
cucker tarlson
Vayra86That you arent missing anything freesync would offer if you lack ULMB. But I see you are in resistance mode, thats fine, just drop it.
No,but you are not getting 100% functionality getting a freesync monitor over g-sync one either. And strobing is a BIG one.

If anything it's you that needs to drop the condescending tone, not just here, but in general.No one was talking about ulmb in g-sync mode until you tried to convince people here that I was.
Posted on Reply
#48
bug
Vayra86LOL you realize Gsync and ULMB are never active together do you

If you buy into Gsync for the ULMB youve lost the plot..,
The caveat is when people can sustain high fps, they tend to prefer ULMB over adaptive refresh rates.
Posted on Reply
#49
Vayra86
cucker tarlsonNo,but you are not getting 100% functionality getting a freesync monitor over g-sync one either. And strobing is a BIG one.

If anything it's you that needs to drop the condescending tone, not just here, but in general.No one was talking about ulmb in g-sync mode until you started thinking I was.
Fair enough but thats not the topic here. The topic is Freesync and opening up the use of monitors for Nvidia cards. And, on top of that, you can find monitors with strobe and no Gsync too.

Relax man. Shit.

You first posted with no Ulmb no thanks... which really is offtopic here
Posted on Reply
#50
cucker tarlson
Well,IS freesync really working properly ? Or is it just one guy in one game screaming "it works!" ? Maybe it's the monitor osd malfunction,displaying freesync mode when it's not active. Only way is the verify that empirically, readings can be wrong.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts