Wednesday, August 29th 2018

NVIDIA GPUs Can be Tricked to Support AMD FreeSync

Newer generations of NVIDIA GPUs such as "Pascal" and "Maxwell" meet or exceed the hardware requirements of AMD FreeSync, as they feature DisplayPort 1.4 connectors that include the features of DisplayPort 1.2a, required for VESA adaptive sync. In a bid to promote its own G-SYNC technology, NVIDIA doesn't expose this feature to monitors or software that support FreeSync. Redditor "bryf50" may have found a way around this. The trick is deceptively simple, however, you'll need games that support on-the-fly switching of rendering GPUs, and an AMD Radeon graphics card at hand.

When poking around with system settings in "Warcraft: Battle for Azeroth," bryf50 discovered that you can switch the "rendering GPU" on the fly, without having to physically connect your display to that newly selected GPU. You can start the game with your display connected to VGA1 (an AMD Radeon GPU), and switch the renderer in-game to VGA2 (an NVIDIA GPU). FreeSync should continue to work, while you enjoy the performance of that NVIDIA GPU. In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
Sources: Reddit, PC Perspective
Add your own comment

94 Comments on NVIDIA GPUs Can be Tricked to Support AMD FreeSync

#2
SetsunaFZero
do I see a NGreedia Hotfix driver incoming
Posted on Reply
#3
londiste
No, Nvidia GPUs cannot be tricked into supporting FreeSync.
Image rendered by Nvidia GPU can be output using Radeon GPU that inherently supports FreeSync.
Posted on Reply
#4
eidairaman1
The Exiled Airman
SetsunaFZerodo I see a NGreedia Hotfix driver incoming
Forced firmware update on GPU.
Posted on Reply
#5
londiste
I do not really see a good way for Nvidia to "fix" this without shooting themselves in the foot. Improved support for multiple GPUs including selecting the rendering GPU separately from output GPU should be a standard Windows (10) feature.

There is a slightly increased input lag - completely irrelevant at around 7ms more according to PC Perspective's tests and possibly a slight performance hit - so far, all the solutions of output buffering through a different (i)GPU have had one. Other than that, awesome!
Posted on Reply
#6
StrayKAT
Even if someone wanted to do this, it sounds like a good waste of a [precious] PCIe slot.
Posted on Reply
#7
eidairaman1
The Exiled Airman
londisteI do not really see a good way for Nvidia to "fix" this without shooting themselves in the foot. Improved support for multiple GPUs including selecting the rendering GPU separately from output GPU should be a standard Windows (10) feature.

There is a slightly increased input lag - completely irrelevant at around 7ms more according to PC Perspective's tests and possibly a slight performance hit - so far, all the solutions of output buffering through a different (i)GPU have had one. Other than that, awesome!
Nvidia would do it to force gsync.
Posted on Reply
#8
cucker tarlson
We don't know if this thing really works or the guy thinks it does. The monitor can show it's working in freesync mode, but is it really ?
Posted on Reply
#10
HimymCZe
... never understood that tax... ???
you see >4h of conference full of $h!t that their HW will support
..., but IF you wanna utilize that you need to MULTIPLE $200 "DLC"
Posted on Reply
#11
notb
StrayKATEven if someone wanted to do this, it sounds like a good waste of a [precious] PCIe slot.
Not really. If Intel IGP supported freesync (being built on AMD chip or not), you could do this without the second card.
Posted on Reply
#14
StrayKAT
notbNot really. If Intel IGP supported freesync (being built on AMD chip or not), you could do this without the second card.
Sure (and I wish they did), but I'm just talking about the topic at hand. Nvidia. These slots don't grow on trees!
Posted on Reply
#15
cucker tarlson
what is the cheapst used gpu that supports freesync ?

btw you can connect it to x1 via riser
Posted on Reply
#16
StrayKAT
cucker tarlsonwhat is the cheapst used gpu that supports freesync ?

btw you can connect it to x1 via riser
R7 260 I think? Could be wrong... edit: Apparently Hd series support it, but only for video or power saving features.
Posted on Reply
#17
ShurikN
GCN 2 and newer
The cheapest one should be 7790
Posted on Reply
#19
Dammeron
eidairaman1Forced firmware update on GPU.
Yep. Already did it once, when people were pairing a high-end radeon card with a cheap geforce to handle physx. Then nV blocked the drivers, so they would not work if an AMD card is detected.
Posted on Reply
#20
cucker tarlson
DammeronYep. Already did it once, when people were pairing a high-end radeon card with a cheap geforce to handle physx. Then nV blocked the drivers, so they would not work if an AMD card is detected.
can't do it now when multi gpu is working in several games and amd has apus
Posted on Reply
#21
Octopuss
DammeronYep. Already did it once, when people were pairing a high-end radeon card with a cheap geforce to handle physx. Then nV blocked the drivers, so they would not work if an AMD card is detected.
What.
That's next level of being assholes.
Posted on Reply
#22
TheDeeGee
So it's not "Free" cuz you need to buy a second GPU.
Posted on Reply
#23
medi01
StrayKATEven if someone wanted to do this, it sounds like a good waste of a [precious] PCIe slot.
If you'd read OP you'd notice it is done using graphic card integrated into APU.
OctopussWhat.
That's next level of being assholes.
You must be new here. nVidia has it's own level, named after it's CEO.
Posted on Reply
#24
GreiverBlade
well, duh... of course they can.. and i am 100% sure, even without needing a AMD gpu to do the trick ... well ... unlike G-Rapsync errrr G-Sync, Freesync is royalty free open standard so Nvidia could even support it if they weren't total greedy @$$
TheDeeGeeSo it's not "Free" cuz you need to buy a second GPU.
uh? ... well ... with G-Sync you don't even need a second GPU to pay more :laugh:
medi01If you'd read OP you'd notice it is done using graphic card integrated into APU.
and here i was about to answer to that : what waste? if not used (SLI/CFX aren't technically useful lately ) PCIeX X16 slot are a waste of space... well i do have 2 X16 and 3 X1 completely free well i don't like SLI and i don't need a soundcard, PCIeX SSD are too expensive and not many daughter card use a X1 and are useful to me.
Posted on Reply
#25
medi01
GreiverBladeuh? ... well ... with G-Sync you don't even need a second GPU to pay more :laugh:
:roll:
Posted on Reply
Add your own comment
Dec 26th, 2024 05:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts