Wednesday, August 29th 2018
NVIDIA GPUs Can be Tricked to Support AMD FreeSync
Newer generations of NVIDIA GPUs such as "Pascal" and "Maxwell" meet or exceed the hardware requirements of AMD FreeSync, as they feature DisplayPort 1.4 connectors that include the features of DisplayPort 1.2a, required for VESA adaptive sync. In a bid to promote its own G-SYNC technology, NVIDIA doesn't expose this feature to monitors or software that support FreeSync. Redditor "bryf50" may have found a way around this. The trick is deceptively simple, however, you'll need games that support on-the-fly switching of rendering GPUs, and an AMD Radeon graphics card at hand.
When poking around with system settings in "Warcraft: Battle for Azeroth," bryf50 discovered that you can switch the "rendering GPU" on the fly, without having to physically connect your display to that newly selected GPU. You can start the game with your display connected to VGA1 (an AMD Radeon GPU), and switch the renderer in-game to VGA2 (an NVIDIA GPU). FreeSync should continue to work, while you enjoy the performance of that NVIDIA GPU. In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
Sources:
Reddit, PC Perspective
When poking around with system settings in "Warcraft: Battle for Azeroth," bryf50 discovered that you can switch the "rendering GPU" on the fly, without having to physically connect your display to that newly selected GPU. You can start the game with your display connected to VGA1 (an AMD Radeon GPU), and switch the renderer in-game to VGA2 (an NVIDIA GPU). FreeSync should continue to work, while you enjoy the performance of that NVIDIA GPU. In theory, this should allow you to pair your high-end GTX 1080 Ti with a $50 RX 550 that supports FreeSync, instead of paying the $200+ G-SYNC tax.
94 Comments on NVIDIA GPUs Can be Tricked to Support AMD FreeSync
www.techpowerup.com/245463/nvidia-g-sync-hdr-module-adds-usd-500-to-monitor-pricing
used to have "TriFire" HD 5850s and ran a GTX260 if I remember right as the PhysX card.
TL;DR: NVIDIA isn't doing FreeSync in these cases, it's rendering frames, Windows is transfering it to a FreeSync enabled GPU which signals the adaptive sync features of the panel.
There's absolutely no technical reason why AMD couldn't drive G-Sync and NVIDIA couldn't drive FreeSync. It's all software and legalese.
edit:
AMD --
"Adaptive-Sync is the underlying standard, but makes no qualitative demands of adopters. As far as the spec is concerned, a 2Hz range is Adaptive-Sync, for example. Obviously that is useless for gaming. We have specific tests for backlight bleed, DRR range, motion blur, backlight flicker, pixel persistence, etc. We want monitors that bear our brand and logo to clear a certain quality threshold for the users that might buy them. To put it bluntly: the rubric is not available because this is a competitive industry and we're not interested in having our rubric needlessly and pettily nitpicked."
Still, FreeSync is AMD's name for their implementation. There's no standards body that adopted FreeSync that I know of.
Mandated wide frequency range is one thing Nvidia definitely got right with GSync along with frame doubling from get go.
Granted though, you're right about the rest (I don't have experience working with standards :p).
edit: Yes, de facto standard. Brainfart. :)
"In essence, a standard is an agreed way of doing something. It could be about making a product, managing a process, delivering a service or supplying materials—standards can cover a huge range of activities undertaken by organizations and used by their customers. " well yes ... G-Sync is another one ...
key sentence was actually that ...
include several industry standards and proprietary standards: AMD FreeSync, Nvidia G-Sync, VESA Adaptive Sync, HDMI 2.1 VRR, Apple ProMotion
standard branding then ... :laugh: oh well who care about standard ... only Vesa VRR is, okay, "de facto standard" then ... (well it still include the word standard ... :roll: )
i didn't just copy past no worries :p (removed all hyperlink ... that was a p.i.t.a. :laugh: ) actually nothing legal involved ... Nvidia just doesn't support the open royalty free equivalent of their tech (well ...:laugh: not really equivalent since it does not need supplementary hardware to work ), partially because G-Sync has wider rate range but majorly because it add 200-500$ to their pocket for each monitor sold with it ... (a overpriced frame buffer ... okay okay it also has a SOC and a few gimmick to make it seems worth it .. )
i am not merely arguing since the definition of "standard" is vague at best :laugh: