Tuesday, January 7th 2014
AMD Responds to NVIDIA G-Sync with FreeSync
At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.
FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.
According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).
In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.
Sources:
The TechReport, AnandTech
FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.
According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).
In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.
53 Comments on AMD Responds to NVIDIA G-Sync with FreeSync
now we know that its mostly a dud, (as with most software optimizations regarding this area), but it was a hot selling point for their chipset back in the days.
I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.
Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.
Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.
AMD comes out with a tech that they'll let everyone take advantage of.
It remains to be seen which one is superior. I'm guessing NV.
: www.google.com/patents/US20080055318
it called Dynamic frame rate adjustment.
perhaps, this is why AMD not care to much with gsync.
cuz they have it 11 years ago.
...oh wait.
I love anything that is free btw
having vsync turned on with a badly coded game engine COULD cause stuttering. games fault since they assumed no one had hardware fast enough to run past 60FPS, thus they never looked into the issue.
AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.
Just a few examples: False fudzilla.net/home/item/33570-kaveri-presentation-leaked-fails-to-impress
True fudzilla.net/home/item/33558-nvidia-heading-for-a-spanking-this-year
So very, very, very typical of ATi...history repeats itself for the nth time. It sounds like pretty much every baseline hardware block/gpu-use implementation outside hardware T&L for the last decade. ATi takes an idea, implements it, pushes for it to be a standard in DX/OGL while it goes unused because by definition of invention initially proprietary, nvidia makes a version much later that based on that initial idea but developed further and pushed harder (because of marketing or newer fab processes affording them the space to implement it) usually at that point in a needlessly proprietary manner, and then eventually it becomes a standard.
Another entry into the forward-thinking but badly capitalized technology of ati. May they forever be the guinea pigs that break the ice that allows nvidia to publicize it so we all eventually benefit. Hopefully the open version of the tech, now that it is in fashion, is further realized and adopted.
I'mma file this right next to TRUFORM and CTM, and hope this turns out equally as well as those eventually did and will.
I mean, there really isn't a lot of options here, you either match the monitor with the GPU, or you match the GPU with the monitor, since you can't just magically "improve" performance on the GPU, then this is a hardware problem, not software.
I get that G-sync works by adjusting the monitor's refresh rate to match the GPU's frame rate - OK, fairly simple and straightforward, I get this.
But what is AMD's method here? according to my understanding of the article, AMD tries to insert "blank/fake" frames to pretentiously boost the frame rate to match that of the monitor, is that what they are doing here?
if I remember correctly, this is exactly the same method Lucidlogic tried to implement with their Virtu software which was nothing but gimmick, it only 'looked' good in benchmark scores, but it didn't fix the problem, if anything it created more stuttering and decreased overall performance.