Tuesday, January 7th 2014

AMD Responds to NVIDIA G-Sync with FreeSync

At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.

FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).

In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.
Sources: The TechReport, AnandTech
Add your own comment

53 Comments on AMD Responds to NVIDIA G-Sync with FreeSync

#1
arterius2
btarunrFreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates.
Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.
btarunr...the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware.
Oh, you mean like the adaptive v-sync setting in the Nvidia control panel?

NVIDIANVIDIA Adaptive VSync makes your gaming experience smoother and more responsive by eliminating frame rate stuttering and screen tearing
Posted on Reply
#2
btarunr
Editor & Senior Moderator
arterius2Past 3 generations eh? So they waited until now to tell us this?
Yes, because nobody cared about dynamic refresh rates until now (with G-Sync).
Posted on Reply
#3
RCoon
arterius2Past 3 generations eh? So they waited until now to tell us this? I call BS, typical AMD PR stunt.
Why so surprised? AMD had something useful and they didn't market it, because their marketing skills are abysmal, and they assumed nobody would care about it. Kinda like that entire section they sold off that's now incredibly successful. They're only quietly marketing it now because NVidia is set to make a killing on GSync monitors.
Posted on Reply
#4
buggalugs
BAM!!....and there they are again lol.
Posted on Reply
#5
arterius2
btarunrYes, because nobody cared about dynamic refresh rates until now (with G-Sync).
What makes you think nobody cared about it? people have been complaining over vsync on/off tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?
Posted on Reply
#6
Solidstate89
arterius2What makes you think nobody cared about it? people have been complaining over vsync tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?
What did Virtu do for refresh rates? I thought they only worked on hybridizing GPUs.
Posted on Reply
#7
arterius2
Solidstate89What did Virtu do for refresh rates? I thought they only worked on hybridizing GPUs.
google "hyperformance" and "virtual vsync"

now we know that its mostly a dud, (as with most software optimizations regarding this area), but it was a hot selling point for their chipset back in the days.
Posted on Reply
#8
SIGSEGV
arterius2What makes you think nobody cared about it? people have been complaining over vsync tearing/stuttering issues ever since its implementation. Not to name any names, but Lucid Virtu anyone?
lucid virtu makes almost my games running horribly stuttering even on single monitor. It's just play well on creating a wonderful benchmark score. I guess no one using lucid virtu while playing games.
Posted on Reply
#9
NC37
arterius2What makes you think nobody cared about it? people have been complaining over vsync tearing/stuttering issues ever since its implementation.
Ummm...you do know vsync helps gets rid of tearing/stuttering issues, not cause them right?

I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.
Posted on Reply
#10
arterius2
SIGSEGVlucid virtu makes almost my games running horribly stuttering even on single monitor. It's just play well on creating a wonderful benchmark score. I guess no one using lucid virtu while playing games.
exactly, which is why I said a software solution to a hardware problem genuinely doesn't works, unless this is some massive breakthrough that we miss a few years ago and nobody knew about, I'll be convinced until I see some real results.
Posted on Reply
#11
arterius2
NC37Ummm...you do know vsync helps gets rid of tearing/stuttering issues, not cause them right?

I honestly don't see how people can play without it. Just seeing one screen tear sends me into inis, control panels, and scrounging the web to find solutions to enable vsync in a game that forgot to add it.

Barely could get through Uncharted and a few other PS3 games because vsync wasn't enabled due to my HDfury adapter breaking it. No, I don't have an HDCP monitor. I bought the gen before everyone suddenly made it standard. Lousy DRM companies.
I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.
Posted on Reply
#12
Sasqui
Hmm... NVidia is certainly better at capitalizing on their own ideas. If you want to take advantage of G-Sync, you'll have to buy multiple NV products. Somewhat reminiscent of SLI, no?

AMD comes out with a tech that they'll let everyone take advantage of.

It remains to be seen which one is superior. I'm guessing NV.
Posted on Reply
#13
Rahmat Sofyan
BTW, AMD/ATi already patented such same tech as those sync stuff in 2002

: www.google.com/patents/US20080055318

it called Dynamic frame rate adjustment.

perhaps, this is why AMD not care to much with gsync.

cuz they have it 11 years ago.
Posted on Reply
#14
Wittermark
Sasqui...you'll have to buy multiple NV products. Somewhat reminiscent of SLI, no?

AMD comes out with a tech that they'll let everyone take advantage of.
like Mantle and crossfire?

...oh wait.
Posted on Reply
#15
Prima.Vera
We need more info about this FreeSync, also a comparison test sometimes...
I love anything that is free btw
Posted on Reply
#16
Mussels
Freshwater Moderator
arterius2I never said the vsync was creating the tear -it removes them yes. I simply summarized the issue with one sentence - I thought people would get the point, but now I have to type this entire paragraph just to explain it again. vsync was traditionally known for stuttering/lag on some video cards/games, it was never the perfect solution as gamers had to choose between tear or lag.

Also, vsync does not get rid of stuttering, it creates them because stuttering is caused by having vsync enabled when your frame rates drops under the monitor refresh rate.
having vsync off caused tearing.

having vsync turned on with a badly coded game engine COULD cause stuttering. games fault since they assumed no one had hardware fast enough to run past 60FPS, thus they never looked into the issue.
Posted on Reply
#17
Big_Vulture
Is this also working (between 20-30FPS) where Nvidia G-sync not good?
Posted on Reply
#20
RCoon
Prima.VeraMan, that is the worst analysis ever. It doesn't provide any concrete example or data, it just trashes AMD for free. Plus there are some really immature statements in that "article" :)
My thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.
Posted on Reply
#21
Mussels
Freshwater Moderator
RCoonMy thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".

AMD are just complete retards. NVidia demoes their polished GSync probably on some real nice hardware. AMD clearly quickly rushed out a demo of a crappy little laptop showing off technology they haven't even bothered to spit-shine. They would have been better off waiting, polishing it up, and then showcasing it on a nice HD monitor with a 290X or something, you know, REALLY marketing it hard, showing how awesome it is on some high end hardware in a AAA title.
way to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.
Posted on Reply
#22
RCoon
Musselsway to miss the point - they wanted to show it CAN work on existing hardware, including where it matters most - crap low end hardware that cant push 60FPS.
Freesync is supposed to be competing against GSync right? I'm fairly certain people buying GSync monitors have enough cash to splash on a good system considering they're spending so much on a GSync monitor. If this isn't for the high end market like GSync monitors, then it isn't competing at all, just another bit of free stuff for everyone.
Posted on Reply
#23
Recus
Prima.VeraMan, that is the worst analysis ever. It doesn't provide any concrete example or data, it just trashes AMD for free. Plus there are some really immature statements in that "article" :)
RCoonMy thoughts exactly, that guy seems like a douche, and provided no information other than "I think GSync looks better better on a demo after seeing Freesync in a backroom on a tiny laptop NOT DESIGNED FOR GAMES".
Ending AMD hype - worst analysis ever.

Just a few examples: False fudzilla.net/home/item/33570-kaveri-presentation-leaked-fails-to-impress
True fudzilla.net/home/item/33558-nvidia-heading-for-a-spanking-this-year
Posted on Reply
#24
alwayssts
Patented 7-11 years ago by ati and implemented 3 years ago in AMD gpus, along with something they're pushing for in the vesa standard, but didn't improve/capitalize on because conflicts with marketing budget/open-standard mentality?

So very, very, very typical of ATi...history repeats itself for the nth time. It sounds like pretty much every baseline hardware block/gpu-use implementation outside hardware T&L for the last decade. ATi takes an idea, implements it, pushes for it to be a standard in DX/OGL while it goes unused because by definition of invention initially proprietary, nvidia makes a version much later that based on that initial idea but developed further and pushed harder (because of marketing or newer fab processes affording them the space to implement it) usually at that point in a needlessly proprietary manner, and then eventually it becomes a standard.

Another entry into the forward-thinking but badly capitalized technology of ati. May they forever be the guinea pigs that break the ice that allows nvidia to publicize it so we all eventually benefit. Hopefully the open version of the tech, now that it is in fashion, is further realized and adopted.

I'mma file this right next to TRUFORM and CTM, and hope this turns out equally as well as those eventually did and will.
Posted on Reply
#25
arterius2
alwaysstsPatented 7-11 years ago by ati and implemented 3 years ago in AMD gpus, along with something they're pushing for in the vesa standard, but didn't improve/capitalize on because conflicts with marketing budget/open-standard mentality?
Wait, what exactly has AMD implemented here regarding Vsync? I'm not understanding this fully.

I mean, there really isn't a lot of options here, you either match the monitor with the GPU, or you match the GPU with the monitor, since you can't just magically "improve" performance on the GPU, then this is a hardware problem, not software.

I get that G-sync works by adjusting the monitor's refresh rate to match the GPU's frame rate - OK, fairly simple and straightforward, I get this.

But what is AMD's method here? according to my understanding of the article, AMD tries to insert "blank/fake" frames to pretentiously boost the frame rate to match that of the monitor, is that what they are doing here?

if I remember correctly, this is exactly the same method Lucidlogic tried to implement with their Virtu software which was nothing but gimmick, it only 'looked' good in benchmark scores, but it didn't fix the problem, if anything it created more stuttering and decreased overall performance.
Posted on Reply
Add your own comment
Dec 21st, 2024 22:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts