Tuesday, January 7th 2014

AMD Responds to NVIDIA G-Sync with FreeSync
At CES, various display makers exhibited their gaming-grade monitors featuring NVIDIA G-Sync, a display fluidity technology that's an evolution of V-sync, which we've seen with our own eyes to make a tangible difference. AMD, at the back-room of its CES booth, demoed what various sources are calling "FreeSync," a competitive technology to G-Sync, but one that doesn't require specialized hardware, or licenses to the display makers. AMD didn't give out too many details into the finer-workings of FreeSync, but here's what we make of it.
FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.
According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).
In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.
Sources:
The TechReport, AnandTech
FreeSync taps into a lesser known feature that AMD Radeon GPUs have had for the past three generations (i.e. since Radeon HD 5000 series), called dynamic refresh rates. The feature allows GPUs to spool down refresh rates to save power, without entailing a display re-initialization (the flicker that happens when a digital display is sent a signal with a new resolution and refresh rate), on supported displays. Dynamic refresh is reportedly also a proposed addition to VESA specifications, and some (if not most) display makers have implemented it. On displays that do, AMD Catalyst drivers already run dynamic refresh rates. For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.
According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync).
In AMD's implementation, VBLANK length (interval between two refresh cycles where the GPU isn't putting out "new" frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA's implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA's implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the "sync" part. In AMD's the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA's implementation. We're looking forward to AMD's whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.
53 Comments on AMD Responds to NVIDIA G-Sync with FreeSync
G-Sync is a hardware add-on you pay extra for to lock you down into Nvidia eco-system and disable some of the monitors features and is limited to DisplayPort which is also a VESA standard (Irony).
FreeSync uses a VESA standard which can be active on all monitors.
Both ATI and Nvidia were in the VESA committee when these standards were made. Nvidia is just try'n to charge you hundreds extra for it.
www.geforce.com/hardware/technology/adaptive-vsync/technology
The dynamism isn't where you think it is.
ATI has always been good at pushing, their failure has always been on the follow through with the reality of it all.
Nvidia is much better at forcing and pushing, and their market share shows.
G-Sync works like this it sends your frame to display and then ask display to refresh it self. If your framerate drops below 30fps g-sync monitor duplicates last frame, but that doesn't cause stutter because next update can come even after some milliseconds from another update. So even framerate that fluctuates between 20 and 40 is smooth on G-Sync.
So when you are using G-Sync there is no vsync or vblank ever period.
It seems like some AMD fanboys wish that FreeSync was good as G-Sync but it's much crappier tech.
Also it needs new hardware in monitors and new VESA specification because current VESA specification is for integrated displays only like in laptops. That why they are showing it on those Toshiba laptops.
FreeSync is probably cheaper tech than G-Sync but you don't get latency reduction you get with G-Sync.
With G-Sync you probably get better CPU usage because CPU doesn't have to wait for anything, but with FreeSync there is still some sort of sync I think, so there must be some CPU overhead.
NVidia added frame duplication because lcd displays really don't work well under 30hz.
Because of that I think you can't have under 30hz with FreeSync. That means framerate fluctuating between 20 to 40 framerate isn't going to be smooth with FreeSync.
Shorting and extending vBlank is a VESA standard CVT since 2003. You can also switch on the fly between refresh-rates. Companies were already doing that for products far as 2011. Look up Intel Seamless Display Refresh Rate Switching.
The G-Sync module is a TCON that is used for PSR (Panel Self Resfreh). Theres nothing new in G-Sync that wasn't already a VESA standard.
However Nvidia had shitloads of tegra 4s sitting around and thought they would slap their fanbase (just like with nv3d) for something that included a royalty fee to them, hell you never know it might catch on ,,,,or all monitor maker's will realise they can do the same without paying Nvidia a penny and Nvidia will likely roll the support in quitely later via a driver and or new Gpu's with SPECIAL Gsync built in only 500 extra notes
- A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
So everything I wrote in my post stands.I've have owned both NVIDIA and AMD graphics card. Both are good cards but this time NVIDIA surely got better tech with G-Sync.
Maybe you should watch this
I think you need understand how a frame and timing is composed.
Nice first 2 post tho
:toast:
why does anybody need gsync on their screen when the vesa standards already have systems in place to do what the same thing?
because it helps feed the stock prices.....
amd have shown that the system already works in tech already out which costs nothing. nvidia think you are stupid enough to go out and buy a new nvidia approved screen to plug into your nvidia gpu to get a feature which is already out and free.
nvidia fanbois are nearly getting as gullible as apple fanbois, damn.
Any person can argue that your statement alone, makes you bias to one side. One, from what I have read on Anandtech.com, TV and displays that meet the VESA standard or compliance, is capable of using Free-Sync. Just like G-Sync, AMD Mantle, TruAudio, it will be enabled, most likely through a Driver. It will also be tweaked with a driver on AMD's end. The cost of "Free-Sync" is already in the retail price of the TV or monitor you own, have owned, and will own. Any AMD 5000 series generation, and after, is capable of using Free-Sync. It isn't limited to just laptop displays.
Second, I think a lot of people didn't get the indirect point made by AMD with the two Toshiba Satellite Laptops. You have one laptop using it, and you have another laptop "not" using it. Laptop A had a set FPS, and the other had a dynamically changing FPS because Free-Sync changes the static refresh rate to a dynamic one. So at the core of AMD's indirect point, the tech is there, it works, and it' not limited to any specific brand-name, display model or type like NVidia's G-Sync is. That's part of its allure. I don't think G-Sync will improve or hinder the CPU usage. G-Sync isn't designed to improve CPU Usage, on the other hand, AMD Mantle will improve CPU usage. The NVidia GPU is controlling G-Sync. It's not the CPU if I am not mistaken.
In addition, I don't think Free-Sync will do the same. The AMD Gpu will probably control Free-Sync, unless it is stated else where...
ftp://ftp.cis.nctu.edu.tw/pub/csie/Software/X11/private/VeSaSpEcS/VESA_Document_Center_Monitor_Interface/DMTv1r11.pdf
Where do you think Nvidia got the idea for Gsync? Coordinated Video Timings-Reduced Blanking
Much wow, so read, others work, blind profit!
It's used as synonym for Vertical blanking interval. That why I said there is no vBlank anymore on monitor with G-Sync. It's sad that people are using wrong term for VBI but what can you do.
There is no vSync or vBlank/VBI on monitor when you are using G-Sync. That means there is no interval for display refresh. With G-Sync graphics card is asking for display refresh. With FreeSync display is still controlling refresh rate but display card driver is setting that inverval by guessing.
Seriously, I expected more from you of all people.
NVidia are a business, as are AMD. The whole point of a business is to sell people something they probably don't necessarily need. This keeps profits up and stockholders happy. NVidia happen to have marketing team, so they can successfully charge people for something they don't really need, because they went ahead and marketed GSync. AMD did nothing bit sit on good tech for a few years, and they poorly marketed it. As far as I can conjure, Freesync (VBlank) is not yet a complete VESA standard, so not all monitors have it. People might call it "Free"sync, but there's nothing stopping display manufacturers adopting it while it's not a standard, advertising their screens as having such technology, and charging more for it. Just means AMD missed out on some possible profit.
Things haven't changed.
That's called the VFP [Vertical Front Porch] G-Sync is just extending or reducing the lines of the VFP and VBP to fit the timing of the closest refresh cycle. VBI - Is the VBP of one frame to the VFP of another frame.
nvidia are just out to make more money, from old rope, for things which the rest of their segment where working together to fix in an attempt to keep costs down.
idk about amd missing the ball, they was playing ball waiting for the vesa standard which is now implemented in some screens. i mean amd cards have the ability to do this for a few years now (5k cards and later). nvidia are again trying to make people spend again as their old, yet still powerful cards do not supprt it (650 and bellow...).
as for the fan bois comment, how else can you describe someone who is blind to the truth in front of them? if you think for as second that nv are doing this for our good, then it is already too late you.
Quote from Guru3D: www.guru3d.com/news_story/nvidia_responds_to_amd_freesync.html Basically, to sum it up, reason why AMD showcased it on some crappy laptop is because it's the only system could utilize this feature atm. Why is it free? because AMD hasn't invested a lot of research nor could offer any form of polished product at the given moment. So I guess this ends all discussions then.
and as far as me being wrong before i might as well add No , I was right, regardless of the how and what with ,,,all these things hold the displayed frame until the next one is sent by the gpu instead of updateing erespectively and without a new frame to show its really that simple.
Oh and Arterias , Exactly Where is this tech going to benefit AMD the most, APU powered laptops der , most high end sytems just switch Vsync on and buy enough GPU to power there 60fps 1080p gameing at a smooth rate, and thats the bit i dont get, all this talk of Gsync keeping it smooth betwwen 60-20Fps, I only see 20Fps on a few benchmarks and i wouldnt pay top end for a monitor for that.
Case point, My guy bought the Whole Nvidia 3d setup and paid 360 uk notes just for the monitor(19 inch too and not That long ago) an asus nvidia special with active glasses, he finally has the gpu grunt to power most things easily but i have'nt seen him use 3d in two years, i bought a 70 quid 22" hanns G so A i didnt get ripped off and B i have'nt messed up any carpets with vomit.
Im no more impressed by Gsync or freesync for that matter then I was in 3d
Look up PSR [Panel Self Refresh] a far better tech that is both 2D, 3D complaint and power savings allowing the monitor to be unplugged.