Thursday, March 19th 2015

AMD Announces FreeSync, Promises Fluid Displays More Affordable than G-SYNC

AMD today officially announced FreeSync, an open-standard technology that makes video and games look more fluid on PC monitors, with fluctuating frame-rates. A logical next-step to V-Sync, and analogous in function to NVIDIA's proprietary G-SYNC technology, FreeSync is a dynamic display refresh-rate technology that lets monitors sync their refresh-rate to the frame-rate the GPU is able to put out, resulting in a fluid display output.

FreeSync is an evolution of V-Sync, a feature that syncs the frame-rate of the GPU to the display's refresh-rate, to prevent "frame tearing," when the frame-rate is higher than refresh-rate; but it is known to cause input-lag and stutter when the GPU is not able to keep up with refresh-rate. FreeSync works on both ends of the cable, keeping refresh-rate and frame-rates in sync, to fight both page-tearing and input-lag.
What makes FreeSync different from NVIDIA G-SYNC is that it's a specialization of a VESA-standard feature by AMD, which is slated to be a part of the DisplayPort feature-set, and advanced by DisplayPort 1.2a standard, featured currently only on AMD Radeon GPUs, and Intel's upcoming "Broadwell" integrated graphics. Unlike G-SYNC, FreeSync does not require any proprietary hardware, and comes with no licensing fees. When monitor manufacturers support DP 1.2a, they don't get to pay a dime to AMD. There's no special hardware involved in supporting FreeSync, either, just support for the open-standard and royalty-free DP 1.2a.
AMD announced that no less than 12 monitors from major display manufacturers are already announced or being announced shortly, with support for FreeSync. A typical 27-inch display with TN-film panel, 40-144 Hz refresh-rate range, and WQHD (2560 x 1440 pixels) resolution, such as the Acer XG270HU, should cost US $499. You also have Ultra-Wide 2K (2560 x 1080 pixels) 34-inch and 29-inch monitors, such as the LG xUM67 series, start at $599. These displays offer refresh-rates of up to 75 Hz. Samsung is leading the 4K Ultra HD pack for FreeSync, with the UE590 series 24-inch and 28-inch, and UE850 series 24-inch, 28-inch, and 32-inch Ultra HD (3840 x 2160 pixels) monitors, offering refresh-rates of up to 60 Hz. ViewSonic is offering a full-HD (1920 x 1080 pixels) 27-incher, the VX2701mh, with refresh-rates of up to 144 Hz. On the GPU-end, FreeSync is currently supported on Radeon R9 290 series (R9 290, R9 290X, R9 295X2), R9 285, R7 260X, R7 260, and AMD "Kaveri" APUs. Intel's Core M processors should, in theory, support FreeSync, as its integrated graphics supports DisplayPort 1.2a.
On the performance side of things, AMD claims that FreeSync has lesser performance penalty compared to NVIDIA G-SYNC, and has more performance consistency. The company put out a few of its own benchmarks to make that claim.
For AMD GPUs, the company will add support for FreeSync with the upcoming Catalyst 15.3 drivers.
Add your own comment

96 Comments on AMD Announces FreeSync, Promises Fluid Displays More Affordable than G-SYNC

#51
ZoneDymo
So after all of that, seeing that last image..... we still are not out of the woods.....

Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?
Posted on Reply
#52
Cybrnook2002
ZoneDymoSo after all of that, seeing that last image..... we still are not out of the woods.....

Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?
Only when your outside of the boundaries of adaptive sync. This is a good thing, you get to choose whether you want v sync on or off when you dip below. And honestly, hopefully your not gaming at less than 30 fps.
Posted on Reply
#53
MxPhenom 216
ASIC Engineer
Dj-ElectriCMinimum required 40FPS on some monitors does not make me happy at all.
Currently, to have always above 40FPS on 1440P you got to have enough GPU horsepower to use decent settings on many new games. Otherwise it won't be worth it much.

Hoped to see is in the low 30s. Anyway, with a new top end gen of AMD cards this shouldn't (hopefully) be the case if performance are as leaked.
770/280x or better is all you need.
Posted on Reply
#54
Patriot
Yeah the marketing wording makes it a touch confusing... As a Freesync certified panel is just a panel that has adaptive sync. Freesync is the driver being aware to the adaptive sync and handling the delivery of the frames to take advantage of it. So freesync and adaptive sync are two sides of the same coin. Anyone can make an adaptive sync aware driver and use a freesync certified monitor... There are no whitelists or blacklists only a free and open standard that AMD had pushed into VESA.

But if you want to get technical and call freesync the driver section...and call it proprietary then you are probably correct in that AMD will not do your work for you and make you a driver... but if you are in the green team you probably think that is a good thing... so you can stop your bitching. :P

The troll unsubbed ... Darn. :)
Posted on Reply
#55
semantics
40-48 min refresh range seems like that expensive nvidia module actually does something seeing as they can do 30 min. Hard to say apples to apples in cost when performance isn't apples to apples. Still wait for numbers not produced by amd but 40 min is pretty disappointing, still waiting for a monitor that will do 20 min.
Posted on Reply
#56
Patriot
semantics40-48 min refresh range seems like that expensive nvidia module actually does something seeing as they can do 30 min. Hard to say apples to apples in cost when performance isn't apples to apples. Still wait for numbers not produced by amd but 40 min is pretty disappointing, still waiting for a monitor that will do 20 min.
Freesync can go down to 9 ... It is the panels that are lacking not the adaptive or freesync...

As it is an open standard.... it is up to the individual manufacturer as to how them implement it in the panel.
Posted on Reply
#57
semantics
PatriotFreesync can go down to 9 ... It is the panels that are lacking not the adaptive or freesync...

As it is an open standard.... it is up to the individual manufacturer as to how them implement it in the panel.
The comment was just a snide remark about people clamoring about things being cheaper yet equal. There is always a price to pay till things actually hit consumers it's all just hypotheticals and pointless to me. I'm not investing in either technology till i see monitors hitting 20 min or better. With AMD that could be when ever i suppose given it is up to the manufactures to invest. With nvidia i suppose it's when they release an update to their module probably with a cheaper ASCI instead of repurposed chips. So who ever does that first wins gets my money.

www.guru3d.com/articles-pages/amd-freesync-review-with-the-acer-xb270hu-monitor,3.html
Q: What is the supported range of refresh rates with FreeSync and DisplayPort Adaptive-Sync?
A: AMD Radeon graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.
Seems like i'll be waiting for 21-144hz.
Posted on Reply
#58
Patriot
semanticsThe comment was just a snide remark about people clamoring about things being cheaper yet equal. There is always a price to pay till things actually hit consumers it's all just hypotheticals and pointless to me. I'm not investing in either technology till i see monitors hitting 20 min or better. With AMD that could be when ever i suppose given it is up to the manufactures to invest. With nvidia i suppose it's when they release an update to their module probably with a cheaper ASCI instead of repurposed chips. So who ever does that first wins gets my money.
There are pros and cons of a walled garden... AMD's open solution is more flexible, Nvidia's is more consistent. Freesync also allows you to chose how it acts when outside the panels range of adaptive sync. You can have vsync kick in or not. Gsync doesn't have that option. Frankly if you are much below 40 fps you are not going to have enjoyable gameplay... and around 30 you start getting panel flicker.

While the spec may allow for as low as 9.... going to take some magic on the panel side to make it work.
Posted on Reply
#59
semantics
PatriotThere are pros and cons of a walled garden... AMD's open solution is more flexible, Nvidia's is more consistent. Freesync also allows you to chose how it acts when outside the panels range of adaptive sync. You can have vsync kick in or not. Gsync doesn't have that option.
Not really a feature I care about, I'm buying such a monitor to eliminate tearing. Neither solution can deal with frame rates above monitor refresh rate, pretty sure they never will. So I'll just cap fps, seems like a waste of my money if I accept screen tearing anyways.
Posted on Reply
#60
the54thvoid
Super Intoxicated Moderator
MxPhenom 216770/280x or better is all you need.
Alas no. 280(x) is not supported.
Posted on Reply
#61
RejZoR
Of course it's not free. You have to buy monitor that has FreeSync...
Posted on Reply
#62
Recus
RejZoROf course it's not free. You have to buy monitor that has FreeSync...
Yeah. I wonder when you buy MSI GTX 980 you get Military Class caps and Samsung memory for free? :laugh:
Posted on Reply
#63
Sony Xperia S
Only 11 compatible displays, with Samsung being the only brand with 4k offering(s).

Good, AMD, just expand the support, please, to more than this.
Posted on Reply
#64
RCoon
Funny, the monitors that have Freesync enabled in the UK are dreadfully overpriced considering this is something that's supposed to be a free VESA standard, almost as expensive as similar GSync enabled offerings. Perhaps we should title it almost-as-expensive-sync.
Posted on Reply
#65
jigar2speed
RCoonFunny, the monitors that have Freesync enabled in the UK are dreadfully overpriced considering this is something that's supposed to be a free VESA standard, almost as expensive as similar GSync enabled offerings. Perhaps we should title it almost-as-expensive-sync.
Just because of UK ??
Posted on Reply
#66
Yorgos
MakeDeluxe"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing
It works,
a leaked driver for aus, iirc, made an nVidia GPU connected w/ eDP (laptop) to enable G-sync.
What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.
Posted on Reply
#67
BiggieShady
What happens when using free sync and fps drops below minimal supported adaptive refresh rate? Does the screen go black like in leaked nv g-sync laptop driver?
Posted on Reply
#68
the54thvoid
Super Intoxicated Moderator
YorgosWhat nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.
Coherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
Also, proof of it happening plus actual critical dissection by neutral source required.
If the above requirements can't be fulfilled, then its little more than Trolling.

What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).
Posted on Reply
#69
Sony Xperia S
the54thvoidCoherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
Also, proof of it happening plus actual critical dissection by neutral source required.
If the above requirements can't be fulfilled, then its little more than Trolling.

What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).
Sure, you are violating your own requirements because of being so neutral and not a troll. Not a small troll, a big one, actually. :D

Just say that it is a lie, in the same way like someone else said earlier today that nvidia's price of 999$ is a lie. We could have accused them of big trolling....
Posted on Reply
#70
GhostRyder
semanticsNot really a feature I care about, I'm buying such a monitor to eliminate tearing. Neither solution can deal with frame rates above monitor refresh rate, pretty sure they never will. So I'll just cap fps, seems like a waste of my money if I accept screen tearing anyways.
But most of the problems arise when the FPS is changing which is why most of the time these monitors are ones with above 60hz refresh rates. It is hard to maintain the higher refresh rates than it is around 60 so that is why at least to me I feel its purpose is above at extreme areas instead of the lower ones. Even with G-Sync when you start dipping that low your not having a great experience as it is same will go for Freesync, but when you are in the range of 75-144 your going to be playing a smooth rate while the FPS is constantly changing.
Cybrnook2002Only when your outside of the boundaries of adaptive sync. This is a good thing, you get to choose whether you want v sync on or off when you dip below. And honestly, hopefully your not gaming at less than 30 fps.
Yea, gaming below a certain point still is going to produce a bad experience. Where does the line get crossed is the main problem as for me I would probably not want to go much below 50 but I have heard down to 30 with these features is not to bad but I would not shoot for that. I have seen some reviews already which claim it works so I am happy honestly but I am still waiting to see it for myself. I would love to see a decently priced one available in the U.S. already and I may try one but I still also need to wait for the CFX support next month.

Either way, this tech sounds cool and seems to work so I am interested.
Posted on Reply
#71
Captain_Tom
MakeDeluxe"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing
I guarantee Intel will support it by Skylake. Then Nvidia will be forced to support it by the end of 2016. Mark my words.
Posted on Reply
#72
Ferrum Master
Captain_TomI guarantee Intel will support it by Skylake. Then Nvidia will be forced to support it by the end of 2016. Mark my words.
Where did you get your crystal ball?
Posted on Reply
#73
HalfAHertz
YorgosIt works,
a leaked driver for aus, iirc, made an nVidia GPU connected w/ eDP (laptop) to enable G-sync.
What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.
I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
Maybe the 980 has the hardware built in and doesn't need the external solution.
Posted on Reply
#74
Fluffmeister
HalfAHertzI disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
Maybe the 980 has the hardware built in and doesn't need the external solution.
Indeed, there is apparently some ghosting going on with Freesync too:

PCPerThe ROG Swift animates at 45 FPS without any noticeable ghosting at all. The BenQ actually has a very prominent frame ghost though the image still remains sharp and in focus. The LG 34UM67 shows multiple ghost frames and causes the blade to appear smudgy and muddled a bit.

The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync? NVIDIA has stated on a few occasions that there is more that goes into a VRR monitor than simply integrated vBlank extensions and have pointed to instances like this as an example as to why. Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

It’s impossible now to know if that is the cause for the difference seen above. But with the ROG Swift and BenQ XL2730Z sharing the same 144 Hz TN panel specifications, there is obviously something different about the integration.

..............

FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet.
Source: www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-

Posted on Reply
#75
Captain_Tom
Ferrum MasterWhere did you get your crystal ball?
Common sense.
Posted on Reply
Add your own comment
Nov 21st, 2024 15:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts