# AMD Announces FreeSync, Promises Fluid Displays More Affordable than G-SYNC



## btarunr (Mar 19, 2015)

AMD today officially announced FreeSync, an open-standard technology that makes video and games look more fluid on PC monitors, with fluctuating frame-rates. A logical next-step to V-Sync, and analogous in function to NVIDIA's proprietary G-SYNC technology, FreeSync is a dynamic display refresh-rate technology that lets monitors sync their refresh-rate to the frame-rate the GPU is able to put out, resulting in a fluid display output. 

FreeSync is an evolution of V-Sync, a feature that syncs the frame-rate of the GPU to the display's refresh-rate, to prevent "frame tearing," when the frame-rate is higher than refresh-rate; but it is known to cause input-lag and stutter when the GPU is not able to keep up with refresh-rate. FreeSync works on both ends of the cable, keeping refresh-rate and frame-rates in sync, to fight both page-tearing and input-lag. 



 

 

 

 




What makes FreeSync different from NVIDIA G-SYNC is that it's a specialization of a VESA-standard feature by AMD, which is slated to be a part of the DisplayPort feature-set, and advanced by DisplayPort 1.2a standard, featured currently only on AMD Radeon GPUs, and Intel's upcoming "Broadwell" integrated graphics. Unlike G-SYNC, FreeSync does not require any proprietary hardware, and comes with no licensing fees. When monitor manufacturers support DP 1.2a, they don't get to pay a dime to AMD. There's no special hardware involved in supporting FreeSync, either, just support for the open-standard and royalty-free DP 1.2a. 



 

 

 

 

 

AMD announced that no less than 12 monitors from major display manufacturers are already announced or being announced shortly, with support for FreeSync. A typical 27-inch display with TN-film panel, 40-144 Hz refresh-rate range, and WQHD (2560 x 1440 pixels) resolution, such as the Acer XG270HU, should cost US $499. You also have Ultra-Wide 2K (2560 x 1080 pixels) 34-inch and 29-inch monitors, such as the LG xUM67 series, start at $599. These displays offer refresh-rates of up to 75 Hz. Samsung is leading the 4K Ultra HD pack for FreeSync, with the UE590 series 24-inch and 28-inch, and UE850 series 24-inch, 28-inch, and 32-inch Ultra HD (3840 x 2160 pixels) monitors, offering refresh-rates of up to 60 Hz. ViewSonic is offering a full-HD (1920 x 1080 pixels) 27-incher, the VX2701mh, with refresh-rates of up to 144 Hz. On the GPU-end, FreeSync is currently supported on Radeon R9 290 series (R9 290, R9 290X, R9 295X2), R9 285, R7 260X, R7 260, and AMD "Kaveri" APUs. Intel's Core M processors should, in theory, support FreeSync, as its integrated graphics supports DisplayPort 1.2a.



 

 

 

On the performance side of things, AMD claims that FreeSync has lesser performance penalty compared to NVIDIA G-SYNC, and has more performance consistency. The company put out a few of its own benchmarks to make that claim.



 

 

 

For AMD GPUs, the company will add support for FreeSync with the upcoming Catalyst 15.3 drivers.

*View at TechPowerUp Main Site*


----------



## Cybrnook2002 (Mar 19, 2015)

!!!Thanks btarunr!!!

Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update..... We know Freesync is coming out (BenQ monitor is already out), why poke that horse again? I can already here it "Freesync is not free, long live green"

Anyways, super excited to try it out. Once some of the Asus monitors drop, I am going after IPS + 120hz + adaptive sync.


----------



## Ferrum Master (Mar 19, 2015)

Cybrnook2002 said:


> !!!Thanks btarunr!!!
> 
> Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update..... We know Freesync is coming out (BenQ monitor is already out), why poke that horse again? I can already here it "Freesync is not free, long live green"
> 
> Anyways, super excited to try it out. Once some of the Asus monitors drop, I am going after IPS + 120hz + adaptive sync.



Because it is... manufacturer doesn't need to invest in additional module, designing additional space and providing additional current to that board... It just works as native VESA mode.


----------



## MakeDeluxe (Mar 19, 2015)

"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing


----------



## dj-electric (Mar 19, 2015)

Minimum required 40FPS on some monitors does not make me happy at all.
Currently, to have always above 40FPS on 1440P you got to have enough GPU horsepower to use decent settings on many new games. Otherwise it won't be worth it much.

Hoped to see is in the low 30s. Anyway, with a new top end gen of AMD cards this shouldn't (hopefully) be the case if performance are as leaked.


----------



## esrever (Mar 19, 2015)

Just gotta wait till intel supports their own version of adaptive sync. I doubt it would be hard for them to develop their own solution based on DP1.3.


----------



## the54thvoid (Mar 19, 2015)

Cybrnook2002 said:


> !!!Thanks btarunr!!!
> 
> Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update..... We know Freesync is coming out (BenQ monitor is already out), why poke that horse again? I can already here it "Freesync is not free, long live green"
> 
> Anyways, super excited to try it out. Once some of the Asus monitors drop, I am going after IPS + 120hz + adaptive sync.



Wait....

You have crossfired 295X2 and a (if your specs are up to date) 1080p(144hz) capable display?  Are you insane?  Free-sync is for when fps drop below your refresh rate.  What at _1080p_ with the power of 4 290X's drops below 144fps?  I cap BF4 at 90fps on sli 780ti at 1440p, you should be storming past 144fps on 1080p?

If I was you, I'd wait for a good 4K Free-sync monitor at 60Hz.  You'd get ultra smooth motion with your graphics cards and freesync then.


----------



## Ferrum Master (Mar 19, 2015)

esrever said:


> Just gotta wait till intel supports their own version of adaptive sync. I doubt it would be hard for them to develop their own solution based on DP1.3.



Yeah, it would be nice to see pulling 60FPS max in Alien:Isolation at 1440p from an Intel GPU


----------



## Cybrnook2002 (Mar 19, 2015)

Ferrum Master said:


> Because it is... manufacturer doesn't need to invest in additional module, designing additional space and providing additional current to that board... It just works as native VESA mode.


I got that  I was more commenting on the bashing between nvidia and AMD that always happens. Was just saying that with the title, we are sure to draw an argumentative crowd.

But yes, it's free from the DP 1.2a spec, as long as the hardware displaying to it is supported.


----------



## btarunr (Mar 19, 2015)

MakeDeluxe said:


> "No proprietary hardware"
> Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.
> 
> Also, dang those LG ultrawides look enticing



As I mentioned in the article, all you need is DP 1.2a (and above). NVIDIA chose HDMI 2.0 on its Maxwell GPU, but held back on advancing DP. They didn't want to end up killing G-Sync. 

Intel Broadwell IGP supports DP 1.2a.


----------



## Cybrnook2002 (Mar 19, 2015)

the54thvoid said:


> Wait....
> 
> You have crossfired 295X2 and a (if your specs are up to date) 1080p(144hz) capable display?  Are you insane?  Free-sync is for when fps drop below your refresh rate.  What at _1080p_ with the power of 4 290X's drops below 144fps?  I cap BF4 at 90fps on sli 780ti at 1440p, you should be storming past 144fps on 1080p?
> 
> If I was you, I'd wait for a good 4K Free-sync monitor at 60Hz.  You'd get ultra smooth motion with your graphics cards and freesync then.


Oh I am , I am actually running three displays at the moment (1080P). I have my eyes on the Asus MG279Q when it comes out, want to run three of those. 1440 IPS 120hz+ with adaptive sync (DP1.2a)- or like you said, break down and get a super wide 4K. Only thing about those is I want a higher than 60hz refresh. Thats the only killer for me.... and since I am a compulsive early adopter I will likely be on the 1440 train.
And yes, the cards are on cruise control with 1080p, doesn't stop me anyways


----------



## HossHuge (Mar 19, 2015)

Still out of my price range.


----------



## Cybrnook2002 (Mar 19, 2015)

@btarunr,

    Am I reading that right, AMD is supporting 9 - 240 hz with freesync? Not the speculated bottom limit of 30/40?


----------



## Patriot (Mar 19, 2015)

Cybrnook2002 said:


> @btarunr,
> 
> Am I reading that right, AMD is supporting 9 - 240 hz with freesync? Not the speculated bottom limit of 30/40?



The bottom limit is up to the individual panel it seems.

Most seem to bottom out at 40 though.   Getting the backlight to not flicker at 9 might be tricky.



MakeDeluxe said:


> "No proprietary hardware"
> Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.
> 
> Also, dang those LG ultrawides look enticing



I actually wouldn't be surprised to see Intel adopting this with a driver update.
It is an open standard they can use it if they want to.   
Nvidia won't because G-sync would die.


----------



## HM_Actua1 (Mar 19, 2015)

Sub par performance compared to Gsync.  Smaller affective VRR margin then Gsync and showing ghosting on 3 of the reviewd freesync monitors.


----------



## Patriot (Mar 19, 2015)

Also, dang those LG ultrawides look enticing[/QUOTE]


Hitman_Actual said:


> Sub par performance compared to Gsync.  Smaller affective VRR margin then Gsync and showing ghosting on 3 of the reviewd freesync monitors.



Larger range, panels used are the limiting factor.


----------



## arbiter (Mar 19, 2015)

Cybrnook2002 said:


> @btarunr,
> 
> Am I reading that right, AMD is supporting 9 - 240 hz with freesync? Not the speculated bottom limit of 30/40?



What the real limit g-sync module can support I don't think is known. Bios on the module is 30-144hz, but as far as anyone knows nvidia could release a bios update for monitor that supports larger range, or least update bios on monitors that Support the larger area, but ATM there is no point when the displays can't do it and connect of DP can't.  It just ends up being faulty specs that standard can do it but there is nothing that can support it. Its more future proofing as its there, standard is hard to charge where as Nvidia can change their module when ever they need at any time the displays allow it.



btarunr said:


> On the performance side of things, AMD claims that FreeSync has lesser performance penalty compared to NVIDIA G-SYNC, and has more performance consistency. The company put out a few of its own benchmarks to make that claim.



Problem with that statement is, "AMD Claims" Only realistic way that penalty could happen is if the gpu had to wait for display update window, which anyone would see that would be same for freesync as well. I watched most of that video this morning, just wow that guy suposed to be an expert but he said some of the dumbest crap I have ever heard. In that image he says you can turn v-sync off on freesync but you can't turn it off on g-sync due its effect of being on. If you look at that slide turning off v-sync fundamentally disabled what freesync and what g-sync is there to get rid of, SCREEN TEARING. He said g-sync has input latency cause that v-sync but Independent testers have already proven there g-sync doesn't have any of that less you go above the max refresh rate of the monitor.



Cybrnook2002 said:


> Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update.....



If you don't own one the support AMD gpu's, calling it more affordable is well complete lie. it ends up costing you just as much or more.


----------



## Petey Plane (Mar 19, 2015)

Cool, now if only AMD could get their power levels under control and their drivers more consistent.


----------



## Xzibit (Mar 19, 2015)

arbiter said:


> Problem with that statement is, "AMD Claims" Only realistic way that penalty could happen is if the gpu had to wait for display update window, which anyone would see that would be same for freesync as well. I watched most of that video this morning, just wow that guy suposed to be an expert but he said some of the dumbest crap I have ever heard. In that image he says you can turn v-sync off on freesync but you can't turn it off on g-sync due its effect of being on. If you look at that slide turning off v-sync fundamentally disabled what freesync and what g-sync is there to get rid of, SCREEN TEARING. He said g-sync has input latency cause that v-sync but Independent testers have already proven there g-sync doesn't have any of that less you go above the max refresh rate of the monitor.



You mean like this ?


----------



## qubit (Mar 19, 2015)

I absolutely love open standards. Looks like proprietary G-Sync is gonna go away and if NVIDIA prevent their cards from supporting FreeSync it will really hurt their sales.

G-Sync could potentially live as a premium option if it has a compelling performance improvement over FreeSync, but does it?


----------



## Xzibit (Mar 19, 2015)

qubit said:


> I absolutely love open standards. Looks like proprietary G-Sync is gonna go away and if NVIDIA prevent their cards from supporting FreeSync it will really hurt their sales.
> 
> G-Sync could potentially live as a premium option if it has a compelling performance improvement over FreeSync, but does it?



So far it looks like the differences is when it dips below 30fps

G-Sync = flickers
Freesync = tears/lags (V-sync off/V-Sync On)  depends on which option you choose

Above monitor refresh it looks like Freesync option to have V-Sync on/off is a better trade off then Gsync always on which adds lag.


----------



## arbiter (Mar 19, 2015)

qubit said:


> I absolutely love open standards. Looks like proprietary G-Sync is gonna go away and if NVIDIA prevent their cards from supporting FreeSync it will really hurt their sales.
> 
> G-Sync could potentially live as a premium option if it has a compelling performance improvement over FreeSync, but does it?



Um, might want to go look at market share numbers, when numbers are like 75-25 in favor of nvidia um switching camps well cost a bit so freesync likely to be one that gonna go away first.



Xzibit said:


> So far it looks like the differences is when it dips below 30fps
> 
> G-Sync = flickers
> Freesync = tears/lags (V-sync off/V-Sync On)  depends on which option you choose



Only time i heard that nvidia flickered was when it was on a laptop using an unreleased driver on none g-sync display. g-sync has memory in it so when fps drops that low it still has image data to update screen with.



Xzibit said:


> You mean like this ?



that is AMD provided data, so how reliable are those numbers? i doubt in every game AMD card would gain performance. that is what really makes those number questionable.

All those % are what you could call a margin of error as nor all playthroughs are the same and small things could effect performance numbers. AMD could nit picked ones that looked best for them.


----------



## Xzibit (Mar 19, 2015)

arbiter said:


> Only time i heard that nvidia flickered was when it was on a laptop using an unreleased driver on none g-sync display. g-sync has memory in it so when fps drops that low it still has image data to update screen with.



Which in turn adds lag at the higher end.  I think Blurbusters article proved that.  Nvidias Tom Petersen pointed out it adds a backlog if its consistently redrawing.

Flickering been in G-sync since its inception.

*PC Perspective - A Look into Reported G-Sync Display Flickering
*
You can find a lot more by googling or looking through forums.


----------



## metalslaw (Mar 19, 2015)

Atm, all that freesync is a 'ati and intel' gpu technology. With g-sync being a nvidia gpu technology.

Without either camp budging, this may end up turning into a monitor battle as well. With monitors coming with either freesync, _or_ g-sync only (thus locking in the consumers choice of graphics manufacturer).

The only real hope for consumers, is if monitor makers start shipping monitors with both freesync and g-sync enabled in the same monitor.


----------



## qubit (Mar 19, 2015)

arbiter said:


> Um, might want to go look at market share numbers, when numbers are like 75-25 in favor of nvidia um switching camps well cost a bit so freesync likely to be one that gonna go away first.


No, a marketshare lead like that won't make FreeSync go away any time soon. If the share stayed the same, then there are still plenty of AMD and Intel customers who will buy it and keep it alive.

Now, since this is a killer feature and cheaper to get than G-Sync, then it's likely to take sales away from NVIDIA. Of course, NVIDIA will counter in some way and it will be interesting to see how they do this.

Oh and btw that shiny, new and seriously overpriced Titan X is looking even more lacklustre now for not supporting it. 



metalslaw said:


> Atm, all that freesync is a 'ati and intel' gpu technology. With g-sync being a nvidia gpu technology.
> 
> Without either camp budging, this may end up turning into a monitor battle as well. With monitors coming with either freesync, _or_ g-sync only (thus locking in the consumers choice of graphics manufacturer).
> 
> *The only real hope for consumers, is if monitor makers start shipping monitors with both freesync and g-sync enabled in the same monitor.*


Unfortunately, NVIDIA is sure to have a clause in their contract which prevents both standards being implemented. Hopefully it will be seen as anti-competitive by the competition commision or whatever they're called today and be unenforceable by NVIDIA.


----------



## arbiter (Mar 19, 2015)

qubit said:


> No, a marketshare lead like that won't make FreeSync go away any time soon. If the share stayed the same, then there are still plenty of AMD and Intel customers who will buy it and keep it alive.
> 
> Now, since this is a killer feature and cheaper to get than G-Sync, then it's likely to take sales away from NVIDIA. Of course, NVIDIA will counter in some way and it will be interesting to see how they do this.
> 
> Oh and btw that shiny, new and seriously overpriced Titan X is looking even more lacklustre now for not supporting it.



In the one side the amd guy said their monitor was 499, g-sync version was 599. 100$ over freesync one. when g-sync module was released it 200$. So either module for g-sync got cheaper or freesync has premium on it? It will be only matter of time before it gets even cheaper.


Had a bit of a thought, Since AMD has to certify all monitors to be freesync. They have pretty much locked up freesync to AMD only gpu's. Since g-sync module could possible be updated firmware to support it, but since freesync software is as it stands amd proprietary software. AMD in sense has done same thing everyone rips on nvidia for, they just did it under everyones nose's.


----------



## Xzibit (Mar 19, 2015)

arbiter said:


> Had a bit of a thought, Since AMD has to certify all monitors to be freesync. They have pretty much locked up freesync to AMD only gpu's. Since g-sync module could possible be updated firmware to support it, but since freesync software is as it stands amd proprietary software. AMD in sense has done same thing everyone rips on nvidia for, they just did it under everyones nose's.




AMD doesn't have to.  They offer it free, like a courtesy for marketing.  The important part is VESA Adaptive-Sync certification.  That doesn't have to reach AMD at all.  Once its VESA certified anyone can piggy-back off of it.

As far as software goes Nvidia just has to change the way it communicates through DisplayPort, change the method in which it syncs.  Nothing stopping them.


----------



## m6tzg6r (Mar 19, 2015)

As a gamer who always maintains 60fps, is there anything about Gsync or FreeSync that can do anything positive for me, or is it only intended for people that experience frame rates that fall below 60 during gaming?


----------



## arbiter (Mar 19, 2015)

Xzibit said:


> AMD doesn't have to.  They offer it free, like a courtesy for marketing.  The important part is VESA Adaptive-Sync certification.  That doesn't have to reach AMD at all.  Once its VESA certified anyone can piggy-back off of it.
> 
> As far as software goes Nvidia just has to change the way it communicates through DisplayPort, change the method in which it syncs.  Nothing stopping them.



AMD claims its free and a standard, which its NOT the standard and software is proprietary and locked to AMD gpu's. They did pretty much same thing nvidia did while whining about what nvidia did. AMD spouts freesync/adaptive sync are one and same, which all amd fans believe and and repeat it cause they think its the truth and nvidia can support freesync which they can't since its proprietary amd software.


----------



## GhostRyder (Mar 19, 2015)

I like now how its in an official announcement finally.  The only downside I can see so far is the first reported driver (15.3) is supposed to not support CFX yet sadly 

I am interested in seeing it in action in the real world as I want some comparisons so we can choose.

I have to say though, these advancements (Gsync and freesync) have one major purpose and that to me when the FPS is constantly changing.  Situations like that mostly arise on 144hz displays whether it be 1440p or 1080p so I personally do not see much of a point on a 60hz panel even at 4K with how difficult it is to run.  That being said, I will be interested in the pricing of a panel that will be 4K 60hz just for fun but I am more wanting to compare these since no matter what you choose its probably going to lock you to one side of the GPU market.


----------



## Xzibit (Mar 19, 2015)

arbiter said:


> AMD claims its free and a standard, which its NOT the standard and software is proprietary and locked to AMD gpu's. They did pretty much same thing nvidia did while whining about what nvidia did.



AMD didn't outsource a T-Con to Assera.  They proposed changes that have been in eDP 1.1-1.3 to be implemented into DP 1.2a through VESA and it happened.  Nvidia is part of VESA governing body but they didn't do that.  They choose to use DP 1.2 in its current state and develop a communication method off of that while outsourcing the T-Con module to Assera.  The benefit to that is first to market and lock down VRR eco system.

You sound like you want AMD to start making Adaptive-Sync drivers for everyone.


----------



## Devon68 (Mar 19, 2015)

I never saw frame tearing before. Heard about it but never seen it.


----------



## Cybrnook2002 (Mar 19, 2015)

arbiter said:


> AMD claims its free and a standard, which its NOT the standard and software is proprietary and locked to AMD gpu's. They did pretty much same thing nvidia did while whining about what nvidia did. AMD spouts freesync/adaptive sync are one and same, which all amd fans believe and and repeat it cause they think its the truth and nvidia can support freesync which they can't since its proprietary amd software.


Not to feed you, but I just want to comment on a few pieces here.

- Amd is NOT charging any manufacturers a licensing fee or charging royalties, so YES it is free.

- Amd has NOT locked free-sync (which I think could just as easily be called adaptive sync) to JUST AMD. I am sorry if your Nvidia cards don't support it, but you should look at your green gods Nvidia, and throw the negativity their way. Nvidia chose to not implement display port 1.2a. Can't blame AMD for that.

- Amd is not charging any markup on monitors that have adaptive-sync/free sync. That doesn't even make sense (since when does AMD make monitors)? It's very easy, Manufactures can double dip. As long as Nvidia fans buy g-sync monitors, they can keep the price high. Now that AMD has free-sync (or just call it VESA adaptive sync, and take AMD's name out all together) they are lowering the cost of basically the same panel by $100. Sure, why not? If I could sell something to you that is $100 cheaper than the competition, put back an extra $100 in my pocket, I would do the same. That basic supply/demand without drowning out the inventory of g-sync monitors.

I appreciated you chiming in with an nvidia slanted response to basically everyone's post is every AMD thread ever, but I had to correct just these few pieces for you.


----------



## 64K (Mar 19, 2015)

Cybrnook2002 said:


> Not to feed you, but I just want to comment on a few pieces here.
> 
> - Amd is NOT charging any manufacturers a licensing fee or charging royalties, so YES it is free.
> 
> ...



Well said Cybrnook2002.


----------



## Cybrnook2002 (Mar 19, 2015)

64K said:


> Well said Cybrnook2002.


As long as one person get's it, then I am happy. So thanks for that


----------



## TheGuruStud (Mar 19, 2015)

arbiter said:


> Had a bit of a thought, Since AMD has to certify all monitors to be freesync. They have pretty much locked up freesync to AMD only gpu's. Since g-sync module could possible be updated firmware to support it, but since freesync software is as it stands amd proprietary software. AMD in sense has done same thing everyone rips on nvidia for, they just did it under everyones nose's.



False on all points.

Nothing has to be certified for it to work. That's a formality for branding b/c people are stupid and have no idea what adaptive sync is.


----------



## the54thvoid (Mar 19, 2015)

This is all good news basically.  Anandtech has a nice article on it (kinda kybosh the whole G-Sync perf hit though - the hit is ludicrously small, as to be imperceptible).  The point is, will it make G-Sync cheaper? - Maybe a a wee bit but G-Sync only works for Nvidia, Free-Sync only works with what supports DP1.2a (which notably isn't supported by R9 280/X or 270/X - bummer for some).

If you buy Nvidia currently, you probably don't mind paying a bit more for G-Sync - after all it is cheaper than buying a new AMD GPU and Free-Sync monitor.  If you have a GCN 1.1(?) gpu now, it's great for you as well.  

There's no needs for folks to get all antsy about it and leap to either sides defence.


----------



## Cybrnook2002 (Mar 19, 2015)

TheGuruStud said:


> False.


To add, Freesync certification does not lock anything up. It's already public knowledge that it's free to have your panel FreeSync certified (put the sticker on the box) and you can choose to do so if you wish (manufacturers). But that does not mean that ONLY freesync monitors will work with adaptive sync. AMD has started AMD cards will support the same technology on basic adaptive sync panels as well, so no white list/black list.


----------



## Cybrnook2002 (Mar 19, 2015)

the54thvoid said:


> This is all good news basically.  Anandtech has a nice article on it (kinda kybosh the whole G-Sync perf hit though - the hit is ludicrously small, as to be imperceptible).  The point is, will it make G-Sync cheaper? - Maybe a a wee bit but G-Sync only works for Nvidia, Free-Sync only works with what supports DP1.2a (which notably isn't supported by R9 280/X or 270/X - bummer for some).
> 
> If you buy Nvidia currently, you probably don't mind paying a bit more for G-Sync - after all it is cheaper than buying a new AMD GPU and Free-Sync monitor.  If you have a GCN 1.1(?) gpu now, it's great for you as well.
> 
> There's no needs for folks to get all antsy about it and leap to either sides defence.


Agreed, sigh...

I just like the thought that no matter what camp your in (green or red or intel) , we can all benefit from baby smooth frame rate gaming. That's what it should all be really be about.


----------



## arbiter (Mar 19, 2015)

Cybrnook2002 said:


> - Amd has NOT locked free-sync (which I think could just as easily be called adaptive sync) to JUST AMD. I am sorry if your Nvidia cards don't support it, but you should look at your green gods Nvidia, and throw the negativity their way. Nvidia chose to not implement display port 1.2a. Can't blame AMD for that.



Um its NOT the standard its proprietary amd software that uses the standard so its AMD locked.

http://support.amd.com/en-us/search/faq/214 <--



> DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like AMD FreeSync™ technology. AMD FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync





Cybrnook2002 said:


> As long as one person get's it, then I am happy. So thanks for that



you are not one of them that gets it i see.



Cybrnook2002 said:


> I appreciated you chiming in with an nvidia slanted response to basically everyone's post is every AMD thread ever, but I had to correct just these few pieces for you.



i guess that goes without saying amd fans that flame and troll nvidia threads. Sad all people that bite on every word AMD claims even if they are not true. I guess 75% of market share that uses nvidia shows how well those lies are working out for AMD.



the54thvoid said:


> This is all good news basically. Anandtech has a nice article on it (kinda kybosh the whole G-Sync perf hit though - the hit is ludicrously small, as to be imperceptible). The point is, will it make G-Sync cheaper? - Maybe a a wee bit but G-Sync only works for Nvidia, Free-Sync only works with what supports DP1.2a (which notably isn't supported by R9 280/X or 270/X - bummer for some).



Only works with AMD cards as its proprietary use of the spec. its option part of DP 1.2a not required so. not anything with 1.2a will support it.

/me clicks unwatch on this thread to not see idiot response email updates.


----------



## Cybrnook2002 (Mar 19, 2015)

arbiter said:


> Um its NOT the standard its proprietary amd software that uses the standard so its AMD locked.
> 
> http://support.amd.com/en-us/search/faq/214 <--
> 
> ...





arbiter said:


> Um its NOT the standard its proprietary amd software that uses the standard so its AMD locked.
> 
> http://support.amd.com/en-us/search/faq/214 <--
> 
> ...


Read it a little slower and closer, it's pretty easy to see what it's saying. Adaptive sync is a key ingredient in making freesync work. The way that reads is freesync (which is worded as a software option) needs adaptive sync (VESA Standard in dp1.2a(hardware option)) to work. Where is it technology locked? Except if you are meaning that you can't run an Nvidia card with AMD drivers using freesync, yes that wont work. But that does not mean that only AMD cards can use adaptive sync (needed for free sync). It's open to anyone who wants to adopt it.


----------



## kn00tcn (Mar 19, 2015)

inb4 DP 1.3 or 1.2b or whatever next version is an updated multiplatform version of freesync, just like what vulkan with mantle


----------



## the54thvoid (Mar 19, 2015)

arbiter said:


> Um its NOT the...._< all the crap I'm dribbling>_.....port it.
> 
> /me clicks unwatch on this thread to not see idiot response email updates.



lol.

I recently had to ignore a chappy like this from the red camp.  This must be his green brother.  I literally don't know what he's saying.

titter titter tee hee


----------



## Xzibit (Mar 19, 2015)

the54thvoid said:


> lol.
> 
> I recently had to ignore a chappy like this from the red camp.  This must be his green brother.  I literally don't know what he's saying.
> 
> titter titter tee hee



What's with you and family relationships lately.  

/looks around for his "twin"


----------



## 64K (Mar 19, 2015)

the54thvoid said:


> lol.
> 
> I recently had to ignore a chappy like this from the red camp.  This must be his green brother.  I literally don't know what he's saying.
> 
> titter titter tee hee



Every site should have a case of these on hand


----------



## the54thvoid (Mar 19, 2015)

64K said:


> Every site should have a case of these on hand



Gonna need a bigger can...


----------



## natr0n (Mar 19, 2015)

Anyone know when will the windmill demo be available ?


----------



## Cybrnook2002 (Mar 19, 2015)

Reading up on the fresh rates, it appears the AMD freesync driver (likely a adaptive sync range) supports from 9 to 240hz off the adaptive sync standard. But as already has been said, current technology is operating within the 30 - 144 hz range. BUT, what is cool about this is that gives panel manufacturers breathing room to continue developing better panel technologies to broaden our useable range lower than 30 and higher than 144 (with still great input times) using existing sync'ing technology from today.

We are getting there, we are starting to see IPS technology creep into the 120+hz range (on the mainstream, yes I know korean overclocking panels have been around) This is good for everyone, so for Free-sync (adaptive sync) nothing would change just the hardware being released from the panel manf's. Then for G-Sync, a potential update to the G-Sync module broadening its range (which would of course be installed on whatever new panel technology comes out) as you wont be able to update your existing monitors g-sync module to support ranges outside of what the integrated panel can support. Either way a new monitor would be required......

Just wanted to add that.


----------



## HumanSmoke (Mar 19, 2015)

Cybrnook2002 said:


> To add, Freesync certification does not lock anything up. It's already public knowledge that it's free to have your panel FreeSync certified (put the sticker on the box) and you can choose to do so if you wish (manufacturers). But that does not mean that ONLY freesync monitors will work with adaptive sync. AMD has started AMD cards will support the same technology on basic adaptive sync panels as well, so no white list/black list.


Pretty much. As far as most people are concerned, FreeSync is Adaptive Sync by another name. The hardware and implementation remain the same but the branding attempts to drive differentiation where there is none ( Kind of like how PCI-E x4 b/w sourced M.2 is "Ultra M.2" or "Turbo M.2" depending on who's trying to sell it to you). It allows AMD to keep their branding front-and-centre, but tends to obfuscate the actual implementation. Even AMD's FAQ alludes to the fact that the two are different but provides no actual example of the differences:


> * How are DisplayPort Adaptive-Sync and AMD FreeSync™ technology different*?
> DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like AMD FreeSync™ technology. AMD FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.
> *Is DisplayPort Adaptive-Sync the industry-standard version of AMD FreeSync™ technology?*
> The DisplayPort Adaptive-Sync specification was ported from the Embedded DisplayPort specification through a proposal to the VESA group by AMD. DisplayPort Adaptive-Sync is an ingredient feature of a DisplayPort link and an industry standard that enables technologies like AMD FreeSync™ technology.



I suspect that once DP1.3 becomes entrenched, the separate branding will largely dissolve and Adaptive Sync will just become another ubiquitous feature found on all but the most basic feature-set monitors.


----------



## Cybrnook2002 (Mar 20, 2015)

HumanSmoke said:


> Pretty much. As far as most people are concerned, FreeSync is Adaptive Sync by another name. The hardware and implementation remain the same but the branding attempts to drive differentiation where there is none ( Kind of like how PCI-E x4 b/w sourced M.2 is "Ultra M.2" or "Turbo M.2" depending on who's trying to sell it to you). It allows AMD to keep their branding front-and-centre, but tends to obfuscate the actual implementation. Even AMD's FAQ alludes to the fact that the two are different but provides no actual example of the differences:
> 
> 
> I suspect that once DP1.3 becomes entrenched, the separate branding will largely dissolve and Adaptive Sync will just become another ubiquitous feature found on all but the most basic feature-set monitors.


Yes, I agree with you that, in my conclusion, adaptive sync and free-sync and the same thing. With AMD helping to work with VESA on creating the standard, AMD is able to offer now a certification on technology. (And they should be able to)

I think the devil in the details will be those monitors that have not gone through the AMD certification program, BUT offer adaptive sync MAY give users a sub-par experience which people might confuse and throw back at AMD, calling it free-sync. Free-Sync monitors are only monitors that AMD has sanctioned to perform within AMD spec and offer AMD's sough after experience in the adaptive sync realm. This is where I think we might see a fork in the road.....

Keep in mind a "Free-Sync" experience will involve a driver portion from AMD, so they do have skin in the game. This is why they also have the cert program to make sure it's all correct. Now running a intel broadwell with a normal adaptive sync monitor, that's not AMD anymore.

I am just glad we have finally rid the issues of v-sync vs input vs tearing across the board. (Or at least are working towards there)


----------



## fullinfusion (Mar 20, 2015)

Good for you AMD and I can't wait.. Did I see a 15.3 driver be said hmm cool, testing time


----------



## ZoneDymo (Mar 20, 2015)

So after all of that, seeing that last image..... we still are not out of the woods.....

Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?


----------



## Cybrnook2002 (Mar 20, 2015)

ZoneDymo said:


> So after all of that, seeing that last image..... we still are not out of the woods.....
> 
> Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?


Only when your outside of the boundaries of adaptive sync. This is a good thing, you get to choose whether you want v sync on or off when you dip below. And honestly, hopefully your not gaming at less than 30 fps.


----------



## MxPhenom 216 (Mar 20, 2015)

Dj-ElectriC said:


> Minimum required 40FPS on some monitors does not make me happy at all.
> Currently, to have always above 40FPS on 1440P you got to have enough GPU horsepower to use decent settings on many new games. Otherwise it won't be worth it much.
> 
> Hoped to see is in the low 30s. Anyway, with a new top end gen of AMD cards this shouldn't (hopefully) be the case if performance are as leaked.


770/280x or better is all you need.


----------



## Patriot (Mar 20, 2015)

Yeah the marketing wording makes it a touch confusing...  As a Freesync certified panel is just a panel that has adaptive sync.   Freesync is the driver being aware to the adaptive sync and handling the delivery of the frames to take advantage of it.    So freesync and adaptive sync are two sides of the same coin.  Anyone can make an adaptive sync aware driver and use a freesync certified monitor...   There are no whitelists or blacklists only a free and open standard that AMD had pushed into VESA.   

But if you want to get technical and call freesync the driver section...and call it proprietary then you are probably correct in that AMD will not do your work for you and make you a driver... but if you are in the green team you probably think that is a good thing... so you can stop your bitching.  

The troll unsubbed ... Darn.


----------



## semantics (Mar 20, 2015)

40-48 min refresh range seems like that expensive nvidia module actually does something seeing as they can do 30 min. Hard to say apples to apples in cost when performance isn't apples to apples. Still wait for numbers not produced by amd but 40 min is pretty disappointing, still waiting for a monitor that will do 20 min.


----------



## Patriot (Mar 20, 2015)

semantics said:


> 40-48 min refresh range seems like that expensive nvidia module actually does something seeing as they can do 30 min. Hard to say apples to apples in cost when performance isn't apples to apples. Still wait for numbers not produced by amd but 40 min is pretty disappointing, still waiting for a monitor that will do 20 min.



Freesync can go down to 9 ... It is the panels that are lacking not the adaptive or freesync...

As it is an open standard.... it is up to the individual manufacturer as to how them implement it in the panel.


----------



## semantics (Mar 20, 2015)

Patriot said:


> Freesync can go down to 9 ... It is the panels that are lacking not the adaptive or freesync...
> 
> As it is an open standard.... it is up to the individual manufacturer as to how them implement it in the panel.


The comment was just a snide remark about people clamoring about things being cheaper yet equal. There is always a price to pay till things actually hit consumers it's all just hypotheticals and pointless to me. I'm not investing in either technology till i see monitors hitting 20 min or better. With AMD that could be when ever i suppose given it is up to the manufactures to invest. With nvidia i suppose it's when they release an update to their module probably with a cheaper ASCI instead of repurposed chips. So who ever does that first wins gets my money.

http://www.guru3d.com/articles-pages/amd-freesync-review-with-the-acer-xb270hu-monitor,3.html


> *Q: What is the supported range of refresh rates with FreeSync and DisplayPort Adaptive-Sync?*
> *A:* AMD Radeon graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.


 Seems like i'll be waiting for 21-144hz.


----------



## Patriot (Mar 20, 2015)

semantics said:


> The comment was just a snide remark about people clamoring about things being cheaper yet equal. There is always a price to pay till things actually hit consumers it's all just hypotheticals and pointless to me. I'm not investing in either technology till i see monitors hitting 20 min or better. With AMD that could be when ever i suppose given it is up to the manufactures to invest. With nvidia i suppose it's when they release an update to their module probably with a cheaper ASCI instead of repurposed chips. So who ever does that first wins gets my money.



There are pros and cons of a walled garden...    AMD's open solution is more flexible, Nvidia's is more consistent.     Freesync also allows you to chose how it acts when outside the panels range of adaptive sync.   You can have vsync kick in or not.   Gsync doesn't have that option.   Frankly if you are much below 40 fps you are not going to have enjoyable gameplay... and around 30 you start getting panel flicker.   

While the spec may allow for as low as 9.... going to take some magic on the panel side to make it work.


----------



## semantics (Mar 20, 2015)

Patriot said:


> There are pros and cons of a walled garden...    AMD's open solution is more flexible, Nvidia's is more consistent.   *  Freesync also allows you to chose how it acts when outside the panels range of adaptive sync.   You can have vsync kick in or not.   Gsync doesn't have that option.*


Not really a feature I care about, I'm buying such a monitor to eliminate tearing. Neither solution can deal with frame rates above monitor refresh rate, pretty sure they never will. So I'll just cap fps, seems like a waste of my money if I accept screen tearing anyways.


----------



## the54thvoid (Mar 20, 2015)

MxPhenom 216 said:


> 770/280x or better is all you need.



Alas no. 280(x) is not supported.


----------



## RejZoR (Mar 20, 2015)

Of course it's not free. You have to buy monitor that has FreeSync...


----------



## Recus (Mar 20, 2015)

RejZoR said:


> Of course it's not free. You have to buy monitor that has FreeSync...



Yeah. I wonder when you buy MSI GTX 980 you get Military Class caps and Samsung memory for free?


----------



## Sony Xperia S (Mar 20, 2015)

Only 11 compatible displays, with Samsung being the only brand with 4k offering(s).

Good, AMD, just expand the support, please, to more than this.


----------



## RCoon (Mar 20, 2015)

Funny, the monitors that have Freesync enabled in the UK are dreadfully overpriced considering this is something that's supposed to be a free VESA standard, almost as expensive as similar GSync enabled offerings. Perhaps we should title it almost-as-expensive-sync.


----------



## jigar2speed (Mar 20, 2015)

RCoon said:


> Funny, the monitors that have Freesync enabled in the UK are dreadfully overpriced considering this is something that's supposed to be a free VESA standard, almost as expensive as similar GSync enabled offerings. Perhaps we should title it almost-as-expensive-sync.



Just because of UK ??


----------



## Yorgos (Mar 20, 2015)

MakeDeluxe said:


> "No proprietary hardware"
> Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.
> 
> Also, dang those LG ultrawides look enticing


It works,
a leaked driver for aus, iirc, made an nVidia GPU connected w/ eDP (laptop) to enable G-sync.
What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.


----------



## BiggieShady (Mar 20, 2015)

What happens when using free sync and fps drops below minimal supported adaptive refresh rate? Does the screen go black like in leaked nv g-sync laptop driver?


----------



## the54thvoid (Mar 20, 2015)

Yorgos said:


> What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.



Coherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
Also, proof of it happening plus actual critical dissection by neutral source required.
If the above requirements can't be fulfilled, then its little more than Trolling.

What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).


----------



## Sony Xperia S (Mar 20, 2015)

the54thvoid said:


> Coherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
> Also, proof of it happening plus actual critical dissection by neutral source required.
> If the above requirements can't be fulfilled, then its little more than Trolling.
> 
> What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).



Sure, you are violating your own requirements because of being so neutral and not a troll. Not a small troll, a big one, actually. 

Just say that it is a lie, in the same way like someone else said earlier today that nvidia's price of 999$ is a lie. We could have accused them of big trolling....


----------



## GhostRyder (Mar 20, 2015)

semantics said:


> Not really a feature I care about, I'm buying such a monitor to eliminate tearing. Neither solution can deal with frame rates above monitor refresh rate, pretty sure they never will. So I'll just cap fps, seems like a waste of my money if I accept screen tearing anyways.


 But most of the problems arise when the FPS is changing which is why most of the time these monitors are ones with above 60hz refresh rates.  It is hard to maintain the higher refresh rates than it is around 60 so that is why at least to me I feel its purpose is above at extreme areas instead of the lower ones.  Even with G-Sync when you start dipping that low your not having a great experience as it is same will go for Freesync, but when you are in the range of 75-144 your going to be playing a smooth rate while the FPS is constantly changing.



Cybrnook2002 said:


> Only when your outside of the boundaries of adaptive sync. This is a good thing, you get to choose whether you want v sync on or off when you dip below. And honestly, hopefully your not gaming at less than 30 fps.


 Yea, gaming below a certain point still is going to produce a bad experience.  Where does the line get crossed is the main problem as for me I would probably not want to go much below 50 but I have heard down to 30 with these features is not to bad but I would not shoot for that.  I have seen some reviews already which claim it works so I am happy honestly but I am still waiting to see it for myself.  I would love to see a decently priced one available in the U.S. already and I may try one but I still also need to wait for the CFX support next month.

Either way, this tech sounds cool and seems to work so I am interested.


----------



## Captain_Tom (Mar 20, 2015)

MakeDeluxe said:


> "No proprietary hardware"
> Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.
> 
> Also, dang those LG ultrawides look enticing



I guarantee Intel will support it by Skylake.  Then Nvidia will be forced to support it by the end of 2016.  Mark my words.


----------



## Ferrum Master (Mar 20, 2015)

Captain_Tom said:


> I guarantee Intel will support it by Skylake.  Then Nvidia will be forced to support it by the end of 2016.  Mark my words.



Where did you get your crystal ball?


----------



## HalfAHertz (Mar 20, 2015)

Yorgos said:


> It works,
> a leaked driver for aus, iirc, made an nVidia GPU connected w/ eDP (laptop) to enable G-sync.
> What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.



I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously  there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
Maybe the 980 has the hardware built in and doesn't need the external solution.


----------



## Fluffmeister (Mar 20, 2015)

HalfAHertz said:


> I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously  there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
> Maybe the 980 has the hardware built in and doesn't need the external solution.



Indeed, there is apparently some ghosting going on with Freesync too:












			
				PCPer said:
			
		

> The ROG Swift animates at 45 FPS without any noticeable ghosting at all. The BenQ actually has a very prominent frame ghost though the image still remains sharp and in focus. The LG 34UM67 shows multiple ghost frames and causes the blade to appear smudgy and muddled a bit.
> 
> The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync? *NVIDIA has stated on a few occasions that there is more that goes into a VRR monitor than simply integrated vBlank extensions and have pointed to instances like this as an example as to why. Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects.* But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. *NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.*
> 
> ...



Source: http://www.pcper.com/reviews/Displa...hnical-Discussion/Gaming-Experience-FreeSync-


----------



## Captain_Tom (Mar 20, 2015)

Ferrum Master said:


> Where did you get your crystal ball?



Common sense.


----------



## Captain_Tom (Mar 20, 2015)

HalfAHertz said:


> I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously  there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
> Maybe the 980 has the hardware built in and doesn't need the external solution.



Wrong.  Everything GTX 650 and up supports it.


----------



## Phobia9651 (Mar 20, 2015)

Still waiting for this monitor:
24-25 inch size
IPS/AHVA/PLS display panel
WQHD resolution
120-144 Hz refresh rate
1 ms response time
AdaptiveSync support
no PWM flickering
(quasi) bezelless design

Come on LG.Display/Samsung/AU.Optronics I know you can do it!


----------



## Yorgos (Mar 20, 2015)

the54thvoid said:


> Coherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
> Also, proof of it happening plus actual critical dissection by neutral source required.
> If the above requirements can't be fulfilled, then its little more than Trolling.
> 
> What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).


there is a link to a forum where an actual nvidia customer that claims it.
http://www.pcper.com/reviews/Graphics-Cards/Mobile-G-Sync-Confirmed-and-Tested-Leaked-Alpha-Driver



HalfAHertz said:


> I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously  there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
> Maybe the 980 has the hardware built in and doesn't need the external solution.


I don't see anything that you can actually disagree.
I am not writing any opinion.
What I am saying is that nVidia is not telling you that G-sync is out there with a different name.
They have not invented the wheel, they are trying to re-invent it and charge you for it.
The part about GCN or GM1xx or GM2xx or some vliw architecture to support some-Sync technology goes like this:
If the eDP vX.X is supported by some GPU, then that GPU is able to support this some-Sync.
Now, the DisplayPort that you have on your monitor is very different from the eDP. The standard make the adaptive refresh rate optional which means that noone bothered to adjust it to their monitors and gpus.
On the other hand, eDP is something that can save you a lot of power by forcing the makers to give you control over the refresh rate of your screen. It is not the same as DP but it is NOT optional. It has more features because it is on special devices.

for example, fec (forward error correction) is optional on the 802.3-2008 IEEE standard, but no-one bothers to support it except a few companies that sell IPs or h/w in research centers or data centers... not to customers like you and me.


----------



## Yorgos (Mar 20, 2015)

Captain_Tom said:


> Wrong.  Everything GTX 650 and up supports it.


as I said before, everything that implements that particular feature of the standard supports it.


----------



## ZoneDymo (Mar 20, 2015)

urza26 said:


> Still waiting for this monitor:
> 24-25 inch size
> IPS/AHVA/PLS display panel
> WQHD resolution
> ...



waiting for the same but then in 32/34 inch


----------



## Captain_Tom (Mar 20, 2015)

ZoneDymo said:


> waiting for the same but then in 32/34 inch



Make it 5K Ultra Wide and then we are talking.


----------



## Prima.Vera (Mar 21, 2015)

urza26 said:


> Still waiting for this monitor:
> 24-25 inch size
> *IPS/AHVA/PLS display panel*
> WQHD resolution
> ...



No they cannot. At least not with those panel types you wrote. Currently not even crappy TN panel can go 1ms - and that's even g.t.g.. Only ones capable at this moment to go 1ms or even less are the defunct Plasma or CRT monitors. The LCDs just cannot switch that fast. *Is pure physics and chemistry*.


----------



## AsRock (Mar 21, 2015)

Dj-ElectriC said:


> Minimum required 40FPS on some monitors does not make me happy at all.
> Currently, to have always above 40FPS on 1440P you got to have enough GPU horsepower to use decent settings on many new games. Otherwise it won't be worth it much.
> 
> Hoped to see is in the low 30s. Anyway, with a new top end gen of AMD cards this shouldn't (hopefully) be the case if performance are as leaked.



What i have read on their site it's any thing above 29fps, but you do say some so better watch out.

I'll wait until 40" HDTV's have it as i have no interest going back to a lil monitor any time soon, by that time hopefully shit will have matured.


----------



## aasim (Mar 21, 2015)

Big thumbs up for AMD. You did great by delivering an giving technology without charging any premium. Really nice to see someone doing efforts and making them open for others..

Last but least i am Nvidia user, but hate when they make everything closed like PhysX, Gsync 

'Currently own 750ti hope it will soon get adaptive sync as the new mobile gpus of Nvidia got.


----------



## ZoneDymo (Mar 21, 2015)

AsRock said:


> What i have read on their site it's any thing above 29fps, but you do say some so better watch out.
> 
> I'll wait until 40" HDTV's have it as i have no interest going back to a lil monitor any time soon, by that time hopefully shit will have matured.



the minimum is 9 fps of the technology.
Its the monitors that dont go that low.
Televisions I think all go down to 24 fps


----------



## TheoneandonlyMrK (Mar 21, 2015)

Well all the pieces are coming together for a couple of upgrades to my pc ie an rd 390x and a BIG pixel screen now if I could just step out of poverty im onit, i wwould consider swapping a kidney for three 4ks and a few gpus any offers.


----------



## 1d10t (Mar 22, 2015)

*Forgot login pass to TPU

Yeah,thank you AMD for bring us some affordable technologies 
Goodluck for G-Sync though...



theoneandonlymrk said:


> Well all the pieces are coming together for a couple of upgrades to my pc ie an rd 390x and a BIG pixel screen now if I could just step out of poverty im onit, i wwould consider swapping a kidney for three 4ks and a few gpus any offers.


----------



## NympH (Mar 22, 2015)

Now we just need some monitors....


----------



## newconroer (Mar 22, 2015)

Good to see it finally rolling out - however I feel a bit mislead. Unless I misunderstood it previously, their version of this adaptive v-sync, was to be able to run on any monitor. All it required was an AMD *FreeSync* compatible GPU or APU.
Now I come to find out that you have to purchase a monitor where they've baked adaptive v-sync support into the Display port 1.2 connection.


That's a strike against it because like G-sync, it costs you more money.

Adaptive v-sync will be standard on Display port 1.3 any ways, so don't rush out and buy a monitor that supports FreeSync, unless you really like the monitor itself. When 1.3 hits, your selection will increase.

Lastly, this advanced adaptive v-sync craze, is really aimed at the mainstream gamer who knows very little about frame rendering times and frame latency. The problem is that frame latency, frame time and other variables of frame rending mechanics, is only now being discussed in GPU and performance reviews.
It's not common knowledge, and I question how they expect to sell people on the idea that FreeSync makes your game smoother, when most people didn't even notice their gaming was 'unsmooth' to begin with.
Additionally, FreeSync is far from automatic at the driver level. If a lot of people still struggle to know or find their Catalyst Control Panel...or what a monitor OSD is, where does that leave them when trying to fiddle about setting up FreeSync?



Once you do some research and learn a bit about how an image is rendered, you find out that for the most part, you can achieve the same affect as Gsync/Freesync, without needing to buy additional hardware.
Making use of tools such as Radeon Pro, RTSS and CRU (to create custom resolutions), is the key to helping you achieve a much smoother experience - all while using the same hardware you currently own.


Setting up a custom monitor resolution and/or capping your frame rate, takes about the same amount of time as it does to enable FreeSync.
The former option is all free, the second is costly.



EDIT: In addition to what I posted, the arguments over market share are laughable. This 'technology' is not game breaking. While I wouldn't call it a gimmick, it's not some new architecture that will take us into a grand age of computer graphics and performance.
It's a bonus feature if nothing else and not every monitor, GPU or APU is going to support it.


----------



## semantics (Mar 22, 2015)

I think it's a real feature to IQ just like AA is. Some are more sensitive to such things than others, it's a feature i'd list as sorta like physx on high, you probably wouldn't care if you never had it but if you had it and it was removed you'd notice and would maybe care. 

Latency from v-sync is such a fake issue when playing games online and a non issue when you're playing single player games. If people really cared so much that they had to turn off v-sync because of the latency they choose their monitors more carefully to take a look at overall monitor latency along with steps to reduce overall system latency, meaning turning off things in bios etc. 

The issue was keeping sync without the stutter of 15,30,60 fps locks adaptive vsync debut by nvidia gave you sync when you could afford it turned it off when it would probably cause stutters a minor solution. G-sync, freesync and adaptive sync(really confusing name choice imo) are solutions with greater scope.


----------



## Relayer (Mar 23, 2015)

metalslaw said:


> Atm, all that freesync is a 'ati and intel' gpu technology. With g-sync being a nvidia gpu technology.
> 
> Without either camp budging, this may end up turning into a monitor battle as well. With monitors coming with either freesync, _or_ g-sync only (thus locking in the consumers choice of graphics manufacturer).
> 
> The only real hope for consumers, is if monitor makers start shipping monitors with both freesync and g-sync enabled in the same monitor.


If nVidia wants variable refresh in laptops, they'll support it.


----------



## Relayer (Mar 23, 2015)

arbiter said:


> Had a bit of a thought, Since AMD has to certify all monitors to be freesync. They have pretty much locked up freesync to AMD only gpu's. Since g-sync module could possible be updated firmware to support it, but since freesync software is as it stands amd proprietary software. AMD in sense has done same thing everyone rips on nvidia for, they just did it under everyones nose's.



A monitor doesn't have to have the Freeesync branding to work on AMD cards. There is no DRM check, or anything. Asus is supposed to be releasing a compatible monitor that's not Freesync certified but will work just fine.


----------



## Relayer (Mar 23, 2015)

m6tzg6r said:


> As a gamer who always maintains 60fps, is there anything about Gsync or FreeSync that can do anything positive for me, or is it only intended for people that experience frame rates that fall below 60 during gaming?


I think it will be even more useful for people who have 120+Hz monitors. There's a big space between 60Hz and 120Hz-144Hz (Freesync is actually cetified to 240Hz) where Gsync/Freesync can improve the gameplay experience.


----------



## Relayer (Mar 23, 2015)

the54thvoid said:


> This is all good news basically.  Anandtech has a nice article on it (kinda kybosh the whole G-Sync perf hit though - the hit is ludicrously small, as to be imperceptible).  The point is, will it make G-Sync cheaper? - Maybe a a wee bit but G-Sync only works for Nvidia, Free-Sync only works with what supports DP1.2a (which notably isn't supported by R9 280/X or 270/X - bummer for some).
> 
> *If you buy Nvidia currently, you probably don't mind paying a bit more for G-Sync* - after all it is cheaper than buying a new AMD GPU and Free-Sync monitor.  If you have a GCN 1.1(?) gpu now, it's great for you as well.
> 
> There's no needs for folks to get all antsy about it and leap to either sides defence.



Salesperson: How about an $800 TN monitor to go with your $1000 graphics card?
Customer: Can I, please?


I'm hoping you'll be able to get a 390 w/ a 1440p Freesync capable monitor for $1000.


----------



## Relayer (Mar 23, 2015)

ZoneDymo said:


> So after all of that, seeing that last image..... we still are not out of the woods.....
> 
> Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?


Limit your FPS to the refresh rate.


----------



## P-40E (Mar 25, 2015)

This is BS! They are purposely not allowing support for freesync on the R9 270 models and R9 280 models because that will make it also work with 7000 series cards! This is not fair to people who bought brand new R9 series cards! When you buy a new product you expect it to be current! This is very shady for AMD to do this! And because of this little marketing stunt I will never buy from AMD again! I was going to buy a R9 290 this spring instead of the GTX 970. But I am now going to buy the GTX 970! And I will throw my R9 270 in the trash where it belongs! or just give it away to anyone that needs it if they pay the shipping.


----------



## Relayer (Mar 26, 2015)

P-40E said:


> This is BS! They are purposely not allowing support for freesync on the R9 270 models and R9 280 models because that will make it also work with 7000 series cards! This is not fair to people who bought brand new R9 series cards! When you buy a new product you expect it to be current! This is very shady for AMD to do this! And because of this little marketing stunt I will never buy from AMD again! I was going to buy a R9 290 this spring instead of the GTX 970. But I am now going to buy the GTX 970! And I will throw my R9 270 in the trash where it belongs! or just give it away to anyone that needs it if they pay the shipping.


Are you certain that it's not simply that the display controllers aren't capable? I think you might be mistaken and overreacting.


----------

