# FreeSync or G-Sync or both?



## VulkanBros (Feb 11, 2016)

Hi,

I am on the way to buy a new monitor (2560 x 1440) (under 5ms Gray-to-Gray) and min. 120 Hz.
I have a GTX 970 - so is it better to buy a G-Sync monitor? Or is there a monitor that has both technologies?

I don't know if my next GPU will be an AMD or nVidia - so could it be better to by a neutral monitor with 144 Hz (No FreeSync and no G-Sync)  

And how does a monitor with FreeSync behave on nVidia, or G-Sync with an AMD GPU?

Suggestions?


----------



## Beastie (Feb 11, 2016)

You need an Nvidia GPU to work with Gsync.

Or you need an AMD GPU (or any other GPU with the appropriate driver) to run a Freesync monitor.

So you're stuck on one technology or the other at the moment.

Freesync monitors are cheaper- Nvidia take a substantial cut for their proprietry tech whereas AMD are giving Freesync away.
So you aren't really out of pocket if you go for the Freesync option even if you are using a Nvidia GPU. Just it will only work as a regular monitor.

I've got a gsynch monitor and the tech does work exactly as advertised. Your 970 would be a great card to match with a gsync monitor IMO.


----------



## trog100 (Feb 11, 2016)

the way g-sync seems to works for me is it enables smoother running at lower fps.. at 1440 your 970 is not going to be banging out high frames rates.. in fact it will struggle.. 

with a more powerful card just a 144 hrz monitor without dynamic syncing does a good enough job..

beastie has said it about right..

trog


----------



## VulkanBros (Feb 11, 2016)

I see you have an Acer and an ASUS - the ASUS being slightly more expensive (in Denmark that is) .... trog you say my GPU will struggle,
would it be better to buy a 1920 x 1080 144 Hz G-Sync monitor?

And i assume that there is no such monitor with both FreeSync and G-Sync - Google cant find it.....what a shame


----------



## the54thvoid (Feb 11, 2016)

VulkanBros said:


> And i assume that there is no such monitor with both FreeSync and G-Sync - Google cant find it.....what a shame



Correct - not going to happen.  Nvidia wouldn't want it to happen and despite what AMD folksies say - freesync is just as proprietary in that you need a specific GCN arch card to run it.


----------



## GhostRyder (Feb 11, 2016)

VulkanBros said:


> Hi,
> 
> I am on the way to buy a new monitor (2560 x 1440) (under 5ms Gray-to-Gray) and min. 120 Hz.
> I have a GTX 970 - so is it better to buy a G-Sync monitor? Or is there a monitor that has both technologies?
> ...


 Currently Nvidia only supports G-Sync and AMD only supports Freesync (also known as Adaptive-Sync).

With the G-sync, its an added card into the monitor that is only workable (And probably will only ever be) on an Nvidia card that supports G-Sync.
With Freesync, it is AMD proprietary as well however the tech uses the monitors Adaptive Sync scaler to handle it.  So while the name Freesync and software is AMD, the scaler tech can be implemented by anyone and even (Supposidly) Intel is working to support it and there is more of a chance Nvidia will support it versus AMD supporting G-Sync (Of course I am not saying its a better choice, just saying its more likely than the other way around the way things are).

I have an Acer Freesync monitor and its great, though I have also used Asus Rog Swift which is just as great a tech.  In both cases, I find them on equal footing with the base range being better on G-Sync currently than Freesync.

If you want to use G-Sync or Freesync where they shine which is anywhere above 60hz your going to need more GPU power (Though at 1080p 144hz you will probably be ok but your looking at 1440p 144hz).  I would say if you want to keep the card for awhile, G-Sync is the way to go since that's what you can run now and just add another GTX 970 in the future and be just fine with performance.  Or wait and see what the new cards bring and decide from there.  Freesync generally does not add as much to the price as a G-Sync monitor so if you buy one your probably going to want to keep the G-Sync monitor for awhile and stick with Nvidia versus with Freesync its not as expensive an up charge which may benefit you in the long run if you switch sides (Or if things change).


----------



## VulkanBros (Feb 11, 2016)

Thats the thing - I dont want to be bound to nVidia or AMD GPU´s - I want freedom ;-) 
- if my next build would be an AMD GPU and I have purchased a G-Sync monitor - that would be like tossing money out of the window.....


----------



## INSTG8R (Feb 11, 2016)

Honestly owning Freesync monitor I think it's just "gimmick" The monitor is 144Hz so why would I need Freesync if I'm already doing over 60FPS as it is.  It only applies when your frame limited. Set to 144hz in never tears regardless if it's 30FPS or 300...


----------



## EarthDog (Feb 11, 2016)

VulkanBros said:


> I see you have an Acer and an ASUS - the ASUS being slightly more expensive (in Denmark that is) .... trog you say my GPU will struggle,
> would it be better to buy a 1920 x 1080 144 Hz G-Sync monitor?


Or a better card... for 2560x1440, I would be GTX 980 or 290x and higher on the AMD side. The 970 won't last long at 2560x1440 with IQ settings on high or greater.


----------



## GhostRyder (Feb 11, 2016)

VulkanBros said:


> Thats the thing - I dont want to be bound to nVidia or AMD GPU´s - I want freedom ;-)
> - if my next build would be an AMD GPU and I have purchased a G-Sync monitor - that would be like tossing money out of the window.....


 Here is an equivalent (Well at least close from what I saw available currently, maybe better selections out there) set of monitors with one having G-Sync and the other having Freesync.

Freesync (I own this monitor)
http://www.newegg.com/Product/Product.aspx?Item=N82E16824009769

G-Sync
http://www.newegg.com/Product/Product.aspx?Item=N82E16824009848

Basically there is a premium for those monitors with G-Sync and not as much on Freesync.  If you really don't want to be tide down I would get a Freesync since the price difference for a 1440p 144hz monitor is not as much with or without (Its actually getting a little more difficult to find one without those techs installed now).  But if you want to experience it, get the G-Sync monitor and just stick with Nvidia or sell the monitor once you switch.


----------



## trog100 (Feb 11, 2016)

VulkanBros said:


> I see you have an Acer and an ASUS - the ASUS being slightly more expensive (in Denmark that is) .... trog you say my GPU will struggle,
> would it be better to buy a 1920 x 1080 144 Hz G-Sync monitor?
> 
> And i assume that there is no such monitor with both FreeSync and G-Sync - Google cant find it.....what a shame




i think so and i have a nice none g-sync  to spare.. 24 inch 1080 acer predator 144 hrz.. its not very old bought last summer and would suit your 970  card well..  let me know if you are interested.. i can post it to you.. you can have it for £100 plus shipping..

trog


----------



## VulkanBros (Feb 11, 2016)

trog100 said:


> i think so and i have a nice none g-sync  to spare.. 24 inch 1080 acer predator 144 hrz.. its not very old bought last summer and would suit your 970  card well..  let me know if you are interested.. i can post it to you.. you can have it for £100 plus shipping..
> 
> trog



Thanx for the offer - is it this one?:  Acer 24" LED Predator GN246HLB http://www.amazon.co.uk/dp/B00IG0Z0HY/?tag=tec053-21


----------



## Cybrnook2002 (Feb 11, 2016)

If you are going to buy a new monitor, DO NOT compromise it and buy it for the 970 you have now (think smarter, think ahead). Typically a monitor is something you keep for a few years. GPU's come and go, but that monitor will stay.

If you invest in a monitor, get the resolution you want (1440p is a good balance on eye candy, but not needing TOO much power to see it).


----------



## Beastie (Feb 11, 2016)

Cybrnook2002 said:


> If you are going to buy a new monitor, DO NOT compromise it and buy it for the 970 you have now (think smarter, think ahead). Typically a monitor is something you keep for a few years. GPU's come and go, but that monitor will stay.
> 
> If you invest in a monitor, get the resolution you want (1440p is a good balance on eye candy, but not needing TOO much power to see it).


OP said they were looking at 1440p monitors.


----------



## EarthDog (Feb 11, 2016)

Beastie said:


> OP said they were looking at 1440p monitors.


OP also inquired about stepping down to 1080p 144Hz monitors...


----------



## Cybrnook2002 (Feb 11, 2016)

EarthDog said:


> OP also inquired about stepping down to 1080p 144Hz monitors...


It post #4 actually, but thanks for understand why I was saying (you get it). And also seems he is being offered an Acer 1080P from Trog.

Is the investment really worth it?


----------



## hat (Feb 11, 2016)

In all honesty I wouldn't worry about buying either. I feel proprietary technologies like this generally fail. Look at PhysxX. It was cool, it was a nice feature, but how many games actually supported it? And now, if I'm not wrong, PhysX is basically dead. Look at AMD's Mantle. Great technology, worked/helped where applicable... but it didn't exactly take the gaming world by storm. I will, however, credit it with somehow contributing/being incorporated into the development of DirectX 12, as I've read. 

In short, just buy a monitor and enjoy it. If you must get freesync or g-sync, then get g-sync. Again, I don't like the idea of buying into proprietary tech, but given the choice of one or the other, especially in your situation, I'll take g-sync. You already got a GTX 970, so you can already use it. And, though I don't know what the future holds, I generally prefer Nvidia over AMD. Their cards since the 5xx series, or at least the 6xx series, have been beating AMD in both performance and power consumption, overall, so if I were planning a new GPU purchase I'd probably be looking at Nvidia anyway.


----------



## Niteblooded (Feb 12, 2016)

G-Sync and FreeSync are the superior in every single way successors to Vsync.   Everyone wanted to love Vsync but the input lag it introduces just makes it pointless, so the chances of G-sync and FreeSync going away _anytime soon_ are pretty much nil.   Another reason it is not going to go away is exactly why the OP is hesitant... it locks you in.   AMD is very smart to make FreeSync free for monitor manufacturers because as Cybernook2002 said... graphics cards come and go but monitors almost always stay.   So you buy a FreeSync monitor and guess what your next video card purchase is most likely going to be?   Right.   But they can get away with it because the tech is freaking amazing.   My previous monitor did nothing but screen tear in every single game.  I love it, its a great monitor but the screen tearing is super annoying.   G-Sync removes that and the 144Hz refresh rate keeps it buttery smooth.

I'm currently driving my 2560x1440 G-Sync monitor with a GTX970 and so far I keep every setting maxed out in games.   Now I haven't downloaded Tomb Raider yet so that may change but I'm pretty sure the games I can't max out will still have high quality settings.  I do plan on moving this card to my backup machine when the Pascal cards release, but a 970 can drive a 1440 monitor at this current time.   I most definitely would *not* buy a 1080 monitor because monitors are big purchases that last awhile.   Don't buy one for your current graphics card but for the future.   Obviously you want to make sure your current card can drive it but don't stay with 1080p just because you have a 970.


While you will see pretty much all monitor manufacturers have a FreeSync monitor it looks like Samsung and LG are making the top tier FreeSync monitors.   Neither have a G-Sync monitor so if you like those brands you may want to consider FreeSync and AMD.   Acer and Asus are making the top tier G-Sync monitors.   Dell and BenQ seem to have 1 or 2 models of each so they are probably riding the fence but neither have a standout model.   It's good to know who's going to bed with who before you make a decision.   Nothing wrong with holding out either.  Honestly if I didn't need/want a monitor so badly right now I would have waited to see what 2016 brings.   That said I'm quite happy with the Acer XB271HU.

While I'm not happy about being locked in the way I see it is whether you go Nvidia or AMD you are still going to get a great card and the FPS difference is rarely big enough to go crazy over.    Out of all the variables in computer parts the one constant is the ever raging battle of Nvidia vs AMD is always close.   Lot of people make it sound like its a greater difference than it really is but truth is they trade blows.    No card totally dominates at all games.


----------



## trog100 (Feb 12, 2016)

VulkanBros said:


> Thanx for the offer - is it this one?:  Acer 24" LED Predator GN246HLB http://www.amazon.co.uk/dp/B00IG0Z0HY/?tag=tec053-21



yes its that one.. i changed to the asus rog swift because i needed an ips panel for accurate colour photo editing.. its had about four months use.. i still have the box and it came  from scan..

trog


----------



## arbiter (Feb 12, 2016)

Beastie said:


> You need an Nvidia GPU to work with Gsync.
> Or you need an AMD GPU (or any other GPU with the appropriate driver) to run a Freesync monitor.
> So you're stuck on one technology or the other at the moment.
> Freesync monitors are cheaper- Nvidia take a substantial cut for their proprietry tech whereas AMD are giving Freesync away.
> ...


Reason Nvidia takes a cut, is inside that monitor is a module Nvidia made and prefected. They also did the leg work testing pretty much Every LCD panel to find out which ones react well to VRR (varible refresh rates) with minimum to no very little ghosting. AMD on other hand their way yes it is free but it is left on to the monitor makers to do all the work to make it work right like it needs to, there are a few things they can't do like frame doubleing which g-sync does in the monitor where freesync will rely on the video card to do that. So with that added cost people love to point out was cause nvidia did the work for monitor companies to have to do almost 0 work cept put it in the monitor.  Freesync was bad ghosting wise when you started to push it and when got below the minimum of the monitor it would tear. Though Monitor makers have worked on ghosting which its still not at g-sync level yet but its gotten better then it was.


----------



## xfia (Feb 12, 2016)

yeah.. freesync is apart of the adaptive sync ecosystem that was in part developed by amd for free use by anyone in the industry. yes it was meant for nv too but they are picky about using amd tech that doesnt make them a pile of cash. 
you can get plenty of real information on all the industry standards that amd works on with microsoft samsung apple jadec etc.. 
the 970 starts falling off at 1440p in vram limiting situations. 
i would sell the 970 and get a nano with a 1440p freesysnc and add another nano down the road for max eye candy.. could even go with 1080p eyefinity or 1440p ultra wide or 4k. crossfire will handle it but 1440p eyefinity is not really for gamers imo.. unless your showing off what you can do on normalish settings or less demanding games.


----------



## FordGT90Concept (Feb 12, 2016)

It was developed by VESA (they were behind VGA, DVI, and DisplayPort).  FreeSync is AMD's adapation of the standard (which they also extended to HDMI) where G-Sync is entirely proprietary.  G-Sync's days are numbered where adapative sync is just starting.

G-sync may be technically better now but the cost is prohibitive and the number of manufacturers for G-sync monitors will fade to nothing over the next several years.


As noted by other posters, adaptive sync only really matters  in the 30-60 fps range.  You really won't see an advantage if your running something that exceeds that  (at least not from adaptive sync).


Having a GTX 970, your only option is G-sync.  NVIDIA hardware presently doesn't support the VESA adaptive sync standard.  If you're dead set on getting adaptive sync, I'd wait for the 14/16nm AMD cards (coming this year) and a FreeSync monitor.


----------



## xfia (Feb 12, 2016)

FordGT90Concept said:


> It was developed by VESA (they were behind VGA, DVI, and DisplayPort).  FreeSync is AMD's adapation of the standard (which they also extended to HDMI) where G-Sync is entirely proprietary.  G-Sync's days are numbered where adapative sync is just starting.
> 
> G-sync may be technically better now but the cost is prohibitive and the number of manufacturers for G-sync monitors will fade to nothing over the next several years.
> 
> ...


yeah they have been working with vesa for what 20 years or more.. it doesnt matter if freesync shuts off at higher frame rates on monitors by oem choice and there is logic behind it that keeps price down. like ulmb plus you can use vsync at the same time.


----------



## VulkanBros (Feb 12, 2016)

Thank you all for shedding some light on this subject - my colleague says he has a spare GTX 970 (MSI) that he might sell me (he got a 980ti)
so i think i will go the SLI way and wait a few month before i decide what monitor to buy......


----------



## Dethroy (Feb 12, 2016)

I'll probably just wait this out and see who will reign supreme.
Pretty sure Nvidia will try to milk G-Sync as long as possible and you can't really blame 'em. Right now they have the bigger GPU market share and the superior VRR technology. The outcome of the next GPU generation war will probably play a big role of the outcome of the VRR tech war. But I'd say we won't see a definitive winner until the 2nd iteration of Adaptive Sync.

My guess is we won't see a 2nd iteration of G-Sync. Nvidia is smart when it comes to business and they will adopt Adaptive Sync when the window of opportunity to make some sweet $ with G-Sync is about to close.

________________________________________________________________________________________________________________________
*Edit:* I myself am waiting for better monitor offerings (not happy with the current options) and Pascal + Polaris.

I have high hopes on *Samsung's new 144Hz offerings* and will also wait for Dell's U3415 successor and LG's new offerings.


----------



## FordGT90Concept (Feb 12, 2016)

AMD has the console market.  I guarentee you next generation consoles will have adaptive sync because it makes a huge difference.  In fact, I wouldn't be at all surprised if Nintendo's, Sony's, and Microsoft's interest is the reason why AMD pursued adaptive sync the way they did.  They undeniably have the most to gain.  As TVs turn to adaptive sync, so too will monitors.

Yes, NVIDIA, using their ~80% marketshare, is driving card buyers into a walled garden of G-sync monitors but, as I said previously, G-sync monitors are prohibitively priced which keeps the market tiny.  I'd argue the market for FreeSync monitors is already larger simply because of their lower price--despite AMD's smaller market share.

Bare in mind that there is a third player that dwarfs AMD and NVIDIA: Intel.  Intel announced adaptive sync support is coming.  G-sync will only exist as long as NVIDIA wants to keep funding it.  The decision has already been made that adaptive sync is the future.  No one wants to license G-sync from NVIDIA.


----------



## Dethroy (Feb 12, 2016)

FordGT90Concept said:


> AMD has the console market.  I guarentee you next generation consoles will have adaptive sync because it makes a huge difference.  In fact, I wouldn't be at all surprised if Nintendo's, Sony's, and Microsoft's interest is the reason why AMD pursued adaptive sync the way they did.  They undeniably have the most to gain.  As TVs turn to adaptive sync, so too will monitors.


Good catch!



FordGT90Concept said:


> Bare in mind that there is a third player that dwarfs AMD and NVIDIA: Intel.  Intel announced adaptive sync support is coming.  *G-sync will only exist as long as NVIDIA wants to keep funding it. The decision has already been made that adaptive sync is the future.*  No one wants to license G-sync from NVIDIA.


That is exactly how I expect it to pan out:


Dethroy said:


> My guess is we won't see a 2nd iteration of G-Sync. Nvidia is smart when it comes to business and they will adopt Adaptive Sync when the window of opportunity to make some sweet $ with G-Sync is about to close.


----------



## trog100 (Feb 12, 2016)

i have made a few recent changes.. i was running a pair of 970 cards with a no adaptive sync 1080 144 hrz 24 inch gaming panel.. the combination worked fine i never saw tearing or any problems..

i then got rid if the 970 cards and bought a couple of 980 TI cards.. again the combination worked fine for gaming.. total overkill at 1080 but it worked fine..

i then decided i needed an IPS panel for my photo editing.. i was never entirely happy with the colour gamut on the TN gaming panel.. so just to make sure i got the best of both worlds.. i bought an asus rog swift 1440 panel.. it was f-cking expensive..

i have no problems with nvidia g-sync.. it works fine.. but all i really wanted was a fast IPS style panel.. the g-sync just happened to come with it..

i often buy  things just to see.. i am often not "wowed" by the results.. but i do think its the only real way to find out.. its also part of the fun for me..

when i have done my "finding out" i stop spending money and simply use the stuff and for a f-cking long time.. i have only just stopped using my last system put together back in 2007.. 

my 2007 hardware was still okay.. what made it not okay was the software generations left it behind.. DX being an obvious example.. i expect the same thing to happen again in the years to come.. new software will leave my old hardware behind.. tis eventually what happens to good hardware..

one thing i am sure of.. 1440 (2K) is too much for a 970 level card.. 1080 (1K) gaming is still quite viable and from a bang for buck point of view makes the most sense of all.. the other thing is 4K gaming aint here yet even with the best and most expensive hardware..

trog


----------



## Dethroy (Feb 12, 2016)

trog100 said:


> [...]
> trog



I don't wanna sound mean or anything. But in which way does your comment contribute to the thread? At least you could've told us if you are happy with your Asus RoG Swift monitor and if G-Sync makes a difference 

And please refer to 2.560 x 1.440 pixels as QHD or 1440p. Whenever I hear 1440p 2K I feel a certain sudden urge ...


----------



## trog100 (Feb 12, 2016)

dude do not tell me what i should and should not do.. it aint your place..

i did use the K reference in brackets and it does make sense.. 4 K being 4 times as many pixels as 1K.. and needing 4 times the power to run..

but if you want to take advantage of what i have learned.. ask me a simple question and i will do my best to answer it.. 

if not your own comments have no relevance to the thread.. 

trog


----------



## Dethroy (Feb 12, 2016)

trog100 said:


> dude do not tell me what i should and should not do.. it aint your place..


I didn't mean to sound so harsh*. My apologies! I actually liked your comment and effort to share your personal experience. Just felt it was lacking valueable info that would have added substance to the question on hand...

* English is not my first language and sometimes it's hard to get across your intentions/message using written language that is not your mother tongue.



trog100 said:


> i did use the K reference in brackets and it does make sense.. 4 K being 4 times as many pixels as 1K.. and needing 4 times the power to run..


Question: Does 2K need double the amount of power as 1K?
HInt: It does not 



trog100 said:


> but if you want to take advantage of what i have learned.. ask me a simple question and i will do my best to answer it..


I kinda did... Are you happy with your monitor purchase? Do you notice tangible benefits from G-Sync?


----------



## EarthDog (Feb 12, 2016)

Dethroy said:


> Whenever I hear 1440p 2K I feel a certain sudden urge ...


You and me both... its not right. Its like calling an apple a orange....

2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.

2K is NOT 2560x1440!!!! 1440p or QHD is the correct way to refer to it. 

Here is a reference: https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg


----------



## Dethroy (Feb 12, 2016)

EarthDog said:


> 2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.
> 
> 2K is NOT 2560x1440!!!!
> 
> Here is a reference: https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg


Was referring to


trog100 said:


> one thing i am sure of.. 1440 (2K) is too much for a 970 level card..


Wasn't aware of that strange 17:9 resolution though. Thanks!


----------



## EarthDog (Feb 12, 2016)

Well aware what you were referring to.


----------



## 64K (Feb 12, 2016)

EarthDog said:


> 2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.
> 
> 2K is NOT 2560x1440!!!!
> 
> Here is a reference: https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg



 It doesn't help the confusion when big e-tailers like Newegg classify monitors in odd ways. They say 2K monitors are the following

2560 x 1440 (2K) (397)
2048 x 1536 (2K) (2)
2048 x 768 (3)
2560 x 1024 (8)
2560 x 1080 (2K) (110)
2560 x 1600 (2K) (60)
3440 x 1440 (2K) (29)

http://www.newegg.com/LCD-LED-Monitors/SubCategory/ID-20


----------



## EarthDog (Feb 12, 2016)

64K said:


> It doesn't help the confusion when big e-tailers like Newegg classify monitors in odd ways. They say 2K monitors are the following


No doubt... SO annoying......

Sorry, for the threadjack... LOL!


----------



## Dethroy (Feb 12, 2016)

I'd rather see them use (W)QHD, UWQHD and so on or using the pixel height together with the corresponding aspect ratio, e.g. 1440p 21:9.


----------



## Frick (Feb 12, 2016)

Huh. You can get a Freesync monitor for €170.



Dethroy said:


> I'd rather see them use (W)QHD, UWQHD and so on or using the pixel height together with the corresponding aspect ratio, e.g. 1440p 21:9.



OR they can just write it as number x number, as god intended.


----------



## Dethroy (Feb 12, 2016)

Frick said:


> OR they can just write it as number x number, as god intended.


Too easy! And too much data to process for the average Joe.


----------



## trog100 (Feb 12, 2016)

"I kinda did... Are you happy with your monitor purchase? Do you notice tangible benefits from G-Sync?"

i am.. but it was expensive.. compared to my high frame none g-sync 144 hrz monitor i cant say as i do.. 

i think that dynamic syncing allows lower frame rates to be run with everything still looking smooth.. if the frame rates are high enough i dont think there is much to be gained from it.. 

my monitor will do 165 hrz but i run it at 120 hrz.. my graphics cards with a game like mad max at 1440 will do 160 frames per second but i run it  frame rate capped at about 90 fps..

which means (with g-sinc) my monitor refresh is 90 hrz.. its all still super smooth at 90 fps and i save 200 watts of power compared to running the game at 160 fps..

i am not so sure that g-sync is worth the price premium it currently carries.. but the amd version is much cheaper.. building from scratch the whole scene currently favours the amd solution.. 

sorry i cant be more definite but not everything has a simple black and white type answer..

is it worth buying.. yes but only if you afford the price premium.. its not a must have thing and i think (starting from scratch) the extra money would be better spent on the graphics cards to push the frame rates higher in the first place.. 

but in the sense it enables smooth gaming at lower frame rates it has to be a good thing.. shame about the hefty nvidia price premium.. maybe that will come down in the future..

trog

​


----------



## xfia (Feb 12, 2016)

i would have to say that newegg and what seems to be a number of others dont consider 720p into the equation at all anymore or apart of hd.. 

sorry 720p you getn thrown overboard.. we sinkn and dont want none


----------



## VulkanBros (Feb 12, 2016)

wow - I think I got my answers ...... sum up:

1) so could it be better to by a neutral monitor with 144 Hz (No FreeSync and no G-Sync) - I think yes (no dependence on manufactor)
2) or is there a monitor that has both technologies? - is and will prob. not happen, so no
3) I have a GTX 970 - so is it better to buy a G-Sync monitor? - yes and no as I dont know which will be my next GPU purchase
4) And how does a monitor with FreeSync behave on nVidia, or G-Sync with an AMD GPU? - they can not use each others technologies, so prob not a good idea, tossing money out of the window

I have decided not to buy me colleagues GTX 970 - I´ll wait and see what AMD and nVidia are coming with next - I think I'll go with an FreeSync/G-Sync free monitor with a decent response time and refresh rate.

Thank You all for the "enlightenment"


----------



## xfia (Feb 12, 2016)

while nv doesnt support adaptive sync now they will in the future. couldnt really say how long.. maybe 2yrs'ish.
good idea to hold out for the new gpu's if you can and i would have to say that a adaptive sync or freesync monitor is a pretty neutral choice.. much more than gsync and will have better compatibility in the future. 
they all have the ability to enable or disable adaptive sync. adaptive sync is apart of the display port standard and now is going to be the same for hdmi.


----------



## MxPhenom 216 (Feb 12, 2016)

trog100 said:


> the way g-sync seems to works for me is it enables smoother running at lower fps.. at 1440 your 970 is not going to be banging out high frames rates.. in fact it will struggle..
> 
> with a more powerful card just a 144 hrz monitor without dynamic syncing does a good enough job..
> 
> ...



Freesync and g-sync are essentially the same thing. They allow for dynamic refresh rates, so now the fresh rate is tied to the number of frames per second sent over by the GPU.


----------



## Niteblooded (Feb 13, 2016)

Way too early to wave buhbye to G-Sync.   Don't let the rest of this post undermine how big of a win it was for AMD to get Intel's backing on FreeSync because after all they are rivals so that is HUGE.   Buuut... the win is for integrated graphics.   Kaby Lake could possibly be the first Intel cpu where the integrated graphics can take advantage of FreeSync but most likely we are looking at Cannonlake so this gives Nvidia time to make some political power plays.   But even then how many people running on integrated graphics are going to be aiming for gaming monitors?   Integrated graphics has come a long way but offering good FPS especially on new titles is going to be more miss than hit.

This is extremely important especially when you compare FreeSync to G-Sync and how the technologies differ.   Monitors have a minimum refresh rate and both G-Sync and FreeSync currently handle this differently.   Right now when your FPS dips below that minimum refresh rate on a G-Sync panel the refresh rate remains variable so it keeps gameplay very smooth.   FreeSync on the other hand stays locked in at minimum refresh rate which results in both tearing + judder.   So if the monitor's min refresh rate is 40 and your FPS dip to 20 it will stay locked in at 40Hz.  Now if the FPS you are getting in a game is above the minimum refresh rate of the monitor the experience will be butter smooth just like G-Sync.   I'm sure at some point AMD will fix this by probably doubling the refresh rate and adding a repeated frame like G-Sync does.   G-Sync is also superior when it comes to overshoot so there is less blurring effect.   Currently G-Sync does everything FreeSync can do but better and the added bonus is the module in the monitor handles a lot of the calculations reducing load on the GPU or CPU.

Though if G-Sync did lose it wouldn't be the first time superior tech has lost so it really depends on how much pride gets in the way for Nvidia in future decisions.   Their track history in this regard is pretty bad so that would support the feeling G-Sync will fail.   Would be a shame too because not only is G-Sync the better alternative, Nvidia has an extremely large fan base that have the pockets to pay for gaming monitors.   If they reduced (or better remove) the cost they impose on the monitor manufacturers it would put them in a position to go toe-2-toe with AMD and Intel because again people who care about gaming monitors will have a dedicated graphics card.


It should be noted that if you did invest G-Sync today its not like your investment is out the window if all of a sudden Nvidia threw its hands up.   The module in the monitor handles pretty much everything so your panel will still perform as advertised.   It's not like a HD-DVD and Blu-Ray war because with that war you invested in additional costs - buying HD-DVDs instead of Blu-Ray movies.   You're still buying the same games, your PC is still the same and you interact with everything the same way.   The only difference, though not to be understated, is you are locked in to buying Nvidia graphics cards as long as you wish to take advantage of G-Sync.

Right now there is no clear cut winner.   AMD is making all the right moves and getting Intel's backing is almost never heard of.   For the most part their technology is on par with Nvidia's with limitations as described above.   Nvidia has the superior tech and the market share but Nvidia is going to have to drop some of its pride and will have to play like they're scared.   But the technology on both sides is definitely worth it so we gamers are left waiting to see who leaves the colosseum alive.   Right now the lowest cost bidder is leading.


----------



## FordGT90Concept (Feb 13, 2016)

Think of G-sync as the modern-day Betamax.

All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA.  We don't know yet if Pascal cards will have DisplayPort 1.3 or not.

There is no competition here.  NVIDIA developed G-sync to operate on the backbone of DisplayPort 1.2.  DisplayPort 1.2a and newer have the feature built in.  The relationship is like a person competing with their own heart.

AMD is not at the core here, VESA is.  Look at the back of your computer and count how many VESA ports there are.  Mine has 3 DVI and 3 DisplayPort.

The embedded DisplayPort (eDP) chip is a lot cheaper than the G-sync module; eDP will also get cheaper over time as adapative sync becomes the norm for all displays.


----------



## Vayra86 (Feb 13, 2016)

VulkanBros said:


> Thats the thing - I dont want to be bound to nVidia or AMD GPU´s - I want freedom ;-)
> - if my next build would be an AMD GPU and I have purchased a G-Sync monitor - that would be like tossing money out of the window.....



Then there is the simple solution of a frame rate cap and enough hardware to keep running at stable FPS. It has always worked very well and is the lowest input-lag solution in every single situation, that you can freely adapt to any hardware setup and any game.

That's how I do it, and will keep doing it until both AMD and Nvidia get their head out of their asses selling us tech that is readily available already, since it is part of standards now. It will inevitably come around someday and the current solutions are buying into an ecosystem which is exactly why we don't game on Apple or a console. Any gamer with a decent set of brains and insight into the market dynamics stays clear of either solution, sorry if I offended those that buy into this crap (though not really...).


----------



## Niteblooded (Feb 13, 2016)

FordGT90Concept said:


> Think of G-sync as the modern-day Betamax.
> 
> All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA.  We don't know yet if Pascal cards will have DisplayPort 1.3 or not.
> 
> ...


Those analogies are just bad.   You don't need to buy G-Sync games so the analogy with Betamax is not the same.   Investing in Betamax and then switching to VHS meant a library overhaul.   Going with G-Sync now and then switching to FreeSync with a future monitor purchase is just that - a monitor switch and people do that all the time.  The competing with its own heart also alludes me.   DisplayPort 1.3 won't make G-Sync's job any harder, it will just make FreeSync's job easier.   How much who knows at this point.

Too much is in the air at this point.   What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate?  On lesser importance, how it will it catch up on overshoot?  How will Nvidia address the proprietary concerns and price premium?   Right now I will agree with you that AMD has the upper hand but its not as simplistic as you think it is.   The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.


edit:
Apple and console analogy is also terrible.   You are not buying an ecosystem.   You are not buying a system.   It is supporting tech that you can decide to take advantage of or not.   People see the word proprietary and immediately put blinders on and think its all the same - it's not.


----------



## xfia (Feb 13, 2016)

FordGT90Concept said:


> Think of G-sync as the modern-day Betamax.
> 
> All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA.  We don't know yet if Pascal cards will have DisplayPort 1.3 or not.
> 
> ...


guess working hand in hand with vesa on dp like no one else did means nothing at all


----------



## Vayra86 (Feb 13, 2016)

Niteblooded said:


> edit:
> Apple and console analogy is also terrible.   You are not buying an ecosystem.   You are not buying a system.   It is supporting tech that you can decide to take advantage of or not.   People see the word proprietary and immediately put blinders on and think its all the same - it's not.



Did you really think about what you've just said?

You would first go out, pay a premium for either Free- or Gsync and then opt for a GPU of the competing brand that won't support the very feature you just paid a premium for?

Sense - it makes none.


----------



## trog100 (Feb 13, 2016)

it seems to me that currently AMD is a "poor mans choice" (hardcore fan base apart) people buy AMD (cpu or gpu) because its the cheaper option not because its the better option..

looked at this way g-sync costing more than free-sync just follows the otherwise normal pattern..

i think most people tend to look at a monitor as an extra and not as an integral part of their system.. its the way i used to be.. carefully plan out a system and then chuck in a cheap keyboard mouse and monitor more as an after-thought more than anything else..

i have g-sinc but never "bought into it" it just came as part and parcel of the high end monitor i decided to buy.. i aint gonna knock it but aint gonna enthuse over it ether..

i run g-sync and a frame rate cap.. a practise that dosnt seem to be that common.. i aint quite figured out what are the best frame rates to run at but i am getting there.. 

trog


----------



## Niteblooded (Feb 13, 2016)

Vayra86 said:


> Did you really think about what you've just said?


You're really going to make me do all the work for you aren't you?

*Your comparison to Apple...*
- With an Apple ecosystem you are hard locked to all software you can use on that computer
- With an Apple ecosystem you are hard locked to any hardware it does and does not accept
- You cannot forego anything to remove those limitations

*Your comparison to Consoles...*
- With a console all your hardware is locked (except possible HDD upgrades)
- With a console you can only play games designed for that console
- You cannot forego anything to remove those limitations

*G-Sync*
- You need an Nvidia GPU to take advantage of the G-Sync technology
- You can forego an Nvidia GPU and simply use the monitor as a 144Hz (or whatever the refresh rate is) monitor
- It does not impact your ability to use any software you want
- It does not remove your ability to use any hardware you want



Vayra86 said:


> You would first go out, pay a premium for either Free- or Gsync and then opt for a GPU of the competing brand that won't support the very feature you just paid a premium for?



Is it the best or even the smartest move?   Of course not but the option is still available.  But that wasn't the point.   The point is your analogies don't apply here.   They are flawed.   Either you are trying too hard to think of analogies or your understanding is lacking.   I'm guessing the former despite your rudeness but perhaps you should take your own advice.


----------



## Vayra86 (Feb 13, 2016)

Of course it is póssible, but it is not sénsible... We are posting in a thread where the question is posed 'Freesync or Gsync'. I'll agree the analogy is slightly flawed, but you get the point. It makes no sense to pay premium for features that you cannot even use, and it can be a reason to not buy into either technology and wait, or not pay any premium for it at all. At thát point, buying either tech is sensible. And that lack of premium will only happen when it gets adapted as standard for both GPU vendors.

With regards to ecosystems, this is also a function of the mind/perception, because you can easily mod an Apple comp to run Windows or build a Hackintosh, it just takes some extra effort. When you buy Gsync, very slim chance you are ever buying an AMD gpu to go with it, it doesn't fit the perception of what you bought earlier. That is effectively buying into an ecosystem as well. I can also connect my Apple stuff to a non-Apple PC can't I? It just won't work as well, and that effectively pushes you into buying same brand stuff.


----------



## FordGT90Concept (Feb 13, 2016)

Niteblooded said:


> You don't need to buy G-Sync games so the analogy with Betamax is not the same.  Investing in Betamax and then switching to VHS meant a library overhaul.


You have to buy a Betamax player (NVIDIA card) and Betamax cassettes (G-sync equiped monitor).  Switching to adaptive sync means new monitors and cards (at least until NVIDIA supports it).  The similarties are undeniably there.



Niteblooded said:


> Going with G-Sync now and then switching to FreeSync with a future monitor purchase is just that - a monitor switch and people do that all the time.  The competing with its own heart also alludes me.   DisplayPort 1.3 won't make G-Sync's job any harder, it will just make FreeSync's job easier.   How much who knows at this point.


Presently, G-sync does not support VESA adaptive sync but G-sync only exists on the back of VESA's DisplayPort standard (where AMD already extended the VESA standard to HDMI as well).  The analogy stems from the fact that no competition can exist because VESA DisplayPort is not going to compete with VESA DisplayPort.  G-sync is a cobblejob placed on top of the adaptive sync standard that has to go away because it requires a device that is not directly compliant with the eDP standard.  The G-sync brand could appear on eDP hardware but doing so would create a lot of confusion in terms of GPU support.  It would be best if the G-sync brand was abandoned but I don't know if NVIDIA will do that--kind of moot until they debut a DisplayPort 1.3 card.



Niteblooded said:


> What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate?


Reduce settings to get reasonable framerates.  Who, seriously, thinks 0 frames is acceptable?  Additionally, there is some indicators that eDP does take minimum refreshrate into consideration but manufacturers are forced to provide a minimum refreshrate via EDID so they give a higher number than the eDP can handle.  More info here.



Niteblooded said:


> The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.


They cannot.  G-sync is not a standard where adaptive sync is.  The former cannot exist in a market where the latter does--at least not without huge and continued investments from NVIDIA which they'll have to take a loss on to keep going.


There's an important distinction to be made here:  AMD FreeSync branding is moot.  FreeSync is an implementation of the adaptive sync standard.  It's kind of like how the NX bit is called "XD bit" by Intel and "Enhanced Virus Protection" by AMD.  They're different names for the same thing.  G-sync, presently, does not fit VESA's adaptive sync mold.  Sure, monitor manufacturers advertise "FreeSync" to catch the eye of potential buyers but that is a misnomer.  If you buy a FreeSync monitor and plug it into an Intel DisplayPort 1.3 port, it will work just like it would on an AMD card.  Adaptive sync is vendor neutral--just like plug and play functionality of all DisplayPort monitors.




xfia said:


> guess working hand in hand with vesa on dp like no one else did means nothing at all


AMD did the same with Mantle and Vulkan.  Consumers benefit hugely from AMD advancing all of these open standards because they can use any hardware they want (even NVIDIA).  How is that a bad thing?  The obligatory meme:





NVIDIA can't ignore DisplayPort 1.3 because it comes with a huge boost to bandwidth 4K displays need.  If they support DP 1.3 and deliberately axe/ignore adaptive sync in the name of G-sync, they deserve to burn in hell.


----------



## GhostRyder (Feb 13, 2016)

Niteblooded said:


> Too much is in the air at this point.   What will AMD's fix be for FreeSync dropping its variable refresh rate below a monitor's min refresh rate?  On lesser importance, how it will it catch up on overshoot?  How will Nvidia address the proprietary concerns and price premium?   Right now I will agree with you that AMD has the upper hand but its not as simplistic as you think it is.   The two technologies can co-exist if Nvidia removes the price premium in favor of knowing G-Sync monitors will keep its clientele (long term investment vs short) and can prove to monitor manufacturers that its worth their time and money to have a G-Sync lineup.


I am confused because they both do that, once you exit the range you lose Freesync/G-Sync, G-Sync just at the moment goes lower in the monitors available.




Niteblooded said:


> You're really going to make me do all the work for you aren't you?
> 
> *Your comparison to Apple...*
> - With an Apple ecosystem you are hard locked to all software you can use on that computer
> ...


However, the problem in purchasing a G-Sync monitor is the expense as they cost extra (Quite a bit) over a non Sync enabled or Adaptive Sync (FreeSync) monitor.  Even though you can use the monitor as a normal monitor its also more difficult as most (I have seen I believe one so far that has more than 1) have only 1 DP output making them harder to use for other tasks without adaptors and such (Or if you want to have multiple things hooked into it).  They are strictly designed to be a 1 computer gaming monitor for Nvidia cards which is fine but it makes it hard to recommend to people unless they don't mind selling off an expensive monitor if they decided to switch vendors or just keep it without being able to use the tech they paid extra for.



FordGT90Concept said:


> NVIDIA can't ignore DisplayPort 1.3 because it comes with a huge boost to bandwidth 4K displays need.  If they support DP 1.3 and deliberately axe/ignore adaptive sync in the name of G-sync, they deserve to burn in hell.



I have a feeling they are going to block it, not because of anything more than they want to push G-Sync even when they add DP 1.3.


----------



## FordGT90Concept (Feb 14, 2016)

GhostRyder said:


> I have a feeling they are going to block it, not because of anything more than they want to push G-Sync even when they add DP 1.3.


I hope not but that possibility exists.  I have a feeling they won't block it because display manufacturers won't play ball unless NVIDIA greases their palms.  NVIDIA goes where the profits are and I don't see profits in G-sync's future--I see losses.

There's only 9 G-sync monitors available for sale now:
http://www.geforce.com/hardware/technology/g-sync/where-to-buy-g-sync-monitors-and-modules

There are over 50 adaptive sync monitors (9 are HDMI):
http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

Note the number of manufacturers too.  Adaptive sync already has broad industry backing where G-Sync does not.


Additionally, the FreeSync page says right on it that Polaris will have DisplayPort 1.3 support.  90% sure Pascal will too.

I think it is very possible 2016 is the end for G-sync.  It will be put on legacy support.


----------



## xfia (Feb 14, 2016)

FordGT90Concept said:


> I hope not but that possibility exists.  I have a feeling they won't block it because display manufacturers won't play ball unless NVIDIA greases their palms.  NVIDIA goes where the profits are and I don't see profits in G-sync's future--I see losses.
> 
> There's only 9 G-sync monitors available for sale now:
> http://www.geforce.com/hardware/technology/g-sync/where-to-buy-g-sync-monitors-and-modules
> ...


the next nv fad will be an operating system  the next amd venture be getting another hsa member to make gpu's and push out nv.. poor nv needs to wear a helmet  even apple is making jokes about them


----------



## AsRock (Feb 14, 2016)

EarthDog said:


> You and me both... its not right. Its like calling an apple a orange....
> 
> 2K = 2048x1080, for the record. 1080p is 1920x1080. Ive never heard of 1K...because it doesn't seem to exist.
> 
> ...



2k is 1920x1080, like 4k is 3840 x 2160 which you can see is less than 4000 so 4k further away than what  2k is.

That said 1920x1080 has every right to be all so called 2k as 4k is way of it's mark with 3840. He mean 1920x1080 being 1k when really it's not, they just started a new naming scheme is all.


----------



## FordGT90Concept (Feb 14, 2016)

*1K* (*1024* x 576) -- Probably only exists on cheap phones.
Full HD/*2K* (*1920* x 1080)
Ultra HD/*4K* (*3840* x 2160)
*5K* (*5120* x 2880)

Note they are all 16:9 and they all round to their #K.


For the record, I think calling a resolution by any name other than it's resolution is stupid.


----------



## trog100 (Feb 14, 2016)

you are probably right.. maybe even more so now that extra wide screens are becoming popular.. the term 4K is accepted and commonly used.. it is easy to remember and write down..

its become the new must have buzz word for some..  4K is roughly 8 million pixels.. 1920 x 1080 is roughly 2 million pixels.. 2560 x 1440 is roughly or a little less than 4 million pixels.. 

the ratio of 1 2 4 makes sense from a power needed to move them all about or the amount of pixels on a screen point of view hence my incorrect use of the 1 K 2 K and 4 K tis just the way my poor old brain makes sense of it.. 

i go back to the 640 x 480 days.. he he

trog


----------



## Niteblooded (Feb 14, 2016)

*@Vayra86*
Even if Nvidia says FreeSync wins they will still provide back compatibility support of G-Sync.   You only have two options of graphics cards anyway.  It's one or the other.  Not like you are losing out on a breadth of options.  The reason Nvidia dominates the market share is because most of its user base is loyal and will keep buying Nvidia cards.  This has been a thing before G-Sync and will most likely still be a thing.  So being locked into Nvidia isn't as big of a negative to many (read, not all) of them.

While the price premium does suck (not arguing this) at least you get something for that premium - superior tech.   Many people are quite ok with paying a premium for better technology.  I am.   I supported SSDs when they were new and expensive.  If I get something for my money its not as evil as many make it out to be.  To be clear since people are analogy happy, I am *not* comparing G-Sync to SSDs.  Only saying I personally don't mind paying a price premium for better tech.


*@FordGT90Concept*
The Betamax player analogy is really pointless.  You should not need me to tell you Betamax Cassettes != a friggin monitor.   The similarities exist only in your head.   Stop with the analogies.  We get it - you hate G-Sync and Nvidia.  Instead of trying to think of analogies, focus on differences in the technology.

Now if you go back and read my posts you will see that I agree with you that AMD has Nvidia by the short ones and that Nvidia has a track record of bad habits.  What I don't agree with you on, is that its not as simple as you make it out to be.   There is a very clear and distinct difference on that.

First you make it sound like all of a sudden some recent information came out that put the death nail in G-Sync - using DisplayPort 1.3 as your argument.  Truth is what you are referring to is old news - well for the computer industry.  VESA created Adaptive Sync in 2009 but it was not implemented.  Nvidia took the opportunity to develop and release G-Sync.  In response to that AMD announced FreeSync, which was VESA's Adaptive Sync.  So all of a sudden a technology that wasn't pushed at all was used to combat Nvidia's release of G-Sync.  Adaptive Sync was actually supported by DisplayPort 1.2a - so no... 1.3 is not when Adaptive Sync was first supported.  It should be noted that 1.2a was released in 2014 ...so was the spec for 1.3.  Even with that you still need a FreeSync enabled monitor as it needs the chip.  FreeSync monitors win on price.   G-Sync monitors win on tech.  Since Nvidia is not supporting Adaptive Sync we are back at square one... you need an AMD card for FreeSync and you need an Nvidia card for G-Sync.  So we can discuss DisplayPort all we want but in the end very little has changed in that regard.  In spite of the DisplayPort changes guess what... monitor manufacturers are still releasing G-Sync monitors. 

A noteworthy change is Intel's integrated graphics will now support AdaptiveSync, but how many people running integrated graphics will be buying a gaming monitor?

You kind of have AMD's triumph on DisplayPort 1.3 a little off.   The big win on DisplayPort 1.3 is the fact it enables 5120×2880 displays to 60Hz refresh rate, 1080p monitors will go up to 240hz refresh rate, 170hz for HDR 1440p screens and 144Hz for 3840×2160.   The upper end displays will most likely have an announcement date toward the end of the year - in case anyone is curious.

The above is great but honestly what I feel is the biggest win for AMD is getting FreeSync to work over HDMI 2.0 since not all monitors support DisplayPort.   This will definitely increase the number of monitors that are FreeSync certified.  Granted this is a low cost solution since HDMI doesn't have the bandwidth DisplayPort does so all serious gamers will still go for a DisplayPort monitor.   But it does open up options to the lower price tier group of monitors.   So now lost cost game systems can enjoy variable refresh rates on low cost monitors.   AMD always kind of dominated on the low cost GPUs but now there is even more reason for people looking for a low cost GPU to invest in AMD instead of Nvidia.


As far as your clarification of FreeSync and AdaptiveSync, you should realize that the two while the same are not mutually inclusive.  A FreeSync certified monitor is always based on AdaptiveSync but not every AdaptiveSync monitor is FreeSync certified.  Your same paragraph makes it sound like FreeSync is completely plug-n-play with DisplayPort 1.3.  Again, you need an AMD graphics card for FreeSync to work just like you need an Nvidia graphics card for G-Sync.  See how we came full circle again?   Now when Intel releases its first line of CPUs that are AdaptiveSync enabled than your statement becomes true but DisplayPort 1.3 does not magically enable FreeSync on its own.

Also you should not assume every monitor's min refresh rate is 30Hz because some are actually 40Hz and above.  It's not hard to dip below 40 FPS on a recent title.   If you are going to buy a FreeSync monitor this is one of the most important things you should look up.


That list of G-Sync monitors you posted is out of date.   AMD still has the advantage but there are more than 9 G-Sync monitors 
- List of FreeSync monitors
- List of G-Sync monitors

Again, the important distinction between your POV and mine is I don't think it is clear cut like you do.  As I said, AMD has the upper hand and Nvidia has a bad track record with the tech it likes to push.  In fact, I think I provided more examples of how AMD has the upper hand.   But the analogies in this thread are severely flawed.   Your information is a bit off and you paint this picture that if someone invests in a G-Sync monitor they're screwed - not true.


----------



## FordGT90Concept (Feb 14, 2016)

Niteblooded said:


> VESA created Adaptive Sync in 2009 but it was not implemented.


Only in eDP (think laptop monitors)...


			
				VESA said:
			
		

> http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
> Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. *Newly introduced to the DisplayPort 1.2a specification for external displays*, this technology is now formally known as DisplayPort Adaptive-Sync.



DisplayPort 1.2a was designed specifically for AMD to push FreeSync out the door.  DisplayPort 1.3 is to put everyone (AMD, Intel, NVIDIA, and so on) on the same page with support for external adaptive sync.  There are no DisplayPort 1.3 devices out yet--they're coming this year en masse.



Niteblooded said:


> A noteworthy change is Intel's integrated graphics will now support AdaptiveSync, but how many people running integrated graphics will be buying a gaming monitor?


Businesses, everywhere.  When nothing changes on the screen (think your typical workstation screen), the GPU can literally shut off because the display already has everything it needs thanks to its onboard memory in the eDP.  This tech goes far behind high refresh rates.


The only reason why AMD bothered with FreeSync and HDMI is for consoles...likely beginning with the Nintendo NX.  It's not really aimed at the computer market...other than the Nano which is aimed at HTPCs.  Remember, the Nano has 3 DisplayPorts, each capable of Multi-Stream Technology, where its sole HDMI port can only power one display.  AMD is very committed to DisplayPort and the only reason why they put up with HDMI at all is because home theaters are sticking to it.




Niteblooded said:


> As far as your clarification of FreeSync and AdaptiveSync, you should realize that the two while the same are not mutually inclusive.  A FreeSync certified monitor is always based on AdaptiveSync but not every AdaptiveSync monitor is FreeSync certified.  Your same paragraph makes it sound like FreeSync is completely plug-n-play with DisplayPort 1.3.  Again, you need an AMD graphics card for FreeSync to work just like you need an Nvidia graphics card for G-Sync.  See how we came full circle again?   Now when Intel releases its first line of CPUs that are AdaptiveSync enabled than your statement becomes true but DisplayPort 1.3 does not magically enable FreeSync on its own.


Negative.  Adaptive sync is agnostic.  If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default.  The purpose of the technology is to act without user input.  Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption.  The branding matters not.

Of course this isn't true of G-Sync, in its current state, because it is non-standard.




Niteblooded said:


> Also you should not assume every monitor's min refresh rate is 30Hz because some are actually 40Hz and above. It's not hard to dip below 40 FPS on a recent title. If you are going to buy a FreeSync monitor this is one of the most important things you should look up.


That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at.  The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.


----------



## Niteblooded (Feb 14, 2016)

Ya I addressed that...



Niteblooded said:


> Nvidia took the opportunity to develop and release G-Sync.  In response to that AMD announced FreeSync, which was VESA's Adaptive Sync.  So all of a sudden a technology that wasn't pushed at all was used to combat Nvidia's release of G-Sync.  Adaptive Sync was actually supported by DisplayPort 1.2a - so no... 1.3 is not when Adaptive Sync was first supported.


----------



## FordGT90Concept (Feb 14, 2016)

I never said 1.3 is when it was first supported.  It will be the first supported by NVIDIA, Intel, and the rest of the industry.



FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:




http://videocardz.com/57776/amd-launches-radeon-software-crimson-driver

G-sync lost its technical edge with a driver update.  eDP/adaptive sync is just that awesome. 


Edit: Interesting caveat there: "greater than or equal to 2.5 times the minimum refresh rate."
30 Hz -> 75 Hz
35 Hz -> 87.5
40 Hz -> 100 Hz
42 Hz -> 105 Hz
47 Hz -> 117.5 Hz
48 Hz -> 120 Hz
56 Hz -> 140 Hz

That's *definitely* something buyers should be aware of.

Edit: Looks like LFC should work on all 144 Hz displays.


----------



## the54thvoid (Feb 14, 2016)

Without sounding trollish - I'd only buy a Freesync or G-Sync monitor if I was a brand loyalist OR had no problem buying a new monitor when i changed gfx cards.

I've owned Nvidia since i left my 7970's behind but I still wouldn't splash on a G-Sync.  And likewise, I'm not buying an AMD card just to buy a FreeSync monitor.

Unless Nvidia support adaptive sync (in which case G-Sync dies), it's a lottery between cards and monitors.  Without knowing what the next gen cards are doing, the F/G sync options are like technology prisons you pay to lock yourself into.  No thanks.  I'll stick with monitor agnostic powerful cards instead (be it Fury X or 980ti).


----------



## GreiverBlade (Feb 14, 2016)

Dethroy said:


> *Edit:* I myself am waiting for better monitor offerings (not happy with the current options) and Pascal + Polaris..


same here ... seems that my 980 and a 60Hz 1080p monitor are enough and G-Sync Freesync not really worth anything ... (specially G-sync ... seriously, proprietary + cost added? no thanks nVidia ) with all my excuses for those who think G-Sync is "tha bomb"

(i mean what's the point of that tech, when my framerate is already stable enough and no stutter no matter what game i play? )

if i decide one day to go 1440p 144hz (the day the manufacturer will be "less nuts" on pricing ...) maybe i will consider a G-Sync or Freesync monitor depending my GPU (or the 3rd tech that will be "open" and replace both and add no cost to a, already expensive enough, monitor)
well ... also any "ROG SWIFT" or "gaming" 144hz 1440p monitor, cost a little more than my GPU ... while my Philips 27E3LH cost 1/3 of it, i know i am surely missing something ... 



the54thvoid said:


> Without sounding trollish - I'd only buy a Freesync or G-Sync monitor if I was a brand loyalist OR had no problem buying a new monitor when i changed gfx cards.).


totally true 



the54thvoid said:


> Unless Nvidia support adaptive sync (in which case G-Sync dies), it's a lottery between cards and monitors.  Without knowing what the next gen cards are doing, the F/G sync options are like technology prisons you pay to lock yourself into.  No thanks.  I'll stick with monitor agnostic powerful cards instead (be it Fury X or 980ti).


and true (and G-Sync need to die anyway, but nVidia want to make the most money out of the brand's loyalists )


----------



## xfia (Feb 14, 2016)

FordGT90Concept said:


> For the record, I think calling a resolution by any name other than it's resolution is stupid.


----------



## trog100 (Feb 14, 2016)

liking a certain brand dosnt always turn a person into a flag waving mindless zealot.. i like intel and i like nvidia.. i do so for what i think are good reasons.. 

in days long past i used to like amd and ati.. i did have good reasons back then.. reasons which sadly dont exist any more.. 

having spent a fair bit of dosh on a good gaming IPS panel which happens to have g-sync.. i will be bit pissed off if support for  it does die in the near future.. 

i could live without it.. but i do find it useful..

trog


----------



## xfia (Feb 14, 2016)

the only chance of gsync surviving is if they support both and only make them as highest premium gaming experience on every monitor coming perfectly calibrated.. 1ms.. rimless.. curved and there is the realm of 5k.


----------



## trog100 (Feb 14, 2016)

xfia said:


> the only chance of gsync surviving is if they support both and only make them as highest premium gaming experience on every monitor coming perfectly calibrated.. 1ms.. rimless.. curved and there is the realm of 5k.



it will definitely die off then.. he he

i recon i am gonna turn mine off just to see what perceptible difference it actually makes.. i wont be entirely surprised if i dont see any difference.. but unlike most i can at least find out for real.. he he

trog

ps.. running mad max at 90 fps 1440 resolution with a monitor refresh rate set at 120 hrz and g-sync off i cant see any noticeable difference.. i will leave it off for longer and look f-cking harder.. he he he


----------



## VulkanBros (Feb 14, 2016)

Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every  second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.

All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.


----------



## AsRock (Feb 14, 2016)

Not at all, I have mine as long as possible.  MY HDTV is about 4 years old now and cannot see it being replaced any time soon unless it breaks down.  In no rush G Sync or freesync not are not even matured yet and on top of that Home Theater has not court up yet so.

I just wish AMD drivers would recognize that it's a 10bit panel lol.

So i am just waiting it see what happens.


----------



## Beastie (Feb 14, 2016)

I'd hope to get 5+ yrs out of a monitor


----------



## FordGT90Concept (Feb 15, 2016)

VulkanBros said:


> Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every  second year?? I maybe switch GPU every second or third year,
> but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.
> 
> All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.


I haven't changed monitors in probably 7+ years.  I change GPUs every 5 or fewer years.  I'm going to keep using what I got until it dies.  At which point, I'm hoping for an affordable HDR, IPS, adaptive sync panel.  I'm hoping they don't die soon because the former is only just happening.  HDR and IPS are definitely more important to me than adaptive sync.


----------



## Peter1986C (Feb 15, 2016)

But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.


----------



## trog100 (Feb 15, 2016)

i just swapped from a new-ish 144 hrz 1 ms TN gaming panel to an IPS 4 ms gaming panel.. i did it for photo editing.. colour reproduction and viewing angles are far better on the IPS style panel and the blacks are deeper.. 

but it does come down to what folks are used to.. i expect to keep the monitor i now have for quite some time.. i have not the slightest desire to move to 4K even though i do have the gpu power to do it.. 

trog


----------



## FordGT90Concept (Feb 15, 2016)

Peter1986C said:


> But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.


My TN right now is 5ms.  Most IPS panels can match that.  IPS always has better viewing angles than TN.  Most TVs and smartphones have an IPS display.


----------



## Xzibit (Feb 15, 2016)

Peter1986C said:


> But response times on IPS and PLS are slow. *And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours*. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.



TNs are 6-bit where IPS are 8-bit.  Most TNs found on the higher gaming monitors are still 6-bit+FRC color is between 72-75%.  You can easily find a IPS panel for $150 with color range 75%. Even a eco or value IPS panel which has a 65% will look better than a TN panel and those monitors you can get for $120 and under.


----------



## Niteblooded (Feb 15, 2016)

FordGT90Concept said:


> Negative.  Adaptive sync is agnostic.  If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default.  The purpose of the technology is to act without user input.  Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption.  The branding matters not.
> 
> That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at.  The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.



Looks like you realized your mistake on the last paragraph but we'll address that soon. 

AdaptiveSync is agnostic but again... FreeSync is not.   And the point of this topic is FreeSync vs G-Sync.

*From AMD:* _FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits_

*More from AMD...*
*Q: What are the requirements to use FreeSync?*
*A:* To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, *a compatible AMD Radeon™ GPU* with a DisplayPort™ connection, *and a compatible AMD Catalyst™ graphics driver*. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.​
Yup, agnostic. 



FordGT90Concept said:


> I never said 1.3 is when it was first supported.  It will be the first supported by NVIDIA, Intel, and the rest of the industry.


Actually you did say it.   You also said with DisplayPort 1.3 will allow it to work natively without any other hardware.   Want me to quote that too?   You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.


FordGT90Concept said:


> All hardware with DisplayPort 1.3 ports *will support* adaptive sync and that includes NVIDIA.







FordGT90Concept said:


> FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:
> 
> G-sync lost its technical edge with a driver update.  eDP/adaptive sync is just that awesome.
> 
> ...



Did you just find out about the driver update?     There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting).   Then you have the caveat that you mentioned.   The upper end resolutions is where this will have the most effect.

*Verdict:* Good first start but still room for improvement

So no, Nvidia didn't lose the technical edge ...yet.

Also here is the video in which they recommended those changes to AMD.










This debate is tired.   I said what I wanted to say.
You know the jargon but your information is off.
Do your per-emptive victory cheer.   Get the last word in.   I'm done.   We are not getting anywhere.   You never see anyone else's point of view but your own.


----------



## FordGT90Concept (Feb 15, 2016)

Niteblooded said:


> Looks like you realized your mistake on the last paragraph but we'll address that soon.
> 
> AdaptiveSync is agnostic but again... FreeSync is not.   And the point of this topic is FreeSync vs G-Sync.
> 
> ...


Intel will undeniably call it something else.  My point is that that the name does not matter.  If you buy a monitor that is branded as "FreeSync compatible," you'll be able to plug it into an AMD graphics card and use "FreeSync" or plug it into an Intel IGP and it will use "[insert name here]."  The branding doesn't matter.  If your GPU supports external adaptive sync (be it DisplayPort 1.2a or newer or whatever HDMI backbone AMD is using for that) and your monitor supports adapative sync, it will work.  The name used to market it doesn't matter.  You cited AMD so AMD naturally uses FreeSync branding.  It is still agnostic--part of the DisplayPort standard.



Niteblooded said:


> Actually you did say it.   You also said with DisplayPort 1.3 will allow it to work natively without any other hardware.   Want me to quote that too?   You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.


facepalm.jpg

Instead of arguing, I'm just going to be very blunt.  External adapative synce was/is started at...
AMD: DisplayPort 1.2a
Everyone else: DisplayPort 1.3

DisplayPort 1.3 FAQs


			
				VESA said:
			
		

> *Q: Does the release of DisplayPort 1.3 mean that DisplayPort 1.2 products are obsolete?*
> 
> A: Not at all. VESA develops and publishes standards like DisplayPort prior to their actual deployment in the field. DisplayPort 1.2a represents the latest interconnect technology now available to consumers from manufacturers. The new DisplayPort capabilities included in DisplayPort 1.3 have begun the cycle of hardware development that will result in such technology becoming available to consumers in a range of products over the next few years. And like other new versions of DisplayPort, *DisplayPort 1.3 is backward compatible with earlier DisplayPort standards*.


The reason why 1.2a exists is because AMD wanted support in 2014 and didn't want to wait until 2016 for 1.3 because G-sync was already out the door.  AMD...and VESA...couldn't wait.



Niteblooded said:


> Did you just find out about the driver update?     There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting).   Then you have the caveat that you mentioned.   The upper end resolutions is where this will have the most effect.


The "driver update" is AMD only.  It requires no changes in the monitor.  Note the last bullet on the slide: "No user configuration or proprietary monitor hardware required."  If you're wondering why that is, it is because of the eDP chip inside the monitor.  The LFC bug was purely in software because AMD didn't code in a solution to the "FPS < Min Refresh" problem.  That's been fixed.



Niteblooded said:


> *Verdict:* Good first start but still room for improvement





> Despite those couple of complaints, my first impression of LFC with the Crimson driver is one of satisfied acceptance. It's a great start, there is room to improve, but AMD is listening and fixing technologies to better compete with NVIDIA as we head into 2016.


AMD will fix the bugs in time and do so through software updates.  There's nothing wrong with the hardware in the card nor the monitor.  Their algorithms are still just a little rough around the edges.  Such is the nature of working with a complex industry standard.

Intel and NVIDIA are likely to go through the same iterations to apply and improve external adaptive sync support.


----------



## medi01 (Feb 15, 2016)

FreeSync monitors come "for free" (and that's a problem for AMD, in some way) since most scaler chips out there support the tech out of the box.
AMD's way is FREEE FOR EVERYONE to use, being an industry standard. (please spare me from the "adaptive sync is not free sync" argument, in this context it's the same thing).

Tests show FreeSync to be slightly superior to G-Sync (former increases FPS a bit, latter decreases).

nVidia's stuff comes at a price (more chips to integrate into monitor), AND IT LIMITS MONITOR TO HAVING ONLY PORT: G-Sync enabled DISPLAYPORT. (it will still work as normal display port though).
FreeSync has no such restriction. (AMD even promised FreeSync over HDMI!)

Since nVidia officially announced about NOT licensing G stuff to anyone (and frankly there being no good reason to endorse G over F anyhow), Intel or AMD jumping on GSync wagon is extremely unlikely. (nVi exclusiveness is it's only point)

All that is needed for nVidia to jump on FreeSync wagon is for AMD to re-gain GPU market share (which is shamelessly low given how many great products they have on the market at the moment, but that's a different discussion).


----------



## Dethroy (Feb 15, 2016)

Let's all keep our mouths shut until the arrival of Pascal and Polaris


----------



## 64K (Feb 15, 2016)

Dethroy said:


> Let's all keep our mouths shut until the arrival of Pascal and Polaris



What? Without speculation there wouldn't be much posting at all in tech forums. I'm speculating that of course.


----------



## Beastie (Feb 15, 2016)

medi01 said:


> Tests show FreeSync to be slightly superior to G-Sync (former increases FPS a bit, latter decreases).


 Which tests? Everything I've read gives gsync a slight advantage at the moment.


----------



## FordGT90Concept (Feb 15, 2016)

90% sure Polaris will support DisplayPort 1.3 and 60% on Pascal.  The future of GPUs is higher resolutions and refresh rates.  The best path to those is DP 1.3.

I think the reason why Fiji doesn't have HDMI 2.0 is because AMD wanted HDMI 2.0a for external adaptive sync.  AMD had no intent of ever using HDMI 2.0.  That inevitably means HDMI 2.1 is likely to make external adaptive sync a standard too but it will come some time after Polaris (or announced at about the same time).


----------



## Dethroy (Feb 15, 2016)

64K said:


> What? Without speculation there wouldn't be much posting at all in tech forums. I'm speculating that of course.


Speculation, huh? I thought my _rather provocative*_ and puny attempt of being funny would lead to even more posts. Trolls attract attention, right? 
_*self-irony_


FordGT90Concept said:


> 90% sure Polaris will support DisplayPort 1.3 and 60% on Pascal.


Found this:


> _This release time frame will coincide with AMD’s and NVIDIA’s launch of next generation of GPUs which will be equipped with DisplayPort 1.3._


Source: http://120hzmonitors.com/samsung-144hz-3440x1440-ultrawide-monitors/


----------



## medi01 (Feb 15, 2016)

Haha, can one be DP 1.3 compliant without FreeSync (Adaptive Sync) thing? 



Beastie said:


> Everything I've read gives gsync a slight advantage at the moment


Everything you read... where? And what "slight advantage"?
Tests show GS reducing FPS a bit (about -2%), while FS gives (even smaller) bump. 
http://www.anandtech.com/show/9097/the-amd-freesync-review/4

But, hell yeah, then there is FUDzilla, fueled by pay rolled nTrolls quite a bit, about OMG nidividiaaaas infinite greatness.
Effectiveness of troll tactics is depressing. 380 is about 10% faster than 960, consumes about 20% more power, is regarded as  "you can fry... on it". (applies to most AMD vs nVidia products).


----------



## GhostRyder (Feb 15, 2016)

VulkanBros said:


> Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every  second year?? I maybe switch GPU every second or third year,
> but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.
> All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.


 Monitors are something I don't upgrade very often with the only exception being I switched from a 2160p monitor to a 1440p 144hz monitor because it was better to me for gaming and I wanted to try Freesync on it.  I normally change my cards every other generation with some exceptions in the past but I try and stick to this method now.  Also if a monitor says "Freesync" it means it is adaptive sync, they are only using that name because it will be more recognized by gamers but its the exact same thing (Freesync is just AMD's design on their cards to take advantage of it).

The problem is if you want to pay for it or not at the end of the day.  Adaptive Sync (Freesync) monitors are generally more like standard monitors and are not proprietary and contain extra hookups where as a G-Sync monitor is going to have 1 DP output and be locked exclusively to Nvidia hardware.  Its a hard sell right now either way because there is the fact that DP 1.3 monitors could come out this year and be supported by all (Heck maybe we could get lucky and 1.2a monitors will work on them with Adaptive Sync) and then you will have a much more obvious choice.  Otherwise (Or if Nvidia locks it out) were still going to be stuck with the same debate.  Its not really a matter of which is better as in their ranges they perform equally well (though again G-Sync goes down to 30 and ther others are generally down to 40 with one I believe at 35 currently) so its a matter of if you want to pay for it and what GPU you have and how long you want to keep both.


----------



## Beastie (Feb 15, 2016)

medi01 said:


> Everything you read... where? And what "slight advantage"?
> Tests show GS reducing FPS a bit (about -2%), while FS gives (even smaller) bump.
> http://www.anandtech.com/show/9097/the-amd-freesync-review/4
> 
> ...


from the summary of the link you posted-

"Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error."

So they found a glitch with one game, Alien Isolation.


 Here's some gsync vs freesync articles

http://www.tomshardware.co.uk/amd-freesync-versus-nvidia-g-sync-reader-event,review-33278.html

http://www.pcworld.com/article/2974...rate-displays-make-pc-games-super-smooth.html

http://www.techradar.com/news/compu...cards/nvidia-g-sync-vs-amd-freesync-1289637/2

http://www.digitaltrends.com/computing/nvidia-g-sync-or-amd-freesync-pick-a-side-and-stick-with-it/


I'm not saying gsync is a better option-there are benefits to both. Freesync being cheaper is a big plus in my view.
But mostly it depends on what GPU is used.


----------



## FordGT90Concept (Feb 15, 2016)

medi01 said:


> Haha, can one be DP 1.3 compliant without FreeSync (Adaptive Sync) thing?


No in regards to the hardware (by installing a DisplayPort 1.3 connector, the hardware is there to do it) and yes in regards to software (the drivers determine under what condition to send a signal to the monitor).  Most of the magic occurs in the eDP chip in the display but the drivers still have to instruct it what to do by sending it a signal.

There is a real danger that NVIDIA cards could look for a G-sync module and if it doesn't find it, disable adaptive sync.  I'm thinking this might actually be likely in Pascal for two reasons:
1) NVIDIA wants to milk the G-sync cash cow a while longer.
2) NVIDIA needs time to not only implement adaptive sync in their drivers but also fine tune it so it matches or exceeds AMD's implementation.

Why would NVIDIA deliberately put itself behind AMD when they have a competitive product that is smoother despite being more expensive? I can only think of one reason why they wouldn't: cost.  Obviously I don't have insider knowledge to know if that is a factor or not.


What isn't known is how Intel plays into this.  They don't have a horse in the race yet like Intel.  Are they going push their implementation out the door and tweak it like AMD or are they going to bide their time perfecting it until they push it out the door?  I'm 50/50 on that one.  I don't know.


----------



## medi01 (Feb 15, 2016)

Beastie said:


> So they found a glitch with one game, Alien Isolation.


Check other bits, it's just easier to see there:
http://www.anandtech.com/Gallery/Album/4315#6



Beastie said:


> Here's some gsync vs freesync articles


Sorry, I missed which one of them is saying GS is faster than FS.



Beastie said:


> I'm not saying gsync is a better option-there are benefits to both. Freesync being cheaper is a big plus in my view.
> But mostly it depends on what GPU is used.



G-Sync, there is no visible technological advantage ('oh, monitor's manufacturer needs to do bla" isn't technical), yet it costs about a 100$ to support and restricts number of ports to 1.
It's an attempt to leverage dominant market position to bar competitors, akin to PhysX.
There is no (customer) benefit in it. nVidia could benefit from it, if AMD didn't strike back with FS. It still can if AMD goes bust.



FordGT90Concept said:


> There is a real danger that NVIDIA cards could look for a G-sync module and if it doesn't find it, disable adaptive sync.


So, "Dear Asus, put this chip of mine into your monitor, or else I disable adaptive sync"?
But then, which tech will be used for adaptive sync? And if it is GS (in which case chip is really really needed) what about the problem that caused all Monitors with GS support to have only single port, did they somehow overcome it?


----------



## FordGT90Concept (Feb 15, 2016)

medi01 said:


> So, "Dear Asus, put this chip of mine into your monitor, or else I disable adaptive sync"?
> But then, which tech will be used for adaptive sync? And if it is GS (in which case chip is really really needed) what about the problem that caused all Monitors with GS support to have only single port, did they somehow overcome it?


All displays with the VESA-compliant eDP chip.  It's really eDP versus G-sync module.

I doubt NVIDIA has any interest in adding more than one port to G-sync displays.  I mean, the module makes it expensive, why you buy the display and not use it? 

I guess there is a third option I didn't mention: NVIDIA could sell their own eDP chip that is G-sync compliant as well.  The G-sync module could check for non-NVIDIA cards and block adapative sync functionality if found.

I don't know what scares me more: the fact they could do this shady stuff or that I feel it is likely they will.  If they do, I hope VESA and AMD pile on lawsuits.  AMD would be in a better position because this would be anti-competitive behavior if NVIDIA did it.


----------



## trog100 (Feb 15, 2016)

i own a g-sync monitor so far i cant notice much of a difference with it (g-sync not the monitor) on or off.. mind you i dont play games at 30 or 40 fps.. 

to be honest i read that much copy and paste or entirely speculative bollocks on this place i am beginning not to trust my own judgement.. 

trog


----------



## Peter1986C (Feb 16, 2016)

trog100 said:


> i just swapped from a new-ish 144 hrz 1 ms TN gaming panel to an IPS 4 ms gaming panel.. i did it for photo editing.. colour reproduction and viewing angles are far better on the IPS style panel and the blacks are deeper..
> 
> but it does come down to what folks are used to.. i expect to keep the monitor i now have for quite some time.. i have not the slightest desire to move to 4K even though i do have the gpu power to do it..
> 
> trog





FordGT90Concept said:


> My TN right now is 5ms.  Most IPS panels can match that.  IPS always has better viewing angles than TN.  Most TVs and smartphones have an IPS display.





Xzibit said:


> TNs are 6-bit where IPS are 8-bit.  Most TNs found on the higher gaming monitors are still 6-bit+FRC color is between 72-75%.  You can easily find a IPS panel for $150 with color range 75%. Even a eco or value IPS panel which has a 65% will look better than a TN panel and those monitors you can get for $120 and under.



Note: I was talking on actual "independent" test results and not manufacturer specs. Specs can be outright lies especially with monitors and fans. I have been reading reviews on hardware.info recently, and they warned quit clearly about that (they are of the kind who grab colorimeters and oscilloscopes to test the screens they review).
And Windows reserves 2 bits of eight for stuff other than actual colour info (all monitors are 16-17 milion colours max).


----------



## FordGT90Concept (Feb 16, 2016)

Uh....no...There are only a handful of color choices in Windows: 8-bit (256 color pallete), 16-bit (5-bits per color, green may have 6-bits), 24-bit (8-bits per color), or 32-bit (8-bits per color + alpha).

6-bit displays still get 24 (if not 32) bits of color information.  The display simply just can't reproduce that many colors.
10-bit displays, the graphics driver has to override Windows' color settings: NVIDIA AMD

AMD is pushing 10-bit support to become the norm through HDR.


----------



## Dethroy (Feb 16, 2016)

Peter1986C said:


> And Windows reserves 2 bits of eight for stuff other than actual colour info (all monitors are 16-17 milion colours max).


Please verify these sort of things next time to avoid embarrassment in the future.


----------



## Peter1986C (Feb 16, 2016)

Although "embarrassment" is not the thing I feel right now, I see your point.

Although I have to say that FordGT90Concept gave a better answer than you by showing I was wrong, through explanation. That usually works better than personal attacks, especially if his reply was enough to let me realise I have had a derp moment.

BTW, if you still wish to "spank" me for being stupid, please do so in PM (directed at Dethroy , not Ford).


----------



## Dethroy (Feb 16, 2016)

A personal attack? It was meant to be an advice, nothing else. But I can see how it may sound different (written word is not very good at transporting intentions) when I look at my statement from your point of view.


----------

