# G-Sync is Dead. VESA Adds Adaptive-Sync to DisplayPort Standard



## btarunr (May 12, 2014)

The Video Electronics Standards Association (VESA) today announced the addition of 'Adaptive-Sync' to its popular DisplayPort 1.2a video interface standard. This technology delivers several important capabilities to computer users: Adaptive-Sync provides smoother, tear-free images for gaming and judder-free video playback. It also significantly reduces power consumption for static desktop content and low frame rate video.

Computer monitors normally refresh their displays at a fixed frame rate. In gaming applications, a computer's CPU or GPU output frame rate will vary according to the rendering complexity of the image. If a display's refresh rate and a computer's render rate are not synchronized, visual artifacts-tearing or stuttering-can be seen by the user. DisplayPort Adaptive-Sync enables the display to dynamically match a GPU's rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience. In applications where the display content is static-such as surfing the web, reading email, or viewing a slide presentation-DisplayPort Adaptive-Sync allows the display refresh rate to be reduced seamlessly, lowering system power and extending battery life.



During the playback of lower frame rate video content, Adaptive-Sync allows the source to optimize transport of the video format leveraging OS and DisplayPort interfaces. In addition to providing smoother video playback, the lower frame rate enabled by Adaptive-Sync also reduces power demand, extending battery life.

"DisplayPort Adaptive-Sync enables a new approach in display refresh technology," said Syed Athar Hussain, Display Domain Architect, AMD and VESA Board Vice Chairman. "Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user's content, enabling power efficient transport over the display link and a fluid, low-latency visual experience."

Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

"VESA is constantly evaluating new methods and technologies that add value to both the end user and our OEM member companies. Adaptive-Sync delivers clearly visible advantages to the user for gaming and live video, and contributes to the development of sleeker mobile system designs by reducing battery power requirements," said Bill Lempesis, VESA Executive Director. "VESA has developed a test specification to certify Adaptive-Sync compliance. Systems that pass Adaptive-Sync compliance testing will be allowed to feature the official Adaptive-Sync logo on their packaging, informing consumers which DisplayPort-certified displays and video sources offer Adaptive-Sync."

Implementation of DisplayPort Adaptive-Sync is offered to VESA members without any license fee.

*View at TechPowerUp Main Site*


----------



## HM_Actua1 (May 12, 2014)

Now it's all on the manufacturers to produce their monitors with the capability.


----------



## renz496 (May 12, 2014)

so when we can see real game demo?


----------



## Casecutter (May 12, 2014)

Good we don't proprietary... Nice to see some folks pushing back and making companies stop this we “own that”… BS. 

Let’s hope Mantle if does nothing more than get MS off it's "dead gaming azzes"... They're lost on the Xbox market, and I see Mantle made them jump on Dx12 improve quicker.  I don’t want Mantle so much other than than just to move MS.   If it's supposely open (not exclusive), and moves performance forward it’s done its job!


----------



## TheDeeGee (May 12, 2014)

Hitman_Actual said:


> Now it's all on the manufacturers to produce their monitors with the capability.



Which sucks...

Why can't they make something that's between the two ends of the cable so it can be used on all DisplayPort Monitors.


----------



## erocker (May 12, 2014)

Svarog said:


> Which sucks...
> 
> Why can't they make something that's between the two ends of the cable so it can be used on all DisplayPort Monitors.


If there's a demand, someone will eventually sell it.


----------



## GhostRyder (May 12, 2014)

It could be cool so long as monitors with the capabilities come out to work on the specification.  I have been waiting to see this implemented similar to the time AMD's FreeSync was announced.  It will only come across as a cool platform is the monitors supporting it come along as well.  I wonder if any of the coming 4k monitors (Specifically the asus) will have this built in.


----------



## HM_Actua1 (May 12, 2014)

erocker said:


> If there's a demand, someone will eventually sell it.



Thats the thing, Nvidia did the foot work already and created their own logic board/Gsync so the manufacturers didn't have to do the work.

it will be interesting to see what happens down the road.


----------



## erocker (May 12, 2014)

Hitman_Actual said:


> Thats the thing, Nvidia did the foot work already and created their own logic board/Gsync so the manufacturers didn't have to do the work.
> 
> it will be interesting to see what happens down the road.


From what I've read the implementations are different. I just hope the results are similar.


----------



## HM_Actua1 (May 12, 2014)

erocker said:


> From what I've read the implementations are different. I just hope the results are similar.


yah when AMD countered with "freesync" after the G-sync unveiling Nvidia responded saying that there is a lot more to G-sync then what VR standard driven freesync is doing.
We'll see what happens.


----------



## GhostRyder (May 12, 2014)

Hitman_Actual said:


> yah when AMD countered with "freesync" after the G-sync unveiling Nvidia responded saying that there is a lot more to G-sync then what VR standard driven freesync is doing.
> We'll see what happens.


But of course they are going to say that, I mean you wouldn't want your new device to sound likes its been out done by something that is counted as "Free".  But in reality the G-Sync module is probably going to be better (even if it turns out to be slightly).  Its just going to come down to how close freesync is to G-Sync as to whether its going to be worth it to buy the G-Sync monitor.

Will be interesting to see the monitor list of support for the proprietary one as well as freesync.


----------



## HM_Actua1 (May 12, 2014)

Hitman_Actual said:


> yah when AMD countered with "freesync" after the G-sync unveiling Nvidia responded saying that there is a lot more to G-sync then what VR standard driven freesync is doing.





GhostRyder said:


> But of course they are going to say that, I mean you wouldn't want your new device to sound likes its been out done by something that is counted as "Free".  But in reality the G-Sync module is probably going to be better (even if it turns out to be slightly).  Its just going to come down to how close freesync is to G-Sync as to whether its going to be worth it to buy the G-Sync monitor.
> 
> Will be interesting to see the monitor list of support.



Yah well see, It's still boils down to the manufacturers to make monitors capable of VR. The standard doesn't mean anything until we see that.


----------



## Fierce Guppy (May 12, 2014)

Oh, happy days!  Thanks to AMD and nVidia trying to undercut each other we will soon have adaptive-vsync in monitors and mantle like improvements in DX12.  Times are good.


----------



## GhostRyder (May 12, 2014)

Hitman_Actual said:


> Yah well see, It's still boils down to the manufacturers to make monitors capable of VR. The standard doesn't mean anything until we see that.


Yea which is the primary problem, the reality is that the tools are all there so now the manufacturers need to bite the bullet.


----------



## HM_Actua1 (May 12, 2014)

We shall see.. there are a lot of steps involved to actually get adoption.. In the meantime G-Sync works today


----------



## Fluffmeister (May 12, 2014)

This seems to be more of a means to an end, say hello to AMD's G-Sync alternative called "Project FreeSync":



			
				TechReport said:
			
		

> *Q: How are DisplayPort™ Adaptive-Sync and Project FreeSync different?*
> A: DisplayPort™ Adaptive-Sync is an ingredient DisplayPort™ feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. *Project FreeSync is a unique AMD hardware/software solution* that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.



Source: http://techreport.com/news/26451/adaptive-sync-added-to-displayport-spec


----------



## Xzibit (May 12, 2014)

Hitman_Actual said:


> yah when AMD countered with "freesync" after the G-sync unveiling Nvidia responded saying that there is a lot more to G-sync then what VR standard driven freesync is doing.
> We'll see what happens.



You referring to this ?

*January 8, 2014 - The TechReport - Nvidia responds to AMD's "free sync" demo*



			
				Tom Petersen interview said:
			
		

> However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, *with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS* or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.
> That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to* nudge the industry in the right direction*.



He might of forgotten how G-Sync works.

*October 21, 2013 - PC Perspective - Nvidia G-Sync Overview and Explanation with Tom Petersen
@1:03:xx *(question about resolution)


			
				Tom Petersen said:
			
		

> *Right now our module is LVDS*



*May 12, 2014 - VESA - VESA® Adds ‘Adaptive-Sync’ to Popular DisplayPort™ Video Standard
*
Board of Directors 
Craig Wiley, Chairman, Parade Technologies
*Syed Athar Hussain, Vice Chairman, AMD*
Brian Fetz, Secretary/Treasurer, Agilent Technologies
Simon Ellis, Intel
Alan Kobayashi, MegaChips
Gourgen Oganessyan, Hirose Electric USA
*Pablo Ortega, NVIDIA*



			
				VESA said:
			
		

> VESA is constantly evaluating new methods and technologies that add value to both the end user and our OEM member companies. Adaptive-Sync delivers clearly visible advantages to the user for gaming and live video, and contributes to the development of sleeker mobile system designs by reducing battery power requirements, ”said Bill Lempesis, VESA Executive Director. *“VESA has developed a test specification to certify Adaptive-Sync compliance. Systems that pass Adaptive-Sync compliance testing will be allowed to feature the official Adaptive-Sync logo on their packaging, informing consumers which DisplayPort-certified displays and video sources offer Adaptive-Sync.”*
> Implementation of DisplayPort Adaptive-Sync is offered to VESA members without any license fee.



Display Port Adaptive-Sync doesn't look like its limited to Full Screen 3D mode like G-Sync.  Now that is officially standard it wont be limited to single monitor use like G-Sync.


----------



## CoD511 (May 13, 2014)

Xzibit said:


> Display Port Adaptive-Sync doesn't look like its limited to Full Screen 3D mode like G-Sync.  Now that is officially standard it wont be limited to single monitor use like G-Sync.



Could you clarify what you mean by full-screen 3D mode, if you could? A bit bamboozled by your meaning  If you meant limited to 144Hz capable monitors, then that one would be a no, it isn't and can be used on any monitor. There was a demo with G-sync on a ASUS 4K, 60hz monitor at CES by Nvidia to demonstrate as such.


----------



## Xzibit (May 13, 2014)

CoD511 said:


> Could you clarify what you mean by full-screen 3D mode, if you could? A bit bamboozled by your meaning  If you meant limited to 144Hz capable monitors, then that one would be a no, it isn't and can be used on any monitor. There was a demo with G-sync on a ASUS 4K, 60hz monitor at CES by Nvidia to demonstrate as such.



G-Sync only works in fullscreen 3D mode.  Doesn't work in 2D or window 3D mode.


----------



## erocker (May 13, 2014)

Hitman_Actual said:


> We shall see.. there are a lot of steps involved to actually get adoption.. In the meantime G-Sync works today


It does, and I gotta say, it's pretty damn awesome.. You just can't seem to buy the kits anywhere.. and they only work with Nvidia cards. This technology is something everyone should experience. So, again I hope it's done right.


----------



## CoD511 (May 13, 2014)

erocker said:


> It does, and I gotta say, it's pretty damn awesome.. You just can't seem to buy the kits anywhere.. and they only work with Nvidia cards. This technology is something everyone should experience. So, again I hope it's done right.



The flood of monitors should hopefully come with G-sync very soon, the first one I know of is being released in the UK in two weeks. I can see most other 1080p 144hz monitors like that AOC one being released being anything but right behind it. The ROG Swift a little longer perhaps but sometime in Q2...


----------



## pr0n Inspector (May 13, 2014)

Xzibit said:


> G-Sync only works in fullscreen 3D mode.  Doesn't work in 2D or window 3D mode.


But the entire desktop has been a 3D surface since Vista...


----------



## Xzibit (May 13, 2014)

pr0n Inspector said:


> But the entire desktop has been a 3D surface since Vista...



Time to petition Nvidia then.

*Guru3D - NVIDIA G-Sync Explored and Explained*



			
				Guru3D said:
			
		

> Okay, we slowly get to the point where we will install the monitor and drivers. Right now there are some requirements for G-Sync:
> 
> Windows 7/8.1 Desktop PC.
> GeForce GPU – GTX 650 Ti Boost or Higher with DisplayPort Connector (we used a GTX 760 for G-SYNC testing.)
> ...



G-Sync also disable the audio pass-through in DisplayPort.


----------



## erocker (May 13, 2014)

CoD511 said:


> The flood of monitors should hopefully come with G-sync very soon, the first one I know of is being released in the UK in two weeks. I can see most other 1080p 144hz monitors like that AOC one being released being anything but right behind it. The ROG Swift a little longer perhaps but sometime in Q2...


As much as I'd like to get one, I just can't stop using my 1440p OC'd monitor. I'd kind of rather wait until the end of the year when I plan to replace my current system.


----------



## The Von Matrices (May 13, 2014)

Xzibit said:


> G-Sync only works in fullscreen 3D mode.  Doesn't work in 2D or window 3D mode.



If you're running things in a window, then the usefulness of adaptive sync is questionable.  The only reason you would run a program in windowed mode would be if you cared about the content of more than one window; however, each window will be rendering/refreshing at different rates.  If you software synchronizes to one window with a variable refresh rate, then every other window's cadence looks even worse than with a fixed refresh rate.  For example if you're trying to display a video and a 3D rendering, then focusing upon the 3D rendering will cause the video to become a stuttering mess.  The highest fixed refresh rate that a monitor can display will be the best compromise for a multiple window environment.


----------



## Recus (May 13, 2014)

Serious question: how do you feel when paying for free product?



> There are few AMD cards, which already support this technology, but unfortunately you won’t find a single monitor which is DisplayPort 1.2a compatible.





> So how will users get FreeSync working on their systems?
> 
> To use Project FreeSync, users will require: a monitor compatible with DisplayPort Adaptive-Sync, a compatible AMD Radeon GPU with a DisplayPort connection, and a compatible AMD Catalyst graphics driver. AMD will release a compatible graphics driver to coincide with the introduction of the first DisplayPort Adaptive-Sync monitors.


----------



## Breit (May 13, 2014)

The Von Matrices said:


> If you're running things in a window, then the usefulness of adaptive sync is questionable.  The only reason you would run a program in windowed mode would be if you cared about the content of more than one window; however, each window will be rendering/refreshing at different rates.  If you software synchronizes to one window with a variable refresh rate, then every other window's cadence looks even worse than with a fixed refresh rate.  For example if you're trying to display a video and a 3D rendering, then focusing upon the 3D rendering will cause the video to become a stuttering mess.  The highest fixed refresh rate that a monitor can display will be the best compromise for a multiple window environment.



I think he is talking about running a game for instance in windowed fullscreen mode and some other apps on a secondary monitor. This is especially useful if you want to switch between the game and some other apps (like TS or a chat tool). This is something you can't do with an app running in true fullscreen mode, at least on windows. 
The use case for a game in a small window and a video playing on the same monitor may be very limited.


----------



## Mussels (May 13, 2014)

Woooo, another reason for PC gaming master race to lord it over the console peasants!


(goddamnit, now where do i get a 46", 4K HDTV with this tech?)


----------



## Steevo (May 13, 2014)

Mussels said:


> Woooo, another reason for PC gaming master race to lord it over the console peasants!
> 
> 
> (goddamnit, now where do i get a 46", 4K HDTV with this tech?)


I think you mean a 52" sir.


On a side note its amazing how a partial implementation that costs enough money that if I were to purchase it I could afford a newer or additional card to remove the need for it gets so much support, and people fall over themselves to talk about how great it is, reminds me of apple peasants.


----------



## Mussels (May 14, 2014)

Steevo said:


> I think you mean a 52" sir.
> 
> 
> On a side note its amazing how a partial implementation that costs enough money that if I were to purchase it I could afford a newer or additional card to remove the need for it gets so much support, and people fall over themselves to talk about how great it is, reminds me of apple peasants.



46" fits my desk nicely at present. maybe 50" if the bezel is thinner


----------



## birdie (May 14, 2014)

What's up with* the open NVIDIA hatred *on this website? Do you envy them or what?

Can you tone down the titles of your future articles to make them less derogatory?

G-Sync offers a fine control over the display and decreased lag - that kind of thing is *not* available in the updated VESA standard. You just cannot implement G-Sync features as a software update to the standard. Deal with it and stop pouring sh*t on NVIDIA.

Yes, they may look greedy but they are not a non-profit organization, they have to earn money to survive.


----------



## OneCool (May 15, 2014)

This must be a new thing called  "nVidia gets there ass handed to them week"    I like it


----------



## Prima.Vera (May 15, 2014)

birdie said:


> What's up with* the open NVIDIA hatred *on this website? Do you envy them or what?



Is only natural when you start to ask ridiculous prices for your hardware and lock down proprietary techs.


----------



## birdie (May 15, 2014)

I remember almost everyone here loved Mantle (and DirectX before it) and both are 100% proprietary technologies.

Your comment makes no sense.


----------



## Mussels (May 15, 2014)

birdie said:


> I remember almost everyone here loved Mantle (and DirectX before it) and both are 100% proprietary technologies.
> 
> Your comment makes no sense.



mantle doesnt require you to buy a new graphics card AND monitor to use it. 3D monitors died for that same reason. directX and mantle support a lot more existing hardware.


----------



## birdie (May 15, 2014)

Mussels said:


> mantle doesnt require you to buy a new graphics card AND monitor to use it. 3D monitors died for that same reason. directX and *mantle support a lot more existing hardware*.



Mantle is not supported by both NVIDIA and Intel.

G-Sync is not supported by both AMD and Intel.

For G-Sync you don't have to buy a new GPU either - it's supported by every GPU starting from Kepler.

Now tell me, why Mantle is better?


----------



## Mussels (May 15, 2014)

birdie said:


> Mantle is not supported by both NVIDIA and Intel.
> 
> G-Sync is not supported by both AMD and Intel.
> 
> ...



because mantle doesnt require me to buy a new monitor.

you've gone far beyond apples and oranges here, you're comparing coconuts to coconut crabs.


----------



## rvalencia (May 22, 2014)

birdie said:


> What's up with* the open NVIDIA hatred *on this website? Do you envy them or what?
> 
> Can you tone down the titles of your future articles to make them less derogatory?
> 
> ...


Your "that kind of thing is *not* available in the updated VESA standard. You just cannot implement G-Sync features as a software update to the standard" statement is wrong.

For gaming adaptive sync support (i.e. refresh rate driving by GPU), AMD's FreeSync/VESA's A-Sync requires desktop GCN 1.1 video cards. From GCN 1.1's point of view, it requires a new monitor and driver software. VESA's DP 1.2a with A-Sync feature was based from existing eDP 1.0's A-Sync standard.

Older GCNs has A-Sync for video and power saving modes.


----------



## SQUIDZILLA (May 28, 2014)

Hitman_Actual said:


> Thats the thing, Nvidia did the foot work already and created their own logic board/Gsync so the manufacturers didn't have to do the work.
> 
> it will be interesting to see what happens down the road.



I don't think Nvidia did anything but try and make a quick buck off of an un-utilized, already present technology by telling people that essentially adaptive frame times for monitors is unique to only Nvidia's technology and because they convinced people that their G-Sync technology was unique, they charged a premium for having their unnecessary hardware that did nothing but add cost and make Variable refresh rate a proprietary technology.  Even AMD mentioned that they didn't understand why Nvidia did what they did with G-Sync


----------



## Mussels (May 28, 2014)

what nvidia did, was made their own connection standard, really.

at the time they did it, DVI and DP couldnt do it - so they made a new 'connector' that required a compatible GPU and monitor.

now that DP can do it, we still need a new GPU and monitor - its just not locked to nvidia.


----------



## SQUIDZILLA (May 28, 2014)

Mussels said:


> what nvidia did, was made their own connection standard, really.
> 
> at the time they did it, DVI and DP couldnt do it - so they made a new 'connector' that required a compatible GPU and monitor.
> 
> now that DP can do it, we still need a new GPU and monitor - its just not locked to nvidia.



Im pretty sure it was supported before Nvidia revealed G-Sync. It just wasn't available on external desktop monitors but it could have been. 

"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."


----------



## SQUIDZILLA (May 28, 2014)

Mussels said:


> what nvidia did, was made their own connection standard, really.
> 
> at the time they did it, DVI and DP couldnt do it - so they made a new 'connector' that required a compatible GPU and monitor.
> 
> now that DP can do it, we still need a new GPU and monitor - its just not locked to nvidia.


----------



## SQUIDZILLA (May 28, 2014)

Mussels said:


> what nvidia did, was made their own connection standard, really.
> 
> at the time they did it, DVI and DP couldnt do it - so they made a new 'connector' that required a compatible GPU and monitor.
> 
> now that DP can do it, we still need a new GPU and monitor - its just not locked to nvidia.



Infact this entire article is just copied from displayport.org's website in the article where they announced the new display port standard.


----------



## Xzibit (Jun 7, 2014)

*Anandtech - Computex 2014: AMD Demonstrates First FreeSync Monitor Prototype
*









Monitor is a Nixeus IPS 2560x1440 *Nixeus Vue 27D* with a firmware update


----------



## rvalencia (Jun 7, 2014)

Xzibit said:


> *Anandtech - Computex 2014: AMD Demonstrates First FreeSync Monitor Prototype
> *
> 
> 
> ...


It looks like birdie's comment was wrong. A firmware update LOL.


----------



## Xzibit (Jun 7, 2014)

Yeah, they got it to work on a current monitor with a IPS panel none the less.

Monitors are already adopting DP 1.2a.  Dell released 3 new monitor all with DP 1.2a. Don't know if they will be "FreeSync" capable but goes to show DP 1.2a is being adopted and hit retail before a official G-Sync monitor did.


----------



## mousespecial14 (Jul 16, 2014)

This article I found makes a good point about the different syncs out there though.   "AMD claims that monitors will be available in 6-12 months. If there are delays, then this could leave G-Sync viable for a couple of years." -- http://toptengamer.squidoo.com/top-g-sync-monitors .  And then we have to remember that nothing's really free; still freesync has to be cheaper.


----------



## Xzibit (Jul 17, 2014)

mousespecial14 said:


> This article I found makes a good point about the different syncs out there though.   "AMD claims that monitors will be available in 6-12 months. If there are delays, then this could leave G-Sync viable for a couple of years." -- http://toptengamer.squidoo.com/top-g-sync-monitors .  And then we have to remember that nothing's really free; still freesync has to be cheaper.



Nvidia has to adapt DP 1.2a or higher and rework G-Sync.  One of the advantages of VESA adaptive-sync is being a standard.  Unlike G-Sync it wont have to use the AUX bandwidth to communicate and thus not sacrificing the audio component of DisplayPort.  Not to mention probably one of the quirks G-Sync is facing since they haven't been able to get it to work with MSTs so far.  They already are DP 1.2a MST on the market.

They also don't bring up or address that Nvidia will have to compete with TCOM companies which are well establish and at a competative or lower price to break into that market.  Nvidia likes to sell stuff at a high premium we saw its short lived cycle in the mobile phone market in US because of it.

I'd wait for the 2nd gen of these products to make a decision myself.  I wouldn't want to get a G-Sync monitor only for Nvidia to switch to DP 1.2a+ standard or have the first kinks worked out of VESA adaptive-sync if they are any.  Things will move along faster once Nvidia and AMD are playing on the standards they both are part off.


----------



## HM_Actua1 (Jul 17, 2014)

"AMD is actively working with these scaler vendors to bring Adaptive-Sync support into their higher-end scalers, and the company expects the mainstream scalers to gain support for the features very soon as well. The process isn’t progressing as slowly as we thought, although the AMD spokesman did make it clear that it was still too early to discuss exact commitments."

Let me know when that is ACTUALLY TRUE. Until then I say BS, NV tried and couldn't get them to budge, so created a way to do it themselves. Too early to discuss commitments because there are ZERO to discuss. If scaler companies don't make the hardware, how far are you going to get? About as far as NV did I suspect. If it was progressing quickly why would you put a couple of YEARS? Also since there are moles everywhere, I'm sure NV will know when they decide to cooperate and at that point they can choose to drop gsync or lower the price to force more sales (it does sell a gpu also). It's up to them to figure out which way is better for their business and I'm sure their bean counters will be hard at work...LOL.

http://www.blurbusters.com/gsync/list-of-gsync-monitors/
The list is growing for gsync monitors announced. Not sure when it was last udpated but I'm sure more will be announced in the next 5 months. The only reason we don't see more already is the amount of time it takes to tune it for each panel (hence the cost). It would sure speed things up if they get a rev2 our or something that can just be applied to all monitors easily. Considering the difficulty in even doing it NV's way, I'm still wondering how GOOD AMD's solution will really be when someone finally is allowed to test it gaming across a dozen titles or so.

"couple years down the road"
This is a jump on Gsync that is ALREADY HERE? ROFL. "speed to market" means nothing if it takes years to actually GET to market. AMD said nothing here, no clearing the air, just more of what they HOPE will happen. Meanwhile monitors with gsync will be out in quite good numbers for xmas (meaning on a good number of monitors). Your comment is crazy. In order to have a jump on gsync you have to be FIRST don't you? Not in a COUPLE OF YEARS, right? Is it 6-12 months or a couple of years now?

"We can expect to see almost all mainstream and high-end monitors support Adaptive-Sync in the future."
Umm...There's that scaler problem that has to be worked out with the 4-5 vendors the AMD guy mentioned...Remember, NV said they tried and nobody would budge (that R&D costs money), so again, this is why they did it themselves. It is also why you DON'T give it away freely after doing that R&D. Make no mistake the scaler companies will charge the monitor people, and the monitor people will in turn charge YOU. The same thing happened with gsync. This is no different, it's just not AMD doing it to you, it's the scalers/monitor makers who will.

"If we wanted to do something over HDMI right now, it would have to be proprietary, and we would rather not do that."
ROFL...

So in other words "Nvidia had no choice but to do it proprietary because that is all they had available to work with, and since we don't want to be blamed for charging you, we'll wait for years maybe until scalers cooperate so it can be blamed on them or monitor makers"...LOL. 

If scaler makers move at all it will be due to them losing sales because Gsync is included INSTEAD of their scaler. At that point (say xmas or so when all the monitors that we know are coming with gsync are out in great numbers), they may be willing to at least do the work and charge a minimal amount for it, but they won't go for FREE, just cheaper than NV probably to win back sales from Gsync monitors. You see, without vast numbers of gsync selling yet, they have no fears, but that ends at xmas. You could say, AMD's success at getting it into monitors is solely based on Nvidia's success at selling Gsync this xmas...ROFL. If NV succeeds you'll see scaler vendors ramp up some R&D to get new scalers out the door to stop gsync from taking all their sales. It's that simple. Then again, if NV can drive the cost down as sales ramp up they may lose anyway. That's how cuda got entrenched. By the time AMD actually did something they already had years in cuda and owned 90% of the market.

What air got cleared? No commitments discussed and no "it will be out on X day", so what got cleared up? The 4-5 scalers still haven't committed here either or it would be in the post. All I see is "it's taking long, so we thought we'd make more some more fluff noise and keep saying words like FREE when we know it isn't FREE for scalers or vendors". 
	

	
	
		
		

		
			





  Even the monitor makers have to pay some R&D to get their monitor to pass for the label. Why the heck would AMD not reveal a monitor that CAN be used today with adaptive sync unless it, well, CAN'T? Are you unable to purchase it with this different firmware that can use it because brand X wants to make you buy a new monitor?

Fuzzy, fuzzy, fuzzy...Not clear at all.


----------



## HM_Actua1 (Jul 17, 2014)

Xzibit said:


> Nvidia has to adapt DP 1.2a or higher and rework G-Sync.  One of the advantages of VESA adaptive-sync is being a standard.  Unlike G-Sync it wont have to use the AUX bandwidth to communicate and thus not sacrificing the audio component of DisplayPort.  Not to mention probably one of the quirks G-Sync is facing since they haven't been able to get it to work with MSTs so far.  They already are DP 1.2a MST on the market.
> 
> They also don't bring up or address that Nvidia will have to compete with TCOM companies which are well establish and at a competative or lower price to break into that market.  Nvidia likes to sell stuff at a high premium we saw its short lived cycle in the mobile phone market in US because of it.
> 
> I'd wait for the 2nd gen of these products to make a decision myself.  I wouldn't want to get a G-Sync monitor only for Nvidia to switch to DP 1.2a+ standard or have the first kinks worked out of VESA adaptive-sync if they are any.  Things will move along faster once Nvidia and AMD are playing on the standards they both are part off.



That's assuming you think DP isn't backwards compatible?


----------



## HM_Actua1 (Jul 17, 2014)

I've never read so many misunderstood statements about a tech.


----------



## Xzibit (Jul 17, 2014)

Hitman_Actual said:


> That's assuming you think DP isn't backwards compatible?





DisplayPort will be but you wont get the benefits of the extra bandwith.  Nvidia will have to change the calls going thru AUX to DP 1.2a or newer to free up the AUX. So they have to change it. Better sooner then later.  Unless you expect all future G-Sync monitors to be DP 1.2 when every other monitor will be DP 1.2a or DP 1.3.  How are they going to drive a 4k monitor pass 60hz if they stick to DP 1.2. They either have to adapt or keep using the AUX out of stubbornness.  They might run into trouble with that.  It will become a selling point between monitor vendors and tech. Marketing will be as the "Standard".

Nvidia announced G-Sync on Oct 18 2013,  9 months ago (1 day shy of 10 months).  No monitors yet.  Maybe in a few days.

Vesa Adapative Sync was announced May 12 2014. Only 2 months ago.  They have 7 months of delay wiggle room to be on par.


----------



## HM_Actua1 (Jul 17, 2014)

w


Xzibit said:


> DisplayPort will be but you wont get the benefits of the extra bandwith.  Nvidia will have to change the calls going thru AUX to DP 1.2a or newer to free up the AUX. So they have to change it. Better sooner then later.  Unless you expect all future G-Sync monitors to be DP 1.2 when every other monitor will be DP 1.2a or DP 1.3.  How are they going to drive a 4k monitor pass 60hz if they stick to DP 1.2. They either have to adapt or keep using the AUX out of stubbornness.  They might run into trouble with that.  It will become a selling point between monitor vendors and tech. Marketing will be as the "Standard".
> 
> Nvidia announced G-Sync on Oct 18 2013,  9 months ago (1 day shy of 10 months).  No monitors yet.  Maybe in a few days.
> 
> Vesa Adapative Sync was announced May 12 2014. Only 2 months ago.  They have 7 months of delay wiggle room to be on par.



How would a user benefit from extra bandwidth at a fixed resolution?

the response time is fixed? I can only see that mattering when resolution goes up and when the GPU can support that. 

I'm not following you there?


----------



## Xzibit (Jul 17, 2014)

Hitman_Actual said:


> w
> 
> 
> How would a user benefit from extra bandwidth at a fixed resolution?
> ...



It wouldn't

Its not so much as bandwidth but rather in the matter they are using the AUX channel to communicate.  They can stick with their current menthod loose the audio pass-through that DisplayPort provides for those who need it and compatibility issues with MSTs.  That's up to them.

AMD got VESA to standardize their spec in DP 1.2a and going forward so those issues will likely not be present.


----------



## LeonVolcove (Jul 18, 2014)

I dont really know about this Gsync or Free sync since i connect my CPU to monitor via HDMI and never looked back

Sorry if my posting is trash.


----------

