Good to see it finally rolling out - however I feel a bit mislead. Unless I misunderstood it previously, their version of this adaptive v-sync, was to be able to run on any monitor. All it required was an AMD *FreeSync* compatible GPU or APU.
Now I come to find out that you have to purchase a monitor where they've baked adaptive v-sync support into the Display port 1.2 connection.
- That's a strike against it because like G-sync, it costs you more money.
- Adaptive v-sync will be standard on Display port 1.3 any ways, so don't rush out and buy a monitor that supports FreeSync, unless you really like the monitor itself. When 1.3 hits, your selection will increase.
- Lastly, this advanced adaptive v-sync craze, is really aimed at the mainstream gamer who knows very little about frame rendering times and frame latency. The problem is that frame latency, frame time and other variables of frame rending mechanics, is only now being discussed in GPU and performance reviews.
It's not common knowledge, and I question how they expect to sell people on the idea that FreeSync makes your game smoother, when most people didn't even notice their gaming was 'unsmooth' to begin with.
Additionally, FreeSync is far from automatic at the driver level. If a lot of people still struggle to know or find their Catalyst Control Panel...or what a monitor OSD is, where does that leave them when trying to fiddle about setting up FreeSync?
Once you do some research and learn a bit about how an image is rendered, you find out that for the most part, you can achieve the same affect as Gsync/Freesync, without needing to buy additional hardware.
Making use of tools such as Radeon Pro, RTSS and CRU (to create custom resolutions), is the key to helping you achieve a much smoother experience - all while using the same hardware you currently own.
Setting up a custom monitor resolution and/or capping your frame rate, takes about the same amount of time as it does to enable FreeSync.
The former option is all free, the second is costly.
EDIT: In addition to what I posted, the arguments over market share are laughable. This 'technology' is not game breaking. While I wouldn't call it a gimmick, it's not some new architecture that will take us into a grand age of computer graphics and performance.
It's a bonus feature if nothing else and not every monitor, GPU or APU is going to support it.