Until we can game at a minimum 80 fps (Using ULMB) in every game on a 144 Hz IPS monitor with acceptable lag and response times @ 4k for less than 1/3 of the current cost of $4,500 in GFX cards and monitor, not going to go there.
The current problem with 4K gaming is that, to be able to do it, it requires a lot of money:
- the monitor's price itself: prices have come down, but they are still expensive
- the extra cost of sync technologies: freesync is much cheaper VS G-sync, but it still adds to the cost of the monitor
- the only manufacturer (nVidia) currently able to provide good FPS @ 4K has the card's that do it @ very high prices (i'm referring to 1080 and above performance cards), thus unavailable to most people
- mining crazy did NOT help one bit, driving away further potential 4K gaming adopters
- the only alternative to nVidia's solutions to 4K gaming (@ lesser quality settings), though cheaper, still costs too much thus unavailable to most people
- requires a good CPU so games aren't bottlenecked by it: for gaming, Intel is still the best choice in terms of FPS, but also the most expensive, thus increasing the total costs further
- RAM is still very pricey: doesn't help one bit
No wonder the 4K gaming adoption is slow!
All true ... almost... Freesync isn't going anything for ya with a 144 Hz 4k monitor
From this perspective... the two technologies are very similar.
G-Sync provides adaptive sync starting at about 30 fps and its impact trails off greatly after 60 fps.
Free-Sync provides adaptive sync starting at about 40 fps and its impact trails off greatly after 60 fps
From this perspective... the two technologies are very dissimilar.
G-Sync monitors include a Hardware Module which is the reason for the difference in cost between the two technologies. When the 1st G-Sync ready monitors came out, you could buy a module as an add on for $200. Adaptove sync is intended to eliminate display issues that occur from when fps is below 60Hz. The hardware module provides Motion Blur Reduction which is quite useful above 60 Hz when the problems that adaptive sync solves are no longer a significant issue. Of course average fps means min fps is lower so some cushion is needed; so when fps averages 70 fps or so, the MBR technology provides a superior visual experience. The typical 1440p 144 / 165 Hz owner, assuming they have the GFX horsepower, is playing with G-Sync OFF and ULMB ON until that rare game that won't let them maintain fps above 60.
Freesync monitors are not equipped with this hardware module. Yes, adaptive sync continues to work after 60 fps but the effects are greatly diminished as the fps increases. But there is no high fps alternative to switch to because Freesync includes fo MBR hardware module.
So, when ya are ready to go 4k ....
With AMD, grab a decent 60 / 75 Hz panel and wait for a next gen AMD card that can handle the games you want to play play @ 40 - 75 fps
With nVidia, wait till ya can afford to grab what is now a $2500 144Hz panel as well as reasonably affordable nVidia card(s) that can handle the games you want to play play at 75 - 144 Hz
Personally, I won't give up 144Hz ULMB to go to 4k and do it at reasonable expense .... I'm thinking 2020