I would reckon it is coming either next year, but perhaps more likely the year after on a broad scope. Curious if anyone else agrees with my chicken/egg correlative thinking. That is to say, the average decent PC display costs about the same as a graphics adapter to drive it (currently excluding 4k obviously) in contemporary gaming.
Currently the graphics card market is divided more-or-less like so:
~4k or a tri-1080p setup ~60fps (sli/cf)
2560 ~60fps (high-end gpu)
1080p ~60fps (midrange gpu)
On down, obviously...but they are increasingly irrelevant.
Obviously 30fps/60fps are some-what arbitrary, but it makes sense to lay it out like that, especially if you factor in 3D/120hz etc requiring roughly double the horsepower.
With the next generation on the same process, that will probably be more-or-less solidified (60fps at those resolutions).
Come 20nm, each market will probably drop a notch (and conceivably a limited one above the former could be born) With it, the market for those displays grows. Yadda yadda economics trickle-down magic. (In a broad sense beyond Chinese ebay specials) 2560x14/16 becomes more mainstream, 2560 120hz becomes obtainable, and 4k starts poking it's head in the door. If 2560 or 1080p120 does not catch on in a broad sense, and 1080p->4k becomes the defacto progression, this is why we see the 128-bit gpu die. CPUs will handle 1080p by 14nm, a decent gpu (or high-bandwidth) is needed for 4k, and the middle-ground is a weird niche that may, in the grand scheme of things, be passed over and consolidated.