They have already said that 8K will be the final standard because it will reach human vision abilities.
I strongly suggest reading the link I posted.
tftcentral said:
If you think about that you only have maximum visual acuity in the foveola, 8K seems quite a waste; you can only see a circle with a 69 pixel diameter with maximum accuracy at the time out of 7680x4320.
On computer displays there is definitely something to say for 4K. You can display a lot of information simultaneously and you usually only have to focus on a small area at the time, which means the higher detail really has added value. Furthermore the short viewing distance allows a wide field of view without the need for extremely large displays.
With televisions it’s a different story. To really profit from 4K you’d need an extremely large screen, or sit extremely close. And 8K is just plain ridiculous. For a 250 cm viewing distance you’d need a 595 x 335 cm screen.
Mixing PC standards with TV Broadcasting standards is never a good idea.
So, it's a bad idea for companies to sell 4K monitors, 1080p monitors, and so on? The standards have already been mixed. The only recent exception of particular note is 1440p.
What doesn't make sense is to not have standardization. Having HDTV and desktop monitor resolutions be equivalent makes prices drop due to the effect of larger production volume. (That is the one major good thing about 4K TV, although it does have significant drawbacks.) It also makes having beneficial standards easier to implement. The difference between TV and monitor has diminished recently and will continue to diminish now that plasma is dead.
The main issue now is input lag, particularly since 4K monitors and 4K TVs have become the standard — which gives console and PC gaming a single resolution to target for top-level performance. TVs also need to adopt something like Freesync/G-sync, fast pixels, strobing backlights, and so on for better gaming.
Going forward, desktop monitors will differentiate themselves by having wider color gamuts (something that TV standards should have focused on rather than 4K), better uniformity, anti-glare treatment (in some cases — especially for office use), sophisticated calibration features like programmable hardware LUTs, and so on. Or, in the case of many of them, low price will be the differentiator. LCD with LED backlighting, though, is tech that is common between HDTVs and monitors, making small details like those the differentiators. How well OLED will hold up to monitor usage remains to be seen, especially since we have so many miniscule pixels thanks to the 4K craze. If IR and burn end up being a significant problem for OLED on the desktop because of 4K that will certainly be a shame since LCD can't offer its black level, at least with comparable viewing angle range.