4K or 8K or 16K.....
All of them are marketing hoaxes.
Producers try to fascinate the "rich idiots".
Producers try to earn MORE thanks to INEFFICIENT technologies.
GPUs are not enough for the 4K and they intend to offer 8K!
It is very clear that " THEY WANT YOUR MONEY", "THEY WANT MORE MONEY".
Except the hi-end gaming laptops still more than 80% of the laptops offer 1366X768 pixel resolution.
Before 1080p becomes a standard, producers started to produce 4K and now 8K.
1920X1080p has never become a standard.
1920X1080p resolution must have been a standard before 4K technology.
USB 3.0 was announced in 2008 but still producers offer USB 2.0 products.
USB 3.0 has never become a standard.
And USB 3.1 was announced before USB 3.0 becomes a standard.
On the other hand cameras.
Canon had only 1 camera that is capable of recording 1080p 60fps.
It was 1DC.
And a few weeks ago Canon EOS 7D Mark II was announced. Capable of recording 1080p 60fps.
Giant Canon has only 2 cameras capable of recording 1080p 60fps.
But nowadays producers offer 4K Cameras.
1080p 60fps video recording capability must have been a standard for the cameras before 4K.
Before UHS-I technology becomes a standard UHS-II was announced.
Technological improvements are TOO FAST to be released.
But Technological improvements are TOO SLOW to become the standard.
Never forget that "If a technology does not become a standard it will always be LUXURY and UNATTAINABLE."
A technology must become a standard to be able to be AFFORDABLE.
I code on my desktop. 4K and 8K for me would be a massive improvement just in text quality. Sure, a few programs scale blurrily, but I can live with a few blurry programs while everything important (text editors, IDEs, browsers, DE) scales
Single GPUs can't drive 4K yet, but SLI 970s can. There will be a bigger Maxwell chip soon, much like Kepler. A single one of those should be able to get 60+fps on 4K, and a pair in SLI should be able to do 45+fps on triple-screen 4K. With 3 or 4 GPUs, 8K is entirely doable. Expensive, but doable. Also potentially unnecessary depending on how AA and the need for AA scales.
Before the netbook-era drove prices stupidly low (and stalled display improvements for 6-10 years.. don't get me started on this crap...), laptops had been increasing resolution steadily. Nowadays, 4K is entirely accessible on a laptop, and 1080p is (finally!) considered the baseline resolution by most reviewers and general people. I expect the 1366*768 size to die out completely in the next 2-3 years, especially now that Ultrabooks have put in new life into the laptop market. Looking at the steam HW survey, the 1920x1080 category is growing the fastest and is the largest. 1080p is very much the standard target resolution, and 4K will be the next target standard. Graphics cards also show that: they're all optimized for 1080p6 to 1080p120, and are now starting to target the 4K segment.
USB3 is already standardised. You just need to pay for it. USB2 products still exist because they are cheap. Taking a thumbdrive as an example: you only need USB3 for a fast one, so why bother making a slow one use more expensive USB3 parts and design time instead of just reusing current USB2 parts? Same reason screens only use DVI-SL with DVI-SL cables rather than DL-DVI or DP, or why you don't put a Noctua NH-D15 on an Atom C2750.
Cameras couldn't do 4K and 8K or high FPS for a long time because of physics: you need to get more detail with the same amount of light. The more detail you try to extract out of the same amount of light, the more noise becomes an issue, and that's unnacceptable for anyone worth their salt. Sensors are finally good enough now that cameras are shipping with them. Compare the RED One camera to the RED EPIC DRAGON. The sensor on the DRAGON is higher resolution (6K vs 4K), and more importantly, has an SNR of 80dB vs 66dB of the RED ONE. The SNR difference is what allows the DRAGON to hit 150fps of 4K (with a the help of a shitton of extra light from spot lamps) while the ONE has to make do with only 30fps. Don't argue with physics, you will lose. As for dSLR sensors, they are not geared towards video, and consequently, don't work anywhere near as well. Oh, and the storage on dSLRs is crap compared to a real video camera: SD/ZQD/CF cards vs raw digital feed to a real server with capture cards and RAIDed and/or PCIe SSDs. It's correspondingly more expensive. And finally, to put into perspective, cinema-grade 2K video is currently at 250MB/s, or about 2Gbit/s. And that's after compression. Meanwhile blu-rays have to do with ~50Mbit/s at most due to space constraints. For that level of quality, forget about consumer gear, even top-end gear isn't fast enough to cope for a large number of producers.
All in all, it's not that corps don't want steady improvements, but more that this thing called real world physics gets in the way of steady improvements.