# 8K A Great Challenge: NVIDIA and AMD



## btarunr (Oct 16, 2014)

Even as 4K Ultra HD (3840 x 2160) is beginning to enter the consumer mainstream, with 28-inch displays being priced around $600, and Apple toying with 5K (5120 x 2880), with its next-generation iMac Retina desktops, Japanese display maker Sharp threw a spanner in the works, by unveiling a working prototype of its 8K (7680 x 4320 pixels) display, at the CETAC trade-show, held in Japan. 

Two of the industry's biggest graphics processor makers, NVIDIA and AMD, reacted similarly to the development, calling 8K "a great challenge." Currently, neither company has a GPU that can handle the resolution. 8K is four times as many pixels as 4K. Driving an Ultra HD display over DVI needs two TMDS links, and DisplayPort 1.2 and HDMI 2.0 have just enough bandwidth to drive Ultra HD at 60 Hz. To drive 8K, both NVIDIA and AMD believe you would need more than one current-generation GPU, the display should connect to both cards over independent connectors, and somehow treat the single display as four Ultra HD displays. We imagine Sharp demoed its display at a very low refresh rate, to compensate for the bandwidth limitation. After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up.





*View at TechPowerUp Main Site*


----------



## techy1 (Oct 16, 2014)

"Currently, neither company has a GPU that can handle the *4K *resolution" - that would be more correct... cuz there is no real price/performance increase over last generations... just a bunch of rebrands, an Efficiency and performance increase that was coupled with price increase.... in terms of 4K gaming we are just were we was in 2012 and that was: "expensive future - multi GPU setup and then it is not guaranteed that gaming will be solid 40 frames" .... and now they are talking about 8K and running 5K on rebranded mobile GPU's (apple) (facepalm)


----------



## RejZoR (Oct 16, 2014)

Full HD Tyranny? Really? I have a 24 incher 1080p and i see absolutely nothing wrong with it. Only time when i wanted 4K was when i hooked up a 42 inch LCD TV to my PC and played games on it at distance of 0,5m. It was awesome because the screen was so big i could only see ingame stuff and not the room around. But 1080p was realyl not enough for such small distance on such huge screen. But for 24 inch screens, 4K is nearly completelly unnecessary apart from those 3 people who actually really need such high resolution on such small screen... So, stuffing 8K into anything smaller than 40 inch is absolute nonsense for any distance.


----------



## Naito (Oct 16, 2014)

Really!? I haven't even moved to 1440p yet! Even though I would love to!


----------



## erocker (Oct 16, 2014)

I'm fine with this. If it'll run 8k, it will run 4k and lower better as well.


----------



## Frick (Oct 16, 2014)

Naito said:


> Really!? I haven't even moved to 1440p yet! Even though I would love to!



That is what needs to be done IMO. Sub €200 1440p monitor please.


----------



## Roel (Oct 16, 2014)

techy1 said:


> "Currently, neither company has a GPU that can handle the *4K *resolution" - that would be more correct...



A midrange GPU with Displayport handles 4K perfectly. You are only looking at framerates in demanding games which is not what they meant.


----------



## capolavoro (Oct 16, 2014)

Naito said:


> Really!? I haven't even moved to 1440p yet! Even though I would love to!



I know that feel bro.. Still stuck with 1080p eventho I had GTX690 .-.


----------



## Naito (Oct 16, 2014)

Reader said:


> 4K or 8K or 16K.....
> All of them are marketing hoaxes.
> Producers try to fascinate the "rich idiots".
> Producers try to earn MORE thanks to INEFFICIENT technologies.
> ...



There is nothing wrong with the_ cutting-edge_ of technology. Whether it's just a concept or demonstration or actually comes to fruition, technology needs to progress one way or another. Standards are often slow to implement, because you're pretty much trying to regulate entire industries, so it is not uncommon to see new standards announced before current ones are finalized. It can take many years.


----------



## natr0n (Oct 16, 2014)

I bet we will need 512 bit + cards for 8k.


----------



## Sony Xperia S (Oct 16, 2014)

Reader said:


> 4K or 8K or 16K.....
> All of them are marketing hoaxes.



Dull outgiving from you. Go and check your eyes and once fully healthy come again and say whether or not you see dramatic image quality improvement with higher ppi displays.

You need such big resolutions in order to deliver crispy clear retina image.



Reader said:


> Producers try to fascinate the "rich idiots"..



yeah, wondering who you belong to


----------



## Rojan (Oct 16, 2014)

I think it's not just the hardware we need; we also need the software. Any advancement in resolution (relative to pixel density) is futile unless the OS can adopt and scale it properly. If Android, iOS and OS X can do it, why can't Windows? I mean, I bet Android is ready for 10" 2160p/2400p in terms of scaling.



RejZoR said:


> But for 24 inch screens, 4K is nearly completelly unnecessary apart from those 3 people who actually really need such high resolution on such small screen...



Not entirely. 24" 2160p is the same as 12" 1080p in pixel density (~183 PPI), not that ridiculous if you think about it. While this pixel density wouldn't be great for text reading, it would benefit a lot in gaming in terms of sharpness. Well, that's all I can think of.


----------



## Sony Xperia S (Oct 16, 2014)

Rojan said:


> Not entirely. 24" 2160p is the same as 12" 1080p in pixel density (~183 PPI), not that ridiculous if you think about it. While this pixel density wouldn't be great for text reading, it would benefit a lot in gaming in terms of sharpness. Well, that's all I can think of.



do not listen to them, they are evil who try to spoil the advancement. I'm not gonna torture my eyes with low resolution, low pixel density displays because my eyes are strong enough to sense it.

I need my 24 inch 4K monitor NOWWWWW!!!


----------



## rooivalk (Oct 16, 2014)

> Producers try to earn MORE





> It is very clear that " THEY WANT YOUR MONEY", "THEY WANT MORE MONEY".


lol hippie, they're business, they want moar money.



> 1920X1080p resolution must have been a standard before 4K technology.


Standard here means _De Facto_ _Standard _= market dominance = probably not 100% absorption. It's not like International Standard / Imperial Unit where each other shouldn't be used interchangeably (it's messy and prone to error) in one institution or project.
That's why there's also term _Competing Standards_. 



> Except the hi-end gaming laptops still more than 80% of the laptops offer 1366X768 pixel resolution.


1080p and 720p are both standards, dominant standards in their own area (big/small computer display).



> USB 3.0 has never become a standard.


It's _new standards_. Again, standard doesn't need to have 100% absorption. It could be 0% = failed standard.



> Never forget that "If a technology does not become a standard it will always be LUXURY and UNATTAINABLE."
> A technology must become a standard to be able to be AFFORDABLE.


fact: 1080p (your so called 'not yet a standard') has become so cheap, you can find decent one cheaper than  15" 1024x768 years ago.



> Technological improvements are TOO FAST to be released.
> But Technological improvements are TOO SLOW to become the standard.


It could be, but in this case that's because your skewed/forced definition of standard in display industries.


----------



## Tardian (Oct 16, 2014)

8K is unlikely to be feasible for gaming for some time. An immediate use is for photographers with 36mp DSLRs such as Nikon D810, Sony A7R or for those who use image stitching for large high quality photographs. Given how quickly 4K has become available to videographers (e.g. Panasonic, GoPro etc.) then 8K video may be available soon.


----------



## RejZoR (Oct 16, 2014)

There is no problem to churn out a display with 50k resolution. But you'll be having VERY hard time delivering enough compute power to work with that when most graphic cards don't even meet 1080p performance levels that i'd treat as adequate. If a modern game doesn't run at 150+ fps even on 1080p (well, some of us have a high speed 144Hz monitors you know), then doing 8K on same graphic card would equal to a PowerPoint slideshow. Thanks but no thanks.


----------



## jigar2speed (Oct 16, 2014)

RejZoR said:


> Full HD Tyranny? Really? I have a 24 incher 1080p and i see absolutely nothing wrong with it. Only time when i wanted 4K was when i hooked up a 42 inch LCD TV to my PC and played games on it at distance of 0,5m. It was awesome because the screen was so big i could only see ingame stuff and not the room around. But 1080p was realyl not enough for such small distance on such huge screen. But for 24 inch screens, 4K is nearly completelly unnecessary apart from those 3 people who actually really need such high resolution on such small screen... So, stuffing 8K into anything smaller than 40 inch is absolute nonsense for any distance.



Couldn't agree more, i game on my 42" LED and i do feel that at 2 feet distance 42" - 4K screen is viable solution, but at the same distance if i am gaming on 24 inch Full HD screen its more than enough.


----------



## Sony Xperia S (Oct 16, 2014)

jigar2speed said:


> Couldn't agree more, i game on my 42" LED and i do feel that at 2 feet distance 42" - 4K screen is viable solution, but at the same distance if i am gaming on 24 inch Full HD screen its more than enough.



Do you understand what I'm telling you or not?

You have NO right to tell the others what "enough" is because your requirements and quality as a person may be lower than those others. 

Actually, you need to make the standard higher and if it is higher for you all right, but you CAN'T and SHOULDN'T make the standard lower for everyone because there will always be someone for whom it is NOT enough.

Now understand?


----------



## John Mellinger (Oct 16, 2014)

What is the Point of thinking about 8K for PC Gamer's when you have Microsoft and Sony Paying companies like Unisoft to Cripple the PC versions? 

http://www.dsogaming.com/news/report-microsoft-sony-pressuring-ubisoft-for-30fps-on-pc/


----------



## RejZoR (Oct 16, 2014)

Sony Xperia S said:


> Do you understand what I'm telling you or not?
> 
> You have NO right to tell the others what "enough" is because your requirements and quality as a person may be lower than those others.
> 
> ...



It's not WE who are telling you what to use or what not o use, framerate will forcibly tell you that. Unless if you enjoy gaming at sub 20 fps or by lowering game details. I know i don't...


----------



## ZoneDymo (Oct 16, 2014)

RejZoR said:


> Full HD Tyranny? Really? I have a 24 incher 1080p and i see absolutely nothing wrong with it. Only time when i wanted 4K was when i hooked up a 42 inch LCD TV to my PC and played games on it at distance of 0,5m. It was awesome because the screen was so big i could only see ingame stuff and not the room around. But 1080p was realyl not enough for such small distance on such huge screen. But for 24 inch screens, 4K is nearly completelly unnecessary apart from those 3 people who actually really need such high resolution on such small screen... So, stuffing 8K into anything smaller than 40 inch is absolute nonsense for any distance.



you are the most peasanty of the pc masterrace yo


----------



## ZoneDymo (Oct 16, 2014)

John Mellinger said:


> What is the Point of thinking about 8K for PC Gamer's when you have Microsoft and Sony Paying companies like Unisoft to Cripple the PC versions?
> 
> http://www.dsogaming.com/news/report-microsoft-sony-pressuring-ubisoft-for-30fps-on-pc/



 better put on your tinfoil hat man


----------



## RejZoR (Oct 16, 2014)

ZoneDymo said:


> you are the most peasanty of the pc masterrace yo



We are the master race. PC world did just fine when consoles were doing their own march in their own world. And consoles were doing just fine as well. As soon as they started porting shit from consoles to PC and consoles became primary target platform (which is still ironically coded on PC's), all went to, well, shit.


----------



## Sony Xperia S (Oct 16, 2014)

With this current philosophy from those major GPU makers, it's no wonder to be a great challenge.

They need to open their eyes for completely new techniques like the Unlimited Detail.


----------



## Easy Rhino (Oct 16, 2014)

i'm sorry but all of your points are irrelevant. give me a fun game to play FIRST and then maybe i will consider if it requires a 4K or 8K resolution over my 1080p monitor. Focus on game play and fun, not on uber graphics.


----------



## de.das.dude (Oct 16, 2014)

i think they need to push higher resolutions on larger screens rather than small ones. 1080p on my 21.5" is perfectly acceptable.


----------



## GhostRyder (Oct 16, 2014)

I think the only problem is how much of this can we see at a point?

I mean really I think 4k was going to be the limit of what we can really perceive with our eyes so the only reason I see these resolutions being better is for bigger screens to get higher pixel density which could keep things a bit clearer.  Though I think we are a ways off from 8K being feasible for gamers, 4K is doable with 2 top end cards or more right now but that is about the minimum.


----------



## 64K (Oct 16, 2014)

I can't imagine what kind of cards could handle 8K in gaming. That's a little over 33 million pixels. Equal to powering four 4K monitors or 16 1080p monitors. Apparently LG has an 8K TV though. I'm sure it's going to be very expensive for early adopters.


----------



## nickbaldwin86 (Oct 16, 2014)

When I read that title I was like WOOT 8k because 8 is bigger than 4 WO0T MOAR PPI !!!! 

Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website" 

Really are you all affraid of the future????? 

8K is amazing it moves the future forward... we don't need faster cards to run 1080p we need cards that will run 8k like butter then you can play 4k without a issue max out on the latest greatest.

If I had the money I would get a 4k monitor yesterday and enough video cards to run it, I have been saving and my next purchase for my PC is a new monitor.... 1440p with 144hz ROG swift or a 4k monitor are my two options.

I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV

Anyways I am super excited and I cant wait to see what AMD and NV have in store for 8K.


----------



## Patriot (Oct 16, 2014)

Frick said:


> That is what needs to be done IMO. Sub €200 1440p monitor please.



You can get 1440p sub $400 in the US no problem.
I got my 1600p last year for $600


----------



## RejZoR (Oct 16, 2014)

This "Unlimited Detail" has been nothing but a vaporware consisting of bunch of amazing videos and not a single realtime demonstration. I'll believe them when they run their magical stuff in front of media (journalists) in real-time. Until then, vaporware...


----------



## RCoon (Oct 16, 2014)

Films and GPUs haven't even made it to 4K yet. When I watch a 720p or 1080p youtube clip on my 1440p monitor, it looks bad. When I watch a Bluray on my 1440p monitor, it reminds me of watching DVD's on a HD tv. The quality is fine, but the crispness is lacking when scaled up to those resolutions. 4K needs to be widely accepted (and supported) first, so video files look crisp again. 

Then there's games, textures aren't compressible beyond what we currently see, and games are already bordering on the 50GB mark. Imagine 4K textures on steam games download for days at a time. Not to mention SSD's aren't ready for being used as game storage devices still, as they're still not overly price perfect. GPUs can't push 4K at more than 45FPS, and consoles most definitely can't push anything beyond 900p currently. 

Not only are we severely unprepared for 8K, we're still catching up to get 4K ready within the next few years. Not to mention they simply aren't affordable. You need to wait for the tech at the bottom to catch up (consoles, GPUs, SSD's and internet speeds/bandwidth) before you start pushing even more pixels.


----------



## Animalpak (Oct 16, 2014)

I have a 1440p monitor and i must say that my 780 Ti handle it pretty good but not exceptionaly like 1080p.

At 1440p hitting constantly over 100 fps wall is not a easy task for single GPU.

A SLI setup will do.


----------



## Disparia (Oct 16, 2014)

Good stuff.

Full-HD couldn't hold us down forever!


----------



## de.das.dude (Oct 16, 2014)

nickbaldwin86 said:


> When I read that title I was like WOOT 8k because 8 is bigger than 4 WO0T MOAR PPI !!!!
> 
> Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website"
> 
> ...



people like you make it easy for companies. you guys will literally buy anything even if it doesnt make any sense.


----------



## Sony Xperia S (Oct 16, 2014)

RCoon said:


> internet speeds/bandwidth



Some countries don't have problems with the internet connection, actually it is the best there where it is least expected to be so.

Ask your stupid internet providers in the western world for higher speeds finally to drive progress.


----------



## Hilux SSRG (Oct 16, 2014)

8K sounds good.  This will push companies to develop better screens, technology, etc. in order to get the industry away from 1080p and moving towards 1440p and ultimately a 4K as the new standard.


----------



## Frick (Oct 16, 2014)

Sony Xperia S said:


> Some countries don't have problems with the internet connection, actually it is the best there where it is least expected to be so.
> 
> Ask your stupid internet providers in the western world for higher speeds finally to drive progress.



It's not that easy to dig fiber optics all over entire continents ... especially when many places are remote. In the cities and stuff sure it might not be a problem, but what about the suckers not living there? Sweden generally is very good in this regard, but on the edges people still can't get speeds faster than old phone connections. No wireless cover either. And hell, if you're on a gigabit connection your mechanic HDD won't keep up anyway. Which was the point: It's not just ONE bottleneck (even though internet speeds is the lowest common denominator at this point), there are several.



Patriot said:


> You can get 1440p sub $400 in the US no problem.
> I got my 1600p last year for $600



€200 would mean $200 when taxes being accounted for. Give or take.


----------



## ZeDestructor (Oct 16, 2014)

Reader said:


> 4K or 8K or 16K.....
> All of them are marketing hoaxes.
> Producers try to fascinate the "rich idiots".
> Producers try to earn MORE thanks to INEFFICIENT technologies.
> ...



I code on my desktop. 4K and 8K for me would be a massive improvement just in text quality. Sure, a few programs scale blurrily, but I can live with a few blurry programs while everything important (text editors, IDEs, browsers, DE) scales

Single GPUs can't drive 4K yet, but SLI 970s can. There will be a bigger Maxwell chip soon, much like Kepler. A single one of those should be able to get 60+fps on 4K, and a pair in SLI should be able to do 45+fps on triple-screen 4K. With 3 or 4 GPUs, 8K is entirely doable. Expensive, but doable. Also potentially unnecessary depending on how AA and the need for AA scales.

Before the netbook-era drove prices stupidly low (and stalled display improvements for 6-10 years.. don't get me started on this crap...), laptops had been increasing resolution steadily. Nowadays, 4K is entirely accessible on a laptop, and 1080p is (finally!) considered the baseline resolution by most reviewers and general people. I expect the 1366*768 size to die out completely in the next 2-3 years, especially now that Ultrabooks have put in new life into the laptop market. Looking at the steam HW survey, the 1920x1080 category is growing the fastest and is the largest. 1080p is very much the standard target resolution, and 4K will be the next target standard. Graphics cards also show that: they're all optimized for 1080p6 to 1080p120, and are now starting to target the 4K segment.

USB3 is already standardised. You just need to pay for it. USB2 products still exist because they are cheap. Taking a thumbdrive as an example: you only need USB3 for a fast one, so why bother making a slow one use more expensive USB3 parts and design time instead of just reusing current USB2 parts? Same reason screens only use DVI-SL with DVI-SL cables rather than DL-DVI or DP, or why you don't put a Noctua NH-D15 on an Atom C2750.

Cameras couldn't do 4K and 8K or high FPS for a long time because of physics: you need to get more detail with the same amount of light. The more detail you try to extract out of the same amount of light, the more noise becomes an issue, and that's unnacceptable for anyone worth their salt. Sensors are finally good enough now that cameras are shipping with them. Compare the RED One camera to the RED EPIC DRAGON. The sensor on the DRAGON is higher resolution (6K vs 4K), and more importantly, has an SNR of 80dB vs 66dB of the RED ONE. The SNR difference is what allows the DRAGON to hit 150fps of 4K (with a the help of a shitton of extra light from spot lamps) while the ONE has to make do with only 30fps. Don't argue with physics, you will lose. As for dSLR sensors, they are not geared towards video, and consequently, don't work anywhere near as well. Oh, and the storage on dSLRs is crap compared to a real video camera: SD/ZQD/CF cards vs raw digital feed to a real server with capture cards and RAIDed and/or PCIe SSDs. It's correspondingly more expensive. And finally, to put into perspective, cinema-grade 2K video is currently at 250MB/s, or about 2Gbit/s. And that's after compression. Meanwhile blu-rays have to do with ~50Mbit/s at most due to space constraints. For that level of quality, forget about consumer gear, even top-end gear isn't fast enough to cope for a large number of producers.

All in all, it's not that corps don't want steady improvements, but more that this thing called real world physics gets in the way of steady improvements.


----------



## Rojan (Oct 16, 2014)

Sony Xperia S said:


> I need my 24 inch 4K monitor NOWWWWW!!!



UP2414Q, been out for months.



Easy Rhino said:


> i'm sorry but all of your points are irrelevant. give me a fun game to play FIRST and then maybe i will consider if it requires a 4K or 8K resolution over my 1080p monitor. Focus on game play and fun, not on uber graphics.



I'm sorry if someone is gonna be mad on the thing I'm about to say, but gameplay is really irrelevant in terms of technical advancement in display resolution. I mean, it doesn't even have to be about video games at all! High PPI's more practical usage is gaming but for sheer resolution, it could be used in other scenarios (ie. productivity with more real estate without the use of multiple panels and photography work).


----------



## Tallencor (Oct 16, 2014)

nickbaldwin86 said:


> I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV


Tell that to the pro CRT guys. I have a 1080 and down sample some games to rid AA. Although I don't confess myself a "high end PcGamer" I do however take pride in the equipment I do have and find inventive ways of using it to it's full potential. Your statement is unjust.


----------



## Bansaku (Oct 16, 2014)

nickbaldwin86 said:


> Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website"
> 
> Really are you all affraid of the future?????
> 
> ...




Nice attitude, sounds like you are the baby. Most people here at TPU are realists, like myself. Fact, 4K was only ever meant for broadcasting and media production. The push to 4K was not a logical step forward, rather a push from manufacturers to reignite a slumping market with 'new' technologies. 3D flopped, smart TV is a joke, and 120Hz is so 2007. All it took was a Chinese company to release a (sub par) 4K consumer TV for the end user, like yourself, to say " OMG I WANT ". 

" After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up. "

This statement make absolutely no sense. LCD panel manufacturers as well as the designers of modern GPUs have enough challenges with the manufacturing process and limitations of today's 'current' technology. It has nothing to do with politics (tyranny) and everything to do with feasibility. While market demand may push for a certain technology, they can't just 'do it' and make it work. We would all be driving flying cars and traveling to far away solar systems if that were the case.

It takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?


----------



## Sony Xperia S (Oct 16, 2014)

Bansaku said:


> " After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up. "
> 
> This statement make absolutely no sense. LCD panel manufacturers as well as the designers of modern GPUs have enough challenges with the manufacturing process and limitations of today's 'current' technology. It has nothing to do with politics (tyranny) and everything to do with feasibility. While market demand may push for a certain technology, they can't just 'do it' and make it work. We would all be driving flying cars and traveling to far away solar systems if that were the case.



The usual political excuses. Just say it, that YOU or the manufacturers don't want to do it instead of your nonsense. You make it sound as if it is so difficult even impossible but people said 'impossible is nothing' and you can effectively replace your "feasibility" with 'stupidity' and you would be fully correct.

Just look at your smartphone, its display quality, ppi image quailty, etc and you will understand what i mean, probably...


----------



## The Von Matrices (Oct 16, 2014)

I look forward to 8K.  With 8K we can finally get rid of subpixel text rendering (and its resultant color fringing) and anti-aliasing.


----------



## RadFX (Oct 16, 2014)

Bansaku said:


> It takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?



With all due respect I game with a pair of Radeon's (7970ghz and R9 280) at 1.1ghz and 1.5ghz ram at 4k and its fine. There are but a few games that would be unplayable with this setup with max or near max settings  (no AA).


----------



## lemonadesoda (Oct 16, 2014)

8K is what we want, yes! 

This, with G-sync and on the fly resolution scaling, would be perfect. 8K for desktop, productivity with retina quality typefaces, and perfect for photoediting etc. then for gaming the GPU would down-scale and G-sync to obtain the best refresh rate it can always shooting for 60fps+.  By rendering at 2K, and then upscaling to 8K, it is doable.  To minimise the bandwidth, the upscaling will need to be done by the TFT.


----------



## ZeDestructor (Oct 16, 2014)

lemonadesoda said:


> 8K is what we want, yes!
> 
> This, with G-sync and on the fly resolution scaling, would be perfect. 8K for desktop, productivity with retina quality typefaces, and perfect for photoediting etc. then for gaming the GPU would down-scale and G-sync to obtain the best refresh rate it can always shooting for 60fps+.  By rendering at 2K, and then upscaling to 8K, it is doable.  To minimise the bandwidth, the upscaling will need to be done by the TFT.



Nah, have the GPU do the scaling and leave the screen do have a really simple, really dumb controller for minimized latency. When actually pushing a signal from the GPU to the screen, the bandwidth matters little.


----------



## RealNeil (Oct 17, 2014)

Sony Xperia S said:


> Do you understand what I'm telling you or not?
> 
> You have NO right to tell the others what "enough" is because your requirements and quality as a person may be lower than those others.
> 
> ...



He said that if ~~He~~ was gaming,...................................Quoting him: "If I was gaming gaming on 24 inch Full HD screen its more than enough"
So I took that to mean that it's good for himself, and that he wasn't dictating to others.

Is it possible that you're being a little sensitive or reactionary?
My standard is whatever I can afford at the time,...........but I really want it all, and I want it NOW!! 

Peace,.................


----------



## Fx (Oct 17, 2014)

Reader said:


> 4K or 8K or 16K.....
> All of them are marketing hoaxes.
> Producers try to fascinate the "rich idiots".
> Producers try to earn MORE thanks to INEFFICIENT technologies.
> ...



Yeah, screw it. I say lets go back to 1024x768 days......

Seriously though -- yes, technologies do advance all the time. Yes, some technologies aren't all that great. Yes, companies want to make madd money. 4k is a step in the right direction, and isn't a gimmick. My eyes can easily perceive a difference; perhaps your eyes can't.


----------



## FordGT90Concept (Oct 17, 2014)

facepalm.jpg

There's two monumental problems with 8K:

#1: No cable that can carry it.  I have my doubts whether or not DisplayPort can even be expanded to handle it.

#2: If #1 were solved and the workload was only 2D, GPUs could handle 8K today without a problem.  Where the "monumental problem" comes from is that a load any greater than desktop software is going to make any GPU croak at 8K and the _only_ way to combat that is with more transistors which means bigger chips which means more power which means more heat.  Unless there is some breakthrough (perhaps Unlimited Detail Technology), displays are going to run away from graphics technology because graphics can't scale at a rate LCD panels do.

The demand for these panels is coming from the film and TV industry where the GPUs only task is to render the video frames, not process massive amounts of triangles.  I don't think gaming will see reasonable 4K for a long time, never mind 8K.  These things are for mostly film enthusiasts and professionals, not gamers.  Games will have to be played at a lower-than-native resolution for acceptable frame rates.

Oh, and speaking of film industry, HDMI is going to have to be kicked to the curb and a new standard (probably adapted from DisplayPort) will have to replace it to handle 8K.  That's going to take a very long time to phase out HDMI in favor of something newer.


----------



## ZeDestructor (Oct 17, 2014)

FordGT90Concept said:


> facepalm.jpg
> 
> There's two monumental problems with 8K:
> 
> ...



#1 Diplayport Can do ~25Gbit/s right now, and according to the DiaplyPort page on wikipedia, 8K*24bit@60Hz lies around 50-60Gbit/s. For 8K*30bit@120Hz, 125-150Gbit/s should be the bandwidth we're looking at. CAT-8 cabling (4-pair, 8wires) is currently being finalized to provide 40Gbit/s, over 100m. DP is a 4-lane cable (one "pair" per lane). Using CAT-8 grade of cabling with the right transceivers over the usual 5m max needed length (a 5m CAT-8 cable should be good for at least 100Gbit/s) of DP cabling, 8K is perfectly feasible with current tech. Expensive, but feasible. Hell, odds are that CAT-8 cabling will be good for 100Gbit/s over the full 100m length thanks to new electronics... Pennsylvania State University people theorized that 32 or 22nm circuits will do 100Gbit/s over 100m of CAT-7A in 2007.

#2 Most of us care a about 8K for productivity, not games. I for one am happy with only the current 2K screens for games, but olawd the text is nasty . VR however will push things a lot harder though.. the Oculus people want more than 8K. Also, I personally doubt that screens will scale beyond 8K. It's already stalling in phones, where 5" 440ppi screens are considered by almost everyone to be enough for almost everyone and more than necessary for most people.

And HDMI can go diaf. It's a shitty hack based of DVI with a licensing fee (wuuut, with free to use DisplayPort also around?!) and limited to external interfaces only. Personally, the only place I need HDMI for is TVs. If  TVs had DisplayPort inputs, I wouldn't need HDMI at all!


----------



## FordGT90Concept (Oct 17, 2014)

But the infrastructure is not for CAT-8 (CAT-6 is still kind of rare).  It is for DisplayPort and especially HDMI.  Both of which use smaller gauge cable and more wires.  There are no specs presently that can handle 8K without multiple cables.  The industry needs to decide if higher and higher resolution displays will be the new norm and if so, they need to create a standard that can respond to it in kind (think DVI with single and dual-link: standard had room for expansion even though few displays needed it).


----------



## ZeDestructor (Oct 17, 2014)

FordGT90Concept said:


> But the infrastructure is not for CAT-8 (CAT-6 is still kind of rare).  It is for DisplayPort and especially HDMI.  Both of which use smaller gauge cable and more wires.  There are no specs presently that can handle 8K without multiple cables.  The industry needs to decide if higher and higher resolution displays will be the new norm and if so, they need to create a standard that can respond to it in kind (think DVI with single and dual-link: standard had room for expansion even though few displays needed it).



My point was more about the fact that in terms of bandwidth, the tech is here, and largely usable (much like Ethernet, and much more closely, PCIe, Displayport uses a packet-based architecture, also why I chose Ethernet as example). The standard needs a bit of updating, but not as much as you would think. If copper cabling becomes an issue, fibre will replace it, but DisplayPort at it's core will likely remain unchanged (besides allowing higher bit rates within the protocol). This is also the most likely place where 8K is stuck: deciding if the world is ready or not to jump to all-fibre. I for one am all for it.

Oh, nd High-res displays will become the norm: the Industry has just found a new cash cow, and engineers are more than happy to have fun designing that stuff!


----------



## JDG1980 (Oct 17, 2014)

Bansaku said:


> It takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?



It isn't all about gaming. Once you've worked with a high-DPI smartphone or tablet, the text reproduction on a standard PC monitor looks crude and blurry in comparison. And the only reason it even looks as good as it does is because of a lot of terrible hacks, like font hinting and sub-pixel AA. Once screens move to 300 PPI and beyond, these hacks can be done away with. We will finally have print-quality reproduction on a monitor screen with no compromises.


----------



## ZeDestructor (Oct 17, 2014)

JDG1980 said:


> It isn't all about gaming. Once you've worked with a high-DPI smartphone or tablet, the text reproduction on a standard PC monitor looks crude and blurry in comparison. And the only reason it even looks as good as it does is because of a lot of terrible hacks, like font hinting and sub-pixel AA. Once screens move to 300 PPI and beyond, these hacks can be done away with. We will finally have print-quality reproduction on a monitor screen with no compromises.



Font Hinting, sub-pixel rendering and AA are all amazing, they just work much better on high-DPI screens than they do on low-DPI screens, as evidenced by all manner of devices with High-DPI screens (they all have font smoothing).


----------



## Katanai (Oct 17, 2014)

Most of the posts here are simply wrong. You view 8K too much from a narrow PC centric perspective. 8K is first a video format resolution and from that perspective there is nothing wrong with it. There are 8k cameras right now that take awesome video footage, the best looking footage ever taken in the history of mankind, comparable only to the large imax cameras. Those 8K cameras when hooked up to an 8K display are able to drive that display at that resolution perfectly. Not at a low refresh rate as the writer of the article imagined, 8K cameras can record at 120 frames per second and if the monitor supports it, display that footage at 120Hz. Any 8K monitor that exists right now, and there are a few this is not the first one, can display 8K footage at 60Hz without any problems. Now when this resolution will be available to PC users as a viable alternative, I don't know...


----------



## ZeDestructor (Oct 17, 2014)

Katanai said:


> Most of the posts here are simply wrong. You view 8K too much from a narrow PC centric perspective. 8K is first a video format resolution and from that perspective there is nothing wrong with it. There are 8k cameras right now that take awesome video footage, the best looking footage ever taken in the history of mankind, comparable only to the large imax cameras. Those 8K cameras when hooked up to an 8K display are able to drive that display at that resolution perfectly. Not at a low refresh rate as the writer of the article imagined, 8K cameras can record at 120 frames per second and if the monitor supports it, display that footage at 120Hz. Any 8K monitor that exists right now, and there are a few this is not the first one, can display 8K footage at 60Hz without any problems. Now when this resolution will be available to PC users as a viable alternative, I don't know...



I'm expecting 8K displays in the next 24months myself.

We have 5K screens already


----------



## Katanai (Oct 17, 2014)

ZeDestructor said:


> I'm expecting 8K displays in the next 24months myself.
> 
> We have 5K screens already



The writer of this article is mainly responsible for the confusion here as he presented this as being something completely new. He should have researched this a bit more before posting this. Here is a clip from 2011 showing a working 8K display at 60Hz lol.


----------



## FordGT90Concept (Oct 17, 2014)

Using what interface?  I came across this and if the cable coming out that I think is it looks proprietary (almost like SATA).  I'm also pretty sure it is encoding via HEVC from 30 Gbps to 88 Mbps which is being sent to the display and being decoded to 4K.


----------



## Katanai (Oct 17, 2014)

FordGT90Concept said:


> Using what interface?  I came across this and if the cable coming out that I think is it looks proprietary (almost like SATA).  I'm also pretty sure it is encoding via HEVC from 30 Gbps to 88 Mbps which is being sent to the display and being decoded to 4K.



I don't know what interface it is being used but here is a 8K 120Hz panel from LG .


----------



## FordGT90Concept (Oct 17, 2014)

They have to be using multiple HDMI/DisplayPort or something proprietary.


----------



## ZeDestructor (Oct 17, 2014)

Katanai said:


> The writer of this article is mainly responsible for the confusion here as he presented this as being something completely new. He should have researched this a bit more before posting this. Here is a clip from 2011 showing a working 8K display at 60Hz lol.



10bit too! HNNNG!



FordGT90Concept said:


> Using what interface?  I came across this and if the cable coming out that I think is it looks proprietary (almost like SATA).  I'm also pretty sure it is encoding via HEVC from 30 Gbps to 88 Mbps which is being sent to the display and being decoded to 4K.



Looks like twin-axial cabling sending raw uncompressed frames (at 30Gbit/s) to a splitter box that converts into 17 much smaller frame strips sent as SDI signals to the encoders.

PS: SATA 3.0 (6Gbit/s) uses twin-axial cabling for the differential pairs, hence the similarity.


----------



## ZeDestructor (Oct 17, 2014)

FordGT90Concept said:


> They have to be using multiple HDMI/DisplayPort or something proprietary.



Nope, just Twin-Ax from the camera to the splitter box. Yes, 30Gbit/s over a single pair 

You see Twin-ax commonly is in 40 and 100 gigabit ethernet for very short runs (7m or less) which don't require fibre, running at either 10.3125Gbit/s per lane (one twin-ax cable per lane) for 100GBASE-CR10 or 25.78125Gbit/s per lane for 100GBASE-CR4.


----------



## FordGT90Concept (Oct 17, 2014)

Still a proprietary cable.  Cable standards need to catch up to the proposed display standards.


----------



## ZeDestructor (Oct 17, 2014)

FordGT90Concept said:


> Still a proprietary cable.  Cable standards need to catch up to the proposed display standards.



That was a professional video camera. Those always have strange standards, and almost completely interoperable, SDI and HD-SDI for instance is a common interconnect used there, while HDMI and DisplayPort are simply use for nothing besides connecting TVs and computer monitors. The only possibly proprietary cable in that video would be the camera to aplitterbox cable. Everything else will be running on other standards that consumers never interact with.

Consumer cabling will follow later, with a lot less reliability features, much like SATA vs SAS.


----------



## The Von Matrices (Oct 17, 2014)

ZeDestructor said:


> #1 Diplayport Can do ~25Gbit/s right now, and according to the DiaplyPort page on wikipedia, 8K*24bit@60Hz lies around 50-60Gbit/s. For 8K*30bit@120Hz, 125-150Gbit/s should be the bandwidth we're looking at. CAT-8 cabling (4-pair, 8wires) is currently being finalized to provide 40Gbit/s, over 100m. DP is a 4-lane cable (one "pair" per lane). Using CAT-8 grade of cabling with the right transceivers over the usual 5m max needed length (a 5m CAT-8 cable should be good for at least 100Gbit/s) of DP cabling, 8K is perfectly feasible with current tech. Expensive, but feasible. Hell, odds are that CAT-8 cabling will be good for 100Gbit/s over the full 100m length thanks to new electronics... Pennsylvania State University people theorized that 32 or 22nm circuits will do 100Gbit/s over 100m of CAT-7A in 2007.



You're looking at the wrong side of the issue.  The cables can be cheap but the transcievers are what make things expensive.  Look at Thunderbolt - the transcievers, not the cable between them, are why you can't get a Thunderbolt cable for less than $30 no matter how short it is.  Similarly, I would love to have 10Gb Ethernet connecting me to the storage server in my house, but I can't justify paying $150 for each 10GBase-T NIC.


----------



## ZeDestructor (Oct 18, 2014)

The Von Matrices said:


> You're looking at the wrong side of the issue.  The cables can be cheap but the transcievers are what make things expensive.  Look at Thunderbolt - the transcievers, not the cable between them, are why you can't get a Thunderbolt cable for less than $30 no matter how short it is.  Similarly, I would love to have 10Gb Ethernet connecting me to the storage server in my house, but I can't justify paying $150 for each 10GBase-T NIC.



Fair point, although I think that the transceiver issue can be solved fairly easily by scaling production up.

Also, where do you fin a 10GBASE-T NIC for $150?! The lowest I've seen an X540-T1 was around the $300 mark... and then I need to get a new switch...


----------



## Sony Xperia S (Oct 18, 2014)

Rojan said:


> UP2414Q, been out for months



Good but still I realised that I don't want or need this brand and expensive, 700 $. Meh 

*I need my 22 inch 4K monitor for 300 $ NOWWW!!!*


----------



## The Von Matrices (Oct 18, 2014)

ZeDestructor said:


> Fair point, although I think that the transceiver issue can be solved fairly easily by scaling production up.
> 
> Also, where do you fin a 10GBASE-T NIC for $150?! The lowest I've seen an X540-T1 was around the $300 mark... and then I need to get a new switch...


You're right in that NICs cost closer to $300 than the $150 I stated.  The cheapest NIC I could found is a Marvell based Startech card that costs $284.  The switches though are relatively cheap per port compared to the NICs; an 8-port switch costs only $824 or $103 per port.


----------



## ZeDestructor (Oct 18, 2014)

The Von Matrices said:


> You're right in that NICs cost closer to $300 than the $150 I stated.  The cheapest NIC I could found is a Marvell based Startech card that costs $284.  The switches though are relatively cheap per port compared to the NICs; an 8-port switch costs only $824 or $103 per port.



Not bad..

Though I think I'll be waiting another 2-5 years before I move up. If I'm buying comepletely new, I want something quiet, if second hand, at least 24 ports.


----------



## anonymous6366 (Oct 18, 2014)

first off, 1080p 120Hz>4K 60Hz
8K... well first off, as other people said there isn't a video card that would be able to handle that.
I would also agree with the comments about this being a marketing strategy to trick rich people who don't know anything and think that the most expensive product is always the best (I know people like this). I'm still in the stone ages with a 1680x1050 monitor so what do I know lol


----------



## ZeDestructor (Oct 18, 2014)

anonymous6366 said:


> first off, 1080p 120Hz>4K 60Hz
> 8K... well first off, as other people said there isn't a video card that would be able to handle that.
> I would also agree with the comments about this being a marketing strategy to trick rich people who don't know anything and think that the most expensive product is always the best (I know people like this). I'm still in the stone ages with a 1680x1050 monitor so what do I know lol



I have better vision than most people (excluding people with glasses), and all I want is nicer text. 4K is just an in-between step IMO. 8K is where would stop, for screens. VR needs higher... Seriously, just look at normal 8pt text on a 100% (100dpi/ppi) scale screen at 800% magnification... so horrible, and I can see the horribleness on a normal 1920x1200 24" (100ppi) screen 

As for the marketing, well, it's marketing.. It's always gonna be aimed at rich people with more money than sense


----------



## Sony Xperia S (Oct 18, 2014)

ZeDestructor said:


> As for the marketing, well, it's marketing.. It's always gonna be aimed at rich people with more money than sense



Marketing is not specifically aimed at a given class, be it the poor who have enough for something or the rich who are trying to find where to waste their money, it is invented to show products in better light than they actually are.

Ok, the classic should be that it just showcases the product but nowadays it is more like imagination and fantasies...


----------



## Hood (Oct 18, 2014)

Naito said:


> Really!? I haven't even moved to 1440p yet! Even though I would love to!


I took the plunge a few months ago and ordered a Crossover 27QW Perfect Pixel model from NewEgg (about $400).  It looks amazing. The 27" 2560x1440 IPS panel is a definite step up from 1080p, and games have no problem even with my single GPU (660 Ti).  These monitors ship direct from South Korea and arrive in 3 days.  No dead pixels, and it's still working perfectly.  Prices have dropped to about $350.  The panels used are the same Samsung or LG panels found in the expensive name brands ($700-$900).  The stand isn't adjustable, but that was easily remedied by setting it on a 3 inch riser which doubles as a place to slide my keyboard under (for more desk space).


----------



## johnspack (Oct 19, 2014)

Funny,  I'm sure these arguments were had over 1080 a few years ago in here.  Eventually 4k will be the norm,  and further down the line 8k.  I'm still pissed over the 1200p thing,  why wasn't that the norm?  Video cards will catch up,  4k monitors will become standard,  and we'll all be laughing at the old days when we only had 1080p.  Let's allow advancement,  so it will become the norm.......


----------



## newconroer (Oct 19, 2014)

*" and somehow treat the single display as four Ultra HD displays. " *This is the concerning part... and 4k faced it already until I think Iiyama figured out a resolution ..
*
*


johnspack said:


> Funny,  I'm sure these arguments were had over 1080 a few years ago in here.  Eventually 4k will be the norm,  and further down the line 8k.  I'm still pissed over the 1200p thing,  why wasn't that the norm?  Video cards will catch up,  4k monitors will become standard,  and we'll all be laughing at the old days when we only had 1080p.  Let's allow advancement,  so it will become the norm.......



Well quite a few of us have been laughing at 1080p for a long time now. The computer graphics and monitor market is to the audio visual industry what Formula 1 is to the automaker industry. All of the cutting edge stuff has an influence on what becomes common in the marketplace.

The problem with 1080p is that it's out lived it's welcome and would have been replaced by 1440p/1600p, if television manufacturers and broadcast networks weren't so lazy or behind the times. It also doesn't help that the popular console systems have only now just gotten 1080p.

1080p needs to die, and die quick.


----------



## nexus_a (Oct 19, 2014)

newconroer said:


> *" and somehow treat the single display as four Ultra HD displays. " *This is the concerning part... and 4k faced it already until I think Iiyama figured out a resolution ..
> *
> *
> 
> ...


1080p is like XP, it will be lingering around for the next 10 years. Sadly.


----------



## xenocide (Oct 20, 2014)

johnspack said:


> Funny,  I'm sure these arguments were had over 1080 a few years ago in here.  Eventually 4k will be the norm,  and further down the line 8k.  I'm still pissed over the 1200p thing,  why wasn't that the norm?  Video cards will catch up,  4k monitors will become standard,  and we'll all be laughing at the old days when we only had 1080p.  Let's allow advancement,  so it will become the norm.......


 
There were 2560x1600 CRT's if I recall correctly.  1080p was never really new for PC monitors like it was for Televisions (which used to be 640x480 essentially).  What was a huge jump for televisions was actually a massive stagnation for PC resolutions.  The introduction of LCD's had a lot of problems for the PC world.  We had shitty TN Panels for quite a while and they are just barely catching up to where CRTs left off.  With 4k we're not talking about a slight bump, we're talking about a massive increase in the amount of power needed (I think going from 1080p to 4K is like a 4x increase in the number of pixels), going to 8K is another gigantic leap forward.


----------



## ZeDestructor (Oct 20, 2014)

nexus_a said:


> 1080p is like XP, it will be lingering around for the next 10 years. Sadly.



1280x???? is still around for a lot of people, and that is at least 10 years old.


----------



## Frick (Oct 20, 2014)

newconroer said:


> 1080p needs to die, and die quick.



Naah, it's 1440/1600p that needs to be adopted. 1080p is fine on 24 inch monitors. 1440p on those and most people would have to mess with scaling, and Windows 7 doesn't do that very good. Many programs suck at it too.


----------



## jihadjoe (Oct 20, 2014)

The Von Matrices said:


> I look forward to 8K.  With 8K we can finally get rid of subpixel text rendering (and its resultant color fringing) and anti-aliasing.



This!

A producer at the BBC actually wanted to skip 4k and have us go directly to 8k, because 8k was "retina" for large screens. No more FSAA required. I believe the same producer was behind the BBC/NHK SHV broadcast of the 2012 London Olympics.

SHV = 8k video + 22.2 audio


----------



## newconroer (Oct 20, 2014)

Frick said:


> Naah, it's 1440/1600p that needs to be adopted. 1080p is fine on 24 inch monitors. 1440p on those and most people would have to mess with scaling, and Windows 7 doesn't do that very good. Many programs suck at it too.


Under normal circumstances and growth cycles, you'd be correct, but it seems like 1080p is just holding progress back and this point.
I'm not certain we can increase 1440/1600 AND continue to foster 1080p.


Either way it's good we're talking about it.


----------



## ZeDestructor (Oct 21, 2014)

newconroer said:


> Under normal circumstances and growth cycles, you'd be correct, but it seems like 1080p is just holding progress back and this point.
> I'm not certain we can increase 1440/1600 AND continue to foster 1080p.
> 
> 
> Either way it's good we're talking about it.



1080p will move down to the mid-range, so the GTX *60 and Radeon **70 will take over for high-quality, single-screen 1080p, while the high end moves gradually to 8K. GPU makers are aware of where the high-end is going, and are working on it, but it takes time to build new archs and get new nodes.


----------



## DayKnight (Oct 21, 2014)

I would jump to 1440p but lol, no monitor in my country above 1080p.

4k should become a norm pretty soon. 8k will take 4k's place (wow factor) in a year or two, IMO.


----------



## xenocide (Oct 22, 2014)

DayKnight said:


> 4k should become a norm pretty soon. 8k will take 4k's place (wow factor) in a year or two, IMO.


 
You're insane if you believe that.


----------



## DayKnight (Oct 22, 2014)

xenocide said:


> You're insane if you believe that.



I am?. Which part you dont agree with?.

The way technology moves, I DO think 4k will become a norm. Already, so many GPU's support 4k.


----------



## xenocide (Oct 22, 2014)

DayKnight said:


> I am?. Which part you dont agree with?.
> 
> The way technology moves, I DO think 4k will become a norm. Already, so many GPU's support 4k.


 
4K has been in the works for 3-5 years already, and is probably 3 more years from legitimate adoption.  As it stands now it's prohibitively expensive, and there isn't the hardware available to make it shine (I don't consider $1200-1500 worth of GPU's for decent performance acceptable).  It probably won't be on par for 1080p until 2018ish, and won't see wide spread acceptance until 2020ish.  So when all is said and done it will have taken at least 10 years for 4K to really catch on, and you think jumping to 8K will only take 1-2 years after that?

There's a massive difference between it being supported and it being viable.  Hardware has supported Ray Tracing and Voxels for decades now, but it's only in recent years it has been usable and even then in very niche scenarios.


----------



## DayKnight (Oct 22, 2014)

xenocide said:


> 4K has been in the works for 3-5 years already, and is probably 3 more years from legitimate adoption.  As it stands now it's prohibitively expensive, and there isn't the hardware available to make it shine (I don't consider $1200-1500 worth of GPU's for decent performance acceptable).  It probably won't be on par for 1080p until 2018ish, and won't see wide spread acceptance until 2020ish.  So when all is said and done it will have taken at least 10 years for 4K to really catch on, and you think jumping to 8K will only take 1-2 years after that?



Lets wait and watch then. 

Edit: Please stop exaggerating. 2xGTX 980=1100$. 2xGTX 970= 700$.


----------



## Prima.Vera (Oct 22, 2014)

newconroer said:


> .., but it seems like 1080p is just holding progress back and this point.
> I'm not certain we can increase 1440/1600 AND continue to foster 1080p.


 
Actually is not entirely monitor maker's fault but the GPU makers also. Right now you can barelly play on 1440p with the top video cards, therefore the request is very low for 1440p monitors. Until we have the same or better performance on 1440p resolution compared to the 1080p, we wont see any changes hapening soon...


----------



## xenocide (Oct 22, 2014)

DayKnight said:


> Edit: Please stop exaggerating. 2xGTX 980=1100$. 2xGTX 970= 700$.


 
GTX 980's in SLi still push under 60fps in a modern game like Battlefield 4 at 4K--without Anti-Aliasing and at highish settings.  There's also the issues of microstutter and frame times when using SLi setups.  You would need something like 3 GTX 970's or 980's to play at a steady 60fps at 4K.


----------



## DOA (Nov 15, 2014)

I LOL when people lust after 1440, 4K is what to buy now.
You can pay a LOT more by making this two steps and buying two monitors to get to 4K or you can just realize TV resolutions drive the monitor market and buy 4K now. Asus makes a great 60Hz 4K monitor that plays well at lower resolutions until your GPU(s) catch up to 4K. Some may see the difference between 60Hz and higher frequencies (the only reason to buy  1440 instead of 4K), I cannot.
As for 8K - probably not a desktop resolution for a long time. 4K is noticeably better on a 30 inch monitor at 2 ft than 1080P or even 1440. From 4K to 5K and above is not noticeable. Perhaps larger monitors or full 360 view headsets will be the norm in the distant future and 8K will be needed.
Rich boys toys? Why are you here if you are not willing to prioritize your computing power?
BTW, two 290X with mild OC pushed 4K quite nicely.


----------



## Steevo (Nov 15, 2014)

DOA said:


> I LOL when people lust after 1440, 4K is what to buy now.
> You can pay a LOT more by making this two steps and buying two monitors to get to 4K or you can just realize TV resolutions drive the monitor market and buy 4K now. Asus makes a great 60Hz 4K monitor that plays well at lower resolutions until your GPU(s) catch up to 4K. Some may see the difference between 60Hz and higher frequencies (the only reason to buy  1440 instead of 4K), I cannot.
> As for 8K - probably not a desktop resolution for a long time. 4K is noticeably better on a 30 inch monitor at 2 ft than 1080P or even 1440. From 4K to 5K and above is not noticeable. Perhaps larger monitors or full 360 view headsets will be the norm in the distant future and 8K will be needed.
> Rich boys toys? Why are you here if you are not willing to prioritize your computing power?
> BTW, two 290X with mild OC pushed 4K quite nicely.


Its been more about getting the quality of display we are looking for. a cheap 4K offers much worse color reproduction, and tiled display gives it a chance for the timings to be slightly off resulting in tearing, topped off with mediocre frame rates as the refresh rates of many displays were too low, and the lack of support from games, so the resulting image stretch done by the display results in artifacts, or more GPU power requirements to scale it correctly. 

We went through much of the same issues moving from great quality CRT's to LCD displays, I had a tough time justifying a LCD for use when I had a nice CRT that had great color, scaling without artifacts, view angle, and 85Hz refresh rate without ghosting.


----------



## DOA (Nov 16, 2014)

I agree Steevo, those were problems until Asus put out its gamer 4K, 1ms grey to grey and to my eyes the color is perfect.
http://www.tomshardware.com/reviews/asus-pb287q-4k-monitor,3832.html
I still go back to 1080P for Counterstrike so I get a constant 60 Hz, but all other games I play at 4K.


----------



## ZeDestructor (Nov 17, 2014)

Steevo said:


> Its been more about getting the quality of display we are looking for. a cheap 4K offers much worse color reproduction, and tiled display gives it a chance for the timings to be slightly off resulting in tearing, topped off with mediocre frame rates as the refresh rates of many displays were too low, and the lack of support from games, so the resulting image stretch done by the display results in artifacts, or more GPU power requirements to scale it correctly.
> 
> We went through much of the same issues moving from great quality CRT's to LCD displays, I had a tough time justifying a LCD for use when I had a nice CRT that had great color, scaling without artifacts, view angle, and 85Hz refresh rate without ghosting.



Give it a few more months. We'll see the single-tile 4K wide-gamut IPS then


----------

