Thursday, October 16th 2014
8K A Great Challenge: NVIDIA and AMD
Even as 4K Ultra HD (3840 x 2160) is beginning to enter the consumer mainstream, with 28-inch displays being priced around $600, and Apple toying with 5K (5120 x 2880), with its next-generation iMac Retina desktops, Japanese display maker Sharp threw a spanner in the works, by unveiling a working prototype of its 8K (7680 x 4320 pixels) display, at the CETAC trade-show, held in Japan.
Two of the industry's biggest graphics processor makers, NVIDIA and AMD, reacted similarly to the development, calling 8K "a great challenge." Currently, neither company has a GPU that can handle the resolution. 8K is four times as many pixels as 4K. Driving an Ultra HD display over DVI needs two TMDS links, and DisplayPort 1.2 and HDMI 2.0 have just enough bandwidth to drive Ultra HD at 60 Hz. To drive 8K, both NVIDIA and AMD believe you would need more than one current-generation GPU, the display should connect to both cards over independent connectors, and somehow treat the single display as four Ultra HD displays. We imagine Sharp demoed its display at a very low refresh rate, to compensate for the bandwidth limitation. After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up.
Source:
Expreview
Two of the industry's biggest graphics processor makers, NVIDIA and AMD, reacted similarly to the development, calling 8K "a great challenge." Currently, neither company has a GPU that can handle the resolution. 8K is four times as many pixels as 4K. Driving an Ultra HD display over DVI needs two TMDS links, and DisplayPort 1.2 and HDMI 2.0 have just enough bandwidth to drive Ultra HD at 60 Hz. To drive 8K, both NVIDIA and AMD believe you would need more than one current-generation GPU, the display should connect to both cards over independent connectors, and somehow treat the single display as four Ultra HD displays. We imagine Sharp demoed its display at a very low refresh rate, to compensate for the bandwidth limitation. After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up.
93 Comments on 8K A Great Challenge: NVIDIA and AMD
I mean really I think 4k was going to be the limit of what we can really perceive with our eyes so the only reason I see these resolutions being better is for bigger screens to get higher pixel density which could keep things a bit clearer. Though I think we are a ways off from 8K being feasible for gamers, 4K is doable with 2 top end cards or more right now but that is about the minimum.
Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website"
Really are you all affraid of the future?????
8K is amazing it moves the future forward... we don't need faster cards to run 1080p we need cards that will run 8k like butter then you can play 4k without a issue max out on the latest greatest.
If I had the money I would get a 4k monitor yesterday and enough video cards to run it, I have been saving and my next purchase for my PC is a new monitor.... 1440p with 144hz ROG swift or a 4k monitor are my two options.
I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV
Anyways I am super excited and I cant wait to see what AMD and NV have in store for 8K.
I got my 1600p last year for $600
Then there's games, textures aren't compressible beyond what we currently see, and games are already bordering on the 50GB mark. Imagine 4K textures on steam games download for days at a time. Not to mention SSD's aren't ready for being used as game storage devices still, as they're still not overly price perfect. GPUs can't push 4K at more than 45FPS, and consoles most definitely can't push anything beyond 900p currently.
Not only are we severely unprepared for 8K, we're still catching up to get 4K ready within the next few years. Not to mention they simply aren't affordable. You need to wait for the tech at the bottom to catch up (consoles, GPUs, SSD's and internet speeds/bandwidth) before you start pushing even more pixels.
At 1440p hitting constantly over 100 fps wall is not a easy task for single GPU.
A SLI setup will do.
Full-HD couldn't hold us down forever!
Ask your stupid internet providers in the western world for higher speeds finally to drive progress.
Single GPUs can't drive 4K yet, but SLI 970s can. There will be a bigger Maxwell chip soon, much like Kepler. A single one of those should be able to get 60+fps on 4K, and a pair in SLI should be able to do 45+fps on triple-screen 4K. With 3 or 4 GPUs, 8K is entirely doable. Expensive, but doable. Also potentially unnecessary depending on how AA and the need for AA scales.
Before the netbook-era drove prices stupidly low (and stalled display improvements for 6-10 years.. don't get me started on this crap...), laptops had been increasing resolution steadily. Nowadays, 4K is entirely accessible on a laptop, and 1080p is (finally!) considered the baseline resolution by most reviewers and general people. I expect the 1366*768 size to die out completely in the next 2-3 years, especially now that Ultrabooks have put in new life into the laptop market. Looking at the steam HW survey, the 1920x1080 category is growing the fastest and is the largest. 1080p is very much the standard target resolution, and 4K will be the next target standard. Graphics cards also show that: they're all optimized for 1080p6 to 1080p120, and are now starting to target the 4K segment.
USB3 is already standardised. You just need to pay for it. USB2 products still exist because they are cheap. Taking a thumbdrive as an example: you only need USB3 for a fast one, so why bother making a slow one use more expensive USB3 parts and design time instead of just reusing current USB2 parts? Same reason screens only use DVI-SL with DVI-SL cables rather than DL-DVI or DP, or why you don't put a Noctua NH-D15 on an Atom C2750.
Cameras couldn't do 4K and 8K or high FPS for a long time because of physics: you need to get more detail with the same amount of light. The more detail you try to extract out of the same amount of light, the more noise becomes an issue, and that's unnacceptable for anyone worth their salt. Sensors are finally good enough now that cameras are shipping with them. Compare the RED One camera to the RED EPIC DRAGON. The sensor on the DRAGON is higher resolution (6K vs 4K), and more importantly, has an SNR of 80dB vs 66dB of the RED ONE. The SNR difference is what allows the DRAGON to hit 150fps of 4K (with a the help of a shitton of extra light from spot lamps) while the ONE has to make do with only 30fps. Don't argue with physics, you will lose. As for dSLR sensors, they are not geared towards video, and consequently, don't work anywhere near as well. Oh, and the storage on dSLRs is crap compared to a real video camera: SD/ZQD/CF cards vs raw digital feed to a real server with capture cards and RAIDed and/or PCIe SSDs. It's correspondingly more expensive. And finally, to put into perspective, cinema-grade 2K video is currently at 250MB/s, or about 2Gbit/s. And that's after compression. Meanwhile blu-rays have to do with ~50Mbit/s at most due to space constraints. For that level of quality, forget about consumer gear, even top-end gear isn't fast enough to cope for a large number of producers.
All in all, it's not that corps don't want steady improvements, but more that this thing called real world physics gets in the way of steady improvements.
" After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up. "
This statement make absolutely no sense. LCD panel manufacturers as well as the designers of modern GPUs have enough challenges with the manufacturing process and limitations of today's 'current' technology. It has nothing to do with politics (tyranny) and everything to do with feasibility. While market demand may push for a certain technology, they can't just 'do it' and make it work. We would all be driving flying cars and traveling to far away solar systems if that were the case.
It takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?
Just look at your smartphone, its display quality, ppi image quailty, etc and you will understand what i mean, probably... :laugh:
This, with G-sync and on the fly resolution scaling, would be perfect. 8K for desktop, productivity with retina quality typefaces, and perfect for photoediting etc. then for gaming the GPU would down-scale and G-sync to obtain the best refresh rate it can always shooting for 60fps+. By rendering at 2K, and then upscaling to 8K, it is doable. To minimise the bandwidth, the upscaling will need to be done by the TFT.
So I took that to mean that it's good for himself, and that he wasn't dictating to others.
Is it possible that you're being a little sensitive or reactionary?
My standard is whatever I can afford at the time,...........but I really want it all, and I want it NOW!! :banghead:
Peace,.................:rockout:
Seriously though -- yes, technologies do advance all the time. Yes, some technologies aren't all that great. Yes, companies want to make madd money. 4k is a step in the right direction, and isn't a gimmick. My eyes can easily perceive a difference; perhaps your eyes can't.
There's two monumental problems with 8K:
#1: No cable that can carry it. I have my doubts whether or not DisplayPort can even be expanded to handle it.
#2: If #1 were solved and the workload was only 2D, GPUs could handle 8K today without a problem. Where the "monumental problem" comes from is that a load any greater than desktop software is going to make any GPU croak at 8K and the only way to combat that is with more transistors which means bigger chips which means more power which means more heat. Unless there is some breakthrough (perhaps Unlimited Detail Technology), displays are going to run away from graphics technology because graphics can't scale at a rate LCD panels do.
The demand for these panels is coming from the film and TV industry where the GPUs only task is to render the video frames, not process massive amounts of triangles. I don't think gaming will see reasonable 4K for a long time, never mind 8K. These things are for mostly film enthusiasts and professionals, not gamers. Games will have to be played at a lower-than-native resolution for acceptable frame rates.
Oh, and speaking of film industry, HDMI is going to have to be kicked to the curb and a new standard (probably adapted from DisplayPort) will have to replace it to handle 8K. That's going to take a very long time to phase out HDMI in favor of something newer.
#2 Most of us care a about 8K for productivity, not games. I for one am happy with only the current 2K screens for games, but olawd the text is nasty :(. VR however will push things a lot harder though.. the Oculus people want more than 8K. Also, I personally doubt that screens will scale beyond 8K. It's already stalling in phones, where 5" 440ppi screens are considered by almost everyone to be enough for almost everyone and more than necessary for most people.
And HDMI can go diaf. It's a shitty hack based of DVI with a licensing fee (wuuut, with free to use DisplayPort also around?!) and limited to external interfaces only. Personally, the only place I need HDMI for is TVs. If TVs had DisplayPort inputs, I wouldn't need HDMI at all!