Thursday, October 16th 2014

8K A Great Challenge: NVIDIA and AMD

Even as 4K Ultra HD (3840 x 2160) is beginning to enter the consumer mainstream, with 28-inch displays being priced around $600, and Apple toying with 5K (5120 x 2880), with its next-generation iMac Retina desktops, Japanese display maker Sharp threw a spanner in the works, by unveiling a working prototype of its 8K (7680 x 4320 pixels) display, at the CETAC trade-show, held in Japan.

Two of the industry's biggest graphics processor makers, NVIDIA and AMD, reacted similarly to the development, calling 8K "a great challenge." Currently, neither company has a GPU that can handle the resolution. 8K is four times as many pixels as 4K. Driving an Ultra HD display over DVI needs two TMDS links, and DisplayPort 1.2 and HDMI 2.0 have just enough bandwidth to drive Ultra HD at 60 Hz. To drive 8K, both NVIDIA and AMD believe you would need more than one current-generation GPU, the display should connect to both cards over independent connectors, and somehow treat the single display as four Ultra HD displays. We imagine Sharp demoed its display at a very low refresh rate, to compensate for the bandwidth limitation. After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up.
Source: Expreview
Add your own comment

93 Comments on 8K A Great Challenge: NVIDIA and AMD

#26
GhostRyder
I think the only problem is how much of this can we see at a point?

I mean really I think 4k was going to be the limit of what we can really perceive with our eyes so the only reason I see these resolutions being better is for bigger screens to get higher pixel density which could keep things a bit clearer. Though I think we are a ways off from 8K being feasible for gamers, 4K is doable with 2 top end cards or more right now but that is about the minimum.
Posted on Reply
#27
64K
I can't imagine what kind of cards could handle 8K in gaming. That's a little over 33 million pixels. Equal to powering four 4K monitors or 16 1080p monitors. Apparently LG has an 8K TV though. I'm sure it's going to be very expensive for early adopters.
Posted on Reply
#28
nickbaldwin86
When I read that title I was like WOOT 8k because 8 is bigger than 4 WO0T MOAR PPI !!!!

Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website"

Really are you all affraid of the future?????

8K is amazing it moves the future forward... we don't need faster cards to run 1080p we need cards that will run 8k like butter then you can play 4k without a issue max out on the latest greatest.

If I had the money I would get a 4k monitor yesterday and enough video cards to run it, I have been saving and my next purchase for my PC is a new monitor.... 1440p with 144hz ROG swift or a 4k monitor are my two options.

I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV

Anyways I am super excited and I cant wait to see what AMD and NV have in store for 8K.
Posted on Reply
#29
Patriot
FrickThat is what needs to be done IMO. Sub €200 1440p monitor please.
You can get 1440p sub $400 in the US no problem.
I got my 1600p last year for $600
Posted on Reply
#30
RejZoR
This "Unlimited Detail" has been nothing but a vaporware consisting of bunch of amazing videos and not a single realtime demonstration. I'll believe them when they run their magical stuff in front of media (journalists) in real-time. Until then, vaporware...
Posted on Reply
#31
RCoon
Films and GPUs haven't even made it to 4K yet. When I watch a 720p or 1080p youtube clip on my 1440p monitor, it looks bad. When I watch a Bluray on my 1440p monitor, it reminds me of watching DVD's on a HD tv. The quality is fine, but the crispness is lacking when scaled up to those resolutions. 4K needs to be widely accepted (and supported) first, so video files look crisp again.

Then there's games, textures aren't compressible beyond what we currently see, and games are already bordering on the 50GB mark. Imagine 4K textures on steam games download for days at a time. Not to mention SSD's aren't ready for being used as game storage devices still, as they're still not overly price perfect. GPUs can't push 4K at more than 45FPS, and consoles most definitely can't push anything beyond 900p currently.

Not only are we severely unprepared for 8K, we're still catching up to get 4K ready within the next few years. Not to mention they simply aren't affordable. You need to wait for the tech at the bottom to catch up (consoles, GPUs, SSD's and internet speeds/bandwidth) before you start pushing even more pixels.
Posted on Reply
#32
Animalpak
I have a 1440p monitor and i must say that my 780 Ti handle it pretty good but not exceptionaly like 1080p.

At 1440p hitting constantly over 100 fps wall is not a easy task for single GPU.

A SLI setup will do.
Posted on Reply
#33
Disparia
Good stuff.

Full-HD couldn't hold us down forever!
Posted on Reply
#34
de.das.dude
Pro Indian Modder
nickbaldwin86When I read that title I was like WOOT 8k because 8 is bigger than 4 WO0T MOAR PPI !!!!

Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website"

Really are you all affraid of the future?????

8K is amazing it moves the future forward... we don't need faster cards to run 1080p we need cards that will run 8k like butter then you can play 4k without a issue max out on the latest greatest.

If I had the money I would get a 4k monitor yesterday and enough video cards to run it, I have been saving and my next purchase for my PC is a new monitor.... 1440p with 144hz ROG swift or a 4k monitor are my two options.

I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV

Anyways I am super excited and I cant wait to see what AMD and NV have in store for 8K.
people like you make it easy for companies. you guys will literally buy anything even if it doesnt make any sense.
Posted on Reply
#35
Sony Xperia S
RCooninternet speeds/bandwidth
Some countries don't have problems with the internet connection, actually it is the best there where it is least expected to be so.

Ask your stupid internet providers in the western world for higher speeds finally to drive progress.
Posted on Reply
#36
Hilux SSRG
8K sounds good. This will push companies to develop better screens, technology, etc. in order to get the industry away from 1080p and moving towards 1440p and ultimately a 4K as the new standard.
Posted on Reply
#37
Frick
Fishfaced Nincompoop
Sony Xperia SSome countries don't have problems with the internet connection, actually it is the best there where it is least expected to be so.

Ask your stupid internet providers in the western world for higher speeds finally to drive progress.
It's not that easy to dig fiber optics all over entire continents ... especially when many places are remote. In the cities and stuff sure it might not be a problem, but what about the suckers not living there? Sweden generally is very good in this regard, but on the edges people still can't get speeds faster than old phone connections. No wireless cover either. And hell, if you're on a gigabit connection your mechanic HDD won't keep up anyway. Which was the point: It's not just ONE bottleneck (even though internet speeds is the lowest common denominator at this point), there are several.
PatriotYou can get 1440p sub $400 in the US no problem.
I got my 1600p last year for $600
€200 would mean $200 when taxes being accounted for. Give or take.
Posted on Reply
#38
ZeDestructor
Reader4K or 8K or 16K.....
All of them are marketing hoaxes.
Producers try to fascinate the "rich idiots".
Producers try to earn MORE thanks to INEFFICIENT technologies.
GPUs are not enough for the 4K and they intend to offer 8K!
It is very clear that " THEY WANT YOUR MONEY", "THEY WANT MORE MONEY".
Except the hi-end gaming laptops still more than 80% of the laptops offer 1366X768 pixel resolution.
Before 1080p becomes a standard, producers started to produce 4K and now 8K.
1920X1080p has never become a standard.
1920X1080p resolution must have been a standard before 4K technology.
USB 3.0 was announced in 2008 but still producers offer USB 2.0 products.
USB 3.0 has never become a standard.
And USB 3.1 was announced before USB 3.0 becomes a standard.
On the other hand cameras.
Canon had only 1 camera that is capable of recording 1080p 60fps.
It was 1DC.
And a few weeks ago Canon EOS 7D Mark II was announced. Capable of recording 1080p 60fps.
Giant Canon has only 2 cameras capable of recording 1080p 60fps.
But nowadays producers offer 4K Cameras.
1080p 60fps video recording capability must have been a standard for the cameras before 4K.
Before UHS-I technology becomes a standard UHS-II was announced.
Technological improvements are TOO FAST to be released.
But Technological improvements are TOO SLOW to become the standard.
Never forget that "If a technology does not become a standard it will always be LUXURY and UNATTAINABLE."
A technology must become a standard to be able to be AFFORDABLE.
I code on my desktop. 4K and 8K for me would be a massive improvement just in text quality. Sure, a few programs scale blurrily, but I can live with a few blurry programs while everything important (text editors, IDEs, browsers, DE) scales

Single GPUs can't drive 4K yet, but SLI 970s can. There will be a bigger Maxwell chip soon, much like Kepler. A single one of those should be able to get 60+fps on 4K, and a pair in SLI should be able to do 45+fps on triple-screen 4K. With 3 or 4 GPUs, 8K is entirely doable. Expensive, but doable. Also potentially unnecessary depending on how AA and the need for AA scales.

Before the netbook-era drove prices stupidly low (and stalled display improvements for 6-10 years.. don't get me started on this crap...), laptops had been increasing resolution steadily. Nowadays, 4K is entirely accessible on a laptop, and 1080p is (finally!) considered the baseline resolution by most reviewers and general people. I expect the 1366*768 size to die out completely in the next 2-3 years, especially now that Ultrabooks have put in new life into the laptop market. Looking at the steam HW survey, the 1920x1080 category is growing the fastest and is the largest. 1080p is very much the standard target resolution, and 4K will be the next target standard. Graphics cards also show that: they're all optimized for 1080p6 to 1080p120, and are now starting to target the 4K segment.

USB3 is already standardised. You just need to pay for it. USB2 products still exist because they are cheap. Taking a thumbdrive as an example: you only need USB3 for a fast one, so why bother making a slow one use more expensive USB3 parts and design time instead of just reusing current USB2 parts? Same reason screens only use DVI-SL with DVI-SL cables rather than DL-DVI or DP, or why you don't put a Noctua NH-D15 on an Atom C2750.

Cameras couldn't do 4K and 8K or high FPS for a long time because of physics: you need to get more detail with the same amount of light. The more detail you try to extract out of the same amount of light, the more noise becomes an issue, and that's unnacceptable for anyone worth their salt. Sensors are finally good enough now that cameras are shipping with them. Compare the RED One camera to the RED EPIC DRAGON. The sensor on the DRAGON is higher resolution (6K vs 4K), and more importantly, has an SNR of 80dB vs 66dB of the RED ONE. The SNR difference is what allows the DRAGON to hit 150fps of 4K (with a the help of a shitton of extra light from spot lamps) while the ONE has to make do with only 30fps. Don't argue with physics, you will lose. As for dSLR sensors, they are not geared towards video, and consequently, don't work anywhere near as well. Oh, and the storage on dSLRs is crap compared to a real video camera: SD/ZQD/CF cards vs raw digital feed to a real server with capture cards and RAIDed and/or PCIe SSDs. It's correspondingly more expensive. And finally, to put into perspective, cinema-grade 2K video is currently at 250MB/s, or about 2Gbit/s. And that's after compression. Meanwhile blu-rays have to do with ~50Mbit/s at most due to space constraints. For that level of quality, forget about consumer gear, even top-end gear isn't fast enough to cope for a large number of producers.

All in all, it's not that corps don't want steady improvements, but more that this thing called real world physics gets in the way of steady improvements.
Posted on Reply
#39
Rojan
Sony Xperia SI need my 24 inch 4K monitor NOWWWWW!!!
UP2414Q, been out for months.
Easy Rhinoi'm sorry but all of your points are irrelevant. give me a fun game to play FIRST and then maybe i will consider if it requires a 4K or 8K resolution over my 1080p monitor. Focus on game play and fun, not on uber graphics.
I'm sorry if someone is gonna be mad on the thing I'm about to say, but gameplay is really irrelevant in terms of technical advancement in display resolution. I mean, it doesn't even have to be about video games at all! High PPI's more practical usage is gaming but for sheer resolution, it could be used in other scenarios (ie. productivity with more real estate without the use of multiple panels and photography work).
Posted on Reply
#40
Tallencor
TPU's First Patreon
nickbaldwin86I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV
Tell that to the pro CRT guys. I have a 1080 and down sample some games to rid AA. Although I don't confess myself a "high end PcGamer" I do however take pride in the equipment I do have and find inventive ways of using it to it's full potential. Your statement is unjust.
Posted on Reply
#41
Bansaku
nickbaldwin86Then I started reading peoples comments and went "Am I on TechPowerUp or am I on some annoying babies website"

Really are you all affraid of the future?????

I don't know how anyone clams to be a high end PC gamer and then goes and buys a 1080p monitor and says that is good enough... sounds more like you should go pick up a XboXOne and plug it in to your 720p TV

Anyways I am super excited and I cant wait to see what AMD and NV have in store for 8K.
Nice attitude, sounds like you are the baby. Most people here at TPU are realists, like myself. Fact, 4K was only ever meant for broadcasting and media production. The push to 4K was not a logical step forward, rather a push from manufacturers to reignite a slumping market with 'new' technologies. 3D flopped, smart TV is a joke, and 120Hz is so 2007. All it took was a Chinese company to release a (sub par) 4K consumer TV for the end user, like yourself, to say " OMG I WANT ".

" After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up. "

This statement make absolutely no sense. LCD panel manufacturers as well as the designers of modern GPUs have enough challenges with the manufacturing process and limitations of today's 'current' technology. It has nothing to do with politics (tyranny) and everything to do with feasibility. While market demand may push for a certain technology, they can't just 'do it' and make it work. We would all be driving flying cars and traveling to far away solar systems if that were the case.

It takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?
Posted on Reply
#42
Sony Xperia S
Bansaku" After 10 years of Full-HD tyranny, display resolutions are finally beginning to see their normal rate of development. It's time now for GPU developers and display interconnects to keep up. "

This statement make absolutely no sense. LCD panel manufacturers as well as the designers of modern GPUs have enough challenges with the manufacturing process and limitations of today's 'current' technology. It has nothing to do with politics (tyranny) and everything to do with feasibility. While market demand may push for a certain technology, they can't just 'do it' and make it work. We would all be driving flying cars and traveling to far away solar systems if that were the case.
The usual political excuses. Just say it, that YOU or the manufacturers don't want to do it instead of your nonsense. You make it sound as if it is so difficult even impossible but people said 'impossible is nothing' and you can effectively replace your "feasibility" with 'stupidity' and you would be fully correct.

Just look at your smartphone, its display quality, ppi image quailty, etc and you will understand what i mean, probably... :laugh:
Posted on Reply
#43
The Von Matrices
I look forward to 8K. With 8K we can finally get rid of subpixel text rendering (and its resultant color fringing) and anti-aliasing.
Posted on Reply
#44
RadFX
BansakuIt takes 2 top tier GPUs to effectively play a game at 4K with moderate settings at a decent frame-rate. I don't see 8K coming any time soon. Could you imagine the heat coming off a GPU that can fully render 8K with all of the rendering goodies?
With all due respect I game with a pair of Radeon's (7970ghz and R9 280) at 1.1ghz and 1.5ghz ram at 4k and its fine. There are but a few games that would be unplayable with this setup with max or near max settings (no AA).
Posted on Reply
#45
lemonadesoda
8K is what we want, yes!

This, with G-sync and on the fly resolution scaling, would be perfect. 8K for desktop, productivity with retina quality typefaces, and perfect for photoediting etc. then for gaming the GPU would down-scale and G-sync to obtain the best refresh rate it can always shooting for 60fps+. By rendering at 2K, and then upscaling to 8K, it is doable. To minimise the bandwidth, the upscaling will need to be done by the TFT.
Posted on Reply
#46
ZeDestructor
lemonadesoda8K is what we want, yes!

This, with G-sync and on the fly resolution scaling, would be perfect. 8K for desktop, productivity with retina quality typefaces, and perfect for photoediting etc. then for gaming the GPU would down-scale and G-sync to obtain the best refresh rate it can always shooting for 60fps+. By rendering at 2K, and then upscaling to 8K, it is doable. To minimise the bandwidth, the upscaling will need to be done by the TFT.
Nah, have the GPU do the scaling and leave the screen do have a really simple, really dumb controller for minimized latency. When actually pushing a signal from the GPU to the screen, the bandwidth matters little.
Posted on Reply
#47
RealNeil
Sony Xperia SDo you understand what I'm telling you or not?

You have NO right to tell the others what "enough" is because your requirements and quality as a person may be lower than those others.

Actually, you need to make the standard higher and if it is higher for you all right, but you CAN'T and SHOULDN'T make the standard lower for everyone because there will always be someone for whom it is NOT enough.

Now understand?
He said that if ~~He~~ was gaming,...................................Quoting him: "If I was gaming gaming on 24 inch Full HD screen its more than enough"
So I took that to mean that it's good for himself, and that he wasn't dictating to others.

Is it possible that you're being a little sensitive or reactionary?
My standard is whatever I can afford at the time,...........but I really want it all, and I want it NOW!! :banghead:

Peace,.................:rockout:
Posted on Reply
#48
Fx
Reader4K or 8K or 16K.....
All of them are marketing hoaxes.
Producers try to fascinate the "rich idiots".
Producers try to earn MORE thanks to INEFFICIENT technologies.
GPUs are not enough for the 4K and they intend to offer 8K!
It is very clear that " THEY WANT YOUR MONEY", "THEY WANT MORE MONEY".
Except the hi-end gaming laptops still more than 80% of the laptops offer 1366X768 pixel resolution.
Before 1080p becomes a standard, producers started to produce 4K and now 8K.
1920X1080p has never become a standard.
1920X1080p resolution must have been a standard before 4K technology.
USB 3.0 was announced in 2008 but still producers offer USB 2.0 products.
USB 3.0 has never become a standard.
And USB 3.1 was announced before USB 3.0 becomes a standard.
On the other hand cameras.
Canon had only 1 camera that is capable of recording 1080p 60fps.
It was 1DC.
And a few weeks ago Canon EOS 7D Mark II was announced. Capable of recording 1080p 60fps.
Giant Canon has only 2 cameras capable of recording 1080p 60fps.
But nowadays producers offer 4K Cameras.
1080p 60fps video recording capability must have been a standard for the cameras before 4K.
Before UHS-I technology becomes a standard UHS-II was announced.
Technological improvements are TOO FAST to be released.
But Technological improvements are TOO SLOW to become the standard.
Never forget that "If a technology does not become a standard it will always be LUXURY and UNATTAINABLE."
A technology must become a standard to be able to be AFFORDABLE.
Yeah, screw it. I say lets go back to 1024x768 days......

Seriously though -- yes, technologies do advance all the time. Yes, some technologies aren't all that great. Yes, companies want to make madd money. 4k is a step in the right direction, and isn't a gimmick. My eyes can easily perceive a difference; perhaps your eyes can't.
Posted on Reply
#49
FordGT90Concept
"I go fast!1!11!1!"
facepalm.jpg

There's two monumental problems with 8K:

#1: No cable that can carry it. I have my doubts whether or not DisplayPort can even be expanded to handle it.

#2: If #1 were solved and the workload was only 2D, GPUs could handle 8K today without a problem. Where the "monumental problem" comes from is that a load any greater than desktop software is going to make any GPU croak at 8K and the only way to combat that is with more transistors which means bigger chips which means more power which means more heat. Unless there is some breakthrough (perhaps Unlimited Detail Technology), displays are going to run away from graphics technology because graphics can't scale at a rate LCD panels do.

The demand for these panels is coming from the film and TV industry where the GPUs only task is to render the video frames, not process massive amounts of triangles. I don't think gaming will see reasonable 4K for a long time, never mind 8K. These things are for mostly film enthusiasts and professionals, not gamers. Games will have to be played at a lower-than-native resolution for acceptable frame rates.

Oh, and speaking of film industry, HDMI is going to have to be kicked to the curb and a new standard (probably adapted from DisplayPort) will have to replace it to handle 8K. That's going to take a very long time to phase out HDMI in favor of something newer.
Posted on Reply
#50
ZeDestructor
FordGT90Conceptfacepalm.jpg

There's two monumental problems with 8K:

#1: No cable that can carry it. I have my doubts whether or not DisplayPort can even be expanded to handle it.

#2: If #1 were solved and the workload was only 2D, GPUs could handle 8K today without a problem. Where the "monumental problem" comes from is that a load any greater than desktop software is going to make any GPU croak at 8K and the only way to combat that is with more transistors which means bigger chips which means more power which means more heat. Unless there is some breakthrough, displays are going to run away from graphics technology because graphics can't scale at a rate LCD panels do.

The demand for these panels is coming from the film and TV industry where the GPUs only task is to render the video frames, not process massive amounts of triangles. I don't think gaming will see reasonable 4K for a long time, never mind 8K. These things are for mostly film enthusiasts and professionals, not gamers. Games will have to be played at a lower-than-native resolution for acceptable frame rates.

Oh, and speaking of film industry, HDMI is going to have to be kicked to the curb and a new standard (probably adapted from DisplayPort) will have to replace it to handle 8K. That's going to take a very long time to phase out HDMI in favor of something newer.
#1 Diplayport Can do ~25Gbit/s right now, and according to the DiaplyPort page on wikipedia, 8K*24bit@60Hz lies around 50-60Gbit/s. For 8K*30bit@120Hz, 125-150Gbit/s should be the bandwidth we're looking at. CAT-8 cabling (4-pair, 8wires) is currently being finalized to provide 40Gbit/s, over 100m. DP is a 4-lane cable (one "pair" per lane). Using CAT-8 grade of cabling with the right transceivers over the usual 5m max needed length (a 5m CAT-8 cable should be good for at least 100Gbit/s) of DP cabling, 8K is perfectly feasible with current tech. Expensive, but feasible. Hell, odds are that CAT-8 cabling will be good for 100Gbit/s over the full 100m length thanks to new electronics... Pennsylvania State Universitypeople theorized that 32 or 22nm circuits will do 100Gbit/s over 100m of CAT-7A in 2007.

#2 Most of us care a about 8K for productivity, not games. I for one am happy with only the current 2K screens for games, but olawd the text is nasty :(. VR however will push things a lot harder though.. the Oculus people want more than 8K. Also, I personally doubt that screens will scale beyond 8K. It's already stalling in phones, where 5" 440ppi screens are considered by almost everyone to be enough for almost everyone and more than necessary for most people.

And HDMI can go diaf. It's a shitty hack based of DVI with a licensing fee (wuuut, with free to use DisplayPort also around?!) and limited to external interfaces only. Personally, the only place I need HDMI for is TVs. If TVs had DisplayPort inputs, I wouldn't need HDMI at all!
Posted on Reply
Add your own comment
Nov 22nd, 2024 05:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts