Tuesday, September 20th 2011
Super-High 4096 x 4096 Display From An IGP? The Upcoming Ivy Bridge Can Do It
The new Ivy Bridge processors, due out in about six months, have one apparently overlooked but important feature. No, it's not the greatly increased speed (about double or more of Sandy Bridge) or the advanced feature set. It's actually the super-high resolution capability: specifically 4096 x 4096 pixels. This astonishing capability is far better than any of the top-end discreet graphics cards such as the NVIDIA GTX 590 or AMD HD 6990 via a single monitor port. It's so high in fact, that there's almost no content at that resolution and no monitor that can handle it. This IGP can actually play multiple 4K video streams, too. NVIDIA unsurprisingly, is talking up the gaming possibilites at such a resolution. I'd like to see what kind of monster GPU could handle it. It will be interesting to see what uses this capability gets put to generally - and just how much the whole setup will cost.
Source:
VR-ZONE
54 Comments on Super-High 4096 x 4096 Display From An IGP? The Upcoming Ivy Bridge Can Do It
Wow . Already unecessarily powerful...... now with even more POWER!!!!!:rockout: man I love intel:rockout::rockout::rockout:
Yeah yeah yeah nothing on earth now can effectively use this. Think of it as future proofing for the next decade. Or hologram ready :) yeah that's the ticket
On the flip, we have seen higher that 4k resolutions from AMD Eyefinity. Granted it was always more than one LCD and more than one GPU. In the end, this will just be some digits on a spec sheet that will be ignored and overlooked as they admit it will serve no purpose for lack of content. And if you do run a 4k res video, I am sure it will fall well below the 24 FPS movies use. Don't expect this on-die GPU to be beyond the sub-$100 performance section. So yes, I am going to hate a little here, but somebody has to do it.
Eyefinity 6 can do a higher resolution (7680x3200, 24.5 MP) but over six discreet cables--Intel might simply be leaving out that little bit of information.
P.S. It is no longer called LightPeak.
I know HDMI has a "Professional" version of their spec taped out including a proprietary connector, which is supposed to address the disparity between HDMI and DVI as far as resolution capability, but last I read about it, no device was actually using the "pro" connector. Probably has sky-high liscencing fees too. I imagine if they spec'd for higher quality cabling and pushed out a new revision of the specification, new devices could probably operate on a combination of the 29-pin HDMI connector and higher clock rates. Although DVI and DisplayPort could do so just as easily.
There is no disparity in resolution between DVI and HDMI. The only reason the specs are different is DVI is very old and still worked with 4:3 computer monitors in mind, while HDMI was done mainly for 16:9 with all display systems in mind.
And as far as I know, there never has been an HDMI Pro. Sounds like some marketing gimmick BS from Monster Cables to sell their $100 cables to simple folk who don't know better.
Thunderbolt is copper-based. LightPeak is optical/fiber based. Optical is ideal for sending imagery but it isn't simple nor cheap. Maybe Intel had a breakthrough. At an abnormal 33 Hz. 2560x1600 is the maximum dual-link DVI can handle at 60 Hz. 60 Hz is the standard for computers.
Quad-link DVI is an impossibility due to not having enough physical connections in the DVI standard. You might be thinking of 2 x dual-link DVI (literally two inputs on the monitor) which a lot of very high resolution (5+ MP) professional monitors use.
Also Intel =! drivers.
Until that problem is solved this is just another Intel "marketing" point.
Originally conceived as an optical technology, Thunderbolt switched to electrical connections to reduce costs and to supply up to 10W of power to connected devices.[14]
In 2009, Intel officials said the company was "working on bundling the optical fibre with copper wire so Light Peak can be used to power devices plugged into the PC."[15] In 2010, Intel said the original intent was "to have one single connector technology" that would allow "electrical USB 3.0 […] and piggyback on USB 3.0 or 4.0 DC power."[16]
In January 2011, Intel's David Perlmutter told Computerworld that initial Thunderbolt implementations would be based on copper wires.[17] "The copper came out very good, surprisingly better than what we thought," he said.[18]
Intel and industry partners are still developing optical Thunderbolt hardware and cables.[19] The optical fiber cables are to run "tens of meters" but will not supply power, at least not initially.[20][21][22] They are to have two 62.5-micron-wide fibers to transport an infrared signal up to 100 metres (330 ft).[23] The conversion of electrical signal to optical will be embedded into the cable itself, allowing the current DisplayPort socket to be future compatible, but eventually Intel hopes for a purely optical transceiver assembly embedded in the PC.
As of now, Thunderbolt is a PCIe 4x and DisplayPort rolled into one. Bandwidth is 10 Gbits/s bi-directional. Switch that to single direction and you could currently get 20 Gbits/s, putting it just north of DisplayPorts max.
Reference: www.intel.com/technology/io/thunderbolt/ under the "what is Thunderbolt section"
24-bit, 4096x4096 @ 60 Hz = 24.159191040 Gb/s
32-bit, 4096x4096 @ 60 Hz = 32.212254720 Gb/s
And HDMI "B" is nothing special. It was just a revision designation. When they update 1.4a it will become 1.4b. If they change something major or there is a planned upgrade in bandwidth or performance, it will be 1.5. I did say maybe. LightPeak is not some grand scheme or goal. It was just a code name. They may reuse the code name, but I really, really doubt they would.