Wednesday, April 8th 2015
HDMI Forum, Inc. Releases HDMI 2.0a Specification
HDMI Forum, Inc., a non-profit, mutual benefit corporation, today announced the completion and release of Version 2.0a of the HDMI Specification. It is available to current HDMI 2.0 Adopters via the HDMI Adopter Extranet.
The specification has been updated to enable transmission of HDR formats, which provide enhanced picture quality by simultaneously enabling greater detail for both the dark and bright parts of an image. The HDR-related updates include references to CEA-861.3, CEA's recently published update of HDR Static Metadata Extensions."We recognized that HDR would be a critical feature as the industry evolves. Our support for HDR enables our 800+ HDMI 2.0 Adopters to develop market-leading products that include HDR and will maintain interoperability across the entire HDMI ecosystem," said Robert Blanchard, President of the HDMI Forum, Inc. "Along with the publication of the CEA extensions, the HDMI Forum continues to update the HDMI Specification and remain closely aligned with leading CE standards organizations."
"By adding HDR, the HDMI Specification continues its history of supporting the latest formats and technologies planned for Hollywood content," said Arnold Brown, Chairman of the HDMI Forum, Inc. Board of Directors.
For more information about HDMI technology, please visit this page.
The specification has been updated to enable transmission of HDR formats, which provide enhanced picture quality by simultaneously enabling greater detail for both the dark and bright parts of an image. The HDR-related updates include references to CEA-861.3, CEA's recently published update of HDR Static Metadata Extensions."We recognized that HDR would be a critical feature as the industry evolves. Our support for HDR enables our 800+ HDMI 2.0 Adopters to develop market-leading products that include HDR and will maintain interoperability across the entire HDMI ecosystem," said Robert Blanchard, President of the HDMI Forum, Inc. "Along with the publication of the CEA extensions, the HDMI Forum continues to update the HDMI Specification and remain closely aligned with leading CE standards organizations."
"By adding HDR, the HDMI Specification continues its history of supporting the latest formats and technologies planned for Hollywood content," said Arnold Brown, Chairman of the HDMI Forum, Inc. Board of Directors.
For more information about HDMI technology, please visit this page.
28 Comments on HDMI Forum, Inc. Releases HDMI 2.0a Specification
...and perhaps Thunderbolt...
Where-as HDMI 1.4 can support 4k30 (which is all many need), HDMI 2.0 uses HDCP 2.2 which broke compatibility with a ton of equipment, and of course we also have the Silicon Image 9679 chipset being the first, and long-time only solution to support it only allowing 4:2:0 (meaning not RGB nor 4:4:4 8bit, which the standard supports) at that spec. It took forever to catch up to DP and they still managed to botch it's rollout, which is truly quite impressive in the mess they created for themselves and the fracturing in compatibility they created for consumers.
I use it, but truly...fuck HDMI. There is so much wrong with it on both fundamental and practical levels, and they don't appear to be getting their shit together in the slightest. Considering that DP 1.3 has HDMI 2.0/HDCP 2.2 compatibility mode, I don't see why something can't be worked that makes consumers AND Hollywood happy.
I just find it so weird hdmi 2.0 even has a footing. Obviously some companies (like LG and Samsung) get it. They keep up to date with supported features, and even went out of their way to support adaptive sync/vblank.
What is holding everyone else back, incompetence?
HDMI Forum should really be creating a new standard that doesn't rely on two-decade's old DVI micro-packets. HDMI has always been half-assed.
www.startech.com/AV/Displayport-Converters/DisplayPort-to-VGA-Adapter-Converter~DP2VGA
www.amazon.com/dp/B0025ZUF8K/?tag=tec06d-20
www.amazon.com/dp/B00BI3YEQO/?tag=tec06d-20
www.amazon.com/dp/B004C9M7UG/?tag=tec06d-20
DisplayPort has the best compability you could wish for!
Changing from HDMI to DisplayPort would presumably require a lot of those $20 converters. It's not practical. The HDMI standard is garbage because it requires royalties per connector and it doesn't have cable quality controls like VGA, DVI, and DisplayPort. An HDMI cable can claim it will work for 300 yards but virtually every HDMI cable at that length can only carry a tiny resolution and refresh rate. On the other hand, a 300 yard DVI cable will work as advertised because it was tested using VESA's methodology to make sure it will.
The problem with micro-packet is it lacks diversity. It was designed for a singular purpose back in the 1990s and that is for digital video. For example, for HDMI to add audio support, they literally had to add two more wires because the micro packet architecture does not tolerate audio packets. DisplayPort has no such limitation. It's packet architecture can carry video, audio, and even USB3.
DP to DVI-D should require just adapter because both are digital. DP to VGA, DVI-A should require converter since only DP is digital. With all that said DP to VGA/DVI-A "converter" should have external power source and some kind of microchip to convert signals, if you take a look at those products above you will see that none of the product has that which leads me to believe that they are in fact "adapters", I know it sounds weird and illogical but that is all I can gather from this. And USB Type-C can also carry DisplayPort signal which is kind of interesting. What if you have keyboard connected to a display and DP connector for display is connected to USB Type-C Adapter (similar to this:store.apple.com/us/product/MJ1K2AM/A/usb-c-digital-av-multiport-adapter) and that USB Type-C Adapter is connected to PC. You Are essentially carrying (USB signals in DisplayPort signals) in USB signals.
Here's a detailed post about active versus passive:
www.overclock.net/t/721931/active-vs-passive-displayport-adapters-the-truth
In short:
Active: has a converter built in
Passive: relies on a converter in the graphics card (no converter, no worky)
armdevices.net/2014/09/19/8k-120hz-demo-by-nhk-japan/
For gaming there is no combination of GPUs that can handle 4K @ 144 Hz and probably no CPU for a while that could feed the GPUs fast enough. Maybe that will improve with DX12. We'll see. As far as 8K goes supposedly LG outed Apple releasing an 8K iMac later this year. The story is that the image on the screen will be lifelike which I guess has it's appeal to professionals but not for entertainment.
I don't see 8K movies and cable as a reality until serious effort is put into making the broadband bandwidth necessary practical and affordable. Bear in mind that an 8K TV would have the same number of pixels as a 16 Full HD TVs.
For gaming at 8K @ 120 Hz that is a looooong way off imo. We will probably see a 3 way high end SLI/Crossfire setup that could handle that one day but what CPU could feed those GPUs fast enough?
Edit: I went and looked at some reviews and the CPU side of things should not be a problem for most games even with games getting more demanding. Some RTS and open-world games probably will have issues though at 120/144 FPS.
Also in 2016 DisplayPort 1.4a it seems will support 8K@120Hz also (DP1.3 already supports 8K@60Hz)
Just so both resolution and refresh rate are totally factored out in determining quality.
Cable/ Movies don't offer much promise anytime soon.
As for gaming have a look at the Steam Hardware Survey. I know it's not accurate but it can give an idea of where we are right now as far as gaming.
store.steampowered.com/hwsurvey?platform=pc click on Primary Display Resolution
27.38% of gamers on 1366 X 768 resolution
34.28% of gamers on 1920 X 1080 resolution
0.05% of gamers on 3840 X 2160 resolution (1 out of 2,000)
My point is that there isn't a lot of pressure to push the HDMI/ Display Port specs much at this time.
Because im pretty much 100% sure it will all be streaming by then and seeing as some people already having 1Gbps connections, that should work out just fine.
It takes time before something new becomes a standard and people adopt it, right now because of the current limitations on videocards/cables/screens etc people do not adopt.
I know I wont adopt 4k untill A. We will do it at 120hz (and not the bs 30hz or the dated 60hz we have now), and B. Videocards have the power and then some to actually run games at that kinda of specs (of which non exists today, the Titan X is for me massively disappointing in that regard).