Sunday, December 20th 2009
SAPPHIRE Solves Eyefinity Issue with Active DisplayPort Adapter
A feature of the latest SAPPHIRE HD 5000 series of graphics cards is the new ATI Eyefinity mode, which enables games and other applications to be run on three screens treated as one continuous display area. Now with the SAPPHIRE Eyefinity adapter, standard DVI monitors can be used for all three screens.
In addition to spectacular image clarity, speed and visual effects, the SAPPHIRE HD 5000 series supports the new multi-monitor mode known as ATI Eyefinity. This allows a single GPU to display a choice of images over an array of several monitors. The cards in this series support up to three monitors, with a resolution of up to 7680 x 1600. This opens up exciting possibilities not just for multi-screen gaming, but also for information systems, multimedia systems and promotional displays.However, with this family of cards, although two of the screens can be connected directly to the DVI or HDMI outputs on the card, the third display has to be driven from the DisplayPort output. Some users have expressed concern about the cost or availability of DisplayPort compatible monitors, or a wish to match existing monitor styles.
Now SAPPHIRE has introduced an active powered DisplayPort to DVI converter, which allows any standard DVI monitor to be used as the third screen and enabled in Eyefinity mode. This allows users to update their existing multi-monitor setup to use Eyefinity, or to add a lower cost DVI monitor to complete their Eyefinity system. The SAPPHIRE converter is simply plugged into the DisplayPort output of the graphics card, together with a standard USB connection to provide power. A standard DVI monitor cable can then be connected to the female DVI connector on the unit.
This series of cards is supported by AMD's DirectX 11 WHQL certified graphics driver which delivers support for all of the key DirectX 11 level features required for new gaming experiences and acceleration of next generation high performance applications.
In addition to spectacular image clarity, speed and visual effects, the SAPPHIRE HD 5000 series supports the new multi-monitor mode known as ATI Eyefinity. This allows a single GPU to display a choice of images over an array of several monitors. The cards in this series support up to three monitors, with a resolution of up to 7680 x 1600. This opens up exciting possibilities not just for multi-screen gaming, but also for information systems, multimedia systems and promotional displays.However, with this family of cards, although two of the screens can be connected directly to the DVI or HDMI outputs on the card, the third display has to be driven from the DisplayPort output. Some users have expressed concern about the cost or availability of DisplayPort compatible monitors, or a wish to match existing monitor styles.
Now SAPPHIRE has introduced an active powered DisplayPort to DVI converter, which allows any standard DVI monitor to be used as the third screen and enabled in Eyefinity mode. This allows users to update their existing multi-monitor setup to use Eyefinity, or to add a lower cost DVI monitor to complete their Eyefinity system. The SAPPHIRE converter is simply plugged into the DisplayPort output of the graphics card, together with a standard USB connection to provide power. A standard DVI monitor cable can then be connected to the female DVI connector on the unit.
This series of cards is supported by AMD's DirectX 11 WHQL certified graphics driver which delivers support for all of the key DirectX 11 level features required for new gaming experiences and acceleration of next generation high performance applications.
79 Comments on SAPPHIRE Solves Eyefinity Issue with Active DisplayPort Adapter
basicly you should be soon able to run your PC monitor on the same cable as your 1080p 50" OLED HDTV :D
Also, consider the fact that it's cheaper to buy several slightly smaller screens and then hook them all together than it is to buy one larger screen. (tvs excepted).
In games, it's nice to be able to see in more than one direction at a time.
Most likely all they're doing is repackaging another companies display port adaptor.
and will be able to get where you can buy sapphire cards, so no need to order from the US!!!
no need to go Dell or Apple adapters that 88% of the time allways dont work ;P
good for sapphire though, there is a solution for eyefinity people without a DP monitor. cool i guess?
I couldn't find an active display port adapter anywhere in the UK, now I can go out and buy another monitor that matches the two I already have :)
I don't know if this helps anyone but I read somewhere that the 5*** series only has 2 clock generators that are needed for HDMI/DVI, meaning you can only send 2 HDMI/DVI signals (through whichever ports) and the third must be native display port.
I think the goal for DisplayPort is to replace both. I think it is too soon to tell if it will succeed. Virtually nothing supports DisplayPort now and zero backwards compatibility support is a major blow to its implementation. If it does become the unified standard, it won't be for at least five years.
Look how many people are using HDMI for their entertainment system. HDCP doesn't benefit them at all but they clearly don't care or they'd still be using component cables.
I hope no one buys DisplayPort, especially in the entertainment industry. If the entertainment industry sticks with HDMI, the computer industry will most likely stick with DVI. Given enough time of very limited usage, the standard will die.
HDMI was old before it was even conceptualized. The computing industry needs to create a new benchmark that maintains backwards compatibility (like DVI did with analog) and yet moves forward. DisplayPort messes up back on backwards compatibility. If it had that, I think it would be a reasonable path forward.
1. Cheap adaptor to convert back to DVI. Big plus.
2. Audio. While useless to gamers, its a big boost to home theatre users and consoles.
3. Long cables, thin plug (DVI was massive!) - i'm running 15M of HDMI cable between rooms. try that with DVI.
There are limitations with HDMI audio, but we cant help that. stupid HDCP limits TV/screen outputs to stereo, so even if you run HDMI you need to have it reach your sound system BEFORE it reaches your TV/monitor, if you want 5.1 audio
I think DP has at least few better aspects than DVI... more convenient connector, little more bandwith (8.64 Gbit/s Vs. DVI's 7.92 Gbit/s), fiber optic support... And it supports for 8ch, 24-bit 192kHz audio transfer. Also the v1.2 should double the bandwidth. Of course it depends on how it actually works, rather than specs.
That UDI looked really good, shame that they cancelled it. 16 Gbit/s
img.hexus.net/v2/internationalevents/idf2006march/udi_cable.JPG
Pre-post EDIT: But UDI didn't support audio transfer...
To topic:
Wouldn't this work? 15$...
www.startech.com/item/DP2DVI-DisplayPort-to-DVI-Cable-Adapter.aspx
#2 Digital audio sounds like crap so I think the only real advantage there is one less cable to mess with.
#3 [rant]DVI is cable of exceeding the maximum length of HDMI because they are generally well shielded by comparison. I mean, before HDMI showed up, DVI cables were thick, high quality beasts that nothing short of a microwave could penetrate. Most DVI cables (especially packaged with monitors) you see now have the same internal wiring as and HDMI cable (cheap, limited/no shielding). HDMI lowered the DVI standards of signal attenuation.[/rant] Since they now use equally crappy cable, they get equally crappy distance.
HDMI plugs are obviously smaller but look at what you are giving up. Instead of pins, they use sliding contacts. Pins were huge up to DVI because the signal degradation with pins is far less than sliding contacts. Pins were, therefore, critical to keeping analog in the DVI standard. If you are going to run 100'+ video cable, most likely you are using analog component video or DVI w/ boosters, not HDMI. HDMI was meant for the home theater, not professional applications. HDMI plugs, therefore, are a step down, not up. Sure, DVI's take longer to install but once those screws are in, there's not a very good chance it is going to come undone. HDMI either pops out or it breaks. Cheap plastics for home theater versus durable, long lasting metal w/ plastic casing for professional use. Royalties are a PITA but manufacturers will go where the money is at. They can't just start selling DisplayPort products because everyone not using Apple will still buy HDMI. In computers, audio is routed from a sound card (higher SNR and discreet audio processing) to the speakers (rarely incorporated in the monitor unless it is some cheap monitor for business, buy-by-the-dozen, use.
In home theaters, audio is often handled by a receiver. There's also that nagging issue of audio sounding best through analog systems so professional installations (and I don't mean sub $10,000 USD) still opt for keeping audio analog as much as possible which means they don't want to put in the same cable as video.
Bottomline: UDI would be the best solution for computer video. Whether or not the home entertainment industry picks it up like they did with DVI is up to them.
There are monitors out there right now pushing resolutions beyond 5 megapixel. DisplayPort may make sense for TVs but it doesn't make sense for computers. That adapter is only single-link DVI. It is unclear whether this Sapphire adapter is dual-link or single link. If it were dual-link, that may justify the price differential.
HDMI's main advantage is that everything uses it. Consoles, set top boxes, DVD players, PC's, camcorders (mini HDMI) and so on.
While it may not have any LARGE advantages, thats a good reason for it to become a standard - look at USB for an example. This is the video version of USB, that everything supports. Just because you can find situations where it DOESNT have advantages, doesnt mean those situations dont exist.
Monitors progress at a lightning rate compared to TVs. When TV was doing 320x240 analog, computers were doing 640x480 analog. When TVs were doing 640x480 analog, computers were doing 2048x1536 analog. When TVs were doing 1080p digital, computers were doing 3840x2160 digital.
Everyone wants to unify everything and it is ruining quality on a per-device basis. First there was HDMI where they stuck audio with DVI signal. Now DisplayPort isn't even meant to carry anything at all: video (optional), audio (optional). There are no choices anymore. You plug it in and the equipment tells everything connected what it is allowed to do regardless of what you, the consumer, actually want. This is just getting plain silly. Before long, you won't be able to do anything without signing a contract and connecting it to the Internet with which, they see, hear, and whatever else they conjure up without you having any authority beyond installing the equipment or not (and we know people would blindly agree--end justifies the means). The door is wide open to that doom's day prophecy with DisplayPort and once that door is open, it won't close.
Information is more valuable than platinum and more addicting than heroin. If you don't belive me, ask Google. That's the foundation of their success story.
I'm not a fan of USB either (largely due to bandwidth allocation) but that is a discussion for another time.
Oh, and HDMI, DisplayPort, and UDI connectors all inhibit 100% shilded coverage by design. There is always leakage on the connectors where DVI does not.
HDMI is good to go in the entertainment industry but it is time for the computer industry to move on. HDMI is old for that segment of the market.