Sunday, December 20th 2009

SAPPHIRE Solves Eyefinity Issue with Active DisplayPort Adapter

A feature of the latest SAPPHIRE HD 5000 series of graphics cards is the new ATI Eyefinity mode, which enables games and other applications to be run on three screens treated as one continuous display area. Now with the SAPPHIRE Eyefinity adapter, standard DVI monitors can be used for all three screens.

In addition to spectacular image clarity, speed and visual effects, the SAPPHIRE HD 5000 series supports the new multi-monitor mode known as ATI Eyefinity. This allows a single GPU to display a choice of images over an array of several monitors. The cards in this series support up to three monitors, with a resolution of up to 7680 x 1600. This opens up exciting possibilities not just for multi-screen gaming, but also for information systems, multimedia systems and promotional displays.
However, with this family of cards, although two of the screens can be connected directly to the DVI or HDMI outputs on the card, the third display has to be driven from the DisplayPort output. Some users have expressed concern about the cost or availability of DisplayPort compatible monitors, or a wish to match existing monitor styles.

Now SAPPHIRE has introduced an active powered DisplayPort to DVI converter, which allows any standard DVI monitor to be used as the third screen and enabled in Eyefinity mode. This allows users to update their existing multi-monitor setup to use Eyefinity, or to add a lower cost DVI monitor to complete their Eyefinity system. The SAPPHIRE converter is simply plugged into the DisplayPort output of the graphics card, together with a standard USB connection to provide power. A standard DVI monitor cable can then be connected to the female DVI connector on the unit.

This series of cards is supported by AMD's DirectX 11 WHQL certified graphics driver which delivers support for all of the key DirectX 11 level features required for new gaming experiences and acceleration of next generation high performance applications.
Add your own comment

79 Comments on SAPPHIRE Solves Eyefinity Issue with Active DisplayPort Adapter

#26
wiak
DVI needs sync signal or something, DisplayPort does not, btw did you know later on DisplayPort can support chaining of displays on ONE cable?, makes sense for it to replace DVI, btw it even supports sound, so no need for both HDMI and DVI anymore :p

basicly you should be soon able to run your PC monitor on the same cable as your 1080p 50" OLED HDTV :D
Posted on Reply
#27
SummerDays
buggalugsye its not as cool as it first seemed but i dont need 3 monitors anyway. 97% of the market will only use 1 monitor.
Good quality LCD monitors are becoming much lower in price, as well some people have TVs which need to be driven by a graphics card at the same time.

Also, consider the fact that it's cheaper to buy several slightly smaller screens and then hook them all together than it is to buy one larger screen. (tvs excepted).

In games, it's nice to be able to see in more than one direction at a time.
Posted on Reply
#29
SummerDays
This post makes it sound like Saphire has solved the issue.

Most likely all they're doing is repackaging another companies display port adaptor.
Posted on Reply
#30
eidairaman1
The Exiled Airman
seems all people around here can do is bitch and complain. Smart Move Sapphire
Posted on Reply
#31
wiak
this sapphire adapter should be cheaper than most other DP to DVI adapters to :p
and will be able to get where you can buy sapphire cards, so no need to order from the US!!!
no need to go Dell or Apple adapters that 88% of the time allways dont work ;P
Posted on Reply
#32
department76
i see DP and DVI being like usb vs. firewire... eventually one will die off and be pretty pointless to have... such as my untouched, and will most likely stay that way, firewire port hahahaha.

good for sapphire though, there is a solution for eyefinity people without a DP monitor. cool i guess?
Posted on Reply
#33
GSG-9
department76i see DP and DVI being like usb vs. firewire... eventually one will die off and be pretty pointless to have.
You mean DP and HDMI?
Posted on Reply
#34
Conflict0s
Thank you Sapphire! :D
I couldn't find an active display port adapter anywhere in the UK, now I can go out and buy another monitor that matches the two I already have :)

I don't know if this helps anyone but I read somewhere that the 5*** series only has 2 clock generators that are needed for HDMI/DVI, meaning you can only send 2 HDMI/DVI signals (through whichever ports) and the third must be native display port.
Posted on Reply
#35
FordGT90Concept
"I go fast!1!11!1!"
GSG-9You mean DP and HDMI?
Computer monitors don't need audio which is what differentiates between HDMI and DVI. DVI is currently far more popular than HDMI primarily because it supports larger displays with dual-link capability and it has been around longer. DisplayPort is trying to replace DVI, but while they're at it, they're trying to kill HDMI as well. Again, DisplayPort makes absolutely no sense but, because of the companies currently backing it, it won't die like it should. DisplayPort is here to stay, no matter how useless and unnecessary it is.
Posted on Reply
#36
GSG-9
FordGT90ConceptComputer monitors don't need audio which is what differentiates between HDMI and DVI. DVI is currently far more popular than HDMI primarily because it supports larger displays with dual-link capability and it has been around longer. DisplayPort is trying to replace DVI, but while they're at it, they're trying to kill HDMI as well. Again, DisplayPort makes absolutely no sense but, because of the companies currently backing it, it won't die like it should. DisplayPort is here to stay, no matter how useless and unnecessary it is.
Yes but HDMI is being pushed as well, No one is pushing DVI, I don't see it surviving much longer with only customers supporting it, they will revise hdmi and display port specifications to address larger monitors.
Posted on Reply
#37
FordGT90Concept
"I go fast!1!11!1!"
DVI is currently the computing standard whereas HDMI is the home-theater standard.

I think the goal for DisplayPort is to replace both. I think it is too soon to tell if it will succeed. Virtually nothing supports DisplayPort now and zero backwards compatibility support is a major blow to its implementation. If it does become the unified standard, it won't be for at least five years.
Posted on Reply
#38
GSG-9
DP does not have any real edge over HDMI besides that it has content protection. Which people traditionally hate.
Posted on Reply
#39
department76
FordGT90ConceptDisplayPort is here to stay, no matter how useless and unnecessary it is.
my point exactly, just like firewire. hahahaha
Posted on Reply
#40
GSG-9
department76my point exactly, just like firewire. hahahaha
I sure hope its something like that, HP's monitors with DP are way more expensive for no reason and personally I don't want it.
Posted on Reply
#41
FordGT90Concept
"I go fast!1!11!1!"
GSG-9DP does not have any real edge over HDMI besides that it has content protection. Which people traditionally hate.
HDMI/DVI has HDCP support. DisplayPort has HDCP and DPCP support.

Look how many people are using HDMI for their entertainment system. HDCP doesn't benefit them at all but they clearly don't care or they'd still be using component cables.

I hope no one buys DisplayPort, especially in the entertainment industry. If the entertainment industry sticks with HDMI, the computer industry will most likely stick with DVI. Given enough time of very limited usage, the standard will die.
Posted on Reply
#42
DrPepper
The Doctor is in the house
Meh I'd prefer if AMD had used 2 x DVI and 2 x HDMI or 1 x DVI and 3 x HDMI. Assuming there isn't any technical limitations those would be the most logical since DP has so few applications.
Posted on Reply
#43
FordGT90Concept
"I go fast!1!11!1!"
I'm not sold on HDMI being the best for computers yet, namely for several reasons: audio signal is worthless in most applications, HDCP, limited shielding, and the royalties required.

HDMI was old before it was even conceptualized. The computing industry needs to create a new benchmark that maintains backwards compatibility (like DVI did with analog) and yet moves forward. DisplayPort messes up back on backwards compatibility. If it had that, I think it would be a reasonable path forward.
Posted on Reply
#44
Mussels
Freshwater Moderator
FordGT90ConceptI'm not sold on HDMI being the best for computers yet, namely for several reasons: audio signal is worthless in most applications, HDCP, limited shielding, and the royalties required.

HDMI was old before it was even conceptualized. The computing industry needs to create a new benchmark that maintains backwards compatibility (like DVI did with analog) and yet moves forward. DisplayPort messes up back on backwards compatibility. If it had that, I think it would be a reasonable path forward.
HDMI has a few advantages. they may be SMALL advantages, but i'll list them.


1. Cheap adaptor to convert back to DVI. Big plus.

2. Audio. While useless to gamers, its a big boost to home theatre users and consoles.

3. Long cables, thin plug (DVI was massive!) - i'm running 15M of HDMI cable between rooms. try that with DVI.


There are limitations with HDMI audio, but we cant help that. stupid HDCP limits TV/screen outputs to stereo, so even if you run HDMI you need to have it reach your sound system BEFORE it reaches your TV/monitor, if you want 5.1 audio
Posted on Reply
#45
Meizuman
FordGT90ConceptComposite or component video? Composite video is just a matter of extracting the three colors and outputting a matching value on horizontal and vertical refresh rates. That can be done on the fly with a matter of a few ms delay (not detectable by the eyes). Component is more involved because you have to convert binary into analog.

The problem with DisplayPort -> DVI signaling is that they have different communication standards. Still, I'm sure it is not impossible to engineer a chip that would perform the conversion in a time frame that it can't be detected by human senses. It just cost more--a lot more than just rearranging a few pins into a different arrangement.


The point being is that AMD made a bad call. They should have taken a hint from Apple users with all their DisplayPort connectivity issues. Truth be told, I'm as bitter with DisplayPort as I am with ATSC -> NTSC. In fact, I am more so bitter about DisplayPort than anything else. Simply put, it is a bad standard (very limited backwards compatibility, limited cable length, introduces a new form of DRM: DPCP, very little bandwidth gain compared to dual-link DVI, and the list goes on). DisplayPort is "replacing" DVI because industry leaders (HP, Dell, Apple, Intel, to name a few) insist, not because it makes any sense. If it were up to me, we'd be talking about Unified Display Interface (the true successor to DVI), not DisplayPort.
I haven't really been keepining a track of different monitor connecting standards, but now that you brought it up I must say I am a bit worried about what the future holds.. First I thought that DP will be the new standard... and could possibly take over hdmi because of no royalty fees.

I think DP has at least few better aspects than DVI... more convenient connector, little more bandwith (8.64 Gbit/s Vs. DVI's 7.92 Gbit/s), fiber optic support... And it supports for 8ch, 24-bit 192kHz audio transfer. Also the v1.2 should double the bandwidth. Of course it depends on how it actually works, rather than specs.

That UDI looked really good, shame that they cancelled it. 16 Gbit/s
img.hexus.net/v2/internationalevents/idf2006march/udi_cable.JPG

Pre-post EDIT: But UDI didn't support audio transfer...

To topic:

Wouldn't this work? 15$...
www.startech.com/item/DP2DVI-DisplayPort-to-DVI-Cable-Adapter.aspx
Posted on Reply
#46
FordGT90Concept
"I go fast!1!11!1!"
Mussels1. Cheap adaptor to convert back to DVI. Big plus.

2. Audio. While useless to gamers, its a big boost to home theatre users and consoles.

3. Long cables, thin plug (DVI was massive!) - i'm running 15M of HDMI cable between rooms. try that with DVI.
#1 It is kind of ironic that DVI was developed for computers and DVI was such a good standard that they used it to create HDMI; however, their roles still remain diverse. The cheap adapters are merely a result of the same underlying standard, really no more. I agree, but is really more a coincidence than an intentional advantage.

#2 Digital audio sounds like crap so I think the only real advantage there is one less cable to mess with.

#3 [rant]DVI is cable of exceeding the maximum length of HDMI because they are generally well shielded by comparison. I mean, before HDMI showed up, DVI cables were thick, high quality beasts that nothing short of a microwave could penetrate. Most DVI cables (especially packaged with monitors) you see now have the same internal wiring as and HDMI cable (cheap, limited/no shielding). HDMI lowered the DVI standards of signal attenuation.[/rant] Since they now use equally crappy cable, they get equally crappy distance.

HDMI plugs are obviously smaller but look at what you are giving up. Instead of pins, they use sliding contacts. Pins were huge up to DVI because the signal degradation with pins is far less than sliding contacts. Pins were, therefore, critical to keeping analog in the DVI standard. If you are going to run 100'+ video cable, most likely you are using analog component video or DVI w/ boosters, not HDMI. HDMI was meant for the home theater, not professional applications. HDMI plugs, therefore, are a step down, not up. Sure, DVI's take longer to install but once those screws are in, there's not a very good chance it is going to come undone. HDMI either pops out or it breaks. Cheap plastics for home theater versus durable, long lasting metal w/ plastic casing for professional use.
MeizumanI haven't really been keepining a track of different monitor connecting standards, but now that you brought it up I must say I am a bit worried about what the future holds.. First I thought that DP will be the new standard... and could possibly take over hdmi because of no royalty fees.
Royalties are a PITA but manufacturers will go where the money is at. They can't just start selling DisplayPort products because everyone not using Apple will still buy HDMI.
MeizumanPre-post EDIT: But UDI didn't support audio transfer...
In computers, audio is routed from a sound card (higher SNR and discreet audio processing) to the speakers (rarely incorporated in the monitor unless it is some cheap monitor for business, buy-by-the-dozen, use.

In home theaters, audio is often handled by a receiver. There's also that nagging issue of audio sounding best through analog systems so professional installations (and I don't mean sub $10,000 USD) still opt for keeping audio analog as much as possible which means they don't want to put in the same cable as video.


Bottomline: UDI would be the best solution for computer video. Whether or not the home entertainment industry picks it up like they did with DVI is up to them.

There are monitors out there right now pushing resolutions beyond 5 megapixel. DisplayPort may make sense for TVs but it doesn't make sense for computers.
Meizumanwww.startech.com/item/DP2DVI-DisplayPort-to-DVI-Cable-Adapter.aspx
That adapter is only single-link DVI. It is unclear whether this Sapphire adapter is dual-link or single link. If it were dual-link, that may justify the price differential.
Posted on Reply
#47
Mussels
Freshwater Moderator
ATI have said that only active adaptors work on their cards. passive ones only work on other products.


HDMI's main advantage is that everything uses it. Consoles, set top boxes, DVD players, PC's, camcorders (mini HDMI) and so on.

While it may not have any LARGE advantages, thats a good reason for it to become a standard - look at USB for an example. This is the video version of USB, that everything supports. Just because you can find situations where it DOESNT have advantages, doesnt mean those situations dont exist.
Posted on Reply
#48
FordGT90Concept
"I go fast!1!11!1!"
How often does one connect their console, set top box, DVD player, or camcorder to their computer? Camcorder, I give you, but that makes sense. You need a fast (plug in/unplug) connection there. The rest ought to be connected directly to your TV or receiver. Ehm, HDMI on a computer only makes sense as a video/audio input--not an output.

Monitors progress at a lightning rate compared to TVs. When TV was doing 320x240 analog, computers were doing 640x480 analog. When TVs were doing 640x480 analog, computers were doing 2048x1536 analog. When TVs were doing 1080p digital, computers were doing 3840x2160 digital.

Everyone wants to unify everything and it is ruining quality on a per-device basis. First there was HDMI where they stuck audio with DVI signal. Now DisplayPort isn't even meant to carry anything at all: video (optional), audio (optional). There are no choices anymore. You plug it in and the equipment tells everything connected what it is allowed to do regardless of what you, the consumer, actually want. This is just getting plain silly. Before long, you won't be able to do anything without signing a contract and connecting it to the Internet with which, they see, hear, and whatever else they conjure up without you having any authority beyond installing the equipment or not (and we know people would blindly agree--end justifies the means). The door is wide open to that doom's day prophecy with DisplayPort and once that door is open, it won't close.

Information is more valuable than platinum and more addicting than heroin. If you don't belive me, ask Google. That's the foundation of their success story.


I'm not a fan of USB either (largely due to bandwidth allocation) but that is a discussion for another time.


Oh, and HDMI, DisplayPort, and UDI connectors all inhibit 100% shilded coverage by design. There is always leakage on the connectors where DVI does not.
Posted on Reply
#49
Mussels
Freshwater Moderator
FordGT90ConceptHow often does one connect their console, set top box, DVD player, or camcorder to their computer? Camcorder, I give you, but that makes sense. You need a fast (plug in/unplug) connection there. The rest ought to be connected directly to your TV or receiver. Ehm, HDMI on a computer only makes sense as a video/audio input--not an output.

Monitors progress at a lightning rate compared to TVs. When TV was doing 320x240 analog, computers were doing 640x480 analog. When TVs were doing 640x480 analog, computers were doing 2048x1536 analog. When TVs were doing 1080p digital, computers were doing 3840x2160 digital.

Everyone wants to unify everything and it is ruining quality on a per-device basis. First there was HDMI where they stuck audio with DVI signal. Now DisplayPort isn't even meant to carry anything at all: video (optional), audio (optional). There are no choices anymore. You plug it in and the equipment tells everything connected what it is allowed to do regardless of what you, the consumer, actually want. This is just getting plain silly. Before long, you won't be able to do anything without signing a contract and connecting it to the Internet with which, they see, hear, and whatever else they conjure up without you having any authority beyond installing the equipment or not (and we know people would blindly agree--end justifies the means). The door is wide open to that doom's day prophecy with DisplayPort and once that door is open, it won't close.

Information is more valuable than platinum and more addicting than heroin. If you don't belive me, ask Google. That's the foundation of their success story.


I'm not a fan of USB either (largely due to bandwidth allocation) but that is a discussion for another time.


Oh, and HDMI, DisplayPort, and UDI connectors all inhibit 100% shilded coverage by design. There is always leakage on the connectors where DVI does not.
i never mentioned anything about a COMPUTER. all those devices can be hooked up to a TV or a PC display - when they share inputs, there is no real distinction.
Posted on Reply
#50
FordGT90Concept
"I go fast!1!11!1!"
I'm trying to show why the two need separate, but compatible standards. Monitors are intended for a viewing distance of less than 2 feet where TVs are designed for 6+ feet. In order to get a good picture at less than two feet away, you need a really high resolution (dpi). Conversly, the farther away you get from the display, the lower the dpi necessary to make it look exactly the same to the eye. More DPI means more bandwidth and more bandwidth means more robust cables. I think it is a bad idea to attempt to merge the two. I mean, HDMI is just now getting market acceptance when DVI has been around a long time.

HDMI is good to go in the entertainment industry but it is time for the computer industry to move on. HDMI is old for that segment of the market.
Posted on Reply
Add your own comment
Nov 26th, 2024 16:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts