# SAPPHIRE Solves Eyefinity Issue with Active DisplayPort Adapter



## btarunr (Dec 20, 2009)

A feature of the latest SAPPHIRE HD 5000 series of graphics cards is the new ATI Eyefinity mode, which enables games and other applications to be run on three screens treated as one continuous display area. Now with the SAPPHIRE Eyefinity adapter, standard DVI monitors can be used for all three screens.

In addition to spectacular image clarity, speed and visual effects, the SAPPHIRE HD 5000 series supports the new multi-monitor mode known as ATI Eyefinity. This allows a single GPU to display a choice of images over an array of several monitors. The cards in this series support up to three monitors, with a resolution of up to 7680 x 1600. This opens up exciting possibilities not just for multi-screen gaming, but also for information systems, multimedia systems and promotional displays.






However, with this family of cards, although two of the screens can be connected directly to the DVI or HDMI outputs on the card, the third display has to be driven from the DisplayPort output. Some users have expressed concern about the cost or availability of DisplayPort compatible monitors, or a wish to match existing monitor styles.

Now SAPPHIRE has introduced an active powered DisplayPort to DVI converter, which allows any standard DVI monitor to be used as the third screen and enabled in Eyefinity mode. This allows users to update their existing multi-monitor setup to use Eyefinity, or to add a lower cost DVI monitor to complete their Eyefinity system. The SAPPHIRE converter is simply plugged into the DisplayPort output of the graphics card, together with a standard USB connection to provide power. A standard DVI monitor cable can then be connected to the female DVI connector on the unit.

This series of cards is supported by AMD's DirectX 11 WHQL certified graphics driver which delivers support for all of the key DirectX 11 level features required for new gaming experiences and acceleration of next generation high performance applications.

*View at TechPowerUp Main Site*


----------



## btarunr (Dec 20, 2009)

All yours for GBP 78.

Many thanks to jagd for sending this in.


----------



## Solaris17 (Dec 20, 2009)

omg bta 11k posts GJ


----------



## Lionheart (Dec 20, 2009)

I never knew there was an issue in the first place, but then again, ati driver's lol


----------



## angelkiller (Dec 20, 2009)

For the money, I'd just get a third LCD with a Display Port instead of this.


----------



## kylzer (Dec 20, 2009)

CHAOS_KILLA said:


> I never knew there was an issue in the first place, but then again, ati driver's lol



Clever post


----------



## aj28 (Dec 20, 2009)

Dunno that I would call it an issue, to be fair... DisplayPort is simply a standard before its time, for now at least, and backwards compatibility would only serve to hold it back by unnecessarily increasing its complexity. I'd like to get my hands on some compatible hardware in the near future...


----------



## Zubasa (Dec 20, 2009)

CHAOS_KILLA said:


> I never knew there was an issue in the first place, but then again, ati driver's lol


LOL at you.
GO figure what a Display Port is before the driver bashing.


----------



## VulkanBros (Dec 20, 2009)

Zubasa said:


> LOL at you.
> GO figure what a Display Port is before the driver bashing.



+1 for that ;-)


----------



## FordGT90Concept (Dec 20, 2009)

I hope there isn't any lag using that adapter.  AMD made a bad call there seeing as DisplayPort just barely got its foot in the door and, to make matters worse, DisplayPort is in no way compatible with DVI/HDMI.


----------



## 1Kurgan1 (Dec 20, 2009)

FordGT90Concept said:


> I hope there isn't any lag using that adapter.  AMD made a bad call there seeing as DisplayPort just barely got its foot in the door and, to make matters worse, DisplayPort is in no way compatible with DVI/HDMI.



I'm not sure what the similarities are between the setup I run and this. But I run an adapter from my PS3 (Composite out) to VGA, and I don't sense any lag compared to running it through HDMI before. I can't imagine this would be any different.


----------



## FordGT90Concept (Dec 20, 2009)

Composite or component video?  Composite video is just a matter of extracting the three colors and outputting a matching value on horizontal and vertical refresh rates.  That can be done on the fly with a matter of a few ms delay (not detectable by the eyes).  Component is more involved because you have to convert binary into analog.

The problem with DisplayPort -> DVI signaling is that they have different communication standards.  Still, I'm sure it is not impossible to engineer a chip that would perform the conversion in a time frame that it can't be detected by human senses.  It just cost more--a lot more than just rearranging a few pins into a different arrangement.


The point being is that AMD made a bad call.  They should have taken a hint from Apple users with all their DisplayPort connectivity issues.  Truth be told, I'm as bitter with DisplayPort as I am with ATSC -> NTSC.  In fact, I am more so bitter about DisplayPort than anything else.  Simply put, it is a bad standard (very limited backwards compatibility, limited cable length, introduces a new form of DRM: DPCP, very little bandwidth gain compared to dual-link DVI, and the list goes on).  DisplayPort is "replacing" DVI because industry leaders (HP, Dell, Apple, Intel, to name a few) insist, not because it makes any sense.  If it were up to me, we'd be talking about Unified Display Interface (the true successor to DVI), not DisplayPort.


----------



## Mussels (Dec 20, 2009)

for those that dont get it, the point is that you need three monitors for eyefinity.

Since DP and DVI arent compatible, you need a special adaptor - this adaptor lets you use three DVI (or HDMI screens with further adaptors) for eyefinity


----------



## Hayder_Master (Dec 20, 2009)

2 ports only , is that mean maximum 4 lcd display


----------



## Mussels (Dec 20, 2009)

hayder.master said:


> 2 ports only , is that mean maximum 4 lcd display



its one display port to one DVI port - the other plug is USB for power.










To be honest, this does confuse me.

With 2x DVI, 1x HDMI and 1x DP, i dont see why users cant use a HDMI to DVI adaptor/cable for eyefinity without needing this expensive adaptor.

Is there some limitation with one of those DVI ports not working if the HDMI port is in use?


----------



## Zubasa (Dec 20, 2009)

Mussels said:


> its one display port to one DVI port - the other plug is USB for power.
> 
> 
> 
> ...


Exactly, that HDMI port is just an internal DVI to HDMI adaptor 
This is really a bummer TBH.


----------



## FordGT90Concept (Dec 20, 2009)

I'm guessing it is because of how the card is wired internally.  The HDMI out is probably linked to DVI1 or DVI2.  The DisplayPort, on the other hand, is akin to DVI3--it is not linked to DVI1 or DVI2 in any way.  I'm guessing AMD had put some crazy hardware in there in order to support DisplayPort (being so incompatible with DVI and HDMI) and they needed a way to market it as something special thus, Eyefinity is born.


----------



## Mussels (Dec 20, 2009)

its probably linked to the one next to it, actually.


So your combinations for eyefinity are:

DVI, HDMI, DP
DVI, DVI, DP
DVI, DVI, DVI (with this fancy adaptor)


----------



## lemonadesoda (Dec 20, 2009)

Wow. There I was, just about to buy a 5770 to drive three identical DVI monitors, and now this... discovery of marketing spin and that eyefinity is NOT compatible with 3x DVI without extra (relatively significant) expense AND FDONGLES.

I'm pretty embarassed, because somewhere else in this forum I might have even _recommended_ someone to swap out an asymmetric GPU setup for a single 5770 solution to drive 3 monitors. Man, ATI made some bad decisions there. Any why? ALL FOR BLXXDY BLU-RAY DRM. ATI are giant tits, because people that would find eyefinity a feature...e.g. workstation use... do not want DRM or bluray compatibility. Big boyzTM dont do bluray on their workstations. They got a home cinema for that...


----------



## Mussels (Dec 20, 2009)

tis a bit daft - was too early for displayport for a requirement, i think


----------



## jagd (Dec 20, 2009)

It is nothing ,finding an adapter was harder than finding a 58** card ,problem was only DELL (OEM is Blizlink ,same adapter ) carried active DP adapters (apple adapters had problems) and not outside US (even canadians could not find ,think about rest of the world) and one netherland firm in europe http://kabeltje.com/accell-displayport-naar-dvid-dual-link-adapter-25cm-p-1628.html .Sapphire adapter will be more wide speread  and price will drope over time
i hope.

IF you have a monitor with DP you are ok and dont need adapter but they are limited and not wide spread .


lemonadesoda said:


> Wow. There I was, just about to buy a 5770 to drive three identical DVI monitors, and now this... discovery of marketing spin and that eyefinity is NOT compatible with 3x DVI without extra (relatively significant) expense AND FDONGLES.


----------



## kylew (Dec 20, 2009)

CHAOS_KILLA said:


> I never knew there was an issue in the first place, but then again, ati *driver's *lol



Your punctuation lol . 

Then at your stupidity. Display port is a display output, what's it got to do with drivers? (no ' this time).


----------



## SummerDays (Dec 20, 2009)

Mussels said:


> its one display port to one DVI port - the other plug is USB for power.
> 
> 
> 
> ...



No, you can't use both DVI ports and the HDMI port at the same time.  What you really need is for the third monitor to support display port.

btw, Display port also comes with copy protection.

There is a version using mini display ports that will support up to 6, yes 6, monitors!


----------



## SummerDays (Dec 20, 2009)

FordGT90Concept said:


> I'm guessing it is because of how the card is wired internally.  The HDMI out is probably linked to DVI1 or DVI2.  The DisplayPort, on the other hand, is akin to DVI3--it is not linked to DVI1 or DVI2 in any way.  I'm guessing AMD had put some crazy hardware in there in order to support DisplayPort (being so incompatible with DVI and HDMI) and they needed a way to market it as something special thus, Eyefinity is born.



That would be a pretty good guess.  They needed to add compatability with display port and then they realised what it meant, and said "Mein Gott!" slapped themselves in the forehead and ran around shouting "we can now support three monitors!".

However, you can hook up a displayport device using a passive dp to dvi adaptor.


----------



## Disparia (Dec 20, 2009)

Wiki said:
			
		

> The display unit on Evergreen family of GPUs was completely replaced with one that has two DACs which are used to drive the DVI ports in analog mode (for example, when a DVI to VGA converter attached to a DVI port), six digital transmitters that can output either a DisplayPort signal or a TMDS signal which is used for either DVI or HDMI, and two clock signals needed to drive the digital outputs in TMDS mode. Dual-link DVI displays use two of the two TMDS/DisplayPort transmitters and one clock signal each. Single-link DVI displays and HDMI displays use one TMDS/DisplayPort transmitter and one clock signal each. DisplayPort displays use one TMDS/DisplayPort transmitter and zero clock signals. A DisplayPort adaptor or dongle can be used to convert a DisplayPort signal to another type of signal like VGA, single or dual link DVI, or HDMI if more than two non-DisplayPort displays need to be connected to a normal Radeon HD 5870 or Radeon HD 5850 card.[7] The table below shows the maximum possible configurations on a normal Radeon HD 5800/5700 series add in card.



I don't know enough about their implementation to offer an explanation "why", can only give a consumers rant.

DAMNIT! I would have gladly given up a DAC or two for a third "clock signal", you know, a feature that would have made it easy to utilize Eyefinity. Or at least allow the use of ports/signals on other cards. More cards = more money for your greedy black hearts!


----------



## buggalugs (Dec 20, 2009)

ye its not as cool as it first seemed but i dont need 3 monitors anyway. 97% of the market will only use 1 monitor.


----------



## wiak (Dec 20, 2009)

DVI needs sync signal or something, DisplayPort does not, btw did you know later on DisplayPort can support chaining of displays on ONE cable?, makes sense for it to replace DVI, btw it even supports sound, so no need for both HDMI and DVI anymore 

basicly you should be soon able to run your PC monitor on the same cable as your 1080p 50" OLED HDTV


----------



## SummerDays (Dec 20, 2009)

buggalugs said:


> ye its not as cool as it first seemed but i dont need 3 monitors anyway. 97% of the market will only use 1 monitor.



Good quality LCD monitors are becoming much lower in price, as well some people have TVs which need to be driven by a graphics card at the same time.

Also, consider the fact that it's cheaper to buy several slightly smaller screens and then hook them all together than it is to buy one larger screen.  (tvs excepted).

In games, it's nice to be able to see in more than one direction at a time.


----------



## WarEagleAU (Dec 20, 2009)

Nice move Sapphire.


----------



## SummerDays (Dec 20, 2009)

This post makes it sound like Saphire has solved the issue.

Most likely all they're doing is repackaging another companies display port adaptor.


----------



## eidairaman1 (Dec 20, 2009)

seems all people around here can do is bitch and complain.  Smart Move Sapphire


----------



## wiak (Dec 20, 2009)

this sapphire adapter should be cheaper than most other DP to DVI adapters to 
and will be able to get where you can buy sapphire cards, so no need to order from the US!!!
no need to go Dell or Apple adapters that 88% of the time allways dont work ;P


----------



## department76 (Dec 20, 2009)

i see DP and DVI being like usb vs. firewire... eventually one will die off and be pretty pointless to have...  such as my untouched, and will most likely stay that way, firewire port hahahaha.

good for sapphire though, there is a solution for eyefinity people without a DP monitor.  cool i guess?


----------



## GSG-9 (Dec 20, 2009)

department76 said:


> i see DP and DVI being like usb vs. firewire... eventually one will die off and be pretty pointless to have.



 You mean DP and HDMI?


----------



## Conflict0s (Dec 20, 2009)

Thank you Sapphire! 
I couldn't find an active display port adapter anywhere in the UK, now I can go out and buy another monitor that matches the two I already have 

I don't know if this helps anyone but I read somewhere that the 5*** series only has 2 clock generators that are needed for HDMI/DVI, meaning you can only send 2 HDMI/DVI signals (through whichever ports) and the third must be native display port.


----------



## FordGT90Concept (Dec 21, 2009)

GSG-9 said:


> You mean DP and HDMI?


Computer monitors don't need audio which is what differentiates between HDMI and DVI.  DVI is currently far more popular than HDMI primarily because it supports larger displays with dual-link capability and it has been around longer.  DisplayPort is trying to replace DVI, but while they're at it, they're trying to kill HDMI as well.  Again, DisplayPort makes absolutely no sense but, because of the companies currently backing it, it won't die like it should.  DisplayPort is here to stay, no matter how useless and unnecessary it is.


----------



## GSG-9 (Dec 21, 2009)

FordGT90Concept said:


> Computer monitors don't need audio which is what differentiates between HDMI and DVI.  DVI is currently far more popular than HDMI primarily because it supports larger displays with dual-link capability and it has been around longer.  DisplayPort is trying to replace DVI, but while they're at it, they're trying to kill HDMI as well.  Again, DisplayPort makes absolutely no sense but, because of the companies currently backing it, it won't die like it should.  DisplayPort is here to stay, no matter how useless and unnecessary it is.



Yes but HDMI is being pushed as well, No one is pushing DVI, I don't see it surviving much longer with only customers supporting it, they will revise hdmi and display port specifications to address larger monitors.


----------



## FordGT90Concept (Dec 21, 2009)

DVI is currently the computing standard whereas HDMI is the home-theater standard.

I think the goal for DisplayPort is to replace both.  I think it is too soon to tell if it will succeed.  Virtually nothing supports DisplayPort now and zero backwards compatibility support is a major blow to its implementation.  If it does become the unified standard, it won't be for at least five years.


----------



## GSG-9 (Dec 21, 2009)

DP does not have any real edge over HDMI besides that it has content protection. Which people traditionally hate.


----------



## department76 (Dec 21, 2009)

FordGT90Concept said:


> DisplayPort is here to stay, no matter how useless and unnecessary it is.



my point exactly, just like firewire. hahahaha


----------



## GSG-9 (Dec 21, 2009)

department76 said:


> my point exactly, just like firewire. hahahaha



I sure hope its something like that, HP's monitors with DP are way more expensive for no reason and personally I don't want it.


----------



## FordGT90Concept (Dec 21, 2009)

GSG-9 said:


> DP does not have any real edge over HDMI besides that it has content protection. Which people traditionally hate.


HDMI/DVI has HDCP support.  DisplayPort has HDCP and DPCP support.

Look how many people are using HDMI for their entertainment system.  HDCP doesn't benefit them at all but they clearly don't care or they'd still be using component cables.

I hope no one buys DisplayPort, especially in the entertainment industry.  If the entertainment industry sticks with HDMI, the computer industry will most likely stick with DVI.  Given enough time of very limited usage, the standard will die.


----------



## DrPepper (Dec 21, 2009)

Meh I'd prefer if AMD had used 2 x DVI and 2 x HDMI or 1 x DVI and 3 x HDMI. Assuming there isn't any technical limitations those would be the most logical since DP has so few applications.


----------



## FordGT90Concept (Dec 21, 2009)

I'm not sold on HDMI being the best for computers yet, namely for several reasons: audio signal is worthless in most applications, HDCP, limited shielding, and the royalties required.

HDMI was old before it was even conceptualized.  The computing industry needs to create a new benchmark that maintains backwards compatibility (like DVI did with analog) and yet moves forward.  DisplayPort messes up back on backwards compatibility.  If it had that, I think it would be a reasonable path forward.


----------



## Mussels (Dec 21, 2009)

FordGT90Concept said:


> I'm not sold on HDMI being the best for computers yet, namely for several reasons: audio signal is worthless in most applications, HDCP, limited shielding, and the royalties required.
> 
> HDMI was old before it was even conceptualized.  The computing industry needs to create a new benchmark that maintains backwards compatibility (like DVI did with analog) and yet moves forward.  DisplayPort messes up back on backwards compatibility.  If it had that, I think it would be a reasonable path forward.



HDMI has a few advantages. they may be SMALL advantages, but i'll list them.


1. Cheap adaptor to convert back to DVI. Big plus.

2. Audio. While useless to gamers, its a big boost to home theatre users and consoles.

3. Long cables, thin plug (DVI was massive!) - i'm running 15M of HDMI cable between rooms. try that with DVI.


There are limitations with HDMI audio, but we cant help that. stupid HDCP limits TV/screen outputs to stereo, so even if you run HDMI you need to have it reach your sound system BEFORE it reaches your TV/monitor, if you want 5.1 audio


----------



## Meizuman (Dec 21, 2009)

FordGT90Concept said:


> Composite or component video?  Composite video is just a matter of extracting the three colors and outputting a matching value on horizontal and vertical refresh rates.  That can be done on the fly with a matter of a few ms delay (not detectable by the eyes).  Component is more involved because you have to convert binary into analog.
> 
> The problem with DisplayPort -> DVI signaling is that they have different communication standards.  Still, I'm sure it is not impossible to engineer a chip that would perform the conversion in a time frame that it can't be detected by human senses.  It just cost more--a lot more than just rearranging a few pins into a different arrangement.
> 
> ...



I haven't really been keepining a track of different monitor connecting standards, but now that you brought it up I must say I am a bit worried about what the future holds.. First I thought that DP will be the new standard... and could possibly take over hdmi because of no royalty fees.

I think DP has at least few better aspects than DVI... more convenient connector, little more bandwith (8.64 Gbit/s Vs. DVI's 7.92 Gbit/s), fiber optic support... And it supports for 8ch, 24-bit 192kHz audio transfer. Also the v1.2 should double the bandwidth. Of course it depends on how it actually works, rather than specs.

That UDI looked really good, shame that they cancelled it. 16 Gbit/s
http://img.hexus.net/v2/internationalevents/idf2006march/udi_cable.JPG

Pre-post EDIT: But UDI didn't support audio transfer...

To topic:

Wouldn't this work? 15$...
http://www.startech.com/item/DP2DVI-DisplayPort-to-DVI-Cable-Adapter.aspx


----------



## FordGT90Concept (Dec 21, 2009)

Mussels said:


> 1. Cheap adaptor to convert back to DVI. Big plus.
> 
> 2. Audio. While useless to gamers, its a big boost to home theatre users and consoles.
> 
> 3. Long cables, thin plug (DVI was massive!) - i'm running 15M of HDMI cable between rooms. try that with DVI.


#1 It is kind of ironic that DVI was developed for computers and DVI was such a good standard that they used it to create HDMI; however, their roles still remain diverse.  The cheap adapters are merely a result of the same underlying standard, really no more.  I agree, but is really more a coincidence than an intentional advantage.

#2 Digital audio sounds like crap so I think the only real advantage there is one less cable to mess with.

#3 [rant]DVI is cable of exceeding the maximum length of HDMI because they are generally well shielded by comparison.  I mean, before HDMI showed up, DVI cables were thick, high quality beasts that nothing short of a microwave could penetrate.  Most DVI cables (especially packaged with monitors) you see now have the same internal wiring as and HDMI cable (cheap, limited/no shielding).  HDMI lowered the DVI standards of signal attenuation.[/rant]  Since they now use equally crappy cable, they get equally crappy distance.

HDMI plugs are obviously smaller but look at what you are giving up.  Instead of pins, they use sliding contacts.  Pins were huge up to DVI because the signal degradation with pins is far less than sliding contacts.  Pins were, therefore, critical to keeping analog in the DVI standard.  If you are going to run 100'+ video cable, most likely you are using analog component video or DVI w/ boosters, not HDMI.  HDMI was meant for the home theater, not professional applications.  HDMI plugs, therefore, are a step down, not up.  Sure, DVI's take longer to install but once those screws are in, there's not a very good chance it is going to come undone.  HDMI either pops out or it breaks.  Cheap plastics for home theater versus durable, long lasting metal w/ plastic casing for professional use.




Meizuman said:


> I haven't really been keepining a track of different monitor connecting standards, but now that you brought it up I must say I am a bit worried about what the future holds.. First I thought that DP will be the new standard... and could possibly take over hdmi because of no royalty fees.


Royalties are a PITA but manufacturers will go where the money is at.  They can't just start selling DisplayPort products because everyone not using Apple will still buy HDMI.




Meizuman said:


> Pre-post EDIT: But UDI didn't support audio transfer...


In computers, audio is routed from a sound card (higher SNR and discreet audio processing) to the speakers (rarely incorporated in the monitor unless it is some cheap monitor for business, buy-by-the-dozen, use.

In home theaters, audio is often handled by a receiver.  There's also that nagging issue of audio sounding best through analog systems so professional installations (and I don't mean sub $10,000 USD) still opt for keeping audio analog as much as possible which means they don't want to put in the same cable as video.


Bottomline: UDI would be the best solution for computer video.  Whether or not the home entertainment industry picks it up like they did with DVI is up to them.

There are monitors out there right now pushing resolutions beyond 5 megapixel.  DisplayPort may make sense for TVs but it doesn't make sense for computers.




Meizuman said:


> http://www.startech.com/item/DP2DVI-DisplayPort-to-DVI-Cable-Adapter.aspx


That adapter is only single-link DVI.  It is unclear whether this Sapphire adapter is dual-link or single link.  If it were dual-link, that may justify the price differential.


----------



## Mussels (Dec 21, 2009)

ATI have said that only active adaptors work on their cards. passive ones only work on other products.


HDMI's main advantage is that everything uses it. Consoles, set top boxes, DVD players, PC's, camcorders (mini HDMI) and so on.

While it may not have any LARGE advantages, thats a good reason for it to become a standard - look at USB for an example. This is the video version of USB, that everything supports. Just because you can find situations where it DOESNT have advantages, doesnt mean those situations dont exist.


----------



## FordGT90Concept (Dec 21, 2009)

How often does one connect their console, set top box, DVD player, or camcorder to their computer?  Camcorder, I give you, but that makes sense.  You need a fast (plug in/unplug) connection there.  The rest ought to be connected directly to your TV or receiver.  Ehm, HDMI on a computer only makes sense as a video/audio input--not an output.

Monitors progress at a lightning rate compared to TVs.  When TV was doing 320x240 analog, computers were doing 640x480 analog.  When TVs were doing 640x480 analog, computers were doing 2048x1536 analog.  When TVs were doing 1080p digital, computers were doing 3840x2160 digital.

Everyone wants to unify everything and it is ruining quality on a per-device basis.  First there was HDMI where they stuck audio with DVI signal.  Now DisplayPort isn't even meant to carry anything at all: video (optional), audio (optional).  There are no choices anymore.  You plug it in and the equipment tells everything connected what it is allowed to do regardless of what you, the consumer, actually want.  This is just getting plain silly.  Before long, you won't be able to do anything without signing a contract and connecting it to the Internet with which, they see, hear, and whatever else they conjure up without you having any authority beyond installing the equipment or not (and we know people would blindly agree--end justifies the means).  The door is wide open to that doom's day prophecy with DisplayPort and once that door is open, it won't close.

Information is more valuable than platinum and more addicting than heroin.  If you don't belive me, ask Google.  That's the foundation of their success story.


I'm not a fan of USB either (largely due to bandwidth allocation) but that is a discussion for another time.


Oh, and HDMI, DisplayPort, and UDI connectors all inhibit 100% shilded coverage by design.  There is always leakage on the connectors where DVI does not.


----------



## Mussels (Dec 21, 2009)

FordGT90Concept said:


> How often does one connect their console, set top box, DVD player, or camcorder to their computer?  Camcorder, I give you, but that makes sense.  You need a fast (plug in/unplug) connection there.  The rest ought to be connected directly to your TV or receiver.  Ehm, HDMI on a computer only makes sense as a video/audio input--not an output.
> 
> Monitors progress at a lightning rate compared to TVs.  When TV was doing 320x240 analog, computers were doing 640x480 analog.  When TVs were doing 640x480 analog, computers were doing 2048x1536 analog.  When TVs were doing 1080p digital, computers were doing 3840x2160 digital.
> 
> ...




i never mentioned anything about a COMPUTER. all those devices can be hooked up to a TV or a PC display - when they share inputs, there is no real distinction.


----------



## FordGT90Concept (Dec 21, 2009)

I'm trying to show why the two need separate, but compatible standards.  Monitors are intended for a viewing distance of less than 2 feet where TVs are designed for 6+ feet.  In order to get a good picture at less than two feet away, you need a really high resolution (dpi).  Conversly, the farther away you get from the display, the lower the dpi necessary to make it look exactly the same to the eye.  More DPI means more bandwidth and more bandwidth means more robust cables.  I think it is a bad idea to attempt to merge the two.  I mean, HDMI is just now getting market acceptance when DVI has been around a long time.

HDMI is good to go in the entertainment industry but it is time for the computer industry to move on.  HDMI is old for that segment of the market.


----------



## Mussels (Dec 21, 2009)

FordGT90Concept said:


> I'm trying to show why the two need separate, but compatible standards.  Monitors are intended for a viewing distance of less than 2 feet where TVs are designed for 6+ feet.  In order to get a good picture at less than two feet away, you need a really high resolution (dpi).  Conversly, the farther away you get from the display, the lower the dpi necessary to make it look exactly the same to the eye.  More DPI means more bandwidth and more bandwidth means more robust cables.  I think it is a bad idea to attempt to merge the two.  I mean, HDMI is just now getting market acceptance when DVI has been around a long time.
> 
> HDMI is good to go in the entertainment industry but it is time for the computer industry to move on.  HDMI is old for that segment of the market.



your argument holds flawed logic.

since keyboards and mice dont need the bandwidth of USB and are designed for different things than flash drives, they should go back to PS/2 connectors - its been around longer.


Fair enough that you have your own opinion (dont care about HDMI) but your logic contradicts precendents - people prefer one unifying standard even if it does limit them slightly. its better than everyone having their own incompatible standards.


----------



## mdm-adph (Dec 21, 2009)

FordGT90Concept said:


> There are no choices anymore.  You plug it in and the equipment tells everything connected what it is allowed to do regardless of what you, the consumer, actually want.  This is just getting plain silly.  Before long, you won't be able to do anything without signing a contract and connecting it to the Internet with which, they see, hear, and whatever else they conjure up without you having any authority beyond installing the equipment or not (and we know people would blindly agree--end justifies the means).  The door is wide open to that doom's day prophecy with DisplayPort and once that door is open, it won't close.



Are you honestly surprised?  What did you expect?


----------



## Scrizz (Dec 21, 2009)

i agree with GT90Concept


----------



## FordGT90Concept (Dec 21, 2009)

Mussels said:


> since keyboards and mice dont need the bandwidth of USB and are designed for different things than flash drives, they should go back to PS/2 connectors - its been around longer.


You know how a keyboard works, right?  It has a series of interrupts and each key is assigned to a specific interrupt group.  PS/2 is practically the same as USB in terms of wiring.  If PS/2 were replaced with a new cable which supports one interrupt per key, there would never be a problem with interrupt collisions again.  Instead of improving the keyboard connector, we threw it into a trash bin with all the other flawed connection standards.

Mice don't need as many interrupts as keyboards so more suitable connector than USB would be difficult to come by--unless something revolutionary happened.


The advantage of PS/2 mice and keyboard is that, if they plug in, you can be guaranteed they work so long as they aren't defective.




Mussels said:


> Fair enough that you have your own opinion (dont care about HDMI) but your logic contradicts precendents - people prefer one unifying standard even if it does limit them slightly. its better than everyone having their own incompatible standards.


Standards = compatibility.




mdm-adph said:


> Are you honestly surprised?  What did you expect?


I'm not surprised, no.  It should be illegal to develop such standards (business, not consumer interest) internationally.


Edit: The USB hub often isn't initialized soon enough.  For example, my DFI LP X58 motherboard doesn't initialize the USB hub until an OS is loading.  This means that, with an USB keyboard, the BIOS are not accessible.  I have to plug in a PS/2 keyboard (which is initialized before POST begins) in order to access the BIOS.  Yet another reason why keyboards need their own dedicated cable, connectors, and protocols.


----------



## pr0n Inspector (Dec 22, 2009)

PS/2 can do full n-key rollover. USB cannot.


----------



## Hayder_Master (Dec 22, 2009)

Mussels said:


> its one display port to one DVI port - the other plug is USB for power.
> 
> 
> 
> ...



thanx for info


----------



## Munchy (Dec 22, 2009)

Excellent i have found people that actually know what they are on about,

I have been through this saga on several forums now and wondered if anyone can kick me in the right direction,

i have HD5870

3 monitors all working

2 x dvi and 1 passive display port adapter to dvi

now i can only use 2 @ a time so is this really because i need the expensive adapter?

Please do note i do not want extended desktop i only want monitor one and monitor two in extended mode and monitor three mirroring monitor 2

i figured as it is actually working on all three monitors (only 2 at a time) there must be a way to use 3 at the same time.

lol i know my answer but i just need someone to shout it at me so i believe them. 

i feel a little hard done by shelling out 220 on the card and then another 80 is for the adapter.

ps has anyone seen the active converters for sale anywhere?

Edit,

new train of thought?

Seeing as the passive dp to dvi is working is it possible that they wont all duplicate due to the third monitor being a different size?


----------



## jagd (Dec 23, 2009)

Correct you need active adapter for 3rd monitor http://www.amd.com/us/Documents/ATI_Eyefinity_Technology_Brief.pdf  download pdf and read 7th and 8th pages


http://www.widescreengamingforum.com/forum/viewtopic.php?t=16792&postdays=0&postorder=asc&start=570
Read directly from Dave Bauman ( product manager of AMD/ATI )

Munchy i wish you had been written where are you living ,you can find DELL/blizzlink adapters at US  and accell and sapphire adapters at Europe ,i gave a link to accell adapter at first page from a netherland company (accell = blizzlink ,retail brand according to info i found )
http://www.amd-news.com/assets/files/amd-cn/Eyefinity_SetupGuide_v1_AMD.pdf  setup guide 


Munchy said:


> now i can only use 2 @ a time so is this really because i need the expensive adapter?


----------



## Mussels (Dec 23, 2009)

and all three monitors must be the same resolution, thats another requirement


----------



## GSG-9 (Dec 23, 2009)

Mussels said:


> and all three monitors must be the same resolution, thats another requirement




and 1920x1080 max correct? Or can all 3 be run @ say 2048x1152.


----------



## Mussels (Dec 23, 2009)

GSG-9 said:


> and 1920x1080 max correct? Or can all 3 be run @ say 2048x1152.



i didnt see a max res, but that doesnt mean there isnt one


----------



## FordGT90Concept (Dec 23, 2009)

The safest max res would probably be 1920x1200 @ 60 Hz (single-link DVI).  It might be able to go to dual-link DVI res (2560×1600 @ 60 Hz) but don't count on it.


----------



## jagd (Dec 23, 2009)

of course it is possible , ati demoed eyefinity on 2560* 1600 (3 monitor = 7680* 1600 and 6 monitor = 7680 *3200 config ) there are alot reviews with 2560*1600.
http://www.anandtech.com/video/showdoc.aspx?i=3635


FordGT90Concept said:


> It might be able to go to dual-link DVI res (2560×1600 @ 60 Hz) but don't count on it.


----------



## GSG-9 (Dec 23, 2009)

jagd said:


> of course it is possible , ati demoed eyefinity on 2560* 1600 (3 monitor = 7680* 1600 and 6 monitor = 7680 *3200 config ) there are alot reviews with 2560*1600.
> http://www.anandtech.com/video/showdoc.aspx?i=3635



Sweet, im glad to hear it. Does anyone know if you can still rotate monitors when using eyefinity? I have not seen anything about that either.


----------



## Mussels (Dec 24, 2009)

GSG-9 said:


> Sweet, im glad to hear it. Does anyone know if you can still rotate monitors when using eyefinity? I have not seen anything about that either.



since they all need the same resolution, they'd all have to have the same orientation


----------



## GSG-9 (Dec 24, 2009)

Mussels said:


> since they all need the same resolution, they'd all have to have the same orientation



Yes but they can all be rotated 90 degrees, which you can imagine results in a completely different aspect ratio. The Triple Head 2 Go could do this, thats the only reason I ask.


----------



## nugzo (Dec 25, 2009)

jagd said:


> of course it is possible , ati demoed eyefinity on 2560* 1600 (3 monitor = 7680* 1600 and 6 monitor = 7680 *3200 config ) there are alot reviews with 2560*1600.
> http://www.anandtech.com/video/showdoc.aspx?i=3635



Before eyefinity can be used for gaming, xfire must be supported. No way 1 graphics card can game great at 7860*1600. you almost need xfire/SLI even for 2560x1600.


----------



## Disparia (Dec 25, 2009)

That support came with the 9.12 Hotfix drivers.

Haven't seen anyone here post their experience, but over at WidescreenGamingForum and HardOCP some people have posted that it is working. No benches yet.


----------



## nugzo (Dec 25, 2009)

Jizzler said:


> That support came with the 9.12 Hotfix drivers.
> 
> Haven't seen anyone here post their experience, but over at WidescreenGamingForum and HardOCP some people have posted that it is working. No benches yet.



If that's the case, thats awesome. I wonder if it's possible to use the DVI or HDMI on the 
2nd or 3rd video card? or is it not physically possible..


----------



## GSG-9 (Dec 25, 2009)

Jizzler said:


> That support came with the 9.12 Hotfix drivers.
> 
> Haven't seen anyone here post their experience, but over at WidescreenGamingForum and HardOCP some people have posted that it is working. No benches yet.



Rotating monitor support?


----------



## Disparia (Dec 26, 2009)

I think that support has been there since 9.10 (when Eyefinity got support):

http://www.amd.com/us/products/technologies/eyefinity/Pages/eyefinity.aspx


----------



## bryantalpinerunner (Feb 24, 2010)

*Please Help!! Almost Got It!!!*

PLEASE HELP!!! I bought the EFX version of ATIs 5670 Card. As you know it had DVI HDMI and Displayport, so without the active adapter i couldnt have all 3 monitors simotaeously.  I found Saphires version of the same 5670 card ony difference is it had DVI HDMI and VGA! i thought i found a loophole and wouldnt need the active adapter. After installing the card and the driver it came with, im still have the same issue requiring me to disable one of my monitors to use the 3rd one! NOW IM REALLY UPSET! If anyone has any solution or explanation for this i would really appretiate it!! Thanks in advance  - Bryant


----------



## Mussels (Feb 24, 2010)

the models with VGA dont support eyefinity - only two screens will work.


----------



## bryantalpinerunner (Feb 24, 2010)

Mussels said:


> the models with VGA dont support eyefinity - only two screens will work.



Thanks for the info i knew it was too good to be true. Could anyone suggest the cheapest place to get the active display port adapter thats been confirmed to work?


----------



## jagd (Feb 24, 2010)

are you sure mussels ?Because  i saw opposite  mentioned at some reviews .
 bryantalpinerunner  you could get active adapter for 64$ from dell before i dont know if it is still possible or not (coupon +bing discount ) ,but it was 5 months ago . 
http://slickdeals.net/forums/showthread.php?t=1582920
Anyway there are not much active adapters on market . Dell /blizzlink (=accell ) /sapphire look them and get cheapest one. Good luck



Mussels said:


> the models with VGA dont support eyefinity - only two screens will work.


----------



## Mussels (Feb 24, 2010)

quite sure, i mean hell - if the high end cards need an adaptor since they dont have the third timing thingo, of course the low end cards wont


----------



## jagd (Feb 24, 2010)

Of course low end cards need adapter ,problem is i missed sapphire 5670 had not DP output  than no eyefinity correct (i thought it had DP output ) ,my bad .


----------



## Mussels (Feb 24, 2010)

jagd said:


> Of course low end cards need adapter ,problem is i missed sapphire 5670 had not DP output  than no eyefinity correct (i thought it had DP output ) ,my bad .



thats my point - they include a VGA port which you CANT get an active adaptor for.


----------



## sreweti (May 5, 2010)

*Need help advice with setup*

Hello everyone I need your tech advice , I am building a PCATD ( PC based aviation training device ) for our aviation school and got a healthy research grant to set it up 
I have bought a grunty PC i7 950 with TWO! radeon 5970 cards and three 37 inch LG LCD screens ( only hassle is these screens only have HDMI input no displayport input) , THe issue is that i think i need a mini display port to HDMI converter for the third screen to achieve true eyefinity. I have got two screens running off the two DVI ports with eyefinity no probs. Connecting the third screen to one of the dvi ports on the second 5970 card wont work. The interestinf thing is that all the forums say you need an active display port to DVI connector ( active as in USB powered ) , However all of the ATI approved ,mini display port to HDMI adapters are passive and dont need power . Is this correct ? will a minidisplay port to HDMI give me that third screen eyefinity screen . Or will i have to buy an active display port to DVI connector and then convert it to HDMI 
any ideas woudl be most appreciated


----------

