Sunday, May 8th 2016
NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support
NVIDIA appears to have done away with support for the legacy D-Sub (VGA) analog connector, with its latest GeForce GTX 1080 graphics card. The card's DVI connector does not have wiring for analog signals. Retail cards won't include DVI to D-Sub dongles, even aftermarket dongles won't work. What you get instead on the card, are one dual-link DVI-D, an HDMI 2.0b, and three DisplayPort 1.4 connectors. NVIDIA's rival AMD did away with D-Sub support on its high-end graphics cards way back in 2013, with the Radeon R9 290 series.
81 Comments on NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support
www.startech.com/Server-Management/KVM-Switches/2-Port-Dual-DisplayPort-USB-KVM-Switch-with-Audio-and-USB-20-Hub~SV231DPDDUA
MST is apparently broken when switching...probably because of the plug and play like why converters don't work so well switching. DisplayPort needs a keep-alive standard. I don't think any of the newer versions of DisplayPort add that functionality.
Will I not be able to use a 1080 card then?
Next step - get rid of HDMI. I know that's only wishful thinking and highly unlikely to happen. But we should have dreams, right!
Something like this:
www.gefen.com/kvm/ext-dvi-2-vgan.jsp?prod_id=9569
or
www.monoprice.com/product?p_id=8214
For the .1% who need it, there are options.
I'am using vga for crt MB diamondpro at 110hz to play csgo and other fast fps as my second monitor.
For CRT users, I think the GTX 9XX series will be the one to keep. And concerning true contrast / black levels and input lag, CRTs are still state-of-the-art until OLED monitors are affordable (which will take no less than 5 years probably), so it'll be quite some time for me to hold out with my GTX 960.
Not all countries are first world economies.