Sunday, May 8th 2016
NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support
NVIDIA appears to have done away with support for the legacy D-Sub (VGA) analog connector, with its latest GeForce GTX 1080 graphics card. The card's DVI connector does not have wiring for analog signals. Retail cards won't include DVI to D-Sub dongles, even aftermarket dongles won't work. What you get instead on the card, are one dual-link DVI-D, an HDMI 2.0b, and three DisplayPort 1.4 connectors. NVIDIA's rival AMD did away with D-Sub support on its high-end graphics cards way back in 2013, with the Radeon R9 290 series.
81 Comments on NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support
This doesn't surprise me that they axed analog. AMD did a long time ago.
I guess even monitors with DVI/HDMI aren't that expensive anymore, under $100, so I could just buy a new secondary monitor. But I think that's wasteful just because the GPU doesn't support it. And sacrifice precious system RAM!?!
So it now supports FreeSync monitors?
Well it's about time. I get the needs of some but it's getting to the point it was really not necessary. I still would prefer the DVI got deleted as well and make it all one row of DP's and HDMI. I think it would look a lot cleaner.
Some people are stuck with old monitors until they can afford to upgrade.
Any monitor with a decent refresh and size is going to be around $500 and up.
The main difference between DVI/VGA (apart from dual link DVI supporting higher resolutions/refresh than VGA/DVI) is that DVI carries a digital signal whereas VGA carries an analogue signal which is converted to digital by the monitor. This means depending on the quality of the monitor and cable the result can either look terrible or just as good as DVI. This is the reason many people still use high end screens that only have VGA inputs.