Sunday, May 8th 2016

NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support

NVIDIA appears to have done away with support for the legacy D-Sub (VGA) analog connector, with its latest GeForce GTX 1080 graphics card. The card's DVI connector does not have wiring for analog signals. Retail cards won't include DVI to D-Sub dongles, even aftermarket dongles won't work. What you get instead on the card, are one dual-link DVI-D, an HDMI 2.0b, and three DisplayPort 1.4 connectors. NVIDIA's rival AMD did away with D-Sub support on its high-end graphics cards way back in 2013, with the Radeon R9 290 series.
Add your own comment

81 Comments on NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support

#26
GoldenX
Dj-ElectriCPeople in outdated countries than shouldn't buy a GTX 1080 for their VGA monitor.
That's not what I mean. If for example the "gt1020" doesn't have vga support, that would be a problem. Nobody uses a gtx980 with a vga monitor here.
Posted on Reply
#27
dj-electric
If a lower end card will not include VGA they sill surely be in trouble, no doubt.
Hopefully, it won't happen.
Posted on Reply
#28
efikkan
And why don't they get rid of the DVI as well? (HDMI and DVI is compatible) It's just sitting there blocking 20% of the exhaust.
Posted on Reply
#30
TheinsanegamerN
UbersonicA $65 DVI monitor will be urinated all over by a decent monitor using VGA.
That has nothing to do with it being VGA, that is all down to the panel quality. VGA is dead tech, it's been around since the 1990s, it's been replaced several times (DVI, HDMI, displayport) and has no place on a modern graphics card. VGA should have died with the CRT.

And if anyone is still using a VGA monitor, they dont need a 1080, or even a 1070. Their resolutions are far too low to take advantage of new GPUs. The HIGHEST they could realistically be is 1200p, which a 970 runs just fine. People can't expect one video cable to stick around forever, not when resolutions continue to increase. I'd say a 25 year run for a video cable is pretty darn good. Especially considering there are no longer analogue displays to go with them anymore.
Posted on Reply
#31
cdawall
where the hell are my stars
GoldenXThat's not what I mean. If for example the "gt1020" doesn't have vga support, that would be a problem. Nobody uses a gtx980 with a vga monitor here.
When nvidia doesn't just rebrand the bottom end of cards it's time to worry about that
Posted on Reply
#32
Red elf
Darn! I can't use my CRT ViewSonic P227 fB With any upcoming high end graphic cards (Believe me... converts don't do it justice, it stutters like hell) I... may also be one of the very few people who uses a high res VGA only monitor.
Posted on Reply
#33
cdawall
where the hell are my stars
Red elfDarn! I can't use my CRT ViewSonic P227 fB With any upcoming high end graphic cards (Believe me... converts don't do it justice, it stutters like hell) I... may also be one of the very few people who uses a high res VGA only monitor.
Nope you and the Sony owners will be sol.
Posted on Reply
#34
Frick
Fishfaced Nincompoop
Not even I use that connection anymore.
Red elfDarn! I can't use my CRT ViewSonic P227 fB With any upcoming high end graphic cards (Believe me... converts don't do it justice, it stutters like hell) I... may also be one of the very few people who uses a high res VGA only monitor.
That is actually a monitor I'd be willing to own.
Posted on Reply
#35
neatfeatguy
newtekie1Never tried it with VGA, but my DP to HDMI adapter seems to work fine. But that is digital to digital, it might be a whole lot worse going from digital to analog.
+1

I have three monitors that support HDMI/DVI/VGA. I have two connected to my 980Ti with HDMI to DP adapters and I don't have any issues. The third connects with DVI. All is good for me.

Digital to digital shouldn't give any issues. Digital to analog can be problematic.
Posted on Reply
#36
FordGT90Concept
"I go fast!1!11!1!"
TheLostSwedeSo you're saying you've tested every adapter on the market? I've never had any problems going mini DP to HDMI. My girlfriend has been using a mini DP to VGA adapter regularly with her work Mac without any problems. It seems like you've been unlucky and got bum adapters. Maybe you should try buying a quality product instead of the cheapest ones on DX...
The problem is that DisplayPort is plug and play but so is HDMI, DVI, and VGA. What happens is DisplayPort has to wait for activity which turns on the HDMI/DVI/VGA converter which then also has to wait for activity before the monitor will detect that link. That gap can cause issues especially where rapid switching is required (e.g. a KVM). Here's pretty much what happens: KVM switch -> operating system detects the disconnect so it unplugs the device -> operating system detects the device and enables it. On Mac, this likely isn't a problem because Mac restores the desktop on monitor discovery. Windows does not (not even 10). In that <1 second gap, everything that was on that monitor moves to another monitor and you manually have to move it back or use software that attempts to do so (far from flawless). All because DisplayPort turns the converter off instead of keeping it powered.

All of my adapters are Startech, as is my KVM.

If you never disconnect your monitor, they aren't a problem but if you do, expect problems. I discovered it was better to install a second graphics card that still has VGA support than deal with OS/converter issues.
btarunrThe adapter your GF is using could be Thunderbolt-to-VGA (PCIe) rather than DisplayPort-to-VGA?
That could be too. In that case, it is a mini GPU powering the VGA eliminating the converter issue.
Posted on Reply
#38
GhostRyder
nickbaldwin86remove the DVI so I can single slot the card!
That's what I want!
Posted on Reply
#39
efikkan
The only valid reason for needing D-Sub 15 I can come up with would be someone using this card for CUDA in several computers and needing a KVM switch. But short of the 10 people who needs that, I don't really think anyone really needs D-Sub 15 any more.
Posted on Reply
#40
cdawall
where the hell are my stars
efikkanThe only valid reason for needing D-Sub 15 I can come up with would be someone using this card for CUDA in several computers and needing a KVM switch. But short of the 10 people who needs that, I don't really think anyone really needs D-Sub 15 any more.
They have KVM switches with display port, HDMI and DVI.
Posted on Reply
#41
efikkan
cdawallThey have KVM switches with display port, HDMI and DVI.
Yes, but they tend not to work that well, and be pricey.
Posted on Reply
#42
cdawall
where the hell are my stars
efikkanYes, but they tend not to work that well, and be pricey.
I use a cheap one at work every single day works fine
Posted on Reply
#43
Ubersonic
TheinsanegamerNThat has nothing to do with it being VGA, that is all down to the panel quality.
That's the point, the are many people still using high end 1080p/1200p panels that only have VGA input as a second or third screen. It was lame when AMD cut costs by dropping DVI-I ports in favour of cheaper DVI-D ports but to see Nvidia go the budget route too sucks.
Posted on Reply
#44
qubit
Overclocked quantum bit
Great that NVIDIA has keep support for VGA for this long. There are still people using them for one reason or another, even if it's just for testing, trying things out or comparisons.
Posted on Reply
#45
Solidstate89
UbersonicThat's the point, the are many people still using high end 1080p/1200p panels that only have VGA input as a second or third screen. It was lame when AMD cut costs by dropping DVI-I ports in favour of cheaper DVI-D ports but to see Nvidia go the budget route too sucks.
It's not "going the budget" route, it's removing support for a connector that hasn't seen widespread support in almost a decade. We don't use IDE anymore either because better standards came along.

You and everyone else should accept the fact that these old unused standards don't stick around forever. They should have dropped VGA years ago. It hasn't been the dominant protocol in years.
Posted on Reply
#46
cdawall
where the hell are my stars
Solidstate89It's not "going the budget" route, it's removing support for a connector that hasn't seen widespread support in almost a decade. We don't use IDE anymore either because better standards came along.

You and everyone else should accept the fact that these old unused standards don't stick around forever. They should have dropped VGA years ago. It hasn't been the dominant protocol in years.
Yep. Hopefully all of the people with "high end" parallel port printers don't get mad.
Posted on Reply
#47
GoldenX
The problem with the digital standards is the competition between them, HDMI, DVI DisplayPort. VGA was the only analog output, digital ones should be merged into a single standard.

I have a working Epson LX-300, it's cheaper than a laser :P
Posted on Reply
#48
Frick
Fishfaced Nincompoop
cdawallI use a cheap one at work every single day works fine
Which brand and how much? Here it's super hard to find KVM switches with DVI and USB for less than €100, while VGA/PS2 and even VGA/USB can be had for pennies.
Posted on Reply
#49
cdawall
where the hell are my stars
FrickWhich brand and how much? Here it's super hard to find KVM switches with DVI and USB for less than €100, while VGA/PS2 and even VGA/USB can be had for pennies.
Iogear, I think it was like $80 when I snagged it.
Posted on Reply
#50
FordGT90Concept
"I go fast!1!11!1!"
cdawallThey have KVM switches with display port, HDMI and DVI.
Is there any DisplayPort KVMs that support MST?
Posted on Reply
Add your own comment
Nov 22nd, 2024 16:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts