• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why Aren't USB C Video Cables More Common

Joined
Feb 6, 2020
Messages
350 (0.18/day)
Location
USA
System Name My Computer
Processor AMD R7 9800X3D (eco mode)
Motherboard Gigabyte B850M Gamering X Wifi
Cooling Thermalright Phantom Spirit (7 pipe)
Memory Flare X 16GB x2 6000 MT/s 28-36-36-96
Video Card(s) Sapphire RX 9070 XT Pulse (-90mV, -30% PL)
Storage WD SN850X 2 TB x2, 8TB x2
Display(s) Acer XV340CK x2
Case Lian Li A3 black, mesh, woodgrain
Power Supply Corsair RM750e (2025)
Mouse Logitech G400s
Keyboard Ducky Origin Vintage (beige and dark beige)
Software Bazzite
Benchmark Scores Typeracer level Type Master
A long time ago (in a galaxy far, far away...) AMD brought us the amazing HD 5870 eyefinity with 6 video ports.


This had 6x mini display port cables and could power 6x 1080p monitors.

When nvidia launched Turing gen (rtx2***) we saw DP over USB C. But since then we haven't seen any USB C outputs on consumer cards.

I'm curious why don't we have 100% USB C outputs on graphics cards now? They seem compact and just as prone to disconection as an hdmi without a screw or clip. In the same vein, a single gpu can output significantly more total pixels than the old 5870, so why don't we have something like that in USB C? Such as an eyefinity with 16x 1080p USB C outputs.

Your thoughts are appreciated.
 
Cost.

Why is there only one HDMI connection, but usually 3 display port connections?
Answer - Display port is royalty free. HDMI is a fee, and a royalty. USB-C is a fee and royalty. If you were running a GPU company, what exactly would you be using?

The reality here is that unless you've got a very specific use case, the DP connector is cheap and it's why it outclasses everything else.
 
When nvidia launched Turing gen (rtx2***) we saw DP over USB C. But since then we haven't seen any USB C outputs on consumer cards.
Sorry to be that guy, but that wasn't a standard USB-C port, that was a VirtualLink port, a standard that is now dead. Yes, it did do DP, but the implementation of the pins didn't follow the standard USB-C DP Alt mode configuration.
I'm curious why don't we have 100% USB C outputs on graphics cards now? They seem compact and just as prone to disconection as an hdmi without a screw or clip. In the same vein, a single gpu can output significantly more total pixels than the old 5870, so why don't we have something like that in USB C? Such as an eyefinity with 16x 1080p USB C outputs.

Your thoughts are appreciated.
Mostly due to lack of inputs on monitors I would say, plus it cost an extra couple of bucks to add, due to some extra chips that are needed, per port, for it all to work, mostly due to the fact that you can insert the cable either direction, which requires detection by the device you're plugging it in to.
There's seemingly no incentive for the board makers to further increase the cost of graphics cards, plus it makes for a slightly more complex PCB layout, even more so if you expect to get USB PD support as well. USB-C is technically only required to deliver 7.5 W.

Cost.

Why is there only one HDMI connection, but usually 3 display port connections?
Answer - Display port is royalty free. HDMI is a fee, and a royalty. USB-C is a fee and royalty. If you were running a GPU company, what exactly would you be using?

The reality here is that unless you've got a very specific use case, the DP connector is cheap and it's why it outclasses everything else.
There's no fee or royalty for USB ports, however, you do have to certify the product if you want to add the USB logo to your product and get a unique vendor ID, these have a yearly fee which most of the graphics card makers pay anyhow, since they make other products with USB ports.
 
DP over C is a fringe decision outside the third party accessory sales built up around a particular ecosystem or industrial level uses. Casting or whatever term is marketed for OTA screen sharing is the more likely to keep cropping up in consumer sphere.
 
Gotta be compatible with everything else in enterprise markets OOTB, you're not going to convince big companies to change all their monitors when they get new hardware. Dell has been using displayport on everything since displayport existed, we have 4:3 1280x1024 monitors in circulation that have displayport on them. Everyone has gotten used to HDMI and displayport and they work fine, one of those "dont fix what isnt broken" situations.
For the types of devices where the form factor and capabilities of usb c are important, you can just get type c to whatever end cables.

So in the hypothetical transition youd have to have cards with some type C ports for video alongside existing display standards, and then if you reduce the amount of existing display standard ports, to 99% of customers both consumer and business, youve just made a card with less ports. Because theyre not going to use the type c ports, it would take years of adoption to make the standard relevant enough that people would use it before the other ports.
This is why s-video was on every GPU until like 2008, because enough people were still running their PC's into old TV's that it was dangerous from a sales standpoint to entirely omit the port and affiliated circuitry because suddenly your hardware isnt compatible with 50% of all offices who are still using non-vga compatible projectors.
Same applies now, if you lose the displayport sure thats every single lenovo/dell/hp monitor that no longer works with the new computers. But then thats also every hdmi projector, smartboard, or combination networked CCTV system thats now in need of some adapter or other hardware change.

Then theres the consumer side of things, where we've all gotten used to type C charging our phones/laptops/vapes/blenders and a surprisingly small amount of people actually use stuff like thunderbolt docks or plug laptops directly into monitors with dp over type c. Most people just arent even aware type c can do that, and would probably just see their GPU as being an inconvenient place to charge their phone from.
 
Gotta be compatible with everything else in enterprise markets OOTB, you're not going to convince big companies to change all their monitors when they get new hardware. Dell has been using displayport on everything since displayport existed, we have 4:3 1280x1024 monitors in circulation that have displayport on them. Everyone has gotten used to HDMI and displayport and they work fine, one of those "dont fix what isnt broken" situations.
For the types of devices where the form factor and capabilities of usb c are important, you can just get type c to whatever end cables.

So in the hypothetical transition youd have to have cards with some type C ports for video alongside existing display standards, and then if you reduce the amount of existing display standard ports, to 99% of customers both consumer and business, youve just made a card with less ports. Because theyre not going to use the type c ports, it would take years of adoption to make the standard relevant enough that people would use it before the other ports.
This is why s-video was on every GPU until like 2008, because enough people were still running their PC's into old TV's that it was dangerous from a sales standpoint to entirely omit the port and affiliated circuitry because suddenly your hardware isnt compatible with 50% of all offices who are still using non-vga compatible projectors.
Same applies now, if you lose the displayport sure thats every single lenovo/dell/hp monitor that no longer works with the new computers. But then thats also every hdmi projector, smartboard, or combination networked CCTV system thats now in need of some adapter or other hardware change.

Then theres the consumer side of things, where we've all gotten used to type C charging our phones/laptops/vapes/blenders and a surprisingly small amount of people actually use stuff like thunderbolt docks or plug laptops directly into monitors with dp over type c. Most people just arent even aware type c can do that, and would probably just see their GPU as being an inconvenient place to charge their phone from.
This is a bit of a flawed argument though, as you can get simple USB-C to DP adapters which work with all DP monitors.
On top of that, most corporations don't put consumer graphics cards in their computers, which is what the OP was asking about, graphics cards, not corporate PCs.
 
Last edited:
Wasn't the USB-C in RTX 20 series meant mostly for VR?
 
On top of that, most corporations don't put consumer graphics cards in their computers, which is what the OP was asking about, graphics cards, not corporate PCs.

I made that crossover because it exists. Not referencing corporate, industrial uses.

Hard truth is USB standards are a mess that isolated consumers. Why get a high end mobo/GPU with uses that are unattainable until long after realistic lifespan for such a component has elapsed. Also, _____ who removed all but a single port from their devices creating a failed coup.
 
Wasn't the USB-C in RTX 20 series meant mostly for VR?
If you'd read my reply above, you would've seen it was called VirtualLink, which is now a dead USB-C standard.

I made that crossover because it exists. Not referencing corporate, industrial uses.

Hard truth is USB standards are a mess that isolated consumers. Why get a high end mobo/GPU with uses that are unattainable until long after realistic lifespan for such a component has elapsed. Also, _____ who removed all but a single port from their devices creating a failed coup.
Still, a single USB-C port on a graphics card is not the same as dropping all the current ports.
Also, I said most corporations and I was thinking office computers, not other uses.
 
Thank God it isn't!
This standard is a confusing mess that desperately needs someone sane enough to shed all the asinine "features" it had accumulated over the last decade.
It's not "universal" if two identical ports aren't guaranteed to function the same way or even close. Alt modes are stupid and they should never have been implemented.
Version differences in fixed-purpose standards are already confusing enough.
 
If you'd read my reply above, you would've seen it was called VirtualLink, which is now a dead USB-C standard.
Yeah, missed that. I actually thought that it was just marketing name for a normal USB-C connector dedicated to a VR headset (instead it using one of the motherboard's connectors).
 
Aside from the technical or monetary considerations, I assume that in general DisplayPort and HDMI ports are both more resistant to physical stress compared to the puny USB-C port.
 
It's not "universal" if two identical ports aren't guaranteed to function the same way or even close.
Yea, I've seen an expensive peripheral where the power and data connection are both USB C. I hope they were smart enough to put protections on the data port but I wouldn't want to test it myself.
 
Sorry to be that guy, but that wasn't a standard USB-C port, that was a VirtualLink port, a standard that is now dead. Yes, it did do DP, but the implementation of the pins didn't follow the standard USB-C DP Alt mode configuration.

Mostly due to lack of inputs on monitors I would say, plus it cost an extra couple of bucks to add, due to some extra chips that are needed, per port, for it all to work, mostly due to the fact that you can insert the cable either direction, which requires detection by the device you're plugging it in to.
There's seemingly no incentive for the board makers to further increase the cost of graphics cards, plus it makes for a slightly more complex PCB layout, even more so if you expect to get USB PD support as well. USB-C is technically only required to deliver 7.5 W.


There's no fee or royalty for USB ports, however, you do have to certify the product if you want to add the USB logo to your product and get a unique vendor ID, these have a yearly fee which most of the graphics card makers pay anyhow, since they make other products with USB ports.
  • DisplayPort Alt Mode: Many USB-C devices support this mode, allowing direct connection to DisplayPort monitors.
    • Licensing: While the DisplayPort specification is generally considered royalty-free, the DisplayPort website states that obtaining DisplayPort certification requires VESA membership and a VESA Trademark License Agreement to use DisplayPort trademarks and logos.
  • HDMI Alt Mode: This was designed to allow native HDMI signals over USB-C, but it faced limited adoption and is no longer being developed, according to Ars Technica.
    • Licensing: Using the HDMI specification and its trademarks requires being an HDMI Adopter and paying annual fees and per-unit royalties, according to Symmetry Electronics.
USB C requires interpretation mechanisms. I called it a fee and royalty because while the underlying connection could theoretically be had without cost, you cannot effectively do anything with it.

You mean "Greed".

No he meant cost.

@lexluthermiester is 100% correct here.

This is the same reason that my GPU not coming with a bunch of extra cables doesn't bug me, and it's why I'm cool with the compromise. $0.20 per unit in licensing fees isn't a lot when you consider they could cost $500 on the low end...but that's not what we'd get charged. Implement a new standard, include a new connector, provide wiring and subcomponents, and then add on another $0.20 fee. Once you add on everything I have to pay another $20-30 once we're said and done so that I could theoretically hook my GPU up...which I do a grand total of like two or three times before the GPU gets used to death and recycled.

If you're going to charge me an extra twenty for something I cannot use, then at least let me get something good. Like a slightly better constructed fan or shroud.

Barring all of that, do you remember how bad hydravision sucked? It was back before we really had multiple monitor support, it was only on the egregiously expensive cards, and in 2002 we were still rocking a lot of CRT monitors. Today a single QHD display would blow that out of the water in terms of raw pixel pushing...so I'm 100% sold on the cheaper cards with "only" 4 ports being absolutely reasonable in cost savings that is...well, it's hard to tell if it's passed directly to consumers when it's functionally chose at the outlay for a feature to have/support so it's not like it's ever actually factored into said budget.
 
Screenmaxing is an illness.

On a more serious note, the demand for <=4 displays is high. The demand for 5 displays and more is negligible. That's why companies don't bother implementing more compact connectors instead of what works fine as it is.
 
Yeah, missed that. I actually thought that it was just marketing name for a normal USB-C connector dedicated to a VR headset (instead it using one of the motherboard's connectors).
It did most of the things a modern USB-C can do, with the addition of the specific VR-headset mode which almost no VR-headsets supported.
It was a very odd decision by Nvidia to add that connector to its cards.
Some AMD graphics card had a standard USB-C port, but it was only of the AMD reference design, the board makers all dropped it.

  • DisplayPort Alt Mode:Many USB-C devices support this mode, allowing direct connection to DisplayPort monitors.
    • Licensing: While the DisplayPort specification is generally considered royalty-free, the DisplayPort website states that obtaining DisplayPort certification requires VESA membership and a VESA Trademark License Agreement to use DisplayPort trademarks and logos.
  • HDMI Alt Mode: This was designed to allow native HDMI signals over USB-C, but it faced limited adoption and is no longer being developed, according to Ars Technica.
    • Licensing: Using the HDMI specification and its trademarks requires being an HDMI Adopter and paying annual fees and per-unit royalties, according to Symmetry Electronics.
USB C requires interpretation mechanisms. I called it a fee and royalty because while the underlying connection could theoretically be had without cost, you cannot effectively do anything with it.
Eh? I think you're seriously mixing things up now.
USB-C doesn't have to support DP Alt mode, it's an optional feature, but most companies that does USB-C also do DP products, so they're most likely members of both organisation.
In fact, you don't have to certify your cables at all, but you're then also not allowed to use any of the official logos or claim compatibility.
The HDMI Alt mode died a long time ago as per your text.

All of these organisations, regardless of standard, expects you to pay a yearly membership fee if you want to claim compatibility, this includes all modern computer interfaces, so unless you want to go back to serial and parallel ports, it doesn't really matter, as someone owns the standard.
The big difference is the per unit fee, which is nasty and shouldn't be a thing for any organisation that owns a hardware standard and not really for software either.
 
It did most of the things a modern USB-C can do, with the addition of the specific VR-headset mode which almost no VR-headsets supported.
It was a very odd decision by Nvidia to add that connector to its cards.
Some AMD graphics card had a standard USB-C port, but it was only of the AMD reference design, the board makers all dropped it.


Eh? I think you're seriously mixing things up now.
USB-C doesn't have to support DP Alt mode, it's an optional feature, but most companies that does USB-C also do DP products, so they're most likely members of both organisation.
In fact, you don't have to certify your cables at all, but you're then also not allowed to use any of the official logos or claim compatibility.
The HDMI Alt mode died a long time ago as per your text.

All of these organisations, regardless of standard, expects you to pay a yearly membership fee if you want to claim compatibility, this includes all modern computer interfaces, so unless you want to go back to serial and parallel ports, it doesn't really matter, as someone owns the standard.
The big difference is the per unit fee, which is nasty and shouldn't be a thing for any organisation that owns a hardware standard and not really for software either.
"...I'm curious why don't we have 100% USB C outputs on graphics cards now?..."


You cannot have a signal without generating it. To generate it, you need encode and decode to some standard which will then travel along a cable...which gives not one crap about the signals unless there's some sort of baked in active adapter.

In short, the issue is the cards requiring this stuff are why it doesn't show on the cards. Theoretically I could bastardize a nice section of CAT 5e to carry a bunch of uart signals...but they'd be useless if I plugged that uart into the port of a home router...assuming that it didn't suddenly have voltage issues. That's why I'm saying the protocols, and thus the licensing fees, are why we don't see USB C carrying video over cables....not some inherent issue with USB C....which is hinted at, but as always the issue boils down to money.
 
USB-C DPalt is not 'microDP' nor 'DP-C'. USB-C DP Alt Mode is an optional configuration for the 4 (DP/SuperSpeed) 'lanes' contained w/in 'full spec' USB3.1-> USB-C.
It requires more circuitry^ than mDP to implement, and
is often* lower bandwidth vs. mDP/DP.


MiniDP is still used to this day for many display-out cards, for good reason.
mDP/DP are (usually) passively split-able and where not, "MST hubs" exist.

(ignore the extremely outdated 'summary'. mDP's bandwidth capabilities match DP's.)
1752778565990.png
1752778589685.png

Also, mDP and DP are pin for pin compatible; passively adaptable.

^
1752780959653.png

USB-C DP Alt Mode requires a switching/handshake IC on both ends, regardless of whether all 4 lanes are used for DisplayPort or 2 for SuperSpeed*.

*
1752780719427.png
 

Attachments

  • 1752779850340.png
    1752779850340.png
    58.9 KB · Views: 21
  • 1752780867415.png
    1752780867415.png
    344.2 KB · Views: 12
Last edited:
USB-C DPalt is *not* microDP, it's an alternate function mode for USB, that's highly-dependent on cable quality.

And even more highly dependent on how highly a given firmware build prioritizes functionality.
DP is and always has been highly sensitive to cable quality.
 
Screenmaxing is an illness.

On a more serious note, the demand for <=4 displays is high. The demand for 5 displays and more is negligible. That's why companies don't bother implementing more compact connectors instead of what works fine as it is.
MST works around the problem either way, so there's no reason to have more compact connectors unless you need more bandwidth than what your hub can provide.
 
MST works around the problem either way, so there's no reason to have more compact connectors unless you need more bandwidth than what your hub can provide.
High Hz, HDR, and esp. Vesa AdaptiveSync / FreeSync are not well supported, unreliable or just don't work on even the best MST hubs, last I checked.

But, unless you're building a sharable simpit / notaholodeck, I don't forsee it as a problem. :laugh:
 
High Hz, HDR, and esp. Vesa AdaptiveSync / FreeSync are not well supported, unreliable or just don't work on even the best MST hubs, last I checked.

But, unless you're building a sharable simpit / notaholodeck, I don't forsee it as a problem. :laugh:
That's true. I wish there were more DP 2.1 hubs, a lot of them are still on 1.2. At least there's some 1.4 options around that support HDR, but most of these hubs are USB-C and I'm not sure how well they'd work with a DP adapter.
 
I'm curious why don't we have 100% USB C outputs on graphics cards now? They seem compact and just as prone to disconection as an hdmi without a screw or clip. In the same vein, a single gpu can output significantly more total pixels than the old 5870, so why don't we have something like that in USB C? Such as an eyefinity with 16x 1080p USB C outputs.

Your thoughts are appreciated.
Better question, why would you want 100% USB C outputs on graphics cards?

Going USB C wont let you do 16 monitors. Most consumer cards are limited on their monitor output but their firmware. nvidia consumer cards were limited to 3 monitors per GPU (dont know if they still are). With 3 DP ports you can technically run 12 monitors, but good luck with software support. Since GPUs are not size restricted like phones, the size benefit is of no help. Instead, you would need to include adapters with every card, since 99.999999% of the market uses displayport or HDMI. So you're incurring higher costs for 0 benefit. On connectivity, yeah USB C is just as prone to disconnects. Displayport, however, has provisions for clips to hold it in place. So going USB C would be a major downgrade.

A manufacturer could easily just put 4x displayports on a card and an end user could then buy 4 port MST display adapters to drive 16 monitors, 4 per port. But that is such a ridiculous niche use case there's no business reason to do so. For the customer, you gain nothing over using DP and you move to an objectively inferior connector for the application while breaking compatibility requiring adaptors for your monitors.
 
Last edited:
Back
Top