Ok, you keep thinking that. I had a PCI to USB 1.2 adapter BITD and it worked fine. Drivers worked fine, card worked fine. Output was 1024x768@30hz. Can't find a link for it, some no-name brand that doesn't exist anymore. But it worked great and without any issues. Making such an adapter for PCIe would take some work, sure, but it would still be trivial.
1. USB 1.2 does not exist, never did. USB 1.0 had a 1.1 revision, then it went to 2.0. You might be thinking of hi-speed 1.1?
2. 1024x768 @ 30Hz with 8-bit color is 188+ megabits of data per second. USB 1.1 hi-speed was limited to 12 mbps, so couldn't possibly display such a resolution. Even just limiting ourselves to black and white (1 bit) we're still twice over the bandwidth limit. For such an adapter USB 2.0 would be required.
3. Such an adapter would have to actively convert from the USB protocol to the PCI protocol and thusly be expensive. It would not be trivial.
4. PCI to USB? Yet you say you got a graphics output? So you mean USB to PCI, with a GPU then connected in the PCI slot?
5. Without proof of such an adapter ever existing, we're left with null and void claims. I have to side with
@Yukikaze on this as I also don't think such a product would be trivial enough to produce, and the market for it would be extremely limited because of the lack of point in it. If video is all you're looking for, why the need for an eGPU in the first place? One of those USB to HDMI adapters you linked does the job just fine. If you're looking to actually take advantage of a desktop-grade GPU, USB is not the way to go for so many reasons; latency being the primary.
While we can see where you're going with this, it's not really what we were talking about. We were talking about an adapter that let's you plug in a standard PCIe GPU into a USB connected PCIe slot. He was saying it's not possible. I disagreed as the topic of this very article has shown out of box thinking using something successfully in a way it was not intended.
The topic of this very article is connecting a PCIe device to a PCIe slot. Whoopdedoo. That is exactly how the protocol was intended to be used. What's 'strange' is that the PCIe slot happens to be another form factor that happens to be commonly used for storage/wifi/bt only. It's not out of the box thinking; it's using the slot for a slightly abnormal, but totally supported, use case. Converting USB to PCIe and have it function like a regular ol' PCIe slot would be in another ballpark entirely which is what
@Yukikaze was trying to explain to you.
May I reminds you my original post are...
Did I wrote something about plugged to motherboard or any available USB port and function properly for "regular" data transfer? I can swap out that cable with any cheapo USB 3.0 type A and still working normally, so there's nothing "proprietary" about bespoke cable. Yeah its weird, whoever person bought that card know what they doing.
Oh and one more thing,
Yukikaze said you couldn't run an eGPU from USB 3.0 (indicating THEY were talking about USB 3.0 being the source; as in "on the motherboard or any available USB port"), to which you replied "But akschually" and linked something that could NOT be used to connect PCIe to USB 3.0. I pointed out it is only using the cable because of the wires, not because it's USB.
What do you guys called that cable? And also...
If its not HDMI, then what type of port they used? mPCIeMI ?
Number 1 is an USB to serial port adapter.
Number 2 is a proprietary cable that uses the form factor and wires of an HDMI cable but cannot be used for HDMI purposes. If you followed the link provided in the post you'd see Tongban mentioned it is used only for their eGPU solution, it is not and cannot be used for HDMI. It simply uses the wires in the cable to transfer PCIe signals, and happens to terminate in an HDMI male end.
That's literally just a PCIe 3.0 x4 to PCIe 3.0 x4 "riser". Very minimal loss in signal integrity. All it does is convert form factors.
The same performance overhead drop off from a M.2 over USB is what you'd expect of a external GPU in practice. It can be done, but performance won't be the same as a card slotted device. The polling rate interrupts aspect is a fair point about USB. I wouldn't expect peak performance out of it. Moonlight streaming I think is a better option in general where possible if I'm not mistaken.
I believe USB-C to M.2 portable device and then using the M.2 slot to PCIE riser adapter might work in practice, but not at peak performance you'd expect of the PCIE device. If it's just the M.2 to PCIE riser though it should work near identical in practice up to the M.2's PCIE 3.0/4.0 x4 specification on the motherboard for the slot itself which is physically wired to PCIE in the first place M.2 is simply a form factor slot while the NVME device normally connected to it is a protocol.
USB-C is, again, just a form factor. Theoretically you could have USB 2.0 run off a type C connector. What you mean to say is Thunderbolt 3, which uses the type C connector, of which there are plenty of eGPU solutions already. Or, if you really want to do it the difficult way, you could use a USB 3.2 gen 2 (ugh, damn you, USB-IF!) to an M.2 adapter and then possibly use one of these risers to connect a GPU. However, the latency penalty of starting at USB would make the GPU useless in practice (not a problem for storage devices though, which is why NVMe drives still work semi-decently even over USB), and as you mention, streaming would be much, much, much preferred.