Monday, January 25th 2021

Kosin Demonstrates RTX 3090 Running Via Ryzen Laptop's M.2 Slot

Kosin a Chinese subsidiary of Lenovo has recently published a video showing how they modded their Ryzen notebook to run an RTX 3090 from the NVMe M.2 slot. Kosin used their Xiaoxin Air 14 laptop with a Ryzen 5 4600U processor for the demonstration. The systems internal M.2 NVMe SSD was removed and an M.2 to PCIe expansion cable was attached allowing them to connect the RTX 3090. Finally, the laptop housing was modified to allow the PCIe cable to exit the chassis and a desktop power supply was attached to the RTX 3090 for power.

The system booted and correctly detected and utilized the attached RTX 3090. The system performed admirably scoring 14,008 points in 3DMark TimeSpy, for comparison the RTX 3090 paired with a desktop Ryzen 5 3600 scores 15,552, and when paired with a Ryzen 7 5800X scores 17,935. While this is an extreme example pairing an RTX 3090 with a mid-range mobile processor it goes to show the amount of performance achievable over the NVMe M.2 connector. The x4 PCIe 3.0 link of the laptop's M.2 slot could handle a maximum of 4 GB/s, while the x16 PCIe 3.0 slot on previous generation processors offered 16 GB/s, and the new x16 PCIe 4.0 connector doubles that providing 32 GB/s of available bandwidth.
Source: PC Watch
Add your own comment

55 Comments on Kosin Demonstrates RTX 3090 Running Via Ryzen Laptop's M.2 Slot

#26
Mussels
Freshwater Moderator
newtekie1NVME has nothing to do with this. NVME is a storage protocol that runs over the PCI-E bus. If you don't have a storage device plugged in, then the connector is nothing more than another form factor for PCI-E x4. It doesn't matter if it is miniPCI-E or M.2, it is still just a PCI-E x4 connection to the graphics card.



That is what Thunderbolt does, in a much nicer form faster too. And Thunderbolt achieves that because it is also just a PCI-E x4 connection.
Shhh i'm having fun. i like the shiny.
Posted on Reply
#27
1d10t
YukikazeNo, USB3.0 cannot support an eGPU and never could.


Technically yes, they couldn't, for normal purpose other than mining.
Posted on Reply
#28
kayjay010101
1d10t

Technically yes, they couldn't, for normal purpose other than mining.
That's just running PCIe over USB wires. It is not USB 3.0 in any way apart from the cables. You could NOT plug one of those riser boards into a normal USB 3.0 port.
Posted on Reply
#29
Baum
MusselsThis is via NVME, which is a lot faster than mini PCI-E - thats the key difference. Almost every modern laptop has an NVME slot that runs 3.0 x4 or 4.0 x4 bandwidth, making the potential for this a lot greater than with needing custom solutions
no one cares about bandwith!
from the early days of mxm upgrades upon the late mini-pcie slot upgrades, bandwith was the least important thing..

If the bios can not adress all the memmory or the driver just refuses to pick up the egpu the e for external-gpu turns into exit for the experiment!

With uefi and the before availible nvidia switchable graphics this is now even easier than before..
and no the amd graphics did not help as their driver where stubborn and would block /refuse to work stable.

The article should also look on the intel side where thunderbold helps egpu and you wom't need to cut open the laptop shell by any means...........

This is news from the amd laptop cpu paired to the nvidia gpu, nut not bandwith like wise.

Non encrypted uefi updates and all the options unlocked in laptop bios/uefi should be more a selling point for news than this :-P
Posted on Reply
#30
Yukikaze
1d10t

Technically yes, they couldn't, for normal purpose other than mining.
That isn't USB, just like this isn't HDMI.

That's just sending a native PCIe signal over a USB cable, since it has enough wires and supports the signaling rate.

Connecting this to USB will at the best case not work, and in the worst case will fry something.

The card itself is a PCIe switch with 1 upstream lane and 4 downstream lanes. I have one of those.
Posted on Reply
#31
lexluthermiester
YukikazeNo, USB3.0 cannot support an eGPU and never could.
That is absolutely incorrect. Video over USB was done back in the USB2.0 days. It wasn't ideal, but it's doable. USB3.0 is also very much doable.
EDIT;
www.amazon.com/Manhattan-Adapter-Easily-Converts-151061/dp/B006VYWIVI

Then there's video capture devices that work over USB2.0.

Making an eGPU adapter for USB would be almost trivial and would work just fine.
Posted on Reply
#32
Yukikaze
Yeah, no.
lexluthermiesterThat is absolutely incorrect. Video over USB was done back in the USB2.0 days. It wasn't ideal, but it's doable. USB3.0 is also very much doable.

Then there's video capture devices that work over USB2.0.
There is very big difference between making a USB adapter which can output video and making an eGPU adapter that can take a desktop video card and run it with its native drivers. Capture cards are something entirely different and completely irrelevant for the comparison. This isn't just about bandwidth, either. For example, a mPCIe slot is a single PCIe lane, giving you at best 8Gbps (with a Gen3 slot), and you can run an eGPU over such a slot, and it has been done hundreds of times (even with a Gen1 or Gen2 slot). USB3.x can be more than this at 10Gbps or 20Gbps, but it will won't help it any.
lexluthermiesterMaking an eGPU adapter would be almost trivial.
No, it wouldn't be trivial. USB and PCIe are two extremely different technologies, and are not at all compatible with each other. Again, there is a lot of difference between "I can build a chip that can output video when connected to USB" or even "I can get a video card to be connected over USB and play a video" and "I can connect a desktop video card to a USB port and make it work with its native drivers for 3D workloads such as games."

If it were trivial, it would've been done by now, and well publicized.
Posted on Reply
#33
lexluthermiester
YukikazeYeah, no.



There is very big difference between making a USB adapter which can output video and making an eGPU adapter that can take a desktop video card and run it with its native drivers. Capture cards are something entirely different and completely irrelevant for the comparison.



No, it wouldn't be trivial. USB and PCIe are two extremely different technologies, and are not at all compatible with each other. Again, there is a lot of difference between "I can build a chip that can output video when connected to USB" or even "I can get a video card to be connected over USB and play a video" and "I can connect a desktop video card to a USB port and make it work with its native drivers for 3D workloads such as games."

If it were trivial, it would've been done by now, and well publicized.
Ok, you keep thinking that. I had a PCI to USB 1.2 adapter BITD and it worked fine. Drivers worked fine, card worked fine. Output was 1024x768@30hz. Can't find a link for it, some no-name brand that doesn't exist anymore. But it worked great and without any issues. Making such an adapter for PCIe would take some work, sure, but it would still be trivial.
Posted on Reply
#34
InVasMani
kayjay010101That's just running PCIe over USB wires. It is not USB 3.0 in any way apart from the cables. You could NOT plug one of those riser boards into a normal USB 3.0 port.
M.2 slot is physically wired PCIE so I could see a device designed in the same way being created to achieve the same thing in practice. If someone was bothered to do so that is. This seems pretty close to is being demonstrated a M.2 to PCIE 3.0 x4 riser. Judging from what I can see of it a USB-C to a M.2 adapter enclosure which in turn uses one of these adapters might even work, but bandwidth restrained to the USB-C.

www.amazon.com/ADT-Link-External-Graphics-GTX1080ti-R43SG-TU/dp/B07XZ22HQ3
Posted on Reply
#35
Yukikaze
lexluthermiesterOk, you keep thinking that. I had a PCI to USB 1.2 adapter BITD and it worked fine. Drivers worked fine, card worked fine. Output was 1024x768@30hz. Can't find a link for it, some no-name brand that doesn't exist anymore. But it worked great and without any issues. Making such an adapter for PCIe would take some work, sure, but it would still be trivial.
This is trivial, yet, nothing like this exists. Why is that? Obviously the market exists, since eGPUs are a thing for a decade or so now, in various forms, yet no company built one, and no one has shown it to work with a game, while mPCIe, Expresscard, m.2 and various generations of Thunderbolt all have adapters or enclosures which will connect PCIe to them, often at considerable expense.

You also offer no explanation as to how that would work, seeing as USB and PCIe are very, very different in how they work at a basic level (one is a DMA-capable, both-sides-can-send-data interconnect, while the other is a polling-based, non-DMA capable, master/slave connection).

Anything can be software emulated on the host side, but even considering that, that was not done. The reasons lie in latency and bandwidth issues, and that's the end of the story. This is not a viable solution for an eGPU, and no, it isn't trivial.
Posted on Reply
#36
lexluthermiester
YukikazeThis is trivial, yet, nothing like this exists. Why is that? Obviously the market exists, since eGPUs are a thing for a decade or so now, in various forms, yet no company built one, and no one has shown it to work with a game, while mPCIe, Expresscard, m.2 and various generations of Thunderbolt all have adapters or enclosures which will connect PCIe to them, often at considerable expense.

You also offer no explanation as to how that would work, seeing as USB and PCIe are very, very different in how they work at a basic level (one is a DMA-capable, both-sides-can-send-data interconnect, while the other is a polling-based, non-DMA capable, master/slave connection).

Anything can be software emulated on the host side, but even considering that, that was not done. The reasons lie in latency and bandwidth issues, and that's the end of the story. This is not a viable solution for an eGPU, and no, it isn't trivial.
I'm not going to argue with you. You say it's not possible, I say you're wrong as I have owned something similar in the past. Just because it hasn't been done recently doesn't mean it's not possible. Your opinions are not the end-all-be-all of technological possibilities.
Posted on Reply
#37
Yukikaze
These aren't opinions: These are technological facts. You can face them, or you can ignore them. They won't go away or change either way. Have a nice day.
Posted on Reply
#38
InVasMani
The same performance overhead drop off from a M.2 over USB is what you'd expect of a external GPU in practice. It can be done, but performance won't be the same as a card slotted device. The polling rate interrupts aspect is a fair point about USB. I wouldn't expect peak performance out of it. Moonlight streaming I think is a better option in general where possible if I'm not mistaken.

I believe USB-C to M.2 portable device and then using the M.2 slot to PCIE riser adapter might work in practice, but not at peak performance you'd expect of the PCIE device. If it's just the M.2 to PCIE riser though it should work near identical in practice up to the M.2's PCIE 3.0/4.0 x4 specification on the motherboard for the slot itself which is physically wired to PCIE in the first place M.2 is simply a form factor slot while the NVME device normally connected to it is a protocol.
Posted on Reply
#39
1d10t
kayjay010101That's just running PCIe over USB wires. It is not USB 3.0 in any way apart from the cables. You could NOT plug one of those riser boards into a normal USB 3.0 port.
YukikazeThat isn't USB, just like this isn't HDMI.

That's just sending a native PCIe signal over a USB cable, since it has enough wires and supports the signaling rate.

Connecting this to USB will at the best case not work, and in the worst case will fry something.

The card itself is a PCIe switch with 1 upstream lane and 4 downstream lanes. I have one of those.
May I reminds you my original post are...
1d10tYou can also utilizing USB 3.0 back then when mining craze emerged :D
One of my favorites was ADT Link, they had various mod you can imagine.
Did I wrote something about plugged to motherboard or any available USB port and function properly for "regular" data transfer? I can swap out that cable with any cheapo USB 3.0 type A and still working normally, so there's nothing "proprietary" about bespoke cable. Yeah its weird, whoever person bought that card know what they doing.
Oh and one more thing,



What do you guys called that cable? And also...



If its not HDMI, then what type of port they used? mPCIeMI ?
Posted on Reply
#40
lexluthermiester
1d10tMay I reminds you my original post are...


Did I wrote something about plugged to motherboard or any available USB port and function properly for "regular" data transfer? I can swap out that cable with any cheapo USB 3.0 type A and still working normally, so there's nothing "proprietary" about bespoke cable. Yeah its weird, whoever person bought that card know what they doing.
Oh and one more thing,



What do you guys called that cable? And also...



If its not HDMI, then what type of port they used? mPCIeMI ?
While we can see where you're going with this, it's not really what we were talking about. We were talking about an adapter that let's you plug in a standard PCIe GPU into a USB connected PCIe slot. He was saying it's not possible. I disagreed as the topic of this very article has shown out of box thinking using something successfully in a way it was not intended.
Posted on Reply
#42
kayjay010101
lexluthermiesterOk, you keep thinking that. I had a PCI to USB 1.2 adapter BITD and it worked fine. Drivers worked fine, card worked fine. Output was 1024x768@30hz. Can't find a link for it, some no-name brand that doesn't exist anymore. But it worked great and without any issues. Making such an adapter for PCIe would take some work, sure, but it would still be trivial.
1. USB 1.2 does not exist, never did. USB 1.0 had a 1.1 revision, then it went to 2.0. You might be thinking of hi-speed 1.1?
2. 1024x768 @ 30Hz with 8-bit color is 188+ megabits of data per second. USB 1.1 hi-speed was limited to 12 mbps, so couldn't possibly display such a resolution. Even just limiting ourselves to black and white (1 bit) we're still twice over the bandwidth limit. For such an adapter USB 2.0 would be required.
3. Such an adapter would have to actively convert from the USB protocol to the PCI protocol and thusly be expensive. It would not be trivial.
4. PCI to USB? Yet you say you got a graphics output? So you mean USB to PCI, with a GPU then connected in the PCI slot?
5. Without proof of such an adapter ever existing, we're left with null and void claims. I have to side with @Yukikaze on this as I also don't think such a product would be trivial enough to produce, and the market for it would be extremely limited because of the lack of point in it. If video is all you're looking for, why the need for an eGPU in the first place? One of those USB to HDMI adapters you linked does the job just fine. If you're looking to actually take advantage of a desktop-grade GPU, USB is not the way to go for so many reasons; latency being the primary.
lexluthermiesterWhile we can see where you're going with this, it's not really what we were talking about. We were talking about an adapter that let's you plug in a standard PCIe GPU into a USB connected PCIe slot. He was saying it's not possible. I disagreed as the topic of this very article has shown out of box thinking using something successfully in a way it was not intended.
The topic of this very article is connecting a PCIe device to a PCIe slot. Whoopdedoo. That is exactly how the protocol was intended to be used. What's 'strange' is that the PCIe slot happens to be another form factor that happens to be commonly used for storage/wifi/bt only. It's not out of the box thinking; it's using the slot for a slightly abnormal, but totally supported, use case. Converting USB to PCIe and have it function like a regular ol' PCIe slot would be in another ballpark entirely which is what @Yukikaze was trying to explain to you.
1d10tMay I reminds you my original post are...


Did I wrote something about plugged to motherboard or any available USB port and function properly for "regular" data transfer? I can swap out that cable with any cheapo USB 3.0 type A and still working normally, so there's nothing "proprietary" about bespoke cable. Yeah its weird, whoever person bought that card know what they doing.
Oh and one more thing,
Yukikaze said you couldn't run an eGPU from USB 3.0 (indicating THEY were talking about USB 3.0 being the source; as in "on the motherboard or any available USB port"), to which you replied "But akschually" and linked something that could NOT be used to connect PCIe to USB 3.0. I pointed out it is only using the cable because of the wires, not because it's USB.
1d10t

What do you guys called that cable? And also...



If its not HDMI, then what type of port they used? mPCIeMI ?
Number 1 is an USB to serial port adapter.
Number 2 is a proprietary cable that uses the form factor and wires of an HDMI cable but cannot be used for HDMI purposes. If you followed the link provided in the post you'd see Tongban mentioned it is used only for their eGPU solution, it is not and cannot be used for HDMI. It simply uses the wires in the cable to transfer PCIe signals, and happens to terminate in an HDMI male end.
InVasManiThere are these as well.


www.amazon.com/EXPLOMOS-NGFF-Adapter-Power-Cable/dp/B074Z5YKXJ/ref=pd_sbs_10?pd_rd_w=mY227&pf_rd_p=c52600a3-624a-4791-b4c4-3b112e19fbbc&pf_rd_r=S91CSQJVV90GKVQS9K80&pd_rd_r=b4337651-b87f-483a-a10b-1fb81a5195b1&pd_rd_wg=lPmR8&pd_rd_i=B074Z5YKXJ&psc=1
That's literally just a PCIe 3.0 x4 to PCIe 3.0 x4 "riser". Very minimal loss in signal integrity. All it does is convert form factors.
InVasManiThe same performance overhead drop off from a M.2 over USB is what you'd expect of a external GPU in practice. It can be done, but performance won't be the same as a card slotted device. The polling rate interrupts aspect is a fair point about USB. I wouldn't expect peak performance out of it. Moonlight streaming I think is a better option in general where possible if I'm not mistaken.

I believe USB-C to M.2 portable device and then using the M.2 slot to PCIE riser adapter might work in practice, but not at peak performance you'd expect of the PCIE device. If it's just the M.2 to PCIE riser though it should work near identical in practice up to the M.2's PCIE 3.0/4.0 x4 specification on the motherboard for the slot itself which is physically wired to PCIE in the first place M.2 is simply a form factor slot while the NVME device normally connected to it is a protocol.
USB-C is, again, just a form factor. Theoretically you could have USB 2.0 run off a type C connector. What you mean to say is Thunderbolt 3, which uses the type C connector, of which there are plenty of eGPU solutions already. Or, if you really want to do it the difficult way, you could use a USB 3.2 gen 2 (ugh, damn you, USB-IF!) to an M.2 adapter and then possibly use one of these risers to connect a GPU. However, the latency penalty of starting at USB would make the GPU useless in practice (not a problem for storage devices though, which is why NVMe drives still work semi-decently even over USB), and as you mention, streaming would be much, much, much preferred.
Posted on Reply
#43
lexluthermiester
kayjay0101012. 1024x768 @ 30Hz with 8-bit color is 188+ megabits of data per second.
You need to recheck your math on that.
kayjay0101011. USB 1.2 does not exist, never did. USB 1.0 had a 1.1 revision, then it went to 2.0. You might be thinking of hi-speed 1.1?
Yeah, I was think 1.1 with 12mbps. It was 20+ years ago. Still worked well.
kayjay010101The topic of this very article is connecting a PCIe device to a PCIe slot. Whoopdedoo.
Right. Good point.
Posted on Reply
#44
1d10t
lexluthermiesterWhile we can see where you're going with this, it's not really what we were talking about. We were talking about an adapter that let's you plug in a standard PCIe GPU into a USB connected PCIe slot. He was saying it's not possible. I disagreed as the topic of this very article has shown out of box thinking using something successfully in a way it was not intended.
kayjay010101Yukikaze said you couldn't run an eGPU from USB 3.0 (indicating THEY were talking about USB 3.0 being the source; as in "on the motherboard or any available USB port"), to which you replied "But akschually" and linked something that could NOT be used to connect PCIe to USB 3.0. I pointed out it is only using the cable because of the wires, not because it's USB.

Number 1 is an USB to serial port adapter.
Number 2 is a proprietary cable that uses the form factor and wires of an HDMI cable but cannot be used for HDMI purposes. If you followed the link provided in the post you'd see Tongban mentioned it is used only for their eGPU solution, it is not and cannot be used for HDMI. It simply uses the wires in the cable to transfer PCIe signals, and happens to terminate in an HDMI male end.
Fair enough, misdirection steer debate into nowhere. I should be more specific next time.
Still baffles me though, getting flak for using the word "utilizing" :p
Posted on Reply
#45
kayjay010101
lexluthermiesterYou need to recheck your math on that.


Yeah, I was think 1.1 with 12mbps. It was 20+ years ago. Still worked well.


Right. Good point.
1024 horizontal pixels x 768 vertical pixels x 30 times per second x 8 bits of color = 188743680 bits per second = 188 megabits per second, barring no compression. Is that not right?
1d10tFair enough, misdirection steer debate into nowhere. I should be more specific next time.
Still baffles me though, getting flak for using the word "utilizing" :p
Not flak, just friendly discussion, eh? :) Helps to be specific to avoid arguing over nothing. Just something to keep in mind!
Posted on Reply
#46
InVasMani
It might've been a CRT with interlaced for all you know with a higher refresh rate than you'd expect of typical LCD from that time frame or rather the same refresh rate at double the speed with half data requirement. For a external GPU that could've worked to a point and still been better than integrated graphics of that time frame perhaps. Even today integrated graphics isn't very impressive, but it was very bad a decade ago. My first discrete GPU was VGA and had 2MB on it for what it's worth. It might have been ISA don't recall. USB 2.0 full speed would've been capable of much more with a external discrete GPU for what it's worth. Also consider the slow downs would be largely micro stutter hitching initially transferring data into VRAM. The fluid rendering of 30-60FPS to output to a display itself isn't that difficult especially at lower resolutions and no harder than with streaming.

You'd experience lots of latency issues perhaps depending on what was being played however, but not in all cases. You've got to take into account and consider a 1024 x 768 game today in many instances is vastly more demanding than one a decade go even at the same resolution unless settings have been well optimized at reduced settings qualities relative to the resolution. Games today have more baked in image detail on average at the same resolutions as yesterday. The big break through for USB was 2.0 full speed that enabled many more things that weren't previously possible in many area's such as eternal audio interfaces with multiple tracks of audio at reasonable latency.
Posted on Reply
#47
Baum
With all this diy fun every one forgets something important..

ESD and Surge Protection

M.2 goes ZAP if you touch the cable after you have run over the carpet.. by-by laptop motherboard
Posted on Reply
#48
bug
BaumWith all this diy fun every one forgets something important..

ESD and Surge Protection

M.2 goes ZAP if you touch the cable after you have run over the carpet.. by-by laptop motherboard
Germans and their engineering... :P

This is just a proof of concept, it doesn't have to be consumer friendly.
Posted on Reply
#49
InVasMani
Once you see it you can't unsee it...

Posted on Reply
#50
Caring1
InVasManiOnce you see it you can't unsee it...

I doubt those laptops are being used for mining as the only cable to them is a power cable.
Posted on Reply
Add your own comment
Nov 24th, 2024 07:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts