• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sabrent Introduces its Quad NVMe SSD to PCIe 4.0 x16 Card

Joined
Feb 18, 2023
Messages
245 (0.36/day)
Another stupidly noisy bifurcated card.

Why not something simple like:


Great cards ( have stack of them), just watch out for little diodes next to the M.2 connector. If you have 2 sided 4TB NVMe Gen.4 Phison controller devices it will be tough to plug them, but not impossible. Just careful.

I love OWC Accelsior 8M.2, but pricing & importing to EU is so meh.

That one doesn't has hardware raid either, so it has the same problem.

Not in workstaton or server systems where there is plenty of PCIe x16 slots and lanes. This product, without PCIe switch chip onboard, is more aimed towards those systems rather than desktop.

The problem isn't the PCIe lanes, the real problem is that doesn't has hardware RAID, if the BIOS and or the motherboard crashes, so your RAID goes as well.
 
Joined
Feb 17, 2010
Messages
1,678 (0.31/day)
Location
Azalea City
System Name Main
Processor Ryzen 5950x
Motherboard B550 PG Velocita
Cooling Water
Memory Ballistix
Video Card(s) RX 6900XT
Storage T-FORCE CARDEA A440 PRO
Display(s) MAG401QR
Case QUBE 500
Audio Device(s) Logitech Z623
Power Supply LEADEX V 1KW
Mouse Cooler Master MM710
Keyboard Huntsman Elite
Software 11 Pro
Benchmark Scores https://hwbot.org/user/damric/
Of course it does. Each NVMe drive has its distinct features, such as controller, speeds supported, number of channels for NAND module attachment, DRAM or DRAM-less operations, etc.
If you wish to fully utilize the Gen4 NVMe drive can do, check the reviews and look out for drives that have one of these controllers: Phison PS5018 E18, Silicon Motion SM2264, Samsung Pascal S4LV008, etc. If you do not need top notch drives, you can go for drives with one tier down controllers, such as Phison PS5021T, SM2267 or Samsung Elpis S4V003.

What worries me is such AICs is blocking air flow towards GPU. If a motherboard has two x16 slots with x8/x8 bifurcation, I'd install NVMe AIC into the first one closer to CPU and GPU into the second one. This way air flow towards GPU is free from obstacles.


Almost one one in the world needs AIC with PCIe 5.0 support. What would you do with it?
So I wouldn't be able to take a random bunch of m.2 drives and just stick them in there and expect them to work?
 
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
The other problem is that if your motherboard breaks or your BIOS crashes, so it does your RAID.
If I remember correctly if you use software RAID (like Windows storage spaces) you could move the array between different hardware but well yea if you have an extreme hardware failure your storage can be broken.

That one doesn't has hardware raid either, so it has the same problem.
Why do you need hardware RAID for SSD's especially in a time where we have CPU's with a lot of cores to spare?
 
Joined
Aug 25, 2021
Messages
1,182 (0.97/day)
Plus, the sooner it comes out, the sooner it'll be available at a price I can actually justify!
You will have GPUs next year with Gen5 interface, however you will not be "using" it to its capability. This is because current high-end GPUs can barely saturate Gen4 x8 link, as shown in TPU testing a few months ago.

One useful case for PCIe 5.0 will be with several devices in x16, x8 and x4 slots not competing for bandwidth anymore. You could have GPU Gen5 in x16 slot using x8 connection bifurcated to the second x8 slot where AIC could be attached. But... which Gen5 AIC? NVMe? There are already 3-5 NVMe slots on a good motherboard, two of which are Gen5 on AM4.

So, the issue is that there is plenty of Gen5 connectivity from Zen4 CPU (24 lanes), but few devices that could meaningfully use all available bandwidth due to slow development of Gen5 peripherals.

So I wouldn't be able to take a random bunch of m.2 drives and just stick them in there and expect them to work?
You certainly can, but their performance would vary depending on what you need those drives for. DRAM-less drives tipically cannot use cache, so some workloads would be affected. For RAID 1 or 0 set-up, it's good to have two similarly capable drives with the same capacity. If you set-up mirrored RAID on 1TB and 2TB drives, you will lose 1TB on that 2TB drive. So, there are things to consider. You should never blindly just buy any NVMe drives, but you can, of course.
 
Joined
Feb 18, 2023
Messages
245 (0.36/day)
If I remember correctly if you use software RAID (like Windows storage spaces) you could move the array between different hardware but well yea if you have an extreme hardware failure your storage can be broken.


Why do you need hardware RAID for SSD's especially in a time where we have CPU's with a lot of cores to spare?

No, software raid is limited to that specific computer, if for example windows crashes so hard, it could destroy your software raid. On hardware raid, you just move the card with all your SSD to another computer and that's it.
 
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
No, software raid is limited to that specific computer, if for example windows crashes so hard, it could destroy your software raid.
No that doesn't seem right. Separate topics crashing vs. moving (I'm talking about moving) you should be able to move Windows created raid disks to another windows machine and Windows should be able to mount the array. If your computer is crashing any number of things can go wrong if data in the process of updating wasn't completed to the disk raid or not.
On hardware raid, you just move the card with all your SSD to another computer and that's it.
Yea that would work.
 
Last edited:
Joined
Feb 17, 2010
Messages
1,678 (0.31/day)
Location
Azalea City
System Name Main
Processor Ryzen 5950x
Motherboard B550 PG Velocita
Cooling Water
Memory Ballistix
Video Card(s) RX 6900XT
Storage T-FORCE CARDEA A440 PRO
Display(s) MAG401QR
Case QUBE 500
Audio Device(s) Logitech Z623
Power Supply LEADEX V 1KW
Mouse Cooler Master MM710
Keyboard Huntsman Elite
Software 11 Pro
Benchmark Scores https://hwbot.org/user/damric/
You will have GPUs next year with Gen5 interface, however you will not be "using" it to its capability. This is because current high-end GPUs can barely saturate Gen4 x8 link, as shown in TPU testing a few months ago.

One useful case for PCIe 5.0 will be with several devices in x16, x8 and x4 slots not competing for bandwidth anymore. You could have GPU Gen5 in x16 slot using x8 connection bifurcated to the second x8 slot where AIC could be attached. But... which Gen5 AIC? NVMe? There are already 3-5 NVMe slots on a good motherboard, two of which are Gen5 on AM4.

So, the issue is that there is plenty of Gen5 connectivity from Zen4 CPU (24 lanes), but few devices that could meaningfully use all available bandwidth due to slow development of Gen5 peripherals.


You certainly can, but their performance would vary depending on what you need those drives for. DRAM-less drives tipically cannot use cache, so some workloads would be affected. For RAID 1 or 0 set-up, it's good to have two similarly capable drives with the same capacity. If you set-up mirrored RAID on 1TB and 2TB drives, you will lose 1TB on that 2TB drive. So, there are things to consider. You should never blindly just buy any NVMe drives, but you can, of course.
I would only want to consolidate existing drives into one slot. Like say I had some mixed 1TB, 2TB, PCI 3.0 and 4.0 m.2 drives it would read them all fine yes? No need for raid
 
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
So I wouldn't be able to take a random bunch of m.2 drives and just stick them in there and expect them to work?
If they are split in the BIOS you would see each individual drive. It is best practice though to try to have the drives have the same controller. In fact with RAID you want to also make sure they are the same capacity. There is nothing to say though that you cannot actually do it.

No that doesn't seem right. Separate topics crashing vs. moving (I'm talking about moving) you should be able to move Windows created raid disks to another windows machine and Windows should be able to mount the array. If your computer is crashing any number of things can go wrong if data in the process of updating wasn't completed to the disk raid or not.

Yea that would work.
The only issue is Windows 11 TPM. I don't know how but it seems that is why NVME is such a pain in the butt to format. If you are using an existing Windows you can update your entire OS without needing to worry about Software RAID but if you take your drive out of a Windows PC and just put it in another it might not automatically give you the foreign disk option.
 
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
If they are split in the BIOS you would see each individual drive. It is best practice though to try to have the drives have the same controller. In fact with RAID you want to also make sure they are the same capacity. There is nothing to say though that you cannot actually do it.


The only issue is Windows 11 TPM. I don't know how but it seems that is why NVME is such a pain in the butt to format. If you are using an existing Windows you can update your entire OS without needing to worry about Software RAID but if you take your drive out of a Windows PC and just put it in another it might not automatically give you the foreign disk option.
Ah TPM I didn't consider that. Also if your using Bitlocker that might also be a complication and or if you swap your CPU and using your CPU's TPM instead of an external one.
I should clarify in my mind (among this raid conversation) I'm not considering the SSD raid array as participating as the boot drive but as a secondary drive. SSD raid as boot drive on typical consumer hardware doesn't make much sense to me unless it's mirroring.
 
Joined
Oct 7, 2021
Messages
7 (0.01/day)
Location
Welling, London, England, UK
System Name BlukBox
Processor Ryzen 5 7600 OC + UV (-25-35 curve)
Motherboard Asus B650E-I
Cooling Noctua NH-L9a-AM5 + NA-FD1+ NA-FDBE + Artic F9-92 blowing in through .es
Memory 2x32GB Corsair DDR5-5600 Vengeance @ 5904 1.17V (IF 2000Mhz 1.25V)
Video Card(s) AMD Radeon Processor Graphics (2 CU) @ 2700Mhz 1.25V
Storage 2TB Solidigm P41 Plus M.2 (2280) PCIe 4.0x4 NVMe SSD
Display(s) LG 27" 27UP850N-W 3840x2160 4K IPS 60Hz FreeSync HDR400 LED
Case InWin Chopin Max (titanium)
Audio Device(s) LG Monitor + X-Rocker 4.1 chair
Power Supply InWin 200W Gold (custom form factor for Chopin/B1 cases)
Mouse MSI CLUTCH GM40 Red
Keyboard Redragon Green Camouflage + TK Sealth
Software Win 11 + Debian 12 Bookworm
You could have GPU Gen5 in x16 slot using x8 connection bifurcated to the second x8 slot where AIC could be attached. But... which Gen5 AIC? NVMe? There are already 3-5 NVMe slots on a good motherboard,....
In my case I have ASUS mini-ITX B650E-I with just one each of Gen5 + Gen4 NVMe, plus two SATA3. Case is a Chopin Max that has no external bracket or room for more than a one slot card in the Gen5 x16. So NVMe is a feasible use for it for me, not pressing but not many other ways to use it unless I want to case mod an external GPU or replace the PSU beside it with a GPU (it's been done).

Of course, even SATA SSDs can be very viable for latency-focused workloads, they're just not the best. And at some point Gen5 won't be the best, either, which will probably be the time I actually want to try doing this, if at all.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Ah TPM I didn't consider that. Also if your using Bitlocker that might also be a complication and or if you swap your CPU and using your CPU's TPM instead of an external one.
I should clarify in my mind (among this raid conversation) I'm not considering the SSD raid array as participating as the boot drive but as a secondary drive. SSD raid as boot drive on typical consumer hardware doesn't make much sense to me unless it's mirroring.
I would not recommend that you should use a RAID array as Boot but even as Secondary storage you can run into issues with Windows 11. F me Windows 11 is so quirky that I had a M2 drive that I was using with an adapter and since I used the adpater to format it I have to use one specific USB C port on my MB to have ti register.

I would not recommend that you should use a RAID array as Boot but even as Secondary storage you can run into issues with Windows 11. F me Windows 11 is so quirky that I had a M2 drive that I was using with an adapter and since I used the adpater to format it I have to use one specific USB C port on my MB to have ti register.
I for one love RAID. Once you have more than 50 Epic Games and have to re-download them or just move them you will appreciate maxing out Windows 2.5 GB/s write rate with a drive that cost a quarter what it would if you got one for the same capacity. I have to read some more on the controller for my WD AN1500 before I buy another 4 TB NV2.
 
Joined
Oct 7, 2021
Messages
7 (0.01/day)
Location
Welling, London, England, UK
System Name BlukBox
Processor Ryzen 5 7600 OC + UV (-25-35 curve)
Motherboard Asus B650E-I
Cooling Noctua NH-L9a-AM5 + NA-FD1+ NA-FDBE + Artic F9-92 blowing in through .es
Memory 2x32GB Corsair DDR5-5600 Vengeance @ 5904 1.17V (IF 2000Mhz 1.25V)
Video Card(s) AMD Radeon Processor Graphics (2 CU) @ 2700Mhz 1.25V
Storage 2TB Solidigm P41 Plus M.2 (2280) PCIe 4.0x4 NVMe SSD
Display(s) LG 27" 27UP850N-W 3840x2160 4K IPS 60Hz FreeSync HDR400 LED
Case InWin Chopin Max (titanium)
Audio Device(s) LG Monitor + X-Rocker 4.1 chair
Power Supply InWin 200W Gold (custom form factor for Chopin/B1 cases)
Mouse MSI CLUTCH GM40 Red
Keyboard Redragon Green Camouflage + TK Sealth
Software Win 11 + Debian 12 Bookworm
I like mdadm RAID 1.0 metadata for simple RAID1 since it is effectively the same as no RAID at all for simple reading since the metadata is all at the end.

For more complex situations I'd tend towards btrfs.
 
Joined
Apr 18, 2019
Messages
2,397 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
-Seeing several comments about the dangers of CPU NVME RAID (Intel-VROC / AMD-RAID):
Windows Storage Spaces has improved greatly over the years; I used to be 'against' its use, but in my own incidental testing it's been reliable and tolerant of system changes/swapping.
Storage Space arrays absolutely will and do 'transfer' between (compatible Windows) systems. (Though, I don't recall if a RAID5 from a Windows Server install will 'work' on an install of 10/11).

Also, if you research some of the last reviews on RAIDing Optane drives, Storage Spaces 'can' outperform VROC.
In my own experimentation, AMD-RAID and Optane 'don't get along' well. (Severe performance regression beyond 2-drive striped array on AMD-RAID)
Storage Spaces Striped was reliably 'faster' than 4x 118GB P1600Xs in RAID0 AMD-RAID (no matter what cache tag size was used).
 
Joined
Mar 25, 2023
Messages
72 (0.11/day)
Processor Celeron G5905, I3 10100, I5 10400F, I7 10700F
Motherboard Asrock H410, B460, B560, Gigabyte B560
Cooling Zalman CNPS80G
Memory each System 16 or 32GB: Kingston 2666 CL12, 2933 CL14
Video Card(s) Arc A380/A770, IGP
Storage SSD and some HDD
Display(s) Philips 24inch 1080p 165Hz IPS and 32 inch 1440p 165Hz VA
Case Antec, Corsair, Nanoxia
Audio Device(s) Different AVR, Speakers: Klipsch, Polk .....
Power Supply FSP, Deepcool
Mouse Logitech G
Keyboard Logitech G
Software Win 10, Bodhi Linux, Deepin
PCIe bifurcation, like every other which sell them. No controller just a garbage PCB for 10$.

With a controller it would be interessting for a good price.
 
Joined
Aug 25, 2021
Messages
1,182 (0.97/day)
I would only want to consolidate existing drives into one slot. Like say I had some mixed 1TB, 2TB, PCI 3.0 and 4.0 m.2 drives it would read them all fine yes? No need for raid
Sure, you can run any of those individually.
 
Joined
Apr 18, 2019
Messages
2,397 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Ok, seriously, why is everyone complaining about this?

A bifurcated card, is limited in its use to platforms that support bifurcation (and expose the feature to the end-user).
Prosumer-enthusiast-gamer mobo manufacturers have highly-inconsistent support for a feature that's (underneath it all) actually, extraordinarily common. (Also, these 'simple' bifurcated cards seem to be sold at some seriously high prices, for how simply constructed they are)
To an enthusiast, gamer, tinkerer, etc. the mere mention of 'bifurcation' can stir up sourness.

I'm aware of how common and useful bifurcated devices are in server/industrial use:
I have a couple 'sets' of MaxCloudON bifurcated risers for x8, x16, etc. Those maxcloudon risers were made for and by a Remote Rendering and GPGPU server company overseas.
I also own a Gen4 Asus Hyper M.2 Quad-NVMe card that I've filled w/ both 16GB Optane M10s, and 118GB Optane P1600Xs in testing.

To a enthusiast-tinkerer like me, the switch-based cards, are much more 'broad' in how they can be used. Switched PCIe expander-adapter cards can even act as a 'southbridge' to attach new features to older/unsupported equipment.
Ex. I have an Amfeltec Gen2 x16 -> 4x x4-lane M.2 M-key card; it's probably gonna get slotted into a PCIe-Gen1.0 dual S940 K8N-DL or a re-purposed H81-BTC.


All that said, I'd bet bifurcation is usually preferred in-industry as it's less power and heat, with less latency. -in 'big data' applications, that teensy bit of latency, could be a (stacking) issue.

I could see a professional having a generalized preference for bifurcation over switching.
In 'serious' use cases, bifurcation would be more efficient, no?
 
Joined
Jul 5, 2013
Messages
28,257 (6.75/day)
A bifurcated card, is limited in its use to platforms that support bifurcation (and expose the feature to the end-user).
Prosumer-enthusiast-gamer mobo manufacturers have highly-inconsistent support for a feature that's (underneath it all) actually, extraordinarily common. (Also, these 'simple' bifurcated cards seem to be sold at some seriously high prices, for how simply constructed they are)
To an enthusiast, gamer, tinkerer, etc. the mere mention of 'bifurcation' can stir up sourness.

I'm aware of how common and useful bifurcated devices are in server/industrial use:
I have a couple 'sets' of MaxCloudON bifurcated risers for x8, x16, etc. Those maxcloudon risers were made for and by a Remote Rendering and GPGPU server company overseas.
I also own a Gen4 Asus Hyper M.2 Quad-NVMe card that I've filled w/ both 16GB Optane M10s, and 118GB Optane P1600Xs in testing.

To a enthusiast-tinkerer like me, the switch-based cards, are much more 'broad' in how they can be used. Switched PCIe expander-adapter cards can even act as a 'southbridge' to attach new features to older/unsupported equipment.
Ex. I have an Amfeltec Gen2 x16 -> 4x x4-lane M.2 M-key card; it's probably gonna get slotted into a PCIe-Gen1.0 dual S940 K8N-DL or a re-purposed H81-BTC.
Ah! This makes sense. The frustration is easier to understand now. Was not aware of these particular problems. I was under the impression that "most" chipsets supported that function natively across all platforms. Thank You for explaining, much appreciated!

All that said, I'd bet bifurcation is usually preferred in-industry as it's less power and heat, with less latency. -in 'big data' applications, that teensy bit of latency, could be a (stacking) issue.

I could see a professional having a generalized preference for bifurcation over switching.
In 'serious' use cases, bifurcation would be more efficient, no?
This is correct. It is more efficient as it a "direct" connection. In rack deployments, bifurcation is preferable, even if it's less flexible.
 
Last edited:
Joined
Jul 30, 2019
Messages
3,338 (1.69/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
Joined
Apr 18, 2019
Messages
2,397 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
It would be nice if these cards could be smaller. Maybe double sided. (2 on each side)
That precise configuration is extraordinarily common with the 'Cheap Chinese Import' expanders (Both switched and bifurcated varieties).
I believe QNAP and OWC(?) makes a few like that as well.
Heck, even my old Gen2 Amfeltec (switched) card is also built like that.

In fact, now that I've ran down through it, I might venture to say that (double-sided) configuration is 'most common' outside of the gamer-enthusiast market(s).

TBQH, I think Asus, MSI, Gigabyte, etc. have cards laid out 4-to-a-single-side in part because they look more like a fancy slim GPU, and partially due to the inconsistent airflow patterns amongst DIYers' and SIs' custom builds.
 
Top