• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

HighPoint Releases Bootable Quad M.2 NVMe RAID Card: the SSD7102

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
HighPoint has announced the release of a bootable Quad M.2 NVMe RAID card. The SSD7102 features a PCIe x16 connector for plugging into a users' system, and is essentially an expansion card that can house up to four M.2 NVMe SSDs - four 4x PCIe NVMe SSDs comes up to 16 total PCIe lanes, so the math definitely does check out. HighPoint has gone around CPU-bound limitations in recognizing this type of multiple devices folded into a single one with the usage of a PLX8747 PCIe bridge chip. Custom firmware allows these four drives in RAID to serve as a bootable device.





RAID is supported in RAID 0 (all drives working in tandem for the best speed); RAID 1 (the drives are set in pairs for data mirroring), or single-drive boot in a JBOD configuration. When in this configuration, all of the drives can be a boot volume, which means multiple OS installs are possible. The SSD7102 features a shroud and a single 50 mm fan to help in cooling your selection of drives (it supports any kind of vendor drive, but of course that for RAID configurations it's best to pair identical drives). The card is supported under Windows 10, Server 2012 R2 (or later), Linux Kernel 3.3 (or later) and macOS 10.13 (or later). MTBF is set at just under 1M hours, with 8 W typical power consumption, and will be available at an MSRP of $399.



View at TechPowerUp Main Site
 
Mom, please.......
 
nice card, not so nice price....

time to call up the ole banker person with the gold cards :D
 
this costs as much as 4x sx8200 480gb drives
 
To kill you with the price. $ 100 costs PLX with everything you see here. Few , who will sacrifice the 16 pci exp slot to start the computer, because there are other options (optein ..) It is cheaper and the slot will be occupied by another GPU. This Toy for special tests for the rich .
 
Dear Santa i have been a good boy this year (not really, but dont tell him)...

Nom nom the speed you can get with this, now my old Samsung 950 pro sundenly feels so slow:rolleyes:
 
Give me that 980x for free and i won't tell him, he lives next door to me you know :laugh::laugh:

Nooooo not my i7 980x, any thing but that and mobo off cause. Wait this is black mail, exspect a letter from my lawyer about a class action lawsuit:p
 
For that price they could've gone with better cooling than a small, whiny fan...
 
If you can afford 4 m.2 drives, you can afford this.

I didn't really mean to imply that I would go out and buy (or necessarily need) this card AND 4 drives too..... basically because I already have 2 sammy 950 pro's now and would use them to try it out, but if the card actually works like it says it does, I would be a little more inclined to buy 2 more and go for the full quad set-up....

And I agree about the fan, it does seem kinda wimpy, but seeings how it is practically on top of the plx chip, me thinks it was intended to keep it somewhat cool first before blowin on over to the drive section...
 
For that price they could've gone with better cooling than a small, whiny fan...

It would be hard to do something else and stick to a 1-slot card. Plus, if they used a lower RPM fan, which is really all you need here, then it shouldn't be that loud. The point is just to move a little air over the drives and the PLX chip.
 
Could someone tell where and how 4 raid0 drives would do much, or improve anything in home/own PC configuration?
 
Could someone tell where and how 4 raid0 drives would do much, or improve anything in home/own PC configuration?
For a home user, the benefit would be minimal. Screaming fast boot time as well as screaming fast OS operations. To me, RAID 5 or 10(1+0)would be more useful to the home user in this usage case scenario.
 
Could someone tell where and how 4 raid0 drives would do much, or improve anything in home/own PC configuration?

Sorry if this sounds blunt, but if you need to ask this question, then this card is not meant for you.

It IS fine people like me, who use their "home" computers for more than just web browsing, email, gaming, and pron watching.... like maybe some computational analysis, sound/video production, cad/cam work, scientific research.... just to name a few :)

Things that can and will benefit from faster boot and load times, read/write times, and reduced access bottlenecks....you know, like, all that stuff that makes using a computer efficient and productive AND pleasant too....
 
For a home user, the benefit would be minimal. Screaming fast boot time as well as screaming fast OS operations. To me, RAID 5 or 10(1+0)would be more useful to the home user in this usage case scenario.

The thing is, you might actually see worse boot times and worse OS operation performance with this device. The RAID controllers often add latency(especially these doing firmware RAID that rely on the main CPU to do all the work). So IOPs might be worse than a single drive, and random operations will perform worse in that case.

These will shine for large sequential file transfers, but probably won't be any better than a single drive for most other work.
 
The thing is, you might actually see worse boot times and worse OS operation performance with this device. The RAID controllers often add latency(especially these doing firmware RAID that rely on the main CPU to do all the work). So IOPs might be worse than a single drive, and random operations will perform worse in that case.

These will shine for large sequential file transfers, but probably won't be any better than a single drive for most other work.
Good points!
 
Well, the last hardware raid cards I used (back in spinner days) improved performance partly because they offloaded the I/O work from the CPU onto the card...

Does anyone know if this is still the case for this type card?

If not then whatever gains made from the raid setup might be marginal, at best....
 
Well, the last hardware raid cards I used (back in spinner days) improved performance partly because they offloaded the I/O work from the CPU onto the card...
Does anyone know if this is still the case for this type card?
If not then whatever gains made from the raid setup might be marginal, at best....
That's a good question. The PLX PEX8747 processor onboard might actually have duties other than being a PCIe controller bridge for the system CPU. If that's the case, then yes there may be some tangible gains. If not..
The problem is that there isn't a ton of useful documentation on this chip, the best I can find so far is here;
https://www.broadcom.com/products/pcie-switches-bridges/pcie-switches/pex8747#overview
 
Well, the last hardware raid cards I used (back in spinner days) improved performance partly because they offloaded the I/O work from the CPU onto the card...

Does anyone know if this is still the case for this type card?

If not then whatever gains made from the raid setup might be marginal, at best....

Judging from the lack of any substantial heatsink on the chip, I'm guessing it isn't doing any heavy processing, and that is being left up to the computer's CPU. However, RAID0 and 1 don't really need nearly the amount of processing power that RAID5 does, so maybe the chip on the card is doing the processing(and maybe this is why RAID5 isn't an option).
 
Highpoint's model previous to this one, the SSD7101A had Raid 0,1,5, and 10, so I would imagine this new one would as well
 
Back
Top