• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Pre_built NAS or Build a server/nas?

blueturtle

New Member
Joined
May 27, 2024
Messages
17 (0.13/day)
Ive read a few NAS reviews on TPU, the pre_built NAS's just dont seen worth the $600+ plus the hdd's Id need, looking at 50+TB(RAID6).

Any pre_built NAS worth $1K? Build the server myself, would cost more yes, but something breaks, it an easier fix vs loosing $1K to a pre-fab. Whats yalls feedback?
 
Joined
Feb 1, 2019
Messages
3,387 (1.63/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
They seem very over priced for what you get and since I already had a 2nd rig I ended up repurposing it. Not sure about the case of doing that though if you having to build something from scratch for it.

Other benefits from DIY build is flexibility, as you said easier to fix things that break, and upgradeability without replacing the entire thing.
 

blueturtle

New Member
Joined
May 27, 2024
Messages
17 (0.13/day)
Anyone recommend a great SATA card(8x ports), pcie nic card and switch? My networking is atm noobish. I’m guessing somewhere under $1K for the 3?
 
Joined
Mar 18, 2023
Messages
795 (1.40/day)
System Name Never trust a socket with less than 2000 pins
Building is the only way to get ECC RAM. That factors big for me.

I also don't like relying on single software vendors and their patch mechanisms.
 
Joined
Oct 30, 2022
Messages
230 (0.33/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
I've done both (repurposing a pc during an upgrade) and have increase my raid storage levels from a 1.5tb in 2 raid 5 arrays all the way up to 80tb in my current NAS.

Prebuilts have the advantages of low power consumption, dedicated and curated os and apps. Good for business cases, but also for a home user with a lot of storage needs.

The desktop options have mixed options, do you want a dedicated raid card ? That's a point of failure with a card you could potentially not replace
Do you need more than 8 bays ? prebuilts get real exxy beyond 8 bays generally but a server could potentially house 12+ drives without too much difficulty.

I have a Synology DS1821+ (8 bay) and it's DX517 (5 bay expansion) and I had 1 device failure (a bug in the cpu design of the previous 1815+ unit I had) all I did was pull out my drive and slap them into the new unit, 15 minute initial boot up migration and off I went again.

Building a server gives you options for more cpu power (for transcoding and the like), upgradability and the ability to run substantially more apps, server software, docket containers etc.

I wouldn't go back to repurposing an existing computer anymore, I just need a mild setup (with adequate ports for 80TB and room to grow) and I only run a couple of minor dockers, so you'll need to judge your use case beyond simply storage requirements.
 
Joined
Jan 2, 2024
Messages
484 (1.74/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
If you're building the machine yourself, it's way more cost effective to employ a retired system.
Not to the extreme that I do with a 1c/1t Athlon, 2GB, 2xSata (you're not ready) but get like any Phenom II or FX, any board that comes with lots of sata 6gb.
Otherwise you're looking at getting some expansion card just for the drives and if you work with old stuff, that's gonna be a headache.
The cards are great if you're doing RAID but I won't do that to HDDs, so good luck with that.

CPU should be an easy pick. If you're doing basic apps:
Samba disk shares, iSCSI disk shares, HTTP web/ftp/sql.net, torrent seeding, YouTube/Twitch scheduled stream/video archiving, PNG library archiving, Cloud disk shares, Lightweight VMs
You could easily get by with anything i3-2100 and up. Even something as miserable as my Athlon 64 works great for this.

If you're doing LARGE VMs, video transcoding, virtual container apps, document server, forensic web, Amazon AWS, secure developer environments
The floor to entry is a CPU with nested page tables, which is like AM3 socket and newer. Yes I can confirm it works and it's fine.
Clockspeed and memory seem to be a big hindrance for containers, so you'll need good memory, something upper range like a 65W+ quad core and some patience.

Basically try to pick something super FAST like an i7-4790K and if it can be helped with an accelerator like a cheap Tesla, do that. A lot of these apps take advantage of nVidia technologies.
I don't do ECC memory but my stuff does auto-checksum, so it's fine. If you're mainly doing like a data scraper vault with shares, check it out. Anything precious, look into ECC memory.
 
Joined
Jul 30, 2019
Messages
3,101 (1.64/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR4-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
Ive read a few NAS reviews on TPU, the pre_built NAS's just dont seen worth the $600+ plus the hdd's Id need, looking at 50+TB(RAID6).

Any pre_built NAS worth $1K? Build the server myself, would cost more yes, but something breaks, it an easier fix vs loosing $1K to a pre-fab. Whats yalls feedback?
The thing with Synology NAS is your really paying for the NAS OS software (updates, and hardware validation) and for the most part making everything plug and play, backup, and update friendly, and administratively easy, also the warranty. I have the prior gen DS920+ a simple 4 bay NAS I snagged on fire-sale from NewEgg for about $400. (they are still quite expensive so I always wait for the sale) Not having a bay for a hot spare SHR-2 (raid6) makes sense for me with only a 4 bay populated to 8TB so if I loose up to 2 disks I can still have enough time to refresh a backup and replace the disks. I have a hard time imagining doing that with 50+TB capacity but I suppose it's doable, the scrubbing time will take forever at 50+TB capacity I imagine. Assuming you have a good backup plan you might consider a different raid strategy for that high of capacity.

I have yet to build my own using ZFS but it's on my bucket list. Building your own means you have to do all the work and put in the time for it including picking the right hardware. If your someone like me that sometimes balks at Synology pricing, when looking at the flip side I don't have to time yet to spend building my own NAS, so a diskless $400 to $600 doesn't seem that bad all things considered. Although when you see a +1 bay price difference of $100 for the next model up it feels a bit like a rip off for just a little more plastic and metal for a 5-bay vs. 4-bay system.
 
Last edited:
Joined
Jan 4, 2022
Messages
101 (0.10/day)
Over the years I've built four TrueNAS Core ZFS servers, two of them in HP ML350p Gen8 server chassis with ECC RAM, the others with standard PC components and non-ECC RAM. Error Corrected RAM is not 100% essential for TrueNAS, but it does reduce the chance of data corruption going unnoticed. You can fit ECC RAM in some normal Intel/AMD motherboards, but you're more likely to get ECC support with a dedicated server motherboard.

You don't need a fancy motherboard with multiple SATA ports for a NAS. Instead, consider an LSI SAS (Serial Attached SCSI) HBA (Host Bus Adapter) controller such as the LSI 9211-8i. These cards can be picked up second hand on eBay for under 25 dollars/euros/pounds. I avoid buying brand new so-called "LSI" cards from China because they're usually fake and not the genuine article. Get your cards from local suppliers specialising in breaking up old servers.

Suitable LSI SAS HBA cards typically come with one or two SFF-8087 ports, each of which can control four hard disks. To use an LSI 9211-8i with eight hard disks, you'll need a couple of SFF-8087 to SATA breakout cables. SAS controllers can be connected to ordinary SATA drives, or you can use more professional 6Gb/s SAS2 drives instead. This leaves the SATA ports on your motherboard free for other drives that are not part of the array, e.g. an SSD for the TrueNAS operating system.

The LSI 9211-8i comes with a PCI Express x8 interface, but will work quite happily when plugged into PCIe x4 or PCIe x16 slot. If your PCIe slot has only 4 lanes, maximum bandwidth will be halved, but this isn't usually noticeable with "older" hard disks. If you create an array using modern SATA hard drives of at least 8TB capacity and 250MB/s transfer rate, I'd recommend sticking with PCi x8. For 12Gb/s SAS3 drives I'd use a newer more sophisticated HBA card than the 9211, e.g. 9300 series.

If you're planning to use TrueNAS Core or UNRAID as your operating system, it's vital you buy an HBA card that's been flashed with an IT (Initiator Target) BIOS, that passes commands through to the hard disks unaltered. TrueNAS does not work properly with a standard IR (RAID) BIOS, which "hides" the hard disks from the ZFS operating system. Although it's possible to re-flash an HBA card from IR mode over to IT mode, it's not easy for novice users. If you're going to use TrueNAS or UNRAID, buy an IT-flashed card for an easy life. Second hand IR HBA cards are often much cheaper than IT HBA cards, despite the only difference being the BIOS.

I built a basic ZFS system for a pittance using a second hand Fractal Design R4 case with eight 3.5" drive trays (£15 on eBay), an ancient dual core Athlon APU and mobo (£5), eight 4TB Enterprise level ex-server drives (£10 each), 16GB of non-ECC RAM, a 64GB SATA SSD for TrueNAS and a good quality PSU with 10 SATA power connectors.

With TrueNAS set up for RAID Z2 (equivalent to RAID 6), you lose two drives' worth of capacity from the array, so 8 x 4TB drives equates to 6 x 4TB usable capacity. I've only had one drive go down and that was a 4TB Toshiba N300 purchased brand new in 2018 with less than 6 days total use. My 4TB ex-server drives have over 1,400 days use, but I keep multiple backups of data, so it doesn't matter if one system fails catastrophically.

For at least 50TB of usable capacity and assuming RAID Z2 or RAID 6, you could run eight 10TB drives, which would give you 60TB usable space. Eight 8TB drives would give 48TB usable capacity.

One advantage of using TrueNAS Core and the ZFS filing system is that if your computer hardware fails, you can transfer the entire hard disk array to a totally different computer, switch it on and with a few tweaks (or load a previously saved configuration file) and you continue as if nothing happened. Try this with a proprietary RAID system and you'll probably have to buy indentical hardware to restore normal operation.

When I upgraded the drives in an HP ML350P Gen 8 TrueNAS server, I transferred the eight 2TB SAS drives over to a desktop PC and the system booted up normally after restoring the configuration setup. It didn't matter that I switched from an Intel Xeon with ECC RAM to an AMD APU with non-ECC RAM. TrueNAS is not fussed about hardware, provided you have at least 16GB of RAM and a suitable drive controller. For a 50TB system, I'd suggest fitting at least 32GB RAM with TrueNAS.

If you want to save money, consider a DIY TrueNAS or UNRAID system. If you want an easy life, buy an 8-bay Synology or QNAP chassis. It's relatively easy to build a DIY TrueNAS server, but you could spend ages learning to configure the system. It takes a fair degree of patience and online help guides before you can get TrueNAS going. That's where the Synology/QNAP UI wins hands down for ease of use. N.B. I don't have a Synology or QNAP system.

A good source of advice can be found on the 'TrueNas' forum and 'Serve The Home' forum.

Caution:
If you end up building a TrueNAS server, do NOT use SMR (Shingle Magnetic Recording) drives. Instead use CMR (also known as PMR) drives. Back in 2020, the FreeNAS community discovered Western Digital Red drives advertised for use in NAS systems were using "incompatible" SMR techology. There was no mention of the fact Red NAS drives were using SMR in the data sheets on the WD web site. TrueNAS/FreeNAS is particularly intolerant of SMR and when you replace a failed drive in an SMR array, it can take TrueNAS several days to resilver instead of a few hours. Check very carefully before buying drives for TrueNAS and make sure they're CMR/PMR and not SMR.
 
Last edited:
Joined
Oct 30, 2022
Messages
230 (0.33/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
I will chuck this out there, I think (based on 20c a kw for power) I've saved about a grand since I employed Synology prebuilts.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,236 (2.37/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Ive read a few NAS reviews on TPU, the pre_built NAS's just dont seen worth the $600+ plus the hdd's Id need, looking at 50+TB(RAID6).

Any pre_built NAS worth $1K? Build the server myself, would cost more yes, but something breaks, it an easier fix vs loosing $1K to a pre-fab. Whats yalls feedback?
Well, for one, the prebuilt device comes with warranties and support, so if something goes wrong, or you don't know how to do something, you can get support. If the device dies within the warranty period, you get a replacement unit.

Obviously components have warranty too, but whatever OS you install, doesn't tend to come with support and it can be a nightmare to search for solutions online and even asking, doesn't always mean someone will have an answer to your problem.

You're paying extra for that support and warranty.

That said, I build my own NAS some years ago, but it's been a PITA when I've had a couple of odd issues on the software side of things. I didn't go with one of the "fancy" operating systems, I picket OpenMediaVault, as my needs are largely storage and it does everything I need and then some. The system requirements are also quite modest, compared to the more advanced NAS operating systems suggested above. I use MergerFS and SnapRAID, which means I end up with a parity drive that allows for good enough data protection.

Anyone recommend a great SATA card(8x ports), pcie nic card and switch? My networking is atm noobish. I’m guessing somewhere under $1K for the 3?
What kind of speed for the network hardware? I'f you're going full in, I would suggest 10 Gbps, but 10 Gbps switches aren't cheap. Amazon have Marvell Aqtion cards for under $60 now though and the latest gen AQC113C for under $65. You can get a 5-port unmanaged switch for $220.
The cheaper option is going 2.5 Gbps, which is sub $20 for a NIC and $50 or less for a basic 8-port switch.

Building is the only way to get ECC RAM. That factors big for me.

I also don't like relying on single software vendors and their patch mechanisms.
That's simply not true any more. Admittedly you'd have to swap out the RAM yourself, but more and more pre-built devices can support ECC RAM.
Also, you're being too paranoid imho, alongside with a very vocal community that tells people they're morons if they don't use ECC RAM and ZFS.
There's zero proof that ECC RAM does anything for your average home user on their NAS.

As for the OS, who patches the open sauce stuff? You're still relying on a third party to implement support for whatever needs patching, unless you're an expert and build your own Linux kernel and OS from scratch.
 
Last edited:
Joined
Jan 4, 2022
Messages
101 (0.10/day)
Sounds like you're running your Synology devices 24/7. My TrueNAS systems use more power, e.g. HP ML350P Gen8, twin PSUs, Xeon, 40GB ECC RAM, six 4TB Toshiba N300 drives pulls on average 110W at 30 cents per kW/hr. My desktop TrueNAS builds (8 hard disks per system) pull about the same power, but these systems are for archive backups and are only powered up occasionally. I tend to switch off stuff at night including all the network switches in 10GbE home network, but leave the fibre internet router and pfSense firewall running 24/7 to maintain the landline VOIP phone.

If anything goes wrong in any of my machines, I pull them apart and replace anything that's died. I've just removed a failing 4TB Toshiba N300 NAS drive from a RAID Z2 server when TrueNAS reported bad blocks/pending sectors. I purchased a brand new identical drive on eBay for $30, popped it into the server and resilvered the array in 6 hours. On another TrueNAS system, I replaced the ancient AMD FM2 motherboard when it started to fail. Price for a second hand replacement mobo/CPU was approx. $6.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,236 (2.37/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
I will chuck this out there, I think (based on 20c a kw for power) I've saved about a grand since I employed Synology prebuilts.
It's possible to build an equally power efficient system yourself, however, it's hard getting hold of good mini-ITX NAS boards with low power SoCs as a consumer.
 
Last edited:
Joined
Jan 2, 2024
Messages
484 (1.74/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
Seeing how a lot of these server grade boards are put together and priced, I'm inclined to agree.
1721219121013.png


The CWWK has a good amount of sata, onboard USB, M.2 and Ethernet but I don't have a whole lot of confidence in a network store + traffic controller with that CPU or single channel DDR5 memory and neither should you. I don't care how big or fast the memory is going to be, that's an N100 with a big copper heatsoak. This is not ideal for any situation and just generic enough to fit anywhere. Like a mind virus. It might be okay for industrial but I don't see anyone going out to pick these up anytime soon when any consumer B550 board or Chinese snowflake X99 blowout will do the job at a significant price, performance, power and thermal advantage. A NAS board should be a somewhat compact fit and priced right. Just like user workstations it's not a permanent fit outright. It's going to change here and there over time, even if the box gets moved once every few years. If you can't see the future and can't deal with the headache, it's best to just go with an existing product and put it to use. That's what I did.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,236 (2.37/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Seeing how a lot of these server grade boards are put together and priced, I'm inclined to agree.
View attachment 355441

The CWWK has a good amount of sata, onboard USB, M.2 and Ethernet but I don't have a whole lot of confidence in a network store + traffic controller with that CPU or single channel DDR5 memory and neither should you. I don't care how big or fast the memory is going to be, that's an N100 with a big copper heatsoak. This is not ideal for any situation and just generic enough to fit anywhere. Like a mind virus. It might be okay for industrial but I don't see anyone going out to pick these up anytime soon when any consumer B550 board or Chinese snowflake X99 blowout will do the job at a significant price, performance, power and thermal advantage. A NAS board should be a somewhat compact fit and priced right. Just like user workstations it's not a permanent fit outright. It's going to change here and there over time, even if the box gets moved once every few years. If you can't see the future and can't deal with the headache, it's best to just go with an existing product and put it to use. That's what I did.
Sorry, but why would you want to touch these boards at all? Does that company issue BIOS/UEFI updates? Do you get any kind of support if the board fails or has bugs? Does that company even have a website you can download drivers from? I would never, ever touch these random xinese boards, they're a recipe for disaster.
 
Joined
Jan 2, 2024
Messages
484 (1.74/day)
Location
Seattle
System Name DevKit
Processor AMD Ryzen 5 3600 ↗4.0GHz
Motherboard Asus TUF Gaming X570-Plus WiFi
Cooling Koolance CPU-300-H06, Koolance GPU-180-L06, SC800 Pump
Memory 4x16GB Ballistix 3200MT/s ↗3800
Video Card(s) PowerColor RX 580 Red Devil 8GB ↗1380MHz ↘1105mV, PowerColor RX 7900 XT Hellhound 20GB
Storage 240GB Corsair MP510, 120GB KingDian S280
Display(s) Nixeus VUE-24 (1080p144)
Case Koolance PC2-601BLW + Koolance EHX1020CUV Radiator Kit
Audio Device(s) Oculus CV-1
Power Supply Antec Earthwatts EA-750 Semi-Modular
Mouse Easterntimes Tech X-08, Zelotes C-12
Keyboard Logitech 106-key, Romoral 15-Key Macro, Royal Kludge RK84
VR HMD Oculus CV-1
Software Windows 10 Pro Workstation, VMware Workstation 16 Pro, MS SQL Server 2016, Fan Control v120, Blender
Benchmark Scores Cinebench R15: 1590cb Cinebench R20: 3530cb (7.83x451cb) CPU-Z 17.01.64: 481.2/3896.8 VRMark: 8009
We have some VERY different priorities if your first concern is going to be BIOS/UEFI updates.
A lot of my server equipment hasn't had updates in over 10 years and my most valued box is a BIOS only junker that has a split final version between Windows/Linux compatibility.
In short, the Chinese snowflake X99 board that likely has a few UEFI bugs is an infinitely more valuable proposition. It doesn't need much, it just has to work.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,236 (2.37/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
We have some VERY different priorities if your first concern is going to be BIOS/UEFI updates.
A lot of my server equipment hasn't had updates in over 10 years and my most valued box is a BIOS only junker that has a split final version between Windows/Linux compatibility.
In short, the Chinese snowflake X99 board that likely has a few UEFI bugs is an infinitely more valuable proposition. It doesn't need much, it just has to work.
You mean I buy hardware from reputable companies rather than some random crap? Yes, then we clearly have different priorities, as I have seen my fair share of crappy hardware over the years. Maybe you weren't building computers back then, but I worked in a computer shop that used PCChips and other junk like that for their budget machines and it was hell, as 8/10 boards were usually DOA. I would simply not waste my time on crap like that. But each to their own I guess.
 
Joined
Oct 30, 2022
Messages
230 (0.33/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
It's possible to build an equally power efficient system yourself, however, it's hard getting hold of good mini-ITX NAS boards with low power SoCs as a consumer.
Yeah and the cost difference, not worth it.

I got 8 hot swap bays, raid 6, low power consumption in a small form factor. Absolutely achievable in selected pc components, for how much less though ? but more importantly, it's a storage system not a server for me. Also with an earlier hardware death (DS1815+ carked it, swapped into a DS1821+) all I did was throw all the drives in, boot it, let it import the system into the new hardware config (which took 10 minutes without input from me) and off I went again flawlessly, don't think a dead component in a pc build would be as easy a fix potentially (ram sure, cpu could be an issue, if the boot drive dies without the expense of a clone of it)

There no reason for me to build a pc to save what would amount to a few hundred dollars at most (8 hotswap bays alone was about 1/3 of the cost when I looked at it years ago) and it's far far less fickle than any windows server variant I've used over the years and my *nix distro knowledge isn't sufficient for it not to be a learning exercise. My time is worth more than the money saved (which is a choice we all have to make)
 
Top