• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Modern motherboards with 6+ usable PCIe slots?

Joined
Feb 22, 2022
Messages
581 (0.59/day)
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Custom Watercooling
Memory G.Skill Trident Z Royal 2x16GB
Video Card(s) MSi RTX 3080ti Suprim X
Storage 2TB Corsair MP600 PRO Hydro X
Display(s) Samsung G7 27" x2
Audio Device(s) Sound Blaster ZxR
Power Supply Be Quiet! Dark Power Pro 12 1500W
Mouse Logitech G903
Keyboard Steelseries Apex Pro
As have been said several times above, the only way to get an actual meaningful amount of working PCIE slots are workstation level hardware. Consumer hardware is not built for expandability anymore. And the CPU/Chipsets do not have enough available PCIE lanes for this to make much sense anyway. Things like USB4 controllers, 10Gb (or higher) NIC, M.2 nvme ssds, all require multiple PCIE lanes to work properly. So a miner motherboard with several x1 slots is not an option either.

You need to find a motherboard with built-in PCIE switch(es) which automatically increase the price by a good margin. See the MSi MEG X570S Unify-X MAX, that have one x16 and one x8 slot by default. But with numerous built-in switches it can provide up to 6 M.2 slots with x4 PCIE lanes each (1 M.2 is x4 from CPU, 1 is x4 from chipset, the other 4 are split from the physical x16 slots via switches). With some M.2 to PCIE slot adapters you can expand this alot. But the price for that motherboard alone is umm significant. But at least x4 M.2 slots have enough bandwidth for many potential upgrades.

The only board that is somewhat modern there is the ASRock X570S PG Riptide and even that is gimped... three x16 slots but the lower two are electrically x4 and the bottommost one is limited to x2, WHY EVEN BOTHER FFS. Some of the other boards have an x16 and x8... but no USB-C internal connector.

That search is legitimately amazing though.
Not much to work with when the CPU/chipsets don't have more PCIE lanes to begin with. At least the slots can physically hold x16 cards :laugh:
 
Joined
Jan 3, 2021
Messages
3,432 (2.46/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
The only board that is somewhat modern there is the ASRock X570S PG Riptide and even that is gimped... three x16 slots but the lower two are electrically x4 and the bottommost one is limited to x2, WHY EVEN BOTHER FFS. Some of the other boards have an x16 and x8... but no USB-C internal connector.
On the upside, x16 + x4 will operate as x16 + x4. In other Core or Ryzen boards, unless there's a PCIe switch onboard, x16 + x8 will never work but will be reduced to x8 + x8.

That search is legitimately amazing though.
True. I often post links to it. Unfortunately the search is limited to what's currently on sale, so if someone is asking about a Skylake board with certain properties, it can't help much.

Do you see the page in English, by any chance? Up until a few years ago it was in German and English, I saw one on my home PC and the other at work, with no option to choose language. Now I only see it in German.
 
Joined
Feb 22, 2022
Messages
581 (0.59/day)
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Custom Watercooling
Memory G.Skill Trident Z Royal 2x16GB
Video Card(s) MSi RTX 3080ti Suprim X
Storage 2TB Corsair MP600 PRO Hydro X
Display(s) Samsung G7 27" x2
Audio Device(s) Sound Blaster ZxR
Power Supply Be Quiet! Dark Power Pro 12 1500W
Mouse Logitech G903
Keyboard Steelseries Apex Pro
Do you see the page in English, by any chance? Up until a few years ago it was in German and English, I saw one on my home PC and the other at work, with no option to choose language. Now I only see it in German.
I only see German as well. But they link to Skinflint at the bottom of that page, which at first glance look like the same site in English.
 
Joined
Feb 18, 2005
Messages
5,745 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
That is still a ton of bandwidth.
No it's not; X670E can put out 48 lanes of PCIe (28 CPU + 8 chipset 1 + 12 chipset 2). The problem is that because AMD and motherboard manufacturers are shitheels, they don't allow for those lanes to be allocated sensibly.

What are you doing that can even make use of it all?
Graphics card = 16 lanes
Quad M.2 NVMe card = 16 lanes

Completely reasonable to expect the above to work in any desktop motherboard, and yet there are ZERO boards (including AMD's "high-end" X670E) that allow it. I'd happily settle for all the onboard M.2 slots to be disabled to allow the above configuration, but because motherboard and CPU manufacturers are useless, greedy shitheels that's not possible either. Even though, on many boards with a PCIe x4 slot, using said slot disables one of the M.2 slots and vice versa... so why the hell can't they extend that to ALL M.2 slots?

Do you see the page in English, by any chance? Up until a few years ago it was in German and English, I saw one on my home PC and the other at work, with no option to choose language. Now I only see it in German.
I just use Chrome's built-in translate.
 
Last edited:
Joined
Jan 3, 2021
Messages
3,432 (2.46/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
You need to find a motherboard with built-in PCIE switch(es) which automatically increase the price by a good margin. See the MSi MEG X570S Unify-X MAX, that have one x16 and one x8 slot by default. But with numerous built-in switches it can provide up to 6 M.2 slots with x4 PCIE lanes each (1 M.2 is x4 from CPU, 1 is x4 from chipset, the other 4 are split from the physical x16 slots via switches). With some M.2 to PCIE slot adapters you can expand this alot. But the price for that motherboard alone is umm significant. But at least x4 M.2 slots have enough bandwidth for many potential upgrades.
That board really has a lot of flexibility built in but those aren't PCIe packet switches, which would enable communication with both ports. They are simply switched to one of the two positions on boot and stay there. The manual does talk about "bandwidth sharing" but in reality it's lane stealing.

1693916357509.png


I only see German as well. But they link to Skinflint at the bottom of that page, which at first glance look like the same site in English.
Skinflint only lists products currently available in the UK, and the default is only from sellers in the UK. You can choose to also include sellers from Germany, Austria and Poland (and maybe other countries) who ship over the channel. I compared both sites' results a few times, with foreign shipping options included, and found some PC parts that you can buy in Germany but not in the UK - but never the opposite.
 
Joined
Feb 18, 2005
Messages
5,745 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
That board really has a lot of flexibility built in but those aren't PCIe packet switches, which would enable communication with both ports. They are simply switched to one of the two positions on boot and stay there. The manual does talk about "bandwidth sharing" but in reality it's lane stealing.
It's not "stealing" so much as "reallocation". And this implementation is exactly what I'd expect from all motherboards! Why is the ability to allocate lanes away from unused slots something that has to be restricted to a halo model? It's so simple:
  • Choose a PCIe x16 slot to participate in lane reallocation
  • Add 4 M.2 slots on the board and link them to the above PCIe slot
  • Write the following UEFI code:
    • If none of the linked M.2 slots are populated, the linked PCIe slot runs at x16
    • If one of the linked M.2 slots is populated, the linked PCIe slot runs at x8 (AFAIK x12 is not a valid step for PCIe, so you lose 4 lanes here)
    • If two of the linked M.2 slots are populated, the linked PCIe slot runs at x8
    • If three of the linked M.2 slots are populated, the linked PCIe slot runs at x4
    • If all four of the linked M.2 slots are populated, the linked PCIe slot runs at x0 i.e. is disabled
That way, if you never use a particular M.2 slot, you don't lose that bandwidth from the corresponding PCIe slot.

You don't even have to link it to M.2 slots. You could, for example, say that (just like on Promontory 21 for AM5), four SATA ports take four PCIe lanes, and then your x16 slot is linked to 3 M.2 slots and 4 SATA ports. Or To 2 M.2 slots and 8 SATA ports. Or...
 
Last edited:
Joined
Jul 16, 2022
Messages
613 (0.73/day)
Most AMD Threadripper only have 4 slots usable without a riser. A few have 5. I'm looking for 6. The price is a bit steep, but doable. Intel Sapphire Rapids cost more than my car.
Maybe consider the AMD WRX 80 Pro - AMD Threadripper Pro Motherboard. That joker has seven PCI lanes (4-PCI 4.0X16+3-PCI 4.0X8). I am not sure how fast into overclocking you desire for the memory or CPU.
 
Joined
Feb 18, 2005
Messages
5,745 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Maybe consider the AMD WRX 80 Pro - AMD Threadripper Pro Motherboard. That joker has seven PCI lanes (4-PCI 4.0X16+3-PCI 4.0X8). I am not sure how fast into overclocking you desire for the memory or CPU.
A WRX80 system will cost about as much as a car.
 
Joined
Jan 3, 2021
Messages
3,432 (2.46/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
It's not "stealing" so much as "reallocation". And this implementation is exactly what I'd expect from all motherboards! Why is the ability to allocate lanes away from unused slots something that has to be restricted to a halo model? It's so simple:
  • Choose a PCIe x16 slot to participate in lane reallocation
  • Add 4 M.2 slots on the board and link them to the above PCIe slot
  • Write the following UEFI code:
    • If none of the linked M.2 slots are populated, the linked PCIe slot runs at x16
    • If one of the linked M.2 slots is populated, the linked PCIe slot runs at x8 (AFAIK x12 is not a valid step for PCIe, so you lose 4 lanes here)
    • If two of the linked M.2 slots are populated, the linked PCIe slot runs at x8
    • If three of the linked M.2 slots are populated, the linked PCIe slot runs at x4
    • If all four of the linked M.2 slots are populated, the linked PCIe slot runs at x0 i.e. is disabled
That way, if you never use a particular M.2 slot, you don't lose that bandwidth from the corresponding PCIe slot.

You don't even have to link it to M.2 slots. You could, for example, say that (just like on Promontory 21 for AM5), four SATA ports take four PCIe lanes, and then your x16 slot is linked to 3 M.2 slots and 4 SATA ports. Or To 2 M.2 slots and 8 SATA ports. Or...
The joy of fast PCIe ... Up until 3.0, no switches were even necessary, two slots were simply wired in parallel. That's my assumption at least, I checked some info on Z490 boards (PCIe 3.0) and found the following for the MSI MEG Z490 Ace. The "Switch" here might be a real PLX one but other wires just split without a switch if the diagram is correct. 4.0 made everything costlier and less flexible.

1693921743538.png


PCIe x12 was part of the original specification but no products ever existed. It's probably been abandoned by now, and even if it hasn't been, it will continue to not matter.
 
Joined
Jul 30, 2019
Messages
3,190 (1.66/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR4-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
And that's what AMD wants you to spend on it.
NewEgg has a bundle with WRX80 + Threadripper 5955WX for only $2500. If you are actually making money with your PC and making good use of the slots this might not be a bad deal. In my case it's a bit more than double the TDP and overly expensive at such a low core count just to get more PCIe slots - although I am tempted.

Whenever I find myself looking at threadripper I recompare perf and wattage (at lower core counts) I'm reminded how awesome it is to even be able to get a good performing 16 core cpu/mb combo for under $1000 on AM4/AM5.
 
Last edited:
Joined
Apr 12, 2013
Messages
7,473 (1.77/day)
It does but really it's a niche & extremely small one at that, even if you want 40+ PCIe lanes bifurcated nicely I doubt you'd utilize all of them to the fullest. This is why they're not any board makers doing that, on servers you probably have all these lanes being saturated from time to time.
 
Joined
Feb 22, 2022
Messages
581 (0.59/day)
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Custom Watercooling
Memory G.Skill Trident Z Royal 2x16GB
Video Card(s) MSi RTX 3080ti Suprim X
Storage 2TB Corsair MP600 PRO Hydro X
Display(s) Samsung G7 27" x2
Audio Device(s) Sound Blaster ZxR
Power Supply Be Quiet! Dark Power Pro 12 1500W
Mouse Logitech G903
Keyboard Steelseries Apex Pro
That board really has a lot of flexibility built in but those aren't PCIe packet switches, which would enable communication with both ports. They are simply switched to one of the two positions on boot and stay there. The manual does talk about "bandwidth sharing" but in reality it's lane stealing.
I did not call them packet switches either. They are literal switches. They switch lanes from one connector to another. Which I thought I explained in the rest of that post. So no, they are not multiplexers. They give you the option to redirect lanes according to a strict A/B configuration. Which in this case is the choice between M.2 slots or other slots/features onboard.
 
Joined
Feb 18, 2005
Messages
5,745 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
The biggest problem with Threadripper, apart from its price, is that it's always going to be at least a generation behind. So even if you buy WRX80 you only get PCIe 4.0, whereas AM5 will give you PCIe 5.0. And if I'm shelling out that much money I'd expect to be getting the latest and greatest for it...

Basically, HEDT is a scam.
 
Joined
Feb 1, 2019
Messages
3,516 (1.68/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
NewEgg has a bundle with WRX80 + Threadripper 5955WX for only $2500. If you are actually making money with your PC and making good use of the slots this might not be a bad deal. In my case it's a bit more than double the TDP and overly expensive at such a low core count just to get more PCIe slots - although I am tempted.

Whenever I find myself looking at threadripper I recompare perf and wattage (at lower core counts) I'm reminded how awesome it is to even be able to get a good performing 16 core cpu/mb combo for under $1000 on AM4/AM5.
Thats a hard sell, basically just buying it for the pcie lanes as a 13700k as an example will beat it on single threaded comfortably and trade blows with it on all core usage for a much lower cost.
 
Joined
Jul 30, 2019
Messages
3,190 (1.66/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR4-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
Thats a hard sell, basically just buying it for the pcie lanes as a 13700k as an example will beat it on single threaded comfortably and trade blows with it on all core usage for a much lower cost.
Exactly. Unless of course you need the expansion and over 128GB of memory and over 16 cores, then you might really get your bang for the buck. If you only need one of these three then you may not be getting as much as what you are paying for while already entering into server price territory. There was a similar combo with 3955WX for about $1200 but that's still a pretty premium for 2 generations behind. This of course is just my opinion and I admit in my heart I still want to get a Threadripper
OP has left the chat lol

Darn. Actually I think I found the slot configuration he was looking for in the ASRock Master SLI/ac :roll:

1693962650379.png


however the VRM's and OCP (or lack of) on that board really suck as many here already know by now. I have noticed in many boards with the 1x slot open ended, there are components in the way, but on this board it was like they actually thought about that and left some clearance for 4x cards to fit in 1x slot space. I still find myself browsing ebay to see if this board still exists from time to time.
 
Last edited:

silentbogo

Moderator
Staff member
Joined
Nov 20, 2013
Messages
5,538 (1.39/day)
Location
Kyiv, Ukraine
System Name WS#1337
Processor Ryzen 7 3800X
Motherboard ASUS X570-PLUS TUF Gaming
Cooling Xigmatek Scylla 240mm AIO
Memory 4x8GB Samsung DDR4 ECC UDIMM
Video Card(s) MSI RTX 3070 Gaming X Trio
Storage ADATA Legend 2TB + ADATA SX8200 Pro 1TB
Display(s) Samsung U24E590D (4K/UHD)
Case ghetto CM Cosmos RC-1000
Audio Device(s) ALC1220
Power Supply SeaSonic SSR-550FX (80+ GOLD)
Mouse Logitech G603
Keyboard Modecom Volcano Blade (Kailh choc LP)
VR HMD Google dreamview headset(aka fancy cardboard)
Software Windows 11, Ubuntu 24.04 LTS
Darn. Actually I think I found the slot configuration he was looking for in the ASRock Master SLI/ac
I think the one he was looking for is more like on the one he listed in OP or like on Gigabyte Z390D or UD (first x1 slot is above the first x16 slot, and you have empty space for dual-slot GPU).
The only problem is that his AsRock B450 is a subpar overclocker, and GB Z390D[UD] (at least from my experience) is utter garbage.
05_gigabyte_z390_ud.jpg
 
Last edited:
Joined
Jul 30, 2019
Messages
3,190 (1.66/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR4-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
I think the one he was looking for is more like on the one he listed in OP or like on Gigabyte Z390D or UD (first x1 slot is above the first x16 slot, and you have empty space for dual-slot GPU).
The only problem is that his AsRock B450 is a subpar overclocker, and GB Z390D[UD] (at least from my experience) is utter garbage.
View attachment 312309
Oops I missed that point (with the dual slot) but if you go water cooled you get back down to a 1 slot card.
 
Joined
Apr 24, 2020
Messages
2,701 (1.64/day)
Most AMD Threadripper only have 4 slots usable without a riser. A few have 5. I'm looking for 6. The price is a bit steep, but doable. Intel Sapphire Rapids cost more than my car.


- GPU (2 Slots)
- 10G NIC SFP+
- USB3 Controller (dedicated to mouse)
- Analog Capture card

That's just moving stuff from my current build. I know there's going to be yet another USB standard released at some point, that's another card. That's 6 slots used, since the GPU is 2 slots.

I feel like you want a riser cable or two.

USB3 Controller just for mouse? This one seems odd: most USB Mice update at 100 times per second, and there are configs available to easily change that to 1000x per second.

------------

Typical CPUs only have 20x or 24x lanes of PCIe physically. IMO, it makes very little sense to make many, many PCIe connectors like you ask unless you do some weird port-bifurcation (turn a x4 slot into 4x 1x slots), which is what GPU Miners do.

If you _REALLY_ need PCIe slots, then the Threadripper platform is for you.


1694010088178.png


But honestly? I think you're wasting your money on such a platform. You're not "really" using those PCIe lanes to their full potential with this proposed setup you've got. This is a $1000+ motherboard for a reason, some people really do need a ton of PCIe lanes for what they do.

In your case, it'd be cheaper to just buy two computers. One for now, and then upgrade 5 years from now when a new USB standard comes out.

The biggest problem with Threadripper, apart from its price, is that it's always going to be at least a generation behind. So even if you buy WRX80 you only get PCIe 4.0, whereas AM5 will give you PCIe 5.0. And if I'm shelling out that much money I'd expect to be getting the latest and greatest for it...

Basically, HEDT is a scam.

I wouldn't quite call it a "scam", but its a niche product unsuitable for the vast majority of computer users.

If you're going to get 4x GPUs and 16x NVMe RAID0 SSDs, you'll need something like the Threadripper platform. Nothing else will work. But this is a niche-within-a-niche, very few users ever would need such a beast. I've also seen talks on Netflix servers with NVMe SSDs + multiple SFP+ 10Gig Networking with port-ganging that would need this level of PCIe lanes to function at max speed. But these are very atypical uses of computers.
 
Last edited:

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,908 (3.78/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -075mV PL max @225w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Joined
Feb 18, 2005
Messages
5,745 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I wouldn't quite call it a "scam", but its a niche product unsuitable for the vast majority of computer users.
I was more referring to the artificial market segmentation of "mainstream" and "HEDT", whereby getting a sane number of PCIe slots with a sane number of PCIe lanes available should be normal on the former, yet is only possible on the second with all its other, mostly irrelevant bells and whistles.

In the Athlon64 days, before the scam of HEDT was dreamed up by the marketing assholes at Intel, the high-end (but very much mainstream) boards had twin PCIe x16 slots (mechanical and electrical) with some PCIe lanes left over to boot - all for barely more than a hundred quid. Inflation may be a thing but manufacturers are massively and artificially contributing to it by giving us less and charging more for it.

Fuck Intel for their bullshit, fuck AMD for collaborating with it, and fuck the motherboard manufacturers for enabling both.
 
Joined
Apr 24, 2020
Messages
2,701 (1.64/day)
I was more referring to the artificial market segmentation of "mainstream" and "HEDT", whereby getting a sane number of PCIe slots with a sane number of PCIe lanes available should be normal on the former, yet is only possible on the second with all its other, mostly irrelevant bells and whistles.

In the Athlon64 days, before the scam of HEDT was dreamed up by the marketing assholes at Intel, the high-end (but very much mainstream) boards had twin PCIe x16 slots (mechanical and electrical) with some PCIe lanes left over to boot - all for barely more than a hundred quid. Inflation may be a thing but manufacturers are massively and artificially contributing to it by giving us less and charging more for it.

Fuck Intel for their bullshit, fuck AMD for collaborating with it, and fuck the motherboard manufacturers for enabling both.

Hmmm... I dunno. PCIe lanes, especially 4.0 and 5.0, are getting very expensive. The tolerances required to consistently pipe 32GT/s per lane (64GBps for a x16 PCIe 5.0) is above-and-beyond RAM throughput.

So if you look at a typical x28 lane chip like the AMD 7950x3d, that's 112 GBps of I/O throughput. In contrast, your DDR5-5600 RAM is only giving you 44.8 GBps. If you dual channel (like you should), that's 89.6GBps to RAM, but 112GBps to I/O.

If you literally have your CPU do nothing but pass I/O to and from RAM, your computer is RAM-bottlenecked (!!!!). You have more I/O bandwidth than RAM bandwidth on today's computers.

----------

Who actually needs all this bandwidth? IMO, nobody. Even standard consumer platforms are overly thick with I/O bandwidth to absurd levels. The only way to effectively use all this I/O bandwidth is through technologies like DirectX DirectStorage (GPU to NVMe, bypassing the RAM bottleneck).
 
Joined
Feb 18, 2005
Messages
5,745 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Hmmm... I dunno. PCIe lanes, especially 4.0 and 5.0, are getting very expensive. The tolerances required to consistently pipe 32GT/s per lane (64GBps for a x16 PCIe 5.0) is above-and-beyond RAM throughput.

So if you look at a typical x28 lane chip like the AMD 7950x3d, that's 112 GBps of I/O throughput. In contrast, your DDR5-5600 RAM is only giving you 44.8 GBps. If you dual channel (like you should), that's 89.6GBps to RAM, but 112GBps to I/O.

If you literally have your CPU do nothing but pass I/O to and from RAM, your computer is RAM-bottlenecked (!!!!). You have more I/O bandwidth than RAM bandwidth on today's computers.

----------

Who actually needs all this bandwidth? IMO, nobody. Even standard consumer platforms are overly thick with I/O bandwidth to absurd levels. The only way to effectively use all this I/O bandwidth is through technologies like DirectX DirectStorage (GPU to NVMe, bypassing the RAM bottleneck).
Which is why PCIe 5.0 makes no sense in the desktop space. Instead of giving us 28 lanes of PCIe 5.0, gives us 40 of PCIe 4.0.
 
Joined
May 8, 2016
Messages
1,887 (0.61/day)
System Name BOX
Processor Core i7 6950X @ 4,26GHz (1,28V)
Motherboard X99 SOC Champion (BIOS F23c + bifurcation mod)
Cooling Thermalright Venomous-X + 2x Delta 38mm PWM (Push-Pull)
Memory Patriot Viper Steel 4000MHz CL16 4x8GB (@3240MHz CL12.12.12.24 CR2T @ 1,48V)
Video Card(s) Titan V (~1650MHz @ 0.77V, HBM2 1GHz, Forced P2 state [OFF])
Storage WD SN850X 2TB + Samsung EVO 2TB (SATA) + Seagate Exos X20 20TB (4Kn mode)
Display(s) LG 27GP950-B
Case Fractal Design Meshify 2 XL
Audio Device(s) Motu M4 (audio interface) + ATH-A900Z + Behringer C-1
Power Supply Seasonic X-760 (760W)
Mouse Logitech RX-250
Keyboard HP KB-9970
Software Windows 10 Pro x64
In short : Buy older HEDT or server stuff, if you have the need of A LOT of PCIe lanes, or get used to whatever manufacturer integrates into PCB.

Longer version :
In the Athlon64 days, before the scam of HEDT was dreamed up by the marketing assholes at Intel, the high-end (but very much mainstream) boards had twin PCIe x16 slots (mechanical and electrical) with some PCIe lanes left over to boot - all for barely more than a hundred quid. Inflation may be a thing but manufacturers are massively and artificially contributing to it by giving us less and charging more for it.
Not sure why so salty on Intel here...
PCIe lanes up until 2009/2013 (Intel/AMD), were basically a purely Chipset thing.
You had seperate die dedicated to do PCI-e, which meant you could have made "HEDT" on the cheap with simply a bit better chipset and the same Socket/CPU/RAM.

Then LGA 1156 came along, with integrated PCI-e (into CPU).
Intel made LGA 1366 X58 as HEDT (but that platform still has PCIe inside X58 chipset, and not in CPU).
X79 is the first Intel HEDT, with actual 40 PCI-e lanes going from CPU.
Result is LGA 2011 or 2011 pins that were needed to do that (that's PCI-e 3.0).
So, how much of a difference in cost is there between LGA 115x and 20xx ?
I don't know, but it's big enough for Intel and MB guys to ask for more.

Consumers are meant to pay more for less in capitalist world (because otherwise there is a hard limit on how much money someone can earn).
If enough people don't care about above rule, it becomes a norm that everyone else has to live by.

In the end, you may not like it, BUT that ship has already sailed long ago (and unless consumers stop buying low PCI-e port boards, this isn't going to change).
 
Top