Ruru
S.T.A.R.S.
- Joined
- Dec 16, 2012
- Messages
- 12,807 (2.93/day)
- Location
- Jyväskylä, Finland
System Name | 4K-gaming / media-PC |
---|---|
Processor | AMD Ryzen 7 5800X / Intel Core i7-6700K |
Motherboard | Asus ROG Crosshair VII Hero / Asus Z170-A |
Cooling | Arctic Freezer 50 / Thermaltake Contac 21 |
Memory | 32GB DDR4-3466 / 16GB DDR4-3000 |
Video Card(s) | RTX 3080 10GB / RX 6700 XT |
Storage | 3.3TB of SSDs / several small SSDs |
Display(s) | Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS |
Case | Corsair 4000D AF White / DeepCool CC560 WH |
Audio Device(s) | Creative Omni BT speaker |
Power Supply | EVGA G2 750W / Fractal ION Gold 550W |
Mouse | Logitech MX518 / Logitech G400s |
Keyboard | Roccat Vulcan 121 AIMO / NOS C450 Mini Pro |
VR HMD | Oculus Rift CV1 |
Software | Windows 11 Pro / Windows 11 Pro |
Benchmark Scores | They run Crysis |
Seriously, what were they thinking at Gigabyte with this design, when the x1 slot becames unusable with any even somewhat modern GPU... they could've put it above the x16 4.0 slot and move the M.2 slot a bit, but nope.Mine used to go to an m.2 adapter when I was on Intel, but my current B650 board has two sockets coming from the CPU, so I'm fine with using only those... luckily, because the main x16 slot is shifted down by one slot, so the bottom one is the only breathing room for the graphics card.
edit: I could quote AVGN on this one. "no, they were not thinking"