Ruru
S.T.A.R.S.
- Joined
- Dec 16, 2012
- Messages
- 13,288 (3.00/day)
- Location
- Jyväskylä, Finland
System Name | 4K-gaming / console |
---|---|
Processor | 5800X @ PBO +200 / i5-8600K @ 4.6GHz |
Motherboard | ROG Crosshair VII Hero / ROG Strix Z370-F |
Cooling | Alphacool Eisbaer 360 / Alphacool Eisbaer 240 |
Memory | 32GB DDR4-3466 / 16GB DDR4-3600 |
Video Card(s) | Asus RTX 3080 TUF OC / Powercolor RX 6700 XT |
Storage | 3.5TB of SSDs / several small SSDs |
Display(s) | 4K120 IPS + 4K60 IPS / 1080p60 HDTV |
Case | Corsair 4000D AF White / DeepCool CC560 WH |
Audio Device(s) | Sony WH-CH720N / TV speakers |
Power Supply | EVGA G2 750W / Fractal ION Gold 550W |
Mouse | Razer Basilisk / Logitech G400s |
Keyboard | Roccat Vulcan 121 AIMO / NOS C450 Mini Pro |
VR HMD | Oculus Rift CV1 |
Software | Windows 11 Pro / Windows 11 Pro |
Benchmark Scores | They run Crysis |
Seriously, what were they thinking at Gigabyte with this design, when the x1 slot becames unusable with any even somewhat modern GPU... they could've put it above the x16 4.0 slot and move the M.2 slot a bit, but nope.Mine used to go to an m.2 adapter when I was on Intel, but my current B650 board has two sockets coming from the CPU, so I'm fine with using only those... luckily, because the main x16 slot is shifted down by one slot, so the bottom one is the only breathing room for the graphics card.
edit: I could quote AVGN on this one. "no, they were not thinking"