qubit
Overclocked quantum bit
- Joined
- Dec 6, 2007
- Messages
- 17,865 (2.83/day)
- Location
- Quantum Well UK
System Name | Quantumville™ |
---|---|
Processor | Intel Core i7-2700K @ 4GHz |
Motherboard | Asus P8Z68-V PRO/GEN3 |
Cooling | Noctua NH-D14 |
Memory | 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz) |
Video Card(s) | MSI RTX 2080 SUPER Gaming X Trio |
Storage | Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB |
Display(s) | ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible) |
Case | Cooler Master HAF 922 |
Audio Device(s) | Creative Sound Blaster X-Fi Fatal1ty PCIe |
Power Supply | Corsair AX1600i |
Mouse | Microsoft Intellimouse Pro - Black Shadow |
Keyboard | Yes |
Software | Windows 10 Pro 64-bit |
Any computer system is always bottlenecked by its slowest component, since if there was no bottleneck, then it would run infinitely fast. So, in a gaming system, where should that bottleneck be, CPU or GPU? Please vote in the poll.
I say GPU for the following reasons:
1 I believe having the bottleneck at the CPU makes frame pacing less even, resulting in more noticeable and annoying stutter. If anyone knows more about this, let us know
2 The CPU is generally upgraded much less often than the graphics card, since performance gains are only a few percent each generation
3 A GPU bottleneck maximises the financial investment in the graphics card and graphics cards are now very expensive - think RTX 2080 Ti at a whopping $1200. You don't want this baby not to perform to its fullest because of a slow CPU. On the other hand, pairing a fast graphics card with a slow CPU just for kicks, means that you can max everything out at the highest resolution and the framerate doesn't slow down, because it's already quite low, lol
There are probably more reasons, but that's what I can think of off the top of my head.
Note that where the CPU and GPU are evenly matched, the bottleneck will tend to switch between them from moment to moment and will be more obvious for some games more than others. I've never seen a utility to show this happening however, so there's no way to observe this happening.
I say GPU for the following reasons:
1 I believe having the bottleneck at the CPU makes frame pacing less even, resulting in more noticeable and annoying stutter. If anyone knows more about this, let us know
2 The CPU is generally upgraded much less often than the graphics card, since performance gains are only a few percent each generation
3 A GPU bottleneck maximises the financial investment in the graphics card and graphics cards are now very expensive - think RTX 2080 Ti at a whopping $1200. You don't want this baby not to perform to its fullest because of a slow CPU. On the other hand, pairing a fast graphics card with a slow CPU just for kicks, means that you can max everything out at the highest resolution and the framerate doesn't slow down, because it's already quite low, lol
There are probably more reasons, but that's what I can think of off the top of my head.
Note that where the CPU and GPU are evenly matched, the bottleneck will tend to switch between them from moment to moment and will be more obvious for some games more than others. I've never seen a utility to show this happening however, so there's no way to observe this happening.