- Joined
- Sep 15, 2011
- Messages
- 6,762 (1.39/day)
Processor | Intel® Core™ i7-13700K |
---|---|
Motherboard | Gigabyte Z790 Aorus Elite AX |
Cooling | Noctua NH-D15 |
Memory | 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5 |
Video Card(s) | ZOTAC GAMING GeForce RTX 3080 AMP Holo |
Storage | 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD |
Display(s) | Acer Predator X34 3440x1440@100Hz G-Sync |
Case | NZXT PHANTOM410-BK |
Audio Device(s) | Creative X-Fi Titanium PCIe |
Power Supply | Corsair 850W |
Mouse | Logitech Hero G502 SE |
Software | Windows 11 Pro - 64bit |
Benchmark Scores | 30FPS in NFS:Rivals |
... And most games today are GPU bound, ... but games dont need more CPU power.
People keep writing those completely FALSE statements over and over again. Games don't need more CPU power?? Bloody hell they do. How about the enemies and NPCs A.I.? All done in CPU. How about of a world where NPCs are actually doing something instead of just standing still or doing simple scripting tasks? How about complex physics, not those simple stuff? etc, etc...
Anyways, back on topic, the point is, since the 2600K CPUs, every generation had seen a performance increase on average 5%, which is ridiculously and callously low. I bought my 3770K exactly 4 years ago, and in this current rithm, I'll probably going to wait for another 6 years to see a 50% performance improvement over my current CPU. Which is ... LOL
Last edited: