qubit
Overclocked quantum bit
- Joined
- Dec 6, 2007
- Messages
- 17,865 (2.87/day)
- Location
- Quantum Well UK
System Name | Quantumville™ |
---|---|
Processor | Intel Core i7-2700K @ 4GHz |
Motherboard | Asus P8Z68-V PRO/GEN3 |
Cooling | Noctua NH-D14 |
Memory | 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz) |
Video Card(s) | MSI RTX 2080 SUPER Gaming X Trio |
Storage | Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB |
Display(s) | ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible) |
Case | Cooler Master HAF 922 |
Audio Device(s) | Creative Sound Blaster X-Fi Fatal1ty PCIe |
Power Supply | Corsair AX1600i |
Mouse | Microsoft Intellimouse Pro - Black Shadow |
Keyboard | Yes |
Software | Windows 10 Pro 64-bit |
According to this article, soldering the heatspreader to the CPU isn't all it's cracked up to be (literally) so using thermal paste can make sense. My question is, in that case, why doesn't Intel use the best thermal paste possible? Surely, the tiny bit extra that it might cost is negligible and the performance increase and lowered temperature, especially when overclocked more than makes up for it? See what you think.
https://overclocking.guide/the-truth-about-cpu-soldering
Skylake delidding seems to be very common by now. Every day I read postings from people complaining about Intel and the thermal paste between IHS and die. Even tho Skylake is performing great, people are not satisfied with the temperatures on load. Compared to older generations there is conventional thermal paste between the IHS and the die while Sandy Bridge and older CPUs were soldered. Why did Intel change the production and is the thermal paste really that bad?
https://overclocking.guide/the-truth-about-cpu-soldering