Temperatures
We're using a Noctua NH-U14S for temperature measurements. Application temperatures are measured using Blender, a highly demanding rendering load, which will load all cores completely, but that's still realistic and not a synthetic stress test, like Prime95. For gaming, we picked Cyberpunk 2077, its modern engine is multi-thread aware and will try to spread as many tasks as possible over a large number of CPU cores, when available. Even when a game uses multiple threads it doesn't load each CPU core as heavily as rendering, for example, so there's some scope for power savings here.
Note that unless indicated otherwise, all processors are tested at stock conditions with their power limit active, which is why some Intel temperatures are surprisingly low. As designed by Intel, these CPUs can exceed its TDP for a few seconds (PL2), but in the long term, the power limit (PL1) is respected, which brings down temperatures considerably. Both tests report the steady-state temperature after an extended runtime of at least 10 minutes. Temperatures are based on delta T, normalized to 25°C, that's why the temperature of some CPUs is higher than their throttle point, because the room temperature was below 25°C.
For the new Intel CPUs we've increased the temperature limit in BIOS from 100°C to 115°C, to get a better feel for temperatures without thermal throttle getting in the way. We also did a round of testing with the Noctua D15 to get a feel for temps with a more powerful cooler (indicated by "D15" in the charts).
Interestingly the 14900K runs considerably cooler than our 13900K. I confirmed that they both sit at the 253 W power limit in the Blender test, which should result in very similar heat output. Not sure what's happening here, maybe the IHS contact quality is different or the accuracy of the CPU's own power sensor varies. It is expected that the 13900KS runs warmer, because it has a 350 W power limit (vs 253 W on the 14900K).