why does it seem like PPT means more than anything for temperature?
EDC and TDC don't change much on my chip, maybe all core over clock or single thread changes a little bit. Like not all cores are consistent at the same clock then. From 100w down to 90w ppt yields lower temps while scoring about the same in cinebenchr23.
Last night I tested the 5600x setting on here the 90w/ 75TDC / 100 EDC then tried 100w/75TDC/100 EDC score didn't change too much, temps were 10C lower with 90w PPT.
The tried 90w / 85TDC / 110 EDC scores about the same maybe only 5c lower temps.
I don't know if I should trust HWinfo, it shows that if I set 125w PPT but EDC at 120 it will stop at 120w PPT and won't go any further in HWinfo.
For 5600X there is little point going above 76W default limit, especially uf you use CO. Running -30 CO allcore and +50 pbo I get multicore boost in CB to 4.6-4.65GHz if temp is below 71-72C. If I set 90W limit consumption gets to 85W and allcore sits at 4.675-4.7GHz. This gives me 200points in CB23, but noise and heat is up significantly.
CB23:
- 200 pbo and CO -27,-29,-30x4, 90W: 76C, 88W, 4.75-4.6GHz, 12050p, voltage max 1.34v, fan rpm 1200.
-50 pbo and CO -30x6, 90W: 74C, 85W, 4.7-4.675GHz, 12050p, voltage max 1.22v, fan rpm 1200
-50 pbo and CO -30x6, 76W: 70C, 76W, 4.625-4.6GHz, 11850p, voltage max 1.22v, fan rpm 1200
I use the latter since sustained clocks are higher due to lower temp. My CPU gets a 50-100MHz drop in allcore frequency if temp goes from 71 to 76C.