I have a grandson who does the same. My points remain the same. The computer is still spending more time closer to idle consumption rates than maxed out rates.
Plus I think it important to point out that your price per kWh value is unrealistic. Even with rate increases rampant these days, the average cost per kWh in the US, as of this month is
14.47 cents.
As for wear and tear and noise - any difference in wear and tear is going be negligible at best. If the computer is being used 8 hours per day doing normal office tasks, or if gaming, there is little difference in wear and tear - except maybe on a hard drive. And while I personally really hate fan noise, the increased power consumption from a fan spinning faster is going to be insignificant too.
Even if a fan would otherwise be off, I note a 140mm case fan spinning at 2,000RPM typically only pulls ~5 watts.
I am NOT saying your points are totally invalid. I am just saying your numbers are unrealistic simply because the power consumption of computers constantly varies widely and except for a few exceptions, rarely sit at or near capacity for extended periods of time.
This is exactly why PSU power supplies are so different from the power supplies of other electronics - and exactly why 80 PLUS certifications matter. Those supplies provide a "flat" efficiency curve at 20% load all the way up to 100% load. Other supplies typically have a "bell shaped" curve with peak efficiency typically at just 1 load rate.
A big screen TV, for example, puts a relatively consistent load on its power supply. So designers can pick a much less expensive power supply and simply match the load to the supply's peak efficiency point and be good to go.