In fact, as a guy who has promoted the use of Older /500 watt PC's for heater's here why use 500/1000 watts just to heat when you can get computation effectively at the same time.
As someone who has actually ran hardware for consumers at its limits day in day out continuously from both teams as an unbiased engineer I have experience that "this is fine".
You haven't used said equation to prove your right, just mentioning it instead, like no one else is aware, we are.
So are AMD, they're scientific based testing did too, so again put up some evidence, do the Math.
And your pulling the fanboi argument out your pocket reaks of desperation, I suppose you too could just keep repeating the same shit while trying to undermine others with bias tags but know this, it reflects badly on you're own bias's IMHO.
Oh and as someone who has degraded CPUs with they're activity, I am actually ok with CPU designed to provide Max performance within a useful lifespan, Moore's law dictated shit before but got vague in the quad core erra, but given modern performance increases we should be back to about three years useful use, with conditional restricted use beyond that relative to the latest tech IE shit should be surpassed in every metric within 3 years anyway.
Usually migration took 3 years to begin to develop though obviously with effort you can do wonderful things, not.