Not exactly.
Power consumption = Heat generated. It's by definition. Basically, if power is used then energy has changed forms.
The potential drop across a component will only be different from the potential difference across the terminals of the supply if you have other elements in series.
Voltage is measured in Joules per Coulmb. It is the amount to energy stored in the electrons' electric fields due to their proximity.
Think about having rubber balls (electrons) all squashed together at one side and rubber balls far apart next to them, but separated by a 1-way barrier.
When you create a path for the rubber balls to flow (creating a circuit) the balls are going to push away from each other at the one end, go down the path (the cpu) be slowed down and push into the rubber balls at the other end which are further apart.
When they are pushed into those balls at the other end, with some extra help, they are going to push those rubber balls through the barrier in to the rest of the squashed rubber balls.
The rubber balls themselves are not used up, they don't disappear or go anywhere, they just lose energy as they travel down the path.
I believe the potential difference across a CPU is controlled very tightly by the NB. If you increase the voltage (potential difference) across the CPU, it starts using more energy and overheats.
The CPU will have a power rating at a certain number of volts. Increasing the voltage will cause the power used to go up as you are causing the CPU to use more voltage.
The rate at which electrons can carry energy out of the CPU is limited and would be very very minimal. But again, that energy they carry is "re-used."
To get into details would be too involved, and I have to admit, I don't understand it 100% as it's been a long time since studying it at Uni.
Edit - I found a water analogy illustration: