Well ... if you game 24/7 that is. So it's probably more like 25€ at 2.4 hours a day, which is still high probably.
The calculations were for MY case, but I've stated several times that it is NOT for 24/7 gaming (I don't know why everyone would think that, and why you don't read the previous posts). It's for a mix that includes 3 hours gaming, 8-10 hours working (3D modeling + testing on some real-time engine), a few hours of iddling and a little bit of "off-line" rendering on some nights. The actual time the GPU is loaded would amount to 6 hours or so, because 3D modeling, etc does not load it as much as games, so that's what I calculated anyway.
Simplifing my calculations a lot, it was something like:
For every 1w as seen in W1zzard's chart:
1w --> ~1.5w after accounting for 80% PSU efficiency and the power factor of the house, etc. Remember we are talking about how much it costs to you in the end so it's important to know exactly how much you'll endup paying for every watt on the GPU. The electricity company is not going to charge for clean DC watts provided by the PSU. Not even for the aparent consumption. They'll charge you for the actual consumption
on their end.
1.5w * 365 days * 0.00025 €/w*h = 0.137 €/h
Now the only thing left is to calculate the hours of use. In my case the active time (heavy load on GPU) is roughly 6 hours so:
0.137 * 6 = 0.822 € per watt per year
This is when the GPU is fully loaded*, to this we need to add the spending when on idle. Which is less but it probably brings it close to 1€, so roughly 1€.
For someone who stresses the GPU only 2 hours it would be less. And the lucky pleople in areas where they pay 5 cents it will be a lot less too.
* Loaded as in playing a game, i.e Crysis, not 100% as in Furmark, that is, the chart that W1zz labels as "average consumption".
I'd mention hot rooms in the summer before electricity.
Spending on air conditioner derived from higher temps was included in the calculation I made 2 years ago. It's more than people would think.