That would be nice if it's true, but I don't think nvidia would spend money implementing and building in a performance limiting feature (frame rate drops when it kicks in) just to put a certain idea in people's minds.
As W1zzard said to me earlier and you did just now, the card can consume any amount of power and run just fine with it, as long as the power circuitry and the rest is designed for it.
And that's the rub.
Everything is built to a price. While those POWER computers are
priced to run flat out 24/7 (and believe me they really charge for this capability) a power-hungry consumer grade item, including the expensive GTX 580 is not. So, the card will only gobble huge amounts of power for any length of time when an enthusiast overclocks it and runs something like FurMark on it. Now, how many of us do you think there are to do this? Only a tiny handful. Heck, even out of the group of enthusiasts, only some of them will ever bother to do this. The rest of us (likely me included) are happy to read the articles about it and avoid unecessarily stressing out their expensive graphics cards. I've never overclocked my current GTX 285 for example. I did overclock my HD 2900 XT though.
The rest of the time, the card will be either sitting at the desktop (hardly taxing) or running a regular game at something like 1920x1080, which won't stress it anywhere near this amount. So nvidia are gonna build it to withstand this average stress reliably. Much more and reliability drops significantly.
The upshot, is that they're gonna save money on the
quality of the motherboard used for the card, it's power components and all the other things that would take the strain when it's taxed at high power. This means that Mr Enthusiast over here at TPU is gonna kill his card rather more quickly than nvidia would like and generate lots of unprofitable RMAs. Hence, they just limit the performance and be done with it. Heck, it also helps to guard against the clueless wannabe enthusiast that doesn't know what he's doing and maxes the card out in a hot, unventilated case.
Of course, now that there's a workaround, some enthusiasts are gonna use it...
And dammit, all this talk of GTX 580s is really making me want one!!