I feel the need to clear some things up, so here we go.
GTX 750 Ti is a maxwell core card base on 28NM and consumes about 55W at gaming.
If it was a 20NM card, it would probably consume 35W, concidering power save and penalty as-well.
If you take GTX 750Ti's power and double it, you would get performance around the GTX 770. (according to TPU). so 35W X2 + penalty = A 80W GTX 770+ card.
I dont see how impossible, a card that is about twice the GTX 770's power (so 640 GTX750 Ti's shaders times 4) for about 180W could be. Add another cube of GTX 750 Ti's power and what you would get is about a 210W power card, with 3200 shaders. You could limit it using a 256 bit bus and pair it with 4GB of memory
This pobability is far from being a unicorn. It is most likely that at 180W power consumption we will get something that beats the GTX 780 Ti without much effort. It is also a probability that for 230W we will get something that goes even further, a lot further.
I don't get people here.
Where do you get any of that information?
http://www.techpowerup.com/reviews/ASUS/GTX_750_Ti_OC/23.html
There is a huge increase in core voltage needed to attain the last bit of clock speed, and it becomes exponential as the core gets hotter and leakage increases, look at any GPU review, and you will notice this, the only exception to this rule is the minimal switching power required, and the 750Ti does a good job, at that, at lower core speeds the leakage is so minimal that the .95 volts it has doesn't mean anything in real power consumption, and it is only possible on that core as it is so small with so few shaders. If we attempted this with a larger core the power drop for the first increase in clock speeds would cause the hardware to fail during the clock state transition.
Now we have two options here, we can say, hey look at the 750Ti, it costs $150, and the R7 265 it costs $150.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_750_Ti/28.html
Even overclocked it doesn't reach the R7, which is 2 years old. So bad on them, if you are concerned about the few watts difference in idle and full power why are you even talking about a high power GPU?
The other option is you can say, is the 750Ti is a great first foray for nvidia to test something new, and despite its lack of GPU power for actual use since 2009 I hope they learn from it. But power consumption speculation and raw GPU power is and has been centered around the 300W ceiling for both companies for the last few generations, I don't see any reprise there, all I can see is the shrink should bring power consumption down per transistor and allow for more transistors and more raw compute power, core improvements and features will no doubt improve their IPC. Anything more than this is buffalo chips.