Well, that's not "bizarre". Cards have pre-set parameters for thermals and power consumption.
Increase voltage and you'll be hitting thermal limits. -> Cool it better to avoid thermal throttling. -> You start hitting power limit. -> Increase power limit. -> You'll start hitting thermal limit again. Or a noise limit that you can still bear with.
This is how modern overclocking of graphic cards goes. The thing is, today, graphic card protects itself with all these safeguards. In the past, you had to figure out which part of it you're stressing too much. Which kinda makes it easier to work with today, if we exclude the fact you usually have to fiddle with BIOS to really properly tweak it.