We have to connect the cards properly because they've been designed for X power use. Just because you can undervolt them through software, you still need to account for default power use when factoring in your PSU. So for example, if you don't have 3x 8-pins on your PSU, or the power to feed 3x 8-pins, you shouldn't be thinking about a 5080.
100% agree, I feel like this was my point...
Edit: "But you can undervolt it" is an argument I usually get when I speak up against modern GPUs consuming enormous amounts of power. The example shows that it's not a good argument.
Disagree. The example just shows you need to connect it as it demands to be connected, basica hardware compatibility.
Then you can do what you like, run stock, undervolt, overclock, etc. I don't see any connection between a card being properly connected to the PC and then how you choose to run it.
GPU's consuming more power than before is true, and "but you can just undervolt them" I think is not a good counter to that either, but I see zero correlation to connecting it properly. You need a PSU that can handle the card in it's default TDP, typically in outright wattage with perhaps a bit of wiggle room depending how you intend to run it, but 100% required in terms of the physical connectors present. To plan to do otherwise, even if fully intending to drastically undervolt would be foolish at best.
Every single Nvidia card I have ever had or worked with disagrees with you.
Don't quote me but I believe it's been tested and perhaps even confirmed by NVidia the 4090 essentially little power from the pci-e slot (and not circa 70-75w), shouldn't be hard to find articles on, I'll take a look. If it is the case, it'd stand to reason other 40 and perhaps now 50 series cards operate the same.