Nvidia's power claims for the connector need to be taken with an enormous pinch of salt. It must be remembered that this is not an official connector that applies to any PC standards. Therefore Nvidia can make up whatever numbers they like.
To put this in perspective, the 6pin PCI-E connector is rated at 75w by PCI-SIG, if you took two of them and glued them together to make a 12pin it would be rated at 150w, if you then shrunk it to 70% of it's original size it would be rated at 105w. That is the exact process Nvidia used to "invent" this new 12pin, the only reason they claim 600w+ is because it doesn't have to conform to any standards, which is proberbly one of the reasons none of the AIB are using it.
... that's not how this works. The pins and housing used (Micro-Fit rather than the Mini-Fit Jr. used for PCIe plugs) are
entirely standardized. Just because they haven't been adopted for a PC power delivery standard before this doesn't mean that Nvidia pulled these plugs out of their collective rear ends. In this case, the Micro-fit pins are rated for 9A (though versions exist for lower amperages as well as a 10.5A version). That's where the number comes from: 9A * 12V * 6 pins = 648W total power output for a connector with 6 +12V and 6 ground pins. This of course requires a suitable power source as well as wiring thick enough to handle that amount of current per pin, which means some
really thick cabling to reach even a relatively short ATX PSU length. (Most PSU cabling is 18 AWG, which would heat up massively and cause significant voltage drops if asked to transmit 9A (
according to this calculator, .23V drop over 60cm or .38V over 1m, with the entirety of that voltage drop being converted to heat in the cable, i.e. .23 * 12 * 9 = 25W or .38 * 12 * 9 = 41W
of heat in your cable alone - good luck cooling that). That would make for some very melty connectors, so you'd need thicker gauge cables to rectify this. 14 AWG wiring would mean just .15V drop over 1m of wiring, which is a lot more tolerable, but that's also a cable with more than 2x the copper, i.e. a very thicc boi. Now imagine 12 of those .... yeah, that's not a cable you want to manage inside of a cramped case.
So yes, Nvidia absolutely has to conform to standards, and your reasoning is completely off. The reason AIB partners aren't using the connector is likely a combination of a) cost (a new connector = higher price, new supply chain, new tooling to implement, new PCB design guidelines etc.), b) experience/convenience (more than a decade of using PCIe connectors making them simple and secure to implement), and c) necessity. When you can copy+paste in a 3x8-pin poewr entry design from a few years back for your upcoming power hog flagship, why take on the expense of using a new connector that will just eat into your margins while forcing you to also bundle in a (again, expensive) cable adapter as nobody has a PSU with this connector on it? It isn't necessary unless you're building a tiny PCB or need more than the cumulative 525W that 3x150W 8-pins + 75W from the slot can deliver. And the 3090 isn't
that power hungry.