I hear your conspiracy...and raise you a common sense.
Let me be less obtuse. If AMD and Nvidia were to release a card, do you think they'd want it to have 20% of its performance left on the table? No. Common sense is that they'd release a card showing 100% of its capability and charge for it. The common sense thing is that during manufacturing there are variations. The reason that they produce cards and specify them to a performance level is that they can be relatively assured to run at that level. This means that for most you setup a minimum level, bin anything below that as bad, and everything else is good. IE, the cut-off is 2.8 GHz, the three cards in question can run at 2.7, 2.9, and 3.1, and you pass 2/3 with them being set to 2.8 GHz and fuse off the bad bits of the third to make the next product down in the stack.
The thing is, this used to have a lot of variation. "Golden" chips overclocked like crazy, and the internal cut-off was made at a point where production matched output. In this way, depending upon what you got, you could overclock to 3.1 GHz, or be stuck at the rated 2.8 GHz. The "conspiracy" is that with better processes, and more consistent output, the cards are now at 2.7, 2.8, and 2.9 GHz...so the 2.8 rating leaves no production inefficiencies swing for the overclocking headroom. Better manufacturing means they come out of the box clocked higher and with less OC headroom.
So...yeah. It's not a conspiracy so much as improvement in production removing inefficiencies, leading to less performance left on the proverbial table that you can get back by overclocking. That sucks....but it's not something to whine about. Note that Nvidia states their cards run at 575 watts. They can pull 900 watts for 1 ms...or 1,000,000 ns. This is all fine and dandy...except they can do this very often when going from high usage spikes...and it's over a connection where the 900 watts/575 watts is not forcibly balanced over conductors and therefore can (and apparently does) peak on a limited amount of them...exceeding the 9.2 amps that the wire connector is theoretically rated for under presumably ideal situations. That's less about failure of manufacturing, or about reality, and somebody designing a connector that wasn't meant for what it is experiencing in practice.
Ahh...but there are those who still believe the engineers specified everything perfect. Again, engineers of vehicles didn't (historically) design them to be destroyed. This how the Ford Pinto became a thing, and it's one of the easiest situations where some common sense would have fixed everything. Maybe a bolt sticking proud from a hole right next to a gas tank is a bad idea, like taking a connector designed with a 600 watt balanced load and running an unbalanced 900 watts through it was silly. As was stated way back, if they installed overcurrent protection via a shunt resistor on each conductor they'd have detected this and stopped them from burning...but magical engineering specs assuming situations other than what happen in reality don't usually fix issues unless they absolutely overspec things to an insane degree...and that would cost more money, when the goal for this connector was to save money. That's a no-no of common sense.