The temperature of the GPU itself is low, because it has a good cooler sitting on top that can move the heat away from the card into the surrounding air
This means the temperature of your room will go up -> "High heat _output_"
Does that make sense?
Ok.... So what this is saying is that the aftermarket cooler is more than capable of providing adequate heat dissipation to the card.
To then define further, the card itself has a high TDP (in the actual sense of thermally dissipated power, not the black-magic math used to give it a quantitative answer from Intel/AMD).
That makes a lot more sense...and seems significantly less of a qualified statement.
To that end, you commented that basically all of the coolers demonstrated admirable results (admirable is how I put it, adequate for the situation may be less...optimistic but more accurate). That said, is this a function of excessive designs, or a function of the...lessons learned? I'm asking because a generation ago the 5700 models were...prone to thermal issues. I know this is more of an opinion rather than a factual ask, but I've also not had the opportunity to see a bunch of new cards. If AMD's thermal performance figures are necessitating board partners over design these to compensate for old sins it'd be an amusing consideration...beyond the usual rah-rah fanboy love and hate. It would, at least to me, show some learning going on.