I think I'm starting to get more irritated at the disinfo spread by derisive commentary more than any absurd marketing.
If the audiophile marketing label were removed, and replaced with 'Minimal EMI Design' would y'all calm down?
At least some of the 'advertised improvements' are worthwhile in specific applications. (beyond the 'audiophile').
Home/DIY Lab work and device development come to mind. HAMs have had to go to extreme lengths to silence interference as well*.
*You remember Linksys getting in trouble with the FCC for the WRT54 series? It was the HAM operator community that brought the complaints. You'd think the Wi-Fi radio was the issue, right? While it was exceeding power limits, largely the circuits for the wired leads were broadcasting the interference.
You buy stuff like this either as an example of having more $ than mind, *or* you have an identifiable need, which otherwise might point you towards even less available and more expensive components.
I can hear differences in USB cables, that's why I use a Chord USB cable rather than a generic one. Yes, everyone will tell you a digital cable does not make difference- it either works or it doesn't. I am firm believer that better cables do make a difference. Even isolation feet, things you would usually ignore, actually bring out a difference in sound as well. Every little thing brings out a difference in sound, that's why hi-fi is frustrating to those who have can hear a difference and that's why you see many changing out their equipment often- much more often than someone who upgrades their PC components often.
It sounds like snake oil, but even I can see and hear differences between different HDMI cables on my brand new Samsung QLED TV. I went with AOC (optical) HDMI cables in the end as it delivered a punchier, more vibrant, and smoother picture. You can say all these are waste of money, some even say it is money well spent even if the gains are small.
Every thing you mentioned (except MAYBE the HDMI perceived change*) can be tested and reproduced. (You'd need test equipment and sensors across several 'disciplines') Even the capacitance of a shielding layer, or static-buildup can have a definable effect in seemingly unrelated or disconnected systems.
"Digital" is almost always communicated using high frequency analog waveforms and differential signalling.
That said, expensive audiophile and professional studio media production cabling has less to do with signal integrity and more rejection of EMI/RFI, reducing ground loops, and dampening electromechanical-source 'noise'.
Every time I hear/read the phrase "Digital works, or it doesn't", I think about my experience with HDMI cables that would crash displays and occasionally a PC when you walked by them.
I've also experienced digital links 'malfunctioning' rather than 'work, or doesn't'.
*An HDMI link that is the source of inducting EMI into the TV or a lossy link with error correction theoretically might cause such changes in perception. The amount of postprocessing done inside 'the black box' of The Scaler/Image Processor could make errors appear as changes in the image. I used to have an HDTV that would store a frame and slowly start to 'overwrite' the live input, altering color, etc. By all means, it shouldn't have been possible, but it happened every day I used it as a PC display.