Actually, you're wrong on a couple of things. Firstly, I DO get to decide if I think that a lack of VRAM on a card is detrimental and I DO get to decide if I want to tell people about it on a forum that exists for that very purpose.
If something is obviously true, there is no such thing as a strawman argument supporting it. An argument can only be strawman if it's supporting a falsehood because that's why it's called strawman in the first place, there's nothing holding it up except for false rhetoric. Saying that a card without enough VRAM is a bad purchase is not false rhetoric, it is an obvious truth. Therefore, it's not an argument made of straw but an argument made of granite.
The reason that it's especially relevant today is that we're in a pretty specific situation here where so many people who bought last-gen cards paid double or more what they would've paid for the previous-gen. The fact that they did have to pay that much makes every shortcoming on these cards all the more egregious, especially shortcomings that limit the cards' longevity.
If you suddenly had to pay twice as much as what was historically normal for something, wouldn't you be all the more angry that the company that made the product had knowingly done something that would severely limit its useful life? I sure would and I don't think that this is a strawman argument. Sure, in the days of the RTX 20 and RX 5000 cards, it wasn't nearly as bad a thing because cards were priced in a much more sane manner than they were in the storm that followed so I'm guessing that this is the nuance that you're talking about.
The nuance that I'm talking about is people paying twice as much (or more) for cards that have had their useful lives artificially shortened by small VRAM buffers. I had the same thing happen to me twice (the insufficient VRAM, not the paying double). Once was with the HD 7970 because the GPU was strong enough to use more than 3GB of VRAM and the other was with the R9 Fury. With the Fury I knowingly bought it because the first mining craze of 2017 was on and I was stuck with that 3GB HD 7970 since Crossfire had been essentially dropped. Since the Fury had a TDP of 275W, it wasn't suitable for mining and so was half the price of the RX 580 despite being faster. I decided that the low price and high level of performance for the money was worth it despite the low VRAM because everything else was literally double the price or more. Greg Salazar did a video about it:
However, my long-term experience with the R9 Fury, as long-lived as it was, was to never again choose a card, especially a high-end card, if it had significantly less VRAM than its rival. The R9 Fury has had decent longevity but not at anything higher than 1080p and if I had paid full price for it, I probably would've felt cheated.
My whole point is based on altruism and empathy because I'm not suffering from a lack of VRAM, I have an RX 6800 XT and 16GB is not lacking. I'm saying what I'm saying so that people who have been screwed like this know that it happened and why so that they'll see it coming next time and avoid it. I remember pulling my hair out because my HD 7970 had enough GPU horsepower to play a specific game (I don't remember which game because it's been so long) but the 3GB of VRAM just ruined the whole experience. I just don't want others to have to live with that same frustration. I don't see how that makes me a bad person, oh yeah, because it doesn't. I don't know why anyone would push back against this unless they're people who have made this mistake and are in denial or know that it's true and are just being belligerent because they don't want to admit that they messed up.
There's no shortage of people like that on the internet and you know it, especially among the younger crowd.