I don't have any interest in playing games in Medium; I play games in Ultra on two systems. I have two good cards (RTX 3080 and RTX 2070 SUPER). The latter drives a 165Hz 1440p monitor. Pretty good. The former is driving a 4K LG OLED TV. Looks great.
I don't care about you flexing your poor financial decisions, but tell me, why exactly do you dislike medium settings so much? It's been known in enthusiast circles for over decade that ultra settings often give next to none visual benefit over high settings and often reduce fps by two times. And in last 5 years many games even on low do look quite good. So that's even more reason to don't care about ultra settings. The only time when they are good for something is when game becomes legacy and you can max it out, when you get another graphics card, otherwise ultra is just too hard to justify.
The RX 550 is used in a productivity system. Not worth it for gaming. With only 2GB VRAM, by definition this is not a card for Ultra gaming. This is the entry level model in the RX 500 generation. The RX 580 on the other hand was a capable gaming card when it was released 4+ years ago.
Not sure what else did you expect. It was marketed as one and delivers like it was marketed. Nothing wrong with card itself.
If I was really interested in gaming in Medium, hell I'd pick up a PS4 Pro instead. Heck, I can get 4K/120 gaming via a $500 PS5 right now if I wanted to. One thing for sure, right now $500 isn't going to buy a PC graphics card that will let you play at 4K.
Consoles imo are out of discussion in PC sites. Console is not a PC, even if it can run games, it's not the same. You can't use MS/KB with console, many games never come to consoles, there's a strong preference to increase graphics at cost of fps, often poor backwards compatibility, you don't own your own games, DRM owns you, TCO is a lot higher, due to games being more expensive on consoles, online multiplayer still requires paid services, consoles almost never have free titles, consoles don't have emulators, consoles often have design flaws that you aren't supposed to fix yourself. Simply put, a console isn't PC and PC isn't a console, two entirely different things that aren't directly comparable. And a very important thing that people often miss is that owning a console ends up being a lot more expensive than owning a computer, due to multiplayer subscriptions and games being more expensive than on PC. So the console itself might be cheap, but the rest isn't.
You know, I mention US prices only because I don't have the time nor interest in following the vagaries of local market price patterns on a global scale.
Same for me, I have no interest in following US market. Anyway, when I mention European hardware shop, then I really don't expect some out of context comments about how same cards were a lot cheaper in la la la land. Those prices weren't ever real here, so I don't care how they might had been at some point radically different on the other side of pond.
Regardless where you live, the global GPU shortage is good for nobody except scalpers.
Good or not, but my argument is that it doesn't nearly affect anyone as much as they make it out to be (particularly gamers and TPUers).
If people resort to playing 5+ year old games because they can't get their hands on an affordable modern graphics card to play a demanding new release like Cyberpunk 2077 or Microsoft Flight Simulator, that isn't an ideal situation for game studios and their employees now, is it?
But I don't care about them, especially for studios that can't make reasonably optimized code. That's entirely their issue that they did such an awful job at making their shit playable. More importantly, I don't get what's the big deal about playing the latest games. It's an entertainment a type of media, so as long as it works and is enjoyable, its age is irrelevant. Imagine the insanity of people complaining that songs from 1920s don't sound nearly as crispy as modern songs, due to them being on vax cylinders and putting them into landfills, just due to some hardly important technical aspect. Or doing the same to movies... Sure in computer gaming space tech moves fast and things get incompatible rather quickly, so many games are simply lost due t that, but as long as they function and are enjoyable, there's no point in replacing them. Same for hardware, as long as it lets you do what you want, there's no point to replace it, even if there is something better. Just because there is something better, doesn't mean that you should upgrade to that. Particularly today, when you mostly don't even get any new features and only differentiating factor between hardware is generally performance (although argument for power consumption and some specific compatibility oddness, could be made too).