We just disagree on that then, simple enough. You're gaming on a 42 inch TV, which isn't a desktop setting, and might see some return on your increased pixel count in your setup, I don't know. But for myself, having experienced a 1080p TV-situation and a much higher res desktop situation at 34 inch ultrawide with practically 4K horizontal (3440); I can safely say I see absolutely no point in upping the res to 4K for 16:9. The extra width works on a desktop setting, but extra height would make it uncanny. Already 34 inch UW is at the very limit of what's feasible for gaming, as in, the peripheral sight gets filled horizontally, but you're already not 'seeing everything' on screen, you need to move your head all the time to get there - OR sit back further; and in both cases pixel density is high enough to skip AA. I find myself moving UI elements in games not to the edge but to the middle/bottom/top of the screen because edges are really too 'far away'. Productivity wise, on ultrawide, I find myself using WIN+arrow keys to move a (browser) window to half of the screen, and put another one next to it. Effectively I just 'view' half the screen.
On a 42 inch you're looking at similar stuff, albeit with higher diagonal forcing you to sit back further, OR enjoy bad ergonomics. But if I sit back further, that is at 2m or more which is the very least for anything 40 inch 16:9 and up to feel comfortable, the bonus of those extra pixels in 4K is quickly just eliminated versus looking at a similar image on 1080p. I just don't see the difference anymore. Samsung TV does 4K, I can safely nudge back to 1080p and at normal view distance and relative same scaling of elements on screen, I really can't see anything different. Suffice to say, I run the PC at 1080p on that TV; 4K only adds latency and makes browser content slower.
I've experienced all the different situations over time, this is what I've arrived at. 4K and up is marketing/innovation for the 'must always have moar' crowds. It is not for me, all it does is massively increase required processing power for, realistically, no advantages at all.
And let's take a look at the horizon ahead of us. GPU advancements are now slowly but surely getting achieved not by making things better, but by making them bigger and more power hungry. The graphical fidelity in games is pretty much plateau'd and at least suffering from heavy diminishing returns. RT is adding yet another performance hog. 4K feasible? I think its beyond that, and you've already attested to that saying you're not capable of running with all bells and whistles already on top end GPUs. Who are we kidding here? Efficiency, in gaming, is absolute king, so the preferable PPI/diagonal is one where you see everything, are not counting pixels, and are not wasting pixels you won't see proper. It feels totally counterproductive to me to sacrifice IQ to get higher resolution, while the opposite
does show a difference. Let's not forget that games are not getting lighter regardless of resolution, either.