Well, it should have been CLEAR from the git go that Native 1440p not 4K is the "optimal" Display to pair with an RTX 4090.
Because most did not grasp this they will be left blaming game devs for "unoptimized ports" and or forced to use gimmicks such as "frame gen" and upscalers for decent performance.
With a Native 1440p Display none of this is required, just a pristine Native image w/ no visual artifacts or negative impacts to input lag just as the PC God's intended.
If you have an RTX 4090, the optimal display is something like 1440p / 240hz - This way you can play the latest AAA RPG's from UE5, etc at 90 FPS+ and more demanding first person shooters (not talking esports) but games such as Halo Infinite, Destiny 2, etc at up to 240 FPS. It's funny how many seem to think 1440p is suddeningly beneath them when in actuality it's the ideal resolution.
Nah its simply that games are always a moving target, and graphics cards are always looking for ways to upsell or sell you a new gen.
The 4090 isn't slow, but you can make it slow. NP. That's what RT is for. Now with Nanite and UE5 we get to see what's really happening in graphics advances. Not some overmarketed POS that hardly anyone can use proper and needs dev TLC, no, just in-engine improvements that utilize new tech. The 4090 is still fast... but its just about as obsolete as the handful of cards below it when you consider the actual performance maxed out in FPS. All current cards are fighting for playable frames when maxed.
We've finally arrived at a new graphics paradigm, one hinted at with DX12, then Crytek's Neon Noir and after a rocky start in UE5 we're starting to see it bear fruit. This is the move forward that the industry will carry, not this stupid RTX push that still gets scaffolded by bags of Nvidia money and marketing only.
And guess what, you don't need proprietary hardware for it. You don't need a 4090. Nobody ever did, and any recommendations regarding said card or fitting resolutions are just nonsense for people who don't understand gaming or its history. The 1440p '1080' was relegated to a 1080p card within 3 years. The examples of this are endless. Resolution's of very little relevance to anything. Even when 4K became more feasible, the early 4K games didn't even look better, because the assets in them didn't progress with the resolution bump. So now you have large empty spaces on models. Yay. On the flip side, when 4K became more feasible, developers saw the advantage of higher fidelity assets, and you can now enjoy them even on 1080p
Resolution is not, will not, and cannot be a factor to measure things by when it comes to gaming. It fails. Hard. All you need is time. That's also why its very unwise to move to 4K and consider it a norm, though there is the fortunate side effect you can use a near-native 1080p representation.