4K is 4x the pixels of 1080p, so one generation uplift is much less than comparing 4K FPS to 1080p. An RTX 3090 gives you averagely 88 FPS in TechPowerUP suit of games in 4K, in 1080p it has 181 FPS - 105% more. And 2080 Ti goes from 59 FPS in 4K to 130 GPS in 1080p - 120% more.
With RTX 4090 you get 144 FPS in 4K, but only 74% more in 1080p - but absurdly high 251 FPS is here limited by the latency of system, not performance of CPU - we can be sure the actual performance is more than 100% more in 1080p than in 4K.
A generational uplift from RTX 3090 to 4090 only gives you 63% in 4K, 40% in 1080p, and from 2080Ti yo 3090 only gave you 49% uplift in 4K, 39% uplift in 1080p.
So you really have to skip a generation of GPU and use the same CPU after 4 years, and it's still not the same as comparing 4K numbers to 1080p.
720p is of course even more absurd, that's like planning on using the same CPU after more than 8 years.
This "let's get GPU bottleneck out if equation" numbers are very good for selling CPUs, but the theoretical increases at low resolution have very little effect on what you'll get from that product.
But it's good for sales, good for creating FOMO in buyers.
Because showing actual FPS they'll get from a cheap $250 CPU and very expensive $700 one in 4K, and perhaps with a more down to earth GPU that majority of gamers use shows a very, very different picture.