Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p
1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%
The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.
RAM Speed Doesn't Matter:
Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:
a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered
This was because the GPU was the bottleneck at average fps ... in other situations it was not.
RAM Quantity Doesn't Matter as long as ya have xx GB:
Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:
a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered
Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.
At 140p / 4k, CPU performance Doesn't Matter:
Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay
A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
That's a performance difference of 12% (19% @ 99th percentile)
Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.
Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.