Would you consider breaking down the three resolutions into SRD and HDR gaming performance? I could not find information on the testing page whether games are tested for SDR/HDR output. I can see "highest quality settings". Does that include HDR being on in each game?
As shown by Tim from HUB, some gaming engines deal with HDR differently and need more processing when enabled, which lowers frame rates on HDR monitor. This would mean that games need to be tested tested in SDR and HDR, to show differences in GPU performance in those scenarios. A game achieving say 70 fps on 4K SDR monitor could achieve 63 fps on 4K HDR monitor. Many games perform similarly in both modes, but it would be useful to identify the outliers, so that published charts are even more accurate.
This is irrelevant. He was testing PCIe bus throughput, not which CPU is better performing.
Someone would need to test several workflows at the same time to tell us where saturation point is. This somebody would need to have a lot of time to do that.
Please make pictures smaller before posting, as they appear gigantic in the comment section! Try to drag picture angle diagnonally towards its centre.
It depends on what game people play on what kind of gear. Here, average performance is a meaningless measure. If I play Flight Simulator and Cyberpunk in ultra settings only, which is very demanding on GPU, I will not need more than 4K/90hz HDR display. Even mighty 4090 cannot give more than 73 fps in Flight Simulator in this case. Do we need 73 fps in Flight Simulator? Not necessarily. 60 is plenty. It's a slow pace game mostly, unless someone enjoys fighter jet action. It's all about use scenario.
4090 is a halo product that unlocks higher frame rates on 4K display and could be purchased specifically for that purpose by those who need it to do that. Anyone playing 1080p or 1440p should never consider this card, unless needing insane frame rates. A total overkill. Anyone who needs future-proof DisplayPort 2.0 connectivity should never buy 4000 series cards.