So far, Nvidia's CPU footprint on DX12 is still higher than its competitor(s?) so its not far fetched they'll grab some % there at some point.
But I don't think it relates much to the power of the GPUs. It comes down to the simple fact 'There is always a bottleneck'. And its a moving target, as faster CPUs get released, surely the top GPUs will increase their lead a little bit. This has been the case for longer than the last six years.
Also... despite efficiency from the API and hardware side... I think the biggest win to be gained is in the games and the code themselves. Occasionally you see how games do handling huge amounts of logic and its clear there is a lot to be gained there, and its directly engine dependent, more than anything. You can throw more hardware at it, but its still gonna run like shit, if it runs like shit otherwise. Just a little less so. The real CPU loads in games are often determined by continuously keeping a lot of stuff 'in the game' that influences the rest of the game, as fully dynamic components. This creates an exponential load as your simulation increases, and its why city builders often slow to a crawl in late game. Just too many things that need to be kept track of. I think its pretty inexcusable that any other type of game is dying for better CPUs, everything else should just be coded better, really. After all, devs can also do that if they code for Jaguar cores in the PS4 because they need to sell product.