I am actually one who wants it the other way, I want AMD to add iGPU's to their high end CPU's.
My 2600X I had to buy a discrete GPU which annoyed me as only used on a ESXi machine, no need for a discrete GPU. Then after I put in the GTX 1030, it still annoyed me as I know if there was a iGPU the 1030 could be used for passthru (with proxmox).
When I build intel machines I can test using the iGPU, and some of my customers had iGPU as their configuration preference. They needed lots of CPU power but not GPU.
To me once you hit 4-6 cores the typical end user wont benefit much and once you hit 8 cores, any more on top of that is niche.
Compiling
Encoding
Virtualisation
Those 3 use cases in the grand scheme of things are very uncommon.
If you want to stream and game on a single cpu I think 8/16 is good enough, but if the game is particularly demanding one could argue more core's would help in that, but again the amount of professional streamers within the overall consumer market is tiny. Intel know more than us what consumers typically do.
It gets overhyped because youtube reviewers and twitch streamers have a large influence, both of these sets of people have needs to encode so they push the "more cores" mantra as they personally benefit, they do over emphasise it as 97% of consumers do not encode/compile stuff.
This is why bulldozer failed, it was a cpu built for geeks who love core count, but it wasnt good for the average consumer.
The fact cinebench is the go to benchmark for most of the media is misleading people as well, as that is an encoding benchmark, something like geekbench is more accurate as it tests many different cpu functions, not just encoding.