Of course no one wants less optimization. We want more optimization, so for example a large crowd of NPCs with animations is calculated using 16 threads instead of 8 threads. Doing everything on 4/8 is bad optimization in a world when 6 cores on desktop have been available since 2010 and cheap ones since 2017 (5 years!). So that games offer more, not less. More features, more dynamic stuff. Just like going from 1 core to 2 cores allowed for more things a CPU can do. There would be no games like Assassin's Creed for example if we stayed on 1 core 3.2 GHz. Frequency doesn't go up anymore, so developers need to find other ways - using more cores, threads and cache. We can still get more cores, threads and cache, even if we can't get more frequency. I hope these E cores that Intel is adding with new architectures get utilized. Mainstream CPUs with 64 E cores are not very distant, they could happen in 2026. Do you people want games that use perhaps 6 out of 8 P cores and 0 from 64 E cores? And I don't mean bad coding that uses too much resources. Just next-gen games with next-gen features and changes (for the better, not for the worse). 8th generation brought an "awesome" change : microtransactions. Do we want things to continue like that?
For example, I read on the web that AI in Far Cry 6 is very, very bad (and the game uses only 4c). Even though there were Zen 3 8 core 16 thread CPUs when it came out. Why isn't AI in games progressing like graphics? Performance is increasing, so why? Because devs choose so, that's why! AI should be improving exponentially, like it is in some areas outside of video game AI (like even playing video games or other games like Go, poker or bridge). Listen to Jensen Huang when he talks about AI. Why isn't in-game AI getting better? I don't need 300 fps, I need better AI.