Lots of people here want to bag on intel for "muh quad cores" without realizing that 6+ cores were available in HDET and, more importantly, they offered 0 performance improvement. Often times the consumer quad core was faster in games then the HDET 6 core. So there was literally no reason to offer some super expensive 6 core in the early 2010s that simply wouldnt perform. AMD was lambasted for pushing "moar coars" in their CPUs, especially as their 6 core phenom STILL couldnt consistently beat a dual core i3 in average framerates, let alone the i5. There's a reason the phenom X6 was $200 whiel the quad core i7 was $300.
Games didnt even start requiring 4 cores to run right until very recently, as in 2019+. Even then, unlocked core i3 parts like the dual core 7350k can still hit 60 FPS in most titles. You dont need 16 cores to play games or run 99 of consumer software. GaMeRz constantly overbuy on the CPU front.
Today, AMD isnt pushing more then 16 cores. Oh boo hoo, you really need 20 cores to play minecraft, right? Instead you only get a 29% Ipc bump per generation! You dont need 17+ cores in a consumer product. If your workload needs 17+ cores, you fall into the HDET catagory and likely need more then dual channel RAM to satiate your needs anyway, so the whole point is moot.
More like kentsfield, back with core 2 quad. 2007 through 2017 (8700k) with quad cores as the top tier part, with slow increases in per core performance creeping along year by year (until they gave up, and it became wattage increases with the same hardware instead)
The real changes came with the transitions from DDR2 to DDR3 to DDR4 - and basically nowhere else.
Yeah, OK, I guess that core 2 quads had the same per core performance on DDR3 as haswell then? Dude just stop embarassing yourself.