You forgot the differences.
250 W coming from 200 mm² is not as easy to cool as 250 W coming from 315 mm².
Also, AMD said the TDP is 220 W:
View attachment 155960
Intel still says 125 W:
View attachment 155961
I still consider the 10900K to be less of a freak than the FX, because it's more competitive. Don't quote me on that tho, don't remember those FX reviews..
You have a point on a smaller die area, not sure why you bring claimed TDP in. That's a red herring. The 3950x is a 105 watt chip, but hits a peak of 145 watt total power draw with 10 cores loaded, and core frequency starts falling the more cores you load after that, with the aveerage clock rate hittting 3.875 GHz with 16 cores. Upping power limits on DIY systems to allow higher clocks raises that power draw much higher.
www.anandtech.com
According to the enthusiast community, pushing these chips to 4.3-4.4 all core pushes power consumption above 280 watts total, and they manage to handle the heat generated. Notice as well the one spushing these clocks are using 360MM rads with 3-6 fans, yet there is no complkaining fro the community of the higher clocks being useless without "exotic cooling"
https://www.reddit.com/r/Amd/comments/euk0th
The point is the power draw of the 10900k isnt unmanageable. As I said, you have a point of the small die becoming a limiting factor. Techspot was able to load all 10 cores at 4.9 GHz, with a 200 watt power draw, and hit 84C.
Today we're checking out Intel's new enthusiast and gaming flagship CPU, the Core i9-10900K. This is a 10-core, 20-thread processor sporting a base frequency of 3.7 GHz...
www.techspot.com
The whole "ZOMG IT SO HOT" seems to be a bit overblown. Yes the chip is hot, yes it is harder to cool then AMD chips, but its hardly unsustainable. The FX was seen as a freak more because it was identical to the 8350 just clocked higher, no additional cache or more cores, and most 8350s could be pushed about as far. And the 9000 series had compatibility issues and not all fothem worked with speccd voltages.
The 10900k seems to hold the same spot the 9700k did last year: faster in games, slower in everything else, and needing a bigger cooler to hit max speed.
But you are not getting faster gaming performance unless you build a very specific, high cost rig and ONLY game at 1080p medium settings or less. Also, you are wrong about future proof. At resolutions higher than 1080p, TR shows that even a Coffee Lake Core i3/i5 or Zen+ Ryzen 5/7 gets almost the same performance as a 9000KS.
The Core i9-9900KS is Intel's new consumer flagship processor. It runs at 5 GHz boost no matter how many cores are active, which translates into 10% application performance gained over the 9900K. Gaming performance is improved too, but pricing is high, especially compared to what AMD is offering.
www.techpowerup.com
Games are bottlenecked by the graphics card only when you go to higher resolutions and game details. Many year old CPUs perform the same as recent CPUs when gaming under these conditions. So a Ryzen 3900X will perform about the same now and a few years from now at 2.5K and higher resolutions as the Core i9-10900K.
I have both a 9700K and a 2700x. The 2700X is OCed, with PBO and AutoOC, with 2866 MHz RAM. The 9700k is running at stock speeds with 2400 mhz ram.
Despite all the reviews saying there is only a small differenc at higher resolutions in sucha scenario, running both on my 1440p144 gaming monitor, there is a definite difference between the two. Even limiting the framerate to 90 or 60 still shows better overall performance from the intel chip. Perhaps the ryzen chip needs faster memory, which my chip cant even get 2933 out of my 3200MHz RAM. Perhaps there is some setting to tweak. But out of the box, for high end gaming, the intel chip still holds a noticeable advantage, and the 10900k slightly improves this advantage.