Monday, July 18th 2022
Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF
Intel's 13th Gen Core "Raptor Lake" is shaping up to be another leadership desktop processor lineup, with an engineering sample clocking significant increases in gaming minimum framerates over the preceding 12th Gen Core i9-12900K "Alder Lake." Extreme Player, a tech-blogger on Chinese video streaming site Bilibili, posted a comprehensive gaming performance review of an i9-13900K engineering sample covering eight games across three resolutions, comparing it with a retail i9-12900KF. The games include CS:GO, Final Fantasy IX: Endwalker, PUBG, Forza Horizon 5, Far Cry 6, Red Dead Redemption 2, Horizon Zero Dawn, and the synthetic benchmark 3DMark. Both processors were tested with a GeForce RTX 3090 Ti graphics card, 32 GB of DDR5-6400 memory, and a 1.5 kW power supply.
The i9-13900K ES is shown posting performance leads ranging wildly between 1% to 2% in the graphics tests of 3DMark, but an incredible 36% to 38% gain in the CPU-intensive tests of the suite. This is explained not just by increased per-core performance of both the P-cores and E-cores, but also the addition of 8 more E-cores. Although the same "Gracemont" E-cores are used in "Raptor Lake," the L2 cache size per E-core cluster has been doubled in size. Horizon Zero Dawn sees -0.7% to 10.98% increase in frame rates. There are some anomalous 70% frame-rate increases in RDR2, discounting which, we still see a 2-9% increase. FC6 posts modest 2.4% increases. Forza Horizon 5, PUBG, Monster Hunter Rise, and FF IX, each report significant increases in minimum framerates, well above 20%.The second graph below shows the highlight of these tests, significant increases in minimum frame-rates. Averaged across tests, the i9-13900K ES is shown posting a 11.65% min FPS gain at 4K UHD; 21.84% increase at 1440p, and 27.99% increase at 1080p.
A big caveat with all this testing are the CPU clock speeds. Engineering samples do not tend to come with the clock speeds or boosting behavior of the retail processors, and hence don't correctly reflect the end product, although some ES chips may come with unlocked multipliers. In this testing, the i9-13900K ES was set at a maximum P-core clock speed of 5.50 GHz all-core. 5.50 GHz was assumed to be the max boost frequency of the retail chip, and compared with an i9-12900KF that boosts up to 5.20 GHz for the P-cores, but was running at 4.90 GHz all-core.
The i9-13900K ES was also subjected to power-consumption testing, where it posted significant peak gaming power compared to the retail i9-12900KF. A retail i9-13900K will likely come with lower power-consumption than what is shown here, as it will follow boosting behavior typical of retail chips at stock frequencies, when compared to an ES that's been specified to run at a certain frequency.
Intel is preparing to launch its 13th Gen Core "Raptor Lake" processor family in the second half of 2022. This period could also see rival AMD introduce its Ryzen 7000 "Zen 4" processors. "Raptor Lake" combines 8 "Raptor Cove" P-cores with 16 "Gracemont" E-cores, and additional L2 cache for both core types. The I/O of these chips is expected to be similar to "Alder Lake," and hence they're built for the same LGA1700 platform.
Sources:
Extreme Play (Bilibili), harukaze5719 (Twitter), VideoCardz
The i9-13900K ES is shown posting performance leads ranging wildly between 1% to 2% in the graphics tests of 3DMark, but an incredible 36% to 38% gain in the CPU-intensive tests of the suite. This is explained not just by increased per-core performance of both the P-cores and E-cores, but also the addition of 8 more E-cores. Although the same "Gracemont" E-cores are used in "Raptor Lake," the L2 cache size per E-core cluster has been doubled in size. Horizon Zero Dawn sees -0.7% to 10.98% increase in frame rates. There are some anomalous 70% frame-rate increases in RDR2, discounting which, we still see a 2-9% increase. FC6 posts modest 2.4% increases. Forza Horizon 5, PUBG, Monster Hunter Rise, and FF IX, each report significant increases in minimum framerates, well above 20%.The second graph below shows the highlight of these tests, significant increases in minimum frame-rates. Averaged across tests, the i9-13900K ES is shown posting a 11.65% min FPS gain at 4K UHD; 21.84% increase at 1440p, and 27.99% increase at 1080p.
A big caveat with all this testing are the CPU clock speeds. Engineering samples do not tend to come with the clock speeds or boosting behavior of the retail processors, and hence don't correctly reflect the end product, although some ES chips may come with unlocked multipliers. In this testing, the i9-13900K ES was set at a maximum P-core clock speed of 5.50 GHz all-core. 5.50 GHz was assumed to be the max boost frequency of the retail chip, and compared with an i9-12900KF that boosts up to 5.20 GHz for the P-cores, but was running at 4.90 GHz all-core.
The i9-13900K ES was also subjected to power-consumption testing, where it posted significant peak gaming power compared to the retail i9-12900KF. A retail i9-13900K will likely come with lower power-consumption than what is shown here, as it will follow boosting behavior typical of retail chips at stock frequencies, when compared to an ES that's been specified to run at a certain frequency.
Intel is preparing to launch its 13th Gen Core "Raptor Lake" processor family in the second half of 2022. This period could also see rival AMD introduce its Ryzen 7000 "Zen 4" processors. "Raptor Lake" combines 8 "Raptor Cove" P-cores with 16 "Gracemont" E-cores, and additional L2 cache for both core types. The I/O of these chips is expected to be similar to "Alder Lake," and hence they're built for the same LGA1700 platform.
76 Comments on Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF
Only comparing 13900K vs 12900K(F), seeing biggest power jump from 92W to 140W which is a 50% increase in red dead 4k
In that test the performance increase was like 5%
I don't see how you could come up with 'At 13900K, the performance / consumption ratio will be improved. I repeat: in games. '
From the original post
It is very obvious that they bumped the frequency in the price of power consumption
And the bumped frequency didn't end well in the game tests.
The increase in min FPS are mainly from the increased cache, as we have experienced from the 5800X3D behaviour.
The 13900k is just a frequency bump of 12900k plus 8 more e-cores and reached the insane PL4 of 420W.
It will be an uncoolable 12900k which by itself is already quite uncoolable.
ALL core clock, not just 1 or 2 cores like the 12 900K, so maybe YOU should actually try reading it.....if you think ALL cores at 5.5GHz wont need special cooling then your high or something.
What tipped me off is the 1.5 KW PSU used o_O. God I hope it's a coincidence not a must to run it full speed with a 3090 Ti.
I myself own 9900ks and 5900x.
9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.
5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
Intel vs Intel
It is completely reasonable to assume the peak consumption increase ~ average consumption increase in the tested use cases.
Since the architecture and process node are hugely the same.
Let me say again
The original post is doing INTEL VS INTEL.
I don't know why you kept missing the picture here and comparing against 'Imaginary AMD'
There are no AMD products tested in the original post.
With the sample size of one the only thing we could do is using its data point and compare Intel vs Intel.
Intel 12th consumes extremely much in very heavy tasks but is extremely efficient in others. To determine the total consumption, average this consumption throughout the session. If I drink three beers today and drink one tomorrow, you can't say I drink three beers every day.
Intel CPU architecture is solid you just can't overcame the manufacturing node deficit with just solid architecture.
Does anyone else find it peculiar that reviewers use 3090 (ti) GPU's for testing CPU's at 1080p?
I guess it doesn't make too much difference, especially if you are a random person on a Chinese tech forum instead of a W1zzard...
Is it relevant? Depends on your perspective, doesn't it? But I don't think it is relevant in the context of Intel pushing a 5.5 Ghz clock on the top-end model, which, as we know today, much like the 12900K(F) is a POS to keep cool unless you limit it somehow.
You keep talking about your super efficiently gaming 12700K as if its a bench-topping beast, but its not a 12900K. And its certainly not a 12900K being pushed in any possible form at 1440p ultra. So what do you really know?! Especially because you push an ancient 1080ti with it. You don't even have the hardware to push a 12700K to the limit in any game, lol. Mighty efficient indeed, at 20% load... so that puts your 'low' 70W in some real perspective right there. This topic is about a successor to the 12900K at peak clocks pushing the fattest GPU you can find. You're brutally off topic every time you post about how efficient your CPU is practically idling, and then you complain about other people making other silly comparisons ;)
Now, idle means 25 watts and youtube playback with 45W of the same material. Total consumption. If no one died then, surely no one will die now. In those days, AMD fans hid consumption and attacked with prices. You attack with the ones you have at hand, you hide the messes, the important thing is to appear smart.
We have the high-performance processors for the demands of a home user. It's not wrong to choose a company, Intel or AMD, it's only wrong if you choose badly and badly means choosing a processor totally out of step with your needs.