Monday, July 18th 2022

Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF

Intel's 13th Gen Core "Raptor Lake" is shaping up to be another leadership desktop processor lineup, with an engineering sample clocking significant increases in gaming minimum framerates over the preceding 12th Gen Core i9-12900K "Alder Lake." Extreme Player, a tech-blogger on Chinese video streaming site Bilibili, posted a comprehensive gaming performance review of an i9-13900K engineering sample covering eight games across three resolutions, comparing it with a retail i9-12900KF. The games include CS:GO, Final Fantasy IX: Endwalker, PUBG, Forza Horizon 5, Far Cry 6, Red Dead Redemption 2, Horizon Zero Dawn, and the synthetic benchmark 3DMark. Both processors were tested with a GeForce RTX 3090 Ti graphics card, 32 GB of DDR5-6400 memory, and a 1.5 kW power supply.

The i9-13900K ES is shown posting performance leads ranging wildly between 1% to 2% in the graphics tests of 3DMark, but an incredible 36% to 38% gain in the CPU-intensive tests of the suite. This is explained not just by increased per-core performance of both the P-cores and E-cores, but also the addition of 8 more E-cores. Although the same "Gracemont" E-cores are used in "Raptor Lake," the L2 cache size per E-core cluster has been doubled in size. Horizon Zero Dawn sees -0.7% to 10.98% increase in frame rates. There are some anomalous 70% frame-rate increases in RDR2, discounting which, we still see a 2-9% increase. FC6 posts modest 2.4% increases. Forza Horizon 5, PUBG, Monster Hunter Rise, and FF IX, each report significant increases in minimum framerates, well above 20%.
The second graph below shows the highlight of these tests, significant increases in minimum frame-rates. Averaged across tests, the i9-13900K ES is shown posting a 11.65% min FPS gain at 4K UHD; 21.84% increase at 1440p, and 27.99% increase at 1080p.

A big caveat with all this testing are the CPU clock speeds. Engineering samples do not tend to come with the clock speeds or boosting behavior of the retail processors, and hence don't correctly reflect the end product, although some ES chips may come with unlocked multipliers. In this testing, the i9-13900K ES was set at a maximum P-core clock speed of 5.50 GHz all-core. 5.50 GHz was assumed to be the max boost frequency of the retail chip, and compared with an i9-12900KF that boosts up to 5.20 GHz for the P-cores, but was running at 4.90 GHz all-core.

The i9-13900K ES was also subjected to power-consumption testing, where it posted significant peak gaming power compared to the retail i9-12900KF. A retail i9-13900K will likely come with lower power-consumption than what is shown here, as it will follow boosting behavior typical of retail chips at stock frequencies, when compared to an ES that's been specified to run at a certain frequency.

Intel is preparing to launch its 13th Gen Core "Raptor Lake" processor family in the second half of 2022. This period could also see rival AMD introduce its Ryzen 7000 "Zen 4" processors. "Raptor Lake" combines 8 "Raptor Cove" P-cores with 16 "Gracemont" E-cores, and additional L2 cache for both core types. The I/O of these chips is expected to be similar to "Alder Lake," and hence they're built for the same LGA1700 platform.
Sources: Extreme Play (Bilibili), harukaze5719 (Twitter), VideoCardz
Add your own comment

76 Comments on Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF

#51
Crackong
GicaIn games, 12900K looks good. Much better than a 5950X. At 13900K, the performance / consumption ratio will be improved.
I repeat: in games.

The test in the video is the Puget System Premiere Pro Benchmark. So what consumption are we talking about? At maximum consumption, AMD wins, but loses overall consumption and rendering time. And Puget has such tests and at least in the Adobe suite, Intel 12th is the right choice.
I am referring to this pic from the original post.
Only comparing 13900K vs 12900K(F), seeing biggest power jump from 92W to 140W which is a 50% increase in red dead 4k
In that test the performance increase was like 5%

I don't see how you could come up with 'At 13900K, the performance / consumption ratio will be improved. I repeat: in games. '
From the original post
It is very obvious that they bumped the frequency in the price of power consumption
And the bumped frequency didn't end well in the game tests.
The increase in min FPS are mainly from the increased cache, as we have experienced from the 5800X3D behaviour.

The 13900k is just a frequency bump of 12900k plus 8 more e-cores and reached the insane PL4 of 420W.
It will be an uncoolable 12900k which by itself is already quite uncoolable.


Posted on Reply
#52
Melvis
@Tigger
ALL core clock, not just 1 or 2 cores like the 12 900K, so maybe YOU should actually try reading it.....if you think ALL cores at 5.5GHz wont need special cooling then your high or something.
Posted on Reply
#53
Why_Me
Melvis@Tigger
ALL core clock, not just 1 or 2 cores like the 12 900K, so maybe YOU should actually try reading it.....if you think ALL cores at 5.5GHz wont need special cooling then your high or something.
This cpu is going to be the Mad Max car of PC gaming. :cool:

Posted on Reply
#54
ratirt
There is not a lot of performance increase in average mostly the mins are higher which is good. Refresh of a 12th gen it would seem is the best to describe it.
What tipped me off is the 1.5 KW PSU used o_O. God I hope it's a coincidence not a must to run it full speed with a 3090 Ti.
Posted on Reply
#55
R0H1T
GicaAt maximum consumption, AMD wins, but loses overall consumption and rendering time.
Like always ~ it depends on the task!
Posted on Reply
#56
SaLaDiN666
The AMD troll brigade squad is probably running 2600x or something like that.

I myself own 9900ks and 5900x.

9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.

5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.
Posted on Reply
#57
Unregistered
SaLaDiN666The AMD troll brigade squad is probably running 2600x or something like that.

I myself own 9900ks and 5900x.

9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.

5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.
My ADL uses less that 70w gaming, even GTA V MP at 1440p ultra settings.
Posted on Edit | Reply
#58
Gica
CrackongI am referring to this pic from the original post.
Only comparing 13900K vs 12900K(F), seeing biggest power jump from 92W to 140W which is a 50% increase in red dead 4k
In that test the performance increase was like 5%

I don't see how you could come up with 'At 13900K, the performance / consumption ratio will be improved. I repeat: in games. '
From the original post
It is very obvious that they bumped the frequency in the price of power consumption
And the bumped frequency didn't end well in the game tests.
The increase in min FPS are mainly from the increased cache, as we have experienced from the 5800X3D behaviour.

The 13900k is just a frequency bump of 12900k plus 8 more e-cores and reached the insane PL4 of 420W.
It will be an uncoolable 12900k which by itself is already quite uncoolable.


Peak is not the average consumption. As in the picture, the top of the intel processor is higher but the average consumption is below AMD and it performs the task much faster.
R0H1TLike always ~ it depends on the task!
As I said before, the peak of consumption is irrelevant. The basis is the average consumption when performing a task. Even in rendering and video processing, most tasks (creation) do not use the processor to the maximum.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
Posted on Reply
#59
Tomorrow
SaLaDiN666The AMD troll brigade squad is probably running 2600x or something like that.

I myself own 9900ks and 5900x.

9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.

5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.
5900X can also be tuned to draw less with Curve Optimizer. To me it's also a pointless SKU. If a person needs multithreading perf they should get 5950X and if they want gaming they get 5600X or 5800X for mixed workloads or 5800X3D for best gaming.
TiggerMy ADL uses less that 70w gaming, even GTA V MP at 1440p ultra settings.
And my 5800X3D also uses less than 70W when gaming.
Posted on Reply
#60
R0H1T
GicaPeak is not the average consumption. As in the picture, the top of the intel processor is higher but the average consumption is below AMD and it performs the task much faster.


As I said before, the peak of consumption is irrelevant. The basis is the average consumption when performing a task. Even in rendering and video processing, most tasks (creation) do not use the processor to the maximum.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
That's not peak power consumption of zen3, it can easily chew through a lot more! Also you can easily run CB23 for longer period to average out your power/task ~ bottom line is that zen3 at stock is still more efficient than stock Intel 12th gen though it can get beaten in some tasks wrt perf/W & your results can vary wildly depending on the task & how long its run!
Posted on Reply
#61
Unregistered
People always quote peak, but how often is anyone's CPU at 100% during gaming or normal use. Peak use means nothing really, just numbers to throw at the opposite camp.
Posted on Edit | Reply
#62
Crackong
GicaPeak is not the average consumption. As in the picture, the top of the intel processor is higher but the average consumption is below AMD and it performs the task much faster.


As I said before, the peak of consumption is irrelevant. The basis is the average consumption when performing a task. Even in rendering and video processing, most tasks (creation) do not use the processor to the maximum.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
The original post compares 13900k vs 12900k
Intel vs Intel
It is completely reasonable to assume the peak consumption increase ~ average consumption increase in the tested use cases.
Since the architecture and process node are hugely the same.
Let me say again
The original post is doing INTEL VS INTEL.

I don't know why you kept missing the picture here and comparing against 'Imaginary AMD'
There are no AMD products tested in the original post.

With the sample size of one the only thing we could do is using its data point and compare Intel vs Intel.
Posted on Reply
#63
Gica
R0H1Tbottom line is that zen3 at stock is still more efficient than stock Intel 12th gen though it can get beaten in some tasks wrt perf/W & your results can vary wildly depending on the task & how long its run!
Not in gaming, not in the Photoshop suite, not in CaD and not in many.
Intel 12th consumes extremely much in very heavy tasks but is extremely efficient in others. To determine the total consumption, average this consumption throughout the session. If I drink three beers today and drink one tomorrow, you can't say I drink three beers every day.

Posted on Reply
#64
InquisitorLXXI
Has there been any buzz about an “S” version of the 13900K coming out?
Posted on Reply
#65
spnidel
Bjorn_Of_Iceland1080p gaming kek
what do you mean? I bought a 12900k and 3090 ti to play games at 720p!
Posted on Reply
#66
Unregistered
GicaNot in gaming, not in the Photoshop suite, not in CaD and not in many.
Intel 12th consumes extremely much in very heavy tasks but is extremely efficient in others. To determine the total consumption, average this consumption throughout the session. If I drink three beers today and drink one tomorrow, you can't say I drink three beers every day.

Like i said, my 12700k uses less than 70w gaming. Usually below 60w. The 12700k seems to use less usage % than both the others too for more FPS(close to the X3D) which is pretty good.
#67
r9
So this will be 10nm CPU that should compete with 5nm Ryzen 4.
Intel CPU architecture is solid you just can't overcame the manufacturing node deficit with just solid architecture.
Posted on Reply
#68
Count von Schwalbe
Nocturnus Moderatus
Pity that AMD had to become the focus of this thread.

Does anyone else find it peculiar that reviewers use 3090 (ti) GPU's for testing CPU's at 1080p?




I guess it doesn't make too much difference, especially if you are a random person on a Chinese tech forum instead of a W1zzard...
Posted on Reply
#69
Vayra86
TiggerLike i said, my 12700k uses less than 70w gaming. Usually below 60w. The 12700k seems to use less usage % than both the others too for more FPS(close to the X3D) which is pretty good.
This has realistically been true since what, Sandy Bridge? I saw 60W on my 3570K. I'm seeing about 65~ish on my 8700K.

Is it relevant? Depends on your perspective, doesn't it? But I don't think it is relevant in the context of Intel pushing a 5.5 Ghz clock on the top-end model, which, as we know today, much like the 12900K(F) is a POS to keep cool unless you limit it somehow.

You keep talking about your super efficiently gaming 12700K as if its a bench-topping beast, but its not a 12900K. And its certainly not a 12900K being pushed in any possible form at 1440p ultra. So what do you really know?! Especially because you push an ancient 1080ti with it. You don't even have the hardware to push a 12700K to the limit in any game, lol. Mighty efficient indeed, at 20% load... so that puts your 'low' 70W in some real perspective right there. This topic is about a successor to the 12900K at peak clocks pushing the fattest GPU you can find. You're brutally off topic every time you post about how efficient your CPU is practically idling, and then you complain about other people making other silly comparisons ;)
Posted on Reply
#70
Gica
History of consumption - the medieval era of computers (follow the notes in yellow)
Now, idle means 25 watts and youtube playback with 45W of the same material. Total consumption. If no one died then, surely no one will die now. In those days, AMD fans hid consumption and attacked with prices. You attack with the ones you have at hand, you hide the messes, the important thing is to appear smart.
We have the high-performance processors for the demands of a home user. It's not wrong to choose a company, Intel or AMD, it's only wrong if you choose badly and badly means choosing a processor totally out of step with your needs.
Posted on Reply
#71
P4-630
Some Cinebench scores

Posted on Reply
Add your own comment
Dec 19th, 2024 08:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts