Monday, July 18th 2022

Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF

Intel's 13th Gen Core "Raptor Lake" is shaping up to be another leadership desktop processor lineup, with an engineering sample clocking significant increases in gaming minimum framerates over the preceding 12th Gen Core i9-12900K "Alder Lake." Extreme Player, a tech-blogger on Chinese video streaming site Bilibili, posted a comprehensive gaming performance review of an i9-13900K engineering sample covering eight games across three resolutions, comparing it with a retail i9-12900KF. The games include CS:GO, Final Fantasy IX: Endwalker, PUBG, Forza Horizon 5, Far Cry 6, Red Dead Redemption 2, Horizon Zero Dawn, and the synthetic benchmark 3DMark. Both processors were tested with a GeForce RTX 3090 Ti graphics card, 32 GB of DDR5-6400 memory, and a 1.5 kW power supply.

The i9-13900K ES is shown posting performance leads ranging wildly between 1% to 2% in the graphics tests of 3DMark, but an incredible 36% to 38% gain in the CPU-intensive tests of the suite. This is explained not just by increased per-core performance of both the P-cores and E-cores, but also the addition of 8 more E-cores. Although the same "Gracemont" E-cores are used in "Raptor Lake," the L2 cache size per E-core cluster has been doubled in size. Horizon Zero Dawn sees -0.7% to 10.98% increase in frame rates. There are some anomalous 70% frame-rate increases in RDR2, discounting which, we still see a 2-9% increase. FC6 posts modest 2.4% increases. Forza Horizon 5, PUBG, Monster Hunter Rise, and FF IX, each report significant increases in minimum framerates, well above 20%.
The second graph below shows the highlight of these tests, significant increases in minimum frame-rates. Averaged across tests, the i9-13900K ES is shown posting a 11.65% min FPS gain at 4K UHD; 21.84% increase at 1440p, and 27.99% increase at 1080p.

A big caveat with all this testing are the CPU clock speeds. Engineering samples do not tend to come with the clock speeds or boosting behavior of the retail processors, and hence don't correctly reflect the end product, although some ES chips may come with unlocked multipliers. In this testing, the i9-13900K ES was set at a maximum P-core clock speed of 5.50 GHz all-core. 5.50 GHz was assumed to be the max boost frequency of the retail chip, and compared with an i9-12900KF that boosts up to 5.20 GHz for the P-cores, but was running at 4.90 GHz all-core.

The i9-13900K ES was also subjected to power-consumption testing, where it posted significant peak gaming power compared to the retail i9-12900KF. A retail i9-13900K will likely come with lower power-consumption than what is shown here, as it will follow boosting behavior typical of retail chips at stock frequencies, when compared to an ES that's been specified to run at a certain frequency.

Intel is preparing to launch its 13th Gen Core "Raptor Lake" processor family in the second half of 2022. This period could also see rival AMD introduce its Ryzen 7000 "Zen 4" processors. "Raptor Lake" combines 8 "Raptor Cove" P-cores with 16 "Gracemont" E-cores, and additional L2 cache for both core types. The I/O of these chips is expected to be similar to "Alder Lake," and hence they're built for the same LGA1700 platform.
Sources: Extreme Play (Bilibili), harukaze5719 (Twitter), VideoCardz
Add your own comment

76 Comments on Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF

#26
ZoneDymo
TiggerWoo Hoo Gooooooo Intel. Nice, hopefully the 13700k will be just as good.

People run GPU's that are twice the power use of the CPU and don't mind. I don't give a crap about power use as long as i can cool it, and it is fast. If you don't want high power use, don't build a high end gaming rig, simple.
or perhaps demand better from the manufactures? just a thought of course.
Posted on Reply
#27
Hossein Almet
Couldn't care less whether the framerate by 25% or 35%, if the power consumption also increases accordingly.
Posted on Reply
#28
ratirt
Hossein AlmetCouldn't care less whether the framerate by 25% or 35%, if the power consumption also increases accordingly.
I'm not judging anything as of now but considering Intel, I would not be surprised if the power goes up as well.
Posted on Reply
#29
Melvis
Ok wait, so this 13900k is all core 5.5GHz? so its OC? most likely running on liquid Nitrogen to get all those mega hot cores at 5.5GHz and what? using 600W? pointless results are pointless?
Posted on Reply
#30
BorisDG
ArkzYa know people use 240Hz 1080p screens right?
I'm on 1440p / 240Hz. 1080p is low for 2022 gaming.
Posted on Reply
#31
gffermari
looool
i9s are a joke. Intel keeps doing the same thing by increasing the power consumption unutil they get half fps over AMD counterparts....Ridiculous.
A 4.8Ghz 5800X3D will destroy all these jokes for cpus.

On the other hand, the i5 and i7 are very good cpus and are the real threat to AMD.
Posted on Reply
#32
Bomby569
gffermarilooool
i9s are a joke. Intel keeps doing the same thing by increasing the power consumption unutil they get half fps over AMD counterparts....Ridiculous.
A 4.8Ghz 5800X3D will destroy all these jokes for cpus.

On the other hand, the i5 and i7 are very good cpus and are the real threat to AMD.
even the i3 is a very competitive offer for budget gaming. I have to agree it's on the very high end (i9) that they have a problem, with the rest not so much and pricing is even better on Intel.
Posted on Reply
#33
BorisDG
gffermarilooool
i9s are a joke. Intel keeps doing the same thing by increasing the power consumption unutil they get half fps over AMD counterparts....Ridiculous.
A 4.8Ghz 5800X3D will destroy all these jokes for cpus.

On the other hand, the i5 and i7 are very good cpus and are the real threat to AMD.
Oh.. well... I see you are 5800X3D owner. No wonder why we got such a comment.
Posted on Reply
#34
ratirt
MelvisOk wait, so this 13900k is all core 5.5GHz? so its OC? most likely running on liquid Nitrogen to get all those mega hot cores at 5.5GHz and what? using 600W? pointless results are pointless?
The new Intel CPU is using the same node so I guess the increase in clock speed is obvious where it's coming from or you may get the idea though.
Anyway, we will see all of it when the reviews come.
Posted on Reply
#35
PapaTaipei
Btw I thought smaller nodes = lower power consumption. Does this means lower consumption per transistors and so higher consumption overall on the whole chip due to more transistors per square mm? I know this chip uses the same node as 12xxx series, just asking in general how does it work.
Posted on Reply
#36
Veseleil
"people who don't have one say that crap."

Yeah, we were talking about top end SKUs, not the step below which has lower boost.

"He commented without actually reading"

Actually reading what, a rumor?

I had more Intel CPU based systems than AMD ones for two decades already, and i still have.
My 3600 isn't cool as i would expect, due to static OC (1.3V), as PBO and other boost stuff doesn't work for me.
Posted on Reply
#37
QuietBob
The video claims that the 13900K can be overclocked to 5.5 GHz on the P-cores using a 360mm AIO, which would be a feat in itself. Even if such frequencies could only be sustained in games, which hardly ever peg 16 threads at 100%, it's already an improvement over Alder Lake. Games should also benefit from Raptor Lake's increased cache.

However, I question the methodology of these tests. The 12900KF was gimped by setting it to an all-core 4.9 GHz, which is below the 5.1-5.2 GHz boost. Games in general do not benefit from manual overclocks, as they cannot fully utilize all the available threads. The 12900KF would have performed better with stock settings here, especially in ST limited titles.

The 13900K, on the other hand, was given an unfair advantage with a 5.5 GHz overclock across all cores, which is assumed to be its maximum boost frequency. And the perf/watt looks atrocious. I have no idea what quality settings were used, but here's a quick reference. A stock 5800X3D peaks at 71w and uses 51w on average in Forza Horizon 5 1080p benchmark, with maximum detail courtesy of a 6600XT.
Posted on Reply
#38
Bomby569
QuietBobA stock 5800X3D peaks at 71w and uses 51w on average in Forza Horizon 5 1080p benchmark, with maximum detail courtesy of a 6600XT.
that doesn't say much, it depends on if the 6600 is bottlenecking the 5800 and it's just sitting there. Utilisation is what you need for those kind of arguments.
Posted on Reply
#39
phanbuey
TiggerWoo Hoo Gooooooo Intel. Nice, hopefully the 13700k will be just as good.

People run GPU's that are twice the power use of the CPU and don't mind. I don't give a crap about power use as long as i can cool it, and it is fast. If you don't want high power use, don't build a high end gaming rig, simple.
I love all the power comments lol -- same wattage, and these don't use much power at all during gaming. It's honestly just the cache bump - this arch is so memory starved it's insane - tiny tweaks to memory yield massive framerate increases.

Definitely looking forward to the 13700k.
Posted on Reply
#40
Tomorrow
Well regarding power you can say whatever about 5800X3D's performance but it's power consumption figures are amazing. For the performance it puts out it literally sips power. Even better is the fact that it's performance and power efficiency actually improve when undervolted via curve optimizer. Even AMD's own designs not to mention Intel cant match this performance per watt.
Posted on Reply
#41
aQi
AlwaysHopeYep & all that from one Chinese source.... still won't stop the comments!
Its always chinese messing around and then selling ES/QS on ebay. Now they would stock ES of ARC gpus too.
Posted on Reply
#42
95Viper
Stay on topic.
Stop the insults.
Posted on Reply
#43
Hofnaerrchen
With my power company telling me lately that they have to increase energy prices quite significantly I couldn't care less about RL. The best decision this year: Buying a tablet instead of a new GPU or CPU. With new generations increasing performance at the cost of higher power consumption I will most likely change my behavior in hardware usage instead of buying new stuff that will make my anual power bill even more terrifying.
Posted on Reply
#44
Bomby569
HofnaerrchenWith my power company telling me lately that they have to increase energy prices quite significantly I couldn't care less about RL. The best decision this year: Buying a tablet instead of a new GPU or CPU. With new generations increasing performance at the cost of higher power consumption I will most likely change my behavior in hardware usage instead of buying new stuff that will make my anual power bill even more terrifying.
if a tablet is enough a APU is too, and those are cheap and cool and don't draw much power.
Posted on Reply
#45
mechtech
If that's true, I feel bad for anyone who got a 12k chip for gaming lol.
Posted on Reply
#46
QuietBob
Bomby569that doesn't say much, it depends on if the 6600 is bottlenecking the 5800 and it's just sitting there. Utilisation is what you need for those kind of arguments.
A stock 5800X3D peaks at 120w in Cinebench R23. We're talking 100% all core AVX load here. Would you expect to see the same power consumption in games?
Posted on Reply
#47
Gica
CrackongThe biggest increase is power consumption
In games, 12900K looks good. Much better than a 5950X. At 13900K, the performance / consumption ratio will be improved.
I repeat: in games.

The test in the video is the Puget System Premiere Pro Benchmark. So what consumption are we talking about? At maximum consumption, AMD wins, but loses overall consumption and rendering time. And Puget has such tests and at least in the Adobe suite, Intel 12th is the right choice.
Posted on Reply
#48
Hofnaerrchen
Bomby569if a tablet is enough a APU is too, and those are cheap and cool and don't draw much power.
Sorry, but NO! Based on your requirements an APU might do the job - sure - but notebooks still draw much more power compared to tablets and the devices are more bulky and heavy. It's a question what you want to do with your device. Not to mention that NB prices also skyrocket if you want a better than average display or a slim design. In my case I can do everything - with the exception of gaming - that I normaly was using my PC for. I already moved away from using the PC in many cases, simply because using the tablet is much more convenient.
Posted on Reply
#49
BArms
I have a hard time believing a >10% boost at 4K when 4K is generally GPU-bottlenecked in the extreme.
Posted on Reply
#50
Minus Infinity
ZoneDymoI said it once and ill say it again: increased performance at the cost of more power consumption isnt progress
Nvidia didn't get the memo either.

Still it'll be interesting to see the actual power usage comparisons of AL and RL. Intel did show changes to the design of RL that would lower power usage, so performance per watt is probably going to be better. But more cores will mean more power even if they are just pleb cores. The big test will be 13600K vs 12700K and 13700K vs 12900K for power usage, as they are the same core configs. If RL keeps power to the same or better with better performance (this part is guaranteed at least) then that's not too bad. Zen 4 will still probably beat it, but Zen 4 is getting large clock speed increases, so I'll bet it uses more power than Zen 3 for sure.

I'm of the opinion it will make little difference which cpu you get except 13900K will be power hungry. 13700/13600 will be much more desirable IMO. I'm still leaning toward Zen 4 since socket AM5 will be around probably until Zen 7 is released. Meteor Lake is all new MB again.
Posted on Reply
Add your own comment
Dec 19th, 2024 08:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts