• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel i9-13900K "Raptor Lake" ES Improves Gaming Minimum Framerates by 11-27% Over i9-12900KF

Joined
Dec 26, 2006
Messages
3,862 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
If that's true, I feel bad for anyone who got a 12k chip for gaming lol.
 
Joined
Sep 21, 2020
Messages
1,677 (1.08/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
that doesn't say much, it depends on if the 6600 is bottlenecking the 5800 and it's just sitting there. Utilisation is what you need for those kind of arguments.
A stock 5800X3D peaks at 120w in Cinebench R23. We're talking 100% all core AVX load here. Would you expect to see the same power consumption in games?
 
Joined
Jun 6, 2022
Messages
622 (0.67/day)
The biggest increase is power consumption
In games, 12900K looks good. Much better than a 5950X. At 13900K, the performance / consumption ratio will be improved.
I repeat: in games.

The test in the video is the Puget System Premiere Pro Benchmark. So what consumption are we talking about? At maximum consumption, AMD wins, but loses overall consumption and rendering time. And Puget has such tests and at least in the Adobe suite, Intel 12th is the right choice.
 

Attachments

  • Clipboard01.jpg
    Clipboard01.jpg
    300.5 KB · Views: 52
Last edited:
Joined
Jun 20, 2022
Messages
302 (0.33/day)
Location
Germany
System Name Galaxy Tab S8+
Processor Snapdragon 8 gen 1 SOC
Cooling passive
Memory 8 GB
Storage 256 GB + 512 GB SD
Display(s) 2.800 x 1.752 Super AMOLED
Power Supply 10.090 mAh
Software Android 12
if a tablet is enough a APU is too, and those are cheap and cool and don't draw much power.

Sorry, but NO! Based on your requirements an APU might do the job - sure - but notebooks still draw much more power compared to tablets and the devices are more bulky and heavy. It's a question what you want to do with your device. Not to mention that NB prices also skyrocket if you want a better than average display or a slim design. In my case I can do everything - with the exception of gaming - that I normaly was using my PC for. I already moved away from using the PC in many cases, simply because using the tablet is much more convenient.
 
Joined
Jan 5, 2017
Messages
308 (0.11/day)
System Name Main
Processor 8700K
Motherboard Maximus Hero X
Cooling EVGA 280 CLC w/ Noctua silent fans
Memory 2x8GB 3600/16
Video Card(s) EVGA 2080TI Hybrid
I have a hard time believing a >10% boost at 4K when 4K is generally GPU-bottlenecked in the extreme.
 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
I said it once and ill say it again: increased performance at the cost of more power consumption isnt progress
Nvidia didn't get the memo either.

Still it'll be interesting to see the actual power usage comparisons of AL and RL. Intel did show changes to the design of RL that would lower power usage, so performance per watt is probably going to be better. But more cores will mean more power even if they are just pleb cores. The big test will be 13600K vs 12700K and 13700K vs 12900K for power usage, as they are the same core configs. If RL keeps power to the same or better with better performance (this part is guaranteed at least) then that's not too bad. Zen 4 will still probably beat it, but Zen 4 is getting large clock speed increases, so I'll bet it uses more power than Zen 3 for sure.

I'm of the opinion it will make little difference which cpu you get except 13900K will be power hungry. 13700/13600 will be much more desirable IMO. I'm still leaning toward Zen 4 since socket AM5 will be around probably until Zen 7 is released. Meteor Lake is all new MB again.
 
Joined
Feb 15, 2019
Messages
1,666 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
In games, 12900K looks good. Much better than a 5950X. At 13900K, the performance / consumption ratio will be improved.
I repeat: in games.

The test in the video is the Puget System Premiere Pro Benchmark. So what consumption are we talking about? At maximum consumption, AMD wins, but loses overall consumption and rendering time. And Puget has such tests and at least in the Adobe suite, Intel 12th is the right choice.
I am referring to this pic from the original post.
Only comparing 13900K vs 12900K(F), seeing biggest power jump from 92W to 140W which is a 50% increase in red dead 4k
In that test the performance increase was like 5%

I don't see how you could come up with 'At 13900K, the performance / consumption ratio will be improved. I repeat: in games. '
From the original post
It is very obvious that they bumped the frequency in the price of power consumption
And the bumped frequency didn't end well in the game tests.
The increase in min FPS are mainly from the increased cache, as we have experienced from the 5800X3D behaviour.

The 13900k is just a frequency bump of 12900k plus 8 more e-cores and reached the insane PL4 of 420W.
It will be an uncoolable 12900k which by itself is already quite uncoolable.


 
Joined
Mar 18, 2008
Messages
5,444 (0.89/day)
Location
Australia
System Name Night Rider | Mini LAN PC | Workhorse
Processor AMD R7 5800X3D | Ryzen 1600X | i7 970
Motherboard MSi AM4 Pro Carbon | GA- | Gigabyte EX58-UD5
Cooling Noctua U9S Twin Fan| Stock Cooler, Copper Core)| Big shairkan B
Memory 2x8GB DDR4 G.Skill Ripjaws 3600MHz| 2x8GB Corsair 3000 | 6x2GB DDR3 1300 Corsair
Video Card(s) MSI AMD 6750XT | 6500XT | MSI RX 580 8GB
Storage 1TB WD Black NVME / 250GB SSD /2TB WD Black | 500GB SSD WD, 2x1TB, 1x750 | WD 500 SSD/Seagate 320
Display(s) LG 27" 1440P| Samsung 20" S20C300L/DELL 15" | 22" DELL/19"DELL
Case LIAN LI PC-18 | Mini ATX Case (custom) | Atrix C4 9001
Audio Device(s) Onboard | Onbaord | Onboard
Power Supply Silverstone 850 | Silverstone Mini 450W | Corsair CX-750
Mouse Coolermaster Pro | Rapoo V900 | Gigabyte 6850X
Keyboard MAX Keyboard Nighthawk X8 | Creative Fatal1ty eluminx | Some POS Logitech
Software Windows 10 Pro 64 | Windows 10 Pro 64 | Windows 7 Pro 64/Windows 10 Home
@Tigger
ALL core clock, not just 1 or 2 cores like the 12 900K, so maybe YOU should actually try reading it.....if you think ALL cores at 5.5GHz wont need special cooling then your high or something.
 
Joined
Jan 29, 2021
Messages
1,876 (1.32/day)
Location
Alaska USA
@Tigger
ALL core clock, not just 1 or 2 cores like the 12 900K, so maybe YOU should actually try reading it.....if you think ALL cores at 5.5GHz wont need special cooling then your high or something.
This cpu is going to be the Mad Max car of PC gaming. :cool:

 
Joined
May 31, 2016
Messages
4,443 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
There is not a lot of performance increase in average mostly the mins are higher which is good. Refresh of a 12th gen it would seem is the best to describe it.
What tipped me off is the 1.5 KW PSU used o_O. God I hope it's a coincidence not a must to run it full speed with a 3090 Ti.
 
Joined
Nov 6, 2019
Messages
38 (0.02/day)
The AMD troll brigade squad is probably running 2600x or something like that.

I myself own 9900ks and 5900x.

9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.

5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.
 
D

Deleted member 24505

Guest
The AMD troll brigade squad is probably running 2600x or something like that.

I myself own 9900ks and 5900x.

9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.

5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.

My ADL uses less that 70w gaming, even GTA V MP at 1440p ultra settings.
 
Joined
Jun 6, 2022
Messages
622 (0.67/day)
I am referring to this pic from the original post.
Only comparing 13900K vs 12900K(F), seeing biggest power jump from 92W to 140W which is a 50% increase in red dead 4k
In that test the performance increase was like 5%

I don't see how you could come up with 'At 13900K, the performance / consumption ratio will be improved. I repeat: in games. '
From the original post
It is very obvious that they bumped the frequency in the price of power consumption
And the bumped frequency didn't end well in the game tests.
The increase in min FPS are mainly from the increased cache, as we have experienced from the 5800X3D behaviour.

The 13900k is just a frequency bump of 12900k plus 8 more e-cores and reached the insane PL4 of 420W.
It will be an uncoolable 12900k which by itself is already quite uncoolable.


Peak is not the average consumption. As in the picture, the top of the intel processor is higher but the average consumption is below AMD and it performs the task much faster.

Like always ~ it depends on the task!
As I said before, the peak of consumption is irrelevant. The basis is the average consumption when performing a task. Even in rendering and video processing, most tasks (creation) do not use the processor to the maximum.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
 

Attachments

  • Clipboard02.jpg
    Clipboard02.jpg
    282.8 KB · Views: 52
Joined
Aug 21, 2013
Messages
1,939 (0.47/day)
The AMD troll brigade squad is probably running 2600x or something like that.

I myself own 9900ks and 5900x.

9900ks power consumption in games is about 50-70w most of the time, sometimes 100w when playing something heavy which utilizes also AVX.

5900x is consuming 90 -110w CONSTANTLY. If I play a pathetic old-ass game utilizing one core, it doesn't matter, 90w. If I play something newer like BFV, boom, 110 W in MP.
5900X can also be tuned to draw less with Curve Optimizer. To me it's also a pointless SKU. If a person needs multithreading perf they should get 5950X and if they want gaming they get 5600X or 5800X for mixed workloads or 5800X3D for best gaming.
My ADL uses less that 70w gaming, even GTA V MP at 1440p ultra settings.
And my 5800X3D also uses less than 70W when gaming.
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
Peak is not the average consumption. As in the picture, the top of the intel processor is higher but the average consumption is below AMD and it performs the task much faster.


As I said before, the peak of consumption is irrelevant. The basis is the average consumption when performing a task. Even in rendering and video processing, most tasks (creation) do not use the processor to the maximum.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
That's not peak power consumption of zen3, it can easily chew through a lot more! Also you can easily run CB23 for longer period to average out your power/task ~ bottom line is that zen3 at stock is still more efficient than stock Intel 12th gen though it can get beaten in some tasks wrt perf/W & your results can vary wildly depending on the task & how long its run!
 
D

Deleted member 24505

Guest
People always quote peak, but how often is anyone's CPU at 100% during gaming or normal use. Peak use means nothing really, just numbers to throw at the opposite camp.
 
Joined
Feb 15, 2019
Messages
1,666 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Peak is not the average consumption. As in the picture, the top of the intel processor is higher but the average consumption is below AMD and it performs the task much faster.


As I said before, the peak of consumption is irrelevant. The basis is the average consumption when performing a task. Even in rendering and video processing, most tasks (creation) do not use the processor to the maximum.
In my example, if I had a peak of 57.3W, the total consumption of the processor was below 8 W / h.
The original post compares 13900k vs 12900k
Intel vs Intel
It is completely reasonable to assume the peak consumption increase ~ average consumption increase in the tested use cases.
Since the architecture and process node are hugely the same.
Let me say again
The original post is doing INTEL VS INTEL.

I don't know why you kept missing the picture here and comparing against 'Imaginary AMD'
There are no AMD products tested in the original post.

With the sample size of one the only thing we could do is using its data point and compare Intel vs Intel.
 
Last edited:
Joined
Jun 6, 2022
Messages
622 (0.67/day)
bottom line is that zen3 at stock is still more efficient than stock Intel 12th gen though it can get beaten in some tasks wrt perf/W & your results can vary wildly depending on the task & how long its run!
Not in gaming, not in the Photoshop suite, not in CaD and not in many.
Intel 12th consumes extremely much in very heavy tasks but is extremely efficient in others. To determine the total consumption, average this consumption throughout the session. If I drink three beers today and drink one tomorrow, you can't say I drink three beers every day.

 
Last edited:
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
D

Deleted member 24505

Guest
Not in gaming, not in the Photoshop suite, not in CaD and not in many.
Intel 12th consumes extremely much in very heavy tasks but is extremely efficient in others. To determine the total consumption, average this consumption throughout the session. If I drink three beers today and drink one tomorrow, you can't say I drink three beers every day.


Like i said, my 12700k uses less than 70w gaming. Usually below 60w. The 12700k seems to use less usage % than both the others too for more FPS(close to the X3D) which is pretty good.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
So this will be 10nm CPU that should compete with 5nm Ryzen 4.
Intel CPU architecture is solid you just can't overcame the manufacturing node deficit with just solid architecture.
 

Count von Schwalbe

Nocturnus Moderatus
Staff member
Joined
Nov 15, 2021
Messages
3,179 (2.80/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
Pity that AMD had to become the focus of this thread.

Does anyone else find it peculiar that reviewers use 3090 (ti) GPU's for testing CPU's at 1080p?

1658807809724.png



I guess it doesn't make too much difference, especially if you are a random person on a Chinese tech forum instead of a W1zzard...
 
Joined
Sep 17, 2014
Messages
22,684 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Like i said, my 12700k uses less than 70w gaming. Usually below 60w. The 12700k seems to use less usage % than both the others too for more FPS(close to the X3D) which is pretty good.

This has realistically been true since what, Sandy Bridge? I saw 60W on my 3570K. I'm seeing about 65~ish on my 8700K.

Is it relevant? Depends on your perspective, doesn't it? But I don't think it is relevant in the context of Intel pushing a 5.5 Ghz clock on the top-end model, which, as we know today, much like the 12900K(F) is a POS to keep cool unless you limit it somehow.

You keep talking about your super efficiently gaming 12700K as if its a bench-topping beast, but its not a 12900K. And its certainly not a 12900K being pushed in any possible form at 1440p ultra. So what do you really know?! Especially because you push an ancient 1080ti with it. You don't even have the hardware to push a 12700K to the limit in any game, lol. Mighty efficient indeed, at 20% load... so that puts your 'low' 70W in some real perspective right there. This topic is about a successor to the 12900K at peak clocks pushing the fattest GPU you can find. You're brutally off topic every time you post about how efficient your CPU is practically idling, and then you complain about other people making other silly comparisons ;)
 
Last edited:
Top