• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i5-14600K Benchmarked

Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Synthetic performance, yes. Theory vs practice. How often can you actually extract that perf I wonder... when it comes to gaming, we see plateaus of performance more so than a major jump. And of course I was part joking about it...

But now consider the fact these turbo to double the wattage too. I doubt there are many real perf/w improvements in a vast number of workloads when both CPUs run stock.
Seriously, wtf?

You don't think there are many perf / w improvements? A 12900k at 35 watts is much faster than a 8700k at stock running at what, 140 watts if I remember correctly? How much improvement do you expect? The 14600k should be faster than your 8700k while consuming 1/4 of the wattage. That is INSANE actually. If we had that progress on any other devices in that short amount of time, air conditions would be consuming 50 watts now blasting at full load.
 
Joined
Nov 13, 2007
Messages
10,642 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.4GHZ UV - 220W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Seriously, wtf?

You don't think there are many perf / w improvements? A 12900k at 35 watts is much faster than a 8700k at stock running at what, 140 watts if I remember correctly? How much improvement do you expect? The 14600k should be faster than your 8700k while consuming 1/4 of the wattage. That is INSANE actually. If we had that progress on any other devices in that short amount of time, air conditions would be consuming 50 watts now blasting at full load.

This, my OC on 8700K at 5.1 hit 190W in stress.
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
This, my OC on 8700K at 5.1 hit 190W in stress.
Yeah but at stock I remember mine hovering around 130-140 watts. Im sure a 14600k at the same 130-140 watts will just smack it senseless.

This, my OC on 8700K at 5.1 hit 190W in stress.
Οk so I ran the numbers, a stock 8700k will score around 9-9.5k in CBR23, depending on whether or not yo consider MCE stock behavior. For context, MCE on coffeelake didnt just disable power limits like it does nowadays, it also clocked all core clocks to the single core turbo boost, so basically 4.7 ghz. Anyways, a 13600k scores 23k in CBR23 @ 125w, so it is 2.6 times faster while consuming less power. A 12900k scores 24.500 and a 13900k scores 32k. So yeah, efficiency has improved drastically.

Now if we want to make that comparison at ISO performance, a 13600k should be matching 8700ks performance at around 30-35 watts. So basically 1/4th - 1/5th of the power consumption. Whoever is not impressed with this, I don't know what to tell you.
 
Last edited:
Joined
Nov 13, 2007
Messages
10,642 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.4GHZ UV - 220W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
1695143849809.png


Bone stock 13700KF - small -25mv undervolt - air cooled by peerless assasin and a kryosheet. 201W 30K

If they can add efficiency with the new voltage regulation improvements, 160W actually might be high for that 14600K.
 
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Don't worry guys, I get the point ;) At the same time, we know all Intel CPUs these days are clocking high and then are operating way beyond their optimal V/F curve. The efficiency is there, I won't deny. But there is also a lot of waste. And I'm also looking at primarily gaming, for other workloads, especially synthetics, you will see the full depth of the advantage between CPUs, but in most general use cases, you won't.

And then we get to gaming, where the CPU is brutally inefficient overall, as proven by the stellar wattage numbers produced by a 7800X3D in gaming, easily a half to a third of an equally performant Intel CPU. I underline this point because a lot of what we perceive to be advantages and improvements don't always pay off in practice, but at the same time, we do run CPUs now that are capable of burning twice the wattage we used to have.

There's a space there we aren't seeing in reviews/testing, for sure. And I'm not saying this just because I have an 8700K. I'm saying this because that 8700K still uses around 70-90 watts in most games today, even on a 7900XT, even in titles where I am severely CPU limited, and even then I'm still scoring an FPS remarkably close to the latest greatest. The desire to upgrade my CPU isn't that big, even though I know I can get more than 20% higher frames in gaming here and there. CPUs got faster, CPUs got more efficient, but that efficiency is definitely not transparent enough to say it applies everywhere.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Not quite. Double seems about the level of performance increase.
In multithreaded workloads it should be way higher than just double. The 8700k scores around 9k in CBR23, the 13600k already hits 25k.

View attachment 314309

Bone stock 13700KF - small -25mv undervolt - air cooled by peerless assasin and a kryosheet. 201W 30K

If they can add efficiency with the new voltage regulation improvements, 160W actually might be high for that 14600K.
Air coolers ftw

And then we get to gaming, where the CPU is brutally inefficient overall, as proven by the stellar wattage numbers produced by a 7800X3D in gaming, easily a third of an equally performant Intel CPU. I underline this point because a lot of what we perceive to be advantages and improvements don't always pay off in practice, but at the same time, we do run CPUs now that are capable of burning twice the wattage we used to have.

There's a space there we aren't seeing in reviews/testing, for sure. And I'm not saying this just because I have an 8700K. I'm saying this because that 8700K still uses around 70-90 watts in most games today, even on a 7900XT, even in titles where I am severely CPU limited, and even then I'm still scoring an FPS remarkably close to the latest greatest. The desire to upgrade my CPU isn't that big, even though I know I can get more than 20% higher frames in gaming here and there. CPUs got faster, CPUs got more efficient, but that efficiency is definitely not transparent enough to say it applies everywhere.
What if I told you a 12900k consumes between 50 to 70w at 4k with a 4090?

Yes, the 13900k is a power hog if left unchecked running games at 720p with a 4090,but that's not a very realistic scenario. In 4k it usually hovers below 100w

EG1. The above numbers are based on heaviest of games like TLOU and cyberpunk. On your average game the numbers are much lower.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
What if I told you a 12900k consumes between 50 to 70w at 4k with a 4090?

Yes, the 13900k is a power hog if left unchecked running games at 720p with a 4090,but that's not a very realistic scenario. In 4k it usually hovers below 100w

EG1. The above numbers are based on heaviest of games like TLOU and cyberpunk. On your average game the numbers are much lower.
Its irrelevant, these are not high CPU stress scenarios at all. That 720p result is where its at. Not realistic? No - games don't push the CPU, simple as, but the CPU does need a substantial amount of wattage to keep that chip going. There is waste, simple, and a higher wattage CPU is more likely to produce more waste as it boosts higher but there isn't 'more work done' for the perceptive of the gamer. It still runs the game, it just produces more frames. That's a key difference we keep forgetting. There is the balance against the GPU and the requirement of the game towards both CPU and GPU, but there is also the fact that we just always like to have more, even if we don't really need it. A faster CPU enables more, and then also uses more. Gaming is not a workload that's begin to end and then you finish it faster, like a synthetic bench.

The fun fact is, I would also see around 70W at 4K with a 4090 on my 8700K.
I'm seeing 90W today in Starfield, a game that truly loves to load the CPU (lets not speak of performance relative to that...), and it results in FPS results pretty close to what's being benched for 7900XT on recent CPUs. And there are many more examples where I am left wondering whether there is some magic I don't know of, or that CPU perf in gaming has just pretty much hit a wall, not unlike how different CPUs felt in the past. Its either enough, too little, or overkill that barely pays off.
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Its irrelevant, these are not high CPU stress scenarios at all. That 720p result is where its at. Not realistic? No - games don't push the CPU, simple as, but the CPU does need a substantial amount of wattage to keep that chip going. There is waste, simple, and a higher wattage CPU is more likely to produce more waste as it boosts higher but there isn't 'more work done' for the perceptive of the gamer. It still runs the game, it just produces more frames. That's a key difference we keep forgetting. There is the balance against the GPU and the requirement of the game towards both CPU and GPU, but there is also the fact that we just always like to have more, even if we don't really need it. A faster CPU enables more, and then also uses more.

The fun fact is, I would also see 70W at 4K with a 4090 on my 8700K.
Ok, here are some real numbers on full CPU bound scenarios 720p resolution.

Warzone 2 = between 50 and 70 watts
Kingdom come = between 50 and 70 watts
Remnant 2 = between 33 and 52 watts
Hogwarts = between 40 and 60 watts

And the list goes on and on. The maximum power draw i've ever seen in a game was in 720p TLOU were the 12900k hit 115 watts but most games it's half that or even less.

Now with the 13900k, yes, i've gone up to a whooping 170 watts in cyberpunk, but again, those are very academic numbers. After all, when you are running 720p with a 4090, you are basically benching. Im fairly confident you can power limit it to 90w and lose like 3% performance or something.

I'm seeing 90W today in Starfield, a game that truly loves to load the CPU (lets not speak of performance relative to that...), and it results in FPS results pretty close to what's being benched for 7900XT on recent CPUs. And there are many more examples where I am left wondering whether there is some magic I don't know of, or that CPU perf in gaming has just pretty much hit a wall, not unlike how different CPUs felt in the past. Its either enough, too little, or overkill that barely pays off.
In starfield - again - full CPU bound scenario, im between 90 and 100w in that big city. In other areas im around 50-60. Nothing gets close to TLOU in terms of power draw.
 
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Ok, here are some real numbers on full CPU bound scenarios 720p resolution.

Warzone 2 = between 50 and 70 watts
Kingdom come = between 50 and 70 watts
Remnant 2 = between 33 and 52 watts
Hogwarts = between 40 and 60 watts

And the list goes on and on. The maximum power draw i've ever seen in a game was in 720p TLOU were the 12900k hit 115 watts but most games it's half that or even less.

Now with the 13900k, yes, i've gone up to a whooping 170 watts in cyberpunk, but again, those are very academic numbers. After all, when you are running 720p with a 4090, you are basically benching. Im fairly confident you can power limit it to 90w and lose like 3% performance or something.
You're just underlining the fact that higher wattage CPUs at stock are going to use more watts for similar perf, you know. You are talking about tweaked CPUs here, and the moment you let a 13900K run stock on that same workload, the wattage explodes.

Cán these CPUs use less, they certainly can! But its not how they're delivered, and not every CPU can even be tweaked.

In starfield - again - full CPU bound scenario, im between 90 and 100w in that big city. In other areas im around 50-60. Nothing gets close to TLOU in terms of power draw.
Exactly the point... So... have we progressed a lot then, if a 8700K does 90W and you still use that today. I'm still seeing 50+ FPS in cities. You might see 90-100 with more variance. Both are perfectly playable.

Here's a screen - this is with crowd density maxed, and most other settings too, and not 4K (3440x1440).

Starfield_2023_09_09_00_35_40_696.jpg
 
Last edited:
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Exactly the point... So... have we progressed a lot then, if a 8700K does 90W and you still use that today. I'm still seeing 50+ FPS in cities. You might see 90-100 with more variance. Both are perfectly playable.
Im seeing an average of 120-130 in that big city. So - yes, more than twice the fps for similar power draw?

You're just underlining the fact that higher wattage CPUs at stock are going to use more watts for similar perf, you know. You are talking about tweaked CPUs here, and the moment you let a 13900K run stock on that same workload, the wattage explodes.

Cán these CPUs use less, they certainly can! But its not how they're delivered, and not every CPU can even be tweaked.
So your problem with these CPUs is how they are delivered? Well, lucky you, that's an insanely easy problem to solve. Im fully confident you know how to power limit, it will take you what, 5-10 seconds?
 
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Im seeing an average of 120-130 in that big city. So - yes, more than twice the fps for similar power draw?


So your problem with these CPUs is how they are delivered? Well, lucky you, that's an insanely easy problem to solve. Im fully confident you know how to power limit, it will take you what, 5-10 seconds?
You seem to keep thinking I have a problem with things, I don't. Its an observation.

In synthetics we thought to see 2,6x efficiency I read earlier and you mentioned 1/4th. There's a substantial gap.
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You seem to keep thinking I have a problem with things, I don't. Its an observation.

In synthetics we thought to see 2,6x efficiency I read earlier and you mentioned 1/4th. There's a substantial gap.
I never mentioned synthetics. Cinebench isn't a synthetic workload. It's an actual application called cinema 4d.

I tested the same area as your screenshot, locked framerate to 62, power draw was at 49 watts. So yeah, not a huge leap, but games are a different kind of beast all together. Especially this game in particular. I don't think starfield should be used to compare this kind of thing.
 
Joined
Jul 5, 2013
Messages
27,142 (6.58/day)
In multithreaded workloads it should be way higher than just double.
What it seems like it should and what happens in real world practice often don't jive. Given that it's been just less than 6 years and performance has still slightly more than doubled, things are still progressing.
The 8700k scores around 9k in CBR23, the 13600k already hits 25k.
But that is one synthetic benchmark. One benchmark does not a standard of performance make..
Cinebench isn't a synthetic workload. It's an actual application called cinema 4d.
Cinebench is most definitely a synthetic workload. It is a realistic workload benchmark which why it is given more merit than most, but it is an artificial runtime.
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
What it seems like it should and what happens in real world practice often don't jive. Given that it's been just less than 6 years and performance has still slightly more than doubled, things are still progressing.

But that is one synthetic benchmark. One benchmark does not a standard of performance make..
Cinebench is not synthetic. Do you understand what a synthetic workload is?
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Οk, so you don't understand what synthetic is. Lol
No you just like to argue about irrelevant points ;) Cinebench the way it is run for this purpose is most definitely a synthetic bench because you're rendering that same picture every time. Its not a real workload with variance. The same thing applies to, effectively a canned game benchmark. You could argue its synthetic just the same because players never play that canned run ever. They come into the game with similar game logic, but have it manipulated (and more often than not: added to!) with whatever they do in the game. In the same way, Starfield's performance varies highly depending on where you are and even what you do. In combat, FPS takes a nosedive. Indoors, FPS skyrockets. Etc.

Synthetic:
  • Relating to, involving, or of the nature of synthesis.
  • Produced by synthesis, especially not of natural origin.
  • Prepared or made artificially.
  • Not natural or genuine; artificial or contrived.
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
No you just like to argue about irrelevant points ;) Cinebench the way it is run for this purpose is most definitely a synthetic bench because you're rendering that same picture every time. Its not a real workload with variance. The same thing applies to, effectively a canned game benchmark. You could argue its synthetic just the same because players never play that canned run ever. They come into the game with similar game logic, but have it manipulated (and more often than not: added to!) with whatever they do in the game. In the same way, Starfield's performance varies highly depending on where you are and even what you do. In combat, FPS takes a nosedive. Indoors, FPS skyrockets. Etc.

Synthetic:
  • Relating to, involving, or of the nature of synthesis.
  • Produced by synthesis, especially not of natural origin.
  • Prepared or made artificially.
  • Not natural or genuine; artificial or contrived.
Cinebench is made by maxon, the company behind cinema 4d. They created cinebench so system integrators can easily benchmark their systems performance in....you guessed it, cinema 4d. Which is a real world application. Calling cinebench synthetic is just wild.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.98/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
vs 8700K? yeah it's almost doubled performance (60-80% faster) on ST and over doubled on MT...

View attachment 314282
View attachment 314283


View attachment 314286

None of the above - 100% CPU usage and no Throttle state

View attachment 314288

My undervolted 13700K pulls 210W @ 5.3Ghz (stock) during cinebench, so these chips are actually pretty efficient if you don't yeet them at 1.4v at 5.9ghz.
We can't see anything about throttling in that image.

If it was HWinfo showing effective clocks and the rest of the power figures it might be trustworthy, but FRAPS? or whatever that is can easily show an incomplete picture.
I can go use HWmonitor and show CPU's at 8GHz at 255c, not all software is reliable.

Cinebench is made by maxon, the company behind cinema 4d. They created cinebench so system integrators can easily benchmark their systems performance in....you guessed it, cinema 4d. Which is a real world application. Calling cinebench synthetic is just wild.
It's absolutely synthetic. It's 100% the same test every single time.

It's based on a realistic workload so it's a USEFUL synthetic, but it's still synthetic.
 
Joined
Jun 14, 2020
Messages
3,275 (2.06/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
It's absolutely synthetic. It's 100% the same test every single time.

It's based on a realistic workload so it's a USEFUL synthetic, but it's still synthetic.
Ok, it's a synthetic benchmark in which the performance directly correlates to real world applications. Calling it just a "synthetic" to discard the performance numbers is wrong.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Synthetic performance, yes. Theory vs practice. How often can you actually extract that perf I wonder... when it comes to gaming, we see plateaus of performance more so than a major jump. And of course I was part joking about it...

But now consider the fact these turbo to double the wattage too. I doubt there are many real perf/w improvements in a vast number of workloads when both CPUs run stock.
Performance in both real world applications and gaming have gone up big time as well. IPC improved alot and clockspeeds went up as well. 8700K is nowhere near top CPUs today, neither is my 9900K even at 5.2 GHz. It is loosing big to even a stock i5-13600K in both applications and gaming.

But sure, keep thinking your 8700K at just 4.6 GHz is close to new CPUs... I bet your 7900XT is even bottlenecked by it in many demanding games and you have no resizeable bar support on top. Tons of games can and will use 8 cores today, especially when paired with a higher end GPU.
 
Joined
Nov 13, 2007
Messages
10,642 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.4GHZ UV - 220W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
We can't see anything about throttling in that image.

If it was HWinfo showing effective clocks and the rest of the power figures it might be trustworthy, but FRAPS? or whatever that is can easily show an incomplete picture.
I can go use HWmonitor and show CPU's at 8GHz at 255c, not all software is reliable.


It's absolutely synthetic. It's 100% the same test every single time.

It's based on a realistic workload so it's a USEFUL synthetic, but it's still synthetic.
AIDA throttle monitor.
 
Joined
Sep 17, 2014
Messages
22,193 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Performance in both real world applications and gaming have gone up big time as well. IPC improved alot and clockspeeds went up as well. 8700K is nowhere near top CPUs today, neither is my 9900K even at 5.2 GHz. It is loosing big to even a stock i5-13600K in both applications and gaming.

But sure, keep thinking your 8700K at just 4.6 GHz is close to new CPUs... I bet your 7900XT is even bottlenecked by it in many demanding games and you have no resizeable bar support on top. Tons of games can and will use 8 cores today, especially when paired with a higher end GPU.
I know it is.
 
Joined
Oct 31, 2013
Messages
187 (0.05/day)
That is not how you compare efficiency. At all. Unless you run everything at same wattage, any efficiency comparison is just nonsensical.
280W vs 253W was close enough for me for comparison. I could power limit the TR to 253W tho and see how well it does.
But i would also like to see the real power draw of the 14600K for Cinebench, assuming the board is configured to allow 253W all the time for it.
 
Top