# Intel Core i5-13600K



## W1zzard (Oct 20, 2022)

Core i5-13600K is a gamer's dream! The new processor achieves gaming numbers that match the best AMD Zen 4 offerings, at a much more competitive $320 price point. Our review also shows that 13600K can shine in applications, beating the much higher positioned Core i7-12700K.

*Show full review*


----------



## Crackong (Oct 20, 2022)

Now everyone hits 90 degrees or more


----------



## metalslaw (Oct 20, 2022)

Summary page, "5.8 GHz boost on only two cores"?


----------



## Legacy-ZA (Oct 20, 2022)

Watch that power draw.  
Nope, nopity NOPE.

What a clown world we live in.

I am so glad I upgraded to my current system, I was worried that I would be done in by both this generation from Intel and AMD's, but this platform will last a good while yet and I have options to make my power draw even more friendly while keeping my performance and temps in check.


----------



## Dyatlov A (Oct 20, 2022)

Nice CPU, would be very awesome to know what it can do at 5.6GHz overclock


----------



## gridracedriver (Oct 20, 2022)

Good CPU in perf/cost gaming, but power draw and temperature too much.


----------



## U89NzzWoL (Oct 20, 2022)

I was wondering, as I need a system thats good for both gaming and premiere pro/blender, could I go with an i5 13600k + z690 Gaming X motherboard (it's cheaper + I don't really care about the z790) and pair it with an RTX 3080 to be good to go? Since I was really thinking about doing so


----------



## sephiroth117 (Oct 20, 2022)

I think Intel will be strong with Meteor Lake once they jump to 7nm (intel 7nm is more efficient than TSMC 7nm) and with that tile, disaggregated architecture.
Until then I don't think I'd purchase a CPU that needs to hit close to 200W to rival AMD.


----------



## Dristun (Oct 20, 2022)

Consumes ~10W more than 7700X in gaming while being faster and 25W more in app average while being on par - what's the power consumption fuss all about? Great choice especially if one knows how to update the bios and can grab a Z690 board on sale.


----------



## W1zzard (Oct 20, 2022)

metalslaw said:


> Summary page, "5.8 GHz boost on only two cores"?


Fixed, thanks!



Dyatlov A said:


> Nice CPU, would be very awesome to know what it can do at 5.6GHz overclock


Running last tests at 5.6



Dristun said:


> specially if one knows how to update the bios and can grab a Z690 board on sale.


Just to add to that, you can run Raptor Lake with an old BIOS just fine, I tried (before BIOS updates were available). So you don't need a loaner CPU


----------



## Space Lynx (Oct 20, 2022)

@W1zzard I am confused, your test setup page says 420mm AIO and then it also says Noctua for the air setup, which setup is this particualr 13600k review using? I find it hard to believe a 420mm AIO can still hit 91 celsius for an Intel system. 

gaming temps are still good.

looks like I am buying raptor lake for my next rig. 13600kf will be my next chip. AMD just can't compete when it comes to gamers anymore. they made a huge mistake not focusing on the 7800X3D first and foremost. cause my money is going to raptor lake now.


----------



## W1zzard (Oct 20, 2022)

CallandorWoT said:


> confused


Read the text above the temp charts?


----------



## Space Lynx (Oct 20, 2022)

W1zzard said:


> Read the text above the temp charts?



RIP me. sorry I was so excited I skipped the very first line. new hardware days are fun days, apologies!

ok then. well a U14 Noctua is fairly weak. so a proper NH D15 and 360mm AIO will def tame this beast.

I have a 13600k in my cart, about to check out now. damn I really love AMD, but I am not going to pay more for higher temps, less performance in gaming.


----------



## P4-630 (Oct 20, 2022)

Where's the i7 13700K review?


----------



## thewan (Oct 20, 2022)

> The Intel Core i9-13600K comes at an MSRP of $320.



As the review heads towards its conclusion, our resident wizzard worked his magic to evolve an i5 into an i9.


----------



## Space Lynx (Oct 20, 2022)

thewan said:


> As the review heads towards its conclusion, our resident wizzard worked his magic to evolve an i5 into an i9.



to be fair, the 13600k does trade blows in gaming with the 13900k, perhaps for gaming it does deserve the i9 royalty label


----------



## HenrySomeone (Oct 20, 2022)

Mainstream champion for sure! Gaming performance beyond anything from previous gen or what AyyyyMD can muster, multi nearly matching that of 5950x and you can pair it with an entry level Z690 board (bar 2 or 3 worst ones) or even a solid B660 and ddr4 if you want to save a couple hundred extra bucks and you have yourself an excellent general purpose rig!


----------



## P4-630 (Oct 20, 2022)

thewan said:


> As the review heads towards its conclusion, our resident wizzard worked his magic to evolve an i5 into an i9.


Good price for an i9 though!!


----------



## Ayhamb99 (Oct 20, 2022)

HenrySomeone said:


> Mainstream champion for sure! Gaming performance beyond anything from previous gen or what AyyyyMD can muster, multi nearly matching that of 5950x and you can pair it with an entry level Z690 board (bar 2 or 3 worst ones) or even a solid B660 and ddr4 if you want to save a couple hundred extra bucks and you have yourself an excellent general purpose rig!


This and the 13700k will definitely be the champions of Raptor lake. The multi-threaded performance of this CPU is very impressive while staying at around the same efficiency level as the 12600k and 12700k.


----------



## Space Lynx (Oct 20, 2022)

Does anyone know if I disable ecores, will it improve temps in gaming?


----------



## P4-630 (Oct 20, 2022)

CallandorWoT said:


> Does anyone know if I disable ecores, will it improve temps in gaming?



I don't think it will matter much as E-cores aren't being used for gaming, just the P cores, at least that's what I have been seeing with my i7 12700K...


----------



## champsilva (Oct 20, 2022)

Legacy-ZA said:


> Watch that power draw.
> Nope, nopity NOPE.
> 
> What a clown world we live in.
> ...



Why people care so much about 100W in peak power consumption?

That's like pennies por month.


----------



## Hyderz (Oct 20, 2022)

Let’s see what the 13400 cpu can do if the 13600k can be this powerful


----------



## R0H1T (Oct 20, 2022)

They better get that that massive IPC jump with the 14th gen otherwise we'll have oven makers going out of business 










Yes this also applies to AMD & zen5 ~


----------



## Valantar (Oct 20, 2022)

Wow, I was pretty surprised by the power draw and thermal results here. Given how well 12th gen held up against Ryzen 7000, I was expecting this to be along the same lines - instead we're seeing drastic power draw increases even in real-world applications and gaming. 10% faster than the 7600X for more than 50% higher power draw? Even if neither are drawing massive amounts of power while gaming, that's quite the difference. And of course the 12400f trounces either in gaming efficiency.

Overall ... pretty underwhelming? Yes, it's fast - but so is pretty much anything at this point. It's also quite inefficient. Could bode decently for lower-end SKUs with lower clocks I guess, but this is already significantly downclocked from the 13900K, so I guess we'll have to wait and see.

All in all, the 2022 generation of CPUs is looking ever less attractive overall.



CallandorWoT said:


> Does anyone know if I disable ecores, will it improve temps in gaming?


Given that these barely consume power even under full load I really don't see why it would.


----------



## Ayhamb99 (Oct 20, 2022)

R0H1T said:


> They better get that that massive IPC jump with the 14th gen otherwise we'll have oven makers going out of business
> 
> 
> 
> ...


This generation of AMD, Intel and Nvidia will be the space heater generation lol


----------



## Arco (Oct 20, 2022)

Ayhamb99 said:


> This generation of AMD, Intel and Nvidia will be the space heater generation lol


Fermi all over again! Bulldozer too?


----------



## minas (Oct 20, 2022)

I don't get it. Performance is very similar to 12700k, same thread count. 12700k has two perf cores more, 13600k has 4 eff cores more. Frequencies only 100-200 mhz higher on 13600k. So why 13600k draws so much power and runs very hot ?


----------



## R0H1T (Oct 20, 2022)

Too much voltage & "unlimited" power limits. Just like AMD you can tune them to run much more efficiently & tbf I'd do that with Intel as well.


----------



## TheinsanegamerN (Oct 20, 2022)

Valantar said:


> Wow, I was pretty surprised by the power draw and thermal results here. Given how well 12th gen held up against Ryzen 7000, I was expecting this to be along the same lines - instead we're seeing drastic power draw increases even in real-world applications and gaming. 10% faster than the 7600X for more than 50% higher power draw? Even if neither are drawing massive amounts of power while gaming, that's quite the difference. And of course the 12400f trounces either in gaming efficiency.
> 
> Overall ... pretty underwhelming? Yes, it's fast - but so is pretty much anything at this point. It's also quite inefficient. Could bode decently for lower-end SKUs with lower clocks I guess, but this is already significantly downclocked from the 13900K, so I guess we'll have to wait and see.
> 
> All in all, the 2022 generation of CPUs is looking ever less attractive overall.


None of these chips matter in the wake of the 5800x3d. All this money and power utterly wasted. 


Valantar said:


> Given that these barely consume power even under full load I really don't see why it would.


Chief, are you and I reading the same review? As you said : "instead we're seeing drastic power draw increases even in real-world applications and gaming", the big difference between 13th and 12th gen is having more "e" cores and higher clocks. Those "e" cores are clearly pulling somewhat noticeable amounts. 


champsilva said:


> Why people care so much about 100W in peak power consumption?
> 
> That's like pennies por month.


Because cooling these insane power draws is both expensive and annoying, and anyone without extreme cooling will not get this performance out of these chips. Not so much an issue with zen 4, but hardware unboxed couldnt keep these things from throttling with a 420mm AIO. Raptor lake shows some pretty diminished performance at sub 200w.



minas said:


> I don't get it. Performance is very similar to 12700k, same thread count. 12700k has two perf cores more, 13600k has 4 eff cores more. Frequencies only 100-200 mhz higher on 13600k. So why 13600k draws so much power and runs very hot ?


Because those "e" cores are not as efficient as intel claims.


----------



## Valantar (Oct 20, 2022)

minas said:


> So why 13600k draws so much power and runs very hot ?


You pretty much answered that yourself, no?


minas said:


> 12700k has two perf cores more, 13600k has 4 eff cores more. Frequencies only 100-200 mhz higher on 13600k.


Fewer P cores (which are large, but draw a lot of power), more E cores (which are very dense, but draw less power per core), and higher clocks. Overall power draws are very similar, slightly increased from the clock speed bump, but thermal density is likely a bit up, meaning higher temperatures.


----------



## W1zzard (Oct 20, 2022)

Dyatlov A said:


> Nice CPU, would be very awesome to know what it can do at 5.6GHz overclock


Uploaded



P4-630 said:


> Where's the i7 13700K review?


Next week. Intel only send 13600K and 13900K to reviewers


----------



## ZetZet (Oct 20, 2022)

champsilva said:


> Why people care so much about 100W in peak power consumption?
> 
> That's like pennies por month.


It's bizarre af, it's like they have to find something to nitpick.


----------



## Valantar (Oct 20, 2022)

TheinsanegamerN said:


> None of these chips matter in the wake of the 5800x3d. All this money and power utterly wasted.


Mostly true. That chip definitely isn't cheap either (here in Norway it's 25% more than a 13600K), but of course platform costs range from free-'cause-you-already-own-it, to affordable, to expensive if you want to, vs. "it's all expensive" for both this and Ryzen 7000. I'm really, really looking forward to seeing what Ryzen 7000X3D can do, though I also do hope that AMD puts out some more 5000X3D chips for AM4. A 5600X3D would be the perfect chip for a low-power SFF gaming rig.


TheinsanegamerN said:


> Chief, are you and I reading the same review? As you said : "instead we're seeing drastic power draw increases even in real-world applications and gaming", the big difference between 13th and 12th gen is having more "e" cores and higher clocks. Those "e" cores are clearly pulling somewhat noticeable amounts.


Noticeable? Sure, in MT applications that load them heavily. 12th gen E cores topped out at~10W/core, and these clock a bit higher so they might go a bit higher. I sincerely doubt there's a noticeable amount of games putting any real load on the E cores though, given the work Intel has been doing on scheduling in both hardware and software and the potential performance harms of doing so. I'm putting the power bump in gaming here down to the clock speed bump, more aggressive boosting overall, and the fabric/cache tweaks Intel has done since the 12th gen. I might be wrong, but I don't see how they could be making much use of the E cores in gaming without this hurting frametimes.


----------



## phanbuey (Oct 20, 2022)

Pushed way past the efficiency curve again.  ---  guess I will have to do the overclock/undervolt with TVB backoff on the 13700kf.

This is the dumbest race to the bottom -- who keeps making these "YES! LETS ADD 75W AND 20C for 1.8% MORE PERFORMANCE!!"  I want to have a conversation with this person.


----------



## P4-630 (Oct 20, 2022)

W1zzard said:


> Next week. Intel only send 13600K and 13900K to reviewers



Well at tweakers they got them all 3...









						Intel 13th Gen Core Raptor Lake - Core i5 13600K, i7 13700K en i9 13900K getest
					

Na de Ryzens AMD is het de beurt aan Intel, met zijn nieuwe, dertiende generatie Core-processors. Wij hebben de Core i5 13600K, i7 13700K en i9 13900K getest.




					tweakers.net


----------



## W1zzard (Oct 20, 2022)

So they found a retailer that gave them a 13700K early.. I tried .. Intel is crazy about selling early this time, guess they will be even more strict now


----------



## Space Lynx (Oct 20, 2022)

W1zzard said:


> Uploaded
> 
> 
> Next week. Intel only send 13600K and 13900K to reviewers



Just got my 13600k ordered. thanks for the great review mate. never thought I'd be going back to Intel.

now I need to figure out if I want to go with a budget last gen board or go with Z790 board... 

is 13600k plug and play with older boards? like lets say i find a really good deal on a z690 board, can i just pop the 13600k in it with no worry, or would it need a bios update first with an older intel chip?


----------



## W1zzard (Oct 20, 2022)

CallandorWoT said:


> is 13600k plug and play with older boards? like lets say i find a really good deal on a z690 board, can i just pop the 13600k in it with no worry, or would it need a bios update first with an older intel chip?


it just works, then you update the bios, and it just works better


----------



## P4-630 (Oct 20, 2022)

CallandorWoT said:


> is 13600k plug and play with older boards? like lets say i find a really good deal on a z690 board, can i just pop the 13600k in it with no worry, or would it need a bios update first with an older intel chip?


You'll probably need to flash, but there are motherboards you can flash without CPU.


----------



## Verpal (Oct 20, 2022)

Currently I have a 12400F with some undervolt, efficiency is exceedingly good, espacially in gaming.
That being said, 13600K, or whatever non-K series that comes later seems like a reasonable upgrade, temperature and power draw is still within control, price is more competitive....
Still, not a particularly great generational uplift, price to performance didn't really improved, perhaps AMD will consider further price cut?


----------



## phanbuey (Oct 20, 2022)

Verpal said:


> Currently I have a 12400F with some undervolt, efficiency is exceedingly good, espacially in gaming.
> That being said, 13600K, or whatever non-K series that comes later seems like a reasonable upgrade, temperature and power draw is still within control, price is more competitive....
> Still, not a particularly great generational uplift, price to performance didn't really improved, perhaps AMD will consider further price cut?



All AMD needs to do at this point is drop the price of boards -- the Raptor Lake is not compelling and on a dead socket.  They need meteor lake.


----------



## Space Lynx (Oct 20, 2022)

@W1zzard my friend is wanting to know what amount of vram the 3080 you are using has for this 13600k review? 10gb or 12gb?



phanbuey said:


> All AMD needs to do at this point is drop the price of boards -- the Raptor Lake is not compelling and on a dead socket.  They need meteor lake.



Care to recommend me a budget board to pair with my 13600k from last gen? I don't intend to upgrade for 5-7 years after this, but also don't really think Z790 is worth the premium for my use case (I will only have 1 nvme drive, etc)


----------



## phanbuey (Oct 20, 2022)

CallandorWoT said:


> @W1zzard my friend is wanting to know what amount of vram the 3080 you are using has for this 13600k review? 10gb or 12gb?
> 
> 
> 
> Care to recommend me a budget board to pair with my 13600k from last gen? I don't intend to upgrade for 5-7 years after this, but also don't really think Z790 is worth the premium for my use case (I will only have 1 nvme drive, etc)


Amazon.com: MSI PRO Z690-A ProSeries Motherboard (ATX, 12th Gen Intel Core, LGA 1700 Socket, DDR5, USB 3.2 Gen 2, PCIe 5, 2.5G LAN, M.2 Slots) : Electronics

I've used this one and really like it.


----------



## FreezingPC (Oct 20, 2022)

CallandorWoT said:


> Does anyone know if I disable ecores, will it improve temps in gaming?


Not like it would really matter, you will not get 90°C on a i5-13900k on a game, unless you go on your way to screw up...


----------



## Space Lynx (Oct 20, 2022)

phanbuey said:


> Amazon.com: MSI PRO Z690-A ProSeries Motherboard (ATX, 12th Gen Intel Core, LGA 1700 Socket, DDR5, USB 3.2 Gen 2, PCIe 5, 2.5G LAN, M.2 Slots) : Electronics
> 
> I've used this one and really like it.



newegg.com/p/N82E16813162093?

this one is the same price as that but has the z790 chipset. i don't intend to oc at all, i will be running stock everything. worried about the vrm's even on stock though with this one... thoughts?


----------



## FreezingPC (Oct 20, 2022)

CallandorWoT said:


> newegg.com/p/N82E16813162093?
> 
> this one is the same price as that but has the z790 chipset. i don't intend to oc at all, i will be running stock everything. worried about the vrm's even on stock though with this one... thoughts?


You can probably just buy a msi B660 tomahawk for the price of that mobo. (and no problemo with)


----------



## W1zzard (Oct 20, 2022)

CallandorWoT said:


> @W1zzard my friend is wanting to know what amount of vram the 3080 you are using has for this 13600k review? 10gb or 12gb?


10 GB, makes no difference


----------



## TheUn4seen (Oct 20, 2022)

Please help me understand the m.2 SSD situation, from what I gather installing an m.2 SSD will cripple the GPU which seems ridiculous. Is it only true for Gen5 SSDs or any PCIe drive has this effect? Say I install my 660p (gen3) in the m.2 slot alongside the 3080 (gen4), would it still castrate the GPU or is it only a problem of mixing gen5 GPUs with gen5 SSDs?
If there's a simple answer in the review, I must have missed it - if that's the case I'd be much obliged for pointing me to the right place.


----------



## W1zzard (Oct 20, 2022)

CallandorWoT said:


> Does anyone know if I disable ecores, will it improve temps in gaming?


Yes, and you will lose some gaming performance. Good idea for an article


----------



## P4-630 (Oct 20, 2022)

W1zzard said:


> Yes, and you will lose some gaming performance.


When I run a game I can see the 4 e-cores at nearly no load while gaming with my i7 12700K, has it been changed with raptor lake?


----------



## W1zzard (Oct 20, 2022)

P4-630 said:


> When I run a game I can see the 4 e-cores at nearly no load while gaming with my i7 12700K, has it been changed with raptor lake?


Turn them off, how's your gaming now?


----------



## P4-630 (Oct 20, 2022)

W1zzard said:


> Turn them off, how's your gaming now?



So all the background processes and everything else the E-cores did will be shifted to the P-cores, ok..


----------



## THU31 (Oct 20, 2022)

I wonder if Raptor Lake can do lower voltages at the same clocks compared to Alder Lake, especially below 5 GHz. Looking forward to people testing this.

I will definitely be targeting efficiency when I upgrade next year, so a fixed clock speed and undervolting is what I will be looking at.



P4-630 said:


> So all the background processes and everything else the E-cores did will be shifted to the P-cores, ok..



As long as you are not maxing out the P-cores, you will not see a difference after disabling the E-cores. I expect 6C/12T is not enough for some games at unlocked framerates, but I doubt there are many of those right now.
With a 12700K, I doubt you can ever utilize the E-cores in gaming, unless you are doing heavy background tasks like encoding.
For just gaming, I would still prefer 8 P-cores over any hybrid config.


----------



## chowow (Oct 20, 2022)

dammit dammit dammit dammit why didn't you use the 4O90 kill two birds with one stone people want to know ABOUT CPU scaling AT 4K again thanks for all your hard work.
​


----------



## GerKNG (Oct 20, 2022)

the test rig really needs a 4090...


----------



## Space Lynx (Oct 20, 2022)

GerKNG said:


> the test rig really needs a 4090...



If you read the review, he says the 4090 review is incoming for the new chips.


----------



## Valantar (Oct 20, 2022)

chowow said:


> dammit dammit dammit dammit why didn't you use the 4O90 kill two birds with one stone people want to know ABOUT CPU scaling AT 4K again thanks for all your hard work.
> ​


Because the workload involved in making changes to test procedures like that is _massive_. Most likely there'll be a separate article for that, possibly with a more limited scope of CPUs - there are 37 CPUs in the game test charts here after all. 12 games x 37 CPUs x even just 10 minutes per game test = 74 hours of testing. Most likely each game test takes more than that, and of course there's data collection, processing and analysis on top of this, plus all the time needed to build, tear down and re-build systems for testing, re-imaging OSes to avoid driver issues when swapping chips, and more. Even if you're able to run several tests in parallel some of the time, that's still a massive time expenditure to re-test everything for a new test suite.


----------



## codex5600x (Oct 20, 2022)

12600k is the best for gaming


----------



## chowow (Oct 20, 2022)

Valantar said:


> Because the workload involved in making changes to test procedures like that is _massive_. Most likely there'll be a separate article for that, possibly with a more limited scope of CPUs - there are 37 CPUs in the game test charts here after all. 12 games x 37 CPUs x even just 10 minutes per game test = 74 hours of testing. Most likely each game test takes more than that, and of course there's data collection, processing and analysis on top of this, plus all the time needed to build, tear down and re-build systems for testing, re-imaging OSes to avoid driver issues when swapping chips, and more. Even if you're able to run several tests in parallel some of the time, that's still a massive time expenditure to re-test everything for a new test suite.


very funny put in the 4090 update drivers


----------



## GerKNG (Oct 20, 2022)

CallandorWoT said:


> If you read the review, he says the 4090 review is incoming for the new chips.


i know that. 
doesn't change the fact that the gaming benchmarks are pretty much just GPU bound and by far not as accurate as others.


----------



## Space Lynx (Oct 20, 2022)

Can someone explain to me what 

_Some workloads get scheduled onto wrong cores_
in the negative side of the review is talking about? I read 90% of the review, but did not come across this being talked about directly, anything I need to worry about as a casual gamer who just bought a 13600k?



GerKNG said:


> i know that.
> doesn't change the fact that the gaming benchmarks are pretty much just GPU bound and by far not as accurate as others.



I game at both 1440p and 1080p, and I disagree with you here, we are seeing 30-50 fps gains in some games, and a 3080 10gb isn't exactly what I'd call super high end. I mean not with next gen here. Personally, I am very happy to have the 13600k now. It will benefit me in raw frames gained.


----------



## W1zzard (Oct 20, 2022)

chowow said:


> dammit dammit dammit dammit why didn't you use the 4O90 kill two birds with one stone people want to know ABOUT CPU scaling AT 4K again thanks for all your hard work.
> ​





GerKNG said:


> the test rig really needs a 4090...


Because I'd have to test 35 processors with RTX 4090 for proper comparison data



CallandorWoT said:


> Can someone explain to me what
> 
> _Some workloads get scheduled onto wrong cores_
> in the negative side of the review is talking about? I read 90% of the review, but did not come across this being talked about directly, anything I need to worry about as a casual gamer who just bought a 13600k?


Note how Virtualization has surprisingly low performance results? That's because it gets scheduled onto E-Cores. All other workload seem fine



GerKNG said:


> doesn't change the fact that the gaming benchmarks are pretty much just GPU bound and by far not as accurate as others.


If your system is not GPU bound while gaming, you wasted a lot of money on the GPU. You should always be GPU bound, because the GPU costs much more than the CPU



Valantar said:


> 12 games x 37 CPUs x even just 10 minutes per game test = 74 hours of testing.


12 games x 4 resolutions x 37 CPUs x 10 minutes = 296 hours

I spent almost a whole month in summer testing these CPUs


----------



## GerKNG (Oct 20, 2022)

W1zzard said:


> your system is not GPU bound while gaming, you wasted a lot of money on the GPU


but not in a CPU Benchmark...


----------



## W1zzard (Oct 20, 2022)

GerKNG said:


> but not in a CPU Benchmark...


Why not? Are we influencers who are benchmarking for the sake of showing how awesome the new tech is? So that people think they get more FPS, while they will gain almost nothing?


----------



## dj-electric (Oct 20, 2022)

I like how people downplay the work that's testing hardware. Its a F@%$ ton of work, it takes days and days of raw data collection to form a small database


----------



## Space Lynx (Oct 20, 2022)

I don't use virtualization, so thank you for the clarification @W1zzard 
Just casual gamer, so this won't affect me. Really glad I am making this upgrade. Should be some fun times it gives me over next 5 years or so.


----------



## chowow (Oct 20, 2022)

W1zzard said:


> Because I'd have to test 35 processors with RTX 4090 for proper comparison data
> 
> 
> Note how Virtualization has surprisingly low performance results? That's because it gets scheduled onto E-Cores. All other workload seem fine
> ...


Sorry I didn't know you had another test in coming BEFORE I OPEN MY BIG MOUTH.


----------



## B-Real (Oct 20, 2022)

HenrySomeone said:


> Mainstream champion for sure! Gaming performance beyond anything from previous gen or what AyyyyMD can muster.



On Techspot, 13900K is 4% faster than the 7700X with a 4090 (not a 3080). $590 vs. $400.


----------



## BoredErica (Oct 20, 2022)

What is the cause of 13600k performing terribly on 5.6ghz all core in Borderlands 3?


----------



## Delta6326 (Oct 20, 2022)

CallandorWoT said:


> Just got my 13600k ordered. thanks for the great review mate. never thought I'd be going back to Intel.
> 
> now I need to figure out if I want to go with a budget last gen board or go with Z790 board...
> 
> is 13600k plug and play with older boards? like lets say i find a really good deal on a z690 board, can i just pop the 13600k in it with no worry, or would it need a bios update first with an older intel chip?


Check out the MSI Z690 EDGE DDR4. I was able to snag one for $209. It has bios flash where you don't need CPU or RAM installed. It even comes with a 16GB flashdrive it's USB 2.0. You do need to format it to FAT32 it comes NTFS. Just ordered a 13600k. Hope it works well for another 5+ years like my 6600k. Now time for RDNA3
https://www.newegg.com/p/N82E16813144486


----------



## metalslaw (Oct 20, 2022)

P4-630 said:


> So all the background processes and everything else the E-cores did will be shifted to the P-cores, ok..


I've been wondering if you can use the process lasso program to cordon off all windows processes to only the e cores, then have all your main apps/games, all run on only the p cores.

I don't own a 12th or 13th gen, but it would be an interesting experiment to try.

I wouldn't doubt it would result in a slightly lower max fps in games, but I'm more interested in if it improves 1%, and 0.1% lows.


----------



## raptori (Oct 20, 2022)

TheUn4seen said:


> Please help me understand the m.2 SSD situation, from what I understand installing an m.2 SSD will cripple the GPU which seems ridiculous. Is it only true for Gen5 SSDs or any PCIe drive has this effect? Say I install my 660p (gen3) in the m.2 slot alongside the 3080 (gen4), would it still castrate the GPU or is it only a problem of mixing with gen5 GPUs ang gen5 SSDs?
> If there's a simple answer in the review, I must have missed it - if that's the case I'd be much obliged for pointing me to the right place.


+1 here, can we get answer for this ?


----------



## ncrs (Oct 20, 2022)

raptori said:


> +1 here, can we get answer for this ?


Alder and Raptor Lakes have 16 lanes of PCIe 5.0 from the CPU that usually go to the x16 slot for the GPU. Unlike Zen 4 the x4 dedicated to M.2 SSD is 4.0 and not 5.0. In order to get a 5.0 M.2 slot the motherboard has to take it from the GPU lanes thus reducing it to x8 (and potentially wasting x4 in the process). I doubt many motherboards will actually do this, the MSI MAG Z790 Tomahawk WiFi DDR4, which was reviewed today, does not.


----------



## catulitechup (Oct 20, 2022)

W1zzard said:


> Yes, and you will lose some gaming performance. Good idea for an article



Waiting for article and maybe add some benchs about how impact in temperatures / watts if you can


----------



## ShiningSapphire (Oct 20, 2022)

In short, 13600K scrapped 7700X and 7600X a month after their release. Uhh this must hurt AMD...


----------



## r9 (Oct 20, 2022)

Same performance as 7900x if you average productivity and gaming for lot less if you don't mind melting the whole system down. lol 
I don't care for power consumption so would just pair it with some budget mobo and some cheapo DDR4 ram.
Not sure what's the actual difference ddr4 vs ddr5 for these chips.


----------



## Valantar (Oct 20, 2022)

chowow said:


> very funny put in the 4090 update drivers


... And what about the other 36 CPUs? Or do you just want 13600K+4090 results with no comparisons?


----------



## InVasMani (Oct 20, 2022)

Performance is respectable the efficiency is unreasonable especially power unlimited Intel chased benchmark scores too much and threw efficiency out the window. Mixed feelings on this CPU given the wattage. I'd like to see how it behaves with power limits and undervolting and maybe tinkering with multipliers on each core type. I get the impression the P cores were pushed too aggressively on boosts while E cores could have had a little more manual scaling balancing between base and boost frequencies. It's still impressive that it pretty much beats not just the 12700K, but also the 12900K. The power draw is however very foolish and disheartening. If it wasn't for the poor power consumption I'd say Intel really did great, but I can't overlook that big of a difference in wattage at idle and max power.


----------



## Space Lynx (Oct 20, 2022)

Thermalright LGA1700-BCF Contact Frame Review : Can it tame Raptor Lake’s heat?
					

Impressive temperature reduction for those impacted by CPU bending




					www.tomshardware.com
				




I want to see if my Vetroo V5 cooler would benefit from this little gizmo. Not sure really. Only benefits some heatsinks, *so if anyone with raptor lake has a Vetroo V5 let me know.* I was originally going to use the V5 as a backup cooler, as it only cost me $21 or something. If temps are good enough though, and since i am only running stock and casual gaming, it may just fine for my needs. I am still tempted to get something better though.


----------



## Dyatlov A (Oct 20, 2022)

W1zzard said:


> Fixed, thanks!
> 
> 
> Running last tests at 5.6
> ...



How is than possible in certain cases faster at stock, than with fixed 5.6GHz? As i know in stock should boost only to 5.1GHz, that means a massive 500mhz difference.

update,
thx, i see without games much faster at 5.6GHz. Very awesome CPU and very good tests. Lookimg forward to see once again with 4090.


----------



## Space Lynx (Oct 20, 2022)

InVasMani said:


> Performance is respectable the efficiency is unreasonable especially power unlimited Intel chased benchmark scores too much and threw efficiency out the window. Mixed feelings on this CPU given the wattage. I'd like to see how it behaves with power limits and undervolting and maybe tinkering with multipliers on each core type. I get the impression the P cores were pushed too aggressively on boosts while E cores could have had a little more manual scaling balancing between base and boost frequencies. It's still impressive that it pretty much beats not just the 12700K, but also the 12900K. The power draw is however very foolish and disheartening. If it wasn't for the poor power consumption I'd say Intel really did great, but I can't overlook that big of a difference in wattage at idle and max power.



I don't understand why all of you keep hating on this cpu. Seriously, 90% of people buying this cpu are buying it for gaming. In gaming at stock it run at what 71 celsius on average based on the review with a weak heatsink (noctua u14), with a 280mm AIO which is common these days for your average gamer, this will be a 60 celsius gaming cpu. and wattage is average at stock, it's really only when its overclocked its quite bad imo.

better than getting less fps with 95 celsius on a zen4 chip.


----------



## chowow (Oct 20, 2022)

Valantar said:


> ... And what about the other 36 CPUs? Or do you just want 13600K+4090 results with no compdo you





Valantar said:


> ... And what about the other 36 CPUs? Or do you just want 13600K+4090 results with no comparisons?


OK! go check how many processors were there in the original review GO CHECK work done


----------



## Tek-Check (Oct 20, 2022)

CallandorWoT said:


> better than getting less fps with 95 celsius on a zen4 chip.


Incorrect. Zen 4 in gaming does not hit top temperatures. Just look at very demanding Cyberpunk. 
13600K sould be great for gaming on good value systems. There is no doubt about it.
I expected it to be more performant in applications than 7700X, considering it's 14 core CPU.


----------



## Super Firm Tofu (Oct 20, 2022)

CallandorWoT said:


> I don't understand why all of you keep hating on this cpu. Seriously, 90% of people buying this cpu are buying it for gaming. In gaming at stock it run at what 71 celsius on average based on the review with a weak heatsink (noctua u14), with a 280mm AIO which is common these days for your average gamer, this will be a 60 celsius gaming cpu. and wattage is average at stock, it's really only when its overclocked its quite bad imo.
> 
> better than getting less fps with 95 celsius on a zen4 chip.



A coupla tings.  First, the NH-U14s isn't really a weak heat sink, second, the 7600x and 7700x run a smidge cooler than the 13600k while gaming, and lastly there's a 5% performance difference between all three.  They're all good choices.  It's all good my brudda.  Enjoy the '600k. 






PS - Don't let the naysayers bring you down.  No matter what you buy, somebody's going to call you an idiot for doing it.  (Alder Lake & Zen 4 owner here)   F 'em.


----------



## Vader (Oct 20, 2022)

Intel and Amd differences in performance are smaller every generation, yet people continue to argue with the same intensity as if either choice were a clear cut above the other. Just chill out because there is no bad choices here, and thanks to TPU folks with edge cases have enough info to understand which one should they pick.


----------



## FreezingPC (Oct 20, 2022)

CallandorWoT said:


> better than getting less fps with 95 celsius on a zen4 chip.



There is a graph with the temps for several CPUs, you just need to check them.
Games aren't your cinebenchs or blenders, you need to screw up the thing to get the 90°c...




CallandorWoT said:


> based on the review with a weak heatsink (noctua u14)




This cooler is nearly 10 years old and can not only give the 7950x a run for his money but also without sounding like a jet engine.
"weak" isn't the good word to define this cooler...









						AMD Ryzen 9 7950X Cooling Requirements & Thermal Throttling
					

High temperature seem to be an issue on the new Ryzen 7000 processors. We're pairing a Ryzen 9 7950X with a $10 stock cooler, a Noctua air-cooler and a 420 mm AIO to get a feel for what the differences are like in terms of °C, MHz and performance in both applications and games.




					www.techpowerup.com


----------



## docnorth (Oct 20, 2022)

Legacy-ZA said:


> Watch that power draw.
> Nope, nopity NOPE.
> 
> What a clown world we live in.
> ...


What power draw? It is higher than 7700x, but still very reasonable. It seems OC even on Intel is almost only for fun, the boost algorithms took over...
Whole system cost now seems to favor Intel, but this is volatile.

Btw your current system is indeed something you should be glad about.


----------



## Valantar (Oct 20, 2022)

chowow said:


> OK! go check how many processors were there in the original review GO CHECK work done


... I did. In the gaming tests there are 37 CPUs. Or are you asking for a_ 4090 review _with this CPU? 'Cause that's an entirely different thing - but you'd once again be left without comparisons, unless you want to re-test every relevant GPU on that new test platform, which is again hours and hours and hours of work. Swapping GPUs is easier and quicker than swapping CPUs, but it's still a big job.


----------



## Sithaer (Oct 20, 2022)

Hyderz said:


> Let’s see what the 13400 cpu can do if the 13600k can be this powerful



Same here, patiently waiting for the 13400 reviews/launch to compare it with the 12400._ 'also depends on the pricing where I live'_
Pretty much the only 2 CPU I would be willing to upgrade to from my 12100F which does everything I do currently just fine.

_Unless some leaks are true and we will get a 6/12 i 3 Raptor lake too, that could be an interesting option._


----------



## docnorth (Oct 20, 2022)

CallandorWoT said:


> Can someone explain to me what
> 
> _Some workloads get scheduled onto wrong cores_
> in the negative side of the review is talking about? I read 90% of the review, but did not come across this being talked about directly, anything I need to worry about as a casual gamer who just bought a 13600k?


The one is virtualization, like @W1zzard said. Maybe cryptography too, both AES and SHA3, page 14.


----------



## Denver (Oct 20, 2022)

Well, I think the i5 is a much more attractive product than the i9 in every way.

A point to consider: @W1zzard  are you still using the old rpcs3 code without optimizations for Zen4 ?


----------



## InVasMani (Oct 20, 2022)

CallandorWoT said:


> I don't understand why all of you keep hating on this cpu. Seriously, 90% of people buying this cpu are buying it for gaming. In gaming at stock it run at what 71 celsius on average based on the review with a weak heatsink (noctua u14), with a 280mm AIO which is common these days for your average gamer, this will be a 60 celsius gaming cpu. and wattage is average at stock, it's really only when its overclocked its quite bad imo.
> 
> better than getting less fps with 95 celsius on a zen4 chip.



Oh I'm sorry for not shilling for Intel harder by only painting them in a positive light, but that's for it's laid off marketing team to do.


----------



## RedBear (Oct 20, 2022)

TheUn4seen said:


> Please help me understand the m.2 SSD situation, from what I understand installing an m.2 SSD will cripple the GPU which seems ridiculous. Is it only true for Gen5 SSDs or any PCIe drive has this effect? Say I install my 660p (gen3) in the m.2 slot alongside the 3080 (gen4), would it still castrate the GPU or is it only a problem of mixing with gen5 GPUs ang gen5 SSDs?
> If there's a simple answer in the review, I must have missed it - if that's the case I'd be much obliged for pointing me to the right place.


I think W1zzard gave a good explanation at the end of the article about PCIe scaling for the RTX 4090; answering your question in short, it's true for _any_ SSD, as long as you use _that_ m2 slot with PCIe Gen 5 lanes coming from the CPU, but as it's been explained above by ncrs this feature hasn't been implemented in every motherboard to begin with.

On the CPU itself, energy efficiency aside these are impressive results, AMD might eventually reclaim the gaming primate with the X3D variants of Zen 4, but Intel's 13600K will probably still hold the crown for the performance-to-price ratio. Let's see if it will be enough to recover market margins next year.


----------



## 80-watt Hamster (Oct 20, 2022)

CallandorWoT said:


> with a 280mm AIO which is common these days for your average gamer,



I'm not convinced this is true.


----------



## RandallFlagg (Oct 20, 2022)

Well I instantly regret buying that 12700K last week, even at $300.

The freaking 13600K is a monster for $330.  

Crushed everything at games except 13900K. 

Beat everything short of 7900X / 12900K in apps.  

At least I have a good upgrade path now.

AMD is going to have to lop $100+ off their 7600X and 7700X, and $200 off the 7950X/7900X for them to make any sense whatsoever.


----------



## Space Lynx (Oct 20, 2022)

Super Firm Tofu said:


> A coupla tings.  First, the NH-U14s isn't really a weak heat sink, second, the 7600x and 7700x run a smidge cooler than the 13600k while gaming, and lastly there's a 5% performance difference between all three.  They're all good choices.  It's all good my brudda.  Enjoy the '600k.
> 
> View attachment 266369
> 
> PS - Don't let the naysayers bring you down.  No matter what you buy, somebody's going to call you an idiot for doing it.  (Alder Lake & Zen 4 owner here)   F 'em.



I don't call getting 50 fps more in Age of Empires 4 at 1080p and 30 fps more in Far Cry 6 a 5% difference (yes I still game at 1080p and sometimes 1440p it varies between game). It's fine that AMD has a selection of games that bring the average down to 5%, but most games do favor Intel, in fact I would bet money if we looked at games in the last 10 years, vast majority of them will see great gains with a 13600k vs a zen4 chip. but since only new games are ever tested, that's why we only see the 5% average.

true UH14 isn't weak I suppose, I think the U12A and NH-D15 are the better two of Noctua though. I am considering a U12A for my rig actually at the moment. I bet I can bring that stock 71 celsius down to 61 celsius or lower with a mild bump in the fan curve with a U12A. we will see soon enough I expect.


----------



## P4-630 (Oct 20, 2022)

CallandorWoT said:


> Does anyone know if I disable ecores, will it improve temps in gaming?



This is what I get when playing GTA V, core 8,9,10 and 11 are the E-Cores, they only get up to 43 degrees.
Max temp on core 7, just 55 degrees... This is with 21.5C ambient temp. Ofcourse with these temps there is no need to disable the E-Cores...




The E-cores in the taskmanager from bottom right to left.


----------



## Space Lynx (Oct 20, 2022)

P4-630 said:


> This is what I get when playing GTA V, core 8,9,10 and 11 are the E-Cores, they only get up to 43 degrees.
> Max temp on core 7, just 55 degrees... This is with 21.5C ambient temp. Ofcourse with these temps there is no need to disable the E-Cores...
> View attachment 266385
> 
> The E-cores in the taskmanager from bottom right to left.



I decided I won't disable them. Since I am just a casual gamer I don't expect I will run into any issues.


----------



## RandallFlagg (Oct 20, 2022)

80-watt Hamster said:


> I'm not convinced this is true.



Probably more along the lines of the average X or K series chip buyer.   AIOs are quite common.

These don't cost much more than a high end air cooler.  

The biggest expense you will have with a 13600K is getting a GPU than can keep it busy, at least for now.


----------



## Super Firm Tofu (Oct 20, 2022)

CallandorWoT said:


> I don't call getting 50 fps more in Age of Empires 4 at 1080p and 30 fps more in Far Cry 6 a 5% difference (yes I still game at 1080p and sometimes 1440p it varies between game). It's fine that AMD has a selection of games that bring the average down to 5%, but most games do favor Intel, in fact I would bet money if we looked at games in the last 10 years, vast majority of them will see great gains with a 13600k vs a zen4 chip. but since only new games are ever tested, that's why we only see the 5% average.
> 
> true UH14 isn't weak I suppose, I think the U12A and NH-D15 are the better two of Noctua though. I am considering a U12A for my rig actually at the moment. I bet I can bring that stock 71 celsius down to 61 celsius or lower with a mild bump in the fan curve with a U12A. we will see soon enough I expect.



You don't really have to prove or justify anything to me - I have both.   Being happy with what you spent your money on is what matters.  Sounds like you did well!  Can't wait to hear your experiences.



RandallFlagg said:


> Well I instantly regret buying that 12700K last week, even at $300.
> 
> The freaking 13600K is a monster for $330.
> 
> ...



Send it back?  That's what being a prime member is all about.


----------



## RandallFlagg (Oct 20, 2022)

Super Firm Tofu said:


> You don't really have to prove or justify anything to me - I have both.   Being happy with what you spent your money on is what matters.  Sounds like you did well!  Can't wait to hear your experiences.
> 
> 
> 
> Send it back?  That's what being a prime member is all about.



I'll just wait and upgrade again in the spring I imagine. 

Jeez though this thing is a monster. 

Der8aur testing power scaling...

At 90W power limit the 13900K wiped the floor with the 5800X3D, 12900KS, and 7950X. 

Look at that poor 5950X at the bottom at 93.6W and 112 FPS.






Oh man.. this is almost 50% faster than 5950X and like 31% faster than 5800X3D even at 90W limit.


----------



## wheresmycar (Oct 20, 2022)

TBH, i wasn't entirely excited about the 13th Gen K series as higher temps/thermals were already speculated (same applies to Zen 4). Can't wait to see what the 13400 (perhaps new value king) and 13700 bring to the table paired up with affordable B-series boards. Same goes for Zen 4, something along the lines 7500/7600/7700 + eventually less pricier B-mobs. Hope we get to see the rest of the SKUs from both camps (+X3D) before the close of 2022. 

@W1zzard  i was hoping DDR4 memory would also make the cut in these 13600K gaming charts/etc. Is this something expected soon?


----------



## 80-watt Hamster (Oct 20, 2022)

RandallFlagg said:


> Probably more along the lines of the average X or K series chip buyer.   AIOs are quite common.
> 
> These don't cost much more than a high end air cooler.
> 
> The biggest expense you will have with a 13600K is getting a GPU than can keep it busy, at least for now.



Within that population, perhaps.  My own bias toward air is probably clouding my view.  But does a 280mm (or larger) qualify as "common" across the entire set of PC gamers?  I suppose if one is already spending $300+ on processor, another ~$100 on an AIO doesn't seem like much.  Latest SHWS has 80% on 6 cores or fewer.  Let's say 10% of those have AIOs, and 50% of 8+ cores do.  That makes... 18%, I think.  240mm is the most popular rad size, right?  If those plus 120s make up half of the AIOs, 9% of PC gamers have 280mm or larger liquid coolers.  Analysis pulled directly from me bum.  Is 10-15% common?  I dunno.  Seems like medical conditions that affect 20% of the population get described as "rare".

None of that's on topic anyway, so Imma shut up about it now.

EDIT: Said 240 at one point when meant 280


----------



## LuxZg (Oct 20, 2022)

@RandallFlagg - love those useless 1080p 700fps benches and 2 graphs.. I'll wait for a more comprehensive tests.

Anyway, wanted to say that I'll be waiting some more as well. Maybe 13400, maybe 7600, maybe AMD price drops... Just give them all couple months.

Likewise, looking forward to some more power tweaking results.

And why? Because at these results, you need to count on buying larger PSU, better cooler, and while 600 series MBOs will save you money, that won't go to GPU - you'll need to spend it on PSU & cooling. While your average fame doesn't pull that power, you can't plan a build based on ~100W & ~70C from gaming tests

Depending on prices, if I go either Ryzen 7000 or Intel 13000, I think power tweaking will be a must. For me (personally) even this power during gaming is much, and I'll be certainly tweaking the build for FPS target limit, and no, not 120/144/165 one.


----------



## RandallFlagg (Oct 20, 2022)

LuxZg said:


> @RandallFlagg - love those useless 1080p 700fps benches and 2 graphs.. I'll wait for a more comprehensive tests.
> 
> Anyway, wanted to say that I'll be waiting some more as well. Maybe 13400, maybe 7600, maybe AMD price drops... Just give them all couple months.
> 
> ...



Of course.  Some AMD acolyte tells us that Der8auer's analysis is useless.


----------



## Valantar (Oct 20, 2022)

RandallFlagg said:


> I'll just wait and upgrade again in the spring I imagine.
> 
> Jeez though this thing is a monster.
> 
> ...


That's impressive performance, but on the other hand it's not like those power reductions are all that huge - 25W in FC6 is okay, but 9W in PUBG is almost nothing. And of course the 5800X3D still beats it in efficiency, even when the 13900K is limited to 90W. FC6: 164/90=1.8fps/W; 142/70,5=2fps/W; PUBG: 610/89.9=6.8fps/W; 465/62.8=7.4fps/W. The 5800X3D is definitely behind in absolute performance, but you'd need to power limit the 13900K further to catch up in efficiency - if it ever does.


----------



## RandallFlagg (Oct 20, 2022)

80-watt Hamster said:


> Within that population, perhaps.  My own bias toward air is probably clouding my view.  But does a 280mm (or larger) qualify as "common" across the entire set of PC gamers?  I suppose if one is already spending $300+ on processor, another ~$100 on an AIO doesn't seem like much.  Latest SHWS has 80% on 6 cores or fewer.  Let's say 10% of those have AIOs, and 50% of 8+ cores do.  That makes... 18%, I think.  240mm is the most popular rad size, right?  If those plus 120s make up half of the AIOs, 9% of PC gamers have 240mm or larger liquid coolers.  Analysis pulled directly from me bum.  Is 10-15% common?  I dunno.  Seems like medical conditions that affect 20% of the population get described as "rare".
> 
> None of that's on topic anyway, so Imma shut up about it now.



Analysis paralysis?

People buying K or X series chips are buying top of the line chips, they are labelled by both AMD and Intel as enthusiast chips.  Honestly, there is nothing low end about a 13600K.   Upper midrange really ends with the 13600 (or 12400).

I think you'd be right if looking across the entire spectrum, excluding these enthusiast chips - but lets keep in mind 70%-80% of consumer desktop PCs are OEM rigs.  

The vast, vast majority of those are running something like a 12400 / 13400 - and most are running lesser than even that.  Those chips have no problem whatsoever running on air.

The K and X chips are enthusiast chips.  That's why I disdain too much talk of power consumption.  It's the wrong context.

It's like watching a bunch of car enthusiasts talking about a Dodge Hellcat and being concerned about MPG.   90% of people who buy these chips don't give a crap.


----------



## Denver (Oct 20, 2022)

RandallFlagg said:


> I'll just wait and upgrade again in the spring I imagine.
> 
> Jeez though this thing is a monster.
> 
> ...









Why do test results differ so much? Here Zen4 is above AlderLake.


----------



## RandallFlagg (Oct 20, 2022)

Denver said:


> Why do test results differ so much? Here Zen4 is above AlderLake.



Der8auer used lower graphics quality settings.  That will tend to limit the system more to the CPU.  

I think he also used DDR5-6800.  His results are for most of us, fast forwarding to what will happen in about 2 years IMO.   I for example will not see that differentiation anytime soon, because first I need a GPU comparable to a 4090.  

Maybe I will have that in 2024/2025 when the 5080 launches.

If you aren't looking at the test system setup first for every review you are looking at, you aren't going to be able to deduce much from any of them.  All the sites tell you something a little different.

TPU I like to see what a rig *I* would build is likely to perform at, today or sometime in the next 12 months.


----------



## Space Lynx (Oct 20, 2022)

wheresmycar said:


> TBH, i wasn't entirely excited about the 13th Gen K series as higher temps/thermals were already speculated (same applies to Zen 4). Can't wait to see what the 13400 (perhaps new value king) and 13700 bring to the table paired up with affordable B-series boards. Same goes for Zen 4, something along the lines 7500/7600/7700 + eventually less pricier B-mobs. Hope we get to see the rest of the SKUs from both camps (+X3D) before the close of 2022.
> 
> @W1zzard  i was hoping DDR4 memory would also make the cut in these 13600K gaming charts/etc. Is this something expected soon?



I just watched linustechtips review, they have a ddr4 ram gaming section. long story short, ram doesn't make a damn difference. even high end ddr5 ram vs low end really horrible ddr5 kits didn't matter in the few tests they did.


----------



## beedoo (Oct 20, 2022)

RandallFlagg said:


> I'll just wait and upgrade again in the spring I imagine.
> 
> Jeez though this thing is a monster.
> 
> ...



I can only imagine the miserable gaming experience that poor 5950X owner must be having, playing at 416fps... sucker! /s


----------



## wheresmycar (Oct 20, 2022)

80-watt Hamster said:


> Analysis pulled directly from me bum.



80-watts of explosively explorative flatulance.... anyone got a air-freshner handy?


----------



## RandallFlagg (Oct 20, 2022)

CallandorWoT said:


> I just watched linustechtips review, they have a ddr4 ram gaming section. long story short, ram doesn't make a damn difference. even high end ddr5 ram vs low end really horrible ddr5 kits didn't matter in the few tests they did.



I know you meant this but qualifier, it doesn't make much difference in games.  1-3%, usually. 

In productivity, in certain things like compress/decompress or some encoding, it can be huge as in 15-30%.



beedoo said:


> I can only imagine the miserable gaming experience that poor 5950X owner must be having, playing at 416fps... sucker! /s



Well I was referring to the 164FPS vs 112 FPS.

Anyone with a >100Hz refresh rate monitor is likely going to be quite interested in that difference.   

And that's today.  

In two years, anyone who might maybe think about getting say a 5070 or 8700XT might maybe be interested too.


----------



## Space Lynx (Oct 20, 2022)

RandallFlagg said:


> Well I was referring to the 164FPS vs 112 FPS.
> 
> Anyone with a >100Hz refresh rate monitor is likely going to be quite interested in that difference.



yep exactly. also one thing W1zz review doesn't mention is 1% lows and 5% lows in games, and i watched a few reviews on youtube from various youtubers. holy shit Intel decimates AMD on the min low fps side. which means smoother gaming experiences overall.


----------



## ca_steve (Oct 20, 2022)

Thanks for the review. 

I'd love to see a comparison of the R5 7600X and i5-13600K both limited to 65W (and in the case of the i5 e cores disabled) to get a fun preview of the non-X and non-K parts' performance and efficiency.


----------



## RandallFlagg (Oct 20, 2022)

Valantar said:


> That's impressive performance, but on the other hand it's not like those power reductions are all that huge - 25W in FC6 is okay, but 9W in PUBG is almost nothing. And of course the 5800X3D still beats it in efficiency, even when the 13900K is limited to 90W. FC6: 164/90=1.8fps/W; 142/70,5=2fps/W; PUBG: 610/89.9=6.8fps/W; 465/62.8=7.4fps/W. The 5800X3D is definitely behind in absolute performance, but you'd need to power limit the 13900K further to catch up in efficiency - if it ever does.



5800X3d beat it in one of them and lost to it in another, with the 13900K being the most efficient. 

I think it's fair to say that 99.99% of people actually shelling out money 13900K or 5800X3D are **NOT** buying the chips for their power efficiency.  The more astute ones (like 2%) might look at dollars per FPS.  The time it took me to type this is probably more valuable in terms of money earned if I were being paid than an entire year of +25W efficiency while gaming.

Now people doing full time rendering for a living, that sub 1% fraction of a fraction of people, they might care and should probably be looking at those factors. 

If you make $100/hr rendering and you sacrifice 5% of your speed to save 20% on power, or spend 20% more power to get 3% more speed ($3/hr), that's a math problem only the specific business would be able to solve for. 

However, my guess is that speed would win there too.


----------



## RedBear (Oct 21, 2022)

ca_steve said:


> Thanks for the review.
> 
> I'd love to see a comparison of the R5 7600X and i5-13600K both limited to 65W (and in the case of the i5 e cores disabled) to get a fun preview of the non-X and non-K parts' performance and efficiency.


I think it wouldn't be really accurate, unlike Alder Lake the middle range non-K Raptor Lake parts will feature E cores (8 of them for the 13600, according to a leaked chart and geekbench records), but they don't use the same Raptor Cove of the 12600K (and above), instead they reuse the Golden Cove of Alder Lake (which features a smaller L2 cache).


----------



## L'Eliminateur (Oct 21, 2022)

reading reviews across all tech sites i feel like there's a lot of interesting data missing:

1) Benchmark under windows 10, everyone is testing under win11(which is a no-go for me)
2) Benchmark under win10 with only P-cores


----------



## mechtech (Oct 21, 2022)

Looks like poor W1zz has been putting in the OT with all these reviews!!









						Intel Core i5-13600K Review - Best Gaming CPU
					

Core i5-13600K is a gamer's dream! The new processor achieves gaming numbers that match the best AMD Zen 4 offerings, at a much more competitive $320 price point. Our review also shows that 13600K can shine in applications, beating the much higher positioned Core i7-12700K.




					www.techpowerup.com
				




3.1% relative diff at 4k over the 5600 I just bit the bullet on............hard to go wrong with a $170 CAD sale price ($125 USD)

Probably wait until black Friday to pick up a mobo, then upgraded for another 6 years (on the cheap)  

Media encoding is nice though.  Like most modern CPUs it looks like OC'ing scores lower for most thing vs auto/stock.


----------



## Minus Infinity (Oct 21, 2022)

As I said when 2 months ago, 13600 and 13700 will be beasts and AMD has no real answer at thos price points as the Intel's have moved up a tier in performance bbut stayed at same price or only sligthly risen. I mean for god's sake the 13600K often matches the 12900K, beats the 12700K 95% of the time and crushes the 7600X and 7700X  also 90%+ of the time. Also the 13600K destroys my 5800X in everything. Gosh I shudder to see how pathetic my 1700X would look now.The 13900K is pointless if you don't power limit it. I'd stick to 150W max and it'll still trash 12900K by 30%+. However, I'm leaning towards the 13700K becuase I do lot's of photo editing and run sims as well as game. 13700K will easily beat 12900K and you could power limit it and equal the 12900K all for $409.

All I can say is 7600X RIP unless it cuts price $70 and MB prices plummet. Hell even 7700X looks poor value now. AMD may need to get v-cache models out faster and price them at current levels and offer a 7600X3D too. I think 13400 will give 7600X a scare in gaming for what $199?


----------



## L'Eliminateur (Oct 21, 2022)

i'll still stick with AMD, i don't do "e-cores" at all, i don't want them, i never wanted them and i won't overpay for unused silicon that i'll end up disabling on 1st power up.

IF intel had released a 8-core P-core only cpu, then my new PC would probably be that, as it is? i don't give a toss if it's more expensive, the software i use runs better on normal cores


----------



## InVasMani (Oct 21, 2022)

Yeah agree 13600K does nicely in terms of performance anyway. It's got some raw power consumption issue at idle and max consumption at stock as well as power unlimited, but easy enough to fix the worst of the power problems w/o heavily sacrificing much on performance. It's really reflects whats I said since back when Alder Lake launched. The E cores are good design, but needed improving and Raptor Lake certainly has. Intel should've had more of a 85w/170w base power limit and max power limit on them out of the box, but their unlocked anyway. 

For the price they aren't bad and a 13600K will be a good alternative to a 5800X3D to anyone not already on AM4 socket. If on a AM4 board the 5800X3D makes a higher value consideration otherwise 13600K I would say tends to be a nicer option. It's still pretty early too in the launch so manual optimizing on the 13600K might start to making it appear to look better and better when people have more time to manually play with the hardware itself. Another perk is the integrated GPU. I'm pleasantly surprised how well the 13600K is overall.

I thought this would be about the point where Intel and AMD really got into a dog fight before even Alder Lake launched and Raptor Lake was just on a marketing slide. The next Intel generation it's going to be a bigger slug fest AMD will have A LOT to prove because Intel will be coming out swinging. It's going to be spicier still next CPU generation between AMD and Intel I can't wait to see what happens in the next generation. I feel like this is only a precursor to a bigger battle between them which for consumers should be glorious.


----------



## wheresmycar (Oct 21, 2022)

CallandorWoT said:


> I just watched linustechtips review, they have a ddr4 ram gaming section. long story short, ram doesn't make a damn difference. even high end ddr5 ram vs low end really horrible ddr5 kits didn't matter in the few tests they did.



Was checking other reviews for the DDR4/DDR5 skirmish and found HUs 12 game average.....

At 1080p, 13900K-DDR5 beats DDR4 with a 6% performance increase. At 1440p, its 5%. I looked at a couple of individual games... watch dog showing a WHOPPING almost-15% increase when using the DDR5 kit. What going on there?

It seems... if you've got the dollar, its a pretty compelling reason to shift up a gear to DDR5, no? I guess if you already have a spec savvy DDR4 kit, thats a viable/cost effective option too.


----------



## Super Firm Tofu (Oct 21, 2022)

wheresmycar said:


> Was checking other reviews for the DDR4/DDR5 skirmish and found HUs 12 game average.....
> 
> At 1080p, 13900K-DDR5 beats DDR4 with a 6% performance increase. At 1440p, its 5%. I looked at a couple of individual games... watch dog showing almost a 15% increase when using the DDR5 kit. What going on there?
> 
> It seems... if you've got the dollar, its a pretty compelling reason to shift up a gear to DDR5, no? I guess if you already have a spec savvy DDR4 kit, thats a viable/cost effective option too.



I think you're spot on.  If you have a good kit of DDR4 it's probably smart to just use that.  If you need to buy RAM, might as well spend on DDR5.  You can at least use that again when you replace the board.


----------



## AlwaysHope (Oct 21, 2022)

In two generations an i5 has more threads than an i7 from 2021 !


----------



## jsven008 (Oct 21, 2022)

Arco said:


> Fermi all over again! Bulldozer too?
> View attachment 266313


That picture. "Nvidia, the way its meant to be grilled." 

Just when I thought CPUs couldn't get any hotter after the 7950x, now we have the 13900k! Its pushing 101C in application performance and 90C in gaming. I love performance, but this is performance at all costs. I'd wait for a lower-power CPU variant, like a 65 watt 13700 (without the 'k'). I like a cool and quiet PC.



AlwaysHope said:


> In two generations an i5 has more threads than an i7 from 2021 !



Hey, don't be talking about my i7 like that. My i7 will always rule...in my mind!


----------



## W1zzard (Oct 21, 2022)

BoredErica said:


> What is the cause of 13600k performing terribly on 5.6ghz all core in Borderlands 3?


I've noticed it too, no idea



docnorth said:


> The one is virtualization, like @W1zzard said. Maybe cryptography too, both AES and SHA3, page 14.


AES and SHA3 seem fine



wheresmycar said:


> @W1zzard  i was hoping DDR4 memory would also make the cut in these 13600K gaming charts/etc. Is this something expected soon?


No concrete plans, but I can do this in a week or two, once I've caught up with the backlog



Denver said:


> Why do test results differ so much? Here Zen4 is above AlderLake.


Any chance he's using the integrated benchmark? The benchmark uses some kind of fast flyby on a tiny map, which isn't how actual gameplay works


----------



## jsven008 (Oct 21, 2022)

ca_steve said:


> Thanks for the review.
> 
> I'd love to see a comparison of the R5 7600X and i5-13600K both limited to 65W (and in the case of the i5 e cores disabled) to get a fun preview of the non-X and non-K parts' performance and efficiency.


Yes, great review W1zzard! (as always). Hopefully we'll see some 65w comparisons in the future.


----------



## LuxZg (Oct 21, 2022)

RandallFlagg said:


> Of course.  Some AMD acolyte tells us that Der8auer's analysis is useless.


Person should not name another one like this if they don't know them. Or trying to pick a fight. Otherwise, I'd think one wouldn't base their purchase on 2 1080p low details graphs, unsure why Wizz has a test suite of 70-80 (?) apps and games when all you need is 2 slides.


RandallFlagg said:


> Analysis paralysis?
> 
> People buying K or X series chips are buying top of the line chips, they are labelled by both AMD and Intel as enthusiast chips.  Honestly, there is nothing low end about a 13600K.   Upper midrange really ends with the 13600 (or 12400).
> 
> ...


I agree with this partially, but harsh truth is that:
A) AMD won't have anything below 7600X for a long while, so if one is building new PC from scratch and wants modern components you have nothing to buy but X enthusiast parts from them
B) while Intel will drop 13400/13600 sooner, the 13600 won't be actual Raptor Lake and difference between 13600 and 13600K is way more than just lower TDP. It's basically as deceiving naming as nVidia 4080, less of everything but "same" name.

Thus we're put between a rock and a hard place, either buying 200W CPUs that need tweaking to power limit or undervolt (or both), or buying something from previous gen, in which case we can just buy Zen3 on 2nd hand market and call it a day.



CallandorWoT said:


> I just watched linustechtips review, they have a ddr4 ram gaming section. long story short, ram doesn't make a damn difference. even high end ddr5 ram vs low end really horrible ddr5 kits didn't matter in the few tests they did.


There's more to DDR4 than a few games. Some people work AND play on the same PC.

I will gladly wait a week or two for Wizz and others to post more DDR4 reviews. (But I think it will show that buying new PC with DDR4 is unwise)



ca_steve said:


> Thanks for the review.
> 
> I'd love to see a comparison of the R5 7600X and i5-13600K both limited to 65W (and in the case of the i5 e cores disabled) to get a fun preview of the non-X and non-K parts' performance and efficiency.


I don't expect this to show non-K perf.
BUT! I would like to see 7600X, 7700X, and 13600K tweaked, undervolted, and power limited to roughly same power, and not just in games (that in realistic environment are GPU limited) but also in a productivity and other tests.

My view/rant:
I've been planning a new PC for this winter, but I can wait a bit more for all the reviews and tests. And perhaps a bit of price drop.

Yes, right now RL has price edge - if you go with DDR4 and B660 board. But why would I buy 13600K then, lose some here, lose some there, waste of money.

And if I compare DDR5 B660 it's <20$ diff to B650 board with more PCIe lanes. Unlocked 13600K would need bit larger cooler and bit better PSU than 7600X build, so total build price/perf would suffer. Or go with power limits, but then I want those power limited benches and graphs. For AMD we've already seen huge power saving without losing noticable performance (including non-gaming tasks). I want to see if RL can do the same and still beat 7600X with noticable difference (1-3% would be a tie).

IDK, we obviously need more data. Fight is very close. Depending on your focus you'd be picking very different builds for your money. I wish I was just playing games, I'd pick cheapest option and move on to GPU reviews. But this way we can be eyeing compromises between 13600K+DDR4 and 7600X+DDR5. Or ~70-75W power limited 7700X+DDR5 vs 13600X+DDR5. Both would be similar total build expenses. And then we'll get 13700K reviews... So many options.

IMHO, if you're not just gaming, wait a little before deciding what to buy for your needs, it's still very early days.


----------



## AlwaysHope (Oct 21, 2022)

jsven008 said:


> ....
> 
> Hey, don't be talking about my i7 like that. My i7 will always rule...in my mind!



Same here, my i7 purchased barely a year ago will be in my gaming rig for quite a while yet. 8 good cores is all you need for gaming still.. thanks to current console gen.
Cypress cove cores with OC ringbus is just fine imo.


----------



## gffermari (Oct 21, 2022)

13600K is the best overall cpu right now but….
I would sacrifice it by going for AM5 since the advantage that the platform will be supported for years cannot be ignored.

For AM4 owners, it’s pretty clear. 5800X3D or 5950X. And no one will miss this gen of cpus.

The other thing is that no one with a 12th gen intel needs to upgrade. It’s not worth it like going from gen to gen with AMD the last years.


----------



## Valantar (Oct 21, 2022)

RandallFlagg said:


> 5800X3d beat it in one of them and lost to it in another, with the 13900K being the most efficient.


Uh ... no. I literally did the math in the post you quoted.
FC6:
13900K: 164fps/90W=1.8fps/W
5800X3D: 142fps/70,5W=2fps/W
2>1.8

PUBG:
13900K: 610fps/89.9W=6.8fps/W
5800X3D: 465fps/62.8W=7.4fps/W
7.4>6.8.
In other words, the 5800X3D is the most efficient in both.


RandallFlagg said:


> I think it's fair to say that 99.99% of people actually shelling out money 13900K or 5800X3D are **NOT** buying the chips for their power efficiency.  The more astute ones (like 2%) might look at dollars per FPS.  The time it took me to type this is probably more valuable in terms of money earned if I were being paid than an entire year of +25W efficiency while gaming.


I don't disagree - but I also think that's a damn shame, especially given how performant these chips are and how their power usage is increasing generation over generation. The majority of people either not caring or not knowing enough to care isn't an argument for the thing they don't care about not being important. And, crucially, efficiency is a good gauge of an architecture - its job is to perform useful work, and the cost of doing so is power, so being more efficient is better as long as absolute performance is also sufficient. And a newer architecture being overall less efficient than an older one is this not all that good. Of course that also applies to the stock tuning of any Ryzen 7000, which are doing the exact same "let's hike up the power, fuck it" song and dance. That's why I'm more interested in the upcoming X3D chips, as they'll most likely be the first real improvement seen this generation - but at a price, obviously. For my part, I'm more than happy with my 5800X, and my interests are shifting more and more towards lower end hardware now that even nominally mid-range stuff like a 13600K is _really_ overkill for even mainstream gaming. The 12100 was the most interesting CPU of the 12th gen (with the 12400 close behind), and I don't think that trajectory is changing any time soon.


LuxZg said:


> A) AMD won't have anything below 7600X for a long while, so if one is building new PC from scratch and wants modern components you have nothing to buy but X enthusiast parts from them


I don't necessarily see this as being true this time around. Yes, they took _friggin' ages_ to get sub-5600X chips out the door, but that was in the middle of record sales, a wafer shortage, a fab capacity crunch, and them trouncing the competition. Literally every variable in that is now changed - wafer supply is decent, fab capacity is essentially wide open as every chipmaker is cutting orders, and competition is fierce. AMD would have to be _extremely _dumb to not get a 5600, 5500(X?) and 5400 out the door ASAP - as well as X3D variants, of course. The market isn't there currently for selling boatloads of high end SKUs, so they need to get the lower end stuff onto store shelves as soon as possible.


----------



## Bomby569 (Oct 21, 2022)

like since the 10400, the 13400 will be the next cpu everyone should buy


----------



## W1zzard (Oct 21, 2022)

Bomby569 said:


> like since the 10400, the 13400 will be the next cpu everyone should buy


As mentioned in the conclusion, everything below 13600K will be based on Alder Lake rebrand using the 8+6 die


----------



## THU31 (Oct 21, 2022)

W1zzard said:


> As mentioned in the conclusion, everything below 13600K will be based on Alder Lake rebrand using the 8+6 die



If they do a 6P+4E CPU under $200, it will be amazing, even with the smaller caches.

And what if they did a 6C/12T i3? Seems hard to believe, especially with no low-end from AMD, but who knows. But the i3 will probably be 4P+4E.


----------



## wolf (Oct 21, 2022)

my only regret is not being first to comment, where almost no matter what you say gets the most free internet points


----------



## LuxZg (Oct 21, 2022)

Valantar said:


> I don't necessarily see this as being true this time around. Yes, they took _friggin' ages_ to get sub-5600X chips out the door, but that was in the middle of record sales, a wafer shortage, a fab capacity crunch, and them trouncing the competition. Literally every variable in that is now changed - wafer supply is decent, fab capacity is essentially wide open as every chipmaker is cutting orders, and competition is fierce. AMD would have to be _extremely _dumb to not get a 5600, 5500(X?) and 5400 out the door ASAP - as well as X3D variants, of course. The market isn't there currently for selling boatloads of high end SKUs, so they need to get the lower end stuff onto store shelves as soon as possible.



What you say would be logical path. If there wasn't for AM4 and 5000 series still selling like crazy on low(er) end...

Btw did you actually mean 5600/5500/5400 or 7600/7500/7400?  I assume 7xxx


----------



## Valantar (Oct 21, 2022)

LuxZg said:


> What you say would be logical path. If there wasn't for AM4 and 5000 series still selling like crazy on low(er) end...


True, but I don't think AMD sees AM5 as a threat to this - there's no way it will be as cheap, after all, with higher motherboard and RAM costs. And while selling more than one CPU per motherboard is obviously what they really want to do, they also really need buy-in for their new platform for this to happen in the future, they can't make all their bets on selling out an older platform - which will sell eventually anyway, as it's good and cheap.


LuxZg said:


> Btw did you actually mean 5600/5500/5400 or 7600/7500/7400?  I assume 7xxx


Lol, yes, I meant 7, not 5.


----------



## benbird7 (Oct 21, 2022)

R0H1T said:


> They better get that that massive IPC jump with the 14th gen otherwise we'll have oven makers going out of business
> 
> 
> 
> ...


With those temps I think I'll just stick with the 12700k


----------



## Chrispy_ (Oct 21, 2022)

Boom!
There goes any reason for a gamer to buy a 7600X or 7700X.
I saw from other testing that DDR4 really isn't much of a hinderance to the 13600K, too - so you can avoid the AM5 and DDR5 premium for now.


----------



## RandallFlagg (Oct 21, 2022)

benbird7 said:


> With those temps I think I'll just stick with the 12700k



They need to stop using that air cooler for temp testing high end SKUs.  That can dissipate around 180W.

So for every CPU that is going to draw more than 180W in their test, what they are doing is essentially maxing out the temp to see at what temp the CPU will thermal throttle.  That's all those charts tell you.


----------



## 80-watt Hamster (Oct 21, 2022)

benbird7 said:


> With those temps I think I'll just stick with the 12700k



Temps aside, why _wouldn't_ you stick with the 12700K? The difference is no more than 5% in any test.


----------



## RandallFlagg (Oct 21, 2022)

Chrispy_ said:


> Boom!
> There goes any reason for a gamer to buy a 7600X or 7700X.
> I saw from other testing that DDR4 really isn't much of a hinderance to the 13600K, too - so you can avoid the AM5 and DDR5 premium for now.



When Zen 4 released, excepting the 7950X, all I saw was parity with Alder Lake but at a higher price.

I think a lot of folks intuitively knew this was coming.  AMD needed to do a 2-gen type leapfrog of Intel, since AL was already demonstrably superior to Zen 3.  They didn't do that, so naturally now they are clearly behind.

With AMD's 2 year release cycle, this is likely to just get worse.  This time next year we'll have Meteor Lake, and AMD will still be on Zen 4.   In 2024 Intel will release Arrow Lake, and that will be what Zen 5 goes up against.  

I find it highly unlikely, that AMD would be competitive against an Intel part 2 generations in the future with Zen 5 using the same socket and so on, when they are effectively most of a generation behind right now.  It's a total repeat of the late 2000s and early 2010s.


----------



## Super Firm Tofu (Oct 21, 2022)

RandallFlagg said:


> When Zen 4 released, excepting the 7950X, all I saw was parity with Alder Lake but at a higher price.
> 
> I think a lot of folks intuitively knew this was coming.  AMD needed to do a 2-gen type leapfrog of Intel, since AL was already demonstrably superior to Zen 3.  They didn't do that, so naturally now they are clearly behind.
> 
> ...



As an Alder Lake and Zen 4 owner I can agree with this.  The one caveat is that Intel makes their release dates - something they don't have the best track record with.


----------



## 80-watt Hamster (Oct 21, 2022)

RandallFlagg said:


> When Zen 4 released, excepting the 7950X, all I saw was parity with Alder Lake but at a higher price.
> 
> I think a lot of folks intuitively knew this was coming.  AMD needed to do a 2-gen type leapfrog of Intel, since AL was already demonstrably superior to Zen 3.  They didn't do that, so naturally now they are clearly behind.
> 
> ...



Not exactly, IMO.  Back then, AMD was slower, hotter, _and_ more power-hungry, sometimes by appreciable margins. As of now, only the former is true, and not by very much. The deltas could easily grow/shrink/flip over the next couple of generations, but the above seems like a pretty cynical take.


----------



## Valantar (Oct 21, 2022)

RandallFlagg said:


> When Zen 4 released, excepting the 7950X, all I saw was parity with Alder Lake but at a higher price.
> 
> I think a lot of folks intuitively knew this was coming.  AMD needed to do a 2-gen type leapfrog of Intel, since AL was already demonstrably superior to Zen 3.  They didn't do that, so naturally now they are clearly behind.
> 
> ...


You have some points here, but you're ignoring X3D and other future packing technologies, which will likely change AMD's progrssion here quite a bit. Zen4X3D will most likely beat Raptor Lake in gaming, and new packaging will allow for increased core counts/hybrid core layouts from AMD as well. The socket isn't a limitation for any lf this, so I don't see how it's relevant to this. AMD needs a wider big core sooner rather than later, but they also need CoWoS/LSI and to ditch the current through-substrate IF links - which we know they're working closely with TSMC on, and which RDNA3 serves as a test vehicle for. Just the latter will already improve inter-CCD latencies a lot, which will help performance. They'll also drop IF power a lot, allotting more power to the cores. None of that alleviates the need for a wider core to compete with ALD/RPL, but 3D cache kind of does. AMD is definitely on the defensive right now, and Meteor Lake will no doubt be good, but I don't see any reason to discount AMD quite yet.


----------



## benbird7 (Oct 21, 2022)

80-watt Hamster said:


> Temps aside, why _wouldn't_ you stick with the 12700K? The difference is no more than 5% in any test.


Because I upgraded to the 4090 last week and I've got the upgrade bug.

I've hit a wall with 5ghz p cores and 4ghz e cores if I raise vcore anymore I start to red line on temps. I like to get my single core performance as high as I can for msfs. Got a MSI z690-a pro


----------



## RandallFlagg (Oct 21, 2022)

benbird7 said:


> Because I upgraded to the 4090 last week and I've got the upgrade bug.
> 
> I've hit a wall with 5ghz p cores and 4ghz e cores if I raise vcore anymore I start to red line on temps. I like to get my single core performance as high as I can for msfs. Got a MSI z690-a pro



Most of the 12700K have a reputation for not overclocking well, most people can't get past 5.1ghz.  I got lucky on mine, with 5.3 / 5.2 / 5.1 / 5.0 every 2 cores, else I'd return it and get a 13600K.  

I'm still tweaking, but this is stable 12700KF.


----------



## Valantar (Oct 21, 2022)

benbird7 said:


> Because I upgraded to the 4090 last week and I've got the upgrade bug.
> 
> I've hit a wall with 5ghz p cores and 4ghz e cores if I raise vcore anymore I start to red line on temps. I like to get my single core performance as high as I can for msfs. Got a MSI z690-a pro


MSFS seems like literally the only scenario where that upgrade would make any sense (a 17% uplift at 1080p according to Eurogamer, but crucially starting from pretty low fps to begin with), though I hope you can get a good price when selling your 12700K, as it's not exactly a cost-effective move.


----------



## Chrispy_ (Oct 21, 2022)

RandallFlagg said:


> When Zen 4 released, excepting the 7950X, all I saw was parity with Alder Lake but at a higher price.
> 
> I think a lot of folks intuitively knew this was coming.  AMD needed to do a 2-gen type leapfrog of Intel, since AL was already demonstrably superior to Zen 3.  They didn't do that, so naturally now they are clearly behind.
> 
> ...


I can't guarantee any future predictions, but Intel have surged back to parity with AMD because their troubled 10nm process finally ironed out enough kinks to make a viable product, and that product is only viable because they've found a way to make it eat 300W+ without catching fire. Performance/Watt matters immensely in every single segment except the high-end, liquid-cooled, DIY-enthusiast segment, which is an absolutely miniscule segment in terms of units sold. What matters for efficiency is process node, since I believe Intel's chip design engineers to be of a high calibre. From a process node perspective, Intel's last 3 years have been a complete shit-show:






Their 10nm first failed to launch at 8th-Gen, with only some completely defective mobile i3's actually making it out of the foundry. It took until Alder Lake to be viable for desktop and server, and rebranding it intel 7 isn't progress, it's just a new marketing name. I am taking a look at that roadmap above and seeing that right now, Intel are transitioning to 4nm, EUV, and a new Foveros packaging ALL AT ONCE. Just bear in mind they've spend FOUR GENERATIONS floundering around trying to get a single jump from 14nm to 10nm _7nm_ to work viably, and the result is only "good" if you pump insane amounts of power through it. I definitely do not have the confidence you do that Intel are just going to return to form as the leading global semiconductor foundry. Their foundry execution record over the last 5-6 years has been _abysmal_ and they are so far behind TSMC now that I doubt they will truly catch up before the end of this decade.

Additionally, Intel aren't ahead of AMD in IPC, it's just that they clock their CPUs to the moon at 2.5x the power consumption. Intel are playing catch-up with AMD in terms of cache, interconnect, IMC latency, chiplet technology, MCM scaling from Ryzen 5 to 8-die EPYC server CPUs. 14th Gen will have many of the things (for the first time) that AMD have had for nearly five years now. That's five years of field-tested experience AMD have that Intel don't, and a lot of that is down to manufacturing process, not design. Again, I don't think I need to iterate how poor Intel's manufacturing and process node track record has been for the last half-decade, and I don't have any real reason to expect a massive reversal of competence out of the blue...

Going back to Ryzen 7000, sure - it may only be as good as Alder Lake, but it's coolable, efficient, and you know that a PCIe 5.0, DDR5 AM5 motherboard will be good for Ryzen 8000, 9000, 10000 and possibly beyond. Socket 1700 is _already_ a dead end, to be replaced next generation.


----------



## benbird7 (Oct 21, 2022)

Valantar said:


> MSFS seems like literally the only scenario where that upgrade would make any sense (a 17% uplift at 1080p according to Eurogamer, but crucially starting from pretty low fps to begin with), though I hope you can get a good price when selling your 12700K, as it's not exactly a cost-effective move.


Thanks both with my 240mm aio unless I change that I'm going to be limited with what I could overclock the 13th series with heat.

I'd be interested in how to overclock per core I'll have to have a mess in the bios settings I've got the current running stable I've tried 5.1ghz but over 1.32v on the core it gets too hot stress testing.


----------



## ShiningSapphire (Oct 21, 2022)

Legacy-ZA said:


> Watch that power draw.
> Nope, nopity NOPE.
> 
> What a clown world we live in.
> ...


This CPU has a very high potential for undervolting. Stock 1.35V, UV 1.1V. Power draw minus 100W, temperatures down from 80-100 to ~60-67 degrees.








						Test procesorów Intel Core i5-13600K vs AMD Ryzen 5 7600X - Więcej rdzeni i świetna wydajność! Porównanie w grach i programach | PurePC.pl
					

Test procesorów Intel Core i5-13600K vs AMD Ryzen 5 7600X - Więcej rdzeni i świetna wydajność! Porównanie w grach i programach (strona 65) Recenzja i test procesorów Intel Core i5-13600K vs AMD Ryzen 5 7600X. Nowy Intel jest szybszy niż oczekiwałeś! Porównanie w grach i programach




					www.purepc.pl


----------



## cvearl (Oct 21, 2022)

One of the best review formats there is. Thanks for this. Instead of the constant flood of DOOM videos on youtube about 90c CPU's... you at least point out gaming temps and power consumption under gaming loads. Where the vast majority use these. Wonderful graphs too. Always recommend your reviews!


----------



## wheresmycar (Oct 21, 2022)

mechtech said:


> Looks like poor W1zz has been putting in the OT with all these reviews!!
> 
> 
> 
> ...



wasn't long ago 1440p saw smaller margins at the top of the table too with 4k sitting at ~1% difference... now with RPL/Zen 4 vs 5600/my 9700K we're seeing a 15% shift at 1440p. 15% at this level is more than acceptable... but in my experience the single core 9700K is limiting performance or diminishing consistent visual eye candy in pacier select titles. I like my heavier multiplayer titles to run silky smooth hence compelled to upgrade.

Now we're hearing 40-series/potentially RDNA3 is transfering the bottleneck to the CPU... hope thats just a smelly-wallet-pinch-4090/4K-thing with mid-segment cards at 1440p delivering a finer balance


----------



## LuxZg (Oct 21, 2022)

An article just popped to my news feed:








						Intel Core i9-13900K vs. AMD Ryzen 9 7950X at 125W and 65W | Club386
					

Prefer to keep temperature and power consumption down to lower levels this winter? Here's what happens when the best CPUs are scaled back.




					www.club386.com
				




Too bad still can't find similar for 13600K.


----------



## kbk_75 (Oct 21, 2022)

Chrispy_ said:


> Going back to Ryzen 7000, sure - it may only be as good as Alder Lake, but it's coolable, efficient, and you know that a PCIe 5.0, DDR5 AM5 motherboard will be good for Ryzen 8000, 9000, 10000 and possibly beyond. Socket 1700 is _already_ a dead end, to be replaced next generation.



This is why I've just built my very first AMD system ever. Been building PCs since the 8088 and never felt the desire to assemble an AMD rig, not even in the early 2000s. My 7700X with a -30 undervolt boosts to 5.5GHz and runs in the high 30s / low 50s C whilst gaming at 26-28 ambient. It's also pulling barely 50W doing so. Sure, it's not as fast as the new 13th Gen Intel parts but it makes virtually no difference to me at 4K/120 with my 4090 (which I have also, coincidentally, undervolted since my monitor won't go over 120fps. This gives me a power draw of around 270W on the 4090 and it utterly demolishes my 3090's frame rates with that card pulling 350-400W).

Makes for a whisper silent system that runs cool and super efficient at a total system gaming power draw well under 350W and pretty much maxing out my 4K/120 screen! Looking forward to getting a PCIe Gen 5 SSD when those launch and then this rig will be next to perfect for my needs!

Edit: Just checked, CPU runs at 39-53C at 26C ambient and draws 51W in gaming loads! AMD could easily have made this default behaviour, they knew they were gonna lose outright performance to 13th gen!


----------



## Valantar (Oct 21, 2022)

kbk_75 said:


> This is why I've just built my very first AMD system ever. Been building PCs since the 8088 and never felt the desire to assemble an AMD rig, not even in the early 2000s. My 7700X with a -30 undervolt boosts to 5.5GHz and runs in the high 50s / low 60s C whilst gaming at 26-28 ambient. It's also pulling barely 56-60W doing so. Sure, it's not as fast as the new 13th Gen Intel parts but it makes virtually no difference to me at 4K/120 with my 4090 (which I have also, coincidentally, undervolted since my monitor won't go over 120fps. This gives me a power draw of around 270W on the 4090 and it utterly demolishes my 3090's frame rates with that card pulling 350-400W).
> 
> Makes for a whisper silent system that runs cool and super efficient at a total gaming power draw under 350W and pretty much maxing out my 4K/120 screen! Looking forward to getting a PCIe Gen 5 SSD when those launch and then this rig will be next to perfect for my needs!


Sounds like a sweet - and well tuned - setup! I personally wouldn't bother with that PCIe 5.0 SSD though - even when DirectStorage becomes a thing, you won't see any meaningful performance increase compared to a 4.0 drive (and likely even a 3.0 drive). Outside of massive sequential operations the bottleneck is the NAND, not the controller or bus, and NAND isn't getting much faster with time outside of pure sequential loads either. I would much rather have twice the capacity on a nominally slower drive, as real world performance (assuming you pick a good drive) won't be noticeably different.


----------



## mechtech (Oct 21, 2022)

wheresmycar said:


> wasn't long ago 1440p saw smaller margins at the top of the table too with 4k sitting at ~1% difference... now with RPL/Zen 4 vs 5600/my 9700K we're seeing a 15% shift at 1440p. 15% at this level is more than acceptable... but in my experience the single core 9700K is limiting performance or diminishing consistent visual eye candy in pacier select titles. I like my heavier multiplayer titles to run silky smooth hence compelled to upgrade.
> 
> Now we're hearing 40-series/potentially RDNA3 is transfering the bottleneck to the CPU... hope thats just a smelly-wallet-pinch-4090/4K-thing with mid-segment cards at 1440p delivering a finer balance


Ya, end result is how much you are satisfied/expectations and how much budget you have.  My current 1700 and old rx480 was still meeting my wants & needs, so not point burning cash for no perceivable gains for my requirements.  For anyone else.............it's your money, do whatever you want with it.


----------



## kbk_75 (Oct 21, 2022)

Valantar said:


> I would much rather have twice the capacity on a nominally slower drive, as real world performance (assuming you pick a good drive) won't be noticeably different.


I couldn't agree more, which is why my boot drive is a Firecuda 530 2TB Gen 4 drive (which, by the way, is noticeably quicker than my previous 2TB Samsung 970 Evo Plus drive in system start-up and game load times) and my data drive is a 8TB Corsair MP400 Gen 3 unit. I do find the 2TB drive slightly on the small side, so I guess I'll hang-on to this setup until decent 4TB Gen 5 drives are out and then I'll hock this one and be done!

I did consider waiting for the 13700K reviews to come in before I made this rig but I didn't like the notion that 13th gen was on a dead platform that would require a whole new system if I wanted to upgrade in 3-4 years' time! All I can say is I'm glad AMD pushed Intel as hard as they have in the past 2-3 years because the products from the 2600K to the 8086K were marginal improvements, year-on-year, at best! 

Good time to be a PC gamer, honestly, you can't really lose whichever way you go!


----------



## Minus Infinity (Oct 22, 2022)

L'Eliminateur said:


> i'll still stick with AMD, i don't do "e-cores" at all, i don't want them, i never wanted them and i won't overpay for unused silicon that i'll end up disabling on 1st power up.
> 
> IF intel had released a 8-core P-core only cpu, then my new PC would probably be that, as it is? i don't give a toss if it's more expensive, the software i use runs better on normal cores


Well come Zen 5 you will get E cores, but they will be Zen 5c cores and far more powerful than Gracemont+++ and SMT enabled. Come Arrow Lake i9 will have 48 cores, 8P + 40E!


----------



## Space Lynx (Oct 22, 2022)

Should I bother with Intel Optane? I am pretty sure my B660 board supports it, but it seems like I never hear anything about Optane. From what I understand it is relatively cheap for a 16gb stick of it, you plug it in a m.2 slot, install the drivers, reboot, and done... it just makes everything snappier and faster or something over time automatically after that?

(i just bought a 13600k, is why I am asking)


----------



## juraj (Oct 22, 2022)

*Why is the performance of the overclocked CPU so much worse in "Web Browsing" and "Microsoft Office"?*
Those should be a single core workloads so 5.1GHz VS 5.6GHz should give it a nice boost in both.
Especially in the "Speedometer 2" it's 30% slower when overclocked.... why?


----------



## ModEl4 (Oct 22, 2022)

AMD Ryzen 7 7700 non-X CPU allegedly features 8 cores and 65W TDP

I wonder at what price it make sense with the kind of performance that 13600K/KF brings...



CallandorWoT said:


> Should I bother with Intel Optane? I am pretty sure my B660 board supports it, but it seems like I never hear anything about Optane. From what I understand it is relatively cheap for a 16gb stick of it, you plug it in a m.2 slot, install the drivers, reboot, and done... it just makes everything snappier and faster or something over time automatically after that?
> 
> (i just bought a 13600k, is why I am asking)


Don't bother.
It has good random QD1-8 performance regarding 4K random read but no essential advantage in write and it is just a rebadged 5 year-old tech.
I don't know how much you can find it, $30? Just use them to buy a better SSD imo or something else


----------



## Lovec1990 (Oct 22, 2022)

13600K looks nice, but where is 13700K review?

some already posted their review of 13700K


----------



## tomfuegue (Oct 22, 2022)

Incredible price/performance ratio. I think it will be the winner of this generation, waiting for the 13400F.


----------



## Mats (Oct 22, 2022)

This just shows how close these CPU's can be..

*TPU:*_ The *13600K* is a bit faster than the *7700X* at 1080 & 1440 with a RTX 3080._

*HUB: *_The *13700K* is just as fast as the *7700X *at1080 & 1440 with a RTX 4090__:_


----------



## gffermari (Oct 22, 2022)




----------



## InVasMani (Oct 22, 2022)

Mats said:


> This just shows how close these CPU's can be..
> 
> *TPU:*_ The *13600K* is a bit faster than the *7700X* at 1080 & 1440 with a RTX 3080._
> 
> ...



$120's for 2P cores and small frequency boost to unlocked chips that already have a lot of thermal cooling requirements along with over indulgent power draw out of the box. Keep in mind a Alder Lake Pentium is $75's for the same 2P cores. You could probably even buy that and MB for about the same or maybe less as the additional price of the 13700K.


----------



## N3M3515 (Oct 22, 2022)

Holy shit it gives even the 7700X a run for its money!!
Time to lower prices AMD.....
And the 3D versions won't solve the applications deficit of the 7700X, only a price decrease would be acceptable.

AMD should think very hard about increasing core count of the 8 core to at least 10 core. And the 6 to 8.


----------



## etayorius (Oct 22, 2022)

I just watched Hardware Unboxed review for the 13600k against the 7600X after reading this review. Not sure what is going on, could  but their results showed the 7600X being overall faster in gaming, even in the same games tested by both reviews results are completely opposite.


----------



## N3M3515 (Oct 22, 2022)

etayorius said:


> I just watched Hardware Unboxed review for the 13600k against the 7600X after reading this review. Not sure what is going on, could  but their results showed the 7600X being overall faster in gaming, even in the same games tested by both reviews results are completely opposite.


This is interesting, because either site, whichever cpu wins, you won't notice it in real life, these reviews are more of an academic excercise, specially when the results are so close to each other.
A "win" at least for me has to be a difference noticeable by the user, like 20% - 30% minimum.


----------



## RandallFlagg (Oct 22, 2022)

etayorius said:


> I just watched Hardware Unboxed review for the 13600k against the 7600X after reading this review. Not sure what is going on, could  but their results showed the 7600X being overall faster in gaming, even in the same games tested by both reviews results are completely opposite.



Has to do with game selection and platform setup.    Look at the three motherboards he is using.  He's probably not equalizing them at all.  When you just pop a CPU into a motherboard and set XMP, you have no idea what it is going to do. 

The difference between what a MEG ACE motherboard (Zen 4) defaults to and what a Tomahawk (DDR4 Intel) or Carbon (DDR5 Intel) defaults to is likely significant.  These 3 boards are all at entirely different tiers.


----------



## tfdsaf (Oct 22, 2022)

I like this CPU, competitively priced, fairly decent power consumption in games, amazing performance in games and crushes the 7600x in multithreaded applications! AMD's 7600x need to cost something like $200 in order to make sense at this point and their 7700x need to cost $350 in order to make sense! I don't see AMD doing well with their lower end parts if prices stay the same!

Intel definitely has an advantage at the mid range in terms of performance and value! Sure power consumption and temperatures are quite bad overall and in powerful applications this cpu turns into a heater as well, but for games and normal application work its a solid CPU.


----------



## wheresmycar (Oct 23, 2022)

etayorius said:


> I just watched Hardware Unboxed review for the 13600k against the 7600X after reading this review. Not sure what is going on, could  but their results showed the 7600X being overall faster in gaming, even in the same games tested by both reviews results are completely opposite.



Looking at multiple 13600K vs 7600X reviews, it seems there is an emerging pattern which "possibly" explains why some reviews are showing the 13600K coming out ahead whilst other reviews are backing the 7600X.

It looks like its down to the test bench choice of graphics card. Based on several 10+ game averages, it appears:

where the high-end RTX 30-series card is used the 13600K takes the win with a nice 8% lead (taken from TPU 12-game bench shown below/others).
Where the RTX 4090 is used the 7600X takes the win with a 6% lead (taken from Jarrods 25-game bench shown below). Other 4090 reviews with 8-12 game averages are showing a 1-4% lead.
If this is correct, i'd appreciate if the know-howsers/experts can explain why the two generation of cards are showing variable results with RPL and Zen 4? I totally get it, some of these results will vary based on memory configurations, game type, etc... but for some reason i'm strongly suspecting the primary offender is the GPU(s)

*TPU: 1080p / RTX 3090 / DDR5 / 13600K v 7600X*




Individual 12-game performance: https://www.techpowerup.com/review/intel-core-i5-13600k/18.html


*JARRODS: 1080P / RTX 4090 / DDR5 / 13600K v 7600X



*


----------



## gffermari (Oct 23, 2022)

The same here.
With 4090, the 7600X is faster.


----------



## trparky (Oct 23, 2022)

OK, there's is a question that I've had ever since these new chips from both AMD and Intel have come out.

How is the Ryzen 7 5800X3D doing so well against these new chips despite it being a two-year-old chip? What's going on here?


----------



## gffermari (Oct 23, 2022)

trparky said:


> How is the Ryzen 7 5800X3D doing so well against these new chips despite it being a two-year-old chip? What's going on here?


----------



## trparky (Oct 23, 2022)

Ok, so one-year-old processor.


----------



## Super Firm Tofu (Oct 23, 2022)

trparky said:


> Ok, so one-year-old processor.



Six months and three days.


----------



## wheresmycar (Oct 23, 2022)

gffermari said:


> The same here.
> With 4090, the 7600X is faster.
> 
> View attachment 266756



Yeah, i wander if this is a common variable amongst different generation of GPUs... either way, seems like both the 13600K and 7600K are at the top of their game (pun intended), with the 13600K taking the win with its powerhouse multi-threaded performance. I dunno, im not entirely excited about either of these options but look forward to more efficient non-K/X models



trparky said:


> OK, there's is a question that I've had ever since these new chips from both AMD and Intel have come out.
> 
> How is the Ryzen 7 5800X3D doing so well against these new chips despite it being a two-year-old chip? What's going on here?



The 2 year old AMD 5000-series somewhat recently saw the addition of the "X3D" model... it does it a little different with some massive game performance rewards. Essentially its AMDs response to Intels 12th gen CPUs - taking the older 5800X and packing or vertically stacking multiple layers of L3 cache (bigger cache size).... which helps to quickly manage instructions internally opposed to having to revert to system RAM. In return, huge performance benefits in gaming, especially cache hungry titles! Surprisingly the 5800X3D even takes on DDR5 platforms housing the 7600K/13600K/etc and beats them in couple of titles.

Personally i prefer the faster overall performance with the 13600K (or even the 7600X in a game-only scenario). Obviously can't justify platform upgrade costs if you're already on AM4!


----------



## trparky (Oct 24, 2022)

wheresmycar said:


> Essentially its AMDs response to Intels 12th gen CPUs - taking the older 5800X and packing or vertically stacking multiple layers of L3 cache (bigger cache size).... which helps to quickly manage instructions internally opposed to having to revert to system RAM.


I've always thought of system RAM as being pretty damn fast.


wheresmycar said:


> Surprisingly the 5800X3D even takes on DDR5 platforms housing the 7600K/13600K/etc and beats them in couple of titles.


This is the part that surprises me too. How is this even possible? Newer processors are supposed to be faster than the previous version. That's how it's supposed to be. Yet, that's not necessarily the case.


----------



## InVasMani (Oct 24, 2022)

trparky said:


> I've always thought of system RAM as being pretty damn fast.
> 
> This is the part that surprises me too. How is this even possible? Newer processors are supposed to be faster than the previous version. That's how it's supposed to be. Yet, that's not necessarily the case.



It boils down to 5800X3D avoids a lot of cache misses and fetching from system memory that is a magnitude significantly slower in relative terms. DDR5 however is faster than DDR4 so if you over extend cache usage and memory usage requirements are high it can start to pull ahead. Fast smaller buffer vs slower wider buffer.


----------



## trparky (Oct 24, 2022)

I wonder, if one were to build a system with a 5800X3D in it today, how much life would one expect to get out of it considering today's ever growing software needs where it seems software is getting more and more bloated by the year?


----------



## InVasMani (Oct 24, 2022)

Complex question to answer depends on intended usage and expectations in regard to what you prioritize most when comparing two chips.


----------



## RandallFlagg (Oct 24, 2022)

trparky said:


> I wonder, if one were to build a system with a 5800X3D in it today, how much life would one expect to get out of it considering today's ever growing software needs where it seems software is getting more and more bloated by the year?



I'm not sure why you are so enthralled by the 5800X3D.  Yes, it's good at games, but it was never the fastest (the 12900K/KS with fast DDR5 was).  

In productivity applications, the 5800X3D is demolished by double digit percentages.   The 12600K is 11.5% faster overall, the 12700K which is in its same price category now, is 29.4% faster.  

Meanwhile the 13600K, which is cheaper, is 36.2% faster in applications while clocking in about 5% higher 1440P FPS and the 7600X is 16.9% faster in applications while matching the 5800X3D in 1440P gaming.  This based on TPUs recent benchmarks, with all updated drivers and BIOS', and just using a 3080.  

5800X3D is a good deal - _if you already have a decent AM4 platform and you just want to play games. _

Conversely, it's a bad choice for people who are building an entirely new platform (CPU/Motherboard/RAM).   

I personally would opt for a 5900X if I were on AM4.  It loses to 5800X3D by 5% at 1440P, but blows it away in productivity by ~17%.


----------



## Chrispy_ (Oct 24, 2022)

CallandorWoT said:


> Should I bother with Intel Optane? I am pretty sure my B660 board supports it, but it seems like I never hear anything about Optane. From what I understand it is relatively cheap for a 16gb stick of it, you plug it in a m.2 slot, install the drivers, reboot, and done... it just makes everything snappier and faster or something over time automatically after that?
> 
> (i just bought a 13600k, is why I am asking)


Optane cache drives only really helped mechanical OS drives. They're discontinued and the software you rely on is now woefully out of date. 16GB is enough to act as an OS cache, but it's way too small to act as a read cache for a library drive.

The best-case scenario is that you won't have serious issues and you won't notice any real improvement.
The worst-case scenario is the outdated software screws up your windows install and forces a wipe & reinstall instead.


----------



## Space Lynx (Oct 24, 2022)

Chrispy_ said:


> Optane cache drives only really helped mechanical OS drives.



I did not realize this. Thank you, I would not have asked the question if I understood that part of it. Thanks   

I still am really happy I got a 13600k. I needed a big upgrade and for the price I can't complain.


----------



## L'Eliminateur (Oct 24, 2022)

Minus Infinity said:


> Well come Zen 5 you will get E cores, but they will be Zen 5c cores and far more powerful than Gracemont+++ and SMT enabled. Come Arrow Lake i9 will have 48 cores, 8P + 40E!


Nothing is known about zen 5(specially as zen4 just launched), all that talk about them bringing e-cores are unsubstantiated rumors.
AND if they do that, great for them, i'm not going to buy those types of CPU ever.


----------



## qubit (Oct 24, 2022)

@W1zzard will you be reviewing the 13700K? I'm thinking of buying this model.


----------



## trparky (Oct 25, 2022)

RandallFlagg said:


> I'm not sure why you are so enthralled by the 5800X3D.


Because just about every tech YouTuber seems to include this processor as a comparison as if it's some kind of god-like processor that puts both AMD and Intel's current crop of processors to shame.


L'Eliminateur said:


> AND if they do that, great for them, i'm not going to buy those types of CPU ever.


But if AMD does those kinds of cores the right way as versus the wrong way like what Intel is doing, what then? It seems that AMD is doing things far more intelligently than Intel so the way I see it, at the risk of sounding like an AMD fanboy, when they do come around with their own efficiency core design, I have no doubt that they'll do it the right way.


RandallFlagg said:


> The 12600K is 11.5% faster overall, the 12700K which is in its same price category now, is 29.4% faster.


But at what cost? Higher energy consumption, higher heat output. Yeah, sure... it won the battle but it lost the war. One more victory like that and it's game over. A pyrrhic victory.


----------



## Space Lynx (Oct 25, 2022)

trparky said:


> Because just about every tech YouTuber seems to include this processor as a comparison as if it's some kind of god-like processor that puts both AMD and Intel's current crop of processors to shame.
> 
> But if AMD does those kinds of cores the right way as versus the wrong way like what Intel is doing, what then? It seems that AMD is doing things far more intelligently than Intel so the way I see it, at the risk of sounding like an AMD fanboy, when they do come around with their own efficiency core design, I have no doubt that they'll do it the right way.
> 
> But at what cost? Higher energy consumption, higher heat output. Yeah, sure... it won the battle but it lost the war. One more victory like that and it's game over. A pyrrhic victory.



13600k does 74 watts in gaming and beats almost everything across the board in max/min fps, including the x3d in several games, not all, but majority.  for $309 if you opt for the kf version.

its not high energy consumption at all for a gamer only.  /shrug


----------



## L'Eliminateur (Oct 25, 2022)

trparky said:


> Because just about every tech YouTuber seems to include this processor as a comparison as if it's some kind of god-like processor that puts both AMD and Intel's current crop of processors to shame.
> 
> But if AMD does those kinds of cores the right way as versus the wrong way like what Intel is doing, what then? It seems that AMD is doing things far more intelligently than Intel so the way I see it, at the risk of sounding like an AMD fanboy, when they do come around with their own efficiency core design, I have no doubt that they'll do it the right way.
> 
> But at what cost? Higher energy consumption, higher heat output. Yeah, sure... it won the battle but it lost the war. One more victory like that and it's game over. A pyrrhic victory.


There's not really a good way to make "efficiency" core without them being dogshit on the software side, i mean, ¿what can you do?, ¿the same core but with a lower max clock/power?


----------



## trparky (Oct 25, 2022)

L'Eliminateur said:


> There's not really a good way to make "efficiency" core without them being dogshit on the software side, i mean, ¿what can you do?, ¿the same core but with a lower max clock/power?


I don't know. However, I have no doubt that AMD will pull some kind of rabbit out of their hat.


----------



## FreezingPC (Oct 25, 2022)

L'Eliminateur said:


> ¿the same core but with a lower max clock/power?


with less cache on the side to make it smaller and it is starting to look like zen 4D(ense).
Well im oversimplifying, of course, but yeah...


----------



## THU31 (Oct 25, 2022)

Has anyone seen any gaming tests using only 8 E-cores, with P-cores completely disabled?

With the increased clock speeds and cache, I am curious what the results would be. Maybe it would even be viable for 60 FPS gaming?


----------



## L'Eliminateur (Oct 25, 2022)

THU31 said:


> Has anyone seen any gaming tests using only 8 E-cores, with P-cores completely disabled?
> 
> With the increased clock speeds and cache, I am curious what the results would be. Maybe it would even be viable for 60 FPS gaming?


techteamgb posted a video about that just yesterday 







results are wildly varied with some games being "playable" and some others unplayable. That said the performance uplift between ADL and RKL e-cores is astounding.

At this point, Intel could probably do a 20-core e-core only cpu with the space the p-cores take, but then again, an e-core is essentially the same performance as a skylake cpu so you're essentially having a 20 core i7-6000.


----------



## THU31 (Oct 25, 2022)

L'Eliminateur said:


> techteamgb posted a video about that just yesterday
> 
> 
> 
> ...



Wow, gaming performance is really impressive.

But I am actually surprised at the productivity performance. In the 13600K, E-cores basically have 50% of the P-cores performance at slightly less than 50% power. Does that not confirm that it would be better to have 2 extra P-cores instead of 8 E-cores?
6 P-cores consume 118 W, so 8 would consume about 157 W, and the stock 13600K hybrid consumes 149 W in their testing. What is the point?

This really is a gimmick on desktop, where they do not care about efficiency anyway. Applications that can utilize 20+ threads will get slightly more performance at slightly less power, but that really seems irrelevant.


----------



## W1zzard (Oct 25, 2022)

qubit said:


> @W1zzard will you be reviewing the 13700K? I'm thinking of buying this model.


Yeah I have the review almost finished


----------



## RandallFlagg (Oct 25, 2022)

THU31 said:


> Wow, gaming performance is really impressive.
> 
> But I am actually surprised at the productivity performance. In the 13600K, E-cores basically have 50% of the P-cores performance at slightly less than 50% power. Does that not confirm that it would be better to have 2 extra P-cores instead of 8 E-cores?
> 6 P-cores consume 118 W, so 8 would consume about 157 W, and the stock 13600K hybrid consumes 149 W in their testing. What is the point?
> ...



Is it? 

I don't think I have ever seen any reviewer who is talking about power control for the many factors that influence power consumption results.  It isn't just a simple matter of swapping out the CPU.  

Unfortunately, most of these sites and tubers justify unequal platforms by saying something about 'out of the box' experience.  The problem with that logic is they have just switched from analyzing the power characteristics of a *CPU* to evaluating a *motherboards default settings*.  

Even if you use the same motherboard, do we know what it's default power limits and VF curve looks like with different CPUs?  My Asus TUF for example, came out of the box completely power unlocked.

I mean, without knowing all those details, you really don't know anything about what you just saw or what the reviewer did.  This is especially true when testing between different CPUs and different vendors (AMD/Intel). 

To give an example of what I'm talking about, study these two charts - yes that's 116W difference for negative performance, even the MSI MAG B660 Tomahawk is drawing 67W more for about 1.2% performance loss in CB MC :


----------



## THU31 (Oct 25, 2022)

I am not really understanding how that relates to my post, talking about the difference between P-cores and E-cores within the same CPU in the same system.


----------



## 80-watt Hamster (Oct 25, 2022)

RandallFlagg said:


> Is it?
> 
> I don't think I have ever seen any reviewer who is talking about power control for the many factors that influence power consumption results.  It isn't just a simple matter of swapping out the CPU.
> 
> ...



If you're going to compare CB scores, you should also be comparing CB power draw.  Unless it's been otherwise proved that AIDA stability and CB scale consistently relative to each other.


----------



## RandallFlagg (Oct 25, 2022)

THU31 said:


> I am not really understanding how that relates to my post, talking about the difference between P-cores and E-cores within the same CPU in the same system.



The same motherboard can give different results with different CPUs, especially from different generations.  




80-watt Hamster said:


> If you're going to compare CB scores, you should also be comparing CB power draw.  Unless it's been otherwise proved that AIDA stability and CB scale consistently relative to each other.



It's not incumbent on me, the reader, to determine that.  The reviewer showed that the motherboards had wildly different power draws with the same CPU under a load, while the CB scores were largely the same.  

I can pretty much conclude from that the bulk of disparity in power draws on two different motherboards is from the motherboard itself and its configuration.  

For example, what was the load line calibration set to on the CPUs in that comparison?  I can instantly overheat my rig and greatly increase power draw by increasing that to max.  Does anyone ever even mention these settings?  Nope.


----------



## THU31 (Oct 25, 2022)

RandallFlagg said:


> The same motherboard can give different results with different CPUs, especially from different generations.



But it is the same CPU, the 13600K. Three tests - all cores, just P-cores and just E-cores. Did you not see what video I was quoting? The 12600K comparison there is irrelevant, I was just talking about the differences between the three configurations of the 13600K.


This can actually be tested with the 13700K. One can compare 8P+0E to 6P+8E. Gaming performance, productivity performance and power draw.


----------



## qubit (Oct 25, 2022)

W1zzard said:


> Yeah I have the review almost finished


Epic. Look forward to it.


----------



## L'Eliminateur (Oct 25, 2022)

THU31 said:


> Wow, gaming performance is really impressive.
> 
> But I am actually surprised at the productivity performance. In the 13600K, E-cores basically have 50% of the P-cores performance at slightly less than 50% power. Does that not confirm that it would be better to have 2 extra P-cores instead of 8 E-cores?
> 6 P-cores consume 118 W, so 8 would consume about 157 W, and the stock 13600K hybrid consumes 149 W in their testing. What is the point?
> ...


That's the thing, it IS a gimmick, a gimmick no one asked for that's useful ONLY for laptops.

Now, i understand why intel is shoving it down everyone's throat: because if they don't put it across all their desktop product stack, neither OS makers nor developers will give a frick about a laptop-only feature that has so many problems and would end up a "nice but flawed" feature. This way everyone becomes a beta tester for something only useful for laptop.

Yes, the silicon space taken by all those e-trash would be much better served by huge P-cores, in fact, e-cores should be capped at 4 at the most, i mean, ¿aren't they for "background tasks" and "power efficiency"?, then why is it that a "high end" CPU has 16 "background" cores and only 8 performance ones?, and the further up the stack you go, the P-cores remain stagnant and only the e-core count increase?.
When the p-core count is the one that should increase whilst leaving the e-cores fixed at 4, maaaybe 6 for the i9

it makes no sense at all and honestly annoys me that those issues are completely ignored by reviewers.


----------



## Nopa (Oct 26, 2022)

gffermari said:


> The same here.
> With 4090, the 7600X is faster.
> 
> View attachment 266756


7600X, still not fast enough imo. 13600K, a little bit faster than 7600X but based on a dead end platflom.
I think I'll hold on my 8700K until 7800X3D comes out. Invest into Z790 platflom at this point is pretty dumb. Besides, the X3D would beat any CPU in Gaming even the upcoming 13900KS.



W1zzard said:


> Yeah I have the review almost finished


Excellent works you've been doing so far! Please do 4090 covers with 13900K and 7950X. (13900KS & 7800X3D when it come out).


----------



## trparky (Oct 26, 2022)

L'Eliminateur said:


> That's the thing, it IS a gimmick, a gimmick no one asked for that's useful ONLY for laptops.


I don't think so. If you can have efficiency cores handle the operating system itself and all of the associated background processes and services, the performance cores can concentrate on handling your heavy-lifting tasks like running your games and such. That essentially means that the performance cores can keep on trucking along with your game or whatever heavy-lifting task you're running instead of having to handle stupid things like your operating system.



Nopa said:


> I think I'll hold on my 8700K until 7800X3D comes out.


What gets me is why didn't AMD come even close to the amount of cache that the 5800X3D version had. If they had, they wouldn't have had the kind of issue that we have today.


----------



## Nopa (Oct 26, 2022)

trparky said:


> What gets me is why didn't AMD come even close to the amount of cache that the 5800X3D version had. If they had, they wouldn't have had the kind of issue that we have today.


I believe they thought releasing only a 8C/16T with huge 3D V-Cache is the most sensible choice for gaming. They leave the space between 7700X and 7900X specifically on purpose in case 7700X & 7950X are beaten by 13700K & 13900K, 7800X3D would come to the rescue and leads them to overtake Intel's best flagship in gaming which could be true based on how 5800X3D leads AMD neck and neck with 12900K & 12900KS.

7800X3D's gaming performances should be double, likely even better than that of 13900KS.


----------



## 80-watt Hamster (Oct 26, 2022)

L'Eliminateur said:


> That's the thing, it IS a gimmick, a gimmick no one asked for that's useful ONLY for laptops.
> 
> Now, i understand why intel is shoving it down everyone's throat: because if they don't put it across all their desktop product stack, neither OS makers nor developers will give a frick about a laptop-only feature that has so many problems and would end up a "nice but flawed" feature. This way everyone becomes a beta tester for something only useful for laptop.
> 
> ...



They can already barely cool eight highly-clocked P-cores. How are they going to deal with more?


----------



## Wasteland (Oct 26, 2022)

trparky said:


> I don't think so. If you can have efficiency cores handle the operating system itself and all of the associated background processes and services, the performance cores can concentrate on handling your heavy-lifting tasks like running your games and such. That essentially means that the performance cores can keep on trucking along with your game or whatever heavy-lifting task you're running instead of having to handle stupid things like your operating system.



Yes.  The E-cores also perform extremely well in many multi-threaded tasks.  Alder Lake took back the MT performance crown from Zen 3 largely on the strength of E-cores.  Calling them a "gimmick," or "only useful for laptops," is a biiiig stretch.

If all you care about is gaming, then sure, the E-cores don't do a whole lot for you.  You're free to disable them or to buy a CPU that doesn't have them, but there's no downside to having them active either.  Yeah there were some scheduling hiccups at first, but as far as I know they've been ironed out by now.


----------



## L'Eliminateur (Oct 26, 2022)

trparky said:


> I don't think so. If you can have efficiency cores handle the operating system itself and all of the associated background processes and services, the performance cores can concentrate on handling your heavy-lifting tasks like running your games and such. That essentially means that the performance cores can keep on trucking along with your game or whatever heavy-lifting task you're running instead of having to handle stupid things like your operating system.
> 
> 
> What gets me is why didn't AMD come even close to the amount of cache that the 5800X3D version had. If they had, they wouldn't have had the kind of issue that we have today.


And why would i want efficiency cores for that?, simply put more performance cores that handle the same thing with greater performance. On a laptop it makes sense on a desktop?, nah, just give me more big cores



> If all you care about is gaming, then sure, the E-cores don't do a whole lot for you.  You're free to disable them or to buy a CPU that doesn't have them, but there's no downside to having them active either.  Yeah there were some scheduling hiccups at first, but as far as I know they've been ironed out by now.


Except they haven't, and there are lots of downsides: you need to run windows 11, and even then they're problematic with older software (and even with newer soft), and that zen4 feels smoother and more responsive.
And also, i'm not going to pay intel for something i will have to disable on first boot(it was bad enough with integrated gpus, at least they added the F cpus for that), give me a pure homogeneous ultra high performance classic cpu, that's what i pay for, not some laptop-gimmick.
i don't only care for gaming but it's my main concern, i do all sort of stuff but i'm a very heavy multitasker(think of games+browser/s with 1400 tabs+WA dekstop+PDFs+illustrator/indesign+discord+assorted TSRs all running at the same time) and i won't downgrade to win11


----------



## Wasteland (Oct 26, 2022)

Well I've never touched Windows 11 and my i7-12700 performs just fine.  Take it for what it's worth, but for my money you're obsessing over something that really isn't a big deal.


----------



## Nopa (Oct 26, 2022)

Wasteland said:


> Well I've never touched Windows 11


Only 2 things I like about Win 11, its extra HDR options and DirectStorage advancement compare to those of Win 10.


----------



## THU31 (Oct 26, 2022)

trparky said:


> I don't think so. If you can have efficiency cores handle the operating system itself and all of the associated background processes and services, the performance cores can concentrate on handling your heavy-lifting tasks like running your games and such. That essentially means that the performance cores can keep on trucking along with your game or whatever heavy-lifting task you're running instead of having to handle stupid things like your operating system.



You are literally just quoting Intel's marketing that so many people are buying into. This only applies to a situation where a game uses 100% of your P-cores. But guess what? If you add 2 more P-cores, suddenly they will be able to perform those background tasks too, which consist of single digit percentages when it comes to CPU usage.
This is not an issue with multi-core CPUs. Back in the day of slow single-core CPUs, an extra E-core would have made a big difference off-loading all the background stuff to it. But these days common CPUs have 12 or 16 extremely fast threads. Background tasks will not cause stutters or hitches with such CPUs.



80-watt Hamster said:


> They can already barely cool eight highly-clocked P-cores. How are they going to deal with more?



Have you seen the P-core vs. E-core comparison?
6 fully loaded P-cores in the 13600K consume 118 W. How much would 12 P-cores consume? Using simple math, I would guess about 236 W.
How much does the 13900K consume? 300-400 W. With just 8 P-cores. Where does that power consumption come from? From those super efficient E-cores, I guess.
16 P-cores would consume about 315 W (at 13600K clocks, so ~5.1 GHz) using the same simple math. Would that be harder to cool than the actual 13900K, that consumes way more?

I get it. E-cores are good for productivity. But my point is that including E-cores in lower SKUs is just marketing (and benchmark scores).
The 13600K would be an even better gaming CPU if it had 8 P-cores and 0 E-cores. It would still beat the 7600X, but I guess it would only match the 7700X instead of beating it in Cinebench.


----------



## trparky (Oct 26, 2022)

THU31 said:


> You are literally just quoting Intel's marketing that so many people are buying into. This only applies to a situation where a game uses 100% of your P-cores. But guess what? If you add 2 more P-cores, suddenly they will be able to perform those background tasks too, which consist of single digit percentages when it comes to CPU usage.
> 
> This is not an issue with multi-core CPUs. Back in the day of slow single-core CPUs, an extra E-core would have made a big difference off-loading all the background stuff to it. But these days common CPUs have 12 or 16 extremely fast threads. Background tasks will not cause stutters or hitches with such CPUs.


I've not read any marketing material from Intel at all. I'm just basing what I said upon theories as to how tasks are handled by the processor and how each time it has to change a task it has to do what is known a context switch and that every time a processor has to do one of those, you lose five to ten clock cycles.

Again, I've not read a single damn line of Intel marketing. I'm simply basing it upon my own theories.


----------



## 80-watt Hamster (Oct 26, 2022)

THU31 said:


> You are literally just quoting Intel's marketing that so many people are buying into. This only applies to a situation where a game uses 100% of your P-cores. But guess what? If you add 2 more P-cores, suddenly they will be able to perform those background tasks too, which consist of single digit percentages when it comes to CPU usage.
> This is not an issue with multi-core CPUs. Back in the day of slow single-core CPUs, an extra E-core would have made a big difference off-loading all the background stuff to it. But these days common CPUs have 12 or 16 extremely fast threads. Background tasks will not cause stutters or hitches with such CPUs.
> 
> 
> ...



I've read a handful of P-vs-E articles, but can't remember/find one specifically about power.  If you've got a link, I'd definitely be interested to read (or possibly re-read) it.

Let's operate for the moment on the assumption that 12 P-cores (the number that would hypothetically fit on the die based on an eyeball analysis of the below diagram) would operate within the same thermal envelope.  We now have a 12c/24t chip instead of a 24c/32t and have lost thread-count parity with AMD's top desktop model.  Would it be functionally superior?  Maybe.  The number of MSDT use cases where that's true is fairly small, I'd wager.  But it could easily be mostly about the marketing.  In any case, we don't have a 12 P-core RPL processor to prove any of this.


----------



## RandallFlagg (Oct 26, 2022)

80-watt Hamster said:


> I've read a handful of P-vs-E articles, but can't remember/find one specifically about power.  If you've got a link, I'd definitely be interested to read (or possibly re-read) it.
> 
> Let's operate for the moment on the assumption that 12 P-cores (the number that would hypothetically fit on the die based on an eyeball analysis of the below diagram) would operate within the same thermal envelope.  We now have a 12c/24t chip instead of a 24c/32t and have lost thread-count parity with AMD's top desktop model.  Would it be functionally superior?  Maybe.  The number of MSDT use cases where that's true is fairly small, I'd wager.  But it could easily be mostly about the marketing.  In any case, we don't have a 12 P-core RPL processor to prove any of this.
> 
> View attachment 267341




Alder Lake would have started 5 years before release, meaning 2016.  It's unlikely that Intel was deciding to use E-Cores as a response to anything regarding process nodes or high core count competition.   

Intel and AMD are both on different tracks to reach the same destination.  They are both trying to get more efficiency while increasing compute capacity.  

AMD used chiplets in its first phase, Intel used hybrid.  

Intel's next step is disaggregation - chiplets just not separating the compute chiplet like AMD did (yet).  

AMDs next phase is likely to have different types of cores on a chiplet, and mixing those chiplets.  These would be like e-cores and p-cores.  

Linux patches to support this on AMD systems were released early this year.




			AMD could follow Intel and switch to hybrid CPUs
		










						AMD Explains why Intel Switched to a Hybrid Core Architecture | Hardware Times
					

AMD’s Lead Marketing Manager for the Ryzen family, Robert Hallock appeared in an interview with KitGuru this Sunday. He spoke on a variety of topics at length, including the adoption of the hybrid core architecture by Intel. According to Hallock, this primarily has to do with power efficiency...




					www.hardwaretimes.com


----------



## 80-watt Hamster (Oct 26, 2022)

RandallFlagg said:


> Alder Lake would have started 5 years before release, meaning 2016.  It's unlikely that Intel was deciding to use E-Cores as a response to anything regarding process nodes or high core count competition.
> 
> Intel and AMD are both on different tracks to reach the same destination.  They are both trying to get more efficiency while increasing compute capacity.
> 
> ...



Not disputing any of that.  The point was supposed to be that a hypothetical RPL CPU that uses the E-core die space for P-cores wouldn't be the ball of amazing that some like to think.  Or maybe it would.  I'm no CPU expert.


----------



## aktpu (Oct 27, 2022)

L'Eliminateur said:


> And also, i'm not going to pay intel for something i will have to disable on first boot(it was bad enough with integrated gpus, at least they added the F cpus for that)


Wait till you hear what AMD forces you to pay for on new 7000-series CPUs (or are they APUs now?)


----------



## L'Eliminateur (Oct 27, 2022)

aktpu said:


> Wait till you hear what AMD forces you to pay for on new 7000-series CPUs (or are they APUs now?)


the "barely 3D" integrated graphics are actually a good thing, specially when troubleshooting, if they offered a non-gpu version cheaper like intel's-F series then it would be great.
Also, several gimped unwanted cores are hardly the same as a block that does something no other part of the chip can do


----------



## RandallFlagg (Oct 27, 2022)

80-watt Hamster said:


> Not disputing any of that.  The point was supposed to be that a hypothetical RPL CPU that uses the E-core die space for P-cores wouldn't be the ball of amazing that some like to think.  Or maybe it would.  I'm no CPU expert.



I was just providing context, because there is a tale spun in this same thread that Intel spammed e-cores in response to Zen.  

The first Zen came out (2017) a year after Alder Lake was started (2016), it wasn't on chiplets, topped out at 8/16, was on an inferior process node and didn't perform nearly as well as its Coffee Lake competitor (8700K) despite its 2-core advantage.  This persisted even with Zen 1+ on TSMC "12nm" vs 9th gen (9900K) which went to 8/16 cores. 

It really wasn't until Zen 2 with its chiplets that core count was significantly shifted (3900X / 3950X), and that was 2019 - 3 years after Alder Lake would have been started. 

So no, Alder Lake and E-cores is not a knee jerk reaction to AMDs more cores strategy.  It was in the pipeline long before that.  If there was such a reaction to AMDs Zen 2 surprise, it was probably 10th Gen.


----------



## W1zzard (Oct 27, 2022)

This review has been updated with new performance numbers for the 13900K. Due to an OS issue the 13900K ran at lower than normal performance in heavily multi-threaded workloads. All 13900K test runs have been rebenched (the 13900K review has been updated, too).


----------



## L'Eliminateur (Oct 28, 2022)

Wizzard,
is there a way for you to add or do a P-core only results(and preferably on win10, tangent: ¿would it also be possible to do a win10 to 11 analysis piece with RKL and zen4 now that win11 has been out for quite some time?) to "some" benchmarks?


----------



## lightning70 (Oct 30, 2022)

best for gaming it also has good app performance. I'm not sure if I can switch from 12600k? It is more logical to wait for the 14th generation.


----------



## THU31 (Oct 30, 2022)

lightning70 said:


> best for gaming it also has good app performance. I'm not sure if I can switch from 12600k? It is more logical to wait for the 14th generation.


If you only need gaming performance, there is no reason to switch. But it is just a simple CPU swap.

14th gen will require a new motherboard. We do not know much about Meteor Lake right now, except that it will use multiple dies in a single package. But rumors say that it will not be able to clock as high as Alder/Raptor Lake, which are on a very mature process. 14th gen will be the first one using the new Intel 4 process. I would not be surprised if gaming performance was actually lower because of that.
I do not think it is a major architecture change for the actual CPU cores. I think the main purpose is to switch to the multi-chip approach. It will have much better efficiency, but not peak performance. Mobile is the main focus here.


----------



## KaitouX (Nov 5, 2022)

ComputerBase did some testing on the 13600K on lower power limits, and it had some pretty good results, at 125W it's only ~5% slower than the stock 181W.
Here's a simple chart using their results including also the i9 and i7, Y axis is their score based on 9 workloads (7-Zip, Agisoft PhotoScan Pro: Align Photos (84 JPEGs), Blender Benchmark: Quick Benchmark, Cinebench R15, Cinebench R20, Corona 1.3 Benchmark, DigiCortex Simulation: BenchLarge, HandBrake, POV-Ray):


----------



## tunste (Nov 5, 2022)

L'Eliminateur said:


> And why would i want efficiency cores for that?, simply put more performance cores that handle the same thing with greater performance. On a laptop it makes sense on a desktop?, nah, just give me more big cores
> 
> 
> Except they haven't, and there are lots of downsides: you need to run windows 11, and even then they're problematic with older software (and even with newer soft), and that zen4 feels smoother and more responsive.
> ...


I have a 13600K on Asus Z690 Strix A D4  @ 5.9 GHz high and all cores 5.6 GHz on 2 year old custom waterloop with Corsair 420 radiator (3 x 140mm Artic fans in push configuration) with Samsung B-Bie 4000 CL16 (2 x 16 GB) @ 3871 MHz; FSB is 100.1 & auto ring is 4500. This processor from Newegg their New Jersey  warehouse. This is a golden cpu, it only hits 65C after 3-4 hours of gaming. This overclock  is linpack stable.
I have kept my E cores enable to handle background tasks. My E cores @ 4.2 GHz heavy & 4.4 GHz miedium to light workloads.
If you have the right cooling solution then you can really push a good 13600K processor. I used the Asus IA overclocking feature to overclock my 13600K; my IA overclocked my 13600K to 68% overclock with my cooling rated as 167 with Asus cooler rating system. I did set the cpu voltage manually to 1.34v as auto was setting it  to 1.41v idle but 1.26v under full load benching @ 5.6 GHz all cores.
This is my first i5 processor; I normally get the processor just below the king of the hill like  Intel 12700K or AMD 5900X which I had both as my main system watercooled. The 12700K now in my wife’s system & 5900X in my daughter’s system now.
This 13600K is better performance price point than I have seen in over 20 years. I have been building PCs since the days of the 286. I can finally say I’ve gotten my golden processor that overclocks outstandingly.
i view running Windows 11 as a positive as it has matured into a good OS finally after the latest update.


----------



## Tek-Check (Nov 12, 2022)

Would it be possible to retest 13600K vs. 7600X with 4090? Latest results from 54 games by HUB suggest that 7600X has an edge over 13600K both in 1080p and 1440p, 5% and 4% respectively. This is significantly different from the original results on TUP with 3080.


----------



## Gica (Nov 12, 2022)

DDR 4 versus DDR 5 and 13600K versus 5800X3D in 15 tests


----------



## Vario (Nov 17, 2022)

tunste said:


> I have a 13600K on Asus Z690 Strix A D4  @ 5.9 GHz high and all cores 5.6 GHz on 2 year old custom waterloop with Corsair 420 radiator (3 x 140mm Artic fans in push configuration) with Samsung B-Bie 4000 CL16 (2 x 16 GB) @ 3871 MHz; FSB is 100.1 & auto ring is 4500. This processor from Newegg their New Jersey  warehouse. This is a golden cpu, it only hits 65C after 3-4 hours of gaming. This overclock  is linpack stable.
> I have kept my E cores enable to handle background tasks. My E cores @ 4.2 GHz heavy & 4.4 GHz miedium to light workloads.
> If you have the right cooling solution then you can really push a good 13600K processor. I used the Asus IA overclocking feature to overclock my 13600K; my IA overclocked my 13600K to 68% overclock with my cooling rated as 167 with Asus cooler rating system. I did set the cpu voltage manually to 1.34v as auto was setting it  to 1.41v idle but 1.26v under full load benching @ 5.6 GHz all cores.
> This is my first i5 processor; I normally get the processor just below the king of the hill like  Intel 12700K or AMD 5900X which I had both as my main system watercooled. The 12700K now in my wife’s system & 5900X in my daughter’s system now.
> ...


Absolutely fantastic CPU.  Congratulations on the purchase.


----------



## ebonyhill (Nov 19, 2022)

*Why is the performance of the overclocked CPU so much worse in "Web Browsing" and "Microsoft Office"?*
Those should be a single core workloads so 5.1GHz VS 5.6GHz should give it a nice boost in both.
Especially in the "Speedometer 2" it's 36% slower when overclocked.... why?


----------



## ColdSKySuper (Dec 8, 2022)

In fact，13600KF processor has greater potential，it can even beat 12900K processor completely


----------



## persizi (Dec 14, 2022)

At about 1.100v 13600K gets more power efficient than 7700K or 7900K.






25K from 144W.
If you want more energy efficiency better to limit the current(A) than the power(PL). Thus for the same power better performance is obtained.


----------



## PenguinBelly (Jan 1, 2023)

You know you can limit the power supply to Zen 4 CPUs, right?   And who knows, maybe even the 13900K at certain voltage can be more efficient than the 13600K at its stock V/f curve.  As a matter of fact, I am sure of it.



> AMD's new Zen 4 platform requires very expensive motherboards. Even the "cheapest" B650 chipset board costs well over $200, while Intel motherboards can be found for around $100. Sure, these might not be the latest and greatest Z790, but *there won't be any big differences when opting for a cheaper B660 board*, for example.



I am curious if the author actually verified this on a $100 board.


----------



## Why_Me (Jan 3, 2023)

PenguinBelly said:


> You know you can limit the power supply to Zen 4 CPUs, right?   And who knows, maybe even the 13900K at certain voltage can be more efficient than the 13600K at its stock V/f curve.  As a matter of fact, I am sure of it.
> 
> *I am curious if the author actually verified this on a $100 board.*


Who in their right mind runs a 13900K on a $100 board.


----------



## 80-watt Hamster (Jan 3, 2023)

Why_Me said:


> Who in their right mind runs a 13900K on a $100 board.



There's some multi-quoting going on, and the question was (AFAICT) about 13900K efficiency at 13600K power limits, and if the hypothesis that the i9 would be more efficient in that scenario would be true on both higher-end and budget boards.


----------



## persizi (Jan 4, 2023)

PenguinBelly said:


> You know you can limit the power supply to Zen 4 CPUs, right?   And who knows, maybe even the 13900K at certain voltage can be more efficient than the 13600K at its stock V/f curve.  As a matter of fact, I am sure of it.
> 
> 
> 
> I am curious if the author actually verified this on a $100 board.


The 13600K is not power limited, just undervolted and slightly overclocked on the E cores. More cores CPUs are more efficient than the less cores ones at the same power, no question about it.


----------

