# Intel Core i9-13900K



## W1zzard (Oct 20, 2022)

With the Core i9-13900K, Intel delivers impressive performance. Our in-depth review confirms: Raptor Lake is the world's fastest CPU for gaming. Even in applications the processor is able to match AMD's Zen 4 Ryzen 9 7950X flagship. If only power consumption wasn't so high...

*Show full review*


----------



## thunderingroar (Oct 20, 2022)

Looking forward to future CPU testings with a 4090


----------



## Crackong (Oct 20, 2022)

98 degree vs 95 degree 
whoa


----------



## Chaitanya (Oct 20, 2022)

Best to skip both these flagships this gen then.


----------



## Asni (Oct 20, 2022)

The most uselful segment of the 12900k review was the test with e-cores disabled.


----------



## Solid State Brain (Oct 20, 2022)

Other reviewers are seeing higher results in CB23 with power limits removed.


----------



## usiname (Oct 20, 2022)

I just cant


----------



## BorisDG (Oct 20, 2022)

This year/generation of CPUs is a FLOP. Go next!


----------



## birdie (Oct 20, 2022)

TLDR: Meteor Lake couldn't come sooner.

AMD's classic approach on average works better (a lot more power efficient and in many tasks quite faster) than Intel's BIG.little foray.

To be honest both Zen 4 and RPL are meh. I hope it's not the sign of things to come and Zen 4+/5/Meteor Lake will dial back down on temperatures and power consumption. Things are getting out of hands really. First, ADL with ~250W power consumption, then Zen 4 with ~240W, then Lovelace with 450W, now RPL with ~280W.

Igor's Lab shows different power consumption figures though:







Now suddenly RPL is not so bad.


----------



## swirl09 (Oct 20, 2022)

Man this seems underwhelming. I mean, unless you do mixed workloads and make sure of the extra E cores.

I wasnt expecting much, but it still somehow comes up short. I have a conservative all core OC on the 12900k at low volts and it games like a champ.


----------



## Athlon2K15 (Oct 20, 2022)

Why are you testing on windows 11 21H2 when Intel specifically said that 22H2 has the new core scheduler for 13th Gen along with all the improvements to thread director and then complain in your pros cons that cores get put on wrong workloads. wtf.


----------



## Space Lynx (Oct 20, 2022)

117 celsius on load? holy shit. i didn't even know that was possible.


----------



## gridracedriver (Oct 20, 2022)

Terrifying, simply.


----------



## LifeOnMars (Oct 20, 2022)

I can hand on heart say I am so glad that I am solely a gamer who plays at 4K and for once, made the right choices for me. I have an energy efficient CPU (5700x) which provides enough cores, speed and runs very cool. Slow ram(Just 3200 cl16), a budget motherboard (asrock b450) which previously ran a 3600 and only provides PCIE 3.0 and finally, a good quality 3080 12gb GPU (my biggest extravagance)

This system works beautifully, runs very cool, extremely quiet and is still running the latest game releases very well at good fidelity settings(Plague Tale/Uncharted) not to mention a huge backlog of older steam games to feast on.

I'm hoping, by the time I have to upgrade again, we'll be back to more refined parts but for now I'm more than happy with a near silent, cool running, 4k gaming experience at good framerates.

I don't envy those competitve gamers who feel the need to upgrade because they need to max out 500hz monitors else they can't aim or those enthusiasts who simply have to have the latest and greatest, these companies are mocking us.


----------



## R0H1T (Oct 20, 2022)

fevgatos said:


> Yes, all or most MT workloads (cbr23, blender , corona etc.). TDP numbers are not meaningless. *If you put a PL2 on the bios, the mobo respects it.* If *mobo manafacturers do stupid stuff like power unlocking it on stock, that's their own problem.*


So, will you continue to be the Intel apologist again 






That's BS, because Intel can force any of these mobo makers to go bankrupt overnight! You think they're doing this without Intel's knowledge/approval


----------



## kapone32 (Oct 20, 2022)

399 Watt power draw seems very excessive. No wonder the VRMs on the Z790 boards are so robust.


----------



## Solid State Brain (Oct 20, 2022)

CallandorWoT said:


> 117 celsius on load? holy shit. i didn't even know that was possible.



For some reason it appears that the temperature limit was increased for this review even though the default should be 100 °C.


----------



## spnidel (Oct 20, 2022)

280w cpu, 500w+ gpu, let's gooo
yeah this generation of cpus from both AMD and Intel is ass, small performance uplifts, ridiculous power draw, all at a high entry cost
looking forward to see what cpus will have to offer in 2-3 generations when ddr5 will actually be worth it


----------



## agatong55 (Oct 20, 2022)

spnidel said:


> 280w cpu, 500w+ gpu, let's gooo
> yeah this generation of cpus from both AMD and Intel is ass, small performance uplifts, ridiculous power draw, all at a high entry cost
> looking forward to see what cpus will have to offer in 2-3 generations when ddr5 will actually be worth it


but for someone like me who has a 1600x no matter what I choose it's a huge upgrade. It's not always about the people that want / can afford to upgrade every gen.


----------



## R0H1T (Oct 20, 2022)

agatong55 said:


> but for someone like me who has a 1600x no matter what I choose it's a huge upgrade. It's not always about the people that want / can afford to upgrade every gen.


Well if you go to 5700x or 5800x3d you'd likely double your overall performance at a third of the cost.


----------



## HenrySomeone (Oct 20, 2022)

Fastest cpu overall, much faster in games and can be had for barely more than 7900x on a (potentially, if you so choose) much cheaper platform to boot! What's not to like, except of course, if you're a hardcore red team fanboy, then there will always be a pea under the mattress...


----------



## fevgatos (Oct 20, 2022)

R0H1T said:


> So, will you continue to be the Intel apologist again
> 
> 
> 
> ...


Ιm not an apologist, the results in MT are underwhelming. In gaming ofc it literally dumbs on everything out there.

About the wattage, I don't care what mobos do, I care about Intel specs. You can have it pull 900 watts for all I care, doesn't really matter


----------



## ymdhis (Oct 20, 2022)

120C and 400W, holy shit.


----------



## Kissamies (Oct 20, 2022)

At least it's useful as winter is coming so it can be used as a heater.


----------



## TheinsanegamerN (Oct 20, 2022)

The power scaling of raptor lake is pure garbage. Those efficiency cores are utterly useless.


----------



## R0H1T (Oct 20, 2022)

fevgatos said:


> Ιm not an apologist, the results in MT are underwhelming.* In gaming ofc it literally dumbs on everything out there.*
> 
> About the wattage,* I don't care what mobos do, I care about Intel specs. *You can have it pull 900 watts for all I care, doesn't really matter


And I can show you dozens of benches where Intel chips don't enforce PL1/2 strictly, like AMD does with their PPT. And this from a company which patched non Z OCing loopholes* a year after mobos were released*, you think everyone's a fool here?

Roughly 10% (wrt 7950x) is what now  






Yes you do seem to have a love/hate relationship with facts, like you hate facts!


----------



## TheinsanegamerN (Oct 20, 2022)

agatong55 said:


> but for someone like me who has a 1600x no matter what I choose it's a huge upgrade. It's not always about the people that want / can afford to upgrade every gen.


in your case none of the new stuff makes sense when the 5800x3d exists.


----------



## AdmiralThrawn (Oct 20, 2022)

Looks like intel takes the crown again, Now lets see AMDs card.


----------



## Ayhamb99 (Oct 20, 2022)

Yikes that power consumption is too much, I do not understand why AMD and Intel think using brute force to overtake in performance while suffering high temperatures and power usage is a good idea . I'm now curious to see how the 13700k will perform and hope it's not as inefficient as this monstrosity.


----------



## ModEl4 (Oct 20, 2022)

Around 2% slower in gaming and 10% slower in MT of what i was expecting at stock, but still good, (power limits removed mode is undesirable for probably the most 13900K users i would imagine) but regarding MT it's nowhere near the +40% at 251W vs 12900K that Intel claimed.


----------



## Chaitanya (Oct 20, 2022)

Just remembered Intel is rumoured to launch HEDT CPU this time around and these desktop class dont inspire confidence in that platform neither does it regarding their upcoming glued together Server CPUs.


----------



## fevgatos (Oct 20, 2022)

ModEl4 said:


> but regarding MT it's nowhere near the +40% at 251W vs 12900K that Intel claimed.


Ιt is - if you check other reviews, eg. guru 3d. At this point im not sure what's going on and who to trust, but guru3d has it at 38k cbr23 stock.


----------



## Dirt Chip (Oct 20, 2022)

A glory example of "Pyrrhic victory".
I hope that after tuning that ~300W all-core-load will be closer to 200W with -5% pref and -20c degrees. Then I'm fine.


----------



## Solid State Brain (Oct 20, 2022)

fevgatos said:


> Ιt is - if you check other reviews, eg. guru 3d. At this point im not sure what's going on and who to trust, but guru3d has it at 38k cbr23 stock.



In this regard, Computerbase.de mentions (Google Translation):









						Core i9-13900K, i7-13700K & i5-13600K: Gaming-Könige im Test: Leistungsaufnahme und Effizienz
					

Intel Raptor Lake im Test: Leistungsaufnahme und Effizienz / Leistungsaufnahme in Anwendungen ab Werk / Der Verbrauch steigt




					www-computerbase-de.translate.goog


----------



## fevgatos (Oct 20, 2022)

Actually lot's of reviews have the 13900k at 38k CBR23 at stock 250w. Something is off again with techpowerup...



Solid State Brain said:


> In this regard, Computerbase.de mentions (Google Translation):
> 
> 
> 
> ...


So it's more than 50% faster at similar wattage. So I was right all along, right
@usiname @R0H1T


----------



## oxrufiioxo (Oct 20, 2022)

CallandorWoT said:


> 117 celsius on load? holy shit. i didn't even know that was possible.



117 is the new 60 bruh.


----------



## Arco (Oct 20, 2022)

When will we get 7nm ++++++++++++++++

Also why intel? This is stupid.


----------



## ModEl4 (Oct 20, 2022)

fevgatos said:


> Ιt is - if you check other reviews, eg. guru 3d. At this point im not sure what's going on and who to trust, but guru3d has it at 38k cbr23 stock.


I will check and reply, it seems odd, also computerbase according to @Solid State Brain


----------



## Dirt Chip (Oct 20, 2022)

Arco said:


> When will we get 7nm ++++++++++++++++
> 
> Also why intel? This is stupid.


As long as it can keep up, and outperform AMD`s 5nm, Why not?


----------



## P4-630 (Oct 20, 2022)

Where is the review of the i7 13700K?


----------



## Xex360 (Oct 20, 2022)

Seems both high-end Zen4 and 13th are a no buy, one with stupid high entry price, the other dead platform with no cooler capable to handle them.
For gaming probably the 5800X3D is still the best especially at high resolutions.
One correction, 590$ is not the MSRP, that's Intel's customers (buying 1000+).


----------



## spnidel (Oct 20, 2022)

agatong55 said:


> but for someone like me who has a 1600x no matter what I choose it's a huge upgrade. It's not always about the people that want / can afford to upgrade every gen.


like another user said, if you upgrade to a 5800x3d you'll get nearly the same performance, but you won't have to buy a new mobo (provided there's a bios that supports it) + ram, so it'll be far cheaper


----------



## fevgatos (Oct 20, 2022)

ModEl4 said:


> I will check and reply, it seems odd, also computerbase according to @Solid State Brain


Techpowerup has a history of poor results with intel CPUs. Happened with the 12900k as well, their power limited numbers were as much as 70% slower compared to the actual score it should be getting


----------



## Arco (Oct 20, 2022)




----------



## spnidel (Oct 20, 2022)

HenrySomeone said:


> Fastest cpu overall, much faster in games and can be had for barely more than 7900x on a (potentially, if you so choose) much cheaper platform to boot! What's not to like, except of course, if you're a hardcore red team fanboy, then there will always be a pea under the mattress...


you sure you're not dicktracy's alt account? I wouldn't call 10% higher perf in games to be "much faster" lol
I swear the wording you use is almost completely identical to that shill, with goalposts being moved every time
speaking of shills... how much are you paid per post? do you do it for free?


----------



## ModEl4 (Oct 20, 2022)

fevgatos said:


> Ιt is - if you check other reviews, eg. guru 3d. At this point im not sure what's going on and who to trust, but guru3d has it at 38k cbr23 stock.


I had a glance but must have missed it, does guru3d says anywhere explicitly that CBr23 was running at 251W PL2?
But i saw the below in the last page:

_It has to be stated that today you'll see very different reviews on performance throughout a lot of media. See, the motherboard manufacturers get to decide how the CPU is configured concerning PL1/PL2 states (the maximum allowed energy consumption during a set timeframe, and these values will differ everywhere. We also noticed that when you run a benchmark, often the score is never 100% the same. 

We tested on a Z790 motherboard that allows the multi-core configuration to be set at reference specification, and we did that. So you're seeing as close to reference performance as possible_


----------



## Dirt Chip (Oct 20, 2022)

Arco said:


> View attachment 266316


It's funny cus it`s an old AMD FX CPU


----------



## Fleurious (Oct 20, 2022)

Wow, those temps are something else.   Performance is too good to call it a modern Netburst but literally… hot damn.


----------



## Arco (Oct 20, 2022)

Dirt Chip said:


> It's funny cus it`s an old AMD FX CPU


Man with Michigan's weather doing things I'm going to be happy to have a very strong space heater.


----------



## R0H1T (Oct 20, 2022)

fevgatos said:


> Actually lot's of reviews have the 13900k at 38k CBR23 at stock 250w. Something is off again with techpowerup...
> 
> 
> *So it's more than 50% faster at similar wattage.* So I was right all along, right
> @usiname @R0H1T


No you weren't, those are power limits not actual power consumption! Besides your math is wrong ~
(96/71)x100 or (111/81)x100 or (126/93)x100

Might wanna get a scientific calculator for the results.


----------



## fevgatos (Oct 20, 2022)

R0H1T said:


> No you weren't, those are power limits not actual power consumption! Besides your math is wrong ~
> (96/71)x100 or (111/81)x100 or (126/93)x100
> 
> Might wanna get a scientific calculator for the results.


Powerlimit IS the power consumption on Intel. You can actually see it, the computerbase review has power draw. With the stock 253w limit the 13900k draws...253w (surprise, I know). And it score 43% better in CBR23 compared to the 12900k @ 240w. So I was perfectly right, dunno wth y ou are talking about


----------



## TheoneandonlyMrK (Oct 20, 2022)

So 95 isn't the new 65, 115 is?!.

Funny now I do get to say told you so, 95 isn't an issue as I said.


----------



## Solid State Brain (Oct 20, 2022)

At this point I almost suspect that some reviewers are deliberately sabotaging their tests to get rage clicks. Look at the power scaling by HardwareUnboxed.


----------



## spnidel (Oct 20, 2022)

damn, this cpu is so good! only twice as much power for a slightly higher cinebench score!



wowza, and a whole 4% faster than a 7700x in games! sure is a bad year to be an "AMD fanboy"!
both generations are nothing special, but to see intel shills desperately damage control in here is hilarious
ask for an hourly raise for all the effort you're putting in


----------



## fevgatos (Oct 20, 2022)

Solid State Brain said:


> At this point I almost suspect that some reviewers are deliberately sabotaging their tests to get rage clicks. Look at the power scaling by HardwareUnboxed.
> 
> 
> View attachment 266318


Yeah hwunboxed and techpowerup are shitting the bed with the power draws. Every other site I checked has completely different results, guru3d / computerbase / club386 / igorslab. Wtf is going on?


----------



## Solid State Brain (Oct 20, 2022)

spnidel said:


> damn, this cpu is so good! only twice as much power for a slightly higher cinebench score!



Something went clearly wrong in the HardwareUnboxed test.

How can it be that the 13900K at lower power levels is producing the same scores as the 12900K when most of the efficiency improvements will be in that operating region?


----------



## Rahmat Sofyan (Oct 20, 2022)

wintel is coming ...

go intel go ..


----------



## MarsM4N (Oct 20, 2022)

Nicely head to head with the AMD 7950X. Didn't crush it like many predicted.  Pricing is very competitive, though. AMD needs to cut prices.

Hats off for the *gaming performance*. Quite a little lead over the AMD's. Let's see if they can stand AMD's upcomming mighty X3D force.
But then it all comes down to pricing. I guess Intel's will be the top pick for budget gaming if AMD doesn't bring on a cheapo 7600X3D etc. variant.

Also would be interesting to see some 13900K vs. 7950X *performance/W* charts with locked power targets.

And _*temperatures*_, holy moly! Folks where making fun about AMD's temps.  Intel, hold my beer.








> Some workloads get scheduled onto wrong cores



What, it's still not fixed?  lol, Intel.


----------



## Steevo (Oct 20, 2022)

8% faster than the 7700 for almost 100W more power and 15 degrees hotter? Where dem boys at that was saying 95 was gonna cause degrading….. what’s the deal here?


----------



## v12dock (Oct 20, 2022)

I can see a 3D cache for Ryzen 7000 no problem. I can't see a KS for the 13900 that power and temp is simply unreal...


----------



## R0H1T (Oct 20, 2022)

fevgatos said:


> You can actually see it, the computerbase review has power draw.


Their max power draw listed is for P95 v29.8 which isn't what some of the other reviews are testing it with.


fevgatos said:


> So I was perfectly right, dunno wth y ou are talking about


Let's see what we have here ~



fevgatos said:


> Yeah hwunboxed and techpowerup are shitting the bed with the power draws.


No *you can't read reviews*, that's what the issue is!


----------



## spnidel (Oct 20, 2022)

Solid State Brain said:


> Something went clearly wrong in the HardwareUnboxed test.
> 
> How can it be that the 13900K at lower power levels is producing the same scores as the 12900K when most of the efficiency improvements will be in that operating region?


wut? guru3d is showing power consumption of 368w at a multithreaded load, and a ~38k CB23 score, that's pretty much in line with HUB's findings


----------



## mahanddeem (Oct 20, 2022)

Thank you for the review
I have 2 comments please:
1. In 720p game tests, are there any engine caps on fps and some CPUs would hit that before running out of calculations?

2. I would (in my opinion) use the new 4090 to remove any bottleneck from GPU side in all game resolution tests since it has tremendous power exposing CPU difference better.

Thank wizard


----------



## Steevo (Oct 20, 2022)

HenrySomeone said:


> Fastest cpu overall, much faster in games and can be had for barely more than 7900x on a (potentially, if you so choose) much cheaper platform to boot! What's not to like, except of course, if you're a hardcore red team fanboy, then there will always be a pea under the mattress...


Cough….




HenrySomeone said:


> I have to agree; with how much AM5 costs as a whole, it was absolutely stupid to try to retain cooler compatibility and sacrifice well over 10 degrees (as DerBauer's testing clearly shows) due to the ridiculously thick IHS. Just another typical AMD blunder I guess...



That pea is at least 20C cooler, what’s Intels blunder here beyond more cores, cache and power?

Their prediction unit and cache latency are the two benefits they have.


----------



## fevgatos (Oct 20, 2022)

R0H1T said:


> Their max power draw listed is for P95 v29.8 which isn't what some of the other reviews are testing it with.
> 
> Let's see what we have here ~
> View attachment 266320View attachment 266321
> ...


Sure I can't ready reviews. With the exception of techpowerup, every other review has the 13900k scoring 38 to 39k in CBR23 at 253w, YES? The 12900k on the other hand scores 27.5k @ 240w, yes? What percentage is that increase? 

Can't you just admit you were wrong, jesus christ


----------



## dirtyferret (Oct 20, 2022)

I picked up a i9-13900k off QVC this morning, George Foreman was selling them!


----------



## oxrufiioxo (Oct 20, 2022)

R0H1T said:


> Their max power draw listed is for P95 v29.8 which isn't what some of the other reviews are testing it with.
> 
> Let's see what we have here ~
> View attachment 266320View attachment 266321
> ...



GN got 300w in blender stock as well.


----------



## Dimitriman (Oct 20, 2022)

This is a great round with both Intel and AMD pushing their archs to the absolute limit. Amazing performance from both camps which will make the CPU scenario all that better for us buyers. Hopefully a price war is incoming!


----------



## fevgatos (Oct 20, 2022)

spnidel said:


> wut? guru3d is showing power consumption of 368w at a multithreaded load, and a ~38k CB23 score, that's pretty much in line with HUB's findings


No, guru3d measures system power. That's why the 7950x is at 340w. TPUP is measuring CPU power only. See the problem?


----------



## usiname (Oct 20, 2022)

Its hard to not use insulting words to someone so much clueless like you. I will just post the power consumption from computerbase and your 50% faster at similar wattage


----------



## Xuper (Oct 20, 2022)

I think winner is 12600K.


----------



## defaultluser (Oct 20, 2022)

So, why did you continue to test using the ancient 3080 data set under games?  you could have started with this, and just re-ram the 720p tests.









						RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review
					

We test the NVIDIA GeForce RTX 4090 with 53 games at three resolutions, comparing the AMD Ryzen 7 5800X against the Intel Core i9-12900K. The idea here is to get a feel for how much graphics performance is lost by a weaker processor.




					www.techpowerup.com


----------



## Solid State Brain (Oct 20, 2022)

spnidel said:


> wut? guru3d is showing power consumption of 368w at a multithreaded load, and a ~38k CB23 score, that's pretty much in line with HUB's findings



I'm referring to the results at reduced power limits. The i9-12900K at a power limit of 125W scores about 24000 points at 125W.








						AMD Ryzen 9 7950X vs. Intel Core i9-12900K at 125W and 65W | Club386
					

What happens when you dial-down power limits? You might be surprised.




					www.club386.com
				




The 13900K apparently does the same (or less) according to HardwareUnboxed's power scaling graph.
Besides this, the score-power curve they show does not look correct; it is not supposed to grow roughly _linearly_ with power.


----------



## nexus290 (Oct 20, 2022)

The latest version of RPSC3 has the zen 4 7950 x faster than the 12900k but in your review its slower than the 12900k AMD Ryzen 9 7950X Tops PS3 Emulation Rankings, Thanks to AVX-512 | Tom's Hardware (tomshardware.com).


----------



## R0H1T (Oct 20, 2022)

fevgatos said:


> Sure I can't ready reviews. With the exception of techpowerup, every other review has the 13900k scoring 38 to 39k in CBR23 at *253w*, YES? The 12900k on the other hand scores 27.5k @ *240w, yes*? What percentage is that increase?
> 
> Can't you just admit you were wrong, jesus christ


Oh FFS these are *TDP levels*, do you think setting 253W or 241W TDP will get you the exact same power draw across a 1000 review sites 

Do you know this thing called entropy then physics & chemistry all at play here?


----------



## KarymidoN (Oct 20, 2022)

89ºC at gaming (stock) bruh... NH-U14S got stomped.


----------



## Hossein Almet (Oct 20, 2022)

EDITOR CHOICE, really?


----------



## fevgatos (Oct 20, 2022)

usiname said:


> Oh FFS these are *TDP levels*, do you think setting 253W or 241W TDP will get you the exact same power draw


YES. It will. I'ts not TDP, it's a power limit. If you set it to 250w, it will draw 250w, unless you do something wrong with the loadlines. It literally cannot pull more than the power limit you set.


----------



## Crackong (Oct 20, 2022)

Solid State Brain said:


> I'm referring to the results at reduced TDP. The i9-12900K at a power limit of 125W scores about 24000 points at 125W.



'Reduced TDP' isn't equal to 'Power Limit'
In HWB's review They specifically mentioned it is 'Power Limited'
So the 13900k can't boost beyond the limit.

We all know TDP means really nothing for modern CPUs


----------



## watzupken (Oct 20, 2022)

Hard pass for both Zen 4 and Raptor Lake for me. I don't need more heat in my home. Intel once again takes the crown with regards to performance (mostly), with 2 additional awards in high power and temperature.


----------



## watzupken (Oct 20, 2022)

Hossein Almet said:


> Editor's Choice, really?


No award, no review units ever again. Jokes aside, it is a fast chip if you are willing to ignore the ridiculous power draw and temperature. I don't believe you can get this to run cooler without delid.


----------



## Tomgang (Oct 20, 2022)

Oh my God. Meteor lake and zen 4 is to hot, to power hungry and not so fast that i find it worth upgrade from Zen 3.

Hornestly I am glad I just chose zen 3. Yes it's not as fast. But it also less power hungry and runs much cooler. I am keeping my 5600X and 5950X for a while more. They are at least easy to aircool.

Bofh cpu and gpu these days just seems to be pushed to the max and way out of there efficiency range. Not the direction I like. Sure it's cool with fast hardware, but it is uncool to run 90C+ and eat 200  watt+. That is just nuts.


----------



## Solid State Brain (Oct 20, 2022)

Crackong said:


> 'Reduced TDP' isn't equal to 'Power Limit'
> In HWB's review They specifically mentioned it is 'Power Limited'
> So the 13900k can't boost beyond the limit.
> 
> We all know TDP means really nothing for modern CPUs



I changed that to "power limit" just before you replied because I expected something along these lines as an answer.

To clarify: I meant power limits. On Intel CPUs you change the power limits and the CPU draw no more than what has been set unless, as fevgatos mentioned above, the so-called DC Loadline has been incorrectly set.


----------



## Toss (Oct 20, 2022)

what is this 117C ?
what is throttle temps in those garbage


----------



## neatfeatguy (Oct 20, 2022)

Hossein Almet said:


> EDITOR CHOICE, really?



At least it didn't say "Highly Recommended". I would have scoffed at him if it said that. The CPU is faster than it's predecessor, but I personally am not impressed with it, nor with AMD's offerings.

Perhaps I'm just biased since I moved to a 5900x and a 3080 and the performance gains compared to what I have and need them for is kind of lack luster. I see no reason to upgrade to anything new that's come out in terms of CPUs or GPUs.


----------



## 64K (Oct 20, 2022)

Strictly from a gamers perspective. Very small FPS gains over an i5 13600k in games and costs $270 more. I don't pay any attention to Synthetics scores because I don't play Synthetics. I play PC games.


----------



## oxrufiioxo (Oct 20, 2022)

Both flagships are meh this generation. They both can be impressive at times but even with my 360 CLC I'd cap either of them to 125w-180w.


----------



## fevgatos (Oct 20, 2022)

It's true I'm afraid. Power limit = The maximum power allowed. That's just a fact.


----------



## HD64G (Oct 20, 2022)

Efficiency scaling 13900K vs 7950X


----------



## xorbe (Oct 20, 2022)

That 5800X3D though, 99.1% at 4K gaming and #2 efficiency, while 13900K #1 power usage for 100%.


----------



## fevgatos (Oct 20, 2022)

HD64G said:


> Efficiency scaling 13900K vs 7950X
> 
> View attachment 266328


The graph is a failure though


----------



## W1zzard (Oct 20, 2022)

Athlon2K15 said:


> Why are you testing on windows 11 21H2 when Intel specifically said that 22H2 has the new core scheduler for 13th Gen along with all the improvements to thread director and then complain in your pros cons that cores get put on wrong workloads. wtf.


22H2 was not available at the time of testing in summer. Also it's so new, I'm sure there's plenty of bugs, maybe some that hurt AMD, so I wanted to keep things fair.

Only one workload went to the wrong cores (virtualization)

I'll test 21H2 vs 22H2 soon in an in-depth article


----------



## rv8000 (Oct 20, 2022)

It’s actually horrifying to see the power usage and temps on this, I wasn’t expecting the efficiency to be that much worse  

I see a lot of apologists in here who were lambasting AMD for 95c and increased power usage, 13900k is on another level.

I was thinking about a 13600k, but it looks like holding out for the 7800x3d or whatever models end up getting the 3d vcache treatment is the way to go


----------



## W1zzard (Oct 20, 2022)

Solid State Brain said:


> For some reason it appears that the temperature limit was increased for this review even though the default should be 100 °C.


Correct, so we can figure out the real temp these CPUs will run at without throttling


----------



## Solid State Brain (Oct 20, 2022)

fevgatos said:


> It's true I'm afraid. Power limit = The maximum power allowed. That's just a fact.



To be fair, an additional point of confusion here (besides that almost nobody appears to be aware that on Intel the reported Package Power can be accurate, _if_ the DC loadline has been correctly configured) is that TechPowerUp's "CPU only" power draw figures are taken from the motherboard's 12V power connectors. So they are not just for the CPU chip only, but they include at the least also the VRMs, whose efficiency will vary with CPU load, VRM temperatures and of course motherboard design. Unless these variables have been factored out in some way.


----------



## W1zzard (Oct 20, 2022)

Solid State Brain said:


> To be fair, an additional point of confusion here (besides that almost nobody appears to be aware that on Intel the reported Package Power can be accurate, _if_ the DC loadline has been correctly configured)


Just less wrong if you tweak loadline, and there's no way to know by how much unless you actually calibrate your measurements, which cannot be done unless you solder around on the motherboard



Solid State Brain said:


> TechPowerUp's "CPU only" power draw figures are taken from the motherboard's 12V power connectors. So they are not just for the CPU chip only, but they include at the least also the VRMs, whose efficiency will vary with CPU load, VRM temperatures and of course motherboard design. Unless these variables have been factored out in some way.


Yup, we measure with high-precision, high-resolution test equipment on the 12V power connectors. Data collection happens on a separate machine, not the tested machine


----------



## fevgatos (Oct 20, 2022)

Solid State Brain said:


> To be fair, an additional point of confusion here (besides that almost nobody appears to be aware that on Intel the reported Package Power can be accurate, _if_ the DC loadline has been correctly configured) is that TechPowerUp's "CPU only" power draw figures are taken from the motherboard's 12V power connectors. So they are not just for the CPU chip only, but they include at the least also the VRMs, whose efficiency will vary with CPU load, VRM temperatures and of course motherboard design. Unless these variables have been factored out in some way.


But that's easy to check with the VID, is it not? Most high end motherboards should be pretty accurate at auto LLC settings with matched VID and vcores.


----------



## Dirt Chip (Oct 20, 2022)

spnidel said:


> wut? guru3d is showing power consumption of 368w at a multithreaded load, and a ~38k CB23 score, that's pretty much in line with HUB's findings


GURU3D use all system power, not just CPU: "Keep in mind that we measure the ENTIRE PC, not just the processor's power consumption."


----------



## clopezi (Oct 20, 2022)

I would love to see how Noctua NHD15 performance with this CPU's, for many users it's our reference heatsink... and many times better than liquid cooling...

On the other hand, looks like DDR5 has no big advantage here. Looks like my DDR4 3200 that I would love to change, will live to see another day...


----------



## Solid State Brain (Oct 20, 2022)

fevgatos said:


> But that's easy to check with the VID, is it not? Most high end motherboards should be pretty accurate at auto LLC settings with matched VID and vcores.



Yes, the simplest way is tweaking the DC Loadline until VID (a calculated value) and vcore (actually measured CPU voltage) match as close as possible in a multithreaded workload, which can take some time. If not for _more or less_ correct CPU chip-only power measurements, at least this would ensure that power limits are actually working as intended.

From what I've observed with recent middle-range motherboards, VID and vcore almost never match with stock settings, and some motherboards may set completely wrong DC Loadline values depending on other firmware settings.


----------



## birdie (Oct 20, 2022)

13900K and 7950X at 125W and 65W:









						Intel Core i9-13900K vs. AMD Ryzen 9 7950X at 125W and 65W | Club386
					

Prefer to keep temperature and power consumption down to lower levels this winter? Here's what happens when the best CPUs are scaled back.




					www.club386.com
				




Looks like both CPUs are needlessly overclocked out of the box.


----------



## Dirt Chip (Oct 20, 2022)

The race for the highest wattage consumption is on.
For a sec it seems AMD reached an equal-point, but along came 13900k and just smoke everyone (and everything) around.


----------



## Denver (Oct 20, 2022)

RL manages to be even hotter than Zen4. I know winter is coming, but this is a very appealing sales tactic. Just kidding


----------



## phill (Oct 20, 2022)

Amazing review there @W1zzard, above and beyond as always mate, amazing work and effort.

I've gotta say, reading that quickly, its not really earth shattering and massively in front of AMD for me as I was expecting it to be.  Its a little faster in games lower res's but if you have these CPUs and a GPU to match, you won't be gaming at 720P I doubt...

Wow...   Power and temps, jesus......


----------



## W1zzard (Oct 20, 2022)

Solid State Brain said:


> Yes, the simplest way is tweaking the DC Loadline until VID (a calculated value) and vcore (actually measured CPU voltage) match as close as possible in a multithreaded workload, which can take some time. If not for _more or less_ correct CPU chip-only power measurements, at least this would ensure that power limits are actually working as intended.


That gives you a match for voltage, and only if you measure vcore with a DMM. What about current? So you can do U*I = P


----------



## Pumper (Oct 20, 2022)

Something is not right with the Cinebench results. All other reviews show 38k~40k. Wonder how affected the other tests are by whatever is causing it in your setup.


----------



## Toss (Oct 20, 2022)

HD64G said:


> Efficiency scaling 13900K vs 7950X
> 
> View attachment 266328


125W is sweet spot


----------



## fevgatos (Oct 20, 2022)

Pumper said:


> Something is not right with the Cinebench results. All other reviews show 38k~40k. Wonder how affected the other tests are by whatever is causing it in your setup.


Yeah, was the same with the 12900k, their power limited numbers were waaay off.


----------



## HTC (Oct 20, 2022)

I wonder: will those that slammed AMD and their new generation *due to high temps* now purchase this CPU because of it's performance, with EVEN HIGHER temps?

Dunno about the other CPUs of this family: will they too have such temps?


----------



## oxrufiioxo (Oct 20, 2022)

HTC said:


> I wonder: will those that slammed AMD and their new generation *due to high temps* now purchase this CPU because of it's performance, with EVEN HIGHER temps?
> 
> Dunno about the other CPUs of this family: will they too have such temps?



Pretty sure the 13600k will be just fine and maybe the 13700k. This cpu is pushed beyond reason to compete with the 7950X.


----------



## W1zzard (Oct 20, 2022)

Pumper said:


> Something is not right with the Cinebench results. All other reviews show 38k~40k. Wonder how affected the other tests are by whatever is causing it in your setup.


I noticed that too .. I get 40,000 if I disable power limit, i.e. enable ASUS MCE. That's not "stock" though


----------



## fevgatos (Oct 20, 2022)

W1zzard said:


> I noticed that too .. I get 40,000 if I disable power limit, i.e. enable ASUS MCE. That's not "stock" though


Εveryone else scores 38 to 39k at 253w stock though.


----------



## W1zzard (Oct 20, 2022)

fevgatos said:


> Εveryone else scores 38 to 39k at 253w stock though.


Is that with Adaptive Boost on or off ? On Windows 11 with VBS?


----------



## Solid State Brain (Oct 20, 2022)

W1zzard said:


> That gives you a match for voltage, and only if you measure vcore with a DMM. What about current? So you can do U*I = P



There's no straightforward way that I am aware of for making sure that the internal current value used by the CPU for calculating power ("Package Power") is truly accurate. Since among other things it is used for internal CPU limits (e.g. IccMax / peak current limit), it could be assumed to be factory-calibrated. When current readings are present from motherboard sensors (e.g. "VR IOUT"), they are usually from the digital VRM controller on the motherboard, not the CPU itself.

Usually just making sure that the DC Loadline has been correctly configured is enough for most practical purposes. In principle, the DC loadline is supposed to be configured to the impedance/slope of the VRM loadline so that the CPU knows how much input voltage will drop with input current. It will vary with the configured LLC setting and occasionally CPU type/class.


----------



## W1zzard (Oct 20, 2022)

Solid State Brain said:


> is enough for most practical purposes


For actual usage, definitely .. even for a review where a few percent difference in measurement can be make or break? Not sure


----------



## OfficerTux (Oct 20, 2022)

I'll wait for the 5800X3D + RTX4090 re-bench. If that also shows the 5800X3D is within striking distance I will just upgrade my AM4 system for the third time, get a mid to high-range graphics card from either team green or red as soon as those are out and be fine for the next years. Does not look like new Motherboard + DDR5 RAM is worth it at the moment.


----------



## HTC (Oct 20, 2022)

Question for @W1zzard 

For Zen 4 CPUs, AMD claims to the highest temp these CPUs can achieve before the PC shuts down is 115º, which is why 95º is a safe temp.

This CPU went as high as 117º: how high does Intel claim it to be able to reach before shutting down due to temps?


----------



## W1zzard (Oct 20, 2022)

HTC said:


> This CPU went as high as 117º: how high does Intel claim it to be able to reach before shutting down due to temps?


it's throttled at 115° + 2°C from normalizing from delta T to 25°C room temp


----------



## fevgatos (Oct 20, 2022)

W1zzard said:


> Is that with Adaptive Boost on or off ? On Windows 11 with VBS?


Guru3d uses windows 11. Don't know about adaptive boost but it shouldn't make a difference, if they are scoring 38-39k @ 253w.


----------



## HTC (Oct 20, 2022)

W1zzard said:


> it's throttled at 115° + 2°C from normalizing from delta T to 25°C room temp



In that case, the Intel CPU has "an unfair advantage" since it can boost to it's absolute max temp while the same can't be done with the 7950X AFAIK due to the max temp being 95º in BIOS: *as i understand it*, one can lower them max temp in BIOS *but NOT raise it*, though they might change this in future BIOS updates.


----------



## TheinsanegamerN (Oct 20, 2022)

HTC said:


> In that case, the Intel CPU has "an unfair advantage" since it can boost to it's absolute max temp while the same can't be done with the 7950X AFAIK due to the max temp being 95º in BIOS: *as i understand it*, one can lower them max temp in BIOS *but NOT raise it*, though they might change this in future BIOS updates.


Doubt it's sustainable, techspot had it throttle at 90c


----------



## Xuper (Oct 20, 2022)

so 95'c is normal on intel ? just like Zen4 ?


----------



## HTC (Oct 20, 2022)

TheinsanegamerN said:


> Doubt it's sustainable, techspot had it throttle at 90c



Strange: TPU made a CPU cooler experiment on the 7950X using an AIO, the cooler of this review @ different fan speeds and a spiral wraith cooler also @ various fan speeds and, when they reached 95º (mostly in rendering) they all throttled WHILE KEEPING 95º.


----------



## TheinsanegamerN (Oct 20, 2022)

HTC said:


> Strange: TPU made a CPU cooler experiment on the 7950X using an AIO, the cooler of this review @ different fan speeds and a spiral wraith cooler also @ various fan speeds and, when they reached 95º (mostly in rendering) they all throttled WHILE KEEPING 95º.


The 13900k, not the 7950x


----------



## HTC (Oct 20, 2022)

TheinsanegamerN said:


> The 13900k, not the 7950x



Ahhhh: i misunderstood.


----------



## Solaris17 (Oct 20, 2022)

I almost honestly feel like given the backlash, this trend isnt going to continue but instead finally pop. We have yet to see AMDs new GPUs but I almost feel like this is netburst and fermi all over again and the next platform jump we will start to see reductions in temps.

I personally think this will come in the form of IPC and arc gains, not physical nodes. Its my belief anyway that the voltage reduction from node shrinking has past the point of diminishing returns and we are simply too thermally dense now.

To correct this imo, they will increase overall die size to spread this out, and/or they will work (probably) on e-core efficiency or increase their work load ability as too keep the P-cores more inactive.

Just conjecture of course, I havent finished me first coffee and im about to be late for a meeting so im shooting in the dark tbh.


----------



## YahyaMegahed (Oct 20, 2022)

Amazing performance no doubt but at huge power draw and very high temperature,
I think all AMD needs to do is to cut their CPUs prices to a competitive points and users to enable Eco mode in MB UEFI.


----------



## dgianstefani (Oct 20, 2022)

Der8auer posting interesting numbers.


----------



## spnidel (Oct 20, 2022)

Xuper said:


> so 95'c is normal on intel ? just like Zen4 ?


no, for AMD 95c is FAILMD
on intel 117c is wintel


----------



## R0H1T (Oct 20, 2022)

Solaris17 said:


> I almost honestly feel like given the backlash, this trend isnt going to continue but instead finally pop. We have yet to see AMDs new GPUs but I almost feel like this is netburst and fermi all over again and the next platform jump we will start to see reductions in temps.
> 
> I personally think this will come in the form of IPC and arc gains, not physical nodes. Its my belief anyway that the voltage reduction from node shrinking has past the point of diminishing returns and we are simply too thermally dense now.
> 
> ...


Probably for Intel but not AMD, that's because chiplets (or tiles) will necessitate lower clocks & probably result in lower temps for Intel. At least till they master the tech & get close or above AMD in overall clock speeds. Now once AMD takes the IPC crown, likely with zen5, they would lower the insane power draw at the top end as well but yeah then there's physics ~ all of the major chipmakers will have to lower clocks going forward. Aside from IPC & more cores there's no way they can sell these chips to the retail (non extreme) DIY market.



dgianstefani said:


> Der8auer posting interesting numbers.
> 
> View attachment 266353


7950x is slightly more efficient at "65W" eco mode.


----------



## rv8000 (Oct 20, 2022)

dgianstefani said:


> Der8auer posting interesting numbers.
> 
> View attachment 266353



What’s interesting about this? Every review site has 13 gen leading in far cry 6 that’s tested the game.


----------



## dgianstefani (Oct 20, 2022)

rv8000 said:


> What’s interesting about this? Every review site has 13 gen leading in far cry 6 that’s tested the game.


Doesn't matter the game, point is, in gaming, the 13900K even at stock uses similar amounts of power to the 7950X while giving much better performance.





The only issue with 13th gen I see, is that mobo manufacturers remove power limits, and the tune is way out of the efficiency curve, but then you could say the same about AMD. 

Per core OC, with power limits enforced seems to be the ideal way to tune these chips.


----------



## rv8000 (Oct 20, 2022)

dgianstefani said:


> Doesn't matter the game, point is, in gaming, the 13900K even at stock uses similar amounts of power to the 7950X while giving much better performance.
> 
> View attachment 266354
> 
> ...



Again, every review has pointed this out? 13900k leads, 5800x3d close by, 7000 series a few more percent off.

MT AMD leads in efficiency, ST AMD/Intel are relatively close in efficiency.

Pretty much the story in every review I’ve read.


----------



## Dave65 (Oct 20, 2022)

Ill keep my lil old 5900x and 6800xt.. I just upgraded furnace and don't need another.


----------



## dgianstefani (Oct 20, 2022)

rv8000 said:


> Again, every review has pointed this out? 13900k leads, 5800x3d close by, 7000 series a few more percent off.
> 
> MT AMD leads in efficiency, ST AMD/Intel are relatively close in efficiency.
> 
> Pretty much the story in every review I’ve read.


And yet there's five pages of posters whining about power draw? And since when is gaming a purely ST load?


----------



## leha12345 (Oct 20, 2022)

Would be interesting to see if my MSI Z690 tomahawk DDR motherboard can cope with 13900k even at stock?

Currently got 12600kf that maxes out at 199W no issues, should be Ok with mild OC on the new i9 say at 300 watts? 

Looking to buy it in 1 - 2 years' time when they drop(potentially used market)


----------



## TheoneandonlyMrK (Oct 20, 2022)

dgianstefani said:


> And yet there's five pages of posters whining about power draw? And since when is gaming a purely ST load?


Since when was gaming the most important or only load, yet it is all your going on about.
My PC do 95% work and at best 5% gaming.
Commercial and enterprise pc use eclipses gaming use.


----------



## Vayra86 (Oct 20, 2022)

So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.

Utterly pointless product. Efficiency is nice, but its limits for sure aren't, then again, its a K model, so whatever. But still. Writing's on the wall, as it has been: to get more perf, you need more power, so basically that's all the development on big little end of story.

Its clear though that 125W is a complete joke.


----------



## dgianstefani (Oct 20, 2022)

TheoneandonlyMrK said:


> Since when was gaming the most important or only load, yet it is all your going on about.
> My PC do 95% work and at best 5% gaming.
> Commercial and enterprise pc use eclipses gaming use.


So use the Intel power limits that motherboard manufacturers disable by default???? Profit.



Vayra86 said:


> So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.
> 
> Utterly pointless product. Efficiency is nice, but its limits for sure aren't, then again, its a K model, so whatever. But still. Writing's on the wall, as it has been: to get more perf, you need more power, so basically that's all the development on big little end of story.


Are you unable to change a single setting in the BIOS? Didn't realise the enthusiast community was forced to run everything at stock?


----------



## clopezi (Oct 20, 2022)

Vayra86 said:


> So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.
> 
> Utterly pointless product. Efficiency is nice, but its limits for sure aren't.


It depends the main use. I believe that this CPU it's not for gaming, but for productivity and gaming. Only for gaming, there are best options.

For example, I've a 9900K and I'm switching to 13900K, in gaming the improve it's minimal, but in productivity... big improvements.


----------



## rv8000 (Oct 20, 2022)

dgianstefani said:


> And yet there's five pages of posters whining about power draw? And since when is gaming a purely ST load?



Because it uses more power than the 7950X (7000 series), and thus is also hotter. They’re facts, and anyone is free to express their opinion or concerns about power draw and temps.

You seem to be intentionally ignoring those facts to claim the opposite, as are other people here. Both Zen 4 and RPL are hot and power hungry in stock configs, RPL is worse in those regards. They’re all great CPUs performance wise, doesn’t excuse the fact or change the objective truth.


----------



## dgianstefani (Oct 20, 2022)

clopezi said:


> It depends the main use. I believe that this CPU it's not for gaming, but for productivity and gaming. Only for gaming, there are best options.
> 
> For example, I've a 9900K and I'm switching to 13900K, in gaming the improve it's minimal, but in productivity... big improvements.


I mean, it's not minimal. You're going from around 13-1400 ST to more than 2200 ST. That's a huge improvement for gaming workloads.



rv8000 said:


> Because it uses more power than the 7950X (7000 series), and thus is also hotter. They’re facts, and anyone is free to express their opinion or concerns about power draw and temps.
> 
> You seem to be intentionally ignoring those facts to claim the opposite, as are other people here. Both Zen 4 and RPL are hot and power hungry in stock configs, RPL is worse in those regards. They’re all great CPUs performance wise, doesn’t excuse the fact or change the objective truth.


It uses the same power for much more performance in lightly threaded loads (therefore running cool), and uses more power for heavily threaded loads (while still being faster). This is the "objective truth".

And the same way that AMD has "eco mode" where the CPU is limited to 65 or 95 W, you can do this with Intel if you're so concerned with power draw or can't cool the CPU, without losing much performance (especially in lighter threaded loads), and generally still being faster than the previous generation.


----------



## ThrashZone (Oct 20, 2022)

Hi,
Gutsy using EOL mx-5


----------



## Why_Me (Oct 20, 2022)

Ayhamb99 said:


> Yikes that power consumption is too much, I do not understand why AMD and Intel think using brute force to overtake in performance while suffering high temperatures and power usage is a good idea . I'm now curious to see how the 13700k will perform and hope it's not as inefficient as this monstrosity.


Intel's locked cpu's are going to be a hit.  Proof is in this vid.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> Are you unable to change a single setting in the BIOS? Didn't realise the enthusiast community was forced to run everything at stock?


What? That isn't the point. The point is how it gets its spot as a top dog in Intel's stack: by adding ridiculous amounts of wattage.

Every half wit understands you can tweak a K CPU. I even pointed it out in my post. Why is it the way it is, out of the box? To provide the illusion of having a purpose at the top at a retarded price point. Let's just be real about it man. This isn't personal even if you bought one or want to buy a whole box. Its about the company, the product, and the marketing. It is actually possible, - I know this might be surprising to some, evident from your post - to feel a certain way about a product and then apply different considerations when its about your own purchase/usage decisions. Its a stance called realism. Try it.

I feel the exact same way wrt AMD's latest temp target. Utterly fkin ridiculous nonsense. Oh I can undervolt it to work better? Fantastic, but why isn't it like that to begin with? Or is the definition of better simply no longer matching with what the product presents to us? At least AMD couples it with real efficiency gains gen to gen, still, but I wonder if that lasts.

The reality is, this is how we reap the fruits of a new gen now. We add some silicon/die size to make bigger what got smaller and we yank even more volts through it. Such progress. There is also a reality where efficiency is key because realistically, nobody needs this performance in consumer segment, and we could do with lower global power usage. A very unlikely trend I know. But still, that, to me, seems more like progress in CPU now that we're reaching a pinnacle in performance and end to shrinks.

GPU is much the same way. There is a good reason I'm still on this 1080. And its not because there's no money to upgrade. I literally don't see why I should, even after a monitor upgrade.


----------



## TheinsanegamerN (Oct 20, 2022)

Solaris17 said:


> I almost honestly feel like given the backlash, this trend isnt going to continue but instead finally pop. We have yet to see AMDs new GPUs but I almost feel like this is netburst and fermi all over again and the next platform jump we will start to see reductions in temps.
> 
> I personally think this will come in the form of IPC and arc gains, not physical nodes. Its my belief anyway that the voltage reduction from node shrinking has past the point of diminishing returns and we are simply too thermally dense now.
> 
> ...


I don't think it will be that complicated. Both the Ryzen 7000 series and the Nvidia 4090 have shown that a tiny bit of underclocking makes a world of difference. 

The 7950x loses a whopping 1-2 percent performance, worst case like 5, in exchange you limit power from 170 w to 65. 

All these companies have to do is not clock things to the friggin moon. One bin back, that's it, and you save huge on power consumption numbers.


----------



## dalekdukesboy (Oct 20, 2022)

So... What kind of temps did you get with the aio cooler or did I miss it?


----------



## rv8000 (Oct 20, 2022)

dgianstefani said:


> I mean, it's not minimal. You're going from around 13-1400 ST to more than 2200 ST. That's a huge improvement for gaming workloads.
> 
> 
> It uses the same power for much more performance in lightly threaded loads (therefore running cool), and uses more power for heavily threaded loads (while still being faster). This is the "objective truth".
> ...



The objective truth is your reading comprehension is lacking. Go take another look at the power consumption summary page in this review.

The only explicit data point that shows RPL, specifically the 13900K, using less power in a ST load, is a graph on MP3 encoding.

Otherwise across 12 games, the 13900k on average uses 31w more than the 7950X. And across 45 recorded applications it uses 45w more than the 7950X.

You are objectively wrong. Cherry picking one or two results from other reviews doesn’t change the average across the board, or the summation of all reviews. 13900K is hotter, uses more power, has a a lead in gaming performance, and trades blows across MT apps, *while using more power on average *(need to restate this as I feel it’s needed).


----------



## dgianstefani (Oct 20, 2022)

Vayra86 said:


> What? That isn't the point. The point is how it gets its spot as a top dog in Intel's stack: by adding ridiculous amounts of wattage.
> 
> Every half wit understands you can tweak a K CPU. I even pointed it out in my post. Why is it the way it is, out of the box? To provide the illusion of having a purpose at the top at a retarded price point. Let's just be real about it man. This isn't personal even if you bought one or want to buy a whole box. Its about the company, the product, and the marketing. It is actually possible, - I know this might be surprising to some, evident from your post - to feel a certain way about a product and then apply different considerations when its about your own purchase/usage decisions. Its a stance called realism. Try it.
> 
> ...


If you don't like the K series, don't plan on tuning (why buy a K then), then get the non K versions, which will have very similar boost clocks, but more reasonable power targets.



rv8000 said:


> The objective truth is your reading comprehension is lacking. Go take another look at the power consumption summary page in this review.
> 
> The only explicit data point that shows RPL, specifically the 13900K, using less power in a ST load, is a graph on MP3 encoding.
> 
> ...


Do you understand the concept of performance per watt? (here come the MT synthetic load comparisons...)

Using more power is irrelevant if it also performs faster?

Have another look at the graphs, from the review, and from Der8auer, who I consider to be the foremost expert of tuning. Then maybe think about criticising the reading comprehension of the person who proofed the article you're quoting.



dgianstefani said:


> It uses the same power for much more performance in lightly threaded loads (therefore running cool), and uses more power for heavily threaded loads (while still being faster). This is the "objective truth".
> 
> And the same way that AMD has "eco mode" where the CPU is limited to 65 or 95 W, you can do this with Intel if you're so concerned with power draw or can't cool the CPU, without losing much performance (especially in lighter threaded loads), and generally still being faster than the previous generation.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> If you don't like the K series, don't plan on tuning (why buy a K then), then get the non K versions, which will have very similar boost clocks, but more reasonable power targets.
> 
> 
> Do you understand the concept of performance per watt? (here come the MT synthetic load comparisons...)
> ...


Its still strange this K CPU is marketed as a 125W CPU, which it really isn't.

Also, the perf/watt isn't better, it's worse; objectively worse at MT; and at ST-focused gaming, the 5800X3D is still better, as is even a 7600X

And note: this is not architecture. Its pure and plain stock power/clock settings. If you go over an Intel xx600k, you're in silly land in that regard.


----------



## clopezi (Oct 20, 2022)

I feel so bad with myself, but I saw 32GB DDR5 6400 at 325€ and I've cancelled my economic TUF Z790 Plus D4 and switched for a ROG Z790-F... I hope it worth it


----------



## dgianstefani (Oct 20, 2022)

Vayra86 said:


> Its still strange this K CPU is marketed as a 125W CPU, which it really isn't.
> 
> Also, the perf/watt isn't better, it's worse; objectively worse at MT; and at ST-focused gaming, the 5800X3D is still better, as is even a 7600X
> 
> View attachment 266368


Now play an actual game, and divide the power draw by the FPS, then show me what you get.










Or better yet, let the best technical youtuber do it for you.


----------



## Shatun_Bear (Oct 20, 2022)

One of the most inefficient gaming CPU TPU has tested.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> Now play an actual game, and divide the power draw by the FPS, then show me what you get.
> 
> 
> 
> ...


See that's where our approaches of looking at this depart.

Peak FPS at the best possible GPU in a CPU limited situation. == Theoretical
vs
Useful FPS for the games you actually play in practice on your GPU. == Practical

The peak FPS is obtained by grossly exceeding what you would get in a typical system.
The energy efficiency tests are taken on a much more reasonable RTX3080. So when there isn't a fight for peak clocks/perf, Intel's xx900k falls apart in every way.


----------



## dgianstefani (Oct 20, 2022)

Shatun_Bear said:


> One of the most inefficient gaming CPU TPU has tested.


Wait for the 4090 tests.



Vayra86 said:


> See that's where our approaches of looking at this depart.
> 
> Peak FPS at the best possible GPU in a CPU limited situation. == Theoretical
> vs
> Useful FPS for the games you actually play in practice on your GPU. == Practical


Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?

It's almost like testing a new CPU generation in a CPU limited situation is the ideal situation?

There's nothing theoretical about der8auer's tests. They're tests, end of story.

The 4090 is simply the first GPU from the new generations to be tested, lower end ones that still perform better than a 3080 are certainly coming.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> Wait for the 4090 tests.
> 
> 
> Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?
> ...


A 3080 isn't a budget GPU.
Also, let's appreciate the fact that test is already done at 1080p on that GPU, which it eats for breakfast, which is practical translation for: you really won't be needing even more CPU based FPS in your gaming.


----------



## TheinsanegamerN (Oct 20, 2022)

dgianstefani said:


> Wait for the 4090 tests.
> 
> 
> Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?
> ...


Bruh just take the L man, you've completely lost sight of the argument and the tests from your OWN WEBSITE say you're wrong. A 4090 wont change that.


----------



## dgianstefani (Oct 20, 2022)

Vayra86 said:


> A 3080 isn't a budget GPU.


It's a previous generation GPU that isn't fast enough to put the CPU in a CPU limiting situation for testing. Hence why we're going to be testing with the 4090.



TheinsanegamerN said:


> Bruh just take the L man, you've completely lost sight of the argument and the tests from your OWN WEBSITE say you're wrong. A 4090 wont change that.


What L? The same people who complain about the 4090 450 W draw, that can go up to 600 W also pretend to not notice that it's also putting out twice the performance of previous generation cards.
Who cares if the 13900K can draw 300 W? OK? So?
The chip is still faster than any other CPU in gaming when limited to it's PL1 of 125 W.


----------



## Lightofhonor (Oct 20, 2022)

dgianstefani said:


> Wait for the 4090 tests.
> 
> 
> Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?
> ...


If your ultimate goal is gaming efficiency then you'd also need to compare 7950X with only 1 CCD, (I would guess) disabling Intel E-cores, and then also undervolting/limiting both to their most efficient mode. 

But just because you buy a $600 CPU doesn't mean you will buy a $1500 GPU. The opposite is true (why saddle fast GPU and slow CPU) but there are lots of reasons to max out the CPU and not GPU, specifically all the workstation tasks this very site tests.


----------



## Count Shagula (Oct 20, 2022)

Was going to upgrade to one of these from my 5800X3D because im a 360hz 1080p gamer and want the absolute highest fps but fuck me. Those temps and power usage. Hard pass now


----------



## TheoneandonlyMrK (Oct 20, 2022)

dgianstefani said:


> So use the Intel power limits that motherboard manufacturers disable by default???? Profit.
> 
> 
> Are you unable to change a single setting in the BIOS? Didn't realise the enthusiast community was forced to run everything at stock?


Your evasive bs answer is disingenuous.

You know full well I already do such as I can to get the most performance for the least power, as Most enthusiasts here do.

That's not where my problem lies with power guzzling CPUs, technically I agree with your point regarding power not being important if it gets the job done.

My issue lies with the average Joe buying this because Intel crancked all the dials to be all Intel number one about it, but, using a effing sledge hammer, they're algorithm is a bit shit IMHO these could and should have been way more efficient in way more use cases, it's like they don't know how to power gate or when again IMHO.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> It's a previous generation GPU that isn't fast enough to put the CPU in a CPU limiting situation for testing. Hence why we're going to be testing with the 4090.


Hey I get that. This isn't a question of who's right or wrong, both ways of looking at it are valid. It depends on your use case. I think that detail is a fine thing to recognize. That is all.

The reality is, if you're not coupling this CPU with a 4090, you're doing it wrong in the gaming section. And, on top of that, you aren't concerned with perf/watt at all. Even a 4090 won't get you CPU constrained gaming, realistically; I mean... it can't even hit 60 FPS in Cyberpunk


----------



## dgianstefani (Oct 20, 2022)

TheoneandonlyMrK said:


> Your evasive bs answer is disingenuous.
> 
> You know full well I already do such as I can to get the most performance for the least power, as Most enthusiasts here do.
> 
> ...


The average Joe shouldn't be buying a K series CPU. The i9 13900 (non K), and other i5/7/9 chips will be an excellent choice for them, and will be what is used in most OEM systems that aren't marketed towards gamers. The average Joe isn't going to be building a PC either. The people who do, know how to tune or ask advice, for the most part.

I agree with your statement about pushing chips past their efficiency curves, but everyone, Intel, AMD, NVIDIA etc., is doing this. For one player to go the efficient route when the rest are going the performance route is bad marketing.

There's nothing BS about anything I've said.



Lightofhonor said:


> If your ultimate goal is gaming efficiency then you'd also need to compare 7950X with only 1 CCD, (I would guess) disabling Intel E-cores, and then also undervolting/limiting both to their most efficient mode.
> 
> But just because you buy a $600 CPU doesn't mean you will buy a $1500 GPU. The opposite is true (why saddle fast GPU and slow CPU) but there are lots of reasons to max out the CPU and not GPU, specifically all the workstation tasks this very site tests.


Personally I think if you're the use case that can fully utilize and profit from a 24 core CPU, you probably also have the budget for a Quadro equivalent, or see the 24 GB frame buffer of the xx90 series cards as being worth whatever NVIDIA chooses to sell them for.

I think the main issue is that everyone is evaluating these chips from their own budgets and needs. If you're not a HFR gamer or a person who actually makes money from the processing power of their computer - this chip isn't for you.


----------



## R0H1T (Oct 20, 2022)

Vayra86 said:


> So basically, if you want peak performance,* your man cave is a furnace* and you still only gained what, *10% over any other sane CPU*.


Hey I hear Winter is coming, we'll ignore the last 2 seasons of GoT for now, so this is a nice investment in a relatively cheap heater? Though I guess your* electricity bills* will probably bankrupt you as well 

No I'm, pretty sure someone said 40% more at similar wattage.


dgianstefani said:


> *Who cares if the 13900K can draw 300 W*? OK? So?


I'd like to go full Greenpeace on you but I'll just add *any sane person would*, those who don't care for the environment don't deserve respect *IMO*!


----------



## Richards (Oct 20, 2022)

Solid State Brain said:


> Other reviewers are seeing higher results in CB23 with power limits removed.
> 
> View attachment 266304
> 
> ...


Something's  fishy here


----------



## InVasMani (Oct 20, 2022)

dgianstefani said:


> Now play an actual game, and divide the power draw by the FPS, then show me what you get.
> 
> 
> 
> ...



Very much what I thought would be the case. Intel chased benchmarks a bit over aggressively and at stock not the best balance mix, but with the right power limit and undervolt should be really good and with tons of manual optimization. A stock and especially with unlimited power draw it's crazy. Intel should've really tried to better balance out the sweet spot balance. I'd still say it's good in the right hands willing to optimize it though. Will it be enough when X3D arrives at least if making the same comparison probably not, but until then it's the best chip you can get currently just not on value for dollar perhaps. The 13600K with this same approach makes for a compelling chip.


----------



## TheinsanegamerN (Oct 20, 2022)

dgianstefani said:


> It's a previous generation GPU that isn't fast enough to put the CPU in a CPU limiting situation for testing. Hence why we're going to be testing with the 4090.
> 
> 
> What L? The same people who complain about the 4090 450 W draw, that can go up to 600 W also pretend to not notice that it's also putting out twice the performance of previous generation cards.


You're claiming the 13900k is more efficient. Graphs from your own site prove you to be wrong, and your response is "well wait for the next GPU" instead of simply admitting that you were wrong.


dgianstefani said:


> Who cares if the 13900K can draw 300 W? OK? So?


Who cares? How about anyone who has to cool these monsters? There's plenty of legitimate criticism of that level power draw in this very comment section. Being "the fastest" doesnt negate these concerns. 


dgianstefani said:


> The chip is still faster than any other CPU in gaming when limited to it's PL1 of 125 W.


According to a single youtube source. I'd honestly expect better from TPU's proofreader. 


dgianstefani said:


> There's nothing BS about anything I've said.


You're being disingenuous, petty, dismissive, and combative with your readers. That is, quite frankly, the level of bullshit I'd expect from WCCFtech or Reddit, not techpowerup. 

Also, you claiming that the 13900k has better perf/watt when your own site says otherwise smells pretty strongly of bullcrap. Again, read your own site, Mr. proofreader.


----------



## dgianstefani (Oct 20, 2022)

R0H1T said:


> Hey I hear Winter is coming, we'll ignore the last 2 seasons of GoT for now, so this is a nice investment in a relatively cheap heater? Though I guess your electricity bills will probably bankrupt you as well.
> 
> No I'm, pretty sure someone said 40% more at similar wattage.
> 
> I'd like to go full Greenpeace on you but I'll just add *any sane person would*, those who don't care for the environment don't deserve respect *IMO*!


So stop driving and buying foodstuffs that have been shipped across the world using fuel oil burning container ships, forget about fast fashion, ask your governments why they've not been building nuclear power plants, or just flip a setting in the BIOS that enables PL1/2 limits to actually be enforced. Or better yet, buy a non-K CPU that will be almost as fast in almost every situation, while not having that last 5% of performance for a silly power budget. You guys need to evaluate this product in the context of its target demographic.


----------



## Vayra86 (Oct 20, 2022)

TheinsanegamerN said:


> You're claiming the 13900k is more efficient. Graphs from your own site prove you to be wrong, and your response is "well wait for the next GPU" instead of simply admitting that you were wrong.
> 
> Who cares? How about anyone who has to cool these monsters? There's plenty of legitimate criticism of that level power draw in this very comment section. Being "the fastest" doesnt negate these concerns.
> 
> ...


Exactly.

Nuance. That is what was missing. I mean, yes I can read charts, and omg its fastuhhrr but what's underneath?


----------



## dgianstefani (Oct 20, 2022)

TheinsanegamerN said:


> You're claiming the 13900k is more efficient. Graphs from your own site prove you to be wrong, and your response is "well wait for the next GPU" instead of simply admitting that you were wrong.
> 
> Who cares? How about anyone who has to cool these monsters? There's plenty of legitimate criticism of that level power draw in this very comment section. Being "the fastest" doesnt negate these concerns.
> 
> ...


So you're questioning the expertise of der8auer? He's a qualified engineer who advises motherboard, GPU and CPU manufacturers on products. What are your qualifications?

People who value "the fastest", which this is, have the ability to cool these monsters.

I'm challenging your assertions with evidence.

You are doing the same.

This is neither petty nor disingenuous.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> So stop driving and buying foodstuffs that have been shipped across the world using fuel oil burning container ships, forget about fast fashion, ask your governments why they've not been building nuclear power plants, or just flip a setting in the BIOS that enables PL1/2 limits to actually be enforced. Or better yet, buy a non-K CPU that will be almost as fast in almost every situation, while not having that last 5% of performance for a silly power budget. You guys need to evaluate this product in the context of its target demographic.


Nope, we can evaluate this product in the context of the time and the world it's being released in, too. In fact, companies would do wise to look at that context, and there are good examples of companies and new businesses that thrive because of doing so. The fact chip progress hasn't quite yet, is a fact. A worrying one.

Tech enthusiast does not equal 'I buy the top end part every time', sir.

I'm on a tech site, so why would I talk about oil or fashion here?


----------



## dgianstefani (Oct 20, 2022)

Vayra86 said:


> Hey I get that. This isn't a question of who's right or wrong, both ways of looking at it are valid. It depends on your use case. I think that detail is a fine thing to recognize. That is all.
> 
> The reality is, if you're not coupling this CPU with a 4090, you're doing it wrong in the gaming section. And, on top of that, you aren't concerned with perf/watt at all. Even a 4090 won't get you CPU constrained gaming, realistically; I mean... it can't even hit 60 FPS in Cyberpunk





Vayra86 said:


> Nope, we can evaluate this product in the context of the time and the world it's being released in, too. In fact, companies would do wise to look at that context, and there are good examples of companies and new businesses that thrive because of doing so. The fact chip progress hasn't quite yet, is a fact. A worrying one.
> 
> Tech enthusiast does not equal 'I buy the top end part every time', sir.
> 
> I'm on a tech site, so why would I talk about oil or fashion here?


That's your right. As is my right to evaluate it in the context I see fit.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> That's your right. As is my right to evaluate it in the context I see fit.


You're still part of a community buddy. This was feedback.

For a staff member, a more open stance could be expected. We (or at least, I) am not jabbing at you, its about the content we're discussing and what might make it more valuable. Insights, you know. I think that gaming energy efficiency result is a very interesting thing to highlight, because it points out the limited necessity for going this high up the Intel stack for 'better gaming'. Seeing as we are speaking of a gaming champ. Could less actually be more?


----------



## Shatun_Bear (Oct 20, 2022)

fevgatos said:


> Ιm not an apologist, the results in MT are underwhelming.* In gaming ofc it literally dumbs on everything out there.*



What, less than 5% at 1080p for significantly higher power draw and temperatures?! You're not playing games at this point, *you're playing yourself.*


----------



## R0H1T (Oct 20, 2022)

dgianstefani said:


> You guys need to evaluate this product in the context of its *target demographic*.


Which is for who ~ brain dead Zombies?


dgianstefani said:


> So stop driving and buying foodstuffs that have been shipped across the world using fuel oil burning container ships


I don't do that, never bought anything to eat that was shipped from outside India. Yes the car may have some imported parts, dunno for sure.


dgianstefani said:


> forget about fast fashion


Don't do fast fashion, in fact I hate buying clothes. I still have 15 year old clothes in perfectly good condition that I can wear at home!


dgianstefani said:


> ask your governments why they've not been building nuclear power plants


That's a tad, or quite a lot, harder than you think.


dgianstefani said:


> just flip a setting in the BIOS that enables PL1/2 limits to actually be enforced.


Or better yet don't buy this freaking thing!


dgianstefani said:


> Or better yet, buy a non-K CPU that will be almost as fast in almost every situation, while not having that *last 5% of performance for a silly power budget*.


Like I said at the beginning ~ for brain-dead "morons" 

Just for clarification I will add that I do spend a disproportionate amount on phones ~ I buy a phone each year & sell the old phones in exchange. Though tbf that's accounting for the whole family, *I just keep the latest one* since I'm paying


----------



## Why_Me (Oct 20, 2022)

R0H1T said:


> Hey I hear Winter is coming, we'll ignore the last 2 seasons of GoT for now, so this is a nice investment in a relatively cheap heater? Though I guess your* electricity bills* will probably bankrupt you as well
> 
> No I'm, pretty sure someone said 40% more at similar wattage.
> 
> I'd like to go full Greenpeace on you but I'll just add *any sane person would*, those who don't care for the environment don't deserve respect *IMO*!


I'm guessing you live in Europe where electricity is scarce. Locked at 90W and it still mopped the floor with AMD's new lineup in regards to gaming. Intel's upcoming locked cpu's are going to be the dagger in AMD's heart.


----------



## Richards (Oct 20, 2022)

Need to stop testing with a 3080 when the industry  tests with the best gpu in the world 4090 the gap will be bigger in favour  of intel


----------



## dgianstefani (Oct 20, 2022)

Shatun_Bear said:


> What, less than 5% at 1080p for significantly higher power draw and temperatures?! You're not playing games at this point, *you're playing yourself.*


It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.

I expect that difference to increase when we retest with a 4090, which is exactly what the other evidence I linked (der8auer's 4090 tests) shows.


----------



## TheinsanegamerN (Oct 20, 2022)

dgianstefani said:


> So you're questioning the expertise of der8auer?


You cannot trust a single data point, no matter who it is from. Science 101.


dgianstefani said:


> He's a qualified engineer who advises motherboard, GPU and CPU manufacturers on products. What are your qualifications?


Well, for starters I can read the graphs TPU puts out. If a single review is taken as fact, then why do you disagree with your own website?








dgianstefani said:


> I'm challenging your assertions with evidence.
> 
> You are doing the same.
> 
> This is neither petty nor disingenuous.


Handwaving proof that disagrees with your own, and insinuating that readers of your website are complaining simply to complain because they are "not the target market" is both petty and disingenuous. That's already been written up in comments in this thread. Look at the way W1zzard responds to criticism VS the way you do, it's rather eye opening. The way you responded to me, claiming that I was questioning the expertise of a reviewer because I dont take one data point as holy text and asking what my qualifications are is incredibly inappropriate for a TPU staff member, and it reflects poorly on this site as a whole.


----------



## TheoneandonlyMrK (Oct 20, 2022)

dgianstefani said:


> The average Joe shouldn't be buying a K series CPU. The i9 13900 (non K), and other i5/7/9 chips will be an excellent choice for them, and will be what is used in most OEM systems that aren't marketed towards gamers. The average Joe isn't going to be building a PC either. The people who do, know how to tune or ask advice, for the most part.
> 
> I agree with your statement about pushing chips past their efficiency curves, but everyone, Intel, AMD, NVIDIA etc., is doing this. For one player to go the efficient route when the rest are going the performance route is bad marketing.
> 
> ...


I don't think it's for you or me to decide what people should buy though I do agree with your last point.
I think they could have done better though, it's a good chip I'm clear on that, I just hoped they would have gotten a more intelligent work flow and resource utilisation algorithm and system in play, in chip, AmD have machine learning tech to optimise workflow and it works quite well not perfect, I hoped Intel would have done more than a trim and shrink I suppose.


----------



## GreiverBlade (Oct 20, 2022)

nice review, good job Intel nonetheless (bringing on the heat, pun intended /s also ), but i am glad i have a worthy upgrade on AM4 in the form of the 5800X3D, if it continues to drop in price and reach the sub 450chf (and it's quite close atm ) since my res is 1620p


----------



## dgianstefani (Oct 20, 2022)

TheinsanegamerN said:


> You cannot trust a single data point, no matter who it is from. Science 101.
> 
> Well, for starters I can read the graphs TPU puts out.
> 
> Handwaving proof that disagrees with your own, and insinuating that readers of your website are complaining simply to complain because they are "not the target market" is both petty and disingenuous. That's already been written up in comments in this thread. Look at the way W1zzard responds to criticism VS the way you do, it's rather eye opening. The way you responded to me, claiming that I was questioning the expertise of a reviewer because I dont take one data point as holy text and asking what my qualifications are is incredibly inappropriate for a TPU staff member, and it reflects poorly on this site as a whole.


You're the one questioning results from a very respected source - the burden of evidence is on you to provide proof as to why we should take that questioning seriously, hence why I asked what your qualifications are. TPU results are great, and contextually provide really reliable datasets which are useful in comparing different hardware. The issue is that we're in the middle of several huge generational product releases, CPU and GPUs are all being refreshed at the same time practically.

There's a good reason why the TPU graphs are somewhat slewed, it's because we had to use a 3080. Why exactly do you think we're retesting with a 4090? For fun? It's incredibly difficult to put modern CPUs in a position where they are actually the limiting factor in a system, making testing hard, and justifying the use of academic 720p testing. Just look at 4K results, you can infer the same thing from the other end of the extreme, why bother getting anything above a 3700X when you'll only get a 4% performance increase for your money? Obviously the actual performance difference is significantly different, but in that context, you wouldn't think so. Just like in the context of not using a 4090, the 13900K seems silly. People upgrade their GPU a lot more than they upgrade their CPU for this reason. And it's also why we try to have so many different tests, so you can get an idea of actual chip performance, instead of chip performance when it's being bottlenecked by a different component.



Vayra86 said:


> You're still part of a community buddy. This was feedback.
> 
> For a staff member, a more open stance could be expected. We (or at least, I) am not jabbing at you, its about the content we're discussing and what might make it more valuable. Insights, you know. I think that gaming energy efficiency result is a very interesting thing to highlight, because it points out the limited necessity for going this high up the Intel stack for 'better gaming'. Seeing as we are speaking of a gaming champ. Could less actually be more?


Exactly why the 13600K is titled "best gaming CPU", it's an overclocking beast, and I see no reason why it can't achieve the same clocks as the 13900K when tuned a bit. It goes from 628 to 712 FPS in CS:GO with a 500 MHz OC to 5.6 GHz, pretty sweet. The 13900K gets 736 FPS at the same clocks (5.6 GHz), so the difference of 24 FPS is probably due to the two extra P cores and more cache.

I believe I have a pretty open stance, I'm not planning on upgrading to this or any other next gen CPU released by Intel or AMD, for the next few years. I just dislike the criticisms of people who don't even have the use case a product is aimed at (the best HFR gaming experience, or making money with their processing power). If anyone has doubts about my own feelings on performance/energy efficiency - have a look at my specs, I use a 5800X3D. As you said, I'm part of a community, and these posts are my own, just because I'm a staff member doesn't mean that every opinion or post I make is the official representation of TPU, nor should it be IMO. The responsibility I have as a staff member is to ensure my TPU work is of good quality and unbiased, which I believe it is. It's not like I'm slandering people here. The responsibility I have as a member of this community is to present my statements as best I can in order to foster greater understanding, which I will continue to do.

To clarify and restate - if you're not chasing 240 Hz gaming, don't care about e-peen, or don't have the time=money attitude, to where a little extra on the power bill is irrelevant in the context of each minute saved in that workflow = $$$. This CPU isn't really for you.


----------



## oxrufiioxo (Oct 20, 2022)

TheoneandonlyMrK said:


> I don't think it's for you or me to decide what people should buy though I do agree with your last point.
> I think they could have done better though, it's a good chip I'm clear on that, I just hoped they would have gotten a more intelligent work flow and resource utilisation algorithm and system in play, in chip, AmD have machine learning tech to optimise workflow and it works quite well not perfect, I hoped Intel would have done more than a trim and shrink I suppose.



I agree all these new CPUs are just making the 5800X3D look better and better for gaming.


I can't believe Amazon is trying to charge $800 for this cpu lmao no wonder it is currently sitting at 38th on their best cpu sellers list smh.


----------



## TheoneandonlyMrK (Oct 20, 2022)

oxrufiioxo said:


> I agree all these new CPUs are just making the 5800X3D look better and better for gaming.
> 
> 
> I can't believe Amazon is trying to charge 800 for this cpu lmao no wonder it is currently sitting at 38th on their best cpu sellers list smh.


If I only gamed I would have one tbh.


----------



## InVasMani (Oct 20, 2022)

It's pretty much in line with what I thought would be the case. It comes with caveats Intel chased benchmark scores a bit over aggressively, but it still performs and with appropriate manual tuning it can still be quite good. I don't really see the issue if you can't hold out longer for Zen 4 X3D that might tilt things back the other direction.

Also let's see how well the iGPU's perform on these new chips at 720p/1080p.


----------



## GreiverBlade (Oct 20, 2022)

oxrufiioxo said:


> I agree all these new CPUs are just making the 5800X3D look better and better for gaming.
> 
> 
> I can't believe Amazon is trying to charge $800 for this cpu lmao no wonder it is currently sitting at 38th on their best cpu sellers list smh.


heck yeah, the 5800X3D look like AMD freaking achievement of the year (or more ) 

Intel going Hybrid for 3 gen and that "fiasco of a non OC  CPU" (the 5800X3D obviously) still hold strong... and AMD current gen is also putting a good fight 

hilariously the 7700X is not even that expensive even for me @ 449chf given the place in the ranking (although it's a bloody shame since the 5700X is 249 chf and the 5800X is 299chf )

and a tray 5800X3D is 449.90 right now (the boxed version is 499.90) oh wait ... it's sub 450chf ... 


GreiverBlade said:


> if it continues to drop in price and reach the sub 450chf (and it's quite close atm )


alright ... placing an order right now ... 


as if ... my 3600 will last a bit more i reckon ... just a little more drop and i fire.

well if i would do a full platform change the 7X00X3D would be where my money were to go, if i had money tho ... i need a e-bike, mobility come first


----------



## rv8000 (Oct 20, 2022)

TheinsanegamerN said:


> You cannot trust a single data point, no matter who it is from. Science 101.
> 
> Well, for starters I can read the graphs TPU puts out. If a single review is taken as fact, then why do you disagree with your own website?
> 
> ...



I can’t understand what he’s even talking about at this point; I didn’t even notice the fact he’s the supposed proof reader of articles here, that’s wild considering he’s going against what TPUs own article plainly states.

Very unprofessional


----------



## Psychoholic (Oct 20, 2022)

Pretty impressive chip minus the power consumption..  

I have been running my 7950X at 105W (Eco mode) and loving it, it doesn't lose much performance at all.
I also run my 12900K machine at a 200W Limit.

These could also be limited in bios of course, i feel like BOTH intel and AMD should have released their latest at much lower power levels considering how little performance you lose by capping it to something reasonable.


----------



## oxrufiioxo (Oct 20, 2022)

Psychoholic said:


> Pretty impressive chip minus the power consumption..
> 
> I have been running my 7950X at 105W (Eco mode) and loving it, it doesn't lose much performance at all.
> I also run my 12900K machine at a 200W Limit.
> ...



Agree, I was more impressed with the 7950X at 125w/65w than I am with It's stock performance. I'm sure once more people get to grips with the 13900k we will see similarly impressive numbers.


----------



## PapaTaipei (Oct 20, 2022)

Good perfs but oh boy those temps and Watts...


----------



## rv8000 (Oct 20, 2022)

dgianstefani said:


> It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.
> 
> I expect that difference to increase when we retest with a 4090, which is exactly what the other evidence I linked (der8auer's 4090 tests) shows.
> 
> View attachment 266375View attachment 266376



Not that a direct comparison is the best idea to der8auers work (different test systems)…

Far Cry 6 1080p 3080 vs 4090

With 3080, the TPU results are 162 FPS with the 13900k and 131 with the 7950X for Far Cry 6

With Der8auers 4090, his results are 169 with the 13900k and 138 with the 7950X.

Considering ONLY the change in FPS, test configs aside, the 13900k becomes less efficient compared to TPUs results…

The fps gain for the 7950x when using a 4090 being 5.3%

The fps gain for the 13900k when using a 4090 being 4.3%


----------



## dgianstefani (Oct 20, 2022)

rv8000 said:


> Not that a direct comparison is the best idea to der8auers work (different test systems)…
> 
> Far Cry 6 1080p 3080
> 
> ...


I'm looking forward to the TPU testing.


----------



## fevgatos (Oct 20, 2022)

Vayra86 said:


> Its still strange this K CPU is marketed as a 125W CPU, which it really isn't.
> 
> Also, the perf/watt isn't better, it's worse; objectively worse at MT; and at ST-focused gaming, the 5800X3D is still better, as is even a 7600X
> 
> ...


I hope you realize that the graph you just posted about efficiency basically goes against every other review out there. Not saying it's wrong -  it might be that every other review is wrong, but most reviews have the 13900k at 38.5 to 39.5k points @ 253watts, bringing it to around 160points / watt instead of 113 that your graph has it on.


----------



## Shatun_Bear (Oct 20, 2022)

dgianstefani said:


> It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.
> 
> I expect that difference to increase when we retest with a 4090, which is exactly what the other evidence I linked (der8auer's 4090 tests) shows.
> 
> View attachment 266375View attachment 266376



Wonder why you have hid the resolution where we see this 10% difference...

At resolutions that are sensible, the difference goes from 7% to 6%. Whereas using 30W more, which is significant when the power draw is already so high.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> There's a good reason why the TPU graphs are somewhat slewed, it's because we had to use a 3080. Why exactly do you think we're retesting with a 4090? For fun? It's incredibly difficult to put modern CPUs in a position where they are actually the limiting factor in a system, making testing hard, and justifying the use of academic 720p testing. Just look at 4K results, you can infer the same thing from the other end of the extreme, why bother getting anything above a 3700X when you'll only get a 4% performance increase for your money? Obviously the actual performance difference is significantly different, but in that context, you wouldn't think so. Just like in the context of not using a 4090, the 13900K seems silly. People upgrade their GPU a lot more than they upgrade their CPU for this reason. And it's also why we try to have so many different tests, so you can get an idea of actual chip performance, instead of chip performance when it's being bottlenecked by a different component.


Oh, okay, I thought we were discussing your proofreading not too long ago since you brought it up. But then I'm wrong  Np!

By the way, on the bit I quoted up there specifically, I'm on that exact same page with you. But again, you confirm it yourself, the gaming purpose of a CPU this high up the stack is still extremely limited, you can't even properly max it out in gaming, and this echoes throughout CPU history. Going this far out with your CPU for future GPU upgrades really never paid off, except in periods of complete stagnation; quad i7's proved their value over i5 in the late days of Skylake. And why? Only - and I do mean only - because of core/thread count. Even that isn't in the picture anymore with these CPUs, you can drop to near bottom of the stack and still have enough. And... are there outliers where you do find the bonus FPS in actual gaming? Sure! But again. Its so limited.


----------



## DemonicRyzen666 (Oct 20, 2022)

dgianstefani said:


> It's 10.5% faster in games than the next fastest competitor (7950X) at stock, while using a whole 31 W more.


 & what's the Wattage total in % over the 7950X ? because that matter?


----------



## phill (Oct 20, 2022)

Its just a CPU guys...  Please don't get all antsy  

Intel seems to have it in the games section but I think for the most part of everything else, AMD still has the edge but again even with in games, it trades blows depending on the games from what reviews/results I've seen.  

I'm not in the market for either CPU at the moment, but I agree with the video Jayztwocents put out and said "whichever platform you go for, you'll have an amazing gaming experience with" and I think its true.  I honestly feel that Intel hasn't really done anything special with this release.  I think its just purely based on the higher clocks and the IPC increase, that's all that's pushing it forward, in effect turning it up to 11 from 10 or from 10 to 11, whichever way you want it    The higher temp limit seems to help that because without it, I think the performance might be similar or less than the AMD counterpart... 

I think the hours that has gone into this review and every review that @W1zzard does is amazing, the data is good and solid and there's a massive variance of it.


----------



## dgianstefani (Oct 20, 2022)

Shatun_Bear said:


> Wonder why you have hid the resolution where we see this 10% difference...
> 
> At resolutions that are sensible, the difference goes from 7% to 6%. Whereas using 30W more, which is significant when the power draw is already so high.


"at resolutions where the GPU is the bottleneck, not the CPU..." is what I believe you mean.

I didn't hide anything - the review is there for everyone to see, I snipped up to the next relevant CPU after the 7950X - the 5800X3D.



Vayra86 said:


> Oh, okay, I thought we were discussing your proofreading not too long ago since you brought it up. But then I'm wrong  Np!
> 
> By the way, on the bit I quoted up there specifically, I'm on that exact same page with you. But again, you confirm it yourself, the gaming purpose of a CPU this high up the stack is still extremely limited, you can't even properly max it out in gaming, and this echoes throughout CPU history. Going this far out with your CPU for future GPU upgrades really never paid off, except in periods of complete stagnation; quad i7's proved their value over i5 in the late days of Skylake. And why? Only - and I do mean only - because of core/thread count. Even that isn't in the picture anymore with these CPUs, you can drop to near bottom of the stack and still have enough. And... are there outliers where you do find the bonus FPS in actual gaming? Sure! But again. Its so limited.


I didn't bring it up - people have been criticising my opinion, which supposedly should be different since I'm the proofreader? I disagree that the gaming purpose of this CPU is limited, maybe for now, when tested with a two year old 3080 it's overkill, but bear in mind people keep their CPU/Mobo platforms a lot longer than they keep their GPUs. Lots of people still on Skylake derivatives, but still rocking RTX 3xxx chips for instance. Having CPU power in excess is useful, since you'll probably keep the platform for several GPU generations, where the CPU will then be able to stretch it's legs more. You can make the same argument with AM5 and it's long lasting platform, but that assumes you're willing to also spend money to upgrade the CPU.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> "at resolutions where the GPU is the bottleneck, not the CPU..." is what I believe you mean.
> 
> I didn't hide anything - the review is there for everyone to see, I snipped up to the next relevant CPU after the 7950X - the 5800X3D.
> 
> ...


Your opinion is yours alone. Be wary it doesn't reflect in your job


----------



## dgianstefani (Oct 20, 2022)

Vayra86 said:


> Your opinion is yours alone. Be wary it doesn't reflect in your job


If the opinion I (and other people share) ever biases my work, people can criticize as much as they want, and i'm sure it would come up a lot faster internally too.


----------



## Vayra86 (Oct 20, 2022)

dgianstefani said:


> If the opinion I (and other people share) ever biases my work, people can criticize as much as they want, and i'm sure it would come up a lot faster internally too.


Fine statement. I just want to add the detail that I don't think you're being accused of bias, but rather of a one-sided view on the thing. But you've explained you see more than that, so, yeah.

A more specific point I might want to make: perhaps its mighty interesting to add a duo of energy efficiency test runs instead of 'the one on the fastest GPU'. The comparison to one that is much more commonplace or 'in reach' is pretty interesting imho. Looking forward to that 4090 retest.


----------



## oxrufiioxo (Oct 20, 2022)

dgianstefani said:


> I didn't bring it up - people have been criticising my opinion, which supposedly should be different since I'm the proofreader? I disagree that the gaming purpose of this CPU is limited, maybe for now, when tested with a two year old 3080 it's overkill, but bear in mind people keep their CPU/Mobo platforms a lot longer than they keep their GPUs. Lots of people still on Skylake derivatives, but still rocking RTX 3xxx chips for instance. Having CPU power in excess is useful, since you'll probably keep the platform for several GPU generations, where the CPU will then be able to stretch it's legs more. You can make the same argument with AM5 and it's long lasting platform, but that assumes you're willing to also spend money to upgrade the CPU.



Eh, I use to think this way but I don't really think that is the case anymore it was more a thing during the i7 4 core era. As an example if you bought a 300 ish cpu today for gaming and then grab another 300 cpu in 2 years you are likely going to be better off than spending 600 on your cpu now. Although maybe that only applies to AM5 because lga 1700 is dead I guess. 

If I was going to jump in on a platform when it was basically EOL with no future upgrade options I would probably lean towards the best in socket cpu although it would have to be generationally better than the competition for me to do that.  

I still think 13th gen and Ryzen 7000 will sell poorly till DDR5 pricing comes down. Even including Alderlake these new gen cpus have been bugs on a windshield trying to stop the Ryzen 5000 train that still seem to be dominating diy sales here in the states.


----------



## dgianstefani (Oct 20, 2022)

oxrufiioxo said:


> Eh, I use to think this way but I don't really think that is the case anymore it was more a thing during the i7 4 core era. As an example if you bought a 300 ish cpu today for gaming and then grab another 300 cpu in 2 years you are likely going to be better off than spending 600 on your cpu now. Although maybe that only applies to AM5 because lga 1700 is dead I guess.
> 
> If I was going to jump in on a platform when it was basically EOL with no future upgrade options I would probably lean towards the best in socket cpu although it would have to be generationally better than the competition for me to do that.
> 
> I still think 13th gen and Ryzen 7000 will sell poorly till DDR5 pricing comes down. Even including Alderlake these new gen cpus have been bugs on a windshield trying to stop the Ryzen 5000 train that still seem to be dominating diy sales here in the states.


The KS is coming next year with some more fuel for the haters , so not quite a dead platform. The perks of buying that $600 CPU today instead of two $300 CPUs two years apart also being that you get to enjoy the performance (not just in gaming) of that $600 CPU for the next two years, which also happens to be more than 10% faster in gaming than the more expensive competition .

Ryzen is nice for MT performance, but I'll say it again as I've said in the past. ST is always relevant, MT is sometimes relevant.


----------



## damric (Oct 20, 2022)

Ok so I guess these new fangled CPUs from Intel and AMD are geared towards the upper high end users. These elites typically have some nice monitors like 4K or some fancy ultra wide.

But according to the performance graph at 4K, even the old Ryzen 3600x still gets 95% of the gaming performance as these new ones.

Yeah I just don't see this generation as _must have. _

5.8GHz stock is neat though.


----------



## Anymal (Oct 20, 2022)

W1zz, why not 4090.


----------



## fevgatos (Oct 20, 2022)

dgianstefani said:


> The KS is coming next year with some more fuel for the haters , so not quite a dead platform. The perks of buying that $600 CPU today instead of two $300 CPUs two years apart also being that you get to enjoy the performance (not just in gaming) of that $600 CPU for the next two years, which also happens to be more than 10% faster in gaming than the more expensive competition .
> 
> Ryzen is nice for MT performance, but I'll say it again as I've said in the past. ST is always relevant, MT is sometimes relevant.


The problem is - all high end CPUs nowadays, are absolutely useless unless you have  a PHD in bios. Which personally I don't mind, since I never run anything stock, but I can't recommend these cpus to anyone that wants to do some serious heavy work on them - unless he wants to tinker with the bios. That goes for both AMD and Intel, ald / zen 4 / rpl are absolutely crap in that regard.

All of them are extremely efficient, but not out of the box. Out of the box they are just completely wonky. The last decent high end CPU I can recommend is the 5950x - you don't ever need to get on the bios, it runs perfectly okay out of the box.


----------



## Zareek (Oct 20, 2022)

Ding, Ding, Ding, CPU Wars Round Two!!! It's nice to see the competition heating up. Intel is firing back, we have real competition again! Intel was content to release Quad Core after Quad Core on a 14+++++++++++++++++ process. Not anymore, the Blue Giant has been awakened! Two series in a row, they are punching back. Let the innovations flow! Prices will fall, and we can all enjoy the fireworks.


----------



## pavle (Oct 20, 2022)

Finally we have a 400W CPU (max. but still).


----------



## Why_Me (Oct 20, 2022)

damric said:


> Ok so I guess these new fangled CPUs from Intel and AMD are geared towards the upper high end users. These elites typically have some nice monitors like 4K or some fancy ultra wide.
> 
> But according to the performance graph at 4K, even the old Ryzen 3600x still gets 95% of the gaming performance as these new ones.
> 
> ...


These are the three most expensive Intel 13 gen cpu's.  In January the rest of the Intel 13 gen lineup along with the less expensive B760 boards will be released.  You will be able to use DDR5 or DDR4 with this generation.


----------



## InVasMani (Oct 20, 2022)

The L2N BBRRRRRR is strong with this generation of hardware.


----------



## sLowEnd (Oct 20, 2022)

Fast chip, but I wouldn't want to use it in the summer out of the box in my room. It looks much better after some tweaks, but I can see why some people would not like to bother with something like that. Hopefully the non-K SKUs will be much more reasonable with their power targets out of the box.


----------



## oxrufiioxo (Oct 20, 2022)

dgianstefani said:


> The KS is coming next year with some more fuel for the haters , so not quite a dead platform. The perks of buying that $600 CPU today instead of two $300 CPUs two years apart also being that you get to enjoy the performance (not just in gaming) of that $600 CPU for the next two years, which also happens to be more than 10% faster in gaming than the more expensive competition .
> 
> Ryzen is nice for MT performance, but I'll say it again as I've said in the past. ST is always relevant, MT is sometimes relevant.



The KS lol.... It will be a miracle if that thing works well on anything but open loop cooling I also doubt an extra 200mhz on a couple cores will actually matter unless you look at RTSS more than game and really that's the issue these days everything from like a 140 usd 5600 up games well at 1440p with high/ultra settings on 90% of gpus.  Thankfully the days of intel giving us a hella gimped i5s is dead.



Why_Me said:


> These are the three most expensive Intel 13 gen cpu's.  In January the rest of the Intel 13 gen lineup along with the less expensive B760 boards will be released.  You will be able to use DDR5 or DDR4 with this generation.



The only problem is all the i5 non K sku and below are Just Alderlake rebrands so while still good miss all the benefits of Raptorlake.


----------



## Why_Me (Oct 20, 2022)

oxrufiioxo said:


> The KS lol.... It will be a miracle if that thing works well on anything but open loop cooling I also doubt an extra 200mhz on a couple cores will actually matter unless you look at RTSS more than game and really that's the issue these days everything from like a 140 usd 5600 up games well at 1440p with high/ultra settings on 90% of gpus.  Thankfully the days of intel giving us a hella gimped i5s are dead.
> 
> 
> 
> The only problem is all the i5 non K sku and below are Just Alderlake rebrands so while still good miss all the benefits of Raptorlake.


And what benefits would those be if you aren't using your PC for productivity five days a week?


----------



## oxrufiioxo (Oct 21, 2022)

Why_Me said:


> And what benefits would those be if you aren't using your PC for productivity five days a week?



The extra cache benefits games more than productivity. The 13400 is just a castrated 12600k that will always perform worse not very exciting for a new gen part that will likely be almost a year newer when launched.


----------



## Why_Me (Oct 21, 2022)

oxrufiioxo said:


> The extra cache benefits games more than productivity. The 13400 is just a castrated 12600k that will always perform worse not very exciting for a new gen part that will likely be almost a year newer when launched.


No argument there.  The 13400 is limited to 24mb of cache but I'm still interested on seeing the benchmarks.


----------



## oxrufiioxo (Oct 21, 2022)

Why_Me said:


> No argument there.  The 13400 is limited to 24mb of cache but I'm still interested on seeing the benchmarks.



I'm not saying they are going to be bad or even not worth buying just not a huge fan of intel rebranding alderlake in the price segment that most people keep their hardware the longest. I will likely be recommending the 12600k especially if it comes down a bit in price over the 13th gen locked i5 parts. At least down the line you can tweak it for extra performance if necessary.

The other thing and something that may matter more down the line is Raptorlake seems to have a much better memory controller than Alderlake so for anyone doing ddr5 system who opts for a cheap 5000mhz ish kit now  probably will not be able to grab a much better 7000+ kit down the line whenever ddr5 gets cheaper which kinda sucks.


----------



## Ravenas (Oct 21, 2022)

The 7950X is a bad upgrade for me having a 5950X. This is equally bad or worse due to the power consumption.

I really am hoping AMD impresses me with GPUs.

There is absolutely zero incentive for efficiency year over year. What incentive do these firms have to provide better power consumption? The race is exclusively who produces the most frames. At this rate we will have 2000W PSUs within a decade.


----------



## jamexman (Oct 21, 2022)

Meh, just wait for Amd’s 3D cache 7000 series if you must upgrade soon. I’m staying with my 5900x until I see Zen 4+ or 5 and/or ML. I game at almost 4K (5120x1440) so GPU perf is most decisive.

Let’s hope they can bring down power consumption.


----------



## Mussels (Oct 21, 2022)

Somethings not adding up, after comparing a bunch of reviews of the 13th gen and reading this


One GN example of frametimes






TPU's results (ignoring the messy graph)





Same game, same resolution - but those 0.1% lows massively change how to interpret those results and the results seem so wildly different


Vs the 7700x seems the same, I'm trying to wrap my head around what hardware is different to alter these results so much




Vs: (Cropped out some CPU's between them, didnt alter order or anything) where the 13900K is 11.9% higher average (vs 27.3%) and the 99th percentile/1% swap places completely


----------



## AusWolf (Oct 21, 2022)

Look at that gaming performance!  I nearly axed my AM5 upgrade plans, but then I saw the page on temperatures and power consumption... jeez!


----------



## Crackong (Oct 21, 2022)

Mussels said:


> Somethings not adding up, after comparing a bunch of reviews of the 13th gen and reading this
> 
> 
> One GN example of frametimes
> ...



I think this just show there is no 'domination' in this generation.
Different games behave differently
Even the same game, same CPU, but in different environments (OS, Build, thermals), behaves differently
Some game just don't like Intel
Some game just don't like Ryzen
Some OS just 'magically' breaks certain CPU performance by an automatic update.

People should pick the ones performs better in their daily workloads.
Or maybe just pick the one 'Causing less trouble to upgrade' .


----------



## r9 (Oct 21, 2022)

Silly me thinking the Ryzen 7xxx run hot 117C wtf is that?1


----------



## rv8000 (Oct 21, 2022)

Also, the 13900k can’t be found for under $649. Shouldn’t the performance per dollar chart represent that?


----------



## Super Firm Tofu (Oct 21, 2022)

Mussels said:


> Somethings not adding up, after comparing a bunch of reviews of the 13th gen and reading this
> 
> Same game, same resolution - but those 0.1% lows massively change how to interpret those results and the results seem so wildly different
> 
> Vs the 7700x seems the same, *I'm trying to wrap my head around what hardware is different to alter these results so much*



My guess would be 3090Ti vs 3080, and probably a different test scene.  I know the built in benchmark for FC6 can stutter like a mutter.


----------



## metalslaw (Oct 21, 2022)

Interestingly, given the gaming cpu temps (with the noctua), the 5.8 boost on 2 cores may not happen in games hardly at all, even with a really good aio, as the temps will probably still be over 70.

That makes the chip less worth getting, and is probably why we didn't see more improvement vs last gen.

Custom loops with big radiators may be the only way to cool it sufficiently to get it below 70c when gaming.


----------



## R0H1T (Oct 21, 2022)

Ravenas said:


> The race is exclusively who produces the most frames *flames*.


Probably more accurate now than ever 


Ravenas said:


> At this rate we will have 2000W PSUs within a decade.


They're already here, you just need the dough or inheritance from a rich uncle to be able to afford them 








						Nvidia RTX 4090 Ti reportedly canned due to sky-high power consumption
					

Rumor: Nvidia RTX 4090 Ti was such a PSU killer it’s been ditched




					www.techradar.com


----------



## sLowEnd (Oct 21, 2022)

Ravenas said:


> At this rate we will have 2000W PSUs within a decade.


They already exist (probably mostly to meet miner demand), but aren't common in 120V areas because of the need for a beefier power outlet.









						Going BIG: EVGA’s SuperNova 2000W G+ Power Supply
					






					www.anandtech.com


----------



## Iain Saturn (Oct 21, 2022)

Anyone point me towards under volting data?  He has interesting oc data also.

Only thing I could find.










Any hope for this processor?  Was looking for something that would be efficient at long AV1 and mp4 or 5 video rendering.

Thank you.


----------



## Mistral (Oct 21, 2022)

When you say that the Intel 13 series are more affordable than Ryzen 7000 because you can buy a cheaper motherboard with DDR4, wouldn't it be fair to also review them on one?


----------



## Why_Me (Oct 21, 2022)

AusWolf said:


> Look at that gaming performance!  I nearly axed my AM5 upgrade plans, but then I saw the page on temperatures and power consumption... jeez!


----------



## Thorsthimble (Oct 21, 2022)

Yeah, I think I'm just going to stick with what I have for a bit longer. Those power figures are kind of eye-watering. I understand running the upper tier stuff is going to use more power no as a general rule, and I accept that I will need to make the bill payment that comes as a result of it. So I'm willing to accept a bit of pain to have the rig I do have. That said, I'm also not a complete masochist. That's a lot of juice. Then slapping a 4090 on top of it? You're going to have your neighbors complaining about their lights dimming for a few seconds whenever you turn on your computer. Holy smokes.


----------



## rvalencia (Oct 21, 2022)

Both camps need to be overclocked, not just from one side.


----------



## jsven008 (Oct 21, 2022)

Just when I thought CPUs couldn't get any hotter after the 7950x, now we have the 13900k! Its pushing 101C in application performance and 90C in gaming. I love performance, but this is performance at all costs. I'd wait for a lower-power CPU variant, like a 65 watt 13700 (without the 'k'). I also like a cool and quiet PC.


----------



## W1zzard (Oct 21, 2022)

Mussels said:


> Somethings not adding up, after comparing a bunch of reviews of the 13th gen and reading this
> 
> 
> One GN example of frametimes
> ...


Any chance he's using the integrated benchmark? The benchmark uses some kind of fast flyby on a tiny map, which isn't how actual gameplay works. I'm using an in-game scene that I picked, that I feel is representative of the typical gameplay experience


----------



## mb194dc (Oct 21, 2022)

Find it hard to understand anyone dropping the money for a latest gen chip and then using it at 1080p. Maybe a tiny subset of gamers? 

Struggle to see the use case for both latest gen power hungry hot monstrosities. Last gen chips will do the same for less.

If you're gaming at a normal modern resolution like 4k, any modern cpu will do.


----------



## jsven008 (Oct 21, 2022)

W1zzard said:


> Any chance he's using the integrated benchmark? The benchmark uses some kind of fast flyby on a tiny map, which isn't how actual gameplay works. I'm using an in-game scene that I picked, that I feel is representative of the typical gameplay experience


Not to disparage Gamer's Nexus, but in the past I have noticed some of the values in GN reviews didn't add up and had little to no statistical variability. I have noticed the same thing mentioned above, that some of the 0.1% lows didn't make sense on GN in the past. If a processor is supposed to be exactly 1.0% faster, most likely on Gamers Nexus it will be exactly 1.0% faster. Some of the results on Gamers Nexus are a little too statistically consistent to be accurate IMO. But, that's why I'm here and not at Gamer's Nexus. Because I have far more faith in W1zzard. Truthfully speaking.


----------



## N3M3515 (Oct 21, 2022)

dgianstefani said:


> Ryzen is nice for MT performance, but I'll say it again as I've said in the past. ST is always relevant, MT is sometimes relevant.


According to this review and every other both cpus are indistinguishable from each other in st or mt. Don't know what all the fuzz is about. Anyone does a blind test and won't know which is which.


----------



## Mussels (Oct 21, 2022)

Super Firm Tofu said:


> My guess would be 3090Ti vs 3080, and probably a different test scene.  I know the built in benchmark for FC6 can stutter like a mutter.


Even so, somethings really wrong for the results to be so different with the frametime results

I'm not asking why one setup has more FPS than the other, but why one shows the x3D having its fantastic frametimes it's known for, and the other not



W1zzard said:


> Any chance he's using the integrated benchmark? The benchmark uses some kind of fast flyby on a tiny map, which isn't how actual gameplay works. I'm using an in-game scene that I picked, that I feel is representative of the typical gameplay experience


It's entirely possible, I couldn't even find the hardware for their AMD setup listed
It's simply seeing the discrepancy is odd - I've just downloaded CP2077 so i can test on my own system for comparison.


I've just seen differing results all over the place, mostly boiling (hah) down to:

1. These CPU's run at 350W for 120 seconds, then drop to 250W. Performance drops when this happens, so they really shine in short tests when they have a cooldown time
2.the insane temps and power draw have resulted in erratic performance for some reviewers, where they boost high, throttle down, and repeat causing the stutters


----------



## AusWolf (Oct 21, 2022)

jsven008 said:


> Not to disparage Gamer's Nexus, but in the past I have noticed some of the values in GN reviews didn't add up and had little to no statistical variability. I have noticed the same thing mentioned above, that some of the 0.1% lows didn't make sense on GN in the past. If a processor is supposed to be exactly 1.0% faster, most likely on Gamers Nexus it will be exactly 1.0% faster. Some of the results on Gamers Nexus are a little too statistically consistent to be accurate IMO. But, that's why I'm here and not at Gamer's Nexus. Because I have far more faith in W1zzard. Truthfully speaking.


Same here. Also, GN tends to emphasize 0.1% lows a bit too much, when most of the time, it's just a small hitch while the game loads an asset and you barely even notice it - if at all.


----------



## Dirt Chip (Oct 21, 2022)

We need to stop talking about top tier cpus (13900, 7950) as gaming cpu's as they won't have anything good to offer or even slower than same gen and lower tier cpus (13600, 7700) with worse thermal, higher cost and high wattage.

Those highly threaded cpus (32 threads) are for applications use and only gamer people should avoid them and invest the difference in gpu.


----------



## Vayra86 (Oct 21, 2022)

dgianstefani said:


> The KS is coming next year with some more fuel for the haters , so not quite a dead platform. The perks of buying that $600 CPU today instead of two $300 CPUs two years apart also being that you get to enjoy the performance (not just in gaming) of that $600 CPU for the next two years, which also happens to be more than 10% faster in gaming than the more expensive competition .
> 
> Ryzen is nice for MT performance, but I'll say it again as I've said in the past. ST is always relevant, MT is sometimes relevant.


LOL. Its not a dead platform because there is a .... KS.... coming out next year?

Now its clear you suffer from some bias lmao. And you repeat an argument about a 600 dollar CPU giving you the perf today, but that doesn't match with the results we see in reality and the discussion in which you _yourself_ admitted you can't even get it maxed out proper today. In gaming. Similarly... ST is always relevant, yes, so we're using the argument wrt to a 32 thread CPU 

Sorry man, but this is just strange.


----------



## Mussels (Oct 21, 2022)

AusWolf said:


> Same here. Also, GN tends to emphasize 0.1% lows a bit too much, when most of the time, it's just a small hitch while the game loads an asset and you barely even notice it - if at all.


personally those hitches are a big deal

If i'm in VR, that's nausea - and in any other type of game it's exactly what I build gaming PC's to avoid


----------



## fevgatos (Oct 21, 2022)

Vayra86 said:


> LOL. Its not a dead platform because there is a .... KS.... coming out next year?
> 
> Now its clear you suffer from some bias lmao. And you repeat an argument about a 600 dollar CPU giving you the perf today, but that doesn't match with the results we see in reality and the discussion in which you _yourself_ admitted you can't even get it maxed out proper today.
> 
> Sorry man, but this is just strange.


Honestly that dead platform argument is stupid no matter which way you look at it. Going by current and past prices - amd will CHARGE you the price of a motherboard included in their CPU prices. Think about it this way, I had an R5 1600 + a b350. I could upgrade to a 5800x 3d for 450€ and keep my motherboard, or buy a 12700f + b660 bazooka for 450€ combined. Personally, I think the second option is better, yes I might lose up to 10% gaming performance in some extreme scenarios, but I get a brand new full of warranty motherboard with all the latest features - and a much faster CPU in everything besides gaming. 

And you can see the same pattern this gen. The 13600k annihilates both the 7600x and the 7700x, and it's priced inbetween. Going by current numbers, it would take zen 6 or zen 7 (which is like what, 4 to 6 years) for an R5 cpu to match the performance of the 13600k.


----------



## AusWolf (Oct 21, 2022)

Mussels said:


> personally those hitches are a big deal
> 
> If i'm in VR, that's nausea - and in any other type of game it's exactly what I build gaming PC's to avoid


VR is a fair point.


----------



## Dirt Chip (Oct 21, 2022)

N3M3515 said:


> According to this review and every other both cpus are indistinguishable from each other in st or mt. Don't know what all the fuzz is about. Anyone does a blind test and won't know which is which.


But you do see the cost of the platform.
If both are the same and one is cheaper, by some hundreds of $ if you go ddr4 and z690, than the choice is more easy.


----------



## ModEl4 (Oct 21, 2022)

I definitely saw some reviews that explicitly mentioned 253W PL2 (*PL1=PL2*=253W) and 38.000 CBr23 scores but with windows 22H2, i even saw a 35500 result with 22H1.
The same publications had in the same charts results for unlimited power +40.000 so no confusion there!
I also watched der8auer video review that paints an extremely positive picture for 13900K power efficiency (with his optimizations it matches 7950X eco mode)
I really don't remember the last years so vastly different results from publication to publication, is it windows edition, is it motherboard peculiarities, is it memory clocks/timings/gear, cooler used, i really don't know what to think at this point, i will reserve my judgement for the time being.


----------



## Solid State Brain (Oct 21, 2022)

@ModEl4
A few possible reasons:

Not all CPUs are equal and some may even run at significantly higher or lower voltages than the average.
As I mentioned earlier, when the CPU is running in a power-limited fashion, it is important that the DC Loadline be correctly configured, or results may be unusually higher or lower than they should be for the same reported Package Power, but almost nobody checks this out.
The AC Loadline used also influences the results. This is a motherboard setting for regulating load voltage (with Adaptive voltages) and help keeping the CPU at its built-in voltage-frequency curve. MSI notably uses a very high AC Loadline by default; other manufacturers (e.g. ASUS) tend to use a lower one.
Incompetence/sloppiness (wrong bios settings or testing methodologies).

*EDIT*: on a related note, HardwareUnboxed had testing issues:





						Hardware Unboxed (@HardwareUnboxed)
					

Please note I will trim (remove) the power scaling section from our review. The data is accurate for the MSI motherboard used. BUT It appears as though MSI is very aggressive on voltage and doesn’t correct this when power limiting the CPU.




					nitter.net


----------



## ModEl4 (Oct 21, 2022)

i wonder if there is any possibility in a  year from now when Meteor Lake launches (or even earlier than that), Intel to upgrade all the CPUs below 13600KF to Raptor Lake dies (supposedly upcoming 65W i5s are Alder Lake based)
I can't remember, has Intel ever done this the last decade? (same socket)
This will allow people that bought 12th gen i3s to upgrade to 6P+8E Raptor Lake CPUs with good gaming performance increase and also double MT performance (and also stay at low PL2 values in relation with 65W i7/i9 Raptor Lake models)



Solid State Brain said:


> @ModEl4
> A few possible reasons:
> 
> Not all CPUs are equal and some may even run at significantly higher or lower voltages than the average.
> ...


Regarding HU i watched before 2 years 1-2 reviews but it was obviously too one-sided (AMD favouring) and it showed that he wanted badly to promote the competing brand, so i never watched again! (but i never posted anything about it here, it was just my impression and most of their results were valid anyway, it's that some where not valid and it pushed too hard the message based on those)


----------



## FeelinFroggy (Oct 21, 2022)

It's probably getting time to upgrade my 8700k.  The new AMD CPUs are very good and Intel still holds a small advantage in gaming performance.  But the elephant in the room is the potential 78003dx.  For strictly gaming (which is me) I think I'll wait till it comes out (thought it was supposed to be November) before I make a choice on an upgrade.



Thorsthimble said:


> Yeah, I think I'm just going to stick with what I have for a bit longer. Those power figures are kind of eye-watering. I understand running the upper tier stuff is going to use more power no as a general rule, and I accept that I will need to make the bill payment that comes as a result of it. So I'm willing to accept a bit of pain to have the rig I do have. That said, I'm also not a complete masochist. That's a lot of juice. Then slapping a 4090 on top of it? You're going to have your neighbors complaining about their lights dimming for a few seconds whenever you turn on your computer. Holy smokes.


Do you run your rig at 100% 24/7?  Unless you are a power user and do a lot of content creation where the system is running full peak all the time any difference on your electric bill will be negligible compared to whatever you currently run.  

Gaming is not even close to a full load on the system and how much do you game?  12-15 hours a day?  Most gamers (especially adults who can afford this equipment) wont be spending that much time gaming.  So you complaints about power bills and lights dimming are silly.  

If you can afford the 4090 or the cost of a platform upgrade, you can afford a few extra kWh on your electric bill.


----------



## Why_Me (Oct 21, 2022)

Meanwhile peeps with more money than brains are purchasing 7950's and 13900K's atm for gaming builds.


----------



## Vayra86 (Oct 21, 2022)

FeelinFroggy said:


> It's probably getting time to upgrade my 8700k.  The new AMD CPUs are very good and Intel still holds a small advantage in gaming performance.  But the elephant in the room is the potential 78003dx.  For strictly gaming (which is me) I think I'll wait till it comes out (thought it was supposed to be November) before I make a choice on an upgrade.


What games are choking your 8700K? I'm curious as I have yet to find one, though of course I'm on an 'ancient' GPU.


----------



## medi01 (Oct 21, 2022)

thunderingroar said:


> Looking forward to future CPU testings with a 4090


It is worse than 12900k let alone 5800x if lowest 1% are included (super relevant for gaming), so better not.


----------



## W1zzard (Oct 21, 2022)

Vayra86 said:


> What games are choking your 8700K? I'm curious as I have yet to find one, though of course I'm on an 'ancient' GPU.


I'm on 8700K with 3080 and haven't noticed any "choking"


----------



## 1d10t (Oct 21, 2022)

And I think AMD outdid Intel in power and temperature, both teams managed to deliver good meh in this generation. I think my Zen 3 will be fine until 3D V Cache came.


----------



## ratirt (Oct 21, 2022)

Well I guess those who so claimed how great this new Intel is can bite their tongues now. Some improvement over 12900k but definitely not claimed 40%(or whatever it was) though and yet the power consumption.... holy crap. I guess the main improvement was the frequency which means power through the roof. Just to be slightly ahead of competition. Sorry Intel but the power consumption is way too high either in application or gaming is really not good.
I literally just glanced at the charts and it would seem the 13900k is just 12900k on steroids with power thus the results. As I see it though? No improvement whatsoever, higher power consumption for higher gains. Hoped for better results to be honest.

Just noticed. 
The real fun starts when 13600k has a power draw of 255W which is where the 3900XT sits. 7600X has a 183watt power draw. Not to mention during gaming.
No, I'm trying to like it for some reason but I just simply can't despite if it is 13600k or 13900k it does not matter.


----------



## W1zzard (Oct 21, 2022)

Pumper said:


> Something is not right with the Cinebench results. All other reviews show 38k~40k. Wonder how affected the other tests are by whatever is causing it in your setup.


I'm researching this .. something strange is going on .. going away for the weekend in an hour though, more testing on monday


----------



## Arco (Oct 21, 2022)

Honestly with those temps, I'm waiting till someone cooks on one of these. 115C! 300w! Both AMD and Intel are pretty close in this gen. Maybe gaming but I think Zen4X3D will retake the top.


----------



## BiggieShady (Oct 21, 2022)

RDR2, WatchDogs2, EldenRing and Forza can't run at 144 Hz on any cpu ... is this "the choking"?


----------



## fevgatos (Oct 21, 2022)

ratirt said:


> Well I guess those who so claimed how great this new Intel is can bite their tongues now. Some improvement over 12900k but definitely not claimed 40%(or whatever it was) though and yet the power consumption.... holy crap. I guess the main improvement was the frequency which means power through the roof. Just to be slightly ahead of competition. Sorry Intel but the power consumption is way too high either in application or gaming is really not good.
> I literally just glanced at the charts and it would seem the 13900k is just 12900k on steroids with power thus the results. As I see it though? No improvement whatsoever, higher power consumption for higher gains. Hoped for better results to be honest.
> 
> Just noticed.
> ...


Maybe you should check reviews from other sites. The 13900k according to every review except this one is more than 40% faster than the 12900k with both at similar wattage (240 vs 253).


----------



## ratirt (Oct 21, 2022)

fevgatos said:


> Maybe you should check reviews from other sites. The 13900k according to every review except this one is more than 40% faster than the 12900k with both at similar wattage (240 vs 253).


I did. The power stuff is from GURU3D for instance. It is not impressive and you have been telling people different. Assuring almost to be mind blown after they've seen what the new Intel. To be fair, I am mind blown and I'm sure others as well but not in the way you think. I'm sorry but it's not 40% faster. Not in gaming. Maybe there are instances (applications) that it may reach 40% but in general it is not 40% faster and I think that is pretty clear.


----------



## R0H1T (Oct 21, 2022)

Tbf it can be tuned to a fair bit more efficient, but so can 12900k & 7950x ~ *at stock it's definitely not 40% more efficient*


----------



## fevgatos (Oct 21, 2022)

ratirt said:


> I did. The power stuff is from GURU3D for instance. It is not impressive and you have been telling people different. Assuring almost to be mind blown after they've seen what the new Intel. To be fair, I am mind blown and I'm sure others as well but not in the way you think. I'm sorry but it's not 40% faster. Not in gaming. Maybe there are instances (applications) that it may reach 40% but in general it is not 40% faster and I think that is pretty clear.


Guru3d has the 13900k scoring 38+k at 253w in cbr23. Thats more than 40% faster at similar wattage. So wtf are you talking about? 

Who said its going to be 40% faster in gaming? Lol, ofc it wont be, how did you ever ecpect that to be possible.


----------



## ratirt (Oct 21, 2022)

fevgatos said:


> Guru3d has the 13900k scoring 38+k at 253w in cbr23. Thats more than 40% faster at similar wattage. So wtf are you talking about?
> 
> Who said its going to be 40% faster in gaming? Lol, ofc it wont be, how did you ever ecpect that to be possible.


Oh so in Cinebench score it is 40% faster. That is just one score for one benchmark. Then you have other apps and it is not 40% faster and gaming same thing thus you can't say it is 40% faster since it is misleading. In general it is not 40% faster and that was my point.


----------



## fevgatos (Oct 21, 2022)

ratirt said:


> Oh so in Cinebench score it is 40% faster. That is just one score for one benchmark. Then you have other apps and it is not 40% faster and gaming same thing thus you can't say it is 40% faster since it is misleading. In general it is not 40% faster and that was my point.


It's not just cinebench, every workload that is multithreaded the difference is 40% or more

So are you saying the 7950x is 5% faster than the 12900k?


----------



## AusWolf (Oct 21, 2022)

ratirt said:


> Oh so in Cinebench score it is 40% faster. That is just one score for one benchmark. Then you have other apps and it is not 40% faster and gaming same thing thus you can't say it is 40% faster since it is misleading. In general it is not 40% faster and that was my point.





fevgatos said:


> It's not just cinebench, every workload that is multithreaded the difference is 40% or more
> 
> So are you saying the 7950x is 5% faster than the 12900k?


Guys, you are comparing ridiculously fast CPUs at ridiculously high power and temperature levels. May I suggest that maybe... it doesn't matter? No sane person should push a CPU beyond 200+ W for gaming.


----------



## ratirt (Oct 21, 2022)

fevgatos said:


> It's not just cinebench, every workload that is multithreaded the difference is 40% or more
> 
> So are you saying the 7950x is 5% faster than the 12900k?


Dude. I'm not saying anything about AMD products. Wrong thread though. I was point something out about your comments to my post.
I honestly don't care about your glaring problems with Intel's praise and my comments are not for anyone's amusement.



AusWolf said:


> Guys, you are comparing ridiculously fast CPUs at ridiculously high power and temperature levels. May I suggest that maybe... it doesn't matter? No sane person should push a CPU beyond 200+ W for gaming.


Im not comparing anything. I just expressed something about a product that just came out.


----------



## Chrispy_ (Oct 21, 2022)

Ugh, I was hoping for IPC improvements or process node improvements but what we have is just an overgrown Alder Lake with yet more power consumption to get the overclock higher.

I could have given Raptor lake some slack if it resulted in lower-tier models like an i5-13500 that had 6P+8E at reasonable power consuption, but no; The 13600K will be the smallest, cheapest Raptor Lake and that is already high-end enough that only the top 10% of buyers will spend that much. For the remaining 90% of the market, we just get rebranded Alder Lake models from 2021.

Given the global economic recession, cost of living increases, and exponentially rising energy costs, I feel like AMD may still have the upper hand here with far lower power consumption. When A620 boards or budget B650(non-E) boards are available, having a fast modern CPU that doesn't require replacing your PSU and cooling to use are both going to be points in AMD's favour.


----------



## Dirt Chip (Oct 21, 2022)

Chrispy_ said:


> Ugh, I was hoping for IPC improvements or process node improvements but what we have is just an overgrown Alder Lake with yet more power consumption to get the overclock higher.
> 
> I could have given Raptor lake some slack if it resulted in lower-tier models like an i5-13500 that had 6P+8E at reasonable power consuption, but no; The 13600K will be the smallest, cheapest Raptor Lake and that is already high-end enough that only the top 10% of buyers will spend that much. For the remaining 90% of the market, we just get rebranded Alder Lake models from 2021.
> 
> Given the global economic recession, cost of living increases, and exponentially rising energy costs, I feel like AMD may still have the upper hand here with far lower power consumption. When A620 boards or budget B650(non-E) boards are available, having a fast modern CPU that doesn't require replacing your PSU and cooling to use are both going to be points in AMD's favour.


From my cristal ball: 65w 13300f\13100f on ddr4+h610 will be the ultimate budget options with better gaming preformance all around. AM5 on ddr5 just can't cut it unless massively priceed cut.


----------



## Shatun_Bear (Oct 21, 2022)

Steve from HardwareUnboxed: "The power consumption of the 13900K is downright hideous..."


----------



## AusWolf (Oct 21, 2022)

Chrispy_ said:


> Ugh, I was hoping for IPC improvements or process node improvements but what we have is just an overgrown Alder Lake with yet more power consumption to get the overclock higher.
> 
> I could have given Raptor lake some slack if it resulted in lower-tier models like an i5-13500 that had 6P+8E at reasonable power consuption, but no; The 13600K will be the smallest, cheapest Raptor Lake and that is already high-end enough that only the top 10% of buyers will spend that much. For the remaining 90% of the market, we just get rebranded Alder Lake models from 2021.
> 
> Given the global economic recession, cost of living increases, and exponentially rising energy costs, I feel like AMD may still have the upper hand here with far lower power consumption. When A620 boards or budget B650(non-E) boards are available, having a fast modern CPU that doesn't require replacing your PSU and cooling to use are both going to be points in AMD's favour.


The economy is slowly but surely going down the toilet, energy prices are soaring all around the world, and here we have Intel, Nvidia and AMD brute forcing their ways into the highest of the high tiers of computer hardware with never-before-seen power consumption and heat, and no innovation on the IPC and efficiency fronts other than what a node shrink naturally brings. What is going on?


----------



## ratirt (Oct 21, 2022)

Shatun_Bear said:


> Steve from HardwareUnboxed: "The power consumption of the 13900K is downright hideous..."
> 
> View attachment 266504


Yeah. I just finished watching what HWUB Steve had to say. It was not pretty. Not to mention the constant thermal throttling for the 13900K and the ridiculous power consumption in most scenarios there are.


----------



## AusWolf (Oct 21, 2022)

ratirt said:


> Yeah. I just finished watching what HWUB Steve had to say. It was not pretty. Not to mention the constant thermal throttling for the 13900K and the ridiculous power consumption in most scenarios there are.


I just finished the 13700K video which paints a similar picture, unfortunately. And we (or at least some of us) thought Zen 4 was bad...


----------



## robal (Oct 21, 2022)

Many thanks for an excellent review @W1zzard
Call me crazy, but the term *'frames per watt'* tickles me the wrong way...
You've probably used that phrase to say 'frames per second per watt' in a shorter way, but still feels wrong.

If anyone is not asleep yet and want detail:
"frames per watt" is wrong because 'frame' is an 'amount of work done' and 'watt' is 'power' (amount of energy per unit of time).
A correct measure of efficiency would be either:
"amount of work done per amount of energy",  so:  *"frame per joule" *or* "frames per kWh" (to give a familiar unit of energy)*
or
"rate of work being done per amount of power", so: *"FPS per watt" *


----------



## alganonim (Oct 21, 2022)

Hello, could anyone point me some 13600k tests with Z690 DDR4 mainboard ?


----------



## N3M3515 (Oct 21, 2022)

Dirt Chip said:


> But you do see the cost of the platform.
> If both are the same and one is cheaper, by some hundreds of $ if you go ddr4 and z690, than the choice is more easy.


What you save today by not buying motherboard if you come from alder then that same you save when ryzen 5 comes. So either way gets to the same result more less.


----------



## Gameslove (Oct 21, 2022)

Big big disappointment: a lot of powers consumption and very high temperature. 

Not full gaming parity vs Ryzen 7 5800x3d (according all reviews) . 

Overall a Ryzen 7 5800x3d  stay in 1 place efficient / gaming.


----------



## ShiningSapphire (Oct 21, 2022)

Solid State Brain said:


> Other reviewers are seeing higher results in CB23 with power limits removed.



Something very wrong must be with their testing methodology, it was the same with the 4090 tests - much lower results compared to other reviewers because they were testing on a 5800X which was an obvious bottleneck. The worst of this is probably the temperature tests, fanboys everywhere are screaming about 117 degrees but they don't see or don't want to see that the measurements were taken on a tiny Noctuy air cooler. LOL 
By the way, RPLs have great potential for undervolting.


----------



## R0H1T (Oct 21, 2022)

The U14s isn't tiny, in fact the test shows that you will need to invest a lot more in cooling the 13xxx chips if you want them to stay reasonably cool even with extreme workloads. More than AMD I'd say, so there's the extra investment needed for that.


----------



## fevgatos (Oct 21, 2022)

ratirt said:


> Dude. I'm not saying anything about AMD products. Wrong thread though. I was point something out about your comments to my post.
> I honestly don't care about your glaring problems with Intel's praise and my comments are not for anyone's amusement.
> 
> 
> Im not comparing anything. I just expressed something about a product that just came out.


Ιm just asking, in your opinion, how faster is the 7950x over the 12900k. Gimme a straight answer if possible



R0H1T said:


> The U14s isn't tiny, in fact the test shows that you will need to invest a lot more in cooling the 13xxx chips if you want them to stay reasonably cool even with extreme workloads. More than AMD I'd say, so there's the extra investment needed for that.


Well I have a u12a, pretty similar to the u14s, my 12900k is sitting at 78c in CBR23 at stock.... This site has it at 95 or something.


----------



## R0H1T (Oct 21, 2022)

You're not running it on manufacturer (board) defaults though are you?


----------



## HTC (Oct 21, 2022)

ShiningSapphire said:


> The worst of this is probably the temperature tests, fanboys everywhere are screaming about 117 degrees but they don't see or don't want to see that *the measurements were taken on a tiny Noctuy air cooler*.





R0H1T said:


> The U14s isn't tiny



It's actually a big cooler, though there are bigger air coolers.

@W1zzard 

Could you perhaps make a thermal throttling test for this CPU like you did with the 7950X?


----------



## ShiningSapphire (Oct 21, 2022)

R0H1T said:


> The U14s isn't tiny, in fact the test shows that you will need to invest a lot more in cooling the 13xxx chips if you want them to stay reasonably cool even with extreme workloads. More than AMD I'd say, so there's the extra investment needed for that.


There is nothing new about the fact that such cooling is not enough for such a powerful cpu. Who want to buy this will of course go for a 360/420mm AiO so what's the point of this testing procedure, to show that the processor will boil under poor cooling? This is obvious.


----------



## R0H1T (Oct 21, 2022)

ShiningSapphire said:


> *Who want to buy this will of course go for a 360/420mm AiO *so what's the point of this testing procedure, to show that the processor will boil under poor cooling? *This is obvious.*


You don't speak for thousands of others who will buy these chips, in fact many on this forum prefer air cooling. *Besides not every case will have space for 360/420mm cooling*, oh wait is Intel gonna subsidize that as well through their *contra revenues* "marketing development funds"


----------



## HTC (Oct 21, 2022)

ShiningSapphire said:


> here is nothing new about the fact that such cooling is not enough for such a powerful cpu.



Are you sure about that? A LOT of people were claiming a "heavy-duty cooler" was required for a 7950X (it's most direct competitor) but @W1zzard proved that it could be run just fine, even with a Wraith cooler using fan @ 20%, though @ the cost of A LOT of performance.

Would the 13900K behave the same way? I don't know.


----------



## Chrispy_ (Oct 21, 2022)

Dirt Chip said:


> From my cristal ball: 65w 13300f\13100f on ddr4+h610 will be the ultimate budget options with better gaming preformance all around. AM5 on ddr5 just can't cut it unless massively priceed cut.


AM5 is probably a little premature as a DDR5-only platform. The 13100F on an mATX H610 board will almost certainly be great bang for the buck at the very lowest end.

I think the new 13th gen label slapped on the rebranded 12400F isn't going to change anything, it'll still compete with AM4, and a cheap B550 board and DDR4-3600 kit pairs exceptionally well with a 5700X. There are no winners or losers in that fight, both platforms are a dead end, both are good performance/$ and both are competent, all-round solutions that will likely outsell higher-tier products from either brand by one or more orders of magnitude.


----------



## fevgatos (Oct 21, 2022)

R0H1T said:


> You're not running it on manufacturer (board) defaults though are you?


Yes I am


----------



## Chrispy_ (Oct 21, 2022)

HTC said:


> Are you sure about that? A LOT of people were claiming a "heavy-duty cooler" was required for a 7950X (it's most direct competitor) but @W1zzard proved that it could be run just fine, even with a Wraith cooler using fan @ 20%, though @ the cost of A LOT of performance.
> 
> Would the 13900K behave the same way? I don't know.


No need to wonder. HUB did the testing and it's ugly for Raptor lake.




Yes, the 7950X also starts to suffer right down at 65W, but the difference at 105W is insane, with it outperforming the 13900K by over 50%.

Edit:
These results have been retracted by HUB, though IMO they are very much still valid, although misleading. The terrible showing above is a direct result of Intel's own XTU software which is not something you'd get if you power-gated Raptor lake in the BIOS. Blame Intel for shit software because that's the true source of this erroneous data.


----------



## HTC (Oct 21, 2022)

Chrispy_ said:


> No need to wonder. HUB did the testing and it's ugly for Raptor Lake at 65W:
> 
> View attachment 266536



I meant using the same kind of approach as in that 7950X thermal throttling test: not by reducing the power, but rather by reducing the fan speed to simulate different (worse) coolers, in order to test how the CPU would fare with them as well as how much performance it would lose with the "worse coolers".


----------



## Solid State Brain (Oct 21, 2022)

Chrispy_ said:


> No need to wonder. HUB did the testing and it's ugly for Raptor lake.
> 
> View attachment 266536
> 
> Yes, the 7950X also starts to suffer right down at 65W, but the difference at 105W is insane, with it outperforming the 13900K by over 50%.



The HardwareUnboxed results have been retracted:


__ https://twitter.com/i/web/status/1583257536202842114


----------



## Dirt Chip (Oct 21, 2022)

N3M3515 said:


> What you save today by not buying motherboard if you come from alder then that same you save when ryzen 5 comes. So either way gets to the same result more less.


You mean use the same 6xx mobo with zen5?
If you are on a budget it will be very poor choice, budget wise, to change CPU every year or two.
Also, most people dont change cpu often, and some of one who do tend to upgrade the mobo as well as cpu to get all the new and fresh tech.
The whole "dead platform" seems irrelevant to me (I upgrade every 10 years or so on average), but I can see why on forums like this is quite prevalence (although I don't think it's the common state of mind).


----------



## Chrispy_ (Oct 21, 2022)

Solid State Brain said:


> The HardwareUnboxed results have been retracted:


Oh, good spot.
I guess we'll have to wait for W1zzard to have time for a power-scaling test of his own.


----------



## R0H1T (Oct 21, 2022)

Dirt Chip said:


> The whole "dead platform" seems irrelevant to me (I upgrade every 10 years or so on average), but I can see why on forums like this is quite prevalence (although I don't think it's the common state of mind).


It is important in more ways than one, chances are if your mobo dies for some reason you'll be able to buy a cheap second hand or even brand new x570 board 2-3 years down the line. Good luck finding a z97 board(used or new) though at reasonable prices. If you want to upgrade you could go from zen to zen3 as well with the chipset limitations, the last time Intel allowed this was when Youtube was barely a thing. Intel's artificial limitations wrt sockets are well documented & really there's no excuse for changing them 5 times on the same uarch, with Skylake!

This is even more relevant now because PCIe 5.0 will easily last you a decade or more, we have no dGPU's which can make use of it & barely any SSD's in the consumer space which can properly use it. This wasn't the case a decade back, because while the progress from PCIe 2.0-> 3.0-> 4.0 was painfully slow *the jump from 4.0->5.0-> 6.0 will take less than half the time*. So in essence *these boards will last you for a long while*!


----------



## Chrispy_ (Oct 21, 2022)

R0H1T said:


> It is important in more ways than one, chances are if your mobo dies for some reason you'll be able to buy a cheap second hand or even brand new x570 board 2-3 years down the line. Good luck finding a z97 board(used or new) though at reasonable prices. If you want to upgrade you could go from zen to zen3 as well with the chipset limitations, the last time Intel allowed this was when Youtube was barely a thing. Intel's artificial limitations wrt sockets are well documented & really there's no excuse for changing them 5 times on the same uarch, with Skylake!
> 
> This is even more relevant now because PCIe 5.0 will easily last you a decade or more, we have no dGPU's which can make use of it & barely any SSD's in the consumer space which can properly use it. This wasn't the case a decade back, because while the progress from PCIe 2.0-> 3.0-> 4.0 was painfully slow *the jump from 4.0->5.0-> 6.0 will take less than half the time*. So in essence *these boards will last you for a long while*!


Case in point, I sold an unopened X99 board a couple of months back for twice what I originally bought it for, becuase S2011-3 CPU users are stiffed if their board dies. Nothing else is compatible.

I was on the other end of the same situation where I was putting together a build to gift someone and intended to repurpose an i9-10900 I had. Could I find a decent S1200 board new for any reasonable price? No. I ended up buying a used Asus Prime Z490 board for about what they were selling for brand new three years ago, dusty, unknown history, no accessories other than the IO shield and absolutely no warranty. 

The previous machine I built at home as part of my regular parts cleanouts of abandoned crap that ends up at home used a Ryzen 5 1600 and I had a B450 board lying around, but I can still buy cheap mATX B550 and A520 boards for next to nothing, brand new, with a warranty that will work just fine with a CPU that's _two years older_ than the Comet lake i9 I struggled to find a board for last week.


----------



## R0H1T (Oct 21, 2022)

Yes this planned obsolescence should not be condoned by anyone, it's really only good *great for Intel & their board partners* & horrible for the end user! AMD's done some bad things especially wrt pricing their 5xxx or 7xxx chips but socket longevity is not one of them, it's at least +10 *IMO* over any other feature Intel can show in good light with their 13xxx launch.


----------



## AnotherReader (Oct 21, 2022)

Dirt Chip said:


> You mean use the same 6xx mobo with zen5?
> If you are on a budget it will be very poor choice, budget wise, to change CPU every year or two.
> Also, most people dont change cpu often, and some of one who do tend to upgrade the mobo as well as cpu to get all the new and fresh tech.
> The whole "dead platform" seems irrelevant to me (I upgrade every 10 years or so on average), but I can see why on forums like this is quite prevalence (although I don't think it's the common state of mind).


I've done that, and it's a valid strategy. Don't forget that you can sell your previous CPU. I bought a 1700X and X370 Taichi in 2017 only a couple of months after release. I swapped the CPU for a 3600X that ended up costing me only $75 and came with a free game as well. Now I have a 5700X and my 3600X should be in someone else's hands soon. The total outlay is slightly more than buying an 1800X in 2017, but a 5700X is much faster. Alternatively, you can upgrade at the end and you only spend on the CPU rather than CPU and motherboard.


----------



## Why_Me (Oct 21, 2022)




----------



## QUANTUMPHYSICS (Oct 22, 2022)

My flight simulator is using an older Core i7 Extreme and I'm ready to upgrade everything:  13900k,  EVGA classified motherboard, EVGA PSU,  EVGA CLX cooler and DDR5.

I will definitely get the 13900k, but I wish I could just wait for the 15900K.


----------



## InVasMani (Oct 22, 2022)

QUANTUMPHYSICS said:


> My flight simulator is using an older Core i7 Extreme and I'm ready to upgrade everything:  13900k,  EVGA classified motherboard, EVGA PSU,  EVGA CLX cooler and DDR5.
> 
> I will definitely get the 13900k, but I wish I could just wait for the 15900K.



Be sure to check EVGA's website if it's the z670 Classified they have a bundle for it at $299 or did like a week ago with Z20 keyboard and a mouse I believe. The price alone is pretty good, but other inclusions make it a even better deal.  NM seems that that ship sailed, but for $499 there is a bundle on the kingpin. Meanwhile if you get the kingpin on EVGA's website w/o the bundle not including the mouse and keyboard it's only $300's more. I guess Jensen was right the more you buy the more you do save!


----------



## Why_Me (Oct 22, 2022)

QUANTUMPHYSICS said:


> My flight simulator is using an older Core i7 Extreme and I'm ready to upgrade everything:  13900k,  EVGA classified motherboard, EVGA PSU,  EVGA CLX cooler and DDR5.
> 
> I will definitely get the 13900k, but I wish I could just wait for the 15900K.


I'd consider the i9 13900 / 13900F due for release this January.


----------



## Mussels (Oct 22, 2022)

Solid State Brain said:


> @ModEl4
> A few possible reasons:
> 
> Not all CPUs are equal and some may even run at significantly higher or lower voltages than the average.
> ...


This definitely makes sense as MSI have been caught cheating this sort of thing several times now - Hell, HWinfo had a metric added just to catch them out for dishonest power reporting

They may blame intel XTU, but if that's the case it's even worse as any reviews using that software might also be reporting dishonest values



W1zzard said:


> I'm on 8700K with 3080 and haven't noticed any "choking"


From what i've seen very few games max out the CPU cores, but i see a lot of reports from players on facebook about 100% usage on their 4 core/8 thread CPU's with a few modern FPS titles (call of duty/battlefield)

I don't play those so i havent experienced it firsthand



W1zzard said:


> I'm researching this .. something strange is going on .. going away for the weekend in an hour though, more testing on monday


Might be related to the power values being limited/not limited as expected?

We've had dishonest motherboards in the past, and now evidence intels XTU software isn't working as expected


----------



## Readlight (Oct 22, 2022)

Can it warm up bedroom?


----------



## Solid State Brain (Oct 22, 2022)

Mussels said:


> This definitely makes sense as MSI have been caught cheating this sort of thing several times now - Hell, HWinfo had a metric added just to catch them out for dishonest power reporting
> 
> They may blame intel XTU, but if that's the case it's even worse as any reviews using that software might also be reporting dishonest values



I don't know if that's a case of the motherboard cheating with power reporting. To me, HardwareUnboxed's result seemed consistent with the CPU running in fixed voltage mode, called "Override mode" on MSI motherboards. When that happens, CPU voltage only decreases due to the VRM impedance (i.e. vdroop), and not also with frequency as in the default voltage mode ("Adaptive Voltage").

As a result, performance under power-limited scenarios will be significantly lower that expected, since the CPU will use higher voltages and consume more power at lower frequencies than it normally would, and performance will decrease more or less linearly with the power limit all the way down to low levels, which is what looked like from HardwareUnboxed's power scaling graph.

I'm not sure if Intel XTU can do this on its own; I haven't used it much but the power limits seemed to work correctly when I did. On the other hand, if for a reason or another the reviewer set a fixed CPU voltage in BIOS, that could explain the results.


----------



## AusWolf (Oct 22, 2022)

Solid State Brain said:


> I don't know if that's a case of the motherboard cheating with power reporting. To me, HardwareUnboxed's result seemed consistent with the CPU running in fixed voltage mode, called "Override mode" on MSI motherboards. When that happens, CPU voltage only decreases due to the VRM impedance (i.e. vdroop), and not also with frequency as in the default voltage mode ("Adaptive Voltage").
> 
> As a result, performance under power-limited scenarios will be significantly lower that expected, since the CPU will use higher voltages and consume more power at lower frequencies than it normally would, and performance will decrease more or less linearly with the power limit all the way down to low levels, which is what looked like from HardwareUnboxed's power scaling graph.
> 
> I'm not sure if Intel XTU can do this on its own; I haven't used it much but the power limits seemed to work correctly when I did. On the other hand, if for a reason or another the reviewer set a fixed CPU voltage in BIOS, that could explain the results.


I'm always puzzled my MSI's weird BIOS settings. I remember building a PC for someone with an MSI board where I could choose "cooler setup". The options were basic air, tower and AIO, I think. I had no clue what it was about until I read the manual and saw that it's a simple power limit toggle. I don't know why MSI has to do this and why they can't keep things simple and easy to understand.


----------



## Solid State Brain (Oct 22, 2022)

@AusWolf
The "Cooler Setup" dialog on recent MSI motherboards shows up during the first boot or after the CMOS is cleared. It is just for selecting fixed presets for PL1, PL2 and IccMax (CPU current limit) in accordance with your cooler performance, which you can otherwise manually configure in Advanced CPU options.

The idea is interesting, but not very well executed. The lowest preset on my motherboard-CPU combination already used limits above Intel recommendations/specifications (PL1=PL2=241W when it should have been PL1=125W, PL2=190W); other presets seemed to use too high limits and so they are almost useless and possibly even damaging on the long term (since they use a very high current limit of 512A).

I agree that MSI often uses strange confusing wording. AC/DC Loadline for example are under "CPU Lite Load" settings.


----------



## AusWolf (Oct 22, 2022)

Solid State Brain said:


> @AusWolf
> The "Cooler Setup" dialog on recent MSI motherboards shows up during the first boot or after the CMOS is cleared. It is just for selecting fixed presets for PL1, PL2 and IccMax (CPU current limit) in accordance with your cooler performance, which you can otherwise manually configure in Advanced CPU options.
> 
> The idea is interesting, but not very well executed. The lowest preset on my motherboard-CPU combination already used limits above Intel recommendations/specifications (PL1=PL2=241W when it should have been PL1=125W, PL2=190W); other presets seemed to use too high limits and so they are almost useless and possibly even damaging on the long term (since they use a very high current limit of 512A).
> ...


Someone correct me if I'm wrong, but I have the feeling that they name their BIOS settings with stupid and/or non-IT oriented people in mind. I mean, even if you have no clue what PL1 and PL2 mean, you can still select what kind of cooler you have (which I still think is stupid because not two AIOs, and not two tower coolers are the same).

Edit: Imo, instead of stupid naming, they could use a "simple mode" / "advanced mode" toggle like Asus does.


----------



## Dirt Chip (Oct 22, 2022)

R0H1T said:


> It is important in more ways than one, chances are if your mobo dies for some reason you'll be able to buy a cheap second hand or even brand new x570 board 2-3 years down the line. Good luck finding a z97 board(used or new) though at reasonable prices. If you want to upgrade you could go from zen to zen3 as well with the chipset limitations, the last time Intel allowed this was when Youtube was barely a thing. Intel's artificial limitations wrt sockets are well documented & really there's no excuse for changing them 5 times on the same uarch, with Skylake!
> 
> This is even more relevant now because PCIe 5.0 will easily last you a decade or more, we have no dGPU's which can make use of it & barely any SSD's in the consumer space which can properly use it. This wasn't the case a decade back, because while the progress from PCIe 2.0-> 3.0-> 4.0 was painfully slow *the jump from 4.0->5.0-> 6.0 will take less than half the time*. So in essence *these boards will last you for a long while*!


I don`t see how Z97 availability (launched at 2014, 8 years ago) comper to the x570 availability- a 3 years old (lunched 2019)..?
if anything, Intel's strategy of changing socket every 2 gen makes a larger second hand market because mobo is replaced more often.

You are right and I very much agree about Intel's very much one-sided, intensely and profit oriented way of terminating socket compatibility. But that`s on itself, imo, it is not a reason not to choose them unless you know you will need to upgrade CPU every 2 years without changing mobo and to be left out with any new features that shows up.

Also, I advise not to too much counting on the ability to drop a new CPU to a 2-4 years old mobo. AMD is not your friend and can, in a flip sec, to not support for some weird reason. You cannot know if the CPU in 2-4 yers time will be of any good for you (thermally, financial, performance wise, new features that you need ect). So my device- choose the platform that`s suite you now, not the one that you hope will suite you in 2-4 years from now.

Other than that, the ability to keep using the same mobo is a wonderful thing for th consumer and I hope Intel will fallow (ye, right..). It is also Important from the environment point of view to lower e-waste even in a small amount (cus you still buy a new CPU).


----------



## AusWolf (Oct 22, 2022)

Dirt Chip said:


> I don`t see how Z97 availability (launched at 2014, 8 years ago) comper to the x570 availability- a 3 years old (lunched 2019)..?
> if anything, Intel's strategy of changing socket every 2 gen makes a larger second hand market because mobo is replaced more often.
> 
> You are right and I very much agree about Intel's very much one-sided,intensely and profit oriented way of terminating socket compatibility. But that`s on itself, imo, not a reason not to choose them, unless you know you will need to upgrade CPU every 2 years without changing mobo and to be left out with any new features that shows up).
> ...


I agree. Future platform compatibility is overrated, imo. Even if you buy the newest AMD platform, by the time you really need an upgrade, the next one will be out - if not socket, then chipset compatibility-wise.


----------



## R0H1T (Oct 22, 2022)

Dirt Chip said:


> I don`t see how Z97 availability (launched at 2014, 8 years ago) comper to the x570 availability- a 3 years old (lunched 2019)..?


Well that was just an example of what (extended) socket compatibility does & that I had Z97 prior to the x570, you can also see other examples in this thread after that. FYI you could put any chip between zen1-zen3 with any mobo between x3xx & x5xx (with BIOS updates in some cases) & Intel simply doesn't allow that! With such *compatibility you also have a more robust second hand market with much reasonable prices overall*, for such components. Again this shouldn't be that hard to understand, try getting a z170 board for instance & check what it costs now?

And try getting a similar


AusWolf said:


> Future platform compatibility is overrated, imo.


That may have been true 10 years back but it's virtually the opposite today! What will you really need a new chipset for 3 years down the line PCIe 6.0?


----------



## AusWolf (Oct 22, 2022)

R0H1T said:


> That may have been true 10 years back but it's virtually the opposite today! What will you really need a new chipset for 3 years down the line PCIe 6.0?


Exactly my point. And what do you need a new CPU for in your existing mobo? A 4% IPC uplift?


----------



## Dirt Chip (Oct 22, 2022)

AnotherReader said:


> I've done that, and it's a valid strategy. Don't forget that you can sell your previous CPU. I bought a 1700X and X370 Taichi in 2017 only a couple of months after release. I swapped the CPU for a 3600X that ended up costing me only $75 and came with a free game as well. Now I have a 5700X and my 3600X should be in someone else's hands soon. The total outlay is slightly more than buying an 1800X in 2017, but a 5700X is much faster. Alternatively, you can upgrade at the end and you only spend on the CPU rather than CPU and motherboard.


Your strategy is very much valid but not spacial in any way to AMD only.
You can also sell any intel mobo and CPU- nothing new here. Again, if you are on a budget almost any change will be at loss, financially wise. Intel second hand market is as booming just as AMD`s, I guress.


----------



## fevgatos (Oct 22, 2022)

Solid State Brain said:


> I don't know if that's a case of the motherboard cheating with power reporting. To me, HardwareUnboxed's result seemed consistent with the CPU running in fixed voltage mode, called "Override mode" on MSI motherboards. When that happens, CPU voltage only decreases due to the VRM impedance (i.e. vdroop), and not also with frequency as in the default voltage mode ("Adaptive Voltage").
> 
> As a result, performance under power-limited scenarios will be significantly lower that expected, since the CPU will use higher voltages and consume more power at lower frequencies than it normally would, and performance will decrease more or less linearly with the power limit all the way down to low levels, which is what looked like from HardwareUnboxed's power scaling graph.
> 
> I'm not sure if Intel XTU can do this on its own; I haven't used it much but the power limits seemed to work correctly when I did. On the other hand, if for a reason or another the reviewer set a fixed CPU voltage in BIOS, that could explain the results.


The last 2 releases of Intel - we had the same "tactics" from many of the big reviewers. Post fake numbers with the Intel CPUs drawing 999 watts and some absurd fake efficiency graphs, and later retract them. In the meanwhile Billy the AMD fanboy keeps advertising around the web those original fake graphs. And then claim that Intel is paying reviewers on top of everything else. Welcome to the internet in 2022, facts just don't matter.


----------



## R0H1T (Oct 22, 2022)

Dirt Chip said:


> Intel second hand market is as booming just as AMD`s, I guress.


Booming for who though? With AM4 you had 2^4 combination of boards/chips, a typical Intel release guarantees at most 2^2 so you're 4x as likely to get a compatible AM4 board/zen chip as Intel. I'm counting zen+ as well. You also didn't answer why they had to change socket compatibility so many times on a single uarch?


----------



## Dirt Chip (Oct 22, 2022)

R0H1T said:


> Well that was just an example of what (extended) socket compatibility does & that I had Z97 prior to the x570, you can also see other examples in this thread after that. FYI you could put any chip between zen1-zen3 with any mobo between x3xx & x5xx (with BIOS updates in some cases) & Intel simply doesn't allow that! With such *compatibility you also have a more robust second hand market with much reasonable prices overall*, for such components. Again this shouldn't be that hard to understand, try getting a z170 board for instance & check what it costs now?
> 
> And try getting a similar
> 
> That may have been true 10 years back but it's virtually the opposite today! What will you really need a new chipset for 3 years down the line PCIe 6.0?


"Intel simply doesn't allow that! " you are right, but than agin- so what unless you know you will surly upgrade to a new AMD CPU only?

PCIe is just one the changes a mobo can have and I agree, It is not a reson to change mobo. But you have all sorts of other fetures that can change, get upgraded, and new one you don't even know of. As a user who want`s to stay updated on the performance, It is nou unlikely to think that you also will want the new updated goodis.

About second hand market, I think both partys have one. Maybe AMD is better in that front but I dont think its a deciding factor when you choose a platform. maybe just a bonus.

All in all, what i`m saying is that it is better to choose what is right for you in the day of purchasing rather than choosing a lesser option in the hope years from now it will be worth it. But thats just my opinion and way of purchasing things, surly others will do different and be just as fine


----------



## fevgatos (Oct 22, 2022)

R0H1T said:


> Booming for who though? With AM4 you had 2^4 combination of boards/chips, a typical Intel release guarantees at most 2^2 so you're 4x as likely to get a compatible AM4 board/zen chip as Intel. I'm counting zen+ as well. You also didn't answer why they had to change socket compatibility so many times on a single uarch?


Compatibility is great if it saves you money. AMD's compatibility usually doesn't save you money, cause their CPUs are uber expensive compared to the competition. So what's the point? I gave you an example before, it was cheaper for me to buy a 12700f + a brand new b660 motherboard than upgrading my R5 1600 into a 5800x 3d and keeping the same old outdated out of warranty motherboard. Plus, the 12700f is way faster in the vast majority of workloads. So...?


----------



## R0H1T (Oct 22, 2022)

Dirt Chip said:


> "Intel simply doesn't allow that! " you are right, but than agin- so what unless you know you will surly upgrade to a new AMD CPU only?


Did you miss the part where I said the board dies? That is an important consideration for anyone not upgrading their system every 2 years. And a really major one at that.


----------



## AusWolf (Oct 22, 2022)

fevgatos said:


> The last 2 releases of Intel - we had the same "tactics" from many of the big reviewers. Post fake numbers with the Intel CPUs drawing 999 watts and some absurd fake efficiency graphs, and later retract them. In the meanwhile Billy the AMD fanboy keeps advertising around the web those original fake graphs. And then claim that Intel is paying reviewers on top of everything else. Welcome to the internet in 2022, facts just don't matter.


That's just the sign of living in a world where everything has to be 1% faster than the competition, and every job (including reviews) has to be done yesterday. Reviewers don't have time to test things at settings that make sense before the review deadline - some don't have time at all. As a result, both AMD and Intel are specced way out of their efficiency curves by default, both run at their temperature limits out of the box, but both can be tamed with a little care if that's what you prefer. Lower power and temperature limits can be set, and I think that is the way to go with these CPUs nowadays.


----------



## fevgatos (Oct 22, 2022)

AusWolf said:


> That's just the sign of living in a world where everything has to be 1% faster than the competition, and every job (including reviews) has to be done yesterday. Reviewers don't have time to test things at settings that make sense before the review deadline - some don't have time at all. As a result, both AMD and Intel are specced way out of their efficiency curves by default, both run at their temperature limits out of the box, but both can be tamed with a little care if that's what you prefer. Lower power and temperature limits can be set, and I think that is the way to go with these CPUs nowadays.


I agree but I think reviewers are the main problem, not Intel or AMD. They are the ones rushing numbers, they are the ones using stupid settings, and there are smaller channels out there that call them out for it. A very popular reviewer (won't name him, but is one of the biggest ones), is still defending his decision of manually going INTO the bios and choosing unlimited power limits for his review. I mean what the actual hell? When you manually go out of your way to make the CPU inefficient, why are you wondering why it is inefficient?

If every reviewer decided to test the CPUs at - lets say - 150w limit, then Intel and AMD wouldn't be pushing 600 watts out of the box, cause noone would test them like that - they would have nothing to gain.


----------



## R0H1T (Oct 22, 2022)

fevgatos said:


> Compatibility is great if it saves you money. *AMD's compatibility usually doesn't save you money,* cause their CPUs are uber expensive compared to the competition. So what's the point? I gave you an example before,* it was cheaper for me to buy a 12700f + a brand new b660 motherboard* than upgrading my R5 1600 into a 5800x 3d and keeping the same old outdated out of warranty motherboard. Plus, *the 12700f is way faster in the vast majority of workloads. *So...?


Oh that's BS, you have two examples in this very thread for it.

And it's cheaper for me to slot in a 5950x in my x570 board, much cheaper for a lot of AM4 users in fact.

As compared to what? A 5950x for instance?

I'm not asking anyone to buy the 7xxx chips right now, in fact if you need it you should still wait for BF deals *IMO* but socket longevity is such a simple yet "underrated" plus point that overlooking it is almost criminal these days.


----------



## Dirt Chip (Oct 22, 2022)

R0H1T said:


> You also didn't answer why they had to change socket compatibility so many times on a single uarch?


They don`t. *They choose to from a financial point to get more mony out of the consumer and to satisfy the shere holderd *(just as any other global company btw), and to a much lesser reson to force an update features that maybe benefits them in some way.
To make it very much clear becuse I allready said it in this thres a few times: I`m aginst this socket changes practice from intel but I don`t count it as a deal breaker if their platform is the right one for me in a giveng time point.
If I know that CPU upgrade is a something that is importent for me in 2-3 years time, than AMD will be the preferred choice.


----------



## fevgatos (Oct 22, 2022)

R0H1T said:


> Oh that's BS, you have two examples in this very thread for it.
> 
> And it's cheaper for me to slot in a 5950x in my x570 board, much cheaper for a lot of AM4 users in fact.
> 
> ...


What example do I have? Sorry, I might have missed something.

Your x570 isn't an old board, it's the latest AM4, what exactly are you talking about?

Socket longevity is great in theory, but with AMD charging exorbitant prices for their CPU's, you are already paying for a motherboard included in the price of an AMD CPU. That's just the sad reality.

It's funny to talk about longevity but completely forget that it took AMD 2 freaking years to support Zen 3 on older motherboards. By the time x370 / b350 got zen 3 support, those cpus were freaking outdated. Is that the kind of support you want? I don't


----------



## Dirt Chip (Oct 22, 2022)

R0H1T said:


> Did you miss the part where I said the board dies? That is an important consideration for anyone not upgrading their system every 2 years. And a really major one at that.


Choosing platform according to sec market avilabilty because the mobo might die is not a major point to me. Also, I don't know how will the sec hand market will be like in 2,4,6 or 10 years from now. If the mobo will die in 4,6,10 years from now and I can`t find any replacement than I'm OK with buying a new platform. btw, That`s why I will choose DDR5 for a new build and not DDR4- Memory is probably the most expensive part that you can take across many years and platforms (Although I'm on DDR3 now...).
But if it`s an importent factor to you than OK- I can understand it but I wouldn't factor it as more than a bonus.
I prefere to use UPS to lower the risk from grid spikes, among other bonuses that come with it.


----------



## AusWolf (Oct 22, 2022)

fevgatos said:


> I agree but I think reviewers are the main problem, not Intel or AMD. They are the ones rushing numbers, they are the ones using stupid settings, and there are smaller channels out there that call them out for it. A very popular reviewer (won't name him, but is one of the biggest ones), is still defending his decision of manually going INTO the bios and choosing unlimited power limits for his review. I mean what the actual hell? When you manually go out of your way to make the CPU inefficient, why are you wondering why it is inefficient?
> 
> If every reviewer decided to test the CPUs at - lets say - 150w limit, then Intel and AMD wouldn't be pushing 600 watts out of the box, cause noone would test them like that - they would have nothing to gain.


I would say, time is the big problem, as reviewers tend to get the parts only a couple of days before the deadline which makes reviews rushed... but if they're defending the choice with unlimited power limits, that's another story. As far as I'm concerned, changing factory (Intel/AMD) default and recommended power/thermal/any limits is overclocking, and should not be done for a review.

Motherboard makers are also a problem, if you look at the above example with MSI.

A third problem may be what I call "the reviewer's paradox", when they specifically look at that extra 1% performance to shit on the product or its competition (shitting on stuff brings them views). They throw a lot of hardware around, so they have the keen eye for that miniscule difference that the home user will never ever notice. I'm sure if you built two systems with any two modern, but vastly different CPUs, and just played games on both without an FPS counter on, you wouldn't notice any difference.


----------



## R0H1T (Oct 22, 2022)

X570 was released over 2 years back, it is old by Intel standards in fact it'd be obsolete last year.


fevgatos said:


> It's funny to talk about longevity but completely forget that it took AMD 2 freaking years to support Zen 3 on older motherboards.


They did deliver eventually, even if was driven by the community outrage. Intel on other hand can't even guarantee 2 gens of chip compatibility on a single chipset ~ here's a trivia for you which chipset supported basically just one gen of chips?


----------



## AusWolf (Oct 22, 2022)

R0H1T said:


> They did deliver eventually, even if was driven by the community outrage. *Intel on other hand can't even guarantee 2 gens of chip compatibility on a single chipset* ~ here's a trivia for you which chipset supported basically just one gen of chips?


That's true, but realistically speaking, is it a problem? I mean, my Rocket Lake i7 is still the same Rocket Lake i7 and it still runs my programs and games as it was/did two years ago.


----------



## fevgatos (Oct 22, 2022)

R0H1T said:


> X570 was released over 2 years back, it is old by Intel standards in fact it'd be obsolete last year.


And so is the by AMD's standards, the last cpu you can put on them is zen 3. What are you talking about? LOL


R0H1T said:


> They did deliver eventually, even if was driven by the community outrage. Intel on other hand can't even guarantee 2 gens of chip compatibility on a single chipset ~ here's a trivia for you which chipset supported basically just one gen of chips?


But I don't care about mobo upgradability if it doesn't actually save me money, that's the point you keep on missing. AMD can support 50 gens of CPU's and I still wouldn't care if their CPU pricing includes the cost of buying a new motherboard compared to competitors products. Again - when the 3d was released, my options were to buy a 3d for my old outdated, out of warranty b350 for 450€, or buy a 12700f + a brand new b660 for 460€. So how the heck does mobo upgradability benefitted me?

To answer your question, b550 supported basically just one gen of chips. That's AMD's chipset btw


----------



## Dirt Chip (Oct 22, 2022)

R0H1T said:


> X570 was released over 2 years back, it is old by Intel standards in fact it'd be obsolete last year.
> 
> They did deliver eventually, even if was driven by the community outrage. Intel on other hand can't even guarantee 2 gens of chip compatibility on a single chipset ~ here's a trivia for you which chipset supported basically just one gen of chips?


You keep asking it over and over although you are given direct answers. why?
Do you try to support AMD and hurt Intel for their business choice about mobo longevity?


----------



## R0H1T (Oct 22, 2022)

AusWolf said:


> That's true, but realistically speaking, is it a problem?


It is a problem if you're looking to keep a system for 5-10 years, I've been burnt more than once over this & I simply will avoid Intel on desktops for this important reason. It is not the only reason but a major one, not to mention the second hand PC market is even worse in this part of the world where there are very few avenues to secure old parts even online. Now as I understand it the situation in the US is much better & probably much of Western Europe wrt used components. Not so much in 60-70 percent of the rest of the world.


Dirt Chip said:


> Do you try to support AMD and hurt Intel for their business choice about mobo longevity?


Yes because of my experience with them in the past, although my last two laptops were Intel based ~ if that answers your question.


----------



## Xuper (Oct 22, 2022)

any result for rendering after one hour between 13900K/7950X ?


----------



## AusWolf (Oct 22, 2022)

R0H1T said:


> It is a problem if you're looking to keep a system for 5-10 years, I've been burnt more than once over this & I simply will avoid Intel on desktops for this important reason. It is not the only reason but a major one, not to mention the second hand PC market is even worse in this part of the world where there are very few avenues to secure old parts even online. Now as I understand it the situation in the US is much better & probably much of Western Europe wrt used components. Not so much in 60-70 percent of the rest of the world.


If you intend to keep your system for 5-10 years, then you'll have to buy a new motherboard when you upgrade anyway.


----------



## R0H1T (Oct 22, 2022)

No you don't, I have a 2700 right now and I will upgrade to either 5900x or 5950x in the next year or so ~ can you do that with Intel?

That's at least 2-4x more performance and minimal spend on the upgrade overall.


----------



## AusWolf (Oct 22, 2022)

R0H1T said:


> No you don't, I have a 2700 right now and I will upgrade to either 5900x or 5950x in the next year or so ~ can you do that with Intel?
> 
> That's at least 2-4x more performance and minimal spend on the upgrade overall.


Fair enough, I guess. It's just not something you can plan ahead. AMD said AM5 will be supported as long as AM4 was, but we'll have to see about chipset compatibility.

Edit: We'll also have to see about future CPU improvements. The 5900X is a huge upgrade over the 2700, but we don't know if there will be another big jump on AM5 or not.


----------



## fevgatos (Oct 22, 2022)

R0H1T said:


> No you don't, I have a 2700 right now and I will upgrade to either 5900x or 5950x in the next year or so ~ can you do that with Intel?
> 
> That's at least 2-4x more performance and minimal spend on the upgrade overall.


But you don't need to do that with Intel. Going by pricing on one of the biggest EU retailers, a 13600kf (just released btw, prices will drop soon) and a brand new b660 will cost you around 510€. The 5900x is 440€. Why the flying **** would you go for the 5900x??? For 70€ you get a brand new better motherboard that has 3 years warranty and all the bells and whistles, you get much much faster ST performance, you get more MT performance and way better gaming performance. And you can sell your old motherboard - even for 30€ - that would cover the 70€ extra youd have to pay.

That's my point all along, upgradability means absolutely nothing when AMD prices their CPUs skyhigh.


----------



## Dirt Chip (Oct 22, 2022)

R0H1T said:


> Yes because of my experience with them in the past, although my last two laptops were Intel based ~ if that answers your question.


OK than, now I get it 
I have a different experience with my current mobo, pushing 12 years now.
The one before it (an "nForce" chipset by NV that use Athlon 3200+ CPU (hooo what a wird thought- NV mobo with AMD cpu  ) last also about 10 year as main rig and then 2 more as an HTPC.
In the rear cases I needed a second hand mobo (not for myself) I could find it on eBay and such. It might have cost more than a sec hand product needed to but that`s something I can take once in a while (say 1 instance every 15 years).

I myself try not to "help" or "support" any brand by buying its product (not saying you (R0H1T) are, I`ts just a general note). That`s a bad practice imo that might cost you more, give you less and resolute in a heartbreak.
If the product is right for me (right from a selfish point of view), OK than and pay. Nothing more.



R0H1T said:


> 2-4x more performance


Sound like something I already heard before...


----------



## R0H1T (Oct 22, 2022)

fevgatos said:


> But you don't need to do that with Intel. Going by pricing on one of the biggest EU retailers, a 13600kf (just released btw, prices will drop soon) and a brand new b660 will cost you around 510€. The 5900x is 440€. Why the flying **** would you go for the 5900x???


I was strictly talking about the benefits of socket compatibility ~ there are just way too many to ignore. As for the current market, if you don't really care about potential mobo replacement in the future/upgrading on the same socket then 13xxx is a no brainer. It's an insane value compared to 7xxx chips, if you check any of my responses in the zen4 threads I've never once recommended it that price.





fevgatos said:


> That's my point all along, upgradability means absolutely nothing when AMD prices their CPUs skyhigh.


 Really depends on the person, for me securing a replacement/spare component is a quintessential part of building the system ~ while others would probably throw it completely in the bin.


----------



## fevgatos (Oct 22, 2022)

R0H1T said:


> I was strictly talking about the benefits of socket compatibility ~ there are just way too many to ignore. As for the current market, if you don't really care about potential mobo replacement in the future/upgrading on the same socket then 13xxx is a no brainer. It's an insane value compared to 7xxx chips, if you check any of my responses in the zen4 threads I've never once recommended it that price. Really depends on the person, for me securing a replacement/spare component is a quintessential part of building the system ~ while others would probably throw it completely in the bin.


Well you mixed 2 points together. Finding a spare motherboard, sure amd's way is better. Upgradability saving you money, it's just not the case. You spend as much or even more going with AMD even if you get to keep your motherboard. So what good does  upgradability do in that regard? You overpay for an obsolete CPU while still keeping your old outdated out of warranty motherboard. And people, you included, are trying to convince me that's a positive thing, lol


----------



## R0H1T (Oct 22, 2022)

Over pay for what? I bought the 2700 at roughly $150 & as I've stressed in multiple threads I could've bought the 5900x ($250) or 5950x ($400) at insane (sale) prices but was minutes late! Find me a better deal than that, in fact I'll pay you the difference if you can.



fevgatos said:


> You overpay for an obsolete CPU while still keeping your old outdated out of warranty motherboard.


It still has warranty


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Well you mixed 2 points together. Finding a spare motherboard, sure amd's way is better. Upgradability saving you money, it's just not the case. You spend as much or even more going with AMD even if you get to keep your motherboard. So what good does  upgradability do in that regard? You overpay for an obsolete CPU while still keeping your old outdated out of warranty motherboard. And people, you included, are trying to convince me that's a positive thing, lol


I don't see the point in switching to anything above 8 cores unless you do specific productivity apps.
With that in mind if you're upgrading from a 2700x on an AM4 you should just get the 5700x and that's that.
The price is just 60% of the 13600K, you get almost 85% of that CPU's gaming performance.
When you factor in the motherboard you get a price to performance ratio that is TWICE better than the 13600K lol
How is that not a good thing?


----------



## fevgatos (Oct 22, 2022)

R0H1T said:


> Over pay for what? I bought the 2700 at roughly $150 & as I've stressed in multiple threads I could've bought the 5900x ($250) or 5950x ($400) at insane (sale) prices but was minutes late! Find me a better deal than that, in fact I'll pay you the difference if you can.
> 
> 
> It still has warranty


Overpay for getting to keep your motherboard. How many times do we have to go through this? A 13600kf + a brand new b660 motherboard is faster in every single workload, has more features, better upgradability, it's brand freaking new and it only costs 70€ more than just buying a 5900x. So, ill ask again, for the 5000th time, what do you actually gain by that incredible am4 upgradability?



siluro818 said:


> I don't see the point in switching to anything above 8 cores unless you do specific productivity apps.
> With that in mind if you're upgrading from a 2700x on an AM4 you should just get the 5700x and that's that.
> The price is just 60% of the 13600K, you get almost 85% of that CPU's gaming performance.
> When you factor in the motherboard you get a price to performance ratio that is TWICE better than the 13600K lol
> How is that not a good thing?


According to the benchmarks from this very site, the 13600k is 30% faster in games. In productivity...yeah, let's not even talk about that. Those 2 cpus aren't really comparable


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Overpay for getting to keep your motherboard. How many times do we have to go through this? A 13600kf + a brand new b660 motherboard is faster in every single workload, has more features, better upgradability, it's brand freaking new and it only costs 70€ more than just buying a 5900x. So, ill ask again, for the 5000th time, what do you actually gain by that incredible am4 upgradability?
> 
> 
> According to the benchmarks from this very site, the 13600k is 30% faster in games. In productivity...yeah, let's not even talk about that. Those 2 cpus aren't really comparable


The benchmarks (1080p) from this very site:


----------



## Dirt Chip (Oct 22, 2022)

fevgatos said:


> According to the benchmarks from this very site, the 13600k is 30% faster in games. In productivity...yeah, let's not even talk about that. Those 2 cpus aren't really comparable


16.5% faster in gaming average FHD, 30% average in all app.
Where did you get those numbers on TPU?

For anyone on budget and gaming looking for upgrade from zen2 right very now, zen3 is the better eco route.
But when 13400f will come, it might change.


----------



## fevgatos (Oct 22, 2022)

siluro818 said:


> The benchmarks (1080p) from this very site:


Τhe same benchmarks at 720p show different results though.



Dirt Chip said:


> 16.5% faster in gaming average FHD, 30% average in all app.
> Where did you get those numbers on TPU?
> 
> For anyone on budget and gaming looking for upgrade from zen2 right very now, zen3 is the better eco route.
> But going Intel ,when 13400f will come, might change it.


720p gaming the difference is around 30%.


----------



## R0H1T (Oct 22, 2022)

fevgatos said:


> A 13600kf + a brand new b660 motherboard is faster in every single workload, has more features, *better upgradability,*


What features? What the heck are you talking about? I see we're still in the geocentric age 



fevgatos said:


> So, ill ask again, for the 5000th time, what do you actually gain by that* incredible am4 upgradability?*


You could buy a relatively cheap 6-8 core chip in 2017 & upgrade to a chip that can be at least 2-8x faster without spending another dime on a motherboard! Oh did I mention a replacement board will not cost the same as a brand new chip itself?


fevgatos said:


> Overpay for getting to keep your motherboard. How many times do we have to go through this?


So instead of spending 250~400 dollars on a zen3 chip you're saying 13600kf+B660 is cheaper? Are you sure you're doing your math right? More importantly why do you think someone upgrading every 5-10 years would overspend on such a questionable build 

I'll ask again for the billionth time ~ what will a new board/chipset bring you in 2025 part from possibly USB 5.0 if at that?


fevgatos said:


> According to the benchmarks from this very site, the 13600k is 30% faster in games. In productivity...yeah, let's not even talk about that. *Those 2 cpus aren't really comparable*


No they aren't because you keep shifting the goalposts!


----------



## Dirt Chip (Oct 22, 2022)

fevgatos said:


> 720p gaming the difference is around 30%.


1- No one play at 720p for the past 10 years now, this test is only for "academic knowledge".
2- At 720p It`s 22% sharp. not close to 30%.
Sorry.


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Τhe same benchmarks at 720p show different results though.
> 
> 
> 720p gaming the difference is around 30%.


Because you will obviously game at 720p lol
That stuff's completely irrelevant.
What is of note however is that these 83.5% are achieved with DDR4 vs the DDR5 memory config the Intel uses, meaning you will have to factor the memory upgrade into your costs as well.

Let's cut this conversation short, shall we?
If you are making a brand new configuration and you have no intention of upgrading anything in the next 5 years, the 13600K is an excellent choice.
But to recommend it to someone who is already on AM4 - for the purposes of gaming nonetheless - is beyond silly.


----------



## fevgatos (Oct 22, 2022)

Dirt Chip said:


> 1- No one play at 720p for the past 10 years now, this test is only for "academic knowledge".
> 2- At 720p It`s 22% sharp. not close to 30%.
> Sorry.


The difference isn't 22%, your math are just wrong. 78% means that for every 100fps the 13600k gets, the 5700x gets 78. So for the 78 to become 100, you need a 30% increase (78 + 78*30/100). So the difference IS 30%.


It's not academic knowledge, it tells you how long you can keep your CPU. The 5700x WILL eventually bottleneck a graphics card at 1440p or even 4k that the 13600k won't, cause it's 30% faster. So, you can keep it longer. If you see for example a 4k benchmark with a 4090 between a ryzen 1700 and an 8700k, you will realize that the 1700 severely bottlenecks the 4090 while the 8700k does not. So you can keep it, while the 1700 needs an upgrade 



siluro818 said:


> Because you will obviously game at 720p lol


No you won't, but youll keep your CPU until a future GPU can turn those 720p numbers into 4k numbers. So let's say the 5700x can get an average of 80 fps at 720p while the 13600k gets 110, you can pair the 5700x with a GPU that gets 80fps at 4k (let's call it a 5080) while you can pair the 13600k with a GPU that gets 110fps at 4k (let's call it a 6080). So the 13600k will last you longer without needing an upgrade. It's logic 101 man


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> The difference isn't 22%, your math are just wrong. 78% means that for every 100fps the 13600k gets, the 5700x gets 78. So for the 78 to become 100, you need a 30% increase (78 + 78*30/100). So the difference IS 30%.
> 
> 
> It's not academic knowledge, it tells you how long you can keep your CPU. The 5700x WILL eventually bottleneck a graphics card at 1440p or even 4k that the 13600k won't, cause it's 30% faster. So, you can keep it longer. If you see for example a 4k benchmark with a 4090 between a ryzen 1700 and an 8700k, you will realize that the 1700 severely bottlenecks the 4090 while the 8700k does not. So you can keep it, while the 1700 needs an upgrade
> ...


The thing is though you will never keep a CPU for so long, because for this to happen (the 720p to 4K scenario) you will need roughly a 9x faster GPU than the 3090 we're currently using for these benchmarks.
So presuming a 60% increase from generation on generation (which is on the generous side unfortunately) you will have to wait FIVE generations, i.e. for the 8090 lmao - that's 10 years.


----------



## fevgatos (Oct 22, 2022)

siluro818 said:


> The thing is though you will never keep a CPU for so long, because for this to happen (the 720p to 4K scenario) you will need roughly a 16x faster GPU than the 3090 we're currently using for these benchmarks.
> So presuming a 60% increase from generation on generation (which is on the generous side unfortunately) you will have to wait SIX generations, i.e. for the 9090 lmao


Sure, but why wouldn't you keep it for that long if it actually maxes out your GPU? Lot's of people had a 2700k until very recently for that exact reason. The 4090 already bottlenecks almost if not all CPUs in 1440p already


----------



## InVasMani (Oct 22, 2022)

Should probably be a instant ban for that 720p remark.  I would not touch that resolution with a dual core.  3090 it's 720p GG time! Someone Drake me it please.


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Sure, but why wouldn't you keep it for that long if it actually maxes out your GPU? Lot's of people had a 2700k until very recently for that exact reason. The 4090 already bottlenecks almost if not all CPUs in 1440p already


Because this sort of performance is far from the only factor to take into account when considering what will matter for your PC.
I don't know who kept a 2700K until recently or rather what you mean by recently but that CPU was a bottleneck already during the Pascal era...
Anyway we are getting OT here.
Even if getting a 13600K would allow you an extra generation of GPU without the need to upgrade it is NOT worth the cost right now as an upgrade for an AM4 user.


----------



## fevgatos (Oct 22, 2022)

siluro818 said:


> Because this sort of performance is far from the only factor to take into account when considering what will matter for your PC.
> I don't know who kept a 2700K until recently or rather what you mean by recently but that CPU was a bottleneck already during the Pascal era...
> Anyway we are getting OT here.
> Even if getting a 13600K would allow you an extra generation of GPU without the need to upgrade it is NOT worth the cost right now as an upgrade for an AM4 user.


The pascal era (like the 1080ti) was released 6 -7 years after the 2700k. And even then, I doubt it was a bottleneck at 4k. Actually, there is a very popular video on youtube testing exactly that, a 1080ti on a 2700k vs an 8700k or something similar, don't remember exactly. No difference at all in 4k



InVasMani said:


> Should probably be a instant ban for that 720p remark.  I would not touch that resolution with a dual core.  3090 it's 720p GG time! Someone Drake me it please.


No one suggested you should play in that resolution. Literally no one. The only people that brought this up are people that can't argue the actual point so they are strawmaning


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> The pascal era (like the 1080ti) was released 6 -7 years after the 2700k. And even then, I doubt it was a bottleneck at 4k. Actually, there is a very popular video on youtube testing exactly that, a 1080ti on a 2700k vs an 8700k or something similar, don't remember exactly. No difference at all in 4k
> 
> 
> No one suggested you should play in that resolution. Literally no one. The only people that brought this up are people that can't argue the actual point so they are strawmaning


It is you who is trying to argue anything else but the "actual point".
And that point is there is NO POINT for anyone sitting low on AM4 to switch platforms for this or any other current CPU when they can still go high on the AM4.
The full cost will NEVER be justified.


----------



## InVasMani (Oct 22, 2022)

Kind of sounds a bit more like you're strawmaning a little yourself when your counter argument self admittedly is useless and no one should play at that resolution.  You must really like DSR downsample on that 720p display with that 3090 a lot. Jensen pimp my 720p display!


----------



## fevgatos (Oct 22, 2022)

InVasMani said:


> Kind of sounds a bit more like you're strawmaning a little yourself when your counter argument self admittedly is useless and no one should play at that resolution.  You must really like DSR downsample on that 720p display with that 3090 a lot. Jensen pimp my 720p display!


No one suggested you should play at 720p but  you. It must feel horrible not being able to argue the point, so instead you have to resort to these kind of tactics. That's sad, I feel sorry man.


----------



## Dirt Chip (Oct 22, 2022)

fevgatos said:


> The difference isn't 22%, your math are just wrong. 78% means that for every 100fps the 13600k gets, the 5700x gets 78. So for the 78 to become 100, you need a 30% increase (78 + 78*30/100). So the difference IS 30%.
> 
> 
> It's not academic knowledge, it tells you how long you can keep your CPU. The 5700x WILL eventually bottleneck a graphics card at 1440p or even 4k that the 13600k won't, cause it's 30% faster. So, you can keep it longer. If you see for example a 4k benchmark with a 4090 between a ryzen 1700 and an 8700k, you will realize that the 1700 severely bottlenecks the 4090 while the 8700k does not. So you can keep it, while the 1700 needs an upgrade
> ...


If your baseline (100%) is 13600k, than it is 22% bigger (faster) than 5700x. 100%*(1-0.78)=22%
If your baseline is 5700x, than it is 28.2% smaller (slower) than 13600k. (100/78)*100%=128.2%. 100%-128.2%=28.2%
If 13600k was 30% faster than 5700x need to be at 70%. 100%*(1-0.7)=70%
Thats how we do precentage in my side of the globe or I just need to re-do math agine.

I dont see 32K res even in 10 years from now, but maybe by then 4K will be the new FHD


----------



## R0H1T (Oct 22, 2022)

The computational costs will be exorbitantly high at that point, this is part of the reason why I said sitting on a PCIe 5.0 board today is wise, yes even for Intel users. "Moore's law" is truly dead & when Intel moves to tiles you will see a clock speed regression, anyone under the impression that clock speeds will not reduce going forward, whether it be with Intel or AMD, is smoking some good stuff. This is possibly the last gen of high/insane clocks on MSDT ~ it will be only downhill from here.

Unless there's some major computational breakthrough or new materials developed to replace Si we'll just see PC hardware kind of skyrocket in prices, especially for high end stuff. As they say the last 1% is the hardest to achieve, the same goes for *nm race* as well.


----------



## TheoneandonlyMrK (Oct 22, 2022)

Elnor has finally beaten the FX world record by 90 MHz with one of these at 8.8Ghz.

So they clearly do have a use case, benchmarks.


----------



## Dirt Chip (Oct 22, 2022)

TheoneandonlyMrK said:


> Elnor has finally beaten the FX world record by 90 MHz with one of these at 8.8Ghz.
> 
> So they clearly do have a use case, benchmarks.


I'm going 13900k.
Adobe suite use case, It is also common at least as uber-extrem OC.


----------



## Shatun_Bear (Oct 22, 2022)

@fevgatos getting desperate and doing his best Intel marketing...give it a rest fella, these CPUs are good, if you don't care about their massive power draw, heat and thermal throttling (13900K, 13700K power draw is still horrible).


----------



## fevgatos (Oct 22, 2022)

Dirt Chip said:


> If your baseline (100%) is 13600k, than it is 22% bigger (faster) than 5700x. 100%*(1-0.78)=22%
> If your baseline is 5700x, than it is 28.2% smaller (slower) than 13600k. (100/78)*100%=128.2%. 100%-128.2%=28.2%
> If 13600k was 30% faster than 5700x need to be at 70%. 100%*(1-0.7)=70%
> Thats how we do precentage in my side of the globe or I just need to re-do math agine.


Yes, you definitely need to redo math again.

One CPU gets 100 fps, and the other 78. How much faster is CPU 1 compared to CPU 2? It's 30%. That's how math have worked the last 30 years ive been alive.

CPU 2 is not 28.2% slower, it's actually 22% slower. 

You have percentages completely mixed up


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Yes, you definitely need to redo math again.
> 
> One CPU gets 100 fps, and the other 78. How much faster is CPU 1 compared to CPU 2? It's 30%. That's how math have worked the last 30 years ive been alive.
> 
> ...


Lol you both gotta go to that class tho xD
According to TPU 13600K is 28.2% faster than 5700X in 720p, which also means 5700X is 28.2% slower. There is no 30% or 22% anywhere in this comparison.
But it still doesn't matter because you ain't gonna play anything using 720p, which DOES matter because you ain't gonna see any of that performance this year or the next.
And no these results don't translate directly to the 6 or 10 or however many years in the future, because by the time said potential might be realized:
A) you have no idea how games are gonna utilize the CPU at the time, because they are likely to be VERY different than the games these benchmarks were done with;
B) several CPU generations down the line you will be able to get another PC config at a lower price than the upgrade you're proposing even with the 5700X upgrade now included in the total cost;
C) you can simply pick the 5800X3D now, minimize the difference with 13600K and STILL save money off the whole deal.


----------



## fevgatos (Oct 22, 2022)

siluro818 said:


> Lol you both gotta go to that class tho xD
> According to TPU 13600K is 28.2% faster than 5700X in 720p, which also means 5700X is 28.2% slower. There is no 30% or 22% anywhere in this comparison.


Nope. You are factually wrong. If CPU A gets 80 points and CPU B gets 100 points, CPU B is 25% faster, while CPU A is 20% slower. That's like math 101.


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Nope. You are factually wrong. If CPU A gets 80 points and CPU B gets 100 points, CPU B is 25% faster, while CPU A is 20% slower. That's like math 101.


True lol
But again - it doesn't matter one bit because of the 720p ^^


----------



## fevgatos (Oct 22, 2022)

siluro818 said:


> True lol
> But again - it doesn't matter one bit because of the 720p ^^


Past evidence shows that it is always the case. The CPU that get's more frames at a low resolution gets more frames at normal resolutions 5 years down the line.


----------



## Dirt Chip (Oct 22, 2022)

fevgatos said:


> Yes, you definitely need to redo math again.
> 
> One CPU gets 100 fps, and the other 78. How much faster is CPU 1 compared to CPU 2? It's 30%. That's how math have worked the last 30 years ive been alive.
> 
> ...


Yep, you are right about the math. back to school than...


----------



## CrAsHnBuRnXp (Oct 22, 2022)

What is the performance difference when running a gen 5 ssd and the GPU running at x8 compared to a gen 4 SSD and the GPU running at x16? Is this mentioned and I miss it? All I saw was something along the lines of it making little difference but I'd like to know actual numbers


----------



## siluro818 (Oct 22, 2022)

fevgatos said:


> Past evidence shows that it is always the case. The CPU that get's more frames at a low resolution gets more frames at normal resolutions 5 years down the line.


My man, no one is arguing whether 13600K gets more frames or not - it obviously does.
But not nearly enough to justify switching platforms. You get the 5700X for 240EUR and the jump from 2700X is already huge. To get the presumed extra 28% 5 years down the track, you need to spend exactly 4 times as much to get the 13600K, the Z790 mobo, and the same 32GB DDR5 mem kit.
You can't make that look like a good deal even if you add free math classes on the side lol


----------



## fevgatos (Oct 22, 2022)

siluro818 said:


> My man, no one is arguing whether 13600K gets more frames or not - it obviously does.
> But not nearly enough to justify switching platforms. You get the 5700X for 240EUR and the jump from 2700X is already huge. To get the presumed extra 28% 5 years down the track, you need to spend exactly 4 times as much to get the 13600K, the Z790 mobo, and the same 32GB DDR5 mem kit.
> You can't make that look like a good deal even if you add free math classes on the side lol


And im saying that the 2 cpus arent comparable at all. Nothing beats keeping your 2700x in terms of value either, now does it?


----------



## Mussels (Oct 23, 2022)

AusWolf said:


> Exactly my point. And what do you need a new CPU for in your existing mobo? A 4% IPC uplift?


Zen 1 vs Zen3 is a lot more than 4%

That's the amount of increase intel gave, AMD did things different



You could get an x370 from 2017 (~5 years old) and slap in a 5800x3D, and apart from PCI-E 3.0 x16 you'd have no performance loss. My x370 happily runs 3200 on the RAM, and i've managed 3600 with SoC voltage increases.


The Ultimate Upgrade Bait - From 1800X to 5800X3D | Hardware Canucks





Including direct comparisons with x370 vs x570 showing the difference is genuinely small






4%? More like 50% to 200% (and the 0.1, 200% plus)



siluro818 said:


> My man, no one is arguing whether 13600K gets more frames or not - it obviously does.
> But not nearly enough to justify switching platforms. You get the 5700X for 240EUR and the jump from 2700X is already huge. To get the presumed extra 28% 5 years down the track, you need to spend exactly 4 times as much to get the 13600K, the Z790 mobo, and the same 32GB DDR5 mem kit.
> You can't make that look like a good deal even if you add free math classes on the side lol


That's my view on this 
Very few people (4090 owners, really) can justify the extra cost right now.

Anyone else should wait a gen, because both AMD and intel seem to have rushed their products this time around


----------



## Dirt Chip (Oct 23, 2022)

Mussels said:


> Anyone else should wait a gen, because both AMD and intel seem to have rushed their products this time around


Yep, zen4 is in a though spot from zen3 in the low budget side for AM4 owners and the preformance side from RL.
For someone on the budget with ddr4 kit and to carry over and\or old system RL is also a good option.


----------



## siluro818 (Oct 23, 2022)

fevgatos said:


> And im saying that the 2 cpus arent comparable at all. Nothing beats keeping your 2700x in terms of value either, now does it?


OK. 5800X3D is comparable. Same math.  Still no point in switching platforms.


----------



## fevgatos (Oct 23, 2022)

siluro818 said:


> OK. 5800X3D is comparable. Same math.  Still no point in switching platforms.


Actually, they aren't. The 13600k is more than 50% faster in MT workloads and more than 20% in ST workloads.. The 3d is more comparable to a 12600kf, which alongside a brand new b660 motherboard would cost you around the same price a 5800x 3d costs on it's own. Which is my point all along, you don't really gain anything by mobo upgradability.


----------



## FeelinFroggy (Oct 23, 2022)

Vayra86 said:


> What games are choking your 8700K? I'm curious as I have yet to find one, though of course I'm on an 'ancient' GPU.


When did I say my 8700k was chocking games?  I run an 8700k and a 3080 at 1440p and it's been great.  The 8700k is very comparable to the 3600 and the new intel CPU gets about 20% better fps at 1440p.

For the past several generations of CPUs by both AMD and Intel have not been work upgrading.  There would be no real world difference that I would see in a game. 

But 20% bump in fps is noticeable.  I'd also like to get a PCIe 4.0 NVME drive, they have been out for a while now any my board does not support it. 

While I love my 8700k and it has been a great CPU, I have had it for 5 or 6 years and I dont see it as being irresponsible if I wanted to upgrade my platform to get some new features and better performance.  

But I'll probably wait till the 78003dx comes out and check out Wizzard's review before I pull the trigger.


----------



## Vayra86 (Oct 23, 2022)

FeelinFroggy said:


> When did I say my 8700k was chocking games?  I run an 8700k and a 3080 at 1440p and it's been great.  The 8700k is very comparable to the 3600 and the new intel CPU gets about 20% better fps at 1440p.
> 
> For the past several generations of CPUs by both AMD and Intel have not been work upgrading.  There would be no real world difference that I would see in a game.
> 
> ...


Oh no don't get me wrong, she certainly has matured by now, and I get it, we all like a young new lass; I just prefer the more experienced bed partner over those young hotties  I like 'em when they're no longer in a hurry these days.



Still though, I'm also looking at those X3D's and midrange Intels... the rack of cache on those


----------



## AusWolf (Oct 23, 2022)

Vayra86 said:


> Oh know don't get me wrong, she certainly has matured by now, and I get it, we all like a young new lass; I just prefer the more experienced bed partner over those young hotties  I like 'em when they're no longer in a hurry these days.
> 
> 
> 
> Still though, I'm also looking at those X3D's and midrange Intels... the rack of cache on those


I've just started playing Stray on my bedroom HTPC with the i7-4765T. The wonders that decade-old 4-core 35 W CPU can do with the last GeForce GT card in existence are amazing! Solid gameplay at 1080p low graphics... why am I thinking about upgrading my main rig again?


----------



## Steevo (Oct 23, 2022)

fevgatos said:


> Τhe same benchmarks at 720p show different results though.
> 
> 
> 720p gaming the difference is around 30%.



And at 480 I bet it’s so fast it would grow wings and take off so no one would ever catch it. I have a 480 monitor somewhere if you would like it!!!!


----------



## Mussels (Oct 24, 2022)

fevgatos said:


> Actually, they aren't. The 13600k is more than 50% faster in MT workloads and more than 20% in ST workloads.. The 3d is more comparable to a 12600kf, which alongside a brand new b660 motherboard would cost you around the same price a 5800x 3d costs on it's own. Which is my point all along, you don't really gain anything by mobo upgradability.


workloads are not games

Gaming CPU? x3D


Workload CPU? Anything with more cores.



Steevo said:


> And at 480 I bet it’s so fast it would grow wings and take off so no one would ever catch it. I have a 480 monitor somewhere if you would like it!!!!


low res testing is how you know the CPU will perform on future more powerful GPUs


----------



## AnotherReader (Oct 24, 2022)

Dirt Chip said:


> Your strategy is very much valid but not spacial in any way to AMD only.
> You can also sell any intel mobo and CPU- nothing new here. Again, if you are on a budget almost any change will be at loss, financially wise. Intel second hand market is as booming just as AMD`s, I guress.


Only for AMD can you have such a large jump within the same socket; Intel usually requires a socket change for appreciable CPU upgrades. To each their own, but for me, Intel's strategy of limited compatibilty is repellent.


----------



## Steevo (Oct 24, 2022)

Mussels said:


> workloads are not games
> 
> Gaming CPU? x3D
> 
> ...


I know, but dead horse is dead. But maybe they need more horse meat lasagna wherever watts are free?


----------



## Dirt Chip (Oct 24, 2022)

AnotherReader said:


> Only for AMD can you have such a large jump within the same socket; Intel usually requires a socket change for appreciable CPU upgrades. To each their own, but for me, Intel's strategy of limited compatibilty is repellent.


You are right, but I don't find it as enough reason not to choose them if the product is right for me (faster and cheaper than compatition).


----------



## fevgatos (Oct 24, 2022)

Steevo said:


> I know, but dead horse is dead. But maybe they need more horse meat lasagna wherever watts are free?


It's not, you just don't understand the point. Going by 4k results, I should buy a 3600x. It performs almost identical to a 13900k and it costs like 1/4th of the price. What happens next year when I replace my 3080 with a 5080 though? Exactly. So that's why im looking at 720p results, to know what's gonna happen next year / which  cpu will last me



Mussels said:


> workloads are not games
> 
> Gaming CPU? x3D
> 
> ...


And im saying, they are not comparable. Its like comparing the 3d to a 7950x. The 3d is closer to the 12600kf, both in games and in other workloads. At least thats what TPU shows. At which point I don't see the huge benefit of mobo upgradability, since the 12600kf with a brand new mobo costs as much as the 3d on its own


----------



## InVasMani (Oct 24, 2022)

Even if your CPU lasts at 720p the display won't with a 5080. If you're that concerned about which CPU will last you maybe you should show similar concern for how long the socket will last you as well.


----------



## Wasteland (Oct 24, 2022)

InVasMani said:


> Even if your CPU lasts at 720p the display won't with a 5080. If you're that concerned about which CPU will last you maybe you should show similar concern for how long the socket will last you as well.



Forget the resolution.  The point isn't that anyone will actually use a 720p display; the point is that low resolution testing removes GPU bottlenecks.  This is useful both for determining a CPU's useful lifespan,, and as a proxy for heavily CPU-limited games that might be impractical to benchmark (e.g. certain online games).  I'm usually among the first to say that CPU reviews can mislead gamers; usually GPU bottlenecks are a bigger concern in practice--but there _is_ a purpose in low resolution testing. This discussion is a perfect example, because it's all about CPU/socket longevity.

Good reviewers will include various resolutions, offering a composite of the CPU's performance profile, and allowing the audience to make its own judgment as to how relevant different aspects of that composite are for their particular use case.  "Hahaha 720p are you kidding," really isn't the rhetorical kill stroke that some people seem to think it is.  Why would W1zzard include 720p numbers at all, if they're so ridiculous?


----------



## FreezingPC (Oct 24, 2022)

Wasteland said:


> Why do you think W1zzard includes 720p numbers at all, if they're so ridiculous?


Because people wouldn't stop asking for them ?
Could be one of the reasons... Without being the main reason...

----------------------------------------------------------------

At this point, with this much fear for your performances, i would just buy a top tier gaming god PC and just sell them 11 months down the line and buy the next big thing...
No need for a crystal ball and Trying to play guessing games with 720p and the gaming industry, you just get a guarantee top performances...


----------



## InVasMani (Oct 24, 2022)

To be objective and fair to different scenarios primarily. No one is saying 720p is bad for testing purposes to gauge limitations. Pretty much everyone however agrees that 720p doesn't represent desktop gaming usage in general on a discrete GPU and hasn't for a very long time and is entirely misrepresentation of it on a RTX 4090 at the same time or even a mid range GPU today. There lies the problem with nonsense CPU bottleneck arguments that don't apply or won't apply to the end user with their individual expectations. 

The issue is when people make assertions as if it always applies to everyone and rings true and is fair accurate or representative. If they want to point something out fine, but when their own intention is very clearly simply diminish the appeal of a competing brand from consideration it's a classic situation of defamation of character so to speak, but applied to hardware. 

When no matter what the argument they spin it to something far fetched and outlandish that tries to steer you in the direction of agreement and writing off a competing product that's a bit of a problem. If it's always brand A is better than brand B because and never mentions any of the area's brand B competes and is better than brand A that's clearly bias. There is no question tech brands don't compete perfectly in every area again another at all times yet some people try to construe it that way by only painting a good light on one brand while simply disparaging the other and ignoring valid points or simply spinning to another counter point to just repeat the same thing once again. 

Worse still is when questionable data is brought into the picture that can't be verified or disputed or compared in a fair manner because guess what a key detail isn't listed about the test setup like the memory involved and not simply the MT/s speeds involved because that alone doesn't determine memory performance. No one wants to argue in circles with people like that either then arrive back at yes, but at 720p with a GPU you and I wouldn't use it has more of bottleneck so you shouldn't buy what you see as better value you should get what I've been shilling you to get instead.

To the dismay of some people I don't care about 720p results and never will care about them in terms of it applying to me in regards to gaming because I don't game on integrated graphics. If it was a APU sure you'd have a valid point I might say you know that's a fair point at 720p that is a better APU than a iGPU if I want to run more crappy graphics than better quality laptops these days.


----------



## 64K (Oct 24, 2022)

720p benches have nothing to do with gaming at 720p.

It's a CPU test.


----------



## fevgatos (Oct 24, 2022)

InVasMani said:


> Even if your CPU lasts at 720p the display won't with a 5080. If you're that concerned about which CPU will last you maybe you should show similar concern for how long the socket will last you as well.


Socket longevity is irrelevant, if a cpu lasts for 5 or 6 years, i wont need to upgrade it on the same socket



InVasMani said:


> To be objective and fair to different scenarios primarily. No one is saying 720p is bad for testing purposes to gauge limitations. Pretty much everyone however agrees that 720p doesn't represent desktop gaming usage in general on a discrete GPU and hasn't for a very long time and is entirely misrepresentation of it on a RTX 4090 at the same time or even a mid range GPU today. There lies the problem with nonsense CPU bottleneck arguments that don't apply or won't apply to the end user with their individual expectations.
> 
> The issue is when people make assertions as if it always applies to everyone and rings true and is fair accurate or representative. If they want to point something out fine, but when their own intention is very clearly simply diminish the appeal of a competing brand from consideration it's a classic situation of defamation of character so to speak, but applied to hardware.
> 
> ...


And you are completely missing the point. Again. I don't see how this is even debatable. A cpu that is faster in 720p will last you longer, that's just self evident.


----------



## InVasMani (Oct 24, 2022)

So now it is irrelevant for a CPU, but with memory it is not? Is that now it works now. Apparently a socket isn't socket. I think you're missing the point people want something that fits their needs and last within their expectations without spending more than they need to. A cpu that is faster in 720p won't necessarily last longer these days purely for gaming. You can get around that with upscale. If only it made any sense to do so much like spending significantly more to get little relevant upside in terms of what the individual might need and want. 

Everyone's use cases differs, that's just self evident. Someone with a weaker CPU is probably not pairing it with new top end GPU unless already have plans to upgrade the CPU later soon after. Like wise in the same scenario they might settle on a upper mid range or high end GPU and be perfectly fine with save a good money in doing so and when they feel ready to similarly with the CPU upgrade later. CPU and GPU don't need to be perfectly synchronous in performance capabilities and sure it's nicer when they are closer in parity just like memory ratio's operate, but it isn't mandatory in general at the same time.

I don't think I'm missing the point at all I'm not going to be using 720p now or in the future with my CPU and a better CPU won't change that favorably. I don't even have a GPU like 1/4 the performance RTX 4090 at the same time. I mean damn if I did have one that odds become even more exponentially worse that I would do so. I represent about the worst case scenario and yet I'd still not use 720p. I'd upgrade the CPU longer before that was ever the case especially paired with a RTX 4090. It's a very nonsense argument if merely talking about games and expectations which was the case of the argument at the time between you and another individual. 

I'm yeah I'm missing the point on a weaker overall setup than them and wouldn't consider that myself if you say so. I could probably post pone a CPU upgrade on what I have paired with a RTX 4090 for like 5 or 6 years for strictly games short of games strictly demanding more thread to operate than 4 threads by developers and be entirely fine with it. Outside of gaming it wouldn't be the case, but that isn't the argument and if I really need and demand excess MT I might not be considering a consumer level CPU in the first place if I can afford add justify better.


----------



## fevgatos (Oct 24, 2022)

InVasMani said:


> So now it is irrelevant for a CPU, but with memory it is not? Is that now it works now. Apparently a socket isn't socket. I think you're missing the point people want something that fits their needs and last within their expectations without spending more than they need to. A cpu that is faster in 720p won't necessarily last longer these days purely for gaming. You can get around that with upscale. If only it made any sense to do so much like spending significantly more to get little relevant upside in terms of what the individual might need and want.


Νο, you cannot get around the simple fact that a faster CPU is going to last you longer. 


InVasMani said:


> Someone with a weaker CPU is probably not pairing it with new top end GPU unless already have plans to upgrade the CPU later soon after.


And that still doesn't change the fact that a faster CPU will last you longer. Eventually a mid range GPU will bottleneck your CPU even at 4k.  That point will be sooner for a CPU that is slower in 720p than a CPU that is faster in 720p. That's just...common sense honestly. I don't see how that's even remotely contestable. 


InVasMani said:


> I don't think I'm missing the point at all I'm not going to be using 720p now or in the future with my CPU and a better CPU won't change that favorably. I don't even have a GPU like 1/4 the performance RTX 4090 at the same time.


And then you would end up buying the wrong product. Case in point. CPU A costs 300€, gets 100 fps at 4k and 150fps in 720p. CPU B also costs 300€, gets also 100 fps at 4k but 200 fps at 720p. If you buy A based on 4k results, you made a mistake. Simple as that.


----------



## InVasMani (Oct 24, 2022)

You don't get to decided what is the right product for me nor how it lasts or how I intend to use it. Which is technically faster at 720p is irrelevant for me and no you won't argue that differently for me. So just to confirm DDR4 irrelevant yes? or are you going to be hypocrite and argue otherwise selectively.


----------



## fevgatos (Oct 24, 2022)

InVasMani said:


> You don't get to decided what is the right product for me nor how it lasts or how I intend to use it. Which is technically faster at 720p is irrelevant for me and no you won't argue that differently for me.


I don't care about the right product for you, nobody is talking about you. You can buy whatever you want. Im saying that if someone is interested in gaming, 720p results are very relevant cause they show longevity. The faster 720p cpu IS going to last longer, that's just a fact and no matter how hard you try - you can't change facts.


InVasMani said:


> So just to confirm DDR4 irrelevant yes? or are you going to be hypocrite and argue otherwise selectively.


What does DDR4 have to do with anything? I don't understand what you are asking


----------



## medi01 (Oct 24, 2022)

64K said:


> 720p benches have nothing to do with gaming at 720p.
> 
> It's a CPU test.



CPU test of doing what?


----------



## InVasMani (Oct 24, 2022)

I'm done with this circular bs I'll use the ignore button. He knows damn well he said above.



fevgatos said:


> Socket longevity is irrelevant, if a cpu lasts for 5 or 6 years, i wont need to upgrade it on the same socket



I guess sockets are magical and DDR4 sockets are much different and more special than CPU sockets which irrelevant for the latter, but not for the former. Intel supports DDR4 still so apparently it's relevant because shill stance for Intel while CPU socket isn't because shill stance against competition let's name call here I think you know who I put on ignore. Feel free to return the topic thread back to the 13900K before it got derailed somewhere down the line into don't upgrade AM4 CPU because of 720p being a quickie that won't last as long as Intel because moar cores now matters.


----------



## fevgatos (Oct 24, 2022)

InVasMani said:


> I'm done with this circular bs I'll use the ignore button. He knows damn well he said above.
> 
> 
> 
> I guess sockets are magical and DDR4 sockets are much different and more special than CPU sockets which irrelevant for the latter, but not for the former. Intel supports DDR4 still so apparently it's relevant because shill stance for Intel while CPU socket isn't because shill stance against competition let's name call here I think you know who I put on ignore. Feel free to return the topic thread back to the 13900K before it got derailed somewhere down the line into don't upgrade AM4 CPU because of 720p being a quickie that won't last as long as Intel because moar cores now matters.


I don't think I ever mentioned DDR4 so I'm not sure what the heck you are talking about?

I never said don't upgrade AM4 because 720p is quicker either, I really have no clue what the heck you are talking about. 

Actually I never brought up AMD or Intel, I'm just saying that your "720p is useless" stance is wrong. That's all. Apparently you want to turn it into an amd vs intel for fanboyism reasons, don't let me stop you, just don't involve me in that. I only care about performance and price, I'm brand agnostic. Apparently you aren't. Well...too bad for you i guess


----------



## ratirt (Oct 24, 2022)

fevgatos said:


> And im saying, they are not comparable. Its like comparing the 3d to a 7950x. The 3d is closer to the 12600kf, both in games and in other workloads. At least thats what TPU shows. At which point I don't see the huge benefit of mobo upgradability, since the 12600kf with a brand new mobo costs as much as the 3d on its own


Why aren't these comparable? You can compare the price to game performance for instance. Why not? x3d is a very capable gaming CPU and it costs less than either 7950x or 13900k. It is literally trading blows with 12900k which is pretty good if you ask me. You can't dictate which products can be compared with each other and which can't.
I can bet you, there are plenty of people who liked the 12900k, 7959x, 13900k and other comparisons with a 5800x3d for gaming purposes.
If I were to upgrade my system or platform now for gaming, I'd consider 5800x3d for sure. Especially due to low power usage in comparison to 13900k and 7950x. Even 13600k uses more power than a 5800x3d in games on average and it is slower in general. Not to mention, the 5800x3d would have come as the cheapest option among all the current options available and noticeably faster than my current CPU.


----------



## fevgatos (Oct 24, 2022)

ratirt said:


> Why aren't these comparable?


Cause the performance is vastly different, the 13600 is what, over 50% faster in MT and over 20% in ST? If you are fine with the 5800x 3d's performance then you shouldn't be looking at a 13600 in the first place, since it's way way faster. If you are purely interested in gaming CPU's  wait for the lower end parts from Intel, the 7600x from AMD or last gen's 12600k.  


The 3d is laughably expensive - you got to pay for that mobo upgradability feature that amd fans keep talking about. Anyways, Eu pricing, a 13600kf + a cheap b660 costs 60-70€  more than the 5800x 3d on it's own. Which one you consider better, up to you


----------



## 64K (Oct 24, 2022)

medi01 said:


> CPU test of doing what?



I will try to explain. The 720p benches were in a CPU review and not a GPU review. The idea is to find out how many FPS a CPU could give if the GPU was not slowing things down rendering frames. Once you know the maximum FPS possible of the CPU then you can better know it's potential right now and future potential as GPUs become more powerful.

You would not test a CPU at higher resolutions to find the maximum number of FPS possible because then the GPU would be slowing things down rendering frames.


----------



## fevgatos (Oct 24, 2022)

64K said:


> I will try to explain. The 720p benches were in a CPU review and not a GPU review. The idea is to find out how many FPS a CPU could give if the GPU was not slowing things down rendering frames. Once you know the maximum FPS possible of the CPU then you can better know it's potential right now and future potential as GPUs become more powerful.
> 
> You would not test a CPU at higher resolutions to find the maximum number of FPS possible because then the GPU would be slowing things down rendering frames.


Case in point, someone just looking at 4k results would buy a 9100f, as it performs exactly identical to a 12900k. Then he upgrades to a new GPU and realizes his 9100f is severely bottlenecking his shiny 1k to 2k graphics card.


----------



## ratirt (Oct 24, 2022)

fevgatos said:


> Cause the performance is vastly different, the 13600 is what, over 50% faster in MT and over 20% in ST? If you are fine with the 5800x 3d's performance then you shouldn't be looking at a 13600 in the first place, since it's way way faster. If you are purely interested in gaming CPU's wait for the lower end parts from Intel, the 7600x from AMD or last gen's 12600k.


OK MT and ST is fine but I been talking about gaming which basically these processors are for arent they? I'm sure the  x3d is for gaming. You need to have some measurable MT and ST performance but at this point it is not a deal breaker that the 5800x3d is slower in MT if you aim is gaming.
Expensive you say? where show me.
Here is something from GN



Yeah the 5800x3d is pricier than a 13600K but only with DDR4 Ram combo. That combo is slower in games by a lot than a 5800x3d unfortunately.
You would have to go DDR5 but that would be pricier than a 5800x3d combo. Not to mention, I already have the board so I don't need anything except the CPU.
Here is something from HWUB about the performance of the 13600k to illustrate what I'm talking about.




If you are doing some serious MT or ST workload than sure 13600K would be better but that CPU alone is not for those MT and ST workloads which it isn't a monster at those. If you really need something for MT crunch etc you would go with something different.
Solely for gaming, x3d is a better option. In a normal day to day tasks, you wont see the difference between the two when you surf net or do YT or whatever else there is. The only value and advantage the 3600k has, is a possibility to upgrade later to something stronger and then poof, MT and gaming performance go up (13700k for instance) but you need a DDR5 for that anyway and that is expensive nonetheless.
maybe the 13600 or 13500 turn out to be cheap alternative for gaming only for new platform buyers, for a Ryzen system owner, might not be so great.


----------



## fevgatos (Oct 24, 2022)

ratirt said:


> OK MT and ST is fine but I been talking about gaming which basically these processors are for arent they? I'm sure the  x3d is for gaming. You need to have some measurable MT and ST performance but at this point it is not a deal breaker that the 5800x3d is slower in MT if you aim is gaming.


If your aim is only gaming there are much better vfm options than the 5800x 3d.



ratirt said:


> Expensive you say? where show me.


The 3d on it's own yes, it's super expensive. You don't even have to compare it to Intel that are decently priced. Even against the overpriced Zen 4 cpus, the 3d is wildly overpriced. Think about it, a 7600x is faster in games, way faster in ST workloads and equal in MT workloads. And yet, even though it's already overpriced, it's cheaper than the 3d. In facts, it's a 100€ cheaper. You might argue that Am3 motherboards are cheaper, but then that's exactly my point, you are paying for the mobo upgradability, it's not free. Instead of paying for a new motherboard, you are overpaying for the CPU.



ratirt said:


> Yeah the 5800x3d is pricier than a 13600K but only with DDR4 Ram combo. That combo is slower in games by a lot than a 5800x3d unfortunately.
> You would have to go DDR5 but that would be pricier than a 5800x3d combo. Not to mention, I already have the board so I don't need anything except the CPU.
> Here is something from HWUB about the performance of the 13600k to illustrate what I'm talking about.



What do you mean by a lot? The difference in gaming performance is 5%, yet the 13600kf is 90€ cheaper (359€ vs 449€ in big EU retailer). With that price difference you could get faster ram for example and tie it in gaming, while still offering vastly better MT and ST performance and a better upgrade path. 


ratirt said:


> If you are doing some serious MT or ST workload than sure 13600K would be better but that CPU alone is not for those MT and ST workloads which it isn't a monster at those. If you really need something for MT crunch etc you would go with something different.


I disagree. The 13600kf IS a monster at those. It's the 3rd fastest CPU in ST performance and faster than a 5900x and a 7700x in MT performance. It's pretty freaking strong actually


----------



## AusWolf (Oct 24, 2022)

Wasteland said:


> Forget the resolution.  The point isn't that anyone will actually use a 720p display; the point is that low resolution testing removes GPU bottlenecks.  This is useful both for determining a CPU's useful lifespan,, and as a proxy for heavily CPU-limited games that might be impractical to benchmark (e.g. certain online games).  I'm usually among the first to say that CPU reviews can mislead gamers; usually GPU bottlenecks are a bigger concern in practice--but there _is_ a purpose in low resolution testing. This discussion is a perfect example, because it's all about CPU/socket longevity.
> 
> Good reviewers will include various resolutions, offering a composite of the CPU's performance profile, and allowing the audience to make its own judgment as to how relevant different aspects of that composite are for their particular use case.  "Hahaha 720p are you kidding," really isn't the rhetorical kill stroke that some people seem to think it is.  Why would W1zzard include 720p numbers at all, if they're so ridiculous?


I guess this is on reviewers in a sense that it isn't explained well, usually. So naturally, the majority of people have no idea why 720p tests exist at all and of course they retort with "no one plays at 720p anymore, duh". Reviews should state that it's to test lifespan longevity, and not actual present-day gaming performance. To be honest, even I didn't know until not too long ago, even though it's logical as heck.


----------



## fevgatos (Oct 24, 2022)

AusWolf said:


> I guess this is on reviewers in a sense that it isn't explained well, usually.


Νο, trust me, that's not the issue. It's fanboys (usually of one specific company, won't name which), that know why 720p tests exist, but they come up with  all kinds of excuses when that specific company they support is losing in them. Guru3d (i'm not sure but i think that was the one) was forced to REMOVE (im not even kidding you) 720p results from one of their reviews cause supporters of that same company went freaking nuts in the comments section cause their favorite company's CPU was losing badly. Holy freaking cow


----------



## ratirt (Oct 24, 2022)

fevgatos said:


> If your aim is only gaming there are much better vfm options than the 5800x 3d.


probably but this one is better in price and performance than a 13600k. 


fevgatos said:


> The 3d on it's own yes, it's super expensive. You don't even have to compare it to Intel that are decently priced. Even against the overpriced Zen 4 cpus, the 3d is wildly overpriced. Think about it, a 7600x is faster in games, way faster in ST workloads and equal in MT workloads. And yet, even though it's already overpriced, it's cheaper than the 3d. In facts, it's a 100€ cheaper. You might argue that Am3 motherboards are cheaper, but then that's exactly my point, you are paying for the mobo upgradability, it's not free. Instead of paying for a new motherboard, you are overpaying for the CPU.


not really. 12900K is super expensive. The x3d is fairly cheap if you buy a combo even if you compare with a 13600k I guess you have omitted screenshots I have posted, If you have and AM4 board than it becomes even cheaper. Fact is, it is a great gaming CPU and it has no equal in performance per wat or performance per $. Downside is lack of upgrade but for what it is and what it offers, it will last you long time. Long enough you can literally wait until you change your entire rig. 


fevgatos said:


> What do you mean by a lot? The difference in gaming performance is 5%, yet the 13600kf is 90€ cheaper (359€ vs 449€ in big EU retailer). With that price difference you could get faster ram for example and tie it in gaming, while still offering vastly better MT and ST performance and a better upgrade path.


Either way it is faster and by that logic there is no point for replacing anything since any noticeable difference can be seen only with stupidly fast graphics and at ridiculous resolution of 1080p.


fevgatos said:


> I disagree. The 13600kf IS a monster at those. It's the 3rd fastest CPU in ST performance and faster than a 5900x and a 7700x in MT performance. It's pretty freaking strong actually


Not saying its performance suck since it is approaching 12900k's performance and that is huge but there is better like 13700K much more capable with MT and I think if you consider MT you should be going with this one. It surpasses 12900k so that is a great upgrade if MT workloads are what you are looking for.


----------



## fevgatos (Oct 24, 2022)

ratirt said:


> not really. 12900K is super expensive. The x3d is fairly cheap if you buy a combo even if you compare with a 13600k I guess you have omitted screenshots I have posted, If you have and AM4 board than it becomes even cheaper.


Τhe 12900k is twice as fast in multithreading performance, again - you are comparing completely different products. That's like saying the 12900k is good value for money cause it's much cheaper than the Threadripper 7990wx. I mean..what? 



ratirt said:


> Fact is, it is a great gaming CPU and it has no equal in performance per wat or performance per $.


No it does not. That's just absolutely false.

These are the results from the TPU review



			https://tpucdn.com/review/intel-core-i9-13900k/images/relative-performance-games-1280-720.png
		


But even if we go by the ones you posted from hwunboxed, it definitely isn't good in performance per $, I don't even know why you would say something like that. The vast majority of CPUs have better performance per $ even if we are just talking about gaming. HECK, the insanely overpriced 7600x has way better performance per $ in games. LOL


ratirt said:


> Either way it is faster and by that logic there is no point for replacing anything since any noticeable difference can be seen only with stupidly fast graphics and at ridiculous resolution of 1080p.


No, it is not. The 13600kf is faster when you pair it with ddr5. Actually according to TPU it's faster with ddr4 as well.


----------



## Wasteland (Oct 24, 2022)

AusWolf said:


> I guess this is on reviewers in a sense that it isn't explained well, usually. So naturally, the majority of people have no idea why 720p tests exist at all and of course they retort with "no one plays at 720p anymore, duh". Reviews should state that it's to test lifespan longevity, and not actual present-day gaming performance. To be honest, even I didn't know until not too long ago, even though it's logical as heck.



Yeah I don't think it's made explicit often enough.  FWIW, what I like to do is look at the CPU bottlenecked scores out of what you might call academic or long-term interest, and then shift over to the 4k scores as a sanity check.  At least up until the launch of the 4090, 4k benchmarks were reliably GPU limited, and thus they were a good proxy for, "ok, if I pair one of these mid-range CPUs with a sensible GPU, this is what I'm most likely to see most of the time, i.e. no effective difference."

I think it's still generally true that CPU upgrades are overrated for gamers, though that was easier to argue back when everyone was targeting 60 Hz.  Certainly most gamers don't need a CPU upgrade with every new generation, nor even every two or three or four.  "Good enough" is the only target that matters; it can be difficult to keep that in mind if you immerse yourself in tech-enthusiast news/discussion.

Bottom line is buy the fastest CPU you can for your preferred price point and ride it until its performance becomes noticeably disappointing.



fevgatos said:


> Νο, trust me, that's not the issue. It's fanboys (usually of one specific company, won't name which), that know why 720p tests exist, but they come up with  all kinds of excuses when that specific company they support is losing in them. Guru3d (i'm not sure but i think that was the one) was forced to REMOVE (im not even kidding you) 720p results from one of their reviews cause supporters of that same company went freaking nuts in the comments section cause their favorite company's CPU was losing badly. Holy freaking cow



Come on now.  This isn't a partisan issue, and no sociopathic mega-corporation deserves a free pass.  While we're on the subject of CPU value, you could just as easily say that without competition from that "one specific company," the other big company would still be pumping out 4-core i7s with a 5% generational uplift every year or two.

The objections to low-resolution CPU testing go back way farther than you suggest, and they have nothing to do with pro-AMD or pro-Intel factions.  Bringing fanboy wars into any discussion is extreme cringe, bro.  Can't imagine hitching my wagon to some faceless globocorp.  Good god.


----------



## fevgatos (Oct 24, 2022)

Wasteland said:


> Come on now.  This isn't a partisan issue, and no sociopathic mega-corporation deserves a free pass.  While we're on the subject of CPU value, you could just as easily say that without competition from that "one specific company," the other big company would still be pumping out 4-core i7s with a 5% generational uplift every year or two.


I wasnt criticising the corporation though, just the fans. Funny though, how did you realize which one I was talking about?


----------



## Mussels (Oct 25, 2022)

fevgatos said:


> It's not, you just don't understand the point. Going by 4k results, I should buy a 3600x. It performs almost identical to a 13900k and it costs like 1/4th of the price. What happens next year when I replace my 3080 with a 5080 though? Exactly. So that's why im looking at 720p results, to know what's gonna happen next year / which  cpu will last me
> 
> 
> And im saying, they are not comparable. Its like comparing the 3d to a 7950x. The 3d is closer to the 12600kf, both in games and in other workloads. At least thats what TPU shows. At which point I don't see the huge benefit of mobo upgradability, since the 12600kf with a brand new mobo costs as much as the 3d on its own



First part: if you had a 3600x, you dont need to upgrade yet. If you're buying new, you'd want that headroom for sure.

I'm not confident with TPU's results here - other websites show really different results for the x3D, it may be down to whats being measured since theres no 0.1% lows

the x3D also had major price drops and is the flagship of the series, imagine if you could slap a 13400f into your 9th gen mobo - that's whats on offer here


----------



## Thorsthimble (Oct 25, 2022)

FeelinFroggy said:


> It's probably getting time to upgrade my 8700k.  The new AMD CPUs are very good and Intel still holds a small advantage in gaming performance.  But the elephant in the room is the potential 78003dx.  For strictly gaming (which is me) I think I'll wait till it comes out (thought it was supposed to be November) before I make a choice on an upgrade.
> 
> 
> Do you run your rig at 100% 24/7?  Unless you are a power user and do a lot of content creation where the system is running full peak all the time any difference on your electric bill will be negligible compared to whatever you currently run.
> ...


Yep, sure thing boss. If you say so.


----------



## ratirt (Oct 25, 2022)

fevgatos said:


> Τhe 12900k is twice as fast in multithreading performance, again - you are comparing completely different products. That's like saying the 12900k is good value for money cause it's much cheaper than the Threadripper 7990wx. I mean..what?


Yes it is faster but I was talking specifically gaming. 12900K is hardly a gaming CPU but MT CPU for sure.


fevgatos said:


> No it does not. That's just absolutely false.
> 
> These are the results from the TPU review
> 
> ...


Yes 13900K is faster and it costs way more. With all the mobos and DDR5 and shit. even DDR4 it is still $350 more (if I'm not mistaken). 5800x3d is faster than 13600k and can be cheaper. I posted pictures from HWUB and GN. Why are we talking about this again? 5800x3d is a  better option for gaming than 13900K. It costs way less and even though it is a tad slower it does not make a huge difference. Not to mention, the power consumption of the 13900k is literally twice of 5800x3d. 13600k has a higher gaming power consumption than x3d. So, I'd pick x3d for myself anytime instead of moving to Intel.


fevgatos said:


> No, it is not. The 13600kf is faster when you pair it with ddr5. Actually according to TPU it's faster with ddr4 as well.


It is faster if equipped with DDR5 but in gaming, according to HWUB, tested with 12 games 5800x3d is faster.
Even if you pair it with DDR5 it's not faster than 5800x3d in gaming.


----------



## AusWolf (Oct 25, 2022)

ratirt said:


> Yes it is faster but I was talking specifically gaming. 12900K is hardly a gaming CPU but MT CPU for sure.
> 
> Yes 13900K is faster and it costs way more. With all the mobos and DDR5 and shit. even DDR4 it is still $350 more (if I'm not mistaken). 5800x3d is faster than 13600k and can be cheaper. I posted pictures from HWUB and GN. Why are we talking about this again? 5800x3d is a  better option for gaming than 13900K. It costs way less and even though it is a tad slower it does not make a huge difference. Not to mention, the power consumption of the 13900k is literally twice of 5800x3d. 13600k has a higher gaming power consumption than x3d. So, I'd pick x3d for myself anytime instead of moving to Intel.
> 
> ...


Not to mention the extra cost of buying a new motherboard and DDR5 RAM. That's what upgrade costs ultimately come down to these days and that's why AM5 isn't selling well. I bet the vast majority of 12th (and now 13th) gen Intel Core systems out there are using DDR4.


----------



## Richards (Oct 25, 2022)

fevgatos said:


> Νο, trust me, that's not the issue. It's fanboys (usually of one specific company, won't name which), that know why 720p tests exist, but they come up with  all kinds of excuses when that specific company they support is losing in them. Guru3d (i'm not sure but i think that was the one) was forced to REMOVE (im not even kidding you) 720p results from one of their reviews cause supporters of that same company went freaking nuts in the comments section cause their favorite company's CPU was losing badly. Holy freaking cow


The red cult lol


----------



## fevgatos (Oct 25, 2022)

Mussels said:


> First part: if you had a 3600x, you dont need to upgrade yet. If you're buying new, you'd want that headroom for sure.
> 
> I'm not confident with TPU's results here - other websites show really different results for the x3D, it may be down to whats being measured since theres no 0.1% lows
> 
> the x3D also had major price drops and is the flagship of the series, imagine if you could slap a 13400f into your 9th gen mobo - that's whats on offer here


Sure but imagine if the 13400f cost 400€ in order to allow you to slap the 13400f in there. I mean, that's what Im saying, the 3d is nice, but judging the price as a standalone CPU, its way overpriced. Think about it, it's more expensive (by 100€) than the already overpriced 7600x, and it loses in everything, even games. So upgradability costs - the cost is included in the price of the CPU, otherwise the 3d should be cheaper than the 7600x, right?

It's funny how you are not confident with TPU's results, remember what you told me back when I said the same about the 12900k power limited numbers?


----------



## AusWolf (Oct 25, 2022)

fevgatos said:


> Sure but imagine if the 13400f cost 400€ in order to allow you to slap the 13400f in there. I mean, that's what Im saying, the 3d is nice, but judging the price as a standalone CPU, its way overpriced. Think about it, it's more expensive (by 100€) than the already overpriced 7600x, and it loses in everything, even games. So upgradability costs - the cost is included in the price of the CPU, otherwise the 3d should be cheaper than the 7600x, right?


I agree that the x3d is overpriced, but its main selling point is not having to buy a motherboard and RAM if you're already on AM4. The 7600X + motherboard + DDR5 RAM combo is way more expensive. If you're on an older DDR4 Intel platform, 13th gen Core i5 with a DDR4 motherboard is a good option.


----------



## W1zzard (Oct 25, 2022)

Good news, I think I found a fix for the low Cinebench score (and possibly others). Doing more testing and will update the review soon, I hope


----------



## Max(IT) (Oct 26, 2022)

Could you please clarify what do you mean with “stock” setting ? because I see a stock power consumption of 283W while Intel stock settings should be 253W for the 13900K. Thank you.


----------



## W1zzard (Oct 26, 2022)

Max(IT) said:


> Could you please clarify what do you mean with “stock” setting ? because I see a stock power consumption of 283W while Intel stock settings should be 253W for the 13900K. Thank you.


PL1 = 253, PL2 = 253. Do you have these values too?


----------



## Taraquin (Oct 26, 2022)

I wonder a bit, in gaming 13900K performs about 10% better than 7950X, but in all other reviews I have read they perform very similar or 13900K is less than 5% faster. Why do TPU get better results for 13900K than others?


----------



## fevgatos (Oct 26, 2022)

Taraquin said:


> I wonder a bit, in gaming 13900K performs about 10% better than 7950X, but in all other reviews I have read they perform very similar or 13900K is less than 5% faster. Why do TPU get better results for 13900K than others?


They dont. Club386 tested with a 4090 and wherever there is no gpu bottleneck the difference is 25 %


----------



## Taraquin (Oct 26, 2022)

fevgatos said:


> They dont. Club386 tested with a 4090 and wherever there is no gpu bottleneck the difference is 25 %


I know the difference will increase with a 4090 but TPU uses 3080, that is odd results compared to other test-sites. I know 13900K is faster, not doubting that, just find it a bit strange that TPU get so much difference with a significantly slower TPU than 4090 when others using similar GPU don't.

I seriously dount that there in general will be 25% difference when there is no GPU bottleneck, in some games yes, but not all. Some games fares better on 7950X vs 13900K aswell, but the majority prefers 13900K.



fevgatos said:


> Sure but imagine if the 13400f cost 400€ in order to allow you to slap the 13400f in there. I mean, that's what Im saying, the 3d is nice, but judging the price as a standalone CPU, its way overpriced. Think about it, it's more expensive (by 100€) than the already overpriced 7600x, and it loses in everything, even games. So upgradability costs - the cost is included in the price of the CPU, otherwise the 3d should be cheaper than the 7600x, right?
> 
> It's funny how you are not confident with TPU's results, remember what you told me back when I said the same about the 12900k power limited numbers?


Remember that ram and MBs cost a lot more for 7600X than 5800X3D. Where I live the cheapest option for 7600X costs 800usd (400+250+150) vs 650usd (500+70+80) fpr 5800X3D. I would pick the 3D anyday with that pricing, but prefer to wait for 7000 3D.


----------



## Max(IT) (Oct 26, 2022)

W1zzard said:


> PL1 = 253, PL2 = 253. Do you have these values too?


I was referring to your test



Why are you seeing 283W at stock if PL2 is 253W ?


----------



## Solid State Brain (Oct 26, 2022)

TechPowerUp's CPU-only power measurements are taken from the motherboards' 12V inputs, so they include the power consumption of motherboard components as well, notably the VRMs (which generate heat under load = wasted power).


----------



## W1zzard (Oct 26, 2022)

Max(IT) said:


> I was referring to your test
> 
> View attachment 267282
> 
> Why are you seeing 283W at stock if PL2 is 253W ?


CPU sensor isn't 100% accurate, my measurement also includes losses through the VRM, because I measure on the ATX 12 V power cables of the mainboard. It's still much better than relying on the CPU's own sensors which are inaccurate, not calibrate, vary between batches, and vary between AMD/Intel


----------



## Nopa (Oct 26, 2022)

W1zzard said:


> CPU sensor isn't 100% accurate, my measurement also includes losses through the VRM, because I measure on the ATX 12 V power cables of the mainboard. It's still much better than relying on the CPU's own sensors which are inaccurate, not calibrate, vary between batches, and vary between AMD/Intel


Then every websites/reviewers have different way of reporting CPU power consumption hence the wattage differences (different testing softwares/testing components). It's fair to say all of their reports are right in their respective ways and readers should do heavy research before deciding what is what.


----------



## Max(IT) (Oct 26, 2022)

W1zzard said:


> CPU sensor isn't 100% accurate, my measurement also includes losses through the VRM, because I measure on the ATX 12 V power cables of the mainboard. It's still much better than relying on the CPU's own sensors which are inaccurate, not calibrate, vary between batches, and vary between AMD/Intel


Thank you. I wasn't expecting 30W differente anyway. That's interesting.


----------



## Solid State Brain (Oct 26, 2022)

Max(IT) said:


> Thank you. I wasn't expecting 30W differente anyway. That's interesting.



Some VRMs supposedly only have efficiencies in the 80-85% range under full load, so if anything I would have expected an even larger difference.
253W / 0.825 = 306.7W


----------



## `Orum (Oct 26, 2022)

Phoronix finally got around to posting their review of the 13900k, and perhaps most interesting to me was the discrepancy between their SVT-AV1 numbers and yours.  I realize there are a lot of variables here between the two tests, e.g. different OS and Phoronix didn't test 4K at preset 10 (the default) as you did.  However, the difference is so large it got me to wondering, was your SVT-AV1 built with AVX-512 support?

Note that this has to be explicitly enabled when compiling it, even if you're doing a standard release build.

*Edit:* Just found this article which might also explain some or all of the discrepancy.


----------



## W1zzard (Oct 27, 2022)

This review has been updated with new performance numbers for the 13900K. Due to an OS issue the 13900K ran at lower than normal performance in heavily multi-threaded workloads. All 13900K test runs have been rebenched


----------



## ratirt (Oct 28, 2022)

`Orum said:


> Phoronix finally got around to posting their review of the 13900k, and perhaps most interesting to me was the discrepancy between their SVT-AV1 numbers and yours.  I realize there are a lot of variables here between the two tests, e.g. different OS and Phoronix didn't test 4K at preset 10 (the default) as you did.  However, the difference is so large it got me to wondering, was your SVT-AV1 built with AVX-512 support?
> 
> Note that this has to be explicitly enabled when compiling it, even if you're doing a standard release build.
> 
> *Edit:* Just found this article which might also explain some or all of the discrepancy.


Yeah I have seen those controversial articles stating again, some scheduler problems with AMD CPUs. You can see those in the reviews to be fair with the 7950X and 7900x. 
I only hope these will be fixed because I kinda tired of reviews which are misleading due to some Windows limitations created or behavior that affects performance drastically for one product making other look better even if it is not the reality.


----------



## Max(IT) (Oct 28, 2022)

W1zzard said:


> This review has been updated with new performance numbers for the 13900K. Due to an OS issue the 13900K ran at lower than normal performance in heavily multi-threaded workloads. All 13900K test runs have been rebenched


now results make sense: 36K points in Cinebench were just weird.


----------



## W1zzard (Oct 28, 2022)

Max(IT) said:


> now results make sense: 36K points in Cinebench were just weird.


Yeah, and thanks to everyone who reported this


----------



## rbgc (Nov 2, 2022)

This 13900K power consumption is still nothing. "Power consumption fun" will start after they introduce 6 GHz 13900K* (7800x3D "killer") in January.


----------

