# RTX 4090 & 53 Games: Core i9-13900K vs Ryzen 7 5800X3D



## W1zzard (Nov 4, 2022)

Intel's new 13900K offers amazing gaming performance, thanks to improvements to caches, IPC and higher operating frequencies. But is that enough to beat the 3DV Cache-powered AMD Ryzen 7 5800X3D? What about platform cost?

*Show full review*


----------



## potsdaman (Nov 4, 2022)

Thanks W1zzard for such comparison and work, very nice to see how AM4 still rock!)))


----------



## Taraquin (Nov 4, 2022)

5800X3D is hands down the best bang-for-bucks gaming CPU atm considering price of MB and ram


----------



## SeventhReign (Nov 4, 2022)

Yeah lets compare a CPU that literally no one in their right mind would EVER buy for gaming.  EVER. To a CPU that can literally be used for nothing BUT gaming.  Thats genius right there fellas.  While we're at it I think we should drag race a bicycle vs a Top Fuel Dragster.


----------



## Vecix6 (Nov 4, 2022)

Nice article its curious to see that Halo Infinite who took 50% more performance thanks to 3DV cache from last article now still goes to the side of Intel Raptor Lake.

Still need to fix a typo in all graphs: "Says Ryzen 7 5800X vs Ryzen 7 5800X3D" I supose they are recicled from latest TPU-50.


----------



## zlobby (Nov 4, 2022)

Hmm, am I the only one who misses the low end of the FPS? For actual game 'smoothness' lower percentile matters way more than the max FPS number. Great job nonetheless, @W1zzard


----------



## P4-630 (Nov 4, 2022)

I think you should compare the 5800x3d with a previous gen intel, so maybe with an i7 12700K or wait for 7800x3D and compare that with a 13th gen intel.


----------



## gffermari (Nov 4, 2022)

The results are as expected.

The big win for the consumers is that whatever you prefer, there is a product to cover you.
You want Intel. 13600K.
You want AMD. 7600X.
You have older AM4 mb: 5800X3D.

Whatever you choose, it's top on the charts. You can't go wrong no matter what.
If only we had the same competition in gpus...


----------



## Tek-Check (Nov 4, 2022)

gffermari said:


> You want AMD. 7600X.


This one is currently a terrible value for buck.


----------



## applejuice (Nov 4, 2022)

SeventhReign said:


> Yeah lets compare a CPU that literally no one in their right mind would EVER buy for gaming.  EVER. To a CPU that can literally be used for nothing BUT gaming.  Thats genius right there fellas.  While we're at it I think we should drag race a bicycle vs a Top Fuel Dragster.


Obvious bait is obvious. 0/13


----------



## Vecix6 (Nov 4, 2022)

gffermari said:


> The results are as expected.
> 
> The big win for the consumers is that whatever you prefer, there is a product to cover you.
> You want Intel. 13600K.
> ...



Agree 95%, thinking in future I prefer 7700X than 7600X for his number of cores.


----------



## gffermari (Nov 4, 2022)

Tek-Check said:


> This one is currently a terrible value for buck.



The buyer sees the value differently.
Someone prefers the 13600X for the top scores in everything while other could sacrifice the MT performance now in order to get a platform that will last 4-5 years.
Still, both are correct and both win.

Personally I can't decide between a 13600K, 13700K top all arounders and a 7700X with upgrade path to future 3Ds.
I might use a dice....


----------



## ratirt (Nov 4, 2022)

P4-630 said:


> I think you should compare the 5800x3d with a previous gen intel, so maybe with an i7 12700K or wait for 7800x3D and compare that with a 13th gen intel.


Why not this comparison? I think it says a lot. This is the best Intel CPU so why not compare to the best one? It only shows how 13900K considering it's price is expensive for gaming. You'd be better off with 5800x3d.


----------



## dir_d (Nov 4, 2022)

I have a 5600x at 4k 120hz must not buy 5800x3d... So tempting at $329.


----------



## siluro818 (Nov 4, 2022)

Now where is that guy who was trying so damn hard to convince everyone that if you're using a 2700X on an AM4 you should just drop everything and buy 13600K + mobo & DDR5 kit because in 10 (!) years it would be great in 4K lmfao


----------



## dirtyferret (Nov 4, 2022)

@W1zzard do you sleep?  Didn't you just finish a similar comparison a few days ago?


----------



## Fluffmeister (Nov 4, 2022)

dirtyferret said:


> @W1zzard do you sleep?  Didn't you just finish a similar comparison a few days ago?



German efficiency my friend.


----------



## rusTORK (Nov 4, 2022)

@W1zzard , typo in Test system:
Processor: AMD Ryzen 7 5800XD, Stock


----------



## W1zzard (Nov 4, 2022)

Vecix6 said:


> Nice article its curious to see that Halo Infinite who took 50% more performance thanks to 3DV cache from last article now still goes to the side of Intel Raptor Lake.
> 
> Still need to fix a typo in all graphs: "Says Ryzen 7 5800X vs Ryzen 7 5800X3D" I supose they are recicled from latest TPU-50.


As indicated, only the title is wrong. Benchmarks are correct. I noticed this just today, and can’t fix right now.. in the plane ready to take off


----------



## Count Shagula (Nov 4, 2022)

The power usage should also be considered when both cpu's have similar performance. You'd be mad to move to raptor lake especially for gaming if your already on AM4 and have a decent motherboard and ram


----------



## ZeppMan217 (Nov 4, 2022)

Count Shagula said:


> The power usage should also be considered when both cpu's have similar performance. You'd be mad to move to raptor lake for gaming if your already on AM4 and have a decent motherboard and ram


Wtf.


----------



## jallenlabs (Nov 4, 2022)

Thanks for this article, I was waiting for this one.  Well done.  Im guessing that AMDs 3D upgrade on this generation will crush Intel in gaming.  Oh well, my gaming rig is still a 11700k.


----------



## Vario (Nov 4, 2022)

Nice.  Good showing by the 5800X3D and it is now around $330 at a few retailers, keeps falling in price.  I think the 7x00X3D will be a long lasting performer.


----------



## Garrus (Nov 4, 2022)

The most important subset of these 53 games to me, is the one of "low frame rate games". I would like to see the bottom 25 percent of games taken out of the group of 53 (the bottom 13 games in FPS for example), so we can see what difference the CPU is making in games that run slowly. I don't need a CPU upgrade to get from 400 to 600fps. I would like to bring 100 fps titles up to 150fps etc.

Very funny that out of all the games that have large differences at 4k, DMC5 and Civilization 6 are the only ones I play and they are both heavily AMD favored. Upgrading to Intel would slow me down. Intel has to get fixes for those 2 games out! The scheduler is still not working right?


----------



## Tek-Check (Nov 4, 2022)

gffermari said:


> Personally I can't decide between a 13600K, 13700K top all arounders and a 7700X with upgrade path to future 3Ds.
> I might use a dice....


Never use a dice. Depends what you need CPU for. Long-term support, go AMD, one-off build for a few year, go i7.


----------



## JATownes (Nov 4, 2022)

This shows that for gaming the 5800X3D is a great deal and can still feed high end video cards. Unfortunately, if you use your rig for any kind of compiling or other high thread workloads, it's simply not going to cut it.  

But damn I'd sure like to play with one.


----------



## zlobby (Nov 4, 2022)

W1zzard said:


> in the plane ready to take off


Riiiight!  

Why smoke jah to fly, when you can take some DMT and teleport? Airlines hate this trick!



JATownes said:


> This shows that for gaming the 5800X3D is a great deal and can still feed high end video cards. Unfortunately, if you use your rig for any kind of compiling or other high thread workloads, it's simply not going to cut it.
> 
> But damn I'd sure like to play with one.


I'm the only guy that I knew back in the days, who was crunching numbers AND playing games on the same CPU (often at the same time). We damn few... 

So, not quite sure if crunching folk will miss the X3D?


----------



## ReallyBigMistake (Nov 4, 2022)

I think I have OCD or something but some of the game engines are wrong
Farcry 5 and 6 use Dunia 2
Watchdogs Legion uses Disrupt


----------



## mastrdrver (Nov 5, 2022)

@W1zzard do you have any comment on the 4k flip in Civ 6? Seems to go from a decent double digit lead for Intel and then flips to a double digit lead for AMD. Seems odd.


----------



## Space Lynx (Nov 5, 2022)

I agree 100% with the conclusion W1zz wrote.

imo AMD if they were smart, would halt production of all other CPU's since they aren't selling out, bring in a 7800X3D SKU, and just mass produce that one SKU, and I bet even with that level of a concentrated effort in production, it would still sell out day 1. Give the people what they want AMD. I would have went AM5 this round actually if that SKU was around.

Also, if AMD was smart they would use their CPU 5nm TSMC production time to max produce their two next generation GPU's with a temporary hold on the CPU production, flood the markets with supply of those two gpu's since they will provide the max profit, and lets face it we all know they will sell out day 1 no matter what AMD does... make that money AMD. After the markets are saturated with those two high end gpu's though, switch over to the 7800X3D SKU and begin production on that as well as lower tier next gen gpu's.

Have to give the people what they want AMD.


----------



## EatingDirt (Nov 5, 2022)

Garrus said:


> The most important subset of these 53 games to me, is the one of "low frame rate games". I would like to see the bottom 25 percent of games taken out of the group of 53 (the bottom 13 games in FPS for example), so we can see what difference the CPU is making in games that run slowly. I don't need a CPU upgrade to get from 400 to 600fps. I would like to bring 100 fps titles up to 150fps etc.
> 
> Very funny that out of all the games that have large differences at 4k, DMC5 and Civilization 6 are the only ones I play and they are both heavily AMD favored. Upgrading to Intel would slow me down. Intel has to get fixes for those 2 games out! The scheduler is still not working right?


I don't know what's wrong with DMC5, the 1080p is an anomaly, but it disappears at 1440p & 4k. If you look at all the resolution results for Civ5, the 13900k is faster at lower resolutions, but the 5800X3D beats it out at higher resolutions. It may be a case that at 4k Civ5 framerate runs into a memory latency bottleneck, which would help the 5800X3D pull ahead. 

As for a framerate, we get those for the CPU reviews. We can see what they're majority of games (24) in the reviews for the CPU's. That being said, the 4090's average FPS was ~160 at 4k with the 5800X, so you can probably be assured that both CPU's are getting well above that at 720p in all the games, I imagine, and an average of over 150 in most games at 4k.


----------



## Nihilus (Nov 5, 2022)

So extrapolating these results... the 5800x3d is the king at 8k gaming


----------



## Gemack (Nov 5, 2022)

This title is not the 5800X used in the comparison.  Even Intel recognized that the 5800X3D (not 5800X as in this article) beat the 13900K in gaming.  Very misleading and biased


----------



## Space Lynx (Nov 5, 2022)

Does anyone know if Windows 11 vs Windows 10 makes a difference in benchmark scores these days? I haven't kept up with things on that front.


----------



## Pepamami (Nov 5, 2022)

SeventhReign said:


> Yeah lets compare a CPU that literally no one in their right mind would EVER buy for gaming.  EVER. To a CPU that can literally be used for nothing BUT gaming.  Thats genius right there fellas.  While we're at it I think we should drag race a bicycle vs a Top Fuel Dragster.


where is threadripper, I cant find it in review. I think 13600k can be next. But still, 5800X3D is widely used as an "Upgrade" option, and 13900K is used in review as "fastest gaming CPU", its like "how far u will be behind from the best, if u just buy an upgrade for ur old b450 or x370, instead of buying everything new"


----------



## rv8000 (Nov 5, 2022)

Tek-Check said:


> This one is currently a terrible value for buck.



Both the 7600X and 7700X are terrible value. AMD through them further under the bus when they discounted the 5800x3D. Now they get beat in price, gaming, and MT apps by the 13600k and 13700k.

They need a massive price adjust asap


----------



## Jism (Nov 5, 2022)

The few games the X3D is behind is actually due to the difference in IPC pretty much.

You await untill AMD brings out that 7x00X3D version, with _hopefully_ a seperate voltage rail for it's SDRAM Cache so the CPU can function at it's fullest clocks (and not have a voltage limitation of 1.35V).


----------



## NeDix! (Nov 5, 2022)

nice work @w1zard

But could u add the FFXIV benchmark on the future ? mmorpg kinda love the 3d cache so could be really good information for some people


----------



## Garrus (Nov 5, 2022)

We are comparing two top CPUs. If you don't see a 20 percent difference in a game you are interested in, you probably won't notice it. They are not far apart. Unless you only play Ace Combat 7 a lot, buy either / or


----------



## Minus Infinity (Nov 5, 2022)

Conclusion: Why waste money buying a 13900K if just for gaming. 13700K makes far more sense. 13900K is pointless unless you do a lot of productivity based MT work, but even here 13700K shines. Based on the results a 7800X3D will wipe the floor with Raptor Lake, but knowing AMD will be priced at an absurd $500.


----------



## nguyen (Nov 5, 2022)

I'm more interested in the 95% and 99% percentile FPS, as showed in 13700K review


----------



## noel_fs (Nov 5, 2022)

nice, to me the only thing that matters is 4k 

would have been super nice to see like an average of power consumption but i can imagine adding tons of work if done properly


----------



## AnotherReader (Nov 5, 2022)

Great review @W1zzard . You're a machine.

TLDR: Raptor Lake will lose by a significant margin to the 3D variant of the 7700X.


----------



## Count von Schwalbe (Nov 5, 2022)

@W1zzard looks like you may have posted the wrong chart.




Great review BTW, a real eye opener. 

2 quick questions:

Were the outliers (e.g. Civ 5 @ 4k) rerun to exclude the possibility of error?

Also, was there any particular reason for using DDR5 on the Raptor Lake setup instead of identical RAM configs?


----------



## rbgc (Nov 5, 2022)

Pepamami said:


> 5800X3D is widely used as an "Upgrade" option, and 13900K is used in review as "fastest gaming CPU", its like "how far u will be behind from the best, if u just buy an upgrade for ur old AM4 b450 or x370, instead of buying everything new"


Exactly.

This review is about current gen top Intel CPU against previous gen AMD top "gaming" CPU. And reviews in January (4090 with 13900K and 7800X3D) will be about "how far u are behind in games with 13900K instead of using current gen AMD top "gaming" CPU".


----------



## Space Lynx (Nov 5, 2022)

nguyen said:


> I'm more interested in the 95% and 99% percentile FPS, as showed in 13700K review
> View attachment 268620



Can you explain to me what this mean? Is this the same thing as 1% and 5% lows people talk about? I don't understand the terminology.


----------



## BoredErica (Nov 5, 2022)

It would be interesting to see if loading times differ between zen3d or RPL overclocked.


----------



## birdie (Nov 5, 2022)

I slightly disagree with the tone of the review.

5800X3D is a great _gaming-only_ CPU but 13900K is a great all-rounder CPU since it's more than twice as fast in heavy MT workloads and in its single threaded performance is unparallelled.

And, oh boy, I don't quite understand everyone's excitement with the first. How many people ... no, let's talk about _dedicated hardcore_ gamers, out there game at 1080/1440p on RTX 4090? 0.5%? 0.1%?


----------



## nguyen (Nov 5, 2022)

CallandorWoT said:


> Can you explain to me what this mean? Is this the same thing as 1% and 5% lows people talk about? I don't understand the terminology.



They are similar, 95% and 99% percentile measure the FPS that you will have 95%/99% of the time, while 1% and 5% measure the occasional dip in FPS. They are 2 side of the same coin anyways.
Avg FPS alone just do not tell the whole story about the gaming experience.


----------



## jrbigz (Nov 5, 2022)

Count Shagula said:


> The power usage should also be considered when both cpu's have similar performance. You'd be mad to move to raptor lake especially for gaming if your already on AM4 and have a decent motherboard and ram


Similar to what? Gotta give to you AMD fanboys constantly posting power usage for productivity where 5800X3D is crushed to the abyss, while ignoring the power usage during gaming.


----------



## Space Lynx (Nov 5, 2022)

nguyen said:


> They are similar, 95% and 99% percentile measure the FPS that you will have 95%/99% of the time, while 1% and 5% measure the occasional dip in FPS. They are 2 side of the same coin anyways.
> Avg FPS alone just do not tell the whole story about the gaming experience.



Thanks for explaining it. That is interesting to look at it that way. 

I'm just happy to have a build that can game again, going to be fun playing my backlog. I think first game I am going to play will be Divinity Original Sin 1 and 2, my buddy wants to co-op them. Hopefully I can get 165hz 165 fps at 1440p on Ultra, I think I will be able to assuming I can get RDNA3. Heck, RDNA3 would probably give me that at 4k. Insane how good gpu's are getting. I really don't know if I need anything more powerful after this. Unless I go to 4k, but I don't think I will. Pretty happy with 27" 1440p.


----------



## VeqIR (Nov 5, 2022)

jrbigz said:


> Similar to what? Gotta give to you AMD fanboys constantly posting power usage for productivity where 5800X3D is crushed to the abyss, while ignoring the power usage during gaming.
> 
> View attachment 268629
> View attachment 268630
> View attachment 268631


You are still showing that 5800x3d uses the least power while gaming.  Up to half as much.


----------



## Bwaze (Nov 5, 2022)

AnotherReader said:


> TLDR: Raptor Lake will lose by a significant margin to the 3D variant of the 7700X.



For that conclusion we'll have to see the 7x00X3D benchmarks. 

I think everybody was kind of surprised to what extent extra cache helped the Zen 3 in gaming - maybe even AMD, since they didn't plan to use 3D cache in Zen 4 from the start. 

Is it because of the thermal penalty? 5800X3D lost in almost all productivity tests to 5800X. Having a single or two dedicated gaming processors makes sense, lowering productivity scores of the whole CPU generation for the sake of better gaming scores - not so much. 

Zen 4 already has thermal issues with temperatures almost constantly jumping to maximum, 95 degrees. What will the extra blanket of cache over chiplets cause? I guess we'll have to see, but I predict similar result - lower boost clocks and lower productivity scores for the sake of uplift in gaming. 

And will the extra chache help as much as in Zen 3? I don't see why not, L3 cache sizes are the same per core, and it doesn't look Zen 4 has any big architectural changes compared to Zen 3, most of the performance uplift was brought mainly by frequency uplift.


----------



## HD64G (Nov 5, 2022)

So, 5800X3D is the best vfm high-end gaming CPU ever considering the platofrm cost. And zen4 with improved 3D cache is coming in a few months to regain the gaming crown for good.


----------



## clopezi (Nov 5, 2022)

jrbigz said:


> Similar to what? Gotta give to you AMD fanboys constantly posting power usage for productivity where 5800X3D is crushed to the abyss, while ignoring the power usage during gaming.
> 
> View attachment 268629
> View attachment 268630
> View attachment 268631


Thanks for this, the other chart was very worrying.

It's the same for temps, many people worried about temps on full load... but unless you have a video rendering station, you dont have your cpu on full load almost any time...


----------



## ARF (Nov 5, 2022)

This means that @W1zzard must not change the testing setup to Core i9-13900K. Either upgrade the CPU to Ryzen 7 5800X3D, or wait for the 3D variants of the upcoming Ryzen 7000 CPUs in Q1 2023 

The efficiency of Ryzen 7 5800X3D is unbeatable:


----------



## clopezi (Nov 5, 2022)

ARF said:


> This means that @W1zzard must not change the testing setup to Core i9-13900K. Either upgrade the CPU to Ryzen 7 5800X3D, or wait for the 3D variants of the upcoming Ryzen 7000 CPUs in Q1 2023
> 
> The efficiency of Ryzen 7 5800X3D is unbeatable:
> 
> ...



It's a very good CPU and his power draw it's excellent, but in this chart, power draw and fps are better on 13900K, simple maths:

465 / 62,8 = 7,40
625 / 98,8 = 6,32


----------



## Richards (Nov 5, 2022)

Outdated  games are being tested  here


----------



## ARF (Nov 5, 2022)

clopezi said:


> It's a very good CPU and his power draw it's excellent, but in this chart, power draw and fps are better on 13900K, simple maths:
> 
> 465 / 62,8 = 7,40
> 625 / 98,8 = 6,32



Sorry for posting a cherry-picked instant


----------



## birdie (Nov 5, 2022)

ARF said:


> This means that @W1zzard must not change the testing setup to Core i9-13900K. Either upgrade the CPU to Ryzen 7 5800X3D, or wait for the 3D variants of the upcoming Ryzen 7000 CPUs in Q1 2023
> 
> The efficiency of Ryzen 7 5800X3D is unbeatable:


In gaming 13900K can be as power efficient if not more efficient as 5800X3D. Funny I mean how AMD fanboy'ish you post an application chart which uses mostly _heavy_ MT applications.

How about this?


----------



## londiste (Nov 5, 2022)

@W1zzard are you going to do 5800X3D vs 13600K next? Those are the two that should be in very direct competition now.


----------



## AlainCh2 (Nov 5, 2022)

What is CRAP review about ?

Even if it can be a very nicely done bunch of researches, putting the wrong label invalidate them all._


----------



## FreezingPC (Nov 5, 2022)

londiste said:


> are you going to do 5800X3D vs 13600K next? Those are the two that should be in very direct competition now.


Do we really need another bench just for a CPU thats 5% slower? I mean, no one is expecting the i5 to do better, im certainly not expecting him to be worst then the 5800X3D...


----------



## clopezi (Nov 5, 2022)

AlainCh2 said:


> What is CRAP review about ?
> 
> Even if it can be a very nicely done bunch of researches, putting the wrong label invalidate them all._
> 
> View attachment 268646View attachment 268645View attachment 268645


No sir, a wrong label it's only a mistake, doesn't invalidate anything... how much money do you pay here for being so angry and impertinent?


----------



## Imouto (Nov 5, 2022)

birdie said:


> How about this?



Do you mean the misleading bench you posted where the 5800X3D isn't also undervolted and underclocked?

Try to do better. Right now you are exactly what you are accusing others for.


----------



## N/A (Nov 5, 2022)

londiste said:


> 5800X3D vs 13600K next? Those are the two that should be in very direct competition now.


I believe 13600K is kind of the same perfomance as 13900K in games, no point in doing that. Minus some cores that don't do anything really.


----------



## Count Shagula (Nov 5, 2022)

jrbigz said:


> Similar to what? Gotta give to you AMD fanboys constantly posting power usage for productivity where 5800X3D is crushed to the abyss, while ignoring the power usage during gaming.
> 
> View attachment 268629





jrbigz said:


> lol mate never posted anything on power use ever before i think, The 13900k is also fucking shit when power usage is concerned for gaming, just look at the official tpu review. Fella im not a fanboy. Im here to buy the best performing shit money can buy IF ITS WORTH IT, The 5800X3D is is literally killing it in gaming. Im only here to to buy the best cpu and gpu combo this is available at any given time FOR THE $$$. Sure i have a shitty 3090 but i want a 4090 when they stop melting. That said after they launch the 7900xtx Ill BE BUIYING THAT. I wanted to buy a 13900k system the moment they launched but tpu says its less than 1% faster than my X3d in 4k and the other metrics make it even more rubbish. A 13900k and mobo and ram in my country is like 3 fucking thousand dollars, and still uses double the power. The 13900k best scenario uses at best twice the power and the temps are fucking mental. Like 100 at stock... Are any us going to buy that, no we arent. Anyone on am4 with any cpu thats kinda outdated is gonna go for a X3d


----------



## AlainCh2 (Nov 5, 2022)

clopezi said:


> No sir, a wrong label it's only a mistake, doesn't invalidate anything... how much money do you pay here for being so angry and impertinent?


Well, Sir.
I see you have never published anything.

A wrong label means you don't care about the correctness of your results, and you don't respect your readers.
>>>> I'm not here to teach how to write a paper<<<<<

I've just noted that the reader has to guess what the Authors were meaning:

And guessing is not for me.
I have no skill in divining the future or the present,
and less what other people are thinking while writing unrespectful garbage.

... and even if somebody noted and wrote that before me,
the Authors worse thing is not to have corrected it immediatly.

Not honorable.


----------



## N/A (Nov 5, 2022)

Don't be second guessing your second guesses. Get in line because a bunch of peer reviewers who noticed already complained about it in the thread. We all know.


----------



## Tek-Check (Nov 5, 2022)

birdie said:


> I slightly disagree with the tone of the review.
> 
> 5800X3D is a great _gaming-only_ CPU but 13900K is a great all-rounder CPU since it's more than twice as fast in heavy MT workloads and in its single threaded performance is unparallelled.
> 
> And, oh boy, I don't quite understand everyone's excitement with the first. How many people ... no, let's talk about _dedicated hardcore_ gamers, out there game at 1080/1440p on RTX 4090? 0.5%? 0.1%?


Most gamers do not need a CPU for heavy MT workloads. Waste of silicon and money. They are not professionals rendering, compiling or decompressing huge workloads on a daily basis.

From this point of view, 13900K is just a halo product, like it always has been. Literally no reason to buy it for gaming. 

5800X 3D is the king for gaming in terms of value and it's great for general daily comptuting. You do not need 4090 to drive this CPUs. Any good GPU would do. This testing was an academic exercise out of curiosity.



birdie said:


> In gaming 13900K can be as power efficient if not more efficient as 5800X3D. Funny I mean how AMD fanboy'ish you post an application chart which uses mostly _heavy_ MT applications.
> 
> How about this?
> 
> View attachment 268644 View attachment 268643


Silly to say those words. These charts are from DerBauer, a computer expert whose job is to spend several days and huge amount f hours fine-tuning and experimenting with CPUs. His graphs are not representative of daily consumer experience.

Very few people would ever dedicate so much time to undervolt and tune CPU. Most consumers will want to drop CPU in a system and play. For this, 5800X 3D is unbeatable value for money.


----------



## Nater (Nov 5, 2022)

Richards said:


> Outdated  games are being tested  here


Duh?


----------



## Hofnaerrchen (Nov 5, 2022)

Would be interesting to see a detailed power consumption comparison of the RTX 4090 on the 5800X3D and 13900K.


----------



## Prima.Vera (Nov 5, 2022)

By the looks of it, the next 7800X3D will be the best CPU AMD ever released. Looking forward for the comparison reviews.


----------



## 95Viper (Nov 5, 2022)

Stay on topic.
Stop the trolling.
Stop the arguing/bickering.


----------



## mrpaco (Nov 5, 2022)

Any way to find the 0.1 and 1% low of each test ?


----------



## Nopa (Nov 5, 2022)

5800X3D & 13600K are hands down the best Bang4Bucks duo CPUs of 2022. There's no other way to put it.


----------



## mechtech (Nov 5, 2022)

Wow

That was a crap ton of work.

You know what would be really interesting.  What would be the smallest gpu and cpu that could play these games at 1080p at mid graphics settings and get 60 fps.


----------



## regs (Nov 5, 2022)

Taraquin said:


> 5800X3D is hands down the best bang-for-bucks gaming CPU


For $400 it's not. Once again - save $200 and get better GPU for those money.


----------



## Taraquin (Nov 5, 2022)

regs said:


> For $400 it's not. Once again - save $200 and get better GPU for those money.


It costs 330usd atm.


----------



## Asni (Nov 5, 2022)

I don't get it: if the 13900k is 13% faster than the 5800x3d @1080p with a rtx 3080, how can 6.2% faster with a rtx 4090?








						Intel Core i9-13900K Review - Power-Hungry Beast
					

With the Core i9-13900K, Intel delivers impressive performance. Our in-depth review confirms: Raptor Lake is the world's fastest CPU for gaming. Even in applications the processor is able to match AMD's Zen 4 Ryzen 9 7950X flagship. If only power consumption wasn't so high...




					www.techpowerup.com


----------



## Tomgang (Nov 5, 2022)

I can only say. AMD give me a 5950X3D and i will be first one to say: Shut up and take my money.


----------



## Upgrayedd (Nov 5, 2022)

jallenlabs said:


> Thanks for this article, I was waiting for this one.  Well done.  Im guessing that AMDs 3D upgrade on this generation will crush Intel in gaming.  Oh well, my gaming rig is still a 11700k.


I paid $200 for an 11700K and a free mobo both new a few months ago. At that low of a price I could've cared less if new stuff was coming or if there was better stuff out. The price couldn't be beaten for what it offered.


----------



## Taraquin (Nov 5, 2022)

Asni said:


> I don't get it: if the 13900k is 13% faster than the 5800x3d @1080p with a rtx 3080, how can 6.2% faster with a rtx 4090?
> 
> 
> 
> ...


It is a very small selection of games, 2 of them heavily favors 13900K. Using 50 games you get a much more realistic picture.


----------



## Asni (Nov 5, 2022)

Taraquin said:


> It is a very small selection of games, 2 of them heavily favors 13900K. Using 50 games you get a much more realistic picture.


Let's talk about 1440p, where the rtx 3080 isn't bottlenecked (excluding some extreme scenarios). Even in this case the 13900k is 6.2% faster than the 5800x3d while it's just 4.7% faster with a rtx 4090 which causes a bottleneck and should increase the gap significantly.

To be honest i don't understand these results.


----------



## Voodoo Rufus (Nov 5, 2022)

Thanks for all that hard work, W1zzard.  The X3D is a great value at $330 right now. 

And the 5600X at $160......is super tempting for building a budget upgrade off of. Ebay a decent motherboard, slap some reasonable DDR4-3600 in it, and enjoy.


----------



## b1k3rdude (Nov 5, 2022)

Am I missing something here, but the images show 5800x versus 5800X3D in the title and 5800X3D versus 13900k in the body.


----------



## W1zzard (Nov 5, 2022)

b1k3rdude said:


> Am I missing something here, but the images show 5800x versus 5800X3D in the title and 5800X3D versus 13900k in the body.


On top of the charts I had a notice that the titles were wrong, but I'm traveling, so I can only fix them when back home. Looks like many people don't read text. Back home and the chart title are fixed now


----------



## ShiningSapphire (Nov 5, 2022)

ARF said:


> This means that @W1zzard must not change the testing setup to Core i9-13900K. Either upgrade the CPU to Ryzen 7 5800X3D, or wait for the 3D variants of the upcoming Ryzen 7000 CPUs in Q1 2023
> 
> The efficiency of Ryzen 7 5800X3D is unbeatable:
> 
> View attachment 268635


LOL You forgot that at the same time 13900K is much much faster in MC workloads, meaningless compare. 13600K is faster too, basically Intel wiped floor with 7600X/7700X and AMD additionally make them look bad with discounts on 5800X3D


----------



## W1zzard (Nov 5, 2022)

ReallyBigMistake said:


> Farcry 5 and 6 use Dunia 2
> Watchdogs Legion uses Disrupt


Fixed Watch Dogs. Do you have an official source for "Dunia 2" ? I thought they don't release a version number and say it's just "Dunia"


----------



## Taraquin (Nov 5, 2022)

Asni said:


> Let's talk about 1440p, where the rtx 3080 isn't bottlenecked (excluding some extreme scenarios). Even in this case the 13900k is 6.2% faster than the 5800x3d while it's just 4.7% faster with a rtx 4090 which causes a bottleneck and should increase the gap significantly.
> 
> To be honest i don't understand these results.


2 games have 13900K at over 30% advantage, this has a large impact on avg result.


----------



## Tek-Check (Nov 5, 2022)

birdie said:


> Gamers normally do not run heavy MT workloads, besides 13600K is nearly as fast in games as 13900K, and as an all-rounder it's as good as 5800X3D if not better.


i5 is good, I agree. Never said anything against it.


birdie said:


> I've never talked about undervolting Intel CPUs. No idea where from you drew that conclusion.


In DerBauer video, from which you posted charts, he plays with power levels and undervolting. He said it took him 3 days to play with settings and test it. Most buyers would never do such a thing.


birdie said:


> If you like the Ryzen 5800X3D great, just stop parroting that RPL is a power use hog. It's only 13900K/13700K and only in heavy MT tasks.


I don't have a need for 5800X 3D. Those who do on AM4 platform, it's a no brainer for them to upgrade a single CPU rather than buy completely new system with i5.
I am not "parroting" anything. Silly to say that, if you read carefully what I wrote.


----------



## Asni (Nov 5, 2022)

Taraquin said:


> 2 games have 13900K at over 30% advantage, this has a large impact on avg result.


Then the difference, without those 2 games, is even smaller! This makes even less sense.


----------



## Taraquin (Nov 5, 2022)

Asni said:


> Then the difference, without those 2 games, is even smaller! This makes even less sense.


Smaller with 3080 that is, making results closer to 4090 results which are impacted by DMC5 for instance.


----------



## Asni (Nov 5, 2022)

Taraquin said:


> Smaller with 3080 that is, making results closer to 4090 results which are impacted by DMC5 for instance.


They did not test Ace Combat 7 with the rtx 3080 in the 13900k review!


----------



## izy (Nov 5, 2022)

If you dont have a powerfull GPU an 5600x or 5800x its more than enough, at the moment you can find 5700x for like 200$ , that is my sweet spot if you have a mid tier gpu / system, best performance for the $.( If you are on amd you dont need new motherboard or extreme cooling)


----------



## DeadmanFatboy (Nov 5, 2022)

Asni said:


> Then the difference, without those 2 games, is even smaller! This makes even less sense.


They tested more games. Performance outliers make less of an impact on the average.


----------



## nomadka670 (Nov 5, 2022)

Then lets see 7700x vs 13700k

RTX 4090
SAME MEMORY 
Everything the same.

With Major fps cpu hungry games.

Like Csgo, Valorant 

New games like Plague Tale, God of War, Uncharted , etc etc

No very high quality. 1080p low as always 

Because 7700x here is crap in games but on YouTube hardware unboxed the 7700x Beats the 13700k in 12 game average.


----------



## Asni (Nov 5, 2022)

DeadmanFatboy said:


> They tested more games. Performance outliers make less of an impact on the average.


I think you just didn't check the 13900k review: they did not test Ace Combat 7 (+30%), Detroit become human (+18% @1080p), Dos (+20.8% @1080p, +19% @1440p), Guardian of the Galaxy (+12.3% @1080p), SpiderMan remastered (+17.6% @1080p,  +14.6% @1440p).

5 titles with that guarantee +20% at 1080p not tested in the 13900k review which shows a better scaling compared to this comparison. Don't you think that's strange?


----------



## regs (Nov 5, 2022)

Taraquin said:


> It costs 330usd atm.


Somewhere in USA, may be. Worldwide it's still $430, excluding taxes.


----------



## Taraquin (Nov 5, 2022)

regs said:


> Somewhere in USA, may be. Worldwide it's still $430, excluding taxes.


Costs 390usd incl tax in Norway, abd we have 25% tax.


----------



## 80-watt Hamster (Nov 5, 2022)

birdie said:


> Gamers normally do not run heavy MT workloads, besides 13600K is nearly as fast in games as 13900K, and as an all-rounder it's as good as 5800X3D if not better. Both are just fine power consumption-wise in gaming. I've never talked about undervolting Intel CPUs. No idea where from you drew that conclusion. .
> 
> If you like the Ryzen 5800X3D great, just stop parroting that RPL is a power use hog. It's only 13900K/13700K and only in heavy MT tasks.



In order to show that the 13900K is superior to the 5800X3D in frames/W, you posted a chart where the 13900K alone was tuned with an undervolt and underclock. The X3D still came in second place, and about as far behind the tuned i9 as the 90W i9 is behind the X3D.



> I can post lots more reviews which show that 13900K is a fine CPU while gaming



Of course you can. Is that even in dispute?


----------



## Space Lynx (Nov 5, 2022)

Well, I kind of regret getting raptor lake now after watching this video... looks like 7600x beats it easily across the board as long as you slot in some ddr5 6000 cas 30 ram. i think even lows are better by 20% with the 7600x over the 5800x3d and 13600k... turns out 7600x was the sleeper winner all along you just have to use really high end ram.

fuck. i don't know know what to do now. i should have never betrayed my love for AMD after all... 

7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.


----------



## EatingDirt (Nov 5, 2022)

Asni said:


> I think you just didn't check the 13900k review: they did not test Ace Combat 7 (+30%), Detroit become human (+18% @1080p), Dos (+20.8% @1080p, +19% @1440p), Guardian of the Galaxy (+12.3% @1080p), SpiderMan remastered (+17.6% @1080p,  +14.6% @1440p).
> 
> 5 titles with that guarantee +20% at 1080p not tested in the 13900k review which shows a better scaling compared to this comparison. Don't you think that's strange?


If you add 41 games to the original 12 to the test suite, and only 5 of of those new games have 25% better performance, and the rest have ~0%, those outliers aren't going to make a big difference in the final percentage. The ~0% will make up 77% of the score, the ~25% will make up 23% of the score.

Weighted and simplified that turns into 5.75%.


----------



## nomadka670 (Nov 5, 2022)

CallandorWoT said:


> Well, I kind of regret getting raptor lake now after watching this video... looks like 7600x beats it easily across the board as long as you slot in some ddr5 6000 cas 30 ram. i think even lows are better by 20% with the 7600x over the 5800x3d and 13600k... turns out 7600x was the sleeper winner all along you just have to use really high end ram.
> 
> fuck. i don't know know what to do now. i should have never betrayed my love for AMD after all...
> 
> 7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.


Thats why I said test again the 7700x vs 13700k with the Right RAM and rtx 4090 . People like to manipulate benchmarks


----------



## AlainCh2 (Nov 5, 2022)

regs said:


> Somewhere in USA, may be. Worldwide it's still $430, excluding taxes.


Taxes & Shipping included - Amazon prices in the Union


----------



## gffermari (Nov 5, 2022)

CallandorWoT said:


> Well, I kind of regret getting raptor lake now after watching this video... looks like 7600x beats it easily across the board as long as you slot in some ddr5 6000 cas 30 ram. i think even lows are better by 20% with the 7600x over the 5800x3d and 13600k... turns out 7600x was the sleeper winner all along you just have to use really high end ram.
> 
> fuck. i don't know know what to do now. i should have never betrayed my love for AMD after all...
> 
> 7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.



The 13600K/KF with DDR5 is the best overall combination.
The 7600X performs the same in gaming (the differences between all three cpus are a joke), worse in productivity but offers better upgrade path.

So....apart from the fact that you use DDR4 with the 13600KF, and may lose some frames, your choise was the best possible for an Intel build.

Just enjoy your build.


----------



## Space Lynx (Nov 5, 2022)

gffermari said:


> The 13600K/KF with DDR5 is the best overall combination.
> The 7600X performs the same in gaming (the differences between all three cpus are a joke), worse in productivity but offers better upgrade path.
> 
> So....apart from the fact that you use DDR4 with the 13600KF, and may lose some frames, your choise was the best possible for an Intel build.
> ...



I already bought my mobo though, and I did not realize how much DDR4 ram was holding me back. Sucks ass.

Also, if I am going to spend a shit ton of money on DDR5 ram I might as well go with 7600x then I can slot in 7800X3D someday down the road or 8800X3D a few years from now, etc.

I thought I was getting the best bang for my buck, but eh I guess not. I still saved about $300 overall by not going with AM5 premium prices, but in the longer term I didn't, and I get worse performance out the gate...


----------



## N/A (Nov 5, 2022)

CallandorWoT said:


> I already bought my mobo though, and I did not realize how much DDR4 ram was holding me back.


To avoid buyers remorse wait or don't buy. Just sell it. I think they're all the same, with 58X3 / 7600X 10-15% faster than 13600K in 1440p, but that will not be the case with 7900 as it tends to have less driver overhead.


----------



## vmarv (Nov 5, 2022)

Garrus said:


> The most important subset of these 53 games to me, is the one of "low frame rate games". I would like to see the bottom 25 percent of games taken out of the group of 53 (the bottom 13 games in FPS for example), so we can see what difference the CPU is making in games that run slowly. I don't need a CPU upgrade to get from 400 to 600fps. I would like to bring 100 fps titles up to 150fps etc.
> Very funny that out of all the games that have large differences at 4k, DMC5 and Civilization 6 are the only ones I play and they are both heavily AMD favored. Upgrading to Intel would slow me down. Intel has to get fixes for those 2 games out! The scheduler is still not working right?


If I'm not wrong, the hybrid Intels work better with Windows 11. I'm not speaking of gaming in particular, but overall. I saw so many reviews these days and unfortunally I can't remember where I read it, but the scheduler seems to works better in the new Windows.
Is it correct guys?


----------



## Asni (Nov 5, 2022)

EatingDirt said:


> If you add 41 games to the original 12 to the test suite, and only 5 of of those new games have 25% better performance, and the rest have ~0%, those outliers aren't going to make a big difference in the final percentage. The ~0% will make up 77% of the score, the ~25% will make up 23% of the score.
> 
> Weighted and simplified that turns into 5.75%.


Yep, exactly: 5.75% in addition to the initial  13% difference (keeping 13% as an approximation: you should evaluate the number of games, the scaling on the rtx 4090 in these initial 12 games and the percentage in the the other 36 games which isn't 0%). That's the point.
While this comparison talks about a 6.2% total!


Spoiler













Spoiler


----------



## Imouto (Nov 5, 2022)

regs said:


> Somewhere in USA, may be. Worldwide it's still $430, excluding taxes.



In Europe it's 360~380€ taxes included.


----------



## Space Lynx (Nov 5, 2022)

N/A said:


> To avoid buyers remorse wait or don't buy. Just sell it. I think they're all the same, with 58X3 / 7600X 10-15% faster than 13600K in 1440p, but that will not be the case with 7900 as it tends to have less driver overhead.



I'm just going to say fuck it and keep my current setup. this is my 10 yr build and im retiring from the hardware hobby. i used to love this shit, but i don't these days. haven't for quite some time really, even this build was more just out of necessity cause my laptop is really quite outdated. 

so yeah fuck it, im still going to get great performance and i did save $300 or so by not doing AM5, so eh its all good. this rig will do me good for next ten years. fuck all the nonsense, time to game boys


----------



## b1k3rdude (Nov 5, 2022)

W1zzard said:


> On top of the charts I had a notice that the titles were wrong, but I'm traveling, so I can only fix them when back home. Looks like many people don't read text. Back home and the chart title are fixed now


Nice one Wiz, I thought it might be something simple.

Well I have my 5900x and a 3080-12gb, so am happy to wait for the 7800X3D and by then the 4080-16GB/RX7900XT & DDR5-6000 will have dropped to more reasobale prices. At which point if finances allow, will then do a complete upgrade and sell the old rig etc.


----------



## bastordd (Nov 5, 2022)

Intel 13 gen sweet spot for ram is 6400...
6000 with cl 36 is to slow


----------



## regs (Nov 5, 2022)

AlainCh2 said:


> Taxes & Shipping included - Amazon prices in the Union
> 
> View attachment 268691


China - $400. And so as most of the world.


			https://www.aliexpress.us/item/3256804012905041.html
		


Still doesn't change anything. 12400F cost 150 euro, excluding VAT. 350-150 is still 200 euro difference, which you can spend on 6750 XT instead of 6600 XT. That would give you far greater experience. Except if you playing strategies, of course.


----------



## ShiningSapphire (Nov 5, 2022)

CallandorWoT said:


> Well, I kind of regret getting raptor lake now after watching this video... looks like 7600x beats it easily across the board as long as you slot in some ddr5 6000 cas 30 ram. i think even lows are better by 20% with the 7600x over the 5800x3d and 13600k... turns out 7600x was the sleeper winner all along you just have to use really high end ram.
> 
> fuck. i don't know know what to do now. i should have never betrayed my love for AMD after all...
> 
> 7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.


You made no mistake with this choice. 13600K is much better all-rounder beating 7600X by a lot in multi core workloads and providing great performance in games. Here you can read a review in which the testing methodology in games is based on finding the most cpu heavy scenarios (not built-in benchmarks like so many reviewers like to do) to show differences between different cpus and as you will see 13600K performs really solid better than 7600X in games not far away behind 13900K. It's in Polish but you'll get by with a translator.  https://www.purepc.pl/test-procesor...tna-wydajnosc-porownanie-w-grach-i-programach


----------



## EatingDirt (Nov 5, 2022)

Asni said:


> Yep, exactly: 5.75% in addition to the initial  13% difference (keeping 13% as an approximation: you should evaluate the number of games, the scaling on the rtx 4090 in these initial 12 games and the percentage in the the other 36 games which isn't 0%). That's the point.
> While this comparison talks about a 6.2% total!
> 
> 
> ...


The initial difference was 11.4%. The CPU games suite was a lot more in favor of the 13900k when paired with the 3080. Which had CS:GO @32%, AoE5 @28%, Borderlands 3 @6% & FC6 @27% all in favor of the 13900k.

It may be that the 4090 is just better with the 5800X3D than the 3080 was. Driver updates, AGESA updates and game patches could also account for the increase in performance for the 5800X3D. There are so many factors at play when benchmarking that you're going to get anomaly's and differences from month-to-month, year-to-year, etc. If we bench this a month from now, Devil May Cry 5 might be in favor of the 13900k by 10%, which would put us closer to the initial 11.4%.


----------



## gffermari (Nov 5, 2022)

CallandorWoT said:


> im still going to get great performance



You're getting 99% of the absolute performance. I think you can live with that, taking into account that you're not gonna get a 4090 now and a 4090Ti the day it's released etc...

For the money you spent....you actually stole them.

See you in 4-5 years. We'll both need to upgrade some time then.


----------



## Asni (Nov 5, 2022)

EatingDirt said:


> The initial difference was 11.4%. The CPU games suite was a lot more in favor of the 13900k when paired with the 3080. Which had CS:GO @32%, AoE5 @28%, Borderlands 3 @6% & FC6 @27% all in favor of the 13900k.
> 
> It may be that the 4090 is just better with the 5800X3D than the 3080 was. Driver updates, AGESA updates and game patches could also account for the increase in performance for the 5800X3D. There are so many factors at play when benchmarking that you're going to get anomaly's and differences from month-to-month, year-to-year, etc. If we bench this a month from now, Devil May Cry 5 might be in favor of the 13900k by 10%, which would put us closer to the initial 11.4%.


In that review, the 13900k performs 13% better than the 5800x3d (100/88.6). The 13900k review is 15 days old, nothing changed.
I'd just like to know how we went from 13% with a rtx 3080 to 6.2% with a rtx 4090 in a completely cpu bound scenario.


----------



## nomadka670 (Nov 5, 2022)

ShiningSapphire said:


> You made no mistake with this choice. 13600K is much better all-rounder beating 7600X by a lot in multi core workloads and providing great performance in games. Here you can read a review in which the testing methodology in games is based on finding the most cpu heavy scenarios (not built-in benchmarks like so many reviewers like to do) to show differences between different cpus and as you will see 13600K performs really solid better than 7600X in games not far away behind 13900K. It's in Polish but you'll get by with a translator.  https://www.purepc.pl/test-procesor...tna-wydajnosc-porownanie-w-grach-i-programach
> View attachment 268696


Then what about the hardware unboxed video where the ryzen 7700x beats the 13700k . And they not bullshittin.


----------



## Space Lynx (Nov 5, 2022)

gffermari said:


> You're getting 99% of the absolute performance. I think you can live with that, taking into account that you're not gonna get a 4090 now and a 4090Ti the day it's released etc...
> 
> For the money you spent....you actually stole them.
> 
> See you in 4-5 years. We'll both need to upgrade some time then.



It's true, its more of just the confusion of the variances being so great between review methodologies this round that made me a little frustrated. Ram never used to matter much, especially not on Intel's side of things. I just assumed nothing had changed cause the initial reviews showed 13600k dominating. 

It's all good though, I actually just did the math, and by going with my current setup I saved exactly $320 by avoiding high end ddr5 ram and a $330 AM5 mobo (if I went AM5 I would have gone with the Riptide mobo).  total savings were $320 by going with a high end clearance sale mobo for z690 and budget dual ranked ddr4 ram, for you are correct, even if i build again in 5 years AM6 socket will be out.

I might do a new build in 5-6 years, depends where I am at in life. I do intend to try to make this a 10 year build though, with maybe one gpu upgrade in-between, since i do have a gen5 slot for gpu on my mobo.


----------



## Mistral (Nov 5, 2022)

What's the deal with DMC at 1080p and Civ at 4k?


----------



## Scrizz (Nov 5, 2022)

the Civ VI result at 4k seems odd. It also seems most games are GPU not CPU limited.


----------



## DeadmanFatboy (Nov 5, 2022)

Asni said:


> In that review, the 13900k performs 13% better than the 5800x3d (100/88.6). The 13900k review is 15 days old, nothing changed.
> I'd just like to know how we went from 13% with a rtx 3080 to 6.2% with a rtx 4090 in a completely cpu bound scenario.


I'm not sure what is so confusing. The average got dragged down when more games were tested. Averages can be misleading. That's why most reviewers show individual game benchmarks.

Conclusion is clear,  though: 13900K and 5800X3D are evenly matched in most games, but a few favor one (usually Intel) with double digits relative performance gains.


----------



## wheresmycar (Nov 5, 2022)

I'm really impressed with these results.. the less power munching DDR4 X3D competing head-to-head with a faster stock clocked and DDR5 pushed 13900K gaming overkill. In some of the titles benched which are currently in my play-library the difference in perf going either way is only -/+2.5%. The only 2 games which i'm interested in where intel offers a nice performance uplift (13-16%) is death stranding and far cry 6. Havent touched these titles yet but definitely looking to secure a third title with similar performance gains, spiderman remastered, for my nephews and nieces. 

Bottom line, not interested in RPL/ZEN 4 anymore (a growing sentiment). Unless Zen 4 drops an X3D for a reasonable asking price + affordable boards, its looking more likely i'm gonna grab a 5800X3D


----------



## Tek-Check (Nov 6, 2022)

CallandorWoT said:


> 7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.


If you game only and do nothing more, 7600X is marginally better. Nothing to worry too much about. You will, however, not be able to upgrade from 13600K.


ShiningSapphire said:


> You made no mistake with this choice. 13600K is much better all-rounder beating 7600X by a lot in multi core workloads and providing great performance in games.


True that.


ShiningSapphire said:


> Here you can read a review in which the testing methodology in games is based on finding the most cpu heavy scenarios (not built-in benchmarks like so many reviewers like to do) to show differences between different cpus and as you will see 13600K performs really solid better than 7600X in games


Doubtful. HUB recent tests giver edge to 7600X in gaming.


----------



## Space Lynx (Nov 6, 2022)

Tek-Check said:


> If you game only and do nothing more, 7600X is marginally better. Nothing to worry too much about. You will, however, not be able to upgrade from 13600K.



I don't want to upgrade after this. The industry is going crazy these days, this is it for me for several years. lol


----------



## EatingDirt (Nov 6, 2022)

Asni said:


> In that review, the 13900k performs 13% better than the 5800x3d (100/88.6). The 13900k review is 15 days old, nothing changed.
> I'd just like to know how we went from 13% with a rtx 3080 to 6.2% with a rtx 4090 in a completely cpu bound scenario.


What do you mean nothing changed?

1. Different games were benchmarked. *Games are not a monolith, the 13900k being 13% faster in 12 games doesn't mean it'll be 13% faster in 53 games. It's pretty clear that the 13900k test suite favored the 13900k over the 5800X3D.*
2. The entire GPU & architecture changed. *This can change the performance even in similar games.*
3. A different Nvidia driver was used. *See #2.*


----------



## ShiningSapphire (Nov 6, 2022)

Tek-Check said:


> Doubtful. HUB recent tests giver edge to 7600X in gaming.


This is a very reliable source in my country with a long history. HU doesn't even give the places in the games where they test and for some titles they're running built-in benchmarks. This is why their results are flattened and the differences between the processors are small.


----------



## nomadka670 (Nov 6, 2022)

7700x with 6000mhz cl30 Memory beats the 13700k same with 4090. 

Games like Csgo , Valorant the Intel dont have any chance.

So when you link a cyberpunk where the 13600k beats all the ryzen its Just false a little bit.

Its more like which game you play and Pick the best Processor


----------



## Nopa (Nov 6, 2022)

CallandorWoT said:


> Well, I kind of regret getting raptor lake now after watching this video... looks like 7600x beats it easily across the board as long as you slot in some ddr5 6000 cas 30 ram. i think even lows are better by 20% with the 7600x over the 5800x3d and 13600k... turns out 7600x was the sleeper winner all along you just have to use really high end ram.
> 
> fuck. i don't know know what to do now. i should have never betrayed my love for AMD after all...
> 
> 7600x beats x3d and raptor lake according to this video, in highs, in lows, everything. as long as you pair with ddr5 6000 cas 30... how come I didn't realize this when I bought my 13600k on launch day... im so fucking confused right now. ugh my head hurts.


Would 7950X pair with a very low latency CL30 DDR5 6000 beats 13900K with the best DDR5 7800 as well I wonder?


----------



## ausmisc (Nov 6, 2022)

Hardware Unboxed just seem to have higher Zen4 results compared to the majority of reviews. Even when comparing the same games at the same resolutions, but maybe that's in-game benchmark vs in-game play I'm not sure. The general launch review consensus is that a DDR5 13600K competes with the 7700x.






						Launch-Analyse Intel Raptor Lake (Seite 4) | 3DCenter.org
					

Mittwoch, 26. Oktober 2022  / von Leonidas   Der Schnitt aus 28 Launch-Reviews mit ~4230 ausgewerteten Anwendungs-Benchmarks sowie ~2480 ausgewerteten Spiele-Benchmarks führt letztlich zu einem vergleichsweise eindeutigen Ergebnis:




					www.3dcenter.org


----------



## nomadka670 (Nov 6, 2022)

ausmisc said:


> Hardware Unboxed just seem to have higher Zen4 results compared to the majority of reviews. Even when comparing the same games at the same resolutions, but maybe that's in-game benchmark vs in-game play I'm not sure. The general launch review consensus is that a DDR5 13600K competes with the 7700x.
> 
> 
> 
> ...


Thats why i asking for a 7700 vs 13700 benchmark with the Right ram and 4090

Lets see the true.


----------



## Nopa (Nov 6, 2022)

Intel and MB manufacturers been pushing for this whole DDR5 8000-9000+ bus speed aka the higher, the better.

I wanna know if a low latency CL30s DDR5 6000 really make up for DDR5 9000+ speed in real-world gaming usage.


----------



## Cryio (Nov 6, 2022)

Thank you for finally having all games using they proper APIs in these charts.


----------



## Asni (Nov 6, 2022)

EatingDirt said:


> What do you mean nothing changed?
> 
> 1. Different games were benchmarked. *Games are not a monolith, the 13900k being 13% faster in 12 games doesn't mean it'll be 13% faster in 53 games. It's pretty clear that the 13900k test suite favored the 13900k over the 5800X3D.*
> 2. The entire GPU & architecture changed. *This can change the performance even in similar games.*
> 3. A different Nvidia driver was used. *See #2.*


Nothing changed about the platform, as i'm saying since yesterday they went from a rtx 3080 to a rtx 4090 which is a more powerful gpu. That's what changed and that's what should have caused a bigger gap.
Again, some of the additional 43 games have even a bigger gap (i listed 5 of them) even if the initial test suite was favoring the 13900k.

The difference shouldn't be 13%, that's obvious, but it can't be half of that with the most powerful gpu on the market. I don't know which one is incorrect but they can't coexist.


----------



## Tek-Check (Nov 6, 2022)

ShiningSapphire said:


> This is a very reliable source in my country with a long history. HU doesn't even give the places in the games where they test and for some titles they're running built-in benchmarks. This is why their results are flattened and the differences between the processors are small.





nomadka670 said:


> Its more like which game you play and Pick the best Processor





ausmisc said:


> Hardware Unboxed just seem to have higher Zen4 results compared to the majority of reviews.


At the end of the day folks, Raptor Lake CPUs have a slight edge in gaming with current imperfect measurements (different RAM and all...), but this is nothing revolutionary to lose your head about or argue for/against apologetically. Difference between the top three CPUs is 1.5%-9%, which is mostly negligible, depends on RAM and selection of games. 

Average difference seen below is not necessarily relevant for each individual user, depending on their system and game content. Plus, AMD is now addressing with Windows and game developers reports that 7900X and 7950X are a bit slower when both chiplets are used. Expect some improvements in gaming performance of those top CPUs in months ahead.


----------



## EatingDirt (Nov 6, 2022)

Asni said:


> Nothing changed about the platform, as i'm saying since yesterday they went from a rtx 3080 to a rtx 4090 which is a more powerful gpu. That's what changed and that's what should have caused a bigger gap.
> Again, some of the additional 43 games have even a bigger gap (i listed 5 of them) even if the initial test suite was favoring the 13900k.
> 
> The difference shouldn't be 13%, that's obvious, but it can't be half of that with the most powerful gpu on the market. I don't know which one is incorrect but they can't coexist.


But something did change about the platform. I listed both: 3080 to 4090 & different drivers. This can change _previous _tested game results, as there may be different limitations that the new card is introducing that the old one was not. Raw power is not _everything _when you're limited by the CPU. Architecture & Driver overhead/optimization can play large roles when you start hitting CPU limitations in a game.

The 5800X3D is also an anomaly in terms of CPU's. It's not simply a brute-force ghz & IPC CPU, it relies on the cache to do most the heavy lifting. There are many games that still favor brute force, and those are shown when the 13900k is clearly ahead in the 1080p graph.


----------



## Nater (Nov 6, 2022)

Asni said:


> In that review, the 13900k performs 13% better than the 5800x3d (100/88.6). The 13900k review is 15 days old, nothing changed.
> I'd just like to know how we went from 13% with a rtx 3080 to 6.2% with a rtx 4090 in a completely cpu bound scenario.


It has nothing to do with the CPU.  It's the 4090.  It's a trend I noticed in all the reviews I've seen.


----------



## Nopa (Nov 6, 2022)

Asni said:


> Nothing changed about the platform, as i'm saying since yesterday they went from a rtx 3080 to a rtx 4090 which is a more powerful gpu. That's what changed and that's what should have caused a bigger gap.
> Again, some of the additional 43 games have even a bigger gap (i listed 5 of them) even if the initial test suite was favoring the 13900k.
> 
> The difference shouldn't be 13%, that's obvious, but it can't be half of that with the most powerful gpu on the market. I don't know which one is incorrect but they can't coexist.


nVIDIA GRD driver issues with hardware scheduler and driver overhead perhaps?


----------



## Gica (Nov 6, 2022)

EatingDirt said:


> What do you mean nothing changed?
> 
> 1. Different games were benchmarked. *Games are not a monolith, the 13900k being 13% faster in 12 games doesn't mean it'll be 13% faster in 53 games. It's pretty clear that the 13900k test suite favored the 13900k over the 5800X3D.*
> 2. The entire GPU & architecture changed. *This can change the performance even in similar games.*
> 3. A different Nvidia driver was used. *See #2.*


And windows 10, not 11. From Alder it is known as Windows 11 must use.
For the review 13900K used Windows 11, now it used 10 and I don't understand why.


----------



## W1zzard (Nov 6, 2022)

Cryio said:


> Thank you for finally having all games using they proper APIs in these charts.


I thought of you before publishing the article and made sure Days Gone is DX11 



Gica said:


> For the review 13900K used Windows 11, now it used 10 and I don't understand why.


Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll


----------



## Voodoo Rufus (Nov 6, 2022)

It's such a great time right now, so many good options for hardware that you can't go wrong. All depends on preferences, needs and your pocket book.

Although from my perspective, if one is gaming at 1080P still, they're probably on a lower end GPU. I wonder how scaling would work for something like a 3060 class GPU instead of the 4090 at those resolutions. Things run so fast with an xx80-class GPU or better that 1080P runs are basically for ranking not practicality. And as we know at 4K or close, it's all GPU bound so you can get a lot of runway out of a CPU a couple generations old.


----------



## Gica (Nov 6, 2022)

W1zzard said:


> Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll


The migration from W10 to W11 is free.
The recommendation for Alder and Raptor owners is to migrate to W11. I guess they did. A fair comparison would be the review with W11 because it does not disadvantage the platforms, but W10 can create problems for the Intel platform because it cannot efficiently manage P and E cores.


----------



## Space Lynx (Nov 6, 2022)

W1zzard said:


> Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll



Do a poll!  

I'm curious what people want.


----------



## W1zzard (Nov 6, 2022)

Gica said:


> The migration from W10 to W11 is free.
> The recommendation for Alder and Raptor owners is to migrate to W11. I guess they did.


A ton of people have not used the upgrade offer, or adoption rates wouldn't be as low as they are



CallandorWoT said:


> Do a poll!
> 
> I'm curious what people want.











						Ideas for next Megabench?
					

What should I test next? Also accepting suggestions if you can think of something other than listed in the poll options  Previous articles: https://www.techpowerup.com/review/rtx-4090-53-games-core-i9-13900k-vs-ryzen-7-5800x3d/...




					www.techpowerup.com


----------



## Gica (Nov 6, 2022)

W1zzard said:


> A ton of people have not used the upgrade offer, or adoption rates wouldn't be as low as they are


In a review, it is extremely important not to disadvantage any platform. W11 does not disadvantage the 5800X3D, but the tests are inconclusive for the 13900K if W10 is used because it cannot effectively divide the loads between the P and E cores. There is no P and E in W10 and it is very likely that some "P" loads are assigned to the cores E. Honestly, it's like testing modern processors with Cinebench R15 or older.
P.S. I'm using W10 on the i5-12500 system because this processor doesn't have E cores. If it did, I'd migrate to W11


----------



## thestryker6 (Nov 6, 2022)

I've been greatly appreciating these comparison looks as it helps to see where things lie. I know if I had an AM4 setup here I would buy a 5800X3D when on sale just because of how much they swing for gaming. I'm hoping we see a Zen 4 X3D next year and that Intel's MTL competes because this has been a great time for customer choice in CPUs.


----------



## Mussels (Nov 7, 2022)

> Jumping straight into the action, and starting with the highest 4K Ultra HD resolution, which is the native stomping ground for the GeForce RTX 4090, we're seeing that the Ryzen 7 5800X3D is matching the Core i9-13900K "Raptor Lake" very well. Averaged across all 53 games, the i9-13900K is a negligible 1.3% faster than the 5800X3D.



This is what i was expecting to see in the previous articles (although they didnt focus on the same things)
With the usual CPU reviews performance per watt and power efficiency graphs thrown in, it does make the new intels just seem like a "why the f*ck would anyone want this, for gaming?"


0.1% lows come ahead according to reviews that cover that, which isnt a surprise since the intel systems have time based and thermal based throttles to worry about - if they reach the time limit of PL1 and drop to 125W, they *may* start to show small FPS drops and microstutter the x3D doesnt at its lower wattages


----------



## rom64k (Nov 7, 2022)

Thanks for the amazing review.

About the bottleneck generated between the 5800X3D and the RTX 4090, here's my experience

I let you two captures from Modern Warfare II performance test at 4k

This one is taken with my new 5800X3D and RTX 4090.

https://bit.ly/3NCqBkd

This other was before to change the CPU, I had a 5600x paired with same RTX 4090 too.

https://bit.ly/3T4wbNm

Both were taken playing at 4k. You can draw your own conclusions.



dir_d said:


> I have a 5600x at 4k 120hz must not buy 5800x3d... So tempting at $329.


Please, read my post #150

Best regards


----------



## daish0 (Nov 7, 2022)

From what I can see on the screenshots the actual rendering resolution is set to FullHD, guess because of DLSS (Performance) setting.
This is bascially a best case scenario for the CPU in gaming, no wonder the FPS is skyrocketing.

Anyway, I think the 5800X3D is extremely good for gaming right now.

Wanted to upgrade to an Intel system again after my AM4 system because I really hated all the problems I had - felt like a beta test product for me and some things are still not fixed on my board.
But the efficiency of the X3D cache in games makes it really hard not to wait for the next AMD variants with X3D ...


----------



## rom64k (Nov 7, 2022)

daish0 said:


> From what I can see on the screenshots the actual rendering resolution is set to FullHD, guess because of DLSS (Performance) setting.
> This is bascially a best case scenario for the CPU in gaming, no wonder the FPS is skyrocketing.
> 
> Anyway, I think the 5800X3D is extremely good for gaming right now.
> ...


Sorry to read that. My motherboard has been running as good as a charm, it's an ASUS TUF GAMING B550-Plus, since day one (however, I've had to update the BIOS several times without any major problems).


----------



## SOGOKU (Nov 7, 2022)

W1zzard said:


> I thought of you before publishing the article and made sure Days Gone is DX11
> 
> 
> Because the vast majority of gamers of Windows 10. I guess I could do a "50 Games Windows 10 vs Windows 11 RTX 4090 + 13900K" article next .. or maybe something with 7700X or 7600X or 13700K. I guess I could start a poll


I don't understand till this day the aversion people have to Windows 11. I'm use it since day 1. Never had an issue ! It's even prettier then Windows 10. It's always the same thing when a Microsoft OS come out. Ok this time is a little different with TPM 2.0, but only for older hardware.


----------



## izy (Nov 7, 2022)

SOGOKU said:


> I don't understand till this day the aversion people have to Windows 11. I'm use it since day 1. Never had an issue ! It's even prettier then Windows 10. It's always the same thing when a Microsoft OS come out. Ok this time is a little different with TPM 2.0, but only for older hardware.


You cant ungroup taskbar for example.


----------



## wolf (Nov 7, 2022)

Awesome review W1z, and great to know I made a good choice to have a top tier CPU in my AM4 system to give it a final send off.

5800X3D gonna be the CPU version of a 1080Ti.


----------



## SOGOKU (Nov 7, 2022)

izy said:


> You cant ungroup taskbar for example.


Okay, not in vanilla Win11. Install "Explorer Patcher". Done. There are always simple apps or reg tweaks that can be done, that shape Windows to our liking. Ex: I use Nilesoft "Shell" app because a don't like the context menu on Win11. O&O Shutup since Win10 cause I don't like the lose of privacy, and so on.


----------



## docnorth (Nov 7, 2022)

Nice work as usual @W1zzard. 
1)It would be interesting to test the 13900K with e-cores disabled (or at least 1-2 games that still seem to have problems with the hybrid architecture). 
2)Is there a way for us to see the FPS number?
Thanks.


----------



## Tek-Check (Nov 7, 2022)

nomadka670 said:


> Then lets see 7700x vs 13700k
> 
> RTX 4090
> SAME MEMORY
> ...


7700XT "crap" in games?
On which planet do you live dude?
Differences in gaming between 7700XT and 13700K are negligible, a few percentage points on average.

You might find one or the other CPU more preferable if you predominantly play certain games.

See 3D Centrer chart with all SKUs I posted on the previous page


----------



## nomadka670 (Nov 7, 2022)

Tek-Check said:


> 7700XT "crap" in games?
> On which planet do you live dude?
> Differences in gaming between 7700XT and 13700K are negligible, a few percentage points on average.
> 
> ...





Tek-Check said:


> 7700XT "crap" in games?
> On which planet do you live dude?
> Differences in gaming between 7700XT and 13700K are negligible, a few percentage points on average.
> 
> ...


Let me explain it. There is a lot of benchmark where the 7700x outperform the 13700k . But here they forget to mention if we retest it with csgo valorant hitman horizon, the 7700x way better than the 13900k. So on techpowerup in this games yes the Intel is king theres no doubt


----------



## Nopa (Nov 7, 2022)

Tek-Check said:


> 7700XT "crap" in games?
> On which planet do you live dude?
> Differences in gaming between 7700XT and 13700K are negligible, a few percentage points on average.
> 
> ...


Differences are close, but 13700K does beat 7700X 9/10 tasks.


----------



## SOGOKU (Nov 7, 2022)

I really dislike fanboyism ! When are people changing their 5950x's for 13900K's and not the 7950x, there is always someone saying that amd is so much better then intel !


----------



## nomadka670 (Nov 7, 2022)

SOGOKU said:


> I really dislike fanboyism ! When are people changing their 5950x's for 13900K's and not the 7950x, there is always someone saying that amd is so much better then intel !


Im not a fanboy I have Both of them. All I sayin its not the 100% real .


----------



## AnotherReader (Nov 7, 2022)

Bwaze said:


> For that conclusion we'll have to see the 7x00X3D benchmarks.
> 
> I think everybody was kind of surprised to what extent extra cache helped the Zen 3 in gaming - maybe even AMD, since they didn't plan to use 3D cache in Zen 4 from the start.
> 
> ...


My forecast is based off the difference between the 5800X and 5800X3D; this accounts for lower clock speeds. The actual difference will, of course, be seen when the reviewers get their hands on these chips.


----------



## Voodoo Rufus (Nov 7, 2022)

The X3D chips remind me of the K6-2 vs. K6-3 days. The extra cache then helped the K6-3 maintain some competitive value against Intel in those days before the original Athlon. Old trick that still pays dividends.


----------



## Tek-Check (Nov 7, 2022)

nomadka670 said:


> Let me explain it. There is a lot of benchmark where the 7700x outperform the 13700k . But here they forget to mention if we retest it with csgo valorant hitman horizon, the 7700x way better than the 13900k. So on techpowerup in this games yes the Intel is king theres no doubt


There is no "king" dude and no need to explain it to me. That's what I have been trying to tell you. You should always be sceptical towards individual website reviews. The value of 3D Center meta-analysis is the *meta-analysis itself*. It averages 28 release reviews, including TPU, and it gets better over time, with more reviews taken into account. 

Individual website reviews can even get significantly different results when testing the same products, but in different workloads and different selection of games. I noticed this with GPU 6800XT and 3090Ti tested on several ocassions, on its own and in context of other GPUs. Differences could be 10-15 fps in 4K, which is not banal in demanding games.

As I said, those users who have specific games they enjoy, one CPU or the other may fit better. It's about fitting for your own needs, and not "king" branding. Any CPU could become "king" only if it really blows others out of waters, which is not the case with any Intel or AMD CPU from the same product tier. See below.




Branding any new CPU as a "king" is a deep misunderstanding of how things work. Even if 13900K brings 9% on average higher gaming performance, it's productivity rating is the same as 7950X, with a caveat that i9 uses *way* more power to stay competitive in those heavy workloads, as tested by several reviewers. This means saving on buying new CPU+motherboard, but also spending more on beefier cooler. Someone buying i9 purely for gaming is literally wasting their money. In comparison, one could cool 7950X even with a good air-cooler and lose onyl a few percentages of heavy workload performance, but not in games. So, everything had a more complex perspective and analysis. Remember this.






Nopa said:


> Differences are close, but 13700K does beat 7700X 9/10 tasks.


You are forgetting that 13700K competes with 7900X, and 13600K with 7700X. 
When you put it this way, differences look different.






SOGOKU said:


> I really dislike fanboyism ! When are people changing their 5950x's for 13900K's and not the 7950x, there is always someone saying that amd is so much better then intel !


Who does that?


----------



## mb194dc (Nov 7, 2022)

So at 4k or higher, which is basically the use case for a 4090. Essentially no difference between them.


----------



## 3x0 (Nov 7, 2022)

rom64k said:


> Thanks for the amazing review.
> 
> About the bottleneck generated between the 5800X3D and the RTX 4090, here's my experience
> 
> ...


Wow, incredible difference. Are you sure the settings were the same for both benchmarks?


----------



## jallenlabs (Nov 8, 2022)

Upgrayedd said:


> I paid $200 for an 11700K and a free mobo both new a few months ago. At that low of a price I could've cared less if new stuff was coming or if there was better stuff out. The price couldn't be beaten for what it offered.


When I bought mine, it was $320 in a combo deal at newegg and they through in another motherboard in the deal.  Sold the board for retail, so my chip was like $120.  Got it under a TEC/waterblock, at idle, its below room temp.  Way more fun than a few frames with a cpu upgrade.  That said, I do have a 12700k in my other rig...


----------



## Gica (Nov 8, 2022)

Tek-Check said:


> There is no "king" dude and no need to explain it to me. That's what I have been trying to tell you. You should always be sceptical towards individual website reviews. The value of 3D Center meta-analysis is the *meta-analysis itself*. It averages 28 release reviews, including TPU, and it gets better over time, with more reviews taken into account.
> 
> Individual website reviews can even get significantly different results when testing the same products, but in different workloads and different selection of games. I noticed this with GPU 6800XT and 3090Ti tested on several ocassions, on its own and in context of other GPUs. Differences could be 10-15 fps in 4K, which is not banal in demanding games.
> 
> ...


You make some elementary mistakes.
1. The cooler that drives the 13900K to 100 degrees (100 max) drives the 7950X to 95 degrees (95 degrees max).
2. The fps/W ratio is reasonably equal, but the 13900K offers more fps.
3. Those willing to buy these processors do not skimp on the cooler.

So?


----------



## laszlo (Nov 8, 2022)

considering that a minority have and will have rtx4090 this comparison clearly shows that previous & older gen gpu's can't bottleneck the tested cpu's


----------



## Tek-Check (Nov 8, 2022)

Gica said:


> You make some elementary mistakes.
> 1. The cooler that drives the 13900K to 100 degrees (100 max) drives the 7950X to 95 degrees (95 degrees max).


Please, before you point out anyone's 'elementary' mistakes, read their text properly.

This is what pros need to know before buying a system for constant MT workloads, and not casual home gaming.

7950X will achieve 95 degrees on any cooler. And that's all to it. After that, CPU will stay there and self-regulate. On AIO 360 mm, you can do full MT onslaught at full performance potential. On air cooler, gaming is good and light MT workloads, with ~3% performance loss in encoding for example.

With i9, you cannot do that. Watch reviews with thermal and throttling testing. Have you seen any? 13900K thermally throttles both with AIO 360 and 420 mm, and it can lose up to 8% performance beyond 100 degrees, while using wapping 340W of power. It's an oven for daily MT workloads in someone's company.


Gica said:


> 2. The fps/W ratio is reasonably equal, but the 13900K offers more fps.


What's the point of saying this if I had already mentioned that 13900K offers up to 9% more performance in gaming? Did you actually read properly what I wrote?


Gica said:


> 3. Those willing to buy these processors do not skimp on the cooler


Sure, spend more money, by all means. There is nothing wrong with it.

Just make sure you don't say later it's a cheaper platform, if you have to add another $100 on beefier cooler to cool down the power hungry beast.

Again, if you predominantly play games, you don't need i9. Waste of money on halo product. It's like buying Ferarri and living in a country without motorways. i7 and i5 do gaming job equally well, with negligible difference.


----------



## Gica (Nov 8, 2022)

I also want to see the miracle of how a 7950X, reached 95 degrees (maximum allowed), "self-regulates" without losing performance.
It seems that you also missed the lesson on heat dissipation, the W/mm2 chapter.


----------



## AnotherReader (Nov 8, 2022)

Gica said:


> I also want to see the miracle of how a 7950X, reached 95 degrees (maximum allowed), "self-regulates" without losing performance.
> It seems that you also missed the lesson on heat dissipation, the W/mm2 chapter.


This very site tested this, and found


> Our testing in this article shows that the performance losses are minimal, even when pairing the Ryzen 9 7950X flagship with an entry-level cooler that's running at slow fan speed settings.


----------



## Viperl0 (Nov 8, 2022)

Really would've appreciated tests for Final Fantasy XIV, Valorant and Apex. I know you mentioned in a previous CPU specific review it's hard to bench games like that for comparison because the games or on different patch versions at different CPU release dates. But in a straight X vs Y I think it's a great opportunity to do it.


----------



## Wasteland (Nov 8, 2022)

Tek-Check said:


> You are forgetting that 13700K competes with 7900X, and 13600K with 7700X.


I agree with all of your other points, but this one seems iffy.  Based on pricing, the 13700k squares off against the 7700x, and the 13600K squares off against the 7600x. These aren't favorable match ups for AMD; the 7600x and 7700x compete well in gaming, but lose hugely in productivity workloads to Raptor Lake at analogous price points. 

And if the low-to-mid range Zen 4 CPUs* are going to hang their hats on gaming perf, well, they _also_ lose pretty convincingly to AMD's own 5800x3d, based on platform cost. AMD needs to release some non-X SKUs, or something.  I'm sure AM5 will come into its own eventually, but it ain't there yet.

(* - "low"-end is a term of art, in the context of Zen 4 and Raptor Lake, thus far.  $300+ CPUs just don't qualify for that category, IMO, which sums up the problem here neatly.  Gaming enthusiasts who don't plan to buy an extremely expensive graphics card are almost always best off buying a CPU in the $150-200 range.  Frankly, an R5 5600 or an i5-12400 still offers vastly better value for most gamers than any of the new shiny stuff from _either_ AMD or Intel.)



AnotherReader said:


> This very site tested this, and found


Yeah, HUB also did a pretty good piece on this:










Zen 4's thermal behavior is actually among its most interesting and attractive characteristics.  And AMD kept cooler compatibility with AM4 in the bargain.


----------



## Tek-Check (Nov 8, 2022)

Gica said:


> I also want to see the miracle of how a 7950X, reached 95 degrees (maximum allowed), "self-regulates" without losing performance.
> It seems that you also missed the lesson on heat dissipation, the W/mm2 chapter.


Read, watch and learn dude, from reviewers and other members here. Don't embarrass yourself out of ignorance and/or arrogance. It's not worth it. There's a lot of heat from new CPUs and GPUs and each company deals with it in a slightly different way. Nvidia has even more heat to deal with on optimized 4 nm node, and so new GPUs are gigantic for this very purpose. As regards to CPUs, AMD has more elegant solution for thermal and power management this time around and Intel uses brute force on i9 to let us know that unrestricted power use beyond 253W will bring you a few petty percentages of performance, but also enormous amount of power and heat to deal with.

7950X hits 95 degrees immediately when MT workload starts and all-core frequency is very stable throughout tests. See GN below.




Watch GN review of 7950X
Both temperature and core behaviour are by design on Zen4 and choice of coolers will make much smaller difference for 7950X/7900X than for i9 and i7 CPUs that suck wapping ~300W in similar workloads. And so, GN found i9 to be the worst power efficiency CPU on the market for continuous MT workloads.
Watch GN review of 13900K
It is i9 that is hardest to cool precisely because it uses insane amount of power in those workloads. In HUB review, i9 was not able to finish a test without thermal throttling because cores hit 100 degrees soon, so soon that it throttles just after a few seconds into the test, even with 420 mm AIO cooler! It loses ~8% of performance in those workloads, which professionals who need strong MT systems for daily workloads will certainly think about very, very carefully, indeed.
This means that if 7950X can save them 20-30 minutes per 8 hours, day-by-day, to do the same job over 13900K, just imagine how many hours of work could be saved over several months of usage. Marginal gains are important in this field, as faster jobs done mean more jobs done over longer period of time. And time is money for them. Plus, it's easier to deal with 355W of system power consumption, than with almost 500W with 13900K. See HUB below.



Watch HUB review of 13900K
How much performance does 7950X lose, you asked? Here is another watch for you, to show that 360 mm AIO makes little difference to 120 mm or air cooler in most cases. If you are on the edge of saving time because you run multiple-hour workloads, 360 mm will save precious seconds and minutes. If not, good air cooler will do the job too. Other members here posted for you more explanations and links regaridng cooling of Zen4. Pretty simple stuff.
HC review of 7950X cooling requirements
Enjoy it and let us know what you think, after digesting relevant information.



Wasteland said:


> I agree with all of your other points, but this one seems iffy. Based on pricing, the 13700k squares off against the 7700x, and the 13600K squares off against the 7600x. These aren't favorable match ups for AMD; the 7600x and 7700x compete well in gaming, but lose hugely in productivity workloads to Raptor Lake at analogous price points.


Price-wise, agreed. Intel wants us to think this way and pick up lower in stack Zen4 CPU to compare with higher in stack RPL. Of course 7700X looks worse than 13700K. No doubt. However, once prices change, 7700X will compare with 13600K, as it should.

In terms of power efficiency, core count and workload purpose, I disagree. It's amazing what Zen4 CPUs can do with less cores than Intel parts. 16-core CPU matches 24-core CPU in productivity workloads. The same for 12-core 7900X against 16-core i7. Those are high-end consumer CPUs, so they have broader application, far beyond gaming and casual MT. 7700X loses in productivity only ~5% to 13600K, which is negligible. That's why I suggested to compare the top three Zen4 with top three Raptor Lake, rather then falling into a price trap comparison. 7700X needs to be $320-330 to be very competitive with i5 on price.

If I was a professional who uses MT system daily, 8-10 hours a day over entire year, I'd buy 7900X over 13700K and 7950X over 13900K system, for simple reasons. Easier to cool, definitely more power efficient and future-proof platform. More expensive initially, true, but gains come over time and initial investment pays off.  I only need to slot in a single CPU at Zen5 and/or Zen6 point and call it a day, rather than buying entire system again.


Wasteland said:


> And if the low-to-mid range Zen 4 CPUs* are going to hang their hats on gaming perf, well, they _also_ lose pretty convincingly to AMD's own 5800x3d, based on platform cost. AMD needs to release some non-X SKUs, or something. I'm sure AM5 will come into its own eventually, but it ain't there yet.


Agreed.


Wasteland said:


> Gaming enthusiasts who don't plan to buy an extremely expensive graphics card are almost always best off buying a CPU in the $150-200 range. Frankly, an R5 5600 or an i5-12400 still offers vastly better value for most gamers than any of the new shiny stuff from _either_ AMD or Intel.)


Agreed.


----------



## Gica (Nov 9, 2022)

Tek-Check said:


> Read, watch and learn dude
> 
> Agreed.


The same as you.
How to have the same performance in renders when the frequency decreases, only you can know.
Agreed?


----------



## HTC (Nov 9, 2022)

Gica said:


> The same as you.
> How to have the same performance in renders when the frequency decreases, only you can know.
> Agreed?



I've seen that graph before: it simulates "lower quality coolers" by having different fan speeds.

If only a similar test were to be done with the 13900K, just to see if the impact from "a lower quality cooler" was smaller, the same or higher: i'd bet it would be higher because the CPU already gets hotter to start with.


----------



## Gica (Nov 9, 2022)

The gentleman claims that the 7950X maintains its performance when it reaches 95 degrees. I knew that it reduces its frequency (default performance) when it reaches this critical threshold.

Very fresh. My "hot Intel" 12500 beat by 10 W, average CPU+GPU.  (printscreen)
Hero! 

Seriously, it smells of frustration among red fans because the 13900K took the crown. If you don't like it, don't buy it. Simple! I don't think that in any competition, apart from the position occupied, anyone cares how much the winner consumes or what he eats.


----------



## HTC (Nov 9, 2022)

Gica said:


> The gentleman claims that the 7950X maintains its performance when it reaches 95 degrees.



In ST scenarios he's mostly right, but NOT in MT scenarios, like the graph you posted earlier clearly states.

The same should be right for the 13900K. In fact, it SHOULD be even worse because the "starting full load temp" is higher but, until a similar test is conducted on the 13900K, we won't know for sure.

@W1zzard proved that 95º for the 7000Zen 4 CPU series wasn't a problem and that, even with "weaker coolers", the CPU wouldn't overshoot it's temperature though IT WOULD lose performance. I wish he'd do a similar test for the 13900K so that we could compare the impact the "various coolers" he used with the 7950X have on 13900K's performance.


----------



## Tek-Check (Nov 9, 2022)

Gica said:


> A fair comparison would be the review with W11 because it does not disadvantage the platforms, but W10 can create problems for the Intel platform because it cannot efficiently manage P and E cores.


Hardware Unboxed has already compared Intel's performance in Win10 and Win11. Watch it.
Plus, it was Thread Director that had more problems with 12th gen rather than OS. Now, Thread Director 2.0 has improved in 13th gen.



Gica said:


> The same as you.
> How to have the same performance in renders when the frequency decreases, only you can know.
> Agreed?


One, I have already posted performance differences with different coolers in #165
You replied to it in #169, but you completely ignored cooler and performance chart.

Two, I also gave you a link to the very video from which the chart was taken in #176
Have you seen it? I am asking you to engage with reference content and pay more attention to what is posted.

Both @W1zzard and Hardware Canucks found negligible differences in MT rendering performance.
In Blender, in 15 minute workload, you will save 14 seconds on 360 mm AIO versus good air cooler.
Would you like the same chart to be posted again or will you read #165 and watch the video from #176?

The TPU chart you posted is for academic purposes. No one sane will ever run in everyday life 32-threads in Blender with Noctua at 40%. It does not work like that. What is meaningful is to look at just how *small* frequency drop is from 420 mm AIO to Noctua single tower air cooler at 80-100%, only 70-80 MHz on all threads. This is where those additional seconds occur. The difference is still negligible and taps into W1zzard's conclusion, which you should read again. The CPU will attempt to run at 95 degrees with any cooler and if the cooler is not as performant, the CPU will drop the frequency a bit to stay within temperature envelope.

The same does not happen with 13900K. In Blender, it hits thermal throttling at 100 degrees just after a few seconds from the start of workload on 360 and 420 mm AIO. No matter how much additional power you throw at it, that's what you get - a power hungry beast that is extremely difficult to cool at those extended professional workloads, with performance drop up to 8% as it approaches 300W. You want more power and performance? Use LN cooler. Just don't burn your hands, please.


----------



## Gica (Nov 9, 2022)

OK. We pay $1000+ for the cpu+mobo+RAM, we pay another $1000+ for the video card... at least $3000 for the system, but we buy a cooler under $100 and complain about the consumption.
Which part will add a comma, because it sounds like hell to me? Those who want the best do not look at costs or consumption.
What would be the difference between 7950X and 13900K in the normal use of a home user? A kilowatt a week? Two a month? You talk as if the owner of an AMD has no expenses, his processor produces energy and sells it on the free market.
Do you understand that the reviews test the extremes and you will not meet such consumption in real life? It seems not. If the 13900K consumes significantly less than the 7950X in single-threaded mode (automatically more economical in idle), the average consumption depends only on the user and with the watt performance ratio even as you speak, you may have a question, because it is sensitively equal: the 13900K can win some applications , may lose in others.

Below you have a 12500, used for an hour only for internet. 3.9 W/h average, CPU+igp. Show me a Ryzen that consumes that much or less, please.


----------



## Tek-Check (Nov 9, 2022)

Gica said:


> OK. We pay $1000+ for the cpu+mobo+RAM, we pay another $1000+ for the video card... at least $3000 for the system, but we buy a cooler under $100 and complain about the consumption. Which part will add a comma, because it sounds like hell to me? Those who want the best do not look at costs or consumption.


I have tried to explain this in the post #176 above. Have you missed it? Working people who want the best DO look at costs, consumption, efficiency and time-saving. It is a prudent thing to look into for your business. They are not impressed by the brute force of CPU. From #176:

"It is i9 that is hardest to cool precisely because it uses insane amount of power in those MT workloads. In HUB review, i9 was not able to finish a test without thermal throttling because cores hit 100 degrees soon, so soon that it throttles just after a few seconds into the test, even with 420 mm AIO cooler! It loses ~8% of performance in those workloads, which *professionals who need strong MT systems for daily workloads will certainly think about very, very carefully,* indeed. This means that *if 7950X can save them 20-30 minutes per 8 hours, day-by-day, to do the same job instead of using 13900K, just imagine how many hours of work could be saved over several months of usage. Marginal gains are important in this field, as faster jobs done mean more jobs done over longer period of time. And time is money for them.* Plus, it's easier to deal with 355W of system power consumption than with almost 500W with 13900K."


Gica said:


> What would be the difference between 7950X and 13900K in the normal use of a home user? A kilowatt a week? Two a month? You talk as if the owner of an AMD has no expenses, his processor produces energy and sells it on the free market.


You are putting words into my mouth. I did not say that. I say that "normal" home user does not need any of these two CPUs. It's a huge overkill in both cases.


Gica said:


> Do you understand that the reviews test the extremes and you will not meet such consumption in real life? It seems not. If the 13900K consumes significantly less than the 7950X in single-threaded mode (automatically more economical in idle), the average consumption depends only on the user and with the watt performance ratio even as you speak, you may have a question, because it is sensitively equal: the 13900K can win some applications , may lose in others.


Extreme testing cases are an indicator for how those CPUs would behave in daily usage in different fields, businesses and industries.
And I did say that users need to buy a CPU that would best fit their case use and workloads. Read my posts again. 
When the two top CPUs perform roughly equally in extended MT workloads, 7950X is the system to go to, for the reasons I posted above. It is that simple.

If I was a professional who uses MT system daily, 8-10 hours a day over entire year, I'd buy 7900X over 13700K or 7950X over 13900K system, for simple reasons. Easier to cool, definitely more power efficient, no throttling and AM5 is future-proof platform. More expensive initially, true, but gains come over time and initial investment pays off.  I only need to slot in a single CPU at Zen5 and/or Zen6 point and call it a day, rather than buying entire system again for my business or home use.


Gica said:


> Below you have a 12500, used for an hour only for internet. 3.9 W/h average, CPU+igp. Show me a Ryzen that consumes that much or less, please.


There are many efficient Intel CPUs. 12400 is even better. But 13900K is not one of them and it was never designed to be in those extended MT workloads. Man, they threw 24 cores (!) with power usage never seen before, to be competitive against 16-core 7950X in those workloads.


----------



## wheresmycar (Nov 9, 2022)

Tek-Check said:


> I have tried to explain this in the post #176 above. Have you missed it? Working people who want the best DO look at costs, consumption, efficiency and time-saving. It is a prudent thing to look into for your business. They are not impressed by the brute force of CPU. From #176:
> 
> "It is i9 that is hardest to cool precisely because it uses insane amount of power in those MT workloads. In HUB review, i9 was not able to finish a test without thermal throttling because cores hit 100 degrees soon, so soon that it throttles just after a few seconds into the test, even with 420 mm AIO cooler! It loses ~8% of performance in those workloads, which *professionals who need strong MT systems for daily workloads will certainly think about very, very carefully,* indeed. This means that *if 7950X can save them 20-30 minutes per 8 hours, day-by-day, to do the same job instead of using 13900K, just imagine how many hours of work could be saved over several months of usage. Marginal gains are important in this field, as faster jobs done mean more jobs done over longer period of time. And time is money for them.* Plus, it's easier to deal with 355W of system power consumption than with almost 500W with 13900K."
> 
> ...



I haven't read this but good luck in knocking some common sense into GICA.... he's one of the strangest characters on here, pays no attention to member input, derails the topic in hand to conceal his lack of understanding on any given situation and will repeat everything again no matter how well you present FACTs/etc. 

We couldn't convince Gica that its not the "actual market" which determines the MSRP of GPUs but the manufacturers themselves. If that doesn't sink home, NOTHING WILL!!


----------



## Tek-Check (Nov 9, 2022)

Well, back to the topic. 5800X 3D is an amazing CPU for gamers and far, far better value for buck than 13900K will ever be. Nothing more to add.


----------



## wheresmycar (Nov 9, 2022)

Tek-Check said:


> Well, back to the topic. 5800X 3D is na amazing CPU for gamers and far, far better value for buck than 13900K will ever be. Nothing more to add.



I hope he's not suggesting otherwise lol. Wouldn't touch a 13900K for gaming not even with a 10-foot telescopic pole which stretches to 20-foot hehe.

Can't blame the ordinary folk though... the amount of times i've read "X900/XX900 BEST GAMING CPU" certainly doesn't help people who don't know any better. The 13600K or 13700K is where gaming options should draw the line.


----------



## Tek-Check (Nov 10, 2022)

wheresmycar said:


> The 13600K or 13700K is where gaming options should draw the line.


Even 12400 is great in gaming and great value.


----------



## wheresmycar (Nov 10, 2022)

Tek-Check said:


> Even 12400 is great in gaming and great value.



can't argue with that! TBH i'm all for previous Gen CPUs at this point of time. AM5 went AWOL and RPL is juicing it too hard. Neither tickle my fancy! The 12400/12600K/5600/5600X/5700X is plenty for any gamers dream PC... although ive got my eyes set on the little more pricier 5800X3D.


----------



## Nopa (Nov 10, 2022)

@W1zzard Is this 13900K's power consumption data from HWUB included MB, RAM, Fans, and even GPU too?
I mean 493W seems a little off somehow...


----------



## Hervon (Nov 10, 2022)

Read the title : SYSTEM power consumption.


----------



## Tek-Check (Nov 10, 2022)

Nopa said:


> View attachment 269359
> @W1zzard Is this 13900K's power consumption data from HWUB included MB, RAM, Fans, and even GPU too?
> I mean 493W seems a little off somehow...


It's without GPU. 
It's bonkers with 13900K! So bad for i9 that DerBauer spent three days tweaking the CPU, e.g undervolting etc. to try to tame it into meaningful usage.


----------



## Gica (Nov 10, 2022)

wheresmycar said:


> I haven't read this but good luck in knocking some common sense into GICA.... he's one of the strangest characters on here, pays no attention to member input, derails the topic in hand to conceal his lack of understanding on any given situation and will repeat everything again no matter how well you present FACTs/etc.
> 
> We couldn't convince Gica that its not the "actual market" which determines the MSRP of GPUs but the manufacturers themselves. If that doesn't sink home, NOTHING WILL!!


Or I ignore some pro AMD characters and look for information from several channels.

Intel Core i9 13900K: Impact of MultiCore Enhancement (MCE) and Long Power Duration Limits on Thermals and Content Creation Performance​
What is the difference in performance between AIO and $100 air cooler, even with MCE ON (overclocking)??? Zero or ...zero????

*P.S. The source of the review is of high quality and covers an important area of the reason for buying a processor.*
Note: let's accept +/- 2% tolerance margin, however.


----------



## Tek-Check (Nov 10, 2022)

Gica said:


> Or I ignore some pro AMD characters and look for information from several channels.
> Intel Core i9 13900K: Impact of MultiCore Enhancement (MCE) and Long Power Duration Limits on Thermals and Content Creation Performance​What is the difference in performance between AIO and $100 air cooler, even with MCE ON (overclocking)??? Zero or ...zero????
> *P.S. The source of the review is of high quality and covers an important area of the reason for buying a processor.*
> Note: let's accept +/- 2% tolerance margin, however.


Dude, did you read their closing words?
"We are going to continue to disable MCE and enforce the P1/P2 power limits in our testing and workstations, but we want to make it clear that there is no right or wrong answer here. It is just a matter of tradeoffs, and *the 30-40C temperature increase simply does not constitute an acceptable tradeoff for us as a workstation system integrator*." Therefore, PudgetSystems are NOT happy to run 13900K in a hotter mode, regardless of cooler used. They are happy with performance drop 10-20% in specific workloads in order to run the CPU at acceptable temperature and power use. Simple.

End of story. Focus on the thread topic. 5800X 3D is far better value CPU for gaming than 13900K will ever be. That is the topic of this thread.


----------



## Gica (Nov 11, 2022)

"In most of the workloads we tested, it is very clear that *using Precision Boost Overdrive (PBO) and Core Performance Boost (CPB) is not worth it*. Photoshop, Lightroom Classic, Premiere Pro, After Effects, and DaVinci Resolve all showed no difference in performance when we disabled these settings from their "Auto" default, yet the CPU temperature dropped as much as 30C, or even 40C!"
Review for 7950X, red boy.

5800X3D costs as much as 13600KF (7600X). If you opt for it, you invest in DDR4 memories in 2022, red boy. About the same performance in gaming, effectively destroyed by the new processors in the other applications.
13900K is clearly on topic. I do both, red boy.

If the 13900K reaches the maximum temperature in only two heavy loads, the 7950X reaches it in all. With the same cooler!
I think it was also discussed here about watt/mm2 and that ryzen needs a more efficient cooler than 13th because the dissipation surface is smaller. Elementary physics.


----------



## Tek-Check (Nov 11, 2022)

Gica said:


> "In most of the workloads we tested, it is very clear that *using Precision Boost Overdrive (PBO) and Core Performance Boost (CPB) is not worth it*. Photoshop, Lightroom Classic, Premiere Pro, After Effects, and DaVinci Resolve all showed no difference in performance when we disabled these settings from their "Auto" default, yet the CPU temperature dropped as much as 30C, or even 40C!"
> Review for 7950X, red boy.


I have never suggested that 7950X should be overclocked with PBO or other tool to improve performance. Those tools are OFF by default on Zen4 systems. I'd never do it, as the CPU is already maxed out enough out-of-the-box, and I agree with what Pudget System said about those tools. Totally not worth it.

*However*, Intel allows motherboard partners to enable MCE or whatever name they use, so on several MSI and Gigabyte boards you can get unrestricted power use as "auto", out-of-the-box feature, without even knowing. And then reviewers benchmark with such motherboards, often without checking this.

13900K should come with motherboard set by deafult to Intel's limit of 253W, aka PL1=PL2. Anything beyond this in default settings is pure cheating by motherboard vendors for benchmark wins and sale purposes. If any vendor does the same for any AMD system, they should also be condemned immediately. I believe that AMD is pretty strict about those tools being OFF by default.

It is not surprising that 13900K attracts bad press for power usage and thermal throttling when motherboard vendors cheat the logic of PL2, in reality offering MCE by default, and allowing CPU to run at 300W out-of-the-box. Hence, by default on many motherboads, users are forced to buy high end cooler to avoid thermal throttling in heavy and continuous MT tasks, which is not the case with 7950X. Intel has only themselves to blame for this lax and careless power policy.


Gica said:


> If the 13900K reaches the maximum temperature in only two heavy loads, the 7950X reaches it in all. With the same cooler! I think it was also discussed here about watt/mm2 and that ryzen needs a more efficient cooler than 13th because the dissipation surface is smaller. Elementary physics.


 I have Intel, AMD and ARM systems at home and at work and happily use all of them. I don't get angry. 


Gica said:


> 5800X3D costs as much as 13600KF (7600X). If you opt for it, you invest in DDR4 memories in 2022


That's exactly why this CPU is the most popular to buy right now. I don't buy it, as I don't need it, but thousands upon thousands of AM4 owners or builders can make a simple upgrade and extend the longevity of their systems for another couple of years, or build an affordable system they will enjoy. Pretty simple.

Yet, again, stick to the topic. 5800X 3D is both very competitive and more affordable than 13900K for gaming. No brainer.


----------



## HTC (Nov 11, 2022)

Tek-Check said:


> That's exactly why this CPU is the most popular to buy right now. I don't buy it, as I don't need it, but thousands upon thousands of AM4 owners or builders can make a simple upgrade and extend the longevity of their systems for another couple of years, or build an affordable system they will enjoy. Pretty simple.



I'm using a 2600X and a 5800X3d is going to be my next upgrade.

Unfortunately, it will have to wait because the cheapest i can find it here in Portugal is 394€ ... since i'm not in any rush, i don't mind the wait ...


----------



## Solaris17 (Nov 11, 2022)

Viperl0 said:


> Really would've appreciated tests for Final Fantasy XIV, Valorant and Apex. I know you mentioned in a previous CPU specific review it's hard to bench games like that for comparison because the games or on different patch versions at different CPU release dates. But in a straight X vs Y I think it's a great opportunity to do it.



I pull between 100-190 FPS on a 13900k and 4090 at 4k settings cranked depending on zone in XIV. That is with my machine driving x3 4k displays though. so you might get more.


----------



## Gica (Nov 12, 2022)

Tek-Check said:


> Bla bla bla


Look for the fine print in the reviews and you will see that those who want a processor for specific tasks can correctly opt for only one of them. For example, if you want a processor for video editing and encoding, you are definitely wrong if you choose AMD over Intel 13th (hint or hint and read the conclusions in Premiere Pro)). AMD also has advantages in other applications and then you are wrong if you choose Intel. 

In short and on topic:
1. Whoever buys Intel is not crazy, nor stupid. The one who criticizes his choice is stupid. The same rule applies to AMD.
2. 5800X3D is an excellent AM4 *upgrade* for gaming only, not to invest in a "new" platform. It's ok only if you change the processor and only have gaming as your target.
It costs as much as the 5900X, I don't think you will notice differences in gaming, but the 5900X destroys the 5800X3D in the rest.


----------



## Tek-Check (Nov 12, 2022)

Once again, temper your language and stop going personal with members. If not, your posts will be reported and removed.
You will never successfully argue your case if you post personal comments. You have to learn to refrain from such comments and disagree with respect.

Secondly, I have never said that 13900K is not suitable for video encoding and editing. You are again putting words into my mouth. I was specific about which workloads are less suitable, as shown in testing by others posted before. GN and HUB tested default configurations sold to end user, as many motherboards come unlocked to allow 300W usage. As I said, you only have Intel to blame for this reckless power policy, and not us members who observe the results of such policy. Do not blame the messengers.

If you use 13900K, enjoy it, but have decency to allow voices of criticism towards Intel and their partners for shoving the hot cake 300W CPU down the throat of end user from the get go. If you are ok with that policy, that's fine, but you are not in a position to go personal with members who point out this reckless power policy. On a tech forum, you are only allowed to argue your case with respect, without labelling anyone with insults or similar. If you are unable to argue your case respectfully without calling people names, this is not a good place for you. Move on.

Again, 5800X 3D is ~15% faster in gaming than 5900X, so it's a great option both as single component upgrade and CPU for someone who wants to build AM4 gaming system. No matter what you say about it, it's not going to fly, especially when you use emotive language such as "5900X destroys 5800X 3D in the rest" to make irrelevant MT comparison for gaming topic. Ridiculous. Someone who mainly games does not care about 15% less performance on average in productivity applications. Finally, 5800X 3D is, as said many times, undeniably a better value for buck in gaming than slightly faster 13900K, as the results in this thread show. There is nothing more to add to it.


----------



## 95Viper (Nov 12, 2022)

Stay on the topic.
Stop the insults.
Report problems... do not become a problem.
Read the Guidelines and follow them.

Any more of the <arguing/name calling/guideline violations> will incur warnings and thread bans.


----------



## Ayhamb99 (Nov 12, 2022)

The comparison proves that going with a 13900k specifically for gaming is a really dumb idea.... The 13600k would get you 2-3% less performance but at a much reasonable price and power/temperature requirements when it comes to gaming... The 5800X3D is still a beast of a CPU for gaming even with the AM4 and DDR4 Platform, makes me wonder if Intel is going to try to come out with a similar sku that has more cache (Maybe a 13650k or something) instead of just binning the I9 to reach higher clocks because that strategy isn't really appealing with the horrendous power consumption and temperatures that come with it.


----------



## Tek-Check (Nov 12, 2022)

Ayhamb99 said:


> The comparison proves that going with a 13900k specifically for gaming is a really dumb idea.... The 13600k would get you 2-3% less performance but at a much reasonable price and power/temperature requirements when it comes to gaming... The 5800X3D is still a beast of a CPU for gaming even with the AM4 and DDR4 Platform, makes me wonder if Intel is going to try to come out with a similar sku that has more cache (Maybe a 13650k or something) instead of just binning the I9 to reach higher clocks because that strategy isn't really appealing with the horrendous power consumption and temperatures that come with it.


In gaming, power and temperatures of 13900K are fine. There is nothing wrong with it. There is everything wrong with a value for buck if someone predominantly games. 13600K or 5800X 3D are much better options, with similar performance in gaming across resolutions.


----------



## Nopa (Nov 14, 2022)

Tek-Check said:


> In gaming, power and temperatures of 13900K are fine. There is nothing wrong with it. There is everything wrong with a value for buck if someone predominantly games. 13600K or 5800X 3D are much better options, with similar performance in gaming across resolutions.


7600X at 299$ is tempting too when you pair it alongside cheap B650. With CL30 RAM, it even pulls ahead of 13600K.


----------



## Ayhamb99 (Nov 15, 2022)

Nopa said:


> 7600X at 299$ is tempting too when you pair it alongside cheap B650. With CL30 RAM, it even pulls ahead of 13600K.


But in that case you can also pair the 13600K with CL30 with a cheap DDR5 Z690 Mobo or even B660 Board if you do not care about OCing and the total platform cost will be cheaper than the 7600X....








AM5 boards are still more expensive than the LGA1700 Boards unfortunately so I hope that they improve within the next couple of months.


----------



## Nater (Nov 16, 2022)

Ayhamb99 said:


> But in that case you can also pair the 13600K with CL30 with a cheap DDR5 Z690 Mobo or even B660 Board if you do not care about OCing and the total platform cost will be cheaper than the 7600X....
> View attachment 270046
> 
> 
> ...



AM5 socket potential vs Intel's dead socket is worth $29 all day every day.  In fact, it's worth probably $100's, if not into the $1000's saved on mainboard upgrades over the next 4-5 years if AM4 is any indicator. (assuming you upgrade your CPU every release)


----------



## Tek-Check (Nov 16, 2022)

Ayhamb99 said:


> But in that case you can also pair the 13600K with CL30 with a cheap DDR5 Z690 Mobo or even B660 Board if you do not care about OCing and the total platform cost will be cheaper than the 7600X....
> 
> AM5 boards are still more expensive than the LGA1700 Boards unfortunately so I hope that they improve within the next couple of months.


Gigabyte's B650 board costs only $30 more, but it is definitely more superior, with better features across the board and it's future-proof for upgrades.
- it has 20 Gbps USB port (MSI has 5 Gbps)
- 15 ports at rear I/O (MSI has 10)
- Gen 5 and Gen4 NVMe drives
- more powerful VRM 12+2+1, each 60A, robust heatsink and 8-layer PCB
- other heatsinks are also more robust

MSI's B660 has dismal VRM heatsink that does not even cover rear I/O. Other feature are bare bones.




B660 should cost no more than $120 for what it offers in comparison with B650.


----------



## wheresmycar (Nov 16, 2022)

Ayhamb99 said:


> But in that case you can also pair the 13600K with CL30 with a cheap DDR5 Z690 Mobo or even B660 Board if you do not care about OCing and the total platform cost will be cheaper than the 7600X....
> View attachment 270046
> 
> 
> ...



according to this comparison it doesn't look all that bad! ~$30 investment for fwd Gen support sounds like a treat. For me (1440p gamer) neither of these hot baked cakes and power chugging offerings short of $700 make for an appealing buy. Maybe a little later.... with more affordable and efficient non-K/X parts, possibly cheaper boards and slashed DDR5 kits for around $550 (or ~$600 at the very top)


----------



## Ayhamb99 (Nov 16, 2022)

Nater said:


> AM5 socket potential vs Intel's dead socket is worth $29 all day every day.  In fact, it's worth probably $100's, if not into the $1000's saved on mainboard upgrades over the next 4-5 years if AM4 is any indicator. (assuming you upgrade your CPU every release)


While the longevity will definitely be an advantage, after what AMD tried to pull off with B450/X470 originally not supporting Zen 3 and only after backlash that it was enabled and Zen 3 support for X370/B350 only becoming available after Alder Lake released plus the poor folks who bought sTRX4, I'm not just gonna take AMD's word and buy into their promise


Tek-Check said:


> Gigabyte's B650 board costs only $30 more, but it is definitely more superior, with better features across the board and it's future-proof for upgrades.
> - it has 20 Gbps USB port (MSI has 5 Gbps)
> - 15 ports at rear I/O (MSI has 10)
> - Gen 5 and Gen4 NVMe drives
> ...


I was originally planning to put the MSI PRO Z690-A into the Intel list, with that the total price it would have been equal to AMD's total but i decided to choose what was the most reliable board that was cheaper on both AMD and Intel and the B660-A was there, The B660 Mortar would also be a choice as well


----------



## Tek-Check (Nov 16, 2022)

Ayhamb99 said:


> I was originally planning to put the MSI PRO Z690-A into the Intel list, with that the total price it would have been equal to AMD's total but i decided to choose what was the most reliable board that was cheaper on both AMD and Intel and the B660-A was there, The B660 Mortar would also be a choice as well


Cheaper often means terrible VRM cooling, skimping on ports and other features, etc. Plus, if you do need CPU for MT workloads, 7600X edges in gaming 3-4%, following 53 game review by HUB, and it's more future-proof for simple drop-in upgrades.


----------



## Wasteland (Nov 16, 2022)

Tek-Check said:


> Cheaper often means terrible VRM cooling, skimping on ports and other features, etc. Plus, if you do need CPU for MT workloads, 7600X edges in gaming 3-4%, following 53 game review by HUB, and it's more future-proof for simple drop-in upgrades.


The aforementioned MSI Pro B660-A and B660 MAG Mortar both have excellent VRM, according to Hardware Unboxed.  I've owned both.  On the AMD side I also own what is by today's standards an extremely cheap (~$95) B450 Tomahawk Max, which was likewise well reviewed by HUB.









						Best Budget Intel B660 Motherboards, VRM Thermal Test
					

Our first look at VRM thermal performance of entry-level Intel B660 motherboards. On hand we have models boards from Asus, Gigabyte, MSI, Asrock, Soyo and Maxsun. Pricing...




					www.techspot.com
				












						Intel B660 Motherboard VRM, Mid-Range Roundup
					

In this roundup we have 8 new Intel B660 motherboards that go for from $150 to $250. Spoiler alert: We're looking at a hefty 70% difference between...




					www.techspot.com
				




I can't vouch for B650 or other AM5 boards, and as a matter of principle I agree that one should be wary of cheaper boards, but if you can find a trustworthy and favorable review of the product in question, I don't see much in the way of incentive not to go for the "B" chipsets rather than their enthusiast equivalents.  In fact, it seems in many cases a cheap Z or X motherboard can be worse (in terms of VRM or even features) than a mid-to-high-range B motherboard.

Of course, the funny part is that "cheap" boards today would have been considered expensive even 3-4 years ago.  I've been building PCs for about 25 years now, and until the Zen 2 build I don't think I ever bought a mobo for more than about $70.  Nowadays it's considered "budget" if you're not spending $200.


----------



## Tek-Check (Nov 16, 2022)

Wasteland said:


> The aforementioned MSI Pro B660-A and B660 MAG Mortar both have excellent VRM, according to Hardware Unboxed.  I've owned both.  On the AMD side I also own what is by today's standards an extremely cheap (~$95) B450 Tomahawk Max, which was likewise well reviewed by HUB.
> 
> 
> 
> ...


True that, but boards have also come a long way. They are so much advanced comparison to previous gen boards. 

That said, I agree most new boards are overpriced, some ridiculously overpriced, just like 4080 GPU. A terrible value for money.


----------



## Kei (Dec 3, 2022)

Late to the party, but this was a fantastic test. I wish I could've seen the 1% lows more than anything...but nonetheless this is fantastic work. I say this especially as an owner of a 4090 paired with a vanilla 5800x that I use heavily for VR, so I'm always looking at where the best gain vs cost might be.


----------



## vekspec (Dec 4, 2022)

Just swapped to 5800x3D from 5900x as I game more than productivity on it (moved to m1 Mac for that) and seeing 20+ fps in games at 1440p with 3080. MMOs seeing largest increase also with FH5 and other racing sims. Really impressive chip if you are gonna gaming focused and up their with the current gens!


----------

