# RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K



## W1zzard (Oct 18, 2022)

We test the NVIDIA GeForce RTX 4090 with 53 games at three resolutions, comparing the AMD Ryzen 7 5800X against the Intel Core i9-12900K. The idea here is to get a feel for how much graphics performance is lost by a weaker processor.

*Show full review*


----------



## TheEndIsNear (Oct 18, 2022)

Pretty soon we'll have to buy video cards depending on which engines or games you like if you don't already.  I don't.


----------



## clopezi (Oct 18, 2022)

Thanks a lot for your work @W1zzard . It's very interesting.

Now I'm waiting to the 7950X vs 13900K hehe


----------



## P4-630 (Oct 18, 2022)

TheEndIsNear said:


> Pretty soon we'll have to buy video cards depending on which engines or games you like if you don't already.  I don't.


The End Is Near my man.....


----------



## defaultluser (Oct 18, 2022)

TheEndIsNear said:


> Pretty soon we'll have to buy video cards depending on which engines or games you like if you don't already.  I don't.




we already do that - MOST HIGH-END GAMERS HAVE A LIMITED LIST OF GAME TYPES THEY WILL EVER PLAY


----------



## Selaya (Oct 18, 2022)

no 5800x3d?
is it on the menu (for later) at the very least?


----------



## GreiverBlade (Oct 18, 2022)

Selaya said:


> no 5800x3d?
> is it on the menu (for later) at the very least?


i do hope so too ...

that comparision was a bit unfair 

Intel needed 2 gens of hybrid to "overtake" meanwhile AMD added 3DV cache and got back on top 

(i jest i jest ... well, half jest  )

one good thing with the newer gens, the 5800X is as cheap as the 5700X at launch (299chf) the 5700X is nearing sub 250chf and the 5800X3D is, soon (tm), going to be below 400chf if lucky


----------



## RandallFlagg (Oct 18, 2022)

I expected a difference, but not this much.  

Looks to me like when the next gen of GPUs comes out around 2024, everything will be CPU limited at sub 4K resolutions. 

It's taken a long time, and maybe a little left to wait for mainstream GPUs, but seems like the era of 4K gaming going mainstream has finally arrived.


----------



## seventy (Oct 18, 2022)

Extensive testing, but the data is not presented well. Would have been nice to at least see the fps, or even frametime graphs instead.
Questionable ram choice for 5800x, the infinity fabric is clocked high, but those timings (@ 4000 MHz 20-23-23-42 1T) most likely sandbagged the 5800x.
I wouldn't be surprised if good b-die at even 3600Mhz (with cl14 etc) would score much better (10-20%+), because zen 2 and 3 mostly scale with timings instead of raw frequency from my personal testing.


----------



## TheEndIsNear (Oct 18, 2022)

defaultluser said:


> we already do that - MOST HIGH-END GAMERS HAVE A LIMITED LIST OF GAME TYPES THEY WILL EVER PLAY


I didn't know I'm sorry.  It just make me a little sad.  I could afford 2 machines but that's dumb or maybe not???  Uggghh you just made me want to spend money lol


----------



## rv8000 (Oct 18, 2022)

So now you’ll test a 7950x/5800x3D vs. 13400f right?

Not that this article is in bad faith, but low/high end CPUs from either manf. should have been tested.


----------



## Space Lynx (Oct 18, 2022)

Hope we can get a comparison just like this one with the 13900k and ryzen 7700x.  I think that would be interesting to see


----------



## RH92 (Oct 18, 2022)

Thanks for the update W1zzard , i believe it is pretty clear that TPU needs to update the testing plafrom , 5800X3D or 12900K should be the bare minimum . Keen to see if margins increase with Raptor Lake .


----------



## maverik-sg1 (Oct 18, 2022)

Looks like you tested ddr4000 16gb with loose timings, against 32gb ddr5, the test would still show there's an advantage to Intel, however, I believe the gap to close significantly.


----------



## Space Lynx (Oct 18, 2022)

RH92 said:


> Thanks for the update W1zzard , i believe it is pretty clear that TPU needs to update their testing plafrom , 5800X3D or 12900K should be the bare minimum . Keen to see if margins increase with Raptor Lake .



I actually like that W1zz uses this, because so many other sites are already doing 5800X3d comparisons, like we have those in droves. It's nice to see this instead.

Which is why I'd like to see a 7700x and 13700k or 13900k comparison next.



maverik-sg1 said:


> Looks like you tested ddr4000 16gb with loose timings, against 32gb ddr5, the test would still show there's an advantage to Intel, however, I believe the gap to close significantly.



AMD does seem to benefit some with dual ranked ram, so good spotting on this!


----------



## seventy (Oct 18, 2022)

maverik-sg1 said:


> Looks like you tested ddr4000 16gb with loose timings, against 32gb ddr5, the test would still show there's an advantage to Intel, however, I believe the gap to close significantly.


Yeah, dual-rank b-die would be much closer.


----------



## mb194dc (Oct 18, 2022)

People will spend 4k+ on a machine and game at anything other than 4k or even higher?


----------



## noel_fs (Oct 18, 2022)

if you are gonna use 12900k on intel, you should use the 5800X3D on amd no?


----------



## W1zzard (Oct 18, 2022)

Selaya said:


> no 5800x3d?
> is it on the menu (for later) at the very least?


Soon


----------



## HD64G (Oct 18, 2022)

Nice test @W1zzard ! How about you posted some 4090 power draw results when testing with 12900K? More performance means higher utilisation that should increase the power consumption also me thinks.


----------



## W1zzard (Oct 18, 2022)

seventy said:


> much


Should I test 5800X3D or faster RAM speed?


----------



## HD64G (Oct 18, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


5800X3D for sure me thinks


----------



## Xuper (Oct 18, 2022)

If you want to test 5800X3D , Please have DDR4 3733mhz or 3200 CL14, Thanks.


----------



## RandallFlagg (Oct 18, 2022)

seventy said:


> Yeah, dual-rank b-die would be much closer.



It's an odd setup.

The plus is running IF at 2000 1:1 

I think TPU's bench is normally run at 1800 1:1, so the loose timings were likely necessary to get 4000 with IF 2000 1:1 (this is rare on Zen 3). 

I would bet that in some cases the 4000 C20 wins with the higher IF speed 2000 vs 1800 1:1.  Probably more of the eye-candy games like Cyberpunk and Assassin's Creed will prefer 4000, while what I call 'twitch' e-sports titles will prefer the lower latency. 

I personally would have rather seen standard bench setup though, with more CPUs tested, even if the tested set of games were smaller. 

The standard test bench at TPU IMO represents what most DIY enthusiasts will build (DDR5-6000 C30 / C36, DDR4-3600 C14 or C16).  Every other site seems to have some implausible config that makes their data unrepresentative. 

This setup, unfortunately, kinda fits in that 2nd category.



W1zzard said:


> Should I test 5800X3D or faster RAM speed?



I'd prefer you used the standard bench setup, regardless of CPUs tested.  I don't have a basis for comparison using DDR4-4000 C20.


----------



## Rowsol (Oct 18, 2022)

I hope you'll start including 1% lows as I feel it's more important than average fps.


----------



## JorgeRod (Oct 18, 2022)

Comparing a i9-12900k priced at 699,90€ with a 339,90€ ryzen7 5800x?? Should be a Ryzen 9 5950X (614,90€) maybe.


----------



## kapone32 (Oct 18, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


Faster RAM speeds with tighter timings but nothing that the average power user could not replicate.


----------



## RandallFlagg (Oct 18, 2022)

JorgeRod said:


> Comparing a i9-12900k priced at 699,90€ with a 339,90€ ryzen7 5800x?? Should be a Ryzen 9 5950X (614,90€) maybe.



I think their point alluded to in the verbiage was to compare their test bench system (5800X) to the alternative test bench.   

It's not really a CPU comparison but 'what would it look like if we used this instead to benchmark GPUs'.

So now we know Zen 3 basically isn't fast enough for the fastest of GPUs.  

I'm going to hazard a guess that no current CPU can keep up with a 4090 at 1080P though.


----------



## sector-z (Oct 18, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


The 2, ram make so much difference on AMD platforme, that 5800X was really handicaped by that very poor kit of ram. Take a fair B-Die 4000MHz Cas 14 or 15 but not 20 !! In a 1:1 ratio.

13900K(Not bios restricted)
7950X (With PBO)
5800X3D

You can add the 5800X with the new ram in bonus

That the Battle people want to see  With a good 420mm AIO at least or Custome loop, people dont want Temps throttled CPU but what they can offer at stock unrestricted.


----------



## seventy (Oct 18, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


The reason 5800X3D is generally considered a much faster processor is because it completely offsets suboptimal ram the processor usually tested with.
It's common knowledge for overclockers that even with tightly overclocked b-die performance of 5800x3d doesn't improve much.
So I would vote for faster ram, as it would be more useful for the average person to know how much impact it actually has on zen. 



sector-z said:


> The 2, ram make so much difference on AMD platforme, that 5800X was really handicaped by that very poor kit of ram. Take a fair B-Die 4000MHz Cas 14 or 15 but not 20 !! In a 1:1 ratio.
> 
> 13900K(Not bios restricted)
> 7950X (With PBO)
> ...


b-die at 4000MHz with 14-14... etc timings is basically the best ddr4 that you could get for zen.
The "cheap" dual-ranked 3200cl14 kits people usually buy would not clock that high without extensive babysitting. And off-the-shelf kits are too expensive. Something like tight 3600cl14 or 3800cl14 would be more reasonable imo.


----------



## RandallFlagg (Oct 18, 2022)

sector-z said:


> The 2, ram make so much difference on AMD platforme, that 5800X was really handicaped by that very poor kit of ram. Take a fair B-Die 4000MHz Cas 14 or 15 but not 20 !! In a 1:1 ratio.
> 
> 13900K(Not bios restricted)
> 7950X (With PBO)
> ...



Then why not use DDR5-7200.

I mean you can't even buy 32GB of DDR4-4000 C14.  You then have to OC it past XMP settings, you're playing silicon lotto at this point. 

DDR4-4000 C16 is the best you can do on XMP, and only about 5% of Zen 3 will let you run IF 2000 1:1

Why not just use the standard test bench, which represents what probably 80%+ of what people can actually do (DDR5-6000 and DDR4-3600 C14).


----------



## qubit (Oct 18, 2022)

@W1zzard 53 games: yer defo a masochist!   Look forward to reading it.

It's my contention that buying a powerful card can sometimes make sense when pairing it with a weak processor, with the classic "devil's in the detail" making or breaking that case.


----------



## W1zzard (Oct 18, 2022)

RandallFlagg said:


> DDR5-7200


Actually DDR5-7400 results coming from me this week


----------



## RandallFlagg (Oct 18, 2022)

W1zzard said:


> Actually DDR5-7400 results coming from me this week



Zen 4 and Raptor Lake?  

A lot of these X670 and Z790 motherboards are touting the ability to run that kind of speed now.


----------



## TheinsanegamerN (Oct 18, 2022)

So.....WTF happened in DMC 5? Most forward looking game engine?


----------



## bobmeix (Oct 18, 2022)

On the conclusion page, did you mean performance gains instead of 6.4-6.6% performance games?


----------



## Haile Selassie (Oct 18, 2022)

@W1zzard, can you check your Cinebench score with IFRAM 1:1 @ 4000 (2000)? My system starts acting up over 1900MHz IF. It's Prime stable, Memtest86 stable, however the performance degrades severely. Like, 60% Cinebench score of what you get at 1900IF.


----------



## jinxjx (Oct 18, 2022)

LOL, 16gb vs 32gb of ram, and not even their best cpu ! why does this place hate AMD so much!....


----------



## W1zzard (Oct 18, 2022)

Haile Selassie said:


> @W1zzard, can you check your Cinebench score with IFRAM 1:1 @ 4000 (2000)? My system starts acting up over 1900MHz IF. It's Prime stable, Memtest86 stable, however the performance degrades severely. Like, 60% Cinebench score of what you get at 1900IF.


I have reviewed over 100 graphics cards on this config, it is stable and trouble-free, or I wouldn't use it. can't risk having wrong scores in my reviews .. retail CPU btw


----------



## Haile Selassie (Oct 18, 2022)

@W1zzard - Totally understood. I had this set up for 3 months not knowing what was wrong.
If you have the time just do a sanity check via Cinebench. Like I said, mine exhibited total system stability.
Used DDR4-4000 C16-16-16 (high bin Samsung B) and later DDR4-4600 18-22-22-44-76 tight-as-it-goes Micron (69-70GB/s on 5900X, 58ns).


----------



## Neizel (Oct 18, 2022)

sector-z said:


> The 2, ram make so much difference on AMD platforme, that 5800X was really handicaped by that very poor kit of ram. Take a fair B-Die 4000MHz Cas 14 or 15 but not 20 !! In a 1:1 ratio.



4000MHz B-die CL15 is on the tight side of the spectrum and playing with 1.5v+. CL14 is 1.6v+ for sure, so I wouldn't advice to test that for a reliable source of performance for every reader.

The kit used on these test for sure is holding the 5800X back. Going for a 3600-3800 kit is very to run even for new people into PCs.



W1zzard said:


> Should I test 5800X3D or faster RAM speed?


I would go 5800X3D if the ram kit will be the same.

I would go with at least 3600 CL14 XMP kit if you're going to add or keep testing the 5800X. Sweet spot would be 3800MHz but 3600 is where the vast majority of people use when they aren't into tweaking bios.


----------



## Silvinjo (Oct 18, 2022)

Haile Selassie said:


> @W1zzard, can you check your Cinebench score with IFRAM 1:1 @ 4000 (2000)? My system starts acting up over 1900MHz IF. It's Prime stable, Memtest86 stable, however the performance degrades severely. Like, 60% Cinebench score of what you get at 1900IF.


For Zen 3 you dont want to go over 3800mhz, if you have APU yes, but "standard" CPU nop since you lose performance (even stated by AMD themselves). I have tested myself with my E-die kit and 5600X, I can run 24/7 stable 4000 CL19-20-20 (2000 IF) but FPS in games were dogshit compared to 3733 CL14-8-18-13 that I have rn. Since I do play games at 1080p and low settings, I'm considering to go lower on frequency (3600mhz) and tighten the timings even more since that would gain probably ~5 more fps, but currently I'm a bit lazy.


----------



## Haile Selassie (Oct 18, 2022)

Silvinjo said:


> For Zen 3 you dont want to go over 3800mhz, if you have APU yes, but "standard" CPU nop since you lose performance (even stated by AMD themselves). I have tested myself with my E-die kit and 5600X, I can run 24/7 stable 4000 CL19-20-20 (2000 IF) but FPS in games were dogshit compared to 3733 CL14-8-18-13 that I have rn. Since I do play games at 1080p and low settings, I'm considering to go lower on frequency (3600mhz) and tighten the timings even more since that would gain probably ~5 more fps, but currently I'm a bit lazy.


Yes, IF2000 1:1 tanked performance compared to IF1900 1:1, same DRAM settings. System stable This is what I have observed on 5900X.


----------



## erocker (Oct 18, 2022)

AMD should have already had X3D incorporated in the 7 series.


----------



## RandallFlagg (Oct 18, 2022)

People need to stop with this nonsense of running DDR4-4000 C14 or some such.  That's all binned memory and CPUs on  $500 motherboards, each part you have < 5% chance of winning on the silicon lottery.  That means like 5% of 5% of 5% to score on all 3 counts.

If you all really want to see that here's a video for you.

The short version is, it's a 4-way tie on avgs with 12900K having better 1% lows.  All 3 of these chips can max out a 3090 Ti in most titles.

Note: the '4th' is the 12900K with max OC DDR4 vs DDR5.


----------



## sephiroth117 (Oct 18, 2022)

Tbh one of those two CPU peaks near 230W.
I wasn't expecting that big of a difference, I wonder how a 7600X or a 5800X3D would fare tho


----------



## wheresmycar (Oct 18, 2022)

53 games!!..... W1zzard, TPU members owe you a holiday and then some!


----------



## Silvinjo (Oct 18, 2022)

Haile Selassie said:


> Yes, IF2000 1:1 tanked performance compared to IF1900 1:1, same DRAM settings. System stable This is what I have observed on 5900X.


I'm more shocked that this is posted live to the world (and the fact that 53 games were tested with that bad ram) and that someone who has so much systems didnt realize yet that 4000mhz is not go for Zen 3, especially those really high timings. Just to be clear, I dont have 500$ board (I'm on B550 Pro4, 100$ board) and 2x8 Ballistix 3600 CL16 (E-die) that I picked for a bit more than 50€ on amazon.de. Actually, this is ONE of the reason I don't like techpowerup, or atleast, their "testers".


----------



## kapone32 (Oct 18, 2022)

Silvinjo said:


> I'm more shocked that this is posted live to the world (and the fact that 53 games were tested with that bad ram) and that someone who has so much systems didnt realize yet that 4000mhz is not go for Zen 3, especially those really high timings. Just to be clear, I dont have 500$ board (I'm on B550 Pro4, 100$ board) and 2x8 Ballistix 3600 CL16 (E-die) that I picked for a bit more than 50€ on amazon.de. Actually, this is ONE of the reason I don't like techpowerup, or atleast, their "testers".


You RAM kit is killer for AM4 congrats


----------



## Silvinjo (Oct 18, 2022)

kapone32 said:


> You RAM kit is killer for AM4 congrats


Kind of, I went from 3000 CL16 and saw this for cheap so decided to buy it. First, didnt really notice a difference between 3000 CL16 and 3600 CL16, tho, not long ago I decided to try to clock and test the ram for a week, and saw decent gains on 3733 in both AVG and esepecially 0.1%, with 4000mhz the only thing that I achieved was better latency when testing in Aida64 (it was like 2ns lower then this 3733 setup that I'm getting 55.5 rn), tho, games didnt like it. I'm only bummed about RD that is bad on E-die, I could set it to 17 (from 18) but I would probably need another 0.5v on dimm so decided to settle with this for now, but in the near future I'm definetly aiming for something like 3600  CL13-7-17-11 to get the max possible fps and best possible 0.1%.


----------



## nicamarvin (Oct 18, 2022)

rv8000 said:


> So now you’ll test a 7950x/5800x3D vs. 13400f right?
> 
> Not that this article is in bad faith, but low/high end CPUs from either manf. should have been tested.


It's completely in Bad Faith.. Wizzard is anything but a unbiased reviewer.


----------



## jallenlabs (Oct 18, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


no 5800x3d?  why even bother with the 5800x.  Seems like a huge time suck for no reason...


----------



## chowow (Oct 18, 2022)

thank you amazing work that's a lot of testing, that helps a lot i am on 5600 i think 3080 will good enough for me.


----------



## tajoh111 (Oct 18, 2022)

nicamarvin said:


> It's completely in Bad Faith.. Wizzard is anything but a unbiased reviewer.



You can tell the biggest reason for this comparison is a result of criticism pointed towards the RTX 4090 review where it was criticized that the original test platform with the 5800x was holding back the RTX 4090. 

This is addressing that criticism and testing it rather than simply saying the 5800x is a good enough platform for such a high end card, particularly at 4k. 

And this article validates that criticism. It is humble for Wizzard to publish this article. 

All articles are not ads for AMD where AMD need to be shown in the best light possible.


----------



## BNSoul (Oct 18, 2022)

Superb article, the author deserves all the praise in the world for the collected and provided data. That said, I still believe a 5800X3D instead of a regular 5800X would have been much more interesting to benchmark in order to assess the extent as to which the "3D stacked L3 cache" technology can improve performance with regard to state-of-the-art GPUs.


----------



## lord_emperor (Oct 18, 2022)

Can you justify benching a $500 Intel CPU vs a $270 AMD CPU (current Amazon.com prices)?

Why not the 5800X3D which is still only $400?


----------



## Silvinjo (Oct 18, 2022)

jallenlabs said:


> no 5800x3d?  why even bother with the 5800x.  Seems like a huge time suck for no reason...


Huge time was sucked just because the bad ram was used, he is better settings off with 3200 CL16, he would get better results than with 4000 CL20. But I do agree that 5800X3D should be used with atleast E-die ram (so atleast 3600 CL16, no more above 3800 CL16). 5800X3D came out to answer Intels 12gen, 5800X was and should be compared to 10900k since it came out like 2 years ago.


----------



## Privater (Oct 18, 2022)

Am I the only one astonished by that 5800X run 1:1 IF with 4000 DDR4?

Any Intel gen 12 can easily run xmp with 6000 or higher 6800 DDR5. But a Zen 3 to push 2000 infinity fabric for 24x7 daily is uneasy.


----------



## RandallFlagg (Oct 18, 2022)

BNSoul said:


> Superb article, the author deserves all the praise in the world for the collected and provided data. That said, I still believe a 5800X3D instead of a regular 5800X would have been much more interesting to benchmark in order to assess the extent as to which the "3D stacked L3 cache" technology can improve performance with regard to state-of-the-art GPUs.



Actually, the only way you'd know that is if they did a 3-way comparison.  Your baseline would be the 5800X, not the 12900K.

This is the first review I've seen that shows conclusively that CPU is becoming the prime limiting factor for new GPUs at anything below 4K, and it's not small.

The funny thing is, this review such that it is pretty much means every 4090 review in existence as far as representing that GPUs performance is probably invalid.


----------



## ModEl4 (Oct 18, 2022)

@W1zzard hi, are you going to change in CPU reviews testbed the VGA also (RTX3080 -> RTX4090 for example) or you are waiting for more mainstream next-gen VGA models to become available (RTX 4080 or upcoming RDNA3 ones) in order the CPU results to be more representative for the vast majority of users (not those buying +$1600 VGAs)


----------



## Megas (Oct 18, 2022)

jallenlabs said:


> no 5800x3d?  why even bother with the 5800x.  Seems like a huge time suck for no reason...


Nothing better to do? Intel's last gen high end vs. AMD Mid-High made obsolete by the 5800X3D.  Next month Intel 13900K vs. AMD 3800X!


----------



## nicamarvin (Oct 18, 2022)

Megas said:


> Nothing better to do? Next month Intel 13900K vs. AMD 3800X!


Wizzard will never paint AMD CPUs in good light..!


----------



## odellus (Oct 18, 2022)

why are you still testing borderlands 3 using dx11 when dx12 has been the most performant renderer almost since the game came out?


----------



## mtb scotland (Oct 18, 2022)

what is with that RAM for the AMD system.


----------



## Silvinjo (Oct 18, 2022)

RandallFlagg said:


> It's an odd setup.
> 
> The plus is running IF at 2000 1:1
> 
> ...


If 2000 1:1 is winning over 1800 1:1 in gaming (does not matter wich game) then something is wrong, especially when its 4000 CL20-23-23. The only AMD CPUs that profit from higher freq is APU systems. Also latency on AMD is a bit weird. So, I've got E-die 3600 CL16, I can do 4000 (1:1) with much better timings that on this system and my latency in Aida64 was around 53ns, with much tighter 3733 CL14-8-17- I'm getting 55.5, but difference in games were night and day. Someone who is aiming for maxed out fps should consider doing 3600/3733mhz and as tight as possible timings.


----------



## W1zzard (Oct 18, 2022)

nicamarvin said:


> Wizzard will never paint AMD CPUs in good light..!


Why do you come to my house and shit on me?

Edit: maybe post something nice next time, goodbye


----------



## vMax65 (Oct 18, 2022)

Well done on taking the time to do this massive test, it is more than appreciated. I think it needs to be stated that this is not an Intel vs AMD 'or mine is better than yours'! test but rather a bottle neck test show casing what a GPU like the RTX 4090 is going to need if you are looking to extract the best out of it...


----------



## R0H1T (Oct 18, 2022)

tajoh111 said:


> You can tell the biggest reason for this comparison is a result of criticism pointed towards the RTX 4090 review where it was criticized that the original test platform with the 5800x was holding back the RTX 4090.
> 
> This is addressing that criticism and testing it rather than simply saying the 5800x is a good enough platform for such a high end card, particularly at 4k.
> 
> ...


Not everyone buys the fastest ST/gaming CPU & also buys the fastest singe GPU out there! In fact less that 0.000001% of people out there will do that, so while you could argue that 5800x may be holding the 4090 back at lower resolutions but when you're spending 1.6 grands on a freaking brick are you also going to spend another grand, or $400 on 5800x3d, on the rest of the system? I guess everyone drives a *Koenigsegg *here or they're looking forward to inherit Warren Buffet's millions


----------



## jinxjx (Oct 18, 2022)

nicamarvin said:


> Wizzard will never paint AMD CPUs in good light..!


Thats a true statement from what i been seeing and reading


----------



## SentinelAeon (Oct 18, 2022)

From Nvidia's point of view, wouldn't it make more sense to present RTX 4090 after both Zen 4 and Raptor lake were out and say outright that any test on older gear will bottleneck the GPU badly? There was no hurry since Raptor will be out long before Navi 3 and i read that they have to clear Ampere stock anyway. I was just wondering if im missing something.

edit: Trying to be completely objective here. Dont prefer any brand, always buy second hand best bang for buck so they have no profit from me whatsoever. But it got me thinking if they should wait another 2-3 weeks.


----------



## DemonicRyzen666 (Oct 18, 2022)

Is the 12900K here running with e-cores enable in these games, I just want want claification of 24T vs 16T ?


----------



## Icon Charlie (Oct 18, 2022)

Xuper said:


> If you want to test 5800X3D , Please have DDR4 3733mhz or 3200 CL14, Thanks.


Since I've been playing around with my Ram Speeds DDR4 3600 is the better choice instead of DDR 3733 as most people took advantage of the price vs performance between the two.

CL 14  of course does make a difference.  3200 CL14 is almost equal/or equal to 3600 CL14 is what I found out with my testing.

I actually down clocked/volted my PC 4000 CL 15-15-15-36  To PC 3600 CL 14-14-14-34.  7.8ns vs 7.5ns is virtually a tie and am running less voltages on my ram. 1.5v  vs 1.4 volts is a no brainer.

Everything counts in my case so.  Again I've posted this before  but it is worth while information.

So I suggest PC 3600 and its flavors.  BTY I'm using the AM4 platform for my results.



jinxjx said:


> Thats a true statement from what i been seeing and reading


Not really.  A lot of people including myself have been supporting AMD for years.  When a company changes its culture and forgets who got them there as well as IMHO putting out  components at a price where it is out of the reach for the average person.  You VOTE with your wallet.

Also a few of us old timers that have been in the industry, myself for 34 years are sick and tired of the excuses given by Influencers and Apologists on why "X" product is 60%-to100+ more in cost while giving less at times 30% performance increases.

I have stated this before and stand by this comment as this is directed at not only at AMD but Ngreedia and the rest of the tech industry.

*When performance is at the cost of excessive wattage and/or heat...  THERE IS NO PERFORMANCE AT ALL.

Oh and thank you W1zzard for the results. *


----------



## zero989z (Oct 18, 2022)

Even my tiny laptop SO DIMMs (2x16GB CJR) can do 4000Mhz with better timings than that. 

It's going to significantly affect those results by more than 10%.


----------



## mechtech (Oct 18, 2022)

53 games!!!!!!!

And not one DX9!!!   

"After hundreds of individual benchmarks, we present you with a fair idea of just how much the GeForce RTX 4090 "Ada" graphics card is bottlenecked by the CPU."

A question for the software/hardware experts out there.....

When it is bound by CPU, how much of that 'boundness' would be due to
-IPC
-Frequency
-Cache
-Architecture/IMC/ram speed/BIOS/OS (branch prediction, HPET, other OS oddities, etc.)
-Other (API/optimization, instruction sets, etc., misc., other?)

or basically impossible to say?


----------



## catulitechup (Oct 18, 2022)

W1zzard said:


> Actually DDR5-7400 results coming from me this week





RandallFlagg said:


> Zen 4 and Raptor Lake?
> 
> A lot of these X670 and Z790 motherboards are touting the ability to run that kind of speed now.



Hi @W1zzard do you can confirm if amd can with memories up to 6000mhz?, asking that because infinity fabric on zen4 stay at 3000mhz right?

Other information showing ddr5 memories up to 6400mhz have troubles to work good in many mainboards for apparently issue related to layers on mainboard pcb (various talking about run this high frecuencies memories with mainboards with 8 or more layers) and talking about mainboard must be have 2oz of copper compared with regular copper in most mainboards

Any confirmation about before themes will be appreciated

thanks


----------



## Jism (Oct 18, 2022)

X3D lacks good clocks. Thats why the intel overall is faster. But the same Intel consumes roughly double the power to accomplish it.


----------



## catulitechup (Oct 18, 2022)

mechtech said:


> 53 games!!!!!!!
> 
> _*And not one DX9!!! *_
> 
> ...



Yeah this will be good to add


----------



## Lightofhonor (Oct 18, 2022)

Jism said:


> X3D lacks good clocks. Thats why the intel overall is faster. But the same Intel consumes roughly double the power to accomplish it.


This test is normal X, not X3D.


----------



## zero989z (Oct 18, 2022)

mechtech said:


> 53 games!!!!!!!
> 
> And not one DX9!!!
> 
> ...


Poor threading, ultimately. Remember when Microsoft was teasing DX12? Closer to the metal? How 2.3Ghz with multiple cores would be just fine, but years later it's still a Ghz battle? This is closely tied to the OS as well, of course.


----------



## Fluffmeister (Oct 18, 2022)

Ouch, pretty brutal for the red team, help me Obi-wan 5800X3D-nobi, you're my only hope. Still for the price the 5800X does great... averaged out over all titles.


----------



## zlobby (Oct 18, 2022)

Selaya said:


> no 5800x3d?
> is it on the menu (for later) at the very least?


Yeah, we all know that the X3D version is the 'real' gaming CPU from AMD.



HD64G said:


> 5800X3D for sure me thinks


Why not both?


----------



## Cippo95 (Oct 19, 2022)

On the founders review it was said that the RTX 4090 was 45% faster than the 3090 Ti at 4K, does this new data increases that gap?


----------



## konga (Oct 19, 2022)

@W1zzard I know it's not very common for CPU comparisons, but I wonder if you could test a few games with ray-tracing on as well. I find RT to be a fairly CPU-intensive task, even at high-resolution, and the little CPU comparison data with RT enabled we do have from outlets like Eurogamer and Hardware Unboxed back this up. It's a real-world test case that's relevant to many people and might be useful to include in these comparisons as well. Cyberpunk 2077, Hitman 3, and Spider-Man Remastered with high object draw distance are three games that come to mind for really hammering the CPU in certain scenarios with RT on.


----------



## evernessince (Oct 19, 2022)

RandallFlagg said:


> Then why not use DDR5-7200.
> 
> I mean you can't even buy 32GB of DDR4-4000 C14.  You then have to OC it past XMP settings, you're playing silicon lotto at this point.
> 
> ...



To be fair hardly anyone is running a 4090 in general right now.  The price precludes it from being in the majority of people's hands.

I do agree in regards to memory.  OC memory should not be used unless the piece is specifically about memory scaling / is a section in a product review regarding memory OC.

That said, this comparison should have never been done with such a discrepancy in memory or CPU tier.  If the intent is to show how much performance reasonable high-end systems can squeeze out of the 4090 then the comparison should have been the 5800X vs the 12700K, both with DDR4 memory.  If the intent is to show how much the fastest from Intel and AMD can squeeze out of the 4090 the comparison should be the 5800X3D vs 7950X vs 12900K all with the fastest RAM that'll run reliably.  For DDR5 that's DDR5 6000 CL30 and for DDR4 3600 CL14.



tajoh111 said:


> You can tell the biggest reason for this comparison is a result of criticism pointed towards the RTX 4090 review where it was criticized that the original test platform with the 5800x was holding back the RTX 4090.
> 
> This is addressing that criticism and testing it rather than simply saying the 5800x is a good enough platform for such a high end card, particularly at 4k.
> 
> ...



Yes, different reviewers have different approaches to reviews that may impact the results. For example OS and RAM choice in addition to how data is gathered.  It's important to look at many reviews and examine the results.  If one review obtained abnormal results the objective thing would be to review methodology and look for ways to improve, assuming of course those differences are not the result of a difference in approach.  Some reviewers for example prefer to appeal to average customers by using slower RAM and others like including 720p results for certain tests.


----------



## Vipeax (Oct 19, 2022)

Would be great to see the actual fps. I don't really care about going from 200 to 220, but I would about say 70 to 77.


----------



## RandallFlagg (Oct 19, 2022)

evernessince said:


> To be fair hardly anyone is running a 4090 in general right now.  The price precludes it from being in the majority of people's hands.
> 
> I do agree in regards to memory.  OC memory should not be used unless the piece is specifically about memory scaling / is a section in a product review regarding memory OC.
> 
> That said, this comparison should have never been done with such a discrepancy in memory or CPU tier.  If the intent is to show how much performance reasonable high-end systems can squeeze out of the 4090 then the comparison should have been the 5800X vs the 12700K, both with DDR4 memory.  If the intent is to show how much the fastest from Intel and AMD can squeeze out of the 4090 the comparison should be the 5800X3D vs 7950X vs 12900K all with the fastest RAM that'll run reliably.  For DDR5 that's DDR5 6000 CL30 and for DDR4 3600 CL14.



It wasn't any of those intents from what I read, it was to see what changing the GPU bench test setup TPU uses from a 5800X (the current standard) to a 12900K (one of the contenders at the time they went to 5800X) would reveal.  It's in the verbiage of the article.  

I just think they should have stayed with their normal DDR4-3600 setup.


----------



## HM_Actua1 (Oct 19, 2022)

Conclusion, AMD fan boy hearts be broken.


----------



## Pepamami (Oct 19, 2022)

Hitman_Actual said:


> Conclusion, AMD fan boy hearts be broken.


how so? 5800X worth less than 12900K, and eats way less energy. 5700X has almost same performance, and heats even less.
5800X used with Single Rank memory, instead of Dual Rank, while 12900K used with DDR5 memory.
Not saying, that 5800X has 5800X3D version. and its Diffident gens.

This test shows people, that right now u dont need super-duper CPU (unless u own the 2000$ GPU), and u will be fine with basic 5600X/5700X/5900X or 12400F/12600K/12700K. Its has nothing to do with "Intel vs AMD"


----------



## Tek-Check (Oct 19, 2022)

Right. Thanks for taking time to do all those tests. Gordon from PC World said earlier today that his hair got grey from years of testing and he had to shave it. I hope this is not the case with you after so many tests ;-)

Back to business, this test with 4090 shows that nothing fundamentally has changed, following so many tests with Ampere cards. In 3D Center's meta-analysis from December 2021, 12900K is ~16% faster in 1080p gaming than 5800X, so within margin of error. This graph has added 7000 CPUs too.








JorgeRod said:


> Comparing a i9-12900k priced at 699,90€ with a 339,90€ ryzen7 5800x?? Should be a Ryzen 9 5950X (614,90€) maybe.


It does not really matter. It was an academic probe. Nothing has changed that we did not know before.
By the way, 5800X, 5900X and 5950X have very similar performance in 1080p gaming, within less than 2% difference.

Recent testing of 7950X showed that gaming performance is even better when one CCD is switched off and gaming is faster with 8 cores only due to latency penalty between CCDs. So, testing with 5800X is fine.



Hitman_Actual said:


> Conclusion, AMD fan boy hearts be broken.


Please don't post nonsense on a platform with serious discussions. It's embarrassing. Step up the game.


----------



## wheresmycar (Oct 19, 2022)

Tek-Check said:


> Right. Thanks for taking time to do all those tests. Gordon from PC World said earlier today that his hair got grey from years of testing and he had to shave it. I hope this is not the case with you after so many tests ;-)
> 
> Back to business, this test with 4090 shows that nothing fundamentally has changed, following so many tests with Ampere cards. In 3D Center's meta-analysis from December 2021, 12900K is ~16% faster in 1080p gaming than 5800X, so within margin of error. This graph has added 7000 CPUs too.
> 
> ...



Got a link to the above chart? Will we get to see individual game results too alongside test setup notes and display resolutions?

One of the things I admire about TPU charts is the wealth of info that comes with it... a second source with a similar break-up would be great.


----------



## Tek-Check (Oct 19, 2022)

Fluffmeister said:


> Ouch, pretty brutal for the red team, help me Obi-wan 5800X3D-nobi, you're my only hope. Still for the price the 5800X does great... averaged out over all titles.


15% slower on average is far away from "brutal". Nonsense. Plus, if someone plays in 4K, the difference in experience is pretty much negligible. However, your pocket will feel a real and deep difference by deciding to game with halo CPU 12900K.


----------



## InVasMani (Oct 19, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?



Definitely a weird comparison  with the 12900K. The locked chip Raptor Lake chips below the 13600K when it arrives and older 12600K are closer comparisons for the 5800X.

On the other hand if comparing with 5800X3D and 12700K and also 13600K with DDR4 I think make the most fair comparisons. If your comparing with DDR5 however you'd have to probably compare a locked Raptor Lake chip below the 13600K eventually or previous generation 12600K. The difference is expected DDR5 price and platform cost relative to DDR4 price and platform cost. That's even factoring out the other variable of already owning DDR4 memory which makes a value comparison with DDR5 even trickier between newer and older chips.

Far as 5800X3D or faster RAM speed 5800X the former since the even with a 5800X with faster memory is a really hard comparison against a 12900K. You have to be comparing a very premium obscure DDR4 kit like 4800MT/s CL17 or 4000MT/s CL14 that most wouldn't be temped to buy. That's exceptional binning for DDR4 though however SR and limited capacity. The cost difference is too steep for it to be a practical comparison that's not heavily slanted.

A comparison on 5800XD with 4000MT/s CL14 against 13600K and a more affordable z670 board and best DDR5 kit that matches the CPU/MB/RAM cost of the AMD configuration would be a good one see. The 13600K should win at MT and could challenge 5800X3D at gaming more than expected, but then again a crazy kit of very binned DDR4 might make 5800X3D even exciting where it already excels very well.

Seeing the same cost investment between the two would be a nice showdown on which comes out on top and in which under scenario's and how is the power on each!!? Keep it as money neutral as possible within about +/- $20's.


----------



## Tek-Check (Oct 19, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


We know now that changing GPU from Ampere to Lovelace does not make a significant difference for Zen3 and Alder Lake CPU average performance in gaming. See #89 

With 3D SKU, I doubt we will find something novel with Nvidia GPUs, but I might be wrong. Testing 5800X 3D will be more interesting with RDNA3 GPUs, because we know from HUB tests that SAM works slightly better on AMD's systems and brings more gains.

RAM could be more exciting with 7700X or 7900X vs. 12900K.



Silvinjo said:


> I'm more shocked that this is posted live to the world (and the fact that 53 games were tested with that bad ram) and that someone who has so much systems didnt realize yet that 4000mhz is not go for Zen 3, especially those really high timings. Just to be clear, I dont have 500$ board (I'm on B550 Pro4, 100$ board) and 2x8 Ballistix 3600 CL16 (E-die) that I picked for a bit more than 50€ on amazon.de. Actually, this is ONE of the reason I don't like techpowerup, or atleast, their "testers".


3600MHz is a sweet spot for Zen3, but even testing with different RAM did not uncover anything that we did not know before. It's all roughly the same, within margin of error.

You are free not to like anyone and anything, but you could put some basic effort into polite communication and not post personal and sarcastic comments against staff who put a lot of time into this. You can always suggest changes in testing regime without being rude.


----------



## Fluffmeister (Oct 19, 2022)

Tek-Check said:


> 15% slower on average is far away from "brutal". Nonsense. Plus, if someone plays in 4K, the difference in experience is pretty much negligible. However, your pocket will feel a real and deep difference by deciding to game with halo CPU 12900K.



Average does tend to sooth the pain, I'm sure a cheaper 12600 would also leave the 5800X behind, but hey I get your point... percentages shrink to nothing at 4K with my 3700X at 4K too.

Devils indeed do cry.


----------



## Tek-Check (Oct 19, 2022)

Fluffmeister said:


> Average does tend to sooth the pain, I'm sure a cheaper 12600 would also leave the 5800X behind, but hey I get your point... percentages shrink to nothing at 4K with my 3700X at 4K too.


Here. 12600 did not have any 'oomph' to offer either, at any resolution.


----------



## Fluffmeister (Oct 19, 2022)

Tek-Check said:


> Here. 12600 did not have any 'oomph' to offer either, at any resolution.
> 
> View attachment 266103


So basically your upset with these results, I get it.


----------



## Tek-Check (Oct 19, 2022)

Fluffmeister said:


> So basically your upset with these results, I get it.


Why would I be upset? And, results have not changed.


----------



## Fluffmeister (Oct 19, 2022)

Tek-Check said:


> Why would I be upset? And, results have not changed.


Just a hunch, I look at all the green bars to right and wondered if anyone was bothered by the results, hey AMD will be fine.


----------



## birdie (Oct 19, 2022)

The amount of work put into this review is staggering.

Thank you @W1zzard.

The only issue is that RTX 4090 is a luxury card for like 0.5% of gamers and GTX 1060 still rocks the world. The past 3-4 years of GPUs have been insane. There's nothing even close to this GPU in terms of price/performance. Probably the best/most popular GPU that's ever been released.


----------



## Minus Infinity (Oct 19, 2022)

Things will get real interesting when we have Zen 4 v-cache cpu's vs 13700K and 13900K. 13700K should handily beat 12900K across the board, basically it's an improved version in all ways. 13900K is going to win quite a few productivity test against 7950X just for having 24 cores, but will lose quite a few too I expect. Clocks though will win it for gaming for Intel in a lot of cases, against standard Zen 4, but I expect v-cache models to rule the roost.


----------



## Godrilla (Oct 19, 2022)

That's why I'm waiting for Zen 4 3d on am5 future proofing and potentially upgrade to better CPU down the line.  Isn't it sad all that performance wasted with dp 1.4 image quality! Even with a better CPU you still have to compromise image quality from compression on a 4 generations now supported standard since gtx 10 series. While my 9900 ks is obviously bottlenecking my 4090 at 4k I am still getting more than double the performance than my Neutered 3090 xc3 ultra hybrid which is more than satisfying for now. 
Great article and thank you for the hard work!


----------



## konga (Oct 19, 2022)

Godrilla said:


> That's why I'm waiting for Zen 4 3d on am5 future proofing and potentially upgrade to better CPU down the line.  Isn't it sad all that performance wasted with dp 1.4 image quality! Even with a better CPU you still have to compromise image quality from compression on a 4 generations now supported standard since gtx 10 series. While my 9900 ks is obviously bottlenecking my 4090 at 4k I am still getting more than double the performance than my Neutered 3090 xc3 ultra hybrid which is more than satisfying for now.
> Great article and thank you for the hard work!


DSC does not reduce image quality. The whole point of it is that it's "visually lossless." I've used a DSC monitor side-by-side with a non-DSC monitor for two years now and have not noticed a single instanced of degraded image quality on the DSC monitor.


----------



## ACE76 (Oct 19, 2022)

This review should have been done with the 5800x3d.


----------



## InVasMani (Oct 19, 2022)

13600K will likely generally be more appealing than the 12900K outside a heavy overclock of the latter in terms of performance, but as a whole 13600K will look way more appealing to anyone considering power draw, noise, heat, and dollar cost. At or below the same max power limit of the 13600K it'll probably be more appealing typically.


----------



## konga (Oct 19, 2022)

InVasMani said:


> 13600K will likely generally be more appealing than the 12900K outside a heavy overclock of the latter in terms of performance, but as a whole 13600K will look way more appealing to anyone considering power draw, noise, heat, and dollar cost. At or below the same max power limit of the 13600K it'll probably be more appealing typically.


Indeed. I have a Dark Rock Pro 4 and do not wish to upgrade the cooler. I will either buy a 13600K or try to do power limit stuff on the 13700K to make it reasonable on the DRP4, if the reviews indicate there's a reason to do so. Ideally I'd wait until Zen 4 3D to make a purchasing decision, but my 4090 is arriving tomorrow and I have a feeling my 5600X isn't gonna cut it.


----------



## Mistral (Oct 19, 2022)

Isn't the 12900K practically twice the price of the 5800x right now? In my area it is 330 Canuckian bucks vs 650..


----------



## InVasMani (Oct 19, 2022)

13700K is honestly just a bad pitch for Intel to me it's either 13600K with DDR4 or 13900K DDR5 no in between. The 13700K is in similar rock and hard spot to me as Zen 4 is currently prior to the X3D variants arrival. It's not a very good positioned chip to me generally I'd say the current Zen 4 parts aren't either. I'm either dipping my toes in the raptors mouth or all it's belly. 

That's if I go with Intel at all the 5800X3D is still a relevant consideration and Zen 4 X3D I want to try to wait to see if it's even worth it. I'd even consider a few AM4 SKU's outside of the 5800X3D though depends on the relative value angle of certain options heavily. I'll know when I'm ready when the deal is good enough I don't want to pass on it. If I have to debate the purchase decision I probably don't need it and shouldn't.

I can understand not wanting to doll out more money on new cooler if it can be avoided.


----------



## Garrus (Oct 19, 2022)

InVasMani said:


> 13700K is honestly just a bad pitch for Intel to me it's either 13600K with DDR4 or 13900K DDR5 no in between. The 13700K is in similar rock and hard spot to me as Zen 4 is currently prior to the X3D variants arrival. It's not a very good positioned chip to me generally I'd say the current Zen 4 parts aren't either. I'm either dipping my toes in the raptors mouth or all it's belly.
> 
> That's if I go with Intel at all the 5800X3D is still a relevant consideration and Zen 4 X3D I want to try to wait to see if it's even worth it. I'd even consider a few AM4 SKU's outside of the 5800X3D though depends on the relative value angle of certain options heavily. I'll know when I'm ready when the deal is good enough I don't want to pass on it. If I have to debate the purchase decision I probably don't need it and shouldn't.
> 
> I can understand not wanting to doll out more money on new cooler if it can be avoided.


The i7 is the best chip. The i5 is nerfed in cache and gaming performance and default clock speeds. The i7 gets you everything plus 2 more P cores, the important ones. The i9 just gets you a bunch of nearly useless E cores and doesn't actually perform better at the same clock speed as the i7 for gaming. Also in most countries like my own there is a very tiny price difference i7 versus i5 now as the i5 is WAY overpriced. Meanwhile they want an extra $200 for the i9 over the i7.


----------



## Gundem (Oct 19, 2022)

Very interesting write up. Thank you. Getting a 4090 means getting a few other things too


----------



## Dirt Chip (Oct 19, 2022)

Garrus said:


> The i7 is the best chip. The i5 is nerfed in cache and gaming performance and default clock speeds. The i7 gets you everything plus 2 more P cores, the important ones. The i9 just gets you a bunch of nearly useless E cores and doesn't actually perform better at the same clock speed as the i7 for gaming. Also in most countries like my own there is a very tiny price difference i7 versus i5 now as the i5 is WAY overpriced. Meanwhile they want an extra $200 for the i9 over the i7.


Yep, i9/x9xx tier is for workload that take full use of multi-threaded, must feel they are "future-proof"\"have the best" or cant's stand the idea of losing any 1% FPS one in a while.

Any way, very informative article- I have a feeling RL is also there but not showmen as of NDA.


----------



## W1zzard (Oct 19, 2022)

vMax65 said:


> Well done on taking the time to do this massive test, it is more than appreciated. I think it needs to be stated that this is not an Intel vs AMD 'or mine is better than yours'! test but rather a bottle neck test show casing what a GPU like the RTX 4090 is going to need if you are looking to extract the best out of it...


Yeah it seems people think this is "AMD vs Intel at similar config" whereas the test is "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring (added this to test setup page, too, so people can stop freaking out)



DemonicRyzen666 said:


> Is the 12900K here running with e-cores enable in these games, I just want want claification of 24T vs 16T ?


12900K was running at default settings, so yes



mechtech said:


> And not one DX9


If you had to pick one game, that was a commercial success and that everybody knows, what would you choose?



catulitechup said:


> because infinity fabric on zen4 stay at 3000mhz right?


IF on Zen 3 can do 1800 on all, 1900 on some, 2000 on very few
AMD says for Zen 4 2000 MHz is the sweet spot.

I'm not aware of anyone who has ever gotten 3000 MHz IF on Zen 3 or Zen 4


----------



## Dirt Chip (Oct 19, 2022)

ACE76 said:


> This review should have been done with the 5800x3d.


But then you also need the 12900KS "to be fair". Most people don`t have this CPU even among the top tier owners and anyway it is not AMD vs. Intel, although the way data is presented might make you think that way.

To get less bottlenecked we need better software much more than faster CPU`s.
We saturated cores\threads compliantly by now. More GHz and better IPC will sure help they come in single digit percentage, are slow progressed and cost more and more.
Game designers need to use make full use of BAR, SAM, and direct-storage tech the alleviate the CPU-GPU crosstalk.

Software and utilizing AI is where will's see the biggest improvement in this department, without the extra cost to the consumer.


----------



## Tek-Check (Oct 19, 2022)

wheresmycar said:


> Got a link to the above chart? Will we get to see individual game results too alongside test setup notes and display resolutions?
> One of the things I admire about TPU charts is the wealth of info that comes with it... a second source with a similar break-up would be great.


No individual games, as 3D Center carries out a *meta-analysis* of individual reviews. You would need to visit each individual review from tech websites around the world to try to discover whether they published breakdowns of games tested. For the starter, there is a list over there of all individual websites.





						Launch-Analyse Intel Alder Lake | 3DCenter.org
					

Sonntag, 7. November 2021  / von Leonidas   Gefühlte Ewigkeiten hat sie angedauert, die Zeit der 14nm-basierten Desktop-Prozessoren bei Intel: Seit dem Jahr 2015 (Broadwell) wurden ganze sieben Desktop-Generation basierend auf der




					www.3dcenter.org
				








						AMD Ryzen 7000: Die Launch-Reviews gehen online | 3DCenter.org
					

Einen Tag vor dem morgigen Marktstart gehen mit dem 26. September 2022 die Launchreviews für die ersten vier Modelle der "Ryzen 7000" Prozessoren-Generation auf Basis der "Zen 4" Prozessoren-Architektur online. Mit den neuen




					www.3dcenter.org


----------



## Dyatlov A (Oct 19, 2022)

W1zzard said:


> Yeah it seems people think this is "AMD vs Intel at similar config" whereas the test is "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring (added this to test setup page, too, so people can stop freaking out)
> 
> 
> 12900K was running at default settings, so yes
> ...



If 12900K was on stock, than with optimization and overclock probably could do even bigger difference, while 5800X would not have any more room.


----------



## HenrySomeone (Oct 19, 2022)

seventy said:


> Extensive testing, but the data is not presented well. Would have been nice to at least see the fps, or even frametime graphs instead.
> Questionable ram choice for 5800x, the infinity fabric is clocked high, but those timings (@ 4000 MHz 20-23-23-42 1T) most likely sandbagged the 5800x.
> I wouldn't be surprised if *good b-die at even 3600Mhz (with cl14 etc) would score much better (10-20%+*), because zen 2 and 3 mostly scale with timings instead of raw frequency from my personal testing.


 Oh, you sweet summer child...


----------



## so11ex (Oct 19, 2022)

Dunno why but its always top of the line Intel CPU crushing some mid AMD (especially past gen)
Would be fair to have an update 7950x vs 13900k and to add some previous gen intel chip like 10600k/11600k
Right now it looks like a good intel advertisement. Dont forget that many of your site visitors are not so familiar with hardware. What will they remember some time later? Intel>AMD approx 1.2x. Thats it.


----------



## Cryio (Oct 19, 2022)

At this point I think TPU is doing it intentionally on listing Days Gone as DX12 as some sort of Meta Joke.

They've been doing it for 3 years


----------



## joseLopez (Oct 19, 2022)

The amd 5800x is half the price of the intel, and the 5800x3d is 26% less than the intel. Despite all that, the 5800x3d would be the right cpu to test, and using a better ddr4.


----------



## W1zzard (Oct 19, 2022)

Cryio said:


> At this point I think TPU is doing it intentionally on listing Days Gone as DX12 as some sort of Meta Joke.
> 
> They've been doing it for 3 years


Bah .. and I could swear it was DX12 .. this will be fun to fix



http://imgur.com/utgMca3




joseLopez said:


> The amd 5800x is half the price of the intel, and the 5800x3d is 26% less than the intel. Despite all that, the 5800x3d would be the right cpu to test, and using a better ddr4.


Just from a few posts above yours: "Yeah it seems people think this is "AMD vs Intel at similar config" whereas the test is "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring (added this to test setup page, too, so people can stop freaking out)"

I will be testing 5800X3D soon


----------



## Silvinjo (Oct 19, 2022)

Any info when is test coming out with 13900k vs 2700X with 2666 CL16 ram ?


----------



## adilazimdegilx (Oct 19, 2022)

Thank you for this huge test w1z. I guess it shows clearly that it's time to change your bench system. Seems like 13900k will be the replacement as it is expected to be faster than 12900k. Even in that case, you'd like your cpu to be in the best condition (be it overclocked, on best memory configuration etc) to future proof your testing setup as possible. You might want to consider using process lasso with games like DMC5 as they taint the results. I don't know if disabling them all together is better, might even just do that if it is.
Even then I'm not sure if that new system would serve for long though. Since 4090 released, I'm expecting a huge jump on CPU side as well in near future. As we have clearly seen that current CPUs doesn't just cut it and we have still yet to see 7000s and 4090ti. Both Intel and AMD will surely force their hands in their next generation/launch.
This gets me really excited for near future, looking forward your new reviews.


----------



## siluro818 (Oct 19, 2022)

W1zzard said:


> Bah .. and I could swear it was DX12 .. this will be fun to fix
> 
> 
> 
> ...


You would have saved yourself all the explaining had you not used "Ryzen 7 5800X vs Core i9-12900K" in the title of this piece...


----------



## Godrilla (Oct 19, 2022)

I noticed no RT which is probably the best fit for this gpu and will make it even more gpu bound at 6 % overall delta might be less with some outliers naturally.


----------



## Easy Rhino (Oct 19, 2022)

W1zzard said:


> Yeah it seems people think this is "AMD vs Intel at similar config" whereas the test is "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring (added this to test setup page, too, so people can stop freaking out)



That's because people no longer read these days. It's pretty obvious from your conclusion that these tests are meant to show how even a slightly old CPU is going to massively bottleneck this GPU.


----------



## Colddecked (Oct 19, 2022)

konga said:


> Indeed. I have a Dark Rock Pro 4 and do not wish to upgrade the cooler. I will either buy a 13600K or try to do power limit stuff on the 13700K to make it reasonable on the DRP4, if the reviews indicate there's a reason to do so. Ideally I'd wait until Zen 4 3D to make a purchasing decision, but my 4090 is arriving tomorrow and I have a feeling my 5600X isn't gonna cut it.



If you want to hold out until zen4 3d, why not try out a 5800x3d?


----------



## TheinsanegamerN (Oct 19, 2022)

Pepamami said:


> how so?


Because this 1 review shows intel ahead in performance of the 5800x, and out of the woodwork come all these accounts with low post history having a meltdown in the comments about memory speed and wizzard did it wrong  and he should have used this config and "he'll never show AMD in a good light" and blah blah blah. One managed to get banned for being a dick over it.

That's fanboy behavior, just because your favorite CPU didnt do as well as the competition doesnt mean the world is conspiring against AMD or such nonsense.


so11ex said:


> Dunno why but its always top of the line Intel CPU crushing some mid AMD (especially past gen)
> Would be fair to have an update 7950x vs 13900k and to add some previous gen intel chip like 10600k/11600k
> Right now it looks like a good intel advertisement. Dont forget that many of your site visitors are not so familiar with hardware. What will they remember some time later? Intel>AMD approx 1.2x. Thats it.


See this guy? This is what happens when you dont read. 


Easy Rhino said:


> That's because people no longer read these days. It's pretty obvious from your conclusion that these tests are meant to show how even a slightly old CPU is going to massively bottleneck this GPU.


It's also odd that many of these accounts have few-to-no posts in their history and little to no activity until an AMD article comes out, then all of a sudden they are in full force.....



Colddecked said:


> If you want to hold out until zen4 3d, why not try out a 5800x3d?


It's really the best choice. None of these newer CPUs will matter unless you're pushing over 144hz at like 1080p or some such nonsense. If you're going for 1080p240, then yeah newer ones will make a difference, but for 99% the 5800x3d is the perfect gaming chip.


----------



## dirtyferret (Oct 19, 2022)

W1zzard said:


> Soon


----------



## HD64G (Oct 19, 2022)

zlobby said:


> Why not both?


Since @W1zzard  asked us to choose (due to time restraints I suppose) I chose.


----------



## mechtech (Oct 19, 2022)

W1zzard said:


> If you had to pick one game, that was a commercial success and that everybody knows, what would you choose?
> 
> 
> Well the DX9 was kind of a joke, but since you asked, how about the classic Half Life 2?  Maybe the 4090 can break 1000 fps.


Edit.  Was black mesa or Anno or borderlands 1,2 done in DX9?


----------



## Pepamami (Oct 19, 2022)

TheinsanegamerN said:


> Because this 1 review shows intel ahead in performance of the 5800x, and out of the woodwork come all these accounts with low post history having a meltdown in the comments about memory speed and wizzard did it wrong and he should have used this config and "he'll never show AMD in a good light" and blah blah blah. One managed to get banned for being a dick over it.


But he said, that my heart is broken (I like amd cpu over intel cpus), but its not, I answered why. I did not read all of the comments >.>

And wot is wrong to telling, that Zen 1-2-3 memory should be in DualRank mode or 4 sticks of SingleRank mode, to achieve better results in some memory demanding games.
Its just a good reminder.


----------



## zlobby (Oct 19, 2022)

TheinsanegamerN said:


> Because this 1 review shows intel ahead in performance of the 5800x, and out of the woodwork come all these accounts with low post history having a meltdown in the comments about memory speed and wizzard did it wrong  and he should have used this config and "he'll never show AMD in a good light" and blah blah blah. One managed to get banned for being a dick over it.
> 
> That's fanboy behavior, just because your favorite CPU didnt do as well as the competition doesnt mean the world is conspiring against AMD or such nonsense.
> 
> ...


Seriously, what is wrong with 'lower-ranked' users' opinions? I really hate it when people count the number of stars below one's name and merit the validity of one's response by it.




Pepamami said:


> But he said, that my heart is broken (I like amd cpu over intel cpus), but its not, I answered why.
> And wot is wrong in telling in comments, that Zen 1-2-3 memory should be in DualRank mode or 4 sticks of SingleRank mode, to achieve better results in some memory demanding games.


Every CPU has its own sweet spot when it comes down to memory configuration i.e. rank, slots, speed, timings, etc.
I for one believe it's one's own duty to research what is the best config for a given CPU, but only because intel and AMD won't publish such information. Naturally, vendors will only cite compatibility stuff, so thanks to guys like @W1zzard for trying to help the wicked get something that will serve us right.

I'm far from even thinking @W1zzard will 'lease' his integrity for showing any bias for any vendor there is, even if he has its own personal beliefs. The moment I catch some wind of somerhing like this happening here would be the moment I put the entire TPU domain in my blacklist.


----------



## Pepamami (Oct 19, 2022)

zlobby said:


> Every CPU has its own sweet spot when it comes down to memory configuration i.e. rank, slots, speed, timings, etc.


The review itself is not about how intel is cooler over amd. But it has 5800X in it, so I think its valid to tell in comments what u can du, to run 5800X even better by using these sweet spots.


----------



## DemonicRyzen666 (Oct 19, 2022)

W1zzard said:


> Yeah it seems people think this is "AMD vs Intel at similar config" whereas the test is "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring (added this to test setup page, too, so people can stop freaking out)


I feel like most people are complaining about the price difference between the two cpu's. It's nice to have clarifcation.


W1zzard said:


> 12900K was running at default settings, so yes


Thanks


W1zzard said:


> If you had to pick one game, that was a commercial success and that everybody knows, what would you choose?


does it have to me a commerical success? how about about one that well optimized like of the oold Batman Arkam games. One of those games ran really well on a 20 threaded I7 6950x.


W1zzard said:


> IF on Zen 3 can do 1800 on all, 1900 on some, 2000 on very few
> AMD says for Zen 4 2000 MHz is the sweet spot.
> I'm not aware of anyone who has ever gotten 3000 MHz IF on Zen 3 or Zen 4


I run my infiniy Frabric at 1600mhz I never get errors & Most of my cinebench scores are higher than everyone else running 1800mhz ¯\_(ツ)_/¯


Oh, I wanted to ask if your going to be adding in Prefromance pre-watt & maybe adding prefromance pres-qaure milimeter too &/or a combination of both. Now that Zen4 is all on the same node for CCD's & IOD's & Raporlake is monolithic to get an aggregate of prefromance.


----------



## GotNoRice (Oct 19, 2022)

TheEndIsNear said:


> Pretty soon we'll have to buy video cards depending on which engines or games you like if you don't already.



Agreed.  If you are a World of Warcraft player like I am, then almost nothing beats the 5800X3D right now.  Just look at the numbers that Intel put in their own marketing slide.  Intel would be the LAST company to exaggerate the performance of an AMD CPU, yet their own marketing slide clearly shows the 5800X3D dominating both the top-end 12-series and 13-series Intel CPUs when it comes to World of Warcraft.



Looking at the 53 games that were reviewed in the TPU article, i'm seeing ~50 games that I don't play or care about.  I enjoy GTA 5, and I occasionally play Far Cry 5 and Battlefield 5.  Ironically GTA 5 also seems to be faster on AMD CPUs.  So, Zero reason for me to go Intel.


----------



## THU31 (Oct 19, 2022)

Reading the comments has been quite painful. So many people did not understand the point of this test.

The original 4090 review was done with the 5800X. This re-test is supposed to minimize the CPU bottleneck. It is not supposed to be a comparison between various CPUs, it is supposed to utilize one of the fastest processors currently available.

People complained about the platform used in the original review because the 4090 was heavily bottlenecked. It still is even with a 12900K and it still will be with Raptor Lake and Zen 4 X3D, but we can see the extra performance that can be gained using the fastest CPUs on the market.


----------



## Xebec (Oct 19, 2022)

I appreciate the effort that W1zzard puts into these reviews and providing this data.   I took away if you replace a ‘good gaming CPU’ with a ‘great gaming CPU’, even at 4K average fps you’ll see strong benefits in some games with a RTX 4090.  That’s really good information.  

I will definitely read TPU’s Raptor Lake reviews tomorrow.  The only data point I’ll need to go elsewhere for is Flight Simulator as I play a lot of sim type games.


----------



## jallenlabs (Oct 20, 2022)

jallenlabs said:


> no 5800x3d?  why even bother with the 5800x.  Seems like a huge time suck for no reason...


Sorry to bash all your work W1zzard.  I know it was a handful and I shouldn't jump so quick to complain.  So, thanks for all your work and go with the 5800x3d in the next one and use appropriate ram speeds for each (sweet spot for each architecture).


----------



## desiprofessor (Oct 20, 2022)

W1zzard said:


> Should I test 5800X3D or faster RAM speed?


Would prefer the 5800x3d first. Allows people to see if they need to upgrade to next gen or not.


----------



## x4it3n (Oct 20, 2022)

It would have been interesting to compare with a 5900X or 5950X since they have more cores _(which can definitely be beneficial in some games)_ even though the 5800X only has 1 CCX !
Also the best RAM optimization for ZEN 2 & ZEN 3 is 3733MHz CL14, and you also can get a small boost when using 4 sticks instead of 2

But I'm surprised that the 4090 is still being bottlenecked at 4K though... I wonder if it's because most games are still being optimized for Intel CPUs (since they have the biggest market share and history for Desktop Gaming CPUs)

Can't wait to see the reviews and benchmarks for the 4090 Ti and RDNA 3 GPUs!

PS: Good job, I'm sure it was a lot of work and hours...!!!


----------



## Pepamami (Oct 20, 2022)

THU31 said:


> The original 4090 review was done with the 5800X. This re-test is supposed to minimize the CPU bottleneck. It is not supposed to be a comparison between various CPUs, it is supposed to utilize one of the fastest processors currently available.


Its not "fastest", it just "a random one, I have in my desk, which is faster than 5800X".
I mean dont say "fastest" or else people gonna complain about why 12900K was picked as "fastest" CPU, but not 5800X3D or 7xxxX. Wait they already do... (and this thread changed wildly into "which CPU is the fastest, and how u shout tune it")


----------



## nguyen (Oct 20, 2022)

Would love to see more online games being tested since FPS is much more important in this case.


----------



## Mussels (Oct 20, 2022)

Seeing the comments of people arguing about this on facebook has been hilarious
People really didn't understand in the slightest this was done since TPU's done a lot of testing on the 5800x system


6 pages of comments to read up on, but I'm all for the x3D results - from my own experience with all of this, having four ranks of memory is critical. Either 4x8GB or 2x16GB is the bare minimum, with 3600Mhz C16 being the lowest you'd want to go on such a performance oriented CPU (but understandable since its also a common speed and easy to purchase, and going above 3600 does vary a lot between RAM kits and IMC's)

The below isn't shitting on w1zz as his test setup was made before a lot of this was known, let alone commonly known.
*Once he commited to his test setup, he couldn't change it.*
That said., anything involving modern comparisons that don't need to be compatible with all the previous benchmarks should include an updated memory setup - Zen3/Zen3D, *really* prefers four ranks and low latency over anything else. The 3D chips alleviate the latency largely, but the exta ranks still help.


My thoughts:
1. Was this only two 8GB sticks? That's 100% going to lower performance on the Zen3 system
We've known this for a long, long time, even zen 2 had lower results with 2x SR sticks and TPU has covered it in the past.



Even the 5600x gets anywhere from 5-15% performance hits from two ranks, with otherwise equal memory

GN's youtube video (since i opened it yesterday and commented on another thread already) shows this super clearly:
Their "stock" setup is 3200CL14 4x8 - the key to reading this is that every 2x8 result is at the bottom of the chart.
When dual rank CL18 can beat single rank CL14, you know it's gotta be avoided.




*How much would that 12900K vs 5800x 6.5% difference shrink, when GN found four ranks of memory gave a 9.5% performance gain?*

2. 4000 CL20 is great for high MHz, but the poor timings is definitely an issue.
Look at the GN graph, 3200CL14 was superior to 3800CL18 on average - benchmarks love the MHz, games love the latency (hence the x3D's big advantage)

3. I like these charts less than the old style.  Needs more colours.

4. FPS values. Knowing its 20% faster is nice, but knowing if that's 200FPS vs 240FPS or 50FPS vs 60FPS, matters






Xuper said:


> If you want to test 5800X3D , Please have DDR4 3733mhz or 3200 CL14, Thanks.


4x8 or 2x16GB for sure, 3200CL14 or 3600CL16 seem to be the best choices for a 'common' setup without requiring manual tweaking that any end user could achieve



HenrySomeone said:


> Oh, you sweet summer child...


No he was partly correct - lower timings and quad ranks (4xSR or 2xDR) definitely will help results

Damn, i wish i had more RAM modules available. I could do my own 3200Mhz testing and compare SR vs DR, but i cant do the scale of w1zz or have combinations of timings and MHz values
Not even sure what game benchmark i could use to demonstrate it, just doing 4xSR vs 2x DR with the same speeds should be enough to demonstrate its worth the time

Final edit: Since GN had shadow of the tomb raider show larger differences, i'll test 2xSR, 4xSR, 2xDR and 2xDR+2xSR at 3200C16 with that. It's not going to be massively definitive, but it's at least repeatable for consistency and verification by others.

I cant do much with my 5800x system as its only got two memory slots, but i can do SR vs DR, at the same speed.


----------



## nguyen (Oct 20, 2022)

Mussels said:


> Seeing the comments of people arguing about this on facebook has been hilarious
> People really didn't understand in the slightest this was done since TPU's done a lot of testing on the 5800x system
> 
> 
> ...



Pretty sure I urged W1zzard to use quad ranks config back when the new testing PC with 5800X came out LOL


----------



## HTC (Oct 20, 2022)

I understand W1zzard's objective with this review: to pick up the original review's performance, and compared it using a more powerful processor.

IMO, the problem isn't this review but rather the original one, where he should have used the most powerful processor he had available, from either Intel or AMD, in order to maximize the performance of the GPU being tested.


----------



## Mussels (Oct 20, 2022)

nguyen said:


> Pretty sure I urged W1zzard to use quad ranks config back when the new testing PC with 5800X came out LOL


I think i did too, but there was genuine arguments to be made for getting the IF as high as possible
I just want the x3D to be setup with a new config, since it's a new test setup anyway.


And it's totally fair to have both setups on 32GB of RAM, too.


----------



## W1zzard (Oct 20, 2022)

Mussels said:


> I just want the x3D to be setup with a new config, since it's a new test setup anyway.


5800X3D will use our typical CPU review config









						AMD Ryzen 9 7950X Review - Impressive 16-core Powerhouse
					

The Ryzen 9 7950X is a monster CPU. When paired with the right workload it will eat even the 12900K for breakfast. As our review shows, the performance uplifts can be massive: +30-50% gen-over-gen is totally possible. What makes things complicated though, is that keeping the beast cool is almost...




					www.techpowerup.com
				




2x 16 GB 3600 CL14 IF 1:1. Many of my Zen 3 CPUs can't do 1900 or 1883, so out of fairness I'm using 1800 on all


----------



## Mussels (Oct 20, 2022)

Well. This certainly makes it harder.





*flips desk*

Even the free demo locks you out if you change hardware, such as your RAM amount...



W1zzard said:


> 5800X3D will use our typical CPU review config
> 
> 
> 
> ...


That's a pretty good ram config, and should give near the best performance these can do
TR had a good benchmark that's now useless since i've been locked out of running it, i'm going to need a moment to scream at useless DRM.


----------



## W1zzard (Oct 20, 2022)

Mussels said:


> TR had a good benchmark


Any slightly busy in-game location will be a better and more realistic benchmark, even if you stand completely still


----------



## Mussels (Oct 20, 2022)

W1zzard said:


> Any slightly busy in-game location will be a better and more realistic benchmark, even if you stand completely still


Yeah I wanted something easily repeatable with a built in benchmark, I dont have your benchmark suite available (or the time to download it on a 50Mb connection lol)


I uh, am running 80GB of ram with 6 ranks right now tho?

I can just install an entire OS and my games directly into that i guess...




Okay! finished fighting with dumb in game benchmarks and just used CapFrameX on a game i use already - Deep Rock Galactic.
It's unreal engine 4, DX12 and has all the fancy features under the sun except ray tracing, because it's got awesome lighting already.

This has been run at 4K all settings on ultra, DLSS on auto.

This was done with shitty RAM timings so i could benchmark all these sets of ram at 3800 1:1:1
2x8GB
2x32GB
and 2x32GB + 2x8GB




The results will not shock you, although I couldn't rename the results and had to MSpaint the RAM values over the top





TL;DR: average FPS is the same - GPU limited by a 3090.
0.2% FPS has 8% improvement with four memory ranks
1% FPS has 3.6% improvement


Can't control how it sorts these, annoyingly (Or rename them)




And another stat from the same results:



Will it make it beat a 12900k? no.
Will it narrow that gap? definitely. And this is at CL18 for W1zzards sake, CL16 or CL14 would absolutely be faster than my results if the gains scale with faster RAM




Viewed as %'s, you can see the dips drop 29% off the average FPS, vs 12%


----------



## inf64 (Oct 20, 2022)

Pretty solid article (new member here, long time poster at AT forums).

One thing caught my eye: Wizzard wrote that 12900K has *15-20% higher IPC than Zen 3*. Could you tell me what is he basing this on? It's a  common knowledge (from AT review, computerbase aggregate charts and SPEC2017 1 thread results) that Golden Cove has between *10 and 11%* higher average IPC in desktop apps than Zen 3. Cumulative performance jump is greater as Golden Cove can clock much higher (stock and even OCed), but 15-20% higher IPC is simply not factual.

5800X3D vs 12900K would be great to see, along with Ryzen 7000 and Raptor Lake. Zen 4 and Golden/Raptor Cove have basically neck even integer IPC (according to SPEC2017 1 thread results at the same clock). The only differentiator in games is the lower latency for intel parts due to their monolithic design, hence they are slightly faster. I expect that Raptor Lake will be at best ~5% faster than 7950X when paired with 4090, in 4K gaming, and more or less even with 12900K(S). Also, I expect that Zen 4 X3D parts should make a bigger leap and reign supreme in games with next gen GPUs, whenever they launch.

I'm looking forward to techpowerup's 13th gen review.


----------



## Why_Me (Oct 20, 2022)

InVasMani said:


> 13700K is honestly just a bad pitch for Intel to me it's either 13600K with DDR4 or 13900K DDR5 no in between. The 13700K is in similar rock and hard spot to me as Zen 4 is currently prior to the X3D variants arrival. It's not a very good positioned chip to me generally I'd say the current Zen 4 parts aren't either. I'm either dipping my toes in the raptors mouth or all it's belly.
> 
> That's if I go with Intel at all the 5800X3D is still a relevant consideration and Zen 4 X3D I want to try to wait to see if it's even worth it. I'd even consider a few AM4 SKU's outside of the 5800X3D though depends on the relative value angle of certain options heavily. I'll know when I'm ready when the deal is good enough I don't want to pass on it. If I have to debate the purchase decision I probably don't need it and shouldn't.
> 
> I can understand not wanting to doll out more money on new cooler if it can be avoided.


The 12700K goes toe to toe with the 12900K in regards to gaming while costing less.  Even a better deal are the locked Intel 12 gen cpu's such as the i7 12700 / 12700F.  As far as an i7 w/DDR5 ... ya it works.

*Example A:*

https://www.bhphotovideo.com/c/product/1687328-REG/msi_mag_b660m_mortar_wifi.html
MSI MAG B660M MORTAR WIFI DDR5 $179.99






						MAG B660M MORTAR WIFI
					

Powered by Intel 12th Gen Core processors, the MSI MAG B660M MORTAR WIFI is hardened with performance essential specifications to outlast enemies. Tuned for better performance by Core boost, Memory Boost, Premium Thermal Solution, M.2 Shield Frozr,




					www.msi.com
				




https://www.amazon.com/dp/B09NPJDPVG
Intel Core i7-12700F $312.99

*Example B:*

https://www.amazon.com/GIGABYTE-Z690-AORUS-AX-Motherboard/dp/B083NPKL1C/
GIGABYTE Z690 AORUS ELITE AX DDR5 $199.99

https://www.gigabyte.com/Motherboard/Z690-AORUS-ELITE-AX-rev-1x#kf

https://www.amazon.com/dp/B09NPJDPVG
Intel Core i7-12700F $312.99

Cheap DDR5 ftw

https://www.amazon.com/dp/B09MTS5YH1
CORSAIR Vengeance DDR5 4800 32GB (2x16GB) CL40 $139.99

No need to break the bank keeping that locked cpu cool.

https://www.amazon.com/DeepCool-AK620-High-Performance-Dual-Tower-Dissipation/dp/B09CSXS3X4
DeepCool AK620 CPU Cooler $64.99


----------



## Mussels (Oct 21, 2022)

Why_Me said:


> The 12700K goes toe to toe with the 12900K in regards to gaming while costing less.  Even a better deal are the locked Intel 12 gen cpu's such as the i7 12700 / 12700F.  As far as an i7 w/DDR5 ... ya it works.
> 
> *Example A:*
> 
> ...


Outside of really high FPS gaming (240Hz) almost any modern CPU is good enough.
Very few CPU's give you boosts to the 99% FPS for the smoother gaming, which seems to be the x3D's strength

If you look at the stuff posted below this, you could simply pick a CPU that has 0.1% lows above your refresh rate and you're in for butter time. 4K120Hz smoothly requires a 5600x or above or... a 10900k and above?
(Although the 12100F is close enough. Any zen3 or 12th gen is enough for 120 smoothly.)


I'd never used CapFrameX before the memory testing i did here, but results like this please me







Compared to 13900k results where average FPS may be higher, but it seems to replace butter with stutter

0.1% low FPS:
13900k: 119.4
5800x: 133
12900k: 136.6
5800x3d: 149.4

And then ryzens 7000 series dominate (in backwards order, thanks to that windows scheduling bug)
7950x: 158.7
7900x: 171.3
7600x: 184.2
7700x: 189.1



The 5800x3D is ahead of every single intel CPU is 0.1% lows, and the 7000 series dominates even it despite those scheduling bugs






*I get the feeling average FPS isn't enough these days, we're well and truly capable of blowing past refresh rates of even the highest frame rate displays - we need the 1% and 0.1% values to judge gaming merits*

This might be an outlier result but wow
Anyone who says the intel wins this one because it got 15FPS higher just... non. Nein. Nope. Yeah Nah. Crikey.




Oh it's in more than one title, the 12900K gets lower average and worse 0.1% here


----------



## medi01 (Oct 21, 2022)

clopezi said:


> Now I'm waiting to the 7950X vs 13900K hehe


Also with AMD's top RDNA3 GPU please. 

Something tells me that AMD drivers would be more optimized for AMD CPUs, unlike NVs.


----------



## nguyen (Oct 21, 2022)

Mussels said:


> Outside of really high FPS gaming (240Hz) almost any modern CPU is good enough.
> Very few CPU's give you boosts to the 99% FPS for the smoother gaming, which seems to be the x3D's strength
> 
> If you look at the stuff posted below this, you could simply pick a CPU that has 0.1% lows above your refresh rate and you're in for butter time. 4K120Hz smoothly requires a 5600x or above or... a 10900k and above?
> ...



Well except when 4090 comes into play, then Ryzen7000 and RPL pull ahead of 5800X3D in all metrics (avg FPS, 1% and 0.1%low)
GN 13600K review, he also explained (at 15:00) why 13900K has those low lowFPS with 3090Ti but not with 4090







From LTT review, 5800X3D gain nothing going from 3090Ti to 4090, meanwhile Ryzen7000 all have meaningful gains






With next gen GPUs coming out, 5800X3D probably doesn't cut it anymore, it need more single-thread performance.

Would be interesting when Wizzard benchmark CPUs with 4090


----------



## medi01 (Oct 21, 2022)

nguyen said:


> With next gen GPUs coming out, 5800X3D probably doesn't cut it anymore, it need more single-thread performance.


That likely flies (if at all) only with green GPUs. 

NV even openly promised to address "drivers are slow on AMD CPUs" but I doubt it happened so far.



nguyen said:


> From LTT review, 5800X3D gain nothing going from 3090Ti to 4090, meanwhile Ryzen7000 all have meaningful gains


I fail to see how that sort of benchmark (800+ frames per second, are you kidding me) are even remotely relevant.


----------



## killerbyte (Oct 22, 2022)

I think this is a messed up review cause Rysen 9 vs intel i9  mean  5900x vs 12900k  yet we choose a lower r7 5800x to bash on amd ?


----------



## Mussels (Oct 22, 2022)

nguyen said:


> Well except when 4090 comes into play, then Ryzen7000 and RPL pull ahead of 5800X3D in all metrics (avg FPS, 1% and 0.1%low)
> GN 13600K review, he also explained (at 15:00) why 13900K has those low lowFPS with 3090Ti but not with 4090
> View attachment 266470
> View attachment 266471
> ...


Yeah the 5800x3D definitely has a limit these new CPU's can pass for max/average FPS


I'm just focused on the 0.1's as thats what my system(s) are designed to avoid - and when i see an obvious difference in reviews for the same metric i want to know why because sometimes that's how we learn situations to avoid (LIke 2x8GB being so harmful to Zen2 and Zen3, yet still SO DAMN COMMON in reviews)


----------



## medi01 (Oct 25, 2022)

Rowsol said:


> I hope you'll start including 1% lows as I feel it's more important than average fps.



I second that.

Twice.


----------



## W1zzard (Oct 25, 2022)

killerbyte said:


> I think this is a messed up review cause Rysen 9 vs intel i9  mean  5900x vs 12900k  yet we choose a lower r7 5800x to bash on amd ?


The goal of this review is NOT to test "AMD vs Intel at similar config", but "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring.


----------



## wheresmycar (Oct 25, 2022)

medi01 said:


> I second that.
> 
> Twice.


 
i thrice that!

I tend to revert to other benchmarks online to view 1% lows and it would be nice if we could get everything on TPU. Also a 25 game average pls hehe (at this rate we're gonna have to start paying w1z a fat salary and overtime for the sleepless nights)


----------



## Mussels (Oct 26, 2022)

W1zzard said:


> The goal of this review is NOT to test "AMD vs Intel at similar config", but "The current GPU Test System that I have right now, a decent but slightly aged config, vs 12900K" to find out how much of a difference an upgrade can bring.


I swear, you either need a title edit or to add that to the first page and conclusion in bold lettering


----------



## seventy (Oct 31, 2022)

HenrySomeone said:


> Oh, you sweet summer child...


Try testing it yourself and see if you keep laughing, instead of making a fool of yourself.


----------



## Hofnaerrchen (Nov 3, 2022)

What I take from this article: If you are on AM4 don't waste money on upgrading your GPU to RTX 4000 or RX 7000, especially if you are already on a decent RDNA2 or RTX 3000 GPU.

Personally I don't like Aldor/Raptor Lake for the reason it is already end of life, there won't be any upgrade paths in the future and AM5 is still just to expensive. I personally would favor B650E but currently decent boards start at about 400 bucks. B650 is not really a future-proof (PCIe 4.0 x 16) platform in my opinion. Apart from that AMD as well as nVIDIA might have moved themselves in a postion where upgrading to their newest generation(s) it is not worth to consider based on where you come from. Especially AMD might see some hard times ahead when it comes to complete new systems builds, intel CPU and a GPU from either AMD or nVIDIA might - based on usage requirements - be just the better solution if you do not care about upgradability but pricing.


----------



## Radxge (Nov 26, 2022)

This is great info!  I have a 9700K + RTX4900 combination and curious about the impact of upgrading my CPU. 

It would be useful to see the FPS under the slower processor in addition to the % impact... I game at 4K@120 and do not care much if a fancier CPU increases the speed beyond 120FPS. Also, not sure if the tested games are at their higher settings?


----------



## THU31 (Nov 27, 2022)

9700K is definitely a big bottleneck for the 4090 at 120 Hz. I still use that CPU because I play in 4K60.

Most modern games will not hit consistent 120 FPS on this CPU. And Spider-Man barely stays above 60 with ray tracing (which is also heavy on the CPU). Just google some recent CPU comparisons.

This is basically Skylake with 8 cores, the IPC is rather ancient.


----------



## Radxge (Nov 29, 2022)

Sorry but *Big* bottleneck seems like an exaggeration in most circumstances (Spiderman may be an exception)... I have seen a huge impact from replacing a 3080 Ti by a 4090 in all games I tried so far (Dying Light 2, Metro Exodus, Bright Memory Infinite, AC Valhalla and Control).   I now can run most of these at 4K@120FPS at max settings (or close to), which was not the case with my older 3080 Ti.

I agree that my rig would certainly benefit from an upgrade and I was tempted by the 13600K or 13700K but I think I am going to hold on until Zen4 3D is released next year.. I dont like the thermal of the new intel and primarily use my PC for gaming, so do not care much about the productivity angle.


----------



## THU31 (Nov 29, 2022)

Have you seen this? 








Look at the 10600K, which is the closest equivalent (only 6 cores, but with HT, same amount of L3 cache).

The numbers are for 1080p, but that does not matter. It shows that it is very difficult to hit 120 FPS on this CPU. Of course I am only looking at 1% lows, which is the most relevant metric. The newest CPUs offer a minimum of 50% more FPS, sometimes even double.

I am not saying your setup is unplayable, because it obviously is, at least with a VRR display. With Vsync, personally I would need locked 120 FPS to consider it playable, or use a lower refresh rate.
But the fact is, 9700K is a bottleneck for the 4090. Yet like you mentioned, I would definitely wait for Zen 4 X3D. That might be the ultimate gaming platform for several years (Intel's future CPUs like Meteor Lake are not looking great for gamers).


----------

