• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

Joined
Jun 26, 2022
Messages
237 (0.26/day)
Processor 7950X, PBO CO -15
Motherboard Gigabyte X670 AORUS Elite AX (rev. 1.0)
Cooling EVGA CLC 360 w/Arctic P12 PWM PST A-RGB fans
Memory 64GB G.Skill Trident Z5 RGB F5-6000J3040G32GA2-TZ5RK
Video Card(s) ASUS TUF Gaming GeForce RTX 3070
Storage 970 EVO Plus 2TB x2, 970 EVO 1TB; SATA: 850 EVO 500GB (HDD cache), HDDs: 6TB Seagate, 1TB Samsung
Display(s) ASUS 32" 165Hz IPS (VG32AQL1A), ASUS 27" 144Hz TN (MG278Q)
Case Corsair 4000D Airflow
Audio Device(s) Razer BlackShark V2 Pro
Power Supply Corsair RM1000x
Mouse Logitech M720
Keyboard G.Skill KM780R MX
Software Win10 Pro, PrimoCache, VMware Workstation Pro 16
Joined
Jan 20, 2019
Messages
1,589 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
4 upgrades? It has been a while for me! :laugh:

Yes, the 2700K is my main PC; I don't have anything higher spec than this - see specs for full info. Note that I've upgraded just about everything other than the CPU, supporting components and case since I built it in 2011.

Well, on the desktop, it feels as snappy as ever. Seriously, no slowdown at all since I first built it, hence Microsoft hasn't made Windows any slower. Fantastic. I don't run any intensive apps that would really show up the lack of performance compared to a modern system.

Now, while I do have hundreds of games, I haven't played that many of them (Steam is very efficient at separating me from my money with special offers lol) or that often.

I ran Cyberpunk2077 and got something like 15-25fps even when dropping screen res and details right down, so it's no good for that. In hindsight, I should have gotten my money back, nvm.

CoD: Modern Warfare (the newer one) runs quite well at 60-110fps or so. Jumps around a lot, but with my newish G-SYNC monitor, that hardly matters and it plays fine. Even before that, the experience was still good, but not great, especially if I set the screen refresh rate to 144Hz and vsync off. Felt very responsive like that. Note that my 2080 Super easily plays this game at 4K. I don't have all the details maxed out though, regardless of resolution. I don't like motion blur and ambient occlusion doesn't make that much visual difference, so I turn them both off, for example, but both really reduce the performance.

CoD: Modern Warfare II Warzone 2.0 runs with rather less performance and can drop down into the stuttery 40fps which is below what the G-SYNC compatibility will handle, but otherwise not too bad. It also tends to hitch a bit, but my console friends reported that too, so is a game engine problem, not my CPU.

I've got CoD games running back generations and they all work fine. Only the latest one struggles to any degree.

I've run various other games which worked alright too, can't remember the details now. It's always possible to pick that one game that has a really big system load, or is badly optimised and runs poorly, but that can happen even with a modern CPU.

I have a feeling that this old rig, with its current spec, can actually game better than a modern gaming rig with a low end CPU. Haven't done tests of course, but it wouldn't surprise me.

Agreed, I don't like the greater expense for AMD either, so the devil will be in the details. I want to see what the realworld performance uplift will be compared to the 13700K I have my eye on before I consider my upgrade. Thing is, every time, I think I'm finally gonna pull the trigger, the goal posts move! The real deadline here of course, is Windows 10 patch support is gonna end in 2025, so it's gonna happen for sure by then.

And finally, out of interest, here's the thread I started when I upgraded to my trusty 2700K all those years ago. It's proven to be a superb investment to last this long and still be going strong.


oops i meant 3 upgrades after the 2700K. The 4th pending

If its getting the job done, i thinks its fantastic you've got the 2700K sweet sailing for this long. Looks like you've had a blast at 4K and it makes sense with most of the weight probably thrown over at the GPU end.

With my 2700K, I kept hitting a brick wall with battlefield - although 100% playable i fell short on visual smoothness - its a difficult one to explain with FPS+FT being decent but i could still sense some lumpy roughness>some jiggery jaggery boo in fast paced scenes or in dense environments. The first assumption was the GPU which was upgraded and i could still feel some irregularity. Eventually reinstalled windows for one last attempt and then gave up... grabbed a 4790K. With each Battlefield release, the lack-of-smooth offender returned hence each time a jump up a couple of Gens resolved the problem which eventually landed me on a 9700K. Not gonna lie, it wasn't just observable performance punching in the upgrade ignition button, i sadly suffer from the upgrade-itch too. Now the current BF is starved for threads and the single threaded 9700K (does a decent job) will be sadly put to rest. I'm a buff for screen time silkiness and something like a 7800X3D/5800X3D sounds like a sound plan for a 3 year excursion (or 2, you know the upgrade-itch hehe)

Thing is, every time, I think I'm finally gonna pull the trigger, the goal posts move!

The goal posts stayed put for me when considering CPU upgrades.... but moved a couple of miles far and beyond when considering GPU upgrades. 40-series (or RDNA3) was the last stop, the unyielding affirmative buy... and then nV dropped those rediculous MSRPs and crushed the hope and glory and left me traumatised lol (ok a bit dramatic - simple as, no thanks aint gonna withdraw from me wallet to fill the corps already fattened up pockets)
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
How does 720p results dispute his claims about 4K gaming focused on 1% and 0.1% lows?
If hes getting worse 1% and 0.1% is he going to get better onez at 4k?
 
Joined
Jun 26, 2022
Messages
237 (0.26/day)
Processor 7950X, PBO CO -15
Motherboard Gigabyte X670 AORUS Elite AX (rev. 1.0)
Cooling EVGA CLC 360 w/Arctic P12 PWM PST A-RGB fans
Memory 64GB G.Skill Trident Z5 RGB F5-6000J3040G32GA2-TZ5RK
Video Card(s) ASUS TUF Gaming GeForce RTX 3070
Storage 970 EVO Plus 2TB x2, 970 EVO 1TB; SATA: 850 EVO 500GB (HDD cache), HDDs: 6TB Seagate, 1TB Samsung
Display(s) ASUS 32" 165Hz IPS (VG32AQL1A), ASUS 27" 144Hz TN (MG278Q)
Case Corsair 4000D Airflow
Audio Device(s) Razer BlackShark V2 Pro
Power Supply Corsair RM1000x
Mouse Logitech M720
Keyboard G.Skill KM780R MX
Software Win10 Pro, PrimoCache, VMware Workstation Pro 16
If hes getting worse 1% and 0.1% is he going to get better onez at 4k?
I'm not seeing where either one of you posted 0.1% or 1% low gaming results for 12900KS & 5800X3D, let alone @ 4K. And yes, you will get different margins @ 4K than lower resolutions.

Below are some average FPS differences between 13900K and 5800X3D showing that @ 4K the 13900K averages 1.3% faster but 6.2% faster @ 1080P. These results don't resolve who's claim is right regarding lows, nor an actual comparison to the 12900KS. It does however show how posting 720P results are not useful in arguing what CPU is going to be faster @ 4K, as the lower clocked, higher cache CPU is at a major disadvantage as the resolution is reduced below 4K.

Maybe you or @Crylune would like to actually provide 1% and 0.1% low @ 4K results between the 5800X3D and 12900KS (and I suppose the 12900K since you claimed 12900K was also faster) so the thread is more informative?

2160P:
1673497007738.png


1080P:
1673498830806.png
 
Last edited:

Hxx

Joined
Dec 5, 2013
Messages
303 (0.08/day)
why does everything has to be for gaming
View attachment 277623

MEH.
because this is nothing more than a dick measuring contest between AMD and intel. Each want to be able to market that they make the fastest undisputed most powerful gaming cpu powerhouse etc etc no matter the cost, efficiency or anything else, when the reality is any midrange chip from previous gen is more than enough to drive high refresh rate gaming. In a way this is good for consumers but id rather them focus on things that actually matter for gaming like u know... the damn gpu. That market is not exactly good rn. AMD should focus on better driver support, better pricing, fixing their current lineup, more powerful gpus so they crawl back that market share from nvidia. Instead they focus on single digit percentage improvements that can mostly be seen in 720p gaming... because u know that makes so much more sense lmao.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,417 (4.69/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
If the 7800X3D is as good as this article suggests, then Intel have a big problem.

I'm gonna seriously consider this for my 2700K upgrade once the reviews are out. Will be really nice to dodge the e-core bullet, if nothing else.

2700k to a 7800x3d would be a baller as fuck upgrade lol, do it!!! :rockout::rockout::rockout::rockout:
 
Joined
Sep 17, 2014
Messages
22,645 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Very reliable result, the 12700k has better lows than the 13900k, lol.
Why not? This is the core of the issue: maximum or average FPS says absolutely nothing about frametimes, especially at the 0,1% and 1% lows.

Its very common to see a combination of lower maximums and better control over outliers. It relates to bursty frequency behaviour as well: if the CPU can boost high, it creates a larger gap between boost and base clock, so your peak FPS might be higher, but your worst numbers are also worse. Why do you think Intel is progressively lowering base clocks gen to gen to attain higher boost? Its not to help minimums, but to shine in maximums. In GPUs, with pre-rendered frames and frame smoothing you create some of the same effects: maximum FPS is sacrificed to use the available time to start on the next frame earlier.

X3D isn't about peak frequency, its about peak consistency, and it shows everywhere. The CPUs are most useful for gaming because they elevate the performance in precisely those gaming situations where you dip the hardest because you're missing the required information at the correct timing. That's where the cache shows its value best and that's where it differs from every other CPU.

Intel can keep up for a large number of games because they're well managed in CPU load; this applies to most triple A content, most console content, but it absolutely does NOT apply to simulations that expand as you go into end-game (almost every generated frame is one where lots of info must be collected to present the correct next step in simulation, the amount increasing the further your army/village/galactic empire expands).

Who cares if you can run a shooter at 250 or 300 FPS, basically is the gist of this. What matters is if you can keep your minimums in check. Only X3Ds offer a technology that does that regardless of the frequency the CPU runs at.

And this in a nutshell is why most CPU reviews don't manage to emphasize or cover properly the impact of CPU performance in gaming. Measuring lows is the way, and in fact should be the defining thing on your CPU choice, and NOT max/avg FPS. The things that damage the experience most, are the dips, not the peaks.

because this is nothing more than a dick measuring contest between AMD and intel. Each want to be able to market that they make the fastest undisputed most powerful gaming cpu powerhouse etc etc no matter the cost, efficiency or anything else, when the reality is any midrange chip from previous gen is more than enough to drive high refresh rate gaming. In a way this is good for consumers but id rather them focus on things that actually matter for gaming like u know... the damn gpu. That market is not exactly good rn. AMD should focus on better driver support, better pricing, fixing their current lineup, more powerful gpus so they crawl back that market share from nvidia. Instead they focus on single digit percentage improvements that can mostly be seen in 720p gaming... because u know that makes so much more sense lmao.
Minor difference, the X3D is real innovation, Intel's next KS is not.

And as pointed out above, there are tons of in-game situations where you play not a canned benchmark run, but a real game where the real CPU load is many times higher than you see in reviews. Stuff like Stellaris or Cities Skylines wants every % of performance on the CPU it can get.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Why not? This is the core of the issue: maximum or average FPS says absolutely nothing about frametimes, especially at the 0,1% and 1% lows.

Its very common to see a combination of lower maximums and better control over outliers. It relates to bursty frequency behaviour as well: if the CPU can boost high, it creates a larger gap between boost and base clock, so your peak FPS might be higher, but your worst numbers are also worse. Why do you think Intel is progressively lowering base clocks gen to gen to attain higher boost? Its not to help minimums, but to shine in maximums. In GPUs, with pre-rendered frames and frame smoothing you create some of the same effects: maximum FPS is sacrificed to use the available time to start on the next frame earlier.

X3D isn't about peak frequency, its about peak consistency, and it shows everywhere. The CPUs are most useful for gaming because they elevate the performance in precisely those gaming situations where you dip the hardest because you're missing the required information at the correct timing. That's where the cache shows its value best and that's where it differs from every other CPU.

Intel can keep up for a large number of games because they're well managed in CPU load; this applies to most triple A content, most console content, but it absolutely does NOT apply to simulations that expand as you go into end-game (almost every generated frame is one where lots of info must be collected to present the correct next step in simulation, the amount increasing the further your army/village/galactic empire expands).

Who cares if you can run a shooter at 250 or 300 FPS, basically is the gist of this. What matters is if you can keep your minimums in check. Only X3Ds offer a technology that does that regardless of the frequency the CPU runs at.

And this in a nutshell is why most CPU reviews don't manage to emphasize or cover properly the impact of CPU performance in gaming. Measuring lows is the way, and in fact should be the defining thing on your CPU choice, and NOT max/avg FPS. The things that damage the experience most, are the dips, not the peaks.


Minor difference, the X3D is real innovation, Intel's next KS is not.

And as pointed out above, there are tons of in-game situations where you play not a canned benchmark run, but a real game where the real CPU load is many times higher than you see in reviews. Stuff like Stellaris or Cities Skylines wants every % of performance on the CPU it can get.
There is no way in hell the 12700k gets better minimums than the 13900k at anything. Ever. Gnexus ways has pretty weird numbers when it comes to lows and minimums, you can check his older reviews as well and you'll see a trend.

Also, intel cpus never drop from the boost clocks during gaming. The 12900k is running at 4.9 ghz 100% of the time, the 13900k runs at 5.5GHz 100% of the time etc.
 
Last edited:
Joined
Sep 26, 2022
Messages
2,140 (2.63/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
There is no way in hell the 12700k gets better minimums than the 13900k at anything. Ever. Gnexus ways has pretty weird numbers when it comes to lows and minimums, you can check his older reviews as well and you'll see a trend.
Would you mind checking the base clock for both processors' P and E-cores?
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Would you mind checking the base clock for both processors' P and E-cores?
The base clocks refer to the 125w long duration power limit and youll only drop to these clocks during something like ycruncher or prime. Even with the 125w power limit in place (which the gnexus review didnt have) they'll still run at max clocks during gaming cause they dont exceed those likings. Only the 13900k does in cyberpunk where it can hit 140 to 150w sometimes but that's about it, in most other games it sits below 100.
 
Joined
Jun 26, 2022
Messages
237 (0.26/day)
Processor 7950X, PBO CO -15
Motherboard Gigabyte X670 AORUS Elite AX (rev. 1.0)
Cooling EVGA CLC 360 w/Arctic P12 PWM PST A-RGB fans
Memory 64GB G.Skill Trident Z5 RGB F5-6000J3040G32GA2-TZ5RK
Video Card(s) ASUS TUF Gaming GeForce RTX 3070
Storage 970 EVO Plus 2TB x2, 970 EVO 1TB; SATA: 850 EVO 500GB (HDD cache), HDDs: 6TB Seagate, 1TB Samsung
Display(s) ASUS 32" 165Hz IPS (VG32AQL1A), ASUS 27" 144Hz TN (MG278Q)
Case Corsair 4000D Airflow
Audio Device(s) Razer BlackShark V2 Pro
Power Supply Corsair RM1000x
Mouse Logitech M720
Keyboard G.Skill KM780R MX
Software Win10 Pro, PrimoCache, VMware Workstation Pro 16
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
All im gonna say is, I find it hillarious you are on TPU forums and instead of quoting their benchmarks, you are using someone elses just because they happen to agree with the opinion you have. Hilarious stuff.

To the point now, I have a 12900k and a 13900k with 7600c34 ram. If anyone wants to test their 3d and see how much better it is compared to intels offerings, just come forward
 
Joined
Jun 26, 2022
Messages
237 (0.26/day)
Processor 7950X, PBO CO -15
Motherboard Gigabyte X670 AORUS Elite AX (rev. 1.0)
Cooling EVGA CLC 360 w/Arctic P12 PWM PST A-RGB fans
Memory 64GB G.Skill Trident Z5 RGB F5-6000J3040G32GA2-TZ5RK
Video Card(s) ASUS TUF Gaming GeForce RTX 3070
Storage 970 EVO Plus 2TB x2, 970 EVO 1TB; SATA: 850 EVO 500GB (HDD cache), HDDs: 6TB Seagate, 1TB Samsung
Display(s) ASUS 32" 165Hz IPS (VG32AQL1A), ASUS 27" 144Hz TN (MG278Q)
Case Corsair 4000D Airflow
Audio Device(s) Razer BlackShark V2 Pro
Power Supply Corsair RM1000x
Mouse Logitech M720
Keyboard G.Skill KM780R MX
Software Win10 Pro, PrimoCache, VMware Workstation Pro 16
All im gonna say is, I find it hillarious you are on TPU forums and instead of quoting their benchmarks,
Are you claiming @Crylune posted a benchmark?

you are using someone elses just because they happen to agree with the opinion you have. Hilarious stuff.
I hope you find that benchmark @Crylune posted and aren't making another false claim!

I didn't have any opinion on this, I tried to fact check both claims that were made and your claim was the easiest to prove or disprove while his claim is harder, given the fact that 1% and 0.1% low data @ 4K is generally compiled for GPU reviews, not CPU reviews, and the KS is not reviewed as much.

Given the difficulty in finding 0.1/1% lows @ 4K for a 12900KS vs 5800X3D I won't be spending more time on this. It was interesting at first to see if one would be head and shoulders better than the other, but it appears the X3D only has a slight lead and I fully believe the X3D will be so similarly close in performance to the KS, based on the K, that it's not worth looking into further to find the answer.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Are you claiming @Crylune posted a benchmark?


I hope you find that benchmark @Crylune posted and aren't making another false claim!

I didn't have any opinion on this, I tried to fact check both claims that were made and your claim was the easiest to prove or disprove while his claim is harder, given the fact that 1% and 0.1% low data @ 4K is generally compiled for GPU reviews, not CPU reviews, and the KS is not reviewed as much.

Given the difficulty in finding 0.1/1% lows @ 4K for a 12900KS vs 5800X3D I won't be spending more time on this. It was interesting at first to see if one would be head and shoulders better than the other, but it appears the X3D only has a slight lead and I fully believe the X3D will be so similarly close in performance to the KS, based on the K, that it's not worth looking into further to find the answer.
No im saying TPU has a benchmark. Sure they dont record 0.1 and 1% lows, but - looking at the 13900k review, the x3d is so far behind the 12900k and the ks in averages that I doubt the lows are better
 
Joined
Jun 26, 2022
Messages
237 (0.26/day)
Processor 7950X, PBO CO -15
Motherboard Gigabyte X670 AORUS Elite AX (rev. 1.0)
Cooling EVGA CLC 360 w/Arctic P12 PWM PST A-RGB fans
Memory 64GB G.Skill Trident Z5 RGB F5-6000J3040G32GA2-TZ5RK
Video Card(s) ASUS TUF Gaming GeForce RTX 3070
Storage 970 EVO Plus 2TB x2, 970 EVO 1TB; SATA: 850 EVO 500GB (HDD cache), HDDs: 6TB Seagate, 1TB Samsung
Display(s) ASUS 32" 165Hz IPS (VG32AQL1A), ASUS 27" 144Hz TN (MG278Q)
Case Corsair 4000D Airflow
Audio Device(s) Razer BlackShark V2 Pro
Power Supply Corsair RM1000x
Mouse Logitech M720
Keyboard G.Skill KM780R MX
Software Win10 Pro, PrimoCache, VMware Workstation Pro 16
No im saying TPU has a benchmark. Sure they dont record 0.1 and 1% lows, but - looking at the 13900k review, the x3d is so far behind the 12900k and the ks in averages that I doubt the lows are better
It's was only 1.3% behind the 13900K @ 4K in averages on a TPU benchmark, I even talked about it in post #104. Based on that, it should have been hard to believe it was far behind a 12900K @ 4K on a TPU benchmark.

Here's the 12900K/S & 5800X3D @ 4K on a TPU benchmark, they are VERY close in averages. It doesn't surprise me at all that the X3D, with its extra cache, could beat the KS in 0.1/1% lows, as it only trails the KS by 0.9% in average FPS:
1673567231281.png
 
Joined
Jan 20, 2019
Messages
1,589 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
why are you guys comparing the 5800X3D to a 12900K? Wouldn't the correct comparison sit with the 12700K vs 5800X3D? For gaming i wouldn't touch anything above this range.

You guys got me interested in looking in on 1% lows between the 2 discussed models.... a 20 game average:

Screenshot (98).png

[source: eTeknix]

In short, these 1% low averages puts both the 5800X3D and 12900K on an equal war path...practically the same. Its a given both trade blows depending on the titles played/resolutions applied leaving each inquirer to come to their own conclusion based on their setups and targeted games.

Again if i were going 12th Gen intel (for gaming) i wouldn't touch the 12900K.. just silly beans unless non-gaming core-hungry workloads suggest otherwise. The 12700/12700K is what makes sense or the 5800X/5800X3D. Oddly enough, i've seen the Zen 3 X3D even trading blows with 13th Gen in a small number of titles (probably compared to a 13600K at a given resolution - might need to revisit the stats) but overall 13th Gen easily came out ahead.

Anyone on either 12th Gen or AM4-5000-series should be over the moon for this sort of cutting edge processing power and yet the WWW is full of people beating the third leg against the non-conformist militant wall of futility.
 
Joined
Sep 17, 2014
Messages
22,645 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
There is no way in hell the 12700k gets better minimums than the 13900k at anything. Ever. Gnexus ways has pretty weird numbers when it comes to lows and minimums, you can check his older reviews as well and you'll see a trend.

Also, intel cpus never drop from the boost clocks during gaming. The 12900k is running at 4.9 ghz 100% of the time, the 13900k runs at 5.5GHz 100% of the time etc.
Very cool story but it relates in NO WAY to what you're replying to.

I specifically talked about the relative impact of frequency. Frequency is what the CPU core runs at, its not indicative of how fast the CPU can fetch data. You can believe whatever you want to believe, but there are countless examples where the X3D shines and no Intel CPU can reach it, and they're specifically in highest CPU load cases for gaming. In other words: where it matters most.

And even in your own weird take on how the Intel CPUs work, you can't deny there are already games (like Cyberpunk... as if that's not a writing on the wall but an outlier) that pull these CPUs to base clock because they're exceeding limits for turbo. In fact, your ideas don't match reality in any way shape or form, except perhaps from your own N=1 perspective, but then all I can say is, you ain't gaming a lot, or you're playing the games where the impact just isn't there. I've already pointed out as well, specific types of games excel on X3Ds. Comparing the bog standard bench suite, even if its big, isn't really doing that justice. Not a single reviewer plays a Stellaris 'endgame' or a TW Warhammer 3 campaign in turn 200.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Very cool story but it relates in NO WAY to what you're replying to.

I specifically talked about the relative impact of frequency. Frequency is what the CPU core runs at, its not indicative of how fast the CPU can fetch data. You can believe whatever you want to believe, but there are countless examples where the X3D shines and no Intel CPU can reach it, and they're specifically in highest CPU load cases for gaming. In other words: where it matters most.

And even in your own weird take on how the Intel CPUs work, you can't deny there are already games (like Cyberpunk... as if that's not a writing on the wall but an outlier) that pull these CPUs to base clock because they're exceeding limits for turbo. In fact, your ideas don't match reality in any way shape or form, except perhaps from your own N=1 perspective, but then all I can say is, you ain't gaming a lot, or you're playing the games where the impact just isn't there. I've already pointed out as well, specific types of games excel on X3Ds. Comparing the bog standard bench suite, even if its big, isn't really doing that justice. Not a single reviewer plays a Stellaris 'endgame' or a TW Warhammer 3 campaign in turn 200.
But cyberpunk does not pull the 13900k to base clocks. Every review runs them power unlimited, therefore gnexus numbers were with the cpu running at 5.5ghz all core all day long. Therefore thats not the explanation for his 0.1% numbers. But regardless, the same applies to every cpu. The 7950x draws 140 watts in that game, if you limit the cpu to the same 125w its going to throttle as well.

I agree with you that there are games that the 3d is king. But the same applies to the 12900k (im not even mentioning the 13900k that is much faster). Spiderman, spiderman miles morales, cyberpunk etc, the 12900k just poops on the 3d by a big juicy margin. Especially if you run ingame and not the built in bench, the differences are staggering. Im talking about close to 50% differences.

Im absolutely ready to backup my statements with videos, i have the 12900k and the 13900k running with 7600c34, if anyone has the 3d and wants to test the above games, lets do it.
 
Joined
Sep 17, 2014
Messages
22,645 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But cyberpunk does not pull the 13900k to base clocks. Every review runs them power unlimited, therefore gnexus numbers were with the cpu running at 5.5ghz all core all day long. Therefore thats not the explanation for his 0.1% numbers. But regardless, the same applies to every cpu. The 7950x draws 140 watts in that game, if you limit the cpu to the same 125w its going to throttle as well.

I agree with you that there are games that the 3d is king. But the same applies to the 12900k (im not even mentioning the 13900k that is much faster). Spiderman, spiderman miles morales, cyberpunk etc, the 12900k just poops on the 3d by a big juicy margin. Especially if you run ingame and not the built in bench, the differences are staggering. Im talking about close to 50% differences.

Im absolutely ready to backup my statements with videos, i have the 12900k and the 13900k running with 7600c34, if anyone has the 3d and wants to test the above games, lets do it.
Spiderman & Cyberpunk at retarded settings... Who cares about FPS in the shittiest-optimized first/third person games of the moment? Some examples there.... I'm sure TW3 also gets a fantastic number somewhere in its newest RT-on rendition that is generally considered grossly inefficient on pretty much everything, with not much to show for it. This isn't new, every gen has a few of those poster childs. I fondly remember how Nvidia pushed Hairworks in TW3 vanilla.

This is just parroting the cherry picked marketing examples that were given the exact treatment to create buy incentive for high end. I consider them about as relevant as Minesweeper performance honestly.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Spiderman & Cyberpunk at retarded settings... Who cares about FPS in the shittiest-optimized first/third person games of the moment? Some examples there.... I'm sure TW3 also gets a fantastic number somewhere in its newest RT-on rendition that is generally considered grossly inefficient on pretty much everything, with not much to show for it. This isn't new, every gen has a few of those poster childs. I fondly remember how Nvidia pushed Hairworks in TW3 vanilla.

This is just parroting the cherry picked marketing examples that were given the exact treatment to create buy incentive for high end. I consider them about as relevant as Minesweeper performance honestly.
I disagree with the shittiest optimized, but sure, lets say they are. So? Why does that make them run way faster on intel? There must be something that intel cpus do better to make those games run that much faster, right?
 
Joined
Sep 17, 2014
Messages
22,645 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I disagree with the shittiest optimized, but sure, lets say they are. So? Why does that make them run way faster on intel? There must be something that intel cpus do better to make those games run that much faster, right?
Sure, and I think that something isn't quite so relevant in the games where your FPS really tanks to a level of unplayability because of CPU load. That's where the X3D starts to shine - of course not everywhere, but then nothing does.

From what I've gathered on those games a big part of the additional CPU load is in fact caused by graphics options, DLSS3, frame generation, etc. I suppose that's where Intel can put its core count to work. Its an interesting development nonetheless, both the biglittle approach and the cache heavy CPU, in how they accelerate gaming. There is definitely untapped potential in CPUs to put to use.

Im absolutely ready to backup my statements with videos, i have the 12900k and the 13900k running with 7600c34, if anyone has the 3d and wants to test the above games, lets do it.
I would definitely be interested in this!
 
Last edited:
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sure, and I think that something isn't quite so relevant in the games where your FPS really tanks to a level of unplayability because of CPU load. That's where the X3D starts to shine - of course not everywhere, but then nothing does.

From what I've gathered on those games a big part of the additional CPU load is in fact caused by graphics options, DLSS3, frame generation, etc. I suppose that's where Intel can put its core count to work. Its an interesting development nonetheless, both the biglittle approach and the cache heavy CPU, in how they accelerate gaming. There is definitely untapped potential in CPUs to put to use.


I would definitely be interested in this!
I have some videos on my channel with the 12900k running those games if you are interested
 
Joined
Oct 15, 2019
Messages
588 (0.31/day)
Especially if you run ingame and not the built in bench, the differences are staggering. Im talking about close to 50% differences.
50% diff comparing 5800x3d to 12900k? I need to see this. How were the 1% and 0.1% lows? Which resolution?
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
50% diff comparing 5800x3d to 12900k? I need to see this. How were the 1% and 0.1% lows? Which resolution?
The resolution doesn't matter, if the GPU isn't the bottleneck..

This is a 12900k with just 6000 ram at max settings + RT

 
Top