• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Atlas Fallen Optimization Fail: Gain 50% Additional Performance by Turning off the E-cores

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,863 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Except a few programmes which Intel themselves optimized their thread scheduler very heavily on like Cinebench.
Source? There's nothing to optimize for, Cinebench simply spawns one worker thread per core and runs work on it non-stop, until done

Edit: Thread Director helps the OS with the decision onto which core to schedule a particular thread
 
Last edited:
Joined
Feb 15, 2019
Messages
1,661 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Source? There's nothing to optimize for, Cinebench simply spawns one worker thread per core and runs work on it non-stop, until done
I mean the Intel side thread director actively recognized Cinebench as a high pirority programme and spread the loads evenly in the optimzed way.
Not Cinebench side.
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
They split it into blocks of pixels, but same thing.. this is trivial, it's just a few lines of code


There is just one synchronization per run, so one after a few minutes, this isn't even worth calling "synchronization". I doubt that it submits the last chunk onto a faster core, if it's waiting for a slower core to finish that last piece
Ah, the benchmark is just one static scene?
 
Joined
Apr 12, 2013
Messages
7,545 (1.77/day)
Except a few programmes which Intel themselves optimized their thread director very heavily on like Cinebench.
It's a load of BS, tell me what you gain from this? When ultimately it's the OS scheduler/governor that's making the decisions :slap:
The Intel® Thread Director supplies the behind-the-scenes magic that maximizes hybrid performance.

Built directly into the hardware3, the Thread Director uses machine learning to schedule tasks on the right core at the right time (opposed to relying on static rules). This helps ensure that Performance-cores and Efficient-cores work in concert; background tasks don’t slow you down, and you can have more apps open simultaneously.

Here’s how the Intel® Thread Director works:

  • It monitors the runtime instruction mix of each thread and the state of each core with nanosecond precision.
  • It provides runtime feedback to the OS to make the optimal decision for any workload.
  • It dynamically adapts its guidance according to the Thermal Design Point (TDP) of the system, operating conditions, and power settings.
By identifying the class of each workload and using its energy and performance core scoring mechanism, the Intel® Thread Director helps the OS schedule threads on the best core for performance or efficiency.

The end result is performance gains in many demanding gaming scenarios, such as streaming your game and recording gameplay footage at the same time. You get a smoother gaming experience with a higher FPS, your followers get a better viewing experience with higher-quality streams, and your gameplay captures look better, too.
Magic really :laugh:
 
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Thread Director doesn't do anything for apps that utilize 100% of the CPU. Cinebench is one of those apps. There's nothing to direct when all cores can and need to be used for max performance.

But if you're running an app that's using 100% of the CPU and then you perform a different task, Thread Director will decide where to allocate that task, whether it needs to be completed quickly or not. Same with performing other tasks (mainly background) while gaming.

All this is just a theory, though. I don't think it's even possible to measure the potential benefits of this technology. Personally I don't see myself ever enabling E-cores in a gaming setup and have no interest in hybrid architectures on desktop.
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Thread Director doesn't do anything for apps that utilize 100% of the CPU. Cinebench is one of those apps. There's nothing to direct when all cores can and need to be used for max performance.
If the OS decides to start some background task while you crunch, the TD may be able to hint that it needs to go to an E-core. Just guessing, since I don't know exactly what it does, where it stops and where the OS takes over.
 
Joined
Dec 26, 2006
Messages
3,847 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Does this happen in Win 10 as well or only win 11??
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Does this happen in Win 10 as well or only win 11??
@W1zzard has spoon-fed you the answer: the game explicitly disables the detection of E-cores, treating everything the same. That's exactly what Win10 does, being unaware of E-cores.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
Hybrid processors are here since a while now, and it’s unacceptable for a software house not to be able to take this architecture into consideration. This is laziness and incompetence …
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
The main failure was to introduce P/E idea to desktop CPUs. Who need efficient cores anyway in desktop?
That’s only if you don’t understand how it works

Actually Alder Lake was a disaster for months after launch. I had lots of problems. It was all fixed up, but E cores are still causing some issues.

I switched to the Ryzen 7800X3D and couldn't be happier.
I had a 12700K two months after release: never had a single issue.
 
Joined
Feb 15, 2019
Messages
1,661 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
That’s only if you don’t understand how it works
I had a 12700K two months after release: never had a single issue.
May I ask you a question.

Why P+E Hybrid architecture isn't introduced into the Xeon scalable, where the BIG moneys at ?
If P+E Hybrid architecture is so good and so efficient and lived up to its name and as advertised and flawless and "never had a single issue" .
Why can't the enterprise users enjoy a 20P + 100E = 120core CPU right now?

There are so much money and talent in the enterprise space so there shouldn't be laziness and incompetence right?
Why Intel themselves had to limit Xeon scalable to P-cores only or E-cores only?
Why not Both?
Intel don't even talk about P+E Hybrid architecture Xeon in their roadmap until 2025.

Could you answer that?
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
May I ask you a question.

Why P+E Hybrid architecture isn't introduced into the Xeon scalable, where the BIG moneys at ?
If P+E Hybrid architecture is so good and so efficient and lived up to its name and as advertised and flawless and "never had a single issue" .
Why can't the enterprise users enjoy a 20P + 100E = 120core CPU right now?

There are so much money and talent in the enterprise space so there shouldn't be laziness and incompetence right?
Why Intel themselves had to limit Xeon scalable to P-cores only or E-cores only?
Why not Both?
Intel don't even talk about P+E Hybrid architecture Xeon in their roadmap until 2025.

Could you answer that?
That's an easy one. Desktop handles heterogeneous loads. Sometimes a game that needs a handful of fast cores, sometimes a 3D modelling software or something that processes video and images that will take as many cores as you can throw at them. By contrast, unless you are a cloud provider, your servers will see a much more homogeneous load. There's rarely a "light workload" on servers. They tend to crunch or idle, with very little in between.
 
Joined
Feb 15, 2019
Messages
1,661 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
unless you are a cloud provider
Wait so all the cloud providers purchased so many CPUs and still don't deserve some special care ?

AWS, Google, Azure....
Sad Cry GIF by SpongeBob SquarePants
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
May I ask you a question.

Why P+E Hybrid architecture isn't introduced into the Xeon scalable, where the BIG moneys at ?
If P+E Hybrid architecture is so good and so efficient and lived up to its name and as advertised and flawless and "never had a single issue" .
Why can't the enterprise users enjoy a 20P + 100E = 120core CPU right now?

There are so much money and talent in the enterprise space so there shouldn't be laziness and incompetence right?
Why Intel themselves had to limit Xeon scalable to P-cores only or E-cores only?
Why not Both?
Intel don't even talk about P+E Hybrid architecture Xeon in their roadmap until 2025.

Could you answer that?
And again if can’t even understand the difference from a Xeon target workload and a consumer CPU target workload, how could you criticize Intel’s choices?
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Wait so all the cloud providers purchased so many CPUs and still don't deserve some special care ?

AWS, Google, Azure....
Sad Cry GIF by SpongeBob SquarePants
If you ever configured something in the cloud, you know each provider offers an assortment of CPUs to choose from (not limited to x86_64 either). Amazon even built their own to add to the mix.
 
Joined
Feb 15, 2019
Messages
1,661 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
If you ever configured something in the cloud, you know each provider offers an assortment of CPUs to choose from (not limited to x86_64 either). Amazon even built their own to add to the mix.
But I don't see P+E cores in the mix.

Spongebob Squarepants Ngapa GIF by The SpongeBob Movie: Sponge On The Run


Jokes aside.
We all know why P+E core doesn't appear in the Xeon Scalable market.
Even as big as VMware had so much trouble making P+E cores working properly in virtualization.

And again if can’t even understand the difference from a Xeon target workload and a consumer CPU target workload, how could you criticize Intel’s choices?

So the competitor AMD could make their CPU architecture suitable for both workloads and Intel had to differentiate themselves so to make 3 architectures co-existed for no apparent positive effects other than driving the cost up?

What a Good Choice !
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
But I don't see P+E cores in the mix.

Spongebob Squarepants Ngapa GIF by The SpongeBob Movie: Sponge On The Run
Because they're not needed. The cloud is not a simple desktop or a server. It works differently. You create your heterogeneous environment by mixing up the homogeneous CPUs available. It's cost-ineffective to do the mix in hardware when you can control it via software. And then there's serverless.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
So the competitor AMD could make their CPU architecture suitable for both workloads and Intel had to differentiate themselves so to make 3 architectures co-existed for no apparent positive effects other than driving the cost up?

What a Good Choice !

AMD and Intel choose a different path, but AMD is stuck in the 16/32 configuration since a while, in the consumer market, while Intel moved on. The hybrid solution is a compromise, but a good one. You cannot just add cores on a consumer CPU.
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
So the competitor AMD could make their CPU architecture suitable for both workloads and Intel had to differentiate themselves so to make 3 architectures co-existed for no apparent positive effects other than driving the cost up?

What a Good Choice !

13900k will match the mighty 7950X in multithreading while having half the threads running on weaker cores. Clearly not everything about Raptor Lake is as bad as you would have us believe. The costs were driven so high, the Intel CPU can be had for ~$150 less than AMD's.
 
Joined
Feb 15, 2019
Messages
1,661 (0.78/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Because they're not needed.
So as the never-existed Xeon W-1400 series, when Intel themselves quickly realized they are unfitted vessal for virtualization.

13900k will match the mighty 7950X in multithreading while having half the threads running on weaker cores. Clearly not everything about Raptor Lake is as bad as you would have us believe. The costs were driven so high, the Intel CPU can be had for ~$150 less than AMD's.

While consuming double the power consumption (140 vs 276) ?

Also I don't want you to believe Raptor Lake is bad.
Raptor Lake is GOOD when I am using it as a pure 8 core CPU with E-cores disabled, or Having pure P-core SR Xeon running VMs.
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@Crackong I cannot follow your logic, you're just throwing out random stuff. I'm out.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
@Crackong I cannot follow your logic, you're just throwing out random stuff. I'm out.
There’s no reasoning with AMD cheerleaders. They really are worse than Apple’s fan

So as the never-existed Xeon W-1400 series, when Intel themselves quickly realized they are unfitted vessal for virtualization.



While consuming double the power consumption (140 vs 276) ?

Also I don't want you to believe Raptor Lake is bad.
Raptor Lake is GOOD when I am using it as a pure 8 core CPU with E-cores disabled, or Having pure P-core SR Xeon running VMs.
Little reality check: a 7950X under full load can consume up to 240W
 
Joined
Nov 26, 2021
Messages
1,653 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04

13900k will match the mighty 7950X in multithreading while having half the threads running on weaker cores. Clearly not everything about Raptor Lake is as bad as you would have us believe. The costs were driven so high, the Intel CPU can be had for ~$150 less than AMD's.
I think the difference between the two is less than $50. The 7950X is available for $599 at multiple retailers such as Best Buy and Newegg while the 13900k is being sold for $568.2 by Amazon and B&H photo video.
 

bug

Joined
May 22, 2015
Messages
13,794 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Little reality check: a 7950X under full load can consume up to 240W
You can even shave off 100W from ADL/RPL with like a 5% penalty. They're configured pretty badly ootb, but that doesn't mean there's not a power efficient chip in there.

I think the difference between the two is less than $50. The 7950X is available for $599 at multiple retailers such as Best Buy and Newegg while the 13900k is being sold for $568.2 by Amazon and B&H photo video.
Now maybe, but 7950X launched at $750 MSRP.
 
Top