• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core 12th Gen Alder Lake Preview

Joined
Sep 28, 2005
Messages
3,329 (0.48/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
How exciting this is. I dont feel the need to upgrade my 10700kf, but just to get one of these, I want to!

Wizzard, if you feel too tired or worn out from doing the review, send the hardware my way, I'll .....review it..... for you. ;)
 
Joined
Apr 18, 2019
Messages
935 (0.46/day)
Location
The New England region of the United States
System Name Gaming Rig
Processor Ryzen 7 3800X
Motherboard Gigabyte X570 Aurus Pro Wifi
Cooling Noctua NH-D15 chromax.black
Memory 32GB(2x16GB) Patriot Viper DDR4-3200C16
Video Card(s) EVGA RTX 3060 Ti
Storage Samsung 970 EVO Plus 1TB (Boot/OS)|Hynix Platinum P41 2TB (Games)
Display(s) Gigabyte G27F
Case Corsair Graphite 600T w/mesh side
Audio Device(s) Logitech Z625 2.1 | cheapo gaming headset when mic is needed
Power Supply Corsair HX850i
Mouse Redragon M808-KS Storm Pro (Great Value)
Keyboard Redragon K512 Shiva replaced a Corsair K70 Lux - Blue on Black
VR HMD Nope
Software Windows 11 Pro x64
Benchmark Scores Nope
Are you seriously expecting Intel did this on porpuse? Thats an issue to be fixed with AMD and Microsoft, those tests were made Oct 1st. Even intel could improve some stuff with BIOS and stuff from there to now.
Yes absolutely, chipzilla knows exactly what it is doing. They could have benched with Windows 10 but chose to put out false results to make their new chips look even better. AMD probably would have done the same thing if the shoe was on the other foot.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
it's 4E core vs 1P core at same area, same clocks 5.2Ghz, if 1.27 IPC and 1,25 hyperthreading/. 4x vs 1.6x at same power.
I don't think it's quite that simple - the E core cluster is visibly wider than a P core in the die shot Intel used for their review packaging, and while it is a bit smaller in the other dimension I would think the overall area is a bit larger. Still far more perf/area/clock of course. But I sincerely doubt these cores can reliably clock to 5.2GHz. Given that they are designed for efficiency it would make sense for less effort being put into making them clock far beyond 4GHz, and it's quite common for architectures to hit hard clock limitations at any point from 3GHz to 5GHz - so they might not go much higher than 3.9GHz as they are rated on the 12900K.
Yes absolutely, chipzilla knows exactly what it is doing. They could have benched with Windows 10 but chose to put out false results to make their new chips look even better. AMD probably would have done the same thing if the shoe was on the other foot.
Given that W10 lacks the scheduler to handle ADL properly that would have been quite problematic though - you try to make an even playing field, and only test across OSes if you really have to. Still, they had time to re-test, especially with the Insider patch being out for a while now. I rather think this is "opportunistic laziness" than malevolence though. "Oops, we didn't have time to fix that before our huge event, sorry" is far more likely than some grinning PR Disney villain being behind it.
 
Joined
Apr 30, 2011
Messages
2,704 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Win10 will more likely use only the P cores and ignore the E ones. Until the MS update the cpu scheduler there also. For now the big push for their Win10.5=11 OS is only the AL platform (maybe that's why they collaborated heavily with Intel and messed the Ryzen CPU performance), so they will leave Win10 on the back sit for a while me thinks.
 
Joined
Apr 18, 2019
Messages
935 (0.46/day)
Location
The New England region of the United States
System Name Gaming Rig
Processor Ryzen 7 3800X
Motherboard Gigabyte X570 Aurus Pro Wifi
Cooling Noctua NH-D15 chromax.black
Memory 32GB(2x16GB) Patriot Viper DDR4-3200C16
Video Card(s) EVGA RTX 3060 Ti
Storage Samsung 970 EVO Plus 1TB (Boot/OS)|Hynix Platinum P41 2TB (Games)
Display(s) Gigabyte G27F
Case Corsair Graphite 600T w/mesh side
Audio Device(s) Logitech Z625 2.1 | cheapo gaming headset when mic is needed
Power Supply Corsair HX850i
Mouse Redragon M808-KS Storm Pro (Great Value)
Keyboard Redragon K512 Shiva replaced a Corsair K70 Lux - Blue on Black
VR HMD Nope
Software Windows 11 Pro x64
Benchmark Scores Nope
I'm not sure I understand the need for E cores on a desktop. Laptops I understand, it's all about efficiency. But desktops just want raw power.

Looking forward to reviews
I think it is about marketing. Intel thinks most people are too dumb to know the difference and buy CPUs based solely on core counts now. They can say look we have 16 cores too and at the same time be able to profess their advantage in gaming while not making a GPU sized processor with a 500Watt Peak TDP.

Given that W10 lacks the scheduler to handle ADL properly that would have been quite problematic though - you try to make an even playing field, and only test across OSes if you really have to. Still, they had time to re-test, especially with the Insider patch being out for a while now. I rather think this is "opportunistic laziness" than malevolence though. "Oops, we didn't have time to fix that before our huge event, sorry" is far more likely than some grinning PR Disney villain being behind it.
You are right, I totally forgot about that. So it's not 100% malicious and devious. I'm not planning on embracing MS spyware OS 2.0 so I guess no 12xxx CPU for me. Maybe in Linux instead...

Am I the only one wondering what all e-core chip would be like? At that size, with those power requirements and only 1% slower than Sky Lake. They could have made a 40 core chip in the same die area(This assumes 1P=4E, 8P * 4 = 32 + 8 existing E = 40 Cores). Maybe that's a good server strategy for them.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Win10 will more likely use only the P cores and ignore the E ones. Until the MS update the cpu scheduler there also. For now the big push for their Win10.5=11 OS is only the AL platform (maybe that's why they collaborated heavily with Intel and messed the Ryzen CPU performance), so they will leave Win10 on the back sit for a while me thinks.
You think W10 will get the new scheduler? Call me a pessimist, but I doubt it.
Am I the only one wondering what all e-core chip would be like? At that size, with those power requirements and only 1% slower than Sky Lake. They could have made a 40 core chip in the same die area(This assumes 1P=4E, 8P * 4 = 32 + 8 existing E = 40 Cores). Maybe that's a good server strategy for them.
Nope, others have asked more or less the same question. 40 cores with that little L3, 2MB L2 per 4 cores and only a single link to the L3 and ring bus per four cores would likely be a pretty mixed bag in terms of performance though, and many server workloads want tons of cache. The 4c cluster would be kind of like an L3-less CCX, but on a monolithic die with a single L3, but a small one... I could definitely see this being done in the server space, but they might also change the clusters (2-core clusters? Single cores? Four, but more shared L2?) and stick them in a mesh for really high core counts.
 
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I think it is about marketing. Intel thinks most people are too dumb to know the difference and buy CPUs based solely on core counts now. They can say look we have 16 cores too and at the same time be able to profess their advantage in gaming while not making a GPU sized processor with a 500Watt Peak TDP.

It's funny to watch AMD fans talk about marketing a CPU based only on core count.

So tell me, serious question, was there a worse x86 processor than Zen 2 released in the past 4 years for games?

And how many AMD fans here and elsewhere pushed Zen 2 onto gamers for its future proofing due to high core counts?
 
Joined
Apr 18, 2019
Messages
935 (0.46/day)
Location
The New England region of the United States
System Name Gaming Rig
Processor Ryzen 7 3800X
Motherboard Gigabyte X570 Aurus Pro Wifi
Cooling Noctua NH-D15 chromax.black
Memory 32GB(2x16GB) Patriot Viper DDR4-3200C16
Video Card(s) EVGA RTX 3060 Ti
Storage Samsung 970 EVO Plus 1TB (Boot/OS)|Hynix Platinum P41 2TB (Games)
Display(s) Gigabyte G27F
Case Corsair Graphite 600T w/mesh side
Audio Device(s) Logitech Z625 2.1 | cheapo gaming headset when mic is needed
Power Supply Corsair HX850i
Mouse Redragon M808-KS Storm Pro (Great Value)
Keyboard Redragon K512 Shiva replaced a Corsair K70 Lux - Blue on Black
VR HMD Nope
Software Windows 11 Pro x64
Benchmark Scores Nope
It's funny to watch AMD fans talk about marketing a CPU based only on core count.

So tell me, serious question, was there a worse x86 processor than Zen 2 released in the past 4 years for games?

And how many AMD fans here and elsewhere pushed Zen 2 onto gamers for its future proofing due to high core counts?
Not everyone who dislikes the way Intel has historically operated their business is an AMD "fan". I buy strictly on value.

This may shock you but gaming isn't the only purpose for a high-performance processor.

I still believe core counts will matter more for gaming in the coming years. The current generation of consoles embracing 8 core processors will have an impact.
 
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Not everyone who dislikes the way Intel has historically operated their business is an AMD "fan". I buy strictly on value.
If the company has 'Inc' at the end it is fundamentally evil. See Google.

This may shock you but gaming isn't the only purpose for a high-performance processor.

I still believe core counts will matter more for gaming in the coming years. The current generation of consoles embracing 8 core processors will have an impact.

That's called a red herring and goalpost shifting. I was responding to a post about Intel core counts and #1 gaming performance as marketing hijinks. We weren't talking about running CineBench. I don't run that but I do run a lot of MS Office apps and browser based apps and...

Oh wait!

1635429251392.png


1635429329242.png
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
now that would be telling wouldn't it.
8 P cores running an application v/s 8 E course running an application with information on the power consumption difference between the two.

Theoretically, 1.27x IPC * (5 GHz/3.9 GHz) * 1.25x Hyper Threading = 2.03 times the multithreaded performance of 8 P cores running 16 threads vs 8 E cores running 8 threads.
Yes it would. Great minds think alike. ;)
 
Joined
Aug 6, 2020
Messages
729 (0.46/day)


If Intel is telling the truth (which is very doubtful) - Gracemont is only 20% behind Golden Cove?!! Damn, 8x Gracemont is looking like a very impressive processor for budget laptops. Hell, even budget gaming laptops with RTX 3050/3060s will be well suited with 8x Gracemont cores.


They're not - the frequency for Golden Cove is still almost 2x Gracemont (hence why they have clock-for-clock compaarisons)

So you still need nearly twice as many cores to match Golden Cove! The real world performance difference is 60% of Golden Cove.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
How exciting this is. I dont feel the need to upgrade my 10700kf, but just to get one of these, I want to!

Wizzard, if you feel too tired or worn out from doing the review, send the hardware my way, I'll .....review it..... for you. ;)
me first.

you do need the upgrade....riiiiiiight? :laugh:
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
They're not - the frequency for Golden Cove is still almost 2x Gracemont (hence why they have clock-for-clock compaarisons)

So you still need nearly twice as many cores to match Golden Cove! The real world performance difference is 60% of Golden Cove.
2x? 5.1 v 3.9GHz is a 30.8% advantage (or a 23.5% disadvantage). You're right other than that, but it's nowhere near 2x.
 
Joined
Sep 17, 2014
Messages
22,475 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
If the company has 'Inc' at the end it is fundamentally evil. See Google.



That's called a red herring and goalpost shifting. I was responding to a post about Intel core counts and #1 gaming performance as marketing hijinks. We weren't talking about running CineBench. I don't run that but I do run a lot of MS Office apps and browser based apps and...

Oh wait!

View attachment 222748

View attachment 222750

Its not shifting goalposts in the argument about core count. Chronology matters and the market has shifted.

- prior to Zen, quad core was what Intel wrote and stayed at for years. The performance in gaming relied on single thread efficiency, the market ran stuff on DX9-11. The hardware was clearly fighting a losing battle against increased gaming demands. Threading was fundamentally problematic for the industry. Nvidia won GPU comparisons in part by virtue of higher single thread efficiency with an API like Dx11 (CPU driver overhead).

- Leading into Zen we had the Mantle and DX12 initiatives, finally making the API more flexible for threading.

- Consoles adopted x86

- Zen and post Skylake generations started pushing core counts to 8+ for MSDT.

- Gaming is now once more, as it was during the Sandy Bridge days - except now both camps offer highly capable stacks of CPU for it. You can game perfectly fine on anything midrange and up, even from yesteryear. This is the norm Sandy Bridge and later had turned Intel into gaming king. It readily applies to AMDs latest 2-3 generations of Zen.

So right now, for desktop gaming purposes, anything works and new releases barely matter. The new and other ground covered here is higher peak performance for HEDT-like, and more parallel workloads. Again - both ADL and Zen are perfectly tuned for it, albeit different under the hood, and DDR5 is the big enabler for both too, to take it further.
 
Last edited:
Joined
Dec 26, 2006
Messages
3,837 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Its not shifting goalposts in the argument about core count. Chronology matters and the market has shifted.

- prior to Zen, quad core was what Intel wrote and stayed at for years. The performance in gaming relied on single thread efficiency, the market ran stuff on DX9-11. The hardware was clearly fighting a losing battle against increased gaming demands. Threading was fundamentally problematic for the industry. Nvidia won GPU comparisons in part by virtue of higher single thread efficiency with an API like Dx11 (CPU driver overhead).

- Leading into Zen we had the Mantle and DX12 initiatives, finally making the API more flexible for threading.

- Consoles adopted x86

- Zen and post Skylake generations started pushing core counts to 8+ for MSDT.

- Gaming is now once more, as it was during the Sandy Bridge days - except now both camps offer highly capable stacks of CPU for it. You can game perfectly fine on anything midrange and up, even from yesteryear. This is the norm Sandy Bridge and later had turned Intel into gaming king.

So right now, for desktop gaming purposes, anything works and new releases barely matter. The new and other ground covered here is higher peak performance for HEDT-like, and more parallel workloads. Again - both ADL and Zen are perfectly tuned for it, albeit different under the hood, and DDR5 is the big enabler for both too, to take it further.

It's goalpost shifting when someone is talking about one topic / set of metrics - gaming performance and core counts - and someone chimes in :

  • "This may shock you but gaming isn't the only purpose for a high-performance processor."
    • The above is a goal post shift, we aren't talking about productivity apps - and they're were never a slam dunk with Zen 2 either
  • "I still believe core counts will matter more for gaming in the coming years. The current generation of consoles embracing 8 core processors will have an impact."
    • This is a red herring, talking about what *may* be, it's exactly the same type of statement that had a bunch of people buying Zen 1 and Zen 2 *for gaming*.
    • Reality check - they are the two worst series of CPUs to have bought in the last 4 years *for a gamer*.
 
Joined
Sep 17, 2014
Messages
22,475 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It's goalpost shifting when someone is talking about one topic / set of metrics - gaming performance and core counts - and someone chimes in :

  • "This may shock you but gaming isn't the only purpose for a high-performance processor."
    • The above is a goal post shift, we aren't talking about productivity apps - and they're were never a slam dunk with Zen 2 either
  • "I still believe core counts will matter more for gaming in the coming years. The current generation of consoles embracing 8 core processors will have an impact."
    • This is a red herring, talking about what *may* be, it's exactly the same type of statement that had a bunch of people buying Zen 1 and Zen 2 *for gaming*.
    • Reality check - they are the two worst series of CPUs to have bought in the last 4 years *for a gamer*.

The real reality check that I am pointing at is to stop looking at what niches of gaming still offer tangible benefits and of diving into what is nearly margin of error territory... and instead zoom out a little bit and see what is new to the MSDT market with the current advances. The notable advances of ADL dont really point to better gaming in any way. Not the increased core count and not DDR5. Nor the better MT efficiency.

Good gaming was never about the highest benchmark result and the GPU is pivotal while the CPU if good enough is just irrelevant beyond that point. Zen from 2xxx series onwards offered that - and here is the kicker - alongside a set of other USPs like initially price, and core count.

Its good to recognize that because the mainstream market is going to move in kind. You only upgrade a CPU when it starts to show age.
 
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
The real reality check that I am pointing at is to stop looking at what niches of gaming still offer tangible benefits instead of diving into what is nearly margin of error territory... and instead zoom out a little bit and see what is new to the MSDT market with the current advances.

Good gaming was never about the highest benchmark result and the GPU is pivotal while the CPU if good enough is just irrelevant beyond that point. Zen from 2xxx series onwards offered that - and here is the kicker - alongside a set of other USPs like initially price, and core count.

Its good to recognize that because the mainstream market is going to move in kind. You only upgrade a CPU when it starts to show age.

The pre-Zen 3 CCX architecture was the worst choice for gaming, that's all that I said. This isn't philosophy, if you choose a CPU for gaming and that choice becomes hopelessly obsolete in < 1 year then you made a bad choice. The main thing that saved the Zen 2 gaming thing from being in everyone's face was COVID and the GPU price jump / scarcity.

Recent benchmarks don't even list Zen 2 on the charts anymore - gen 9 is usually bottom of the list. And here's why -

The difference is not margin of error, that's something people who can't read a chart repeated until they all believed it (feedback loop).

This is with a 3080, the contemporary of Zen 2 CPU was the gen 9 and gen 10, here you have 15% higher FPS with a 9900K vs 3900X and almost 20% with a 10900K.

Zen 2 was mostly fine with 2XXX series Nvidia cards but that fell apart in the space of 12 months. Again, Zen 1 and Zen 2 were demonstrably two of the worst CPUs one could have bought for gaming in the past 3-4 years.

1635439683672.png

T
 
Joined
Feb 21, 2006
Messages
2,225 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
The pre-Zen 3 CCX architecture was the worst choice for gaming, that's all that I said. This isn't philosophy, if you choose a CPU for gaming and that choice becomes hopelessly obsolete in < 1 year then you made a bad choice. The main thing that saved the Zen 2 gaming thing from being in everyone's face was COVID and the GPU price jump / scarcity.

Recent benchmarks don't even list Zen 2 on the charts anymore - gen 9 is usually bottom of the list. And here's why -

The difference is not margin of error, that's something people who can't read a chart repeated until they all believed it (feedback loop).

This is with a 3080, the contemporary of Zen 2 CPU was the gen 9 and gen 10, here you have 15% higher FPS with a 9900K vs 3900X and almost 20% with a 10900K.

Zen 2 was mostly fine with 2XXX series Nvidia cards but that fell apart in the space of 12 months. Again, Zen 1 and Zen 2 were demonstrably two of the worst CPUs one could have bought for gaming in the past 3-4 years.

View attachment 222766
T

While I see the point you are trying to make. Worse for gaming in 3-4 years while showing it doing 200fps is abit of a stretch.

I would believe that more if it was going from non playable fps to playable. Yes prior to Zen 3 intel provides more fps but for some people they were fine with that. A side by side test between the 10900k in the chart and the 3900X in the chart would not be noticeable to anyone while playing without a fps counter up.

How deep does this argument need to go when you are talking about 200+ fps from both chips?
 
Joined
Apr 30, 2011
Messages
2,704 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
The pre-Zen 3 CCX architecture was the worst choice for gaming, that's all that I said. This isn't philosophy, if you choose a CPU for gaming and that choice becomes hopelessly obsolete in < 1 year then you made a bad choice. The main thing that saved the Zen 2 gaming thing from being in everyone's face was COVID and the GPU price jump / scarcity.

Recent benchmarks don't even list Zen 2 on the charts anymore - gen 9 is usually bottom of the list. And here's why -

The difference is not margin of error, that's something people who can't read a chart repeated until they all believed it (feedback loop).

This is with a 3080, the contemporary of Zen 2 CPU was the gen 9 and gen 10, here you have 15% higher FPS with a 9900K vs 3900X and almost 20% with a 10900K.

Zen 2 was mostly fine with 2XXX series Nvidia cards but that fell apart in the space of 12 months. Again, Zen 1 and Zen 2 were demonstrably two of the worst CPUs one could have bought for gaming in the past 3-4 years.

View attachment 222766
T
I suppose serious gamers like you who need the best CPU to play on high settings and high res will feel inferior when the games run on average 10% slower eh? Kudos then! Because that is the difference at stock between the best CPUs back in 2019 (3950X vs 11900K).
1635441288513.png
 
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
While I see the point you are trying to make. Worse for gaming in 3-4 years while showing it doing 200fps is abit of a stretch.

That's true insofar as my choice of showing the games and settings Tom's used. It's not true on newer / higher end games though.

To illustrate - Cyperpunk 2077 is a heavily threaded game, at a setting I would think is common for fairly dedicated games. This should be ideal for all Zen, but it isn't quite. You've got 10600K 10700K and 10900K all beating the top Zen 2 3950X by wide margins. Interestingly the 9900K ties it. Again, this should be an ideal game for high thread count Zen and the test is at a very reasonable 1440P medium that many gamers will expect :

1635443054289.png
 
Joined
Oct 23, 2020
Messages
671 (0.45/day)
Location
Austria
System Name nope
Processor I3 10100F
Motherboard ATM Gigabyte h410
Cooling Arctic 12 passive
Memory ATM Gskill 1x 8GB NT Series (No Heatspreader bling bling garbage, just Black DIMMS)
Video Card(s) Sapphire HD7770 and EVGA GTX 470 and Zotac GTX 960
Storage 120GB OS SSD, 240GB M2 Sata, 240GB M2 NVME, 300GB HDD, 500GB HDD
Display(s) Nec EA 241 WM
Case Coolermaster whatever
Audio Device(s) Onkyo on TV and Mi Bluetooth on Screen
Power Supply Super Flower Leadx 550W
Mouse Steelseries Rival Fnatic
Keyboard Logitech K270 Wireless
Software Deepin, BSD and 10 LTSC
Pice of shit, self builder get again only a 32EU IGP :laugh: but now with 1450MHz and not 1300 like Rocket Lake.

In SFF and Notebooks there are 80-96 EU:kookoo:

im too dont buy a R5 5600G for 260€ if i can get a GTX 970 for 130€ :p


What would be faster a config of:
I5 12600K/5600G, B450 Board, IGP, 16GB of RAM for 420€
A8 5500, A68H Board, 16GB and a GTX 970 for about 300€ (in my way a GTX 970 for 130€)
 
Last edited:
Joined
Apr 18, 2019
Messages
935 (0.46/day)
Location
The New England region of the United States
System Name Gaming Rig
Processor Ryzen 7 3800X
Motherboard Gigabyte X570 Aurus Pro Wifi
Cooling Noctua NH-D15 chromax.black
Memory 32GB(2x16GB) Patriot Viper DDR4-3200C16
Video Card(s) EVGA RTX 3060 Ti
Storage Samsung 970 EVO Plus 1TB (Boot/OS)|Hynix Platinum P41 2TB (Games)
Display(s) Gigabyte G27F
Case Corsair Graphite 600T w/mesh side
Audio Device(s) Logitech Z625 2.1 | cheapo gaming headset when mic is needed
Power Supply Corsair HX850i
Mouse Redragon M808-KS Storm Pro (Great Value)
Keyboard Redragon K512 Shiva replaced a Corsair K70 Lux - Blue on Black
VR HMD Nope
Software Windows 11 Pro x64
Benchmark Scores Nope
If the company has 'Inc' at the end it is fundamentally evil. See Google.



That's called a red herring and goalpost shifting. I was responding to a post about Intel core counts and #1 gaming performance as marketing hijinks. We weren't talking about running CineBench. I don't run that but I do run a lot of MS Office apps and browser based apps and...

Oh wait!

My last comment on this because I'm pretty sure you just like to argue.

There are a lot more things to do outside the examples you've provided. Personally, I used to do a lot of software video encoding. Note the software part, it typically yields superior quality and compression to the hardware encoders in some CPUs and graphics cards. That and the former price advantage got me to buy into Ryzen processors. I could encode faster and still play some games. I don't play at 4K or 500FPS so losing a bit of gaming performance was fine by me.
 
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
My last comment on this because I'm pretty sure you just like to argue.

There are a lot more things to do outside the examples you've provided. Personally, I used to do a lot of software video encoding. Note the software part, it typically yields superior quality and compression to the hardware encoders in some CPUs and graphics cards. That and the former price advantage got me to buy into Ryzen processors. I could encode faster and still play some games. I don't play at 4K or 500FPS so losing a bit of gaming performance was fine by me.

You are the one that posted something about nebulous 'productivity' when the subject was gaming. Now we get the stereotypical 'I do rendering and encoding'.
 
Joined
Oct 23, 2020
Messages
671 (0.45/day)
Location
Austria
System Name nope
Processor I3 10100F
Motherboard ATM Gigabyte h410
Cooling Arctic 12 passive
Memory ATM Gskill 1x 8GB NT Series (No Heatspreader bling bling garbage, just Black DIMMS)
Video Card(s) Sapphire HD7770 and EVGA GTX 470 and Zotac GTX 960
Storage 120GB OS SSD, 240GB M2 Sata, 240GB M2 NVME, 300GB HDD, 500GB HDD
Display(s) Nec EA 241 WM
Case Coolermaster whatever
Audio Device(s) Onkyo on TV and Mi Bluetooth on Screen
Power Supply Super Flower Leadx 550W
Mouse Steelseries Rival Fnatic
Keyboard Logitech K270 Wireless
Software Deepin, BSD and 10 LTSC
My last comment on this because I'm pretty sure you just like to argue.

There are a lot more things to do outside the examples you've provided. Personally, I used to do a lot of software video encoding. Note the software part, it typically yields superior quality and compression to the hardware encoders in some CPUs and graphics cards. That and the former price advantage got me to buy into Ryzen processors. I could encode faster and still play some games. I don't play at 4K or 500FPS so losing a bit of gaming performance was fine by me.
Those way is totaly ill, my IGP in form of HD6550D (A8 3800) can even render faster than any AM4 CPU :kookoo:
 
Top