• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core 12th Gen Alder Lake Preview

how crazy... all of this information already leaked.


Except for this little detail about the bullshit performance chart:

Intel shows showing the 12900K being anywhere from slightly slower than the Ryzen 9 5950X, to being 30 percent faster. However, Intel does admit that these benchmarks were captured on Windows 11 before the performance patch for AMD CPUs was available, so the results aren’t as meaningful as they would have been had they tested the 5950X in its best performing mode, such as using Windows 10 or waiting for the patch to be available



The BS Parade already started before the launch even left the gate! :rolleyes:

- Zen 3 with Windows 11 performance corrections is only 5% slower than Alder Lake! , and Zen 3+ will exceed this trash heap by about 10%!
 
Last edited:
True, though I have not seen DP inputs on ASUS Z690 TB4 boards. I know past ASRock and MSI boards had DP inputs for Thunderbolt.
There are a few boards with input, but yeah, not all of them seem to have it. Haven't looked at Asus, but I saw one of Gigabyte's Aero boards had an input.
 
- Zen 3 with Windows 11 performance corrections is only 5% slower than Alder Lake! , and Zen 3+ will exceed this trash heap by about 10%!

where is that info from? Zen3+ may gets further delays and the plus cache will give benefit only in gaming. The real deal will be Zen4, but when that is coming?
 
Actually 190W PL1 for 12700K is not bad I guess. Was expecting it to be more power hungry TBH.

If disabling 4 E-cores we may very well see a similar PL2 with 5800X around 140W.
Why.

Would you disable the main efficiency feature of a design to save power, it makes no sense, and is the opposite of what most people do.

Let's just await benches, coming up with comparisons where detrimental to performance specifics have to be met to achieve a version of parity is useless let's see what stock v stock looks like first.

And I mean legit ,patched windows 11 tests, any reviewer coming out with shite like we didn't have time is up for abuse with a week to go until performance NDA lift's (balls arrangement, pr without useful accurate comparison).
 
The thin silicon, thin STIM and thicker IHS is of particular interest to me for cooling. Should make it easier to get the heat out of these chips.

I won't be an early adopter, those motherboards can be pricey. But these should be fun once the DDR5 gets optimized for clocks and timings. Think I'll wait for PCIE5 SSDs before making the jump. Gen 5 everything, GPU, SSD and ram.
 
Why.

Would you disable the main efficiency feature of a design to save power, it makes no sense, and is the opposite of what most people do.
Personally I don't buy this hybrid BS from Intel. A normal desktop usage strategy should be to reserve headroom at peak scenarios for single core perf rather than reduce idle power / crunching MT workloads. And I agree to wait for proper reviews, especially on win11 scheduler overhead. Before, a thread doesn't need to be constantly moved around. Now it needs to.

Anandtech already offers a case of Win11 hybrid scheduler defect. IMO you may very well just disable all E-cores in that case instead of going "high performance mode". Anyway, it's just 4 E-cores on 12700k. Dark silicon helps cooling. XD
 
Hybrid chips have been working well for Apple and phones.

If Microsoft can make them play nice with Windows, it could work well as opposed to continuing to brute force it with high standard core counts, especially for mobile applications.
 
The thin silicon, thin STIM and thicker IHS is of particular interest to me for cooling. Should make it easier to get the heat out of these chips.
Also the lavishly wasted dark silicons. Consider the hardwire-disabled AVX512 on P-cores, also the DDR4+DDR5 logics which you are only going to use one. And the out-of-factory disabled cores on every product except 12900k (which Intel did not do in the past), and also the soon-to-be-disabled E-cores by some consumers.

*Probably* we are finally going to see a ~7nm x86 product with proper thermal density.
 
There are a few boards with input, but yeah, not all of them seem to have it. Haven't looked at Asus, but I saw one of Gigabyte's Aero boards had an input.
Is there any technical reason why TB controllers can't get DP signals from any GPU in the system over PCIe? Given that eGPUs can transfer signals back in this manner, and you can even connect a display to the iGPU and manually set your dGPU as the rendering GPU for 3D tasks, I don't see why we still need DP inputs for TB on desktop motherboards. Can't this just be a BIOS toggle?


Other than this, it's fantastic to see Intel ditch the TDP term for marketing and instead specify both base and boost power for each chip. About time!
 
What uses AVX512 anyway? Nothing I use at present at least.

Yeah, probably skip this generation and wait for them to drop the DDR4 controller.
 
Is there any technical reason why TB controllers can't get DP signals from any GPU in the system over PCIe? Given that eGPUs can transfer signals back in this manner, and you can even connect a display to the iGPU and manually set your dGPU as the rendering GPU for 3D tasks, I don't see why we still need DP inputs for TB on desktop motherboards. Can't this just be a BIOS toggle?
Ask Intel? I honestly don't know, but TB has always had a lot of weird limitations and even if you look at the board makers disclaimers now, it seems like if you're using TB4, you can't use some PCIe devices that are part of the motherboard or you're limited to how many TB4 devices you can use?!
Because of the limited I/O resources of the PC architecture, the number of Thunderbolt™ devices that can be used is dependent on the number of the PCI Express devices being installed. (Refer to Chapter 2-7, "Back Panel Connectors," for more information.)
Thunderbolt™ 4 Connector (USB Type-C® Port) The connector supports standard DisplayPort and Thunderbolt™ video outputs. You can connect a standard DisplayPort/Thunderbolt™ monitor to this connector with an adapter. The Thunderbolt™ connector can daisy chain up to five Thunderbolt™ devices. Because of the limited I/O resources of the PC architecture, the number of Thunderbolt™ devices that can be used is dependent on the number of the PCI Express devices being installed. You can adjust the Thunderbolt™ settings under Settings\Thunderbolt Configuration in BIOS Setup. The maximum supported resolution is 5120 x 2880@60 Hz with 24 bpp via single display output, but the actual resolutions supported are dependent on the monitor being used. Also, the connector is reversible and supports the USB 3.2 Gen 2 specification and is compatible to the USB 3.2 Gen 1 and USB 2.0 specification. You can use this port for USB devices, too.
It also seems that no boards support USB4 as yet, due to a component shortage from TI.
Chipset+Intel® Thunderbolt™ 4 Controller:
  1. 2 x USB Type-C® ports on the back panel, with USB 3.2 Gen 2 support
 
Ask Intel? I honestly don't know, but TB has always had a lot of weird limitations and even if you look at the board makers disclaimers now, it seems like if you're using TB4, you can't use some PCIe devices that are part of the motherboard or you're limited to how many TB4 devices you can use?!

It also seems that no boards support USB4 as yet, due to a component shortage from TI.
I guess that might kind of make sense - the TB4 controllers probably share bandwidth with something? Or are we talking something weird and backwards like IRQ conflicts? Also: component shortage? What? In 2021? Who'd'a thunk it? :rolleyes:
 
I guess that might kind of make sense - the TB4 controllers probably share bandwidth with something? Or are we talking something weird and backwards like IRQ conflicts? Also: component shortage? What? In 2021? Who'd'a thunk it? :rolleyes:
I really don't know, I added the bit from the manual above, but it still doesn't clear things up.
The component shortage from TI was reported back in June and apparently couldn't be resolved so... :rolleyes:
In other words, first gen Z690 motherboards don't support USB4.
 
There is a typo at page 6-Overclocking: at the end you talk to wait until "September 4th", actually I'ts not so important but can cause some confusion about release date

We're not exactly sure how this works together with a "125 W" claim, but will know more on September 4th when our reviews go live.

The rest of this preview is accurate and easy to understand even to noun-English people, "May the 4th" will confirm us the results of this Rivalry and bring interesting procesors from both AMD and Intel to stores for a long time.
 
Last edited:
So, the Ryzen bug on Win11 produced great benchmarking results for Intel marketing to show off their new CPU line-up... Who 'd have thought it would happen eh? Let's hope for fair reviews then and may the best CPU win.
 
Best Buy :

1635365106849.png
 
Honestly completely meh. The fact they used the bugged build speaks volumes as to how well alder lake ACTUALLY performs in IRL scenarios.

The most interesting chip will likely once again be the lowest end 6 core 12400, especially if it doesnt have any e cores and will be available well under $200. Everythign else is overpriced tat.
 
Wait, that looks ... lower than what you can expect with a decent DDR4 kit currently?

AnandTech only tests with JEDEC rated memory kit, they don't even do XMP. That means when they test a Zen 3 or Rocket Lake they use DDR4-3200 CL20. He's going to stick with it for AL. That is what he's throwing up.

I think it's interesting mostly in that it tells you what a typical OEM like HP or Dell will put in the box. Many Zen 3 / RKL / CML actually ship with crap like DDR4-2666, and those OEMs are 85% of the market, so it isn't wrong really. However IMO it makes his tests somewhat irrelevant for DIY, maybe ok as a baseline but more info needed.
 
I dont like it’s high power draw, but it will be the new performance king and AMD has no answer until next year’s fall?
Well, it looks like Intel's leaks were running on windows 11 before the AMD patch, which reduces performance for AMD by 15%, and AMD is releasing the V-cache chips first quarter of 2022 which they claim adds 15% gaming performance, so add those two together and with the Windows 11 AMD patch, the V-cache chips may be deleting Intel's gaming gains
 
Back
Top