Wednesday, September 20th 2023

Intel Demoes Core "Lunar Lake" Processor from Two Generations Ahead

Intel at the 2023 InnovatiON event surprised audiences with a live demo of a reference notebook powered by a Core "Lunar Lake" processor. What's surprising about this is that "Lunar Lake" won't come out until 2025 (at least), and succeeds not just the upcoming "Meteor Lake" architecture, but also its succeeding "Arrow Lake," which debuts in 2024. Intel is expected to debut "Meteor Lake" some time later this year. What's also surprising is that Intel has proven that the Intel 18A foundry node works. The Compute tile of "Lunar Lake" is expected to be based on Intel 18A, which is four generations ahead of the current Intel 7, which will be succeeded by Intel 4, Intel 3, and Intel 20A along the way.

The demo focused on the generative AI capabilities of Intel's third generation NPU, the hardware backend of AI Boost. Using a local session of a tool similar to Stable Diffusion, the processor was made to generate the image of a giraffe wearing a hat; and a GPT program was made to pen the lyrics of a song in the genre of Taylor Swift from scratch. Both tasks were completed on stage using the chip's NPU, and in timeframes you'd normally expect from discrete AI accelerators or cloud-based services.
Source: HotHardware
Add your own comment

62 Comments on Intel Demoes Core "Lunar Lake" Processor from Two Generations Ahead

#26
HD64G
Just a marketing move to gain time from investors until things get better since for the next year or two the mirror of Intel's vehicle will need to be inverted to find AMD.
Posted on Reply
#27
BlaezaLite
I'm not worried about the guy with the hat, who looks like he might be going down the farm. I'm scared of Gollum on the left, rubbing his hands together... My 3DVCache...

Intel was always going to use a technology that has been proven to be so successful. Be stupid not to. Who knows, it might be that good I go back to Intel in 2 or 3 years.
Posted on Reply
#28
AusWolf
theoutoDon't you worry my friend, they are incredibly efficient at doing that, so much so that you aren't even notified about if that is happening or not! Man I love having the insurance that the GDPR provides, feel bad for countries that don't have it.
I'm not sure GDPR offers 100% protection. What if these companies steal our data without us ever knowing about it? Or what if they put a clause in the T&Cs that allows them to collect "some" data for "diagnostic purposes"?
Posted on Reply
#29
theouto
AusWolfI'm not sure GDPR offers 100% protection. What if these companies steal our data without us ever knowing about it? Or what if they put a clause in the T&Cs that allows them to collect "some" data for "diagnostic purposes"?
If we find proof of them housing our private data without consent, we hold the right to have it removed, if they don't, then they are legally liable.

commission.europa.eu/law/law-topic/data-protection/reform/rights-citizens/my-rights/can-i-ask-company-delete-my-personal-data_en

Maybe they'd have to just nuke your whole account, but they are legally liable if they don't comply
Posted on Reply
#30
AusWolf
theoutoIf we find proof of them housing our private data without consent, we hold the right to have it removed, if they don't, then they are legally liable.
That's my point - there's that huge "if". Personally, I have no device or knowledge to get proof of these companies housing any data about me. GDPR gives you the rights, but doesn't give you the means.
Posted on Reply
#31
Vayra86
Ferrum MasterThere is an etiquette for doing public appreances, talks, training. Especially for public showcases. Your goal is to be respectful to everyone and not offend anyone. This clearly does.

This a message that Intel isn't capable of picking right people for right tasks again. Not mentioning again this whole show is about nothing.

Welcome to the world, it ain't a small sandbox of your values, you have to consider others, not only you. If something is a miss, maybe pass the torch to someone else for public stunts.
This goes both ways you know
You are at liberty to feel however you want to feel about this guys' hat. He's at liberty to make a statement, whatever that is here, btw, I don't know... by wearing a hat. Maybe next time someone wears a My little Pony jumpsuit, it would make that presentation radically different too. Compare how you would respond to that, relative to this hat, and maybe things will clear up a little.

I feel not wholly different as you when someone wears a hat indoors, but yeah. Reasons, complicated, social interactions... its good to tread carefully, because if we reflect on that, all that is, is what we've been taught, is it really relevant is another question. In other cultures perhaps hat wearing isn't frowned upon so much, and since corporate is mighty inclusive these days, well.
Posted on Reply
#32
Squared
lasI'd love to see Arrow Lake on 20A next year. If not I am going Zen 5. I might do anyway, depending on performance and price.

What worries me about AMD is they don't really have an overall good product. 3D chips are good for gaming, non-3D chips are good for applications. Intel has overall better performance across the board. Might loose slightly in gaming (compared to 78003D) but even 13600K destroys 7800X3D outside of gaming is my point)

I really really hope that AMD will do 3D cache on all cores for 8000 3D chips. Not only one CCD like 7900X3D and 7950X3D and hopefully they can bump up clockspeeds as well.

7800X3D is great for gaming but really not that great outside of gaming, meaning 7700X performance or even less. 7800X3D can beat 7950X3D in some games, showing why you don't want a half solution.

Give me 8950X3D with 3D cache on all 16 cores and I will probably buy that, unless Arrow Lake delivers a big step forward (which it should).

If Intel truly has 20A/18A ready in 1-2 years, then AMD might be in trouble. Ryzen mostly has been a success because AMD had a node advantage (Ryzen 1000 and 2000 series on 12nm GloFo kinda sucked) and AMD is 100% reliant on TSMC to deliver improvements (sadly for AMD, TSMC prioritizes Apple) but without Apples money, TSMC would not be where they are today.
AMD's Zen 5 might change things a bit. I think Zen 4 enhanced Zen less than Zen 3, and 3 less than 2. But Zen 5 is the biggest redesign since Zen 1, so its capabilities should be pretty different from Zen 4.
Posted on Reply
#33
R0H1T
We'll see about that, generally AMD's delivered on their promises with Zen but they also have a history of over a decade of going nowhere!
Posted on Reply
#34
Squared
usinameActually Intel 7(10nm) was released over 5 yeas ago (2018 May)
Also running of low power NPU tile is not a proof for working node. Cannon lake released in 2018 was working, yes, with performance level of mobile celeron, but it worked
I don't know, Cannon Lake was a really limited release and the iGPU was broken. I get the impression that Intel didn't consider it ready but sold it to a very small number of customers just so Intel could claim to have delivered on a release schedule that Intel had promised to investors. Ice Lake/Sunny Cove (10nm+) was a much better release, although even it was only released to the low-power laptop market. This table shows the release order and node:
platform µarchitecturenode market
Cannon LakeCannon Lake?10nm?
Ice LakeSunny Cove10nm+ renamed 10nm15W laptops
Tiger LakeWillow Cove10nm++ renamed 10SFlaptops
Ice Lake SPSunny Cove10?servers
Alder Lake and Sapphire RapidsGolden Cove (+Gracemont in Alder Lake)10ESF renamed Intel 7all Intel CPU markets


Demoing Lunar Lake doesn't make sense to me. TPU says this means Intel 18A is ahead of schedule but Anand Tech's take is that Lunar Lake was moved to Intel 20A. So it's not clear if Intel is demoing good news or bad news. Moreover, the products Intel claims we should be looking forward to are Meteor Lake and Arrow Lake. Why give attention to something even further away from release, unless Meteor Lake and Arrow Lake will be disappointing? AMD did demo Zen a long time ahead of release, but the Excavator µarch was already available and widely known to be inadequate.
Posted on Reply
#35
evernessince
lasI'd love to see Arrow Lake on 20A next year. If not I am going Zen 5. I might do anyway, depending on performance and price.

What worries me about AMD is they don't really have an overall good product. 3D chips are good for gaming, non-3D chips are good for applications. Intel has overall better performance across the board. Might loose slightly in gaming (compared to 78003D) but even 13600K destroys 7800X3D outside of gaming is my point)
The 13600K is 6% faster in applications and 15% slower in games. How in the world is 6% faster "dominating" while 15% slower is games loosing slightly? That's a hell of a double standard. That's before you consider the 7800X3D consumes half the power.

Certain Intel SKUs provide more MT performance for your dollar but to say that Intel performs better all across the board is factually inaccurate. The 7950X matches the 13900K in application performance. The 7950X3D does as well, while consuming a fraction of the power. The 7600 beats the 13400f (both same price) in application performance by 1% and by 14% in gaming.

Then consider that AMD is dominating in server and winning laptop marketshare. I don't think your analysis of AMD's current position is accurate at all given the facts. Intel is moving towards a chiplet architecture precisely because of the advantages AMD has demonstrated. Intel makes sense when they provide a value proposition, as in more MT for your dollar, but otherwise if you are buying them because you think they provide the best performance (bar niche situations where you purchase for specific applications) you are only fooling yourself. In terms of application performance either brand is competitive and you should get whichever provides the most bang for your buck.
Posted on Reply
#36
DavidC1
SquaredDemoing Lunar Lake doesn't make sense to me. TPU says this means Intel 18A is ahead of schedule but Anand Tech's take is that Lunar Lake was moved to Intel 20A. So it's not clear if Intel is demoing good news or bad news. Moreover, the products Intel claims we should be looking forward to are Meteor Lake and Arrow Lake. Why give attention to something even further away from release, unless Meteor Lake and Arrow Lake will be disappointing? AMD did demo Zen a long time ahead of release, but the Excavator µarch was already available and widely known to be inadequate.
If Lunar Lake is on an Intel process, Intel would have said it. It's on TSMC N3. Intel already told the generation after that is on 18A(Panther Lake). Why would they keep the older chip's secrets?

I get that both TPU and AT editors want to keep speculations to a minimum but this is taking is way too conservatively.

18A was only demoed as a test chip and nothing else. And wafers.
Posted on Reply
#37
R-T-B
Ferrum MasterThere is an etiquette for doing public appreances, talks, training. Especially for public showcases. Your goal is to be respectful to everyone and not offend anyone. This clearly does.
So far I only see one person offended honestly. I personally was not even aware this was a thing.

I view hats more as a style choice than a statement or anything.
Posted on Reply
#38
DavidC1
pjl321As for the manufacturing nodes, Intel 7 has been around for ages, Intel 4 and Intel 3 are basically the same node (Intel 4 is what Intel is exclusively using to iron out the bugs for Intel 3 which has very small tweaks to it's libraries and Intel are opening this up to all it's partners), and it's the exact same thing for Intel 20A & 18A. So in reality, Intel only really have 2 new node since Intel 7 which came out over a year and half ago.
Intel 3 gets an 18% transistor performance improvement and 18A gets 10%. They didn't say for 18A but 3 gets density improvements for high performance libraries, but it isn't exactly a small tweak.

18% gain is a full node worth of improvement. Intel 4 is 20% over Intel 7.

I'd say Intel 3 is a 0.75 node gain. Again, don't know enough about 18A.
Posted on Reply
#39
Squared
DavidC1If Lunar Lake is on an Intel process, Intel would have said it. It's on TSMC N3. Intel already told the generation after that is on 18A(Panther Lake). Why would they keep the older chip's secrets?

I get that both TPU and AT editors want to keep speculations to a minimum but this is taking is way too conservatively.

18A was only demoed as a test chip and nothing else. And wafers.
That would be an interesting turn of events. Arrow Lake is scheduled a year ahead of Lunar Lake, and is made with Intel 20A. The names imply that Intel 3 will be equivalent to TSMC N3. This does seem a little unlikely, because Intel 3 is only a half node beyond Intel 4 which is probably roughly on par with TSMC N4, but TSMC N3 is a full node ahead of N4. However Intel claims 20A will be a full node ahead of Intel 3, so it should match or beat TSMC N3 especially for a clock-optimized CPU. So Arrow Lake's successor, Lunar Lake, ought to be built with Intel 20A or 18A. So putting Lunar Lake on TSMC N3 would be very bad news.
Posted on Reply
#40
Wye
ZoneDymoHonestly, your sense of "basic manners" comes awfully close to oppression tbh.
If he was in a wheelchair, would he have to prove to you that he cant walk?
Are you one of those that says Jamie Fox is just faking it?

Or can we leave people with a bit of dignity and assume its for a reason.

Heck what do basic manners say about some sort of disgusting wound or malady on the head? expose it to all so they know there is a reason to cover it up....and then to cover it up?
or hell maybe its because of a general lack of basic manners from others that made him self conscious enough to keep the hat on, like something simple as getting bald.

anywho, yeah, idk man but you have an odd point of view, but that is just my opinion, equally valuable/meaningless.
lol two idiots arguing on the internet about wearing hats. :roll:
Posted on Reply
#41
RandallFlagg
DavidC1Intel 3 gets an 18% transistor performance improvement and 18A gets 10%. They didn't say for 18A but 3 gets density improvements for high performance libraries, but it isn't exactly a small tweak.

18% gain is a full node worth of improvement. Intel 4 is 20% over Intel 7.

I'd say Intel 3 is a 0.75 node gain. Again, don't know enough about 18A.
It's high power vs low power. Intel 4 is a high power node, Intel 3 is a low power node (with other enhancements).

TSMC's naming convention has been a nice marketing coup, some of their names merely denote an updated node for different power ranges. N5/N4 comes to mind.

AFAIK all of Intel's prior nodes would be considered high power, and in fact one of the IDM (foundry) nodes is "Intel 16" which is a reworked 22nm for low power. Low power usually means more dense, and from a desktop user standpoint would be 'very low' power - like sub 10W. Intel 4 / 3 are comparable to TSMC N3 / N3E.

I think 20A and 18A are going to be the same thing, 20A is high power for internal use and 18A will be low power for foundry partners. If they actually deliver product in 2024 on these nodes, they'll be ahead of TSMC by at least 3 quarters.

Even Apple is not using N2 in 2024, and may not use it until 2026.
Posted on Reply
#42
DavidC1
RandallFlaggIt's high power vs low power. Intel 4 is a high power node, Intel 3 is a low power node (with other enhancements).
I don't know if you don't know this already or failed to mention it.

Intel 4 process doesn't have enough libraries and features to have IO chipsets at all. Intel 3 does. Also Intel 4 is only HP, while Intel 3 is available in both HP and LP. While they did not say it for 20A, the assumption is that they'll follow the same convention as with 4/3.

The HP and LP distinction is also more subtle than most people think. Yes you can have a 5W chip on HP, but it'll be more optimal on LP. The LP transistors sacrifice clockspeeds for lower leakage transistors. Actually LP products use more power at the same frequency than the same on HP. It doesn't make sense, but what's happening is you don't get nothing for free. Leakage becomes dominant for battery life. But for super high frequency like desktops, it's beneficial to have better MHz/W, which is what HP gives you, but at higher leakages, which don't matter since you aren't running on battery and who care about few W when you have a 1000W system?
Even Apple is not using N2 in 2024, and may not use it until 2026.
N2's mass production is 2025 so you literally can't have a product until 2026.
Posted on Reply
#43
JustBenching
evernessinceThe 13600K is 6% faster in applications and 15% slower in games. How in the world is 6% faster "dominating" while 15% slower is games loosing slightly? That's a hell of a double standard. That's before you consider the 7800X3D consumes half the power.

Certain Intel SKUs provide more MT performance for your dollar but to say that Intel performs better all across the board is factually inaccurate. The 7950X matches the 13900K in application performance. The 7950X3D does as well, while consuming a fraction of the power. The 7600 beats the 13400f (both same price) in application performance by 1% and by 14% in gaming.

Then consider that AMD is dominating in server and winning laptop marketshare. I don't think your analysis of AMD's current position is accurate at all given the facts. Intel is moving towards a chiplet architecture precisely because of the advantages AMD has demonstrated. Intel makes sense when they provide a value proposition, as in more MT for your dollar, but otherwise if you are buying them because you think they provide the best performance (bar niche situations where you purchase for specific applications) you are only fooling yourself. In terms of application performance either brand is competitive and you should get whichever provides the most bang for your buck.
What a load of....not accurate facts. Wow.

The bias is evident. Amd is dominating servers (Intel has what, 80% marketshare) and gains traction in mobile (again, Intel has what, more than 80% marketshare?). ROFL

And yeah, a 13600k slaugthers the 7700x or the 7800x 3d in mt tasks. It's 2 generations ahead.
Posted on Reply
#44
las
evernessinceThe 13600K is 6% faster in applications and 15% slower in games. How in the world is 6% faster "dominating" while 15% slower is games loosing slightly? That's a hell of a double standard. That's before you consider the 7800X3D consumes half the power.

Certain Intel SKUs provide more MT performance for your dollar but to say that Intel performs better all across the board is factually inaccurate. The 7950X matches the 13900K in application performance. The 7950X3D does as well, while consuming a fraction of the power. The 7600 beats the 13400f (both same price) in application performance by 1% and by 14% in gaming.

Then consider that AMD is dominating in server and winning laptop marketshare. I don't think your analysis of AMD's current position is accurate at all given the facts. Intel is moving towards a chiplet architecture precisely because of the advantages AMD has demonstrated. Intel makes sense when they provide a value proposition, as in more MT for your dollar, but otherwise if you are buying them because you think they provide the best performance (bar niche situations where you purchase for specific applications) you are only fooling yourself. In terms of application performance either brand is competitive and you should get whichever provides the most bang for your buck.
Depends on game. 13600K beats 7800X3D in Starfield (funny, since its AMD sponsored)
Besides, 13600K is way cheaper than 7800X3D and the true comparison should be 13700K anyway (soon 14700K)

Also, you can boost Intel gaming performance alot with faster memory (than 6000 Mhz). You can't on Ryzen 7000 since 6000 MHz is sweet spot and going above won't change much and might even lower performance. Even AMD said 6000 MHz is sweet spot.

SquaredAMD's Zen 5 might change things a bit. I think Zen 4 enhanced Zen less than Zen 3, and 3 less than 2. But Zen 5 is the biggest redesign since Zen 1, so its capabilities should be pretty different from Zen 4.
Can be both good and bad. Because Zen 5 might need months of optimization and fixes before it actually works "great"

I want Zen 5 (pref. 3D models) or Arrow Lake next.

It is going to be a good fight in 2024.
Posted on Reply
#45
theouto
Honestly, using starfield to compare anything is a thing I wouldn't recommend doing, that game is not the most technically sound, by any metric
Posted on Reply
#46
Assimilator
Ferrum MasterPretty weird to look at well paid, educated person who wears a hat indoors. If he doesn't care about etiquette and many do also, he is publicly on stage with many persons who might find it offensive as those really are basic manners and showing respect to others.
If you are the only person to be so offended by whether someone wears a hat indoors or not, that you have to post about it... the problem many, just may be you, not that person.
Ferrum MasterProve or bust, it ain't Reddit here. Basic manners are basic manners.
Basic manners? You mean, like not posting petty criticism of someone for their choice of apparel? Never ceases to amaze me how the people quickest to take offense at completely nonsensical things, are also the biggest hypocrites.
theoutoHonestly, using starfield to compare anything is a thing I wouldn't recommend doing, that game is not the most technically sound, by any metric
Some would argue it's not technically even a game :p
Posted on Reply
#47
theouto
I genuinely have no idea what's so bad about anyone wearing a hat indoors, how is it seen as rude, or lacking manners? It's just a hat, not a loaded gun.
Posted on Reply
#48
Vayra86
theoutoI genuinely have no idea what's so bad about anyone wearing a hat indoors, how is it seen as rude, or lacking manners? It's just a hat, not a loaded gun.
You could hide a weapon under a hat. Be careful what you wish for!

Hats off for this topic tho
Posted on Reply
#49
PapaTaipei
Comparisons are always a matter of perspectives
Posted on Reply
#50
Upgrayedd
Ferrum MasterProve or bust, it ain't Reddit here. Basic manners are basic manners.
Pat is wearing jeans, sneakers and a graphic t shirt. Yet you want to complain about the guy that actually put a little effort into what he's wearing.

What a joke...
Posted on Reply
Add your own comment
Mar 15th, 2025 22:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts