Monday, July 27th 2020

Intel Rocket Lake CPUs Will Bring up to 10% IPC Improvement and 5 GHz Clocks

Intel is struggling with its node development and it looks like next-generation consumer systems are going to be stuck on 14 nm for a bit more. Preparing for that, Intel will finally break free from Skylake-based architectures and launch something new. The replacement for the current Comet Lake generation is set to be called Rocket Lake and today we have obtained some more information about it. Thanks to popular hardware leaker rogame (_rogame), we know a few stuff about Rocket Lake. Starting off, it is known that Rocket Lake features the backport of 10 nm Willow Cove core, called Cypress Cove. That Cypress Cove is supposed to bring only 10% IPC improvements, according to the latest rumors.

With 10% IPC improvement the company will at least offer some more competitive product than it currently does, however, that should be much slower than 10 nm Tiger Lake processors which feature the original Willow Cove design. It shows that backporting of the design doesn't just bring loses of the node benefits like smaller design and less heat, but rather means that only a fraction of the performance can be extracted. Another point that rogame made is that Rocket Lake will run up to 5 GHz in boost, and it will run hot, which is expected.
Source: _rogame
Add your own comment

45 Comments on Intel Rocket Lake CPUs Will Bring up to 10% IPC Improvement and 5 GHz Clocks

#26
bug
R0H1TIn case you forgot the whole ultra-books was a concept literally copied from Mac-books & yes the chips were Intel's very own design but the drive towards portability & emphasis on efficiency & greater battery life was driven at first by Apple, say whatever you will about them but the smartphone revolution & yes portability (arguably efficiency as well) in the laptop space is their doing!
All the things you (rightfully) credit Apple for would have stayed on the drawing board if Intel didn't build the chips to enable those designs. So I think you've got things a little backwards.
Posted on Reply
#27
TheUn4seen
R0H1TIn case you forgot the whole ultra-books was a concept literally copied from Mac-books & yes the chips were Intel's very own design but the drive towards portability & emphasis on efficiency & greater battery life was driven at first by Apple, say whatever you will about them but the smartphone revolution & yes portability (arguably efficiency as well) in the laptop space is their doing!
Apple is kind of bruteforcing the issues by putting as much battery in a laptop as they can and completely ignoring thermal design, pegging the CPU at 100°C. I mean, their new laptops are an orgy of throttling and burning the CPU, reducing the lifetime of both electronics and said battery. I get it, they are just Facebook scrolling machines with a "pro" moniker to make scrolling social media feel like a valuable activity, but their design is not something to be called revolutionary.
Posted on Reply
#28
bug
TheUn4seen...they are just Facebook scrolling machines with a "pro" moniker to make scrolling social media feel like a valuable activity...
:roll::rockout:

This holds true for many ultrabooks, too, but spot-on, dude.
Posted on Reply
#29
InVasMani
When your fab node went to H-E-Double-L Intel drop it like it's hot..., when your CPU's costs cheddar, but secure like cheese of Swiss...drop it like it's hot..., When AMD's come back StoreMi precision corner pocket killing it Intel hot pocket like it's not...when Intel 14nm rollie like a Ford spectre meltdown to the core park it like it's hot...
Posted on Reply
#31
EarthDog
theoneandonlymrk10% , good luck facing down Ryzen 4th generation with that.
Good thing that is further down the pike. Zen 3 isn't released yet... but yeah... if this is all they have when Zen4 is released 2022 (rumors)...Oof.
Posted on Reply
#32
InVasMani
Unless AMD can finally crush Intel's slight edge at like 720p/1080p for the high refresh rate gamer's they will still sell plenty of these space heaters.
Posted on Reply
#33
TheoneandonlyMrK
EarthDogGood thing that is further down the pike. Zen 3 isn't released yet... but yeah... if this is all they have when Zen4 is released 2022 (rumors)...Oof.
I did say Ryzen and not zen?.
Posted on Reply
#34
EarthDog
theoneandonlymrkI did say Ryzen and not zen?.
You said.....
theoneandonlymrk10% , good luck facing down Ryzen 4th generation with that.
Zen (1st gen), Zen+, (2nd gen), Zen 2 (3rd gen), Zen 3 (4th generation).

Did I miss something (outside of my typo when I said 'Zen4')?

I missed something... don't mind me!

That said, I don't think it will have any issues with Zen 3 in clocks or IPC. It will, yet again, be within reach/faster due to clock speeds in non heavily multi-threaded benchmarks. It will still cost more and use more power. Rinse...repeat. :)
Posted on Reply
#35
TheoneandonlyMrK
EarthDogYou said.....
Zen (1st gen), Zen+, (2nd gen), Zen 2 (3rd gen), Zen 3 (4th generation).

Did I miss something (outside of my typo when I said 'Zen4')?

I missed something... don't mind me!

That said, I don't think it will have any issues with Zen 3 in clocks or IPC. It will, yet again, be within reach/faster due to clock speeds in non heavily multi-threaded benchmarks. It will still cost more and use more power. Rinse...repeat. :)
Yep I see how we got here ,and agree.
Posted on Reply
#36
techguymaxc
EarthDogWhat does enough mean? This takes the ipc crown back from amd (until zen3) and they are clocked a lot higher.

We all want more, and expect more... but this isn't a die shrink which brings with it even mkre native improvements. Understand what they are working with (regardless of how they got here).
"Enough" means good enough to earn my purchase. 10% over Skylake+++++++ isn't good enough for me to hand over my money, I would rather go with Zen 3 and have more cores, cache (and if the rumors are true) potentially more IPC to play with. Clockspeed, price, and availability would be the final determining factors for me. Intel hasn't been doing great on the last 2 fronts as of late.

Also, Zen 3 will be on the market before Rocket Lake so Intel would truly be in a position of attempting to retake the single-thread performance crown at that point. We haven't seen that scenario in 15 years.
londisteSunny Cove did bring that ~18% increase. Which makes the tweet this snippet is based on strange - Willow Cove only +10% over Skylake? How? Willow Cove is worse than Sunny Cove which was previous new core architecture?
I find the leaks suspicious in light of this as well.
Posted on Reply
#37
efikkan
londisteSunny Cove did bring that ~18% increase. Which makes the tweet this snippet is based on strange - Willow Cove only +10% over Skylake? How? Willow Cove is worse than Sunny Cove which was previous new core architecture?
We don't know much about Rocket Lake, but the cache seems to indicate it's related to Sunny Cove. If Rocket Lake is indeed a backported Sunny Cove, and not just Skylake with bits and pieces of Sunny Cove, the cache system should probably be the hardest to backport (since it's so timing sensitive). The cache and the front-end are the largest changes in Sunny Cove, so obtaining most of the IPC gain should be feasible. It's even conceivable that some newer additions from later designs could be incorporated too.

I find it disappointing that most other people in here are so gullible and believes pretty much any piece of "news" posted on Twitter etc. Nobody outside Intel have access to these chips yet to do a proper test. So regardless of what the actual IPC gains are, this piece of news is bogus.
Posted on Reply
#38
techguymaxc
efikkanWe don't know much about Rocket Lake, but the cache seems to indicate it's related to Sunny Cove. If Rocket Lake is indeed a backported Sunny Cove, and not just Skylake with bits and pieces of Sunny Cove, the cache system should probably be the hardest to backport (since it's so timing sensitive). The cache and the front-end are the largest changes in Sunny Cove, so obtaining most of the IPC gain should be feasible. It's even conceivable that some newer additions from later designs could be incorporated too.

I find it disappointing that most other people in here are so gullible and believes pretty much any piece of "news" posted on Twitter etc. Nobody outside Intel have access to these chips yet to do a proper test. So regardless of what the actual IPC gains are, this piece of news is bogus.
At this point one can only said to be gullible re: Intel news if you believe in a positive outcome. Intel has failed to execute at every turn for the last several years.
Posted on Reply
#39
R0H1T
bugAll the things you (rightfully) credit Apple for would have stayed on the drawing board if Intel didn't build the chips to enable those designs. So I think you've got things a little backwards.
Can't agree with that, it's like saying if Crysis wasn't made you wouldn't have had the race to the top for driving it with high end GPUs not to mention the (in)famous "Can it run Crysis" :ohwell:

Which is to say that GPUs would've gotten that level of performance eventually but Crysis lead & drove a race to the top much like what Vista & Apple forced, so if you're saying without Intel we'd never have had that kind of performance or efficiency leap then you're obviously forgetting the entry of ARM as well. Around the same time Intel decided to enter the mobile & tablet arena, with Bay Trail IIRC, so again Intel was led there & we'd have had great battery life & ultra portability maybe with ARM driven notebooks instead of Intel's o_O

The point being Intel has not been innovating over the last decade or so, they have mostly been followers rather than leaders in the tech arena. Their biggest leap or achievement of the last decade IMO would be Sandy Bridge.
Posted on Reply
#40
wickerman
I wonder if the issue all along havnt just been bad management of talent at intel over the last decade or so. Intel used to be the giant in silicon design and lithography..and if you wanted to be a rock star engineer its the only company you consider working for. But look at the giants that have emerged around ARM.. Apple loves hiring rock stars and suddenly they are designing the fastest silicon in mobile and want to make the ARM powered laptop/desktop a reality..which everyone wants but nobody has really taken THIS seriously. And TSMC/Samsung fabs are doing crazy volume and pay top dollar to make that profitable.

There's no doubt intel has world class engineers on every level of their business... but the fab guys clearly dropped the ball, and whether the design guys can make up for that or not is one question.. but someone has to point out the crater left by that earlier ball dropping... I think intel has to take a close look at if they can survive being a top tier designer and fabricator. Maybe one division has outgrown the other. Top tier design needs the latest and greatest fabs, maybe thats not in house anymore. Hell maybe this leads to high end intel cpus being made at TSMC and Apple winds up partnering with intel's fabs to make their ARM on laptop/desktop work. Theres no other competitor in ARM in that product stack so maybe Apple wouldn't need to push for the latest node there. In this political climate, Apple is big enough to want to make something stateside and intel has 5 fabs here and Apple has the money to invest in them. So maybe its time for these two divisions inside intel to start relying less on each other.

It's a fun thought at least..
Posted on Reply
#41
R0H1T
wickermanTheres no other competitor in ARM in that product stack so maybe Apple wouldn't need to push for the latest node there.
That's an interesting point & definitely something which might happen should Intel be willing to open their fabs to serious businesses outside their x86 walled garden. I'm sure in a decade or so, again assuming Intel's willing to change or mend their ways, it can become a reality!
Posted on Reply
#42
londiste
R0H1TThat's an interesting point & definitely something which might happen should Intel be willing to open their fabs to serious businesses outside their x86 walled garden. I'm sure in a decade or so, again assuming Intel's willing to change or mend their ways, it can become a reality!
Intel's foundries being in-house have almost nothing to do with x86 walled garden. They produce more diverse set of stuff beyond CPUs. In-house foundries are their business model and Intel has been supply-constrained pretty constantly. Not to the extent of the last few years but there have always been bits and pieces (like chipsets) that they outsource to other foundries.
wickermanTheres no other competitor in ARM in that product stack so maybe Apple wouldn't need to push for the latest node there. In this political climate, Apple is big enough to want to make something stateside and intel has 5 fabs here and Apple has the money to invest in them. So maybe its time for these two divisions inside intel to start relying less on each other.
8cx is the first thing that comes to mind? If that market takes off, others will follow. These (and Apple) SoCs are AFAIK still produced on the mobile-oriented low-power nodes for the maximum efficiency. They can always trade some density and power efficiency to the altar of performance if they so wish :)
Posted on Reply
#43
efikkan
wickermanMaybe one division has outgrown the other. Top tier design needs the latest and greatest fabs, maybe thats not in house anymore.
I think the solution to that is hedging their bets by adapting designs to multiple nodes. Intel have started to do that going forward, but to my knowledge still only targeting their own nodes.

As of right now, TSMC would be the only other foundry capable of producing high-performance CPUs, but their high power production lines are all booked up. Intel could probably only get a few thousand wafers per month, not enough to make server chips, but could have been "enough" to cover a couple of K-models in to upper mainstream.

But going forward, dropping their own foundries is not an option for Intel. Even TSMC don't have nearly enough production lines optimized for high-power chips. So if Intel were to primarily rely on other foundries, they would have to reserve significant capacity like 4-5 years in advance.

While it is clear to most that Intel didn't put enough resources into R&D of 10nm, Intel also failed to build enough production capacity. Now that the yield issues are resolved, Intel still can't push nearly enough wafers to meet demand. This is partly due to higher product demands, but also because lithography takes more time than anticipated. Intel is in no short supply of money, so I think they should double-down on foundries and design their chips to target multiple (even external) nodes. The potential lost revenue is way more than the cost of doing so, and they can always sell spare capacity.
wickermanHell maybe this leads to high end intel cpus being made at TSMC and Apple winds up partnering with intel's fabs to make their ARM on laptop/desktop work. Theres no other competitor in ARM in that product stack so maybe Apple wouldn't need to push for the latest node there. In this political climate, Apple is big enough to want to make something stateside and intel has 5 fabs here and Apple has the money to invest in them. So maybe its time for these two divisions inside intel to start relying less on each other.

It's a fun thought at least..
Intel have not been afraid to work with ARM in the past. In fact, a while back the 14nm supplies were limited because a portion of the capacity were reserved for modems for Apple, and Intel apparently had assumed they would be moved on to 10nm by then.
Posted on Reply
#44
londiste
efikkanWhile it is clear to most that Intel didn't put enough resources into R&D of 10nm, Intel also failed to build enough production capacity. Now that the yield issues are resolved, Intel still can't push nearly enough wafers to meet demand. This is partly due to higher product demands, but also because lithography takes more time than anticipated. Intel is in no short supply of money, so I think they should double-down on foundries and design their chips to target multiple (even external) nodes. The potential lost revenue is way more than the cost of doing so, and they can always sell spare capacity.
Based on Intel financials they did put a lot of resources into foundry R&D. Money might not solve whatever the issues they ran into was.

But production capacity is likely not about failure to build but a planning question. 10nm - even with delayed 7nm - is going to have a short lifespan. Intel kept 14nm running as much as possible (bringing money in, shortages) and from rumors and some of their statements actually wants to skip 10nm in as many fabs as it can. Refitting from 10nm to 7nm might be faster but refitting from 14nm to 10nm also takes considerable time. I cannot say whether this is a good strategy. If they actually succeed in getting competitive 7nm process done it probably is.
Posted on Reply
#45
efikkan
londisteBased on Intel financials they did put a lot of resources into foundry R&D. Money might not solve whatever the issues they ran into was.
They didn't put enough resources (money and manpower) on it early enough.
londisteBut production capacity is likely not about failure to build but a planning question. 10nm - even with delayed 7nm - is going to have a short lifespan. Intel kept 14nm running as much as possible (bringing money in, shortages) and from rumors and some of their statements actually wants to skip 10nm in as many fabs as it can. Refitting from 10nm to 7nm might be faster but refitting from 14nm to 10nm also takes considerable time. I cannot say whether this is a good strategy. If they actually succeed in getting competitive 7nm process done it probably is.
Most production lines are kept active for 10+ years, long after they are used in top CPUs. 10nm will be serving as Intel's main node from 2021-2023/2024, after that it will be used for chipsets, NICs, and all kinds of chips for third parties.

I believe it was about two years ago Intel launched another low-power optimized version of their 22nm family. 22nm and 14nm will be kept online for many years to come, as will 10nm and 7nm. Most production lines like this are kept until they are no longer functional, as most of their cost are in the equipment tied to that node. And the demand for cost effective nodes from third parties is only growing, so Intel can probably recoup some of their expenses of 10nm there.
Posted on Reply
Add your own comment
Dec 22nd, 2024 03:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts