# Intel 7nm CPUs Delayed by a Year, Alder Lake in 2H-2021, Other Commentary from Intel Management



## btarunr (Jul 24, 2020)

Intel's silicon fabrication woes refuse to torment the company's product roadmaps, with the company disclosing in its Q2-2020 financial results release that the company's first CPUs built on the 7 nanometer silicon fabrication node are delayed by a year due to a further 6-month delay from prior expectations. The company will focus on getting its 10 nm node up to scale in the meantime. 

The company mentioned that the 10 nm "Tiger Lake" mobile processor and "Ice Lake-SP" enterprise processor remains on-track for 2020. The company's 12th Generation Core "Alder Lake-S" desktop processors won't arrive before the second half of 2021. In the meantime, Intel will launch its 11th Gen Core "Rocket Lake" processor on the 14 nm node, but with increased IPC from the new "Cypress Cove" CPU cores. Also in 2H-2021, the company will launch its "Sapphire Rapids" enterprise processors that come with next-gen connectivity and updated CPU cores.



 




It's interesting to note that Intel was specific about "CPU" when talking about 7 nm, meaning that Intel's foundry woes only affect its CPU product stack, and not a word was mentioned in the release about the company's discrete GPU and scalar compute processors that are being prototyped and validated. This is probably the biggest hint we'll ever get from Intel that the company's dGPUs are being designed for third-party foundries (such as Samsung or TSMC), and that the Xe dGPU product roadmap is disconnected from that of Intel's fabs. 



> Intel is accelerating its transition to 10 nm products this year with increasing volumes and strong demand for an expanding line up. This includes a growing portfolio of 10 nm-based Intel Core processors with "Tiger Lake" launching soon, and the first 10 nm-based server CPU "Ice Lake," which remains planned for the end of this year. In the second half of 2021, Intel expects to deliver a new line of client CPU's (code-named "Alder Lake"), which will include its first 10 nm-based desktop CPU, and a new 10 nm-based server CPU (code-named "Sapphire Rapids"). The company's 7 nm-based CPU product timing is shifting approximately six months relative to prior expectations. The primary driver is the yield of Intel's 7 nm process, which based on recent data, is now trending approximately twelve months behind the company's internal target.



Intel's post results call also revealed a handful interesting tentative dates. For starters, "Tiger Lake" is shipping in "a matter of weeks," indicating an imminent launch ahead of the "Back to School" shopping season. Next up, the company's high-performance scalar compute processor, codenamed "Ponte Vecchio" remains slated for 2021-22, and given that it's reportedly being designed for 7 nm, we have our next big hint confirmation that these dGPUs will be built on third-party 7 nm fabs. Intel did mention that the Foveros packaging technology could be further developed over the years, and its upcoming discrete GPUs could combine dies (tiles) from multiple sources, which could include its own fabs.

Given the delays in Intel's 7 nm foundry node, the first Intel client-segment processors based on the node won't arrive before late-2022 or 2023, which means refinements of the current 10 nm silicon fabrication node should support Intel's client-segment product stack for the foreseeable future. The first enterprise 7 nm processors will arrive by the first half of 2023. Intel also mentioned that they expect to see "one full node improvement" from a refined 10 nanometer process, which isn't surprising, given how much experience they have improving their 14 nanometer process.

*View at TechPowerUp Main Site*


----------



## mtcn77 (Jul 24, 2020)

Intel still owns the BEST chip to chip communication a.k.a EMIB. They could design the worst chips and still win due to communication. It is just like Nvidia mobile ip ported gpus. They have more communication, so all is dandy.

It is just the same with Samsung entry into 3d nand. They took a 40nm process and made history. Everybody knows about 860, nobody knows what came before it eventhough it came in a twice subsequent node.


----------



## Crackong (Jul 24, 2020)

For 7nm GPU , TomsHardware says in their version of article :



> The company will also use external third-party foundries for its forthcoming 7nm Ponte Vecchio GPUs, the company's first graphics chips. Swan noted the GPUs will come in late 2021 or early 2022, portending a delay beyond the original schedule for a 2021 launch in the exascale Aurora supercomputer.


----------



## mtcn77 (Jul 24, 2020)

Could Intel compete on a packaging scenario, TSMC CoWoS vs Intel EMIB, that is what I'm guessing at.


----------



## Verpal (Jul 24, 2020)

Let me guess......

10nm +++ that boost up to 5.5Ghz with big.LITTLE design that chew tons of power but miraculously competitive against 5nm EUV.

Maybe Intel should just consider shipping bare die locked CPU with cooler soldered on the top, at least user won't complain about temperature.

Lets hope 10nm +++ won't happen, Intel can't be this stubburn, they seems to realize AMD's threat already.


----------



## X71200 (Jul 24, 2020)

mtcn77 said:


> Intel still owns the BEST chip to chip communication a.k.a EMIB. They could design the worst chips and still win due to communication. It is just like Nvidia mobile ip ported gpus. They have more communication, so all is dandy.
> 
> It is just the same with Samsung entry into 3d nand. They took a 40nm process and made history. Everybody knows about 860, nobody knows what came before it eventhough it came in a twice subsequent node.



Even the chips with the beefiest of that connection, the LGA 3647 Xeon Golds with 3 links and octa channel RAM can be had for cheap from eBay because Intel no longer really rules like they did back in the days of Nehalem. Higher destinies inherently carry more leakage as usual, look at their thermal ratings, almost nobody wants Intel anymore. As for Samsung making it big with the 860 SSD and 3D-nand, you're wrong there as well. The 840 Pro and others were big because of how good they were back when Sandforce was still selling around, and almost anybody in a tech forum was well aware of it.


----------



## chodaboy19 (Jul 24, 2020)

If intel can at least get 10nm shipping across all market segments they could have a viable chance to survive. Not sure why they are warning everyone about 7nm when 10nm is not even out in volume. The best guess is that 7nm is in deep trouble and intel is managing expectations from their clients to their investors.


----------



## watzupken (Jul 24, 2020)

I am not convinced that Alder Lake will be able to save Intel in the consumer space at this point unless proven otherwise when I see the actual results. In particular when we look at desktop segment, the big/little core config will not make much sense. Intel further muddied the water of Alder Lake by creating SKUs with and without big/little config. Looking at the TDP for Alder Lake S from the rumors of up to 125W, I feel its going to be 14nm++++ strategy once again, i.e. sacrificing efficiency for performance. The small cores probably will hide their poor power efficiency under light load. 

As for Intel outsourcing their GPU to external foundry, it is not unexpected. Considering the maturity of the 14nm fab and yet they are not able to keep up with demand, it is likely worst for 10nm. In addition, I also feel that Intel may not have the expertise to fab big/ complex GPUs, unlike the likes of TSMC, Samsung and GF.



chodaboy19 said:


> If intel can at least get 10nm shipping across all market segments they could have a viable chance to survive. Not sure why they are warning everyone about 7nm when 10nm is not even out in volume. The best guess is that 7nm is in deep trouble and intel is managing expectations from their clients to their investors.


They are obligated to because this is a significant event. Intel surely have shared the 7nm roadmap with their investors previously. With the delivery of 7nm products off track, they will need to correct their roadmap and make it transparent to investors. And as you can tell, the moment the news of another 6 months delay in 7nm delivery got announced, the stock got hammered.


----------



## mtcn77 (Jul 24, 2020)

X71200 said:


> As for Samsung making it big with the 860 SSD and 3D-nand, you're wrong there as well. The 840 Pro and others were big because of how good they were back when Sandforce was still selling around, and almost anybody in a tech forum was well aware of it.


3D-nand changed the basics. No enterprise drive has a better mixed 4K benchmark. To make matters clear, 850 Evo 250 is the 1st in 2.5" drives while 840 Pro 256 is 29th on ssd.userbenchmark. Funny you would mention that.


----------



## X71200 (Jul 24, 2020)

mtcn77 said:


> 3D-nand changed the basics. No enterprise drive has a better mixed 4K benchmark. To make matters clear, 850 Evo is 1st in 2.5" drives while 840 Pro is 29th on ssd.userbenchmark. Funny you would mention that.



Rofl, are you seriously looking Userbenchmark ratings of anything? The people who think DX11 and 12 are irrelevant in 2020. I'm talking about the relevancy of the 840 Pro when it came out. This has nothing to do with performance, you talked about a drive making its name. The people in tech forums like TPU, [H], etc, were all big on it because it was also better than SandForce. 3D-Nand is not everything, you could put it on a crap drive today and be under the ground with 4k.


----------



## mtcn77 (Jul 24, 2020)

X71200 said:


> I'm talking about the relevancy of the 840 Pro when it came out. This has nothing to do with performance, you talked about a drive making its name. The people in tech forums like TPU, [H], etc, were all big on it because it was also better than SandForce. 3D-Nand is not everything, you could put it on a crap drive today and be under the ground with 4k.


Okay. That was how enterprise drives were counted on back then, instead of burst r/w. Funny you would mention that.


----------



## windwhirl (Jul 24, 2020)

btarunr said:


> Intel's silicon fabrication woes refuse to torment the company's product roadmaps,



More like they refuse to stop tormenting the company


----------



## X71200 (Jul 24, 2020)

mtcn77 said:


> Okay. That was how enterprise drives were counted on back then, instead of burst r/w. Funny you would mention that.



The Evo drives never made it to the "enterprise" back then, heck even those people in the forums dashed them because they weren't relying on long-term TLC sustainability yet. Today you have TLC enterprise drives. Enterprise was still using stuff like SLC Sandforce 2285's, I still have some of those drives in my old laptops and whatnot. Seriously, you're making fun of yourself while trying to make fun of others.


----------



## mtcn77 (Jul 24, 2020)

X71200 said:


> The Evo drives never made it to the "enterprise" back then,


Why should it, I just stated they changed the hierarchy. You had pro series doing mixed r/w stuff and amateurs that compared their naked read performance on a 0% worn pristine drive. When 850 came, suddenly you had a drive do both simultaneously.


----------



## AnarchoPrimitiv (Jul 24, 2020)

AMD should be on 5nm by the time Intel is on 7nm, right?


----------



## chodaboy19 (Jul 24, 2020)

AnarchoPrimitiv said:


> AMD should be on 5nm by the time Intel is on 7nm, right?



AMD and intel's fabrication processes are not a 1:1 match.
Intel's 10nm is closer to TSMC's 7nm and Intel's 7nm is closer to TSMC's 5nm.


----------



## seronx (Jul 24, 2020)

AnarchoPrimitiv said:


> AMD should be on 5nm by the time Intel is on 7nm, right?


AMD should be on something that has the density of Intel's 7nm sooner than Intel.  Genoa, Raphael, Rembrandt should be on a TSMC N5 process which is comparable to Intel's P1276/7nm.


----------



## Jism (Jul 24, 2020)

Intel Bulldozer FX.


----------



## DemonicRyzen666 (Jul 24, 2020)

chodaboy19 said:


> AMD and intel's fabrication processes are not a 1:1 match.
> Intel's 10nm is closer to TSMC's 7nm and Intel's 7nm is closer to TSMC's 5nm.



Where does every one get this?
Far as I know this just what intel claims and there is no actual proof

There density comes from having several different designs for caches L1 caches transistors are different from L2 and so is it's L3 it's all for space saving.

AMD an other manufacturers use a more uniformed transistor on the nod.

That is why AMD an other always end up 20% larger dies


----------



## Caqde (Jul 24, 2020)

So based on what we are seeing here. Alder-Lake S will go against the 5nm Zen 4 Desktop CPU's (Late 2021/Early 2022)..... And the first 7nm Intel CPU's will be up against a future 5nm+ "Zen 5" chip in Late 2022/Early 2023... Yup doing good Intel. A bit later and you can play against a potential 3nm "Zen 6" chip. Keep the delays coming....


----------



## londiste (Jul 24, 2020)

DemonicRyzen666 said:


> Where does every one get this?
> Far as I know this just what intel claims and there is no actual proof


At least some design specifications for manufacturing processes are known.





						7 nm process - Wikipedia
					






					en.wikipedia.org
				



It is not just Intel claiming this, industry analysts  have a pretty good idea what is going on. If there are products out on a manufacturing process the density and specs can be verified to some degree.


----------



## jeremyshaw (Jul 24, 2020)

The problem with the 7nm delay is Aurora, IMO. Intel already got a reprieve once, if they screw it up again, do they finally lose the contract? Will AMD snatch all three Exaflop contracts in the US?


----------



## thesmokingman (Jul 24, 2020)

Well that was a shocker!

Like really, who believed they were actually gonna release 7nm this year?




chodaboy19 said:


> AMD and intel's fabrication processes are not a 1:1 match.
> Intel's 10nm is closer to TSMC's 7nm and Intel's 7nm is closer to TSMC's 5nm.



Do you ever get tired of spouting this rubbish?

That's like Nikola saying they make better electric semis than Tesla. lmao


----------



## cucker tarlson (Jul 24, 2020)

mtcn77 said:


> It is just the same with Samsung entry into 3d nand. They took a 40nm process and made history. Everybody knows about 860, nobody knows what came before it eventhough it came in a twice subsequent node.


sadly,I do.


----------



## Xex360 (Jul 24, 2020)

How the hell did they mess up so badly, like I said on Anandtech, since Sandy Bridge in 2011 they didn't do anything major, they just got a lot of money doing the same thing over and over.


----------



## chodaboy19 (Jul 24, 2020)

DemonicRyzen666 said:


> Where does every one get this?
> Far as I know this just what intel claims and there is no actual proof
> 
> There density comes from having several different designs for caches L1 caches transistors are different from L2 and so is it's L3 it's all for space saving.
> ...





thesmokingman said:


> Well that was a shocker!
> 
> Like really, who believed they were actually gonna release 7nm this year?
> 
> ...



They have actual specs here:








						7 nm lithography process - WikiChip
					

The 7 nanometer (7 nm) lithography process is a technology node semiconductor manufacturing process following the 10 nm process node. Mass production of integrated circuit fabricated using a 7 nm process began in 2018. The process technology will be phased out by leading-edge foundries by...




					en.wikichip.org


----------



## Caring1 (Jul 24, 2020)

10nm all over again.


----------



## thesmokingman (Jul 24, 2020)

chodaboy19 said:


> They have actual specs here:
> 
> 
> 
> ...



You didn't get the reference to Nikola did you? You ever heard of the saying "A *Bird* in the *Hand is Worth* Two in the Bush?"


----------



## londiste (Jul 24, 2020)

Xex360 said:


> How the hell did they mess up so badly, like I said on Anandtech, since Sandy Bridge in 2011 they didn't do anything major, they just got a lot of money doing the same thing over and over.


I do not understand the popular opinion that Intel does nothing. They absolutely do a lot of things.
We are all disappointed that they are failing to bring out proper competition to AMD Ryzens but come on.

After Sandy Bridge they did 22nm and 14nm manufacturing processes, arguably the 10nm process and 7nm is somewhere in the pipeline.
In terms of CPUs, AVX2, extending execution resources and caches, adding new tech as it comes along etc. And Ice Lake has even more substantial changes than that. Current Comet Lake (which is a rehashed Skylake from 2015) is ~25% faster than Sandy Bridge with single core load at the same clock. Ice Lake is a good step faster than that, Intel's 18% has been verified to be true enough.

This is just mainline CPUs. There is the Atom line that Intel seems to be getting back to with Tremont. There is XPoint with hopefully new gen coming out at one point.
Plus there are a bunch of other things Intel does with varying degrees of success - NAND Flash and controllers, FPGAs come to mind. Packaging technologies like EMIB or Foveros. Mobile modems and 5G is something they failed at.


----------



## Flanker (Jul 24, 2020)

Duke Nukem for Intel


----------



## TheLostSwede (Jul 24, 2020)

londiste said:


> I do not understand the popular opinion that Intel does nothing. They absolutely do a lot of things.
> We are all disappointed that they are failing to bring out proper competition to AMD Ryzens but come on.
> 
> After Sandy Bridge they did 22nm and 14nm manufacturing processes, arguably the 10nm process and 7nm is somewhere in the pipeline.
> ...


It's not that Intel hasn't done anything, it's more that they've over promised and under delivered time and time again when it comes to a lot of their products.
Unfortunately this is what happens when you have a de facto monopoly in any industry, so this is by no means unique to Intel.

They also tried to jump further than was technically possible with their process node, as they believed that they could go down a different and supposedly better router, which turned out to be a disaster in the end. However, due to management decisions, this wasn't knocked on the head early enough, instead they tried to salvage it several times over, which lead to further delays. This is how many a company has gone bust, but luckily for Intel, there wasn't much competition, so they could continue using their current node.

This doesn't even take into account all the security holes Intel has been struggling to patch, which makes the company look like they haven't even bothered to check their products properly, as they affect so many generations of processors. There will obviously always be things like this cropping up, as someone will always find a way to bypass security, but it feels like Intel has been lazy here.

Now the competition has caught up and is about to potentially supersede Intel on many levels, which is what makes Intel look bad.

We obviously have limited insight into what they're working on, but the information that is available makes it look like it's at last another 2-3 years before Intel might be able to sort all this out and be back with something truly competitive, both in terms of process node and desktop CPU performance.   

However, you are right that Intel is also working on a lot of other things, although XPoint wasn't done just by Intel, as it was co-designed with Micron and the FPGA was added though an acquisition of Altera, so that wasn't developed in-house. 

Maybe the loss of focus is partially what caused some of the problems in the company, there were too many BU's fighting for resources and that's why things are where they are today. I guess you also forgot about Intel's attempt at competing with ARM in the mobile phone space, which ended in a disaster, as there's no other word for it.

Is Intel a terrible company because of all this? Of course not, but it shows that even the giants can fall from grace.


----------



## londiste (Jul 24, 2020)

TheLostSwede said:


> We obviously have limited insight into what they're working on, but the information that is available makes it look like it's at last another 2-3 years before Intel might be able to sort all this out and be back with something truly competitive, both in terms of process node and desktop CPU performance.


I only half-agree with you here. From the limited insight we have, lack of a workable new process node is their root and main problem. Architecturally speaking Ice Lake is OK, Tiger Lake seems to shape up just fine as well. Intel simply cannot produce this stuff. Even Skylake/Comet Lake is not half bad considering what it is with one notable exception - power consumption, which is very much down to process node.

Edit: Ice Lake and Tiger Lake are obviously not without issues. 10nm is still rough, core count is too low for whatever reason, marketing is doing them no favors, do not exist on desktop etc. But in terms of CPU architecture and performance, they are not in a bad place.


TheLostSwede said:


> Maybe the loss of focus is partially what caused some of the problems in the company, there were too many BU's fighting for resources and that's why things are where they are today. I guess you also forgot about Intel's attempt at competing with ARM in the mobile phone space, which ended in a disaster, as there's no other word for it.


BUs fighting for resources is normal for all corporations 
Intel's mobile attempt is definitely not the only thing I forgot. They are dealing in too many areas.


----------



## john_ (Jul 24, 2020)

Funny that OEMs will have to start pushing AMD models in the market, as the premium optrions, because Intel CPUs are going to become non competitive in a year.

Now we also know why Apple chose this time to switch to ARM.

On the other hand, Zen 3 will be ultra expensive to avoid pushing Intel to drop prices.


----------



## TheLostSwede (Jul 24, 2020)

londiste said:


> I only half-agree with you here. From the limited insight we have, lack of a workable new process node is their root and main problem. Architecturally speaking Ice Lake is OK, Tiger Lake seems to shape up just fine as well. Intel simply cannot produce this stuff. Even Skylake/Comet Lake is not half bad considering what it is with one notable exception - power consumption, which is very much down to process node.
> 
> Edit: Ice Lake and Tiger Lake are obviously not without issues. 10nm is still rough, core count is too low for whatever reason, marketing is doing them no favors, do not exist on desktop etc. But in terms of CPU architecture and performance, they are not in a bad place.
> BUs fighting for resources is normal for all corporations
> Intel's mobile attempt is definitely not the only thing I forgot. They are dealing in too many areas.



But how is a good CPU design going to help them if they can't produce it? I mean, it could be the best thing since sliced bread, but if they only get 10 chips per wafer, what it does it matter, as it's going to cost a fortune and no-one will buy it?
Without a reliable process node, their CPU designs aren't going to help them get out of the hole they dug themselves into.



john_ said:


> On the other hand, Zen 3 will be ultra expensive to avoid pushing Intel to drop prices.


And you know this how? Obviously AMD is going to cash in on being the top dog for the time, I mean, they need the money, but ultra expensive...


----------



## Assimilator (Jul 24, 2020)

Surprise, surprise...

What's hurting Intel most right now is the culture among upper management of "we're the best, we'll figure it out, we don't need outside help". Pride comes before a fall, as they say, and Intel is still prideful and still falling. The only question remaining is how long before they get fresh blood that isn't bound by the sunk cost fallacy, and is willing to throw the whole 10nm mess (and potentially 7nm too now, it seems) on the garbage heap.


----------



## john_ (Jul 24, 2020)

londiste said:


> We are all disappointed that they are failing to bring out proper competition to AMD Ryzens but come on.


*Not all*.

Personally I would love Intel becoming noncompetitive for a couple of years or more if needed. That way it will lose it's influence on OEMs and the tech press and give AMD the room needed to become a viable competitor to Intel. Intel is using it's money and influence the last 20 or more years, to secure an 80% or more of the market, even when having worst products. We need two companies that will have a 50-50 or at least a 60-40 percent of the market, not a company owning 80% of the market, pushing to 90% in some periods, thanks to its big pockets.

We all know what a "competitive Intel" means. Full control of the OEM and server market, full influence on the tech press and benchmark suites/sites. No thanks. Let Intel shrink a little, let AMD grow a lot, just so we can hope for a duopoly that works, not a duopoly that is a monopoly in disguise.




TheLostSwede said:


> And you know this how? Obviously AMD is going to cash in on being the top dog for the time mean, they need the money, but ultra expensive...



They have a very competitive Zen 2 line of models in the market that they can keep selling. The XT models where made to keep Zen 2 prices up and the 5600XT fiasco showed us that AMD's mentality is changing. They will create a mess instead of dropping the MSRP price by 7-10%.

We now also know that they don't need to fear a responce from Intel. Only a price war that they can avoid by not pushing prices down. They can still sell cheap to OEMs, but in retail they will put higher prices to drive margins up and try to create the image of being the premium brand. If they don't try to convince consumers now, that the AMD brand is in fact the premium brand, when are they going to do it? When Intel starts throwing out more 10nm models or when they finaly fix their 7nm problems(they might never manage to fix those, but I bet at AMD they build strategies based on Intel's fast recovery in manufacturing just to be ready for that possibility).



Assimilator said:


> Surprise, surprise...
> 
> What's hurting Intel most right now is the culture among upper management of "we're the best, we'll figure it out, we don't need outside help". Pride comes before a fall, as they say, and Intel is still prideful and still falling. The only question remaining is how long before they get fresh blood that isn't bound by the sunk cost fallacy, and is willing to throw the whole 10nm mess (and potentially 7nm too now, it seems) on the garbage heap.


It's not only that. It's the security they feel as long as OEMs stick with them, as long as most ITs keep choosing Xeons. They keep going from record revenue to record revenue. The pile of money is keep growing in front of them, hiding the train that is coming to run over them.


----------



## londiste (Jul 24, 2020)

TheLostSwede said:


> But how is a good CPU design going to help them if they can't produce it? I mean, it could be the best thing since sliced bread, but if they only get 10 chips per wafer, what it does it matter, as it's going to cost a fortune and no-one will buy it? Without a reliable process node, their CPU designs aren't going to help them get out of the hole they dug themselves into.


That was not quite what I meant. They have one problem to solve to get back in the game. When manufacturing process problem is solved (in whatever way, make a deal with TSMC or Samsung for all we care), they will hit the ground running.


----------



## X71200 (Jul 24, 2020)

thesmokingman said:


> That's like Nikola saying they make better electric semis than Tesla. lmao



Nikola does make better semis than Tesla though. Given Tesla's track record on reliability of their cars (especially now China) I'd much rather try Nikola. IIRC, they are also better specced / looking inside out. While having a huge hydrogen tank is not viable for cars, it actually makes a whole lot of sense for a semi due to the range anxiety on pure electric - while carrying a lot of weight too. The problem is not having enough hydrogen filling base stations. While this is still an issue (which Nikola states they'll try to fix in 2023), it doesn't make the truck itself bad.

I was looking at those Xeons I spoke about earlier and the lower end versions of them like Bronze are hilariously awful. They are based on the old iteration of Skylake-X, but with nothing that even made those good. They're locked, have no HT and no turbo. Base is around 1.5-2 Ghz, awful arrangement just like the CPU found in the Mac Pro. Intel is seriously still selling these.

AMD has a better for most things end user, like their only lack is native AVX-512 support, but that's not where they were headed first. They tried to take Intel down from the desktop market and they did most of it. Servers could still use their old Intel gear, depending on the size and needs of the cluster, and I think it is partly a case of that. Renoir is a far better CPU than the old Intel chip kinds of laptops still use. AMD is making a phone CPU too, they're putting their goods to where end consumer wants to see. I guess OEMs keep using massive trays of leftover Intel CPUs for their NUCs and whatnot (say 7200U), and this is a part of market they control...


----------



## Assimilator (Jul 24, 2020)

As usual, a more comprehensive writeup from AT that reveals some very interesting details: https://www.anandtech.com/show/1592...ke-pragmatic-approach-in-using-3rd-party-fabs

Supposedly, this 7nm slip is due to a defect in the process that Intel is confident they have isolated and can fix (take that claim with a scoop of salt, but the fact that they are being specific about issues with 7nm vs the dead silence from continuing 10nm issues is positive).

But the most interesting snippet is this:



			
				AnandTech said:
			
		

> ... the message from Intel is clear: they will do what they need to in order to deliver new products according to their release roadmap... including manufacturing a product entirely at a third-party fab if that is truly the best option



This tells me that the OEMs have made it very clear to Intel that unless the latter starts hitting its projected dates again, they are going to AMD. Evidently the tipping point between familiarity with Intel but having to deal with missed releases, vs AMD predictability but having to retool for it, has been reached.

Now, whether Intel actually takes that message seriously enough is anyone's guess - see my previous comments about pride and corporate culture - but it seems that Intel is finally feeling the pressure it should have felt back in 2017 when 10nm first slipped.


----------



## TheLostSwede (Jul 24, 2020)

londiste said:


> That was not quite what I meant. They have one problem to solve to get back in the game. When manufacturing process problem is solved (in whatever way, make a deal with TSMC or Samsung for all we care), they will hit the ground running.


If only it was that easy.

You can't simply take something designed for one process node and make it on someone else's process node. This is why companies have to stick to the foundry they've designed their part to made at.
There were some cross-compatibility at one point between GloFo, Samsung and someone else, but that was on something like 32 or 2x nm and still required a fair bit of extra work to move between the two foundries.

So say Intel would take anyone of their current products and move it to TSMC, they'd have to spend something like six months to just transition their designs to work with TSMC's process node, then most likely spend another couple of months to make sure the tape-out is successful and then tune that over the better part of six months to a year before they're running at a decent production rate. So no, Intel wouldn't hit the ground running, unless they designed a new chip from more or less scratch, specifically for the TSMC process node, which might be quicker than trying to transplant a current design based on Intel's process node.

In all fairness, Intel has already made some products on various TSMC process nodes, so they might be able to do things a little bit quicker because of that, but it's still going to be more or less a year from start to finish.



Assimilator said:


> As usual, a more comprehensive writeup from AT that reveals some very interesting details: https://www.anandtech.com/show/1592...ke-pragmatic-approach-in-using-3rd-party-fabs
> 
> Supposedly, this 7nm slip is due to a defect in the process that Intel is confident they have isolated and can fix (take that claim with a scoop of salt, but the fact that they are being specific about issues with 7nm vs the dead silence from continuing 10nm issues is positive).
> 
> ...


The question that remains unanswered though is this: Does TSMC have enough capacity to manufacture for Intel? Assuming they would go with TSMC.
Samsung might have the capacity, but I'm not aware of anything Intel has ever made in one of Samsungs foundries.


----------



## bonehead123 (Jul 24, 2020)

mooooo.....mooooo.

Can ya smell what kinda milk da rock be cookin .........


----------



## londiste (Jul 24, 2020)

TheLostSwede said:


> If only it was that easy.
> 
> You can't simply take something designed for one process node and make it on someone else's process node. This is why companies have to stick to the foundry they've designed their part to made at.
> There were some cross-compatibility at one point between GloFo, Samsung and someone else, but that was on something like 32 or 2x nm and still required a fair bit of extra work to move between the two foundries.
> ...


I know it is not that easy. Intel has been talking about external manufacturing for Xe GPUs from the beginning so it shouldn't be a problem for those. I would be very surprised if Intel actually decided to outsource manufacturing CPUs. From what has been said, Intel has very different tools from what the rest of the industry uses so moving something core like CPUs to an external foundry would be a huge undertaking.

That was not meant as a realistic option, just an out of the blue example . Intel will likely figure out the manufacturing process sooner or later. By all indications there is enough money to bleed until they do.


----------



## mtcn77 (Jul 24, 2020)

londiste said:


> When manufacturing process problem is solved


We have to see a precedent to come to believe it.
I think it is due to company culture. These discrete product segmentations cost Intel to skip mixed loads altogether. For instance, there was a time when Intel could halt the AMD surge into the streamer pc market. No one knew AMD back then, it was all singular workloads until concurrently cpu encoding at the same time showed up. The AMD multithreading prowess wasn't taken for serious.
It was their esram featured chips that didn't get the greenlight from the higher ups. They were much cheaper(market buffered) and possibly a better candidate than intel's mainstream cpus to extend feature support. If intel somehow made the slight performance, they would have a line up based on esram feature level(they said time and again they had 4 times  the amount projected for a measly 4 core cpu, they could have extended to its own distinction).


----------



## AusWolf (Jul 24, 2020)

More Skylake, yay!


----------



## Th3pwn3r (Jul 24, 2020)

Verpal said:


> Let me guess......
> 
> 10nm +++ that boost up to 5.5Ghz with big.LITTLE design that chew tons of power but miraculously competitive against 5nm EUV.
> 
> ...



I just bought an i5-10600k and this sounds like Intel is already planning on stalling everything out or milking things dry. I get the pandemic slowing things down but with their history as of late...


----------



## X71200 (Jul 24, 2020)

Th3pwn3r said:


> I just bought an i5-10600k and this sounds like Intel is already planning on stalling everything out or milking things dry. I get the pandemic slowing things down but with their history as of late...



Might wait a bit and get that new Gigabyte board with the monoblock 360 AIO if you don't have a board, seems like the only thing making that platform worth it, somehow... (board is probably overpriced lol).


----------



## medi01 (Jul 24, 2020)

Uh, *isn't Intel's 10nm denser than TSMC 7nm?*

I suspect Intel's 7nm needs to be lined up against TSMC's "5".

It's not even remotely as bad as presented.


----------



## john_ (Jul 24, 2020)

medi01 said:


> It's not even remotely as bad as presented.






Well, people are jumping ship.

TSMC's 7nm are in a much better condition than Intel's 10nm. People can expect 5nm from TSMC to also be in good shape, but after that lattest announcement from Intel, 7nm is a huge questionmark. not even a "not so good node". It is a "huge question mark node".


----------



## mtcn77 (Jul 24, 2020)

cucker tarlson said:


> sadly,I do.


That is quite a low vibe retort. It got lost on me.
This might devolve into gpu trolling stereotypes which steam charts are playing the joke in question. It won't go so lightly however. 860 is both cheap and dominant in a way fake virtual accounts in chinese cyber cafes cannot tip the balance.


----------



## mechtech (Jul 24, 2020)

Meh, record profits though, just stay on 14nm until no more record profits


----------



## AnarchoPrimitiv (Jul 24, 2020)

I honestly think this will end up being better for consumers in the long run.  I think it's in everybody's best interest that AMD has at least five years of of "advantage" over Intel so that AMD can build up a "war chest" and be more entrenched when Intel finally reemerges. 

While AMD has made great advances in the DIY space, they still need to gain more ground in mobile, OEM desktop, and enterprise.  Ideally, AMD needs to get as close as possible to controlling 50% of the x86 T.A.M. in order to ensure that their current success isn't just a temporary salient that can be rolled back just as quickly. 

I'm sure everyone here as enjoyed the spoils of the new competition, personally, five or six years ago, I didn't expect to have an 8, 12 and 16 core mainstream CPUs available at the prices for which they're currently available. If we want this trend to continue, and this competition to be a permanent fixture of the PC market, then I think we should be in favor of AMD having a few more years of success at Intel's expense.


----------



## mtcn77 (Jul 24, 2020)

It boggles me that Intel could afford 32/64/128mb esram, but went along with 128mb on its mainstream cpu whereas the xbox one console got 32mb. Where is the logic in that... how expendable can your hardware be. Intel, thus, seems to have quit scaling their product segments altogether.


----------



## ppn (Jul 24, 2020)

make it 5 years, and sounds about right, just like 10nm, 2016 now 2021.

1 atom wide graphene is posponed indefinitely I suppose.

Why do they even bother with this 10/7nm thing when they can easily do 1 atom wide sub 1nm structures. Such a waste of time.

I guess they have a thing for opening and closing factories. Just doing stuff for the sake of doing it, doesnt matter that is pointless and obsoleted.


----------



## X71200 (Jul 24, 2020)

AnarchoPrimitiv said:


> I'm sure everyone here as enjoyed the spoils of the new competition, personally, five or six years ago, I didn't expect to have an 8, 12 and 16 core mainstream CPUs available at the prices for which they're currently available. If we want this trend to continue, and this competition to be a permanent fixture of the PC market, then I think we should be in favor of AMD having a few more years of success at Intel's expense.



I would have expected it, having had X99 with couple 6 core CPUs, maybe not from AMD - but that was obvious after the introduction of Ryzen too.



mtcn77 said:


> It boggles me that Intel could afford 32/64/128mb esram, but went along with 128mb on its mainstream cpu whereas the xbox one console got 32mb. Where is the logic in that... how expendable can your hardware be. Intel, thus, seems to have quit scaling their product segments altogether.



My One S has always been a media device and will stay that way for me, the GPU is ancient-level weak and it was never good because of that. The new gen consoles are finally a step towards more actual hardware inside console.


----------



## mtcn77 (Jul 24, 2020)

X71200 said:


> My One S has always been a media device and will stay that way for me, the GPU is ancient-level weak and it was never good because of that. The new gen consoles are finally a step towards more actual hardware inside console.


Which is why ram costs more than half the total this time around. In case you missed, it is like 100% price rise just by its memory.
Esram had a valid place when main memory interface was dram. Obviously, for reasons...


----------



## X71200 (Jul 24, 2020)

mtcn77 said:


> Which is why ram costs more than half the total this time around. In case you missed, it is like 100% price rise just by its memory.
> Esram had a valid place when main memory interface was dram. Obviously, for reasons...



Irrelevant, the primary intend of the console is gaming and for that, what matters the most is GPU. The GPU in One was already not in the same timeline segmentation of what was available when it got released, so it wasn't even any good back then. After couple years with price slashes and the introduction of X, which has a Polaris GPU, S could be found in throwaway bundle boxes laying on the ground in tech stores.


----------



## mtcn77 (Jul 24, 2020)

X71200 said:


> Irrelevant, the primary intend of the console is gaming and for that, what matters the most is GPU. The GPU in One was already not in the same timeline segmentation of what was available when it got released, so it wasn't even any good back then. After couple years with price slashes and the introduction of X, which has a Polaris GPU, S could be found in throwaway bundle boxes laying on the ground in tech stores.


Save that for cpu is what I'm saying anyway. It would take a weak gpu, but contrarily a high performance cpu to require the same amount of bandwidth. Intel just offered latency when instead, some bandwidth could extend market life in some cpu series that got relegated when ryzen showed up. It was a shower of rapid launches.


----------



## cucker tarlson (Jul 24, 2020)

mtcn77 said:


> That is quite a low vibe retort. It got lost on me.
> This might devolve into gpu trolling stereotypes which steam charts are playing the joke in question. It won't go so lightly however. 860 is both cheap and dominant in a way fake virtual accounts in chinese cyber cafes cannot tip the balance.


seriously,I got no f***** idea what you're talking about 90% of the time
what gpu ? what steam charts ?
I just said I owned an 840 evo that was plagued with issues before samsung went with 40nm vnand

seriously,you talk like no other people.


----------



## mtcn77 (Jul 24, 2020)

cucker tarlson said:


> seriously,I got no f***** idea what you're talking about 90% of the time
> what gpu ?
> I just said I owned an 840 evo


850 and 860 combined makes up 11% of total ssd market in userbenchmark. They weren't as popular until 3D-nand showed up. You are so forgetful...


----------



## cucker tarlson (Jul 24, 2020)

mtcn77 said:


> 850 and 860 combined makes up 11% of total ssd market in userbenchmark. They werent as popular until 3D-nand showed up. You are so forgetful...


what wasn't popular ?
samsung drives ?
are you serious ?

and what time periods are you even comparing ? how long was 840 there before 850 showed up and how long has 850 been on the market ?


----------



## mtcn77 (Jul 24, 2020)

cucker tarlson said:


> what wasn't popular ?
> samsung drives ?
> are you serious ?


You have any counterarguments to make, or are you trying to dissolve the discussion into mindless rant...
Samsung wasn't a big player until 3d-nand. After that, they just cornered the semiconductor market and kept flash prices in control.


----------



## X71200 (Jul 24, 2020)

That crappy Userbenchmark didn't even exist pre-850. Samsung has been making SSDs for well over 10 years actually, the early old Corsair P128 for example was a Samsung drive. Though, the better stuff came specifically with 830 and above. They were becoming big before 3D-nand already, but the market was so filled with tons of different cheap Sandforce models with questionable firmware, the people in the know bought Samsung.


----------



## Th3pwn3r (Jul 24, 2020)

X71200 said:


> Might wait a bit and get that new Gigabyte board with the monoblock 360 AIO if you don't have a board, seems like the only thing making that platform worth it, somehow... (board is probably overpriced lol).


That sounds really cool, I'll check it out, I do already have an Asus board that was $299 so if it's features are as good and priced similarly then maybe I'll make a switch. It's already installed and completed but I'm not lazy


----------



## RandallFlagg (Jul 24, 2020)

I don't think this means quite what people seem to think it means.

Been posted before and there are plenty of articles, but :

In terms of density :

Intel 14nm = TSMC 10/12nm
Intel 10nm = TSMC 7nm+
Intel 7nm = TSMC 5nm

Zen 2 = TSMC 7nm (1st gen).  Intel 10nm is superior to this.

Zen 3 = TSMC 7nm+ (3rd gen).   The latest TSMC node is slightly more dense than Intel 10nm (13%)
  - I should point out that no one would call this a node advantage, Samsung 7nm is actually farther away from TSMC 7nm than this - it's like 1/8th of a "node".

Zen 4 = Assumed to be TSMC 5nm

What this means :

Zen 2 vs Comet Lake :  Intel is behind 1 node (until mid 2020)
Zen 3 vs Rocket Lake :  Intel is  behind 1 node (until mid 2021)
Zen 3 vs Alder Lake : Parity on process node  (Q3 2021 - Q3 2022)
Zen 4 vs Alder Lake : Intel behind 1 node (until 2022), Zen 4 is likely to be a Q4 2021 but nothing much is known

The reason for discrepancy in naming convention is that when Intel went to 14nm, they went to FinFet too.  TSMC / Samsung / GloFlo called going to FinFet an entirely new node.


----------



## TheLostSwede (Jul 24, 2020)




----------



## efikkan (Jul 24, 2020)

For quite some time several in this forum has advocated for Intel to skip 10nm and "jump straight to" 7nm. But as we can see now, that wouldn't have helped at all. Even in the best case scenario, 7nm would provide low volume production late 2021, so we've known for a while that it could never have solved their 10nm problem.

The yield issues of 10nm are resolved, and while production volume is several times higher than last year, it's still too low to meet demand, and too low to cover mainstream desktop for now.


----------



## kings (Jul 24, 2020)

AnarchoPrimitiv said:


> I honestly think this will end up being better for consumers in the long run.  I think it's in everybody's best interest that AMD has at least five years of of "advantage" over Intel so that AMD can build up a "war chest" and be more entrenched when Intel finally reemerges.
> 
> While AMD has made great advances in the DIY space, they still need to gain more ground in mobile, OEM desktop, and enterprise.  Ideally, AMD needs to get as close as possible to controlling 50% of the x86 T.A.M. in order to ensure that their current success isn't just a temporary salient that can be rolled back just as quickly.
> 
> I'm sure everyone here as enjoyed the spoils of the new competition, personally, five or six years ago, I didn't expect to have an 8, 12 and 16 core mainstream CPUs available at the prices for which they're currently available. If we want this trend to continue, and this competition to be a permanent fixture of the PC market, then I think we should be in favor of AMD having a few more years of success at Intel's expense.



The thing is, lack of competition leads to stagnation. For example, If Intel disappears "from the map" in the CPU space for 5 years, do you think AMD will continue at this pace? They will milk customers as much as they can, like any other company.

We must wish is strong competition, not the failure of others, otherwise we return to the same, only with the names reversed. Weak competition always leads to stagnation.


----------



## Aldain (Jul 24, 2020)

AnarchoPrimitiv said:


> AMD should be on 5nm by the time Intel is on 7nm, right?



More like TSMC 3nm


----------



## trparky (Jul 24, 2020)

londiste said:


> By all indications there is enough money to bleed until they do.


That's not exactly a happy thought you know. Sure, they've got money to burn alright but what about public relations? Their image that they've so artfully crafted over the last decade is starting to look not so clean anymore.

You can see that in online forums where it used to be "Intel or nothing at all!" but now people are recommending AMD products to more people than I've ever seen in years past. If you had told me five years ago that this very scenario would be playing out today, I (and many others) would've laughed you out of the room and rightfully so. Yet, that scenario is exactly what's playing out today.


mechtech said:


> Meh, record profits though, just stay on 14nm until no more record profits


Yeah, and all the while their 14nm++++++++ chips will continue to run hotter and hotter while also have more and more yield issues. They can't keep going on like this.


kings said:


> For example, If Intel disappears "from the map" in the CPU space for 5 years, do you think AMD will continue at this pace?


I for one hope that Lisa Su wouldn't allow for that. I think that she's far smarter than that to do such a stupid move.

You have to keep striving, you have to keep innovating, you have to keep moving forward because the moment you stop for whatever reason, your competition will be right behind you and pass you by. This is what is happening to Intel right now. Intel stopped and now AMD ran past them.


----------



## Assimilator (Jul 24, 2020)

TheLostSwede said:


> View attachment 163343



f



efikkan said:


> The yield issues of 10nm are resolved



I don't buy this. If they're resolved, why is 7nm still in trouble? The only explanation is that 7nm is different from 10nm yet again... but why would 7nm be different from 10nm, unless 10nm was irrevocably broken?



kings said:


> The thing is, lack of competition leads to stagnation. For example, If Intel disappears "from the map" in the CPU space for 5 years, do you think AMD will continue at this pace? They will milk customers as much as they can, like any other company.
> 
> We must wish is strong competition, not the failure of others, otherwise we return to the same, only with the names reversed. Weak competition always leads to stagnation.



Nobody is saying Intel should fail. Competition will be strengthened if AMD is able to reach a point where they are able to compete with Intel in terms of marketshare even if their products aren't necessarily better, and AMD probably isn't quite there yet. A couple more years will give them that time.


----------



## X71200 (Jul 24, 2020)

The "Intel or nothing at all" was primarily because Excavator architecture came out only to drill its own grave. There were talks of an architecture coming from AMD before Zen's launch, and that is what you have today being perfected.


----------



## Th3pwn3r (Jul 24, 2020)

X71200 said:


> Might wait a bit and get that new Gigabyte board with the monoblock 360 AIO if you don't have a board, seems like the only thing making that platform worth it, somehow... (board is probably overpriced lol).


Just looked at and saw the price of that board. It's not for me. The looks for one reason and the price. I'd buy 4 other boards before one of those. I'm not cheap but there's no justifying buying that board for me. I'd rather spend money on amplifiers, speakers and subwoofers( home theater).


----------



## kapone32 (Jul 24, 2020)

john_ said:


> Funny that OEMs will have to start pushing AMD models in the market, as the premium optrions, because Intel CPUs are going to become non competitive in a year.
> 
> Now we also know why Apple chose this time to switch to ARM.
> 
> On the other hand, Zen 3 will be ultra expensive to avoid pushing Intel to drop prices.


 I really believe AMD truly does not care about Intel's prices. They are basically selling them as fast as they can make them right now. It is a hope that is not too exotic that we will see performance improvements in IPC and clock speed without building more cores. I also believe the single CCX CPU line will be a reality in both APUs and CPUs. In terms of the thread by the time Intel actually releases 7nm AMD should have the DIY market undisputed sewn up.


----------



## X71200 (Jul 24, 2020)

Th3pwn3r said:


> Just looked at and saw the price of that board. It's not for me. The looks for one reason and the price. I'd buy 4 other boards before one of those. I'm not cheap but there's no justifying buying that board for me. I'd rather spend money on amplifiers, speakers and subwoofers( home theater).



You sure you're looking at the AIO version and not the Xtreme with just the block? I don't think the Aorus Master Waterforce sells yet, haven't seen price for it.


----------



## efikkan (Jul 24, 2020)

Assimilator said:


> I don't buy this. If they're resolved, why is 7nm still in trouble? The only explanation is that 7nm is different from 10nm yet again... but why would 7nm be different from 10nm, unless 10nm was irrevocably broken?


They _are_ different nodes. I don't know where people got the idea that 7nm would be automatically better, even with EUV. 7nm will have its own issues to be resolved.


----------



## Dave65 (Jul 24, 2020)

Ok got to ask a question because I forgot..
Was AMD's 7nm developed by AMD or TSMC, and if TSMC does do 7nm for Intel will it be an Intel design or TSMC? I should know this but my mind rings up blank!


----------



## windwhirl (Jul 24, 2020)

Dave65 said:


> Ok got to ask a question because I forgot..
> Was AMD's 7nm developed by AMD or TSMC, and if TSMC does do 7nm for Intel will it be an Intel design or TSMC? I should know this but my mind rings up blank!



As I understand it, TSMC's 7nm is TSMC's own development. AMD and, if it ever happens, Intel, will adjust their designs to fit the specs and quirks of TSMC's nodes. However, I imagine that they agree to share some information so that TSMC can keep polishing their nodes and so that whoever uses that node also gets to understand how to best utilize it to maximize yields and the silicon performance.


----------



## ThrashZone (Jul 24, 2020)

Hi,
Intel excuses are well past lame think they just enjoy milking the monkey just adding more +++++++++++


----------



## Xuper (Jul 24, 2020)

wow Intel nasdaq dropped from 60 to 50.so much for profit.


----------



## Dave65 (Jul 24, 2020)

windwhirl said:


> As I understand it, TSMC's 7nm is TSMC's own development. AMD and, if it ever happens, Intel, will adjust their designs to fit the specs and quirks of TSMC's nodes. However, I imagine that they agree to share some information so that TSMC can keep polishing their nodes and so that whoever uses that node also gets to understand how to best utilize it to maximize yields and the silicon performance.


That makes sense I guess. I guess if it was just TSMC then both Intel and AMD would have identical chips?
Confusing to say the least..


----------



## trparky (Jul 24, 2020)

I like what one person said in another forum and I think this explains what's happening at Intel...


> "_Lifecycle of a corporation_" which is both a book and theory explains that this almost always happens eventually: creative founder(s) leave either voluntarily or not, leadership slowly becomes accountant-minded people that don’t have the natural talent and intuition to create the future, and the company slowly dies...
> 
> Unless they start the cycle over again by bringing in an innovation-minded leader.


We can apply this to many companies including Oracle, Boeing, GE, Xerox, IBM, etc.

In the case of AMD, that "innovation-minded leader" is Lisa Su. Intel needs their own "Lisa Su".


----------



## TheoneandonlyMrK (Jul 24, 2020)

trparky said:


> I like what one person said in another forum and I think this explains what's happening at Intel...
> 
> We can apply this to many companies including Oracle, Boeing, GE, Xerox, IBM, etc.
> 
> In the case of AMD, that "innovation-minded leader" is Lisa Su. Intel needs their own "Lisa Su".


I'm quite surprised they haven't bought her in, it's a twofer.

As for this , given the trends intel have set NEWS would be Intel actually hitting a target, any target.


----------



## ddarko (Jul 24, 2020)

Assimilator said:


> I don't buy this. If they're resolved, why is 7nm still in trouble? The only explanation is that 7nm is different from 10nm yet again... but why would 7nm be different from 10nm, unless 10nm was irrevocably broken?




Intel CFO George Davis said publicly in May 2020 that 10nm was not their best process:



> As we said back at our analyst day in May of 19: Look, this isn’t just going to be the best node that Intel has ever had. It’s going to be less productive than 14nm, less productive than 22nm… the fact is that I wanted to be clear what was happening during the 10nm generation. The fact is, it isn’t going to be as strong a node as people would expect from 14nm or what they’ll see in 7nm.


----------



## efikkan (Jul 24, 2020)

trparky said:


> I like what one person said in another forum and I think this explains what's happening at Intel...
> 
> We can apply this to many companies including Oracle, Boeing, GE, Xerox, IBM, etc.
> 
> In the case of AMD, that "innovation-minded leader" is Lisa Su. Intel needs their own "Lisa Su".


There isn't a lack of innovation from Intel, nearly all of their problems have been related to their production problems.
Ice Lake/Sunny Cove has been ready for over 2 years, and their next-gen Sapphire Rapids/Golden Cove is in the final testing stages. We have nothing to indicate these are inferior to AMD's upcoming counterparts, and just imagine if the 10nm node were not holding these back, then AMD would have gotten real tough competition.

People are attributing far too much to single leaders in general, both in business and in politics. The reality is higher management are mostly important for funding and to "stay out of the way". Middle management and team management is far more important, and of course good engineering. I don't care if it's Lisa Su, Jim Keller, Raja Koduro, Jensen Huang or whomever, it's the real engineering that matters.


----------



## Vayra86 (Jul 24, 2020)

Cooper Lake now? That's a new puddle... In Holland we have a saying

'Hij loopt in zeven sloten tegelijk'
Translated: "He walks into seven ditches at the same time"

Intel won't even make 7...


----------



## Vya Domus (Jul 24, 2020)

chodaboy19 said:


> Intel's 10nm is closer to TSMC's 7nm



So far that's unverifiable and probably untrue because of the lack of high end high volume 10nm parts. The only aspects that we can compare objectively today is stuff like density, but without power/voltages/clocks/yeilds that means nothing, so as far as I am concerned they're way behind TSMC.



RandallFlagg said:


> Zen 2 = TSMC 7nm (1st gen).  Intel 10nm is superior to this.



Just as I said, unverifiable and likely untrue.


----------



## RandallFlagg (Jul 24, 2020)

efikkan said:


> There isn't a lack of innovation from Intel, nearly all of their problems have been related to their production problems.
> Ice Lake/Sunny Cove has been ready for over 2 years, and their next-gen Sapphire Rapids/Golden Cove is in the final testing stages. We have nothing to indicate these are inferior to AMD's upcoming counterparts, and just imagine if the 10nm node were not holding these back, then AMD would have gotten real tough competition.



I think you're missing his point.  I've seen this lifecycle in action, and it does come down to the bean counters being in charge.  Bean counter = someone who manages based entirely on the bottom line, quarter to quarter and one year to the next.

I can imagine the conversations at Intel 5 years ago.  I have _*seen *_these kind of conversation before.

Engineer: We need to get going on the next node, and we need significant capital investment to do that.
Bean Counter:Are we behind on process technology?
Engineer: No, but we have to keep moving forward or we will be.
Bean Counter: How much do you need?
Engineer: Billions
Bean Counter: Will this new process node make us more profitable than the last, will it give us higher productivity?
Engineer: No
Bean Counter: So you want me to spend billions on something that won't benefit the business?  
Engineer: You won't have a business if you don't push this.
Bean Counter: Get this jerk out of here, he doesn't understand how business works, I never want to talk to him again!

5 years later
Bean Counter:  What is going on with all this negative press about our process node?
Engineer: TSMC passed us up, and AMD is using their node.
Bean Counter: So?  We're still making a lot of money.  
Engineer: We won't be if this continues.
Bean Counter: Well where are we on this node, we've been talking about it for years.  Why can't you guys do your job?
.....


----------



## TheLostSwede (Jul 24, 2020)

trparky said:


> I like what one person said in another forum and I think this explains what's happening at Intel...
> 
> We can apply this to many companies including Oracle, Boeing, GE, Xerox, IBM, etc.
> 
> In the case of AMD, that "innovation-minded leader" is Lisa Su. Intel needs their own "Lisa Su".


Interesting and it clearly also highlights why companies need to spend a lot of money on their R&D divisions and keep them at the forefront of the company to keep being innovative, rather than slowly dying over time.


----------



## trparky (Jul 24, 2020)

Exactly @RandallFlagg, people who can't see past the next quarter are put in charge of an engineering company. This is also what happened at Boeing, bean counters were put in charge and now we have the 737 MAX that crashed not once, not twice, but three damn times killing thousands of people.

When will these companies get it through their heads, if you need things done... don't turn to the bean counters. Turn to the engineers, they'll get it done.



RandallFlagg said:


> Why can't you guys do your job?


Bean counters, always putting the blame on someone else.


----------



## ToxicTaZ (Jul 24, 2020)

How is this delayed? 

Intel Meteor Lake 13th was always scheduled for late 2022

And everyone already knew about Intel Alder Lake scheduled for late 2021

This is fake news. 

As I have been preaching about Meteor Lake for 2022 for over a year.


----------



## mtcn77 (Jul 24, 2020)

ToxicTaZ said:


> How is this delayed?
> 
> Intel Meteor Lake 13th was always scheduled for late 2022
> 
> ...


Sometimes I wish Intel was as zealous as its fanbase. Even Samsung has done more in the previous decade.


----------



## trparky (Jul 24, 2020)

ToxicTaZ said:


> This is fake news.


How is it fake news when Intel literally talks about it in their quarterly financial report?


----------



## Zotz (Jul 24, 2020)

ToxicTaZ said:


> And everyone already knew about Intel Alder Lake scheduled for late 2021



Well, not quite everyone.  The market didn't know, apparently.



trparky said:


> Turn to the engineers, they'll get it done.


But you might run out of money while they're doing it.  Some 19th-century capitalist said (excuse the dated genderization):

"There are three ways to go bankrupt:  Women, liquor, and engineering.  Of those, women are the most enjoyable but engineering is the most certain".


----------



## Ashtr1x (Jul 24, 2020)

Apart from all those comments. Intel first needs to fire all these MBA bean counter fools, Bob Swan needs to be booted out asap. Intel's strongest strength is it's Fab plants, outsourcing them means suicidal for the company ROI on these, esp given how much higher volume Intel puts out. And why did they pander BS trash when BK was ousted, instead of telling the goddamned truth that he ruined Intel with 10nm and sitting complacent all the time and appeasing investors and PR talk, spinned into some MeToo garbage. Intel needs to act together, esp Jim Keller also left no Idea whether he faced an immovable mountain or left after his work is done. USA's top Semi corporation Intel the company which built billions of PCs and made their fortune with first class performance facing this kind of struggle with pathetic 7nm technology when TSMC is already in 5nm and Samsung managed to make 8nm EUV. It's awful.

They should stop entertaining those shitty side projects and M&A, Mobileye and focus on their bread and butter CPU and Lithography. I don't know with the way how that California is looking everyday with all political nonsense atmosphere, Intel has to get proper talent who is capable of handling the corporation and it's principles. It's pathetic. 7nm delay again and 10nm high performance is nowhere to be found. Goldmont trash in x86 Sunny Cove is insane, and that awful 8C12T RKL-S parts, still PCIe 3.0 on Z5xx. Damn it Intel. Get you things straight cut the fat off those side useless projects. Oh and that fool Bob Swan sold out 5G R&D and Patents to Apple for immediate cash instead of benefiting from them with higher R&D and talent with ROI they put less money and spend more on the useless PR trash = failure.


----------



## svan71 (Jul 24, 2020)

Its like in Spiderman "Back To Formula" how f'd up was it to be delayed at least another year ?


----------



## john_ (Jul 24, 2020)

kapone32 said:


> I really believe AMD truly does not care about Intel's prices. They are basically selling them as fast as they can make them right now. It is a hope that is not too exotic that we will see performance improvements in IPC and clock speed without building more cores. I also believe the single CCX CPU line will be a reality in both APUs and CPUs. In terms of the thread by the time Intel actually releases 7nm AMD should have the DIY market undisputed sewn up.


Let me try another theory. Maybe AMD cares more about it's own prices. If I am not mistaken, AMD needs to buy a number of wafers from GlobalFoundries for one more year. If I am not saying something completely stupid here, then probably they have reasons to keep selling Ryzen 1000 and 2000 series for at least one more year. So pushing too many 7nm CPUs in the under $200 market will make this task too difficult. Except if OEMs buy huge quantities of cheap octacore Zen+ CPUs to use them in their cheaper models.


----------



## ARF (Jul 24, 2020)

john_ said:


> Let me try another theory. Maybe AMD cares more about it's own prices. If I am not mistaken, AMD needs to buy a number of wafers from GlobalFoundries for one more year. If I am not saying something completely stupid here, then probably they have reasons to keep selling Ryzen 1000 and 2000 series for at least one more year. So pushing too many 7nm CPUs in the under $200 market will make this task too difficult. Except if OEMs buy huge quantities of cheap octacore Zen+ CPUs to use them in their cheaper models.




AMD could still use the GF's wafers for anything else - like the IO die, chipsets, consoles chips, mobile whatever...



Ashtr1x said:


> Apart from all those comments. Intel first needs to fire all these MBA bean counter fools, Bob Swan needs to be booted out asap. Intel's strongest strength is it's Fab plants, outsourcing them means suicidal for the company ROI on these, esp given how much higher volume Intel puts out. And why did they pander BS trash when BK was ousted, instead of telling the goddamned truth that he ruined Intel with 10nm and sitting complacent all the time and appeasing investors and PR talk, spinned into some MeToo garbage. Intel needs to act together, esp Jim Keller also left no Idea whether he faced an immovable mountain or left after his work is done. USA's top Semi corporation Intel the company which built billions of PCs and made their fortune with first class performance facing this kind of struggle with pathetic 7nm technology when TSMC is already in 5nm and Samsung managed to make 8nm EUV. It's awful.
> 
> They should stop entertaining those shitty side projects and M&A, Mobileye and focus on their bread and butter CPU and Lithography. I don't know with the way how that California is looking everyday with all political nonsense atmosphere, Intel has to get proper talent who is capable of handling the corporation and it's principles. It's pathetic. 7nm delay again and 10nm high performance is nowhere to be found. Goldmont trash in x86 Sunny Cove is insane, and that awful 8C12T RKL-S parts, still PCIe 3.0 on Z5xx. Damn it Intel. Get you things straight cut the fat off those side useless projects. Oh and that fool Bob Swan sold out 5G R&D and Patents to Apple for immediate cash instead of benefiting from them with higher R&D and talent with ROI they put less money and spend more on the useless PR trash = failure.



I don't think the talent is guilty and should be blamed but Intel's desire to keep things as exactly they have been for years - you know when something works or looks like working, don't make changes... Sitting on the very old PCIe 3 for no reason is just an example for this.
Weird ideas - big-little, half-done HT... lack of direction - this is mission, strategy, vision at the highest level - politics and policies included.


----------



## mtcn77 (Jul 25, 2020)

For the first time, people are having a field day about stock listings. Weird times...


----------



## ToxicTaZ (Jul 25, 2020)

ARF said:


> AMD could still use the GF's wafers for anything else - like the IO die, chipsets, consoles chips, mobile whatever...
> 
> 
> 
> ...



PCIe 3.0?
11th gen Rocket Lake is PCIe 4.0.... Is out in Q1 2021 (14nm++) 

New architecture 16 Cores (big.Little)

12th gen Alder Lake is PCIe 4.0..... Is Q4 2021 (10nm++) 

13th gen Meteor Lake is PCIe 5.0.... Is Q4 2022 (7nm+)

People like to talk about PCIe 4.0 technology yet its very short lived as Both AMD and Intel have PCIe 5.0 with DDR5 and USB-4 WiFi-6E 5G technology in two years.

PCIe 4.0 is only on two generations from both sides AMD and Intel.

Not sure why would anyone buy Intel H5 LGA 1200 socket when we all know that H6 LGA 1700 socket is coming, at the same time as AMD has AM5 coming!

Intel 13th generation (Meteor Lake-S) 7nm+ on second generation H6 LGA 1700 socket PCIe 5.0 with DDR5 and USB-4 WiFi-6E 5G technology. 

Intel Meteor Lake (7nm+) is there most important upcoming project that Jim Keller worked on directly (Ocean Cove) new core design. 

Meteor Lake is Intel first 7nm+ CPUs out of there brand new Fab42 factory. 

Intel 7nm+ EUV technology is using Graphene 

We will have to change the name from Silicon Valley to Graphene Valley very soon.


----------



## Xex360 (Jul 25, 2020)

londiste said:


> I do not understand the popular opinion that Intel does nothing. They absolutely do a lot of things.
> We are all disappointed that they are failing to bring out proper competition to AMD Ryzens but come on.
> 
> After Sandy Bridge they did 22nm and 14nm manufacturing processes, arguably the 10nm process and 7nm is somewhere in the pipeline.
> ...


Of course it's an exaggeration to say so, but given the size of the company and how much money they have they could've done much more, AMD should've never been able to catch up, they were basically broke. Intel locked us with stupid 4 cores for ages, the Ryzen 1700 destroys Intel's 4 cores besides games that were made around Intel's architecture (hopefully this will change now, and we'll have games optimised for both platforms).
That's why at least some say so.


----------



## TheGuruStud (Jul 25, 2020)

Caqde said:


> So based on what we are seeing here. Alder-Lake S will go against the 5nm Zen 4 Desktop CPU's (Late 2021/Early 2022)..... And the first 7nm Intel CPU's will be up against a future 5nm+ "Zen 5" chip in Late 2022/Early 2023... Yup doing good Intel. A bit later and you can play against a potential 3nm "Zen 6" chip. Keep the delays coming....



Intel won't have volume production. TSMC will probably have amazing yields as usual. Intel is gonna meltdown aside from power consumption lol

And you know this only the first delay announcement. Another will come.


----------



## efikkan (Jul 25, 2020)

ToxicTaZ said:


> People like to talk about PCIe 4.0 technology yet its very short lived as Both AMD and Intel have PCIe 5.0 with DDR5 and USB-4 WiFi-6E 5G technology in two years.
> PCIe 4.0 is only on two generations from both sides AMD and Intel.


Luckily, PCIe is backwards compatible, so it's not like it becomes obsolete.
Also PCIe 5.0 will be very expensive and might be a premium feature for a while.



ToxicTaZ said:


> Not sure why would anyone buy Intel H5 LGA 1200 socket when we all know that H6 LGA 1700 socket is coming, at the same time as AMD has AM5 coming!


CPU upgrades are really only relevant if a platform offers compatibility for 3-4 years, and AM4 has shown us that it only sort of works with some major compromises.

It's much more important that a platform properly supports its CPUs and works from "day one". The Zen(1) launch was horrible in terms of BIOS support(inc. memory, PCIe stability etc.), Zen 2 a lot better, yet had BIOS issues and firmware issues for 2-3 months. I will be watching Zen 3 closely to see if it's more mature at launch, I can't recommend any platform until it's fairly reliable.



Xex360 said:


> Of course it's an exaggeration to say so, but given the size of the company and how much money they have they could've done much more, AMD should've never been able to catch up, they were basically broke. Intel locked us with stupid 4 cores for ages, the Ryzen 1700 destroys Intel's 4 cores besides games that were made around Intel's architecture (hopefully this will change now, and we'll have games optimised for both platforms).
> That's why at least some say so.


No game is "made around Intel's architecture". It's not possible to target the microarchitecture in x86 code.
And no, the Skylake family does very well in tasks including Photoshop, Premiere, web browsing etc.

And for your information, back when Skylake launched there were indication that it would move to 6-core for mainstream, but the yields for 14nm were still not good enough. Engineering samples of Cannon-Lake-S, which were targeted for late 2016/early 2017, featured 8 cores. So it's the struggles (incompetence?) with Intel's nodes which have kept them at 4-cores, not lack of ambition or "evil" plans to keep you at 4 cores.


----------



## Chrispy_ (Jul 25, 2020)

Does anyone thing that Intel made Alderlake for Apple, who basically turned around and said "piss off, we're doing the chips ourselves now" to Intel, leaving them with a product that no desktop maker wants, on a process that no laptop maker wants either?

I'm just spitballing but Intel certainly isn't after the consumer performance crown or value crown with Alderlake parts, and they're not exactly server-grade either....


----------



## EarthDog (Jul 25, 2020)

Chrispy_ said:


> Does anyone thing that Intel made Alderlake for Apple, who basically turned around and said "piss off, we're doing the chips ourselves now" to Intel, leaving them with a product that no desktop maker wants, on a process that no laptop maker wants either?
> 
> I'm just spitballing but Intel certainly isn't after the consumer performance crown or value crown with Alderlake parts, and they're not exactly server-grade either....


Nope. Can't say I believe that.


----------



## Vayra86 (Jul 25, 2020)

Chrispy_ said:


> Does anyone thing that Intel made Alderlake for Apple, who basically turned around and said "piss off, we're doing the chips ourselves now" to Intel, leaving them with a product that no desktop maker wants, on a process that no laptop maker wants either?
> 
> I'm just spitballing but Intel certainly isn't after the consumer performance crown or value crown with Alderlake parts, and they're not exactly server-grade either....



People overestimate Apple. Company is indeed big but in volume its not unique at all.


----------



## psyclist (Jul 25, 2020)

jeremyshaw said:


> The problem with the 7nm delay is Aurora, IMO. Intel already got a reprieve once, if they screw it up again, do they finally lose the contract? Will AMD snatch all three Exaflop contracts in the US?



This was my Initial thought as well, there is a lot riding on Aurora, seemed very aggressive the performance they were touting as well as the timeline. Huge kick in the teeth in Intel loses that contract.

Will be interesting to see where we are in 5 years. Lisa has been at AMD for 5 years and made very strategic moves to outmaneuver the giant. Having Nvidia pushing performance aggressively on the GPU front, they took that same formula to the CPU side and look where we are!


----------



## ARF (Jul 25, 2020)

londiste said:


> I do not understand the popular opinion that Intel does nothing. They absolutely do a lot of things.
> We are all disappointed that they are failing to bring out proper competition to AMD Ryzens but come on.
> 
> After Sandy Bridge they did 22nm and 14nm manufacturing processes, arguably the 10nm process and 7nm is somewhere in the pipeline.
> ...




What's disappointing is that many OEM users still sit on old technology by Intel - 4c/8t or 6c/6t or 8c/8t in brand new PCs, instead of having the much superior/faster/more energy efficient/more secure and cheaper AMD competitive products. This is not only disturbing but also scary.


----------



## TheGuruStud (Jul 25, 2020)

ARF said:


> What's disappointing is that many OEM users still sit on old technology by Intel - 4c/8t or 6c/6t or 8c/8t in brand new PCs, instead of having the much superior/faster/more energy efficient/more secure and cheaper AMD competitive products. This is not only disturbing but also scary.



I thought brand new laptops were supposed to get shit battery life? They're too dumb to know their ass from a hole in the ground.

That explains most stuff in the world.


----------



## Lucas_ (Jul 25, 2020)

john_ said:


> Funny that OEMs will have to start pushing AMD models in the market, as the premium optrions, because Intel CPUs are going to become non competitive in a year.
> 
> Now we also know why Apple chose this time to switch to ARM.
> 
> On the other hand, Zen 3 will be ultra expensive to avoid pushing Intel to drop prices.



tuxedocomputers guys have good setups, and they also promote Linux, I wanna get the 4800 but I wanted a dedicated graphics card, still not an option, on the other hand its pretty cool
I saw another store also, only the big OEM's like Dell still resisting, I would also like to see Dell XP15 with 4 series AMD Cpu.


----------



## ARF (Jul 25, 2020)

TheGuruStud said:


> I thought brand new laptops were supposed to get shit battery life? They're too dumb to know their ass from a hole in the ground.
> 
> That explains most stuff in the world.




Do you honestly think that I understood anything from your post? 

The only thing that I did is about the battery life - AMD's U-series with 8c/16t should be great both with ultra high performance and ultra durable battery life.

What explains most of the world stuff? Presence of corruption among the humans?


----------



## bikemanI7 (Jul 25, 2020)

Used AMD for many many years,  after experiencing performance issues in 2016 with my new to me AMD FX 8310 system that a friend gave me to replace my acting up AMD Athlon 64 system,  so happily used it, dealt with the driver issues i've had at times, and tried to get the best performance i could with the hardware had then.      

Suddenly Came across money in August 2017,   so made decision at that time to go with a higher end Intel System as friends in games was saying to purchase an Intel based build,  so i was like ok guess will, been a long time since i was on Intel/nvidia setup, used that to started having overheating problems til June 2020,  and local PC shop gave me a sweet deal for a newer Intel 10th Gen 10700, board, and case (and reuse some of the older hardware from old system)      

Will i stay on Intel years from now, not sure,  i just may Try AMD in a 3-5 years again perhaps, depending on whats on by then and such.


----------



## TheGuruStud (Jul 25, 2020)

ARF said:


> Do you honestly think that I understood anything from your post?
> 
> The only thing that I did is about the battery life - AMD's U-series with 8c/16t should be great both with ultra high performance and ultra durable battery life.
> 
> What explains most of the world stuff? Presence of corruption among the humans?



It's a common phrase in the US meaning they're stupid AF. So, yes, most people are stupid is the source of problems.


----------



## efikkan (Jul 25, 2020)

Let's not get _too_ philosophical here 

Alder Lake with its hybrid technology doesn't excite me, I believe it doesn't belong on the desktop. While Rocket Lake might be a decent "stop gap" for the mainstream desktop, I think most is missing the most interesting piece of the puzzle. Ice Lake-X will be a very interesting contender against Zen 3 based Ryzen 9 and Threadrippers. And while it probably can't get close to the highest core count of Threadripper, most power users are looking for a balance between core count and core speed, while having good IO options. Many such users are doing either photo or video editing or development on the same machine as gaming, and I think there are many such users in our audience here. I believe it would be a mistake by AMD if their next Threadrippers start at 24 cores, I think 12-16 core HEDT models would be compelling to many buyers. This is a segment I want more competition; high core speed and "medium" core count, plenty of IO.


----------



## ToxicTaZ (Jul 26, 2020)

efikkan said:


> Let's not get _too_ philosophical here
> 
> Alder Lake with its hybrid technology doesn't excite me, I believe it doesn't belong on the desktop. While Rocket Lake might be a decent "stop gap" for the mainstream desktop, I think most is missing the most interesting piece of the puzzle. Ice Lake-X will be a very interesting contender against Zen 3 based Ryzen 9 and Threadrippers. And while it probably can't get close to the highest core count of Threadripper, most power users are looking for a balance between core count and core speed, while having good IO options. Many such users are doing either photo or video editing or development on the same machine as gaming, and I think there are many such users in our audience here. I believe it would be a mistake by AMD if their next Threadrippers start at 24 cores, I think 12-16 core HEDT models would be compelling to many buyers. This is a segment I want more competition; high core speed and "medium" core count, plenty of IO.



Intel 13th generation 16 Cores (7nm+) Meteor Lake Lake with Ocean Cove cores design.

I rather energy efficient with extreme IPC (80+ over 10th Gen) 16 cores ( 7nm+ & big.Little)

Intel new Fab42 is 7nm from the start, everyone is waiting for Meteor Lake Intel first 7nm+ & PCIe 5.0 basically its the rebirth of the great Ivy Bridge (3770K) Intel first 22nm & PCIe 3.0

All eyes are watching Intel 7nm very closely 

Intel Alder Lake 10nm++ will be just as good as Sandy Bridge was all around quality.

Intel 10nm++ will be used to make Intel 700 series chipsets as well.


----------



## rgrooms (Jul 26, 2020)

They're still remaining competitive with the 14nm+++++ refinements...if it wasn't for that they would be looking at big trouble. With Ryzen 3rd gen coming up at the end of year and rumors that it will be much improved over 2nd gen could really push for the gaming crown...they are already killing it with the productivity side of it.


----------



## TheGuruStud (Jul 26, 2020)

rgrooms said:


> They're still remaining competitive with the 14nm+++++ refinements...if it wasn't for that they would be looking at big trouble. With Ryzen 3rd gen coming up at the end of year and rumors that it will be much improved over 2nd gen could really push for the gaming crown...they are already killing it with the productivity side of it.



Double and triple power consumption on mobile (and still lower perf) is not competitive. That's where the majority of consumer sales are. People are just braindead.


----------



## rgrooms (Jul 26, 2020)

TheGuruStud said:


> Double and triple power consumption on mobile (and still lower perf) is not competitive. That's where the majority of consumer sales are. People are just braindead.


I'm not sure about that lower performance with the Renoir 4000 chips coming up...the majority of consumer sales well I don't know, the desktop market is pretty strong too. Not a big fan of the mobile market anyway...never have been, overpriced and hardly upgradeable.


----------



## efikkan (Jul 26, 2020)

rgrooms said:


> They're still remaining competitive with the 14nm+++++ refinements...if it wasn't for that they would be looking at big trouble.


There are *no* 14nm nodes beyond 14nm++. The "+" refers to node iterations, not chip designs, and means they have changed the node parameters such as gate pitch, metal compositions etc.
Intel has improved designs beyond the node, such as changing TIM, modifying the heat spreader and of course optimizing the chip design.



rgrooms said:


> With Ryzen 3rd gen coming up at the end of year and rumors that it will be much improved over 2nd gen could really push for the gaming crown...they are already killing it with the productivity side of it.


Zen 3 will probably be closer to Intel in gaming performance, but to beat it they need to make a better CPU front-end than Intel and get similar or better memory latency.


----------



## Xex360 (Jul 26, 2020)

efikkan said:


> Luckily, PCIe is backwards compatible, so it's not like it becomes obsolete.
> Also PCIe 5.0 will be very expensive and might be a premium feature for a while.
> 
> 
> ...


That proves my point, different applications favour different architecture, how else would you explain that Zen2 is faster in Cinebench both in single and multi core, while it loses in Photoshop, the same goes for decompression in 7zip.
I don't buy the argument that they wanted but couldn't, partially yes but not completely, weirdly in the recent "generations" suddenly they were able to add more cores.


----------



## Chrispy_ (Jul 26, 2020)

efikkan said:


> Zen 3 will probably be closer to Intel in gaming performance, but to beat it they need to make a better CPU front-end than Intel and get similar or better memory latency.


Thing is, do they need to beat Intel in gaming performance? Outside of contrived 2080Ti 720p testing specifically designed to move the gaming bottleneck away from the GPU or playing CS:GO at low details on a 300Hz monitor, AMD's gaming performance is rarely, if ever, low enough to be a significant factor.

Realistically, the more cores your CPU has, the more chance there is of a stable framerate since background OS tasks, and even background game-engine threads are likely to be finished sooner and without interrupting or causing any kind of resource conflict with the ultra-crucial game-engine thread that is the current bottleneck to lower frame times. That's felt in the minimum or 99th percentile numbers.

Can a 4GHz Zen2 core provide the very fastest gaming performance on the market? No. It's genuinely worse at the job than Intel's current lineup.
Can a 4GHz Zen2 core run a gaming thread fast enough that in 99.9% of all situations it doesn't matter? Absolutely.

I'm not going to say _no _to more gaming performance, but we do have to remember how unrealistic and unrepresentative of actual gaming the CPU game testing methodologies are. Nobody, and I mean _nobody_ dropping $3000+ on a water-cooled, overclocked i9 with a 240Hz+ monitor and 2080Ti is playing games at 720p.


----------



## efikkan (Jul 26, 2020)

Xex360 said:


> That proves my point, different applications favour different architecture, how else would you explain that Zen2 is faster in Cinebench both in single and multi core, while it loses in Photoshop, the same goes for decompression in 7zip.


What specifically proves your point?

If you think that because a piece of software performs better on one CPU than another, it proves it's optimized for that CPU? That makes absolutely no sense whatsoever.
In order for software to be optimized for a piece of hardware it needs to be intentionally designed to utilize either a specific feature or a specific characteristic of that hardware.

PC software relies on the same ISA whether it's running on AMD or Intel. And with the exception of AVX-512 and a few other features, Zen and Skylake has pretty much feature parity. All modern x86 designs relies on microoperations which we can't target, so we can't write truely low-level code for either of these architectures. These CPUs are also highly superscalar, but it's not explicit, so software can't control or optimize for it directly.

Software scales differently on different CPUs because their architectures have different strengths in terms of resource balancing. Skylake has a stronger front-end with better branch prediction and has a larger instruction window, it has lower latency in the memory controller and there are some differences in the caches. Zen/Zen2 have a different configuration of execution ports which can have a slightly higher peak combined int/vec performance under the right conditions. Zen 2 is also on a more energy efficient node, which helps a lot under those heavily threaded benchmarks where Skylake throttle much more, but this has nothing to do with software optimization. So the the only thing software developers can do to "optimize" for one microarchitecture or the other is to shuffle around the assembly code and see if they get a minor performance difference. Since they are not explicitly superscalar, executes out-of-order and we can't control or debug the microoperations, this is pretty much a pointless effort that probably yields <5% gains, and the gains will not be consistent. Pretty much no software, and especially games, do low-level assembly code anyway. Software today is mostly high-level bloated code, and such code generally performs a tiny bit better on Intel hardware, not due to optimizations (rather lack thereof), but due to a stronger front-end.

There is no software out there "optimized for Intel" (unless you count custom software relying on features AMD have not implemented yet).
But I've seen a case where a library intentionally runs slower code on AMD hardware in runtime, but this is not optimization, this is sabotage, and is not playing fair.



Chrispy_ said:


> Thing is, do they need to beat Intel in gaming performance?


No, they need to be _close enough_, and with Zen 3 they might be within the margin of error in many cases.



Chrispy_ said:


> I'm not going to say no to more gaming performance, but we do have to remember how unrealistic and unrepresentative of actual gaming the CPU game testing methodologies are. Nobody, and I mean nobody dropping $3000+ on a water-cooled, overclocked i9 with a 240Hz+ monitor and 2080Ti is playing games at 720p.


I know 720p or 1080p at low or medium is pointless with an high-end card, that's only interesting for "academic discussions", not buying recommendations.

But then consider, if you're buying a gaming machine, and there are two mostly "equal" options in your budget, while one has ~3% more gaming performance, would you say no to it?
Another argument which most ignores is that Zen 2 (for now) needs overclocked memory to become "competitive" in gaming, while Intel can run stock memory speeds and still perform better. I'll take the long-term stability please.



Chrispy_ said:


> Realistically, the more cores your CPU has, the more chance there is of a stable framerate since background OS tasks, and even background game-engine threads are likely to be finished sooner and without interrupting or causing any kind of resource conflict with the ultra-crucial game-engine thread that is the current bottleneck to lower frame times. That's felt in the minimum or 99th percentile numbers.


Sure, any time the OS scheduler kicks out any of the game's threads, it can cause stutter, at the scale of ~1-20ms for Windows. But then again, a faster core will finish sooner, so other threads waiting for it will get working earlier and finish with a larger margin before the "deadline". So it's a complicated balancing act.


----------



## londiste (Jul 26, 2020)

ARF said:


> What's disappointing is that many OEM users still sit on old technology by Intel - 4c/8t or 6c/6t or 8c/8t in brand new PCs, instead of having the much superior/faster/more energy efficient/more secure and cheaper AMD competitive products. This is not only disturbing but also scary.


4, 6 and 8 cores are still all valid choices in mainstream. 12 and 16 cores are now viable but still niche. These OEM products you mention are not marketed or meant for enthusiasts or workstation-like uses where many cores help.


Chrispy_ said:


> I'm not going to say _no _to more gaming performance, but we do have to remember how unrealistic and unrepresentative of actual gaming the CPU game testing methodologies are. Nobody, and I mean _nobody_ dropping $3000+ on a water-cooled, overclocked i9 with a 240Hz+ monitor and 2080Ti is playing games at 720p.


You are looking at it from the wrong side. 10900K vs 3900X/3950X is the wrong comparison to look at. Unless game is very thread-limited - and today, 6c/12t is plenty - something like 10400 is going to be as fast or faster than any Ryzen and 10600K is going to glow past them, more so when overclocked.


----------



## ARF (Jul 26, 2020)

londiste said:


> 4, 6 and 8 cores are still all valid choices in mainstream. 12 and 16 cores are now viable but still niche. These OEM products you mention are not marketed or meant for enthusiasts or workstation-like uses where many cores help.



It's not. The Ryzen 7 4800U is 85% faster than Core i5-8500, while drawing 23% of its power.
Core i5-8500 is not a valid choice. It's a turd.


----------



## londiste (Jul 26, 2020)

ARF said:


> It's not. The Ryzen 7 4800U is 85% faster than Core i5-8500, while drawing 23% of its power.
> Core i5-8500 is not a valid choice. It's a turd.


That is a very strange comparison. State of the art high-end low-power CPU vs a two-year old midrange desktop CPU.
2 more cores and 10 more threads is a big deal. Both chips are also the same size while 7nm is 70% denser, so about 70% more transistors in 4800U. 
Desktop CPUs are not at their point of power efficiency and to be honest, neither is 4800U - it is pretty heavily constrained by its power limit (which pretty much never seems to be 15W).

At least choose an apt comparison - 4800U should be able to convincingly beat 1065G7 or 10810U.


----------



## ARF (Jul 26, 2020)

londiste said:


> That is a very strange comparison. State of the art high-end low-power CPU vs a two-year old midrange desktop CPU.
> 2 more cores and 10 more threads is a big deal. Both chips are also the same size while 7nm is 70% denser, so about 70% more transistors in 4800U.
> Desktop CPUs are not at their point of power efficiency and to be honest, neither is 4800U - it is pretty heavily constrained by its power limit (which pretty much never seems to be 15W).
> 
> At least choose an apt comparison - 4800U should be able to convincingly beat 1065G7 or 10810U as well.




My question is. Why the likes of HP still ship this 2-year-old midrange desktop CPU in brand new office PCs? Why don't they use lower power state-of-the-art CPUs in their machines ?


----------



## londiste (Jul 26, 2020)

Office PCs?
4800U is a mobile CPU that has been out for little under 2 months.
Ryzen 4000 series APUs were announced this Tuesday.

4800U price in unknown and AMD is probably discounting them heavily to get as wide adoption as possible.
i5-8500 (or newer and HT-enabled i5-10500) costs officially ~190 moneys and i5-10400 is ~20 moneys less. Old i5-8500 is probably heavily discounted for OEMs or they are clearing stock.
The 8-core Renoir for desktop - 4750G PRO that we can see for sale now - costs 400.

In Cinebench R20 MT, R7 4750G PRO is about 50% faster than i5 10500. Power consumption is probably comparable.


----------



## ARF (Jul 26, 2020)

londiste said:


> Office PCs?


OEMs, brother, I think OEMs most of the time ship office PCs, may be I am wrong, don't know..


----------



## londiste (Jul 26, 2020)

Laptops (especially light-and-thin laptops that 15W CPUs are put into ) and office PCs are two very very different market segments.


----------



## HenrySomeone (Jul 26, 2020)

I honestly feel they are fine as long as 5 nm EUV from AMD doesn't come out considerably before their (proper) 10nm; only then would I start to worry a little.


----------



## ARF (Jul 26, 2020)

londiste said:


> Laptops (especially light-and-thin laptops that 15W CPUs are put into ) and office PCs are two very very different market segments.




Why do big organisations and companies pursue lowest possible costs for electricity? How would they achieve their targets if they don't move all of their configurations to U-series CPUs?
Office PCs need neither top extreme performance as seen in 125-watt HEDT CPUs, nor they need so high power draw, especially when they don't even measure performance/watt.


----------



## ToxicTaZ (Jul 27, 2020)

ARF said:


> My question is. Why the likes of HP still ship this 2-year-old midrange desktop CPU in brand new office PCs? Why don't they use lower power state-of-the-art CPUs in their machines ?



That's easy!

Because HP is garbage lol


----------



## watzupken (Jul 27, 2020)

Vayra86 said:


> People overestimate Apple. Company is indeed big but in volume its not unique at all.



Apple don't compete on volume if you have not noticed. They thrive on big profit margins despite lower volume sold as compared to their competitors. And despite the high price of Apple products, they are still able to maintain a very healthy fan base.


----------



## Vayra86 (Jul 27, 2020)

watzupken said:


> Apple don't compete on volume if you have not noticed. They thrive on big profit margins despite lower volume sold as compared to their competitors. And despite the high price of Apple products, they are still able to maintain a very healthy fan base.



Yes, but in the context of being supplied with chips and components, who cares about fan base?


----------



## Chrispy_ (Jul 27, 2020)

londiste said:


> 4, 6 and 8 cores are still all valid choices in mainstream. 12 and 16 cores are now viable but still niche. These OEM products you mention are not marketed or meant for enthusiasts or workstation-like uses where many cores help.
> You are looking at it from the wrong side. 10900K vs 3900X/3950X is the wrong comparison to look at. Unless game is very thread-limited - and today, 6c/12t is plenty - something like 10400 is going to be as fast or faster than any Ryzen and 10600K is going to glow past them, more so when overclocked.


I'm looking at it from high-end side because that's the only side where the CPU is relevant to gaming. At the budget end the CPU is almost completely irrelevant. The money goes into the GPU at the low end, *every time*. If you're not doing that you're doing it wrong.

Take an R5 3600 vs the 10400 at the exact same price (ignore the fact that the 3600 is usually $5-10 cheaper and comes with a usable, rather than an unusable cooler). The 3600 is better at everything, including many games where the 3600 with DDR4-2666 will only be beaten by the 10400 using DDR4-3200. Yes, the _10600_ is significantly faster at games than the R5 3600 but then you have problems even getting hold of the damn thing because it's out of stock everywhere, and price-gouging is rampant (I've only seen it in stock at $300+). If you were adamant that your gaming CPU had to be a 10600 then you've just thrown away $130 of budget forcing you to downgrade the planned 2060KO to a rubbish little 1650S. O_uch. _What's the point of a faster gaming CPU when your GPU sucks?


----------



## londiste (Jul 27, 2020)

ARF said:


> Why do big organisations and companies pursue lowest possible costs for electricity? How would they achieve their targets if they don't move all of their configurations to U-series CPUs?
> Office PCs need neither top extreme performance as seen in 125-watt HEDT CPUs, nor they need so high power draw, especially when they don't even measure performance/watt.


You mentioned i5 8500 - that is a 65W CPU. 35W CPUs are available from both AMD and Intel but are not too popular, even in office PCs. Moving to U-series will not happen simply because it would be too costly.
Office PCs are not heavily loaded anyway, so their real consumption is pretty low.

Electricity consumption as a cost factor for office PCs is not a real problem for companies. Most of the energy efficiency drive for office PCs is regulatory if even that.


----------



## skates (Jul 28, 2020)

Intel lost focus, initiative & energy.  Their leadership is to blame.  Time will tell if Intel goes into irreversible decline or they right the ship & do what is necessary to refocus, re-energize and gain the initiative.


----------



## ARF (Jul 28, 2020)

londiste said:


> You mentioned i5 8500 - that is a 65W CPU. 35W CPUs are available from both AMD and Intel but are not too popular, even in office PCs. Moving to U-series will not happen simply because it would be too costly.
> *Office PCs are not heavily loaded anyway*, so their real consumption is pretty low.
> 
> Electricity consumption as a cost factor for office PCs is not a real problem for companies. Most of the energy efficiency drive for office PCs is regulatory if even that.



This is quite disputable. There are always running applications, services and updates which quite heavily load the office computer.


----------



## EarthDog (Jul 28, 2020)

ARF said:


> This is quite disputable. There are always running applications, services and updates which quite heavily load the office computer.


Yeah, MS Office is a bitch to run, eh? Same with default services. THANK GOD AMD is here with all their cores and threads.  

Jokes aside, in any kind of enterprise or office environment updates are rolled out after hours 99% of the time.... not while users are sitting in front of their PC.

That said, it really depends on use. CAD designers, content creators etc, likely run a higher load than most general office workers (accounting, HR, CSRs, etc). But typical office functionality isn't much on a PC. There will always be exceptions.


----------



## kapone32 (Jul 28, 2020)

skates said:


> Intel lost focus, initiative & energy.  Their leadership is to blame.  Time will tell if Intel goes into irreversible decline or they right the ship & do what is necessary to refocus, re-energize and gain the initiative.


In some ways the market made Intel what it is. Publicly traded companies that are in the profiles of super rich people


EarthDog said:


> Yeah, MS Office is a bitch to run, eh? Same with default services. THANK GOD AMD is here with all their cores and threads.
> 
> Jokes aside, in any kind of enterprise or office environment updates are rolled out after hours 99% of the time.... not while users are sitting in front of their PC.
> 
> That said, it really depends on use. CAD designers, content creators etc, likey run a higher load than most general office workers (accounting, HR, CSRs, etc). But typical office functionality isn't much on a PC. There will always be exceptions.


There is no sane reason to do any work on a network during business hours. Could you imagine the Telephone company putting the lines on standy (battery power) to change a transmission cable?


----------



## ARF (Jul 28, 2020)

EarthDog said:


> Yeah, MS Office is a bitch to run, eh? Same with default services. THANK GOD AMD is here with all their cores and threads.
> 
> Jokes aside, in any kind of enterprise or office environment updates are rolled out after hours 99% of the time.... not while users are sitting in front of their PC.
> 
> That said, it really depends on use. CAD designers, content creators etc, likely run a higher load than most general office workers (accounting, HR, CSRs, etc). But typical office functionality isn't much on a PC. There will always be exceptions.




Don't browsers use a logical/physical core per open tab?


----------



## EarthDog (Jul 28, 2020)

ARF said:


> Don't browsers use a logical/physical core per open tab?


Not that I know of...doesn't look like it on  my end...open your browser and see.

I'd be in deep shit as I have, right this second, 38 Chrome tabs open and 'only' 16c/16t (its 32t CPU but I have HT disabled)............my CPU is, for all intents and purposes at idle. Browsers do use a lot of RAM... but CPU use seems negligible (for my tabs currently). With Outlook and Excel open, Paint.Net (with a dozen images), two chrome instances with a total of 30+ tabs and while watching twitch shows ~10% give or take. If I remove twitch, I'm down to 2-4% with everything else up.


----------



## mtcn77 (Jul 28, 2020)

I wonder what is next for intel a leveraged buyout?


----------



## EarthDog (Jul 28, 2020)

mtcn77 said:


> I wonder what is next for intel a leveraged buyout?


Of the company? Are you serious or did I miss some sarcasm tags?


----------



## trparky (Jul 28, 2020)

EarthDog said:


> Of the company? Are you serious or did I miss some sarcasm tags?


I'm with you on that one. Intel is too big to be bought out. Not even the likes of Amazon and/or Apple have the kind of cash on hand to buy the likes of Intel.


----------



## Chrispy_ (Jul 28, 2020)

ARF said:


> Don't browsers use a logical/physical core per open tab?


They use a separate _thread_ per open tab.

Windows10 will happily spawn ~2000 threads on a dual-core atom netbook with 2GB RAM.


----------



## EarthDog (Jul 28, 2020)

Chrispy_ said:


> They use a separate _thread_ per open tab.
> 
> Windows10 will happily spawn ~2000 threads on a dual-core atom netbook with 2GB RAM.


I've got 2908 currently on my 16c/16t config...


----------



## ARF (Jul 28, 2020)

EarthDog said:


> Not that I know of...doesn't look like it on  my end...open your browser and see.
> 
> I'd be in deep shit as I have, right this second, 38 Chrome tabs open and 'only' 16c/16t (its 32t CPU but I have HT disabled)............my CPU is, for all intents and purposes at idle. Browsers do use a lot of RAM... but CPU use seems negligible (for my tabs currently). With Outlook and Excel open, Paint.Net (with a dozen images), two chrome instances with a total of 30+ tabs and while watching twitch shows ~10% give or take. If I remove twitch, I'm down to 2-4% with everything else up.



Wrong. Those percentages are when the tab is idle. When you load the tab, you get your CPU up to 100% load.


----------



## EarthDog (Jul 28, 2020)

ARF said:


> Wrong. Those percentages are when the tab is idle. When you load the tab, you get your CPU up to 100% load.


Wrong? LOLwut?

When working any of these tabs, I've already mentioned my peak CPU use....which is with twitch.tv streaming as the active tab. Unless I don't understand what you are saying by "load the tab"? Please be more clear if I misunderstood?
EDIT: I just tried to bring up a new tab (blank) and then one of my book marks. There was a spike to a bit over 20%... then back down to ~10% when watching twitch. My NV GPU is working harder than the CPU (by %) watching twitch. 

EDIT2: Currently I only have 6 tabs and none of them spike the CPU at all when making it the active tab. Again, my CPU use peaks at ~10% with twitch doing its thing as the active tab.


----------



## mtcn77 (Jul 28, 2020)

trparky said:


> I'm with you on that one. Intel is too big to be bought out. Not even the likes of Amazon and/or Apple have the kind of cash on hand to buy the likes of Intel.





EarthDog said:


> Of the company? Are you serious or did I miss some sarcasm tags?


Yes, just watched barbarians at the gate. I took a liking to it.


----------



## EarthDog (Jul 28, 2020)

mtcn77 said:


> Yes, just watched barbarians at the gate. I took a liking to it.


lol, nice!! Fair enough!


----------



## ARF (Jul 28, 2020)

EarthDog said:


> Wrong? LOLwut?
> 
> When working any of these tabs, I've already mentioned my peak CPU use....which is with twitch.tv streaming as the active tab. Unless I don't understand what you are saying by "load the tab"? Please be more clear if I misunderstood?
> EDIT: I just tried to bring up a new tab (blank) and then one of my book marks. There was a spike to a bit over 20%... then back down to ~10% when watching twitch. My NV GPU is working harder than the CPU (by %) watching twitch.
> ...




This is what my CPU does when I open YouTube in 9 new tabs.


----------



## londiste (Jul 28, 2020)

I just tried that out of curiosity. After ~15 seconds the CPU usage goes down to 2-3% which is negligible. Playing 9 different videos on Youtube peaks higher but again after ~15 seconds CPU usage goes down to ~17% for a quite unrealistic use case.


----------



## Vayra86 (Jul 28, 2020)

EarthDog said:


> Yeah, MS Office is a bitch to run, eh? Same with default services. THANK GOD AMD is here with all their cores and threads.
> 
> Jokes aside, in any kind of enterprise or office environment updates are rolled out after hours 99% of the time.... not while users are sitting in front of their PC.
> 
> That said, it really depends on use. CAD designers, content creators etc, likely run a higher load than most general office workers (accounting, HR, CSRs, etc). But typical office functionality isn't much on a PC. There will always be exceptions.



The bottleneck is the intranet services really, not so much the workstations themselves. Although if you do anything more serious than basic administrative jobs, you'll definitely get load fom Office alongside all of its cloud based stuff. I can choke my lappy with Word and Excel, no problem whatsoever. Add a few tables and databases to Excel and it gets real fun, too. There are word docs we do edits on that are so vast Word has trouble processing the thing. Its not so much absolute performance all the time, as it is simultaneous processes running.

I've managed to destroy one laptop fan so far from all the heavy duty, took less than a year, but maybe I was lucky.


----------



## EarthDog (Jul 28, 2020)

ARF said:


> This is what my CPU does when I open YouTube in 9 new tabs.


Neat! Not one core even spiked to 100% when you did that and that (unrealistic) use scenario. So, what's your point?


londiste said:


> I just tried that out of curiosity. After ~15 seconds the CPU usage goes down to 2-3% which is negligible. Playing 9 different videos on Youtube peaks higher but again after ~15 seconds CPU usage goes down to ~17% for a quite unrealistic use case.


Similar story... I really don't know what he's on about. 



Vayra86 said:


> The bottleneck is the intranet services really, not so much the workstations themselves. Although if you do anything more serious than basic administrative jobs, you'll definitely get load fom Office alongside all of its cloud based stuff. I can choke my lappy with Word and Excel, no problem whatsoever. Add a few tables and databases to Excel and it gets real fun, too. There are word docs we do edits on that are so vast Word has trouble processing the thing. Its not so much absolute performance all the time, as it is simultaneous processes running.
> 
> I've managed to destroy one laptop fan so far from all the heavy duty, took less than a year, but maybe I was lucky.


Yeah... I run some pivot tables/data bases, etc... but nothing too heavy. All I am saying is that, for the most part, office PC's are not high utilization machines... in particular if the company keeps up on product life cycle. Running a dually with HT in 2020 even for an office box is a travesty.


----------



## ARF (Jul 29, 2020)

EarthDog said:


> Neat! Not one core even spiked to 100% when you did that and that (unrealistic) use scenario. So, what's your point?



People sometimes open dozens of tabs and keep them running. Imagine if some of them keep running flash videos and other stuff in the background.
Your CPU goes to 100% with ease.
Not to mention that your GPU also participates and is heavily loaded.

Don't underestimate the browsing. I was quite mild with only 9 open tabs. Do you know what happens with 29 open tabs ?

You need a powerful CPU to cope with them.


----------



## EarthDog (Jul 29, 2020)

ARF said:


> People sometimes open dozens of tabs and keep them running. Imagine if some of them keep running flash videos and other stuff in the background.
> Your CPU goes to 100% with ease.
> Not to mention that your GPU also participates and is heavily loaded.
> 
> ...


lol, ARF... come on man... let's not let one offs and poor PC management dictate the majority of office use cases. Don't be a dolt (not you) and forget to close your shiza on an office PC. lol

I had over 30+ tabs open at once (said that earlier...), I know exactly what happens (described that earlier). I'm also not a brick and don't leave 29 flash/yt vids running at the same time... 

We'll agree to disagree. I'm moving on. 

I'll leave this here.





EarthDog said:


> That said, *it really depends on use*. CAD designers, content creators etc, likely run a higher load than most general office workers (accounting, HR, CSRs, etc). But typical office functionality isn't much on a PC. *There will always be exceptions*.


----------



## Chrispy_ (Jul 29, 2020)

EarthDog said:


> That said, it really depends on use. CAD designers, content creators etc, likely run a higher load than most general office workers (accounting, HR, CSRs, etc). But typical office functionality isn't much on a PC. There will always be exceptions.


That's my industry.

An alarming amount of that software is single-threaded, poorly-optimised legacy junk. Outside of Premiere and software rendering (which is rapidly being replaced by GPU rendering) you can work just fine on an old Core2 Duo, as long as you have enough RAM.

There are some legacy 10+ year-old machines hooked up to CNC mills and laser-cutting beds with either Core2 or we've replaced some of them with low-end Bay-Trail or Apollo-Lake NUCs and with 16GB RAM to hold the models in memory they are perfectly-snappy, 100% capable CAD modelling platforms that can multitask Office suites and Web browsers in the background.

CAD software is old and simple. Parametric real-time simulations are where the CPU demands in my CAD/AEC industry are and once again, it's cores, cores, and more cores to feed those which is why anyone without a 3900X at their desk just offloads that to the Threadripper/3950X farm (or the old Broadwell-E farm if they've ave angered the compute department).


----------



## Vayra86 (Jul 30, 2020)

ARF said:


> People sometimes open dozens of tabs and keep them running. Imagine if some of them keep running flash videos and other stuff in the background.
> Your CPU goes to 100% with ease.
> Not to mention that your GPU also participates and is heavily loaded.
> 
> ...



29 Tabs is a sign of madness, that is what that is. I know them, I've seen it. But even so, the PC will not choke on CPU load, it will choke on RAM.


----------



## cat1092 (Aug 2, 2020)

Intel has been 'delaying' 10nm for years, while the AMD FX series CPU's & a wide variety of 970 chipset MB's were still being sold on Newegg.  

Then AMD comes in & goes to work & now most all of their mainstream chips are 7nm.  Be it AMD or NVIDIA, they've both came in & claimed Moore's Law as their own. It would seem, with all of Intel's massive resources, plus the cash saved by swapping cheap thermal paste in place of quality solder (which most consumers would pay for), they'd have 7nm chips to compete. 

Intel became too laid back for their own good when they thought AMD was dead. Now they've lost Apple, haven't met their goals for years (they've has 14nm since Broadwell) & are nowhere close to putting a chip on the market which boots at 5.0 GHz. By now, Intel should be there, were the first to break the 4.0 GHz barrier with the i7-4790K, and other than adding more cores/threads, have done little more. Why I've not seen the need to upgrade my Z97 system on that mentioned i7-4790K, although have done part of their work in delidding & replacement with liquid metal. AMD has steadily soldered their mainstream chips for as far back as I can remember, this decreases heat & in most instances, eliminates the need for an expensive cooler. 

Don't count on Intel having 7nm chips across the board for another 5 years at least!


----------



## RandallFlagg (Aug 2, 2020)

cat1092 said:


> Then AMD comes in & goes to work & now most all of their mainstream chips are 7nm.



You mean TSMC, right?  AMD does not fab chips.



cat1092 said:


> Be it AMD or NVIDIA, they've both came in & claimed Moore's Law as their own. It would seem, with all of Intel's massive resources, plus the cash saved by swapping cheap thermal paste in place of quality solder (which most consumers would pay for), they'd have 7nm chips to compete.



Intel's 10nm is equivalent to TSMCs 2nd/3rd gen 7nm and better than TSMC 1st gen 7nm.  It is also better than Samsung 7nm.  




cat1092 said:


> Don't count on Intel having 7nm chips across the board for another 5 years at least!



As noted, Intel 10nm is already equivalent to TSMC 7nm.  

We will not have AMD 5nm chips until Zen 4, which means maybe late 2021.  Zen 3 will be 7nm+, which as noted is about the same as Intel 10nm.   

Don't be a sucker for the marketing gimmickry of process node names which are not comparable with each other.


----------



## Vayra86 (Aug 2, 2020)

RandallFlagg said:


> You mean TSMC, right?  AMD does not fab chips.
> 
> 
> 
> ...



What are you even talking about, all we saw of Intel's 10nm is low performance chip material, so even if they might have efficiency and density perks, they still won't scale to a high performance setup and 7nm clearly does. Why do you think that Intel's entire high performance segment (in fact anything faster than a quadcore laptop CPU, basically) is still stuck on 14nm?

TSMC doesn't need gimmicking, Intel does, and they're going full steam on it. This 10nm fairy tale is one of the examples. It was true once upon a time when nobody had anything on the market. But these things have changed faster than Intel could blink. Intel's proposed higher density on 10nm is exactly the thing that killed their roadmaps and they could have seen it coming as they jumped from one heatspreader solution to the next to keep things in check. TSMC has a much better node in their hands.


----------



## RandallFlagg (Aug 2, 2020)

Vayra86 said:


> What are you even talking about, all we saw of Intel's 10nm is low performance chip material, so even if they might have efficiency and density perks, they still won't scale to a high performance setup and 7nm clearly does. Why do you think that Intel's entire high performance segment (in fact anything faster than a quadcore laptop CPU, basically) is still stuck on 14nm?
> ...



Read the post I was responding to and enlighten yourself.  

As far as performance, don't be too cocky.  

Here's a picture of a 4C/8T 2.8Ghz Tiger Lake 10nm CPU, and circled is an 8C/8T 4.1Ghz AMD Zen 2 getting _*7% lower*_ in multi-threaded Time Spy CPU benchmark.   

i.e. Tiger Lake 4C/8T beats 8C/8T AMD.

Even the desktop 3300X 4C/8T 3.8-4.3 Ghz part is only 9% faster, despite 27% higher base clock and  35% higher turbo, not to mention being a 65W part compared to the Intel's 15W part.

The moment this type of increased IPC hits the laptop space in anything 6C or higher, AMD is done in that space.  The laptop market is  much bigger than desktops.  I expect this to happen late Q3 and Q4 this year.  

It's entirely possible that Intel's high end laptops will be faster than all but the highest end desktop parts - AMD or Intel.


----------



## mtcn77 (Aug 2, 2020)

RandallFlagg said:


> Intel's 10nm is equivalent to TSMCs 2nd/3rd gen 7nm and better than TSMC 1st gen 7nm. It is also better than *Samsung* 7nm.


This inflection point is critical because it totally distinguishes the fundamental differences of Intel and Samsung.
Samsung builds processes on previous foundational IPs. Intel aimed a new process both at foundry and microarchitecture leaps without validating one on the other, first.


----------



## RandallFlagg (Aug 2, 2020)

mtcn77 said:


> This inflection point is critical because it totally distinguishes the fundamental differences of Intel and Samsung.
> Samsung builds processes on previous foundational IPs. Intel aimed a new process both at foundry and microarchitecture leaps without validating one on the other, first.



Don't know about that, but that isn't how the naming conventions between Intel in the foundries got out of sync.

They got out of sync back when the foundries started going from 20nm to 12/14/16nm.  You see, when Intel went from 22nm to 14nm, they also added FinFet.  They called it one node though, 22nm->14nm. 

When the foundries (TSMC, Samsung, GloFlo) took their 20nm nodes and added FinFet, they started calling it 16nm/14nm/12nm even though they did not really shrink the node. 

This is why Intel's 14nm process node is actually roughly equivalent to the foundries 10/12/14nm nodes. 

And their 10nm is roughly equivalent to foundries 7nm nodes.


----------



## mtcn77 (Aug 2, 2020)

That's all nice and dandy, but doesn't play crysis when push meets the shove.


----------



## Vayra86 (Aug 3, 2020)

RandallFlagg said:


> Read the post I was responding to and enlighten yourself.
> 
> As far as performance, don't be too cocky.
> 
> ...



Yeah, if you think laptops are somehow not built with limited TDPs, Tiger Lake looks amazing. When you let that chip fully stretch its legs... it can do with 2.8 Ghz what Skylake could do with 4.0. But it still caps at that eternal 4c8t Skylake perf level. That is the point. Now look at the TDP of the 202% faster (!!) 10900K. Change the desktop? The proof is right there in the chart that it won't.

Now, let's get back to reality. I get your point, but this won't change a thing in anything that is not power limited. Its easy to get lost in the woods of Intel benchmarking logic.



RandallFlagg said:


> And their 10nm is roughly equivalent to foundries 7nm nodes.



Its not wise to keep repeating that because it underlines you missed the point entirely about power budgets and what high performance entails. Its like it went straight past you, but its not a lie or anything...

Even investors and Intel itself have lost faith in that node, and here you are 

One last point, IPC is an architecture trait, not a node trait. They could obtain equal IPC on 14nm with the same core on a different node, but the power usage would change. In the same vein we saw planned 10nm architectural updates rolled back to 14nm. Most notably anything that had to have high clocks and ditto performance.


----------



## RandallFlagg (Aug 3, 2020)

Vayra86 said:


> Yeah, if you think laptops are somehow not built with limited TDPs, Tiger Lake looks amazing. When you let that chip fully stretch its legs... it can do with 2.8 Ghz what Skylake could do with 4.0. But it still caps at that eternal 4c8t Skylake perf level. That is the point. Now look at the TDP of the 202% faster (!!) 10900K. Change the desktop? The proof is right there in the chart that it won't.
> 
> Now, let's get back to reality. I get your point, but this won't change a thing in anything that is not power limited. Its easy to get lost in the woods of Intel benchmarking logic.



Ridiculous.  That is a 4c8t Tiger Lake beating an AMD 8c8t part, not Skylake.  I didn't mention Skylake, you did that. 

And I said it may challenge all "but the highest" performing AMD/Intel chips.  You went and ran with that to the 10900k, their top chip.  You got a reading comprehension problem or a social issue?  Just wondering.

And yes it is entirely possible that a 35W version of this with 6C/12T will challenge all but the highest AMD/Intel desktops.   What do you suppose Tiger Lake looks like at 4Ghz with 6 C and 12 T?  33% faster clock and 50% more cores would put this on par with a 3800X or a 10700K.

Maybe you should try thinking before posting?

And no I didn't miss anything about the limitations on current Ice Lake low power chips.  You missed the basic math part (see above).  You also missed the part where I noted laptops were larger market than desktops.  And on and on.


----------



## Vayra86 (Aug 3, 2020)

RandallFlagg said:


> Ridiculous.  That is a 4c8t Tiger Lake beating an AMD 8c8t part, not Skylake.  I didn't mention Skylake, you did that.
> 
> And I said it may challenge all "but the highest" performing AMD/Intel chips.  You went and ran with that to the 10900k, their top chip.  You got a reading comprehension problem or a social issue?  Just wondering.
> 
> ...



Okay buddy, keep believing whatever you think is true. You're just grasping specs out of thin air thinking you can simply multiply some numbers and boom you've predicted what a node will do under higher power delivery. If you want to base yourself on that, hey, who am I to stop you. At least you got your 'basic math' down, eh?

Try some knowledge on semiconductors in the mix, it might help a little bit. If semicon fabbing was basic math, surely we'd see lots of basic mathematicians over there wouldn't you think?

About Skylake... I pulled that out because it is the perf level Intel has capped on for a decade, and with Tiger Lake, they have once again created and optimized a product around that very same performance level. Look at where the 6700 lands compared to Tiger Lake on full steam. I'm not here to pester you or anything, I'm trying to provide insight you have obviously missed when you say 10nm equals TSMC's new nodes.

Either learn something or don't, but you don't need to convince me. This is not an AMD - Intel pissing contest for me either, contrary to what you might think. AMD's laptop segment for CPUs is severely lacking even today, I'll concede that right away.


----------



## RandallFlagg (Aug 3, 2020)

Vayra86 said:


> Okay buddy, keep believing whatever you think is true. You're just grasping specs out of thin air thinking you can simply multiply some numbers and boom you've predicted what a node will do under higher power delivery.



I never said high power delivery, you said that.   And that's the _*third *_time you've attributed your statements to me.  

I said their laptop segment (READ: LOW POWER) with Tiger Lake may be as powerful as all but the the highest end AMD/Intel desktop variants.  

Again, that reading comprehension / social issue of yours.


----------



## Vayra86 (Aug 3, 2020)

RandallFlagg said:


> I never said high power delivery, you said that.   And that's the _*third *_time you've attributed your statements to me.
> 
> I said their laptop segment (READ: LOW POWER) with Tiger Lake may be as powerful as all but the the highest end AMD/Intel desktop variants.
> 
> Again, that reading comprehension / social issue of yours.



I've already said that I agree on your laptop segment dominance for Intel... you can point me to the post where I said otherwise? Reading comprehension goes both ways, perhaps. On top of that, your bench showed us a Tiger Lake chip running without power constraints, and those dó matter for laptops, in fact throttling is a major issue on almost all of them.

You said, and that is what I responded to because that is where you missed the mark: that Intel's 10nm is better than TSMC's 7nm. I specifically pointed to the density issue in play that makes the difference here. With higher density, higher power in Intel's node will quickly lead to disaster. So no, it won't rival faster desktop parts, because it won't compete with them. TSMC has a node that can scale, Intel has one that is relegated to bottom end performance. Their 10nm is a dead end, basically.


----------



## RandallFlagg (Aug 3, 2020)

And where did you get the impression the thermal limits were removed?  The source for that data was Notebookcheck, and they stated the Tiger Lake part was clock-locked at 2.8Ghz.  That's not thermal unlocked.  That's crippled.

But I do believe I know where *you* got that from (not me).  That would be a different comparison at wccftech, between the 8C 4700U vs the i7-1165G7, where they were both set to 25-28W.  They also used a different bench, not Time Spy.

That comparison was even worse for AMD, as the 4C Tiger Lake beat the 8C AMD by 34% on 3dMark overall, 35% in graphics, and the little tiger lake matched the AMD chip in multi-threaded CPU performance almost exactly despite having half the cores.  









						Intel Tiger Lake Core i7-1165G7 4 Core CPU On Par With AMD Renoir Ryzen 7 4700U 8 Core CPU in 3DMark Time Spy, Up To 35% Lead In Graphics Test With Xe GPU
					

The latest Intel Tiger Lake benchmarks show the 4 core, Core i7-1165G7, CPU on par with AMD's 8 core, Ryzen 7 4700U, in 3DMark Time Spy.




					wccftech.com


----------



## Vayra86 (Aug 3, 2020)

RandallFlagg said:


> And where did you get the impression the thermal limits were removed?  The source for that data was Notebookcheck, and they stated the Tiger Lake part was clock-locked at 2.8Ghz.  That's not thermal unlocked.  That's crippled.
> 
> But I do believe I know where *you* got that from (not me).  That would be a different comparison at wccftech, between the 8C 4700U vs the i7-1165G7, where they were both set to 25-28W.  They also used a different bench, not Time Spy.
> 
> ...



From the wccftech article:





Note the CPU test. Tiger Lake has a good IGP. Great. But that is still circling the laptop point which I already agreed on - three times now... CPU is like you said, a match at best - but we have already got our hands on 7nm AMD desktop parts while there are no 10nm high performance Intel parts. That underlines why TSMC has a better node at this point. They can serve the *entire *market.


----------



## RandallFlagg (Aug 3, 2020)

I was looking specifically at the 8C/8T part.  The 4800U has 8C/16T.  I usually consider the T part of that to be like half a real CPU, so an 8C/8T vs a 4C/8T would be like comparing 8C vs 6C w/o hyperthreading.  Basically the numbers come out the same as stated in the article - Tiger Lake is ~ 30 to 35% faster per core vs Renoir. 

This means even Zen 3 laptop chips from AMD would have a hard time competing with Willow Cove (Tiger Lake) cores. 

4C/8T Tiger Lake (right) vs 8C/8T Renoir (left).  It is besting Zen 2 APU on every single metric. 

Said this before, if Intel can get this out in a 6C/12T part and up to 35W+ AMD won't have anything in the laptop space to compete. 

Doubly so when you consider what the Xe graphics are doing here - I haven't mentioned nor cared about that, but for 90% of laptop owners the iGPU is very important. 

Renoir will wind up being one of those things that came out and was really great for 3 months, then got obsolete.


----------



## Vayra86 (Aug 4, 2020)

RandallFlagg said:


> I was looking specifically at the 8C/8T part.  The 4800U has 8C/16T.  I usually consider the T part of that to be like half a real CPU, so an 8C/8T vs a 4C/8T would be like comparing 8C vs 6C w/o hyperthreading.  Basically the numbers come out the same as stated in the article - Tiger Lake is ~ 30 to 35% faster per core vs Renoir.
> 
> This means even Zen 3 laptop chips from AMD would have a hard time competing with Willow Cove (Tiger Lake) cores.
> 
> ...



Note the clock speeds. Ryzen at 2000mhz and Tiger Lake at 2800mhz. Its a perfect CPU score match. 0.01 FPS gap and 2 points. Tiger Lake needs higher clocks to get the same performance.

Higher IPC? That is not what this bench shows you - it shows the opposite but you've mixed core count into that equation as well, haven't you  That is maybe where you're missing the mark in your comparisons - Zen's strength is exactly its chiplet design which means AMD can cram more cores in with no issue whatsoever, and this translates to a 'wider' CPU that can remain in an efficiency curve more easily, not bursty but steady in clock behaviour. In addition, Intel has a higher peak clock as well, which is just another echo of the bursty laptop CPUs we already know. I think you're overestimating what you see in a big, big way. Its no coincidence we're looking at 4C8T Intel chips here. IF Intel can get this out as a 6c12t? That G7 was announced, released and then not seen in the wild for over a half year. And even now, they're rare unicorns. I don't believe in whatever Zen 3 might offer or what Intel might do with bigger chips that don't exist. I only care about actual releases, and this is what we have.

And the 'Xe' graphics are just another rebrand of Iris. Please... you said something about happily believing marketing about nodes... you're falling for the Intel branding madhouse, bigtime. There isn't that much here that is going to radically change anything. If there even is anything. You think Raja came to Intel, farted three times and poof new IGP? They work from the top down, not the bottom up. Whatever Xe is supposed to be in the lower end, it cannot be more than a few tweaks on whatever Intel already had.

All you've got is 'Tiger Lake in a different situation with higher core count and much higher TDP limits will destroy them all'... yeah. Cool story. Intel dominates laptops because they have supplied that market for over a decade and they've been proven reliable. Tiger Lake merely continues that spree, nothing more. Its not a game changer - it just manages to get by against nodes that are slowly but surely going to trump it anyway. If Intel does not move to 7nm and smaller within now and the next two years, they've got 14nm woes all over again. Keep in mind that recent history was not all about performance either, but also about security issues, heat/throttling issues (new heatspreader designs every gen, does not instill faith in any long term strategy) and lots of confusion and delays in supply chains. Intel is losing investors and rapidly losing share price - and it has already held on to the old glory for quite a while if you ask me, despite countless reports and bad omens.

For AMD, Renoir is just another stepping stone. For Intel, Tiger Lake is a desperate attempt to keep up.


----------



## londiste (Aug 4, 2020)

IPC isn't higher but it isn't lower either. It seems to match up pretty well to what is expected from 4c8t at 2800MH vs 8c8t at 2000MHz. This is actually a good sign.

G7 was announced, released and has been available pretty reliably from what I can see. There are about 2 real SKUs of Ice Lake that seem to be steadily available - 1065G7 and 1035G1 - the rest could as well not exist.

Chiplets have nothing to do with Ryzen 4000 series - this is a monolithic CPU. At least the current chiplet design is not well suitable for mobile usage, primarily due to high idle power.

Xe is as much a rebrand of Iris as RDNA2 is a rebrand of GCN. It is in part and isn't in the other.


----------



## RandallFlagg (Aug 4, 2020)

You are both comparing base clock to base clock.  During relatively short periods where these benchmarks are run, turbo counts.  And in turbo, it's 4.2Ghz (ryzen) vs 4.7 Ghz (intel).   In the end these semantics matter not, as it is just the raw performance that matters.   Ignore that and you just get lost in the weeds of minutia.


----------

