• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Research Fuels Moore's Law and Paves the Way to a Trillion Transistors by 2030

What exactly is your argument? :D

Look at Intel's manufacturing nodes cadence and you will see that Intel is screwed.

Intel 90 nm - 2004
Intel 65 nm - 2006
Intel 45 nm - 2008
Intel 32 nm - 2010
Intel 22 nm - 2012
Intel 14 nm - 2014
Intel 10 nm rebadged to Intel 7 - 2019
Intel 7 nm rebadged to Intel 4 - don't know when...
Intel has been stuck in their 14nm for an extended period of time, and likely stuck with 10nm for the next few years as well. They are just being cryptic and not spelling out the "actual" process node now. It is true that the rest of the foundries are anyway doing it, but in changing the way they name their node suddenly just reaffirm the intention to muddy the water. One can write some research paper and have very ambitious plans, but realistically, it will be challenging to put it into practice/ make it a reality. I think we have reached a point that it is really difficult to shrink transistors/ node smaller, and we can even see this with TSMC who has been stuck at 5nm as their mass produced cutting edge node for some time, even though they may have refined it and called it 4nm. Apple has been stuck with 5nm since A14, and even A16 is still using 5nm despite advertising it as 4nm. And as a result, there is no significant improvement in performance other than higher clockspeed (which results in higher power and heat) and maybe 1 additional GPU core to bump up the benchmark numbers.
 
For now the reality is TSMC isolated on the throne. I feel that the chances of following TSMC's progress are better on Samsung's side than on Intel's side.
Definitely, Samsung is positioned a lot better, not just fab wise but also in products.
 
Intel should separate the fabs on a different company and get proper decent management for it. The piles of money they burned to show almost nothing and get overtaken by 2 smaller companies is insane
 
Did Intel in the past make any contribution to R&D in node shrink in silicon manufacturing?
They did R&D in CPU architecture yes.

Silicon manufacturing R&D , i think about Bell , IBM , and many other pioneering electronic companies from the past and companies like ASML nowadays.
Intel uses the processes invented by others.
You're wrong about this. Intel was the leader for decades until TSMC caught up with 10 nm in 2017 and surpassed them with 7 nm in 2018. During that long, uncontested period of Intel's dominance, there were some key advancements:
  • Introduced strained silicon in the 90 nm process; the first process to use this technique
  • 45nm process; the first commercial manufacturing process to use high-k gate dielectrics and metal gate electrodes
  • 22 nm process; the first process to use FinFET
Remember that things can change relatively rapidly in this industry. Many might remember the stagnation of TSMC's 28 nm node. At that time, it looked like Intel would stay one to two nodes ahead of the industry for a long time. Nobody could have foreseen the current situation where Intel is lagging both TSMC and Samsung. Although, with the recent stumbles of Samsung, they may not hold on to the number 2 spot for much longer.

Definitely, Samsung is positioned a lot better, not just fab wise but also in products.
Samsung's leading nodes have had poor yields and they have been falling behind TSMC for a while now. Intel has a good chance of surpassing them in the near future.
 
If they really do have chips with a trillion transistors by 2030 then I expect they will be expensive.
 
  • Haha
Reactions: ARF
Huang stated his opinion about a named observation. To call it "truth" is a bit silly. Everyone knows Moore's Law is not a "law" in the strict sense. Huang has no ground to stand on though because his company doesn't even manufacture the products they design. It's easy to sit back and criticize when you don't actually do any of the work and that's exactly what he is doing.

You would not live fine without Intel. Maybe you haven't used one of their CPUs in over a decade, but it's 100% guaranteed that you use multiple technologies developed by them. Intel is much, much more than just a consumer-focused design firm that can't manufacture their own products like AMD or NVIDIA. They have created tech that is so ubiquitous, you don't even know that you use it. I don't see why there is so much distaste for one of the few chip manufacturing companies left in the free world. At least Intel is trying and presenting some degree of success. AMD, NVIDIA, and Apple would cease to function if TSMC disappeared. People give these fabless chip companies WAY too much credit.
You're not wrong on any one point. However to be fair, Jensen founded NVidia and there was a time when they made their own silicon and owned their fabs. He does have an understanding of those things. But I agree with you, his opinion about Moores "Law" isn't spot-on. You nailed the Intel point 100%. Anyone who thinks differently is either ignorant to reality or just plain silly.

Intel should separate the fabs on a different company and get proper decent management for it.
No, they shouldn't. Their structure is fine just the way it is.
 
Last edited:
You're not wrong on any one point. However to be fair, Jensen founded NVidia and there was a time when they made their own silicon and owned their fabs. He does have an understanding of those things.
You might be thinking of AMD. As far as I know, Nvidia has always been fabless.
 
Intel should separate the fabs on a different company and get proper decent management for it. The piles of money they burned to show almost nothing and get overtaken by 2 smaller companies is insane

Dude! They invented big little, and reintroduced the performant quad core to MSDT, what more do you want?
 
The problem is that IBM wanted competition, and Intel did nothing but sabotaging IBM and enforcing monopoly.
Intel 4004 (1971): first microprocessor
Intel 8088 (1978): first microprocessor x86

IBM PC used Intel processors (first generation PersonalComputer: Intel 8088), their rivals being Apple. The predecessor of the first IBM PC used the Intel 8085.
IBM PC clones are not related to Intel processors but to the x86 architecture and hardware compatibility. In a clone you could just as well find AMD or Cyrix processors. Even Apple used Intel processors and you couldn't run Windows on them.
Beyond history, x86, MMX, SSE, AVX, FMA3 and many others come from Intel. You can't deny their contribution and I don't see anything to laugh at that they will continue to introduce innovations in the future.
 
Intel 4004 (1971): first microprocessor
Intel 8088 (1978): first microprocessor x86

IBM PC used Intel processors (first generation PersonalComputer: Intel 8088), their rivals being Apple. The predecessor of the first IBM PC used the Intel 8085.
IBM PC clones are not related to Intel processors but to the x86 architecture and hardware compatibility. In a clone you could just as well find AMD or Cyrix processors. Even Apple used Intel processors and you couldn't run Windows on them.
Beyond history, x86, MMX, SSE, AVX, FMA3 and many others come from Intel. You can't deny their contribution and I don't see anything to laugh at that they will continue to introduce innovations in the future.
While Intel is innovative, their instruction set extensions, especially the vector ones, have been lackluster. Altivec did vectors better than AVX in 1999. Intel's innovations are broad ranging, but the most relevant are in:
  • process node improvements, e.g. FinFET
  • CPU microarchitecture techniques, e.g. load store disambiguation, and the first CISC out of order microprocessor
  • pioneered DRAM
  • Non volatile memory such as Optane and NOR based Flash
  • Networking: as early as 1980, Intel, DEC, and Xerox published the first Ethernet specification
While I prefer AMD's products, there is no denying that Intel is one of the crown jewels of the western technology industry.
 
Back
Top