Tuesday, December 4th 2018

Intel Looks Beyond CMOS to the Future of Logic Devices

Today, "Nature" published a research paper on the next generation of logic devices authored by researchers from Intel, the University of California, Berkeley, and the Lawrence Berkeley National Laboratory. The paper describes a magneto-electric spin-orbit (MESO) logic device, invented by Intel. MESO devices have the potential to lower voltage by 5 times and energy by 10-30 times when combined with ultralow sleep state power, as compared to today's complementary metal-oxide-semiconductors (CMOS). While Intel is pursuing CMOS scaling, the company has been working on computing logic options that will emerge in the next decade for the beyond-CMOS era, driving computing energy-efficiency and allowing performance to grow across diverse computing architectures.

"We are looking for revolutionary, not evolutionary, approaches for computing in the beyond-CMOS era. MESO is built around low-voltage interconnects and low-voltage magneto-electrics. It brings together quantum materials innovation with computing. We are excited about the progress we have made and are looking forward to future demonstrations of reducing the switching voltage even further toward its potential," said Ian Young, Intel Senior Fellow and director of the Exploratory Integrated Circuits group in the Technology and Manufacturing Group.
Intel researchers invented the MESO device, with the memory, interconnect and logic requirements of future computing needs in mind. The MESO device was prototyped at Intel using quantum materials with emergent quantum behaviors at room temperature, with magneto-electric materials developed by Ramamoorthy Ramesh at UC Berkeley and the Lawrence Berkeley National Laboratory. MESO also utilizes spin-orbit transduction effects described by Albert Fert at Unité Mixte de Physique CNRS/Thales.

"MESO is a device built with room temperature quantum materials," said Sasikanth Manipatruni, senior staff scientist and director of Intel Science and Technology Center on Functional Electronics Integration and Manufacturing. "It is an example of what is possible, and hopefully triggers innovation across industry, academia and the national labs. A number of critical materials and techniques are yet to be developed to allow the new type of computing devices and architectures."

The Nature publication can be accessed here.
Add your own comment

15 Comments on Intel Looks Beyond CMOS to the Future of Logic Devices

#1
Space Lynx
Astronaut
I'll never understand hardware or software. I still don't see why we can't just have a fiber optic CPU, light goes on and off, 0 and 1 at speed of light, and a sensor reads it, and software says oh this was a 0 or 1, obviously I know it is not that easy... just I don't understand why CMOS brute force computing has always been needed, actually I am just going to shut up now cause I don't even understand how any of it works. gg my life, sticking to video games, lol
Posted on Reply
#2
sepheronx
lynx29I'll never understand hardware or software. I still don't see why we can't just have a fiber optic CPU, light goes on and off, 0 and 1 at speed of light, and a sensor reads it, and software says oh this was a 0 or 1, obviously I know it is not that easy... just I don't understand why CMOS brute force computing has always been needed, actually I am just going to shut up now cause I don't even understand how any of it works. gg my life, sticking to video games, lol
price

Quantum (Photonics) is very expensive and limited in its current use. Mostly for radar or wireless device like 5G. But it is an option for the future.
Posted on Reply
#3
Flanker
lynx29I'll never understand hardware or software. I still don't see why we can't just have a fiber optic CPU, light goes on and off, 0 and 1 at speed of light, and a sensor reads it, and software says oh this was a 0 or 1, obviously I know it is not that easy... just I don't understand why CMOS brute force computing has always been needed, actually I am just going to shut up now cause I don't even understand how any of it works. gg my life, sticking to video games, lol
Light signal is a pain to control, and the components to do so is bulkier than electronic counterparts, which is not a quality you want in integrated circuits
Posted on Reply
#4
DeathtoGnomes
lynx29I'll never understand hardware or software. I still don't see why we can't just have a fiber optic CPU, light goes on and off, 0 and 1 at speed of light, and a sensor reads it, and software says oh this was a 0 or 1, obviously I know it is not that easy... just I don't understand why CMOS brute force computing has always been needed, actually I am just going to shut up now cause I don't even understand how any of it works. gg my life, sticking to video games, lol
My poor attempt at science here: Its about the latency of switching between 0 and 1. Cheaper solutions could offer the same or better performance to what you suggest. :kookoo:
Posted on Reply
#5
First Strike
lynx29I'll never understand hardware or software. I still don't see why we can't just have a fiber optic CPU, light goes on and off, 0 and 1 at speed of light, and a sensor reads it, and software says oh this was a 0 or 1, obviously I know it is not that easy... just I don't understand why CMOS brute force computing has always been needed, actually I am just going to shut up now cause I don't even understand how any of it works. gg my life, sticking to video games, lol
current CMOS technology: characteristic width 7nm~14nm, futher shrink down possible
Silicon photonics: waveguide width: ~500nm, clearance around waveguide: 1um, typical modulator length: a few mm, impossible to shrink due to Maxwell eq.
Optical fiber: diameter: a few um

It is because of size.
Posted on Reply
#6
R-T-B
First Strikea few mm, impossible to shrink due to Maxwell
Damnit nvidia!
/joke
Posted on Reply
#7
Space Lynx
Astronaut
First Strikecurrent CMOS technology: characteristic width 7nm~14nm, futher shrink down possible
Silicon photonics: waveguide width: ~500nm, clearance around waveguide: 1um, typical modulator length: a few mm, impossible to shrink due to Maxwell eq.
Optical fiber: diameter: a few um

It is because of size.
if size is the true issue, then perhaps cloud streaming of games is the future after all, imagine Nvidia doing a subscription based model based on an optical fiber PC that is exponentially faster by a million of any other computer? We will just buy monitors in the future and fiber optic internet, lol
Posted on Reply
#8
Wavetrex
Everything related to transmission of information is strongly connected to how the very small, atomic world behaves (or, the "quantum" world).
In that world, things are not "exact", as in, it's not certain that something is ON and something else is OFF, or 1 and 0.

Instead, it works based on probabilities. What is the probability that something is "on" ? If that something is 99% sure it's ON, then it can be considered ON, or 1.
But what it the probability is 50% ? Should you consider it 1 or 0 ? You don't know.

Let's say than an electron or photon has the probability of 10% to go from place A to place B.
To use one single electron... with such low probability you'll never be sure if it actually reached place B. So you use more. With two, the probability that either of them will reach increases B is based on the formula:


In my 10% for one electron example, this means 10%+10% - 1% ( 19% ), almost double but not fully double.
For 4 electrons, it's 19% + 19% - 3.6%, so around 34.4% ...

Eventually, as you add more and more, you get to somewhere "high enough" (even if not 100%) that you can safely consider the chance of sufficient of them making it form A to B as a "ONE" (1)

~~~
So how does apply this to electronics?
Well, if you have enough electrons do do your job and make sure that something can be considered a "1", you have a working binary logic ( or a transistor ).

But if the transistor is REALLY SMALL, not enough electrons go through it and the chance of result being "1" is not close to 100%, actually it might be quite worse, at 90% or less.
It means, the computer would add 1 and 1 and it could result 0, 1 or 2... with a good change of being 2 but NOT A CERTAINTY.

Obviously, a computer that cannot calculate properly is not a good computer.

~~~
And here lies the problem with making transistors smaller and smaller... it starts to give "quantum errors", as those probabilities are not in the 99.999%+ but much lower, because simply there's not enough electrons to do the job properly !

And the sad part is... there is no solution. We've almost reached the limits of physics. The smaller we make stuff, the more errors it will give, and more error checks need to be done to ensure that the calculation is correct, which in turn makes things slower and waste more energy.

It's sad, but here it is, we're near the end of the the miniaturization age.
From now on it's all about making things more efficient (waste less energy) using new materials, but they won't get any smaller or faster.
Posted on Reply
#9
Space Lynx
Astronaut
WavetrexEverything related to transmission of information is strongly connected to how the very small, atomic world behaves (or, the "quantum" world).
In that world, things are not "exact", as in, it's not certain that something is ON and something else is OFF, or 1 and 0.

.
I just don't understand why? We have sensors that can detect light and darkness. So I still do not understand why that technology can not be scaled down to the chip level. Or when I pull up to a traffic light, the laser beam of light detects there is a car there and then tells the software what to do.

Why can't we develop new software to work with a light and dark 1 and 0 based CPU?
Posted on Reply
#10
fwix
lynx29I just don't understand why? We have sensors that can detect light and darkness. So I still do not understand why that technology can not be scaled down to the chip level. Or when I pull up to a traffic light, the laser beam of light detects there is a car there and then tells the software what to do.

Why can't we develop new software to work with a light and dark 1 and 0 based CPU?
the problem is the " wavelengths of visible light " is so big vs electrons "400 to 700 nm " and also so complicated to implant with u consider putting more than billions of transistors in one chip plus the difficulty of connecting all things in the motherboards using fiber ,so yeah thats not gonna happen for computing ,
Posted on Reply
#11
Space Lynx
Astronaut
fwixthe problem is the " wavelengths of visible light " is so big vs electrons "400 to 700 nm " and also so complicated to implant with u consider putting more than billions of transistors in one chip plus the difficulty of connecting all things in the motherboards using fiber ,so yeah thats not gonna happen for computing ,
huh, I didn't realize light was that big, i know we are still talking nanometers, but yeah that makes sense. I get what you are saying now. huh, interesting. the future is this then?

www.techspot.com/news/77327-chiplets-answer-extending-moore-law.html @W1zzard what are your thoughts on Chiplet CPU design and the future of CPU? i'm sure the community would love to read an article from you exploring the future of CPU design, the options currently being researched, your own take and opinion, etc. UNITE US!!!! BE PROACTIVE WIZZY I KNOW YE WANT TO LAD! hhuhuhuhu ten bucks says he wants to tackle me right now, but he knows its a good idea!



this entire video clip is 100% appropriate! unite my brothers! let us build space ships! let us begin today!
Posted on Reply
#12
TheGuruStud
"We're so screwed from 10nm it's time to talk about fantasy land stuff" LOL
Posted on Reply
#13
Space Lynx
Astronaut
TheGuruStud"We're so screwed from 10nm it's time to talk about fantasy land stuff" LOL
we must unite the clans!!!! ::howls into the night sky::
Posted on Reply
#14
First Strike
lynx29huh, I didn't realize light was that big, i know we are still talking nanometers, but yeah that makes sense. I get what you are saying now. huh, interesting. the future is this then?

www.techspot.com/news/77327-chiplets-answer-extending-moore-law.html @W1zzard what are your thoughts on Chiplet CPU design and the future of CPU? i'm sure the community would love to read an article from you exploring the future of CPU design, the options currently being researched, your own take and opinion, etc.
As for cluster computing, the trend is quite clear, use electronics to do the computing and use silicon photonics to do the communication. Reality: silicon photonics is used as a discrete I/O device communicating between nodes. Almost reality: on-chip photonics for chip-to-chip communication. Future: photonic interconnect in a many-core system. Even futurer: front-end integrated photonics as a universal interconnect solution.
In this picture, CMOS won't die. Until quantum computers throws the table.
Posted on Reply
#15
stimpy88
Intel, you have been "dreaming" (counting coin/not investing in R&D) for 10 years now. It's not getting you anywhere is it?
Posted on Reply
Add your own comment
Nov 5th, 2024 15:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts