Wednesday, November 16th 2011

Intel Celebrates 40 Years of the Microprocessor

On this day 40 years ago, Intel Corporation introduced the world's first commercially available microprocessor - the Intel 4004 - triggering the start of the digital revolution. While most people have never seen a microprocessor, devices that contain them have become so integrated into daily life that they have become virtually indispensible.

Microprocessors are the "brains" inside computers, servers, phones, cars, cameras, refrigerators, radios, TVs and many other everyday devices. The proliferation of microprocessors is due in large part to Intel's relentless pursuit of Moore's Law, a forecast for the pace of silicon technology development that states that roughly every 2 years transistor density of semiconductors will double, while increasing functionality and performance and decreasing costs. It has become the basic business model for the semiconductor industry for more than 40 years.
For example, compared to the Intel 4004, today's second-generation Intel Core processors are more than 350,000 times the performance and each transistor uses about 5,000 times less energy. In this same time period, the price of a transistor has dropped by a factor of about 50,000.

Future microprocessors developed on Intel's next-generation 22nm manufacturing process are due in systems starting next year and will deliver even more energy-efficient performance as a result of the company's breakthrough 3-D Tri-Gate transistors that make use of a new transistor structure. These novel transistors usher in the next era of Moore's Law and make possible a new generation of innovations across a broad spectrum of devices.

While looking back to see how much things have changed since the microprocessor's introduction, it's astounding to think about the future and how this digital revolution will continue at a rapid pace as microprocessor technology continues to evolve.

"The sheer number of advances in the next 40 years will equal or surpass all of the innovative activity that has taken place over the last 10,000 years of human history," said Justin Rattner, Intel chief technology officer.

Such advances in chip technology are paving the way for an age when computing systems will be aware of what is happening around them, and anticipate people's needs. This capability is poised to fundamentally change the nature of how people interact with and relate to information devices and the services they provide. Future context-aware devices ranging from PCs and smartphones to automobiles and televisions, will be able to advise people and guide them through their day in a manner more like a personal assistant than a traditional computer.

Reflections and Predications: The Impact of the Microprocessor
To celebrate the past 40 years of microprocessor innovation and look ahead at the next 40 years, Intel compiled photos, video interviews, opinion pieces and a number of info graphics and other materials with insight from Intel and industry executives, analysts, futurists and engineers.
Add your own comment

34 Comments on Intel Celebrates 40 Years of the Microprocessor

#1
v12dock
Block Caption of Rainey Street
Happy birthday the modern processor
Posted on Reply
#2
laszlo
just wonder what happened if no Moore's law...
Posted on Reply
#3
MilkyWay
Quite amazing how Intel have managed to stay ahead of the competition and even outlast most of the competition.
Posted on Reply
#4
qubit
Overclocked quantum bit
I love nostalgia like this. :)
Posted on Reply
#5
v12dock
Block Caption of Rainey Street
MilkyWayQuite amazing how Intel have managed to stay ahead of the competition and even outlast most of the competition.
Hmm sounds like a risc vs cisc argument. Although there is no doubt in Intel's success I feel like people fail to give credit to other companies (AMD) for their contribution in modern computing.

AND
Intel still thinks you can build a video card off x86
Posted on Reply
#6
NC37
laszlojust wonder what happened if no Moore's law...
*Imperial March plays*

Oh I think we all know what would have happened...;)
Posted on Reply
#7
btarunr
Editor & Senior Moderator
16 pins to 2,011 pins; 2,300 transistors to 2.3 billion transistors. That's a heck of a journey.
Posted on Reply
#8
RejZoR
MilkyWayQuite amazing how Intel have managed to stay ahead of the competition and even outlast most of the competition.
If you artificially block competition by "under the desk" deals, it's not that hard. Microsoft was doing it in it's days and Intel as well. Anyone remembers exclusive Intel only laptops by pretty much every vendor? In the last few years, things softened up a bit and AMD is present more in this segment (especially thanks to brilliant AMD Fusion).
Posted on Reply
#9
Unregistered
v12dockAND
Intel still thinks you can build a video card off x86
You well can, by using mathematical cores instead of shaders. They're interchangeable depending on instructions. But writing drivers is an another story for it, especially when considering the standarts (DirectX) are already set their way up on CUDA/VLIW based GPU's. Which is why they spun off Larrabee as a paraller computing GPU. All it needs is software to work, and a display output... I was hoping them to do it back in 09, but yeah. It sadly doesn't have an API to work with.

#10
RejZoR
Yeah, but x86 gfx card would be constantly working in emulated mode (basically). A matter of drivers might work now, but when DirectX 12 or 13 arrives, it will simply be too slow for new instrcutions regardless of drivers.
Posted on Reply
#11
Unregistered
RejZoRYeah, but x86 gfx card would be constantly working in emulated mode (basically). A matter of drivers might work now, but when DirectX 12 or 13 arrives, it will simply be too slow for new instrcutions regardless of drivers.
You misunderstood. What I'm saying is, it needs it's "own" instruction sets to work. Only then it'll have a way out. They didn't bother to emulate it because it was much work and they knew it'd blow. So the hardware is there, it just doesn't have the software to function (due proptietery standarts). Same reason why XGI/Monster or any other old GPU maker isn't producing DirectX GPU's. It's just between AMD and nVidia, which is junk considering the number of different GPU producers there were (for OpenGL/software) back in the early 2000's.
#12
Drone
Federico Faggin is a really cool man. His way of thinking is simply great. I totally agree with him that only quantum computers can fundamentally change everything. Because really tired of this architecture that never seems to change. I think I'm gonna watch more his interviews. I like this quote of him
If I look at the next 40 years, what I can see in the mainstream is more of the same: faster processors, more cores, blah blah blah, the same things we've been doing
Anyway 'appy birthday dear 4004.
Posted on Reply
#13
qubit
Overclocked quantum bit
btarunrTo celebrate the past 40 years of microprocessor innovation and look ahead at the next 40 years, Intel compiled photos, video interviews, opinion pieces...
Oh hang on, I've got an opinion so where the hell am I?! :eek: qubit reaches for his lawyer.
Posted on Reply
#14
pr0n Inspector
I think I still have a 120MHz Pentium somewhere in the dungeon...
Posted on Reply
#15
RejZoR
DroneFederico Faggin is a really cool man. His way of thinking is simply great. I totally agree with him that only quantum computers can fundamentally change everything. Because really tired of this architecture that never seems to change. I think I'm gonna watch more his interviews. I like this quote of him





Anyway 'appy birthday dear 4004.
Well, you also have to understand that if we do change the whole architecture someday, you'd have to rewrite what, 90% of world's software? Currently you only have to make minor fixes and changes but who will rewrite entire games? Entire massive programs?
Posted on Reply
#16
Mindweaver
Moderato®™
Hey guys I figured I would share a little nostalgia this moring! This is just some of the processors in my little museum..lol Happy Birthday Intel!

Posted on Reply
#18
mtosev
badtaylorxhappy birthday intel
it isn't intel's birthday but the bday of the 4004 cpu: INTEL Founded Mountain View, California, U.S.
(July 18, 1968)
Posted on Reply
#19
caleb
40 years improving one technology instead of jumping onto something new. Humans are so slow.
Posted on Reply
#20
KyleReese
btarunrcomputing systems will be aware of what is happening around them, and anticipate peoples needs
Oh my God!!! Skynet will be upon us sooner than I thought....:p
Posted on Reply
#21
KyleReese
btarunrThe sheer number of advances in the next 40 years will equal or surpass all of the innovative activity that has taken place over the last 10,000 years of human history
Wow that's a really bold statement I think...it makes you feel both curious and frightened about the future...
Posted on Reply
#22
Joe Public
A lesser known fact is that if the IBM engineers had gotten their way, the first IBM PC would have been powered by Motorolas 68000 CPU. (the one Apple would use in their first Macintosh, also used in Amigas and 16-bit Ataris)

But IBM already had a technology exchange deal with Intel, so they had to use the Intel CPU instead .
Posted on Reply
#23
Sasqui
pr0n InspectorI think I still have a 120MHz Pentium somewhere in the dungeon...
Got me beat by 95 Mhz, I've got a i386/25, and I'm ashamed to say it's never been overclocked :laugh: AFAIK, no way to do it back then, the clock was determined by a "Koyo" hardware clock chip that says 50.000 Mhz ...I just looked at the old MB!
Posted on Reply
#24
PVTCaboose1337
Graphical Hacker
MindweaverHey guys I figured I would share a little nostalgia this moring! This is just some of the processors in my little museum..lol Happy Birthday Intel!

img.techpowerup.org/111116/Intel.jpg
I've got a few, however there might be some AMD thrown in there, old picture.

Posted on Reply
#25
Disparia
John DoeYou misunderstood. What I'm saying is, it needs it's "own" instruction sets to work. Only then it'll have a way out. They didn't bother to emulate it because it was much work and they knew it'd blow. So the hardware is there, it just doesn't have the software to function (due proptietery standarts). Same reason why XGI/Monster or any other old GPU maker isn't producing DirectX GPU's. It's just between AMD and nVidia, which is junk considering the number of different GPU producers there were (for OpenGL/software) back in the early 2000's.
Yup. With an engine designed for ray-tracing and games built around it, Intel could have a MIC ready with enough power to run those games by they time they were completed.

IDF 2011: 8-way 1.2Ghz 32-core Knights Ferry running a visually improved ray-traced version of Wolfenstein at 1080.



[YT]XVZDH15TRro#![/YT]

And the new version is just around the corner running between 1.2Ghz - 1.6Ghz with up to 64 cores enabled.

Of course, all of that is the easy part. Can only imagine the fight against those heavily invested in OpenGL/DX.
Posted on Reply
Add your own comment
Nov 25th, 2024 23:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts