Happy 40th birthday, Intel 4004!
The first of the bricks that built the IT world
On November 15, 1971, 40 years ago this Tuesday, an advertisment appeared in Electronic News for a new kind of chip – one that could perform different operations by obeying instructions given to it.
That first microprocessor was the Intel 4004, a 4-bit chip developed in 1970 by Intel engineers Federico Faggin, Ted Hoff, and Stanley Mazor in cooperation with the Japanese company Busicom (née the Nippon Calculating Machine Corporation) for that company's adding machines.
Busicom held the rights to the 4004 in 1970, but released them to Intel in 1971. Intel then offered the world's first processor for sale, and 40 years later that world is a very, very different place.
At the time, only the most far-thinking futurists could have imagined the 4004's impact. For starters, the chip itself wasn't all that impressive. It ran at 740KHz, had around 2,300 transistors that communicated with their surroundings through a grand total of 16 pins, and was built using a 10-micron process.
Exactly how far have we come in process technology since the 4004? Well, as your Reg reporter once calculated, if the width of a Intel 2nd Generation Core CPU's 32-nanometer process were expanded so that it could be spanned by an unsharpened No. 2 Ticonderoga pencil, the 4004's 10-micron (10,000nm) process, equally expanded, would be wide enough to fit an 18-wheeler followed by a half-dozen 1962 Cadillac Eldorados and a Smart Car.
To say that microprocessors have changed radically over the past 40 years is to utter an empty truism. What's far more interesting is to take a look at the way in which those changes have evolved: the problems encountered, the decisions made, the discoveries ... well ... discovered.
And so to review Intel's 40-year journey from the 4004 to today, The Reg contacted two Intel Senior Fellows who have been responsible for a good chunk of how their company's offerings have grown from the 2,300-transistor 4004 to the over-two-billion-transistor 2nd Generation Intel Core i7-3960X released Monday morning.
We spoke with Steve Pawlowski, who has been intimately involved with a good portion of Intel's microarchitectural development since the early days, and Mark Bohr, who heads up Intel's process architecture and integration efforts.
We learned a lot, such as the fact that for the first 30 years or so, there really weren't all that many challenges in process development. "Most people would say that the period from 1971 until the early 1990s – actually, even to the end of the 1990s – that 30-year period was really the golden era of traditional, classic transistor scaling," Bohr told us.
In those days, the materials used in processors hardly changed – it was based on silicon dioxide for the gate insulator, or dielectric, and doped polysilicon for the gate electrode. "We were simply scaling," Bohr said, and with that scaling came reductions in power needs, and continual improvements in transistor densities and performance.
Next page: 'We just plain ran out of atoms'
What's that in brontosauri?
Succeeded despit etechnology not because of it
The story of teh scusess of intel microprocesors is that commercial and not technical factors dominate.
The 8086 was very much inferior to the 68K and the 16032 it was probably on a par with the Z8000. I rember Intel trying to sell to me at that time and they always emphasised price, the agreement with AMD that gave guarantee of supply and assurance on pricing, and support. They never tried to sell on performance or technical aspects because it was well behind Motorola.
The PC then came out and things changed very rapidly. Intel broke the AMD arrangement and the price of the first non-agreement part the 80287 sky rocketed. Technically intel parts were still very much second best but they sold fantastic numbers o fparts. The 80286 retained the awkward segmented architecture extended withprotected mode performance was still very poor. The 386 finally had a sensible memory architecture but still had the nasty special purpose registers and complicate dinstruction set and performacnce was still very poor compard to other micros. It was probably not until the pentium that Intel gained parity with other microprocessors.
None of these technical things mattered, one design decision by IBM made Intel the dominant microprocessor company with massive reources despite not because of their technical design.
Ahh, those were the days...
...when bytes were were real bytes, Motherboards could be fixed with a soldering iron, "intellectual property" meant you'd paid off your Encyclopedia Britannia, and 'programming' meant hand coding raw MC. Maybe assembler if hung over.
And yes, counting every damn clock cycle.
God, I feel old.... <sniff.>
"After the 8086/8 came the 80286..."
No, it didn't. After the 8086/8 came the 80186/8, which was then followed by the 80286.
I remember coding in 80186 assembly on my dad's Tandy 2000...
A few corrections more...
1. IT was not built on the Intel 4004 or its successors. The information technology industry started in the 1950s with pioneering data processing applications leveraging emerging computing technology. Remember LEO, and the IBM 1401? They were certainly information technology systems. You'd have to use a pretty discrete and tortured definition of IT to claim the 4004 was its first building brick.
2. You use the phrase 'first processor' to describe the 4004. Here comes more pedantry... This is not true either. It was the first commodity, commercially available microprocessor -- which is to say an IC with all the traditional components of a CPU. Computer processors in modern sense date back to at least 1949 and EDSAC. The Digital PDP-11, a direct contemporary of the 4004, certainly has a processor, as did all it's ancestors. What it didn't have was a single chip 'microprocessor.'