'We just plain ran out of atoms'
But that relatively straightforward sequence of improvements didn't last. In the early 2000s, Bohr told us, "Traditional scaling ran out of steam." The problem was that as process sizes got smaller and smaller, circuits tended to leak more power proportional to the amount of electrons that are doing useful work.
Or, as Bohr put it, "We just plain ran out of atoms." When you're talking about gate oxides, he explained, a 1.2nm deposit is only about six atomic layers thick.
At that point, Bohr said, it became clear that it was necessary to investigate, develop, and introduce what he called "revolutionary features", such as strained silicon, then high-k metal gate, and most recently what Intel calls tri-gate transistors and much of the rest of the world calls FinFET structures.
But we're getting ahead of ourselves. On the architectural side, there were plenty of developments underway while the process engineers were busily scaling down the chips' transistors.
Pawlowski wasn't in on the earliest days of the 4004's morphing into the 8-bit 8008 of 1972, and then the development of the much more capable 8-bit 8080 in 1974, which was the microprocessor that really got the ball rolling.
The 8080 wasn't alone, though – there was plenty of competition in the earlier days, such as the Zilog Z80, Motorola 6800, and MOS Technology 6501, which Pawlowski told us were all essentially equal competitors at the time.
"The 8080 was essentially just a simple processor," he told us, "but it had a program counter, it had these nice, wonderful eight registers that we have today, the eight-bit registers. Then the 8085 was an extension of that – it was essentially a 5-volt part."
Pawlowski's first baby at Intel was the 8086, which had a 16-bit external bus, unlike its compatriot, the 8088, which had an 8-bit external bus.
The 8088's claim to fame was that IBM chose it for its groundbreaking IBM Personal Computer – aka the Model 5150 – which it introduced in 1981. According to Pawlowski, IBM chose the 8088 because its 8-bit external bus was compatible with peripherals that had been developed for the smaller-market 8080 and the 8085.
After the 8086/8 came the 80286, which Pawlowski described as not a ground-breaking departure, but rather "just a better architecture than the 8086." The 80286 still required a math coprocessor, the 80287. Unfortunately, Pawlowski remembers, "The 286 added some interesting things, like with the math coprocessor they added an interrupt field which clobbered some of the old interrupt fields in the 8086."
All progress is not linear.
Next page: Change you can believe in
What's that in brontosauri?
Succeeded despit etechnology not because of it
The story of teh scusess of intel microprocesors is that commercial and not technical factors dominate.
The 8086 was very much inferior to the 68K and the 16032 it was probably on a par with the Z8000. I rember Intel trying to sell to me at that time and they always emphasised price, the agreement with AMD that gave guarantee of supply and assurance on pricing, and support. They never tried to sell on performance or technical aspects because it was well behind Motorola.
The PC then came out and things changed very rapidly. Intel broke the AMD arrangement and the price of the first non-agreement part the 80287 sky rocketed. Technically intel parts were still very much second best but they sold fantastic numbers o fparts. The 80286 retained the awkward segmented architecture extended withprotected mode performance was still very poor. The 386 finally had a sensible memory architecture but still had the nasty special purpose registers and complicate dinstruction set and performacnce was still very poor compard to other micros. It was probably not until the pentium that Intel gained parity with other microprocessors.
None of these technical things mattered, one design decision by IBM made Intel the dominant microprocessor company with massive reources despite not because of their technical design.
Ahh, those were the days...
...when bytes were were real bytes, Motherboards could be fixed with a soldering iron, "intellectual property" meant you'd paid off your Encyclopedia Britannia, and 'programming' meant hand coding raw MC. Maybe assembler if hung over.
And yes, counting every damn clock cycle.
God, I feel old.... <sniff.>
"After the 8086/8 came the 80286..."
No, it didn't. After the 8086/8 came the 80186/8, which was then followed by the 80286.
I remember coding in 80186 assembly on my dad's Tandy 2000...
A few corrections more...
1. IT was not built on the Intel 4004 or its successors. The information technology industry started in the 1950s with pioneering data processing applications leveraging emerging computing technology. Remember LEO, and the IBM 1401? They were certainly information technology systems. You'd have to use a pretty discrete and tortured definition of IT to claim the 4004 was its first building brick.
2. You use the phrase 'first processor' to describe the 4004. Here comes more pedantry... This is not true either. It was the first commodity, commercially available microprocessor -- which is to say an IC with all the traditional components of a CPU. Computer processors in modern sense date back to at least 1949 and EDSAC. The Digital PDP-11, a direct contemporary of the 4004, certainly has a processor, as did all it's ancestors. What it didn't have was a single chip 'microprocessor.'