Call to ARMs
The first ARM used 25,000 transistors. "It was a really small, simple piece of design," says Furber.
Today one of the most significant features of the ARM family is its low power consumption. But that hadn't been an initial goal, according to Furber. “We designed the ARM for an Acorn desktop product, where power isn't of primary importance. But it had to be cheap. Cheap meant it had to go in a plastic package, plastic packages have a fairly high thermal resistance, so we had to bring it in under 1W.”
The original 3µm ARM chip
The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.
Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.
As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."
Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt.
From the horse's mouth...
“Running on leakage” also happened to be an appropriate description of Acorn outside the lab. To staunch the haemorrhaging finances, Italian computer manufacturer Olivetti stepped in with an initial cash injection in February 1985. By the end of the year, Olivetti had taken a controlling interest, effectively buying the company.
“Olivetti wasn't told about the ARM when they bought Acorn,” Steve Furber remembers. When they found out, he says, “they didn't know what to do with it”. Wilson and Furber did, and were able to push through the ARM’s initial commercial appearance in the ARM Development System, a specialist add-on to BBC Micro costing £4500.
Next page: Archimedes principle
Almost brought a tear to my eye. A beautiful article about a beautiful piece of technology. My jaw dropped when I read the ARM worked without applying Vcc :)
IBM ROMP vs. ARM
The IBM ROMP chip (aka the 801) was never intended to be a general purpose RISC processor. It was intended to power an office automation product (think of a hardware word-processor like WANG used to sell).
As a result, although it could function as a General Purpose CPU, it was not really that suited for it. It was never a success because at the time, IBM could not see justification for entering the pre-Open Systems UNIX world. RT 6150 and the 6151 were intended as niche systems mainly for education, although they did surface as channel attached display front ends for CADAM and CATIA run on mainframes (and could actually run at least CATIA themselves). This changed completely with the RIOS RISC System/6000 architecture, where IBM was determined to have a creditable product, and invested heavily.
In comparison, the ARM was designed from the ground up as a general purpose CPU. Roger Wilson (as he was then) greatly admired the simplicity and orthogonality of the 6502 instruction set (it is rather elegant IMHO), and designed the instruction set for the ARM in a similar manner. Because the instruction set was orthogonal (like the 6502, the PDP11, and the NS320XX family), it makes the instruction decoding almost trivial. It also made modelling the ARM on an econet of BBC micro's (in BBC Basic, no less) much easier, which allowed them to debug the instruction set before committing anything to silicon.
They had to make some concessions on what they wanted. There was no multiply-add instruction, which appeared to be a hot item in RISC design at the time, and to keep it simple and within the transistor budget, all they could do was a shift-add, (the barrel shifter), which although useful, was a barrier to ultimate performance, but great for multi-byte graphics operations.
It was also simple enough so that they could design the interface and the support chips (MEMC, VIDC and IOC) themselves, achieving early machines with low chip counts.
This is all from memory of articles in Acorn User, PC World, Byte and other publications. Feel free to correct me if my recollections are wrong.
This is why I come to the Reg, so much nicer to read a well informed structured article as opposed to the usual "my dads bigger than your dad" fanboy rantings on other sites
I would just like to congratulate the author of this article. Beautifully written. A real credit to yourself, and The Register, sir. Thank you very much.
It struck me that Hauser, despite not being the technical guy, is as bright as they come. He clearly recognised the talent he had with Wilson and Furber.
I completely riviting read. Actually, the story of Acorn & ARM would make a very good book. Ditto Inmos IMHO.
Have a good day all.
Re: Great article
I agree, it wasn't Acorn and the ARM which failed.
But there is something about the British industrial and financial environment which seems to let the winnings from these works of genius drift away out of reach.
Its not just globalisation, and some factory that is so expensive to build that there can only be one on the entire planet. And we can't expect to spot the right investment choice every time. But what is it about this country which turns a successful entrepreneur into somebody fronting a TV show that tests how people can run a market stall in Essex?