A history of personal computing in 20 objects part 1
From the 17th Century to the 1970s
Feature Personal computing. Personal. Computing. We take both aspects so completely for granted these days, it's almost impossible to think of a time when computing wasn't personal - or when there was no electronic or mechanical computing.
To get from there to here, we've gone from a time when 'computers' were people able to do perform complex calculations themselves, through mechanical systems intended to do the work for them and then to powered machines able to automate the process. These led to systems that could be programmed to perform not only mathematical tasks but to store and retrieve other forms of data, taking us right up to desktop devices for a one-on-one interaction with computing power.
I, Apple I
Since then, that power has been compressed into smaller, more convenient packages: laptops, tablets and smartphones.
What a trip. In memory of the many people who have help us along, here then are some of the key stages of that journey, represented by the 20 objects that, to us, most embody the steps that brought us to where we are today.
It's not a comprehensive list - and feel free to comment with the devices you think we should have included - but here are the first ten of our 20 items, from the early days up to the end of the 1970s. Part two will bring us from the 1980s to the present day.
As if introducing logarithms and evangelising the use of the decimal point weren’t enough, 16th Century Scottish mathematician John Napier (1550-1617) also devised a calculating machine able to multiply and divide. It could even work out square roots. There was initially no mechanism as such: just a series of rods - the ‘bones’ - marked with two columns of numbers. To calculate a multiplication, line the rods up side by side so the figures at the top of each show the digits in the multiplicand. A column of numbers on the left side of the apparatus provides the multipliers: read across the appropriate row, right to left, adding numbers grouped by diagonal lines, to generate the digits of the answer. Division is performed in a similar way. The rods were soon replaced by cylinders, each of which replicated all nine rods and could be rotated to ‘set’ the a given digit in the multiplicand.
Babbage’s Difference Engine
Charles Babbage (1791-1871) first came up with the notion of a calculating machine capable of printing out perfect logarithm tables in 1812 at the tender age of 21. Seven years later, he set about producing such a device and, in 1822, he and his tame mechanic had put together the first Difference Engine, a clock-like assembly of interlocking cogs able to calculate squares and solve quadratic equations. It worked sufficiently well for the government and other institutions to invest in the production of a large, more capable version. It proved too complex and was never completed - the project was effectively abandoned in 1834. By this point, despite many personal and professional setbacks, Babbage had his eye on the Difference Engine’s successor: a multi-function, programmable ‘Analytical Engine’. It too was never built, but nonetheless prompted the writing of first computer program.
Next page: The War Years...
0. The Antikythera mechanism?
Any reference to Leo always reminds me of the story my father-in-law (95 and still going strong) tells. Lyons offered the use of Leo to the Inland Revenue to calculate the tax tables after a budget change. One year he was selected to go to Lyons, taking with him the highly confidential envelope that contained details of the tax changes to be announced in the budget. The rule was that the changes were secret until the budget announcements were made and the chancellor had sat down. Which meant waiting for the phone call to say that the chancellor had sat down. The envelope could be opened and the details given to the Lyons' techies. He opened the envelope, took out the paper that was inside and read it - "No Change".
> The Apple machines really weren't that important or successful
Maybe, but I used one in the office back in the early 80s. Great little machine, 48kb memory, twin floppies and the killer app, Visicalc.
When the company acquired its first IBM PC it seemed inferior in every sense. "It'll never catch on", thought I. Thus starting my ability to be 100% wrong about new technology.
DOS or CCPM? - no question, Concurrent CPM was superior in every way!
Future PC with CCPM or IBM PC with PC-DOS? - no brainer, I can multitask on my Future pc and store data on my 800kb floppies rather than 1 thing at a time with 360kb storage.
Word Perfect of MS Word? - easy. MS Word is so clunky it's all but unusable
Apple Lisa and a mouse? No-one needs a mouse
If only I had bet against myself
Oh, no it wasn't.
The IMSAI 8080 (http://en.wikipedia.org/wiki/IMS_Associates,_Inc.) and MITS Altair 8008 (http://en.wikipedia.org/wiki/Altair_8800) both allowed third-party plug-in cards, and were launched in 1975.
That predates the Apple ][ by two years.
The Apple machines really weren't that important or successful until the Mac came out.
Try looking at the other home computers of the era, especially those from Commodore. Don't believe the Apple revisionists, even Apple lied about who sold a million machines first (it was Commodore, not Apple).