Dennis Ritchie: The C man who booted Unix
strcat(obit, "Quiet revolutionary");
Obituary It was 1968 and students and workers were on the march, protesting against the Vietnam War, with the western world seemingly teetering on the brink of revolution.
In the sleepy, leafy suburb of New Jersey's Murray Hill, a young maths and physics graduate was laying the groundwork for an entirely different revolution.
For Dennis Ritchie, 1968 was the first year of a 44-year career spent among the boffins of Bell Labs where he helped usher in the modern computing age.
Ritchie, who passed away this weekend – still as an employee of Bell, now Alcatel-Lucent Bell Labs – created the C programming language and, along with Ken Thompson, created Unix.
Thompson and Ritchie built the bridge that helped computers cross the chasm from being the kinds of room-size hulks owned by governments and corporations to working as systems for the everyman.
Computers were put the hands of private citizens and small businesses and helped create an entire hardware and software industry.
Nearly 40 years after he created it, C is the world's second most popular programming language, after Java and before C++, which both owe it a huge debt.
C was built on B, also from the Bell team.
It packaged a concise syntax with features and functionality that made it simple yet powerful tool for building a complex system such as Unix. Ritchie had joined Bell Labs in 1968 on the team working on building a general computer operating system.
Thompson was impressed and re-wrote most of Unix's components in C, with the kernel published in 1973.
C was important to the success of Unix. General-purpose programming languages had not existed before C; hardware proliferated and those making it tried to lock virgin customers in through the power of language. Herb Sutter, convener of the ISO/ANSI C++ Standards committee that owes its existence to Ritchie's work, summed up the topology of computing back then here:
Computers proudly sported not just deliciously different and offbeat instruction sets, but varied wildly in almost everything, right down to even things as fundamental as character bit widths (8 bits per byte doesn't suit you? how about 9? or 7? or how about sometimes 6 and sometimes 12?) and memory addressing (don't like 16-bit pointers? how about 18-bit pointers, and oh by the way those aren't pointers to bytes, they're pointers to words?).
C escaped the Labs and by the mid 1970s, developers began dropping languages including Programming Language One (PL/1) and the high-level ALGOrithmic Language (ALGOL).
C paved the way for object-oriented programming with C++ and went visual with Visual C and Visual C++ from Microsoft. Today, C and its descendent C++ are a popular choice for building operating systems, used in Windows and son-of-Unix Linux, and even bits of OS X. C also influenced Java from Sun Microsystems.
With success came danger: systems vendors being systems vendors, they reverted to type and started adding extensions to C to make it "run better" on their own particular hardware. This was an era when companies were running hard to win customers and establish market share.
It was decided, however, that C's continued success should be safeguarded by enshrining Ritchie's language as a standard. In 1989, the American National Standards Institute (ANSI) approved the ANSI C standard with International Standards Organization (ISO) following a year later.
Next page: Rescue me
This man made a bigger contribution to the world of computing than Steve Jobs
Fact: Apple's OS and toolset all came from Ritchie. Tablets, Mobiles mainframes all use Unix like operating systems and toolkits.
I am no expert, but surely to say "Linus Torvalds announced his project of writing an open-source clone of Unix from scratch in 1991" is a bit misleading? A kernel, yes. But a great deal of the rest was GNU -- notably the C-compiler. I can see that Stallman has not made any friends in the past week, but that doesn't justify airbrushing GNU out of Unix history like Trotsky.
A true innovator
... and a real "genius". One that Michael Bloomberg probably won't have heard of.