Feeds

DBMS pioneer Bachman: 'Engineers have more fun than academics'

Very large databases are so '70s

5 things you didn’t know about cloud backup

Something borrowed, something new

Bachman describes IDS as "an assemblage of a number of different ideas". Some of these ideas came from abstract concepts such as data hierarchies, data descriptions and randomised addressing. Other ideas grew from earlier attempts to analyse data such as the 702 Report Generator developed for the IBM 702 and 9PAC that was built for the 702's successor, the IBM 709. Bachman became involved with work to develop the 9PAC reporting system through the IBM SHARE user group while still at Dow.

He recalls the way that SHARE worked then was comparable in some ways to today's open source development. Participants gave their time and ideas voluntarily and swapped code – as long as you had several million dollars to buy the machine to run it on.

"DBMS is a leap back to a 100-year-old technology – it tracks back to what we used to do with punched cards" – Charlie Bachman

But it was not only novel software ideas that made IDS possible. Developments in storage technology also played their part and the big enabler was the emergence of "affordable" and reliable disk storage devices. Disk stores had been around since the mid-1950s, but it was not until the early 1960s that it became practical to use them in commercial applications.

Building the programs needed to operate IDS turned out to be a real challenge for Bachman. He had only written a single program previously.

"When I came to develop IDS, I had only one program behind me. We started out using a language called GECOM, which was GE's version of what later became Cobol. For two years we were dealing with GECOM although there was a feature that allowed you to enter the assembly language: GEPLAN."

Even before the MIACS project was completed, Bachman realised that IDS would be important: "MIACS did not go into production until 1965 – but we already had two IDS sites running in 1964 and saw we could put it into production."

Database software derived from IDS's network model continued to grow until the 1980s, when the faster hardware enabled the relational model developed by IBM's Ted Codd to take the lead.

Bachman, though, remains skeptical about the use of relational database in large-scale transaction processing systems: "There is still a large discussion here in my mind. Ideas have a great deal of momentum. Big systems are very difficult to replace with 'new' systems. Otherwise why do a thousand IDMS (IDS) systems still run some very large IBM mainframes around the world?"

It's a hot topic. Others, particularly supporters of NoSQL, would argue RDMS is not suited to large systems. Today those large systems are not mainframes, they are data centers running companies' clouds. When it comes to so-called very-large databases today, Bachman is fairly dismissive and brief. "Oh that was all happening in the 1970s," he tells us.

Passion in the blood

Bachman's later work has taken him into open systems and computer-aided software engineering (CASE) tools, but his passion for database systems has continued. Most recently he has worked with the Cord Blood Registry (CBR) in California to develop special database software for stem cell research.

Bachman's papers and correspondence are archived at the Charles Babbage Institute.

And with his birthday fast approaching next month, Bachman is still thinking outside the box, coming up with fresh insights on databases.

"While working on my data-modelling book I had an insight into what relational DBMS is really all about. Thinking back to the old card-based 702 report writer it was all about flattened files - where all codes are in every record. If you do a relational JOIN you do just that - you flatten the file.

"So," he concludes, "it struck me that relational DBMS is a leap back to a 100-year-old technology – it tracks back to what we used to do with punched cards – aside from SQL as a language to control what is going on." ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Why has the web gone to hell? Market chaos and HUMAN NATURE
Tim Berners-Lee isn't happy, but we should be
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
Eat up Martha! Microsoft slings handwriting recog into OneNote on Android
Freehand input on non-Windows kit for the first time
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
This is how I set about making a fortune with my own startup
Would you leave your well-paid job to chase your dream?
(Not so) Instagram now: Time-shifting Hyperlapse iPhone tool unleashed
Photos app now able to shoot fast-moving videos
prev story

Whitepapers

A new approach to endpoint data protection
What is the best way to ensure comprehensive visibility, management, and control of information on both company-owned and employee-owned devices?
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Maximize storage efficiency across the enterprise
The HP StoreOnce backup solution offers highly flexible, centrally managed, and highly efficient data protection for any enterprise.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.