Feeds

Learning from the past

David Norfolk has Proustian moment over IMS

New hybrid storage solutions

Comment I'm sitting thinking about our new Reg Developer site and its target audience – professional IT developers who already read The Register but who might like something more targeted on their specific world – when an IMS Newsletter drops on the mat.

For those who don't know it, IMS (Information Management System) was IBM's hierarchical enterprise database management system from the seventies/eighties of the last century and it's where I learned about IT, as a DBA (database administrator) in Australia. (I once found a bug in IMS, a log tape commit on the wrong side of a memory FREEMAIN, which could ABEND and back out a change already committed to the log – oh, happy days).

IMS was, and is, a powerful DBMS (Database Management System) for very large applications exploiting hierarchical data structures, and these aren't that uncommon (XML data structures are basically hierarchical). It is also about the only technology product on my CV that is still reasonably saleable. IMS version 9 is still a current product (GA – General Availability - released Oct 2004) and IMS celebrated its 30th anniversary some 7 years ago.

Actually, mainframe expertise generally is still in demand, according to Bill Millar (Head of Mainframe Solutions, BMC Software), which is actively developing the mainframe aspects of Business Service Management.

An update to BMC's visualisation software, which will take information from Mainview for IMS or DB2 (or from BMC's competitors) and make it available in the service model, is due out next year; automated topology discovery (which will allow an ITIL CMDB to be kept dynamically populated with mainframe assets, without overloading it) is coming; and transaction management across the IMS and DB2 world will let operational support identify points of failure across the enterprise.

In addition, BMC is extending its "smart DBA" to the mainframe – Miller sees mainframe DB2 V8 as a major innovation, although he thinks customers haven't converted yet in the volumes IBM expected. But, there is still a (small) world of mainframe developers and, like developers everywhere, they are having to come to terms with service oriented delivery – the business enterprise wants to be given operationally complete, manageable, automated business services, not just programs.

Incidentally, Miller is most proud of being able to support mainframe DB2 v8 – a mainframe database that is still being actively developed and which IBM thought was moving too fast for the ISVs (independent software vendors) to keep up with. DB2 is still a database of choice for the very largest and most resilient business systems, while IMS is being used but not being developed for much - although I'm old enough to remember getting into trouble in a City bank for suggesting too publicly that DB2 was enterprise-ready and could start offloading processing from IMS.

As an aside, while I'm finishing off this piece, Mark Whitehorn's assessment of SQL Server as “enterprise ready” is exciting comment in our email - “only mainframe DB2 and Teradata come close to cutting it” is a paraphrase of some comments. Well, if it was my money and the application was big enough, I'd be mostly looking at DB2 today too, because I like playing safe with my career, but things change; and tomorrow SQL Server could well be in the frame, just as DB2 snuck up on IMS.

But back to IMS around 1979. What I learnt then was the importance of processing small messages in near real time (using IMS MPPs - message processing programs) rather than relying on batch processing. I learned about distributed processing – our central IMS database was fed from minicomputers in the state capitals and if the communications failed, these minis carried on providing a service (some 90% of processing was local to the State) and updated the central mainframe when the comms came back up.

I learned about abstractions and metadata– our databases were generated automatically from the data dictionary and any production problems were fed back into the dictionary as an audit trail of implementation issues related to the business data entities being processed instead of just to the physical database.

Testing program specs against the logical data structures behind the databases they accessed let us identify and remove defects before coding even started. And, the business logic associated with resolving operational database issues (such as “always restart the database after a transaction failure after logging the failure, since most problems are due to the characteristics of a single rogue transaction; unless we're having repeated failures, in which case leave the database down and call emergency support”) was stored in, and executed from, an active repository.

This was some 25 years ago – and my impression is that driving a database environment from a logical metadata repository linked to a model of business data structures is pretty advanced even today.

Security for virtualized datacentres

More from The Register

next story
Not appy with your Chromebook? Well now it can run Android apps
Google offers beta of tricky OS-inside-OS tech
Greater dev access to iOS 8 will put us AT RISK from HACKERS
Knocking holes in Apple's walled garden could backfire, says securo-chap
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.