Adopt and adapt better than rip and replace...
Developing with Legacy Systems – Part 1
If the internet has done nothing else, it has made the parochial world of proprietary systems appear outdated. Software architecture "A" now had better interoperate with software architecture "B", or risk rejection as now being unfit for purpose. And all of them need to interoperate with language "X" and application "Z". But in practice, this is also the case when the languages and applications are ancient and venerable, because they are still running tasks critical to the survival of the business.
The arrival of SOA has made this an imperative now, for "sweating the assets" is now a strong mantra amongst business and organisational users. But developers and business managers need to work together to decide what actually constitutes an "asset". It is easy to assume that the asset is just the data, which can then simply be ported to a new environment. In practice, however, the asset is more commonly both the data and the application that created it, with the pair often being inseparable (in part, because the application may represent the only detailed documentation of the underlying business requirements there is; as well as containing hardcoded “master data”).
It is also the case that any change away from an existing asset carries risk as well as advantage: change can mean failure as well as success. So, integrating legacy assets that still perform a valuable task for the business remains a serious and important option for developers to consider in business terms, regardless of how much they might relish the technical challenge any change constitutes.
This does not mean legacy integration is the de rigueur option. The possibility of change, to redevelop an application as a service component designed for an SOA environment, has to be one of the developers’ options. This will become one of the developers’ more pressing options over time, if only because the staff with the knowledge and experience needed to maintain many legacy applications are part of the "baby boomer" generation and are now nearing retirement age.
Vendors are, therefore, now offering businesses with established legacy systems a growing range of options that can integrate and absorb those important applications into the newer world of SOA and the Internet, without forcing the risks associated with "rip and replace" approaches.
One option has seen HP, because of its commitment to Intel’s Itanium processor as the heart of its high-end server hardware range, take an important step with two of its acquired legacy environments – DEC’s old OpenVMS and Tandem’s old NonStop (see the whitepapers here and HP NonStop evolution here). The company has invested in porting these environments to run native on Itanium, which at least gives existing applications a current, more maintainable server platform to run on. This will give both systems an extended lease of life; which is important as they both still run business critical applications. But as both also offer potential advantages for the management of complex business transactions across the Web, it will be interesting to see if HP also takes the further step of promoting them to new users with support and developer education programmes.
Mainframes are still here, still in use
Irony indeed. When I first worked as a journalist, my News Editor refused to believe that the bank I'd just walked out of still depended on mainframes - and was likely to do so for a long time (despite our IT director promising to migrate it all to NeXT in a year or so).
And, iirc, the first non-Microsoft compiler for NT was FORTRAN - but that's another story (and I really do prefer COBOL - or Rexx).
And, as an aside, Web html programming is a lot like mainframe 3270. But, then, the web browser model is really 3270 terminal in software... :-)
Mainframes were always there
There is irony in relegating mainframes to the "dusty past". Berniers-Lee wrote in his biography of the web that the first web server outside of CERN came up on a S/360 (written in Rexx).
HP's 'legacy' systems
HP did not make the decision to give VMS and NonStop 'a new lease on life', nor has it invested significantly in any such effort. Compaq made the decision to port them to Itanic rather than scrap them outright when it elected to scrap Alpha (a decision that had been in the making for a couple of years, not one at all associated with the imminent HP merger announcement) - reportedly with significant help from Intel in funding the migration (Intel believing in mid-2001 that visible support for Itanic was worth a large fraction of $1 billion in porting support, though whether it will ever get any more return on that expenditure than on the many other $billions poured down that sink-hole remains unclear).
Since that time Compaq and later HP have done little more than (barely) keep them alive, so one should probably not hold one's breath for any greater commitment - nor hold them up as examples of any kind of 'legacy integration' into the 'service-based world'.