CASE becomes ALM and consolidates
Developing on a legacy
Comment Ever since Rational got three development automation gurus in the same room to agree on UML (Unified Modelling Language) and put an end to those pointless arguments about what shape the boxes in an analysis/design model should be, what used to be called CASE (Computer Aided Software Engineering) has been on a bit of a roll.
Well, with the sort of people who believe that you can automate the process of developing computer software, anyway.
Now, perhaps driven by the need to be able to demonstrate corporate governance (and, since everything is built on automated systems these days, this has to include IT governance) to third parties such as regulators and shareholders, it is becoming mainstream, although it is usually called something different these days. The concept is also being extended to cover computer assistance for the delivery, management, and operating of automated business systems, not just software engineering.
We are thus seeing a new focus on Application Lifecycle Management (ALM) as vendors realise that only end-to-end governance with transparency to stakeholders outside of the IT group really makes any sense. And there is plenty of product/vendor consolidation as the established leaders flesh out the bits of the lifecycle they’ve neglected in the past with best of breed products.
The missing bit is often Requirements Management. After all, the saddest thing in software development is seeing someone craft a high-quality piece of software which solves the wrong problem.
So, for example, we see Compuware acquiring Steeltrace, one of the more innovative Requirements Management product vendors, which has been a Compuware partner for some time.
The important part of this is the complimentary nature of the products now in the Compuware portfolio. It already has a strong Model Driven Architecture (MDA) story, transforming “Platform Independent Models” (PIM) into code, via OptimalJ, but MDA has been weak, in the past, on the “Computationally Independent Model” (CIM) part of the story.
The CIM looks at the complete business system regardless of which bits of it you’ll automate. However, MDA explained, for example, which is co-authored by Wim Bast of Compuware, only devotes part of one page (out of 150) to the CIM. There is, in fact, a real problem with automatically transforming a CIM into a PIM because a human decision has to be made as to where the man-machine boundary is placed, which (as MDA is all about automatic transformation of the different models) may explain the limited use of the CIM so far. Now, however, Andrew Watson (technical director of the OMG) seems to see business process modelling as implementing the CIM.
Steeltrace requirements management, however, could also address this potential hole in the MDA story, by providing for a structured analysis and refinement of business requirements, some of which may never be automated.
Compuware also acquired the Changepoint product in 2004, which addresses IT Governance in general. This could also complement its latest Steeltrace acquisition. The ultimate failure in IT Governance is to waste money on a technology which doesn’t address a business requirement; resulting in a cancelled project or, worse, a dysfunctional project that is forced into production.
QACenter is yet another Compuware technology which supports automated risk-based testing – and it is well-known that early testing is cost-effective testing. With Steeltrace, testing could effectively start at the requirements stage and trace through into automated test cases run against the coded system. It could even continue into the operational use of the system, as you check that the working system continues to satisfy the business’ requirements.
To my mind, the cheapest and most effective place to remove defects is during requirements analysis. This is when unnecessary requirements can be eliminated, the business stakeholders can be questioned as to their understanding of their business processes (sometimes, what the user asks for isn’t what the user really meant to ask for) and the most cost-effective man-machine boundary can be drawn. Sometimes, a manual decision made by an experienced human being is more cost-effective than developing an automated artificial intelligence solution.
Compuware’s main task now is to meld all these different products into a coherent whole, addressing the Application Management Lifecycle in its entirety, but the signs are that it will succeed in doing this. However, Compuware certainly isn’t the only game – or the only tool consolidator – in town and the others also have their strengths.
Telelogic’s ALM story, for example, is complemented by its Enterprise Architecture approach using the impressive System Architect tool (originally acquired by buying Popkin) for enterprise modelling and integration. This tool helps you visualise business relationships involving technology, processes, and data and trace them back to their original sources – and, more importantly perhaps, publish this information easily. System Architect is one of the original CASE tools (its survival itself shows that it was doing something right) and it seems to be integrating well with DOORS, Telelogic’s requirements management tool, and the rest of Telelogic’s Lifecycle Solution (I expect to learn about further developments in this area later this month).
Telelogic has a particular strength in embedded systems development (enhanced by its recent ILogix acquisition). Embedded systems development, developing the software which gives generally-purposed hardware (including TVs, DVD recorders, cameras etc) its personality as a specific technology product, is the source of much of the discipline behind modern ALM and software engineering. Configuration Management, for example, can sometimes be considered a luxury in conventional development (if you’re lucky, and like risk, that is) but there is really no alternative when you are developing software for a hardware platform which isn’t finalised yet. This rather takes coping with changing requirements to the limit.
Borland, however, was perhaps the leading promoter of the ALM message (which isn’t to say that others don’t have similar messages, just that Borland really put the holistic message on the table first) and has now gone beyond ALM with Software Delivery Optimisation (SDO).
Borland’s early Caliber RM acquisition gave it one of the few Requirements Management tools in a similar class to Steeltrace, rather ahead of Compuware.
Nevertheless, perhaps the jewel in Borland’s crown, comes from its acquisition of Teraquest, which brought Dr Bill Curtis (one of the founders of Capability Maturity Management, the most widely-accepted process improvement initiative) into the team. This intellectual expertise is what is enabling a process-led approach to implanting SDO, which Borland calls Accelerate: define goals; architect approach; develop and deploy the solution, validate results.
Borland has recently announced a new, and timely, focus on IT Governance, which, it says means: improving IT-Business alignment through Demand Management; achieving consensus on which investments to make through Portfolio Management; using Project and Program Management to provide transparency of control over IT projects; using Resource Management for more effective resource utilisation; using Financial Management to support regulatory compliance; and using Asset Management to manage the transition of project deliverables into production. Its arguments make sense to me, although I am perhaps less happy with some of the terminology, which may also have other meanings.
Once again, instead of simply coding programs effectively, we are seeing a consideration of the big picture, the transparent delivery of value to the business. And, as a part of this, you should really be designing the operational side of an automated application and its continuing support at the same time as you design the application itself.
Here, ITIL is one of the emerging sets of best practices for the operational support of IT service delivery and central to ITIL is Configuration Management. All the ALM players have or have acquired Configuration Management solutions, but a company that is sometimes overlooked (perhaps because it plays particularly well in the large enterprise space) is Serena, which started by managing change and now has expanded the scope of its offerings to deliver a strong ALM message.
And what of IBM Rational, which re-started all this? Well, it is catalysing Eclipse’s growth into a full ALM environment; it seems to me, although we still hear from people who prefer Rational’s original products to the new Eclipse-based offerings. But IBM, with Alphaworks for example, is making a lot of original intellectual property available to the open source movement generally, and Eclipse in particular. It is also proposing the Eclipse Process Framework (see an introduction by Per Kroll, Manager of Methods at IBM Rational, here).
Although, Eclipse is by no means owned by IBM Rational. Telelogic has announced it will be an early supporter to the Eclipse Process Framework (EPF), for example. It will contribute its library of best practices from some 20 years of experience in areas such as Requirements-Driven Development, Model Driven Architecture, Enterprise Change Management, and Systems and Software Development. Similarly, Compuware is supporting the Eclipse Application Lifecycle Framework (ALF), along with Serena, its original sponsor.
ALM is one area, at least, where Microsoft is still playing catch-up – most people I meet from Microsoft’s Visual Studio Team System (VSTS) project still seem to think it's cutting code, not managing requirements, that’s really cool. So, when Ian Knox (lead product manager, Visual Studio Team System) let slip to me recently that he considered that everyone else was playing catch-up to Microsoft, I think he meant that its tools were particularly well-made and easy-to-use – we’ll have to see what Tim Anderson thinks when he delivers a review here later this month. However, it seems to me that Microsoft’s VSTS tools are a considerable improvement on what it used to have – and cover more of the lifecycle too. There is even innovation with Domain Specific Languages, although the jury is out on how successful these will be – Jack Greenfield blogs a reprise of some of the discussion here (Grady Booch, of Rational, has some issues here, for example).
So, ALM seems healthy. The big question remaining with ALM for the general programming community, perhaps, is: can you really automate business requirements management, software development and operational support? Surely, it’s all a craft and you just need clever people to make it all work? Well, no doubt a lot of banking back-office staff, for example, thought this way before the IT people automated many of them out of existence. We would be lucky (and probably well hated) if what we did to others couldn’t apply to ourselves.
However, a mantra from the early days of Systems Analysis still applies: “automate the routine and use people to deal with the exceptional”. Perhaps not everyone agrees that there is a routine in software development – here are a couple of quotes from Register readers commenting on IT professionalism: “Programmers have no such book. Every problem is new and different”; “Each job a mechanic does is a series of fixed steps. If you’ve done it once you can do it again. Developers have difficulties because they’re generally creating something new each time - often in a new environment” – but I simply can’t see that a system to, say, manage a doctor’s surgery in Birmingham is very much different in essence to a system used to manage a surgery in Bristol, although each may be coded from scratch by different programmers using different databases and IDEs. Paying to have similar functionality coded from scratch will soon make no sense to the people paying for software to automate their business (perhaps it hasn’t for some time); and software developers will have to keep up.
I see space for some developers writing third generation code for advanced pieces of utility software, to deal with parallel processing and performance bottlenecks, and probably working for the suppliers of databases and application servers. The rest of us will, eventually, move higher up in the ALM stack, analysing requirements, designing architectures, and QAing automated systems against “what the user really wanted”. And we will probably be writing something like UML 2.0 instead of code; writing good UML needs similar skills and disciplines to writing code but works at a higher level…
And I fully expect to hear similar comments to those I heard when I abandoned my roots as an Assembler programmer and moved on to third generation languages, 4GLs and prototype domain specific languages such as SQL; but how many people still write Assembler (or even C) today? Nevertheless, these changes won’t take place tomorrow – one thing experience tells me is that process change moves at the rate people change culture, not at the rate computer chip designers work.
That said, will coding ever really die? Well, not if you’re really, really good at it (I know that we’re all above-average coders just as we’re all above-average car drivers, but I mean blindingly good) or find a backwater which just hasn’t moved with the times. As late as the 1980s, I knew a very clever Australian programmer who made a very good living maintaining Assembler, which hooked into a mainframe OS “feature” (bug) inside a major bank’s back office applications and is probably still doing it; and in the 1990’s I met a programmer who knew nothing about Windows and made a living writing DOS systems that interfaced with Teletext for automotive parts dealers.
But you might be unwise to rely on such undoubted niche opportunities for a career these days. My career advice is to look around the ALM vendors and decide which of them has an offering that matches your way of looking at things – whatever you think of these developments generally, there’s a lot of choice on offer.®
David Norfolk is the author of IT Governance, published by Thorogood. More details here.