Feeds

Firing up the Erudine engine

Building with behaviour

Securing Web Applications Made Simple and Scalable

Stage 5: The model is run in order to test the behaviours it has captured (this is a combination of integration and user acceptance testing) and the system "plumbing" completed.

Stage 6: It now just remains to choose from various deployment options:

  1. If a greenfield site or new development, you can deploy the Erudine model as a conventional new system – cut over to the tested system from existing, possibly manual, processes (if any) once you're sure it's ready. However, Erudine seems not to have much actual experience of this.
  2. You can run the Erudine model in parallel with the original system and then cut over once everyone is comfortable with it. This is the safe, normative, option but it can be difficult to manage duplicate information or system state during the parallel run. This is an expensive option in the short term) and assumes a mature, well-organised company.
  3. You can use the Erudine model as "requirements doc" for a cheap rewrite using conventional coding techniques (probably using outsourced, cheap, programmers). This is the comfortable option and appears to minimise risk, but you are unnecessarily duplicating effort and forgoing the maintenance benefits Erudine promises (although resurrecting the Erudine model for maintenance will be cheaper than the initial build). If you rewrite the Erudine model you are probably writing new legacy. It's not really risk-free even in the short-term: it takes longer and you need to manage the rebuild quality and ensure that the Erudine behaviour isn't compromised in the rebuild.

A typical Erudine sell is based on identifying a pain point, such as a legacy system which must be replaced for good business reasons and for which a conventional rebuild is infeasible (or has actually failed). Generally, the sell goes:

  1. First, build new system related to the legacy target (important enough to matter; not so important as to be a company-killer) as proof of concept; then
  2. Second contract, recreate the whole legacy system behaviour with Erudine; and
  3. Potentially, maintain the system by changing its behaviour in the self-testing Erudine models.

Issues

As with any new approach, there are issues to consider. The Erudine approach to legacy reclamation is impressive, more so than the superficially attractive "put an object wrapper around your legacy and deploy it as a service" approach (which reads well but has serious problems in the detail – building standards-based chaos for one; and whether the legacy does break neatly into cohesive services for another). However, it may not be the only feasible option.

Micro Focus and others have mature automated tools for refactoring and understanding legacy systems by analysing the source code (if you still have it). Micro Focus, of course, has positive case studies (eg, one from as long ago as 2000, here). And, Compuware, for example, provides 4 GL tools such as Uniface for rapidly recreating legacy systems in a more agile, business-oriented environment. Compuware's approach may be an unfashionable one, but I think it's also a workable one.

Erudine offers a different approach and probably a more integrated one, with an attractive maintenance story. What makes it different is the underlying mathematical model of behaviour and consequent automatic consistency checking (as I said, this is very hard to assess objectively, as it is a secret "black box"); and the fact that it explicitly manages "knowledge" with conceptual graphs.

If it delivers on its promises, it moves rule-based systems to a higher level of knowledge management (the rules community was originally AI focused, now it plays down AI, perhaps Erudine puts it back, to an extent). Nevertheless, Erudine is difficult to evaluate without case studies – and case studies could succeed or fail for reasons unconnected with the use of Erudine. In any case, most of Erudine's best customers aren't talking publicly. Nevertheless, the secondary evidence of Erudine's workability from Gartner etc and from unattributable sources appears good. ®

Bridging the IT gap between rising business demands and ageing tools

More from The Register

next story
NO MORE ALL CAPS and other pleasures of Visual Studio 14
Unpicking a packed preview that breaks down ASP.NET
Secure microkernel that uses maths to be 'bug free' goes open source
Hacker-repelling, drone-protecting code will soon be yours to tweak as you see fit
KDE releases ice-cream coloured Plasma 5 just in time for summer
Melty but refreshing - popular rival to Mint's Cinnamon's still a work in progress
Cheer up, Nokia fans. It can start making mobes again in 18 months
The real winner of the Nokia sale is *drumroll* ... Nokia
Put down that Oracle database patch: It could cost $23,000 per CPU
On-by-default INMEMORY tech a boon for developers ... as long as they can afford it
Another day, another Firefox: Version 31 is upon us ALREADY
Web devs, Mozilla really wants you to like this one
Google shows off new Chrome OS look
Athena springs full-grown from Chromium project's head
prev story

Whitepapers

Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Seven Steps to Software Security
Seven practical steps you can begin to take today to secure your applications and prevent the damages a successful cyber-attack can cause.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.