Removing the irrational from application rationalisation

When porting to the cloud, only take those applications that make sense

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

Are you hoping to forklift some of your applications over to the cloud, to take advantage of cost savings? By all means evaluate it, but understand all of the implications for the applications and the business processes that they support.

Rationalisation is one of the most important steps in that understanding. Before deciding which applications to move to a cloud-based infrastructure, organisations must document what they have, and how important different applications are to the business.

This is easier said than done, according to IBM which argues that application rationalisation decisions are too often made on the basis of incomplete and subjective criteria. Worse, these may subject to the "influence of politics, pet projects, gut feel and the loudest voice in the room". An objective, structured change management process is required. But how to implement this?

IBM's answer is Rational Focal Point, web-based decision making tool for application portfolio managemnt. The company has a nice case study of the Rational Focal Point in the form of Bank DnB NoRD, a Danish bank with overseas branches across Poland and the Baltic regions, to illustrate its pitch.

The bank wanted to rationalise and standardise core banking systems, but "quickly realised that managing all the individual IT projects effectively would be extremely difficult with its existing approach, which involved a variety of documents, spreadsheets and applications such as Microsoft Project, Torben Bøllingtoft Cruse, head of operations and portfolio management in Bank DnB NORD, says.

“The scale of the core banking replacement programme was simply too large. When we wanted to give senior management an overview of status across all projects, it took up to 70 hours just to collect the relevant data from all these different sources. As a result, status reports generally arrived too late to provide adequate support for decision-making, and there were also concerns about the completeness and consistency of data. The project office increasingly becoming a place where, although a lot of information went in, it was almost impossible to get anything out – and certainly not in a timely manner.”

Cruse selected Rational Focal Point to speed up the process. For starters, the time taken to create a full status report on projects decreased from 70 hours to 1.5 hours. Other business benefits include better visibility of real-time project data, stronger management control, more accurate data forecasting and scheduling and most importantly a structured change management process that ensures the appropriate actions are taken, according to Cruse.

"Rational Focal Point saves us so much time on project control and management reporting, it’s hard to imagine how we could do without it,” says Cruse. “But time-savings in the project office are not the main point: what’s really important is that we can now provide accurate, comprehensive project data to the business in a timely manner – helping to support decision-making and keep all DnB NORD’s projects on track. We are now seen as an asset to the business – so much so that the bank has started using Rational Focal Point to manage all its projects – not only the ones related to the core."

Moving to the cloud

In many cases, the application base can be rationalised before the moving starts, argues Kevin Gemmel, head of professional services at Camwood, an application migration and portfolio management firm.

“We’ve rationalised people from 16,000 applications down to 3,000,” he says. “Then, you’re left with a leaner core of applications then. It really does massively reduce the cost and the time involved.”

However, Gemmel says that companies have rarely mapped out their applications in this way. “They may have done it once, perhaps with the move to Windows XP,” he says. “But they haven’t done it since, and they don’t have that cradle to grave governance.”

Part of the application inventory process involves categorising software in your portfolio to understand what is being used for. This requires an understanding of how applications map to business processes, explains Bruce Otte, director of workload and perform services at IBM. "If we want to migrate a business process or workload to the cloud, we need to have an analysis to say which of these workloads can leverage the cloud," he says.

Some processes may be able to take place entirely in the cloud, while others may have to stay as part of a non-cloud infrastructure. Several factors will influence this, but one of the most important is whether or not the application is easily virtualised, says Erik Sebesta, chief architect and technology officer at cloud computing consulting company Cloud Technology Partners.

“There are some applications that you can’t virtualise, and if you can’t virtualise it, then it’s not a good candidate today,” he says. After all, virtualisation is one of the fundamental tenets of cloud computing.

Examples of applications that are difficult to virtualise effectively are those with very low-latency requirements, running in microseconds, Sebasta says. “If you have something with specialised hardware then the cloud isn’t good for that. Applications that are monolithic and tightly coupled, represent a full rewrite.” Applications with lots of interdependencies can also be problematic.

When deciding how challenging a move may be, don’t forget other considerations such as security (where are the data and the applications hosted?) The potential for efficiency gains is also relevant. Some applications will have been specially crafted over the years to take full advantage of the resources available, using a high proportion of available computing power, rather than leaving 90% of it idle, as is the case with many apps.

These efficient, ‘dense’ applications tend to fall into the transaction processing category, but you’ll also find many of them in the high-performance computing area, used by the scientific community.

The evaluation will throw up a key fact: that a business process’s viability for the cloud lies along a spectrum, rather than being binary. A lot of things can be moved to the cloud given enough motivation and money, but it’ll take varying levels of cost and effort, depending on the characteristics of the underlying application.

Otte uses two factors when analysing the viability of a process for the cloud, and maps then on two axes to create a quadrant. The horizontal axis denotes the effort to cloud-enable a process, while the vertical represents the business value to be gained by doing so.

“Business value is measured in three areas: one is the ability to change with rapidly shifting business needs,” Otte says. The other two are cost reduction, and the ability to integrate.”

After mapping all of your processes, you will hopefully end up with a cluster in the upper right-hand quadrant that you’ll go after first. These will be applications that are relatively easy to cloud-enable, and offer clear benefits.

An example may be applications that can be readily virtualised, and which need a lot of computing power for short periods of time. Single-tiered applications (as opposed to multi-tiered ones that may need to be redesigned) can also be relatively easy to move to the cloud in some cases. However, in most cases Gemmel argues that some rewriting will be necessary.

The centre applications will be harder to move (although hopefully not impossible). Whether or not the IT department moves them will depend on organisation-specific parameters, which could be as political as they are technological or economic.

“Those in the lower left quadrant could be legacy applications,” says Otte. “They may be older applications that might not be able to take advantage of cloud, and so we may replace them at some point in the future.”

Otte says that regulatory requirements may also keep some applications glued to the lower-left quadrant. The application or the data associated with it may be bound by privacy regulations.

Even these applications toward the troublesome edges of the quadrant may not be lost to the cloud, however. In some places, there may be cloud-based alternatives that map closely enough to the functions of the existing software, making it possible to abandon an application altogether and change to a software-as-a-service (SaaS)-based solution instead.

In some cases, some of these problematic applications may be moveable to a private cloud, but not a public one (such as software supporting heavily-regulated processes, for example).

All of this has an impact on the application lifecycle, which is disrupted when applications are migrated anywhere or transformed in any way. But Anthony Dickinson, cloud computing service director at Glasshouse Technologies, an IT consulting firm, argues that the disruption can be positive.

Where possible, cloud-enablement should move applications in the direction of standardisation, he says. “In most organisations they may try and set standards but you’ll find that people are using different languages and middleware layers.”

Cloud models such as platform-as-a-service (PaaS), in which large parts of the software framework are pre-baked for the user, should help to level the environment, which will theoretically make the whole application lifecycle easier to manage.

The caveat is that the platform has to make sense for the customer, who will need to find one that is flexible enough to suit their needs. For a large organisation with wide-ranging processes and applications - especially those with many interdependencies - cloud enablement will be a delicate balance.

On the one hand, they’ll need to find cloud-based environments that support enough diversity in the application base. On the other, they’ll want to move enough of their application portfolio over to the cloud to recognise real savings. It is considerations such as these that make the evolution of broad organisational cloud strategies relatively slow, and cautious. ®

Providing a secure and efficient Helpdesk

More from The Register

next story
Microsoft on the Threshold of a new name for Windows next week
Rebranded OS reportedly set to be flung open by Redmond
Business is back, baby! Hasta la VISTA, Win 8... Oh, yeah, Windows 9
Forget touchscreen millennials, Microsoft goes for mouse crowd
SMASH the Bash bug! Apple and Red Hat scramble for patch batches
'Applying multiple security updates is extremely difficult'
Apple: SO sorry for the iOS 8.0.1 UPDATE BUNGLE HORROR
Apple kills 'upgrade'. Hey, Microsoft. You sure you want to be like these guys?
ARM gives Internet of Things a piece of its mind – the Cortex-M7
32-bit core packs some DSP for VIP IoT CPU LOL
Lotus Notes inventor Ozzie invents app to talk to people on your phone
Imagine that. Startup floats with voice collab app for Win iPhone
prev story


A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.