Feeds

Breaking with your back-up supplier is a sticky business

How to dissolve the glue

Secure remote control for conventional and virtual desktops

Backup is like software superglue with three sticking points: the backup software installation; its operation; and the longevity of the stored backup data vault, which can be needed for years after the last piece of data has been written to it.

Let's take an imaginary product, BackNetVaultExec Pro. It is a typical enterprise backup product which utilises a media server and has agents in all the servers with data to be backed up.

The servers run different operating systems and exist in both physical and virtual forms. The data is stored on a target disk array and restored from it too. The idea is to protect files and virtual machines.

Let's think of the backup software as a lens through which we can see the data. It knows the format the data is stored in and understands the metadata – the data about the stored data.

In the dark

Without this lens we cannot make sense of the backed-up data. We can't see into it and pull out the bits we need to restore, such as files, folders, emails and so on, because the originals have been lost, deleted or corrupted.

Let's run BackNetVaultExec Pro for a year and then decide we want to change it because, say, it has been acquired by another vendor we love to hate, or our only backup admin left us, or we are enamoured of the latest startup, or, best one, we had to do a restore and could not.

And by that time it is too late and you realize that what you thought you had turns out to be rubbish. You now have no choice.

Our hypothetical product also handles snapshots and replication badly. Let's further imagine we switch to WonderBackupPro, another hypothetical product, which remedies these ills. What happens?

We faced three challenges, one big, one medium and one comparatively minor. First, our new product cannot use the backed-up files because it does not understand their formats and metadata and so cannot access their content.

Ancient vaults

A backup storage vault is stuck fast to the software that writes content to it and reads content from it. So long as that vault exists you must use the original software to access it. This has to be recognised and managed. There is, in general, no silver bullet for backup vault migration.

The mid-level difficulty is that you probably have to install the new backup software agents on all the servers needing backup.

That means you still have the burden of maintaining and updating backup agent software as the server operating system environment gets patched and updated.

Since you must maintain the original backup software as long as access to the content in its storage vault is needed then you now face even more work in this area, not less.

The comparatively minor issue is that you have to learn how to use the new software.

Where does this leave us? We have an even bigger backup software problem than before and it is long-lived.

Even if we switch all backups to the new product from a start date and use the old backup software only to provide read and restore access to the backup storage vault, we still have to keep the original backup software current. It is probably a mandatory condition for getting support.

Double trouble

So now we are running two different backup environments in parallel. Does this mean it is impossible or impractical to change?

Not at all, but we should appreciate that a change will take a while and that we will be running two environments for as long as the first product’s stored content is needed. That could be five years or more.

The implication is that the return on investment (ROI) of a backup software migration and the timescale over which it occurs has to be properly understood.

Let's summarise the basic software migration issues identified so far:

1. Concurrent old and new backup software agent versioning for the host operating systems involved;

2. Need to keep old backup software current during the period we need access to its stored data;

3. Need to understand how to operate the new software.

One way, in theory, to resolve these issues is to migrate the old backup data vault to the new one. In theory that's fine; you restore the entire original backed-up data and then back it up a second time with the new product. A moment’s thought, however, shows this is wholly impractical.

Vast amounts of data spanning several years of operation can be involved and such a migration would require lots of minutely detailed planning.

Also, the servers you need for the task are likely to be running production applications, which makes it hard to justify effectively taking them out of commission for the exercise

Top 5 reasons to deploy VMware with Tegile

Next page: So what can you do?

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.