Plane or train? Tape or disk? Reg readers speak

Disk speed versus tape economy and removabiity

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

Chris Evans - Independent storage consultant

Chris Evans

Many people feel tape no longer has a place in the enterprise data centre. This is not a belief I subscribe to; tape still has a place as part of an overall data management strategy.

A comprehensive data protection strategy will address the following scenarios:

• Data corruption
• Loss of data
• Loss of access to data

Depending on RTO (recovery time objectives) and RPO (recovery point objectives), it may be more appropriate to use disk solutions for short-term backups. Using disk will generally be quicker than tape but more expensive. The decision on whether disk or tape is best comes down to the analysis of RTO/RPO requirements.

RTO is a measure of the acceptable elapsed time taken to recover data. It will vary based on the importance of the data being recovered – mission critical data will be recovered well ahead of development data in a disaster, for example. As data ages, the RTO typically also increases and so one strategy is to use disk for short-term backup/restore requirements, moving older backups to tape over time (otherwise known as Disk to Disk to Tape - D2D2T).

As a storage medium, tape still has many advantages:

• It is relatively cheap compared to keeping disk arrays spinning and available.
• It is compact.
• It is portable.

Of course people point to some of these strengths as weaknesses too; tapes can be lost and many companies have received fines for breaches of regulations after tapes have gone missing. However tape content can be encrypted, mitigating this risk. For long-term backup, tape has clear advantages in cost.

Compared to tape, disk solutions have the advantage of additional functionality, such as de-duplication. In a VTL solution, for example, backups are compressed by removing duplicate copies of data, retaining a reference to a single physical copy on disk. Although de-duplication can provide reductions in storage capacity (and therefore cost), it introduces a greater risk of data loss if a hardware failure occurs in the backup system. It also introduces potential performance issues when multiple restores are performed from the VTL at the same time.

One area where tape is typically mis-used is in the retention of old backups to create a data archive. Unfortunately in many cases, this isn’t the best approach because:

• Tapes can become lost or damaged or lose their contents over time. An archive can’t afford to lose data. Many tape users simply retain multiple backups in the hope that this covers all data ever created. This is unlikely to be the case.

• Tape data is usually stored in the format of the backup product and so not easily searchable; at most the backup software will retain a list of files on tape but not metadata relating to the content.

• Tape's content isn’t easy to refresh. Data has to be physically copied out of the backup software to a format that can be backed up again by other backup software. A data archive therefore needs proper content management processes in place.

Generally, the more complex the data, the more unsuitable it is for long term tape archive.

Another scenario gaining in popularity is the idea of using a cloud storage service to provide backup facilities. At present the use cases for cloud-based backup are small because the time taken to instigate a restore from the cloud will be longer than is acceptable under recovery time objectives. I see the use of cloud backup as being more useful for home users and SMB but not the enterprise.

In summary, tape still has a role to play in the data centre. It is one of many tools that can be used in deploying a comprehensive data management strategy. Tape retains advantages in cost, but has issues around effective management. With companies such as Google having to rely on tape for data recovery, we can be sure that tape has a future in the data centre for many years to come.

Chris M Evans is a founding director of Langton Blue Ltd.. He has over 22 years' experience in IT, mostly as an independent consultant to large organisations. Chris's blogged musings on storage and virtualisation can be found at www.thestoragearchitect.com.

Secure remote control for conventional and virtual desktops

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Linux? Bah! Red Hat has its eye on the CLOUD – and it wants to own it
CEO says it will be 'undisputed leader' in enterprise cloud tech
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Hey, what's a STORAGE company doing working on Internet-of-Cars?
Boo - it's not a terabyte car, it's just predictive maintenance and that
prev story


A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.