Feeds

Plane or train? Tape or disk? Reg readers speak

Disk speed versus tape economy and removabiity

  • alert
  • submit to reddit

Reducing the cost and complexity of web vulnerability management

Chris Evans - Independent storage consultant

Chris Evans

Many people feel tape no longer has a place in the enterprise data centre. This is not a belief I subscribe to; tape still has a place as part of an overall data management strategy.

A comprehensive data protection strategy will address the following scenarios:

• Data corruption
• Loss of data
• Loss of access to data

Depending on RTO (recovery time objectives) and RPO (recovery point objectives), it may be more appropriate to use disk solutions for short-term backups. Using disk will generally be quicker than tape but more expensive. The decision on whether disk or tape is best comes down to the analysis of RTO/RPO requirements.

RTO is a measure of the acceptable elapsed time taken to recover data. It will vary based on the importance of the data being recovered – mission critical data will be recovered well ahead of development data in a disaster, for example. As data ages, the RTO typically also increases and so one strategy is to use disk for short-term backup/restore requirements, moving older backups to tape over time (otherwise known as Disk to Disk to Tape - D2D2T).

As a storage medium, tape still has many advantages:

• It is relatively cheap compared to keeping disk arrays spinning and available.
• It is compact.
• It is portable.

Of course people point to some of these strengths as weaknesses too; tapes can be lost and many companies have received fines for breaches of regulations after tapes have gone missing. However tape content can be encrypted, mitigating this risk. For long-term backup, tape has clear advantages in cost.

Compared to tape, disk solutions have the advantage of additional functionality, such as de-duplication. In a VTL solution, for example, backups are compressed by removing duplicate copies of data, retaining a reference to a single physical copy on disk. Although de-duplication can provide reductions in storage capacity (and therefore cost), it introduces a greater risk of data loss if a hardware failure occurs in the backup system. It also introduces potential performance issues when multiple restores are performed from the VTL at the same time.

One area where tape is typically mis-used is in the retention of old backups to create a data archive. Unfortunately in many cases, this isn’t the best approach because:

• Tapes can become lost or damaged or lose their contents over time. An archive can’t afford to lose data. Many tape users simply retain multiple backups in the hope that this covers all data ever created. This is unlikely to be the case.

• Tape data is usually stored in the format of the backup product and so not easily searchable; at most the backup software will retain a list of files on tape but not metadata relating to the content.

• Tape's content isn’t easy to refresh. Data has to be physically copied out of the backup software to a format that can be backed up again by other backup software. A data archive therefore needs proper content management processes in place.

Generally, the more complex the data, the more unsuitable it is for long term tape archive.

Another scenario gaining in popularity is the idea of using a cloud storage service to provide backup facilities. At present the use cases for cloud-based backup are small because the time taken to instigate a restore from the cloud will be longer than is acceptable under recovery time objectives. I see the use of cloud backup as being more useful for home users and SMB but not the enterprise.

In summary, tape still has a role to play in the data centre. It is one of many tools that can be used in deploying a comprehensive data management strategy. Tape retains advantages in cost, but has issues around effective management. With companies such as Google having to rely on tape for data recovery, we can be sure that tape has a future in the data centre for many years to come.

Chris M Evans is a founding director of Langton Blue Ltd.. He has over 22 years' experience in IT, mostly as an independent consultant to large organisations. Chris's blogged musings on storage and virtualisation can be found at www.thestoragearchitect.com.

Reducing the cost and complexity of web vulnerability management

More from The Register

next story
Wanna keep your data for 1,000 YEARS? No? Hard luck, HDS wants you to anyway
Combine Blu-ray and M-DISC and you get this monster
US boffins demo 'twisted radio' mux
OAM takes wireless signals to 32 Gbps
Apple flops out 2FA for iCloud in bid to stop future nude selfie leaks
Millions of 4chan users howl with laughter as Cupertino slams stable door
No biggie: EMC's XtremIO firmware upgrade 'will wipe data'
But it'll have no impact and will be seamless, we're told
Students playing with impressive racks? Yes, it's cluster comp time
The most comprehensive coverage the world has ever seen. Ever
Run little spreadsheet, run! IBM's Watson is coming to gobble you up
Big Blue's big super's big appetite for big data in big clouds for big analytics
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.