Feeds

Digital storage and archiving = digital decay?

The quantity challenge

  • alert
  • submit to reddit

Internet Security Threat Report 2014

Commercial enterprises invariably store information and records, which may be required for a range on external demands, primarily legal and regulatory. The rationale is that huge quantities of data and records can be stored, retained on a more cost effective, efficient basis and accessed on a timely basis. There is a similar trend to take the same approach for academic historical, heritage and, indeed, personal records. The act of capturing data for storage and retention goes under the designation of "archiving."

What are the acts of digital storage and archiving and do they differ?

Digital storage

  • Digital storage focuses on addressing "the quantity challenge": making the process cost efficient; storing as large quantity of data as possible in as small a space as possible; making all data easily accessible. In summary data is piled high in relatively simplistic form for access in as simple and cost-effective form as is technically feasible.
  • The methods currently deployed for secure storage of comprehensible data are highly labour intensive.
  • Digital storage lacks an automated basis to determine the data's value both currently, and in the future. So the approach lacks any sense of human evaluation and judgement.
  • There is no long-term business model for digital archiving. Like all facets of technology, there are built-in redundancy and obsolescence factors to storage. Digital storage technology has maintenance costs, which increase as the technology ages. New technology makes contemporary storage techniques obsolete and ultimately there is a compulsion to migrate to the next generation of storage technology.

Archiving

The creation of archives involves a number of conscious acts. Initial choice to archive in the first instance supported by some rationale, selection of material and organisation of the form and presentation of the material.

  • IT follows that archiving demand trained and experienced individuals to manage and control the archiving process, which extends beyond the retention and maintenance to pruning, reduction and in some instances ultimate destruction of archives.
  • The skills and resources, which have historically been channelled into archiving, have endowed archives with integrity, usability and even immortality.
  • Archiving has historically been conducted using one permanent medium to maintain material for as long as required, normally paper or some other form of visual material.
  • The digital age is producing data at such a rate and volume but which traditional archiving techniques involving the human intervention and judgement are unable to address at the pace or rate data is being produced. Moreover there would be a huge costs to deploy the conventional human techniques.

Until an automated mechanism for determining what data means and its long term value, huge quantities of data will continue to be collected with less than cursory assessment of its value, present or future. Traditional archiving techniques are falling into disuse, but are not being replaced, only substituted by storage.

© IT-Analysis.com

Related stories

Help the Aged Data
Archive.org suffers Fahrenheit 911 memory loss
Britain's Web presence to be saved
EMC retools archiving software
The Web as historical record

Secure remote control for conventional and virtual desktops

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Turnbull should spare us all airline-magazine-grade cloud hype
Box-hugger is not a dirty word, Minister. Box-huggers make the cloud WORK
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
prev story

Whitepapers

Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.