Feeds

IBM's monster tape will take three days to fill

35TB cartridge poses whole new set of problems

Combat fraud and increase customer satisfaction

IBM Research has devised technology with FujiFilm to create a 35TB capacity tape, but it will take 3 days to write the data at LTO5 speeds.

The new hyper-capacity half inch tape technology has been successfully read and written at a 29.5bn bits/sq in areal density, which means a tape capacity of 35TB according to the researchers. This is said to be 44 times the 800MB raw density of LTO4 tape. From a technological point of view the gee whiz factor is impressive.

The media is FujiFilm's Nanocubic tape, with an ultra-fine, perpendicularly-oriented barium-ferrite magnetic medium that apparently does not use expensive metal sputtering or evaporation coating methods. IBM has developed new servo control technologies enabling a 25X increase in the number of parallel tracks on half inch tape, with a track width of less than 0.45 micrometers.

There is an ultra-narrow 0.2um data reader head and a data read channel based on a data-dependent noise-predictive, maximum-likelihood (DD-NPML) detection scheme developed at IBM Research in Zurich. IBM Research at Almaden developed a reduced-friction head assembly allowing the use of smoother magnetic tapes and an advanced GMR (Giant Magneto-Resistive) head module incorporating optimised servo readers.

The capacity can be increased to the 100bn bit/sq in level according to the IBM researchers. However, one issue that IBM and FujiFilm do not discuss is the time to read or write 35TB of tape data. Using LTO5's tape transfer speed of 140MB/sec it would take 2.89 days (69.44 hours) to write the full 35TB. To write 35TB in the same time that LTO5 writes its 1.5TB of raw data, that's 2.98 hours, would require the tape speed to increase 23.33 times, and that assumes that read/write heads can process the signals passing to and from the tape that quickly.

Accelerating tape speed 23.33X would also increase the risk of tape deformation or breakage and require more electricity for the drive. It seems likely that either multiple-head tape drives or greatly increasing the number of tracks readable by a single head would be needed to be developed to cut the tape read/write times down to more practicable levels. A back of an envelope calculation suggests a 4-head drive or drive which read 4 times as many tracks would cut the 35TB read/write time to 17.36 hours. Another possibility would be to stripe the data across two or more tape drives. A 4-drive setup using such heads would deal with 35TB in 4.34 hours and that starts looking reasonable.

Such striping across multi-headed drives implies a tape library using 35TB cartridges would need more drives and more robotic capability to move cartridges between slots and drives, such that, for example, four cartridges could be delivered to four drives simultaneously. If tape libraries are forecast to sustain their usability because tape storage economics are going to outstrip those of disk for many more years, then changes to allow tape cartridge striping, multi-headed drives, and multiple simultaneous cartridge loading into drives look necessary. ®

3 Big data security analytics techniques

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Kingston DataTraveler MicroDuo: Turn your phone into a 72GB beast
USB-usiness in the front, micro-USB party in the back
AMD's 'Seattle' 64-bit ARM server chips now sampling, set to launch in late 2014
But they won't appear in SeaMicro Fabric Compute Systems anytime soon
Microsoft's Nadella: SQL Server 2014 means we're all about data
Adds new big data tools in quest for 'ambient intelligence'
BOFH: Oh DO tell us what you think. *CLICK*
$%%&amp Oh dear, we've been cut *CLICK* Well hello *CLICK* You're breaking up...
Inside the Hekaton: SQL Server 2014's database engine deconstructed
Nadella's database sqares the circle of cheap memory vs speed
prev story

Whitepapers

Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.