Shrinking primary databases
Clear out the redundant data to make clear space
Stripping out old records from primary databases could pay big bucks in terms of reclaimed disk capacity, faster database operations and backups. Clearpace has technology to do this and archive the extracted records in de-duplicated form on cheap SATA arrays.
UK-based HP reseller 2e2 will resell and use Clearpace's NParchive product to take rarely accessed records out of Oracle and similar databases and store them in de-duplicated form in a separate archive. If you have a 500GB Oracle database it can strip out, say, 200GB of data that hasn't been accessed for weeks, de-duplicate it and store it in compressed form in a SATA disk archive. There it occupies 10-50GB of space and is still accessible at disk speed - obviously not as fast as the original raw data but you get it pretty quickly.
The primary database is now a 300GB one; its processes run faster and its DR (disaster recovery) and development copies are smaller. With this multiplier you're saving up to 600GB of primary capacity.
Clearpace's NParchive stores data as either unique original data or pointers to it, in a tree-like pattern to reflect the original records. It can be searched with SQL from common tools offered by Business Objects, Crystal Reports, COGNOS and others. It can ingest data from different primary databases - SQL Server, Oracle, DB2 or whatever - and store them in a tamper-proof way.
ESG boss Steve Duplessie blogged on deduplication recently. He talked about data in primary storage that's rarely accessed and doesn't change, saying: "This is the stage where we next want to apply a massive reduction in the copies of data we have. It's still 'primary' storage, but by applying de-dupe here we can probably chop 50 per cent or more of our overall capacity off at the knees.
"The next step is to figure out how to slide the de-dupe lever closer to the point of creation, and the biggest value point is going to be [with this data]."
Clearpace has technology to vacuum up this data from primary databases, shrinking and speeding them, and stuffing it in de-duplicated and compressed form, perhaps twenty times smaller - possibly more - into a single searchable disk-based archive. It's not de-duplicating the structured database information in situ but it is reducing primary databases in size and preserving old data in a much-reduced form. The de-dupe lever has been slid closer to the creation point. That sounds pretty good. ®
Clearpace has teamed up with HP to produce a free whitepaper on this subject, called The Six R's of Application Archiving. You can download it from Reg Whitepapers.
Sponsored: Are DLP and DTP still an issue?