Genetic researchers fill 1TB a week
A nice little earner for their disk array supplier...
Fresh from its recent sinister triumph, the Wellcome Trust says that its research into genetic diseases is generating such huge data volumes that it has had to buy an extra 42TB of SATA disk arrays, 30TB of which are already full.
Scientific analysis of genetic material generates output files of around 50MB, the trust said, and these are kept on-line so they can be accessed by its partners in other research groups around the world.
It added that the data yield from its Wellcome Trust Centre for Human Genetics (WTCHG) has risen from 20GB a day a couple of years ago to between 200GB and 300GB a day now. At that rate, the remaining 12TB will fill in two to three months, and more arrays will be needed.
Based at Oxford University, WTCHG is part of a world-wide collaborative programme which is researching the genetic causes of diabetes, obesity and other common ailments. Its own compute resources include a 120-node Linux cluster and 25 core servers, plus a Fibre Channel SAN.
It also has a server that now hosts four 21TB Nexsan SATABeast arrays, mirrored for a total of 42TB, said Dr Tim Bardsley, WTCHG's IT manager. It manages its storage using DataCore's SANmelody software, which allows users to access data via iSCSI and Fibre Channel.
Along with low cost and power consumption, a prime factor in choosing the Nexsan storage was reliability, Bardsley said. "We have research programmes that have been running now for three years or more," he added, "and you cannot put a price on how valuable that data is."®
RE: data storage
If the generation of the 200Gb of data was performed over 1 week at a constant rate, they'd need around a 5Mbit/s connection to be able to back it up to a remotely hosted SAN. If it was just within working hours, you'd be looking at around 20Mbit/s
It doesn't seem that unviable... bearing in mind that "Big Science"(tm) has pretty deep pockets.
I'd guess that they'd have an alternative site for something like this - how would you explain to the big boss how you'd lost *everything* in the event of a major incident?
RE: data storage
Just thinking, it would be interesting to know if off site backups were being made and how quickly they could be brought up online. (Another whole server farm, perhaps?)
Some of these comments are excellent examples
of "consumer" coding fanboys. Big science doesn't use the simple tools code-droids are used to. There is another world of computing outside "Windows and -inx" that most people will never be exposed to.
Big science uses IT as a tool, it is not the end-all-be-all "world" for that industry. Big science is one of the few industries that hasn't fallen completely for the smokescreen that is IT.