Feeds

Mapping the universe at 30 Terabytes a night

Jeff Kantor, on building and managing a 150 Petabyte database

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

Kantor added: "We are also prototyping with other open-source and proprietary databases, as well as with a Map Reduce-based approach similar to that in use at Google. We are also participating in a startup venture to create a new database engine specifically oriented at large-scale databases, especially those that contain scientific and image data."

The data will be available in formats compliant with the Virtual Observatory standards, as FITS images, and as RGB images (or something equivalent).

Providing 30TB of data a day, to each and every potential user, sounds about as easy and practical as juggling elephants one-handed.

Kantor explained: "At 1Gbps, 30TB would take 67 hours to download (without overhead). That is why the Data Access Centers exist, so users can access the data and analyze it without downloading large subsets. Rather than move the data to the processing code, we permit you to process the data nearby."

One wonders how an automated system could be written to discover previously unknown classes of rare objects - part of the telescope's mission statement.

How do you program clairvoyance into a data analysis system? Kantor: "There are quite a few researchers pursuing the line that one can analyze large datasets statistically, and uncover outliers and anomalies of interest. This is very much a research topic and one that several LSST partners are pursuing.

"In addition, we are designing the software with the ability to extend it to new algorithms and data types easily. There is a tradeoff between flexibility and performance and we walk that line every day in the design."

Talking of design, agile process aficionados out there will be interested to hear that Kantor and his team are using the minimalist, UML-based ICONIX Process (a subject close to this writer's own heart) for their system and software requirements and design. The teams are geographically dispersed, so the LSST models are shared using Sparx Systems' Enterprise Architect (EA) version control integration capabilities. Individual packages are added to a central version control repository and these packages are then shared by several local EA project files.

Kantor adds: "For code, our development environment is based on the open source trac tool integrated with subversion for version control. This provides a source repository and browser, ticket system, and documentation wiki."

Measuring the success or failure of a project as massive and wide-ranging as the LSST, which will run over such a long period of time, could prove difficult. How would you know that the project is providing value for money, useful information?

Kantor agreed: "That is always a tricky question for 'big science' projects. Typically it is measured in terms of professional papers created from the survey. Additional metrics have to do with educational impact and public impact.

"The Hubble Space Telescope gave us pictures of distant objects in the Universe that changed most people's perception of and interest in astronomy. LSST has the same potential. How do you measure that?"®

The essential guide to IT transformation

More from The Register

next story
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
'Stop dissing Google or quit': OK, I quit, says Code Club co-founder
And now a message from our sponsors: 'STFU or else'
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Uber, Lyft and cutting corners: The true face of the Sharing Economy
Casual labour and tired ideas = not really web-tastic
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?