Feeds

Mapping the universe at 30 Terabytes a night

Jeff Kantor, on building and managing a 150 Petabyte database

  • alert
  • submit to reddit

Choosing a cloud hosting partner with confidence

Kantor added: "We are also prototyping with other open-source and proprietary databases, as well as with a Map Reduce-based approach similar to that in use at Google. We are also participating in a startup venture to create a new database engine specifically oriented at large-scale databases, especially those that contain scientific and image data."

The data will be available in formats compliant with the Virtual Observatory standards, as FITS images, and as RGB images (or something equivalent).

Providing 30TB of data a day, to each and every potential user, sounds about as easy and practical as juggling elephants one-handed.

Kantor explained: "At 1Gbps, 30TB would take 67 hours to download (without overhead). That is why the Data Access Centers exist, so users can access the data and analyze it without downloading large subsets. Rather than move the data to the processing code, we permit you to process the data nearby."

One wonders how an automated system could be written to discover previously unknown classes of rare objects - part of the telescope's mission statement.

How do you program clairvoyance into a data analysis system? Kantor: "There are quite a few researchers pursuing the line that one can analyze large datasets statistically, and uncover outliers and anomalies of interest. This is very much a research topic and one that several LSST partners are pursuing.

"In addition, we are designing the software with the ability to extend it to new algorithms and data types easily. There is a tradeoff between flexibility and performance and we walk that line every day in the design."

Talking of design, agile process aficionados out there will be interested to hear that Kantor and his team are using the minimalist, UML-based ICONIX Process (a subject close to this writer's own heart) for their system and software requirements and design. The teams are geographically dispersed, so the LSST models are shared using Sparx Systems' Enterprise Architect (EA) version control integration capabilities. Individual packages are added to a central version control repository and these packages are then shared by several local EA project files.

Kantor adds: "For code, our development environment is based on the open source trac tool integrated with subversion for version control. This provides a source repository and browser, ticket system, and documentation wiki."

Measuring the success or failure of a project as massive and wide-ranging as the LSST, which will run over such a long period of time, could prove difficult. How would you know that the project is providing value for money, useful information?

Kantor agreed: "That is always a tricky question for 'big science' projects. Typically it is measured in terms of professional papers created from the survey. Additional metrics have to do with educational impact and public impact.

"The Hubble Space Telescope gave us pictures of distant objects in the Universe that changed most people's perception of and interest in astronomy. LSST has the same potential. How do you measure that?"®

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
Nexus 7 fandroids tell of salty taste after sucking on Google's Lollipop
Web giant looking into why version 5.0 of Android is crippling older slabs
Be real, Apple: In-app goodie grab games AREN'T FREE – EU
Cupertino stands down after Euro legal threats
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Bada-Bing! Mozilla flips Firefox to YAHOO! for search
Microsoft system will be the default for browser in US until 2020
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Getting ahead of the compliance curve
Learn about new services that make it easy to discover and manage certificates across the enterprise and how to get ahead of the compliance curve.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.