Feeds

Mapping the universe at 30 Terabytes a night

Jeff Kantor, on building and managing a 150 Petabyte database

  • alert
  • submit to reddit

Reducing security risks from open source software

Interview It makes for one heck of a project mission statement. Explore the nature of dark matter, chart the Solar System in exhaustive detail, discover and analyze rare objects such as neutron stars and black hole binaries, and map out the structure of the Galaxy.

The Large Synoptic Survey Telescope (LSST) is, in the words of Jeff Kantor, LSST data management project manager, "a proposed ground-based 6.7 meter effective diameter (8.4 meter primary mirror), 10 square-degree-field telescope that will provide digital imaging of faint astronomical objects across the entire sky, night after night." Phew.

When it's fully operational in 2016, the LSST will: "Open a movie-like window on objects that change or move on rapid timescales: exploding supernovae, potentially hazardous near-Earth asteroids, and distant Kuiper Belt Objects.

"The superb images from the LSST will also be used to trace billions of remote galaxies and measure the distortions in their shapes produced by lumps of Dark Matter, providing multiple tests of the mysterious Dark Energy."

In its planned 10-year run, the LSST will capture, process and store more than 30 Terabytes (TB) of image data each night, yielding a 150 Petabytes (PB) database. Talking to The Reg, Kantor called this the largest non-proprietary dataset in the world.

Data management is one of the most challenging aspects of the LSST. Every pair of 6.4GB images must be processed within 60 seconds in order to provide astronomical transient alerts to the community. In order to do this, the Data Management System is composed of a number of key elements. These are:

  • the Mountain/Base facility, which does initial data reduction and alert generation on a 25 TFLOPS Linux cluster with 60PB of storage (in year 10 of the survey)
  • a 2.5 Gbps network that transfers the data from Chile (where the telescope itself will be based) to the U.S. and within the US
  • the Archive Center, which re-reduces the data and produces annual data releases on a 250 TFLOPS Linux cluster and 60PB of storage (in year 10 of the survey)
  • the Data Access Centers which provide access to all of the data products as well as 45 TFLOPS and 12 Petabytes of end user available computing and storage.

So what's a time-critical system of this magnitude written in?

The data reduction pipelines are developed in C++ and Python. They rely on approximately 30 off-the-shelf middleware packages/libraries for parallel processing, data persistence and retrieval, data transfer, visualization, operations management and control, and security. The current design is based on MySQL layered on a parallel, fault-tolerant file system.

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
NO MORE ALL CAPS and other pleasures of Visual Studio 14
Unpicking a packed preview that breaks down ASP.NET
Cheer up, Nokia fans. It can start making mobes again in 18 months
The real winner of the Nokia sale is *drumroll* ... Nokia
Mozilla fixes CRITICAL security holes in Firefox, urges v31 upgrade
Misc memory hazards 'could be exploited' - and guess what, one's a Javascript vuln
Put down that Oracle database patch: It could cost $23,000 per CPU
On-by-default INMEMORY tech a boon for developers ... as long as they can afford it
Google shows off new Chrome OS look
Athena springs full-grown from Chromium project's head
Apple: We'll unleash OS X Yosemite beta on the MASSES on 24 July
Starting today, regular fanbois will be guinea pigs, it tells Reg
HIDDEN packet sniffer spy tech in MILLIONS of iPhones, iPads – expert
Don't panic though – Apple's backdoor is not wide open to all, guru tells us
prev story

Whitepapers

Top three mobile application threats
Prevent sensitive data leakage over insecure channels or stolen mobile devices.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.