Feeds

Mapping the universe at 30 Terabytes a night

Jeff Kantor, on building and managing a 150 Petabyte database

  • alert
  • submit to reddit

The Power of One Brief: Top reasons to choose HP BladeSystem

Interview It makes for one heck of a project mission statement. Explore the nature of dark matter, chart the Solar System in exhaustive detail, discover and analyze rare objects such as neutron stars and black hole binaries, and map out the structure of the Galaxy.

The Large Synoptic Survey Telescope (LSST) is, in the words of Jeff Kantor, LSST data management project manager, "a proposed ground-based 6.7 meter effective diameter (8.4 meter primary mirror), 10 square-degree-field telescope that will provide digital imaging of faint astronomical objects across the entire sky, night after night." Phew.

When it's fully operational in 2016, the LSST will: "Open a movie-like window on objects that change or move on rapid timescales: exploding supernovae, potentially hazardous near-Earth asteroids, and distant Kuiper Belt Objects.

"The superb images from the LSST will also be used to trace billions of remote galaxies and measure the distortions in their shapes produced by lumps of Dark Matter, providing multiple tests of the mysterious Dark Energy."

In its planned 10-year run, the LSST will capture, process and store more than 30 Terabytes (TB) of image data each night, yielding a 150 Petabytes (PB) database. Talking to The Reg, Kantor called this the largest non-proprietary dataset in the world.

Data management is one of the most challenging aspects of the LSST. Every pair of 6.4GB images must be processed within 60 seconds in order to provide astronomical transient alerts to the community. In order to do this, the Data Management System is composed of a number of key elements. These are:

  • the Mountain/Base facility, which does initial data reduction and alert generation on a 25 TFLOPS Linux cluster with 60PB of storage (in year 10 of the survey)
  • a 2.5 Gbps network that transfers the data from Chile (where the telescope itself will be based) to the U.S. and within the US
  • the Archive Center, which re-reduces the data and produces annual data releases on a 250 TFLOPS Linux cluster and 60PB of storage (in year 10 of the survey)
  • the Data Access Centers which provide access to all of the data products as well as 45 TFLOPS and 12 Petabytes of end user available computing and storage.

So what's a time-critical system of this magnitude written in?

The data reduction pipelines are developed in C++ and Python. They rely on approximately 30 off-the-shelf middleware packages/libraries for parallel processing, data persistence and retrieval, data transfer, visualization, operations management and control, and security. The current design is based on MySQL layered on a parallel, fault-tolerant file system.

Securing Web Applications Made Simple and Scalable

More from The Register

next story
Apple fanbois SCREAM as update BRICKS their Macbook Airs
Ragegasm spills over as firmware upgrade kills machines
HIDDEN packet sniffer spy tech in MILLIONS of iPhones, iPads – expert
Don't panic though – Apple's backdoor is not wide open to all, guru tells us
Mozilla fixes CRITICAL security holes in Firefox, urges v31 upgrade
Misc memory hazards 'could be exploited' - and guess what, one's a Javascript vuln
NO MORE ALL CAPS and other pleasures of Visual Studio 14
Unpicking a packed preview that breaks down ASP.NET
Captain Kirk sets phaser to SLAUGHTER after trying new Facebook app
William Shatner less-than-impressed by Zuck's celebrity-only app
Cheer up, Nokia fans. It can start making mobes again in 18 months
The real winner of the Nokia sale is *drumroll* ... Nokia
EU dons gloves, pokes Google's deals with Android mobe makers
El Reg cops a squint at investigatory letters
Chrome browser has been DRAINING PC batteries for YEARS
Google is only now fixing ancient, energy-sapping bug
prev story

Whitepapers

Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Reducing security risks from open source software
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Consolidation: the foundation for IT and business transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.