Feeds

SKA precursor starts streaming firehosing astrodata to the world

Cramming the universe into a fibre at 1.5 terabytes an hour

Top 5 reasons to deploy VMware with Tegile

Hard on the heels of yesterday's discussion of high-performance computing with the International Centre for Radio Astronomy Research and the National Computational Infrastructure comes the announcement that real data has started to stream out of Western Australia's Murchison Widefield Array.

In fact, to anybody but the biggest big data enthusiast, “stream” seems an inadequate word for what's leaving this one source. That's because the MWA's 2,048 dual-polarisation dipole antennas, arranged as 128 “tiles” (mostly in a relatively small 1.5 km core region, with some beyond that to yield a 3 km baseline) yield a veritable firehose of data.

The iVEC-managed Pawsey Centre, 800 km distant in Perth is receiving 400 megabytes per second from the telescope – and as discussed here and here, that's after on-site correlators, the first step in processing the telescope data, reduce the amount of data leaving the site to a manageable level.

A dedicated 10 Gbps fibre runs from Murchison to Geraldton, after which dark fibre on the Nextgen Networks-operated Regional Backhaul Blackspots Project link to Perth provides connectivity to the Pawsey Centre.

Murchison iVEC - ICRAR team

The Murchison Widefield Array Data Archive team from ICRAR: Dave Pallot (left) Professor Andreas Wicenec (centre) and Associate Professor Chen Wu (right) at the Pawsey Centre. Image: ICRAR

Just how much work is needed to create a manageable data set is demonstrated by the fact that the MWA's correlators perform half of the entire computation needed by the facility, iVEC says.

That gets the raw data – about 1.5 TB per hour – down to a much more manageable requirement that the Pawsey Centre store a mere 3 PB annually.

And, since in science data becomes more useful the more available it is, there are other institutions in on the act. MIT in the US and the Victoria University in New Zealand already have links to the Pawsey Centre, with another planned to connect India, another partner in the MWA project.

MIT's research is the early Universe, which means there's further filtering carried out. A mere 150 TB has been sent to the USA of astronomy data already collected in the MWA's pilot operations and shipped to Perth on an earlier 1 Gbps link. The new data streams will add 4 TB per day to that.

As previously discussed by The Register, the Pawsey Centre will also host an advanced hierarchical storage management facility, including a 20 Petabyte Spectra Logic tape library, a 6 Petabyte SGI disk-based storage system, and a significant visualisation and post-processing capability provided by SGI in the form of a 6 Terabyte UV2000 and 34 Data Analysis Engines. All of these systems will be interconnected via a high-speed (FDR) Infiniband network.

Managing the distribution of this data is the open-source Next Generation Archive System (NGAS) developed by professor Wicenec while at the European Southern Observatory and modified by an ICRAR for operation at the Pawsey Centre. ®

Internet Security Threat Report 2014

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The hidden costs of self-signed SSL certificates
Exploring the true TCO for self-signed SSL certificates, including a side-by-side comparison of a self-signed architecture versus working with a third-party SSL vendor.