Feeds

Big Blue prototypes software for big, big data

Coping with the astronomy info-glut

Security for virtualized datacentres

IBM has prototyped a software architecture for the huge data demands of astronomy projects such as the SKA (square kilometer array).

One of the many problems created by a project as large as the SKA is that wherever it’s built – we’ll know next year if the South Africa bid or the Australia / New Zealand bid wins – it’s going to generate too much data to store.

With as much as Exabyte of data as its daily dump, the SKA will demand new techniques just so astronomers can use the facility’s output (cue: Mission Impossible theme).

Working with New Zealand-based radio astronomer, Dr Melanie Johnston-Hollitt from Wellington’s Victoria University, IBM has created the Information Intensive Framework prototype.

Under the framework, data will be classified into astronomical concepts, and overlaid with a guided search facility for faster data access and fewer errors. IBM says the prototype has also suggested further improvements to achieve the SKA’s performance demands.

"Undertaking research on exa-scale datasets will force radio astronomers into a new, as yet, unexplored regime of automated processing, imaging and analysis,” Dr Johnston-Hollitt said in IBM’s announcement.

“Surveys on even SKA precursor telescopes such as ASKAP and MWA are expected to produce catalogues of tens of millions of radio sources. How we organise and classify these data, which we will have in the next three years, is a significant challenge. We will need new solutions to fully realize the vast scientific potential of these datasets and it's fantastic that organisations like IBM are prepared to take up that challenge.”

If the A/NZ team wins the SKA contract, data will have to be pre-processed close to the telescopes (most of which would be in the remote north-west of Western Australia), then sent back to the Pawsey Supercomputing Centre in Perth for storage and analysis.

Since contracts are now open for the next phase of the Pawsey Centre’s implementation, The Register wouldn’t be surprised if IBM isn’t the only company trying to position itself with relevant software architectures. ®

Security for virtualized datacentres

More from The Register

next story
Boffins who stare at goats: I do believe they’re SHRINKING
Alpine chamois being squashed by global warming
What's that STINK? Rosetta probe shoves nose under comet's tail
Rotten eggs, horse dung and almonds – yuck
Comet Siding Spring revealed as flying molehill
Hiding from this space pimple isn't going to do humanity's reputation any good
Kip Thorne explains how he created the black hole for Interstellar
Movie special effects project spawns academic papers on gravitational lensing
Experts brand LOHAN's squeaky-clean box
Phytosanitary treatment renders Vulture 2 crate fit for export
LONG ARM of the SAUR: Brachially gifted dino bone conundrum solved
Deinocheirus mirificus was a bit of a knuckle dragger
Moment of truth for LOHAN's servos: Our US allies are poised for final test flight
Will Vulture 2 freeze at altitude? Edge Research Lab to find out
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.