Feeds

Swinburne starts design of pulsar-hunting supercomputer

Australian Uni plans LGM hunt with FPGAs

HP ProLiant Gen8: Integrated lifecycle automation

Back when they first discovered pulsars – in the “Little Green Men” era of the 1960s – astronomers were seeing big, loud and slow pulses. Today's pulsar-hunters are hunting subtler beasts and therefore need a lot more computer power, which is why Australia's Swinburne University has decided to spend more than $AU600,000 to design a computer to join in the search.

The grant, announced at the end of May, is to design a machine that will be the pulsar signal processor for the Square Kilometre Array.

As project leader and Swinburne senior lecturer Dr Willem van Straten explained to The Register, compared to LGM-1 the pulsars of interest to physics in 2013 are “faster, weaker, and have travelled a greater distance”. All of this means that instead of spotting the pulsar with the naked eye and looking at a trace from an antenna, a lot of computing power is needed to distinguish the pulsar's signal – having suffered a lot of interstellar dispersion along the way – from the background noise.

Whereas LGM-1 had a pulse that repeated every ~1.337 seconds, the SKA will be looking for weak pulsars spinning hundreds of times per second.

To get ready for the pulsar search, Swinburne has begun 2½ years of design work to meet the requirements of very high I/O (both on the network connection delivering signals from the SKA's hundreds of antennas, and within the data centre); and very high performance processing.

Much of the computing will be straightforward multiplication and addition of vast amounts of complex numbers, Dr van Stratten said, but there will also be a requirement to carry out large numbers of fast Fourier transforms (FFTs).

First, the incoming pulse profiles will be averaged as a function of the pulsar's phase, and that will be divided into “bins”. “We might need five hundred bins to resolve the structure of a pulsar”, Dr van Stratten said.

Then, the frequency channels will be narrowed to correct for interstellar distortion, and the “cleaned” signal will be put back into the time domain for study.

One interesting debate that will be resolved during the design process will be to decide whether GPUs are suitable for the task, or whether it would be better to design an FPGA-based or ASIC-based processing system.

Dr van Stratten told The Register that throughput isn't the only consideration the designers will be working with, since “power consumption is a limiting factor in the design”.

Although GPU vendors like Nvidia are working hard on power consumption (to keep up with the requirements of the mobile age), there may be other reasons to attempt an FPGA-based design, since the tools now available to end users make it possible for a non-electrical engineer to design an algorithm that can be implemented onto the device for fast processing.

Such attractions have already been noticed elsewhere. For example, advanced Bitcoin miners now routinely use FPGAs in their hunt for the crypto-gold.

That will be resolved by 2016, when the design is complete and construction of the supercomputer will begin.

Partners in the design consortium include the National Research Council of Canada, the Science and Technology Facilities Council (UK), Oxford University, the University of Manchester, the Max Planck Institute for Radio Astronomy (MPIfR), SKA South Africa and the International Centre for Radio Astronomy Research. ®

Reducing security risks from open source software

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Amazon Reveals One Weird Trick: A Loss On Almost $20bn In Sales
Investors really hate it: Share price plunge as growth SLOWS in key AWS division
US judge: YES, cops or feds so can slurp an ENTIRE Gmail account
Crooks don't have folders labelled 'drug records', opines NY beak
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Designing a Defense for Mobile Applications
Learn about the various considerations for defending mobile applications - from the application architecture itself to the myriad testing technologies.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
Seven Steps to Software Security
Seven practical steps you can begin to take today to secure your applications and prevent the damages a successful cyber-attack can cause.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.