Feeds

Swinburne starts design of pulsar-hunting supercomputer

Australian Uni plans LGM hunt with FPGAs

Internet Security Threat Report 2014

Back when they first discovered pulsars – in the “Little Green Men” era of the 1960s – astronomers were seeing big, loud and slow pulses. Today's pulsar-hunters are hunting subtler beasts and therefore need a lot more computer power, which is why Australia's Swinburne University has decided to spend more than $AU600,000 to design a computer to join in the search.

The grant, announced at the end of May, is to design a machine that will be the pulsar signal processor for the Square Kilometre Array.

As project leader and Swinburne senior lecturer Dr Willem van Straten explained to The Register, compared to LGM-1 the pulsars of interest to physics in 2013 are “faster, weaker, and have travelled a greater distance”. All of this means that instead of spotting the pulsar with the naked eye and looking at a trace from an antenna, a lot of computing power is needed to distinguish the pulsar's signal – having suffered a lot of interstellar dispersion along the way – from the background noise.

Whereas LGM-1 had a pulse that repeated every ~1.337 seconds, the SKA will be looking for weak pulsars spinning hundreds of times per second.

To get ready for the pulsar search, Swinburne has begun 2½ years of design work to meet the requirements of very high I/O (both on the network connection delivering signals from the SKA's hundreds of antennas, and within the data centre); and very high performance processing.

Much of the computing will be straightforward multiplication and addition of vast amounts of complex numbers, Dr van Stratten said, but there will also be a requirement to carry out large numbers of fast Fourier transforms (FFTs).

First, the incoming pulse profiles will be averaged as a function of the pulsar's phase, and that will be divided into “bins”. “We might need five hundred bins to resolve the structure of a pulsar”, Dr van Stratten said.

Then, the frequency channels will be narrowed to correct for interstellar distortion, and the “cleaned” signal will be put back into the time domain for study.

One interesting debate that will be resolved during the design process will be to decide whether GPUs are suitable for the task, or whether it would be better to design an FPGA-based or ASIC-based processing system.

Dr van Stratten told The Register that throughput isn't the only consideration the designers will be working with, since “power consumption is a limiting factor in the design”.

Although GPU vendors like Nvidia are working hard on power consumption (to keep up with the requirements of the mobile age), there may be other reasons to attempt an FPGA-based design, since the tools now available to end users make it possible for a non-electrical engineer to design an algorithm that can be implemented onto the device for fast processing.

Such attractions have already been noticed elsewhere. For example, advanced Bitcoin miners now routinely use FPGAs in their hunt for the crypto-gold.

That will be resolved by 2016, when the design is complete and construction of the supercomputer will begin.

Partners in the design consortium include the National Research Council of Canada, the Science and Technology Facilities Council (UK), Oxford University, the University of Manchester, the Max Planck Institute for Radio Astronomy (MPIfR), SKA South Africa and the International Centre for Radio Astronomy Research. ®

Security for virtualized datacentres

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
Cray-cray Met Office spaffs £97m on VERY AVERAGE HPC box
Only 250th most powerful in the world? Bring back Michael Fish
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
Cisco and friends chase WiFi's searing speeds with new cable standard
Cat 5e and Cat 6 are bottlenecks for WLAN access points
CAGE MATCH: Microsoft, Dell open co-located bit barns in Oz
Whole new species of XaaS spawning in the antipodes
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Website security in corporate America
Find out how you rank among other IT managers testing your website's vulnerabilities.