Feeds

Swinburne starts design of pulsar-hunting supercomputer

Australian Uni plans LGM hunt with FPGAs

Internet Security Threat Report 2014

Back when they first discovered pulsars – in the “Little Green Men” era of the 1960s – astronomers were seeing big, loud and slow pulses. Today's pulsar-hunters are hunting subtler beasts and therefore need a lot more computer power, which is why Australia's Swinburne University has decided to spend more than $AU600,000 to design a computer to join in the search.

The grant, announced at the end of May, is to design a machine that will be the pulsar signal processor for the Square Kilometre Array.

As project leader and Swinburne senior lecturer Dr Willem van Straten explained to The Register, compared to LGM-1 the pulsars of interest to physics in 2013 are “faster, weaker, and have travelled a greater distance”. All of this means that instead of spotting the pulsar with the naked eye and looking at a trace from an antenna, a lot of computing power is needed to distinguish the pulsar's signal – having suffered a lot of interstellar dispersion along the way – from the background noise.

Whereas LGM-1 had a pulse that repeated every ~1.337 seconds, the SKA will be looking for weak pulsars spinning hundreds of times per second.

To get ready for the pulsar search, Swinburne has begun 2½ years of design work to meet the requirements of very high I/O (both on the network connection delivering signals from the SKA's hundreds of antennas, and within the data centre); and very high performance processing.

Much of the computing will be straightforward multiplication and addition of vast amounts of complex numbers, Dr van Stratten said, but there will also be a requirement to carry out large numbers of fast Fourier transforms (FFTs).

First, the incoming pulse profiles will be averaged as a function of the pulsar's phase, and that will be divided into “bins”. “We might need five hundred bins to resolve the structure of a pulsar”, Dr van Stratten said.

Then, the frequency channels will be narrowed to correct for interstellar distortion, and the “cleaned” signal will be put back into the time domain for study.

One interesting debate that will be resolved during the design process will be to decide whether GPUs are suitable for the task, or whether it would be better to design an FPGA-based or ASIC-based processing system.

Dr van Stratten told The Register that throughput isn't the only consideration the designers will be working with, since “power consumption is a limiting factor in the design”.

Although GPU vendors like Nvidia are working hard on power consumption (to keep up with the requirements of the mobile age), there may be other reasons to attempt an FPGA-based design, since the tools now available to end users make it possible for a non-electrical engineer to design an algorithm that can be implemented onto the device for fast processing.

Such attractions have already been noticed elsewhere. For example, advanced Bitcoin miners now routinely use FPGAs in their hunt for the crypto-gold.

That will be resolved by 2016, when the design is complete and construction of the supercomputer will begin.

Partners in the design consortium include the National Research Council of Canada, the Science and Technology Facilities Council (UK), Oxford University, the University of Manchester, the Max Planck Institute for Radio Astronomy (MPIfR), SKA South Africa and the International Centre for Radio Astronomy Research. ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.