Feeds

Putting the Square Kilometre Array on a Cloud

Should be enough to get free Amazon shipping, anyway

Internet Security Threat Report 2014

By now, you may have heard of the Square Kilometre Array: it is to be the world’s biggest radio telescope, assembled from 3,000 15-meter dishes into a collecting area of, yes, one square kilometer.

The Southern Hemisphere has less light pollution and radio interference than the North. And so South Africa and Western Australia are on the short list for the central core location, from where an array of receptors will snake out across the Indian Ocean islands up to 3,000km away.

A consortium of 67 organisations in 20 countries is working with industry vendors on the design. Construction is budgeted at $1.5bn and kicks off in 2016.

The SKA should be fully operational in 2024 and will be 10,000 times more sensitive than the best radio telescope today. The Big Questions it will help answer include the origins of the universe; the nature of Dark Matter and Dark Energy (which kind of creeps me out); and if Einstein was right with his General Relativity Theory – we’ll find out if space is truly bendy.

Astronomers and scientists will also look around to see what locations might support life and try to figure out where magnetism comes from. (And yes, the answer is more complicated than “magnets").

Data crunch

The SKA will generate huge volumes of data. The consortium is working on a test site that’s one per cent the size of the full-on SKA and will spit out raw data at 60 Terabits/sec. After some level of correlation and other processing, the rate settles down to 1GB/sec of data to be stored and analyzed.

In operation, SKA will generate 1TB/sec of pre-processed data, which would equal an Exabyte of data every 13 days. Even with much more aggregation, we’re talking about Exabytes of data.

According to a source on the web (so I know that it’s true), five exabytes is big enough to log every word ever spoken by human beings. I think this also would include short words like ‘a’ and ‘an’, but I’m not sure about grunts or exclamations. Either way, it’s a lot.

SKA antennas close-up - artist's impression

Hot dishes

So how do you process, transport, and store this much data? According to the authors of SKA Memo 134, Cloud Computing and the Square Kilometre Array, cloud storage/computing might handle the load.

They put forward some scenarios using Amazon EC2: the largest was storage of 1PB of data and continuous use of 1,000 compute nodes. The price tag is $225,000 per month plus an annual payment of $455,000 - which totals a little over $3.1m per year.

They note that they might be able to negotiate a volume discount, which could reduce costs significantly. I’d also make them throw in free Amazon Prime shipping, free media streaming, and early access to super-saver items before the general public sees them.

Plugging into the Grid

On the compute side, the authors talk about potentially using a SETI@Home or Folding@Home model to carry some of the load. According to their calculations, the current capacity available from folks volunteering their spare cycles is around 5PB (petabytes). If it were a single system, that would put this in second place on the Top500 behind the 8PB Japanese Super K.

Something that captured my imagination was their speculation that the unused or underutilized capacity on multi-core, broadband-attached PCs is something like 100 times the combined processing power of the entire Top500 list.

What would be a fair price for that capacity? Perhaps the number is somewhere north of the cost of data transport plus the incremental cost of electricity, which is still about ten times cheaper than any other processing available today.

This is an interesting concept - maybe a forerunner of future high performance computing (HPC). Would you sign up for free high-speed internet access in exchange for keeping your computer on all night and letting them scavenge your idle cycles? There would be no advertising on your screen, and they wouldn’t be tracking your movements and selling them to advertisers.

If they can negotiate low enough rates from the providers, the numbers might just work. It’s a win-win: the user gets free bandwidth, and the sponsoring organization gets their computing tasks done at much lower cost. ®

Internet Security Threat Report 2014

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
IBM storage revenues sink: 'We are disappointed,' says CEO
Time to put the storage biz up for sale?
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.