Feeds

Floridians to design first space-based supercomputer

Hey, Dave, what are you doing?

Next gen security for virtualised datacentres

Engineers and researchers at the University of Florida are hard at work building a supercomputer capable of surviving in orbit. The computer is scheduled for a 2009 launch, on board a NASA test mission, and will be 100 times more powerful than anything already orbiting Earth.

So far, orbiting computers have missed out on decades of technological advances because of the need to "harden" anything electronic to protect it from cosmic radiation. This makes things slower and bulkier, not great when weight is such a critical issue.

However, as the amount of science being done by orbiting satellites increases, so does the need for more powerful number crunchers above the atmosphere, and faster downlinks to Earth-based researchers.

"There are only so many bits per second you can send down from a satellite," said John Samson, the principal investigator for the project at Honeywell's Clearwater facility. "That means scientists are very limited in how much science they can do."

The technology developed will also be needed to build more autonomous space craft that will be more capable of making course corrections, for example.

"To explore space and to support Earth and space science, there is a great need for much more processing power in space," said Alan George, a professor of electrical and computer engineering at the University of Florida (UF). "To be autonomous is to require a lot of computation and, until now, conventional space processing technologies have been incapable of high-performance computing."

The UF researchers are not trying to find lighter and smaller ways of hardening it, instead they are trying to find ways of building a a machine that is capable of functioning in a radiation intensive environment. Techniques include making a machine especially fault tolerant, being able to switch from a failing circuit board to a functioning one, and using algorithms to check for errors. ®

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Cutting cancer rates: Data, models and a happy ending?
How surgery might be making cancer prognoses worse
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Top 8 considerations to enable and simplify mobility
In this whitepaper learn how to successfully add mobile capabilities simply and cost effectively.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?