Feeds

Life, the universe and supercomputers

British boffins hope Sun cluster will answer cosmic conundrums

  • alert
  • submit to reddit

Next gen security for virtualised datacentres

Updated Trade and Industry Secretary Patricia Hewitt yesterday switched on a supercomputer at Durham University which academics hope will give fresh insights into the origin and creation of the universe.

The Cosmology Machine, which can perform 228 billion
floating point operations per second and costs a cool £1.4 million, will take different theories of cosmic evolution and work out how the heavens would have evolved based on these ideas. By comparing these models of virtual universes, scientists hope to test which of the competing theories best explains how the Universe was created.

Professor Carlos Frenk, director of the project at Durham, said: "The new machine will allow us to recreate the entire evolution of the universe, from its hot Big Bang beginning to the present. We are able to instruct the supercomputer as to how to make artificial universes which can be compared to astronomical observations."

Durham's project is the UK base of the "Virgo Consortium for cosmological simulations", a collaboration of about 30 researchers in the UK, Germany, Canada and the USA.

Long-term goals of the Virgo Consortium include understanding the formation of structures in the universe, establishing the identity and properties of the dark matter that dominates the dynamics of the universe and relating the Big Bang theory to astronomical observations.

In many ways the Cosmology Machine, which Durham claimed is the biggest in use in a British academic institution, has a similar purpose to the Deep Thought machine that features in Douglas Adam's Hitchhiker's Guide to the Galaxy. For those of you unfamiliar with the book, Deep Thought was programmed by a race of hyper-intelligent aliens to find and answer to the ultimate question about Life, the Universe and Everything.

After thinking about the problem for seven and a half million years Deep Thought came up with the answer "42". Boffins at Durham will be hoping for far more useful results from their monster machine.

Durham's supercomputer uses a cluster of 64 two-processor SunBlade 1000 machines, along with a 24-processor SunFire 6800 server, supplied by Sun reseller Esteem Systems. The system features 112 Gigabytes of RAM and 7 Terabytes of data storage, or more than enough to hold a copy of every book held by the British Library 10 times over.

The kit that made up Deep Thought isn't recorded. ®

Update

The original press release Durham has issued on this says the supercomputer works at 10 billion arithmetic calculations per second. Elsewhere on its site Durham said the system was capable of 200 billion arithmetic calculations per second. We have now clarified that there was a typo in the press release.

Claims that Durham's machine is the largest in use by a British University have been questioned by Re

External links
Durham's Department of Computational Cosmology

Related stories

RIP: Douglas Adams
NASA's new supercomp sits on a desktop
AMD cluster sneaks in Supercomputer top 500 list
Sun's Oz super computer goes horribly pear shaped
$10m super'puter to crunch genetic code

Next gen security for virtualised datacentres

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.