Feeds

Life, the universe and supercomputers

British boffins hope Sun cluster will answer cosmic conundrums

  • alert
  • submit to reddit

Boost IT visibility and business value

Updated Trade and Industry Secretary Patricia Hewitt yesterday switched on a supercomputer at Durham University which academics hope will give fresh insights into the origin and creation of the universe.

The Cosmology Machine, which can perform 228 billion
floating point operations per second and costs a cool £1.4 million, will take different theories of cosmic evolution and work out how the heavens would have evolved based on these ideas. By comparing these models of virtual universes, scientists hope to test which of the competing theories best explains how the Universe was created.

Professor Carlos Frenk, director of the project at Durham, said: "The new machine will allow us to recreate the entire evolution of the universe, from its hot Big Bang beginning to the present. We are able to instruct the supercomputer as to how to make artificial universes which can be compared to astronomical observations."

Durham's project is the UK base of the "Virgo Consortium for cosmological simulations", a collaboration of about 30 researchers in the UK, Germany, Canada and the USA.

Long-term goals of the Virgo Consortium include understanding the formation of structures in the universe, establishing the identity and properties of the dark matter that dominates the dynamics of the universe and relating the Big Bang theory to astronomical observations.

In many ways the Cosmology Machine, which Durham claimed is the biggest in use in a British academic institution, has a similar purpose to the Deep Thought machine that features in Douglas Adam's Hitchhiker's Guide to the Galaxy. For those of you unfamiliar with the book, Deep Thought was programmed by a race of hyper-intelligent aliens to find and answer to the ultimate question about Life, the Universe and Everything.

After thinking about the problem for seven and a half million years Deep Thought came up with the answer "42". Boffins at Durham will be hoping for far more useful results from their monster machine.

Durham's supercomputer uses a cluster of 64 two-processor SunBlade 1000 machines, along with a 24-processor SunFire 6800 server, supplied by Sun reseller Esteem Systems. The system features 112 Gigabytes of RAM and 7 Terabytes of data storage, or more than enough to hold a copy of every book held by the British Library 10 times over.

The kit that made up Deep Thought isn't recorded. ®

Update

The original press release Durham has issued on this says the supercomputer works at 10 billion arithmetic calculations per second. Elsewhere on its site Durham said the system was capable of 200 billion arithmetic calculations per second. We have now clarified that there was a typo in the press release.

Claims that Durham's machine is the largest in use by a British University have been questioned by Re

External links
Durham's Department of Computational Cosmology

Related stories

RIP: Douglas Adams
NASA's new supercomp sits on a desktop
AMD cluster sneaks in Supercomputer top 500 list
Sun's Oz super computer goes horribly pear shaped
$10m super'puter to crunch genetic code

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Flash could be CHEAPER than SAS DISK? Come off it, NetApp
Stats analysis reckons we'll hit that point in just three years
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
Object storage bods Exablox: RAID is dead, baby. RAID is dead
Bring your own disks to its object appliances
Nimble's latest mutants GORGE themselves on unlucky forerunners
Crossing Sandy Bridges without stopping for breath
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.