Feeds

Life, the universe and supercomputers

British boffins hope Sun cluster will answer cosmic conundrums

  • alert
  • submit to reddit

Top 5 reasons to deploy VMware with Tegile

Updated Trade and Industry Secretary Patricia Hewitt yesterday switched on a supercomputer at Durham University which academics hope will give fresh insights into the origin and creation of the universe.

The Cosmology Machine, which can perform 228 billion
floating point operations per second and costs a cool £1.4 million, will take different theories of cosmic evolution and work out how the heavens would have evolved based on these ideas. By comparing these models of virtual universes, scientists hope to test which of the competing theories best explains how the Universe was created.

Professor Carlos Frenk, director of the project at Durham, said: "The new machine will allow us to recreate the entire evolution of the universe, from its hot Big Bang beginning to the present. We are able to instruct the supercomputer as to how to make artificial universes which can be compared to astronomical observations."

Durham's project is the UK base of the "Virgo Consortium for cosmological simulations", a collaboration of about 30 researchers in the UK, Germany, Canada and the USA.

Long-term goals of the Virgo Consortium include understanding the formation of structures in the universe, establishing the identity and properties of the dark matter that dominates the dynamics of the universe and relating the Big Bang theory to astronomical observations.

In many ways the Cosmology Machine, which Durham claimed is the biggest in use in a British academic institution, has a similar purpose to the Deep Thought machine that features in Douglas Adam's Hitchhiker's Guide to the Galaxy. For those of you unfamiliar with the book, Deep Thought was programmed by a race of hyper-intelligent aliens to find and answer to the ultimate question about Life, the Universe and Everything.

After thinking about the problem for seven and a half million years Deep Thought came up with the answer "42". Boffins at Durham will be hoping for far more useful results from their monster machine.

Durham's supercomputer uses a cluster of 64 two-processor SunBlade 1000 machines, along with a 24-processor SunFire 6800 server, supplied by Sun reseller Esteem Systems. The system features 112 Gigabytes of RAM and 7 Terabytes of data storage, or more than enough to hold a copy of every book held by the British Library 10 times over.

The kit that made up Deep Thought isn't recorded. ®

Update

The original press release Durham has issued on this says the supercomputer works at 10 billion arithmetic calculations per second. Elsewhere on its site Durham said the system was capable of 200 billion arithmetic calculations per second. We have now clarified that there was a typo in the press release.

Claims that Durham's machine is the largest in use by a British University have been questioned by Re

External links
Durham's Department of Computational Cosmology

Related stories

RIP: Douglas Adams
NASA's new supercomp sits on a desktop
AMD cluster sneaks in Supercomputer top 500 list
Sun's Oz super computer goes horribly pear shaped
$10m super'puter to crunch genetic code

Beginner's guide to SSL certificates

More from The Register

next story
Ellison: Sparc M7 is Oracle's most important silicon EVER
'Acceleration engines' key to performance, security, Larry says
Oracle SHELLSHOCKER - data titan lists unpatchables
Database kingpin lists 32 products that can't be patched (yet) as GNU fixes second vuln
Lenovo to finish $2.1bn IBM x86 server gobble in October
A lighter snack than expected – but what's a few $100m between friends, eh?
Ello? ello? ello?: Facebook challenger in DDoS KNOCKOUT
Gets back up again after half an hour though
Troll hunter Rackspace turns Rotatable's bizarro patent to stone
News of the Weird: Screen-rotating technology declared unpatentable
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.