Feeds

Life, the universe and supercomputers

British boffins hope Sun cluster will answer cosmic conundrums

  • alert
  • submit to reddit

Remote control for virtualized desktops

Updated Trade and Industry Secretary Patricia Hewitt yesterday switched on a supercomputer at Durham University which academics hope will give fresh insights into the origin and creation of the universe.

The Cosmology Machine, which can perform 228 billion
floating point operations per second and costs a cool £1.4 million, will take different theories of cosmic evolution and work out how the heavens would have evolved based on these ideas. By comparing these models of virtual universes, scientists hope to test which of the competing theories best explains how the Universe was created.

Professor Carlos Frenk, director of the project at Durham, said: "The new machine will allow us to recreate the entire evolution of the universe, from its hot Big Bang beginning to the present. We are able to instruct the supercomputer as to how to make artificial universes which can be compared to astronomical observations."

Durham's project is the UK base of the "Virgo Consortium for cosmological simulations", a collaboration of about 30 researchers in the UK, Germany, Canada and the USA.

Long-term goals of the Virgo Consortium include understanding the formation of structures in the universe, establishing the identity and properties of the dark matter that dominates the dynamics of the universe and relating the Big Bang theory to astronomical observations.

In many ways the Cosmology Machine, which Durham claimed is the biggest in use in a British academic institution, has a similar purpose to the Deep Thought machine that features in Douglas Adam's Hitchhiker's Guide to the Galaxy. For those of you unfamiliar with the book, Deep Thought was programmed by a race of hyper-intelligent aliens to find and answer to the ultimate question about Life, the Universe and Everything.

After thinking about the problem for seven and a half million years Deep Thought came up with the answer "42". Boffins at Durham will be hoping for far more useful results from their monster machine.

Durham's supercomputer uses a cluster of 64 two-processor SunBlade 1000 machines, along with a 24-processor SunFire 6800 server, supplied by Sun reseller Esteem Systems. The system features 112 Gigabytes of RAM and 7 Terabytes of data storage, or more than enough to hold a copy of every book held by the British Library 10 times over.

The kit that made up Deep Thought isn't recorded. ®

Update

The original press release Durham has issued on this says the supercomputer works at 10 billion arithmetic calculations per second. Elsewhere on its site Durham said the system was capable of 200 billion arithmetic calculations per second. We have now clarified that there was a typo in the press release.

Claims that Durham's machine is the largest in use by a British University have been questioned by Re

External links
Durham's Department of Computational Cosmology

Related stories

RIP: Douglas Adams
NASA's new supercomp sits on a desktop
AMD cluster sneaks in Supercomputer top 500 list
Sun's Oz super computer goes horribly pear shaped
$10m super'puter to crunch genetic code

Security for virtualized datacentres

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
IBM, backing away from hardware? NEVER!
Don't be so sure, so-surers
Hey - who wants 4.8 TERABYTES almost AS FAST AS MEMORY?
China's Memblaze says they've got it in PCIe. Yow
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
This time it's SO REAL: Overcoming the open-source orgasm myth with TODO
If the web giants need it to work, hey, maybe it'll work
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Mitigating web security risk with SSL certificates
Web-based systems are essential tools for running business processes and delivering services to customers.