Original URL: http://www.theregister.co.uk/2007/10/01/cern_netevents/
CERN BOFH needs a bigger storage array
Networking the secrets of the universe
Preparations are well underway at CERN to commission the world's largest particle accelerator. Advances in networking technology have allowed the particle physics lab to bring in scientists from around the world to analyse the data is will generate.
When activated in May 2008 it's hoped that the Large Hadron Collider (LHC) will uncover evidence of the elusive Higgs boson particle, theorised as a fundamental building block for matter.
Observing signs of the particle would mark a significant milestone in formulating a Grand Unified Theory that explains the four fundamental forces in nature: electromagnetism, the strong nuclear force, the weak force, and gravity. The previous collider wasn't powerful enough to peer into the energy space where Higgs boson particles are theorised to exist.
The LHC is being built 100 metres underground in a circular chamber that runs from Geneva airport out under the Jura Mountains and back, giving the collider a diameter of 27km. During commissioning parts of the accelerator were cooled to -271°C, less than two degrees from the absolute zero of temperature or colder than outer space.
Supercooled superconducting magnets in the LHC are used to accelerate beams of protons to close to the speed of light, before two beams travelling in opposite directions collide with each other around 40 million times a second.
The detectors have millions of output channels, generating data at around one million Gigabytes a second. That's way beyond the capability of existing technology to capture and store, so systems have been put in place to filter this data down into more manageable chunks. In practice, data acquisition occurs at around 100MB per second.
Enter the Matrix
Even after information is filtered to concentrate on interesting events the collider generates a phenomenal volume of data, around 15 million Gigabytes of data a year. This data is distributed from CERN to partner laboratories, from Academia Sinica in Taipei to Fermilab in Chicago, via fibre optic links.
This network (with CERN at its hub) is the backbone of the Worldwide LHC Computing Grid, the largest scientific grid of its kind. At its core is a 10Gbps network that uses kit supplied by HP and Force10. The network feeds data to rack of blade servers at CERN and out through edge switches to data centres at its partners.
The data from the LHC experiments will be distributed around the world. A primary backup will be recorded on tape at CERN. After initial processing, this data will be distributed to 11 Tier-1 centres, large computer centres with sufficient storage capacity. Smaller (Tier-2) centres will handle the analysis of specific tasks.
Processing this data requires the power of around 100,000 desktop CPUs, a figure that's stayed the same at CERN for around seven years. Processor speed increases have kept track with the growth in data that colliders generate. Advances in communication technology have allowed CERN to bring in partner organisations in its research.
David Foster, head of communications systems at CERN, runs the team behind this behemoth network. He's also responsible for a campus network that supports 2,500 staff at CERN and thousands of visiting scientists, as well as mobile and Wi-Fi infrastructures for the lab, and hundreds of kilometers of cables in the underground installations at CERN.
"Changes in networking have enabled a rethink of our whole business model. We can have a global research network - rather than one that is centrally located thanks to advances in communications technology," he said.
Moving data onto storage and then off to be processed is a major headache, Foster explained. Tape is still the most effective way to store data, but it brings its own problems. "Tape is slow and not progressing particularly quickly. Solid state memory and solid state storage are important things to look at," Foster told delegates at the NetEvents conference in Malta last week. "Fast networks alone are not enough."
He said the LHC would seek to answer questions such as "where do particles get mass", and other fundamental questions of physics. The project is costing a cool $5bn but it's money well spent, Foster argues.
"Quite apart from the possible spin-off if we fail to look into the fundamentals of the universe we lose so much. It's in human nature to be curious," Foster told El Reg. ®