Feeds

CERN BOFH needs a bigger storage array

Networking the secrets of the universe

Secure remote control for conventional and virtual desktops

Preparations are well underway at CERN to commission the world's largest particle accelerator. Advances in networking technology have allowed the particle physics lab to bring in scientists from around the world to analyse the data is will generate.

When activated in May 2008 it's hoped that the Large Hadron Collider (LHC) will uncover evidence of the elusive Higgs boson particle, theorised as a fundamental building block for matter.

Observing signs of the particle would mark a significant milestone in formulating a Grand Unified Theory that explains the four fundamental forces in nature: electromagnetism, the strong nuclear force, the weak force, and gravity. The previous collider wasn't powerful enough to peer into the energy space where Higgs boson particles are theorised to exist.

The LHC is being built 100 metres underground in a circular chamber that runs from Geneva airport out under the Jura Mountains and back, giving the collider a diameter of 27km. During commissioning parts of the accelerator were cooled to -271°C, less than two degrees from the absolute zero of temperature or colder than outer space.

Supercooled superconducting magnets in the LHC are used to accelerate beams of protons to close to the speed of light, before two beams travelling in opposite directions collide with each other around 40 million times a second.

The detectors have millions of output channels, generating data at around one million Gigabytes a second. That's way beyond the capability of existing technology to capture and store, so systems have been put in place to filter this data down into more manageable chunks. In practice, data acquisition occurs at around 100MB per second.

Enter the Matrix

Even after information is filtered to concentrate on interesting events the collider generates a phenomenal volume of data, around 15 million Gigabytes of data a year. This data is distributed from CERN to partner laboratories, from Academia Sinica in Taipei to Fermilab in Chicago, via fibre optic links.

This network (with CERN at its hub) is the backbone of the Worldwide LHC Computing Grid, the largest scientific grid of its kind. At its core is a 10Gbps network that uses kit supplied by HP and Force10. The network feeds data to rack of blade servers at CERN and out through edge switches to data centres at its partners.

Secure remote control for conventional and virtual desktops

More from The Register

next story
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
BBC: We're going to slip CODING into kids' TV
Pureed-carrot-in-ice cream C++ surprise
TROLL SLAYER Google grabs $1.3 MEEELLION in patent counter-suit
Chocolate Factory hits back at firm for suing customers
Twitter: La la la, we have not heard of any NUDE JLaw, Upton SELFIES
If there are any on our site it is not our fault as we are not a PUBLISHER
Facebook, Google and Instagram 'worse than drugs' says Miley Cyrus
Italian boffins agree with popette's theory that haters are the real wrecking balls
Sit tight, fanbois. Apple's '$400' wearable release slips into early 2015
Sources: time to put in plenty of clock-watching for' iWatch
Facebook to let stalkers unearth buried posts with mobe search
Prepare to HAUNT your pal's back catalogue
Ex-IBM CEO John Akers dies at 79
An era disrupted by the advent of the PC
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.