Feeds

The Large Hadron Collider's mega-pic churn

If you can't destroy the world, drown it in data

The essential guide to IT transformation

Blogs The Large Hadron Collider has been operating for a few months now, and it hasn’t ripped apart the space/time continuum – not where I live, anyway, and that’s mostly all I care about. Of course, it could be that it’s still early, and that the cumulative effects of accelerating particles really fast could still spell the end of everything. Until that happens, the LHC is generating enough data to keep scientists busy from now until doomsday (unless doomsday is in the next couple of years).

A recent story from Forbes tech correspondent Lee Gomes brought home the scale of the LHC and the storage challenge. The 150 million sensors each take 40 million pictures per second – which results in what anyone will admit is a fairly large amount of data. In terms of pictures per second, we’re talking six thousand trillion: a six with 15 zeros, or 6x1015. That’s only slightly more pictures than my mom takes at a family reunion using her 15-year-old 35mm camera – and far less annoying.

According to the story, most of this data is just uninteresting noise. But it all has to be sifted through to figure out which bits are worth a closer look. The total amount of ‘good stuff’ should amount to around 15 petabytes per year.

To crunch through this treasure trove of data, the WLGC (Worldwide LHC Computing Grid) utilizes systems in more than 130 sites across the world, totaling more than 100,000 processors. Data from collider runs is sent to Tier 1 sites at a rate of 4GB/sec, where it is archived to tape for future analysis.

These same sites also feed data out to secondary sites as needed to feed their research appetites. There are some interesting videos on the CERN site discussing the grid and the challenge of handling LHC data.

So far the LHC has yet to find the Higgs boson or other particles or forces predicted by theoretical physics, but that’s the cool thing about having a Large Hadron Collider – it means you can finally test to see how closely reality conforms to theory.

If they do manage to figure out the true nature of the universe by recreating conditions that existed at the moment after the Big Bang, you can be sure it will be covered in The Reg… unless there’s some late-breaking scandal involving salacious text messages and nude starlets. In that case, we’ll cover the universe thing a few days later. ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.