Feeds

The Large Hadron Collider's mega-pic churn

If you can't destroy the world, drown it in data

Next gen security for virtualised datacentres

Blogs The Large Hadron Collider has been operating for a few months now, and it hasn’t ripped apart the space/time continuum – not where I live, anyway, and that’s mostly all I care about. Of course, it could be that it’s still early, and that the cumulative effects of accelerating particles really fast could still spell the end of everything. Until that happens, the LHC is generating enough data to keep scientists busy from now until doomsday (unless doomsday is in the next couple of years).

A recent story from Forbes tech correspondent Lee Gomes brought home the scale of the LHC and the storage challenge. The 150 million sensors each take 40 million pictures per second – which results in what anyone will admit is a fairly large amount of data. In terms of pictures per second, we’re talking six thousand trillion: a six with 15 zeros, or 6x1015. That’s only slightly more pictures than my mom takes at a family reunion using her 15-year-old 35mm camera – and far less annoying.

According to the story, most of this data is just uninteresting noise. But it all has to be sifted through to figure out which bits are worth a closer look. The total amount of ‘good stuff’ should amount to around 15 petabytes per year.

To crunch through this treasure trove of data, the WLGC (Worldwide LHC Computing Grid) utilizes systems in more than 130 sites across the world, totaling more than 100,000 processors. Data from collider runs is sent to Tier 1 sites at a rate of 4GB/sec, where it is archived to tape for future analysis.

These same sites also feed data out to secondary sites as needed to feed their research appetites. There are some interesting videos on the CERN site discussing the grid and the challenge of handling LHC data.

So far the LHC has yet to find the Higgs boson or other particles or forces predicted by theoretical physics, but that’s the cool thing about having a Large Hadron Collider – it means you can finally test to see how closely reality conforms to theory.

If they do manage to figure out the true nature of the universe by recreating conditions that existed at the moment after the Big Bang, you can be sure it will be covered in The Reg… unless there’s some late-breaking scandal involving salacious text messages and nude starlets. In that case, we’ll cover the universe thing a few days later. ®

The essential guide to IT transformation

More from The Register

next story
The Return of BSOD: Does ANYONE trust Microsoft patches?
Sysadmins, you're either fighting fires or seen as incompetents now
Microsoft: Azure isn't ready for biz-critical apps … yet
Microsoft will move its own IT to the cloud to avoid $200m server bill
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
Death by 1,000 cuts: Mainstream storage array suppliers are bleeding
Cloud, all-flash kit, object storage slicing away at titans of storage
Oracle reveals 32-core, 10 BEEELLION-transistor SPARC M7
New chip scales to 1024 cores, 8192 threads 64 TB RAM, at speeds over 3.6GHz
VMware vaporises vCHS hybrid cloud service
AnD yEt mOre cRazy cAps to dEal wIth
El Reg's virtualisation desk pulls out the VMworld crystal ball
MARVIN musings and other Gelsinger Gang guessing games
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
7 Elements of Radically Simple OS Migration
Avoid the typical headaches of OS migration during your next project by learning about 7 elements of radically simple OS migration.
BYOD's dark side: Data protection
An endpoint data protection solution that adds value to the user and the organization so it can protect itself from data loss as well as leverage corporate data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?