Feeds

The Large Hadron Collider's mega-pic churn

If you can't destroy the world, drown it in data

Top 5 reasons to deploy VMware with Tegile

Blogs The Large Hadron Collider has been operating for a few months now, and it hasn’t ripped apart the space/time continuum – not where I live, anyway, and that’s mostly all I care about. Of course, it could be that it’s still early, and that the cumulative effects of accelerating particles really fast could still spell the end of everything. Until that happens, the LHC is generating enough data to keep scientists busy from now until doomsday (unless doomsday is in the next couple of years).

A recent story from Forbes tech correspondent Lee Gomes brought home the scale of the LHC and the storage challenge. The 150 million sensors each take 40 million pictures per second – which results in what anyone will admit is a fairly large amount of data. In terms of pictures per second, we’re talking six thousand trillion: a six with 15 zeros, or 6x1015. That’s only slightly more pictures than my mom takes at a family reunion using her 15-year-old 35mm camera – and far less annoying.

According to the story, most of this data is just uninteresting noise. But it all has to be sifted through to figure out which bits are worth a closer look. The total amount of ‘good stuff’ should amount to around 15 petabytes per year.

To crunch through this treasure trove of data, the WLGC (Worldwide LHC Computing Grid) utilizes systems in more than 130 sites across the world, totaling more than 100,000 processors. Data from collider runs is sent to Tier 1 sites at a rate of 4GB/sec, where it is archived to tape for future analysis.

These same sites also feed data out to secondary sites as needed to feed their research appetites. There are some interesting videos on the CERN site discussing the grid and the challenge of handling LHC data.

So far the LHC has yet to find the Higgs boson or other particles or forces predicted by theoretical physics, but that’s the cool thing about having a Large Hadron Collider – it means you can finally test to see how closely reality conforms to theory.

If they do manage to figure out the true nature of the universe by recreating conditions that existed at the moment after the Big Bang, you can be sure it will be covered in The Reg… unless there’s some late-breaking scandal involving salacious text messages and nude starlets. In that case, we’ll cover the universe thing a few days later. ®

Beginner's guide to SSL certificates

More from The Register

next story
It's Big, it's Blue... it's simply FABLESS! IBM's chip-free future
Or why the reversal of globalisation ain't gonna 'appen
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Microsoft and Dell’s cloud in a box: Instant Azure for the data centre
A less painful way to run Microsoft’s private cloud
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
CAGE MATCH: Microsoft, Dell open co-located bit barns in Oz
Whole new species of XaaS spawning in the antipodes
AWS pulls desktop-as-a-service from the PC
Support for PCoIP protocol means zero clients can run cloudy desktops
prev story

Whitepapers

Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.