Video: How CERN Handles Big Data from the LHC

Print Friendly, PDF & Email

This video looks at how Big Data is handled from the Large Hadron Collider (LHC).

The LHC produces millions of collisions every second in each detector, generating approximately one petabyte of data per second. None of today’s computing systems are capable of recording such rates, so sophisticated selection systems are used for a first fast electronic pre-selection, only passing one out of 10,000 events. Tens of thousands of processor cores then select 1% of the remaining events for analysis. Even after such drastic data reduction, the four big experiments, ALICE, ATLAS, CMS and LHCb, together need to store over 25 petabytes per year. The LHC data are aggregated in the CERN Data Centre, which performs initial data reconstruction is performed, and a copy is archived to long-term tape storage. Another copy is sent to several large data centres around the world.

Speak Your Mind