Energy Frontier

Fermilab's Contribution to LHC Computing

Computing

The discovery of new physics at CMS depends upon the successful processing and storage of an unprecedented amount of data. Each year, the detector will produce more than one petabyte – 1 million gigabytes – of raw data. The experiment will produce even larger amounts of simulated data, which is necessary for understanding the performance of the detector and what signals from new physics will look like in the collected data. All of this data must be processed, stored, transmitted to physicists around the world and analyzed.

To cope with these demands, CMS and other LHC experiments are pursuing a distributed computing solution. In this tiered system, the CERN computing center is responsible for doing a first-pass processing of the collision events, writing the data to tape for permanent storage and then sending copies of the data to seven Tier-1 sites, one of which is Fermilab. Fermilab will further reduce the new data, reprocess older data with improved software tools and archive the data.

Fermilab is heavily involved in CMS computing, including core software, event simulations and data reconstruction. Fermilab provides software, services and networks and will manage CMS data and analysis jobs.

To learn more about Computing at Fermilab, click here.

Last modified: 04/27/2009 |