LHC - Computing Resources

Computing Resources

Data produced by LHC, as well as LHC-related simulation, was estimated at approximately 15 petabytes per year (max throughput while running not stated).

The LHC Computing Grid was constructed to handle the massive amounts of data produced. It incorporated both private fiber optic cable links and existing high-speed portions of the public Internet, enabling data transfer from CERN to academic institutions around the world.

The Open Science Grid is used as the primary infrastructure in the United States, and also as part of an interoperable federation with the LHC Computing Grid.

The distributed computing project LHC@home was started to support the construction and calibration of the LHC. The project uses the BOINC platform, enabling anybody with an Internet connection and a computer running Mac OSX, Windows or Linux, to use their computer's idle time to simulate how particles will travel in the tunnel. With this information, the scientists will be able to determine how the magnets should be calibrated to gain the most stable "orbit" of the beams in the ring. In August 2011, a second application went live (Test4Theory) which performs simulations against which to compare actual test data, to determine confidence levels of the results.

Read more about this topic:  LHC

Famous quotes containing the word resources:

    We live in a time which has created the art of the absurd. It is our art. It contains happenings, Pop art, camp, a theater of the absurd.... Do we have the art because the absurd is the patina of waste...? Or are we face to face with a desperate or most rational effort from the deepest resources of the unconscious of us all to rescue civilization from the pit and plague of its bedding?
    Norman Mailer (b. 1923)