LHC - Computing Resources

Computing Resources

Data produced by LHC, as well as LHC-related simulation, was estimated at approximately 15 petabytes per year (max throughput while running not stated).

The LHC Computing Grid was constructed to handle the massive amounts of data produced. It incorporated both private fiber optic cable links and existing high-speed portions of the public Internet, enabling data transfer from CERN to academic institutions around the world.

The Open Science Grid is used as the primary infrastructure in the United States, and also as part of an interoperable federation with the LHC Computing Grid.

The distributed computing project LHC@home was started to support the construction and calibration of the LHC. The project uses the BOINC platform, enabling anybody with an Internet connection and a computer running Mac OSX, Windows or Linux, to use their computer's idle time to simulate how particles will travel in the tunnel. With this information, the scientists will be able to determine how the magnets should be calibrated to gain the most stable "orbit" of the beams in the ring. In August 2011, a second application went live (Test4Theory) which performs simulations against which to compare actual test data, to determine confidence levels of the results.

Read more about this topic:  LHC

Famous quotes containing the word resources:

    Everywhere we are told that our human resources are all to be used, that our civilization itself means the uses of everything it has—the inventions, the histories, every scrap of fact. But there is one kind of knowledge—infinitely precious, time- resistant more than monuments, here to be passed between the generations in any way it may be: never to be used. And that is poetry.
    Muriel Rukeyser (1913–1980)