The world’s largest computing grid has been formally inaugurated at Cern, the Geneva particle physics research centre that is home to the Large Hadron Collider [LHC] experiment.
The LHC computing grid has been developed over the past five years as the third key element of the project to study the sub-atomic particles created in the instant following the Big Bang.
Accelerators that smash particles together at near light-speed, detectors that search through the debris from those collisions, and the computing grid itself will together be used to understand the nature of the universe.
“The LHC computing grid forms an unprecedented computational and storage device,” said Cern director general Robert Aymar.
The grid connects 140 data centres in 33 countries into a single processing entity for use by more than 7,000 scientists worldwide, who will be analysing the estimated 15 petabytes [15 million gigabytes] of data produced by the LHC every year.
The grid was due to start capturing and processing data from the LHC but live use has been delayed by a helium leak in the giant underground ring beneath Geneva that caused the experiment to be temporarily shut down on 10 September. It will restart in spring 2009.
Cern is currently running tests and simulations, processing some 50,000 jobs at any time.
“The worldwide LHC computing grid is a vital pillar of the LHC project,” said Jos Engelen, chief scientific officer for the LHC project.
“It is an absolute necessity for analysis of the LHC data. It is the result of a silent revolution in large-scale computing over the last five years.”
Much of the research and development that has gone into creating the grid, involving HP, Intel and Oracle, is likely to lead to new developments in business technology in the coming years.
“The significance of the LHC computing grid goes well beyond the LHC,” said Ian Bird, leader of the grid project. “Many other researchers are already benefiting from the lessons learned here.”