Washington, Oct 4 : The world's largest computing grid has been launched, which would tackle mankind's biggest data challenge from the Earth's most powerful accelerator - the Large Hadron Collider (LHC). Known as the Worldwide LHC Computing Grid (WLCG), it combines the power of more than 140 computer centers from 33 countries to analyze and manage more than 15 million gigabytes of LHC data every year.
According to Glen Crawford of the High Energy Physics program in DOE's (Department of Energy's) Office of Science, the vast distributed computing system will allow 7,000 scientists around the world to analyze LHC data.
U.S. contributions to the Worldwide LHC Computing Grid are coordinated through the Open Science Grid, a national computing infrastructure for science.
The Open Science Grid not only contributes computing power for LHC data needs, but also for projects in many other scientific fields including biology, nanotechnology, medicine and climate science. "Particle physics projects such as the LHC have been a driving force for the development of worldwide computing grids," said Ed Seidel, director of the National Science Foundation's Office of Cyberinfrastructure.
"The benefits from these grids are now being reaped in areas as diverse as mathematical modeling and drug discovery," he added.
"Open Science Grid members have put an incredible amount of time and effort in developing a nationwide computing system that is already at work supporting America's 1,200 LHC physicists and their colleagues from other sciences," said Open Science Grid Executive Director Ruth Pordes from DOE's Fermi National Accelerator Laboratory.
Dedicated optical fiber networks distribute LHC data from CERN in Geneva, Switzerland to eleven major "Tier-1" computer centers in Europe, North America and Asia, including those at DOE's Brookhaven National Laboratory in New York and Fermi National Accelerator Laboratory in Illinois.
From these, data is dispatched to more than 140 "Tier-2" centers around the world, including twelve in the United States.
"Our ability to manage data at this scale is the product of several years of intense testing," said Ian Bird, leader of the Worldwide LHC Computing Grid project. "When the LHC starts running at full speed, it will produce enough data to fill about six CDs per second," said Michael Ernst, director of Brookhaven National Laboratory's Tier-1 Computing Center.
"We've spent years ramping up to this point, and now, we're excited to help uncover some of the numerous secrets nature is still hiding from us," he added. Physicists in the U.S. and around the world will sift through the LHC data torrent in search of tiny signals that will lead to discoveries about the nature of the physical universe.