Menu
Collider probes universe's mysteries at the speed of light

Collider probes universe's mysteries at the speed of light

Worldwide computer grid helps scientists make sense of data coming from collider experiments

The Large Hadron Collider. (Photo credit: CERN.)

The Large Hadron Collider. (Photo credit: CERN.)

With the world's biggest physics experiment ready to fire up today, scientists from around the world are hoping to find answers to a question that has haunted mankind for centuries -- how was the universe created?

The Large Hadron Collider (LHC), which has been under construction for 20 years, will shoot its first beam of protons around a 17-mile, vacuum-sealed loop at a facility that sits astride the Franco-Swiss border. The test run of what is the largest, most powerful particle accelerator in the world, is a forebear to the coming time when scientists will accelerate two particle beams toward each other at 99.9 percent of the speed of light.

Smashing the beams together will create showers of new particles that should recreate conditions in the universe just moments after its conception.

Wednesday's test run is a critical milestone in getting to that ultimate test. And a worldwide grid of servers and desktops will help the scientific team make sense of the information that they expect will come pouring in.

"This will move the limit of our understanding of the universe," said Ruth Pordes, executive director of the Open Science Grid, which was created in 2005 to support the LHC project. "I'm very excited about the turning on of the accelerator. Over the next two years, our grids will be used by thousands of physicists at LHC to make new scientific discoveries. That's what it's all for."

Pordes noted that the US portion of the global grid is a computational and data storage infrastructure made up of more than 25,000 computers and 43,000 CPUs. The mostly Linux-based machines linked into the grid from universities, the US Department of Energy, the National Science Foundation and software development groups. Pordes also said the US grid offers up about 300,000 compute hours a day with 70% of it going to the particle collider project.

Harvey Newman, a physics professor at the California Institute of Technology, told Computerworld that there are about 30,000 servers and more than 100,000 cores around the world hooked into grids that support the LHC project.

"The distributed computing model is essential to doing the computing, storage and hosting of the many Petabytes of data from the experiments," said Newman. "Coordinating data distribution, processing and analysis of the data collaboratively by a worldwide community of scientists working on the LHC are key to the physics discoveries. Only a worldwide effort could provide the resources needed."


Follow Us

Join the newsletter!

Error: Please check your email address.

Tags popular science

Featured

Slideshows

Reseller News ICT Industry Awards 2017 - Meet the winners...

Reseller News ICT Industry Awards 2017 - Meet the winners...

Reseller News honoured the industry’s finest on a standout evening for the New Zealand channel, recognising the achievements of established and emerging partners on a memorable night in Auckland.

Reseller News ICT Industry Awards 2017 - Meet the winners...
Show Comments