Skip to main content

Transatlantic data-transfer gets a boost

New links will improve the flow of data from the Large Hadron Collider to US institutions.

Photo of CERN Computer Centre
Maximilien Brice, Samuel Morier-Genoud, CERN

Scientists across the US will soon have access to new, ultra high-speed network links spanning the Atlantic Ocean.

A new project is currently underway to extend the US Department of Energy’s Energy Sciences Network, or ESnet, to London, Amsterdam and Geneva.

Although the project is designed to benefit data-intensive science throughout the US national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider, the world’s largest and most powerful particle collider. The high capacity of this new connection will provide US-based scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the US and computing facilities in Europe.

“After the Higgs discovery, the next big LHC milestones will come in 2015,” says Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “And this network will be indispensable for the success of the [next LHC physics program].”

DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for US collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure, once it is rigorously tested and commissioned. Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the US, tens of thousands of researchers from other disciplines will benefit as well. 

The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably to approximately 40 petabytes of RAW data per year, compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC’s first run between 2010 and 2012.

The cross-Atlantic connectivity during the first successful run for the LHC experiments was provided by the US LHCNet network, managed by the California Institute of Technology. In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean. 

Map of America connected to Alaska
Brookhaven/Fermilab

An evolving data model

This upgrade coincides with a shift in the data model for LHC science. Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together. 

“Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” says physicist Michael Ernst of Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers. 

Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for US collaborators on the LHC’s ATLAS experiment. A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s US collaborators on the CMS experiment. These computing resources, dubbed “Tier 1” centers, have direct links to the LHC at Europe’s CERN laboratory (Tier 0).

The experts who run them will continue to serve scientists under the new structure. But instead of serving only as hubs for data storage and distribution among US-based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will also be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, US Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe. 

“This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data," says Fermilab’s Lothar Bauerdick, head of software and computing for the US CMS group. "As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.”

Ernst adds, “As centralized hubs for handling LHC data, our reliability, performance, and expertise have been in demand by the whole collaboration and now we will be better able to serve the scientists’ needs.”


Fermilab published a version of this article as a press release.

 

Like what you see? Sign up for a free subscription to symmetry!