NYU will be part of a National Science Foundation-backed coalition that will create next-generation cyberinfrastructure to support high-energy physics research.
New York University will be part of the Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP), a National Science Foundation-backed coalition that will create next-generation cyberinfrastructure to support high-energy physics research.
The institute will develop computing software and expertise to enable a new era of discovery at the Large Hadron Collider (LHC), the world’s most powerful particle accelerator, at CERN in Geneva, Switzerland.
“Our field is undergoing a transformation driven by new scientific challenges and the emergence of machine learning and data science methods,” says NYU’s Kyle Cranmer, a professor of physics and a faculty member at the university’s Center for Data Science, who will lead the analysis systems component of IRIS-HEP.
NYU is among 17 research universities that will be part of the institute, which will be backed by $25 million in NSF funding over the next five years and led by Princeton University. NYU will receive approximately $1.5 million to develop innovative algorithms based on machine learning and systems that enhance reproducibility and aid the community in interpretation of the LHC data.
The LHC’s discovery of the Higgs boson particle in 2012 provided the last piece of what is known as the “Standard Model” of particle physics, a theory that describes the fundamental building blocks of nature and their interactions. The following year, Peter Higgs and François Englert received the Nobel Prize in Physics in recognition of their work in developing the theory of what is now known as the Higgs field, which gives elementary particles mass.
The launch of the institute coincides with a major upgrade at LHC—the High-Luminosity Large Hadron Collider (HL-LHC) project—that will take place over the next eight years. The HL-LHC experiments will search for new particles and interactions, including dark matter, which makes up a large proportion of the universe.
Overall, IRIS-HEP aims to drive innovations in data analysis and algorithms essential to handling the massive amounts of data generated by the HL-LHC.
“This huge increase in data is needed to find the extremely rare ‘needle in a haystack’ signals that could indicate the presence of new physics phenomena,” said Princeton University computational physicist Peter Elmer, the principal investigator for the institute and a CERN researcher. “But to fully explore this data, we need much more powerful software tools and algorithms. We also need to maximally exploit the evolving high-performance computing landscape and new tools like machine learning, in which computers study existing data sets to learn rules that they can apply to new data and new situations.”
“In the high-luminosity era, the LHC detectors will simultaneously record a very large number of overlapping particle collisions,” adds Elmer. “From among these events, we will want to pick out the most interesting ones for further study. Our ability to choose effectively depends entirely on the strength and sophistication of our computational resources; IRIS-HEP will ensure that we have the right tools for the job.”