Challenges and Evolution of the LHC computing model for HL-LHC
This event is part of the HEE Seminar Series.
Over the past decade, experiments at the Large Hadron Collider have relied on a globally distributed-computing system to enable data processing at a scale far beyond what was initially planned. The high-luminosity upgrade of the LHC necessitates a large increase in event rate, and event complexity. In order to carry out an ambitious physics program through the 2030s, data sets at the exabyte scale will be collected and analyzed each year. Asking researchers to continue using the data reduction and analysis methods in use today would mean devoting considerable time and effort towards simply shepherding data through a complex system. New tools and techniques aim to allow HL-LHC researchers to concentrate on the creative data analysis and interpretation process itself. This talk describes challenges for the software and computing infrastructure of the HL-LHC experiments stemming both from the HL-LHC program and from continued technology evolution. I will discuss ideas towards addressing these challenges through evolutions in both algorithms and facilities, some of which are ready for use by LHC researchers today.