Simulating outputs from the High-Luminosity Large Hadron Collider for improved ATLAS trigger algorithm testing
POSTER
Abstract
The Large Hadron Collider (LHC) will undergo upgrades beginning in 2026 to increase its luminosity by tenfold, becoming the High-Luminosity LHC. This translates to an average pileup () of 200 events within the ATLAS detector for every proton bunch crossing. Through careful selection using offline-inspired algorithms implemented on a Global Trigger, the upgraded Trigger and Data Acquisition system will reduce the 40 MHz data arriving from detectors, storing only potentially interesting events at a rate of 1 MHz to memory. One interesting event is the Di-Higgs decay, which has yet to be measured due to its small cross section. Running an online topological clustering algorithm on full-granularity calorimeter information will help in reconstructing jets that model the trajectories of Higgs decay products in real time. Such a clustering algorithm is currently being developed by firmware engineers, but requires detailed validation on =200 simulated data before implementation in the upgraded Trigger and Data Acquisition system. This poster will describe how simulated data is made to resemble detector readout for firmware validation, as well as various performance studies of online clustering algorithms using simulated events like a di-Higgs decay.
Presenters
-
Ryan Stuve
University of Oregon
Authors
-
Ryan Stuve
University of Oregon