FM4NPP: Foundation Model for Nuclear and Particle Physics.
ORAL · Invited
Abstract
The advent of large language models (LLMs) and foundation models has transformed the landscape of artificial intelligence. Unlike traditional AI methods that depend heavily on hand-crafted rules and heuristics, these models harness massive datasets and self-supervised learning to develop general-purpose representations. This paradigm enables them to adapt effectively to a wide range of downstream tasks with minimal labeled data. In this talk, we present our ongoing work on developing a foundation model for nuclear and particle physics (FM4NPP). Our approach focuses on training with sparse particle detector data using self-supervised techniques—without the need for manual annotations or labels. The model is designed to exhibit neural scaling behavior, where increased model size and data volume translate into improved performance. We will demonstrate the model's versatility by applying it to different downstream tasks such as particle tracking. Early results suggest that FM4NPP has the potential to outperform existing methods on multiple downstream tasks.
–
Presenters
-
Yihui Ren
Brookhaven National Laboratory
Authors
-
Yihui Ren
Brookhaven National Laboratory