APS Logo

Dynamical loss functions for Machine Learning

ORAL

Abstract

Current deep learning approaches usually rely on very diverse architectures, stemming from trial and error design. This has triggered great interest in improving the theoretical understanding of machine learning. The structure of the loss function landscape and the way it affects the performance of the algorithm have received recent interest. Loss functions penalize incorrect identifications and the focus has largely been on optimizing algorithms (e.g. stochastic gradient descent) within the landscapes defined by the loss functions. We take a different approach by exploring new loss functions. In particular, we explore the effect of dynamical loss functions, where weights on each training example change during training. Preliminary results show that this new approach can outperform the results obtained with static loss functions for particular cases.

Presenters

  • Miguel Ruiz Garcia

    Univ of Pennsylvania, University of Pennsylvania

Authors

  • Miguel Ruiz Garcia

    Univ of Pennsylvania, University of Pennsylvania

  • Ge Zhang

    Univ of Pennsylvania, University of Pennsylvania

  • Samuel Schoenholz

    Google Brain, Google

  • Andrea Jo-Wei Liu

    Univ of Pennsylvania, University of Pennsylvania, Department of Physics and Astronomy, University of Pennsylvania, Physics, University of Pennsylvania, Physics and Astronomy, University of Pennsylvania