APS Logo

Statistical Physics Meets Machine Learning

FOCUS · B42 · ID: 22631






Presentations

  • Effective Theory of Deep Neural Networks

    ORAL · Invited

    Publication: https://www.cambridge.org/core/books/principles-of-deep-learning-theory/3E566F65026D6896DC814A8C31EF3B4C

    Presenters

    • Sho Yaida

      Facebook AI Research

    Authors

    • Sho Yaida

      Facebook AI Research

    View abstract →

  • Towards General and Robust Deep Learning at Scale

    ORAL · Invited

    Publication: Adversarial Feature Desensitization. Pouya Bashivan, Mojtaba Faramarzi, Touraj Laleh, Blake Aaron Richards and Irina Rish. In Proc of NeurIPS 2021.<br><br>Scaling Laws for the Few-Shot Adaptation of Pre-trained Image Classifiers. Gabriele Prato, Simon Guiroy, Ethan Caballero, Irina Rish and Sarath Chandar. arXiv preprint arXiv:2110.06990, 2021.<br><br>Sequoia: A Software Framework to Unify Continual Learning Research. Fabrice Normandin, Florian Golemo, Oleksiy Ostapenko, Pau Rodríguez, Matthew D. Riemer, Julio Hurtado, Khimya Khetarpal, Dominic Zhao, Ryan Lindeborg, Timothée Lesort, Laurent Charlin, Irina Rish and Massimo Caccia. arXiv preprint arXiv:2108.01005<br><br>Invariance Principle Meets Information Bottleneck for Out-of-Distribution Generalization<br>Kartik Ahuja, Ethan Caballero, Dinghuai Zhang, Jean-Christophe Gagnon-Audet, Yoshua Bengio, Ioannis Mitliagkas and Irina Rish. In Proc of NeurIPS 2021.<br><br>Online Fast Adaptation and Knowledge Accumulation: a New Approach to Continual Learning. Massimo Caccia, Pau Rodríguez, Oleksiy Ostapenko, Fabrice Normandin, Min Lin, Lucas Caccia, Issam H. Laradji, Irina Rish, Alexandre Lacoste, David Vázquez, Laurent Charlin. In Proc of NeurIPS-2020.

    Presenters

    • Irina Rish

      University of Montreal, Mila - Quebec AI Institute

    Authors

    • Irina Rish

      University of Montreal, Mila - Quebec AI Institute

    View abstract →

  • Dynamics of Deep Learning: Landscape-dependent Noise, Inverse Einstein Relation, and Flat Minima

    ORAL · Invited

    Publication: 1. "The inverse variance-flatness relation in Stochastic-Gradient-Descent is critical for finding flat minima", Y. Feng and Y. Tu, PNAS, 118 (9), 2021.<br>2. "Phases of learning dynamics in artificial neural networks in the absence or presence of mislabeled data", Y. Feng and Y. Tu, Machine Learning: Science and Technology (MLST), 2, 043001, 2021.

    Presenters

    • Yuhai Tu

      IBM TJ Watson Research Center, IBM T. J. Watson Research Center

    Authors

    • Yuhai Tu

      IBM TJ Watson Research Center, IBM T. J. Watson Research Center

    View abstract →