Active Learning of Model Discrepancy with Sequential Optimal Experimental Design

ORAL

Abstract

Digital twins have been actively explored in many engineering applications, such as manufacturing and autonomous systems. However, model discrepancy is ubiquitous in most digital twin models. In recent years, data-driven modeling techniques have been demonstrated promising in characterizing the model discrepancy in existing models, while the training data for the learning of model discrepancy is often obtained empirically and an active approach of gathering informative data can potentially benefit the learning of model discrepancy. On the other hand, optimal experimental design (OED) provides a systematic approach to gather the most informative data, but its performance is often negatively impacted by the model discrepancy. In this talk, we build on sequential Bayesian OED and propose an efficient approach to iteratively learn the model discrepancy based on the data from the Bayesian OED. The results show that the proposed method is robust to escape local minimum and is efficient enough to handle high-dimensional model discrepancy, making use of data indicated by the sequential Bayesian OED. We also demonstrate that the proposed method is compatible with both classical solvers and modern auto-differentiable solvers.

Presenters

  • Huchen Yang

    University of Wisconsin - Madison

Authors

  • Huchen Yang

    University of Wisconsin - Madison

  • Chuanqi Chen

    University of Wisconsin - Madison

  • Jinlong Wu

    University of Wisconsin - Madison