APS Logo

Made for each other: adjoint solvers and high-dimensional gradient-augmented Bayesian optimization

ORAL

Abstract

Bayesian optimization (BO) is a global optimization algorithm well-suited for multimodal functions that are costly to evaluate, e.g. quantities derived from computationally expensive simulations. Recent studies have shown that it is possible to scale Bayesian optimization to high-dimensional functions and that its convergence can be accelerated by incorporating derivative information. These developments have laid the groundwork for a productive interplay between Bayesian optimization and adjoint solvers, a tool to cheaply obtain gradients of objective functions w.r.t. tunable parameters in a simulated physical system. Gradient-enhanced high dimensional BO can explore a design space efficiently without getting stuck in local minima. We demonstrate the application of this algorithm to two test cases. The first one is the classic problem of 2D airfoil shape optimization to maximize the lift-to-drag ratio. The second test case uses solutions from an adjoint Helmholtz solver to stabilize a thermoacoustically unstable combustor with geometry changes. We show that compared to L-BFGS, a standard quasi-Newton method, the gradient-enhanced high dimensional BO arrives at multiple, more optimal geometries using considerably fewer evaluations of the solver.

Publication: Sengupta, U., Juniper, M.P. Thermoacoustic stabilization of combustors with adjoint models and gradient-augmented Bayesian optimization (submitted to Symposium on Thermoacoustics in Combustion 2021)

Presenters

  • Ushnish Sengupta

    University of Cambridge

Authors

  • Ushnish Sengupta

    University of Cambridge

  • Yubiao Sun

    University of Cambridge

  • Matthew P Juniper

    University of Cambridge, Univ of Cambridge