Optimization of hyperparameters and padding for a lightweight velocimetry network

ORAL

Abstract

Over the last decade, deep convolutional neural networks (CNNs) have gained popularity for applications to optical flow estimation and fluid velocimetry. Recently, we have proposed a novel CNN lightweight image matching architecture (LIMA), which is specifically designed for velocimetry and proved more accurate than standard methods, as well as computationally cheaper to train than other CNNs. To further optimize the performance, however, two well-known issues of CNNs need to be addressed: (i) the neural networks are highly sensitive to hyperparameters, which makes training prone to get trapped in sub-optimal local minima; (ii) padding of the convolutional kernels at the image boundary may generate spurious artifacts that propagate into the interior of the reconstructed image. Here, we first investigate the effects of various hyperparameters (e.g., the learning rate decay, network depth, and the gradient descent optimizer) and shed light onto their effects on the training speed and the accuracy of the reconstruction. Then we compare various optimization techniques such as random grid search, Bayesian optimization, reinforcement learning, and genetic algorithms, and provide insights into their robustness. Finally, we devise a method tailored to particle image velocimetry to mitigate the boundary effects introduced by kernel padding.

Publication: None.

Presenters

  • Kamila Zdybal

    Empa, Swiss Federal Laboratory

Authors

  • Kamila Zdybal

    Empa, Swiss Federal Laboratory

  • Claudio Mucignat

    Empa, Swiss Federal Laboratory

  • Ivan Lunati

    Empa, Swiss Federal Laboratory