APS Logo

Unsupervised Machine Learning for Spatio-Temporal Characterization of Nanoscale Phenomena Imaged via Ultrafast Electron Microscopy

ORAL

Abstract

Advancements in microscopy techniques have made it possible to investigate dynamic structural phenomena at nanoscales. This work details the use of a machine learning based approach to extract quantitative information regarding the motion of features as captured by an ultrafast electron microscope (UEM). UEM is an emerging technique that uses pulsed electron beams to image structural dynamics at nanometer-picosecond resolutions. This spatio-temporal characteristic of a UEM dataset is one of the main challenges encountered during its analysis. Classical computer vision techniques for characterizing motion between image frames are parametric, and hence require manual supervision. In this work, a U-net type convolutional neural network is designed to take a pair of UEM images as input and generate the optical flow at each pixel as output. A custom loss function is defined, consisting of a photometric loss term and a gradient loss term. Additionally, the uncertainty associated with the estimate at each pixel is quantified using the Monte-Carlo Dropout method. The pixel-level motion computation provides a framework to correlate the phonon wavefront motion with the nanoscale interface structure characteristics, and this is demonstrated using FePS3 as an example.

Presenters

  • Thomas E Gage

    Argonne National Laboratory

Authors

  • Faran Zhou

    Argonne National Laboratory

  • Thomas E Gage

    Argonne National Laboratory

  • Haihua Liu

    Argonne National Laboratory

  • Ilke Arslan

    Argonne National Laboratory

  • Haidan Wen

    Argonne National Laboratory

  • Maria K Chan

    Argonne National Laboratory