APS Logo

Applying Transfer Learning to Graph Neural Networks for Predicting Defect Formation Energies.

ORAL

Abstract

Defect formation energy is a critical metric describing a defect's thermodynamic stability, fundamental to applications ranging from semiconductors to quantum technologies. However, calculating defect formation energies using density functional theory (DFT) is computationally expensive at scale: resulting defect formation energy datasets are typically on the order of 100–1000 entries in size, too small to fully exploit the prediction powers of data-hungry deep learning models. To address the data scarcity challenge of for defects, here we propose using transfer learning to leverage larger datasets (>10,000) that are available for pristine crystal formation energies, which are less costly to compute. We applied transfer learning by pretraining a graph neural network on pristine crystal formation energies and then fine-tuning it to target defect formation energies, with a focus on metal and oxygen vacancies in oxide materials. This method not only improves predictive performance but also accelerates convergence and stabilizes training, significantly reducing the number of epochs required to reach optimal results. Compared with models trained from scratch, the model trained with transfer learning demonstrated superior accuracy approaching that of DFT calculations, making it a promising predictive tool for defect properties. By utilizing the larger pristine crystal dataset, our transfer learning approach reduces computational costs and enhances the model's ability to predict defect formation energies, even with a limited defect dataset. These findings have broad implications for materials discovery, where machine learning models based on small datasets of hard-to-calculate materials properties can be made more efficient and accurate by pretraining on large datasets of easy-to-calculate materials properties.

Publication: Planned manuscript in preparation.

Presenters

  • Thomas A Bouchard

    Austin Peay State University

Authors

  • Thomas A Bouchard

    Austin Peay State University

  • Angela Zhang

    University of Texas at Austin

  • Justin Garrigus

    University of North Texas

  • Fatimah Habis

    University of North Texas

  • Yuanxi Wang

    University of North Texas