APS Logo

Large Language Models in Computational Physics

INVITED · MAR-F11 · ID: 2763243







Presentations

  • Leveraging large language models for predictive chemistry

    ORAL · Invited

    Publication: [1] K. M. Jablonka, P. Schwaller, A. Ortega-Guerrero, and B. Smit, Leveraging large language models for predictive chemistry Nat Mach Intel 6, 161 (2024) http://dx.doi.org/10.1038/s42256-023-00788-1<br>[2] J. Van Herck, et al., Assessment of fine-tuned large language models for real-world chemistry and material science applications Chem. Sci. 16 (2), 670 (2025) http://dx.doi.org/10.1039/D4SC04401K

    Presenters

    • Berend Smit

      EPFL, Lausanne, Switzerland

    Authors

    • Berend Smit

      EPFL, Lausanne, Switzerland

    View abstract →

  • Keeping up with LLMs

    ORAL · Invited

    Presenters

    • Salman Habib

      Argonne National Laboratory

    Authors

    • Salman Habib

      Argonne National Laboratory

    View abstract →

  • Large Language Models in Physics Education

    ORAL · Invited

    Publication: [1] G. Polverini and B. Gregorcic, How understanding large language models can inform the use of ChatGPT in physics education, Eur. J. Phys. 45, 025701 (2024).<br>[2] G. Polverini and B. Gregorcic, Performance of ChatGPT on the test of understanding graphs in kinematics, Phys. Rev. Phys. Educ. Res. 20, 010109 (2024).<br>[3] G. Polverini and B. Gregorcic, Evaluating vision-capable chatbots in interpreting kinematics graphs: a comparative study of free and subscription-based models, Frontiers in Education (2024).<br>[4] B. Gregorcic, G. Polverini, and A. Sarlah, ChatGPT as a tool for honing teachers' Socratic dialogue skills, Phys. Educ. 59, 045005 (2024).<br>[5] B. Gregorcic and A.-M. Pendrill, ChatGPT and the frustrated Socrates, Phys. Educ. 58, 035021 (2023).

    Presenters

    • Bor Gregorcic

      Uppsala University, Sweden

    Authors

    • Bor Gregorcic

      Uppsala University, Sweden

    • Giulia Polverini

      Uppsala University, Sweden

    View abstract →