How much can prompt engineering discover Differential Equation with Large Language Models?
ORAL
Abstract
Partial Differential Equations (PDEs) are an essential part of representing a plethora of physical and social phenomena; therefore, developing new ways to approximate or even solve PDEs has a considerable impact on all scientific disciplines. Recently, Physics Informed Neural Networks (PINNs) have dominated this field, however, with the rise of Large Language Models (LLMs) and the success of Google's FunSearch, through careful prompt engineering and genetic evolution of prompts, LLMs are becoming an innovative tool in the domain of PDE solving. Specifically, we experimented with the evolution of two types of prompts: one which asks the LLM to generate a mathematical function, and one which asks to generate a Python function. Both are expected to approximate an inputted Differential Equation or initial value problem (IVP) which are evaluated over a discretized domain specific to the problem, with the mathematical function being able to benefit from additional, physical evaluation introduced by PINNs. Small LLMs, such as Google's Gemma 2 with 9 billion parameters, could generate a mathematical function very close to the solution of solvable viscous burger's equations without any evolution of the prompt. Further efforts to improve initial prompts or use LLMs specific to answering mathematical prompts would likely show better results. This suggests that LLMs have a potential of complex mathematical reasoning and will likely show promise when allowed the ability to evolve.
–
Presenters
-
Alejandro Pinto
Rutgers University-New Brunswick
Authors
-
Alejandro Pinto
Rutgers University-New Brunswick
-
Ruo-Qian Wang
Rutgers University - New Brunswick, Rutgers, the State University of New Jersey