Expressivity of Structure-Preserving Deep Neural Networks for the Space-Time Approximation of High-Dimensional Nonlinear Partial Differential Equations with Boundaries

Information

  • NSF Award
  • 2318032
Owner
  • Award Id
    2318032
  • Award Effective Date
    1/1/2023 - a year ago
  • Award Expiration Date
    7/31/2025 - a year from now
  • Award Amount
    $ 119,987.00
  • Award Instrument
    Continuing Grant

Expressivity of Structure-Preserving Deep Neural Networks for the Space-Time Approximation of High-Dimensional Nonlinear Partial Differential Equations with Boundaries

Partial differential equations (PDEs) arise naturally in the modeling and study of many natural, industrial, and financial phenomena. While there exist numerous techniques for approximating solutions to many types of PDEs, it is the case that certain problems of practical interest exhibit various types of pathology, which render many standard techniques either highly inefficient or unusable. Such situations often arise in the setting of nonlinear problems which are posed in high dimensions or exhibit singularities, such as in the study of solid-fuel combustion optimization, oil pipeline corrosion predictions, and high-frequency financial trading. This project intends to provide means of circumventing the aforementioned issues through the rigorous study and development of so-called structure-preserving deep neural networks (DNNs). While it has been experimentally observed that DNNs provide highly capable methods of approximating solutions to a large class of problems, it is the case that the theoretical justification of such observations is still in its early stage. To that end, this project will provide a much-needed theory for certain classes of high-dimensional nonlinear PDEs via a two-pronged approach: namely, explicit randomized methods will be constructed which demonstrate desired properties while simultaneously developing theoretical tools representing and studying a large class of objects via DNNs. This approach requires the use of numerous tools from applied mathematics, functional analysis, stochastic analysis, and novel DNN computations. This unique intersection of techniques will serve as the basis for the project's educational and training components, which aim to increase the presence of women, minorities, and other underrepresented groups in mathematical research. This goal will be accomplished through the training and mentoring first-generation and underrepresented students at both graduate and undergraduate levels.<br/><br/>This project aims to address the question of whether or not it can be rigorously proven that there exist DNNs to approximate solutions to a large class of high-dimensional PDEs without suffering from the curse of dimensionality (CoD). The demonstration that DNNs are able to represent solutions to certain classes of high-dimensional nonlinear PDEs with a prescribed accuracy while not suffering from the CoD will fill a gap in the existing theory regarding machine learning algorithms. While the current focus is on studying the expressibility of DNNs with regard to solutions of PDEs, it is the case that this work will also serve as the foundation for extending such studies to other types of problems. This project will extend the existing theory of multilevel Picard (MLP) approximation methods to more general high-dimensional nonlinear PDEs, focusing on preserving inherent qualitative structures. The study of MLP approximations will also result in novel theoretical results regarding various stochastic fixed-point equations. Finally, the proposed work will provide explicit details on how to construct CoD-free DNN representations of various mathematical objects while also exploring theoretical issues related to popular activation functions. These ideas will be utilized in proving DNN-representation results and will provide a deeper understanding of how activation functions affect optimality.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Stacey Levineslevine@nsf.gov7032922948
  • Min Amd Letter Date
    2/15/2023 - a year ago
  • Max Amd Letter Date
    3/27/2023 - a year ago
  • ARRA Amount

Institutions

  • Name
    Baylor University
  • City
    WACO
  • State
    TX
  • Country
    United States
  • Address
    700 S UNIVERSITY PARKS DR
  • Postal Code
    767061003
  • Phone Number
    2547103817

Investigators

  • First Name
    Joshua
  • Last Name
    Padgett
  • Email Address
    padgett@uark.edu
  • Start Date
    2/15/2023 12:00:00 AM
  • End Date
    03/27/2023
  • First Name
    Qin
  • Last Name
    Sheng
  • Email Address
    Qin_Sheng@baylor.edu
  • Start Date
    3/27/2023 12:00:00 AM

Program Element

  • Text
    APPLIED MATHEMATICS
  • Code
    1266

Program Reference

  • Text
    Artificial Intelligence (AI)
  • Text
    Machine Learning Theory
  • Text
    EXP PROG TO STIM COMP RES
  • Code
    9150