FSUMATH
Florida State University Seal

Department of Mathematics

College of Arts and Sciences

Mathematics Colloquium


Shu Liu
UCLA

Title: An Adversarial Deep Learning approach using Natural Gradients for solving Partial Differential Equations
Date: Wednesday, January 22, 2025
Place and Time: Love 101, 3:05-3:55 pm

Abstract. We propose a scalable, preconditioned primal-dual algorithm for solving partial differential equations (PDEs). By multiplying the equation with a test function, we reformulate it as an inf-sup problem, resulting in a loss function involving lower-order differential operators. To address this saddle point problem, we employ the Primal-Dual Hybrid Gradient (PDHG) algorithm. By introducing suitable preconditioning operators to the metric terms in PDHG proximal steps, we obtain an alternative natural gradient ascent-descent optimization scheme for updating primal and adversarial neural network parameters. These natural gradients are efficiently computed using the Krylov subspace iteration. An a posteriori convergence analysis is established for the time-continuous version of the proposed method. The algorithm is tested on various types of linear and nonlinear PDEs, scaling seamlessly to 50 dimensions. Numerical experiments highlight the method's improved accuracy, efficiency, and stability in convergence when compared to conventional deep-PDE solvers.