Physics-Informed Neural Networks with a Differentiable Adversarial Self-Adaptive Pointwise Loss Weighting Scheme for Solving Forward and Inverse Partial Differential Equations
64 Pages Posted: 9 May 2024
Abstract
Physics-informed neural networks (PINNs) have received significant attention for their ability of integrating the physical laws and measurement data into the loss function. The loss function is a weighted sum of multiple terms, including the boundary conditions, the initial conditions, and the residuals of partial differential equations (PDEs). However, the success of training PINNs relies heavily on an effective and simple loss term weighting strategy, which must be able to well balance the interplay among different loss terms. In this paper, we propose a differentiable adversarial self-adaptive weighting scheme (DASA) for PINNs' training to optimize the pointwise loss weights automatically in each training epoch. The idea is to reformulate the original minimization problem of PINNs into a bi-level optimization problem. More specifically, we train a sub-network to maximize the weights for each point in the loss function while a backbone-network minimizing the loss of the classical PINNs. We solve various PDEs using this DASA scheme, i.e., Poisson's, Helmholtz, Burgers, Allen-Cahn, diffusion-reaction, advection-diffusion and Naiver-Stokes equation on regular and irregular computational domains as well as PDEs with discontinuous initial condition. Both forward and inverse problems are considered. The numerical experiments demonstrate that the DASA-PINNs can achieve better accuracy than the classical PINNs, self-adaptive PINNs (SA-PINNs), hp-variational PINNs with domain decomposition (hp-VPINNs), gradient-enhanced PINNs (gPINNs), and Deep Ritz method while keeping the computational cost on par with them. We also investigate the gradient distribution in each layer of the DASA network, eigenvalue distribution of the Neural Tangent Kernel (NTK) matrix, and the dynamics of weights during DASA training to explore the possible reasons of the better performance of DASA-PINNs than other versions of PINNs.
Keywords: Physics-informed neural networks, Loss weighting scheme, Self-adaptive, Partial differential equations, Sub-network, Bi-level optimization problem
Suggested Citation: Suggested Citation