Physics-Informed Neural Networks with a Differentiable Adversarial Self-Adaptive Pointwise Loss Weighting Scheme for Solving Forward and Inverse Partial Differential Equations

64 Pages Posted: 9 May 2024

See all articles by Guangtao Zhang

Guangtao Zhang

University of Macau

Huiyu Yang

South China Agricultural University

Fang Zhu

Macau University of Science and Technology

Yang Chen

University of Macau

xiaoning zheng

Jinan University

Abstract

Physics-informed neural networks (PINNs) have received significant attention for their ability of integrating the physical laws and measurement data into the loss function. The loss function is a weighted sum of multiple terms, including the boundary conditions, the initial conditions, and the residuals of partial differential equations (PDEs). However, the success of training PINNs relies heavily on an effective and simple loss term weighting strategy, which must be able to well balance the interplay among different loss terms. In this paper, we propose a differentiable adversarial self-adaptive weighting scheme (DASA) for PINNs' training to optimize the pointwise loss weights automatically in each training epoch. The idea is to reformulate the original minimization problem of PINNs into a bi-level optimization problem. More specifically, we train a sub-network to maximize the weights for each point in the loss function while a backbone-network minimizing the loss of the classical PINNs. We solve various PDEs using this DASA scheme, i.e., Poisson's, Helmholtz, Burgers, Allen-Cahn, diffusion-reaction, advection-diffusion and Naiver-Stokes equation on regular and irregular computational domains as well as PDEs with discontinuous initial condition. Both forward and inverse problems are considered. The numerical experiments demonstrate that the DASA-PINNs can achieve better accuracy than the classical PINNs, self-adaptive PINNs (SA-PINNs), hp-variational PINNs with domain decomposition (hp-VPINNs), gradient-enhanced PINNs (gPINNs), and Deep Ritz method while keeping the computational cost on par with them. We also investigate the gradient distribution in each layer of the DASA network, eigenvalue distribution of the Neural Tangent Kernel (NTK) matrix, and the dynamics of weights during DASA training to explore the possible reasons of the better performance of DASA-PINNs than other versions of PINNs.

Keywords: Physics-informed neural networks, Loss weighting scheme, Self-adaptive, Partial differential equations, Sub-network, Bi-level optimization problem

Suggested Citation

Zhang, Guangtao and Yang, Huiyu and Zhu, Fang and Chen, Yang and zheng, xiaoning, Physics-Informed Neural Networks with a Differentiable Adversarial Self-Adaptive Pointwise Loss Weighting Scheme for Solving Forward and Inverse Partial Differential Equations. Available at SSRN: https://ssrn.com/abstract=4822227 or http://dx.doi.org/10.2139/ssrn.4822227

Guangtao Zhang

University of Macau ( email )

P.O. Box 3001
Macau

Huiyu Yang

South China Agricultural University ( email )

Guangdong, Guangzhou
China

Fang Zhu

Macau University of Science and Technology ( email )

China

Yang Chen

University of Macau ( email )

P.O. Box 3001
Macau

Xiaoning Zheng (Contact Author)

Jinan University ( email )

Huang Pu Da Dao Xi 601, Tian He District
Guangzhou, 510632
China

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
47
Abstract Views
254
PlumX Metrics