关键词: Curse of dimensionality Physics-informed neural networks

Mesh : Neural Networks, Computer Nonlinear Dynamics Algorithms Stochastic Processes Physics Computer Simulation

来  源:   DOI:10.1016/j.neunet.2024.106369

Abstract:
The curse-of-dimensionality taxes computational resources heavily with exponentially increasing computational cost as the dimension increases. This poses great challenges in solving high-dimensional partial differential equations (PDEs), as Richard E. Bellman first pointed out over 60 years ago. While there has been some recent success in solving numerical PDEs in high dimensions, such computations are prohibitively expensive, and true scaling of general nonlinear PDEs to high dimensions has never been achieved. We develop a new method of scaling up physics-informed neural networks (PINNs) to solve arbitrary high-dimensional PDEs. The new method, called Stochastic Dimension Gradient Descent (SDGD), decomposes a gradient of PDEs\' and PINNs\' residual into pieces corresponding to different dimensions and randomly samples a subset of these dimensional pieces in each iteration of training PINNs. We prove theoretically the convergence and other desired properties of the proposed method. We demonstrate in various diverse tests that the proposed method can solve many notoriously hard high-dimensional PDEs, including the Hamilton-Jacobi-Bellman (HJB) and the Schrödinger equations in tens of thousands of dimensions very fast on a single GPU using the PINNs mesh-free approach. Notably, we solve nonlinear PDEs with nontrivial, anisotropic, and inseparable solutions in less than one hour for 1000 dimensions and in 12 h for 100,000 dimensions on a single GPU using SDGD with PINNs. Since SDGD is a general training methodology of PINNs, it can be applied to any current and future variants of PINNs to scale them up for arbitrary high-dimensional PDEs.
摘要:
维度的诅咒给计算资源带来了沉重的负担,随着维度的增加,计算成本呈指数增长。这对求解高维偏微分方程(PDEs)提出了巨大的挑战,正如理查德·E·贝尔曼在60多年前首次指出的那样。虽然最近在解决高维数值PDE方面取得了一些成功,这样的计算非常昂贵,并且从未实现将一般非线性PDE真正缩放到高维。我们开发了一种扩展物理信息神经网络(PINN)的新方法,以解决任意高维PDE。新方法,称为随机维数梯度下降(SDGD),将PDE\'和PINN\'残差的梯度分解为对应于不同维度的片段,并在训练PINN的每次迭代中随机采样这些维度片段的子集。我们从理论上证明了所提出方法的收敛性和其他期望的性质。我们在各种不同的测试中证明,所提出的方法可以解决许多众所周知的高维PDE,包括Hamilton-Jacobi-Bellman(HJB)和Schrödinger方程,在使用PINN无网格方法的单个GPU上非常快。值得注意的是,我们用非平凡的方法求解非线性PDE,各向异性,在使用带有PINN的SDGD的单个GPU上,对于1000个维度在不到一小时的时间内以及对于100,000个维度在12小时内的不可分割的解决方案。由于SDGD是PINN的通用培训方法,它可以应用于PINN的任何当前和未来变体,以将其扩展为任意高维PDE。
公众号