Abstract
Physically informed neural networks (PINNs) are a promising emerging method for solving differential equations. As in many other deep learning approaches, the choice of PINN design and training protocol requires careful craftsmanship. Here, we suggest a comprehensive theoretical framework that sheds light on this important problem. Leveraging an equivalence between infinitely over-parameterized neural networks and Gaussian process regression, we derive an integro-differential equation that governs PINN prediction in the large data-set limit—the neurally-informed equation. This equation augments the original one by a kernel term reflecting architecture choices. It allows quantifying implicit bias induced by the network via a spectral decomposition of the source term in the original differential equation.
Original language | English |
---|---|
Article number | 035048 |
Journal | Machine Learning: Science and Technology |
Volume | 5 |
Issue number | 3 |
DOIs | |
State | Published - 1 Sep 2024 |
Bibliographical note
Publisher Copyright:© 2024 The Author(s). Published by IOP Publishing Ltd
Keywords
- Gaussian process regression
- deep neural networks
- over-parameterized neural networks
- physically informed neural networks