A New Set of Test Functions for Variational Physics-Informed Neural Networks in Solid Mechanics
Please login to view abstract download link
Physics-Informed Neural Networks (PINNs) have shown their capability in solving both forward and inverse problems across various fields. In this approach, a neural network approximates the unknown variables such that the strong form, alongside given initial and boundary conditions, is satisfied as closely as possible. This framework has been extended in prior works by incorporating the variational form into the loss function, leading to Variational Physics-Informed Neural Networks (VPINNs). Unlike PINNs, VPINNs theoretically allow both h- and p-refinement, providing greater flexibility for applications such as surrogating high-fidelity finite element models. However, previous studies have primarily relied on Legendre test functions, which have not yielded the expected accuracy improvements, limiting VPINNs' effectiveness. Here, we introduce a set of test functions designed to avoid orthogonality. These test functions are evaluated and compared to Legendre-based test functions on a solid mechanics benchmark specially designed to require a steep solution. Our study demonstrates that, unlike Legendre-based test functions, the proposed test functions reduce the prediction error without increasing computational cost. Lastly, the proposed test functions enhance the reliability of VPINNs, yielding results with consistent accuracy over repeated network training runs.