YIC2025

An Order-Preserving Multi-Stage Tensor Reduction Strategy with Efficient Interpolation/Regression for Reduced-Order Modeling

  • Kapadia, Harshit (Max Planck Institute Magdeburg)
  • Feng, Lihong (Max Planck Institute Magdeburg)
  • Benner, Peter (Max Planck Institute Magdeburg)

Please login to view abstract download link

We present a novel multi-stage tensor reduction (MSTR) framework for tensorial data from experimental measurements or high-fidelity simulations of physical systems. The order p of the tensor under consideration can be arbitrarily large. At the heart of the framework are a series of strategic tensor factorizations and compressions, ultimately leading to a final order-preserving reduced representation of the original tensor. We augment the MSTR framework by performing efficient kernel-based interpolation/regression [Kapadia et al. 2024] over certain reduced tensor representations, amounting to a new non-intrusive model reduction approach capable of handling dynamical, parametric steady, and parametric dynamical systems. We formalize our ideas using the tensor t-product algebra [Kilmer and Martin 2011, Martin et al. 2013] and provide a rigorous upper bound for the error of the tensor approximation from the MSTR strategy. The MSTR framework efficiently reduces high-dimensional tensorial data; compared to the typical reduction offered by the singular value decomposition of matricized tensors, we notice a significant improvement in reconstruction accuracy with considerably fewer entries in the final reduced tensor representation. Numerical results for numerous large-scale complex systems highlight the robustness of our MSTR framework and the reduced-order model based on it.