Local feature filtering for scalable and well-conditioned domain-decomposed random feature methods

Research output: Contribution to journalArticleAcademicpeer-review

4 Downloads (Pure)

Abstract

Random Feature Methods (RFMs) [1] and their variants such as extreme learning machine finite-basis physics-informed neural networks (ELM-FBPINNs) [2] offer a scalable approach for solving partial differential equations (PDEs) by using localized, overlapping and randomly initialized neural network basis functions to approximate the PDE solution and training them to minimize PDE residuals through solving structured least-squares problems. This combination leverages the approximation power of randomized neural networks, the parallelism of domain decomposition, and the accuracy and efficiency of least-squares solvers. However, the resulting structured least-squares systems are often severely ill-conditioned, due to local redundancy among random basis functions and correlation introduced by subdomain overlaps, which significantly affects the convergence of standard solvers. In this work, we introduce a block rank-revealing QR (RRQR) filtering and preconditioning strategy that operates directly on the structured least-squares problem. First, local RRQR factorizations identify and remove redundant basis functions while preserving numerically informative ones, reducing problem size, and improving conditioning. Second, we use these factorizations to construct a right preconditioner for the global problem which preserves block-sparsity and numerical stability. Third, we derive deterministic bounds of the condition number of the preconditioned system, with probabilistic refinements for small overlaps. We validate our approach on challenging, multi-scale PDE problems in 1D, 2D, and (2+1)D, demonstrating reductions in condition numbers by up to eleven orders of magnitude, LSQR convergence speedups by factors of 10–1000, and higher accuracy than both unpreconditioned and additive Schwarz-preconditioned baselines, all at significantly lower memory and computational cost. These results establish RRQR-based preconditioning as a scalable, accurate, and efficient enhancement for RFM-based PDE solvers.

Original languageEnglish
Article number118583
Number of pages27
JournalComputer Methods in Applied Mechanics and Engineering
Volume449
DOIs
Publication statusPublished - 1 Feb 2026

Bibliographical note

Publisher Copyright:
© 2025 The Author(s)

Keywords

  • Numerical solution of PDEs
  • Overlapping domain decomposition
  • Physics-informed machine learning
  • Preconditioning of least-squares problems
  • Random feature methods
  • Rank-revealing QR factorization

Fingerprint

Dive into the research topics of 'Local feature filtering for scalable and well-conditioned domain-decomposed random feature methods'. Together they form a unique fingerprint.

Cite this