Variationally mimetic operator networks

Dhruv Patel, Deep Ray, Michael R.A. Abdelmalik, Thomas J.R. Hughes, Assad A. Oberai (Corresponding author)

Research output: Contribution to journalArticleAcademicpeer-review

2 Citations (Scopus)


In recent years operator networks have emerged as promising deep learning tools for approximating the solution to partial differential equations (PDEs). These networks map input functions that describe material properties, forcing functions, and boundary data to the solution of a PDE. This work describes a new architecture for operator networks, called the variationally mimetic operator network (VarMiON ), that mimics the form of the numerical solution obtained from an approximate variational or weak formulation of the problem. Like the conventional Deep Operator Network (DeepONet) the VarMiON is also composed of a sub-network that constructs the basis functions for the output and another that constructs the coefficients for these basis functions. However, in contrast to the DeepONet, the architecture of these sub-networks in the VarMiON is precisely determined. An analysis of the error in the VarMiON solution reveals that it contains contributions from the error in the training data, the training error, the quadrature error in sampling input and output functions, and a “covering error” that measures the distance between the test input functions and the nearest functions in the training dataset. It also depends on the stability constants for the exact solution operator and its VarMiON approximation. The application of the VarMiON to a canonical elliptic PDE and a nonlinear PDE reveals that for approximately the same number of network parameters, on average the VarMiON incurs smaller errors than a standard DeepONet and a recently proposed multiple-input operator network (MIONet). Further, its performance is more robust to variations in input functions, the techniques used to sample the input and output functions, the techniques used to construct the basis functions, and the number of input functions. Moreover, it consistently outperforms baseline methods at various dataset sizes. The data and code accompanying this manuscript are publicly available at

Original languageEnglish
Article number116536
Number of pages30
JournalComputer Methods in Applied Mechanics and Engineering
Publication statusPublished - 1 Feb 2024


AAO and DR acknowledge support from ARO, United States grant W911NF2010050 . DP acknowledges support from the Stephen Timoshenko Distinguished Postdoctoral Fellowship at Stanford University, United States .

FundersFunder number
Army Research Office (ARO)W911NF2010050


    • Deep neural operator
    • Deep operator network
    • Error analysis
    • Variational formulation


    Dive into the research topics of 'Variationally mimetic operator networks'. Together they form a unique fingerprint.

    Cite this