Reference
A comprehensive and fair comparison of two neural operators (with practical extensions) based on FAIR data,
Computer Methods in Applied Mechanics and Engineering(2022)
Abstract
Neural operators can learn nonlinear mappings between function spaces and offer a new simulation paradigm for real-time prediction of complex dynamics for realistic diverse applications as well as for system identification in science and engineering. Herein, we investigate the performance of two neural operators, which have shown promising results so far, and we develop new practical extensions that will make them more accurate and robust and importantly more suitable for industrial-complexity applications. The first neural operator, DeepONet, was published in 2019 (Lu et al., 2019), and its original architecture was based on the universal approximation theorem of Chen & Chen (1995). The second one, named Fourier Neural Operator or FNO, was published in 2020 (Li et al., 2020), and it is based on parameterizing the integral kernel in the Fourier space. DeepONet is represented by a summation of products of neural networks (NNs), corresponding to the branch NN for the input function and the trunk NN for the output function; both NNs are general architectures, e.g., the branch NN can be replaced with a CNN or a ResNet. According to Kovachki et al. (2021), FNO in its continuous form can be viewed conceptually as a DeepONet with a specific architecture of the branch NN and a trunk NN represented by a trigonometric basis. In order to compare FNO with DeepONet computationally for realistic setups, we develop several extensions of FNO that can deal with complex geometric domains as well as mappings where the input and output function spaces are of different dimensions. We also develop an extended DeepONet with special features that provide inductive bias and accelerate training, and we present a faster implementation of DeepONet with cost comparable to the computational cost of FNO, which is based on the Fast Fourier Transform. We consider 16 different benchmarks to demonstrate the relative performance of the two neural operators, including instability wave analysis in hypersonic boundary layers, prediction of the vorticity field of a flapping airfoil, porous media simulations in complex-geometry domains, etc. We follow the guiding principles of FAIR (Findability, Accessibility, Interoperability, and Reusability) for scientific data management and stewardship. The performance of DeepONet and FNO is comparable for relatively simple settings, but for complex geometries the performance of FNO deteriorates greatly. We also compare theoretically the two neural operators and obtain similar error estimates for DeepONet and FNO under the same regularity assumptions.