Function operators are ubiquitous in mathematics and physics: They are used to describe dynamics of physical systems, such as the Navier-Stokes equations in fluid dynamics. As solutions of these systems are functions, it is natural to transfer the concept of function mapping into machine learning.

Checkout the Documentation page
to learn more about operator learning in **continuiti**.
If you prefer to jump into code directly, take a look at our
collection of Examples.

**continuiti** implements the following neural operator architectures:

To see what new methods, features and improvements are coming, check out the issues on GitHub.

# References

[Lu21L]

Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators,

[Kov23N]

Neural Operator: Learning Maps Between Function Spaces With Applications to PDEs,

[Zha23B]

BelNet: basis enhanced learning, a mesh-free neural operator,