Abstract
Neural networks can learn function operators: non-linear mappings of continuous functions. This is particularly interesting in scientific machine learning (SciML), where physical quantities are often represented as functions in space or time. Utilizing neural networks for function operator learning can drastically accelerate numerical simulations, offering speedups over traditional numerical solvers by orders of magnitude. This overview talk addresses the latest advancements in operator learning, focusing on architectures such as DeepONets, Fourier Neural Operators (FNOs), and other recent developments in the field. Along with hands-on examples, we will outline how these methods enable discretization-invariant representations, super-resolution capabilities, and physics-informed training.