Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs

Seungjun Lee will talk about an attention-based neural operator architecture called an Inducing Point Operator Transformer (IPOT), which addresses the challenges of flexibility in handling irregular and arbitrary input and output formats and scalability to large discretizations when solving partial differential equations (PDEs).

Abstract

Solving partial differential equations (PDEs) by learning the solution operators has emerged as an attractive alternative to traditional numerical methods. However, implementing such architectures presents two main challenges: flexibility in handling irregular and arbitrary input and output formats and scalability to large discretizations. We introduce an attention-based model called an inducing point operator transformer (IPOT) to address these issues. Inspired by inducing points methods, IPOT is designed to handle any input function and output query while capturing global interactions in a computationally efficient way. Compared to state-of-the-art methods, our experimental results demonstrate that IPOT achieves strong performances with manageable computational complexity on an extensive range of PDE benchmarks and real-world weather forecasting scenarios.

References

In this series