Learning foundation operators and diffusion models over function spaces
Speaker:
Lu Lu, Yale University
Abstract:
As an emerging paradigm in scientific machine learning (SciML), deep neural operators pioneered by us can learn nonlinear operators of complex dynamic systems via neural networks. In this talk, I will present the vanilla deep operator network (DeepONet) and several extensions of DeepONet, such as DeepONet with Fourier decoder layers and geometry-dependent/manifold operator learning. I will demonstrate their effectiveness on diverse multiphysics and multiscale 3D problems, such as geological carbon sequestration, full waveform inversion, and topology optimization. I will present the first operator learning method that requires only one PDE solution, i.e., one-shot learning, by introducing a new concept of local solution operator based on the principle of locality of PDEs. I will also present the first systematic study of federated SciML for approximating functions and solving PDEs with data heterogeneity. Moreover, I will present our recent work on diffusion models, including FunDiff as a novel framework of diffusion models over function spaces for physics-informed generative modeling and solving forward and inverse PDE problems, and RED-DiffEq as regularization by denoising diffusion models for solving inverse PDE problems.
Bio:
Lu Lu is an Assistant Professor in the Department of Statistics and Data Science at Yale University. Prior to joining Yale, he was an Assistant Professor in the Department of Chemical and Biomolecular Engineering at University of Pennsylvania from 2021 to 2023, and an Applied Mathematics Instructor in the Department of Mathematics at Massachusetts Institute of Technology from 2020 to 2021. He obtained his Ph.D. degree in Applied Mathematics at Brown University in 2020, master's degrees in Engineering, Applied Mathematics, and Computer Science at Brown University, and bachelor's degrees in Mechanical Engineering, Economics, and Computer Science at Tsinghua University in 2013. His current research interest lies in scientific machine learning and artificial intelligence for science, including theory, algorithms, software, and its applications to engineering, physical, and biological problems. His broad research interests focus on multiscale modeling and high performance computing for physical and biological systems. He has received the Department of Energy Early Career Award, MIT Technology Review Innovators under 35 Asia Pacific, Mathematics Young Investigator Award from MDPI, and Joukowsky Family Foundation Outstanding Dissertation Award of Brown University.
Host:
Dongkuan (DK) Xu, CSC