Scalable and Efficient Bayesian Optimal Experimental Design with Derivative-Informed Neural Operators
The core message of this article is to develop an accurate, scalable, and efficient computational framework for Bayesian optimal experimental design (OED) problems by leveraging derivative-informed neural operators (DINOs). The proposed method addresses the key challenges in Bayesian OED, including the high computational cost of evaluating the parameter-to-observable (PtO) map and its derivative, the curse of dimensionality in the parameter and experimental design spaces, and the combinatorial optimization for sensor selection.