The paper explores combining two popular neural operators, DeepONet and Fourier Neural Operator (FNO), with three types of recurrent neural networks (simple RNN, GRU, and LSTM) to address the challenge of long-time integration and extrapolation in dynamical systems modeling.
The key findings are:
The integrated neural operator-recurrent network architectures show lower error and slower error growth compared to vanilla neural operators in both interpolation and extrapolation tasks.
Simultaneous training of the integrated framework, where the neural operator and recurrent network are trained together, provides better stability and accuracy than the two-step training approach.
The gated recurrent networks, GRU and LSTM, offer advantages over the simple RNN in maintaining the shape of the solution and reducing error propagation, especially in extrapolation.
The FNO-based integrated architectures demonstrate more robust performance compared to the DeepONet-based ones, particularly in extrapolation scenarios.
The proposed recurrent neural operator framework shows promise in improving the long-time prediction capabilities of dynamical systems modeling, which is crucial for real-world applications. Further research is needed to gain a deeper theoretical understanding of the error propagation and stability properties of these integrated architectures.
To Another Language
from source content
arxiv.org
Deeper Inquiries