The paper proposes SineNet, a multi-stage neural network architecture for solving time-dependent partial differential equations (PDEs). The key insight is that the U-Net architecture, commonly used for this task, suffers from a misalignment issue between the features in the skip connections and the upsampling path due to the temporal evolution of the latent features.
To address this, SineNet consists of multiple sequentially connected U-shaped network blocks, referred to as waves. By distributing the latent evolution across multiple stages, the degree of spatial misalignment between the input and target encountered by each wave is reduced, thereby alleviating the challenges associated with modeling advection.
The paper also analyzes the role of skip connections in enabling both parallel and sequential processing of multi-scale information, and discusses the importance of encoding boundary conditions, particularly for periodic boundaries.
Empirical evaluations are conducted on multiple challenging PDE datasets, including the Navier-Stokes equations and shallow water equations. The results demonstrate consistent performance improvements of SineNet over existing baseline methods. An ablation study further shows that increasing the number of waves in SineNet while maintaining the same number of parameters leads to monotonically improved performance.
翻譯成其他語言
從原文內容
arxiv.org
深入探究