Alapfogalmak
Proposing a robust Bayesian sparse learning algorithm for discovering Partial Differential Equations with variable coefficients, enhancing robustness and model selection criteria.
Kivonat
The content discusses the challenges of data-driven discovery of PDEs, proposing a Bayesian sparse learning algorithm for PDE discovery with variable coefficients. It introduces the threshold Bayesian group Lasso with spike-and-slab prior (tBGL-SS) and demonstrates its robustness and efficiency in model selection. The method is compared with baseline methods, showcasing its superiority in noisy environments and model selection criteria. The paper includes numerical experiments on classical benchmark PDEs, illustrating the method's effectiveness and efficiency. It also discusses uncertainty quantification and model selection criteria, emphasizing the algorithm's robustness and performance under noisy conditions.
- Introduction to the challenges of data-driven discovery of PDEs
- Proposal of a Bayesian sparse learning algorithm for PDE discovery with variable coefficients
- Comparison with baseline methods and demonstration of method's superiority in noisy environments
- Numerical experiments showcasing method's effectiveness and efficiency
- Discussion on uncertainty quantification and model selection criteria, highlighting algorithm's robustness and performance under noisy conditions
Statisztikák
The tRMS = 0.02 is used for clean data and 1% σu noise data in the Burger's equation experiment.
The tGE = 0.1 is used in all experiments for Burger's equation.
In the Advection-Diffusion equation experiment, tRMS = 0.02 is applied for clean data and 1% σu noise data, and tRMS = 0.01 for 2% σu noise data.
The tGE = 0.08 is used in all experiments for the Advection-Diffusion equation.
For the Kuramoto-Sivashinsky equation experiment, tRMS = 0.1 is applied for clean data and 1% σu noise data, and tRMS = 0.01 for 2% σu noise data.
The tGE = 0.05 is used in all experiments for the Kuramoto-Sivashinsky equation.
Idézetek
"The proposed tBGL-SS method outperforms SGTR and group Lasso in noisy environments and provides better model selection criteria."
"The uncertainty quantification of the reconstructed coefficients benefits from the Bayesian statistical framework adopted in this work."
"The method's robustness and efficiency in model selection criteria are highlighted through numerical experiments on classical benchmark PDEs."