Core Concepts
A novel orthogonal greedy algorithm (OGA) combined with shallow neural networks is proposed to efficiently solve fractional Laplace equations.
Abstract
The paper explores the finite difference approximation of the fractional Laplace operator and combines it with a shallow neural network method to solve fractional Laplace equations.
Key highlights:
- The fractional Laplace operator is discretized using the Riemann-Liouville formula relevant to fractional equations.
- A shallow neural network is constructed to address the discrete fractional operator, coupled with the OGA algorithm as the core optimizer.
- The OGA algorithm can clearly define the optimization direction and obtain better numerical results compared to traditional methods.
- Numerical experiments are conducted for both integer-order and fractional-order Laplace operators, demonstrating favorable convergence results.
Stats
The true solution is taken as u(x) = x^3(1-x)^3.
The forcing term f(x) is derived analytically based on the true solution.
Quotes
"The advantage of using neural networks to solve equations is that it can better approximate complex, high-dimensional function spaces."
"Based on the finite element method, Xu et al. proposed a relaxed greedy algorithm (RGA) and an orthogonal greedy algorithm (OGA) suitable for shallow neural networks, which may be the future development direction of theoretical analysis of neural networks."