Vahedi, A.M., Ilies, H.T. SPGD: Steepest Perturbed Gradient Descent Optimization. arXiv preprint arXiv:2411.04946v1 (2024).
This paper introduces a novel optimization algorithm, Steepest Perturbed Gradient Descent (SPGD), designed to overcome the limitations of traditional gradient descent methods in navigating complex, non-convex optimization landscapes. The research aims to demonstrate SPGD's efficacy in finding global or near-global optima in challenging scenarios where conventional methods often get trapped in local minima.
The authors develop SPGD by combining gradient descent with periodic, randomized perturbations. They evaluate SPGD's performance against established optimization algorithms, including Gradient Descent (GD), Perturbed Gradient Descent (PGD), MATLAB's fmincon function, and Simulated Annealing (SA), using a set of 2D benchmark functions and a 3D component packing problem. The benchmark functions are chosen for their known difficulties and relevance in assessing optimization algorithms' efficacy in navigating complex, non-convex, and potentially deceptive optimization spaces. The 3D component packing problem serves as a practical application with real-world implications, testing the algorithms' ability to handle collision constraints and optimize packing efficiency.
SPGD consistently outperforms the other algorithms in the 2D benchmark tests, demonstrating superior accuracy in finding the global optimum and often exhibiting faster convergence. In the 3D component packing problem, SPGD successfully navigates complex scenarios with collision constraints, achieving significantly more compact and efficient packing configurations compared to GD, particularly in cases involving objects of different sizes and irregular shapes.
SPGD offers a robust and efficient approach to optimization, effectively addressing the limitations of traditional gradient descent methods in complex landscapes. The algorithm's ability to escape local minima through strategic perturbations and its adaptability to different problem scenarios, including those with constraints, makes it a promising tool for various optimization tasks.
The development of SPGD contributes significantly to the field of optimization by providing a more effective method for solving complex, non-convex problems. Its potential applications extend to various domains, including engineering design, machine learning, and bioinformatics, where finding global optima is crucial.
While SPGD demonstrates promising results, the authors acknowledge the need for further research in optimizing the algorithm's computational efficiency, particularly in high-dimensional problems. Future work could also explore adaptive perturbation strategies tailored to specific problem characteristics and extend SPGD's application to more complex systems, such as those involving interconnected components with physical interactions.
Vers une autre langue
à partir du contenu source
arxiv.org
Questions plus approfondies