核心概念
This paper introduces LCoDeepNEAT, a novel Neural Architecture Search (NAS) method based on Lamarckian genetic algorithms, which co-evolves CNN architectures and their last layer weights to achieve faster convergence and higher accuracy in image classification tasks.
要約
Bibliographic Information:
Sharifi, Z., Soltanian, K., & Amiri, A. (2023). Developing Convolutional Neural Networks using a Novel Lamarckian Co-Evolutionary Algorithm. 13th International Conference on Computer and Knowledge Engineering (ICCKE 2023), November 1-2, 2023, Ferdowsi University of Mashhad, Iran.
Research Objective:
This paper aims to address the computational challenges of Neural Architecture Search (NAS) by introducing LCoDeepNEAT, a novel approach that co-evolves CNN architectures and their last layer weights using Lamarckian genetic algorithms.
Methodology:
LCoDeepNEAT utilizes a graph-based genetic algorithm with two populations: 'module' and 'individual'. The 'individual' population represents CNN architectures as directed acyclic graphs (DAGs) with nodes representing modules from a 'module' population. Each module represents a small CNN architecture. LCoDeepNEAT employs Lamarckian evolution, where the final layer weights of evaluated architectures are inherited by offspring, accelerating convergence. The algorithm restricts the search space to architectures with two fully connected layers for classification, further enhancing efficiency.
Key Findings:
- LCoDeepNEAT demonstrates superior performance compared to hand-crafted CNNs and several state-of-the-art NAS methods on six benchmark image classification datasets.
- The co-evolution of architecture and last layer weights, coupled with the Lamarckian inheritance of tuned weights, leads to faster convergence and improved classification accuracy.
- Constraining the architecture search space to architectures with two fully connected layers for classification proves to be an effective strategy for finding optimal solutions efficiently.
Main Conclusions:
LCoDeepNEAT presents a novel and efficient approach for NAS, effectively addressing the computational challenges associated with traditional methods. The integration of Lamarckian evolution and a constrained search space significantly contributes to the algorithm's ability to discover competitive CNN architectures with faster convergence and higher accuracy.
Significance:
This research contributes to the field of NAS by introducing a novel algorithm that effectively balances exploration and exploitation in the architecture search space. The proposed approach has the potential to facilitate the development of more efficient and accurate CNNs for various image classification tasks.
Limitations and Future Research:
While LCoDeepNEAT demonstrates promising results, further investigation into evolving weights beyond the last layer and exploring different search space constraints could lead to even more efficient and accurate architectures. Additionally, applying LCoDeepNEAT to more complex image classification tasks and comparing its performance with a wider range of NAS methods would provide a more comprehensive evaluation of its capabilities.
統計
LCoDeepNEAT achieves a classification error rate of 0.33% on the MNIST dataset, comparable to the best-performing NAS method, psoCNN, at 0.32%.
On the MNIST-BI dataset, LCoDeepNEAT achieves the lowest best error rate of 1.02% and the lowest mean error rate of 1.30%, outperforming sosCNN by 0.66% and 0.38% respectively.
For the MNIST-Fashion dataset, LCoDeepNEAT surpasses all handcrafted methods in terms of error rates, including GoogleNet, AlexNet, and VGG-16.
LCoDeepNEAT achieves a 6.21% error rate on MNIST-Fashion with 1.2 million parameters, compared to SEECNN's 5.38% error rate with 15.9 million parameters, highlighting its ability to balance accuracy and complexity.
The combined strategies of evolving the last layer and using Lamarckian inheritance in LCoDeepNEAT result in a 2% to 5.6% improvement in classification accuracy across all datasets.
Evolving only the last layer weights without Lamarckian inheritance in LCoDeepNEAT still yields a 0.4% to 0.8% increase in classification accuracy per generation.
引用
"The last layer is an excellent candidate for evolution due to its unique characteristics."
"This paper introduces LCoDeepNEAT, an instantiation of Lamarckian genetic algorithms, which extends the foundational principles of the CoDeepNEAT framework."
"Our method yields a notable improvement in the classification accuracy of candidate solutions throughout the evolutionary process, ranging from 2% to 5.6%."