FaDE는 계층적 NAS 공간의 유한 영역에서 상대적 성능 예측을 사용하여 효율적인 신경망 구조 탐색을 가능하게 한다.
The authors present a technique to efficiently construct a realistic neural architecture search (NAS) benchmark for the large-scale ImageNet2012 dataset, combined with performance metrics for various hardware accelerators including GPUs, TPUs, and FPGAs.
Neural Architecture Search (NAS) methods should be able to find optimal neural network architectures for diverse datasets, not just common benchmarks. This work introduces eight new datasets to challenge NAS approaches and evaluate their generalization capabilities.
TG-NAS proposes a universally applicable, data-independent performance predictor model that can handle unseen operators in new search spaces without retraining, acting as a zero-cost proxy to guide efficient neural architecture search.
Assembling multiple zero-cost proxies that capture distinct network characteristics to efficiently predict the performance of candidate architectures without training.
샘플별 활성화 패턴을 활용한 SWAP-Score는 효율적인 NAS를 위한 혁신적인 메트릭스입니다.
Neural Architecture Search (NAS) explores flat regions in loss landscapes to optimize NN architectures for out-of-distribution (OOD) robustness.
AutoBuild proposes a method to construct high-quality neural architectures by assigning importance scores to architecture modules, reducing the need for exhaustive search.
複数目的の進化ニューラルアーキテクチャ検索におけるパレート順位付け分類器の重要性
The author proposes a Pareto-wise end-to-end ranking classifier to simplify the architecture search process in multi-objective NAS, addressing the rank disorder issue and outperforming other methods.