The paper introduces Masked Autoencoders (MAE) as an unsupervised Neural Architecture Search (NAS) method. By replacing supervised learning with an image reconstruction task, MAE-NAS eliminates the need for labeled data. The hierarchical decoder in MAE-NAS addresses performance collapse in DARTS. Experimental results demonstrate the effectiveness of MAE-NAS across various search spaces and datasets.
To Another Language
from source content
arxiv.org
ข้อมูลเชิงลึกที่สำคัญจาก
by Yiming Hu,Xi... ที่ arxiv.org 03-27-2024
https://arxiv.org/pdf/2311.12086.pdfสอบถามเพิ่มเติม