TG-NAS: A Universal Zero-Cost Proxy for Efficient Neural Architecture Search
TG-NAS proposes a universally applicable, data-independent performance predictor model that can handle unseen operators in new search spaces without retraining, acting as a zero-cost proxy to guide efficient neural architecture search.