Sign In

FLOWERFORMER: Empowering Neural Architecture Encoding with Flow-aware Graph Transformer

Core Concepts
FLOWERFORMER introduces a powerful graph transformer that incorporates information flows within neural architectures, outperforming existing methods in various domains.
The success of neural network architecture depends on specific tasks and datasets. Efforts have been made to predict performances without full training. Graph-based methods are effective for representation learning. FLOWERFORMER utilizes bidirectional message passing and global attention for enhanced representation learning. Extensive experiments show the superiority of FLOWERFORMER over existing methods. It excels in computer vision, graph neural networks, and auto speech recognition models.
Neural architecture encoding has gained considerable attention due to its significant downstream tasks. (Abstract) FLOWERFORMER outperforms six baseline architectures by a substantial margin across three benchmark datasets in the computer vision domain. (Introduction) FLOWERFORMER achieves performance gains of up to 4.41% in Kendall’s Tau over baseline models for graph neural networks and auto speech recognition architectures. (Contributions)
"FLOWERFORMER consists of two key components: bidirectional asynchronous message passing and global attention built on flow-based masking." "Our extensive experiments demonstrate the superiority of FLOWERFORMER over existing neural encoding methods."

Key Insights Distilled From

by Dongyeong Hw... at 03-20-2024

Deeper Inquiries

How can the concept of information flows be applied to other domains beyond neural architecture encoding


What potential limitations or drawbacks could arise from relying heavily on graph-based methods for representation learning


How might advancements in graph transformers impact the future development of machine learning models