核心概念
The author introduces BG-HGNN to tackle parameter explosion and relation collapse in heterogeneous graph neural networks, enhancing efficiency and effectiveness.
摘要
The paper discusses the challenges faced by existing HGNNs in handling complex heterogeneous graphs with numerous relation types. It introduces BG-HGNN as a solution that integrates different relations into a unified feature space, improving efficiency and performance significantly. The empirical studies show that BG-HGNN outperforms existing models in terms of parameter efficiency, training throughput, and accuracy.
統計資料
Existing HGNNs are often limited by parameter explosion and relation collapse.
BG-HGNN surpasses existing HGNNs with up to 28.96x parameter efficiency, 8.12x training throughput, and 1.07x accuracy improvement.
引述
"How can we develop a unified mechanism to mitigate parameter explosion and relation collapse in HGNNs?"