ELU-GCN is a novel framework that enhances the performance of Graph Convolutional Networks (GCNs) in semi-supervised learning by optimizing the utilization of label information through adaptive graph structure learning and contrastive learning.
The effectiveness of Graph Convolutional Networks (GCNs) in regression tasks is significantly influenced by a bias-variance trade-off related to the depth of the network (neighborhood size) and the topology of the graph, particularly the presence of cycles, which can hinder variance decay and lead to over-smoothing.
This research paper introduces a novel Smoothness Control Term (SCT) for Graph Convolutional Networks (GCNs) to regulate the smoothness of node features, thereby enhancing node classification accuracy.
PromptGCN은 학습 가능한 프롬프트 임베딩을 활용하여 부분 그래프 샘플링 방법에서 발생하는 정보 손실 문제를 해결하고, 경량 GCN 모델의 정확도를 향상시키는 새로운 방법입니다.
PromptGCN enhances the accuracy of lightweight Graph Convolutional Networks (GCNs) on large-scale graphs by using prompts to bridge information gaps created by subgraph sampling methods.
Proposing innovative strategies to address flaws in GCNs for skeleton-based action recognition.
Deep graph representation learning combines the strengths of graph kernels and neural networks to capture complex structural information in graphs while learning abstract representations. This survey explores various graph convolution techniques, challenges, and future research directions.