insight - Graph Learning

### Analyzing the Impact of Graph Topology on the Performance of Graph Learning Models

The effectiveness of graph neural networks (GNNs) depends on the compatibility between the graph topology and the downstream learning tasks. The proposed metric TopoInf characterizes the influence of graph topology on the performance of GNN models.

### Spectral Graph Neural Networks with Two-dimensional (2-D) Graph Convolution for Improved Graph Learning

The authors propose a novel two-dimensional (2-D) graph convolution paradigm that unifies and generalizes existing spectral graph convolution approaches, enabling error-free construction of arbitrary target outputs.

### Theoretical Expressive Power and Design Space of Higher-Order Graph Transformers

This paper provides a systematic study of the theoretical expressive power of order-k graph transformers and their sparse variants. It shows that a plain order-k graph transformer without additional structural information is less expressive than the k-Weisfeiler Lehman (k-WL) test, but adding explicit tuple indices can make it as expressive as k-WL. The paper then explores strategies to sparsify and enhance the higher-order graph transformers, aiming to improve both their efficiency and expressiveness.

### Subgraph Network-Based Contrastive Learning for Efficient Graph Representation

Subgraph network-based contrastive learning (SGNCL) leverages the power of high-order interactions among substructures to effectively capture graph representations for downstream tasks.

### Detecting Nodes from Novel Categories in Attributed Graphs Under Subpopulation Shift

The core message of this article is to introduce a new approach, RECO-SLIP, that can effectively detect nodes belonging to novel categories in attributed graphs under subpopulation shifts between the source and target domains.

### Beyond the Known: Discovering Novel Classes in Open-world Graph Learning

Discovering novel classes automatically on unlabeled nodes in open-world graph learning scenarios, where novel classes can emerge beyond the known classes in the training data.

### Rayleigh Quotient Graph Neural Networks for Effective Graph-level Anomaly Detection

The Rayleigh Quotient reveals inherent spectral properties of anomalous graphs, motivating the design of a novel Rayleigh Quotient Graph Neural Network (RQGNN) that outperforms state-of-the-art methods for graph-level anomaly detection.

### Leveraging Instruction-based Prompts to Enhance Hypergraph Pretraining for Graph Learning

Instruction-based prompts are leveraged to enhance hypergraph pretraining, enabling the model to capture high-order relations with task-specific guidance and improve generalization across various graph-based tasks.

### Graph Learning under Distribution Shifts: A Comprehensive Survey

Graph learning methods address distribution shifts within the context of graph structural data.

### STG-Mamba: Spatial-Temporal Graph Learning via Selective State Space Model

STG-Mambaは、空間的・時間的グラフ学習において選択的状態空間モデルを活用し、優れた予測性能と計算効率を提供する。