Core Concepts
Proposing a novel Differentiable Information Bottleneck method for deterministic multi-view clustering without variational approximation.
Abstract
This article introduces the Differentiable Information Bottleneck (DIB) method for deterministic multi-view clustering. It addresses the limitations of existing methods by directly fitting mutual information without variational approximation. The DIB approach includes deterministic compression and triplet consistency discovery mechanisms, leading to superior performance compared to state-of-the-art baselines on various datasets.
Introduction
Discusses traditional MVC techniques and the shift towards deep learning models.
Introduces the concept of the information bottleneck principle in multi-view clustering.
Related Work and Preliminaries
Explains the information bottleneck concept and its application in deep multi-view clustering.
Differentiable Information Bottleneck
Proposes a new method for deterministic compression and triplet consistency discovery.
Defines problem statements and objectives for multi-view clustering.
Experiments
Evaluates DIB on six datasets, comparing it with traditional, deep, and IB-based MVC baselines.
Ablation Study
Analyzes the impact of different components on clustering performance.
Parameter Sensitivity Analysis
Investigates the effect of trade-off parameters γ and β on clustering performance.
Convergence Analysis
Demonstrates convergence behavior of the DIB algorithm over iterations.
MI Measurement Evaluation
Compares mutual information with and without variational approximation.
Stats
Variational approximation offers a natural solution to estimate lower bound mutual information in high-dimensional spaces.
The proposed MI measurement directly fits mutual information between high-dimensional spaces using normalized kernel Gram matrix.
Quotes
"The proposed DIB method provides a deterministic and analytical MVC solution."
"DIB outperforms traditional MVC, deep MVC, and IB-based deep MVC baselines."