Core Concepts

Deriving Sliced-Wasserstein distances on Cartan-Hadamard manifolds for efficient data analysis.

Abstract

The content discusses the derivation of Sliced-Wasserstein distances on Cartan-Hadamard manifolds, including Hyperbolic spaces and Symmetric Positive Definite matrices. It explores applications and non-parametric schemes for minimizing these distances using Wasserstein gradient flows.
Introduction to Riemannian manifolds and Optimal Transport.
Development of Sliced-Wasserstein distances on non-positive curvature manifolds.
Applications in various fields like neural networks and generative modeling.
Comparison with other distance metrics like Kullback-Leibler divergence.
Detailed explanation of geodesics, exponential maps, and sectional curvature on Riemannian manifolds.

Stats

Wasserstein distance is defined as infimum over couplings between distributions.
Sliced-Wasserstein distance leverages orthogonal projections for tractable computation.
Approximations like entropic regularization used to reduce computational burden.

Quotes

"Exploiting structure by leveraging manifold metric beneficial." - Fletcher et al. (2004)
"New tools proposed for handling data on Riemannian manifolds." - Huckemann and Ziezold (2006)

Deeper Inquiries

Sliced-Wasserstein distances can be extended to other types of manifolds by leveraging the intrinsic geometry and structure of each specific manifold. For example, on Riemannian manifolds with non-positive curvature, such as Cartan-Hadamard manifolds, the concept of Sliced-Wasserstein distances can be generalized by defining appropriate projections along geodesics or horospheres. These projections allow for measuring distances between probability distributions supported on these manifolds in a way that captures their underlying geometry.

The choice of slicing distributions in the context of Sliced-Wasserstein distances has significant implications on the resulting distance metric. Different slicing distributions can lead to varying levels of sensitivity to different features or structures present in the data. For instance, using a uniform distribution over directions (as commonly done) may provide a balanced view across all possible orientations, while choosing specific directional distributions could emphasize certain aspects more than others. The selection of slicing distributions should align with the characteristics and properties desired for capturing similarities between probability measures effectively.

The computational complexity when computing Wasserstein distances varies depending on the type of manifold involved. In general, computing Wasserstein distances on Riemannian manifolds introduces additional challenges compared to Euclidean spaces due to the intrinsic curvature and geometric properties unique to each manifold. Manifolds with negative curvature like Hyperbolic spaces may require specialized algorithms or numerical techniques to handle computations efficiently due to their complex geometry. On simpler manifolds like pullback Euclidean spaces, where metrics are derived from Euclidean spaces through diffeomorphisms, computational complexity may be more manageable as they retain some properties akin to Euclidean spaces which facilitate calculations.

0