Core Concepts
Geometric Algebra Transformer (GATr) generalizes to Euclidean, projective, and conformal algebras for 3D data representation.
Abstract
Introduction to Geometric Deep Learning and equivariance to symmetry groups.
Development of the transformer architecture as a standard in various domains.
Integration of geometric deep learning principles with transformers.
Generalization of GATr architecture to different geometric algebras for scalable transformer models.
Comparison of GATr variations based on Euclidean, projective, and conformal algebras in theory and practice.
Theoretical analysis on expressivity, position representation, and distance-based attention in different algebras.
Empirical experiments showcasing the performance of GATr variants in n-body modeling and arterial wall-shear-stress estimation tasks.
Stats
Euclidean algebra is computationally cheap but less sample-efficient.
Projective model is not sufficiently expressive.
Conformal algebra offers elegant formulation of 3D geometry.