核心概念
The k-nearest neighbor (k-NN) rule is universally consistent in metric spaces with finite de Groot dimension.
摘要
This article explores the universal consistency of the k-NN rule in metric spaces, focusing on Nagata and de Groot dimensions. It discusses tie-breaking strategies, strong consistency, and examples like the Heisenberg group. Theorems by C´erou and Guyader are highlighted, along with results from Assouad and Quentin de Gromard. The content delves into learning rules, metrics, and properties related to Lebesgue–Besicovitch differentiation. Notable examples demonstrate the application of these concepts.
統計資料
The k-nearest neighbor classifier is universally consistent in every separable metric space that is sigma-finite dimensional in the sense of Nagata.
The Heisenberg group equipped with a Cygan–Kor´anyi metric satisfies the weak Lebesgue–Besicovitch property for every Borel probability measure.
Every doubling metric space has finite de Groot dimension.
引述
"The k-nearest neighbour classifier is universally consistent in every complete separable metric space sigma-finite dimensional in the sense of Nagata." - Corollary 2.9