Core Concepts
Asymptotic results for a modified cross-entropy estimator beyond Markovian settings.
Abstract
The content discusses the extension of the Ziv-Merhav theorem to decoupled measures over finite alphabets, proving strong asymptotic consistency for modified estimators. It introduces the concept of cross-entropic pressure and its relation to large deviations in thermodynamic formalism. The paper outlines a modification to the original estimator, providing more general results without auxiliary parsing. The structure, key insights, and implications are detailed in a comprehensive manner.
Structure:
Introduction to Entropy and Cross Entropy Estimation
Ziv-Merhav Parsing Procedure Overview
Modification of Ziv-Merhav Estimator (QN)
Central Theorem on Almost Sure Convergence
Open Problem Discussion on Rigidity in Cross-Entropy Analogue of Shannon–McMillan–Breiman Theorem
Examples and Applications Discussion
Key Insights:
Introduction to entropy, cross entropy, and relative entropy in information theory.
Ziv-Merhav procedure for estimating cross entropy between ergodic measures.
Extension of estimator QN for decoupled measures with strong asymptotic consistency.
Utilization of cross-entropic pressure inspired by thermodynamic formalism.
Implications for hidden Markov models and ψ-mixing measures.
Stats
"cN = o(N) under our assumptions."