The paper presents a new information-theoretic approach to establishing finite de Finetti theorems. The main result (Theorem 2.1) shows that for an exchangeable vector of random variables Xn1, the distribution of the first k random variables Xk1 is close to a mixture of product distributions, as measured by the relative entropy. This bound is tighter than those obtained via earlier information-theoretic proofs.
The key steps are:
The paper also provides corollaries for the case of discrete random variables (Corollary 2.4) and recovers the classical infinite de Finetti theorem for compact spaces (Corollary 2.5). Examples are given to illustrate the applicability and limitations of the results.
Naar een andere taal
vanuit de broninhoud
arxiv.org
Belangrijkste Inzichten Gedestilleerd Uit
by Mario Berta,... om arxiv.org 04-29-2024
https://arxiv.org/pdf/2304.05360.pdfDiepere vragen