This paper introduces a new information gain measure called the ˜f-mean information gain, where ˜f(t) = exp((α-1)/α t). It is shown that the maximum ˜f-mean information gain is attained at Rényi divergence, which is then proposed as the Y-elementary α-leakage. The ˜f-mean of this Y-elementary leakage is measured by Sibson mutual information, which is interpreted as the maximum ˜f-mean information gain over all estimation decisions applied to the channel output.
The existing α-leakage measures, such as Arimoto mutual information, can be expressed as ˜f-mean measures by using a scaled probability distribution. This provides a straightforward way to derive the known leakage upper bound results.
The paper also derives a decomposition of the ˜f-mean information gain, analogous to the Sibson identity for Rényi divergence. This reveals that the generalized Blahut-Arimoto method for computing Rényi capacity (or Gallager's error exponent) is an alternating maximization of the ˜f-mean information gain over estimation decision and channel input. Additionally, it is shown that the ˜f-mean information gain equals the difference between cross entropy and Rényi entropy, generalizing the excess entropy interpretation of Kullback-Leibler divergence.
Ke Bahasa Lain
dari konten sumber
arxiv.org
Pertanyaan yang Lebih Dalam