This article proposes a two-parameter generalization of the Tsallis entropy, called the generalized Tsallis entropy, and derives its fundamental information-theoretic properties, including pseudo-additivity, sub-additivity, joint convexity, and information monotonicity.
Rényi divergence and Sibson mutual information are proposed as exact α-leakage measures, quantifying the maximum and average information gain an adversary can obtain on sensitive data through a privacy-preserving channel.
The paper derives the generalized mutual information (GMI) of ordered reliability bits guessing random additive noise decoding (ORBGRAND) for memoryless binary-input channels with general output conditional probability distributions. The analysis provides insights into understanding the gap between the ORBGRAND achievable rate and the channel mutual information. As an application, the paper studies the ORBGRAND achievable rate for bit-interleaved coded modulation (BICM), showing that the gap is typically small, suggesting the feasibility of ORBGRAND for high-order coded modulation schemes.
The second-order randomized identification capacity of the Additive White Gaussian Noise Channel (AWGNC) has the same form as the second-order transmission capacity, with the only difference being that the maximum number of messages in randomized identification scales double exponentially in the blocklength.
For a wide range of channels and pairwise-independent code ensembles, expurgating an arbitrarily small fraction of codewords from a randomly selected code results in a code that attains the expurgated exponent with high probability.
The authors develop an analytical method to estimate the average differential entropy of a Gaussian mixture distribution, where the component means are i.i.d. Gaussian vectors. They obtain a series expansion in the ratio of the variances of the component means and the shared covariance matrix, providing an approximation with a quantifiable error bound.
A prefix code is optimal if and only if it is complete and strongly monotone.
The paper investigates the relationship between two practical metrics of high-order interdependencies: the redundancy-synergy index (RSI), which captures directed interdependencies, and the O-information, which captures undirected interdependencies. The results reveal tight links between these two quantities and provide interpretations in terms of likelihood ratios and information geometry.
The core message of this work is to propose an accurate, efficient and convergence-guaranteed algorithm for computing the relevance-compression (RI) function of the Information Bottleneck (IB) problem by introducing a semi-relaxed IB model.
The authors study the information rate-distortion-perception (RDP) function, which characterizes the three-way trade-off between description rate, average distortion, and perceptual quality. They reformulate the RDP problem as a Wasserstein Barycenter optimization problem, enabling the identification of critical transitions where constraints become inactive and the analysis of the interplay between distortion and perception measures. An entropy-regularized model and an improved Alternating Sinkhorn algorithm are proposed to efficiently solve the RDP problem.