Core Concepts
Optimizing entropy for log-concave variables with fixed variance.
Abstract
The article discusses minimizing entropy for log-concave random variables with fixed variance, focusing on Shannon differential entropy. It explores reverse bounds and inequalities for log-concave densities. The authors prove optimal inequalities and generalize to Rényi entropy. Applications in additive noise channels are discussed, providing upper bounds on channel capacity. Reductions to two-piece affine functions are detailed using rearrangement and degrees of freedom techniques. The proof of the main theorem involves a three-point inequality and analysis of a specific function's positivity through series expansions.
Stats
h(X) = h(f) = − R f log f.
h(X) ≤ 1/2 log Var(X) + 1/2 log(2πe).
h(X) ≥ 1/2 log Var(X) + log 2.
CP(Z) ≤ CP(N) ≤ CP(Z) + D(N).
N(X) ≥ e^(2π Var(X)).