Grunnleggende konsepter
This article proposes a two-parameter generalization of the Tsallis entropy, called the generalized Tsallis entropy, and derives its fundamental information-theoretic properties, including pseudo-additivity, sub-additivity, joint convexity, and information monotonicity.
Sammendrag
The article introduces a modified version of the Sharma-Mittal entropy, which the authors call the generalized Tsallis entropy. This new entropy measure is a two-parameter generalization of the Tsallis entropy and can be reduced to the Tsallis entropy for specific parameter values.
The key highlights and insights from the article are:
- The authors derive a product rule for the two-parameter deformed logarithm, which enables them to establish a chain rule for the generalized Tsallis entropy.
- They prove that the generalized Tsallis entropy satisfies the pseudo-additivity property, where the joint entropy of two independent random variables is the sum of their individual entropies minus a term involving their product.
- The generalized Tsallis entropy is shown to be sub-additive, meaning the entropy of a joint distribution is less than or equal to the sum of the individual entropies.
- The authors define a generalized Tsallis relative entropy (or divergence) and demonstrate its pseudo-additivity, joint convexity, and information monotonicity properties.
- The information-geometric aspects of the generalized Tsallis relative entropy are discussed, including its connection to the Sharma-Mittal entropy and the Tsallis relative entropy.
The article provides a comprehensive analysis of the information-theoretic and information-geometric characteristics of the proposed generalized Tsallis entropy and relative entropy measures.
Statistikk
The generalized Tsallis entropy is defined as S{k,r}(X) = -Σ_x (p(x))^(r+k+1) ln{k,r}(p(x)), where ln{k,r}(x) = (x^k - x^(-k)) / (2kx^r).
The generalized Tsallis relative entropy is defined as D{k,r}(P||Q) = Σ_x p(x) (p(x)/q(x))^(r-k) ln{k,r}(p(x)/q(x)).
The pseudo-additivity property of the generalized Tsallis entropy is: S{k,r}(X,Y) = S{k,r}(X) + S{k,r}(Y) - 2kS{k,r}(X)S{k,r}(Y).
The sub-additivity property of the generalized Tsallis entropy is: S{k,r}(X1, X2, ..., Xn) ≤ Σ_i S{k,r}(Xi).
The pseudo-additivity property of the generalized Tsallis relative entropy is: D{k,r}(P(1)⊗P(2)||Q(1)⊗Q(2)) = D{k,r}(P(1)||Q(1)) + D{k,r}(P(2)||Q(2)) - 2kD{k,r}(P(1)||Q(1))D{k,r}(P(2)||Q(2)).
The joint convexity property of the generalized Tsallis relative entropy is: D{k,r}(P(1)+λP(2)||Q(1)+λQ(2)) ≤ D{k,r}(P(1)||Q(1)) + λD{k,r}(P(2)||Q(2)).
The information monotonicity property of the generalized Tsallis relative entropy is: D{k,r}(WP||WQ) ≤ D{k,r}(P||Q).
Sitater
"This article proposes a modification in the Sharma-Mittal entropy and distinguishes it as generalised Tsallis entropy."
"We derive a product rule (xy)^(r+k) ln{k,r}(xy) = x^(r+k) ln{k,r}(x) + y^(r+k) ln{k,r}(y) + 2kx^(r+k)y^(r+k) ln{k,r}(x) ln{k,r}(y), for the two-parameter deformed logarithm ln{k,r}(x) = (x^r - x^(-k)) / (2k)."
"This article is an exposit investigation on the information-theoretic, and information-geometric characteristics of generalized Tsallis entropy."