Abstract
It is well-known that the Shannon entropies of some parameterized probability distributions are concave functions with respect to the parameter. In this paper we consider a family of such distributions (including the binomial, Poisson, and negative binomial distributions) and investigate their Shannon, Rényi, and Tsallis entropies with respect to complete monotonicity.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Let \(c\in {\mathbb {R}}\), \(I_c := \left[ 0, -\frac{1}{c}\right] \) if \(c<0\), and \(I_c:= [0,+\infty )\) if \(c \ge 0\).
As usual, for \(\alpha \in {\mathbb {R}}\) and \(k \in {\mathbb {N}}_0\) the binomial coefficients are defined by
Let \(n> 0\) be a real number such that \(n>c\) if \(c\ge 0\), or \(n=-cl\) with some \(l\in {\mathbb {N}}\) if \(c<0\).
For \(k\in {\mathbb {N}}_0\) and \(x\in I_c\) define
Details and historical notes concerning these functions can be found in [3, 7, 21] and the references therein. In particular,
Moreover,
so that \(\left( p_{n,k}^{[c]}(x)\right) _{k\ge 0}\) is a parameterized probability distribution. Its associated Shannon entropy is
while the Rényi entropy of order 2 and the Tsallis entropy of order 2 are given, respectively, by (see [18, 20])
where
The cases \(c=-1\), \(c=0\), \(c=1\) correspond, respectively, to the binomial, Poisson, and negative binomial distributions. For other details see also [15, 16].
In this paper we investigate the above entropies with respect to the complete monotonicity.
2 Shannon entropy
2.1 Let’s start with the case \(c<0\).
\(H_{n,-1}\) is a concave function; this is a special case of the results of [19]; see also [6, 8, 9] and the references therein.
Here we shall determine the signs of all the derivatives of \(H_{n,c}\).
Theorem 2.1
Let \(c<0\). Then, for all \(k\ge 0\),
Proof
We have \(n=-cl\) with \(l \in {\mathbb {N}}\). As in [10], let us represent \(\log {(l!)}\) by integrals:
Now using (1.2), (1.3) and (2.3) we get
It is a matter of calculus to prove that
and for \(k\ge 0\)
For \(0<t<1\) we have
so that
Repeated integration by parts yields
and so
Replacing x by \(-\frac{1}{c}-x\) we obtain
From (2.5), (2.6) and (2.7) it follows that
and this proves (2.1).
It is easy to verify that \(H_{n,c}^{(2k+1)}\left( -\frac{1}{2c} \right) = 0\). Since \(H_{n,c}^{(2k+2)}\le 0\), it follows that \(H_{n,c}^{(2k+1)}\) is decreasing, and this implies (2.2). \(\square \)
2.2 Consider the case \(c=0\)
\(H_{n,0}\) is the Shannon entropy of the Poisson distribution. The derivative of this function is completely monotonic: see, e.g., [2, p. 2305]. For the sake of completeness we insert here a short proof.
Theorem 2.2
\(H'_{n,0}\) is completely monotonic, i.e.,
Proof
Note that \(H_{n,0}(y) = H_{1,0}(ny)\); so it suffices to investigate the derivatives of \(H_{1,0}(x)\).
According to [10, (2.5)],
It follows that
and for \(k\ge 1\),
By using (2.4) we get
Combined with (2.9), this proves (2.8) for \(k\ge 1\). In particular, we see that \(H_{n,0}\) is concave and non-negative on \([0,+\infty )\); it follows that \(H'_{n,0}\ge 0\) and so (2.8) is completely proved. \(\square \)
2.3 Let now \(c>0\)
Theorem 2.3
For \(c>0\), \(H'_{n,c}\) is completely monotonic.
Proof
Since \(H_{m,c}(y) = H_{\frac{m}{c},1}(cy)\), it suffices to study the derivatives of \(H_{n,1}(x)\).
we get
It follows that, for \(j\ge 1\),
Using again (2.4), we get
where
We shall prove that \(u(x)\le 0\) and \(v(x)\le 0\), \(x>0\). Let us remark that
and integration by parts yields
Applying repeatedly this formula we obtain
Now (2.10) and (2.11) imply \(u(x) \le 0\).
Using again integration by parts we get
which shows that \(v(x) \le 0\).
We conclude that
In particular, (2.12) shows that \(H_{n,1}\) is concave on \([0,+\infty )\); it is also non-negative, which means that \(H'_{n,1}\ge 0\). Combined with (2.12), this shows that \(H'_{n,1}\) is completely monotonic, and the proof is finished. \(\square \)
Remark 2.4
(2.11) can be obtained alternatively by using the change of variables \(y=(1-t)/(1+xt)\) and the properties of the Beta function. An alternative proof of the inequality \(v(x)\le 0\) follows from
Corollary 2.5
The following inequalities are valid for \(x>0\) and \(c\ge 0\):
In particular, for \(c=0\) and \(n=1\),
Proof
We have seen that \(H'_{n,c}(x)\ge 0\). An application of (1.1) yields
This proves the first inequality in (2.13); the second is a consequence of Jensen’s inequality applied to the concave function \(\log {t}\). \(\square \)
3 Rényi entropy and Tsallis entropy
The following conjecture was formulated in [13]:
Conjecture 3.1
\(S_{n,-1}\) is convex on [0, 1].
Th. Neuschel [11] proved that \(S_{n,-1}\) is decreasing on \(\left[ 0, \frac{1}{2}\right] \) and increasing on \(\left[ \frac{1}{2}, 1\right] \). The conjecture and Neuschel’s result can also be found in [5].
A proof of the conjecture was given by G. Nikolov [12], who related it to some new inequalities involving Legendre polynomials. Another proof can be found in [4].
Using the important results of Elena Berdysheva [3], the following extension was obtained in [17]:
Theorem 3.2
[17, Theorem 9] For \(c<0\), \(S_{n,c}\) is convex on \(\left[ 0, -\frac{1}{c}\right] \).
A stronger conjecture was formulated in [14] and [17]:
Conjecture 3.3
For \(c\in {\mathbb {R}}\), \(S_{n,c}\) is logarithmically convex, i.e., \(\log S_{n,c}\) is convex.
This was validated for \(c\ge 0\) by U. Abel, W. Gawronski and Th. Neuschel [1], who proved a stronger result:
Theorem 3.4
[1] For \(c\ge 0\), the function \(S_{n,c}\) is completely monotonic, i.e.,
Consequently, for \(c\ge 0\), \(S_{n,c}\) is logarithmically convex, and hence convex.
Summing up, for the Rényi entropy \(R_{n,c} = -\log S_{n,c}\) and Tsallis entropy \(T_{n,c}=1-S_{n,c}\) we have the following
Corollary 3.5
-
(i)
Let \(c\ge 0\). Then \(R_{n,c}\) is increasing and concave, while \(T'_{n,c}\) is completely monotonic on \([0,+\infty )\).
-
(ii)
\(T_{n,c}\) is concave for all \(c\in {\mathbb {R}}\).
Proof
-
(i)
Apply Theorem 3.4.
-
(ii)
For \(c<0\), apply Theorem 3.2. For \(c\ge 0\), Theorem 3.4 shows that \(S_{n,c}\) is convex, so that \(T_{n,c}\) is concave.
\(\square \)
Remark 3.6
As far as we know, Conjecture 3.3 is still open for \(c<0\), so that the concavity of \(R_{n,c}\), \(c<0\), remains to be investigated.
References
U. Abel, W. Gawronski, Th Neuschel, Complete monotonicity and zeros of sums of squared Baskakov functions. Appl. Math. Comput. 258, 130–137 (2015)
J.A. Adell, A. Lekuona, Y. Yu, Sharp bounds on the entropy of the Poisson Law and related quantities. IEEE Trans. Inf. Theory 56, 2299–2306 (2010)
E. Berdysheva, Studying Baskakov–Durrmeyer operators and quasi-interpolants via special functions. J. Approx. Theory 149, 131–150 (2007)
I. Gavrea, M. Ivan, On a conjecture concerning the sum of the squared Bernstein polynomials. Appl. Math. Comput. 241, 70–74 (2014)
H. Gonska, I. Raşa, M.-D. Rusu, Chebyshev-Grüss-type inequalities via discrete oscillations, Bull. Acad. Stiinte Repub. Mold. Mat., 1, (74), 63–89; arXiv:1401.7908 (2014)
P. Harremoës, Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inf. Theory 47, 2039–2041 (2001)
M. Heilmann, Erhöhung der Konvergenzgeschwindigkeit bei der Approximation von Funktionen mit Hilfe von Linearkombinationen spezieller positiver linearer Operatoren (Universität Dortmund, Habilitationsschrift, 1992)
E. Hillion, Concavity of entropy along binomial convolutions. Electron. Commun. Prob. 17, 1–9 (2012)
E. Hillion, O. Johnson, A proof of the Shepp–Olkin entropy concavity conjecture, arXiv:1503.01570v1 (2015)
C. Knessl, Integral representations and asymptotic expansions for Shannon and Rényi entropies. Appl. Math. Lett. 11, 69–74 (1998)
Th. Neuschel, Unpublished manuscript (2012)
G. Nikolov, Inequalities for ultraspherical polynomials. Proof of a conjecture of I. Raşa. J. Math. Anal. Appl. 418, 852–860 (2014)
I. Raşa, Unpublished manuscripts (2012)
I. Raşa, Special functions associated with positive linear operators, arxiv:1409.1015v2 (2014)
I. Raşa, Rényi entropy and Tsallis entropy associated with positive linear operators, arxiv:1412.4971v1 (2014)
I. Raşa, Entropies and the derivatives of some Heun functions, arxiv:1502.05570v1 (2015)
I. Raşa, Entropies and Heun functions associated with positive linear operators. Appl. Math. Comput. 268, 422–431 (2015)
A. Rényi, On measures of entropy and information, Fourth Berkeley Symposium on Mathematical Statistics and Probability (University of California Press, Berkeley, 1961), pp. 547–561
L.A. Shepp, I. Olkin, Entropy of the sum of independent Bernoulli random variables and of the multinomial distribution (Proceedings of Contributions to Probability, New York, 1981)
C. Tsallis, Possible generalization of Boltzmann–Gibbs statistics. J. Stat. Phys. 52, 479–487 (1988)
M. Wagner, Quasi-Interpolaten zu genuinen Baskakov–Durrmeyer–Typ Operatoren (Shaker Verlag, Aachen, 2013)
Acknowledgements
The author is grateful to the referee for valuable comments and very constructive suggestions. In particular, the elegant alternative proofs presented in Remark 2.4 were kindly suggested by the referee.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Raşa, I. Complete monotonicity of some entropies. Period Math Hung 75, 159–166 (2017). https://doi.org/10.1007/s10998-016-0177-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10998-016-0177-5