1 Introduction

Let \(c\in {\mathbb {R}}\), \(I_c := \left[ 0, -\frac{1}{c}\right] \) if \(c<0\), and \(I_c:= [0,+\infty )\) if \(c \ge 0\).

As usual, for \(\alpha \in {\mathbb {R}}\) and \(k \in {\mathbb {N}}_0\) the binomial coefficients are defined by

$$\begin{aligned} \left( {_k^\alpha } \right) :=\frac{\alpha (\alpha -1)\dots (\alpha -k+1)}{k!} \quad \text {if } k \in {\mathbb {N}}, \text { and } \left( {_0^\alpha } \right) :=1. \end{aligned}$$

Let \(n> 0\) be a real number such that \(n>c\) if \(c\ge 0\), or \(n=-cl\) with some \(l\in {\mathbb {N}}\) if \(c<0\).

For \(k\in {\mathbb {N}}_0\) and \(x\in I_c\) define

$$\begin{aligned} p_{n,k}^{[c]}(x):= & {} (-1)^k \begin{array}{c}\left( {_{k}^{-\frac{n}{c}} } \right) \end{array}(cx)^k (1+cx)^{-\frac{n}{c}-k}, \quad \text {if } c\ne 0,\\ p_{n,k}^{[0]}(x):= & {} \lim _{c\rightarrow 0} p_{n,k}^{[c]}(x)= \frac{(nx)^k}{k!}e^{-nx}. \end{aligned}$$

Details and historical notes concerning these functions can be found in [3, 7, 21] and the references therein. In particular,

$$\begin{aligned} \frac{d}{dx}p_{n,k}^{[c]}(x) = n \left( p_{n+c,k-1}^{[c]}(x) - p_{n+c,k}^{[c]}(x)\right) . \end{aligned}$$
(1.1)

Moreover,

$$\begin{aligned} \sum _{k=0}^\infty p_{n,k}^{[c]}(x) = 1; \end{aligned}$$
(1.2)
$$\begin{aligned} \sum _{k=0}^\infty k p ^{[c]}_{n,k}(x)=nx, \end{aligned}$$
(1.3)

so that \(\left( p_{n,k}^{[c]}(x)\right) _{k\ge 0}\) is a parameterized probability distribution. Its associated Shannon entropy is

$$\begin{aligned} H_{n,c}(x):=-\sum _{k=0}^\infty p_{n,k}^{[c]}(x) \log p_{n,k}^{[c]}(x), \end{aligned}$$

while the Rényi entropy of order 2 and the Tsallis entropy of order 2 are given, respectively, by (see [18, 20])

$$\begin{aligned} R_{n,c}(x):= -\log S_{n,c}(x); \quad T_{n,c}(x):=1-S_{n,c}(x), \end{aligned}$$

where

$$\begin{aligned} S_{n,c}(x) := \sum _{k=0}^\infty \left( p_{n,k}^{[c]}(x)\right) ^2, \quad x\in I_c. \end{aligned}$$

The cases \(c=-1\), \(c=0\), \(c=1\) correspond, respectively, to the binomial, Poisson, and negative binomial distributions. For other details see also [15, 16].

In this paper we investigate the above entropies with respect to the complete monotonicity.

2 Shannon entropy

2.1 Let’s start with the case \(c<0\).

\(H_{n,-1}\) is a concave function; this is a special case of the results of [19]; see also [6, 8, 9] and the references therein.

Here we shall determine the signs of all the derivatives of \(H_{n,c}\).

Theorem 2.1

Let \(c<0\). Then, for all \(k\ge 0\),

$$\begin{aligned} H_{n,c}^{(2k+2)}(x)\le & {} 0, \quad x \in \left( 0,-\frac{1}{c} \right) , \end{aligned}$$
(2.1)
$$\begin{aligned} H_{n,c}^{(2k+1)}(x)= & {} \left\{ \begin{array}{ll} \ge 0 &{} \quad x \in ( 0,-\frac{1}{2c} ],\\ \le 0 &{} \quad x \in [ -\frac{1}{2c}, - \frac{1}{c} ).\\ \end{array}\right. \end{aligned}$$
(2.2)

Proof

We have \(n=-cl\) with \(l \in {\mathbb {N}}\). As in [10], let us represent \(\log {(l!)}\) by integrals:

$$\begin{aligned} \log {(l!)} = \int _0 ^\infty \left( l - \frac{1-e^{-ls}}{1-e^{-s}} \right) \frac{e^{-s}}{s} ds = \int _0 ^1 \left( \frac{1-(1-t)^l}{t} -l \right) \frac{dt}{\log {(1-t)}}. \end{aligned}$$
(2.3)

Now using (1.2), (1.3) and (2.3) we get

$$\begin{aligned} H_{n,c}(x)= & {} H_{l,-1}(-cx) = - l \left[ (-cx)\log {(-cx)}+(1+cx)\log {(1+cx)}\right] \\&\quad +\int _0 ^1 \frac{-t}{\log {(1-t)}} \frac{(1+cxt)^l+(1-t-cxt)^l-1-(1-t)^l}{t^2}dt. \end{aligned}$$

It is a matter of calculus to prove that

$$\begin{aligned} H''_{n,c}(x)= & {} cl \left( \frac{1}{x} - \frac{c}{1+cx}\right) \\&\quad + \,c^2l(l-1)\int _0 ^1 \frac{-t}{\log {(1-t)}} \left[ (1+cxt)^{l-2} + (1-t-cxt)^{l-2}\right] dt, \end{aligned}$$

and for \(k\ge 0\)

$$\begin{aligned} H_{n,c}^{(2k+2)}(x)= & {} cl(2k)! \left( \frac{1}{x^{2k+1}} - \left( \frac{c}{1+cx}\right) ^{2k+1} \right) \nonumber \\&\quad +\, l(l-1)\dots (l-2k-1)c^{2k+2}\nonumber \\&\qquad \times \int _0 ^1 \frac{-t}{\log {(1-t)}} \left[ (1+cxt)^{l-2k-2} + (1-t-cxt)^{l-2k-2}\right] t^{2k} dt. \end{aligned}$$

For \(0<t<1\) we have

$$\begin{aligned} 0<\frac{-t}{\log {(1-t)}}<1, \end{aligned}$$
(2.4)

so that

$$\begin{aligned} H_{n,c}^{(2k+2)}(x)&\le cl(2k)! \left( \frac{1}{x^{2k+1}} - \left( \frac{c}{1+cx}\right) ^{2k+1} \right) \nonumber \\&\quad + l(l-1)\dots (l-2k-1)c^{2k+2}\nonumber \\&\qquad \times \int _0 ^1 \left[ (1+cxt)^{l-2k-2} + (1-t-cxt)^{l-2k-2}\right] t^{2k} dt. \end{aligned}$$
(2.5)

Repeated integration by parts yields

$$\begin{aligned} \int _0 ^1 (1+cxt)^{l-2k-2}t^{2k}dt \le \frac{(2k)!}{(l-2)(l-3)\dots (l-2k-1)(cx)^{2k}}\int _0 ^1 (1+cxt)^{l-2}dt, \end{aligned}$$

and so

$$\begin{aligned} \int _0 ^1 (1+cxt)^{l-2k-2}t^{2k}dt \le \frac{(2k)!\left[ (1+cx)^{l-1}-1 \right] }{(l-1)(l-2)\dots (l-2k-1)(cx)^{2k+1}}. \end{aligned}$$
(2.6)

Replacing x by \(-\frac{1}{c}-x\) we obtain

$$\begin{aligned} \int _0 ^1 (1-t-cxt)^{l-2k-2}t^{2k}dt \le \frac{(2k)! \left[ 1-(-cx)^{l-1} \right] }{(l-1)(l-2)\dots (l-2k-1)(1+cx)^{2k+1}}. \end{aligned}$$
(2.7)

From (2.5), (2.6) and (2.7) it follows that

$$\begin{aligned} H_{n,c}^{(2k+2)}(x)\le cl(2k)! \left[ \frac{(1+cx)^{l-1}}{x^{2k+1}} - \frac{c^{2k+1}(-cx)^{l-1}}{(1+cx)^{2k+1}}\right] \le 0, \end{aligned}$$

and this proves (2.1).

It is easy to verify that \(H_{n,c}^{(2k+1)}\left( -\frac{1}{2c} \right) = 0\). Since \(H_{n,c}^{(2k+2)}\le 0\), it follows that \(H_{n,c}^{(2k+1)}\) is decreasing, and this implies (2.2). \(\square \)

2.2 Consider the case \(c=0\)

\(H_{n,0}\) is the Shannon entropy of the Poisson distribution. The derivative of this function is completely monotonic: see, e.g., [2, p. 2305]. For the sake of completeness we insert here a short proof.

Theorem 2.2

\(H'_{n,0}\) is completely monotonic, i.e.,

$$\begin{aligned} (-1)^k H_{n,0}^{(k+1)}(x)\ge 0, \quad k \ge 0,\quad x>0. \end{aligned}$$
(2.8)

Proof

Note that \(H_{n,0}(y) = H_{1,0}(ny)\); so it suffices to investigate the derivatives of \(H_{1,0}(x)\).

According to [10, (2.5)],

$$\begin{aligned} H_{1,0}(x)= & {} x-x\log {x} + \int _0 ^\infty \frac{e^{-t}}{t} \left( x - \frac{1-\exp {(x(e^{-t}-1))}}{1-e^{-t}} \right) dt\\= & {} x-x\log {x} - \int _0 ^1 \left( x - \frac{1-e^{-sx}}{s} \right) \frac{ds}{\log {(1-s)}}. \end{aligned}$$

It follows that

$$\begin{aligned} H'_{1,0}(x) = -\log {x} - \int _0^1 \left( 1-e^{-sx}\right) \frac{ds}{\log {(1-s)}} \end{aligned}$$

and for \(k\ge 1\),

$$\begin{aligned} H_{1,0}^{(k+1)}(x) = (-1)^k \left( \frac{(k-1)!}{x^k} + \int _0^1 s^k e^{-sx} \frac{ds}{\log {(1-s)}}\right) . \end{aligned}$$
(2.9)

By using (2.4) we get

$$\begin{aligned} \int _0 ^1 \frac{s^k e^{-sx}}{\log {(1-s)}}ds&\ge - \int _0 ^1 s^{k-1}e^{-sx} ds \\&=-\int _0 ^x \frac{t^{k-1}}{x^k} e^{-t}dt \ge - \int _0 ^\infty \frac{1}{x^k}t^{k-1}e^{-t}dt\\&= -\frac{(k-1)!}{x^k}. \end{aligned}$$

Combined with (2.9), this proves (2.8) for \(k\ge 1\). In particular, we see that \(H_{n,0}\) is concave and non-negative on \([0,+\infty )\); it follows that \(H'_{n,0}\ge 0\) and so (2.8) is completely proved. \(\square \)

2.3 Let now \(c>0\)

Theorem 2.3

For \(c>0\), \(H'_{n,c}\) is completely monotonic.

Proof

Since \(H_{m,c}(y) = H_{\frac{m}{c},1}(cy)\), it suffices to study the derivatives of \(H_{n,1}(x)\).

By using (1.2), (1.3) and

$$\begin{aligned} \log {A} = \int _0 ^\infty \frac{e^{-x}-e^{-Ax}}{x}dx, \quad A>0, \end{aligned}$$

we get

$$\begin{aligned} H_{n,1}(x)= & {} n \left( (1+x)\log {(1+x)} - x\log {x} \right) +\int _0 ^\infty \frac{e^{-ns}-e^{-s}}{s(1-e^{-s})}\left( 1-(1+x-xe^{-s})^{-n} \right) ds\\= & {} n \left( (1+x)\log {(1+x)} - x\log {x} \right) + \int _0 ^1 \frac{1-(1-t)^{n-1}}{t\log {(1-t)}} \left( 1-(1+tx)^{-n} \right) dt. \end{aligned}$$

It follows that, for \(j\ge 1\),

$$\begin{aligned} \frac{1}{n}H_{n,1}^{(j+1)}(x)= & {} (-1)^{j-1}(j-1)! \left( (x+1)^{-j} - x^{-j} \right) \\&\quad +\,(-1)^{j-1}(n+1)(n+2)\dots (n+j)\\&\qquad \times \int _0 ^1 \frac{-t}{\log {(1-t)}} \left[ 1-(1-t)^{n-1} \right] (1+xt)^{-n-j-1}t^{j-1}dt. \end{aligned}$$

Using again (2.4), we get

$$\begin{aligned} (-1)^{j-1}\frac{1}{n}H_{n,1}^{(j+1)}(x)&\le (j-1)! \left( (x+1)^{-j}-x^{-j}\right) +(n+1)(n+2)\dots (n+j) \\&\quad \times \int _0 ^1 \left[ 1-(1-t)^{n-1}\right] (1+xt)^{-n-j-1}t^{j-1}dt\\&=u(x) + v(x), \end{aligned}$$

where

$$\begin{aligned} u(x):= & {} \frac{(j-1)!}{(x+1)^j} - (n+1)(n+2)\dots (n+j) \int _0 ^1 t^{j-1}(1-t)^{n-1} (1+xt)^{-n-j-1}dt,\\ v(x):= & {} (n+1)(n+2)\dots (n+j) \int _0 ^1 t^{j-1} (1+xt)^{-n-j-1}dt - \frac{(j-1)!}{x^j}. \end{aligned}$$

We shall prove that \(u(x)\le 0\) and \(v(x)\le 0\), \(x>0\). Let us remark that

$$\begin{aligned} \int _0 ^1 t^{j-1}(1-t)^{n-1}(1+xt)^{-n-j-1}dt \ge \int _0 ^1 t^{j-1}(1-t)^n(1+xt)^{-n-j-1}dt, \end{aligned}$$
(2.10)

and integration by parts yields

$$\begin{aligned} \int _0 ^1 \frac{t^{j-1} (1-t)^n}{(1+xt)^{n+j+1}}dt = \frac{j-1}{(n+1)(x+1)} \int _0 ^1 \frac{t^{j-2}(1-t)^{n+1}}{(1+xt)^{n+j+1}}dt. \end{aligned}$$

Applying repeatedly this formula we obtain

$$\begin{aligned} \int _0 ^1 \frac{t^{j-1}(1-t)^n}{(1+xt)^{n+j+1}}dt = \frac{(j-1)!}{(n+1)(n+2)\dots (n+j)}\frac{1}{(x+1)^j}. \end{aligned}$$
(2.11)

Now (2.10) and (2.11) imply \(u(x) \le 0\).

Using again integration by parts we get

$$\begin{aligned} \int _0 ^1 t^{j-1}(1+xt)^{-n-j-1}dt\le & {} \frac{j-1}{(n+j)x} \int _0 ^1 t^{j-2}(1+xt)^{-n-j}dt \\\le & {} \dots \le \frac{(j-1)!}{(n+1)(n+2)\dots (n+j)}\frac{1}{x^j}, \end{aligned}$$

which shows that \(v(x) \le 0\).

We conclude that

$$\begin{aligned} (-1)^{j-1}H_{n,1}^{(j+1)}(x)\le 0, \quad j \ge 1, x>0. \end{aligned}$$
(2.12)

In particular, (2.12) shows that \(H_{n,1}\) is concave on \([0,+\infty )\); it is also non-negative, which means that \(H'_{n,1}\ge 0\). Combined with (2.12), this shows that \(H'_{n,1}\) is completely monotonic, and the proof is finished. \(\square \)

Remark 2.4

(2.11) can be obtained alternatively by using the change of variables \(y=(1-t)/(1+xt)\) and the properties of the Beta function. An alternative proof of the inequality \(v(x)\le 0\) follows from

$$\begin{aligned} \int _0^1 t^{j-1} (1+xt)^{-n-j-1} dt\le & {} \frac{1}{x^{j-1}}\int _0 ^\infty \frac{(xt)^{j-1}}{(1+xt)^{n+j+1}}dt \\ \quad= & {} \frac{1}{x^j} \int _0 ^\infty \frac{s^{j-1}}{(1+s)^{j+n+1}}ds=\frac{1}{x^j}B(j,n+1)=\frac{1}{x^j}\frac{(j-1)!n!}{(n+j)!}. \end{aligned}$$

Corollary 2.5

The following inequalities are valid for \(x>0\) and \(c\ge 0\):

$$\begin{aligned} \log {\frac{x}{cx+1}} \le \sum _{k=0}^\infty p_{n+c,k}^{[c]}(x)\log {\frac{k+1}{ck+n}}\le \log {\frac{nx+1}{ncx+n}}. \end{aligned}$$
(2.13)

In particular, for \(c=0\) and \(n=1\),

$$\begin{aligned} \log {x} \le \sum _{k=0}^\infty e^{-x}\frac{x^k}{k!}\log {(k+1)}\le \log {(x+1)}. \end{aligned}$$

Proof

We have seen that \(H'_{n,c}(x)\ge 0\). An application of (1.1) yields

$$\begin{aligned} H'_{n,c}(x) = n \left( \log {\frac{1+cx}{x}} + \sum _{k=0}^\infty p_{n+c,k}^{[c]}(x)\log {\frac{k+1}{n+ck}} \right) . \end{aligned}$$

This proves the first inequality in (2.13); the second is a consequence of Jensen’s inequality applied to the concave function \(\log {t}\). \(\square \)

3 Rényi entropy and Tsallis entropy

The following conjecture was formulated in [13]:

Conjecture 3.1

\(S_{n,-1}\) is convex on [0, 1].

Th. Neuschel [11] proved that \(S_{n,-1}\) is decreasing on \(\left[ 0, \frac{1}{2}\right] \) and increasing on \(\left[ \frac{1}{2}, 1\right] \). The conjecture and Neuschel’s result can also be found in [5].

A proof of the conjecture was given by G. Nikolov [12], who related it to some new inequalities involving Legendre polynomials. Another proof can be found in [4].

Using the important results of Elena Berdysheva [3], the following extension was obtained in [17]:

Theorem 3.2

[17, Theorem 9] For \(c<0\), \(S_{n,c}\) is convex on \(\left[ 0, -\frac{1}{c}\right] \).

A stronger conjecture was formulated in [14] and [17]:

Conjecture 3.3

For \(c\in {\mathbb {R}}\), \(S_{n,c}\) is logarithmically convex, i.e., \(\log S_{n,c}\) is convex.

This was validated for \(c\ge 0\) by U. Abel, W. Gawronski and Th. Neuschel [1], who proved a stronger result:

Theorem 3.4

[1] For \(c\ge 0\), the function \(S_{n,c}\) is completely monotonic, i.e.,

$$\begin{aligned} (-1)^m \left( \frac{d}{dx} \right) ^m S_{n,c}(x)>0, \quad x\ge 0, m\ge 0. \end{aligned}$$

Consequently, for \(c\ge 0\), \(S_{n,c}\) is logarithmically convex, and hence convex.

Summing up, for the Rényi entropy \(R_{n,c} = -\log S_{n,c}\) and Tsallis entropy \(T_{n,c}=1-S_{n,c}\) we have the following

Corollary 3.5

  1. (i)

    Let \(c\ge 0\). Then \(R_{n,c}\) is increasing and concave, while \(T'_{n,c}\) is completely monotonic on \([0,+\infty )\).

  2. (ii)

    \(T_{n,c}\) is concave for all \(c\in {\mathbb {R}}\).

Proof

  1. (i)

    Apply Theorem 3.4.

  2. (ii)

    For \(c<0\), apply Theorem 3.2. For \(c\ge 0\), Theorem 3.4 shows that \(S_{n,c}\) is convex, so that \(T_{n,c}\) is concave.

\(\square \)

Remark 3.6

As far as we know, Conjecture 3.3 is still open for \(c<0\), so that the concavity of \(R_{n,c}\), \(c<0\), remains to be investigated.