1 Introduction

Throughout the paper, we assume that \(\xi \) is a nonnegative random variable defined on a given probability space \((\Omega , \mathcal{F}, \mathbf{P})\) with finite moments \(\mathbf{E}[\xi ^k], \ k=1,2,\ldots .\) Let further \(\xi _1,\xi _2, \ldots , \xi _n\) be independent copies of \(\xi .\) Here \(n\ge 1\) is a fixed integer number. Of interest to us are the following two random variables, the power and the product:

$$\begin{aligned} X_n=\xi ^n \ \text{ and } \ Y_n=\xi _1\xi _2 \cdots \xi _n. \end{aligned}$$

Each of the variables \(X_n\) and \(Y_n\) also has all moments finite. Thus, the natural problem arising here is to study, characterize and compare the moment (in)determinacy of these two variables. Since \(X_n\) and \(Y_n\) take values in \({\mathbb R}^+= [0,\infty )\), this means that we deal with the Stieltjes moment problem.

We find conditions on \(\xi \) and \(n\) guaranteeing that \(X_n\) and \(Y_n\) are M-determinate (uniquely determined by the moments), and other conditions when they are M-indeterminate (nonunique in terms of the moments). In these two cases, we use the abbreviations M-det and M-indet for both random variables and their distributions.

In our reasonings, we use classical or new conditions such as Cramér’s condition, Carleman’s condition, Hardy’s condition and Krein’s condition. The reader may find it useful to consult available sources, among them are [10, 12, 17, 19, 22, 23] and [24]. For reader’s convenience, we have included these conditions when formulating our results.

To study powers, products, etc., or other nonlinear transformations of random data (called sometimes Box–Cox transformations), is a challenging probabilistic problem which is of independent interest. Note, however, that products and powers of random variables considered in this paper and the results established are definitely related to contemporary stochastic models of real and complex phenomena; see, e.g., [3, 5, 7] and [20].

In this paper, we deal with new problems and present new results with their proofs. We establish new and general criteria which are then applied to describe the moment (in)determinacy of the above transformations. We also provide new proofs of some known results with reference to the original papers. Our results complement previous studies or represent different aspects of existing studies on this topic; see, e.g., [2, 4, 11, 16, 20] and [22].

The approach and the results in this paper can be further extended to distributions on the whole real line (Hamburger moment problem, see [25]). Also, they can be used to characterize the moment determinacy properties of nonlinear transformations of some important sub-classes of distributions such as, e.g., the subexponential distributions; see [6].

The material is divided into relatively short sections each dealing with a specific question related to a general or specific distribution. General results are included in Sects. 2, 4, 6, 7 and 9. Sections 358 and 10 deal with powers and products based on the generalized gamma distribution, while Sect. 11 is based on half-logistic distribution. All statements are followed by detailed proofs.

2 Comparing the Moment Determinacy of Powers and Products

The power \(X_n=\xi ^n\) and the product \(Y_n=\xi _1\xi _2 \cdots \xi _n\) have some ‘similarity’. They both are defined in terms of \(n\) and \(\xi \) or of \(n\) independent copies of \(\xi \), and both have all moments finite. Thus, we arrive naturally to the question:

Is it true that the random variables \(X_n\) and \(Y_n\) share the same moment determinacy property?

If the generic random variable \(\xi \) has a bounded support, then so does each of \(X_n\) and \(Y_n\), and hence, both \(X_n\) and \(Y_n\) have all moments finite and both are M-det. This simple observation shows that interesting is to study powers and products based on a random variable \(\xi \) with unbounded support contained in \({\mathbb R}^+\) and such that \(\xi \) has all moments finite.

Let us mention first a special case. Suppose \(\xi \sim \) Exp(1), the standard exponential distribution. Then the power \(X_n=\xi ^n\) is M-det iff the product \(Y_n=\xi _1\xi _2\cdots \xi _n\) is M-det, and this is true iff \(n\le 2\) (see, e.g., [2] and [16]). This means that for any \(n=1,2,\ldots ,\) the power \(X_n\) and the product \(Y_n\) share the same moment determinacy property. Since Weibull random variable is just a power of the exponential one, it follows that if \(\xi \) obeys a Weibull distribution, then for any \(n=1,2,\ldots ,\) the power \(X_n\) and the product \(Y_n\) also have the same moment determinacy property. Therefore, the answer to the above question is positive for at least some special distributions including Weibull distributions. In this paper we will explore more distributions (see Theorem 6 and Sect. 11 below).

Note that in general, we have, by Lyapunov’s inequality,

$$\begin{aligned} \mathbf{E}[X_n^s]=\mathbf{E}[\xi ^{ns}] \ge (\mathbf{E}[\xi ^s])^n=\mathbf{E}[Y_n^s]~~\hbox {for all real}~ s>0. \end{aligned}$$
(1)

We use this moment inequality to establish a result which involves three of the most famous conditions for moment determinacy (Carleman’s, Cramér’s and Hardy’s). For more details about Hardy’s condition, see [24].

Proposition 1

  1. (i)

    If \(X_n\) satisfies Carleman’s condition (and hence is M-det), i.e., \(\sum _{k=1}^{\infty }(\mathbf{E}[X_n^k])^{-1/(2k)}=\infty ,\) then so does \(Y_n\).

  2. (ii)

    If \(X_n\) satisfies Cramér’s condition (and hence is M-det),i.e., \(\mathbf{E}[\exp ({cX_n})]<\infty \) for some constant \(c>0,\) then so does \(Y_n\).

  3. (iii)

    If \(X_n\) satisfies Hardy’s condition (and hence is M-det), i.e., \(\mathbf{E}[\exp ({c\sqrt{X_n}})]<\infty \) for some constant \(c>0,\) then so does \(Y_n\).

Proof

Part (i) follows immediately from (1). Parts (ii) and (iii) follow from the fact that for each real \(s>0\),

$$\begin{aligned} \mathbf{E}[\exp (cX_n^s)]=\sum _{k=0}^{\infty }\frac{c^k}{k!}\mathbf{E} [(X_n^s)^{k}]\ge \sum _{k=0}^{\infty } \frac{c^k}{k!}\mathbf{E}[(Y_n^s)^{k}]=\mathbf{E}[\exp (cY_n^s)]. \end{aligned}$$

\(\square \)

Corollary 1

If \(\xi \) satisfies Cramér’s condition and if \(n=2\), then both \(X_2=\xi ^2\) and \(Y_2=\xi _1\xi _2\) are M-det, and hence, \(X_2\) and \(Y_2\) share the same moment determinacy property.

Proof

Note that \(\xi \) satisfies Cramér’s condition iff \(X_2\) satisfies Hardy’s condition. Then by Proposition 1(iii), both \(X_2\) and \(Y_2\) are M-det as claimed above. \(\square \)

3 Generalized Gamma Distributions. Part (a)

Some of our results can be well illustrated if we assume that the generic random variable \(\xi \) has a generalized gamma distribution. We write \(\xi \sim GG(\alpha , \beta , \gamma )\) if \(\xi \) has the following density function \(f\):

$$\begin{aligned} f(x)=cx^{\gamma -1}e^{-\alpha x^{\beta }}, \quad x\ge 0, \end{aligned}$$

where \(\alpha , \beta , \gamma >0\), \(f(0)=0\) if \(\gamma \ne 1\), and \(c=\beta \alpha ^{\gamma /\beta }/\Gamma (\gamma /\beta )\) is the norming constant. Note that \(GG(\alpha , \beta , \gamma )\) is a rich class containing several commonly used distributions such as exponential, Weibull, half-normal and Chi-square.

It is known that the power \(X_n=\xi ^n\) is M-det iff \(n\le 2\beta \) (see, e.g., [18] and [23]). We claim now that for \(n\le 2\beta \), the product \(Y_n=\xi _1\xi _2\cdots \xi _n\) is also M-det. To see this, we note first that the density function \(h_n\) of the random variable \(\sqrt{X_n}\) is

$$\begin{aligned} h_n(z)=\frac{2c}{n}z^{2\gamma /n-1}e^{-\alpha z^{2\beta /n}}, \quad z\ge 0. \end{aligned}$$

This in turn implies that \(X_n\) satisfies Hardy’s condition if \(2\beta /n\ge 1\); hence, so does \(Y_n\) for \(n\le 2\beta \) by Proposition 1(iii).

To obtain further results, it is quite useful to write the explicit form of the density of the product \(Y_2=\xi _1\xi _2\) when \(\xi \) has the generalized gamma distribution. This involves the function \(K_0(x),\, x>0\), the modified Bessel function of the second kind. Its definition and approximation are given as follows:

$$\begin{aligned} K_0(x)&= \frac{1}{2}\int \limits _0^{\infty }t^{-1}e^{-t-x^2/(4t)}\mathrm{d}t, \quad x>0, \\&= \left( \frac{\pi }{2x}\right) ^{1/2}e^{-x}\left[ 1-\frac{1}{8x} \left( 1-\frac{9}{16x}\left( 1-\frac{25}{24x}\right) \right) +o(x^{-3})\right] ~~\hbox {as}~x\rightarrow \infty \end{aligned}$$

(see, e.g., [8] and [13], pp. 37–38).

Lemma 1

(See also [14]) Let \(Y_2=\xi _1\xi _2\), where \(\xi _1\) and \(\xi _2\) are independent random variables having the same distribution \(GG(\alpha , \beta , \gamma ).\) Then the density function \(g_2\) of \(Y_2\) is

$$\begin{aligned} g_2(x)&= \frac{2c^2}{\beta }x^{\gamma -1}K_0\left( 2\alpha x^{\beta /2}\right) , \quad x>0,\\&\approx Cx^{\gamma -\beta /4-1}e^{-2\alpha x^{\beta /2}}, \quad \mathrm{as} ~x\rightarrow \infty . \end{aligned}$$

Proof

(Method I) Let \(G_2\) be the distribution function of \(Y_2\). Then

$$\begin{aligned} \overline{G}_2(x):=1-G_2(x)=\mathbf{P}[Y_2>x]=\int \limits _0^{\infty }\mathbf{P}[\xi _1>x/y] cy^{\gamma -1}e^{-\alpha y^{\beta }}\mathrm{d}y, \quad x>0, \end{aligned}$$

and hence, the density of \(Y_2\) is

$$\begin{aligned} g_2(x)&= c^2x^{\gamma -1}\int \limits _0^{\infty }y^{-1}e^{-\alpha x^{\beta }/y^{\beta } -\alpha y^{\beta }}\mathrm{d}y =\frac{c^2}{\beta }x^{\gamma -1}\int \limits _0^{\infty }t^{-1}e^{-t-(\alpha ^2 x^{\beta })/t}\mathrm{d}t\\&= \frac{2c^2}{\beta }x^{\gamma -1}K_0\left( 2\alpha x^{\beta /2}\right) , \quad x>0. \end{aligned}$$

(Method II) We can use the moment function (or Mellin transform) usually denoted by \(\mathcal M\), because it uniquely determines the corresponding distribution. To do this, we note that

$$\begin{aligned} \mathcal{M}(s)&=: \mathbf{E}[Y_2^s]=(\mathbf{E}[\xi _1^s])^2, ~~\mathbf{E}[\xi _1^s]=c\Gamma ((\gamma +s)/\beta )\left( \beta \alpha ^{(\gamma +s)/ \beta }\right) ^{-1},\, \hbox {and}\\&\quad \int \limits _0^{\infty }x^sK_0(x)\hbox {d}x=2^{s-1}(\Gamma ((s+1)/2))^2 \quad \hbox {for all} \,\,s>0 \end{aligned}$$

(see, e.g., [9], p. 676, Formula 6.561(16)). We omit the detailed calculation. \(\square \)

It may look surprising, but it is well-known, that several commonly used distributions are related to the Bessel function in such a natural way as in Lemma 1. For example, if \(\xi \) is a half-normal random variable, i.e., \(\xi \sim GG(\frac{1}{2}, 2, 1)\) with the density \(f(x)=\sqrt{2/\pi }e^{-x^2/2},~x\ge 0\), then \(Y_2\) has the density function \(g_2(x)=\) \((2/{\pi })K_0(x)\approx C_2x^{-1/2}e^{-x}~~\hbox {as}~x\rightarrow \infty \), with the moment function \(\mathcal{M}(s)=\mathbf{E}[Y_2^s]=(2^s/\pi )\Gamma ^2((s+1)/2),~s>-1\). The distribution of \(Y_2=\xi _1\xi _2\) may be called the half-Bessel distribution and its symmetric counterpart with density \(h_2(x)=(1/{\pi })K_0(x),~x\in {\mathbb R}= (-\infty ,\infty )\), is called the standard Bessel distribution. Note that \(K_0\) is an even function and \(h_2\) happens to be the density of the product of two independent standard normal random variables; see also [4]. It can be checked that for real \(s>0\) we have \((\mathbf{E}[(Y_2^{s})^n])^{-1/(2n)}\approx C_sn^{-s/2}\) as \(n\rightarrow \infty \), and hence, \(Y_2^s\) satisfies Carleman’s condition if \(s\le 2\). Actually, it follows from the density \(g_2\) and its asymptotic behavior that \(Y_2\) satisfies Cramér’s condition. Therefore, by Hardy’s criterion, the square of \(Y_2\), i.e., \(Y_2^2=\xi _1^2\xi _2^2\), is M-det.

Let us express the latter by words: The square of the product of two independent half-normal random variables is M-det. Since \(\xi ^2 = \chi ^2_1\), we conclude also that the product of two independent \(\chi ^2\)-distributed random variables is M-det. In addition, these properties can be compared with the known fact that the power 4 of a normal random variable is M-det (see, e.g., [1] or [22]).

4 Slow Growth Rate of the Moments Implies Moment Determinacy

It is known and well understood that the moment determinacy of a distribution depends on the rate of growth of the moments. Let us establish first results which are of a general and independent interest. Later, we will apply them and make conclusions about powers and products of random variables.

Suppose \(X\) is a nonnegative random variable with finite moments \(m_k=\mathbf{E}[X^k]\), \(k=1,2,\ldots .\) To avoid trivial cases, we assume that \(m_1>0\), meaning that \(X\) is not a degenerate random variable at 0.

Lemma 2

For each \(k\ge 1\), we have the following properties:

  1. (i)

    \(m_k\le m_{k+1}\) if \(m_1\ge 1\), and

  2. (ii)

    \(m_1m_k\le m_{k+1}.\)

Proof

By Lyapunov’s inequality, we have \((m_k)^{1/k}\le (m_{k+1})^{1/(k+1)}.\) Therefore,

$$\begin{aligned} \frac{1}{k}\log m_k\le \frac{1}{k+1}\log m_{k+1}\le \frac{1}{k}\log m_{k+1}, \quad \hbox {if} ~m_1\ge 1, \end{aligned}$$

and hence, \(m_k\le m_{k+1}\) if \(m_1\ge 1\). This proves claim (i). To prove claim (ii), we use the relations \(m_1\le (m_k)^{1/k}\le (m_{k+1})^{1/(k+1)},\) implying that

$$\begin{aligned} m_1m_k\le (m_k)^{1/k}m_k=m_k^{(k+1)/k}\le m_{k+1}. \end{aligned}$$

\(\square \)

In Lemma 2, claim (i) tells us that the moment sequence \(\{m_k, k=1,2,\ldots \}\) is nondecreasing if \(m_1\ge 1\), while claim (ii) shows that the ratio \(m_{k+1}/m_k\) has a lower bound \(m_1\) whatever the nonnegative random variable \(X\) is. The next theorem provides the upper bound of the ratio \(m_{k+1}/m_k\), or, equivalently, of the growth rate of the moments \(m_k\) for which \(X\) is M-det.

Theorem 1

Let \(m_{k+1}/m_k=\mathcal{O}((k+1)^2)\) as \(k\rightarrow \infty \). Then \(X\) satisfies Carleman’s condition and is M-det. (We refer to the constant 2, the exponent in the term \(\mathcal{O}((k+1)^2)\), as the rate of growth of the moments of \(X\).)

Proof

By the assumption, there exists a constant \(C>0\) such that

$$\begin{aligned} m_k^{(k+1)/k}\le m_{k+1}\le C(k+1)^2m_k \quad \hbox {for all large}~k, \end{aligned}$$

which implies

$$\begin{aligned} m_k^{1/k}\le C(k+1)^2 \quad \hbox {for all large}~k, \end{aligned}$$

and hence,

$$\begin{aligned} m_k^{-1/(2k)}\ge C^{-1/2}(k+1)^{-1} \quad \hbox {for large}~k. \end{aligned}$$

Therefore, \(X\) satisfies Carleman’s condition \(\sum _{k=1}^{\infty }m_k^{-1/(2k)}=\infty \), and is M-det. \(\square \)

We can slightly extend Theorem 1 as follows. For a real number \(a\) we denote by \(\lfloor a \rfloor \) the largest integer which is less than or equal to \(a\).

Theorem \(\mathbf 1 ^{\prime }\) Suppose there is a real number \(a \ge 1\) such that the moments of the random variable \(X\) satisfy the condition \(m_{k+1}/m_k=\mathcal{O}((k+1)^{2/a})\) as \(k\rightarrow \infty \). Then the power \(X^{\lfloor a \rfloor }\) satisfies Carleman’s condition and is M-det.

Proof

Note that

$$\begin{aligned} \frac{\mathbf{E}[(X^{\lfloor a \rfloor })^{k+1}]}{\mathbf{E}[(X^{\lfloor a \rfloor })^k]}&= \frac{\mathbf{E}[X^{{\lfloor a \rfloor }k+{\lfloor a \rfloor }}]}{\mathbf{E}[X^{{\lfloor a \rfloor }k+{\lfloor a \rfloor }-1}]}\frac{\mathbf{E}[X^{{\lfloor a \rfloor }k+{\lfloor a \rfloor }-1}]}{\mathbf{E}[X^{{\lfloor a \rfloor }k+{\lfloor a \rfloor }-2}]}\cdots \frac{\mathbf{E}[X^{{\lfloor a \rfloor }k+1}]}{\mathbf{E}[X^{{\lfloor a \rfloor }k}]}\\&= \mathcal{O}((k+1)^{(2/a)\lfloor a \rfloor })=\mathcal{O}((k+1)^2) \quad \mathrm{as} \,\, k\rightarrow \infty . \end{aligned}$$

Hence, by Theorem 1, \(X^{\lfloor a \rfloor }\) satisfies Carleman’s condition and is M-det. \(\square \)

Theorem 2

Let \(\xi ,~\xi _i,~i=1,2,\ldots ,n\), be defined as before and \(Y_n=\xi _1\cdots \xi _n.\) If \(\xi \) and the index \(n\) are such that

$$\begin{aligned} \mathbf{E}[\xi ^{k+1}]/\mathbf{E}[\xi ^k]=\mathcal{O}((k+1)^{2/n}) \quad \mathrm{as} \,\, k\rightarrow \infty , \end{aligned}$$

then \(Y_n\) satisfies Carleman’s condition and is M-det.

Proof

By the assumption, we have

$$\begin{aligned} \mathbf{E}[Y_n^{k+1}]/\mathbf{E}[Y_n^k]=(\mathbf{E}[\xi ^{k+1}]/ \mathbf{E}[\xi ^k])^n=\mathcal{O}((k+1)^2) \ \text{ as } ~ k\rightarrow \infty . \end{aligned}$$

This, according to Theorem 1, implies the validity of Carleman’s condition for \(Y_n\); hence, \(Y_n\) is M-det as stated above. \(\square \)

Theorem 2 \(^{\prime }\) Let \(a\ge 1\). If

$$\begin{aligned} \mathbf{E}[\xi ^{k+1}]/\mathbf{E}[\xi ^k]=\mathcal{O}((k+1)^{2/a}) \ \text{ as } k\rightarrow \infty , \end{aligned}$$

then \(Y_{\lfloor a \rfloor }\) satisfies Carleman’s condition and is M-det.

Proof

Note that

$$\begin{aligned}&\mathbf{E}[Y_{\lfloor a \rfloor }^{k+1}]/\mathbf{E}[Y_{\lfloor a \rfloor }^k]=(\mathbf{E}[\xi ^{k+1}]/\mathbf{E}[\xi ^k])^{\lfloor a \rfloor }\\&\quad =\mathcal{O}((k+1)^{(2/a){\lfloor a \rfloor }})=\mathcal{O}((k+1)^2) \quad \text{ as } k\rightarrow \infty . \end{aligned}$$

The conclusions follow from Theorem 1. \(\square \)

5 Generalized Gamma Distributions. Part (b)

We now apply the general results, Theorems 1 and 2 in Sect. 4, to give an alternative proof of the moment determinacy established in Sect. 3.

Let, as before, \(\xi \sim GG(\alpha , \beta , \gamma ).\) We claim that for \(n\le 2\beta \), both \(X_n=\xi ^n\) and \(Y_n=\xi _1\xi _2\cdots \xi _n\) are M-det. To see this, we first calculate that

$$\begin{aligned} \frac{\mathbf{E}[X_n^{k+1}]}{\mathbf{E}[X_n^{k}]}\!=\!\frac{\mathbf{E} [\xi ^{n(k+1)}]}{\mathbf{E}[\xi ^{nk}]}\!=\! \frac{\Gamma ((\gamma +n(k+1))/\beta )}{\alpha ^{n/\beta } \Gamma ((\gamma +nk)/\beta )}\approx (n/\alpha \beta )^{n/\beta } (k\!+\!1)^{n/\beta } \quad \hbox {as}~ k\rightarrow \infty . \end{aligned}$$

For this relation, we have used the approximation of the gamma function:

$$\begin{aligned} \Gamma (x)\approx \sqrt{2\pi }x^{x-1/2}e^{-x} \quad \hbox { as}~x\rightarrow \infty \end{aligned}$$

(see, e.g., [26], p. 253). Then by Theorem 1, \(X_n\) is M-det if \(n\le 2\beta \), and by Theorem 2, \(Y_n\) is M-det if \(1/\beta \le 2/n\), i.e., if \(n\le 2\beta \), because \(\mathbf{E}[\xi ^{k+1}]/\mathbf{E}[\xi ^k]=\mathcal{O}((k+1)^{1/\beta })\) as \(k\rightarrow \infty \).

For example, if \(\xi \sim Exp(1) = GG(1, 1, 1)\), then the product \(Y_2=\xi _1\xi _2\) is M-det. In fact, by Lemma 1, the density \(g_2\) of \(Y_2\) is \(g_2(x)=2K_0(2\sqrt{x})\approx Cx^{-1/4}e^{-2\sqrt{x}}\) as \(x\rightarrow \infty \), where \(K_0\) is the modified Bessel function of the second kind (see also [15], p. 417, and [9], p. 917, Formula 8.432(8)). If \(\xi \sim GG(1/2, 2, 1)\), the half-normal distribution, then \(Y_n=\xi _1\xi _2\cdots \xi _n\) is M-det for \(n\le 4\). As mentioned before, the density function of the product of two half-normals is \(g_2(x)=(2/{\pi })K_0(x)\approx C_2x^{-1/2}e^{-x}\hbox { as }~x\rightarrow \infty \).

6 More Results Related to Theorems 1 and 2

Under the same assumption as that in Theorem 1, we even have a stronger statement; see Theorem 3 below. Note that its proof does not use Lyapunov’s inequality, and that Hardy’s condition implies Carleman’s condition. For convenience, we recall in the next lemma a characterization of Hardy’s condition in terms of the moments (see [24], Theorem 3).

Lemma 3

Let \(a\in (0,1]\) and let \(X\) be a nonnegative random variable. Then \(\mathbf{E}[\exp ({c{X}^a})]<\infty \) for some constant \(c>0\) iff \(\mathbf{E}[X^k]\le c_0^k\,\Gamma (k/a+1), ~k=1,2,\ldots ,\) for some constant \(c_0>0\) (independent of \(k\)). In particular, \(X\) satisfies Hardy’s condition, i.e., \(\mathbf{E}[\exp ({c\sqrt{X}})]<\infty \) for some constant \(c>0\), iff \(\mathbf{E}[X^k]\le c_0^k\,(2k)!, ~k=1,2,\ldots ,\) for some constant \(c_0>0\) (independent of \(k\)).

Theorem 3

Suppose \(X\) is a nonnegative random variable with finite moments \(m_k=\mathbf{E}[X^k], \ k=1,2,\ldots \), such that the condition in Theorem 1 holds: \(m_{k+1}/m_k = \mathcal{O}((k+1)^2)\) as \(k \rightarrow \infty .\) Then \(X\) satisfies Hardy’s condition, and is M-det.

Proof

By the assumption, there exists a constant \(c_*\ge m_1>0\) such that

$$\begin{aligned} m_{k+1}\le c_*(k+1)^2m_k \quad \hbox {for} \,\,k=0,1,2,\ldots , \end{aligned}$$

where \(m_0\equiv 1\). This implies that

$$\begin{aligned} m_{k+1}\le (c_*/2)(2k+2)(2k+1)m_k \quad \hbox {for}\,\,k=0,1,2,\ldots , \end{aligned}$$

and hence, \(m_{k+1}\le (c_*/2)^{k+1}\Gamma (2k+3)m_0 \quad \hbox {for} \,\,k=0,1,2,\ldots .\) Taking \(c_0=c_*/2\),

$$\begin{aligned} m_{k+1}\le c_0^{k+1}\Gamma (2k+3) \quad \hbox {for} \,\,k=0,1,2,\ldots , \end{aligned}$$

or, equivalently,

$$\begin{aligned} m_{k}\le c_0^{k}\Gamma (2k+1) \quad \hbox {for} \,\,k=1,2,\ldots . \end{aligned}$$

Hence, \(X\) satisfies Hardy’s condition by Lemma 3. \(\square \)

Remark 1

The constant 2 (the growth rate of the moments) in the condition of Theorem 1 is the best possible in the following sense. For each \(\varepsilon >0\), there exists a random variable \(X\) such that \(m_{k+1}/m_k=\mathcal{O}( (k+1)^{2+\varepsilon })\) as \(k\rightarrow \infty \), and \(X\) is M-indet. To see this, let us consider \(X=\xi \sim GG(1, \beta , 1),\) which has density \(f(x)=c\exp (-x^{\beta }), ~x>0.\) We have

$$\begin{aligned} \frac{\mathbf{E}[\xi ^{k+1}]}{\mathbf{E}[\xi ^{k}]}= \frac{\Gamma ((k+2)/\beta )}{{\Gamma ((k+1)/\beta )}}\approx \beta ^{-1/\beta }(k+1)^{1/\beta } \quad \hbox {as}~ k\rightarrow \infty . \end{aligned}$$

If for \(\varepsilon >0\) we take \(\beta =\frac{1}{2+\varepsilon }<\frac{1}{2}\), then \({\mathbf{E}[\xi ^{k+1}]}/{\mathbf{E}[\xi ^{k}]}=\) \(\mathcal{O}( (k+1)^{2+\varepsilon }) ~~ \hbox {as}~ k\rightarrow \infty \). However, as mentioned before, \(X\) is M-indet.

Remark 2

The constant \(2/n\) in the condition of Theorem 2 is the best possible. Indeed, we can show that for each \(\varepsilon >0\), there exists a random variable \(\xi \) such that \(\mathbf{E}[\xi ^{k+1}]/\mathbf{E}[\xi ^k]=\mathcal{O}((k+1)^{2/n+\varepsilon })\) as \(k\rightarrow \infty \), but \(Y_n\) is M-indet. To see this, let us consider \(X=\xi \sim GG(1, \beta , 1)\). For each \(\varepsilon >0\), take \(\beta =1/(2/n+\varepsilon )\), then

$$\begin{aligned} \frac{\mathbf{E}[\xi ^{k+1}]}{\mathbf{E}[\xi ^{k}]}= \frac{\Gamma ((k+2)/\beta )}{{\Gamma ((k+1)/\beta )}}= \mathcal{O}\left( (k+1)^{2/n+\varepsilon } \right) \quad \hbox {as}~ k\rightarrow \infty . \end{aligned}$$

However, since \(n>2\beta \), \(Y_n\) is M-indet (compare this with the statement in Sect. 10).

7 Faster Growth Rate of the Moments Implies Moment Indeterminacy

We now establish a result which is converse to Theorem 1.

Theorem 4

Suppose \(X\) is a nonnegative random variables whose moments \(m_k, \ k=1,2,\ldots \), are such that \(m_{k+1}/m_k\ge C(k+1)^{2+\varepsilon }\) for all large \(k\), where \(C\) and \(\varepsilon \) are positive constants. Assume further that \(X\) has a density \(f\) satisfying the condition: for some \(x_0>0\), \(f\) is positive and differentiable on \([x_0,\infty )\) and

$$\begin{aligned} L_{f}(x):=-\frac{xf^{\prime }(x)}{f(x)}\nearrow \infty ~~\mathrm{as}~~x_0<x\rightarrow \infty . \end{aligned}$$
(2)

Then \(X\) is M-indet.

Proof

Without loss of generality we can assume that \(m_{k+1}/m_k\ge C(k+1)^{2+\varepsilon }\) for each \(k\ge 1\). Therefore,

$$\begin{aligned} m_{k+1}\ge C^k ((k+1)!)^{2+\varepsilon }m_1~~\hbox {for}~k=1,2,\ldots . \end{aligned}$$

Taking \(C_0=\min \{C,m_1\}\), we have

$$\begin{aligned} m_{k+1}\ge C_0^{k+1} ((k+1)!)^{2+\varepsilon } \quad \hbox {for}\,\,k=1,2,\ldots , \end{aligned}$$

or, equivalently,

$$\begin{aligned} m_{k}\ge C_0^{k} (k!)^{2+\varepsilon }=C_0^{k} (\Gamma (k+1))^{2+\varepsilon }\quad \hbox {for} \,\,k=2,3,\ldots . \end{aligned}$$

Since \(\Gamma (x+1)=x\Gamma (x)\approx \sqrt{2\pi }\,x^{x+1/2}\,e^{-x}\) as \(x\rightarrow \infty \), we have that for some constant \(c>0\),

$$\begin{aligned} m_{k}^{-1/(2k)}\le C_0^{-1/2} (\Gamma (k+1))^{-(2+\varepsilon )/(2k)}\approx ck^{-1-\varepsilon /2} \quad \hbox {for all large} \,k. \end{aligned}$$

This implies that the Carleman quantity for the moments of \(f\) is finite:

$$\begin{aligned} \mathbf{C}[f]:= \sum _{k=1}^{\infty }m_k^{-1/(2k)}<\infty . \end{aligned}$$

We sketch the rest of the proof. Following the proof of Theorem 3 in [10], we first construct a symmetric distribution \(G\) on \({\mathbb R}\), obeyed by a random variable \(Y\), such that \(\mathbf{E}[Y^{2k}]=\mathbf{E}[X^k],\) \(~\mathbf{E}[Y^{2k-1}]\) \(=0\) for \(k=1,2,\ldots \). Let \(g\) be the density of \(G\). Then for the Carleman quantity of the moments of \(g\) we have:

$$\begin{aligned} \mathbf{C}[g]:=\sum _{k=1}^{\infty }\left( \mathbf{E}[Y^{2k}]\right) ^{-1/(2k)} =\sum _{k=1}^{\infty }\left( \mathbf{E}[X^{k}]\right) ^{-1/(2k)}=\mathbf{C}[f]<\infty . \end{aligned}$$

This implies that for some \(x_0^*>x_0\), the logarithmic normalized integral (called also Krein quantity of \(g\)) over the domain \(\{x: |x|\ge x_0^*\}\) is finite:

$$\begin{aligned} \mathbf{K}[g]:=\int \limits _{|x|\ge x_0^*}\frac{-\log g(x)}{1+x^2}\hbox {d}x<\infty , \end{aligned}$$

as shown in the proof of Theorem 2 in [10]. Finally, according to Theorem 2.2 in [19], this is a sufficient condition for \(Y\) to be M-indet on \({\mathbb R}\) and we conclude that \(X\) is M-indet on \({\mathbb R}^+\) by mimicking the proof of Corollary 1 in [21] (see also [17], Proposition 1 and Theorem 3). \(\square \)

8 Generalized Gamma Distributions. Part (c)

Let us see how Theorem 4 in Sect. 7 works for a random variable \(\xi \sim GG(\alpha , \beta , \gamma ).\) We claim that for \(n>2\beta \), the power \(X_n=\xi ^n\) is M-indet. To see this, recall that

$$\begin{aligned} \frac{\mathbf{E}[X_n^{k+1}]}{\mathbf{E}[X_n^{k}]}\approx (n/\alpha \beta )^{n/\beta } (k+1)^{n/\beta } \quad \hbox {as}~ k\rightarrow \infty , \end{aligned}$$

where \(n/\beta >2\). Thus, the moments of \(X_n\) grow at a rate more than 2. Let us check that the density \(h\) of \(X_n\) satisfies the condition (2). Indeed, we have

$$\begin{aligned} L_{h}(x):=-\frac{xh^{\prime }(x)}{h(x)}=1-\frac{\gamma }{n}+\frac{\alpha \beta }{n} x^{\beta /n} \nearrow \infty \quad \hbox {ultimately as}~ x\rightarrow \infty . \end{aligned}$$

Therefore, for \(n>2\beta \), \(X_n\) is M-indet by Theorem 4.

Remark 3

To use Theorem 4 is another way to prove some known facts, for example, that the log-normal distribution and the cube of the exponential distribution are M-indet. Indeed, for \(X\sim LogN(0,1),\) we have the moment recurrence

$$\begin{aligned} m_{k+1}=e^{k+1/2}m_k, \quad k=1,2,\ldots , \end{aligned}$$

and for \(X=\xi ^3\), where \(\xi \sim Exp (1)\), we have

$$\begin{aligned} m_{k+1}=(3k+1)(3k+2)(3k+3)m_k, \quad k=1,2,\ldots . \end{aligned}$$

It is easily seen that in both cases the growth rates of the moments are quite fast. For the cube of \(Exp(1)\), we have \(m_{k+1}/m_k\ge C(k+1)^3,~k=1,2,\ldots \), for some constant \(C>0\), so the rate is more than 2. For \(LogN\) the rate is exponential, hence much larger than 2. It remains to check that condition (2) is satisfied for the density of \(\xi ^3\) and the density of \(\hbox {log}N\). Details are omitted.

We can make one step more by considering the logarithmic skew-normal distributions with density \(f_{\lambda }(x)=(2/x)\varphi (\ln x)\Phi (\lambda \ln x),\) \(~x>0\), where \(\lambda \) is a real number. (When \(\lambda =0\), \(f_{\lambda }\) reduces to the standard log-normal density.) Then we have the moment relationship

$$\begin{aligned} m_{k+1}\approx e^{(k+1/2)\rho }m_k, \quad \hbox {as} \,\,k\rightarrow \infty , \end{aligned}$$

where \(\rho \in (0,1]\) is a constant (see, e.g., [12], Proposition 3). Thus, the moments grow very fast, exponentially, and it remains to check that the density function \(f_{\lambda }\) satisfies the condition (2):

$$\begin{aligned} L_{f_{\lambda }}(x):=-\frac{xf_{\lambda }^{\prime } (x)}{f_{\lambda }(x)}\nearrow \infty \quad \hbox {ultimately as}\quad x\rightarrow \infty . \end{aligned}$$

Therefore, by the above Theorem 4, we conclude that all logarithmic skew-normal distributions are M-indet. This is one of the results in [12] where a different proof is given.

9 General Result on the M-indet Property of the Product \(Y_n=\xi _1\xi _2 \cdots \xi _n\)

In the next theorem, we describe conditions on the distribution of \(\xi \) under which the product \(Y_n=\xi _1\xi _2 \cdots \xi _n\) is M-indet.

Theorem 5

Let \(\xi \sim F,\) where \(F\) is absolutely continuous with density \(f>0\) on \({\mathbb R}^+\). Assume further that:

  1. (i)

    \(f(x)\) is decreasing in \(x\ge 0,\) and

  2. (ii)

    there exist two constants \(x_0\ge 1\) and \(A>0\) such that

    $$\begin{aligned} f(x)/\overline{F}(x)\ge A/x \quad \mathrm{for} \,\,x\ge x_0, \end{aligned}$$
    (3)

    and some constants \(B>0,~ \alpha >0,\) \(\beta >0\) and a real \(\gamma \) such that

    $$\begin{aligned} \overline{F}(x)\ge Bx^{\gamma }e^{-\alpha x^{\beta }} \quad \mathrm{for} \,\, x\ge x_0. \end{aligned}$$
    (4)

Then, for \(n>2\beta \), the product \(Y_n\) has a finite Krein quantity and is M-indet.

Corollary 2

Let \(\xi \sim F\) satisfy the conditions in Theorem 5 with \(\beta <\frac{1}{2}\). Then \(F\) itself is M-indet.

Lemma 4

Under the condition (3), we have

$$\begin{aligned} \int \limits _x^{\infty }\frac{f(u)}{u}\hbox {du}\ge \frac{A}{1+A}\frac{\overline{F}(x)}{x} \,\, \mathrm{and } \,\, \overline{F}(x)\le \frac{C}{x^A}, \quad x>x_0,\,\mathrm{for\,\,some\,\, constant}~ C>0. \end{aligned}$$

Proof

Note that for \(x>x_0\),

$$\begin{aligned} \int \limits _x^{\infty }\frac{f(u)}{u}\mathrm{d}u =-\int \limits _x^{\infty }\frac{1}{u}d\overline{F}(u)=\frac{\overline{F}(x)}{x} -\int \limits _x^{\infty }\frac{\overline{F}(u)}{u^2}\mathrm{d}u\ge \frac{\overline{F}(x)}{x} -\frac{1}{A}\int \limits _x^{\infty }\frac{f(u)}{u}\mathrm{d}u. \end{aligned}$$

The last inequality is due to (3). Hence,

$$\begin{aligned} \left( 1+\frac{1}{A}\right) \int \limits _x^{\infty }\frac{f(u)}{u}\mathrm{d}u\ge \frac{\overline{F}(x)}{x}. \end{aligned}$$

On the other hand, for \(x>x_0\),

$$\begin{aligned} \log \overline{F}(x)&= {-\int \limits _0^xf(t)/\overline{F}(t)\mathrm{d}t} ={-\int \limits _0^{x_0}f(t)/\overline{F}(t)\mathrm{d}t-\int \limits _{x_0}^xf(t)/ \overline{F}(t)\mathrm{d}t}\\&\equiv C_0-\int \limits _{x_0}^xf(t)/\overline{F}(t)\mathrm{d}t\le C_0-\int \limits _{x_0}^xA/t\mathrm{d}t= C_0+A\log x_0 -A\log x. \end{aligned}$$

Therefore, \(\overline{F}(x)\le C/x^A,~x>x_0,\) where \(C=x_0^Ae^{C_0}.\) \(\square \)

Remark 4

After deriving in Lemma 4 a lower bound for \(\int _x^{\infty }(f(u)/u)du\) we have the following upper bound for arbitrary density \(f\) on \({\mathbb R}^+\):

$$\begin{aligned} \int \limits _x^{\infty }\frac{f(u)}{u}\mathrm{d}u\le \frac{1}{x}\int \limits _x^{\infty }{f(u)}\mathrm{d}u= \frac{\overline{F}(x)}{x}, \quad x>0. \end{aligned}$$

Proof of Theorem 5

The density \(g_n\) of \(Y_n\) is expressed as follows:

$$\begin{aligned} g_n(x)=\int \limits _0^{\infty }\!\!\int \limits _0^{\infty }\!\!\cdots \!\! \int \limits _0^{\infty }\frac{f(u_1)}{u_1}\frac{f(u_2)}{u_2} \cdots \frac{f(u_{n-1})}{u_{n-1}}f\left( \frac{x}{u_1u_2\cdots u_{n-1}}\right) \mathrm{d}u_1\mathrm{d}u_2\cdots \mathrm{d}u_{n-1} \end{aligned}$$

for \(x>0\). Hence, \(g_n(x)>0\) and decreases in \(x \in (0,\infty )\). For any \(a>0\), we have

$$\begin{aligned} g_n(x)\!\!&\ge \!\!\int \limits _a^{\infty }\!\! \int \limits _a^{\infty }\!\!\cdots \!\!\int \limits _a^{\infty } \frac{f(u_1)}{u_1}\frac{f(u_2)}{u_2} \cdots \frac{f(u_{n-1})}{u_{n-1}}f\left( \frac{x}{u_1u_2\cdots u_{n-1}}\right) \mathrm{d}u_1\mathrm{d}u_2\cdots \mathrm{d}u_{n-1}\nonumber \\&\ge \int \limits _a^{\infty }\!\!\int \limits _a^{\infty }\!\!\cdots \!\! \int \limits _a^{\infty }\frac{f(u_1)}{u_1}\frac{f(u_2)}{u_2} \cdots \frac{f(u_{n-1})}{u_{n-1}}f\left( \frac{x}{a^{n-1}}\right) \mathrm{d}u_1\mathrm{d}u_2\cdots \mathrm{d}u_{n-1}\nonumber \\&= f\left( \frac{x}{a^{n-1}}\right) \left( \int \limits _a^{\infty } \frac{f(u)}{u}\mathrm{d}u\right) ^{n-1}, \quad x>0. \end{aligned}$$
(5)

The above second inequality follows from the monotone property of \(f\). Taking \(a=x^{1/n}>x_0\), we have, by (3)–(5) and Lemma 4, that

$$\begin{aligned} g_n(x)&\ge f\left( x^{1/n}\right) \left( \int \limits _{x^{1/n}}^{\infty } \frac{f(u)}{u}\hbox {d}u\right) ^{n-1}\ge f\left( x^{1/n}\right) \left( \frac{A}{1+A}\frac{\overline{F} (x^{1/n})}{x^{1/n}}\right) ^{n-1}\\&\ge \left( \frac{A}{1+A}\right) ^{n-1}x^{-(1-1/n)} \frac{f\left( x^{1/n}\right) }{\overline{F}(x^{1/n})} \left( \overline{F}(x^{1/n})\right) ^n\\&\ge C_nx^{\gamma -1}e^{-n\alpha x^{\beta /n}}, \end{aligned}$$

where \(C_n=\left( \frac{A}{1+A}\right) ^{n-1}AB^n\). Therefore, the Krein quantity for \(g_n\) is as follows:

$$\begin{aligned} \mathbf{K}[g_n]&= \int \limits _0^{\infty }\frac{-\log g_n(x^2)}{1+x^2}\hbox {d}x=\int \limits _0^{x_0^n}\frac{-\log g_n(x^2)}{1+x^2}\hbox {d}x+\int \limits _{x_0^n}^{\infty }\frac{-\log g_n(x^2)}{1+x^2}\hbox {d}x\\&\le \left( -\log g_n(x_0^{2n})\right) \int \limits _0^{x_0^n}\frac{1}{1+x^2}\hbox {d}x+\int \limits _{x_0^n}^{\infty }\frac{-\log g_n(x^2)}{1+x^2}\hbox {d}x<\infty \quad \hbox {if}\,\, n>2\beta . \end{aligned}$$

This in turn implies that \(Y_n\) is M-indet for \(n>2\beta \) (see, e.g., [10], Theorem 3). \(\square \)

10 Generalized Gamma Distributions. Part (d)

Let us see how the general result from Sect. 9 can be used to establish the moment indeterminacy of products of independent copies of a random variable \(\xi \sim GG(\alpha , \beta , 1).\) Here \(\gamma =1\) and the density is \(f(x)=ce^{-\alpha x^{\beta }}, \, x\ge 0.\)

We claim that for \(n>2\beta \), the product \(Y_n=\xi _1\xi _2\cdots \xi _n\) is M-indet. To see this, note that \(f(x)/\overline{F}(x)\approx \alpha \beta x^{\beta -1}\) and \(\overline{F}(x)\approx [c/(\alpha \beta )]x^{1-\beta }e^{-\alpha x^{\beta }}\) as \(x\rightarrow \infty \). Then the density \(f\) satisfies the conditions (i) and (ii) in Theorem 5, and hence, \(Y_n\) is M-indet if \(n>2\beta \).

For example, if \(\xi \sim Exp(1),\) then, as mentioned before, the product \(Y_n=\xi _1\xi _2 \cdots \xi _n\) is M-indet for \(n\ge 3.\)

If \(\xi \) has the half-normal distribution, its density is \(f(x)=\sqrt{{2}/{\pi }}e^{-x^2/2},\) \(~x\ge 0\), then \(Y_n=\xi _1\xi _2 \cdots \xi _n\) is M-indet for \(n\ge 5\) (recall from Sect. 5 that \(Y_n\) is M-det for \(n\le 4\)). By words: The product of two, three or four half-normal random variables is M-det, while the product of five or more such variables is M-indet.

In summary, we have the following result about \(GG(\alpha , \beta , \gamma )\) with \(\gamma =1\).

Lemma 5

Let \(n\ge 2,~X_n=\xi ^n\) and \(Y_n=\xi _1 \cdots \xi _n\), where \(\xi _1, \ldots ,\xi _n\) are independent copies of \(\xi \sim GG(\alpha , \beta , 1)\). Then the power \(X_n\) is M-det iff the product \(Y_n\) is M-det and this is true iff \(n\le 2\beta \).

We now consider the general case \(\gamma >0\).

Theorem 6

Let \(n\ge 2,~X_n=\xi ^n\) and \(Y_n=\xi _1\cdots \xi _n\), where \(\xi _1,\ldots ,\xi _n\) are independent copies of \(\xi \sim GG(\alpha , \beta , \gamma )\). Then \(X_n\) is M-det iff \(Y_n\) is M-det and this is true iff \(n\le 2\beta \). In other words, both \(X_n\) and \(Y_n\) have the same moment determinacy property.

Proof

Define \(\eta =\xi ^{\gamma }\), \(\eta _i=\xi _i^{\gamma }\), \(i=1,2,\ldots ,n\), \(X_n^*=\eta ^n=(\xi ^n)^{\gamma }=X_n^{\gamma }\) and \(Y_n^*=\eta _1\eta _2\cdots \eta _n=(\xi _1\xi _2\cdots \xi _n)^{\gamma }=Y_n^{\gamma }\). Since \(\eta \sim GG(\alpha , \beta /\gamma , 1)\), we have, by Lemma 5, \(X_n^*\) is M-det iff \(Y_n^*\) is M-det iff \(n\le 2\beta /\gamma \). Next, note that for each \(x>0\), we have \(\mathbf{P}[X_n^*>x]=\mathbf{P}[X_n>x^{1/\gamma }]\) and \(\mathbf{P}[Y_n^*>x]=\mathbf{P}[Y_n>x^{1/\gamma }]\). This implies that any distributional property shared by \(X_n^*\) and \(Y_n^*\) can be transferred to a similar property shared by \(X_n\) and \(Y_n\), and vice versa. Therefore, \(X_n\) is M-det iff \(Y_n\) is M-det iff \(n\le 2\beta \), because \(X_n\) is M-det iff \(n\le 2\beta \) (see, e.g., [18]). \(\square \)

11 Half-Logistic Distribution

Some of the above results and illustrations involve the generalized gamma distribution \(GG(\alpha , \beta , \gamma )\). It is useful to have a moment determinacy characterization for powers and products based on non-GG distributions. Here is an example based on the half-logistic distribution, which clearly is not in the class \(GG\).

Statement

We say that the random variable \(\xi \) has the half-logistic distribution if its density is

$$\begin{aligned} f(x)=\frac{2e^{-x}}{(1+e^{-x})^2}, \quad x\ge 0. \end{aligned}$$

The power \(X_n=\xi ^n\) and the product \(Y_n=\xi _1\xi _2 \cdots \xi _n\) are defined as above. Then \(X_n\) is M-det iff \(Y_n\) is M-det and this is true iff \(n\le 2\). This means that for each \(n\), the two random variables \(X_n\) and \(Y_n\) share the same moment determinacy property.

Proof

  1. (i)

    The claim that \(X_n\) is M-det iff \(n\le 2\) follows from results in [11]. Actually, in [11] it is proved that for any real \(s>0\), the power \(\xi ^s\) is M-det iff \(s\le 2\). Let us give here an alternative proof. The density \(h_s\) of \(\xi ^s\) is

    $$\begin{aligned} h_s(z)=\frac{2}{s}z^{1/s-1}\frac{e^{-z^{1/s}}}{(1+e^{-z^{1/s}})^{2}}, \quad z\ge 0. \end{aligned}$$

    Using the inequality: \(\frac{1}{4}\le (1+e^{-x})^{-2}\le 1\) for \(x\ge 0\), we find two-sided bounds for the moments of \(\xi ^s\):

    $$\begin{aligned} \frac{1}{2}\Gamma (ks+1)\le \mathbf{E}[(\xi ^s)^k]\le \int \limits _0^{\infty }\frac{2}{s}z^{k+1/s-1}{e^{-z^{1/s}}}\hbox {d}z= 2\Gamma (ks+1). \end{aligned}$$

    Therefore, the growth rate of the moments of \(\xi ^s\) is

    $$\begin{aligned} \frac{\mathbf{E}[(\xi ^s)^{k+1}]}{\mathbf{E}[(\xi ^s)^k]}\le 4\cdot \frac{\Gamma ((k+1)s+1)}{\Gamma (ks+1)}\approx 4s^s (k+1)^s \quad \hbox {as} \,\, k\rightarrow \infty . \end{aligned}$$

    By Theorem 1, this implies that \(\xi ^s\) is M-det for \(s\le 2\). On the other hand, we have

    $$\begin{aligned} \frac{\mathbf{E}[(\xi ^s)^{k+1}]}{\mathbf{E}[(\xi ^s)^k]}\ge \frac{1}{4}\cdot \frac{\Gamma ((k+1)s+1)}{\Gamma (ks+1)}\approx \frac{1}{4}s^s (k+1)^s \quad \hbox {as} \,\,k\rightarrow \infty . \end{aligned}$$

    The moment condition in Theorem 4 is satisfied if \(s>2\). It remains now to check the validity of condition (2) for the density \(h_s\). We have

    $$\begin{aligned} L_{h_s}(z):&= -\frac{zh_s^{\prime }(z)}{h_s(z)}\\&= 1-\frac{1}{s}+\frac{1}{s}z^{1/s}- \frac{2}{s}\,z^{1/s}\frac{e^{-z^{1/s}}}{1+e^{-z^{1/s}}}~~\nearrow \infty ~\hbox {ultimately~~as}~~z\rightarrow \infty . \end{aligned}$$

    Hence, if \(s>2\),  \(\xi ^s\) is M-indet.

  2. (ii)

    It remains to prove that \(Y_n\) is M-det iff \(n\le 2\).

(Sufficiency) As in part (i), we have

$$\begin{aligned} \frac{1}{2}\Gamma (k+1)\le \mathbf{E}[\xi ^k]= 2\Gamma (k+1). \end{aligned}$$

Therefore, \(\mathbf{E}[\xi ^{k+1}]/\mathbf{E}[\xi ^{k}]=\mathcal{O}(k+1)\) as \(k\rightarrow \infty \). By Theorem 2, we conclude that \(Y_n\) is M-det if \(n\le 2\).

(Necessity) Note that \(\overline{F}(x)=\mathbf{P}[\xi >x]= 2e^{-x}/(1+e^{-x})\ge e^{-x},~~x\ge 0,\) and \(f(x)/\overline{F}(x)=1/(1+e^{-x})\ge 1/2,~x\ge 0\). Therefore, taking \(\beta =1\) in Theorem 5, we conclude that \(Y_n\) is M-indet if \(n> 2\). Let us express this conclusion by words: The product of three or more half-logistic random variables is M-indet. \(\square \)