Abstract
In this chapter, we introduce a multivariate gamma distribution whose marginals are finite mixtures of gamma distributions and correlation between any pair of variables is negative. Several of its properties such as joint moments, correlation coefficients, moment generating function, Rényi and Shannon entropies have been derived. Simulation study have been conducted to evaluate the performance of the maximum likelihood method.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
1 Introduction
Gamma distribution is an important continuous distribution in probability and statistics. Several distributions such as exponential, Erlang, and chi-square are special cases of this distribution. Several univariate generalizations of gamma distribution have also been studied. Gamma distribution and its variants have been applied in different disciplines to model continuous variables that are positive and have skewed distributions. Gamma distribution has been used to model amounts of daily rainfall (Aksoy [1]) and in neuroscience this model is often used to describe the distribution of inter-spike intervals (Robson and Troy [26]). The gamma distribution is widely used as a conjugate prior in Bayesian statistics. It also plays an important role in actuarial sciences (Furman [9]).
Several multivariate generalizations of univariate gamma distributions are also available in the literature. Mathai and Moschopoulos [20, 21] introduced two multivariate gamma models as the joint distribution of certain linear combinations/partial sums of independent three parameter gamma variables. All the components of their multivariate gamma vectors are positively correlated and have three parameter gamma distributions. They have also indicated that their models have potential applications in stochastic processes and reliability. Furman [9] used the multivariate reduction technique to derive a multivariate probability model possessing a dependence structure and gamma marginals. Kowalczyk and Tyrcha [17] used a re-parameterized form of the gamma distribution to define a multivariate gamma vector and studied a number of properties of their distribution. Recently, Semenikhine, Furman and Su [28] introduced a multiplicative multivariate gamma distribution with gamma marginals and applied their results in actuarial science. They proved that the correlation coefficient between any pair of variables is positive and belongs to (0, 1/2). Multivariate gamma distributions have been used in diverse fields like hydrology, space (wind modeling), reliability, traffic modeling, and finance. For further results on multivariate gamma distribution, the reader may consult articles by Balakrishnan and Ristić [4], Carpenter and Diawara [5], Dussauchoy and Berland [6], Gaver [10], Krishnaiah and Rao [18], Marcus [19], Pepas et al.[23], Royen [27], Vaidyanathan and Lakshmi [33], and an excellent text by Kotz, Balakrishnan and Johnson [16]. For a good review on bivariate gamma distributions, see Balakrishnan and Lai [3], Arnold, Castillo and Sarabia [2], Hutchinson and Lai [14], and Kotz, Balakrishnan and Johnson [16]. For a review on some recent work and applications the reader is referred to Rafiei, Iranmanesh, and Nagar [24] and references therein.
In this chapter, we introduce a multivariate gamma distribution whose marginals are finite mixtures of gamma distributions and correlation between any pair of variables is negative. We organize our work as follows: In Sect. 2, we introduce the new multivariate gamma distribution. In Sects. 3 and 4, results on marginal distributions and factorizations of the multivariate gamma distribution are derived. Sections 5–8 deal with properties such as joint moments, correlation, moment generating function, entropies and estimations. In Sect. 9, simulations of the new distribution are performed in different ways, and the results are provided to evaluate the performance of the maximum likelihood method. Section 10 contains the conclusion. Finally, the Appendix lists a number of results used in this chapter.
2 The Multivariate Gamma Distribution
Recently, Rafiei, Iranmanesh, and Nagar [24] have defined a bivariate gamma distribution with parameters \(\alpha , \beta \) and k and the pdf
where \( x_1>0\), \(x_2>0\), \(\alpha >0\), \(\beta >0\), and \(k\in \mathbb {N}_0\). A natural multivariate generalization of this distribution can be given as follows.
Definition 1
The random variables \(X_{1},\ldots , X_{n}\) are said to have a generalized multivariate gamma distribution, denoted as \((X_1,\ldots ,X_n)\) \(\sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta , k)\), if their joint pdf is given by
where \(\alpha _1>0, \ldots , \alpha _n>0\), \(\beta > 0\), \(k\in \mathbb {N}_0\) and \(C(\alpha _1, \ldots ,\alpha _n; \beta , k)\) is the normalizing constant.
By integrating the joint density of \(X_1, \ldots ,X_n\) over its support set, the normalizing constant is derived as
where the last line has been obtained by using Lemma 2. Finally, from the above expression
For \(k=0\), the multivariate gamma density simplifies to the product of n independent univariate gamma densities with common scale parameter \(\beta \). For \(k=1\), the multivariate gamma density can be written as
For \(n =2 \) in (1), the bivariate gamma density is obtained as
where
In a recent article, Rafiei, Iranmanesh, and Nagar [24] have studied the above distribution for \(\alpha _1=\alpha _2\). Substituting \(n=2\) in (3) or \(k=1\) in (4), the generalized bivariate gamma density takes the form
which yields the marginal density of \(X_1\) as
Clearly, the marginal density of \(X_1\) is a mixture of two gamma densities indicating that, in general, marginal density of any subset of \(X_1, \ldots , X_n\) is not a generalized multivariate gamma.
It may be noted here that the multivariate gamma distribution defined above belongs to the Liouville family of distributions (Sivazlian [30], Gupta and Song [12], Gupta, and Richards [13], Song and Gupta [31]). Because of mathematical tractability, this distribution further enriches the class of multivariate Liouville distributions and may serve as an alternative to many existing distributions belonging to this class.
3 Marginal Distributions
In this section, we derive results on marginal distributions of the generalized multivariate gamma distribution defined in this chapter. By using multinomial expansion of \( \left( \sum _{i=1}^{n}x_{i}\right) ^k\), namely,
in (1), the joint density of \(X_{1},\ldots , X_{n}\) can be restated as
where \(x_{i}>0\), \(i=1,2, \ldots , n\). Thus, the generalized multivariate gamma distribution is a finite mixture of product of independent gamma densities.
In the remaining part of this section and the next section, we derive marginal distributions, distribution of partial sums and several factorizations of the generalized multivariate gamma distribution.
Theorem 1
Let \((X_1,\ldots ,X_n)\) \(\sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta , k)\). Then, for \(1\le s \le n-1\), the marginal density of \(X_{1},\ldots , X_{s}\) is given by
Proof
Integrating out \(x_{s+1},\ldots , x_{n}\) in (1), the marginal density of \(X_{1},\ldots , X_{s}\) is derived as
where the last line has been obtained by using (16). Substituting \(x/\sum _{i=1}^{s}x_{i} = z\) in (5), the marginal density of \(X_{1},\ldots , X_{s}\) is rewritten as
Now, writing \((1 + z)^k\) using binomial theorem and integrating z in (6), the marginal density of \(X_{1},\ldots , X_{s}\) is derived. \(\square \)
Alternately, the density of \(X_{1},\ldots , X_{s}\) given in (7) can be written as
Corollary 1
The marginal density of \(X_1\) is given by
Corollary 2
The marginal density of \(X_1\) and \(X_2\) is given by
Substituting \(u = z/(1+z)\) with \(dz = (1-u)^{-2} du\) in (6), one gets
Now, writing
in (7) and integrating u, the marginal density of \(X_{1},\ldots , X_{s}\), in series involving generalized Laguerre polynomials, is derived as
Theorem 2
Let \((X_1,\ldots ,X_n)\) \(\sim \textrm{GMG}(\alpha _1, \ldots , \alpha _n; \beta , k)\). Then, for \(2\le r \le n\), the marginal density of \(X_{r},\ldots , X_{n}\) is given by
Proof
Similar to the proof of Theorem 1. \(\square \)
Corollary 3
The marginal density of \(X_{n}\) is given by
Theorem 3
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta , k)\). Then, for \(r=1,\ldots ,n\), the marginal density of \(X_r\) is given by
4 Factorizations
This section deals with several factorizations of the multivariate gamma distribution defined in Sect. 2.
In the next theorem, we give the joint distribution of partial sums of random variables distributed jointly as generalized multivariate gamma.
Let \(n_1, \ldots , n_\ell \) be non-negative integers such that \(\sum _{i=1}^{\ell } n_i = n\) and define
Theorem 4
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta , k)\). Define \(Z_{j}= X_{j}/X_{(i)}, j = n_{i-1}^{*} + 1,\ldots , n_{i}^{*}- 1\) and \(X_{(i)} = \sum _{j=n_{i-1}^{*}+1}^{n_{i}^{*}}X_{j}\), \(i = 1,\ldots ,\ell \). Then,
(i) \((X_{(1)},\ldots ,X_{(\ell )})\) and \((Z_{n_{i-1}^{*} +1},\ldots , Z_{n_{i}^{*}-1})\), \(i = 1,\ldots ,\ell \), are independently distributed,
(ii) \((Z_{n_{i-1}^{*} +1},\ldots , Z_{n_{i}^{*}-1})\sim \textrm{D1}(\alpha _{n_{i-1}^{*}+1},\) \(\ldots , \alpha _{n_{i}^{*} -1};\) \( \alpha _{n_{i}^{*}}),\) \(i = 1,\ldots ,\ell \), and
(iii) \((X_{(1)},\ldots ,X_{(\ell )}) \sim \text {GMG}(\alpha _{(1)},\ldots ,\alpha _{(\ell )}; \beta ,k)\).
Proof
Substituting \( x_{(i)} = \sum _{j=n_{i-1}^{*}+1}^{n_{i}^{*}}x_{j}\) and \(z_{j}= x_{j}/x_{(i)},\) \(j = n_{i-1}^{*} + 1,\ldots , n_{i}^{*}- 1\), \(i = 1,\ldots ,\ell \) with the Jacobian
in the density of \((X_{1},\ldots ,X_{n})\) given by (1), we get the joint density of \(Z_{{n_{i-1}^{*}}+1},\ldots ,\) \( Z_{n_{i}^{*}-1},X_{(i)}\), \(i = 1,\ldots ,\ell \) as
where \( x_{(i)}>0\), \(i=1,\ldots ,\ell \), \(z_{j}>0\), \(j = n_{i-1}^{*}+1,\ldots , n_{i}^{*} -1\), \(\sum _{j=n_{i-1}^{*}+1}^{n_{i}^{*}-1} z_{j} < 1\), \(i = 1,\ldots ,\ell \). From the factorization in (8), it is easy to see that \((X_{(1)},\ldots ,X_{(\ell )})\) and \((Z_{n_{i-1}^{*} +1},\ldots , Z_{n_{i}^{*}-1})\), \(i = 1,\ldots ,\ell \), are independently distributed. Further \((X_{(1)},\ldots ,X_{(\ell )}) \!\sim \! \text {GMG}(\alpha _{(1)},\ldots ,\alpha _{(\ell )}; \beta ,k)\) and \((Z_{n_{i-1}^{*} {+}1},\ldots , Z_{n_{i}^{*}-1})\!\sim \! \textrm{D1}(\alpha _{n_{i-1}^{*}{+}1},\) \(\ldots , \alpha _{n_{i}^{*} -1}\); \( \alpha _{n_{i}^{*}})\), \(i = 1,\ldots ,\ell \). \(\square \)
Corollary 4
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(Z_{i} = X_{i}/ Z\), \(i = 1,\ldots ,n-1\), and \(Z =\sum _{j=1}^{n} X_{j}\). Then, \((Z_{1},\ldots ,Z_{n-1})\) and Z are independent, \((Z_{1},\ldots ,Z_{n-1})\sim \textrm{D1}(\alpha _{1},\ldots ,\alpha _{n-1};\alpha _{n})\) and \(Z\sim \text {G}\left( \sum _{i=1}^{n}\alpha _i+k, \beta \right) \).
Corollary 5
If \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\), then \(\sum _{j=1}^{n} X_{j}\) and \( \frac{\sum _{i=1}^{s} X_{i}}{\sum _{i=1}^{n} X_{i}}\) are independent. Further
Theorem 5
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \( W_{j}= {X_{j}}/{X_{n_i^{*}}}, j = n_{i-1}^{*} + 1,\ldots , n_{i}^{*}- 1\) and \(X_{(i)} = \sum _{j=n_{i-1}^{*}+1}^{n_{i}^{*}}X_{j}, i = 1,\ldots ,\ell . \) Then,
(i) \((X_{(1)},\ldots ,X_{(\ell )})\) and \( (W_{n_{i-1}^{*} +1},\ldots , W_{n_{i}^{*}-1} )\), \(i = 1,\ldots ,\ell \), are independently distributed,
(ii) \( (W_{n_{i-1}^{*} +1},\ldots , W_{n_{i}^{*}-1} )\sim \textrm{D2}(\alpha _{n_{i-1}^{*}+1},\) \(\ldots , \alpha _{n_{i}^{*} -1};\) \( \alpha _{n_{i}^{*}}),\) \(i = 1,\ldots ,\ell \), and
(iii) \((X_{(1)},\ldots ,X_{(\ell )}) \sim \text {GMG}(\alpha _{(1)},\ldots ,\alpha _{(\ell )}; \beta ,k)\).
Corollary 6
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \( W_{i} = {X_{i}}/{ X_n}, i = 1,\ldots ,n-1\) and \(Z =\sum _{j=1}^{n} X_{j}\). Then, \((W_{1},\ldots ,W_{n-1})\) and Z are independent, \( (W_{1},\ldots ,W_{n-1})\sim \textrm{D2}(\alpha _{1},\ldots ,\alpha _{n-1};\alpha _{n}) \) and \(Z\sim \text {G}(\sum _{i=1}^{n}\alpha _i, \beta ,k)\).
Corollary 7
If \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\), then \(\sum _{j=1}^{n} X_{j}\) and \( \frac{\sum _{i=1}^{s} X_{i}}{\sum _{i=s+1}^{n} X_{i}}\) are independent. Further
In next six theorems, we give several factorizations of the generalized multivariate gamma density.
Theorem 6
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(Y_n= \sum _{j=1}^{n}X_j\) and \(Y_i= \sum _{j=1}^{i}X_j/ \sum _{j=1}^{i+1}X_j\), \(i=1, \ldots , n-1\). Then, \(Y_{1},\ldots ,Y_{n}\) are independent, \(Y_{i}\sim \textrm{B1} (\sum _{j=1}^{i} \alpha _{j},\alpha _{i+1} ),\, i = 1,\ldots ,n-1\), and \(Y_{n}\sim \text {G}(\sum _{i=1}^{n} \alpha _{i}+k, \beta )\).
Proof
Substituting \(x_1= y_n \prod _{i=1}^{n-1}y_i\), \(x_2= y_n (1-y_1)\prod _{i=2}^{n-1}y_i, \) \(\ldots ,\) \( x_{n-1}= y_n\) \( (1-y_{n-2}) y_{n-1}\) and \(x_n= y_n (1-y_{n-1})\) with the Jacobian \(J(x_1, \ldots , x_n\rightarrow y_1, \ldots , y_n) = \prod _{i=2}^{n}y_i^{i-1} \) in (1) we get the desired result. \(\square \)
Theorem 7
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(Z_n= \sum _{j=1}^{n}X_j\) and \(Z_i = X_{i+1}/ \sum _{j=1}^{i} X_j\), \(i=1, \ldots , n-1\). Then, \(Z_{1},\ldots ,Z_{n}\) are independent, \(Z_{i}\sim \textrm{B2}(\alpha _{i+1}, \sum _{j=1}^{i} \alpha _{j})\), \(i = 1,\ldots ,n-1\), and \(Z_{n}\sim \text {G}(\sum _{j=1}^{n} \alpha _{j}+k, \beta )\).
Proof
The desired result follows from Theorem 6 by noting that \((1-Y_i)/Y_i\sim \textrm{B2} (\alpha _{i+1},\) \( \sum _{j=1}^{i} \alpha _{j})\). \(\square \)
Theorem 8
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(W_n= \sum _{j=1}^{n}X_j\) and \(W_i = \sum _{j=1}^{i} X_j/X_{i+1}\), \(i=1, \ldots , n-1\). Then, \(W_{1},\ldots ,W_{n}\) are independent, \(W_{i}\sim \textrm{B2}( \sum _{j=1}^{i} \alpha _{j},\alpha _{i+1})\), \(i = 1,\ldots ,n-1\), and \(W_{n}\sim \text {G}(\sum _{i=1}^{n} \alpha _{i}+k, \beta , k)\).
Proof
The result follows from Theorem 7 by noting that \(1/Z_i\sim \textrm{B2}(\sum _{j=1}^{i} \alpha _{j}, \alpha _{i+1})\). \(\square \)
Theorem 9
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(Y_n{=}\sum _{j =1}^{n}X_j \) and \(Y_i= X_i/\sum _{j=i}^{n}X_j\), \(i=1,\ldots , n-1\). Then, \(Y_{1},\ldots ,Y_{n}\) are independent, \(Y_{i}\sim \textrm{B1} (\alpha _{i},\sum _{j=i+1}^{n} \alpha _{j}),\, i = 1,\ldots , n-1\), and \(Y_{n}\sim \text {G}( \sum _{i=1}^{n} \alpha _{i}+k, \beta )\).
Proof
Substituting \(x_1=y_n y_1 , x_2= y_ny_2(1-y_1) , \ldots , x_{n-1}= y_n y_{n-1}(1-y_1) \cdots \) \( (1-y_{n-2})\), and \(x_n= y_n (1-y_1) \cdots (1-y_{n-1})\) with the Jacobian \(J(x_1, \ldots , x_n\rightarrow y_1, \ldots , y_n) = y_n^{n-1} \prod _{i=1}^{n-2} (1-y_i)^{n-i-1}\) in (1), we get the desired result. \(\square \)
Theorem 10
Let \((X_1,\ldots ,X_n) \sim \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(Z_n=\sum _{j =1}^{n}X_j \) and \(Z_i= X_i/\sum _{j=i+1}^{n}X_j\), \(i=1,\ldots , n-1\). Then, \(Z_{1},\ldots ,Z_{n}\) are independent, \(Z_{i}\sim \textrm{B2} (\alpha _{i},\sum _{j=i+1}^{n} \alpha _{j}),\, i = 1,\ldots , n-1\), and \(Z_{n}\sim \text {G}( \sum _{i=1}^{n} \alpha _{i}+k, \beta )\).
Proof
The result follows from Theorem 9 by observing that \(Y_i/(1-Y_i)\sim \textrm{B2} (\alpha _{i},\) \(\sum _{j=i+1}^{n} \alpha _{j})\). \(\square \)
Theorem 11
Let \((X_1,\ldots ,X_n) \!\sim \! \text {GMG}(\alpha _1, \ldots , \alpha _n; \beta ,k)\). Define \(W_n=\sum _{j =1}^{n}X_j \) and \(W_i= \sum _{j=i+1}^{n}X_j/X_i\), \(i=1,\ldots , n-1\). Then, \(W_{1},\ldots ,W_{n}\) are independent, \(W_{i}\sim \text {B2} (\sum _{j=i+1}^{n} \alpha _{j},\alpha _{i}),\, i = 1,\ldots , n-1\), and \(W_{n}\sim \textrm{G}( \sum _{i=1}^{n}\alpha _{i}+k,\beta )\).
Proof
The result follows from Theorem 10 by noting that \(1/ W_i\!\sim \! \textrm{B2} (\sum _{j=i+1}^{n} \alpha _{j},\alpha _{i})\). \(\square \)
5 Joint Moments
By definition
Now, simplifying the above expression by using (2), one gets
where \(\alpha = \sum _{i=1}^{n}\alpha _i\) and \(r = \sum _{i=1}^{n}r_i\).
Further, substituting appropriately in the above expression, one gets
and
Finally, by using appropriate definitions, we get
6 Moment Generating Function
By definition, the joint mgf of \(X_1, \ldots ,X_n\) is given by
Substituting \(x_1 = r_1 s , \ldots x_{n-1} = r_{n-1}s\) and \(x_n = s (1-\sum _{i=1}^{n-1}r_i)\) in (9) with the Jacobian \(J (x_1,\ldots , x_{n-1}, x_n \rightarrow r_1, \ldots , r_{n-1}, s) = s^{n-1}\) and integrating s, we get
where \(1-t_i \beta >0\), \(i=1, \ldots , n\). Now, writing
in (10) and integrating r, we get
where the last line has been obtained by using the integral representation of the fourth hypergeometric function of Lauricella given in (15). Finally, substituting for \(C(\alpha _1, \ldots ,\alpha _n; \beta , k)\) and simplifying, we get
For \(t_1=\cdots =t_n =t\), we have
which is the mgf of a gamma random variable with shape parameter \(\sum _{i=1}^{n} \alpha _i + k\) and scale parameter \(\beta \).
7 Entropies
In this section, exact forms of Rényi and Shannon entropies are derived for the multivariate gamma distribution defined in this article.
Let \((\mathcal{X},\mathcal {B},\mathcal {P})\) be a probability space. Consider a pdf f associated with \(\mathcal{P}\), dominated by \(\sigma -\)finite measure \(\mu \) on \(\mathcal {X}\). Denote by \(H_{SH}(f)\) the well-known Shannon entropy introduced in Shannon [29]. It is define by
One of the main extensions of the Shannon entropy was defined by Rényi [25]. This generalized entropy measure is given by
where
The additional parameter \(\eta \) is used to describe complex behavior in probability models and the associated process under study. Rényi entropy is monotonically decreasing in \(\eta \), while Shannon entropy (11) is obtained from (12) for \(\eta \uparrow 1\). For details see Nadarajah and Zografos [22], Zografos and Nadarajah [36] and Zografos [35].
Theorem 12
For the generalized multivariate gamma distribution defined by the pdf (1), the Rényi and the Shannon entropies are given by
and
respectively, where \(\psi (z)=\frac{d}{dz} \ln \Gamma (z) =\frac{1}{\Gamma (z)} \frac{d}{dz} \Gamma (z) \) is the digamma function.
Proof
For \(\eta >0\) and \(\eta \ne 1\), using the joint density of \(X_1, \ldots , X_n\) given by (1), we have
where the last line has been obtained by using (16). Finally, evaluating the above integral by using gamma integral and simplifying the resulting expression, we get
Now, taking logarithm of \(G(\eta )\) and using (12) we get \(H_R(\eta ,f)\). The Shannon entropy is obtained from \(H_R(\eta ,f)\) by taking \(\eta \uparrow 1\) and using L’Hopital’s rule. \(\square \)
8 Estimation
Let \((X_{11},\ldots , X_{1n}), \ldots , (X_{N1},\ldots , X_{Nn})\) be a random sample from \( \text {GMG}(\alpha _1, \ldots , \alpha _n;\) \( \beta , k)\). The log-likelihood function, denoted by \(l(\alpha _1, \ldots , \alpha _n; \beta )\), is given by
where \(\alpha = \sum _{i=1}^{n}\alpha _i\). Now, differentiating \(l(\alpha _1, \ldots , \alpha _n; \beta )\) w.r.t. \(\alpha _i\), we get
Further,
where \(\psi _1(z)\) is the trigamma function defined as the derivative of the digamma function, \(\psi _1(z)=\frac{d}{dz}\psi (z)\),
Now, noting that \(\sum _{i=1}^{n} X_i\sim \text {G} (\alpha +k,\beta )\) and the expected value of a constant is the constant itself, we obtain
The Fisher information matrix for the multivariate gamma distribution given by the density (1) is defined as
Further
gives
and
gives
where \(\tilde{x}_{i} =\prod _{h=1}^{N} x_{hi}^{1/N}\), \(i=1,2,\ldots , n\). Further, using
we have
Thus, by solving numerically (13) and (14), the MLEs of \(\alpha _i\) and \(\beta \) can be obtained.
9 Simulation
In this section, a simulation study for \(p=3\) is conducted to evaluate the performance of maximum likelihood method. For \(p=2\), see Rafiei, Iranmanesh, and Nagar [24]. Samples of size \(n =50, 200, 500\) from Equation (1) for selected values of parameters are generated by MCMC methods (Gibbs Metropolise, Markov Chain Monte Carlo Metropolise, Metropolise, Metropolise gaussian, random walk Metropolise and Metropolise-Hastings). We have performed the simulation for particular values of parameters, namely, \(\alpha _1=1\), \(\alpha _2=2\), \(\alpha _3=3\), \(\beta =2\), \(k=4, 8,\) and \(\alpha _1=2\), \(\alpha _2=2\), \(\alpha _3=1\), \(\beta =2\), \(k=4, 8.\) The results were similar for other choices. MLEs for parameters based on the numerical procedures were computed. This procedures was repeated five hundred times and \((\widehat{\alpha _1}, \widehat{\alpha _2}, \widehat{\alpha _3}, \widehat{\beta })\), the average of biases (Ab) and the mean squared errors (MSE) were obtained by using Monte Carlo methods (the parameter k is an integer and the derivative method is not used to calculate its MLE).
Different packages such as MCMC, MCMCpack, gibbs.met, LearnBayes, MHadaptive, MetroHastings and walkMetropolis in R were used for simulation. After performing simulation using the above methods and comparing results, it was observed that the Gibbs sampling method provides better results. Therefore, the output of Gibbs method is presented in Tables 1, 2, 3, 4 and Figs. 1, 2, 3, 4 and 5. The MLEs of parameters and correlation cofficients are reported in Tables 1 and 2. The DEoptim package in R was used to calculate the MLEs. The average of biases and the mean squared errors of all the estimators are reported in Tables 3 and 4. In particular, biases for the maximum likelihood estimators of \(\alpha _1\), \(\alpha _2\), \(\alpha _3\) and \(\beta \) are close to 0 and the mean squared errors of all estimators always decrease with increasing n.
Figure 1 shows 3D scatter plot of the simulation data for \(\alpha _1=1\), \(\alpha _2=2\), \(\alpha _3=3\), \(\beta =2\), \(k=4\), \(n=50, 500\). Figure 2 shows 3D plot of the simulation data for \(\alpha _1=1\), \(\alpha _2=2\), \(\alpha _3=3\), \(\beta =2\), \(k=4.\) Figs. 3 and 4 show pairs style of the simulation data for \(\alpha _1=2\), \(\alpha _2=2\), \(\alpha _3=2\), \(\beta =2\), \(k=8\), \(n=50\) and \(\alpha _1=2\), \(\alpha _2=2\), \(\alpha _3=1\), \(\beta =2\), \(k=8\), \(n=500\), respectively. Figure 5 shows Trace plot for \(\alpha _1=1\), \(\alpha _2=2\), \(\alpha _3=3\), \(\beta =2\), \(k=8\), \(n=500\). Finally, simulation points and 3D contour plot for different selected values of parameters are shown in Figs. 6, 7, 8 and 9.
10 Conclusion
In this chapter, a new multivariate gamma distribution whose marginals are finite mixtures of gamma distributions is defined. It is shown that the correlation between any pair of variables is negative. Therefore, the newly introduced distribution could be suitable for fitting multivariate data with negative correlations. Several of its properties such as joint moments, correlation coefficients, moment generating function, Rényi and Shannon entropies have been derived. In Sect. 8, the method of MLE has been applied to estimate the parameters. Because the resulting likelihood equations are nonlinear, numerical methods have been used to solve them. Simulation studies have been conducted to evaluate the performance of the maximum likelihood method. Moreover, various tables and figures have been provided to confirm a proper simulation and results of the MLE method for estimating the parameters.
References
Aksoy, H. (2000). Use of gamma distribution in hydrological analysis. Turkish Journal of Engineering and Environmental Sciences, 24, 419–428.
Arnold, B. C., Castillo, E., & Sarabia, J. M. (1999). Conditional specification of statistical models. New York: Springer.
Balakrishnan, N., & C.-D, L. (2009). Continuous bivariate distributions (2nd ed.). Dordrecht: Springer.
Balakrishnan, N., & Ristić, M. M. (2016). Multivariate families of gamma-generated distributions with finite or infinite support above or below the diagonal. Journal of Multivariate Analysis, 143, 194–207.
Carpenter, M., & Diawara, N. (2007). A multivariate gamma distribution and its characterizations. American Journal of Mathematical and Management Sciences, 27(3–4), 499–507.
Dussauchoy, A., & Berland, R. (1975). A multivariate gamma type distribution whose marginal laws are gamma, and which has a property similar to a characteristic property of the normal case. In G. P. Patil, S. Kotz & J. K. Ord (Eds.), Statistical Distributions in Scientific Work, Vol. 1: Models and Structures, (pp. 319–328).
Exton, H. (1976). Multiple hypergeometric functions and applications. Chichester: Ellis Horwood.
Fang, K. T. D., Kotz, S., & Ng, K. W., Symmetric multivariate and realated distributions. London, New York: Chapman and Hall.
Furman, E. (2008). On a multivariate gamma distribution. Statistics and Probability Letters, 78(15), 2353–2360.
Gaver, D. P., Jr. (1970). Multivariate gamma distributions generated by mixture. Sankhya: The Indian Journal of Statistics Series A, 32(1), 123–126.
Gupta, A. K., & Nagar, D. K. (2000). Matrix variate distributions. Boca Raton: Chapman & Hall/CRC.
Gupta, A. K., & Song, D. (1996). Generalized Liouville distribution. Computters & Mathematics with Applications, 32(2), 103–109.
Gupta, R. D., & Richards, D. S. P. (2001). The history of Dirichlet and Liouville distributions. International Statistical Review, 69(3), 433–446.
Hutchinson, T. P., & Lai, C. D. (1991). The engineering statistician’s guide to continuous bivariate distributions. Adelaide: Rumsby Scientific Publishing.
Johnson, N. L., Kotz, S., & Balakrishnan, N. (1995). Continuous univariate distributions (Vol. 2, 2nd Edn.). New York: Wiley.
Kotz, S., Balakrishnan, N., & Johnson, N. L. (2000). Continuous multivariate distributions-1 (2nd ed.). New York: Wiley.
Kowalczyk, T., & Tyrcha, J. (1989). Multivariate gamma distributions-properties and shape estimation. Statistics: A Journal of Theoretical and Applied Statistics, 20(3) , 465–474.
Krishnaiah, P. R., & Rao, M. M. (1961). Remarks on a multivariate gamma distribution. The American Mathematical Monthly, 68(4), 342–346.
Marcus, M. (2014). Multivariate gamma distributions. Electronic Communications in Probability, 19, 1–10.
Mathai, A. M., & Moschopoulos, P. G. (1991). On a multivariate gamma. Journal of Multivariate Analysis, 39(1), 135–153.
Mathai, A. M., & Moschopoulos, P. G. (1992). A form of multivariate gamma distribution. Annals of the Institute of Statistical Mathematics, 44(1), 97–106.
Nadarajah, S., & Zografos, K. (2005). Expressions for Rényi and Shannon entropies for bivariate distributions. Information Sciences, 170(2–4), 173–189.
Peppas, K. P., Alexandropoulos, G. C., Datsikas, C. K., & Lazarakis, F. I. (2011). Multivariate gamma-gamma distribution with exponential correlation and its applications in radio frequency and optical wireless communications. IET Microwaves Antennas & Propagation, 5(3), 364–371.
Rafiei, M., Iranmanesh, A., & Nagar, D. K. (2020). A bivariate gamma distribution whose marginals are finite mixtures of gamma distributions. Statistics, Optimization & Information Computing, 8(4), 950–971.
Rényi, A. (1961). On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability (Vol. I, pp. 547–561). Berkeley, CA: University of California Press.
Robson, J. G., & Troy, J. B. (1987). Nature of the maintained discharge of \(Q\), \(X\), and \(Y\) retinal ganglion cells of the cat. Journal of the Optical Society of America, A, 4, 2301–2307.
Royen, T. (2007). Integral representations and approximations for multivariate gamma distributions. Annals of the Institute of Statistical Mathematics, 59(3), 499–513.
Semenikhine, V., Furman, E., & Su, J. (2018). On a multiplicative multivariate gamma distribution with applications in insurance. Risks, 6(3), 79.
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27(379–423), 623–656.
Sivazlian, B. D. (1981). On a multivariate extension of the gamma and beta distributions. Siam Journal on Applied Mathematics, 41(2), 205–209.
Song, D., & Gupta, A. K. (1997). Properties of generalized Liouville distribution. Random Operators and Stochastic Equations, 5(4), 337–348.
Srivastava, H. M., & Karlsson, P. W. (1985). Multiple gaussian hypergeometric series. New York: Wiley.
Vaidyanathan, V. S., & Lakshmi, R. V. (2015). Parameter estimation in multivariate gamma distribution. Statistics, Optimization and Information Computing, 3, 147–159.
Wilks, S. S. (1962). Mathematical Statistics. New York: Wiley.
Zografos, K. (1999). On maximum entropy characterization of Pearson’s type II and VII multivariate distributions. Journal of Multivariate Analysis, 71(1), 67–75.
Zografos, K., & Nadarajah, S. (2005). Expressions for Rényi and Shannon entropies for multivariate distributions. Statistics & Probability Letters, 71(1), 71–84.
Acknowledgements
Authors are grateful to the worthy reviewers for their constructive and helpful comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
In this section, we give definitions and results that will be used in subsequent sections. Throughout this work we will use the Pochhammer symbol \((a)_{n}\) defined by \((a)_{n}=a(a+1)\cdots (a+n-1)=(a)_{n-1}(a+n-1)\) for \(n=1,2,\ldots ,\) and \((a)_{0}=1\).
The fourth hypergeometric function of Lauricella, denoted by \(F_{D}^{(n)}\), in n variables \(z_{1},\ldots , z_{n}\) is defined by
where \(|z_i|<1\), \(i=1, \ldots , n\). An integral representation of \(F_{D}^{(n)}\) in Exton [7, p. 49, Eq. (2.3.5)] is given as
For further results and properties of this function the reader is referred to Exton [7] and Srivastava and Karlsson [32].
Let \(f(\cdot )\) be a continuous function and \(\alpha _{i} >0\), \(i=1, \ldots ,n\). The integral
is known as the Liouville-Dirichlet integral. Substituting \(y_i=x_i/x,\ i=1,\ldots ,n-1\) and \(x=\sum _{i=1}^n x_i\) with the Jacobian \(J(x_{1},\ldots ,x_{n-1}, x_{n}\rightarrow y_{1},\ldots ,y_{n-1}, x ) =x^{n-1}\) it is easy to see that
Finally, we define the beta type 1, beta type 2 and Dirichlet type 1 distributions. These definitions can be found in Wilks [34], Fang, Kotz and Ng [8], Johnson, Kotz and Balakrishnan [15], and Kotz, Balakrishnan and Johnson [16].
Definition 2
A random variable X is said to have the beta type I distribution with parameters (a, b), \(a>0\), \(b>0\), denoted as \(X\sim \text {B1}(a,b)\), if its pdf is given by
Definition 3
A random variable X is said to have the beta type II distribution with parameters (a, b), denoted as \(X\sim \text {B2}(a,b)\), \(a>0\), \(b>0\), if its pdf is given by
Definition 4
The random variables \(U_1, \ldots , U_n\) are said to have a Dirichlet type 1 distribution with parameters \(\alpha _1,\ldots ,\alpha _n\) and \(\alpha _{n+1}\), denoted by \((U_1,\ldots ,U_n)\sim \text {D1}(\alpha _1,\ldots ,\alpha _n;\alpha _{n+1})\), if their joint pdf is given by
where \(\alpha _i>0\), \(i=1,\ldots ,n+1\).
The Dirichlet type 1 distribution, which is a multivariate generalization of the beta type 1 distribution, has been considered by several authors and is well known in the scientific literature. By making the transformation \(V_j= U_j/(1-\sum _{i=1}^{n}U_i)\), \(j=1, \ldots , n\), in (17), the Dirichelt type 2 density, which is a multivariate generalization of beta type 2 density, is obtained as
We will write \((V_1,\ldots ,V_n)\sim \textrm{D2}(\alpha _1,\ldots ,\alpha _n;\alpha _{n+1})\) if the joint density of \(V_1,\ldots ,V_n\) is given by (18).
The matrix variate generalizations of beta type 1, beta type 2 and Dirichlet type 1 distributions have been defined and studied extensively. For example, see Gupta and Nagar [11].
Definition 5
Multinomial Theorem: For a positive integer k and a non-negative integer m,
where
The numbers appearing in the theorem are the multinomial coefficients. They can be expressed in numerous ways, including as a product of binomial coefficients of factorials:
Lemma 1
For \(a_1>0, \ldots , a_m>0\) and \(k\in \mathbb {N}\), we have
Proof
Writing \((1-\theta )^{- (a_1+\cdots + a_m)}\) as \((1-\theta )^{- a_1} \cdots (1-\theta )^{- a_m}\) and using power series expansion, for \(0<\theta <1\), we get
and
Now, comparing coefficients of \(\theta ^k\), we get the desired result. \(\square \)
Lemma 2
Let
where \(a_1>0, \ldots , a_m>0\) and \(k\in \mathbb {N}\). Then
Proof
Expanding \(\left( \sum _{i=1}^{m}z_{i}\right) ^k\) in (19) by using multinomial theorem and integrating \(z_1,\ldots , z_m\), we obtain
Now, using Lemma 1, we get the desired result. \(\square \)
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Iranmanesh, A., Rafiei, M., Nagar, D.K. (2022). A Generalized Multivariate Gamma Distribution. In: Bekker, A., Ferreira, J.T., Arashi, M., Chen, DG. (eds) Innovations in Multivariate Statistical Modeling. Emerging Topics in Statistics and Biostatistics . Springer, Cham. https://doi.org/10.1007/978-3-031-13971-0_12
Download citation
DOI: https://doi.org/10.1007/978-3-031-13971-0_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-13970-3
Online ISBN: 978-3-031-13971-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)