Abstract
In this paper, we obtain the central limit theorem and the law of the iterated logarithm for Galton–Watson branching processes with time-dependent immigration in varying environments.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Let \(\{T_{n};n\ge 0\}\) be a classical supercritical Galton–Watson branching process with offspring distribution \(\{b_{n};n\ge 0\}\) and mean \(m:=\sum _{n=0}^{\infty }nb_{n}, 1<m<\infty \). Define \(S_{n}=T_{n}/m^{n}\), it is well known that there exists a nonnegative random variable \(S\) such that \(S=\lim _{n\rightarrow \infty }S_{n} ~ a.s. \). C. C. Heyde [10–12] derived the central limit theorem (CLT) and the law of the iterated logarithm (LIL) for \(\{T_{n};n\ge 0\}\):
(I) Suppose that \(\tau _{1}^{2}:=Var(T_{1})<\infty \) and \(P(S>0)=1\). Set \(\tau _{r}^{2}:=Var(T_{r}), r\ge 1\), then
(II) Suppose that \(E(T_{1}^{3})<\infty \), then
Similar limit theorems for a classical Galton–Watson branching process with immigration were studied in [13].
A branching process in a random environment is a natural and important extension of the Galton–Watson process, it is a class of non-homegeneous Galton–Waltson process indexed by a time environment, which has been studied by many authors, see [1–3, 5, 6, 14, 15, 19–22].
In this article, we consider the Galton–Watson branching process with time-dependent immigration in the varying environments (IGWVE) defined as following:
Definition 1.1
Let \(Z_{0}\equiv 1\) and for any \(n\ge 0\),
where \(\{\xi _{n,j}; n\ge 0,j\ge 1\}\) are independent and have the same distribution in row, \(\{Y_{n},n\ge 1\}\) is a sequence of independent random variables taking values in \({\mathbb {N}}\) and independent of \(\{\xi _{n,j}; n\ge 0,j\ge 1\}\), then \(\{Z_{n}, n\ge 0\}\) is said to be a Galton–Watson branching process with time-dependent immigration in varying environments. Particularly, if for any \(n\ge 1\), \(Y_n\equiv 0\), then \(\{Z_{n}, n\ge 0\}\) is said to be a Galton–Watson branching process in varying environments (GWVE).
Throughout this paper we assume that the variances of \(\{\xi _{n,j}; n\ge 0,j\ge 1\}\) and \(\{Y_{n},n\ge 1\}\) exist. Define \(\mu _{n}=E(\xi _{n,j}), \delta _{n}^{2}=Var(\xi _{n,j})\) and \( \nu _{n}=E(Y_{n})\), we assume that
Our first main result is the following growth rate of IGWVE.
Theorem 1.1
Let \(\{Z_{n}, n\ge 0\}\) be an IGWVE. Define \(m_{n}=\prod _{i=0}^{n-1}\mu _{i}, W_{n}=Z_{n}/m_{n}, n\ge 1,\) then \(\{W_{n}, n\ge 1\}\) is a nonnegative submartingale. If
then there exists a nonnegative random variable \(W\) with \(E(W)<\infty \) such that \(W_{n}\xrightarrow {a.s.} W.\) Furthermore, if
where \(m_{n,k}=\prod _{i=n}^{n+k-1}\mu _{i}\), then \(W_{n}\xrightarrow {L^{2}}W\).
To prove the CLT and the LIL for IGWVE, we need the following two decomposition results.
(A) For any fixed \(n\ge 0, r\ge 1\), one has
where \(\{X_{n,r}^{(j)},j\ge 1\}\) are i.i.d. and independent of \(Y_{n,r}\). The variances of \(\{X_{n,r}^{(j)},j\ge 1\}\) and \(Y_{n,r}\) exist. We define \(m_{n,r}=E(X_{n,r}^{(1)}), \sigma _{n,r}^{2}=Var(X_{n,r}^{(1)}), \pi _{n,r}=E(Y_{n,r}), \theta _{n,r}^{2}=Var(Y_{n,r}).\)
(B) For any fixed \(n\ge 0\) one has
where \(\{V_{n}^{(j)};j\ge 1\}\) are i.i.d. and independent of \(I_{n}\). If (1.2) is satisfied, then the variances of \(\{V_{n}^{(j)},j\ge 1\}\) exist. Furthermore, if
then the variances of \(I_{n}\) exist. We define \(\sigma _{n}^{2}\,=\,Var(V_{n}^{(1)}), \pi _{n}=E(I_{n}), \theta _{n}^{2}=Var(I_{n}).\)
Using the above two decomposition results, we obtain the CLT and the LIL for IGWVE.
Theorem 1.2
Let \(\{Z_{n}, n\ge 0\}\) be an IGWVE. If \(Z_{n}\xrightarrow {P}\infty \), (1.2),(1.3) are satisfied and
then for any fixed \(r\ge 1\), when \(n\rightarrow \infty \) one has
Remark 1.3
Note that \(Z_n\ge 1\), a necessary and sufficient condition for \(Z_{n}\xrightarrow {a.s.}\infty \) is
which imply \(Z_{n}\xrightarrow {P}\infty \) [17].
Assume that there exists a constant \(0<\delta <1\) such that for any \(r\ge 1\),
For any \(n\ge 1,~x\in R \) and \(r\ge 1\), define
Theorem 1.4
Let \(\{Z_{n}, n\ge 0\}\) be an IGWVE. If \(Z_n\xrightarrow {a.e.}+\infty \) and (1.2)–(1.5) are satisfied, then there exists constants \(\{C_{r},r\ge 1\}\), \(C\) and \(D\) such that
Theorem 1.5
Let \(\{Z_{n};n\ge 0\}\) be an IGWVE. Suppose that there exist five constants \(\alpha ,\beta ,\tau ,\gamma ,\delta \) with \(\beta >\alpha >1,\tau >\gamma >0\) and \( 0<\delta <1\) such that for any \(n\ge 0\),
If (1.3) and (1.5) are satisfied, then
Remark 1.6
If (1.9) is satisfied, then \(Z_n\xrightarrow {a.e.}+\infty \), (1.2) and (1.4) are true.
2 A Growth Rate for IGWVE
In order to prove Theorem 1.1, we need the following lemma:
Lemma 2.1
Let \(\{X_{n},n\ge 0\}\) be a GWVE. For any fixed \(n\ge 0, r\ge 1\) one has
where \(\{X_{n,r}^{(j)}; j\ge 1\}\) are i.i.d. and independent of \(X_{n}\). Furthermore,
Note that \(m_{n, 0}=1,~~ m_{n, 1}=\mu _{n},~~ X_{n,1}^{(j)}=\xi _{n,j}\) and \(\sigma _{n,1}^{2}=\delta _{n}^{2}.\)
Proof
Let \(X_{n,r}^{(j)}\) be the size of the \(r\)th generation of GWVE starting with the \(j\)th particle at time \(n\). The first result in Lemma 2.1 following from Definition 1.1. Similar to Proposition 4 and Proposition 6 in [8], the rest of Lemma 2.1 is true. \(\square \)
Proof of Theorem 1.1
By our basic assumption, it is obvious that \(W_n\) is integrable for all \(n\ge 1\). By the independence of \(\{Y_n,n\ge 1\}\) and \(\{\xi _{n,j},n\ge 0,j\ge 1\}\), one has
which means that \(\{W_n, n\ge 1\}\) is a nonnegative submartingale.
Let \(X_n\) be the size of the original GWVE at time \(n\) and \(U_{k,n}(k<n)\) be the number of descendants at time \(n\) of the particles that immigrated in generation \(k\), then
where \(U_{n,n}=Y_n\). Let \({\mathcal {G}}\) be the \(\sigma -\)field generated by \(\{Y_n,n\ge 1\}\), by the independence of \(\{Y_n,n\ge 1\}\) and \(\{\xi _{n,j},n\ge 0,j\ge 1\}\) one has
Now for \(k\le n\), the random variable \(U_{k,n}\) is the size of the \((n-k)\)th generation of ordinary GWVE starting, however, with \(Y_k\) particles at time \(k\). Therefore, by the independence of \(\{Y_n,n\ge 1\}\) and Lemma 2.1, its conditional expectation is just \(Y_{k}m_{k,n-k}\). According to (2.2) and Lemma 2.1, one has
Since (1.1) is satisfied we know that \(\{W_n,n\ge 1\}\) is \(L^1\)-bounded. By the convergence theorem of submartingale (see [9] Theorem 2.5), there exists a nonnegative random variable \(W\) with \(E(W)<\infty \) such that \(W_n\xrightarrow {a.s.}W.\)
In order to prove the last result in Theorem 1.1, we only need to prove that\(\{W_n,n\ge 1\}\) is \(L^2\)-bounded (see Theorem 7.6.10 of [4]). In fact, by (2.1) and the independence of \(\{Y_n,n\ge 1\}\) and \(\{\xi _{n,j},n\ge 0,j\ge 1\}\) one has
where \(X_{k,n-k}^{(j)}\) is defined in Lemma 2.1, it is the size of the \((n-k)\)th generation of GWVE starting with the \(j\)th particle immigrated at time \(k\). By the Lemma 2.1 one has
which means that \(\{W_n,n\ge 1\}\) is \(L^2\)-bounded. We complete the proof. \(\square \)
3 The CLT for IGWVE
In order to prove Theorem 1.2, we need the following lemmas.
Lemma 3.1
Let \(\{Z_n,n\ge 0\}\) be an IGWVE. For any fixed \(n\ge 0, r\ge 1\) one has
where \(\{X_{n,r}^{(j)},j\ge 1\}\) are i.i.d. and independent of \(Y_{n,r}\). Furthermore,
Proof
Similar to Lemma 2.1, let \(X_{n,r}^{(j)}\) be the size of the \(r\)th generation of the ordinary GWVE staring with the \(j\)th particle at time \(n\). Let \(U_{k,n}\) be the number of descendants at time \(n\) of the particles that were immigrated in generation \(k\). Define \(Y_{n,r}=\sum _{k=n+1}^{n+r}U_{ k,n+r}.\) Then
By the independence of \(\{Y_n,n\ge 1\}\) and \(\{\xi _{n,j},n\ge 0,j\ge 1\}\) we know that \(\{X_{n,r}^{(j)},j\ge 1\}\) are i.i.d. and independent of \(Y_{n,r}\).
Let \({\mathcal {G}}\) be the \(\sigma -\)field generated by \(\{Y_n,n\ge 1\}\). Similar to the calculation of (2.2) we have
So by Lemma 2.1, we have \(E(Y_{n,r})=\sum _{k=n+1}^{n+r}v_{k}m_{k,n+r-k}.\) On the other hand, by the independence of \(\{Y_n,n\ge 1\}\) and \(\{\xi _{n,j},n\ge 0,j\ge 1\}\) one has
So by Lemma 2.1 we have \(Var(Y_{n,r})=\sum _{k=n+1}^{n+r}v_{k}\sigma _{k,n+r-k}^{2}.\) We complete the proof. \(\square \)
Lemma 3.2
[8] Let \(\{X_{n},n\ge 0\}\) be a GWVE. Define \(V_{n}=X_{n}/m_{n}\) for all \(n\ge 1\), then \(\{V_{n},n\ge 1\}\) is a nonnegative martingale and there exists a nonnegative random variable \(V\) such that \(V_{n}\xrightarrow {a.s.}V\) when \(n\rightarrow \infty \).
Lemma 3.3
Let \(\{X_{n},n\ge 0\}\) be a GWVE. Define \(V_{n,r}^{(j)}=X_{n,r}^{(j)}/m_{n,r}\), where \(X_{n,r}^{(j)}\) and \(m_{n,r}\) are defined in Lemma 2.1, then \(\{V_{n,r}^{(j)},r\ge 1\}\) is a nonnegative martingale and \(V_{n,r}^{(j)}\xrightarrow {a.s}V_{n}^{(j)}\) when \(n,j\) fixed and \(r\rightarrow \infty \). Then for any fixed \(n\ge 1\) one has
where \(V\) is defined in Lemma 3.2, \(\{V_{n}^{(j)};j\ge 1\}\) are i.i.d. and independent of \(X_{n}\).
Furthermore if (1.2) is satisfied, then \(V_{n,r}^{(j)}\xrightarrow {a.s}V_{n}^{(j)}\) when \(n,j\) fixed and \(r\rightarrow \infty \), \(E(V_{n}^{(j)})\equiv 1\),
Proof
Similar to Proposition 5 and Corollary 2 of [8], we know the first result in Lemma 3.3 is true. Then the second result following from Lemma 2.1. Using the same idea of Theorem 1 in [8], we can prove the last result. \(\square \)
Lemma 3.4
Let \(\{Z_{n},n\ge 0\}\) be an IGWVE. For any fixed \(n\ge 0\), one has
where \(\{V_{n}^{(j)},j\ge 1\}\) are i.i.d. and independent of \(I_{n}\). Furthermore, if (2) and (3) are satisfied,
Proof
Applying Theorem 1.1 one has that for each \(n\ge 1\),
So by Lemma 3.1 one deduce that
By Lemma 3.3 one has
One the other hand, define \(T_{n,r}=\frac{Y_{n,r}}{m_{n,r}}=\sum _{k=n+1}^{n+r}U_{ k,n+r}/m_{n,r}\) for each \(r\ge 1\). Let \({\mathcal {F}}_{n,r}\) be the \(\sigma -\)field generated by \(\{Y_{n},\cdots ,Y_{n+r}, \xi _{i,j}, n-1\le i\le n+r-1,j\ge 1 \}\). Note that by Lemma 3.1 one has
which means that \(\{T_{n,r},r\ge 1\}\) is \(L^{1}-\)bounded. Now
which means that \(\{T_{n,r},r\ge 1\}\) is a nonnegative submartingale with respect to \({\mathcal {F}}_{n,r},r\ge 1\), then there exists a nonnegative random variable \(I_{n}\) with \(E(I_{n})<\infty \) such that
By (3.7), (3.8)and (3.9) we obtain (3.4).
Now by Lemma 3.1 one has
which means that \(\{T_{n,r},r\ge 1\}\) is \(L^{2}-\)bounded, so \(T_{n,r}\xrightarrow {L^{2}}I_{n}\) when \(r\rightarrow \infty \). Thus,
We complete the proof of Lemma 3.4. \(\square \)
Now we consider a double sequence of random variables \(\{\zeta _{n,j}, j\ge 1,n\ge 1\}\), where for any \(n\ge 1\), \(\{\zeta _{n,j}, j\ge 1\}\) are i.i.d.. Let \(\{N_{n},n\ge 1\}\) be a sequence of random variables taking values in \({\mathbb {Z}}_{+}:=\{1,2,\cdots \}\), where \(N_{n}\xrightarrow {P}\infty \) and for any \(n\ge 1\), \(\{\zeta _{n,j}, j\ge 1\}\) and \(N_{n}\) are independent. Define
then we have the following result:
Lemma 3.5
If \(E(\zeta _{n,j})\equiv 0\) and \(Var(\zeta _{n,j})\equiv 1\), then one has
Proof
Since \(\{\zeta _{n,j},j\ge 1\}\) have the same distribution, we can set
Note \(E(\zeta _{n,j})\equiv 0\) and \(Var(\zeta _{n,j})\equiv 1\), according to (3.8) of [18] P101, one obtains
For any fixed \(t\) and \(k\) large enough
Since for any \(n\ge 1\), \(\{N_{n},\zeta _{n,j},j\ge 1\}\) are independent, we have
Fix \(t\in {\mathbb {R}}:=(-\infty ,+\infty )\), note that \((1-\frac{x}{n}+o(\frac{1}{n}))^{n}\rightarrow e^{-x}~(n\rightarrow \infty ), \) so there exists a constant \(M=M(\varepsilon )>0\) such that for any \(k\ge M\) one has
Since \(N_{n}\xrightarrow {P}\infty \), for any \(\varepsilon >0\), there exists a constant \(N=N(\varepsilon )>1\) such that for any \(n\ge N\) one has \(P(N_{n}\le M)<\frac{\varepsilon }{4}.\) Thus, when \(n\ge N\), by (3.10),(3.11) and (3.12) one has
This means the characteristic function of \(S_{N_{n}}/\sqrt{N_{n}}\) convergent to that of a standard normal random variable, so by Lévy continuous theorem we complete the proof of this lemma. \(\square \)
Proof of Theorem 1.2
By Lemma 3.1, for any \(n\ge 0,r\ge 1\) we have
where \(\{X_{n,r}^{(j)}, j\ge 1\}\) are i.i.d. and independent of \(Z_{n}\), \(E(X_{n,r})^{(j)}\equiv m_{n,r}\), \(Var(Z_{n,r}^{(j)})\equiv \sigma _{n,r}^{2}\). According to Lemma 3.5, if we can prove that \(Y_{n,r}/\sigma _{n,r}\sqrt{Z_{n}}\xrightarrow {P}0\) when \(r\) fixed and \(n\rightarrow \infty \), then we complete the proof of the first part of Theorem 1.2. In fact, for any \(\epsilon >0\),
Note that \(Z_{n}\xrightarrow {P}\infty \) and \(Z_n\ge 1\), for any \(\epsilon >0\), there exists two constants \(N,M>0\) such that \(1/M<\epsilon /2\) and \(P(Z_n<M)<\epsilon /2\) for all \(n\ge N.\) When \(n\ge N,\)
Since (1.2),(1.3) and (1.4) are satisfied, then
By Lemma 3.4, for any \(n\ge 1\) one has
where \(\{V_{n}^{(j)};j\ge 1\}\) are i.i.d.and independent of \(Z_{n}\), \(E(V_{n}^{(j)})\equiv 1\), \(Var(V_{n}^{(j)})\equiv \sigma _{n}^{2}\). Similarly, according to Lemma 3.5, if we can prove that \(I_{n}/\sigma _{n}\sqrt{Z_{n}}\xrightarrow {P}0\) when \(n\rightarrow \infty \), then we complete the proof of the second part of Theorem 1.2. In fact, for any \(\epsilon >0\),
Note that \(Z_{n}\xrightarrow {P}\infty \), (1.2)–(1.4) are satisfied, then
We complete the proof of Theorem 1.2. \(\square \)
4 Convergence Rate in the CLT for IGWVE
In order to prove Theorem 1.4, we need following lemmas.
Lemma 4.1
([7] P322) Let \(\{X_n,n\ge 1\}\) be a sequence of independent random variables with \(E(X_{n})=0,n=1,2,\cdots \). Define
If there exists an arbitrary small constant \(1>\delta >0\) such that
then there exists a constant \(A\) such that
where \(\Phi (x)\) is the standard normal distribution.
Lemma 4.2
Let \(\{\zeta _{n,j}, j\ge 1,n\ge 1\}\) be a double sequence of random variables with mean zero and variance 1, where for any \(n\ge 1\), \(\{\zeta _{n,j}, j\ge 1\}\) are i.i.d.. \(\{N_{n},n\ge 1\}\) is a sequence of integer valued random variables with \(P(N_{n}\rightarrow \infty )=1,\) as \(n\rightarrow \infty \). For any \(n\ge 1\), \(N_{n}\) and \(\{\zeta _{n,j}, j\ge 1\}\) are independent. \(\{k_n,n\ge 1\}\) is a sequence of strictly increasing positive integers. Define
If there exist two constants \(\delta \) and \(M>0\) such that for any \(n\ge 1\),
then there exists a constants \(C\) (not depend on \(\{k_{n}\}\)) such that
Proof
Let \(C=AM\), where \(A\) is chosen in Lemma 4.1, then using Lemma 4.1 we have the first inequality of (4.1). As for the second inequality, since for any \(n\ge 1\), \(N_{n}\) and \(\{\xi _{n,j}, j\ge 1\}\) are independent, so
By the first inequality of (4.1), one has
We complete the proof. \(\square \)
Lemma 4.3
Assume that \(\{\zeta _{n,j}, j\ge 1,n\ge 1\}, \{N_{n},n\ge 1\}\) satisfy the conditions of Lemma 4.2. Let \(\{\eta _{n},n\ge 1\}\) be a sequence of independent random variables with \(E(|\eta _{n}|)<\infty \) and for any \(n\ge 1\), \(\eta _{n}\) is independent of \(\{\zeta _{n,j}, j\ge 1\}\) and \(N_{n}\). Define
then for any sequence \(\epsilon _{n}\) of positive constants one has
Proof
Let \(\Phi (x)\) be the distribution function of standard normal distribution. By Lemma 4.2 and the independence of \(\eta _{n}\) and \(\{\zeta _{n,j}, j\ge 1\}\) one has
But,
where \(\zeta \) has the standard normal distribution and is independent of \(\eta _{n}\). By (4.3) and (4.4), one has
Now for any \(\epsilon _{n}>0\)
Similarly, we have
or equivalently,
Also, by mean value theorem, we know that
Similarly,
By (4.6)–(4.9) and Markov inequality one has
By (4.5) and (4.10) one obtain (4.2). We complete the proof of Lemma 4.3. \(\square \)
Lemma 4.4
Let \(\{Z_{n}, n\ge 0\}\) be an IGWVE. If (1.2)–(1.5) are satisfied, then there exists constants \(\{C_{r},r\ge 1\}\) and \(D\) such that for any sequence \(\{\epsilon _{n},n\ge 1\}\) of positive constants,
Proof
By Lemma 3.1 one has
where \(\{X_{n,r}^{(j)}; j\ge 1\}\) are i.i.d. and are independent of \(Z_{n}\) and \(Y_{n,r}\). Furthermore,
Note that \(Y_{n,r}\) is independent of \(Z_{n}\), by Lemma 4.3 and (1.5) we obtain (4.11).
Similarly, by Lemma 3.4 one has
where \(\{V_{n}^{(j)}; j\ge 1\}\) are i.i.d. and are independent of \(Z_{n}\) and \(I_{n}\). Furthermore,
Note that \(I_{n}\) is independent of \(Z_{n}\), by Lemma 4.3 and (1.5) we obtain (4.12). \(\square \)
Proof of Theorem 1.4
Note that by Lemma 3.1, for any \(n\ge 1,r\ge 1\), one has
and by Lemma 3.4, for any \(n\ge 1\) we have
Taking \(\epsilon _{n}=\left[ E\left( Z_{n}^{-1/2}\right) \right] ^{\frac{1}{2}}\) in (4.11) and (4.12), \(C=c/d+1/2\), one obtains Theorem 1.4. \(\square \)
5 The LIL for IGWVE
In order to prove Theorem 1.5, we need the following lemmas.
Lemma 5.1
For any \(\varepsilon >0\) one has
Proof
For any \(\varepsilon >0\), one has
where \(m_{0}=\left[ (1+\varepsilon )(2\log n)^{\frac{1}{2}}\right] -1\) and \(N=\frac{1}{\sqrt{2\pi }}e^{-\frac{m_{0}^{2}}{2}}\). When \(n\) is large enough one has \(\frac{1}{\sqrt{e^{m_{0}}}}<\frac{1}{2}\), so
so we obtain the (5.1).
where \(c\) is some positive constant. We obtain the (5.2) by (5.5). \(\square \)
Lemma 5.2
Let \(\{Z_{n}; n\ge 0\}\) be an IGWVE, then for any fixed \(r\ge 1\), \(\{Z_{rn}; n\ge 0\}\) is an IGWVE. If (1.2)-(1.5) are satisfied, then
where \(C_{r}\) and \(C\) are defined in Theorem 1.3.
Proof
For any fixed \(r\ge 1\), let \(Z'_{n}=Z_{rn}\), by Lemma 3.1 one has
and for any \(n\ge 0 \), \(\{X_{rn,r}^{(j)},j\ge 1\}\) are i.i.d. By the proof of Lemma 3.1 one may see that \(\{X_{rn,r}^{(j)},j\ge 1\}\) are independent of \(\{Z'_{1},Z'_{2},\cdots ,Z'_{n}\}\) and \(Y_{rn,r}\), \(\{Y_{rn,r},n\ge 0\}\) are independent and for any \(n\ge 1\), \(Y_{rn,r}\) is independent of \(Z'_{n}\), so \(\{Z_{rn}; n\ge 0\}\) is in fact an IGWVE.
(5.6) is derived from Theorem 1.4. \(\square \)
Proof of Theorem 1.5
We only need to prove the \(\limsup \) part because of symmetry. Note that (1.2) and (1.4) are satisfied by (1.9). According to Theorem 1.4 and the assumption that \(\sum _{n=0}^{\infty }\left[ E \left( Z_{n}^{-\frac{\delta }{2}}\right) \right] ^{\frac{1}{2}}<\infty \) one has
By Lemma 5.1, one has that for any \(\varepsilon >0\),
so applying (5.7) one obtains
Using (5.8) and Borel-Cantelli Lemma one has
and
Fix \(r=1\). For any \(0<\varepsilon <1, n\ge 2\), define
we know that \(A_{n}\in \sigma (Z_{1},Z_{2},\cdots ,Z_{n})\). Observe that
where the last inequality is derived from Lemma 4.3 when \(N_{n}\) is a constant. By Lemma 5.1,
According to the assumption that \(\sum _{n=0}^{\infty }\left[ E \left( Z_{n}^{-\frac{\delta }{2}}\right) \right] ^{\frac{1}{2}}<\infty \), (5.11) and (5.12) one has
then by the extension of Borel-Cantelli lemma ([16] Corollary 7.20 ) one has \(P(A_{n}~i.o.)=1\), which implies
Given \(r>1\), we use \(Z'_{n}\) to denote \(Z_{rn}\). Then by Lemma 5.2 one has that \(\{Z'_{n},n\ge 0\}\) is an IGWVE. Applying the method for the proof of the case \(r=1\) to the process \(\{Z'_{n},n\ge 0\}\),
which means
But for any fixed \(r>1,\)
Finally, we derive (1.10) from (5.9) and (5.14).
According to Theorem 1.4 one has
but Lemma 5.1 tells us for any \(\varepsilon >0\)
so by (5.15) one has
Thus by the Borel-Cantelli Lemma,
As for the lower bound, first note that
but by (1.10) we have
recalling the assumptions that for any \(k\ge 1\), \(\alpha \le \mu _{k}\le \beta , ~~\gamma ^{2}\le \delta _{k}^{2}\le \tau ^{2} \) and the definition of \(\sigma _{n,r}\) and \(\sigma _{n}\) one derives
where \(K_1\) and \(K_2\) are two positive constants. Now, we consider \(I_{2}\). One sees that
so according to (5.21) we have
but \(\sup _{n}\frac{1}{m_{n,r}}\le \alpha ^{-r}\) and
Since \(r\) is arbitrary, by (5.18)–(5.23) one has
which implies
\(\square \)
References
Afanasyev, V.I.: On the maximum of a subcritical branching process in a random environment. Stoch. Process. Appl. 93, 87–107 (2001)
Afanasyev, V.I., Geiger, J., Kersting, G., Vatutin, V.A.: Functional limit theorems for strongly subcritical branching process in random environment. Stoch. Process. Appl. 115, 1658–1676 (2005)
Afanasyev, V.I., Geiger, J., Kersting, G., Vatutin, V.A.: Criticality for branching process in random environment. Ann. Probab. 33(2), 645–673 (2005)
Ash, R.B.: Real Analysis and Probability. Academic Press, New York (1972)
Athreya, K.B., Karlin, S.: Branching processes with random environments, I: extinction probabilities. Ann. Math. Stat. 42(5), 1499–1520 (1971)
Athreya, K.B., Karlin, S.: Branching processes with random environments, II: limit theorems. Ann. Math. Stat. 42(6), 1843–1858 (1971)
Chow, Y.S., Teicher, H.: Probability Theory, 3rd edn. Spring-Verlag, New York (1997)
Fearn, D.H.: Galton-Watson processes with generation dependence. Proc. Sixth Berkeley Symp. Math. Stat. Probab. 2, 159–172 (1971)
Hall, P., Heyde, C.C.: Martingale Limit Theory and its Application. Academic Press, New York (1980)
Heyde, C.C.: A rate of convergence result for the supercritical Galton-Watson process. J. Appl. Probab. 7(2), 451–454 (1970)
Heyde, C.C.: Some central limit analogues for the supercritical Galton-Watson process. J. Appl. Probab. 8(1), 52–59 (1971)
Heyde, C.C.: Some almost sure convergence theorems for branching processes. Z. W. Verw. Geb. 20(1), 189–192 (1971)
Heyde, C.C., Seneta, E.: Analogues of classical limit theorems for the supercritical Galton-Watson Process with immigration. Math. Biosci. 11, 249–259 (1971)
Hu, D.H.: The existence and moments of canonical branching chain in random environment. Acta Math. Sci. 24B(3), 499–506 (2004)
Hu, D.H.: Infinitely dimensional control Markov branching chains in random enviaronments. Sci. China Ser. A Math. 49(1), 27–53 (2006)
Kallenberg, O.: Foundations of Modern Probability, 2nd edn. Science Press, Beijing (2006)
Lindvall, T.: Almost sure convergence of branching processes in varying and random environments. Ann. Probab. 2(2), 344–346 (1974)
Richard, D.: Probability: Theory and Examples, 3rd edn. Thomson Brooks/Cole, Belmont (2005)
Wang, W.G.: Bounds of deviation for branching chains in random environments. Acta Math. Sin. 27(5), 897–904 (2011)
Wang, W.G., Li, Y., Hu, D.H.: Extinction of population-size-dependent branching chains in random environments. Acta Math. Sci. 30B(4), 1065–1072 (2010)
Wang, W.G., Lv, P., Hu, D.H.: Harmonic moments of branching processes in random environments. Acta Math. Sin. 25(7), 1087–1096 (2009)
Yu, J.H., Pei, J.L.: Extinction of branching process in varying environments. Stat. Probab. Lett. 79, 1872–1877 (2009)
Acknowledgments
The authors would like to thank the referee for his (her) valuable suggestions, which improved the manuscript. We also thank Professor Xiaoyu Hu for her help and carefully reading of several early manuscripts. The first author is financially supported by the Youth Foundation and Doctor’s Initial Foundation of Qufu Normal University (XJ201113,BSQD20110127).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Anton Abdulbasah Kamil.
Rights and permissions
About this article
Cite this article
Gao, Z., Zhang, Y. Limit Theorems for a Galton–Watson Process with Immigration in Varying Environments. Bull. Malays. Math. Sci. Soc. 38, 1551–1573 (2015). https://doi.org/10.1007/s40840-014-0085-x
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40840-014-0085-x
Keywords
- Central limit theorem
- The law of the iterated logarithm
- Galton–Watson branching process with immigration
- Varying environment