Abstract
The so-called supOU processes, namely the superpositions of Ornstein–Uhlenbeck type processes, are stationary processes for which one can specify separately the marginal distribution and the temporal dependence structure. They can have finite or infinite variance. We study the limit behavior of integrated infinite variance supOU processes adequately normalized. Depending on the specific circumstances, the limit can be fractional Brownian motion but it can also be a process with infinite variance, a Lévy stable process with independent increments or a stable process with dependent increments. We show that it is even possible to have infinite variance integrated supOU processes converging to processes whose moments are all finite. A number of examples are provided.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
SupOU processes which are defined below are superpositions of stationary Ornstein–Uhlenbeck processes driven by a Lévy process. They were studied extensively by Barndorff-Nielsen and his collaborators [2, 5,6,7]. An attractive feature of supOU processes is that they allow the marginal distribution and the temporal dependence structure to be modeled independently.
The supOU process is defined as follows: It is a strictly stationary process \(X=\{X(t), \, t\in {{\mathbb {R}}}\}\) represented by the stochastic integral [2]
where \(\varLambda \) is a homogeneous infinitely divisible random measure (Lévy basis) on \({{\mathbb {R}}}_+ \times {{\mathbb {R}}}\), with cumulant function for \(A \in {\mathcal {B}} \left( {{\mathbb {R}}}_+ \times {{\mathbb {R}}}\right) \)
The control measure \(m=\pi \times Leb\) is the product of a probability measure \(\pi \) on \({{\mathbb {R}}}_+\) and the Lebesgue measure on \({{\mathbb {R}}}\). The probability measure \(\pi \) “randomizes” the rate parameter \(\xi \), and the Lebesgue measure is associated with the moving average variable s. Finally, \(\kappa _{L}\) in (2) is the cumulant function \(\kappa _{L} (\zeta )= \log {\mathbb {E}}e^{i \zeta L(1) }\) of some infinitely divisible random variable L(1) with Lévy–Khintchine triplet \((a,b,\mu )\), i.e.,
The Lévy process \(L=\{L(t), \, t\ge 0\}\) associated with the triplet \((a,b,\mu )\) is called the background driving Lévy process, and the quadruple
is referred to as the characteristic quadruple.
The marginal distribution of X is determined by L, while the dependence structure is controlled by the probability measure \(\pi \). Indeed, if \({\mathbb {E}}X(t)^2<\infty \), then the correlation function of X is the Laplace transform of \(\pi \):
More details about supOU processes can be found in [2,3,4,5, 12].
Integrated supOU process \(X^*=\{X^*(t), \, t \ge 0\}\) defined by
has a complex asymptotic behavior. We have shown in Grahovac et al. [13] that when the supOU process has a finite variance, then different types of limits of integrated process can occur depending on the specific structure of the process. In this paper, we study what happens when the supOU has infinite variance. We show that again different limits can occur depending in particular on how heavy the tails of the supOU process are. We show that it is possible to have an infinite variance process to converge to a process with all moments finite.
Our results may be of particular interest in financial econometrics where supOU processes are used as stochastic volatility models, and hence the integrated process \(X^*\) represents the integrated volatility (see, e.g., Barndorff-Nielsen and Stelzer [6]). The limiting behavior is also important for statistical estimation (see Stelzer et al. [27] and Curato and Stelzer [9]). In Grahovac et al. [13], it has been shown that integrated supOU processes may exhibit an interesting phenomenon of intermittency which may be relevant for applications in turbulence (see, e.g., Zel’dovich et al. [28]).
When the supOU process \(\{X(t), \, t\in {{\mathbb {R}}}\}\) has finite variance, four different limiting processes may be obtained depending on the elements of the characteristic quadruple, namely
-
Brownian motion,
-
fractional Brownian motion,
-
a stable Lévy process, and
-
a stable process with dependent increments defined in (18) below.
The type of limit depends on whether Gaussian component is present in (4), on a parameter \(\alpha \) quantifying dependence and on a parameter \(\beta \) quantifying the growth of the Lévy measure \(\mu \) in (4) near origin.
We show in this paper that when the supOU process \(\{X(t), \, t\in {{\mathbb {R}}}\}\) has infinite variance, the limiting behavior depends additionally on the regular variation index \(\gamma \) of the marginal distribution. As limiting process, one can obtain
-
a stable Lévy process,
-
a stable process with dependent increments defined in (18) below, and
-
fractional Brownian motion.
We provide examples to illustrate the results.
The paper is organized as follows. In Sect. 2, we list the assumptions used for our results. Section 3 contains the main results, and in Sect. 4 examples are provided. All the proofs are contained in Sect. 5.
2 Basic Assumptions
Before stating the main results, we introduce some notation and basic assumptions.
2.1 Preliminaries
A random variable Z with an infinite variance stable distribution \({\mathcal {S}}_\gamma (\sigma , \rho , c)\) and parameters \(0<\gamma <2\), \(\sigma >0\), \(-1\le \rho \le 1\) and \(c\in {{\mathbb {R}}}\) has a cumulant function of the form
where
For simplicity of the exposition, wherever it applies we will assume Z is symmetric (\(\rho =0\)) when \(\gamma =1\), and hence we can write
We shall make a number of basic assumptions.
2.2 Domain of Attraction
We suppose that the marginal distribution of the supOU process \(\{X(t), \, t\in {{\mathbb {R}}}\}\) in (1) belongs to the domain of attraction of stable law, that is, X(1) has balanced regularly varying tails:
for some \(p,q \ge 0\), \(p+q>0\), \(0<\gamma <2\) and some slowly varying function k. If \(\gamma =1\), we assume \(p=q\). In particular, the variance is infinite. Moreover, when the mean is finite, that is, when \(\gamma >1\), we assume \({\mathbb {E}}X(1)=0\). These assumptions imply that X(1) is in the domain of attraction of \({\mathcal {S}}_\gamma (\sigma , \rho , 0)\) law with (Ibragimov and Linnik [15, Theorem 2.6.1])
Now consider the Lévy process \(\{L(t),\, t \ge 0\}\) introduced in Sect. 1. By Fasen and Klüppelberg [11, Proposition 3.1], the tail of the distribution function of X(1) is asymptotically equivalent to the tail of the background driving Lévy process L(t) at \(t=1\). More precisely, as \( x\rightarrow \infty \)
Hence, (8) implies
and L(1) is in the domain of attraction of stable distribution \({\mathcal {S}}_\gamma (\gamma ^{1/\gamma } \sigma , \rho , 0)\). Note that the scale parameter \(\sigma \) of X(1) yields a scale parameter \(\gamma ^{1/\gamma } \sigma \) for L(1).
The normalizing sequence in some of the limit theorems below involves the de Bruijn conjugate of a slowly varying function (Bingham et al. [8, Subsection 1.5.7]). Recall that the de Bruijn conjugate of some slowly varying function h is a slowly varying function \(h^{\#}\) such that
as \(x\rightarrow \infty \). By Bingham et al. [8, Theorem 1.5.13], such function always exists and is unique up to asymptotic equivalence.
2.3 Dependence Structure
The second set of assumptions deals with the temporal dependence structure dictated by the behavior near the origin of the probability measure \(\pi \) in the characteristic quadruple (4). We will assume that the probability measure \(\pi \) is regularly varying at zero, that is, for some \(\alpha >0\) and some slowly varying function \(\ell \)
To simplify the proofs of some of the results below, we will assume that \(\pi \) has a density p which is monotone on \((0,x')\) for some \(x'>0\) so that (12) implies
To see how this affects dependence, note that if the variance is finite \({\mathbb {E}}X(t)^2<\infty \), then (5) and (12) imply that the correlation function satisfies (Fasen and Klüppelberg [11, Proposition 2.6])
Hence, if \(\alpha \in (0,1)\), the correlation function is not integrable, and the finite variance supOU process may be said to exhibit long-range dependence. On the other hand, note that the behavior of \(\pi \) at infinity does not affect the decay of correlations as decay of correlations depends on the asymptotics of \(\pi \) near zero. To simplify the presentation of the results, we shall assume that
2.4 Behavior of the Lévy Measure at the Origin
Unlike classical limit theorems, the limiting distribution of the integrated supOU processes does not depend only on the tails of the marginal distribution and on the dependence structure. The third component affecting the limit is the growth of the Lévy measure \(\mu \) near origin. We will quantify this growth by assuming a power law behavior of the Lévy measure near the origin. Let
denote the tails of \(\mu \). We will assume that there exist \(\beta \ge 0\), \(c^+, c^- \ge 0\), \(c^++c^->0\) such that
Since \(\mu \) is the Lévy measure, we must have \(\beta <2\). If (15) holds, then \(\beta \) is the Blumenthal–Getoor index of the Lévy measure \(\mu \) defined by (see Grahovac et al. [13])
Note that by [20, Lemma 7.15] \(M^+(x) \sim P(L(1)>x)\) and \(M^-(x) \sim P(L(1)\le -x)\) as \(x \rightarrow \infty \), hence we can express (11) equivalently as
In general, making assumptions on the value of the Blumenthal–Getoor index \(\beta _{BG}\) is more general than assuming (15). For example, in the geometric stable example in Sect. 4.4 below, the mass of the Lévy measure near the origin increases at the logarithmic rate, and hence (15) does not hold but \(\beta _{BG}=0\). Certain parts of our main results below require only assumptions on the value of the Blumenthal–Getoor index and not (15) (see Remark 1).
The condition (15) may be equivalently stated in terms of the Lévy measure of X(1). Indeed, if \(\nu \) is the Lévy measure of X(1), then (15) is equivalent to
See Grahovac et al. [13] for details.
3 Main Results
Before stating the main theorems, let us review the parameters introduced in the previous section:
-
\(\gamma \in (0,2)\) defined in (8) is the regular variation index of the marginal distribution,
-
\(\alpha \in (0,\infty )\) defined in (13) quantifies the strength of dependence, and
-
\(\beta \in [0,2)\) defined in (15) is the power law exponent of the Lévy measure \(\mu \) near origin.
The resulting limiting process depends on the interplay between the parameters \(\alpha \), \(\beta \), and \(\gamma \). In the next theorem, the process \(\{X(t), \, t\in {{\mathbb {R}}}\}\) has no Gaussian component. Here and in what follows, \(\{\cdot \} \overset{fdd}{\rightarrow } \{\cdot \}\) denotes the convergence of finite dimensional distributions.
Theorem 1
Suppose that the supOU process \(\{X(t), \, t\in {{\mathbb {R}}}\}\) is such that
-
\(b=0\) and thus has no Gaussian component,
-
the marginal distribution satisfies (8) with \(0<\gamma <2\),
-
the behavior at the origin of the Lévy measure \(\mu \) is given by (15) with \(0\le \beta <2\), and
-
\(\pi \) has a density p satisfying (13) with \(\alpha >0\) and some slowly varying function \(\ell \) and (14) holds.
Then the following holds:
-
(I)
If \(\gamma <1+\alpha \), then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1/\gamma } k^{\#}(T)^{1/\gamma }} X^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ L_{\gamma } (t) \right\} , \end{aligned}$$where k is the slowly varying function in (8), \(k^{\#}\) is the de Bruijn conjugate of \(1/k\left( x^{1/\gamma }\right) \) and the limit \(\{L_{\gamma }\}\) is a \(\gamma \)-stable Lévy process such that \(L_{\gamma }(1)\overset{d}{=} {\mathcal {S}}_\gamma ({\widetilde{\sigma }}_{1,\gamma }, \rho , 0)\) with
$$\begin{aligned} {\widetilde{\sigma }}_{1,\gamma } = \sigma \left( \gamma \int _0^\infty \xi ^{1-\gamma } \pi (\mathrm{d}\xi ) \right) ^{1/\gamma }, \end{aligned}$$and \(\sigma \) and \(\rho \) are given by (9).
-
(II)
If \(\gamma >1+\alpha \), then the limit depends on the value of \(\beta \), as follows.
-
(II.a)
If \(\beta <1+\alpha \), then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1/(1+\alpha )} \ell ^{\#}\left( T \right) ^{1/(1+\alpha )}} X^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ L_{1 + \alpha } (t) \right\} , \end{aligned}$$where the limit \(\{L_{1+\alpha }\}\) is a \((1+\alpha )\)-stable Lévy process such that \(L_{1+\alpha }(1)\overset{d}{=} {\mathcal {S}}_{1+\alpha } ({\widetilde{\sigma }}, {\widetilde{\rho }}, 0)\) with
$$\begin{aligned} {\widetilde{\sigma }} = \left( {\widetilde{\sigma }}_{1,\beta }^{1+\alpha } + {\widetilde{\sigma }}_{2,\alpha }^{1+\alpha } \right) ^{1/(1+\alpha )}, \qquad {\widetilde{\rho }} = \frac{{\widetilde{\rho }}_{1,\beta } {\widetilde{\sigma }}_{1,\beta }^{1+\alpha } + {\widetilde{\rho }}_{2,\alpha } {\widetilde{\sigma }}_{2,\alpha }^{1+\alpha }}{{\widetilde{\sigma }}_{1,\beta }^{1+\alpha } + {\widetilde{\sigma }}_{2,\alpha }^{1+\alpha }}, \end{aligned}$$with \({\widetilde{\sigma }}_{1,\beta }\) and \({\widetilde{\rho }}_{1,\beta }\) defined in Lemma 2 and \({\widetilde{\sigma }}_{2,\alpha }\) and \({\widetilde{\rho }}_{2,\alpha }\) defined in Lemma 4 below.
-
(II.b)
If \(1+\alpha <\beta \), then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1-\alpha /\beta } \ell (T)^{1/\beta }} X^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ Z_{\alpha , \beta } (t) \right\} , \end{aligned}$$where \(\{Z_{\alpha , \beta }\}\) is a process with the stochastic integral representation
$$\begin{aligned} Z_{\alpha , \beta }(t) = \int _{{{\mathbb {R}}}_+} \int _{{{\mathbb {R}}}} \left( {\mathfrak {f}}(\xi , t-s) - {\mathfrak {f}}(\xi , -s) \right) K(\mathrm{d}\xi , \mathrm{d}s), \end{aligned}$$(18)\({\mathfrak {f}}\) is given by
$$\begin{aligned} {\mathfrak {f}}(x,u) = {\left\{ \begin{array}{ll} 1-e^{-xu}, &{} \mathrm{if}\,\, x>0\,\, \mathrm{and}\,\, u>0,\\ 0, &{} \mathrm{otherwise}, \end{array}\right. } \end{aligned}$$(19)and K is a \(\beta \)-stable Lévy basis on \({{\mathbb {R}}}_+ \times {{\mathbb {R}}}\) with control measure \(\alpha \xi ^{\alpha } \mathrm{d}\xi \mathrm{d}s\) such that \(C\left\{ \zeta \ddagger K(A)\right\} = \kappa _{{\mathcal {S}}_{\beta } ({\widetilde{\sigma }}_{2,\beta }, {\widetilde{\rho }}_{2,\beta }, 0)}(\zeta )\) with
$$\begin{aligned} {\widetilde{\sigma }}_{2,\beta } = \left( \frac{\varGamma (2-\beta )}{1-\beta } (c^-+c^+) \cos \left( \frac{\pi \beta }{2}\right) \right) ^{1/\beta }, \qquad {\widetilde{\rho }}_{2,\beta } = \frac{c^- - c^+}{c^-+c^+}, \end{aligned}$$and \(c^-, c^+\) as in (15). The limit process \(\{Z_{\alpha , \beta }\}\) has stationary increments and is self-similar with index \(H=1-\alpha /\beta \in (1/\beta , 1)\).
-
(II.a)
Remark 1
We note that for the proof of Theorem 1(I) when \(\gamma <1\), one could replace (14) with the assumption that there exists \(\varepsilon >0\) such that
Also, for the proof of Theorem 1(II.a) instead of assuming (15) with \(\beta <1+\alpha \), it is enough to assume that the Blumenthal–Getoor index (16) satisfies \(\beta _{BG}<1+\alpha \).
The first boundary between different limit types in Theorem 1 is given by \(\gamma =1+\alpha \). By choosing formally \(\gamma =2\), we obtain \(\alpha =1\) which corresponds to the boundary between short-range and long-range dependence in the finite variance case (see Grahovac et al. [13]).
In the infinite variance case, the regular variation index \(\gamma \) of the marginal tails seems to play an important role in the limit only when \(\gamma <1+\alpha \). One could say that in this scenario the tails dominate the dependence structure. In the opposite case \(\gamma >1+\alpha \), two classes of stable processes may arise as a limit, either with dependent or independent increments. This depends on the value of parameter \(\beta \).
Note also that if \(\beta<1+\alpha <\gamma \), the limiting process \(L_{1+\alpha }\) has heavier tails than the supOU process whose tails are characterized by \(\gamma \). On the other hand, when \(1+\alpha <\gamma \) and \(1+\alpha <\beta \), the limiting process has \(\beta \)-stable marginals, and hence, depending on whether \(\beta >\gamma \) or \(\beta <\gamma \), the tails of the limit can be lighter or heavier than the tails of the underlying supOU process.
We now consider the case when the Gaussian component is present in the characteristic quadruple, that is, \(b\ne 0\). This is the main difference between Theorems 1 and 2.
Theorem 2
Suppose that the supOU process \(\{X(t), \, t\in {{\mathbb {R}}}\}\) is such that
-
\(b\ne 0\) and thus has a Gaussian component,
-
the marginal distribution satisfies (8) with \(0<\gamma <2\),
-
the behavior at the origin of the Lévy measure \(\mu \) is given by (15) with \(0\le \beta <2\),
-
\(\pi \) has a density p satisfying (13) with \(\alpha >0\) and some slowly varying function \(\ell \) and (14) holds.
-
(I)
If \(\alpha > 1\) or if \(\alpha <1\) and \(\gamma < \frac{2}{2-\alpha }\), then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1/\gamma } k^{\#}(T)^{1/\gamma }} X^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ L_{\gamma } (t) \right\} , \end{aligned}$$where the limit \(\{L_{\gamma }\}\) is a \(\gamma \)-stable Lévy process defined as in Theorem 1(I).
-
(II)
If \(\alpha <1\) and \(\gamma > \frac{2}{2-\alpha }\), then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1-\alpha /2 } \ell (T)^{1/2}} X^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ {\widetilde{\sigma }}_{3,\alpha } B_H(t) \right\} , \end{aligned}$$where \(\{ B_H(t)\}\) is standard fractional Brownian motion with \(H=1-\alpha /2\) and \({\widetilde{\sigma }}_{3,\alpha } = b^2/2 \frac{\varGamma (1+\alpha )}{(2-\alpha )(1-\alpha )}\).
When the Gaussian component is present in the characteristic quadruple, the parameter \(\beta \) is irrelevant for the type of the limit process, and there are only two possible limits. One is the Lévy stable motion \(\{L_\gamma (t), \, t\ge 0\}\) that would have been a limit if \(\{X^*(t), \, t\ge 0\}\) had independent increments. The second is the Gaussian fractional Brownian motion. In the first case, the limit has independent but infinite variance increments, and in the second case the limit has dependent increments but their distribution is Gaussian.
Theorem 2 also provides an example of a limit theorem where the aggregated process has infinite variance, but the limiting process is fractional Brownian motion which has all the moments finite.
Figures 1 and 2 illustrate the limiting behavior graphically.
4 Examples
In this section, we list several examples of supOU process and show how Theorems 1 and 2 apply. In each example, we will fix the distribution of the background driving Lévy process while \(\pi \) may be any absolutely continuous probability measure satisfying (13). For example, \(\pi \) can be Gamma distribution with density
where \(\alpha >0\). Then
Other examples can be found in Grahovac et al. [12].
In each of the examples below, we choose a background driving Lévy process such that L(1) is a heavy-tailed distribution satisfying (11) with \(0<\gamma <2\) and (15) holds, or the Blumenthal–Getoor index (16) is known.
Note that by appropriately choosing the background driving Lévy process L, one can obtain any self-decomposable distribution as a marginal distribution of X. Recall that an infinitely divisible random variable X is self-decomposable if its characteristic function \(\phi (\theta )={\mathbb {E}}e^{i\theta X}\), \(\theta \in {{\mathbb {R}}}\), has the property that for every \(c\in (0,1)\) there exists a characteristic function \(\phi _{c}\) such that \(\phi (\theta )=\phi (c\theta )\phi _{c}(\theta )\) for all \(\theta \in {{\mathbb {R}}}\) (see, e.g., Sato [26]). Equivalently, for every \(c \in (0,1)\), there is a random variable \(Y_c\) such that the random variable X has the same distribution as \(cX+Y_c\).
Each of distributions given in examples below may be imposed as a distribution of X(t). Indeed, every distribution considered in the following examples is self-decomposable (see references cited below), and hence there exists a background driving Lévy process generating a supOU process with such marginal distribution. Furthermore, if (8) holds, then L(1) satisfies (11) by (10). If (17) holds for the Lévy measure of X(1), then this implies (15) for the Lévy measure of L(1). Hence, Theorems 1 and 2 may still be applied as the conditions on the background driving Lévy process are easily translated to the corresponding conditions on the marginals of the supOU process.
4.1 Compound Poisson Background Driving Lévy Process
Let L be a compound Poisson process with rate \(\lambda >0\) and infinite variance jump distribution F regularly varying at infinity. More precisely, F satisfies
for some \(0<\gamma <2\) and k slowly varying at infinity. If F has a finite mean, then we assume it is zero. Suppose X is a supOU process with the background driving Lévy process L and \(\pi \) absolutely continuous probability measure satisfying (13). The characteristic quadruple (4) is then \((a,0,\mu ,\pi )\) where
Since the Lévy measure is finite, this case corresponds to \(\beta =0\) in (15). Hence, Theorem 1 applies to show that the limit is stable Lévy process with index \(\gamma \) if \(\gamma <1+\alpha \) or with index \(1+\alpha \) if \(\gamma >1+\alpha \).
4.2 Stable Background Driving Lévy Process
Let L be a \(\gamma \)-stable Lévy process generating supOU process X with characteristic quadruple (4) given by \((a,0,\mu ,\pi )\) where
with \(c_1,c_2\ge 0\), \(c_1+c_2>0\) if \(\gamma \ne 1\) and \(c_1=c_2\) if \(\gamma =1\). If \(\alpha >1\), we additionally assume \({\mathbb {E}}X(1)=0\). The Lévy measure satisfies (15) with \(\beta =\gamma \), and from Theorem 1 we conclude that if \(\gamma <1+\alpha \), the limit is \(\gamma \)-stable Lévy process and if \(\gamma >1+\alpha \), then the limit is stable process \(Z_{\alpha ,\gamma }\) defined in Theorem 1 (II.b). This type of limiting behavior was obtained by Puplinskaitė and Surgailis [23] for aggregated AR(1) processes with stable marginals.
4.3 Student’s Background Driving Lévy Process
Let L be a Lévy process such that L(1) has Student’s t-distribution given by the density
where \(c\in {{\mathbb {R}}}\) is a location parameter, \(\delta >0\) is a scale parameter and the degrees of freedom \(0<\gamma <2\) correspond to the tail index of the distribution of L(1) as in (11). If \(\gamma >1\), we assume \(c=0\), and hence \({\mathbb {E}}L(1)=0\). The Lévy–Khintchine triplet in (3) is \((c,0,\mu )\) with Lévy measure \(\mu \) absolutely continuous with density
where \(J_{\gamma /2}\) and \(Y_{\gamma /2}\) denote the Bessel functions of the first and the second kind, respectively (see, e.g., Heyde and Leonenko [14]). By Eberlein and Hammerstein [10, Eq. (7.14)] we have
and by using Karamata’s theorem [8, Theorem 1.5.11] it follows that
Hence, \(\beta =1\) in (15). Let \(\pi \) be an absolutely continuous probability measure satisfying (13). Then the characteristic quadruple (4) is \((c,0,\mu ,\pi )\). By Theorem 1, the limits are as in the compound Poisson case, namely stable Lévy process with index \(\gamma \) if \(\gamma <1+\alpha \) or with index \(1+\alpha \) if \(\gamma >1+\alpha \).
4.4 Geometric Stable Background Driving Lévy Process
A random variable Y has a geometric stable distribution if its characteristic function has the form
where \(\kappa _{{\mathcal {S}}_\gamma (\sigma , \rho , c)}\) is the cumulant function (7) of some stable distribution \({\mathcal {S}}_\gamma (\sigma , \rho , c)\). The case \(\rho =c=0\) yields the so-called Linnik distribution with characteristic function [1, 16]
On the other hand, geometric stable distribution with \(0<\gamma < 1\), \(\sigma =\cos (\pi \gamma /2)^{1/\gamma }\), \(\rho =1\), and \(c=0\) is known as the Mittag–Leffler distribution (see Kozubowski [17]).
Let L be a Lévy process such that L(1) has geometric stable distribution. For \(0<\gamma <2\), geometric stable distributions have regularly varying tails with index \(\gamma \) (see, e.g., Kozubowski and Panorska [18]), and hence (11) holds. On the other hand, the mass of the Lévy measure near origin increases at the logarithmic rate, and hence the Blumenthal–Getoor index (16) is 0 (see Kozubowski et al. [19] for details). Since the characteristic quadruple has no Gaussian component, we conclude from Theorem 1 and Remark 1 that the limit is stable Lévy process with index \(\gamma \) if \(\gamma <1+\alpha \) or with index \(1+\alpha \) if \(\gamma >1+\alpha \).
5 Proofs
The proofs of Theorems 1 and 2 are based on the Lévy–Itô decomposition of the background driving Lévy process L and the corresponding decomposition of the integrated process \(X^*\). Let \(\mu _1(\mathrm{d}x)=\mu (\mathrm{d}x) {\mathbf {1}}_{|x|>1}(\mathrm{d}x)\) and \(\mu _{2}(\mathrm{d}x)=\mu (\mathrm{d}x) {\mathbf {1}}_{|x|\le 1}(\mathrm{d}x)\) where \(\mu \) is the Lévy measure of the Lévy process L. Then there exists a modification of the Lévy basis \(\varLambda \) for which we can make a decomposition into \(\varLambda _1\) with characteristic quadruple \((a,0,\mu _1,\pi )\), \(\varLambda _2\) with characteristic quadruple \((0,0,\mu _2,\pi )\), and \(\varLambda _3\) with characteristic quadruple \((0,b,0,\pi )\) (see Pedersen [22], Barndorff-Nielsen and Stelzer [5, Theorem 2.2] and Moser and Stelzer [21]). We assume in the following \(\varLambda \) is already a modification with Lévy–Itô decomposition. Let \(L_1(t)\), \(L_2(t)\), and \(L_3(t)\), \(t\in {{\mathbb {R}}}\), denote the corresponding background driving Lévy processes which have the following cumulant functions:
Note that \(L_1\) is a compound Poisson process and \(L_3\) is Brownian motion. Consequently, we can represent X(t) as
with \(X_1\), \(X_2\), and \(X_3\) independent. Let \(X^*_1\), \(X^*_2\), and \(X_3^*\) denote the corresponding integrated processes which are independent. To obtain the limiting behavior of the integrated process \(X^*\), we first establish limit theorems for each process \(X^*_1\), \(X^*_2\) and \(X_3^*\), respectively.
5.1 The Process \(X_1^*\)
When the supOU process has finite variance, then
if and only if the correlation function is integrable (see Grahovac et al. [13]). If this is the case, then the integrated process after suitable normalization converges to Brownian motion. When the variance is infinite, then, assuming (8), one may expect \(\gamma \)-stable Lévy process in the limit.
We first prove this for the compound Poisson component \(X_1^*\). In this setting, the critical condition turns out to be
Note that choosing formally \(\gamma =2\) corresponds to the critical condition (22) in the finite variance case.
Lemma 1
Suppose that there exists an \(\varepsilon >0\) such that
or
Then as \(T\rightarrow \infty \)
where the limit \(\{L_{\gamma }\}\) is a \(\gamma \)-stable Lévy process with the notation as in Theorem 1(I).
Proof
Let \(0=t_0<t_1<\cdots <t_m\), \(\zeta _1,\dots ,\zeta _m \in {{\mathbb {R}}}\) and \(A_T=T^{1/\gamma } k^{\#}(T)^{1/\gamma }\). By the Cramér–Wold device, it will be enough to prove that
We can rewrite the left-hand side as
and the same can be done for the right-hand side. Hence, it is enough to prove that for arbitrary \(\zeta _1,\dots ,\zeta _m \in {{\mathbb {R}}}\)
By using (1) we have that
with \(\varDelta X^*_{1,1}(Tt_{i})\) and \(\varDelta X^*_{1,2}(Tt_{i})\) independent. Moreover, \(\varDelta X^*_{1,2}(Tt_{i})\), \(i=1,\dots ,m\) are independent, and hence, to prove (26), it will be enough to prove that
Due to stationary increments, it is enough to consider \(t_i=t_1=t\) so that \(t_{i-1}=0\).
We start with the proof of (28). For any \(\varLambda \)-integrable function f on \({{\mathbb {R}}}_+ \times {{\mathbb {R}}}\), one has (see Rajput and Rosinski [24])
Using this and the change of variables, we get that
By Ibragimov and Linnik [15, Theorem 2.6.4], the assumption (11) implies that
Hence, for arbitrary \(\delta >0\), in some neighborhood of the origin one has
On the other hand, since \(\left| e^{i\zeta x}-1 \right| \le 2\), we have from (20) that
We can take \(C_3\) large enough so that \(|\kappa _{L_1}(\zeta ) | \le C_3 |\zeta |\) for \(|\zeta | > \varepsilon \), and then
Now we have by using (31)
Now if \(\gamma \in (0,1)\), then by using the inequality \(x^{-1}(1-e^{-x})\le 1\), \(x>0\), and the fact that \(\pi \) is a probability measure we have
as \(T\rightarrow \infty \), since \(\gamma -\delta - 1 + \delta /\gamma < 0\) and \(1-1/\gamma <0\).
If \(\gamma \in (1,2)\), then from the inequality \(x^{-1}(1-e^{-x})\le x^{-a}\) valid for \(x>0\) and \(a \in [0,1]\), we get by taking \(a=a_1:=-(1-\gamma )/(\gamma - \delta )\in (0,1)\) for the first term and \(a=a_2:=\gamma /2-1/(2\gamma )\in (0,1)\) for the second term that
This tends to zero as \(T\rightarrow \infty \) since \(\delta /\gamma -\gamma <0\), \(1-1/\gamma -a_2<0\), and \(\int _0^\infty \xi ^{-a_2} \pi (\mathrm{d}\xi )<\infty \) due to \(-a_2>1-\gamma \).
If \(\gamma =1\), then we can similarly take \(a=a_1=\varepsilon /(\gamma -\delta ) \in (0,1)\) for the first term and \(a=a_2:=\varepsilon \in (0,1)\) for the second term to obtain
This completes the proof of (28).
To prove (29), note that because of (32) we can write
with \({\overline{k}}\) slowly varying at zero such that \({\overline{k}}(\zeta )\sim k(1/\zeta )\) as \(\zeta \rightarrow 0\). From (30) we have
By the definition of \(k^{\#}\), one has (Bingham et al. [8, Theorem 1.5.13])
and due to slow variation of \({\overline{k}}\), for any \(\zeta \in {{\mathbb {R}}}\), \(\xi >0\) and \(s\in (0,t)\), as \(T\rightarrow \infty \)
Hence, if the limit could be passed under the integral in (34), we would get that
which proves (29). To justify taking the limit under the integral, note that by Potter’s bounds [8, Theorem 1.5.6] we have from (35) that for any \(\delta >0\)
for T large enough. By taking \(\delta <\min \{\gamma , \varepsilon \}\), we get
and by the assumptions (24) and (25)
Hence, the dominated convergence theorem can be applied in (34). \(\square \)
We next consider a scenario where (13) holds. If \(\gamma \in (1,2)\), then this implies that (23) does not hold.
Lemma 2
Suppose that \(\pi \) has a density p satisfying (13) with \(\alpha \in (0,1)\) and some slowly varying function \(\ell \). If
then as \(T\rightarrow \infty \)
where \(\ell ^{\#}\) is de Bruijn conjugate of \(1/\ell \left( x^{1/(1+\alpha )}\right) \) and the limit \(\{L_{1+\alpha }\}\) is the \((1+\alpha )\)-stable Lévy process such that \(L_{1+\alpha }(1)\overset{d}{=} {\mathcal {S}}_\gamma ({\widetilde{\sigma }}_{1,\alpha }, {\widetilde{\rho }}_1, 0)\) with
and \(c^-_1 and c^+_1\) are given by
Proof
The proof is similar to the proof of Grahovac et al. [13, Theorem 2.2]. As in the proof of Lemma 1, it will be enough to prove that as \(T\rightarrow \infty \)
with \(A_T=T^{1/(1+\alpha )} \ell ^{\#}\left( T \right) ^{1/(1+\alpha )}\). Note that the de Bruijn conjugate \(\ell ^{\#}\) exists by Bingham et al. [8, Theorem 1.5.13] and satisfies
To prove (39), note that we can write \(p(x)=\alpha {\widetilde{\ell }}(x^{-1}) x^{\alpha -1}\) where \({\widetilde{\ell }}(t) \sim \ell (t)\) as \(t \rightarrow \infty \). Now from (31) we have
We have assumed \(1+\alpha <\gamma \), hence \(\gamma >1\), and from (33) we get the bound
By using Potter’s bounds [8, Theorem 1.5.6], we have for \(0<\delta <\alpha ^2/(1+\alpha )\)
Now we get that
as \(T\rightarrow \infty \).
We now turn to (40). As in the proof of Lemma 1, we have
Suppose that \(\zeta >0\). The proof is analogous if \(\zeta <0\). Making change of variables \(x=\zeta A_T^{-1} \xi ^{-1}\) in (43), we get
and \(T/A_T\rightarrow \infty \) as \(T\rightarrow \infty \) implies that
Since \(\ell \) is slowly varying, \(\ell \sim {\widetilde{\ell }}\) and (41) holds, we have
as \(T \rightarrow \infty \). Hence, if the limit could be passed under the integral, we would get that
Let us assume momentarily that (45) holds. Since \(\gamma >1\), we have assumed that the mean is 0, namely \({\mathbb {E}}X_1(1)={\mathbb {E}}L_1(1)= a + \int _{|x|>1} x \mu (\mathrm{d}x)=0\), and hence from (20) we can write \(\kappa _{L_1}\) in the form
By using the relation
valid for \(1<\lambda <2\) (see, e.g., Ibragimov and Linnik [15, Theorem 2.2.2]), we obtain by taking \(\lambda =1+\alpha \) that
where \({\widetilde{\sigma }}_{1,\alpha }\) and \({\widetilde{\rho }}_1\) are given by (37) and \(c_1^-\), \(c_1^+\) by (38). In the last equality \({{\,\mathrm{sign}\,}}(\zeta ) = 1\) since we suppose \(\zeta >0\).
To complete the proof, we need to justify taking the limit under the integral in (44). We denote \(g_T(\zeta , x,s)=e^{-x^{-1} \frac{\zeta T}{A_T} (t-s)}\) and split \(C \left\{ \zeta \ddagger A_T^{-1} \varDelta X^*_{1,2}(Tt) \right\} \) into two parts:
where
From Potter’s bounds [8, Theorem 1.5.6], for \(0<\delta <\min \left\{ \gamma - 1 - \alpha , \alpha , 1-\alpha \right\} \) there is \(C_1\) such that
Now from (41) we have that for T large enough
and hence
We will first show that the dominated convergence theorem may be applied to \(I^{(1)}_{T}\) showing that \(I^{(1)}_{T}\) converges to the limit in (45). From (46) by using the inequality
we get that for any \(x \in {{\mathbb {R}}}\),
Moreover, we have
and hence
Now
and it remains to show this integral is finite. Indeed, we have
and
since \(1+\alpha +\delta <\gamma \) and \({\mathbb {E}}|L(1)^{1+\alpha +\delta }|<\infty \Leftrightarrow \int _{|y|>1} |y|^{1+\alpha +\delta } \mu _1(\mathrm{d}y) < \infty \).
We next show that \(I^{(2)}_{T} \rightarrow 0\) in (49) as \(T\rightarrow \infty \). Since \({\mathbf {1}}_{[1/2,1]}(g_T(\zeta , x,s)) = {\mathbf {1}}_{\left[ \frac{\zeta (t-s) T}{A_T \log 2 }, \infty \right) }(x)\), we have by using (42)
as \(T\rightarrow \infty \), which completes the proof of (40). \(\square \)
To summarize the results of this subsection, let us assume that (14) (hence (24) holds) and that \(\pi \) has a density p satisfying (13) with \(\alpha >0\) and some slowly varying function \(\ell \). Then the limiting behavior is illustrated in Fig. 3.
5.2 The Process \(X_2^*\)
The background driving Lévy process of \(X_2\) consists only of jumps of magnitude less than or equal to one. The limiting behavior of \(X_2^*\) may depend on the growth of the Lévy measure near the origin.
Note that \({\mathbb {E}}|X_2(t)|^q<\infty \) for any \(q>0\). In particular, the variance is finite and \({\mathbb {E}}X_2(t)=0\). Hence, we obtain the following results as a corollary of Grahovac et al. [13, Theorems 2.4, 2.2 and 2.3], respectively.
Lemma 3
If
then as \(T\rightarrow \infty \)
where \(\{ B(t)\}\) is standard Brownian motion and
Lemma 4
Suppose that \(\pi \) has a density p satisfying (13) with \(\alpha \in (0,1)\) and some slowly varying function \(\ell \), and suppose (15) holds with \(0\le \beta <2\).
-
(i)
If
$$\begin{aligned} \beta <1+\alpha , \end{aligned}$$then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1/(1+\alpha )} \ell ^{\#}\left( T \right) ^{1/(1+\alpha )}} X_2^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ L_{1 + \alpha } (t) \right\} , \end{aligned}$$where \(\ell ^{\#}\) is de Bruijn conjugate of \(1/\ell \left( x^{1/(1+\alpha )}\right) \) and \(\{L_{1+\alpha }\}\) is \((1+\alpha )\)-stable Lévy process such that \(L_{1+\alpha }(1)\overset{d}{=} {\mathcal {S}}_{1+\alpha } ({\widetilde{\sigma }}_{2,\alpha }, {\widetilde{\rho }}_{2,\alpha }, 0)\) with
$$\begin{aligned} {\widetilde{\sigma }}_{2,\alpha } = \left( \frac{\varGamma (1-\alpha )}{\alpha } (c^-_2+c^+_2) \cos \left( \frac{\pi (1+\alpha )}{2}\right) \right) ^{1/(1+\alpha )}, \quad {\widetilde{\rho }}_{2,\alpha } = \frac{c^-_2 - c^+_2}{c^-_2+c^+_2}, \end{aligned}$$and \(c^-_2 and c^+_2\) are given by
$$\begin{aligned} c^-_2 = \frac{\alpha }{1+\alpha } \int _{-1}^0 |y|^{1+\alpha } \mu (\mathrm{d}y), \qquad c^+_2 = \frac{\alpha }{1+\alpha } \int _0^1 y^{1+\alpha } \mu (\mathrm{d}y). \end{aligned}$$ -
(ii)
If
$$\begin{aligned} 1+\alpha<\beta <2, \end{aligned}$$then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1-\alpha /\beta } \ell (T)^{1/\beta }} X_2^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ Z_{\alpha , \beta } (t) \right\} , \end{aligned}$$where the limit \(\{Z_{\alpha , \beta }\}\) is a process defined as in Theorem 1(I).
Assuming that (13) and (15) hold, we can summarize the limiting behavior of \(X_2^*\) in Fig. 4. The value \(\alpha =1\) is a boundary between Gaussian and infinite variance stable limit.
5.3 The Process \(X_3^*\)
Since \(X_3^*\) is a Gaussian process, the limiting behavior is simple (see Grahovac et al. [13, Theorems 2.1 and 2.4]).
Lemma 5
-
(i)
If
$$\begin{aligned} \int _0^\infty \xi ^{-1} \pi (\mathrm{d} \xi )< \infty , \end{aligned}$$then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1/2}} X_3^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ {\widetilde{\sigma }}_3 B(t) \right\} , \end{aligned}$$where \(\{ B(t)\}\) is standard Brownian motion and \({\widetilde{\sigma }}_3^2= 2 \sigma _3^2 \int _0^\infty \xi ^{-1} \pi (d \xi )\) with \(\sigma _3^2={\text {Var}}X_3(1)=b/2\).
-
(ii)
Suppose that \(\pi \) has a density p satisfying (13) with \(\alpha \in (0,1)\) and some slowly varying function \(\ell \). Then as \(T\rightarrow \infty \)
$$\begin{aligned} \left\{ \frac{1}{T^{1-\alpha /2 } \ell (T)^{1/2}} X_3^*(Tt) \right\} \overset{fdd}{\rightarrow } \left\{ {\widetilde{\sigma }}_{3,\alpha } B_H(t) \right\} , \end{aligned}$$where \(\{ B_H(t)\}\) is standard fractional Brownian motion with \(H=1-\alpha /2\) and \({\widetilde{\sigma }}_{3,\alpha } = 2 \sigma _3^2 \frac{\varGamma (1+\alpha )}{(2-\alpha )(1-\alpha )}\) with \(\sigma _3^2={\text {Var}}X_3(1)=b/2\).
5.4 Proofs of Theorems 1 and 2
The limiting behavior of the integrated process \(X^*\) follows by combining the limit theorems of the three components in the decomposition (21). If \(X^*\) consists of at least two nonzero components, then each of these may be suitably normalized to obtain a nontrivial limiting process. However, to obtain the limit of the sum of the three components, namely the joint process \(X^*\), one has to take the fastest growing among the three normalizations suitable for the components. Hence, the limiting process will depend on the orders of normalizing sequences of the component processes. Namely, an interplay between the parameters \(\alpha \), \(\beta \), and \(\gamma \) will determine the limit.
Proof (Proof of Theorem 1)
The proof is based on comparing the orders of normalizing sequences. Let \(E_1\) and \(E_2\) denote the exponents of the normalizing sequences for the processes \(X_1^*(Tt)\) and \(X_2^*(Tt)\), respectively.
-
(I)
If \(\gamma <1+\alpha \), then \(E_1=1/\gamma \) by Lemma 1. It is enough to show that \(T^{-1/\gamma }X_2^*(Tt)\overset{P}{\rightarrow }0\) by showing that \(1/\gamma >E_2\).
-
If \(\alpha >1\), then \(E_2=1/2\) by Lemma 3. Since \(\gamma <2\), \(1/\gamma >1/2\).
-
If \(\alpha <1\) and \(\beta <1+\alpha \), then \(E_2=1/(1+\alpha )\) by Lemma 4(i). Since \(\gamma <1+\alpha \), we have \(1/\gamma >1/(1+\alpha )\).
-
If \(\alpha <1\) and \(1+\alpha <\beta \), then \(E_2=1-\alpha /\beta \) by Lemma 4(ii). We have \(1-\alpha /\beta<1+(1-\gamma )/\beta <1+(1-\gamma )/\gamma =1/\gamma \).
-
-
(II)
If \(1+\alpha <\gamma \), then \(E_1=1/(1+\alpha )\) by Lemma 2. Note that implicitly we must have \(\alpha <1\).
-
(II.a)
If \(\beta <1+\alpha \), then \(E_2=1/(1+\alpha )\) by Lemma 4(i). We have \(E_1=E_2\) and the same normalization, and hence the limit is a sum of independent limits obtained in Lemma 2 and Lemma 4(i). We additionally use (Samorodnitsky and Taqqu [25, Property 1.2.1]).
-
(II.b)
If \(1+\alpha <\beta \), then \(E_2=1-\alpha /\beta \) by Lemma 4(ii). We have \(1-\alpha /\beta >1-\alpha /(1+\alpha )=1/(1+\alpha )<\) since \(1+\alpha <\beta \).
-
(II.a)
\(\square \)
Proof (Proof of Theorem 2)
The proof follows the same arguments as the proof of Theorem 1.
(I) follows easily from Theorem 1 and Lemma 5. For \(\alpha >1\) we conclude the statement from the fact that \(1/\gamma >1/2\). If \(\alpha <1\) and \(\gamma <2/(2-\alpha )\), then \(\gamma <1+\alpha \), and hence we need to compare \(1/\gamma \) and \(1-\alpha /2\). But this follows easily since \(1/\gamma >1-\alpha /2 \Leftrightarrow \gamma < 2/(2-\alpha )\). (II) follows similarly. Indeed, if \(2/(2-\alpha )< \gamma < 1+\alpha \), then \(1/\gamma <1-\alpha /2\). If \(\gamma >1+\alpha \), the rate of growth of the normalizing sequence depends on \(\beta \). If \(\beta <1+\alpha \), the order of normalizing sequence for \(X_1^*(Tt)+X_2^*(Tt)\) is \(1/(1+\alpha )\) and \(1/(1+\alpha )=1-\alpha /(1+\alpha )<1-\alpha /2\). If \(1+\alpha <\beta \), the order of the normalizing sequence for \(X_1^*(Tt)+X_2^*(Tt)\) is \(1-\alpha /\beta <1-\alpha /2\). \(\square \)
References
Bakeerathan, G., Leonenko, N.N.: Linnik processes. Random Oper. Stoch. Equ. 16(2), 109–130 (2008)
Barndorff-Nielsen, O.E.: Superposition of Ornstein–Uhlenbeck type processes. Theory Probab. Appl. 45(2), 175–194 (2001)
Barndorff-Nielsen, O.E., Leonenko, N.N.: Spectral properties of superpositions of Ornstein–Uhlenbeck type processes. Methodol. Comput. Appl. Probab. 7(3), 335–352 (2005)
Barndorff-Nielsen, O.E., Pérez-Abreu, V., Thorbjørnsen, S.: Lévy mixing. ALEA 10(2), 1013–1062 (2013)
Barndorff-Nielsen, O.E., Stelzer, R.: Multivariate supOU processes. Ann. Appl. Probab. 21(1), 140–182 (2011)
Barndorff-Nielsen, O.E., Stelzer, R.: The multivariate supOU stochastic volatility model. Math. Finance 23(2), 275–296 (2013)
Barndorff-Nielsen, O.E., Veraart, A.E.: Stochastic volatility of volatility and variance risk premia. J. Financ. Econ. 11(1), 1–46 (2013)
Bingham, N.H., Goldie, C.M., Teugels, J.L.: Regular Variation, vol. 27. Cambridge University Press, Cambridge (1989)
Curato, I.V., Stelzer, R.: Weak dependence and GMM estimation of supOU and mixed moving average processes. Electron. J. Stat. 13(1), 310–360 (2019)
Eberlein, E., Hammerstein, E.A.V.: Generalized hyperbolic and inverse Gaussian distributions: limiting cases and approximation of processes. In: Seminar on Stochastic Analysis, Random Fields and Applications IV’, pp. 221–264. Springer (2004)
Fasen, V., Klüppelberg, C.: Extremes of supOU processes, In: Stochastic Analysis and Applications: The Abel Symposium 2005’, vol. 2, pp. 339–359. Springer (2007)
Grahovac, D., Leonenko, N.N., Sikorskii, A., Taqqu, M.S.: The unusual properties of aggregated superpositions of Ornstein-Uhlenbeck type processes. Bernoulli 25(3), 2029–2050 (2019)
Grahovac, D., Leonenko, N.N., Taqqu, M.S.: Limit theorems, scaling of moments and intermittency for integrated finite variance supOU processes. Stoch Proc Appl (in press) (2019). https://doi.org/10.1016/j.spa.2019.01.010
Heyde, C., Leonenko, N.N.: Student processes. Adv. Appl. Probab. 37(2), 342–365 (2005)
Ibragimov, I., Linnik, Y.V.: Independent and Stationary Sequences of Random Variables. Wolters-Noordhoff, Amsterdam (1971)
Kotz, S., Kozubowski, T.J., Podgorski, K.: The Laplace Distributions and Generalizations. Birkäuser, Boston (2001)
Kozubowski, T.J.: Fractional moment estimation of Linnik and Mittag–Leffler parameters. Math. Comput. Model. 34(9–11), 1023–1035 (2001)
Kozubowski, T.J., Panorska, A.K.: On moments and tail behavior of \(\nu \)-stable random variables. Stat. Probab. Lett. 29(4), 307–315 (1996)
Kozubowski, T.J., Podgorski, K., Samorodnitsky, G.: Tails of Lévy measure of geometric stable random variables. Extremes 1(3), 367–378 (1998)
Kyprianou, A.E.: Fluctuations of Lévy Processes with Applications: Introductory Lectures. Springer, New York (2014)
Moser, M., Stelzer, R.: Functional regular variation of Lévy-driven multivariate mixed moving average processes. Extremes 16(3), 351–382 (2013)
Pedersen, J.: The Lévy-Itô decomposition of an independently scattered random measure. MaPhySto Research Report. University of Aarhus. http://www.maphysto.dk (2003)
Puplinskaitė, D., Surgailis, D.: Aggregation of a random-coefficient AR(1) process with infinite variance and idiosyncratic innovations. Adv. Appl. Probab. 42(02), 509–527 (2010)
Rajput, B.S., Rosinski, J.: Spectral representations of infinitely divisible processes. Probab. Theory Relat. Fields 82(3), 451–487 (1989)
Samorodnitsky, G., Taqqu, M.S.: Stable Non-Gaussian Random Processes: Stochastic Models with Infinite Variance. CRC Press, Boca Raton (1994)
Sato, K.: Lévy Processes and Infinitely Divisible Distributions. Cambridge University Press, Cambridge (1999)
Stelzer, R., Tosstorff, T., Wittlinger, M.: Moment based estimation of supOU processes and a related stochastic volatility model. Stat. Risk Model. 32(1), 1–24 (2015)
Zel’dovich, Y.B., Molchanov, S., Ruzmaĭkin, A., Sokolov, D.D.: Intermittency in random media. Soviet Phys. Uspekhi 30(5), 353 (1987)
Acknowledgements
This work was supported by a grant from the Simons Foundation/569118MT at Boston University.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Grahovac, D., Leonenko, N.N. & Taqqu, M.S. The Multifaceted Behavior of Integrated supOU Processes: The Infinite Variance Case. J Theor Probab 33, 1801–1831 (2020). https://doi.org/10.1007/s10959-019-00935-8
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10959-019-00935-8