Correction to: Probab. Theory Relat. Fields (2007) 138:325–361 https://doi.org/10.1007/s00440-006-0025-2

A problem appears in the end of Lemma 8 page 355. The first part of the Lemma remains true but the second weak convergence result for the random predictor is not exact in general. In fact \(Z_{i,n}\) is not a martingale difference sequence with respect to the filtration generated by \(\left( X_{1},\varepsilon _{1},\ldots ,X_{n},\varepsilon _{n}\right) \) and the CLT mentioned in the end of page 355 cannot be invoked.

A new version of the second part of Lemma 8 is given below. An additional assumption denoted (1) below is also required for Theorem 2 to hold. It is satisfied in a wide range of examples and applications.

FormalPara Lemma 1

Let \(X_{i}=\sum _{l=1}^{+\infty }\sqrt{\lambda _{l}}\xi _{l,i}e_{l}\) be the Karhunen-Loeve expansion of \(X_{i}\) given at page 334. Assume that the sequence of the squared principal component satisfies the weak law of large numbers: when L tends to infinity,

$$\begin{aligned} s_{L}=\frac{1}{L}\sum _{l=1}^{L}\xi _{l}^{2}\overset{{\mathbb {P}}}{\rightarrow }1 \end{aligned}$$
(1)

then the second part of Lemma 8 holds namely:

$$\begin{aligned} \sqrt{\frac{n}{s_{n}}}\left\langle R_{n},X_{n+1}\right\rangle \overset{w}{\rightarrow }N\left( 0,\sigma _{\varepsilon }^{2}\right) \end{aligned}$$
FormalPara Remark 1

Assumption (1) holds for Gaussian X and more generally when the principal components \(\xi _{l}\)’s are independent.

FormalPara Proof

In order to clarifiy we set below \(X_{0}=X_{n+1}\), \(s_{L,i}=\frac{1}{L}\sum _{l=1}^{L}\xi _{l,i}^{2}\) and denote \({\mathbb {E}}_{i}\) the expectation w.r.t. the couple \(\left( X_{i},\varepsilon _{i}\right) \). The derivation relies on proving classical pointwise convergence for the characteristic function of \(S_{n}=\frac{1}{\sqrt{ns_{n}}}\sum _{i=1}^{n}Z_{i,n}\) with \(Z_{i,n}=\left\langle \Gamma ^{\dag }X_{i},X_{0}\right\rangle \varepsilon _{i}\).

We prove the result above in the specific case of PCA-spectral cut then \(f_{n}\left( x\right) =1/x\) for \(x\ge \lambda _{k_{n}}\) (then \(s_{n}=k_{n}\)). The reader will check that it does not alter the generality of the statement.

Then with \(\varphi _{S_{n}}\left( t\right) ={\mathbb {E}}\left( \exp \left( itS_{n}\right) \right) \)

$$\begin{aligned} \varphi _{S_{n}}\left( t\right) = {\mathbb {E}}_{0}\left\{ \prod _{j=1}^{n}{\mathbb {E}}_{j}\exp \left( \frac{it}{\sqrt{nk_{n}}} \left\langle \Gamma ^{\dag }X_{j},X_{0}\right\rangle \varepsilon _{j}\right) \right\} ={\mathbb {E}}_{0}\left\{ {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}}Z_{1,n}\right) \right\} ^{n}. \end{aligned}$$

With the above notations on Karhunen-Loeve expansion for X

$$\begin{aligned} Z_{1,n}=\varepsilon _{1}\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle =\varepsilon _{1}\sum _{l=1}^{k_{n}}\xi _{l,1}\xi _{l,0}, \end{aligned}$$

where \(\left( \xi _{l,1}\right) _{1\le l\le k_{n}}\) is independent from \(\left( \xi _{l,0}\right) _{1\le l\le k_{n}}\). Simple computations give, \({\mathbb {E}}_{1}\left[ Z_{1,n}\right] =0\), \({\mathbb {E}}_{1}\left[ Z_{1,n}^{2}\right] =\sigma _{\varepsilon }^{2}\sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\), and Cauchy-Schwarz inequality yields

$$\begin{aligned} {\mathbb {E}}_{1}\left| \left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \right| ^{3}\le \left( \sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\right) ^{3/2} {\mathbb {E}}_{1}\left( \sum _{l=1}^{k_{n}}\xi _{l,1}^{2}\right) ^{3/2}. \end{aligned}$$

Taken from Jensen’s inequality, the bound

$$\begin{aligned} \left( \frac{1}{k_{n}}\sum _{l=1}^{k_{n}}\xi _{l,1}^{2}\right) ^{3/2}\le \frac{1}{k_{n}}\sum _{l=1}^{k_{n}}\left| \xi _{l,1}\right| ^{3} \end{aligned}$$

leads to \({\mathbb {E}}_{1}\left( \sum _{l=1}^{k_{n}}\xi _{l,1}^{2}\right) ^{3/2}\le k_{n}^{1/2} \sum _{l=1}^{k_{n}}{\mathbb {E}}\left| \xi _{l,1}\right| ^{3}\le M^{3/4}k_{n}^{3/2}\) where M appears in assumption \(\left( A.3\right) \) page 334 and finally to

$$\begin{aligned} {\mathbb {E}}_{1}\left| \left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \right| ^{3}\le M^{3/4}k_{n}^{3/2}\left( \sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\right) ^{3/2}. \end{aligned}$$
(2)

Then a Taylor expansion for the characteristic function of \(\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \varepsilon _{1}\) is

$$\begin{aligned} {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}}Z_{1,n}\right) =1-\frac{t^{2}\sigma _{\varepsilon }^{2}}{2nk_{n}} \sum _{l=1}^{k_{n}}\xi _{l,0}^{2}-i\frac{t^{3}}{6\left( nk_{n}\right) ^{3/2}}H_{n}\left( t\right) , \end{aligned}$$

where \(H_{n}\left( t\right) ={\mathbb {E}}_{1}\left[ {\varepsilon }_{1}^{3}\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle ^{3}\exp \left( i\tau _{t}\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \varepsilon _{1}\right) \right] \) for some \(\tau _{t}\in \left( 0,t/\sqrt{nk_{n}}\right) \) is a remainder term in the Taylor’s expansion of \(\varphi _{Z_{1,n}}\). Hence, we get from (2),

$$\begin{aligned} \left| H_n \left( t\right) \right| \le {\mathbb {E}}\left| {\varepsilon }_{1}\right| ^{3} {\mathbb {E}}_{1}\left| \left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \right| ^{3}\le {\mathbb {E}}\left| {\varepsilon }_{1}\right| ^{3}M^{3/4}k_{n}^{3/2} \left( \sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\right) ^{3/2}. \end{aligned}$$

Remind that \(s_{k_{n},0}=\frac{1}{k_{n}}\sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\). From the equations above we can write,

$$\begin{aligned} {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}}Z_{1,n}\right) =1-\frac{t^{2}\sigma _{\varepsilon }^{2}}{2n}s_{k_{n},0}\left( 1+tk_{n}^{3/2} \sqrt{\frac{s_{k_{n},0}}{n}}{\widetilde{H}}_{n}\left( t\right) \right) , \end{aligned}$$
(3)

where this time: \(\sup _{\left( n,t\right) }\left| {\widetilde{H}}_{n}\left( t\right) \right| \le M^{3/4}{\mathbb {E}}\left| {\varepsilon }_{1}\right| ^{3}\). Then, with assumption (7) page 334 in Theorem 2, we have that \(k_{n}^{3}/n\) tends to zero when n tends to infinity so that,

$$\begin{aligned} {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}}\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \varepsilon _{1}\right) =1-\frac{t^{2} \sigma _{\varepsilon }^{2}}{2n}\left( \frac{1}{k_{n}}\sum _{l=1}^{k_{n}} \xi _{l,0}^{2}\right) \left( 1+o_{{\mathbb {P}}}\left( 1\right) \right) , \end{aligned}$$

when \(\frac{1}{k_{n}}\sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\) is an \(O_{{\mathbb {P}}}\left( 1\right) \). In order to conclude we have to take expectation with respect to \(X_{0}\) and integrate to the limit.

First of all it is plain from (3), the assumption on \(\frac{1}{k_{n}}\sum _{l=1}^{k_{n}}\xi _{l,0}^{2}\) and the continuous mapping theorem that

$$\begin{aligned} \left[ {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}}\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \varepsilon _{1}\right) \right] ^{n}\overset{{\mathbb {P}}}{\underset{n\rightarrow +\infty }{\rightarrow }} \exp \left( -\frac{t^{2}}{2}\sigma _{\varepsilon }^{2}\right) . \end{aligned}$$

Besides \(\left[ {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}} Z_{1,n}\right) \right] ^{n}\le 1\mathbb {\ }\)almost surely and is uniformly integrable with respect to \({\mathbb {E}}_{0}\). We can conclude that

$$\begin{aligned} {\mathbb {E}}_{0}\left\{ {\mathbb {E}}_{1}\exp \left( \frac{it}{\sqrt{nk_{n}}}\left\langle \Gamma ^{\dag }X_{1},X_{0}\right\rangle \varepsilon _{1}\right) \right\} ^{n}\underset{n\rightarrow +\infty }{\rightarrow }\exp \left( -\frac{t^{2}}{2}\sigma _{\varepsilon }^{2}\right) , \end{aligned}$$

which concludes the proof of the Lemma. \(\square \)