Abstract
The purpose of this note is to recall one remarkable theorem of Khinchin about the special role of the Gaussian distribution. This theorem allows us to give a new interpretation of the Lindeberg condition: it guarantees the uniform integrability of the squares of normed sums of random variables and, thus, the passage to the limit under the expectation sign. The latter provides a simple proof of the central limit theorem for independent random variables.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1. Let \(\{\xi_{n,j}\}=\{\xi_{n,j},1\leq j\leq k_{n},n\geq 1\}\), \(k_{n}\to\infty\) as \(n\to\infty\), be a triangular array (double sequence) of independent in each row random variables on a probability space \((X,{B},P)\). For the sake of simplicity, we always assume that \(E\xi_{n,j}=0\) for all \(j\) and \(n\). For any \(n\geq 1\), denote \(S_{n}=\sum_{j=1}^{k_{n}}\xi_{n,j}\), and let \(DS_{n}\) be its variance. The Gaussian (normal) distribution function with parameters \(a\) and \(\sigma^{2}\), \(a,\sigma\in\mathbb{R}\), \(\sigma>0\), is defined by
Khinchin [4] (translation into English can be found in [6]) noted that the Gauss law, as a limiting law for sums of independent random variables, has a very special role that distinguishes it from all infinitely divisible laws. Namely, we arrive at the Gauss law in all cases when the limiting negligibility of the components of the sum of terms under study reaches a sufficiently strong degree; and this happens completely independently of the special properties of the laws of distribution of these terms.
The condition of asymptotic infinitesimality (or, equivalently, limiting negligibility) on the summands \(\xi_{n,j}\), in the general case, is formulated as the condition that for any \(\varepsilon>0\), probability of the inequality \(|\xi_{n,j}|\geq\varepsilon\) tends to zero uniformly in \(j\) as \(n\to\infty\):
Khinchin showed (Theorem 42 in [4]) that if we assume that not only this probability but the probability that all \(|\xi_{n,j}|\), \(1\leq j\leq k_{n}\), are greater than \(\varepsilon\) tends to zero as \(n\to\infty\), that is,
then the only possible limiting law for normed row sums is the Gauss law.
Theorem 1 (Khinchin). Let \(\{\xi_{n,j}\}\) be a double sequence of independent in each row random variables. If a limiting nondegenerate distribution for the sums \(S_{n}\) exists, then for it to be Gaussian, it is necessary and sufficient that for any \(\varepsilon>0\), random variables \(\{\xi_{n,j}\}\) satisfy (0.2).
Since condition (0.2) represents only a somewhat strengthened requirement (0.1) for the limiting negligibility of summands and does not contain any special assumptions about the nature of the laws of distribution of summands, the above result characterizes the Gauss law as, in a certain sense, a universal limiting law for sums of independent random variables and justifies the exclusive place given to this law in classical studies.
In the bibliographical notes [4], Khinchin mentioned that a more general result was obtained by Lévy in [5]; however, Khinchin was not able to find the proof of the latter based on the sketch suggested by Lévy. Khinchin’s proof is based on the direct investigation of the characteristic functions of summands. Another (shorter) proof, based on the Lévy–Khinchin formula for the decomposition of characteristic functions, was suggested by Gnedenko [2]. The latter can be found in book [3] by Gnedenko and Kolmogorov (see Theorem 1 on p. 126).
Under conditions of the Khinchin theorem, the limiting distribution (in the case of centered random summands) is \(\Phi_{0,\sigma^{2}}\) with some parameter \(\sigma^{2}\). Since the limiting law is nondegenerate, \(\sigma^{2}>0\). Note that Khinchin do not impose any restriction on the second moments of the summands, which is a minimal condition on the moments in the central limit theorem (CLT). We say that for a sequence \(\{\xi_{n,j}\}\) of (centered) random variables, the CLT holds if
The Khinchin theorem cannot be considered a CLT since the limiting Gaussian distribution is not necessarily the standard one with \(\sigma^{2}=1\). However, if we impose the uniform integrability condition on the squares of normed row sums, we will be able to prove CLT based on the Khinchin result.
We will need the following statements (see, for example, Lemma 1 on p. 322 and Theorem 5 on p. 189 in [7]). Under convergence of distribution functions we understand convergence in general, i.e., at each point of continuity of the limiting distribution function.
Proposition 1. Let \(\{F_{n}\}=\{F_{n},n\geq 1\}\) be a sequence of distribution functions. Suppose that any convergent subsequence \(\{F_{n^{\prime}}\}\) of \(\{F_{n}\}\), \(\{n^{\prime}\}\subset\{n\}\), converges to the same distribution function \(F\). Then the sequence \(\{F_{n}\}\) converges to \(F\) as well.
Proof. Let \({X}_{F}\) be the set of continuity points of the distribution function \(F\). Fix some \(x\in{X}_{F}\) and assume that \(F_{n}(x)\) does not converge to \(F(x)\). Then there exists \(\varepsilon>0\) and an infinite sequence \(\{n^{\prime}\}\) of natural numbers such that
By the Helly theorem, from the sequence \(\{F_{n^{\prime}}\}\), one can select a convergent subsequence \(\{F_{n^{\prime\prime}}\}\), and let generalized distribution function \(G\) be its limit. By the hypothesis of the proposition, \(G=F\), and thus, \(F_{n^{\prime\prime}}(x)\to F(x)\) as \(n\to\infty\), which contradicts with (0.3). This completes the proof. \(\Box\)
We remind that a family of random variables \(\{\eta_{n},n\geq 1\}\) is uniformly integrable if
Theorem 2. Let \(\eta_{n}\), \(n\geq 1\) be a sequence of positive random variables with \(E\eta_{n}<\infty\) such that \(\eta_{n}\to\eta\) as \(n\to\infty\). Then \(E\eta_{n}\to E\eta<\infty\) as \(n\to\infty\) if and only if the family \(\{\eta_{n},n\geq 1\}\) is uniformly integrable.
Now we present the following version of the CLT for independent random variables.
Theorem 3. Let \(\{\xi_{n,j}\}\) be a double sequence of independent in each row random variables such that \(E\xi_{n,j}^{2}<\infty\), \(1\leq j\leq k_{n}\), \(n\geq 1\). If random variables \(\{\xi_{n,j}/\sqrt{DS_{n}}\), \(1\leq j\leq k_{n}\), \(n\geq 1\}\) satisfy condition (0.2) and the squares of normed row sums \(\{S_{n}^{2}/DS_{n},n\geq 1\}\) are uniformly integrable, then for the sequence \(\{\xi_{n,j}\}\), the CLT holds.
Proof. Let \(F_{n}\) be the distribution function of \(S_{n}/\sqrt{DS_{n}}\), \(n\geq 1\). Then
Further, let \(\{F_{n^{\prime}}\}\), \(\{n^{\prime}\}\subset\{n\}\) be some convergent subsequence of the sequence \(\{F_{n}\}\). Due to the Khincin theorem, \(F_{n^{\prime}}(x)\to\Phi_{0,\sigma^{2}}(x)\) as \(n^{\prime}\to\infty\) for any \(x\in\mathbb{R}\) and some \(\sigma>0\) if random variables \(\{\xi_{n,j}/\sqrt{DS_{n}}\}\) satisfy condition (0.2). Since \(\{S_{n}^{2}/DS_{n},n\geq 1\}\) are uniformly integrable, due to Theorem 2, we can pass to the limit under the expectation sign, and thus,
Tacking into account (0.4), we conclude
that is, the parameter \(\sigma^{2}\) in the limiting Gaussian distribution is equal to one. Thus, from the uniform integrability of \(\{S_{n}^{2}/DS_{n},n\geq 1\}\), it follows that \(F_{n^{\prime}}(x)\to\Phi_{0,1}(x)\) as \(n^{\prime}\to\infty\) for any \(x\in\mathbb{R}\).
Thereby, any convergent subsequence \(\{F_{n^{\prime}}\}\) of the distribution functions of the normed row sums \(S_{n}/\sqrt{DS_{n}}\) converge to the same limiting distribution \(\Phi_{0,1}\). Hence, by Proposition 1, the sequence \(\{F_{n}\}\) converges to \(\Phi_{0,1}\) as well. Therefore, for random variables \(\{\xi_{n,j}\}\), the CLT holds. \(\Box\)
2. Theorem 3 allows us to give the new probabilistic interpretation of the Lindeberg condition. We will show that from the Lindeberg condition, the uniform integrability of the squares of normed sums of random variables follows, which, in its turn, allows passage to the limit under the expectation sign, and thus, guaranties \(\sigma^{2}=1\) in the limiting Gaussian distribution in the Khinchin theorem.
The double sequence \(\{\xi_{n,j}\}\) of random variables satisfies the Lindeberg condition if for any \(\varepsilon>0\),
The classical interpretation of the Lindeberg condition is that if a sequence \(\{\xi_{n,j}\}\) of random variables satisfies (0.5), then its elements are asymptotically infinitesimal uniformly in each row, that is, relation (0.1) holds. Billingsley (see p. 90 in [1]) noted that from the Lindeberg condition, the uniform integrability of the squares of normed sums follows as well.
Proposition 2. Let \(\{\xi_{n,j}\}\) be a double sequence of (centered) independent random variables with finite second moments. If \(\{\xi_{n,j}\}\) satisfies the Lindeberg condition (0.5), then squares of normed row sums \(\{S_{n}/\sqrt{DS_{n}},n\geq 1\}\), are uniformly integrable.
Proof. The statement follows from inequality (12.20) in [1], according to which for any \(n\geq 1\) and \(C>0\), one has
where \(K\) is some universal constant. By (0.5), for any \(C>0\), there exists \(n_{0}=n_{0}(C)>1\) such that
Thus,
and hence,
\(\Box\)
The statement above reveals the true essence of the Lindeberg condition. Since uniform integrability condition is the necessary and sufficient condition for taking limit under the expectation sign, we conclude that the Lindeberg condition is one of conditions under which the limiting Gaussian distribution in the Khinchin theorem is the standard one. Tacking into account this fact, we provide the new proof of the well-known Lévy–Lindeberg theorem.
Theorem 4 (Lévy–Lindeberg). Let \(\{\xi_{n,j}\}\) be a double sequence of independent in each row random variables such that \(E\xi_{n,j}=0\), \(0<E\xi_{n,j}^{2}<\infty\), \(1\leq j\leq k_{n}\), \(n\geq 1\). If random variables \(\{\xi_{n,j}\}\) satisfy Lindeberg condition (0.5), then the CLT holds.
Proof. First note that random variables \(\{\xi_{n,j}/\sqrt{DS_{n}},1\leq j\leq k_{n},n\geq 1\}\) satisfy condition (0.2), since for any \(\varepsilon>0\), we have
as \(n\to\infty\). Further, by Proposition 2, random variables \(\{S_{n}^{2}/DS_{n},n\geq 1\}\), are uniformly integrable. Thus, by Theorem 3, for \(\{\xi_{n,j}\}\) the CLT holds. \(\Box\)
3. Let us illustrate the application of Theorem 3 in the case of independent identically distributed (i.i.d.) random variables. Namely, we will use this theorem to prove the following classical result.
Theorem 5 (Lèvy–Khinchin). Let \(\{\eta_{n},n\geq 1\}\) be a sequence of i.i.d. random variables such that \(E\eta_{1}=0\) and \(D\eta_{1}=\sigma_{0}^{2}\leq\infty\). Then for \(\{\eta_{n}\}\) the CLT holds.
Proof. Consider the double array \(\{\xi_{n,j},1\leq j\leq n,n\geq 1\}\) of random variables \(\xi_{n,j}=\dfrac{\eta_{j}}{\sigma_{0}\sqrt{n}}\). It is not difficult to check that random variables \(\{\xi_{n,j}\}\) satisfy condition (0.2). Further, put \(\hat{S}_{n}=\sum_{j=1}^{n}\xi_{n,j}=\dfrac{1}{\sigma_{0}\sqrt{n}}\sum_{j=1}^{n}\eta_{j}\), then \(D\hat{S}_{n}=1\). With application of inequality (12.19) in [1], for any \(C>0\), we can write
where \(K\) is some positive constant. Further, applying equality (3) on p. 223 in [1], we can write
From here it follows that random variables \(\{\hat{S}_{n},n\geq 1\}\) are uniformly integrable. Hence, by Theorem 3, for random variables \(\{\xi_{n,j}\}\) the CLT holds. It remains to note that \(P\left(\hat{S}_{n}\leq x\right)=P\left(\dfrac{S_{n}}{\sqrt{DS_{n}}}\leq x\right)\), \(x\in\mathbb{R}^{d}\). \(\Box\)
We see that checking the uniform integrability of the squares of normmed sums directly requires some effort. At the same time, the Lindeberg condition for i.i.d. random variables can be checked quite simply. That is why it is preferable in applications.
REFERENCES
P. Billingsley, Convergence of Probability Measures (John Wiley & Sons, 1968).
B. V. Gnedenko, ‘‘Limit theorems for sums of independent randomvariables,’’ Usp. Mat. Nauk 10, 115–165 (1944).
B. V. Gnedenko and A. N. Kolmogorov, Limit Distributions of Sumsof Independent Random Variables (Wiley, New York, 1968).
A. Khinchin, Limit Laws for Sums of Independent Random Variables (GONTI, Moscow, 1938).
P. Lévy, ‘‘Propertiétés asymptotiques des sommes devariables aléatories indépendantes ou enchaînées,’’ J. Math. 14, 347–402 (1935).
S. Rogosin and F. Mainardi, The Legacy of A.Y. Khintchine’s Work in Probability Theory (Cambridge Scientific Publishers, Cambridge, 2009).
A. N. Shiryaev, Probability, Graduate Texts in Mathematics, Vol. 95 (Springer, New York, 1996). https://doi.org/10.1007/978-1-4757-2539-1
ACKNOWLEDGMENTS
The author is grateful to Boris S. Nahapetian for his attention to the work and useful discussions. Also, the author wants to express her gratitude to Lutz Mattner for his comments on the preprint of this work, including the reference to the book [6], and to an anonymous reviewer for valuable remarks.
Funding
This work was supported by ongoing institutional funding. No additional grants to carry out or direct this particular research were obtained.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
The author declares that she has no conflicts of interest.
Additional information
Publisher’s Note.
Allerton Press remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Khachatryan, L.A. On Khinchin’s Theorem about the Special Role of the Gaussian Distribution. J. Contemp. Mathemat. Anal. 58, 400–404 (2023). https://doi.org/10.3103/S1068362323060031
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.3103/S1068362323060031