1. Let \(\{\xi_{n,j}\}=\{\xi_{n,j},1\leq j\leq k_{n},n\geq 1\}\), \(k_{n}\to\infty\) as \(n\to\infty\), be a triangular array (double sequence) of independent in each row random variables on a probability space \((X,{B},P)\). For the sake of simplicity, we always assume that \(E\xi_{n,j}=0\) for all \(j\) and \(n\). For any \(n\geq 1\), denote \(S_{n}=\sum_{j=1}^{k_{n}}\xi_{n,j}\), and let \(DS_{n}\) be its variance. The Gaussian (normal) distribution function with parameters \(a\) and \(\sigma^{2}\), \(a,\sigma\in\mathbb{R}\), \(\sigma>0\), is defined by

$$\Phi_{a,\sigma^{2}}(x)=\frac{1}{\sigma\sqrt{2\pi}}\int\limits_{-\infty}^{x}\exp\left\{-\frac{(t-a)^{2}}{2\sigma^{2}}\right\}dt,\quad x\in\mathbb{R}.$$

Khinchin [4] (translation into English can be found in [6]) noted that the Gauss law, as a limiting law for sums of independent random variables, has a very special role that distinguishes it from all infinitely divisible laws. Namely, we arrive at the Gauss law in all cases when the limiting negligibility of the components of the sum of terms under study reaches a sufficiently strong degree; and this happens completely independently of the special properties of the laws of distribution of these terms.

The condition of asymptotic infinitesimality (or, equivalently, limiting negligibility) on the summands \(\xi_{n,j}\), in the general case, is formulated as the condition that for any \(\varepsilon>0\), probability of the inequality \(|\xi_{n,j}|\geq\varepsilon\) tends to zero uniformly in \(j\) as \(n\to\infty\):

$$\max\limits_{1\leq j\leq k_{n}}P\left(|\xi_{n,j}|\geq\varepsilon\right)\to 0\quad\text{as }n\to\infty.$$
(0.1)

Khinchin showed (Theorem 42 in [4]) that if we assume that not only this probability but the probability that all \(|\xi_{n,j}|\), \(1\leq j\leq k_{n}\), are greater than \(\varepsilon\) tends to zero as \(n\to\infty\), that is,

$$P\left(\max\limits_{1\leq j\leq k_{n}}|\xi_{n,j}|\geq\varepsilon\right)\to 0\quad\text{as }n\to\infty,$$
(0.2)

then the only possible limiting law for normed row sums is the Gauss law.

Theorem 1 (Khinchin). Let \(\{\xi_{n,j}\}\) be a double sequence of independent in each row random variables. If a limiting nondegenerate distribution for the sums \(S_{n}\) exists, then for it to be Gaussian, it is necessary and sufficient that for any \(\varepsilon>0\), random variables \(\{\xi_{n,j}\}\) satisfy (0.2).

Since condition (0.2) represents only a somewhat strengthened requirement (0.1) for the limiting negligibility of summands and does not contain any special assumptions about the nature of the laws of distribution of summands, the above result characterizes the Gauss law as, in a certain sense, a universal limiting law for sums of independent random variables and justifies the exclusive place given to this law in classical studies.

In the bibliographical notes [4], Khinchin mentioned that a more general result was obtained by Lévy in [5]; however, Khinchin was not able to find the proof of the latter based on the sketch suggested by Lévy. Khinchin’s proof is based on the direct investigation of the characteristic functions of summands. Another (shorter) proof, based on the Lévy–Khinchin formula for the decomposition of characteristic functions, was suggested by Gnedenko [2]. The latter can be found in book [3] by Gnedenko and Kolmogorov (see Theorem 1 on p. 126).

Under conditions of the Khinchin theorem, the limiting distribution (in the case of centered random summands) is \(\Phi_{0,\sigma^{2}}\) with some parameter \(\sigma^{2}\). Since the limiting law is nondegenerate, \(\sigma^{2}>0\). Note that Khinchin do not impose any restriction on the second moments of the summands, which is a minimal condition on the moments in the central limit theorem (CLT). We say that for a sequence \(\{\xi_{n,j}\}\) of (centered) random variables, the CLT holds if

$$\lim\limits_{n\to\infty}P\left(\frac{S_{n}}{\sqrt{DS_{n}}}\leq x\right)=\Phi_{0,1}(x),\quad x\in\mathbb{R}.$$

The Khinchin theorem cannot be considered a CLT since the limiting Gaussian distribution is not necessarily the standard one with \(\sigma^{2}=1\). However, if we impose the uniform integrability condition on the squares of normed row sums, we will be able to prove CLT based on the Khinchin result.

We will need the following statements (see, for example, Lemma 1 on p. 322 and Theorem 5 on p. 189 in [7]). Under convergence of distribution functions we understand convergence in general, i.e., at each point of continuity of the limiting distribution function.

Proposition 1. Let \(\{F_{n}\}=\{F_{n},n\geq 1\}\) be a sequence of distribution functions. Suppose that any convergent subsequence \(\{F_{n^{\prime}}\}\) of \(\{F_{n}\}\), \(\{n^{\prime}\}\subset\{n\}\), converges to the same distribution function \(F\). Then the sequence \(\{F_{n}\}\) converges to \(F\) as well.

Proof. Let \({X}_{F}\) be the set of continuity points of the distribution function \(F\). Fix some \(x\in{X}_{F}\) and assume that \(F_{n}(x)\) does not converge to \(F(x)\). Then there exists \(\varepsilon>0\) and an infinite sequence \(\{n^{\prime}\}\) of natural numbers such that

$$\left|F_{n^{\prime}}(x)-F(x)\right|>\varepsilon.$$
(0.3)

By the Helly theorem, from the sequence \(\{F_{n^{\prime}}\}\), one can select a convergent subsequence \(\{F_{n^{\prime\prime}}\}\), and let generalized distribution function \(G\) be its limit. By the hypothesis of the proposition, \(G=F\), and thus, \(F_{n^{\prime\prime}}(x)\to F(x)\) as \(n\to\infty\), which contradicts with (0.3). This completes the proof. \(\Box\)

We remind that a family of random variables \(\{\eta_{n},n\geq 1\}\) is uniformly integrable if

$$\sup\limits_{n}\int\limits_{|\eta_{n}|>C}|\eta_{n}|dP\to 0\quad\text{as }C\to\infty.$$

Theorem 2. Let \(\eta_{n}\), \(n\geq 1\) be a sequence of positive random variables with \(E\eta_{n}<\infty\) such that \(\eta_{n}\to\eta\) as \(n\to\infty\). Then \(E\eta_{n}\to E\eta<\infty\) as \(n\to\infty\) if and only if the family \(\{\eta_{n},n\geq 1\}\) is uniformly integrable.

Now we present the following version of the CLT for independent random variables.

Theorem 3. Let \(\{\xi_{n,j}\}\) be a double sequence of independent in each row random variables such that \(E\xi_{n,j}^{2}<\infty\), \(1\leq j\leq k_{n}\), \(n\geq 1\). If random variables \(\{\xi_{n,j}/\sqrt{DS_{n}}\), \(1\leq j\leq k_{n}\), \(n\geq 1\}\) satisfy condition (0.2) and the squares of normed row sums \(\{S_{n}^{2}/DS_{n},n\geq 1\}\) are uniformly integrable, then for the sequence \(\{\xi_{n,j}\}\), the CLT holds.

Proof. Let \(F_{n}\) be the distribution function of \(S_{n}/\sqrt{DS_{n}}\), \(n\geq 1\). Then

$$\int\limits_{\mathbb{R}}x^{2}dF_{n}(x)=E\left(\frac{S_{n}^{2}}{DS_{n}}\right)=1,\quad n\geq 1.$$
(0.4)

Further, let \(\{F_{n^{\prime}}\}\), \(\{n^{\prime}\}\subset\{n\}\) be some convergent subsequence of the sequence \(\{F_{n}\}\). Due to the Khincin theorem, \(F_{n^{\prime}}(x)\to\Phi_{0,\sigma^{2}}(x)\) as \(n^{\prime}\to\infty\) for any \(x\in\mathbb{R}\) and some \(\sigma>0\) if random variables \(\{\xi_{n,j}/\sqrt{DS_{n}}\}\) satisfy condition (0.2). Since \(\{S_{n}^{2}/DS_{n},n\geq 1\}\) are uniformly integrable, due to Theorem 2, we can pass to the limit under the expectation sign, and thus,

$$\lim\limits_{n^{\prime}\to\infty}\int\limits_{\mathbb{R}}x^{2}dF_{n^{\prime}}(x)=\int\limits_{\mathbb{R}}x^{2}\lim\limits_{n^{\prime}\to\infty}dF_{n^{\prime}}(x)=\int\limits_{\mathbb{R}}x^{2}d\Phi_{0,\sigma^{2}}(x).$$

Tacking into account (0.4), we conclude

$$\int\limits_{\mathbb{R}}x^{2}d\Phi_{0,\sigma^{2}}(x)=1,$$

that is, the parameter \(\sigma^{2}\) in the limiting Gaussian distribution is equal to one. Thus, from the uniform integrability of \(\{S_{n}^{2}/DS_{n},n\geq 1\}\), it follows that \(F_{n^{\prime}}(x)\to\Phi_{0,1}(x)\) as \(n^{\prime}\to\infty\) for any \(x\in\mathbb{R}\).

Thereby, any convergent subsequence \(\{F_{n^{\prime}}\}\) of the distribution functions of the normed row sums \(S_{n}/\sqrt{DS_{n}}\) converge to the same limiting distribution \(\Phi_{0,1}\). Hence, by Proposition 1, the sequence \(\{F_{n}\}\) converges to \(\Phi_{0,1}\) as well. Therefore, for random variables \(\{\xi_{n,j}\}\), the CLT holds. \(\Box\)

2. Theorem 3 allows us to give the new probabilistic interpretation of the Lindeberg condition. We will show that from the Lindeberg condition, the uniform integrability of the squares of normed sums of random variables follows, which, in its turn, allows passage to the limit under the expectation sign, and thus, guaranties \(\sigma^{2}=1\) in the limiting Gaussian distribution in the Khinchin theorem.

The double sequence \(\{\xi_{n,j}\}\) of random variables satisfies the Lindeberg condition if for any \(\varepsilon>0\),

$$\frac{1}{DS_{n}}\sum\limits_{j=1}^{k_{n}}\int\limits_{\{|\xi_{n,j}|>\varepsilon\sqrt{DS_{n}}\}}\xi_{n,j}^{2}dP\to 0\quad\text{as }n\to\infty.$$
(0.5)

The classical interpretation of the Lindeberg condition is that if a sequence \(\{\xi_{n,j}\}\) of random variables satisfies (0.5), then its elements are asymptotically infinitesimal uniformly in each row, that is, relation (0.1) holds. Billingsley (see p. 90 in [1]) noted that from the Lindeberg condition, the uniform integrability of the squares of normed sums follows as well.

Proposition 2. Let \(\{\xi_{n,j}\}\) be a double sequence of (centered) independent random variables with finite second moments. If \(\{\xi_{n,j}\}\) satisfies the Lindeberg condition (0.5), then squares of normed row sums \(\{S_{n}/\sqrt{DS_{n}},n\geq 1\}\), are uniformly integrable.

Proof. The statement follows from inequality (12.20) in [1], according to which for any \(n\geq 1\) and \(C>0\), one has

$$\int\limits_{\{S_{n}^{2}\geq CDS_{n}\}}\frac{S_{n}^{2}}{DS_{n}}dP\leq K\left(\frac{1}{C}+\frac{1}{DS_{n}}\sum\limits_{j=1}^{k_{n}}\int\limits_{\{|\xi_{n,j}|\geq\frac{1}{4}CDS_{n}\}}\xi_{n,j}^{2}dP\right),$$

where \(K\) is some universal constant. By (0.5), for any \(C>0\), there exists \(n_{0}=n_{0}(C)>1\) such that

$$\frac{1}{DS_{n}}\sum\limits_{j=1}^{k_{n}}\int\limits_{\{|\xi_{n,j}|\geq\frac{1}{4}CDS_{n}\}}\leq\frac{1}{C}\quad\text{for any }n>n_{0}.$$

Thus,

$$\sup\limits_{n}\int\limits_{\{S_{n}^{2}\geq CDS_{n}\}}\frac{S_{n}^{2}}{DS_{n}}dP\leq K\left(\frac{2}{C}+\sup\limits_{1\leq m\leq n_{0}(C)}\frac{1}{DS_{m}}\sum\limits_{j=1}^{k_{m}}\int\limits_{\{|\xi_{m,j}|\geq\frac{1}{4}CDS_{m}\}}\xi_{m,j}^{2}dP\right),$$

and hence,

$$\sup\limits_{n}\int\limits_{\{S_{n}^{2}\geq CDS_{n}\}}\frac{S_{n}^{2}}{DS_{n}}dP\to 0\quad\text{as }C\to\infty.$$

\(\Box\)

The statement above reveals the true essence of the Lindeberg condition. Since uniform integrability condition is the necessary and sufficient condition for taking limit under the expectation sign, we conclude that the Lindeberg condition is one of conditions under which the limiting Gaussian distribution in the Khinchin theorem is the standard one. Tacking into account this fact, we provide the new proof of the well-known Lévy–Lindeberg theorem.

Theorem 4 (Lévy–Lindeberg). Let \(\{\xi_{n,j}\}\) be a double sequence of independent in each row random variables such that \(E\xi_{n,j}=0\), \(0<E\xi_{n,j}^{2}<\infty\), \(1\leq j\leq k_{n}\), \(n\geq 1\). If random variables \(\{\xi_{n,j}\}\) satisfy Lindeberg condition (0.5), then the CLT holds.

Proof. First note that random variables \(\{\xi_{n,j}/\sqrt{DS_{n}},1\leq j\leq k_{n},n\geq 1\}\) satisfy condition (0.2), since for any \(\varepsilon>0\), we have

$$P\left(\max\limits_{1\leq j\leq k_{n}}|\xi_{n,j}|\geq\varepsilon\sqrt{DS_{n}}\right)\leq\sum\limits_{j=1}^{k_{n}}P(|\xi_{n,j}|\geq\varepsilon\sqrt{DS_{n}})$$
$${}\leq\frac{1}{\varepsilon^{2}DS_{n}}\sum\limits_{j=1}^{k_{n}}\int\limits_{\{|\xi_{n,j}|>\varepsilon\sqrt{DS_{n}}\}}\xi_{n,j}^{2}dP\to 0$$

as \(n\to\infty\). Further, by Proposition 2, random variables \(\{S_{n}^{2}/DS_{n},n\geq 1\}\), are uniformly integrable. Thus, by Theorem 3, for \(\{\xi_{n,j}\}\) the CLT holds. \(\Box\)

3. Let us illustrate the application of Theorem 3 in the case of independent identically distributed (i.i.d.) random variables. Namely, we will use this theorem to prove the following classical result.

Theorem 5 (Lèvy–Khinchin). Let \(\{\eta_{n},n\geq 1\}\) be a sequence of i.i.d. random variables such that \(E\eta_{1}=0\) and \(D\eta_{1}=\sigma_{0}^{2}\leq\infty\). Then for \(\{\eta_{n}\}\) the CLT holds.

Proof. Consider the double array \(\{\xi_{n,j},1\leq j\leq n,n\geq 1\}\) of random variables \(\xi_{n,j}=\dfrac{\eta_{j}}{\sigma_{0}\sqrt{n}}\). It is not difficult to check that random variables \(\{\xi_{n,j}\}\) satisfy condition (0.2). Further, put \(\hat{S}_{n}=\sum_{j=1}^{n}\xi_{n,j}=\dfrac{1}{\sigma_{0}\sqrt{n}}\sum_{j=1}^{n}\eta_{j}\), then \(D\hat{S}_{n}=1\). With application of inequality (12.19) in [1], for any \(C>0\), we can write

$$P(\hat{S}_{n}^{2}\geq C^{2})\leq\max\limits_{1\leq j\leq n}P\left(\hat{S}_{j}^{2}\geq C^{2}\right)\leq P\left(\max\limits_{1\leq j\leq n}|\hat{S}_{j}|\geq C\right)$$
$${}\leq K\left(\dfrac{1}{C^{4}}+\displaystyle\dfrac{1}{C^{2}}\sum\limits_{k=1}^{n}\int\limits_{\{|\xi_{n,j}|>\frac{1}{4}C\}}\xi_{n,j}^{2}dP\right),$$

where \(K\) is some positive constant. Further, applying equality (3) on p. 223 in [1], we can write

$$\displaystyle\int\limits_{\hat{S}_{n}^{2}\geq C}\hat{S}_{n}^{2}dP=CP\left(\hat{S}_{n}^{2}\geq C\right)+\int\limits_{C}^{\infty}P\left(\hat{S}_{n}^{2}\geq t\right)dt$$
$${}\leq C\left(\dfrac{K}{C^{4}}+\dfrac{K}{C^{2}\sigma_{0}^{2}}\displaystyle\int\limits_{\{|\eta_{1}|>C\sigma_{0}/4\}}\eta_{1}^{2}dP\right)+\displaystyle\int\limits_{C}^{\infty}\left(\dfrac{K}{t^{4}}+\dfrac{K}{t^{2}\sigma_{0}^{2}}\displaystyle\int\limits_{\{|\eta_{1}|>t\sigma_{0}/4\}}\eta_{1}^{2}dP\right)dt$$
$${}=K\left(\dfrac{1}{C^{3}}+\displaystyle\int\limits_{C}^{\infty}\dfrac{dt}{t^{4}}+\dfrac{1}{\sigma_{0}^{2}}\left(\dfrac{1}{C}+\int\limits_{C}^{\infty}\dfrac{dt}{t^{2}}\right)\int\limits_{\{|\eta_{1}|>t\sigma_{0}/4\}}\eta_{1}^{2}dP\right).$$

From here it follows that random variables \(\{\hat{S}_{n},n\geq 1\}\) are uniformly integrable. Hence, by Theorem 3, for random variables \(\{\xi_{n,j}\}\) the CLT holds. It remains to note that \(P\left(\hat{S}_{n}\leq x\right)=P\left(\dfrac{S_{n}}{\sqrt{DS_{n}}}\leq x\right)\), \(x\in\mathbb{R}^{d}\). \(\Box\)

We see that checking the uniform integrability of the squares of normmed sums directly requires some effort. At the same time, the Lindeberg condition for i.i.d. random variables can be checked quite simply. That is why it is preferable in applications.