1 Introduction

In this paper, all random elements are defined on a probability space \((\Omega ,\mathcal {F},\mathbb {P})\) and take values in a real separable Banach space \(\mathcal {X}\) with norm \(\Vert \cdot \Vert \).

Let I be a nonempty index set. A family of \(\mathcal {X}\)-valued random elements \(\{X_{i}, i \in I \}\) is said to be stochastically dominated by a real-valued random variable X if

$$\begin{aligned} \sup _{i\in I} \mathbb {P}(\Vert X_{i}\Vert>t)\le \mathbb {P}(|X|>t) \text{ for } \text{ all } t\ge 0. \end{aligned}$$
(1.1)

If the random elements \(X_i\), \(i\in I\) are identically distributed, then (1.1) is of course satisfied with \(X=\Vert X_{i_0}\Vert \) for any \(i_0\in I\). If a family of random elements \(\{X_{i}, i \in I\}\) is stochastically dominated by a real-valued random variable X, then for all \(p> 0\) and \(t> 0\), we can prove that

$$\begin{aligned} \sup _{i \in I}E(\Vert X_{i}\Vert ^p \textbf{1}(\Vert X_{i}\Vert>t))\le E(|X|^p\textbf{1}(|X|>t)) \end{aligned}$$
(1.2)

and

$$\begin{aligned} \sup _{i \in I}E(\Vert X_{i}\Vert ^p \textbf{1}(\Vert X_{i}\Vert \le t))\le E(|X|^p\textbf{1}(|X|\le t))+ t^p\mathbb {P}(|X|>t)\le E(|X|^p). \end{aligned}$$
(1.3)

We will use (1.2) and (1.3) in our proofs without further mention.

It was shown by Rosalsky and Thành [16] that (1.1) is equivalent to an apparently weaker definition of \(\{X_{i}, i \in I\}\) being stochastically dominated by a real-valued random variable Y, namely that

$$\begin{aligned} \sup _{i\in I} \mathbb {P}(\Vert X_{i}\Vert>t)\le C_1\mathbb {P}(C_2 |Y|>t) \text{ for } \text{ all } t\ge 0 \end{aligned}$$
(1.4)

for some constants \(C_1, C_2\in (0,\infty )\).

Let \(\mathbb {Z}_{+}^{d}\), where d is a positive integer, denote the positive integer d-dimensional lattice points. The notation \(\textbf{m} \prec \textbf{n}\), where \(\textbf{m} = \left( m_{1}, m_{2}, \ldots , m_{d}\right) \) and \(\textbf{n} = \left( n_{1}, n_{2}, \ldots , n_{d}\right) \in \mathbb {Z}_{+}^{d}\), means that \(m_{i} \le n_{i}, 1 \le i \le d\). Let \(\textbf{n}=\left( n_{1}, n_{2}, \ldots , n_{d}\right) \in \mathbb {Z}_{+}^{d}\), we denote \(|\textbf{n}|=\prod _{i=1}^{d} n_{i}.\) Consider a d-dimensional array \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) of random elements defined on a probability space \((\Omega , \mathcal {F}, P)\) and taking values in a real separable Banach space \(\mathcal {X}\) with norm \(\Vert \). \(\Vert .\)

Gut [6] proved that if \(\left\{ X, X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) is a d-dimensional array of i.i.d. real-valued random variables with \(E|X|^{p}<\infty \ (0<p<1)\), then

$$\begin{aligned} \dfrac{\sum _{\textbf{j} \prec \textbf{n}} X_{\textbf{j}}}{|\textbf{n}|^{1 / p}} \longrightarrow 0 \text{ in } L^{p} \text{ as } \min _{1 \le i \le d} n_{i} \longrightarrow \infty \end{aligned}$$
(1.5)

Thành [20] improved the Gut result by proving (1.5) under condition of uniform integrability of \(\left\{ \left\| X_{\textbf{n}}\right\| ^{p}\right. \), \(\left. \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \). Limit theorems for arrays of dependent random variables under various types of uniform integrability were also studied by many authors. We refer to [2, 11,12,13, 18] and the references therein.

We note that the aforementioned results did not consider mean convergence for maximal normed partial sums which is of special interest. Motivated by the results of Gut [6] and Thành [20], in the present study, we establish mean convergence theorems and weak laws of large numbers for maximum normed partial sums from a d-dimensional array of random elements taking values in a real separable Banach space irrespective of the joint distributions. The first main result, Theorem 3.1, extends Theorem 2.1 of Thành [20] to Banach space-valued random elements and, at the same time, improves this result by obtaining mean convergence for maximum normed partial sums. Our result on the weak laws of large numbers extends a recent result of Boukhari [3, Theorem 1.2] to random fields by a different method.

The rest of the paper is arranged as follows: Three preliminary lemmas are presented in Sect. 2. In Sect. 3, we establish a mean convergence theorem and weak laws of large numbers for maximum normed sums from a d-dimensional array of arbitrary random elements. Three examples are also provided to illustrate the sharpness of the main results.

2 Preliminaries

In this section, preliminary lemmas which are needed in connection with the main results will be presented. The first lemma is a well-known result. Its proof follows immediately from the \(c_p\) inequality (see, e.g., Gut [8, Theorem 3.2.2]) and the induction.

Lemma 2.1

Let \(a_1,\ldots ,a_n\) be elements of a Banach space, and let \(0<p\le 1\). Then,

$$\begin{aligned} \Vert a_1+\cdots +a_n\Vert ^p\le \Vert a_1\Vert ^p+\cdots +\Vert a_n\Vert ^p. \end{aligned}$$

The following lemma is a maximal inequality for d-sums of random elements.

Lemma 2.2

Let \(0<p\le 1\), and let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a d-dimensional array of random elements in a real separable Banach space \(\mathcal {X}\) satisfying \(E\left\| X_{\textbf{k}}\right\| ^p<\infty \), \(\textbf{k} \in \mathbb {Z}_{+}^{d}\). Then for all \(\textbf{n} \in \mathbb {Z}_{+}^{d}\), we have

$$\begin{aligned} E\left( \max _{\textbf{k} \prec \textbf{n}}\left\| S_{\textbf{k}}\right\| ^p\right) \le \sum _{\textbf{k} \prec \textbf{n}}E\left\| X_{\textbf{k}}\right\| ^p \end{aligned}$$
(2.1)

where \(S_{\textbf{k}}=\sum _{\textbf{i} \prec \textbf{k}} X_{\textbf{i}}\).

Proof

We have

$$\begin{aligned} E\left( \max _{\textbf{k} \prec \textbf{n}}\left\| S_{\textbf{k}}\right\| ^p\right)&\le E\left( \max _{\textbf{k} \prec \textbf{n}} \sum _{\textbf{i} \prec \textbf{k}} \Vert X_{\textbf{i}}\Vert ^p \right) \\&= E\left( \sum _{\textbf{i} \prec \textbf{n}}\Vert X_{\textbf{i}}\Vert ^p\right) \\&= \sum _{\textbf{i} \prec \textbf{n}}E\Vert X_{\textbf{i}}\Vert ^p, \end{aligned}$$

where we have applied Lemma 2.1 in the first inequality.

Applying Markov’s inequality and Lemma 2.2 (the case \(p=1\)), we have the following result.

Lemma 2.3

Let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a collection of random elements in a real separable Banach space and set \(S_{\textbf{k}}=\sum _{\textbf{i} \prec \textbf{k}} X_{\textbf{i}}\). Then,

$$\begin{aligned} \mathbb {P}\left\{ \max _{\textbf{k} \prec \textbf{n}}\left\| S_{\textbf{k}}\right\|>t\right\} \le \frac{\sum _{\textbf{i} \prec \textbf{n}} E\left\| X_{\textbf{i}}\right\| }{t}, \quad t>0. \end{aligned}$$

3 Main Results

We can now present the main results. The first theorem extends Theorem 2.1 of Thành [20] to Banach space-valued random elements. Moreover, in Thành [20], the author only obtained mean convergence results for normed partial sums which are weaker than mean convergence results for maximal normed partial sums as in Theorem 3.1. We note that for the case where the norming constants of the form \(|\textbf{n}|^{1 / p},\) \(p \in (0,1)\), the norming constants are much faster than the linear case and we do not require the underlying random element satisfying any specific dependence structure. The strong law of large numbers for this case was studied by some authors (see, e.g., Theorem 5.1 in [5] and Theorem 3.2 in [15]).

Theorem 3.1

Let \(0<p<1\), and let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a d-dimensional array of random elements in a real separable Banach space \(\mathcal {X}\) such that \(\left\{ \left\| X_{\textbf{n}}\right\| ^{p}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) is uniformly integrable. Then,

$$\begin{aligned} \dfrac{\max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \right\| }{|\textbf{n}|^{1/p}} {\mathop {\longrightarrow }\limits ^{\mathcal {L}^p}} 0 \quad \text{ as } |\textbf{n}| \rightarrow \infty . \end{aligned}$$
(3.1)

Proof

Let \(\varepsilon >0\) be arbitrary. Since \(\left\{ \left\| X_{\textbf{n}}\right\| ^{p}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) is uniformly integrable, there exists \(M>0\) such that

$$\begin{aligned} \sup _{\textbf{n} \in \mathbb {Z}_{+}^{d}}E\left( \Vert X_{\textbf{n}}\Vert ^{p} \textbf{1}\left( \Vert X_{\textbf{n}}\Vert >M\right) \right) <\varepsilon . \end{aligned}$$
(3.2)

Applying Lemma 2.1, we have

$$\begin{aligned} \begin{aligned}&E \left( \dfrac{1}{|\textbf{n}|^{1/p}}\max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \right\| \right) ^p\\&\le \dfrac{1}{|\textbf{n}|} E \left( \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \textbf{1}\left( \Vert X_{\textbf{j}}\Vert \le M\right) \right\| \right) ^p + \dfrac{1}{|\textbf{n}|} E \left( \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \textbf{1}\left( \Vert X_{\textbf{j}}\Vert >M\right) \right\| \right) ^p\\&:=I_1+I_2. \end{aligned} \end{aligned}$$
(3.3)

By Lemma 2.2 and (3.2), we have

$$\begin{aligned} \begin{aligned} I_2&=\dfrac{1}{|\textbf{n}|} E \left( \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \textbf{1}\left( \Vert X_{\textbf{j}}\Vert>M\right) \right\| \right) ^p\\&\le \dfrac{1}{|\textbf{n}|}\sum _{\textbf{j} \prec \textbf{n}} E\left\| X_{\textbf{j}} \textbf{1}\left( \Vert X_{\textbf{j}}\Vert >M\right) \right\| ^p\\&\le \dfrac{|\textbf{n}|\varepsilon }{|\textbf{n}|}=\varepsilon . \end{aligned} \end{aligned}$$
(3.4)

Noting that \(\sup _{\textbf{j} \in \mathbb {Z}_{+}^{d}}\Vert X_{\textbf{j}}\textbf{1}\left( \Vert X_{\textbf{j}}\Vert \le M\right) \Vert \le M\), we have

$$\begin{aligned} \begin{aligned} I_1&= \dfrac{1}{|\textbf{n}|} E \left( \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \textbf{1}\left( \Vert X_{\textbf{j}}\Vert \le M\right) \right\| \right) ^p \\&\le \dfrac{1}{|\textbf{n}|} \left( E\left( \max _{\textbf{k} \prec \textbf{n} } \sum _{\textbf{j} \prec \textbf{k}} \left\| X_{\textbf{j}} \textbf{1}\left( \Vert X_{\textbf{j}}\Vert \le M\right) \right\| \right) \right) ^{p}\\&\le \dfrac{1}{|\textbf{n}|} \left( \sum _{\textbf{j} \prec \textbf{n}} E\left\| X_{\textbf{j}} \textbf{1}\left( \Vert X_{\textbf{j}}\Vert \le M\right) \right\| \right) ^p\\&\le \dfrac{ \left( |\textbf{n}|M\right) ^p}{|\textbf{n}|}\\&\le \dfrac{M^p}{|\textbf{n}|^{1-p}}\rightarrow 0 \text { as } |\textbf{n}|\rightarrow \infty , \end{aligned} \end{aligned}$$
(3.5)

where we have applied Jensen’s inequality in the first inequality, and Lemma 2.2 in the second inequality.

Since \(\varepsilon >0\) is arbitrary, the conclusion (3.1) follows from (3.3)–(3.5). The proof of the theorem is completed.

The following example shows that Theorem 3.1 may fail if \(p=1\).

Example 3.2

Let \(p=1\), and let v be a nonzero vector in a real separable Banach space \(\mathcal {X}\). Let X be a random element taking values in \(\mathcal {X}\) such that

$$\begin{aligned} \mathbb {P}(X=v)=\mathbb {P}(X=-v)=\dfrac{1}{2}. \end{aligned}$$

Consider a d-dimensional array \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) of \(\mathcal {X}\)-valued random elements with \(X_{\textbf{n}}=X\) almost surely for all \(\textbf{n} \in \mathbb {Z}_{+}^{d}\). Then, clearly \(\left\{ \left\| X_n\right\| , \ n \in \mathbb {Z}_{+}^d\right\} \) is uniformly integrable, however

$$\begin{aligned} E \left\{ \dfrac{1}{|\textbf{n}|^{1/p}}\max _{\textbf{k} \prec \textbf{n} } \left\| \sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} \right\| \right\} ^p\ge \dfrac{1}{|\textbf{n}|}E\left\| \sum _{\textbf{j} \prec \textbf{n}} X_{\textbf{j}} \right\| =\dfrac{|\textbf{n}|\Vert v\Vert }{|\textbf{n}|}=\Vert v\Vert \not =0. \end{aligned}$$

Therefore, (3.1) fails.

The next example shows that Theorem 3.1 may fail if the assumption that \(\left\{ \Vert X_{\textbf{n}}\Vert ^p, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) is uniformly integrable is weakened to the condition that

$$\begin{aligned} \sup _{\textbf{n} \in \mathbb {Z}_{+}^{d}}E\left( \Vert X_{\textbf{n}}\Vert ^{p}\right) <\infty . \end{aligned}$$
(3.6)

Example 3.3

Let \(\mathcal {X}\) be the real line, let \(0< p < 1\), and let \(\{Y_n,n\ge 1\}\) be a sequence of independent real-valued random variables with

$$\begin{aligned} \mathbb {P}(Y_n=0)=1-\frac{1}{n},\ \mathbb {P}(Y_n=n^{1/p})=\mathbb {P}(Y_n=-n^{1/p})=\frac{1}{2n},\ n\ge 1. \end{aligned}$$

Then, it is clear that

$$\begin{aligned} \sup _{n\ge 1}E(|Y_{n}|^p)=1<\infty \end{aligned}$$

and

$$\begin{aligned} \sup _{n>1} \mathbb {E}\left( \left| Y_{n}\right| ^{p} \textbf{1}\left( \left| Y_{n}\right| ^p>a\right) \right) =1. \end{aligned}$$

for all \(a>0\). Therefore, (3.6) is satisfied but \(\left\{ \left| Y_{n}\right| ^{p}, n \ge 1\right\} \) is not uniformly integrable. For a real number x, let \([x\rfloor \) denote the greatest integer that is smaller than or equal to x. Then for \(0<\varepsilon <1 / 4\) and for \(n \ge 2\), we have

$$\begin{aligned} \begin{aligned} \sum _{i=1}^{n} \mathbb {P}\left( \left| Y_{i}\right|>\varepsilon n^{1 / p}\right)&\ge \sum _{i=\lfloor n / 2\rfloor }^{n} \mathbb {P}\left( \left| Y_{i}\right| >\varepsilon n^{1 / p}\right) \\&\ge \sum _{i=\lfloor n / 2\rfloor }^{n} \frac{1}{n} \ge \frac{1}{2}. \end{aligned} \end{aligned}$$
(3.7)

By using Theorem 3.3 of Boukhari [3], we have

$$\begin{aligned} \frac{\max _{1 \le i \le n}\left| Y_{i}\right| }{b_{n}} {\mathop {\rightarrow }\limits ^{\mathbb {P}}} 0 \quad \text{ if } \text{ and } \text{ only } \text{ if } \quad \sum _{i=1}^{n} \mathbb {P}\left( \left| Y_{i}\right|>b_{n} \varepsilon \right) \rightarrow 0 \text{ for } \text{ all } \varepsilon >0. \end{aligned}$$
(3.8)

Combining (3.8) and (3.7) yields

$$\begin{aligned} \frac{\max _{1 \le i \le n}\left| Y_{i}\right| }{n^{1 / p}} {\xrightarrow {\mathbb {P}}\!\!\!\!\!\!\!\!/}\;\; 0 . \end{aligned}$$
(3.9)

Let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a d-dimensional array of real-valued random variables such that for \(\textbf{n}=(n_1,\cdots ,n_d)\in \mathbb {Z}_{+}^{d}\),

$$\begin{aligned} X_{\textbf{n}}=Y_{n_1} \text { if } n_2=\cdots =n_d=1, \end{aligned}$$

and

$$\begin{aligned} X_{\textbf{n}}=0 \text { if } \max \{n_2,\cdots ,n_d\}\ge 2. \end{aligned}$$

Then, \(\left\{ |X_{\textbf{n}}|^p, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) is not uniformly integrable but (3.6) holds. For \(n_1\ge 1\) and \(1\le j\le n_1\), we have

$$\begin{aligned} \begin{aligned} \frac{\sum _{i_1=1}^{j}X_{i_1,1,\ldots ,1}}{n_{1}^{1/p}}=\frac{\sum _{k=1}^{j}Y_k}{n_{1}^{1/p}}. \end{aligned} \end{aligned}$$
(3.10)

From (3.9), we see that

$$\begin{aligned} \dfrac{\max _{1\le j\le n_1}\left| \sum _{i=1}^{j}Y_j\right| }{n_{1}^{1/p}} \end{aligned}$$

does not converge to 0 in probability as \(n_1\rightarrow \infty \), so it does not converge to 0 in mean of order p as \(n_1\rightarrow \infty \). Combining this with (3.10), we see that (3.1) fails.

The next theorem establishes the weak law of large numbers for maximum normed partial sums from a d-dimensional array of Banach space-valued random elements. It extends Theorem 2.2 of Boukhari [3] to the d-dimensional setting. We note that our proof is substantially different from that of Boukhari [3]: Boukhari [3] used the Doob inequality from martingale theory whereas our proof is based on Lemma 2.2. We note that for \(b_n=n^{1/p}L(n)\) where \(0<p<1\) and \(L(\cdot )\) is a slowly varying function, then (3.11) is satisfied. An example also shows that in Theorem 3.4, condition (3.11) cannot be dispensed with (see Example 3.6). We also note that if \(b_n = n\), then (3.11) fails.

Theorem 3.4

Let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a d-dimensional array of random elements in a real separable Banach space \(\mathcal {X}\). Suppose that the array \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) is stochastically dominated by a real-valued random variable X. Let \(\left\{ b_{k}, k \ge 1\right\} \) be a non-decreasing sequence of positive constants satisfying

$$\begin{aligned} \sum _{k=1}^{n} \frac{b_{k}}{k^{2}}=\mathcal {O}\left( \frac{b_{n}}{n}\right) . \end{aligned}$$
(3.11)

If

$$\begin{aligned} \lim _{k \rightarrow \infty } k \mathbb {P}\left\{ |X|>b_{k}\right\} =0, \end{aligned}$$
(3.12)

then

$$\begin{aligned} \dfrac{\max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \right\| }{b_{|\textbf{n}|}} {\mathop {\longrightarrow }\limits ^{\mathbb {P}}} 0 \quad \text{ as } |\textbf{n}| \rightarrow \infty , \end{aligned}$$
(3.13)

where \(\textbf{k} \in \mathbb {Z}_{+}^{d}, \text { and } \textbf{j} \in \mathbb {Z}_{+}^{d}\).

Proof of Theorem 3.4

For \(\textbf{n} \in \mathbb {Z}_{+}^{d}, \text { and } \textbf{j} \in \mathbb {Z}_{+}^{d}\), set

$$\begin{aligned} Y_{\textbf{n}\textbf{j}} = X_{\textbf{j}}\textbf{1}(\Vert X_{\textbf{j}}\Vert \le b_{|\textbf{n}|}). \end{aligned}$$

To demonstrate (3.13), we first verify that

$$\begin{aligned} \dfrac{ \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}}\left( X_{\textbf{j}} - Y_{\textbf{n}\textbf{j}}\right) } \right\| }{b_{|\textbf{n}|}} {\mathop {\longrightarrow }\limits ^{\mathbb {P}}} 0 \quad \text{ as } |\textbf{n}| \rightarrow \infty . \end{aligned}$$
(3.14)

To accomplish this, we recall that (3.11) implies (see [3, Remark 2.4 (i)])

$$\begin{aligned} \dfrac{|\textbf{n}|}{b_{|\textbf{n}|}} \rightarrow 0 \quad \text {as } |\textbf{n}| \rightarrow \infty . \end{aligned}$$
(3.15)

Let \(\varepsilon >0\) be arbitrary and observe that

$$\begin{aligned} \begin{aligned} \mathbb {P}\left\{ \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}}\left( X_{\textbf{j}} - Y_{\textbf{n}\textbf{j}} \right) } \right\|> \varepsilon b_{|\textbf{n}|} \right\}&\le \mathbb {P}\left\{ \bigcup _{\textbf{j} \prec \textbf{n}} (X_{\textbf{j}} \ne Y_{\textbf{n}\textbf{j}}) \right\} \\&\le \sum _{\textbf{j} \prec \textbf{n}} \mathbb {P}\left\{ \Vert X_{\textbf{j}}\Vert> b_{|\textbf{n}|} \right\} \\&\le |\textbf{n}| \mathbb {P}\left\{ |X|>b_{|\textbf{n}|}\right\} \rightarrow 0 \quad \text {as} \quad |\textbf{n}| \rightarrow \infty \quad (\text {by (3.12)}). \end{aligned} \end{aligned}$$

Next, it will be shown that

$$\begin{aligned} \dfrac{ \max _{\textbf{k} \prec \textbf{n} } \left\| \sum _{\textbf{j} \prec \textbf{k}} Y_{\textbf{n}\textbf{j}} \right\| }{ b_{|\textbf{n}|}} {\mathop {\longrightarrow }\limits ^{\mathbb {P}}} 0 \quad \text{ as } |\textbf{n}| \rightarrow \infty . \end{aligned}$$
(3.16)

Set \(b_{0}=0\). Again let \(\varepsilon >0\) be arbitrary, we have

$$\begin{aligned} \begin{aligned}&\mathbb {P}\left\{ \max _{\textbf{k} \prec \textbf{n} } \left\| \sum _{\textbf{j} \prec \textbf{k}} Y_{\textbf{n}\textbf{j}} \right\|> \varepsilon b_{|\textbf{n}|} \right\} \\&\le \dfrac{1}{\varepsilon b_{|\textbf{n}|}} \sum _{\textbf{j} \prec \textbf{n}} E\left( \Vert Y_{\textbf{n} \textbf{j}}\Vert \right) \quad (\text {by Lemma 2.3}) \\&= \frac{1}{\varepsilon b_{|\textbf{n}|}} \sum _{\textbf{j} \prec \textbf{n}} \int _{0}^{b_{|\textbf{n}|}} \mathbb {P}\left\{ \left\| X_{\textbf{j}}\right\|>t\right\} \textrm{d} t \\&\le \frac{1}{\varepsilon b_{|\textbf{n}|}} |\textbf{n}| \int _{0}^{b_{|\textbf{n}|}} \mathbb {P}\left\{ |X|>t\right\} \textrm{d} t \\&= \frac{1}{\varepsilon b_{|\textbf{n}|}} |\textbf{n}| \sum _{k=1}^{|\textbf{n}|} \int _{b_{k-1}}^{b_k} \mathbb {P}\left\{ |X|>t\right\} \textrm{d} t \\&\le \frac{ |\textbf{n}|}{\varepsilon b_{|\textbf{n}|}} \sum _{k=1}^{|\textbf{n}|} \frac{b_{k}-b_{k-1}}{k} k \mathbb {P}\left\{ |X|>b_{k-1}\right\} . \end{aligned} \end{aligned}$$
(3.17)

Now when \(\mathbf {|n|} \ge 2,\)

$$\begin{aligned} \begin{aligned} \frac{ |\textbf{n}|}{ b_{|\textbf{n}|}} \sum _{k=1}^{|\textbf{n}|} \frac{b_{k}-b_{k-1}}{k}&=\frac{|\textbf{n}|}{ b_{|\textbf{n}|}} \left( \dfrac{ b_{|\textbf{n}|}}{|\textbf{n}|} + \sum _{k=1}^{ |\textbf{n}|-1} \dfrac{b_k}{k(k+1)}\right) \\&\le 1+ \frac{ |\textbf{n}|}{ b_{|\textbf{n}|}} \sum _{k=1}^{ |\textbf{n}|} \dfrac{b_k}{k^2}\\&\le C\quad (\text {by } (3.11)), \end{aligned} \end{aligned}$$

for all fixed \(k \ge 1\),

$$\begin{aligned} \frac{ |\textbf{n}|}{ b_{|\textbf{n}|}} \left( \dfrac{b_k - b_{k-1}}{k}\right) \rightarrow 0 \quad \text{ as } |\textbf{n}| \rightarrow \infty \quad (\text{ by } (3.15)), \end{aligned}$$

and for all \(k \ge 2\)

$$\begin{aligned} k \mathbb {P}\left\{ |X|>b_{k-1}\right\} \le 2(k-1) \mathbb {P}\left\{ |X|>b_{k-1}\right\} \rightarrow 0 \quad (\text{ by } (3.12)) \end{aligned}$$

Thus by the Toeplitz lemma

$$\begin{aligned} \frac{C |\textbf{n}|}{ b_{|\textbf{n}|}} \sum _{k=1}^{|\textbf{n}|} \frac{b_{k}-b_{k-1}}{k} k \mathbb {P}\left\{ |X|>b_{k-1}\right\} \rightarrow 0 \quad \text{ as } |\textbf{n}| \rightarrow \infty . \end{aligned}$$

Combining this with (3.17), we obtain (3.16). By using (3.14) and (3.16), we obtain (3.13). \(\square \)

Gut [7] and Kruglov [10] established the weak law of large numbers for sequences of independent identically distributed random variables with regularly varying normalizing constants. The strong law of large numbers and complete convergence with regularly varying normalizing constants was recently also studied by many authors. We refer to [1, 4, 19] and the references therein. For definitions, basic properties of regularly varying functions, and their important role in probability theory, we refer the reader to Seneta [17].

By letting \(b_k=k^{1 / p} L(k),\ p \in (0,1)\), where \(L(\cdot )\) is a slowly varying function, then (3.11) is satisfied. Therefore, applying Theorem 3.4, we obtain the following corollary. This result extends the “\(0<p<1\)” part of Theorem 1.3 of Gut [7] to d-dimensional arrays.

Corollary 3.5

Let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a d-dimensional array of random elements in a real separable Banach space \(\mathcal {X}\). Suppose that the \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) are stochastically dominated by a real-valued random variable X satisfying condition (3.12). Let \(0<p<1\), and let \(\left\{ b_{k}, k \ge 1\right\} \) be non-decreasing sequences of positive constant with \(b_k = k^{1/p}L(k)\), where \(L(\cdot )\) is a slowly varying function. Then,

$$\begin{aligned} \dfrac{1}{b_{|\textbf{n}|}} \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \right\| {\mathop {\longrightarrow }\limits ^{\mathbb {P}}} 0 \text{ as } |\textbf{n}| \rightarrow \infty . \end{aligned}$$
(3.18)

The following example is inspired by Example 5.1 of Rosalsky and Thành [14]. It shows that in Theorem 3.4, the condition (3.11) cannot be dispensed with.

Example 3.6

Consider the real separable Banach space \(\ell _{1}\) consisting of absolute summable real sequences \(v=\left\{ v_{k}, k \ge 1\right\} \) with norm \(\Vert v\Vert =\sum _{k=1}^{\infty }\left| v_{k}\right| \). The element having 1 in its k th position and 0 elsewhere will be denoted by \(v^{(k)}, k \ge 1\). Let \(\psi : \mathbb {N}^d \rightarrow \mathbb {N}\) be a one-to-one and onto mapping. Let \(\left\{ X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\right\} \) be a d-dimensional array of random elements in \(\ell _{1}\) by requiring the \(X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\) to be independent with

$$\begin{aligned} \mathbb {P}\left\{ X_{\textbf{n}}=v^{(\psi (\textbf{n}))}\right\} = \mathbb {P}\left\{ X_{\textbf{n}}=-v^{(\psi (\textbf{n}))}\right\} =\frac{1}{2}, \quad \textbf{n} \in \mathbb {Z}_{+}^{d}. \end{aligned}$$

Then, the array \(\{X_{\textbf{n}}, \textbf{n} \in \mathbb {Z}_{+}^{d}\}\) is stochastically dominated by the real valued random variable \(X\equiv \Vert X_{\textbf{1}}\Vert \) and (3.12) is satisfied. Let \(b_k=k\), \(k\ge 1\). It is easy to see that (3.11) fails. Moreover,

$$\begin{aligned} \dfrac{1}{b_{|\textbf{n}|}} \max _{\textbf{k} \prec \textbf{n} } \left\| {\sum _{\textbf{j} \prec \textbf{k}} X_{\textbf{j}} } \right\| = \dfrac{1}{|\textbf{n}|} \max _{\textbf{k} \prec \textbf{n} } |\textbf{k}|= \dfrac{|\textbf{n}|}{|\textbf{n}|}=1 \end{aligned}$$

for all \(\textbf{n} \in \mathbb {Z}_{+}^{d}\) and so (3.18) fails.

Remark 3.7

Example 3.6 demonstrates the difference between real-valued random variables and Banach space-valued random elements. If the underlying Banach space is the real line and the underlying random variables are independent, then, in contradictory to Example 3.6, Corollary 3.5 holds (see the case \(p=1\) of Theorem 1.3 of Gut [7]). Boukhari [3, Theorem 1.3] extended Gut’s result [7] to the case pairwise negatively dependent random variables. By the same method presented in this paper, we can prove that Theorem 1.3 of Gut [7] still holds for Hilbert space-valued random variables. The concept of pairwise negative dependence was recently introduced in [9]. Therefore, it is interesting to extend Theorem 1.3 of Boukhari [3] to the case where the underlying random variables taking values in Hilbert spaces.