1. We consider a sequence of independent random variables (r.v.) X1, X2, … that take a value of 1 with some probability 0 < p < 1 and a value of 0 with the probability q = 1 – p.

The event {Xn = 1} is often referred to as “success” in the nth trial, while the event {Xn = 0} is called the “failure.”

Let μn denote the number of successes in the first n trials. The distribution of this number is described by the B(n, p) binomial distribution with the probabilities

$${{P}_{r}} = P\{ {{\mu }_{n}} = r\} = C_{n}^{r}{{p}^{r}}{{q}^{{n - r}}},\quad r = 0,1, \ldots ,n - 1,n.$$

Mathematical expectation is given by the equalities

$$E{{\mu }_{n}} = np,\quad n = 1,2, \ldots .$$

Closely related to the Bernoulli scheme are geometric Geom(q) distributions with the parameter 0 < q < 1. If trials are carried out until the first “successful” event occurs, the number of failures is described by the geometric distribution since

$$P\{ {{X}_{1}} = 0,{{X}_{2}} = 0, \ldots ,{{X}_{n}} = 0,{{X}_{{n + 1}}} = 1\} = p{{q}^{n}},\quad n = 0,1,2, \ldots ,$$

and the mathematical expectation of the number of failed trials is q/p.

If we are interested in the moment ν when both outcomes are found in the X1, X2, … sequence, we obtain

$$P\{ \nu = n\} = p{{q}^{{n - 1}}} + q{{p}^{{n - 1}}},\quad n = 2,3, \ldots .$$

We now consider a random variable W(k), which is the number of failures that occurred before the emergence of the kth success. In this case, we deal with a sum that consists of k independent random variables distributed geometrically according to Geom(q). This sum is described by a negative binomial distribution with the probabilities

$$P\{ W(k) = n\} = C_{{n + k - 1}}^{n}{{p}^{k}}{{q}^{n}},\quad n = 0,1,2, \ldots $$

and the mathematical expectation

$$EW(k) = kq{\text{/}}p,\quad k = 1,2, \ldots .$$

The generating function for the r.v. W(k) has the form

$$E{{s}^{{W(k)}}} = pq{{s}^{k}}{\text{/}}{{(1 - qs)}^{k}},\quad k = 1,2, \ldots .$$

2. We now consider the following situation. A trial is carried out not to the kth successful event but to the moment when the first series that includes no less that k successive successes occurs in the X1, X2, … sequence.

The first volume of a two-volume book by Feller ([1], ch. 13, Eq. (7.6)) contains the generating function F(s) for the probabilities

$$f(k,n) = P\{ {{L}_{n}}\} $$

that the first series that consists of k successful trials is related to the event

$${{L}_{n}} = \{ {{X}_{{n - k}}} = 0,{{X}_{{n - k + 1}}} = 1, \ldots ,{{X}_{{n - 1}}} = 1,{{X}_{n}} = 1\} .$$

It has been shown that

$$F(s) = {{p}^{k}}{{s}^{k}}(1 - ps){\text{/}}(1 - s + q{{p}^{k}}{{s}^{{k + 1}}}),\quad k = 1,2, \ldots .$$
(1)

If we somewhat change the condition and consider the number of trials prior to the moment when such a series emerges, we obtain that the corresponding generating function has the following form

$${{F}_{1}}(s) = {{p}^{k}}(1 - ps){\text{/}}(1 - s + q{{p}^{k}}{{s}^{{k + 1}}}),\quad k = 1,2, \ldots .$$
(2)

Next, we deal with the events

$${{A}_{0}} = \{ {{X}_{1}} = 0\} ,$$
$${{A}_{r}} = \{ {{X}_{1}} = 1,{{X}_{2}} = 1, \ldots ,{{X}_{{r - 1}}} = 1,{{X}_{r}} = 1,{{X}_{{r + 1}}} = 0\} ,\quad r = 1,2, \ldots ,k - 1,$$
$${{A}_{k}} = \{ {{X}_{1}} = 1,{{X}_{2}} = 1, \ldots ,{{X}_{{k - 1}}} = 1,{{X}_{k}} = 1\} .$$

It should be noted that

$$P\{ {{A}_{r}}\} = q{{p}^{r}},\quad r = 0,1, \ldots ,k - 1\quad {\text{and}}\quad P\{ {{A}_{k}}\} = {{p}^{k}}.$$

What can be said about the number of failures W(k) if the event Ar event occurs? It is clear that the conditional distribution of this quantity provided that the event Ar, r = 0, 1, 2, … , k – 1 occurred coincides with the distribution of the random variable W1(k) = 1 + W(k). If the event Ak occurred, no failures have been observed.

Let B1(k) = EW(k). We obtain that

$${{B}_{1}}(k) = (1 - {{p}^{k}})(1 + EW(k)) = (1 - {{p}^{k}}) + (1 - {{p}^{k}}){{B}_{1}}(k).$$

Consequently

$${{B}_{1}}(k) = (1 - {{p}^{k}}){\text{/}}{{p}^{k}},\quad k = 1,2, \ldots .$$

Let now

$$P(k,s) = E{{s}^{{W(k)}}}$$

denote the generating function of the random variable W(k). Taking into account the possible versions described above, we obtain that the following equalities hold true

$$P(k,s) = {{p}^{k}} + (1 - {{p}^{k}})E{{s}^{{(1 + W(k))}}} = {{p}^{k}} + (1 - {{p}^{k}})sP(k,s)$$

and

$$P(k,s) = {{p}^{k}}{\text{/}}(1 - (1 - {{p}^{k}})s),$$

i.e., the variable W(k) is described by the geometric Geom(1 – pk) distribution with the probabilities

$$P\{ W(k) = n\} = {{p}^{k}}{{(1 - {{p}^{k}})}^{n}},\quad n = 0,1,2, \ldots .$$

In particular,

$$P\{ W(k) = 0\} = {{p}^{k}}$$

and

$$P\{ W(k) = 1\} = {{p}^{k}}(1 - {{p}^{k}}).$$

We now proceed to the random variable R(k), the number of successful events before the emergence of the first series of interest for us that contains no less than k successive successes. The equality R(k) = 0 holds true if the event Ak occurs. If we deal with the event Ar, r = 0, 1, 2, … , k – 1, the conditional distribution R(k) coincides with the distribution of the sum R(k) + r. For the mathematical expectation B2(k) = ER(k), we obtain the following relation

$${{B}_{2}}(k) = \sum\limits_{r = 0}^{k - 1} {(ER(k) + r){{p}^{r}}q} = (1 - {{p}^{k}}){{B}_{2}}(k) + (1 - k{{p}^{{k - 1}}} + (k - 1){{p}^{k}}){\text{/}}{{q}^{2}}.$$

We obtain from this formula that

$${{B}_{2}}(k) = (1 - k{{p}^{{k - 1}}} + (k - 1){{p}^{k}}){\text{/}}{{p}^{{k - 1}}}q.$$

The generating function R(k, s) for the random variable R(k) fulfils the following equality

$$R(k,s) = qR(k,s) + \sum\limits_{r = 1}^{k - 1} {{{s}^{r}}R(k,s){{p}^{r}}q} + {{p}^{k}},$$

from which we obtain that

$$R(k,s) = {{p}^{{k - 1}}}(1 - ps){\text{/}}(1 - s + q{{p}^{{k - 1}}}{{s}^{k}}).$$
(3)

If k = 1, we arrive at the natural result

$$R(1,s) = 1,$$

which corresponds to the distribution degenerate at zero. If k = 2, then

$$R(2,s) = p(1 - ps){\text{/}}(1 - s + qp{{s}^{2}}) = p{\text{/}}(1 - qs),$$

i.e., we obtain a geometric distribution with the probabilities

$$P\{ R(2) = n\} = p{{q}^{n}},\quad n = 0,1,2, \ldots .$$

We now consider a random variable M(k), the number of success series before the momentum when the first series consisting of k or a larger number of successive successful trials emerges.

We now return to the events A0, A1, …, Ak. If the event A0 occurs, the conditional distribution of the random variable M(k) coincides with its unconditional distribution. If the event Ak occurs, we obtain that M(k) = 0. In all other cases, the conditional distribution of the random variable M(k), provided that the event Ar, r = 1, 2, …, k – 1 occurs, coincides with the unconditional distribution for the quantity M(k) + 1. For the mathematical expectations a(k) = EM(k), we obtain that

$$\begin{gathered} a(k) = a(k)P({{A}_{0}}) + (1 + a(k))(P({{A}_{1}}) + P({{A}_{2}}) + \ldots + P({{A}_{{k - 1}}})) \\ = qa(k) + (1 + a(k))(qp + q{{p}^{2}} + q{{p}^{{k - 1}}}) = qa(k) + p(1 - {{p}^{{k - 1}}}) + a(k)p(1 - {{p}^{{k - 1}}}) \\ = a(k) - {{p}^{k}}a(k) + p(1 - {{p}^{{k - 1}}}), \\ \end{gathered} $$

which yields that

$$a(k) = (1 - {{p}^{{k - 1}}}){\text{/}}{{p}^{{k - 1}}}.$$
(4)

If k = 1, we naturally obtain a(1) = 0. If k = 2, then a(2) = q/p. If k = 3, then a(3) = (1 – p2)/p2.

It should be noted that the distribution of the number of series of the successes M(k) does not depend on whether the first trial was successful or not.

If we now consider the number N(k) of the series of failures before the emergence of a group of k successive successes, two possible situations can be distinguished. If the first trial was successful, N(k) = M(k). Otherwise we obtain that N(k) = M(k) + 1. For the mathematical expectations b(k) = EN(k), we arrive at the equality

$$b(k) = p{{B}_{2}}(k) + q(1 + {{B}_{2}}(k)) = q + {{B}_{2}}(k) = ({{q}^{2}}{{p}^{{k - 1}}} + 1 - k{{p}^{{k - 1}}} + (k - 1){{p}^{k}}){\text{/}}{{p}^{{k - 1}}}q.$$
(5)

If we also consider the total number S(k) of the series of successes and the series of failures before the emergence of k successive successful trials, the distribution of this variable coincides with a mixture with weights p and q of the random variables 2M(k) and 2M(k) + 1.

We now determine the form of the generating functions M(k, s), N(k, s), and S(k, s) for the random variables M(k), N(k), and S(k). It should be noted that the following formulas hold true:

$$N(k,s) = pM(k,s) + qsM(k,s) = (p + qs)M(k,s)$$
(6)

and

$$S(k,s) = pM(k,{{s}^{2}}) + qsM(k,{{s}^{2}}) = (p + qs)M(k,{{s}^{2}}).$$
(7)

Returning to the events A0, A1, …, Ak, we can derive the following equality for the function M(k, s):

$$M(k,s) = qM(k,s) + (pq + {{p}^{2}}q + \ldots + {{p}^{{k - 1}}}q)sM(k,s) + {{p}^{k}} = qM(k,s) + (p - {{p}^{k}})sM(k,s) + {{p}^{k}},$$

from which it follows that

$$M(k,s) = {{p}^{{k - 1}}}{\text{/}}(1 - s + s{{p}^{{k - 1}}}).$$
(8)

Equation (8) yields in particular that

$$P\{ M(k) = n\} = {{p}^{{k - 1}}}{{(1 - {{p}^{{k - 1}}})}^{n}},\quad n = 0,1,2, \ldots .$$
(9)

Taking into account equalities (6) and (7), we also obtain that

$$N(k,s) = (p + qs){{p}^{{k - 1}}}{\text{/}}(1 - s + s{{p}^{{k - 1}}})$$
(10)

and

$$S(k,s) = (p + qs){{p}^{{k - 1}}}{\text{/}}(1 - {{s}^{2}} + {{p}^{{k - 1}}}{{s}^{2}}).$$
(11)

The following result (problem 24) can be found in ([1], ch. 13).

Let \({{q}_{{n,l}}} = P\{ Q(n) = l\} \) be the probability to have in n Bernoulli trials exactly l series of success of the length r. Then the generating function Qn(s) = \(E{{s}^{{Q(n)}}}\) is the coefficient of sn in the formula

$$(1 - {{p}^{r}}{{s}^{r}}){\text{/}}[1 - s + q{{p}^{r}}{{s}^{{r + 1}}} - (1 - ps){{p}^{r}}{{s}^{r}}q].$$

We now present some statements that enable us to find the corresponding distributions for the number V(k, r) of the series consisting of exactly r < k successes in the scheme presented above. In this situation, the event Ar is associated with the random variable V(k, r) + 1. If the event Ak occurs, there are no such series. In all other cases, we deal with the conditional distribution that coincides with the distribution of the random variable V(k, r). For the generating function

$$V(k,r,s) = E{{s}^{{V(k,r)}}},$$

the equality holds true

$$V(k,r,s) = {{p}^{k}} + {{p}^{r}}qsV(k,s) + (1 - {{p}^{r}}q - {{p}^{k}})V(k,r,s),$$

which yields that

$$V(k,r,s) = {{p}^{{k - r}}}{\text{/}}({{p}^{{k - r}}} + q - sq).$$
(12)

For r = 1, we obtain, in particular, that

$$V(k,1,s) = {{p}^{{k - 1}}}{\text{/}}({{p}^{{k - 1}}} + q - sq).$$
(13)

Using Eq. (13), we obtain, in particular, that

$$P\{ V(k,1) = n\} = {{q}^{n}}{{p}^{{k - 1}}}{\text{/}}{{(q + {{p}^{{k - 1}}})}^{{n + 1}}},\quad n = 0,1,2, \ldots .$$
(14)

We also note that the formulas for the generating functions

$$V(k,k - 1,s) = p{\text{/}}(1 - sq),\quad k = 2,3, \ldots $$

do not depend on k. In these cases, we obtain that

$$P\{ V(k,k - 1) = n\} = p{{q}^{n}},\quad n = 0,1,2, \ldots ,$$

i.e., we deal with a geometric distribution.

3. We now continue considering a sequence of independent random variables X1, X2, …, which take a value of 1 with some probability 0 < p < 1 and a value of  0 with the probability q = 1 – p until the emergence of the first series of k successes. We have presented above the distributions for the number of successes and failures, the number of the series of successes, and the number of the series of failures in the sequence before the emergence of k successive successes. We have also considered the distribution for the number of the series of success of a certain length r = 1, 2, …, k – 1 that precede the emergence of the first series of k successes.

We now return to the consideration of the series of failures observed before the emergence of k successively recorded successes. It should be noted that suchlike series already can consist of any number of successive failures. Equation (10) provides a formula for the generating function of the random variable N(k), the number of failure series. We now can consider the series of such failures of a fixed length r = 1, 2, …. Let W(r, k) denote their number.

We consider two possible cases.

In the situation when X1 = 0 and X1 = 1, the number of series of failures of length r is denoted as W1(r) and W2(r), respectively. Corresponding generating functions W(r, k, s), W1(r, s), and W2(r, s) for the random variables W(r, k), W1(r), and W2(r) are also needed. We note that

$$W(r,k) = q{{W}_{1}}(r) + p{{W}_{2}}(r).$$
(15)

Let X1 = 0. Then

$$P\{ W(r,k) = m\,|\,{{X}_{1}} = 0\} = P\{ {{W}_{1}}(r) = m\} .$$
(16)

If X1 = 1, then

$$P\{ W(r,k) = m\,|\,{{X}_{1}} = 1\} = P\{ {{W}_{2}}(r) = m\} .$$
(17)

We obtain that

$$P\{ W(r,k) = m\} = qP\{ {{W}_{1}}(r) = m\} + pP\{ {{W}_{1}}(r) = m\} .$$
(18)

We consider the following set of initial events:

$$Aj = \{ {{X}_{1}} = 0,{{X}_{2}} = 0, \ldots ,{{X}_{{j - 1}}} = 0,{{X}_{j}} = 1\} ,\quad j = 1,2, \ldots ,r - 1,r,r + 1 \ldots .$$

If the event \({{A}_{{r + 1}}}\) occurs, which is associated with the conditional probability

$$P\{ {{A}_{{r + 1}}}\,|\,{{X}_{1}} = 0\} = {{q}^{{r - 1}}}p,$$

the distribution of the random variable W1(r) coincides with the distribution of the random variable 1 + W2(r). In other cases, the distribution W1(r) coincides with the distribution W2(r).

Let X1 = 1 now. We consider the events

$${{B}_{j}} = \{ {{X}_{1}} = 1,{{X}_{2}} = 1, \ldots ,{{X}_{{j - 1}}} = 1,{{X}_{j}} = 0\} ,\quad j = 2,3, \ldots ,k,$$
$${{B}_{{k + 1}}} = \{ {{X}_{1}} = 1,{{X}_{2}} = 1, \ldots ,{{X}_{k}} = 1\} .$$

In the situations when any of the events B2, B3, …, Bk occurs, the conditional distribution W2(r) coincides with the distribution of the random variable W1(r). In the case \({{B}_{{k + 1}}}\), we obtain that W2(r) = 0.

For the first of these two situations, we obtain an equality for the generating functions of the form

$${{W}_{1}}(r,s) = p{{q}^{{r - 1}}}s{{W}_{2}}(r,s) + (1 - p{{q}^{{r - 1}}}){{W}_{2}}(r,s).$$
(19)

In the second case, we arrive at the equality

$${{W}_{2}}(r,s) = (1 - {{p}^{{k - 1}}}){{W}_{1}}(r,s) + {{p}^{{k - 1}}}.$$
(20)

The two last formulas that relate the two generating functions yield the equalities

$${{W}_{1}}(r,s) = \frac{{{{p}^{{k - 1}}}(1 + p{{q}^{{r - 1}}}s - p{{q}^{{r - 1}}})}}{{{{p}^{{k - 1}}} + (p{{q}^{{r - 1}}} - {{p}^{k}}{{q}^{{r - 1}}})(1 - s)}},$$
(21)
$${{W}_{2}}(r,s) = \frac{{{{p}^{{k - 1}}}}}{{{{p}^{{k - 1}}} + p{{q}^{{r - 1}}}(1 - s) - {{p}^{k}}{{q}^{{r - 1}}}(1 - s)}}$$
(22)

and

$$W(r,k,s) = q{{W}_{1}}(r,s) + p{{W}_{2}}(r,s) = \frac{{q{{p}^{{k - 1}}} + {{p}^{k}}{{q}^{r}}s - {{p}^{k}}{{q}^{r}} + {{p}^{k}}}}{{{{p}^{{k - 1}}} + (p{{q}^{{r - 1}}} - {{p}^{k}}{{q}^{{r - 1}}})(1 - s)}}.$$
(23)

In the particular case k = 2 and r = 1, we obtain the generating function in the form

$$W(1,2,s) = \frac{{1 - pq + pqs}}{{1 + q - qs}}.$$

Other problems related to the scheme of independent Bernoulli trials can be found in publications [2, 3].