1 Introduction

The classical Stirling numbers of the second kind denoted by S(nk) are the number of ways a set of n objects is partitioned into k non-empty disjoint subsets. It has wide applications in various fields of mathematics, physics, and probability to establish various results, identities and in computing diverse quantities. These are elementary tools for treatment of various combinatorial problems. Several equivalent definitions for Stirling numbers of the second kind appear in literature (for example see Quaintance and Gould [26, Chap. 9] and Comtet [12, Chap. 5]). The Euler’s formula for the Stirling numbers of the second kind is defined by (see [26, p. 118])

$$\begin{aligned} S(n,k) = \frac{1}{k!} \sum _{j=0}^{k} (-1)^{k-j} \left( {\begin{array}{c}k\\ j\end{array}}\right) j^{n}. \end{aligned}$$
(1.1)

In terms of the exponential generating function, it is expressed as (see [2])

$$\begin{aligned} \frac{(e^{z}-1)^{k}}{k!} = \sum _{n=k}^{\infty } S(n,k) \frac{z^{n}}{n!}, ~~~ z \in {\mathbb {C}}, \end{aligned}$$

where \({\mathbb {C}}\) denotes the set of all complex numbers.

In past few decades, the works on defining different generalizations of the Stirling numbers have been carried out. Recently, the multi-parameter non-central Stirling numbers and the r-Stirling numbers of the second kind have been studied in the literature. For more details on these generalizations one can refer [8, 9, 20] and the references therein. Let \((Y_{j})_{j \ge 1}\) be a sequence of mutually independent copies of a random variable Y with finite moment generating function. Consider \(S_{j} = Y_{1} + Y_{2} + \cdot \cdot \cdot + Y_{j}\) for \(j = 1,2,\dots\) with \(S_{0} = 0\). Then, the probabilistic Stirling numbers of the second kind for a random variable Y is defined by (see [4, p. 3])

$$\begin{aligned} S_{Y} (n,k) = \frac{1}{k!} \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) (-1)^{k-j}{\mathbb {E}}S_{j}^{n},~~~ n =0,1,\dots ,~~ k = 0,1,\dots ,n. \end{aligned}$$
(1.2)

Clearly, from (1.2) one may easily obtain the Euler’s formula (1.1) for the Stirling numbers of the second kind when \(Y_j\)’s are degenerate at 1. Probabilistic approach for defining Stirling numbers of the second kind provides a motivation for extensions of various classical formula which involves sum of powers on arithmetic progression. The higher order convolutions of the Appell polynomials can be obtained in a explicit form using proposed definition. Adell [2] also extended definition (1.2) for complex random variables and demonstrated its applications to the various topics of probability theory. They derived some well known combinatorial identities.

In terms of the exponential generating function, the probabilistic Stirling numbers of the second kind may be expressed as (see [4, p. 8])

$$\begin{aligned} \frac{\left( {\mathbb {E}}e^{zY}-1\right) ^{k}}{k!} = \sum _{n=k}^{\infty } S_{Y}(n,k) \frac{z^{n}}{n!}, ~~~ z \in {\mathbb {C}}. \end{aligned}$$
(1.3)

The Bell polynomials are closely related with the Bell and Stirling numbers with applications to Faà di Bruno’s formula and in deriving several convolution identities related to it. For non-negative integers n and k with \(n\ge k\), the partial exponential Bell polynomials denoted by \(B_{n,k}\) are defined by (see [13, p. 96] and [14, p.1])

$$\begin{aligned} B_{n,k}(x_1 , x_2,\dots , x_{n-k+1} ) = n!\sum _{\Omega _n ^{k}}^{} \prod _{j=1}^{n-k+1} \frac{1}{k_j !} \biggl (\frac{x_j}{j!}\biggr )^{k_j}, \end{aligned}$$
(1.4)

where

$$\begin{aligned} \Omega _n ^{k} = \biggl \{(k_1, k_2, \dots , k_{n-k+1}): \sum _{j=1}^{n-k+1}k_j = k, \sum _{j=1}^{n-k+1}jk_j = n, k_j \in {\mathbb {N}}_0 \biggr \}, \end{aligned}$$

and \({\mathbb {N}}_0 = {\mathbb {N}} \cup \{0\},\) where \({\mathbb {N}}\) stands for the set of natural numbers.

Using this definition, one can easily get the following identity (see [25] and [12, p. 135])

$$\begin{aligned} B_{n,k}(pqx_{1}, pq^2x_{2},\dots ,pq^{n-k+1}x_{n-k+1}) = p^k q^n B_{n,k}(x_{1}, x_{2},\dots ,x_{n-k+1}), \end{aligned}$$
(1.5)

for \(n\ge k \ge 0\) and \(p,q \in {\mathbb {C}}\).

The partial exponential Bell polynomials can also be expressed in the form of formal power series as (see [14, p. 1])

$$\begin{aligned} \frac{1}{k!} \biggl (\sum _{j=1}^{\infty } x_j \frac{z^j}{j!} \biggr )^k = \sum _{n = k}^{\infty } B_{n,k}(x_1, x_2,\dots ,x_{n-k+1}) \frac{z^n}{n!}, ~k = 0,1,2,.... \end{aligned}$$
(1.6)

The nth complete exponential Bell polynomial through the partial exponential Bell polynomials is defined by (see [14, pp. 1–2])

$$\begin{aligned} B_{n}(x_1, x_2,\dots , x_n) = \sum _{k=0}^{n}B_{n,k}(x_1, x_2,\dots , x_{n-k+1} ), \end{aligned}$$
(1.7)

with \(B_{0,0}=1, B_{n,0}=0\) for all \(n\ge 1\) and \(B_{0,k}=0\) for all \(k \ge 1\). Also, \(B_{n,1}(x_{1}, x_{2},\dots ,x_{n}) = x_{n}\) and \(B_{n,n}(x_{1}) = x_{1}^{n} \text { for all }n \ge 1.\)

A formal power series expansion for the complete exponential Bell polynomial is given by

$$\begin{aligned} \text {exp} \biggl (\sum _{j=1}^{\infty } x_j \frac{z^j}{j!} \biggr ) = \sum _{n = 0}^{\infty } B_{n}(x_1, x_2,\dots ,x_{n}) \frac{z^n}{n!}. \end{aligned}$$
(1.8)

Bell polynomials may also be expressed in terms of the nth order moments of Poisson random variables. This representation is well known as Dobinski’s formula and is given by (see [4, p. 15] and [14])

$$\begin{aligned} B_{n}(x) = e^{-x}\sum _{k=0}^{\infty } k^{n} \frac{x^{k}}{k!}. \end{aligned}$$
(1.9)

In terms of the exponential generating function, it may also be expressed as (see [24, Eq. 2])

$$\begin{aligned} e^{(e^{z}-1)x} = \sum _{k=0}^{\infty } B_{k}(x)\frac{z^{k}}{k!}. \end{aligned}$$

Also, the Stirling numbers of the second kind and the Bell polynomial are interconnected via the relation (see [4, p. 15])

$$\begin{aligned} B_{n}(x) = \sum _{k=0}^{n} x^{k} S(n,k). \end{aligned}$$
(1.10)

For any random variable Y with finite moments and using (1.6) with \(x_i = {\mathbb {E}}Y^i\), we have

$$\begin{aligned} \frac{1}{k!} \biggl (\sum _{j=1}^{\infty } {\mathbb {E}}Y^j \frac{z^j}{j!} \biggr )^k = \sum _{n = k}^{\infty } B_{n,k}\left( {\mathbb {E}}Y,{\mathbb {E}}Y^2,\dots ,{\mathbb {E}}Y^{(n-k+1)}\right) \frac{z^n}{n!}. \end{aligned}$$

That is,

$$\begin{aligned} \frac{\left( {\mathbb {E}}e^{zY}-1\right) ^{k}}{k!} = \sum _{n = k}^{\infty } B_{n,k}\left( {\mathbb {E}}Y,{\mathbb {E}}Y^2,\dots ,{\mathbb {E}}Y^{(n-k+1)}\right) \frac{z^n}{n!}. \end{aligned}$$
(1.11)

On comparing (1.11) with (1.3), we deduce a new connection between probabilistic generalizations of the Stirling numbers of the second kind and the partial exponential Bell polynomials as follows

$$\begin{aligned} S_{Y}(n,k) = B_{n,k}\left( {\mathbb {E}}Y,{\mathbb {E}}Y^2,\dots ,{\mathbb {E}}Y^{(n-k+1)}\right) , \end{aligned}$$
(1.12)

which may be viewed as an alternate representation of \(S_Y(n,k)\) defined in (1.2).

Laskin [19] proposed a fractional generalization of the Bell polynomials which include the probability mass function (pmf) of the time fractional Poisson process. Several generalizations of the Bell polynomial and the partial exponential Bell polynomial have been studied in literature with wide applications in the various areas of applied sciences. For more details on these polynomials one may refer [5, 6, 12]. Recently, Kataria et al. [14] established a probabilistic interconnection between the partial exponential Bell polynomial and the weighted sums of independent Poisson random variables. They also derived some new identities. Kataria and Vellaisamy [15] showed that Adomian polynomial can be written in the finite sum of the partial exponential Bell polynomials.

Under some appropriate moment conditions on random variable Y, Adell and Lakuona [4] studied a probabilistic version of the Stirling numbers of the second kind, which opens avenue for various generalizations of polynomials and numbers associated with the Stirling numbers of the second kind. The primary aim of this article is to present a probabilistic generalization of the Bell polynomials and we call it probabilistic Bell polynomials. We also address its various important properties and applications. The organization of the article is as follows:

In Sect. 2, we define the probabilistic Bell polynomials and derive its exponential generating function. Some alternative representations of the probabilistic Bell polynomials and several combinatorial identities are discussed. A generalization of the power sums formula linked with the probabilistic Stirling numbers of the second kind is also presented here. In Sect. 3, we consider interconnection of Bernoulli, Poisson, Geometric, and discrete exponential variates with the probabilistic Bell polynomials and the Stirling numbers of the second kind. The formula for the sum of powers of natural numbers is also obtained in case of Bernoulli random variable. Finally, connections of probabilistic Bell polynomials with cumulants and Appell polynomials are established in Sect. 4.

2 Probabilistic Bell polynomials

Adell and Lakuona [2, 4] introduced the probabilistic Stirling numbers of the second kind and discussed its relation with the Bell polynomials (see [4, Theorem 4.9]). Here, we explore a full connection between the probabilistic Stirling numbers of the second kind and the Bell polynomials by defining a probabilistic version of the Bell polynomials. Let \({\mathcal {H}}\) be the set of random variables Y satisfying the following moment conditions

$$\begin{aligned} {\mathbb {E}}|Y|^n< \infty ,\;\;\;\; n \in {\mathbb {N}}_0 \;\; \text {and}\;\;\lim _{n \rightarrow \infty } \frac{|t|^n {\mathbb {E}}|Y|^n}{n!},\;\;\;\; |t| < r, \end{aligned}$$
(2.1)

for some \(r > 0\). It ensures the existence of moment generating function of the random variable Y (see [7, p. 344]).

Let \(Y \in {\mathcal {H}}\) be a random variable. We define the probabilistic Bell polynomials as

$$\begin{aligned} B_{n}^{Y}(x) = \sum _{k=0}^{n} S_{Y} (n,k)x^{k}, \end{aligned}$$
(2.2)

where the probabilistic Stirling numbers of the second kind \(S_{Y} (n,k)\) are defined in terms of the powers of sum of independent and identically distributed (i.i.d.) copies of the random variable Y via the relation (1.2). This definition may be viewed as a probabilistic generalization of the Bell polynomials (or Touchard polynomials) with respect to the random variable Y satisfying some appropriate moments conditions. It also allows us to bridge interconnections with some well-known polynomials for a different choices of the random variable Y.

One can easily verify that for degenerate random variable Y at 1, we get the Bell polynomials. When \(x=1\), (2.2) reduces to \(B_{n}^{Y}(1) = B_{n}^{Y}\), and we call it the probabilistic Bell numbers.

Next, we have the following proposition.

Proposition 2.1

The exponential generating function for the probabilistic Bell polynomial is given by

$$\begin{aligned} e^{x({\mathbb {E}}e^{zY}-1)} = \sum _{n=0}^{\infty } B_{n}^Y(x) \frac{z^{n}}{n!}. \end{aligned}$$
(2.3)

Proof

Multiplying both sides of (1.3) by \(x^{k}\) and summing over k between 0 to \(\infty\), we get

$$\begin{aligned} \sum _{k=0}^{\infty }\dfrac{\left[ x({\mathbb {E}}e^{zY}-1)\right] ^{k}}{k!}= & {} \sum _{k=0}^{\infty } \sum _{n=k}^{\infty } S_{Y} (n,k) x^k \frac{z^{n}}{n!} \\ e^{x({\mathbb {E}}e^{zY}-1)}= & {} \sum _{n=0}^{\infty } \left( \sum _{k=0}^{n} S_{Y} (n,k)x^{k} \right) \frac{z^{n}}{n!}. \end{aligned}$$

In view of (2.2), we get the desired proposition. \(\square\)

Consider \((Y_{j})_{j\ge 1}\) be a sequence of independent copies of a random variable \(Y \in {\mathcal {H}}\). Then by Jensen’s inequality, we have

$$\begin{aligned} {\mathbb {E}}|S_k|^n \le k^{1/n}{\mathbb {E}}|Y|^n, k \in {\mathbb {N}}, \end{aligned}$$

where \(S_{k} = Y_{1}+Y_{2}+\cdots +Y_{k}, \;\;\;\;k=1,2,...\) with \(S_{0}=0\) (see [4, p. 3]).

Next, the following proposition is an alternative representation of the probabilistic Bell polynomials in terms of the moments of sums of i.i.d. random variables in \({\mathcal {H}}\).

Proposition 2.2

Let \(Y \in {\mathcal {H}}\). Then

$$\begin{aligned} B_{n}^{Y}(x) = e^{-x}\sum _{k=0}^{\infty } \frac{x^{k}}{k!}{\mathbb {E}}S_{k}^{n}. \end{aligned}$$
(2.4)

Proof

Simplifying (2.3) for i.i.d. random variables, we get

$$\begin{aligned} \sum _{n=0}^{\infty } B_{n}^{Y}(x) \frac{z^{n}}{n!}&=\sum _{k=0}^{\infty } e^{-x} x^{k} \frac{({\mathbb {E}}e^{zY})^{k}}{k!} =\sum _{k=0}^{\infty } \frac{e^{-x}x^{k}}{k!} ({\mathbb {E}}e^{zY})({\mathbb {E}}e^{zY}) \cdots ({\mathbb {E}}e^{zY})\\&=\sum _{k=0}^{\infty } \frac{e^{-x}x^{k}}{k!} ({\mathbb {E}}e^{zY_{1}})({\mathbb {E}}e^{zY_{2}}) \cdots ({\mathbb {E}}e^{zY_{k}})\\&=\sum _{k=0}^{\infty } \frac{e^{-x}x^{k}}{k!} {\mathbb {E}}e^{zY_{1}+zY_{2}+ \cdots +zY_{k}}\\&=\sum _{k=0}^{\infty } \frac{e^{-x}x^{k}}{k!} {\mathbb {E}}e^{zS_{k}}\\&=\sum _{n=0}^{\infty } \sum _{k=0}^{\infty } \frac{e^{-x}x^{k}}{k!} \frac{z^{n}}{n!}{\mathbb {E}}S_{k}^{n} =\sum _{n=0}^{\infty } \left( \sum _{k=0}^{\infty } \frac{e^{-x}x^{k}}{k!} {\mathbb {E}}S_{k}^{n} \right) \frac{z^{n}}{n!}. \end{aligned}$$

Comparing the coefficients of \(z^n\) on both sides, we get an alternate representation for the probabilistic Bell polynomials. \(\square\)

The definition in (2.4) may be seen as Dobinski’s type formula for the probabilistic Bell polynomials. Also, we have following observations:

  1. (i)

    When Y is degenerate at 1, (2.4) coincide with the Dobinski’s formula as in (1.9).

  2. (ii)

    For Y follows standard exponential distribution with \({\mathbb {E}}S_k^n=\langle k\rangle _n\), (2.4) yields the following

    $$\begin{aligned} B_{n}^Y(x) = B_{n}^{L}(x)=e^{-x}\sum _{k=0}^{\infty } \langle k\rangle _n\frac{x^{k}}{k!}, \end{aligned}$$

    which is the Lah-Bell polynomials and \(\langle k \rangle _n = k(k+1) \cdots (k+n-1)\) are the rising factorials (see [16, p. 4]).

Next, we derive a result for the probabilistic Bell polynomials in terms of the complete exponential Bell polynomials.

Lemma 2.1

For a random variable Y in \({\mathcal {H}}\), we have

$$\begin{aligned} B_{n}^{Y}(x) = B_{n}(x{\mathbb {E}}Y,x{\mathbb {E}}Y^2,...,x{\mathbb {E}}Y^{n}). \end{aligned}$$

Proof

With the help of (1.12) and (2.2), we get

$$\begin{aligned} B_{n}^{Y}(x) =&\sum _{k=0}^{n} B_{n,k}({\mathbb {E}}Y,{\mathbb {E}}Y^2,...,{\mathbb {E}}Y^{(n-k+1)})x^{k}\\ =&\sum _{k=0}^{n} B_{n,k}(x{\mathbb {E}}Y,x{\mathbb {E}}Y^2,...,x{\mathbb {E}}Y^{(n-k+1)}). \end{aligned}$$

The last step is obtained using property (1.5). Hence, by the virtue of (1.7), we get desired connection. \(\square\)

Aliter, the Lemma 2.1 can also be obtained from (1.8) by using \(x_i=x{\mathbb {E}}Y^i\), \(\forall i=1,2,\ldots , n\) and comparing with (2.3).

It may be noticed that for \(x=1\), Lemma 2.1 reduces to a new representation of probabilistic Bell numbers of the form

$$\begin{aligned} B_{n}^{Y} = B_{n}({\mathbb {E}}Y,{\mathbb {E}}Y^2,...,{\mathbb {E}}Y^{n}). \end{aligned}$$
(2.5)

In case of degeneracy of random variable Y at 1, it reduces to the classical Bell numbers.

In the next proposition, we have a recurrence relation for the probabilistic Bell polynomials.

Proposition 2.3

For \(n \ge 0,\) we have

$$\begin{aligned} B_{n+1}^{Y}(x) = x\sum _{k=0}^{n} \left( {\begin{array}{c}n\\ k\end{array}}\right) {\mathbb {E}}Y^{k+1}B_{n-k}^Y(x). \end{aligned}$$

Proof

A recurrence relation for the partial exponential Bell polynomials is of the following form (see [12])

$$\begin{aligned} B_{n+1}(x_1, x_2,\dots ,x_{n+1}) = \sum _{k=0}^{n} \left( {\begin{array}{c}n\\ k\end{array}}\right) x_{k+1}B_{n-k}(x_1, x_2,\dots , x_{n-k}). \end{aligned}$$
(2.6)

Consider \(Y \in {\mathcal {H}}\). Let \(x_i = x {\mathbb {E}}Y^i\), \(i =1,2,\dots\) in (2.6). With the help of Lemma 2.1, we get the desired recurrence relation. \(\square\)

Motivated by some binomial type results for special polynomials like Lah-Bell polynomials, Laguerre polynomials (see [31, Eg. 4.2]), the generalized binomial identity for the probabilistic Bell polynomials may also be obtained.

Proposition 2.4

Let \(B_{n}^{Y}(x)\) be a probabilistic Bell polynomials. Then, we have

$$\begin{aligned} B_{n}^{Y}(x+y) = \sum _{k=0}^{n} \left( {\begin{array}{c}n\\ k\end{array}}\right) B_{k}^{Y}(x) B_{n-k}^{Y}(y). \end{aligned}$$
(2.7)

Proof

We may observe that (2.7) is an immediate consequence of Proposition 2.1. Alternatively, it can also be obtained using Lemma 2.1 with the help of the following relation (see [11])

$$\begin{aligned} B_{n}(x_1+y_1,x_2+y_2,\dots ,x_n+y_n) = \sum _{k=0}^{n} \left( {\begin{array}{c}n\\ k\end{array}}\right) B_k(x_1, x_2, \dots , x_k)B_{n-k}(y_1, y_2, \dots , y_{n-k}). \end{aligned}$$

\(\square\)

It is evident that the sequence \(\{B_{n}^{Y}(x)\}_{n \ge 1}\) is of the binomial type with initial condition \(B_{0}^{Y}(x)=1\). Now, we present some identities for the probabilistic Bell polynomials, which generalizes several well-known identities reported in recent literature. These identites can be derived in similar fashion as the classical ones reported in [30, 32].

Theorem 2.1

Let \((B_{n}^{Y}(x))_{n}\) be a sequence of the probabilistic Bell polynomials defined on \(I, I \subseteq \mathbb {R,}\) with \(B_{0}^{Y}(x) \ne 0\). Then, we have

$$\begin{aligned} (i&)~~~ B_{n,k}(B_{0}^{Y}(1),B_{1}^{Y}(2),B_{2}^{Y}(3),...) = \left( {\begin{array}{c}n-1\\ k-1\end{array}}\right) \sum _{j=0}^{n-k} S_{Y}(n-k,j) n^{j}. \end{aligned}$$
(2.8)
$$\begin{aligned} (ii&)~~~ B_{n,k}(B_{0}^{Y}(x),2 B_{1}^{Y}(x),3 B_{2}^{Y}(x),...) = \left( {\begin{array}{c}n\\ k\end{array}}\right) \sum _{j=0}^{n-k} x^{j} S_{Y}(n-k,j) k^{j}. \end{aligned}$$
(2.9)
$$\begin{aligned} (iii&)~~~ B_{n,k}(B_{1}^{Y}(x),B_{2}^{Y}(x),B_{3}^{Y}(x),...) = \sum _{j=0}^{n} S_Y(n,j)x^j S(j,k),~~n\ge k. \end{aligned}$$
(2.10)
$$\begin{aligned} (iv&)~~~ B_{n}^{Y}(x) = \sum _{k=0}^{n} \left( {\begin{array}{c}x\\ k\end{array}}\right) k! B_{n,k}(B_{1}^{Y},B_{2}^{Y},...,B_{n-k+1}^{Y} ). \end{aligned}$$
(2.11)

In particular, when \(Y=1\) and \(x=1\), the part (ii) of the Theorem 2.1 gives

$$\begin{aligned} B_{n,k}(B_{0},2 B_{1},3 B_{2},\ldots ) = \left( {\begin{array}{c}n\\ k\end{array}}\right) \sum _{j=0}^{n-k} S(n-k,j) k^{j}, \end{aligned}$$

which has been studied in [1]. Also, when Y follows exponential with unit mean, then part (iii) of the Theorem 2.1 provides a new connection between the Lah-Bell polynomials and the partial exponential Bell polynomials of the form

$$\begin{aligned} B_{n,k}(B_{1}^{L}(x),B_{2}^{L}(x),B_{3}^{L}(x),...) = \sum _{j=0}^{n} L(n,j)x^j S(j,k),\;\text {for}~ n\ge k, \end{aligned}$$

where L(nj) denotes the Lah numbers.

The next theorem is a convolution result in terms of derivatives of the probabilistic Bell polynomials and the probabilistic Stirling numbers of the second kind.

Theorem 2.2

For \(k \le n,\) we have

$$\begin{aligned} \displaystyle \frac{d^k}{dx^k} B_{n}^{Y}(x) = k!\sum _{j=0}^{n-k} \left( {\begin{array}{c}n\\ j\end{array}}\right) B_{j}^{Y}(x) S_{Y}(n-j,k), \end{aligned}$$
(2.12)

where \(\displaystyle \frac{d^k}{dx^k}B_{n}^{Y}(x)\) denotes the kth order derivative of \(B_{n}^{Y}(x)\).

Proof

On k-times successive differentiation of (2.3) with respect to x yields to

$$\begin{aligned} \sum _{n=k}^{\infty } \displaystyle \frac{d^k}{dx^k}B_{n}^{Y}(x) \frac{z^{n}}{n!}&= k!\left[ e^{x({\mathbb {E}}e^{zY}-1)} \right] \left[ \frac{({\mathbb {E}}e^{zY}-1)^{k}}{k!}\right] \end{aligned}$$
(2.13)
$$\begin{aligned}&= k! \sum _{j=0}^{\infty } B_{j}^Y(x) \frac{z^{j}}{j!} \sum _{i=k}^{\infty } S_{Y}(i,k) \frac{z^{i}}{i!}\nonumber \\&= k!\sum _{n=k}^{\infty } \sum _{j=0}^{n-k} \left( {\begin{array}{c}n\\ j\end{array}}\right) B_{j}^Y(x) S_{Y}(n-j,k) \frac{z^{n}}{n!}\nonumber \\&= \sum _{n=k}^{\infty } k!\left[ \sum _{j=0}^{n-k} \left( {\begin{array}{c}n\\ j\end{array}}\right) B_{j}^Y(x) S_{Y}(n-j,k) \right] \frac{z^{n}}{n!}. \end{aligned}$$
(2.14)

Comparing the coefficients of \(z^n\) on both sides, we get the result. \(\square\)

For \(k=1\), the first order derivative for the probabilistic Bell polynomials is

$$\begin{aligned} \frac{d}{dx} B_{n}^Y(x) = \sum _{j=0}^{n-1} \left( {\begin{array}{c}n\\ j\end{array}}\right) {\mathbb {E}}Y^{n-j}B_{j}^Y(x). \end{aligned}$$

For a special choices of the random variable Y and for different values of k, we may get several new convolution results. In a particular case, we have the following example.

Example 2.1

Let Y follows standard exponential distribution. Then

$$\begin{aligned} \frac{d}{dx} B_{n}^{L}(x) = \sum _{j=0}^{n-1} \left( {\begin{array}{c}n\\ j\end{array}}\right) (n-j)! B_{j}^{L}(x), \end{aligned}$$

where \(B_{n}^{L}(x)\) be the Lah-Bell polynomials. This result is aforementioned in Theorem 10 of [18].

The probabilistic Bell polynomials are closely associated with the probabilistic version of the Stirling numbers of the second kind. To establish few connections with well-known families of distributions, we first generalize power sums formula and will express it in terms of the probabilistic version of the Stirling numbers of the second kind. These sums formula will extend some known popular identity reported in Spivey [28]. Also, for a different choice of random variable Y, we may obtain several new summation formulas which may be useful in combinatorics.

For any arbitrary real-valued function g(x), the first order forward difference formula is (see [22])

$$\begin{aligned} \bigtriangleup _y^1 g(x) = g(x+y) - g(x). \end{aligned}$$

In a similar manner, its iterates are given by

$$\begin{aligned} \bigtriangleup _{y_1,y_2,\dots , y_j}^j g(x) = \left( \bigtriangleup _{y_1}^1 \circ \bigtriangleup _{y_2}^1 \circ \cdots \circ \bigtriangleup _{y_j}^1\right) g(x), \;\;\; (y_1,y_2,\dots , y_j) \in {\mathbb {R}}^j, \;\;\; j \in {\mathbb {N}}. \end{aligned}$$

In particular, for \(y_1 = y_2 = \cdots = y_j = 1\), we get

$$\begin{aligned} \bigtriangleup _{1,1,\dots ,1}^j g(x) = \bigtriangleup ^j g(x) = \sum _{i=0}^{j} \left( {\begin{array}{c}j\\ i\end{array}}\right) (-1)^{j-i} g(x+i), \end{aligned}$$

where \(\bigtriangleup ^j g(x)\) be the jth order forward difference operator of function g(x).

Let \(q_n(x)\) be a polynomial of exact degree n. Let \(Y\in {\mathcal {H}}\) and \(S_k=Y_1+Y_2+\cdots +Y_k\). In terms of forward difference operator, we have the following formulas (see [4, pp. 9–10])

$$\begin{aligned} {\mathbb {E}}q_n (x+S_k) = \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) {\mathbb {E}}\bigtriangleup _{Y_1,...,Y_j}^j q_n(x). \end{aligned}$$
(2.15)

Also,

$$\begin{aligned} \sum _{k=0}^{m} {\mathbb {E}}q_n (x+S_k) = \sum _{j=0}^{n \wedge m} \left( {\begin{array}{c}m+1\\ j+1\end{array}}\right) {\mathbb {E}}\bigtriangleup _{Y_1,...,Y_j}^j q_n(x), \end{aligned}$$
(2.16)

where \(n,m \in {\mathbb {N}}_0\) and \(n \wedge m= \min (n,m)\).

When \(q_n(x)\) is a monomial of degree n, that is, \(q_n(x) = I_n(x) = x^n\), (2.16) reduces to

$$\begin{aligned} \sum _{k=0}^{m} {\mathbb {E}}I_n (x+S_k) = \sum _{j=0}^{n \wedge m} \left( {\begin{array}{c}m+1\\ j+1\end{array}}\right) j!S_Y (n,j;x), \end{aligned}$$
(2.17)

where

$$\begin{aligned} S_Y(n,j;x) = \frac{1}{j!} {\mathbb {E}}\bigtriangleup _{Y_1,...,Y_j}^j I_n(x), \end{aligned}$$

with \(S_Y(n,j;0) = S_Y(n,j)\). For a detailed discussion about these results one can consult [4, Theorem 3.3].

Hence, motivated by the applications of \(S_Y(n,j)\) reported in Adell [4], we present a probabilistic generalization of a combinatorial identity in the next theorem.

Theorem 2.3

For any \(n,m \in {\mathbb {N}}_0\), we have

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) {\mathbb {E}}I_n(x+S_k) = \sum _{j=0}^{n \wedge m} \left( {\begin{array}{c} m\\ j\end{array}}\right) j!S_Y(n,j;x)2^{m-j}. \end{aligned}$$
(2.18)

Proof

Choosing \(q_n(x) = I_n(x)\) in (2.15), we get

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) {\mathbb {E}}I_n(x+S_k)&= \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) j! S_Y(n,j;x)\\&= \sum _{j=0}^{m}j! S_Y(n,j;x) \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) \left( {\begin{array}{c}k\\ j\end{array}}\right) . \end{aligned}$$

With the help of well-known combinatorial identity \(\sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) \left( {\begin{array}{c}k\\ j\end{array}}\right) = \left( {\begin{array}{c}m\\ j\end{array}}\right) 2^{m-j}\), we get the result. \(\square\)

Remark 2.1

When Y is degenerate at 1, (2.18) leads to the well-known combinatorial identity

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) I_n(x+k) = \sum _{j=0}^{n \wedge m} \left( {\begin{array}{c}m\\ j\end{array}}\right) 2^{m-j} \bigtriangleup ^j I_n(x), \end{aligned}$$

which has been studied in [28].

It is important to note that the Theorem 2.3 can be extended for any polynomial \(q_n(x)\) of the exact degree n and is given by

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) {\mathbb {E}}q_n(x+S_k) = \sum _{j=0}^{n \wedge m} \left( {\begin{array}{c} m\\ j\end{array}}\right) \bigtriangleup _{Y_1,...,Y_j}^j q_n(x) 2^{m-j}. \end{aligned}$$

Corollary 2.1

For jm and \(k \in {\mathbb {N}}_0\) with \(j \le k \le m\), we have

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) {\mathbb {E}}I_{m-k}(x+S_k) = \sum _{j=0}^{m} j! \left( {\begin{array}{c}m\\ j\end{array}}\right) \sum _{k=j}^{m-j} \left( {\begin{array}{c}m-j\\ m-k\end{array}}\right) S_Y(m-k,j;x). \end{aligned}$$
(2.19)

Proof

Consider \(I_n(x)\) be a monomial in the left hand side of (2.19). An application of (2.15) yields

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) {\mathbb {E}}I_{m-k}(x+S_k)&= \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) j!S_Y(m-k,j;x)\\&= \sum _{j=0}^{m} j! \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) \left( {\begin{array}{c}k\\ j\end{array}}\right) S_Y(m-k,j;x)\\&= \sum _{j=0}^{m} j! \left( {\begin{array}{c}m\\ j\end{array}}\right) \sum _{k=j}^{m-j} \left( {\begin{array}{c}m-j\\ m-k\end{array}}\right) S_Y(m-k,j;x), \end{aligned}$$

which completes the proof. \(\square\)

The Stirling numbers of the second kind may be expressed in terms of the moments of sum of uniform random variables concentrated on [0, 1] (i.e. U[0, 1]) via the relation (see [3])

$$\begin{aligned} S(n,k) = \left( {\begin{array}{c}n\\ k\end{array}}\right) {\mathbb {E}}(Y_1 + Y_2 + \cdots + Y_k)^{n-k}=\left( {\begin{array}{c}n\\ k\end{array}}\right) {\mathbb {E}}S_k^{n-k}, \end{aligned}$$
(2.20)

where each \((Y_{j})_{j \ge 1}\) are i.i.d. U[0, 1]. For \(Y\sim U[0,1]\), we may provide an alternative representation of the classical Bell numbers using (2.19) in terms of the probabilistic Stirling numbers of the second kind through the relation

$$\begin{aligned} B_n = \sum _{j=0}^{n} j! \left( {\begin{array}{c}n\\ j\end{array}}\right) \sum _{k=j}^{n-j} \left( {\begin{array}{c}n-j\\ n-k\end{array}}\right) S_Y(n-k,j). \end{aligned}$$

3 Probabilistic Bell polynomials and some probability distributions

Let \(Y\in {\mathcal {H}}\). With the help of (1.4) and (1.12), the probabilistic Stirling numbers of the second kind \(S_Y(n,k)\) have the following representation

$$\begin{aligned} S_Y(n,k) = \sum _{\Omega _n ^{k}}^{} \frac{n!}{k_1! k_2! \cdots k_{n-k+1} !} \prod _{j=1}^{n-k+1} \biggl (\frac{{\mathbb {E}}Y^j}{j!}\biggr )^{k_j}, \end{aligned}$$

where \(\Omega _n ^{k} = \biggl \{(k_1, k_2, \dots , k_{n-k+1}): \sum _{j=1}^{n-k+1}k_j = k, \sum _{j=1}^{n-k+1}jk_j = n, k_j \in {\mathbb {N}}_0 \biggr \}.\)

Example 3.1

Let Y follows standard exponential distribution. Then, we have

$$\begin{aligned} B_{n,k}\left( 1!, 2!, \dots ,(n-k+1)!\right) =\sum _{\Omega _n ^{k}}^{} \frac{n!}{k_1! k_2! \cdots k_j !}. \end{aligned}$$

which are the Lah numbers and has been reported in [17, p. 5].

Next, we present some interconnections of the probabilistic Bell polynomials and the probabilistic Stirling numbers of the second kind for some special probability distributions. We first start with Poisson random variable.

3.1 Probabilistic Bell polynomials and poisson distribution

Let \(Y\sim \text {Poisson}(\lambda )\) with pmf

$$\begin{aligned} \mathbb {P}(Y=j) = \frac{\lambda ^{j}}{j!}e^{-\lambda }, ~~j \in {\mathbb {N}}_{0}. \end{aligned}$$
(3.1)

One can easily see that

$$\begin{aligned} B_{n}(\lambda ) = {\mathbb {E}}Y^{n}, ~~\lambda > 0. \end{aligned}$$

Theorem 3.1

Let \(Y \sim \text {Poisson}(\lambda )\). Then

$$\begin{aligned} B_{n}^{Y}(x) = \sum _{k=0}^{n} \lambda ^{k} B_{k}(x) S(n,k), \end{aligned}$$
(3.2)

where S(nk) is the Stirling numbers of the second kind.

Also, for any \(m \ge n\), we have

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) B_n(k\lambda ) = \sum _{j=0}^{n } \left( {\begin{array}{c} m\\ j\end{array}}\right) j!S_Y(n,j)2^{m-j}. \end{aligned}$$
(3.3)

Proof

The moment generating function for a Poisson variate is

$$\begin{aligned} {\mathbb {E}}e^{zY} = e^{\lambda (e^{z}-1)}. \end{aligned}$$

With the help of Proposition 2.1, we get

$$\begin{aligned} \sum _{n=0}^{\infty } B_{n}^{Y}(x) \frac{z^{n}}{n!} = e^{x{(e^{\lambda (e^{z}-1)}-1)}}&= \sum _{k=0}^{\infty } B_{k}(x) \lambda ^{k} \frac{[(e^{z}-1)]^{k}}{k!}\\&= \sum _{k=0}^{\infty } B_{k}(x) \lambda ^{k} \sum _{n=k}^{\infty } S(n,k) \frac{z^{n}}{n!}\\&= \sum _{n=0}^{\infty } \left( \sum _{k=0}^{n} \lambda ^{k} B_{k}(x) S(n,k) \right) \frac{z^{n}}{n!}. \end{aligned}$$

Comparing the coefficients of \(z^n\) on the both sides, we get desired result.

As the sum of the independent Poisson variates is also Poisson. Then \(S_k\sim \text {Poisson}(k\lambda )\) with \({\mathbb {E}}S_k^n = B_n(k\lambda )\). Finally, for \(x=0\), Theorem 2.3 generates the proof of (3.3). \(\square\)

3.2 Probabilistic Bell polynomials and Bernoulli’s distribution

Let Y be a Bernoulli random variable with parameter \(p\in (0,1]\). The nth order moment of Y is

$$\begin{aligned} {\mathbb {E}}Y^{n} = p, ~~~ \forall n. \end{aligned}$$
(3.4)

Using this fact in (1.12), we get

$$\begin{aligned} S_{Y}(n,k) = p^{k} S(n,k), ~~~ k \le n. \end{aligned}$$
(3.5)

Also, we have

$$\begin{aligned} B_n^Y(x) = B_{n}(px). \end{aligned}$$

The next result is an application of the probabilistic Stirling numbers of the second kind in obtaining sum of powers of natural numbers through Bernoulli random variable.

Theorem 3.2

Let Y be a Bernoulli random variable with parameter \(p \in (0,1]\). Then

$$\begin{aligned} 1^n +2^n+ \cdots +k^n = (k+1)\int _{0}^{1} \sum _{m=0}^{k} \left( {\begin{array}{c}k\\ m\end{array}}\right) m! S_Y(n,m) dp, \end{aligned}$$
(3.6)

where \(S_Y(n,m)\) depends on parameter p as in (3.5).

Proof

Suppose \((Y_{j})_{j \ge 1}\) be a sequence of i.i.d. copies of random variable Y. The sum \(S_{k} = Y_{1} + Y_{2} + \cdot \cdot \cdot + Y_{k}\) for \(k = 1,2,\dots\) follows the binomial distribution with parameter k and p. The nth order moment of random variable \(S_k\) is given by (see [23])

$$\begin{aligned} {\mathbb {E}}S_k^n = \sum _{i=1}^{n} \left( {\begin{array}{c}k\\ i\end{array}}\right) p^i i! S(n,i). \end{aligned}$$

For \(q_n(x) = I_n(x)\), (2.15) gives

$$\begin{aligned} \sum _{i=1}^{n} \left( {\begin{array}{c}k\\ i\end{array}}\right) p^i i! S(n,i)&= \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) {\mathbb {E}}\bigtriangleup _{Y_1,...,Y_j}^j I_n(0) = \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) j! S_Y(n,j). \end{aligned}$$

Integrating both side with respect to p,  we get

$$\begin{aligned} \int _{0}^{1} \sum _{i=1}^{n} \left( {\begin{array}{c}k\\ i\end{array}}\right) p^i i! S(n,i) dp&= \int _{0}^{1} \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) j! S_Y(n,j) dp \nonumber \\ \frac{1}{k+1} \sum _{i=1}^{n} \left( {\begin{array}{c}k+1\\ i+1\end{array}}\right) i! S(n,i)&= \int _{0}^{1} \sum _{j=0}^{k} \left( {\begin{array}{c}k\\ j\end{array}}\right) j! S_Y(n,j) dp. \end{aligned}$$
(3.7)

It is well known that \(1^n +2^n+ \cdots +k^n = \sum _{i=1}^{n} \left( {\begin{array}{c}k+1\\ i+1\end{array}}\right) i! S(n,i)\) (see [23]). Therefore from (3.7), the theorem is proved. \(\square\)

3.3 Grunert’s formula and geometric distribution

For expansion of formal power series \({\mathcal {S}}_{n}(x) =\sum _{j=0}^{\infty }j^n x^j\), the Grunert’s formula in terms of the Stirling numbers of the second kind is given by

$$\begin{aligned} \left( x \frac{d}{dx}\right) ^l {\mathcal {S}}_n(x) = \sum _{j=0}^{l}x^j {\mathcal {S}}(l,j) \frac{d^j}{dx^j} {\mathcal {S}}_n(x) = {\mathcal {S}}_{n+l}(x). \end{aligned}$$
(3.8)

The second equality is obtained in [26, pp. 126–127] such that \({\mathcal {S}}_{n+l}(x) =\sum _{j=0}^{\infty }j^{n+l} x^j\).

Let Y be a random variable following the geometric distribution with parameter p such that \(q=1-p\), \(0<p<1\). Then, its pmf is given by

$$\begin{aligned} \mathbb {P}(Y = j) = pq^j, ~~~~ j \in {\mathbb {N}}_0. \end{aligned}$$
(3.9)

Consider,

$$\begin{aligned} {\mathcal {S}}_{n+l}(q) = \frac{1}{p} \sum _{j=0}^{\infty }j^{n+l}pq^j = \frac{1}{p} {\mathbb {E}}Y^{n+l}. \end{aligned}$$
(3.10)

We define the multinomial convolutions of the form

$$\begin{aligned} {\mathcal {S}}_{n+l}^{*k}(q) = \sum _{\begin{array}{c} n_1 +n_2+ \cdots + n_k = n\\ l_1 +l_2+ \cdots + l_k = l \end{array}}^{} \frac{(n+l)!}{(n_1 +l_1)!(n_2+l_2)! \cdots (n_k+l_k)!}{\mathcal {S}}_{n_1+l_1}(q) {\mathcal {S}}_{n_2+l_2}(q) \cdots {\mathcal {S}}_{n_k+l_k}(q). \end{aligned}$$
(3.11)

Using (3.10), a probabilistic version of (3.11) is given by

$$\begin{aligned} {\mathcal {S}}_{n+l}^{*k}(q)&= \frac{1}{p^k}\sum _{\begin{array}{c} n_1 +n_2+ \cdots + n_k = n\\ l_1 +l_2+ \cdots + l_k = l \end{array}}^{} \frac{(n+l)!}{(n_1 +l_1)!(n_2+l_2)! \cdots (n_k+l_k)!} {\mathbb {E}}Y_1^{n_1+l_1} {\mathbb {E}}Y_2^{n_2+l_2} \cdots {\mathbb {E}}Y_k^{n_k+l_k}\\&= \frac{1}{p^k} {\mathbb {E}}(Y_1 + Y_2 + \cdots + Y_k)^{n+l} = \frac{1}{p^k} {\mathbb {E}}S_k^{n+l}. \end{aligned}$$

In light of these constructions, we have the following results.

Theorem 3.3

Let Y follows the geometric distribution with parameter p. Then, we have

$$\begin{aligned} S_Y(n,k) = \sum _{j=k}^{n} \left( {\begin{array}{c}j\\ k\end{array}}\right) \left( \frac{q}{p}\right) ^j \langle k \rangle _{j-k} S(n,j). \end{aligned}$$
(3.12)

Also,

$$\begin{aligned} \sum _{k=0}^{m} \left( {\begin{array}{c}m\\ k\end{array}}\right) p^k {\mathcal {S}}_{n+l}^{*k}(q) = \sum _{j=0}^{n \wedge m} \left( {\begin{array}{c} m\\ j\end{array}}\right) j!S_Y(n,j)2^{m-j}. \end{aligned}$$
(3.13)

Proof

Let \(|e^z -1| < p/q.\) Then, by (3.9), we have

$$\begin{aligned} {\mathbb {E}}e^{zY} - 1 = \frac{q}{p} (e^z-1) \left[ \frac{1}{1- \frac{q}{p} (e^{z}-1)}\right] . \end{aligned}$$

Using (1.3) and by an use of binomial expansion, we get

$$\begin{aligned} \frac{\left( {\mathbb {E}}e^{zY}-1\right) ^{k}}{k!}&= \frac{q^k(e^z -1)^k}{k! p^k} \left[ \frac{1}{1- \frac{q}{p} (e^{z}-1)}\right] ^k\\&= \frac{1}{k!}\left( \frac{q(e^z-1)}{p}\right) ^k \sum _{i=0}^{\infty } \left( {\begin{array}{c}-k\\ i\end{array}}\right) \left( -\frac{q}{p}\right) ^i (e^z -1)^i\\&= \left( \frac{q}{p}\right) ^k \sum _{i=0}^{\infty } \left( {\begin{array}{c}k+i\\ k\end{array}}\right) \langle k \rangle _i \left( \frac{q}{p}\right) ^i \frac{(e^z -1)^{k+i}}{(k+i)!}\\&= \left( \frac{q}{p}\right) ^k \sum _{j=k}^{\infty } \left( {\begin{array}{c}j\\ k\end{array}}\right) \langle k \rangle _{j-k} \left( \frac{q}{p}\right) ^{j-k} \sum _{n=j}^{\infty } S(n,j)\frac{z^n}{n!}\\&= \sum _{n=k}^{\infty } \sum _{j=k}^{n} \left( {\begin{array}{c}j\\ k\end{array}}\right) \langle k \rangle _{j-k} \left( \frac{q}{p}\right) ^{j} S(n,j)\frac{z^n}{n!}. \end{aligned}$$

Taking into account the form of (1.3), we obtain (3.12).

The proof of (3.13) followed by Theorem 2.3 with \(x=0.\) \(\square\)

It may be observed that for \(q = e^{-1}\) in (3.12), we may induce the interconnection between the probabilistic Stirling numbers of the second kind and discrete exponential distribution. Suppose Y follows discrete exponential distribution whose distribution function is given by

$$\begin{aligned} F(k) = 1-e^{-k}, ~~~~~~~ k =0,1,.... \end{aligned}$$
(3.14)

For any \(z < 1\), the moment generating function of random variable Y is (see [27])

$$\begin{aligned} {\mathbb {E}}e^{zY} = 1 + \frac{e^z -1}{1- e^{-(1-z)}}. \end{aligned}$$
(3.15)

Then, we have the following corollary in case of discrete exponential distribution.

Corollary 3.1

For \(|e^z -1| < e-1,\) we have

$$\begin{aligned} S_Y(n,k) = e^k \sum _{j=k}^{n}\left( {\begin{array}{c}j\\ k\end{array}}\right) \langle k \rangle _{j-k} \frac{S(n,j)}{(e-1)^j}. \end{aligned}$$
(3.16)

4 Applications

4.1 Inverse relation

For \(n \ge 1\), if we have

$$\begin{aligned} B_{n} = \sum _{k=1}^{n}B_{n,k}(x_1, x_2,...,x_{n-k+1}), \end{aligned}$$
(4.1)

then inverse relation for the partial exponential Bell polynomials is given by (see [21])

$$\begin{aligned} x_{n} = \sum _{k=1}^{n} (-1)^{k-1}(k-1)! B_{n,k}(B_{1},B_{2},...,B_{n-k+1}) \text { for } 1 \le k \le n. \end{aligned}$$
(4.2)

For a random variable Y, the cumulant generating function is defined as

$$\begin{aligned} K_Y(z) = \text {log}{\mathbb {E}}e^{zY} = \sum _{i=1}^{\infty } {\mathcal {K}}_{i} \frac{z^i}{i!}, \end{aligned}$$

where \({\mathcal {K}}_{i}'s\) are the ith cumulants of the random variable Y. Also, we have the following relation in terms of the moments

$$\begin{aligned} B_n({\mathcal {K}}_{1},{\mathcal {K}}_{2},\dots , {\mathcal {K}}_{n}) = {\mathbb {E}}Y^n. \end{aligned}$$

Using the inverse relation for a suitable choice of \(x_i'\)s values and applying (2.5), a connection between the cumulants and the probabilistic Bell numbers is established which is given as

$$\begin{aligned} B_{n} ({\mathcal {K}}_{1},{\mathcal {K}}_{2},...,{\mathcal {K}}_{n}) = \sum _{k=1}^{n}(-1)^{k-1}(k-1)! B_{n,k}(B_{1}^{Y},B_{2}^{Y},...,B_{n-k+1}^{Y}). \end{aligned}$$

Example 4.1

Let Y follows standard exponential distribution with ith cumulant \({\mathcal {K}}_{i} = (i-1)!\). Then, we have

$$\begin{aligned} B_{n} (0!,1!,2!,\dots , (n-1)!) = \sum _{k=1}^{n}(-1)^{k-1}(k-1)! B_{n,k}(B_{1}^{L},B_{2}^{L},...,B_{n-k+1}^{L}). \end{aligned}$$

The well known interconnection between unsigned Stirling numbers of the first kind |s(nk)| and the partial exponential Bell polynomials is given by (see [12, p. 135])

$$\begin{aligned} |s(n,k)| = B_{n,k}(0!,1!,\dots ,(n-k)!). \end{aligned}$$

Hence, using (1.7) another representation of unsigned Stirling numbers of the first kind in terms of the Lah-Bell numbers is obtained of the following form

$$\begin{aligned} |s(n,k)| = (-1)^{k-1}(k-1)! B_{n,k}(B_{1}^{L},B_{2}^{L},...,B_{n-k+1}^{L}). \end{aligned}$$

4.2 Probabilistic Stirling numbers of the second kind and Appell polynomials

Let \(A(x) = (A_{n}(x))_{n\ge 0}\) be a sequence of polynomials such that \(A(0) = (A_{n}(0))_{n\ge 0} \in \mathbb {H},\) where \(\mathbb {H}\) contains real sequences. Then, A(x) is an Appell sequence and its generating function has the following form

$$\begin{aligned} H(A(x),z) = \sum _{n=0}^{\infty } A_{n}(x) \frac{z^{n}}{n!} = H(A(0),z) e^{xz}. \end{aligned}$$

For more details one may refer [4]. The k-fold binomial convolution A(kx) of Appell sequence is defined as (see [4, pp. 19–20])

$$\begin{aligned} A_n(k,x) = (A\overbrace{\times {\cdots } \times }^{k\text {-times}}A)(x), \;\;\; k \in {\mathbb {N}}, \;\;\; A_n(0,x) = I_n(x), \end{aligned}$$

which have the probabilistic representation of the form (see [29])

$$\begin{aligned} A_{n}(k,x) = {\mathbb {E}}(x+S_k)^{n}. \end{aligned}$$
(4.3)

In terms of the exponential generating function, (4.3) can be expressed as

$$\begin{aligned} H(A(k,x),z) = \sum _{n=0}^{\infty } A_{n}(k,x) \frac{z^{n}}{n!} = {\mathbb {E}}e^{z(x+S_k)}. \end{aligned}$$
(4.4)

Theorem 4.1

For \(Y \in {\mathcal {H}},\) we have

$$\begin{aligned} {\mathbb {E}}\left[ B_n^Y(x+S_k)\right] = \sum _{m=1}^{n} A_m(k,x) S_Y(n,m), ~~~~ n \ge 1, \end{aligned}$$

where \(A_{m}(k,x)\) be a sequence of the Appell polynomials.

Proof

Suppose \(h_1(z)\) and \(h_2(z)\) are two generating functions of sequences \((x_n)_{n \ge 1}\) and \((y_n)_{n \ge 1}\), respectively such that

$$\begin{aligned} h_1(z) = \sum _{n=1}^{\infty } x_{n} \frac{z^{n}}{n!} ~~ \text { and }~~ h_2(z) = \sum _{n=1}^{\infty } y_{n} \frac{z^{n}}{n!}. \end{aligned}$$
(4.5)

The composition of \(h_1\) and \(h_2\) is given by (see [10, Chapter 5])

$$\begin{aligned} h_2(h_1(z)) = \sum _{n=1}^{\infty } \sum _{m=1}^{n} y_{m} B_{n,m}(x_{1},x_{2},...,x_{n-m+1}) \frac{z^{n}}{n!}. \end{aligned}$$
(4.6)

Consider, \(x_{n} = {\mathbb {E}}Y^n\) and \(y_{m} = {\mathbb {E}}(x+S_k)^m\) as defined in (4.3). Then, from (4.5), the functions \(h_1\) and \(h_2\) takes the form

$$\begin{aligned} h_1(z) = {\mathbb {E}}e^{zY}-1 ~~ \text { and } ~~ h_2(z) = {\mathbb {E}}e^{z(x+S_k)}-1. \end{aligned}$$

Therefore, with the help of (4.6), we obtain

$$\begin{aligned} {\mathbb {E}}e^{(x+S_k)({\mathbb {E}}e^{zY}-1)} - 1= \sum _{n=1}^{\infty } \sum _{m=1}^{n} A_{m}(k,x) S_{Y}(n,m) \frac{z^n}{n!}. \end{aligned}$$
(4.7)

Using Proposition 2.1, we get the desired result. \(\square\)

Remark 4.1

From (4.7), one may construct several new identities for some special random variable Y. If we choose \(Y = -1/2 + iX,\) for a random variable X following logistic distribution, then (4.3) reduces to the Bernoulli polynomials (see [29] and [4, p. 19]). For a random variable \(Y = iU\) with U following the standard normal distribution, an interconnection with the Hermite polynomials can also be established.