1. Introduction

In recent years, we have witnessed that degenerate versions and \(\lambda\)-analogues of many special polynomials and numbers were investigated by employing various methods such as generating functions, combinatorial methods, umbral calculus, \(p\)-adic analysis, differential equations, probability, special functions, analytic number theory and operator theory (see [6, 11–18, 21] and the references therein). Here we study by means of generating functions probabilistic extensions of several special polynomials, including the Bernoulli and Euler polynomials.

Let \(Y\) be a random variable satisfying the moment condition (see (8)). The aim of this paper is to study, as probabilistic extensions, the probabilistic Bernoulli polynomials associated \(Y\) and the probabilistic Euler polynomials associated with \(Y\), along with the probabilistic \(r\)-Stirling numbers of the second associated \(Y\), the probabilistic two variable Fubini polynomials associated \(Y\), and the probabilistic poly-Bernoulli polynomials associated with \(Y\). We derive some properties, explicit expressions, certain identities and recurrence relations for those polynomials and numbers. In addition, as special cases of \(Y\), we consider the gamma random variable with parameters \(\alpha,\beta > 0\), the Poisson random variable with parameter \(\alpha >0\), and the Bernoulli random variable with probability of success \(p\).

The outline of this paper is as follows. In Section 1, we recall the Bernoulli polynomials, the Euler polynomials, the Stirling numbers of the second kind, the \(r\)-Stirling numbers of the second kind, the Fubini polynomials and two variable Fubini polynomials. Assume that \(Y\) is a random variable such that the moment generating function of \(Y\),

$$E[e^{tY}]=\sum_{n=0}^{\infty}\frac{t^{n}}{n!}E[Y^{n}] \quad (|t| <r),$$

exists for some \(r >0\). Let \((Y_{j})_{j\ge 1}\) be a sequence of mutually independent copies of the random variable \(Y\), and let

$$S_{k}=Y_{1}+Y_{2}+\cdots+Y_{k}\quad (k \ge 1),\,\, \text{with} \, S_{0}=0.$$

Then we remind the reader of the gamma random variable with parameters \(\alpha,\,\beta >0\) and the probabilistic Stirling numbers of the second. Section 2 is the main results of this paper. Let \((Y_{j})_{j \ge1},\,\, S_{k}\,\, (k=0,1,\dots)\) be as in the above. We define the probabilistic Bernoulli polynomials associated \(Y\), \(B_{n}^{Y}(x)\) (see (12)). We derive explicit expressions for \(B_{n}^{Y}(x)\) in Theorems 2.1, 2.2, and 2.6, and \(B_{n}^{Y}(x)=n!\binom{x+n-2}{n}\) in Theorem 2.9, when \(Y \sim \Gamma(1,1)\) (see (9)). In Theorems 2.3, 2.4, and 2.5, we get probabilistic analogues for the well-known identities

$$\sum_{k=0}^{n}k^{m}=\frac{1}{m+1}\left\{B_{m+1}(n+1)-B_{m+1}\right\}\quad(n,m \ge 0),$$
$$\sum_{l=0}^{n}\binom{n}{l}B_{l}-B_{n}=\delta_{n,1}\quad(n \ge 0),$$

and

$$B_{n}=d^{n-1}\sum_{k=0}^{d-1}B_{n}\left(\frac{k}{d}\right)\quad (n \ge 0,\,d \ge 1),$$

respectively. In Theorem 2.8, we deduce an identity involving \(B_{n}^{Y}=B_{n}^{Y}(0)\), which gives an explicit expression for the Bernoulli numbers

$$B_{n}=\sum_{k=0}^{n}{n \brace k}\frac{k!}{k+1}(-1)^{k},$$

for \(Y=1\). We show that \(B_{n}^{Y}=\frac{1}{p}B_{n}\) if \(Y\) is the Bernoulli random variable with probability of success \(p\) in Theorem 2.10. We define the probabilistic \(r\)-Stirling numbers of the second kind associated with \(Y\) (see (31)), \({n+r \brace k+r}_{r,Y}\), for which an expression in terms of \(E[S_{j+r}^{n}]\) is found in Theorem 2.11. We introduce the probabilistic two variable Fubini polynomials associated \(Y\), \(F_{n}^{Y}(x|y)\) (see (33)). We obtain an expression in terms of \(F_{k}^{Y}(x)\) for \(F_{n}^{Y}(x|y)\) in Theorem 2.12. In case \(y=r\) is a nonnegative integer, we show that

$$F_{n}^{Y}(x|r)=\sum_{k=0}^{n}k!{n+r \brace k+r}_{r,Y}x^{k}$$

in Theorem 2.13. In Theorem 2.14, we get an identity involving \(B_{n}^{Y}(r)\), which reduces to the identity

$$B_{n}(r)=\sum_{k=0}^{n}\frac{k!}{k+1}(-1)^{k}{n+r \brace k+r}_{r},$$

for \(Y=1\). We define the probabilistic poly-Bernoulli polynomials \(B_{n}^{(k,Y)}(x)\) (see (40)) by making use of the polylogarithmic function. In Theorem 2.15, we obtain an expression for \(B_{n}^{(k,Y)}(x)\) in terms of \(B_{k}^{Y}(x)\). We define the probabilistic Euler polynomials associated with \(Y\), \(\mathcal{E}_{n}^{Y}(x)\) (see (43)). We get an expression for \(\mathcal{E}_{n}^{Y}(x)\) in Theorem 2.16 and that for for \(\mathcal{E}_{n}^{Y}=\mathcal{E}_{n}^{Y}(0)\) in Theorem 2.17. In Theorem 2.18, we get an identity which corresponds to the well-known identity

$$\sum_{k=0}^{n}(-1)^k k^{m}=\frac{1}{2}\left\{\mathcal{E}_{m}^{Y}(n+1)+\mathcal{E}_{m}^{Y}\right\},$$

for any integer \(m \ge 0\) and any even positive integer \(n\). We show that in Theorem 2.19, \(\mathcal{E}_{n}^{Y}=\frac{n!}{2^n}\), for \(Y \sim \Gamma(1,1)\) and in Theorem 2.20,

$$\mathcal{E}_{n}^{Y}=\sum_{k=0}^{n}\alpha^{k}{n \brace k} \mathcal{E}_{k},$$

when \(Y\) is the Poisson random variable with parameter \(\alpha >0\). For the rest of this section, we recall the facts that are needed throughout this paper.

It is well known that the Bernoulli polynomials are defined by

$$\frac{t}{e^{t}-1}e^{xt}=\sum_{n=0}^{\infty}B_{n}(x) \frac{t^{n}}{n!}\quad (\mathrm{see}\ [1-28]). $$
(1)

When \(x=0\), \(B_{n}=B_{n}(0)\) are called the Bernoulli numbers. The Euler polynomials are given by

$$\frac{2}{e^{t}+1}e^{xt} =\sum_{n=0}^{\infty}\mathcal{E}_{n}(x)\frac{t^{n}}{n!} \quad (\mathrm{see}\ [5,7,11]). $$
(2)

For \(x=0\), \(\mathcal{E}_{n}=\mathcal{E}_{n}(0)\) are called the Euler numbers.

For \(n\ge 0\), the Stirling numbers of the second kind are defined by

$$x^{n}=\sum_{k=0}^{n}{n \brace k}(x)_{k},\quad (n\ge 0) \quad (\mathrm{see}\ [7,17,23]), $$
(3)

where \((x)_{0}=1,\ (x)_{n}=x(x-1)\cdots(x-n+1),\ (n\ge 1)\).

For \(r\in\mathbb{Z}\) with \(r\ge 0\), the \(r\)-Stirling numbers of the second kind are given by

$$\frac{1}{k!}\big(e^{t}-1\big)^{k}e^{rt} =\sum_{n=k}^{\infty}{n+r \brace k+r}_{r}\frac{t^{n}}{n!}, \quad (k\ge 0)\quad (\mathrm{see}\ [7,14,16,23]). $$
(4)

If \(r=0,\ {n+r \brace k+r}_{r}={n\brace k},\ (n\ge k\ge 0)\).

The Fubini polynomials are defined by

$$F_{n}(x)=\sum_{k=0}^{n}{n \brace k}k!x^{k},\quad (n\ge 0)\quad (\mathrm{see}\ [15,18]). $$
(5)

Two variable Fubini polynomials are given by

$$\frac{1}{1-x(e^{t}-1)}e^{yt}=\sum_{n=0}^{\infty}F_{n}(x|y)\frac{t^{n}}{n!}\quad (\mathrm{see}\ [15,18,26]). $$
(6)

For \(y=r\in\mathbb{Z}\) with \(r\ge 0\), we have

$$F_{n}(x|r)=\sum_{k=0}^{n}{n+r \brace k+r}_{r}k!x^{k}\quad (n\ge 0). $$
(7)

Assume that \(Y\) is a random variable such that the moment generating function of \(Y\),

$$E\Big[e^{tY}\Big]=\sum_{n=0}^{\infty}\frac{t^{n}}{n!}E\big[Y^{n}\big]\quad (|t|<r),\quad \text{exists for some $r>0$}. $$
(8)

Let \((Y_{j})_{j\ge 1}\) be a sequence of mutually independent copies of the random variable \(Y\), and let \(S_{k}=Y_{1}+Y_{2}+\cdots+Y_{k},\ (k\ge 1)\) and \(S_{0}=0\).

A continuous random variable \(Y\) whose density function is defined by

$$f(y)=\left\{\begin{array}{ccc} \beta e^{-\beta y}\frac{(\beta y)^{\alpha-1}}{\Gamma(\alpha)}, & \text{for $y\ge 0$,}\\ 0, & \text{for $y<0$}, \end{array}\right. $$
(9)

for some \(\alpha,\,\beta>0\) is said to be the gamma random variable with parameters \(\alpha,\beta\), which is denoted by \(Y\sim\Gamma(\alpha,\beta)\), (see [20, 24, 26–28]). The probabilistic Stirling numbers of the second kind associated with \(Y\) are given by

$$\frac{1}{k!}\Big(E[e^{tY}]-1\Big)^{k}=\sum_{n=k}^{\infty}{n\brace k}_{Y}\frac{t^{n}}{n!},\quad (k\ge 0)\quad (\mathrm{see}\ [3,13]). $$
(10)

When \(Y=1\), we have \({n\brace k}_{Y}={n\brace k}\).

2. Probabilistic Bernoulli and Euler polynomials associate with random variables

Let \((Y_{j})_{j\ge 1}\) be a sequence of mutually independent copies of the random variable \(Y\), and let

$$S_{0}=0,\quad S_{k}=Y_{1}+Y_{2}+\cdots+Y_{k}\quad (k\ge 1). $$
(11)

We consider the probabilistic Bernoulli polynomials associated with \(Y\) which are given by

$$\frac{t}{E[e^{Yt}]-1}\Big(E[e^{Yt}]\Big)^{x} =\sum_{n=0}^{\infty}B_{n}^{Y}(x)\frac{t^{n}}{n!}. $$
(12)

When \(Y=1\), we have \(B_{n}^{Y}(x)=B_{n}(x),\ (n\ge 0)\). For \(x=0,\ B_{n}^{Y}=B_{n}^{Y}(0)\) are called the probabilistic Bernoulli numbers associated with \(Y\).

From (12), we note that

$$\begin{aligned} \, \sum_{n=0}^{\infty}B_{n}^{Y}(x)\frac{t^{n}}{n!} &=\frac{t}{E[e^{Yt}]-1}\Big(E[e^{Yt}]-1+1\Big)^{x} =\frac{t}{E[e^{Yt}]-1}+t\sum_{k=1}^{\infty}\binom{x}{k}\Big(E[e^{Yt}]-1\Big)^{k-1}\\ &=\frac{t}{E[e^{Yt}]-1}+t\sum_{k=0}^{\infty} \frac{(x)_{k+1}}{k+1}\frac{1}{k!}\Big(E[e^{Yt}]-1\Big)^{k} =\sum_{n=0}^{\infty}B_{n}^{Y}\frac{t^{n}}{n!} +t\sum_{k=0}^{\infty}\frac{(x)_{k+1}}{k+1} \sum_{n=k}^{\infty}{n \brace k}_{Y}\frac{t^{n}}{n!} \nonumber \\ &=\sum_{n=0}^{\infty}B_{n}^{Y}\frac{t^{n}}{n!} +t\sum_{n=0}^{\infty}\sum_{k=0}^{n}\frac{(x)_{k+1}}{k+1}{n \brace k}_{Y}\frac{t^{n}}{n!} =\sum_{n=0}^{\infty}B_{n}^{Y}\frac{t^{n}}{n!} +\sum_{n=1}^{\infty}n\sum_{k=0}^{n-1} \frac{(x)_{k+1}}{k+1}{n-1 \brace k}_{Y}\frac{t^{n}}{n!}. \nonumber \end{aligned}$$
(13)

Thus, by (13), we get

$$\sum_{n=1}^{\infty}\Big(B_{n}^{Y}(x)-B_{n}^{Y}\Big)\frac{t^{n}}{n!} =\sum_{n=1}^{\infty}n\sum_{k=0}^{n-1}\frac{(x)_{k+1}}{k+1}{n-1 \brace k}_{Y}\frac{t^{n}}{n!}. $$
(14)

Therefore, by comparing the coefficients on both sides of (14), we obtain the following theorem.

Theorem 2.1.

Let \(n\) be a positive integer. Then we have

$$B_{n}^{Y}(x)-B_{n}^{Y}=n\sum_{k=0}^{n-1}\frac{(x)_{k+1}}{k+1}{n-1 \brace k}_{Y}.$$

By binomial expansion, we get

$$\begin{aligned} \, \Big(E[e^{Yt}]\Big)^{x}&=\Big(E[e^{Yt}]-1+1\Big)^{x} =\sum_{k=0}^{\infty}(x)_{k}\frac{1}{k!}\Big(E[e^{Yt}]-1\Big)^{k} \\ &=\sum_{k=0}^{\infty}(x)_{k}\sum_{m=k}^{\infty}{m \brace k}_{Y}\frac{t^{m}}{m!} =\sum_{m=0}^{\infty}\sum_{k=0}^{m}{m \brace k}_{Y}(x)_{k}\frac{t^{m}}{m!}.\nonumber \end{aligned}$$
(15)

Thus, by (12) and (15), we get

$$\begin{aligned} \, \sum_{n=0}^{\infty}B_{n}^{Y}(x)\frac{t^{n}}{n!} &=\frac{t}{E[e^{Yt}]-1}\Big(E[e^{Yt}]\Big)^{x} \\ &=\sum_{l=0}^{\infty}B_{l}^{Y}\frac{t^{l}}{l!} \sum_{m=0}^{\infty}\bigg(\sum_{k=0}^{m}{m\brace k}_{Y}(x)_{k}\bigg) \frac{t^{m}}{m!}\nonumber \\ &=\sum_{n=0}^{\infty}\sum_{m=0}^{n} \binom{n}{m}B_{n-m}^{Y}\sum_{k=0}^{m}{m \brace k}_{Y}(x)_{k}\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(16)

Therefore, by comparing the coefficients on both sides of (16), we obtain the following theorem.

Theorem 2.2.

For \(n\ge 0,\) we have

$$B_{n}^{Y}(x)= \sum_{m=0}^{n}\sum_{k=0}^{m}\binom{n}{m}B_{n-m}^{Y}{m \brace k}_{Y}(x)_{k}.$$

From (12), we note that

$$\begin{aligned} \, \sum_{k=0}^{n}\Big(E[e^{Yt}]\Big)^{k} &=\frac{1}{t}\frac{t}{E[e^{Yt}]-1}\Big(\Big(E[e^{Yt}]\Big)^{n+1}-1\Big) \\ &=\frac{1}{t}\bigg(\sum_{m=0}^{\infty}B_{m}^{Y}(n+1)\frac{t^{m}}{m!} -\sum_{m=0}^{\infty}B_{m}^{Y}\frac{t^{m}}{m!}\bigg)\nonumber \\ &=\sum_{m=0}^{\infty}\bigg(\frac{B_{m+1}^{Y}(n+1) -B_{m+1}^{Y}}{m+1}\bigg)\frac{t^{m}}{m!}. \nonumber \end{aligned}$$
(17)

On the other hand, by (11), we get

$$\sum_{k=0}^{n}\Big(E[e^{Yt}]\Big)^{k} =\sum_{k=0}^{n}E\Big[e^{(Y_{1}+Y_{2}+\cdots+Y_{k})t}\Big] =\sum_{k=0}^{n}E\Big[e^{S_{k}t}\Big]=\sum_{m=0}^{\infty} \bigg(\sum_{k=0}^{n}E[S_{k}^{m}]\bigg)\frac{t^{m}}{m!}.$$
(18)

Therefore, by (17) and (18), we obtain the following theorem.

Theorem 2.3.

For \(n,m\ge 0\), we have

$$\sum_{k=0}^{n}E[S_{k}^{m}]=\frac{B_{m+1}^{Y}(n+1)-B_{n+1}^{Y}}{m+1}.$$

By (12), we see

$$\begin{aligned} \, t&=\sum_{l=0}^{\infty}B_{l}^{Y}\frac{t^{l}}{l!}\Big(E[e^{Yt}]-1\Big) =\sum_{l=0}^{\infty}B_{l}^{Y}\frac{t^{l}}{l!}\bigg(\sum_{m=0}^{\infty} \frac{t^{m}}{m!}E[Y^{m}]-1\bigg) \\ &=\sum_{n=0}^{\infty}\bigg(\sum_{l=0}^{n}\binom{n}{l}B_{l}^{Y}E[Y^{n-l}] -B_{n}^{Y}\bigg)\frac{t^{n}}{n!}. \nonumber \end{aligned}$$
(19)

By comparing the coefficients on both sides of (18), we have

$$\sum_{l=0}^{n}\binom{n}{l}B_{l}^{Y}E[Y^{n-l}]-B_{n}^{Y}=\left\{\begin{array}{ccc} 1, & \text{for $n=1$}\\ 0, & \text{otherwise} \end{array}\right.,\quad E[Y]B_{0}^{Y}=1. $$
(20)

Therefore, by (20), we obtain the following theorem.

Theorem 2.4.

Let \(n\) be a nonnegative integer. Then we have

$$E[Y]B_{0}^{Y}=1,\quad \sum_{l=0}^{n} \binom{n}{l}B_{l}^{Y}E\big[Y^{n-l}\big]-B_{n}^{Y}=\delta_{n,1},$$

where \(\delta_{n,k}\) is the Kronecker’s symbol.

Now, we observe that

$$\begin{aligned} \, \frac{t}{E[e^{Yt}]-1}&=\frac{t}{\big(E[e^{Yt}]\big)^{d}-1} \sum_{k=0}^{d-1}\Big(E[e^{Yt}]\Big)^{k} =\frac{t}{E[e^{(Y_{1}+Y_{2}+\cdots+Y_{d})t}]-1} \sum_{k=0}^{d-1}\Big((E[e^{Yt}])^{d}\Big)^{\frac{k}{d}}\\ &=\frac{t}{E[e^{S_{d}t}]-1}\sum_{k=0}^{d-1}\Big(E[e^{S_{d}t}]\Big)^{\frac{k}{d}} =\sum_{n=0}^{\infty}\sum_{k=0}^{d-1}B_{n}^{S_{d}} \bigg(\frac{k}{d}\bigg)\frac{t^{n}}{n!},\nonumber \end{aligned}$$
(21)

where \(d\) is a positive integer.

By comparing the coefficients on both sides of (21), we obtain the following theorem.

Theorem 2.5.

Let \(d\) be a positive integer. For \(n\ge 0,\) we have

$$B_{n}^{Y}=\sum_{k=0}^{d-1}B_{n}^{S_{d}}\bigg(\frac{k}{d}\bigg).$$

In particular, for \(Y=1\), we have

$$B_{n}=d^{n-1}\sum_{k=0}^{d-1}B_{n}\bigg(\frac{k}{d}\bigg).$$

Let \(Y\) be the Poisson random variable with parameter \(\alpha>0\). Then we have

$$\begin{aligned} \, \sum_{n=0}^{\infty}B_{n}^{Y}(x)\frac{t^{n}}{n!} &=\frac{t}{E[e^{Yt}]-1}(E[e^{Yt}])^{x} =\frac{t}{e^{\alpha(e^{t}-1)}-1}e^{\alpha x(e^{t}-1)} \\ &=\frac{t}{\alpha(e^{t}-1)} \frac{\alpha(e^{t}-1)}{e^{\alpha(e^{t}-1)}-1}e^{\alpha x(e^{t}-1)} \nonumber \\ &=\frac{1}{\alpha}\sum_{l=0}^{\infty}B_{l} \frac{t^{l}}{l!}\sum_{m=0}^{\infty}\alpha^{m}B_{m}(x) \frac{1}{m!}\big(e^{t}-1\big)^{m}\nonumber \\ &=\frac{1}{\alpha}\sum_{l=0}^{\infty}B_{l}\frac{t^{l}}{l!} \sum_{m=0}^{\infty}\alpha^{m}B_{m}(x) \sum_{k=m}^{\infty}{k \brace m}\frac{t^{k}}{k!}\nonumber\\ &=\frac{1}{\alpha}\sum_{l=0}^{\infty}B_{l}\frac{t^{l}}{l!} \sum_{k=0}^{\infty}\sum_{m=0}^{k}\alpha^{m}B_{m}(x){k \brace m} \frac{t^{k}}{k!} \nonumber\\ &=\sum_{n=0}^{\infty}\sum_{k=0}^{n} \sum_{m=0}^{k}\alpha^{m-1}\binom{n}{k}{k \brace m}B_{n-k}B_{m}(x) \frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(22)

Therefore, by (22), we obtain the following theorem.

Theorem 2.6.

Let \(Y\) be the Poisson random variable with parameter \(\alpha>0\). For \(n\ge 0,\) we have

$$B_{n}^{Y}(x)=\sum_{k=0}^{n} \sum_{m=0}^{k}\alpha^{m-1}\binom{n}{k}{k \brace m}B_{n-k}B_{m}(x).$$

We need the following lemma in showing Theorems 2.8 and 2.14. We obtain this from the following observation:

$$\begin{aligned} \, t=\log(1+e^{t}-1)&=\sum_{k=1}^{\infty}\frac{(-1)^{k-1}}{k}(e^t-1)^{k} =\sum_{k=1}^{\infty}(-1)^{k-1}(k-1)!\frac{1}{k!}(e^t-1)^{k} \\ &=\sum_{k=1}^{\infty}(-1)^{k-1}(k-1)!\sum_{n=k}^{\infty}{n \brace k}\frac{t^n}{n!} =\sum_{n=1}^{\infty}\sum_{k=1}^{n}(-1)^{k-1}(k-1)!{n \brace k} \frac{t^n}{n!}. \end{aligned}$$

Lemma 1.

For \(n\in\mathbb{N},\) we have

$$\sum_{k=1}^{n}(-1)^{k-1}(k-1)!{n\brace k}=\delta_{n,1}.$$

Equivalently, for \(n\in\mathbb{N}\cup\{0\},\) we have

$$\frac{1}{n+1}\sum_{k=1}^{n+1}(-1)^{k-1}(k-1)!{n+1 \brace k}=\delta_{n,0}.$$

The probabilistic Fubini polynomials are given by

$$\sum_{n=0}^{\infty}F_{n}^{Y}(y)\frac{t^{n}}{n!}=\frac{1}{1-y(E[e^{Yt}]-1)}. $$
(23)

Thus, by (10), we get

$$F_{n}^{Y}(y)=\sum_{k=0}^{n}{n\brace k}_{Y}k!y^{k}\quad (n\ge 0). $$
(24)

From (24), we get

$$\int_{0}^{1}F_{n}^{Y}(-y)dy=\sum_{k=0}^{n}{n\brace k}_{Y}\frac{k!}{k+1}(-1)^{k}. $$
(25)

From (23), we have

$$\begin{aligned} \, \sum_{n=0}^{\infty}\int_{0}^{1}F_{n}^{Y}(-y)dy\frac{t^{n}}{n!} &=\int_{0}^{1}\frac{1}{1+y(E[e^{Yt}]-1)}dy =\frac{1}{E[e^{Yt}]-1}\log\Big(1+\big(E[e^{Yt}]-1)\Big) \\ &=\frac{1}{E[e^{Yt}]-1}\sum_{k=1}^{\infty}\frac{(-1)^{k-1}}{k}k! \frac{1}{k!}\Big(E[e^{Yt}]-1\Big)^{k}\nonumber\\ &=\frac{t}{E[e^{Yt}]-1}\frac{1}{t}\sum_{k=1}^{\infty}\frac{(-1)^{k-1}}{k}k! \sum_{m=k}^{\infty}{m\brace k}_{Y}\frac{t^{m}}{m!}\nonumber\\ &=\frac{t}{E[e^{Yt}]-1}\frac{1}{t}\sum_{m=1}^{\infty} \sum_{k=1}^{m}(-1)^{k-1}(k-1)!{m \brace k}_{Y}\frac{t^{m}}{m!}\nonumber \\ &=\frac{t}{E[e^{Yt}]-1}\sum_{m=0}^{\infty}\frac{1}{m+1} \sum_{k=1}^{m+1}(-1)^{k-1}(k-1)!{m+1 \brace k}_{Y}\frac{t^{m}}{m!}\nonumber \\ &=\sum_{l=0}^{\infty}B_{l}^{Y}\frac{t^{l}}{l!} \sum_{m=0}^{\infty}\frac{1}{m+1} \sum_{k=1}^{m+1}(-1)^{k-1}(k-1)!{m+1 \brace k}_{Y}\frac{t^{m}}{m!}\nonumber \\ &=\sum_{n=0}^{\infty}\sum_{m=0}^{n}B_{n-m}^{Y}\binom{n}{m}\frac{1}{m+1} \sum_{k=1}^{m+1}(-1)^{k}(k-1)!{m+1 \brace k}_{Y}\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(26)

Therefore, by (25) and (26) and using Lemma 1, we obtain the following theorem.

Theorem 2.7.

For \(n\ge 0,\) we have

$$\sum_{k=0}^{n}{n\brace k}_{Y}\frac{k!}{k+1}(-1)^{k} =\sum_{m=0}^{n}\binom{n}{m}B_{n-m}^{Y}\frac{1}{m+1} \sum_{k=1}^{m+1}(-1)^{k-1}(k-1)!{m+1 \brace k}_{Y}.$$

In particular, for \(Y=1,\) we get

$$\sum_{k=0}^{n}{n \brace k}\frac{k!}{k+1}(-1)^{k}=B_{n}.$$

Let \(Y\sim\Gamma(1,1)\). Then we have

$$E[e^{Yt}]=\int_{0}^{\infty}e^{-y}e^{yt}dt=\frac{1}{1-t} \quad (t <1). $$
(27)

Thus, by (12) and (27), we get

$$ \sum_{n=0}^{\infty}B_{n}^{Y}(x)\frac{t^{n}}{n!} =\frac{t}{E[e^{Yt}]-1}\Big(E[e^{Yt}]\Big)^{x}=\bigg(\frac{1}{1-t}\bigg)^{x-1} =\sum_{n=0}^{\infty}\binom{x+n-2}{n}t^{n}.$$
(28)

Therefore, by (28), we obtain the following theorem.

Theorem 2.8.

Let \(Y\sim\Gamma(1,1)\). For \(n\ge 0,\) we have

$$B_{n}^{Y}(x)=n!\binom{x+n-2}{n}.$$

Let \(Y\) be the Bernoulli random variable with the probability of success \(p\). Then we have

$$E[e^{Yt}]=p(e^{t}-1)+1. $$
(29)

From (12) and (29), we have

$$\sum_{n=0}^{\infty}B_{n}^{Y}\frac{t^{n}}{n!} =\frac{t}{E[e^{Yt}]-1}=\frac{t}{p(e^{t}-1)} =\sum_{n=0}^{\infty}\frac{1}{p}B_{n}\frac{t^{n}}{n!}. $$
(30)

Therefore, by (30), we obtain the following theorem.

Theorem 2.9.

Let \(Y\) be the Bernoulli random variable with probability of success \(p\). For \(n\ge 0\), we have

$$B_{n}^{Y}=\frac{1}{p}B_{n}.$$

Now, we define the probabilistic \(r\)-Stirling numbers of the second kind associated with \(Y\) by

$$\frac{1}{k!}\Big(E[e^{Yt}]-1\Big)^{k}\Big(E[e^{Yt}]\Big)^{r} =\sum_{n=k}^{\infty}{n+r\brace k+r}_{r,Y}\frac{t^{n}}{n!}, $$
(31)

where \(r\) is a nonnegative integer.

When \(Y=1\), we have \({n+r \brace k+r}_{r,Y}={n+r \brace k+r}_{r}\). From (31), we note that

$$\begin{aligned} \, \sum_{n=k}^{\infty}{n+r \brace k+r}_{r,Y}\frac{t^{n}}{n!} &=\frac{1}{k!}\sum_{j=0}^{k}\binom{k}{j}(-1)^{k-j} \Big(E[e^{Yt}]\Big)^{j+r} \\ &=\frac{1}{k!}\sum_{j=0}^{k}\binom{k}{j}(-1)^{k-j}E\Big[e^{(Y_{1}+Y_{2} +\cdots+Y_{j+r})t}\Big]\nonumber\\ &=\sum_{n=0}^{\infty}\frac{1}{k!} \sum_{j=0}^{k}\binom{k}{j}(-1)^{k-j}E\Big[S_{j+r}^{n}\Big]\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(32)

Therefore, by (32), we obtain the following theorem.

Theorem 2.10.

For \(n\ge k\ge 0,\) we have

$${n+r \brace k+r}_{r,Y}=\frac{1}{k!} \sum_{j=0}^{k}\binom{k}{j}(-1)^{k-j}E\big[S_{j+r}^{n}\big].$$

Now, we consider the probabilistic two variable Fubini polynomials associated with \(Y\) defined by

$$\frac{1}{1-x\big(E[e^{Yt}]-1\big)}\Big(E[e^{Yt}]\Big)^{y} =\sum_{n=0}^{\infty}F_{n}^{Y}(x|y)\frac{t^{n}}{n!}. $$
(33)

When \(Y=1\), we have \(F_{n}^{Y}(x|y)=F_{n}(x|y)\).

From (33), we note that

$$\begin{aligned} \, \frac{1}{1-x\big(E[e^{Yt}]-1\big)}\Big(E[e^{Yt}]\Big)^{y} &=\frac{1}{1-x\big(E[e^{Yt}]-1\big)}\Big(E[e^{Yt}]-1+1\Big)^{y} \\ &=\sum_{j=0}^{\infty}F_{j}^{Y}(x)\frac{t^{j}}{j!} \sum_{k=0}^{\infty}\binom{y}{k}k!\frac{1}{k!}\Big(E[e^{Yt}]-1\Big)^{k}\nonumber \\ &=\sum_{j=0}^{\infty}F_{j}^{Y}(x)\frac{t^{j}}{j!} \sum_{m=0}^{\infty}\sum_{k=0}^{m} \binom{y}{k}k!{m \brace k}_{Y}\frac{t^{m}}{m!}\nonumber\\ &=\sum_{n=0}^{\infty}\sum_{m=0}^{n}\sum_{k=0}^{m}k! \binom{y}{k}{m \brace k}_{Y}\binom{n}{m}F_{n-m}^{Y}(x)\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(34)

Therefore, by comparing the coefficients on both sides of (34), we obtain the following theorem.

Theorem 2.11.

For \(n\ge 0,\) we have

$$F_{n}^{Y}(x|y)=\sum_{m=0}^{n}\sum_{k=0}^{m}k! \binom{y}{k}{m \brace k}_{Y}\binom{n}{m}F_{n-m}^{Y}(x).$$

For \(r\in\mathbb{Z}\) with \(r\ge 0\) and from (31), we have

$$\begin{aligned} \, \sum_{n=0}^{\infty}F_{n}^{Y}(y|r)\frac{t^{n}}{n!} &=\frac{1}{1-y(E[e^{Yt}]-1)}\Big(E[e^{Yt}]\Big)^{r} =\sum_{k=0}^{\infty}y^{k}k!\frac{1}{k!}\Big(E[e^{Yt}] -1\Big)^{k}\Big(E[e^{Yt}]\Big)^{r} \\ &=\sum_{k=0}^{\infty}y^{k}k! \sum_{n=k}^{\infty}{n+r \brace k+r}_{r,Y}\frac{t^{n}}{n!} =\sum_{n=0}^{\infty} \sum_{k=0}^{n}k!y^{k}{n+r \brace k+r}_{r,Y}\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(35)

Therefore, by (35), we obtain the following theorem.

Theorem 2.12.

Let \(r\) be a nonnegative integer. For \(n\ge 0,\) we have

$$F_{n}^{Y}(x|r)=\sum_{k=0}^{n}k!{n+r \brace k+r}_{r,Y}x^{k}. $$
(36)

From (36), we have

$$\int_{0}^{1}F_{n}^{Y}(-y|r)dy =\sum_{k=0}^{n}\frac{k!}{k+1}(-1)^{k}{n+r \brace k+r}_{r,Y}. $$
(37)

By (37), we get

$$\begin{aligned} \, \sum_{n=0}^{\infty} \int_{0}^{1}F_{n}^{Y}(-y|r)dy\frac{t^{n}}{n!} &=\int_{0}^{1}\frac{1}{1+y(E[e^{Yt}]-1)}\Big(E[e^{Yt}]\Big)^{r}dy \\ &=\frac{t}{E[e^{Yt}]-1}\Big(E[e^{Yt}]\Big)^r \frac{1}{t}\log\Big(1+\big(E[e^{Yt}]-1\big)\Big)\nonumber \\ &=\sum_{l=0}^{\infty}B_{l}^{Y}(r)\frac{t^{l}}{l!} \frac{1}{t}\sum_{k=1}^{\infty}\frac{(-1)^{k-1}}{k}k! \frac{1}{k!}\Big(E[e^{Yt}]-1\Big)^{k} \nonumber \\ &=\sum_{l=0}^{\infty}B_{l}^{Y}(r)\frac{t^{l}}{l!} \frac{1}{t}\sum_{k=1}^{\infty}(-1)^{k-1}(k-1)! \sum_{m=k}^{\infty}{m \brace k}_{Y}\frac{t^{m}}{m!}\nonumber \\ &=\sum_{l=0}^{\infty}B_{l}^{Y}(r)\frac{t^{l}}{l!}\frac{1}{t} \sum_{m=1}^{\infty} \sum_{k=1}^{m}(-1)^{k-1}(k-1)!{m\brace k}_{Y}\frac{t^{m}}{m!}\nonumber \\ &=\sum_{l=0}^{\infty}B_{l}^{Y}(r)\frac{t^{l}}{l!}\sum_{m=0}^{\infty} \frac{1}{m+1} \sum_{k=1}^{m+1}(-1)^{k-1}(k-1)!{m+1 \brace k}_{Y}\frac{t^{m}}{m!}\nonumber \\ &=\sum_{n=0}^{\infty} \sum_{m=0}^{n}\binom{n}{m}B_{n-m}^{Y}(r) \frac{1}{m+1}\sum_{k=1}^{m+1}(-1)^{k-1}(k-1)!{m+1 \brace k}_{Y} \frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(38)

Therefore, by (37) and (38) and using Lemma 1, we obtain the following theorem.

Theorem 2.13.

For \(n,r\in\mathbb{Z}\) with \(n,r\ge 0,\) we have

$$\sum_{k=0}^{n}\frac{k!}{k+1}(-1)^{k}{n+r \brace k+r}_{r,Y} = \sum_{m=0}^{n}\binom{n}{m}B_{n-m}^{Y}(r)\frac{1}{m+1} \sum_{k=1}^{m+1}(-1)^{k-1}(k-1)!{m+1 \brace k}_{Y}.$$

In particular, for \(Y=1,\) we get

$$\sum_{k=0}^{n}\frac{k!}{k+1}(-1)^{k}{n+r \brace k+r}_{r}=B_{n}(r).$$

For \(k\in\mathbb{Z}\), the polylogarithmic function is given by

$$\mathrm{Li}_{k}(x)=\sum_{n=1}^{\infty}\frac{x^{n}}{n^{k}}\quad (|x|<1). $$
(39)

Note that \(\mathrm{Li}_{1}(x)=-\log(1-x)\).

Now, we define the probabilistic poly-Bernoulli polynomials associated with \(Y\) by

$$\frac{\mathrm{Li}_{k}(1-e^{-t})}{1-E[e^{-Yt}]}\Big(E[e^{-Yt}]\Big)^{x} =\sum_{n=0}^{\infty}B_{n}^{(k,Y)}(x)\frac{t^{n}}{n!}. $$
(40)

Note that

$$B_{n}^{(1,Y)}(x)=(-1)^{n}B_{n}^{Y}(x)\quad (n\ge 0). $$
(41)

From (40), we note that

$$\begin{aligned} \, \sum_{n=0}^{\infty}B_{n}^{(k,Y)}(x)\frac{t^{n}}{n!} &=\frac{\mathrm{Li}_{k}(1-e^{-t})}{1-E[e^{-Yt}]} \Big(E[e^{-Yt}]\Big)^{x} =\frac{t}{1-E[e^{-Yt}]}\Big(E[e^{-Yt}]\Big)^{x} \frac{1}{t}\mathrm{Li}_{k}\big(1-e^{-t}\big) \\ &=\sum_{m=0}^{\infty}B_{m}^{Y}(x)\frac{(-1)^{m}}{m!}t^{m} \frac{1}{t}\sum_{l=1}^{\infty}\frac{(-1)^{l}}{l^{k}}\Big(e^{-t}-1\Big)^{l} \nonumber\\ &=\sum_{m=0}^{\infty}B_{m}^{Y}(x)\frac{(-1)^{m}}{m!}t^{m} \frac{1}{t}\sum_{l=1}^{\infty}\frac{(-1)^{l}}{l^{k}}l! \sum_{j=l}^{\infty}{j\brace l}(-1)^{j}\frac{t^{j}}{j!}\nonumber\\ &=\sum_{m=0}^{\infty}B_{m}^{Y}(x)\frac{(-1)^{m}}{m!}t^{m} \sum_{j=0}^{\infty}\frac{1}{j+1} \sum_{l=1}^{j+1}\frac{l!}{l^{k}}{j+1 \brace l}(-1)^{j+1-l}\frac{t^{j}}{j!}\nonumber \\ &=\sum_{n=0}^{\infty}(-1)^{n}\sum_{j=0}^{n}\binom{n}{j}B_{n-j}^{Y}(x) \frac{1}{j+1}\sum_{l=1}^{j+1} \frac{l!}{l^{k}}{j+1 \brace l}(-1)^{l-1}\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(42)

Therefore, by (42), we obtain the following theorem.

Theorem 2.14.

For \(n\ge 0,\) we have

$$B_{n}^{(k,Y)}(x)= (-1)^{n} \sum_{j=0}^{n}\binom{n}{j}B_{n-j}^{Y}(x)\frac{1}{j+1} \sum_{l=1}^{j+1}\frac{l!}{l^{k}}{j+1 \brace l}(-1)^{l-1}.$$

Now, we define the probabilistic Euler polynomials associated with \(Y\) by

$$\frac{2}{E[e^{tY}]+1}\Big(E[e^{tY}]\Big)^{x} =\sum_{n=0}^{\infty}\mathcal{E}_{n}^{Y}(x)\frac{t^{n}}{n!}. $$
(43)

When \(Y=1\), \(\mathcal{E}_{n}^{Y}(x)=\mathcal{E}_{n}(x),\ (n\ge 0)\). In particular, for \(x=0\), \(\mathcal{E}_{n}^{Y}=\mathcal{E}_{n}^{Y}(0)\) are called the probabilistic Euler numbers associated with \(Y\).

From (43), we have

$$ \sum_{n=0}^{\infty}\mathcal{E}_{n}^{Y}(x)\frac{t^{n}}{n!} =\frac{2}{E[e^{Yt}]+1}\Big(E[e^{Yt}]\Big)^{x} =\frac{2}{E[e^{tY}]+1}\Big(E[e^{tY}]-1+1\Big)^{x}$$
(44)
$$=\frac{2}{E[e^{tY}]+1}\sum_{k=0}^{\infty}\binom{x}{k}\Big(E[e^{tY}]-1\Big)^{k} =\sum_{l=0}^{\infty}\mathcal{E}_{l}^{Y}\frac{t^{l}}{l!} \sum_{k=0}^{\infty}(x)_{k}\sum_{m=k}^{\infty}{m \brace k}_{Y} \frac{t^{m}}{m!} $$
$$=\sum_{l=0}^{\infty}\mathcal{E}_{l}^{Y}\frac{t^{l}}{l!} \sum_{m=0}^{\infty}\sum_{k=0}^{m}(x)_{k}{m\brace k}_{Y}\frac{t^{m}}{m!} =\sum_{n=0}^{\infty}\sum_{m=0}^{n}\binom{n}{m}\mathcal{E}_{n-m}^{Y} \sum_{k=0}^{m}{m\brace k}_{Y}(x)_{k}\frac{t^{n}}{n!}. $$
(45)

Therefore, by comparing the coefficients on both sides of (44), we obtain the following theorem.

Theorem 2.15.

For \(n\ge 0,\) we have

$$\mathcal{E}_{n}^{Y}(x)= \sum_{m=0}^{n}\binom{n}{m}\mathcal{E}_{n-m}^{Y} \sum_{k=0}^{m}{m\brace k}_{Y}(x)_{k}.$$

From (43), we note that

$$ \sum_{n=0}^{\infty}\mathcal{E}_{n}^{Y}\frac{t^{n}}{n!} =2\sum_{k=0}^{\infty}(-1)^{k}\Big(E[e^{Yt}]\Big)^{k} =2\sum_{k=0}^{\infty}(-1)^{k}E\Big[e^{(Y_{1}+Y_{2}+\cdots+Y_{k})t}\Big] =\sum_{n=0}^{\infty}2\sum_{k=0}^{\infty}(-1)^{k}E[S_{k}^{n}]\frac{t^{n}}{n!}.$$
(46)

Therefore, by (46), we obtain the following theorem.

Theorem 2.16.

For \(n\ge 0,\) we have

$$\mathcal{E}_{n}^{Y}=2\sum_{k=0}^{\infty}(-1)^{k}E\big[S_{k}^{n}\big].$$

For \(n\in\mathbb{N}\) with \(n\equiv 0\ (\mathrm{mod}\ 2)\), we have

$$ 2\sum_{k=0}^{n}(-1)^{k}\Big(E[e^{Yt}]\Big)^{k} =\frac{2}{E[e^{Yt}]+1}\Big(1+\big(E[e^{Yt}]\big)^{n+1}\Big) =\sum_{m=0}^{\infty}\Big(\mathcal{E}_{m}^{Y} +\mathcal{E}_{m}^{Y}(n+1)\Big)\frac{t^{m}}{m!}.$$
(47)

On the other hand, by (11), we get

$$\begin{aligned} \, 2\sum_{k=0}^{n}(-1)^{k}\big(E[e^{Yt}]\big)^{k} &=2\sum_{k=0}^{n}(-1)^{k}E\big[e^{(Y_{1}+Y_{2}+\cdots+Y_{k})t}\big] \\ &=2\sum_{k=0}^{n}(-1)^{k}\sum_{m=0}^{\infty}E[S_{k}^{m}]\frac{t^{m}}{m!} =\sum_{m=0}^{\infty}2\sum_{k=0}^{n}(-1)^{k}E[S_{k}^{m}]\frac{t^{m}}{m!}.\nonumber \end{aligned}$$
(48)

Therefore, by (47) and (48), we obtain the following theorem.

Theorem 2.17.

For \(n\in\mathbb{N}\) with \(n\equiv 0\ (\mathrm{mod}\ 2)\) and \(m\ge 0,\) we have

$$\sum_{k=0}^{n}(-1)^{k}E[S_{k}^{m}] =\frac{\mathcal{E}_{m}^{Y}+\mathcal{E}_{m}^{Y}(n+1)}{2}.$$

Let \(Y\sim\Gamma(1,1)\). Then, by (27), \(E[e^{Yt}] =\frac{1}{1-t},\quad (t <1)\). By (43), we get

$$ \sum_{n=0}^{\infty}\mathcal{E}_{n}^{Y}\frac{t^{n}}{n!} =\frac{1-t}{1-\frac{t}{2}}=(1-t)\sum_{n=0}^{\infty}\bigg(\frac{1}{2}\bigg)^{n}t^{n} =\sum_{n=0}^{\infty}\bigg(\frac{1}{2}\bigg)^{n}t^{n} -\sum_{n=1}^{\infty}\bigg(\frac{1}{2}\bigg)^{n-1}t^{n} =1-\sum_{n=1}^{\infty}\bigg(\frac{1}{2}\bigg)^{n}t^{n}.$$
(49)

Hence, by (49), we get

$$ \mathcal{E}_{n}^{Y}=-\frac{n!}{2^{n}}\quad (n\ge 1).$$
(50)

Theorem 2.18.

Let \(Y\sim\Gamma(1,1)\). For \(n\ge 1,\) we have

$$\mathcal{E}_{n}^{Y}=-\frac{n!}{2^{n}}.$$

Let \(Y\) be the Poisson random variable with parameter \(\alpha>0\). Then we have

$$\begin{aligned} \, \sum_{n=0}^{\infty}\mathcal{E}_{n}^{Y}\frac{t^{n}}{n!} &=\frac{2}{E[e^{Yt}]+1}=\frac{2}{e^{\alpha(e^{t}-1)}+1} \\ &=\sum_{k=0}^{\infty}\mathcal{E}_{k}\alpha^{k} \frac{1}{k!}\big(e^{t}-1\big)^{k}=\sum_{k=0}^{\infty}\mathcal{E}_k\alpha^{k} \sum_{n=k}^{\infty}{n\brace k}\frac{t^{n}}{n!} =\sum_{n=0}^{\infty} \sum_{k=0}^{n}\alpha^{k}\mathcal{E}_{k}{n\brace k}\frac{t^{n}}{n!}.\nonumber \end{aligned}$$
(51)

Therefore, by (51), we obtain the following theorem.

Theorem 2.19.

Let \(Y\) be the Poisson random variable with parameter \(\alpha>0\). For \(n\ge 0,\) we have

$$\mathcal{E}_{n}^{Y}=\sum_{k=0}^{n}\alpha^{k}{n\brace k}\mathcal{E}_{k}.$$

3. Conclusion

We used generating functions to study probabilistic extensions of several special polynomials, namely the probabilistic Bernoulli polynomials associated \(Y\) and the probabilistic Euler polynomials associated \(Y\), together with the probabilistic \(r\)-Stirling numbers of the second associated \(Y\), the probabilistic two variable Fubini polynomials associated \(Y\), and the probabilistic poly-Bernoulli polynomials associated with \(Y\). Here \(Y\) is a random variable such that the moment generating function of \(Y\) exists in a neighborhood of the origin. In more detail, we obtained several explicit expressions for \(B_{n}^{Y}(x)\) (see Theorems 2.1, 2.2, 2.6) and an explicit expression for each of \(F_{n}^{Y}(x|y),\,F_{n}^{Y}(y|r),\,B_{n}^{(k,Y)}(x)\), and \(\mathcal{E}_{n}^{Y}(x)\) (see Theorems 2.12, 2.13, 2.15, 2.16). We derived three identities about probabilistic extensions of some well-known identities on Bernoulli numbers (see Theorems 2.3–2.5) and a probabilistic extension of the identity that reduces to an explicit expression of those numbers (see Theorem 2.8). We got two identities on \(\mathcal{E}_{n}^{Y}\) (see Theorems 2.17, 2.18). In the special case of \(Y \sim \Gamma(1,1)\), we found an expression for \(B_{n}^{Y}(x)\) (see Theorem 2.9) and that for \(\mathcal{E}_{n}^{Y}\) (see Theorem 2.19). Also, we determined \(B_{n}^{Y}\) when \(Y\) is the Bernoulli random variable with probability of success \(p\) (see Theorem 2.10) and \(\mathcal{E}_{n}^{Y}\) when \(Y\) is the Poisson random variable with parameter \(\alpha\) (see Theorem 2.20). Finally, we deduced two identities involving \({n+r \brace k+r}_{r,Y}\) (see Theorems 2.11, 2.14).

As one of our future projects, we would like to continue to study probabilistic versions of many special polynomials and numbers and to find their applications to physics, science and engineering as well as to mathematics.