1 Introduction and motivation

Divided differences (cf. [7, Chapter 1]) are useful tools in mathematics and physics. For example, their applications to symmetric functions and approximation theory can be found respectively in [6, § 1.2] and [8, § 2.4]. For a complex function f(y) and uneven spaced grid points \(\{x_k\}_{k=0}^{n}\), the divided differences with respect to y are defined in succession as follows:

$$\begin{aligned} \Delta [x_0,x_1]f(y)= & {} \frac{f(x_0)-f(x_1)}{x_0-x_1},\\ \Delta [x_0,x_1,x_2]f(y)= & {} \frac{\Delta [x_0,x_1]f(y)-\Delta [x_1,x_2]f(y)}{x_0-x_2},\\ \cdots&\vdots&\cdots \\ \Delta [x_0,x_1,\ldots ,x_n]f(y)= & {} \frac{\Delta [x_0,x_1,\ldots ,x_{n-1}]f(y) -\Delta [x_1,x_2,\ldots ,x_n]f(y)}{x_0-x_n}. \end{aligned}$$

In general, they can be expressed through the Newton formula:

$$\begin{aligned} \Delta [x_0,x_1,\ldots ,x_n]f(y) =\sum _{k=0}^{n}\frac{f(x_k)}{\prod _{i\ne k}(x_k-x_i)}. \end{aligned}$$
(1)

Between the divided differences of monomials and complete symmetric functions, there holds the following well–known formula due to Sylvester (1839), whose proofs can be found in [1,2,3]:

$$\begin{aligned} \Delta [x_0,x_1,\ldots ,x_n]y^m= & {} \sum _{k=0}^{n}\frac{x_k^m}{\prod _{i\ne k}(x_k-x_i)} \nonumber \\= & {} {\left\{ \begin{array}{ll} 0, &{} \quad m=0,1,\ldots ,n-1; \\ h_{m-n}(x_0,x_1,\ldots ,x_n), &{} \quad m=n,~n+1,~\ldots ; \\ \frac{(-1)^n}{x_0x_1\cdots x_n} h_{-m-1}\big (\frac{1}{x_0},\frac{1}{x_1},\ldots ,\frac{1}{x_n}\big ), &{} \quad m=-1,-2,-3,\ldots . \end{array}\right. } \end{aligned}$$
(2)

Here and below, we shall denote the elementary and complete symmetric functions (cf. Macdonald [6, § 1.2]) of variables \(X=\{x_0,x_1,\ldots ,x_n\}\), respectively, by

$$\begin{aligned} e_0(X)= & {} 1 \quad \text { and } \quad e_m(X)=\sum _{0\le k_1<k_2<\cdots <k_m\le n} x_{k_1}x_{k_2}\cdots x_{k_m} \quad \text { for } \; m\in \mathbb {N};\\ h_0(X)= & {} 1 \quad \text { and } \quad h_m(X)=\sum _{0\le k_1\le k_2\le \cdots \le k_m\le n} x_{k_1}x_{k_2}\cdots x_{k_m} \quad \text { for } \; m\in \mathbb {N}. \end{aligned}$$

They admit the following generating functions:

$$\begin{aligned} \sum _{k=0}^{n+1}e_k(X)\,y^k= & {} \prod _{k=0}^n(1+x_ky),\end{aligned}$$
(3)
$$\begin{aligned} \sum _{k=0}^{\infty }h_k(X)\,y^k= & {} \prod _{k=0}^n\frac{1}{1-x_ky}. \end{aligned}$$
(4)

Recently, divided differences have been employed by the first author [4, 5] and Tang–Xu [9] to investigate determinant evaluations. In this paper, we shall utilize the Laplace expansion formula to establish an extension of Vandermonde determinant. By computing divided differences, it will further be specialized to several interesting Vandermonde—like determinant identities.

2 Extensions of Vandermonde determinant

For the indeterminates \(X:=\{x_k\}_{k=0}^{n}\), denote the divided differences by

$$\begin{aligned} \Delta [X]f(y):=\Delta [x_0,x_1,\ldots ,x_n]f(y) \end{aligned}$$

and the Vandermonde determinant by

$$\begin{aligned} V(X):=\det _{0\le i,j\le n}\big [x_i^{j}\big ] =\prod _{0\le i<j\le n}(x_j-x_i). \end{aligned}$$

Theorem 1

(Extension of Vandermonde determinant)

$$\begin{aligned} \det _{0\le i,j\le n}\big [x_i^j+u_iv_j\big ] =V(X) \bigg \{1+\sum _{\imath ,\jmath =0}^{n}(-1)^{n+\jmath } \frac{u_{\imath }v_{\jmath }e_{n-\jmath }\big (X\backslash \{x_{\imath }\}\big )}{\prod _{k\ne \imath }(x_{\imath }-x_k)} \bigg \}. \end{aligned}$$

Proof

Consider the extended matrix of order \((n+2)\times (2+n)\) given explicitly by

$$\begin{aligned} \left[ \begin{array}{ccc} 1&{}\vdots &{}0\\ \cdots &{}\cdots &{}\cdots \cdots \cdots \cdots \cdots \cdots \\ u_i&{}\vdots &{}x_i^j+u_iv_j\;\scriptstyle (0\le i,j\le n) \end{array} \right] \end{aligned}$$

whose determinant is obviously equal to the determinant stated in the theorem.

Now subtracting \(v_j\) times the first column from each other column, we transform the matrix into the following one:

$$\begin{aligned} \left[ \begin{array}{ccc}1&{}\vdots &{}-v_j\\ \cdots \cdots &{}\cdots &{}\cdots \cdots \cdots \cdots \cdots \\ u_i&{}\vdots &{}x_i^j\;\scriptstyle (0\le i,j\le n)\end{array}\right] . \end{aligned}$$

Applying the Laplace expansion formula to the last matrix with respect to the first column and then to the first row, we have

$$\begin{aligned} \begin{aligned} \det _{0\le i,j\le n}\big [x_i^j+u_iv_j\big ] =&\det _{0\le i,j\le n}\big [x_i^j\big ] +\sum _{\imath =0}^{n}(-1)^{\imath }u_{\imath } \det _{0\le i,j\le n} \left[ \begin{array}{c}v_j\\ \cdots \\ x_i^j\end{array} \right] _{i\ne \imath }\\ =&V(X) +\sum _{\imath ,\jmath =0}^{n}(-1)^{\imath +\jmath }u_{\imath }v_{\jmath } \det _{0\le i,j\le n}\big [ x_i^j\big ]_{\begin{array}{c} \scriptstyle i\ne \imath \\ \scriptstyle j\ne \jmath \end{array}}. \end{aligned} \end{aligned}$$
(5)

Let \([x^k]f(x)\) stand for the coefficient of \(x^k\) in the formal power series f(x). Then the determinant displayed in (5) can be evaluated as

$$\begin{aligned} \det _{0\le i,j\le n} \big [x_i^j\big ]_{\begin{array}{c} \scriptstyle i\ne \imath \\ \scriptstyle j\ne \jmath \end{array}}= & {} (-1)^{\imath +\jmath }\,[x_\imath ^\jmath ]\,V(X) =(-1)^{\jmath }[x_\imath ^{\jmath }] \prod _{\begin{array}{c} 0\le i<j\le n\\ i,j\ne \imath \end{array}}(x_j-x_i) \prod _{\begin{array}{c} 0\le k\le n\\ k\ne \imath \end{array}}(x_k-x_\imath )\\= & {} e_{n-\jmath }\big (X\backslash \{x_{\imath }\}\big ) \prod _{\begin{array}{c} 0\le i<j\le n\\ i,j\ne \imath \end{array}}(x_j-x_i) =(-1)^{n-\imath } \frac{e_{n-\jmath }\big (X\backslash \{x_{\imath }\}\big )}{\prod _{k\ne \imath }(x_{\imath }-x_k)}V(X). \end{aligned}$$

Substituting this expression into (5) and then simplifying the result, we obtain the determinant identity stated in the theorem. \(\square \)

When \(u_k=1\), Theorem 1 leads immediately to the following determinant identity.

Corollary 2

$$\begin{aligned} \det _{0\le i,j\le n}\big [x_i^j+v_j\big ]=V(X)(1+v_0). \end{aligned}$$

Proof

According to (3), there holds the expression

$$\begin{aligned} e_{n-\jmath }(X\backslash \{x_\imath \}) =[y^{n-\jmath }] \frac{\prod _{k=0}^{n}(1+yx_k)}{1+yx_\imath } =\sum _{k=0}^{n-\jmath }(-1)^kx^k_\imath e_{n-k-\jmath }(X). \end{aligned}$$

Then we have, from (2), the relation

$$\begin{aligned} \Delta [X]y^k=\Delta [x_0,x_1,\ldots ,x_n]y^k=\chi (k=n) \quad \text {for}\; k=0,1,\ldots ,n \end{aligned}$$

where \(\chi \) is the logical function defined by \(\chi (\text {true})=1\) and \(\chi (\text {false})=0\). Therefore, the double sum displayed in Theorem 1 with \(u_k=1\) can be rewritten as follows

$$\begin{aligned} \sum _{\imath =0}^{n}\sum _{\jmath =0}^{n}(-1)^{n-\jmath } \frac{v_{\jmath }e_{n-\jmath }(X\backslash \{x_\imath \})}{\prod _{k\ne \imath }(x_{\imath }-x_k)} =\sum _{\jmath =0}^{n}(-1)^{n-\jmath }v_{\jmath } \sum _{k=0}^{n-\jmath }(-1)^k e_{n-k-\jmath }(X)\Delta [X]y^k \end{aligned}$$

which reduces to the single term \(v_0\), corresponding to \(k=n\) and \(\jmath =0\). \(\square \)

For \(v_k=v^k\) with \(0\le k\le n\), observing that

$$\begin{aligned} \sum _{\jmath =0}^{n}(-1)^{n-\jmath }v^{\jmath } e_{n-\jmath }(X\backslash \{x_\imath \}) =\prod _{\begin{array}{c} k=0\\ k\not =\imath \end{array}}^{n}(v-x_k) \quad \text {and}\quad \Delta [X]\frac{1}{v-y} =\frac{1}{\prod _{k=0}^{n}(v-x_k)}, \end{aligned}$$

we derive from Theorem 1 another determinant identity.

Proposition 3

$$\begin{aligned} \det _{0\le i,j\le n}\big [x_i^j+u_iv^j\big ] =V(X)\sum _{i=0}^{n}(1+u_i) \prod _{\begin{array}{c} k=0\\ k\not =i \end{array}}^{n} \frac{v-x_k}{x_i-x_k}. \end{aligned}$$

Suppose further that \(u_k:=u(x_k)\) is a function of \(x_k\) for \(0\le k\le n\), we may state Proposition 3 equivalently in terms of divided differences.

Corollary 4

$$\begin{aligned} \det _{0\le i,j\le n}\big [x_i^j+u(x_i)v^j\big ] =V(X)\Delta [X]\bigg \{\frac{1+u(y)}{v-y}\bigg \} \prod _{k=0}^{n}(v-x_k). \end{aligned}$$

For \(m\le n\), letting \(u(x_i)=\lambda \prod _{k=1}^{m}(x_i-\gamma _k)\) and appealing to the formula

$$\begin{aligned} \Delta [X]\frac{\prod _{k=1}^{m}(y-\gamma _k)}{v-y} =\frac{\prod _{k=1}^{m}(v-\gamma _k)}{\prod _{k=0}^{n}(v-x_k)}, \end{aligned}$$

we derive from Corollary 4 the following determinant identity.

Example 5

$$\begin{aligned} \det _{0\le i,j\le n}\bigg [ x_i^j+\lambda v^j\prod _{k=1}^{m}(x_i-\gamma _k)\bigg ] =V(X)\bigg \{1+\lambda \prod _{k=1}^{m}(v-\gamma _k)\bigg \}. \end{aligned}$$

Instead, letting \(u(x_i)=\lambda \prod _{k=0}^{n}(x_i-\gamma _k)\) and then applying the formula

$$\begin{aligned} \Delta [X]\frac{\prod _{k=0}^{n}(y-\gamma _k)}{v-y} =\prod _{k=0}^{n}\frac{v-\gamma _k}{v-x_k}-1, \end{aligned}$$

we establish from Corollary 4 another determinant identity.

Example 6

$$\begin{aligned} \det _{0\le i,j\le n}\bigg [ x_i^j+\lambda v^j\prod _{k=0}^{n}(x_i-\gamma _k)\bigg ] =V(X)\bigg \{1-\lambda \prod _{k=0}^{n}(v-x_k) +\lambda \prod _{k=0}^{n}(v-\gamma _k)\bigg \}. \end{aligned}$$

Specifying \(\lambda =-1/\prod _{k=0}^{n}(v-\gamma _k)\) further in the last example, we obtain a more symmetric determinant formula:

$$\begin{aligned} \det _{0\le i,j\le n} \bigg [x_i^j-v^j\prod _{k=0}^{n} \frac{x_i-\gamma _k}{v-\gamma _k}\bigg ] =V(X)\prod _{k=0}^{n} \frac{v-x_k}{v-\gamma _k}. \end{aligned}$$
(6)

Finally, if we let in Corollary 4

$$\begin{aligned} u(y)=-\prod _{k=0}^{n+1}\frac{y-\gamma _k}{v-\gamma _k} \end{aligned}$$

and applying the formula

$$\begin{aligned} \Delta [X]\frac{\prod _{k=0}^{n+1}(y-\gamma _k)}{v-y} =\gamma _{n+1}-v+\sum _{k=0}^{n}(\gamma _k-x_k) +\frac{\prod _{k=0}^{n+1}(v-\gamma _k)}{\prod _{k=0}^{n}(v-x_k)}, \end{aligned}$$

we would derive further the following determinant identity.

Example 7

$$\begin{aligned}\det _{0\le i,j\le n} \bigg [x_i^j-v^j\prod _{k=0}^{n+1} \frac{x_i-\gamma _k}{v-\gamma _k}\bigg ] =V(X)\left\{ 1+\sum _{i=0}^n\frac{x_i-\gamma _i}{v-\gamma _{n+1}}\right\} \prod _{k=0}^{n}\frac{v-x_k}{v-\gamma _k}.\end{aligned}$$

3 Further variants of Vandermonde determinant

For the sake of brevity, this section will use the indeterminates \(\mathbb {X}:=\{x_k\}_{k=1}^n\) and the following notation for the corresponding Vandermonde determinant

$$\begin{aligned} V(\mathbb {X}):=\det _{1\le i,j\le n}\big [x_i^{j-1}\big ] =\prod _{1\le i<j\le n}(x_j-x_i). \end{aligned}$$

Then Theorem 1 can be reformulated equivalently as follows:

$$\begin{aligned} \det _{1\le i,j\le n}\big [x_i^j+u_iv_j\big ] =V(\mathbb {X})e_n(\mathbb {X}) \bigg \{1+\sum _{\imath ,\jmath =1}^{n}(-1)^{n+\jmath } \frac{u_{\imath }v_{\jmath }e_{n-\jmath }(\mathbb {X}\backslash \{x_{\imath }\})}{x_{\imath }\prod _{k\ne \imath }(x_{\imath }-x_k)}\bigg \}. \end{aligned}$$
(7)

For \(u_k=1\), this equality becomes the following determinant identity.

Proposition 8

$$\begin{aligned} \det _{1\le i,j\le n}\big [x_i^j+v_j\big ] =V(\mathbb {X})\bigg \{ e_n(\mathbb {X})-\sum _{k=1}^{n} (-1)^kv_ke_{n-k}(\mathbb {X})\bigg \}. \end{aligned}$$

Proof

Recalling (3), we have the expression

$$\begin{aligned} e_{n-\jmath }(\mathbb {X}\backslash \{x_\imath \}) =[y^{n-\jmath }] \frac{\prod _{k=1}^{n}(1+yx_k)}{1+yx_\imath } =\sum _{k=0}^{n-\jmath }(-1)^kx^k_\imath e_{n-k-\jmath }(\mathbb {X}). \end{aligned}$$

According to (2), it is trivial to check the relations

$$\begin{aligned} \Delta [\mathbb {X}]\frac{1}{y}=\frac{(-1)^{n-1}}{e_n(\mathbb {X})} \quad \text {and}\quad \Delta [\mathbb {X}]y^k=\chi (k=n-1) \quad \text {for}\quad k=0,1,\ldots ,n-1. \end{aligned}$$

Then for \(u_k=1\), we may reformulate the double sum displayed in (7) as

$$\begin{aligned}&\sum _{\imath =1}^{n}\sum _{\jmath =1}^{n}(-1)^{n-\jmath } \frac{v_{\jmath }e_{n-\jmath }(\mathbb {X}\backslash \{x_\imath \})}{x_{\imath }\prod _{k\ne \imath }(x_{\imath }-x_k)} =\sum _{\jmath =1}^{n}(-1)^{n-\jmath }v_{\jmath } e_{n-\jmath }(\mathbb {X})\Delta [\mathbb {X}]\frac{1}{y}&\\&\qquad +\sum _{\jmath =1}^{n}(-1)^{n-\jmath }v_{\jmath } \sum _{k=1}^{n-\jmath }(-1)^k e_{n-k-\jmath }(\mathbb {X})\Delta [\mathbb {X}]y^{k-1}.&\end{aligned}$$

Observing that the last double sum vanishes and then evaluating the divided differences in the penultimate line, we confirm the identity stated in the Proposition. \(\square \)

Letting \(v_j=v^j\) in Proposition 8, we get the simplified determinant identity.

Example 9

$$\begin{aligned} \det _{1\le i,j\le n}\big [x_i^j+v^j\big ] =V(\mathbb {X})\bigg \{ 2e_n(\mathbb {X})-\prod _{k=1}^{n}(x_k-v)\bigg \}. \end{aligned}$$

Notice that the determinant in Corollary 4 can be expressed equivalently as

$$\begin{aligned} \det _{1\le i,j\le n}\big [x_i^j+u(x_i)v^j\big ] =V(\mathbb {X})e_n(\mathbb {X}) \bigg \{1+\Delta [\mathbb {X}]\frac{vu(y)}{y(v-y)} \prod _{k=1}^{n}(v-x_k)\bigg \}. \end{aligned}$$
(8)

For \(m\le n\), letting \(u(x_i)=\lambda \prod _{k=1}^{m}(x_i-\gamma _k)\) and appealing to the formula

$$\begin{aligned} \Delta [\mathbb {X}]\frac{v\prod _{k=1}^{m}(y-\gamma _k)}{y(v-y)} =(-1)^{m+n+1}\frac{\prod _{k=1}^{m}\gamma _k}{\prod _{k=1}^{n}x_k} +\frac{\prod _{k=1}^{m}(v-\gamma _k)}{\prod _{k=1}^{n}(v-x_k)}, \end{aligned}$$

we obtain from (8) another determinant identity.

Corollary 10

$$\begin{aligned} \det _{1\le i,j\le n}\bigg [ x_i^j+\lambda v^j\prod _{k=1}^{m}(x_i-\gamma _k)\bigg ]&=V(\mathbb {X})\bigg \{e_n(\mathbb {X}) +\lambda \prod _{k=1}^{n}x_k\prod _{k=1}^{m}(v-\gamma _k)\\&\quad -(-1)^{m+n}\lambda \prod _{k=1}^{m}\gamma _k \prod _{k=1}^{n}(v-x_k)\bigg \}. \end{aligned}$$

When \(m=n\) and \(\lambda =-1/\prod _{k=1}^{n}(v-\gamma _k)\), the last corollary yields further to a more symmetric determinant identity:

$$\begin{aligned} \det _{1\le i,j\le n} \bigg [x_i^j-v^j\prod _{k=1}^{n} \frac{x_i-\gamma _k}{v-\gamma _k}\bigg ] =V(\mathbb {X})\prod _{k=1}^{n} \frac{\gamma _k(v-x_k)}{v-\gamma _k}. \end{aligned}$$
(9)

Finally, for \(v_k=h_k(\mathbb {X})\), Proposition 8 reduces to the strange looking identity:

$$\begin{aligned} \det _{1\le i,j\le n}\big [x_i^j+h_j(\mathbb {X})\big ] =2V(\mathbb {X})e_n(\mathbb {X}). \end{aligned}$$
(10)