1 Introduction

We consider the following optimization problem. A set of n jobs \(J=\{J_1, J_2, \ldots , J_n\}\) has to be processed on a single machine, and all the jobs are available for processing at time zero. The machine can handle at most one job at a time and job preemption is not allowed. The actual processing time of job \(J_j\) when executed in the rth position in a sequence is

$$\begin{aligned} p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\quad u_j>0, \end{aligned}$$
(1)

where k is a positive constant, \(p_{j}\) is the normal processing time of job \(J_{j}\), \(a_j \le 0\) is a position-dependent learning index of job \(J_j\), and \(u_j\) is the amount of resource that can be allocated to job \(J_j\). Each job \(J_{j}\) has a unique due window \([d_j^1,d_j^2]\) with \(d_j^1\le d_j^2\). In this paper we consider a common due window, that is \(d_j^1=d^1, d_j^2=d^2\). Note that the window size, denoted by \(D=d^2-d^1\), is identical for all jobs.

For a given schedule S, let \(C_j=C_j(S)\) denote the completion time of job \(J_j,C_{\max }=\max \{C_j|j=1,2,\ldots ,n\}\) be the makespan, \(E_j=\mathrm{max}\{0,d^1-C_j\}\) be the earliness value of job \(J_j,T_j=\mathrm{max}\{0,C_j-d^2\}\) be the tardiness value of job \(J_j,j=1,2,\ldots ,n\). The objective is to determine (i) a job schedule S, (ii) a resource allocation \(\mathbf{u}=(u_1,u_2,\ldots ,u_n)\), (iii) a due window starting time \(d^1\), and (iv) a due window size D such that the following objective function is minimized

$$\begin{aligned} \sum _{j=1}^nG_ju_j, \end{aligned}$$
(2)

subject to \(\sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\), where \(G_j\) is the per time unit cost associated with the resource allocation and \(C>0\) is a given constant. Using the three-field notation of Graham et al. [2], Biskup [1] and Shabtay and Steiner [11], the problem can be denoted as \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\).

As far as we know, some resource allocation scheduling problems with learning effect has been considered in the literature. Wang et al. [16] considered the single machine scheduling problems \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a}}{u_j}\right) ^k\right| \delta _1C_{\max }+\delta _2TC+\delta _3 TADC+\sum _{j=1}^nG_ju_j\) and \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a}}{u_j}\right) ^k\right| \delta _1C_{\max }+\delta _2TW+\delta _3 TADW+\sum _{j=1}^nG_ju_j\), where \(TC=\sum _{j=1}^nC_j\) (\(TW=\sum _{j=1}^nW_j\)) is the total completion time (total waiting time), TADC (TADW) is the total absolute differences in completion times (total absolute differences in waiting times), and \(W_j = C_j-p_j^A\) is the waiting time of job \(J_j\). They proved that these two problems can be solved in polynomial time. Lu et al. [9] considered the problem \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k\right| \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j)+\sum _{j=1}^nG_ju_j\), where \(E_j=\mathrm{max}\{0,d_j-C_j\}\)) is the earliness value of job \(J_j\), \(T_j=\mathrm{max}\{0,C_j-d_j\}\) is the tardiness value of job \(J_j\), \(j=1,2,\ldots ,n\). For two due date assignment methods (include the common (CON) due date (i.e., \(d_j=d\) for all jobs), and the slack (SLK) due date (i.e., \(d_j=p_j^A+q\))), they proved that the problem can be solved in polynomial time. Wang and Wang [14] considered the problems \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\sum _{j=1}^nu_j\le U\right| \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j)\) and \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j)\le R\right| \sum _{j=1}^nu_j\), where \(U>0\) and \(R>0\) are given constants. For three due date assignment methods (include the CON due date, the SLK due date, and unrestricted (DIF) due date assignment method), they proved that these problems can be solved in polynomial time. Wang and Wang [14] also proved that some scheduling problems without due dates (i.e., \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\sum _{j=1}^nu_j\le U\right| \delta _1C_{\max }+\delta _2TC+\delta _3 TADC\), \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\sum _{j=1}^nu_j\le U\right| \delta _1C_{\max }+\delta _2TW+\delta _3 TADW\),

\(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\delta _1C_{\max }+\delta _2TC+\delta _3 TADC\le R\right| \sum _{j=1}^nu_j\) and

\(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\delta _1C_{\max }+\delta _2TW+\delta _3 TADW\le R\right| \sum _{j=1}^nu_j\)) can be solved in polynomial time. Wang and Wang [13] considered single machine common due-window scheduling problem. They proved that the problem \(1\left| p_{j}^A=\left( \frac{{p}_jr^{{a_j}}}{u_j}\right) ^k\right| \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\theta \sum _{j=1}^nG_ju_j\) can be solved in polynomial time, where \(\alpha ,\beta ,\delta \) and \(\gamma \) be the per time unit penalties for earliness, tardiness, due date and due window size, respectively. Yang et al. [19] considered single machine resource allocation scheduling problems with multiple due windows. For a non-regular objective cost, they proved that the problem can be solved in polynomial time. Li et al. [5] considered the slack due window scheduling problem \(1\left| p_{j}=\left( \frac{{p}_jr^{{a_j}}}{u_j}\right) ^k\right| \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j^1 +\gamma D_j)+\eta C_{\max }+\theta \sum _{j=1}^nG_ju_j\) can be solved in polynomial time, where \([d_j^1=p_j+q^1, d_j^2=p_j+q^2]\) is the due-window of job \(J_j\), \(D_j\) is due-window size, both \(q^1\) and \(q^2\) are decision variables. Li et al. [5] also proved that the problems \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k\right| \delta _1C_{\max }+\delta _2TC+\delta _3 TADC+\sum _{j=1}^nG_ju_j\) and \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k\right| \delta _1C_{\max }+\delta _2TW+\delta _3 TADW+\sum _{j=1}^nG_ju_j\) can be solved in polynomial time.

The recent paper “Study on due-window scheduling with controllable processing times and learning effect” Wang et al. [12] considered single machine common due window scheduling with limited resource cost availability constraint, i.e., the problem \(1\left| p_{j}^A=\left( \frac{ {p}_jr^{{a_j}}}{u_j}\right) ^k, \sum _{j=1}^nG_ju_j\le V\right| \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d +\gamma D)\). They proved that this problem can be solved in polynomial time. In this paper, we study the “inverse version” of the problem studied by Wang et al. [12], that is the case that processing time of a job is described by a convex decreasing resource consumption function and a decreasing position dependent function, and the objective is to minimize the total resource consumed cost subject to a constraint on \(\sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\), i.e., the problem \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\right| \sum _{j=1}^nG_ju_j\). For more details on scheduling with learning effects, controllable processing times and due windows, the reader may refer to the recent surveys by Biskup [1], Shabtay and Steiner [11] and Janiak et al. [4].

The remainder of this paper is organized as follows. Section 2 derives the properties of the optimal schedule and provides solution algorithm for the general case of the problem. Section 3 considers a special case of the problem, i.e., \(a_j=a\) for all jobs. We extend the problem to incorporate with the job-dependent penalty problem, the truncated job-dependent learning effect and the slack due window assignment method in Sect. 4. The last section contains some conclusions and suggests some future research topics.

2 The single machine problem

2.1 Optimal resource allocation

For a given feasible resource allocation \(\mathbf{u}\), which fixes the job processing times and the resource consumption cost, our problem reduces to find (i) a job schedule S, (ii) a due window starting time \(d^1\), and (iii) a due window size D such that the objective function \(\sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max }\) is minimized. Similarly to Liman et al. [6, 7], Yin et al. [20], Liu et al. [8], we have

Theorem 1

For problem \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k\right| \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max }\), an optimal schedule S satisfies the following properties:

  1. (1)

    All the jobs are processed consecutively without any machine idle from time zero.

  2. (2)

    The optimal values \(d^1=C_{[h]}\) and \(d^2=C_{[l]}\) (\(l\ge h\)), where \(h=\lceil n(\gamma -\delta )/\alpha \rceil \), \(l=\lceil n(\beta -\gamma )/\beta \rceil \) and [j] denotes the jth job in a sequence.

Now, we consider the following cost component:

  1. (1)

    The earliness cost for job \(J_{[j]}\) (\(j = 1, 2, \ldots , h\)) is:

    $$\begin{aligned} \alpha \sum _{j=1}^n E_j=\alpha \sum _{j=1}^h (d^1-C_{[j]})=\alpha \sum _{j=1}^h (C_{[h]}-C_{[j]})=\alpha \sum _{j=1}^k (j-1)p_{[j]}^A \end{aligned}$$
  2. (2)

    The tardiness cost for job \(J_{[j]}\) (\(j = l+1, l+2, \ldots , n\)) is:

    $$\begin{aligned} \beta \sum _{j=1}^n T_j=\beta \sum _{j=l+1}^n (C_{[j]}-d^2)=\beta \sum _{j=l+1}^n (C_{[j]}-C_{[l]})=\beta \sum _{j=l+1}^n(n-j+1)p_{[j]}^A \end{aligned}$$
  3. (3)

    \( \delta \sum _{j=1}^n d^1=\delta n d^1=\sum _{j=1}^h\delta n p_{[j]}^A \)

  4. (4)

    \( \gamma \sum _{j=1}^n D=n\gamma D=n\gamma (C_{[l]}-C_{[k]})=n\gamma \sum _{j=h+1}^lp_{[j]}^A \)

  5. (5)

    \( \eta C_{\max }=\eta \sum _{j=1}^np_{[j]}^A \)

Hence, from (1), Theorem 2.1 and (1)–(5), we have

$$\begin{aligned}&\sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } =\sum _{j=1}^n W_j p_{[j]}^A\nonumber \\&\quad =\sum _{j=1}^n W_j \left( \frac{{p}_{[j]}j^{a_{[j]}}}{u_{[j]}}\right) ^k, \end{aligned}$$
(3)

where

$$\begin{aligned} W_j=\left\{ \begin{array}{ll} \delta n+\alpha (j-1)+\eta &{}\quad \mathrm{for}\,\, j=1, 2,\ldots , h; \\ n\gamma +\eta &{}\quad \mathrm{for}\,\, j=h+1, h+2,\ldots , l; \\ \beta (n-j+1)+\eta &{}\quad \mathrm{for}\,\, j=l+1,l+2,\ldots , n. \end{array}\right. \end{aligned}$$
(4)

Theorem 2

For a given schedule \(S=(J_{[1]}, J_{[2]}, \ldots ,J_{[n]})\), the optimal resource allocation of the problem \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\) as a function of the job sequence, that is

$$\begin{aligned} u_{[j]}^*(\pi )&{=}&{\frac{(W_j)^{1/(k+1)}\left( {p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)}\left( \sum _{j=1}^n(W_j)^{1/(k+1)}\left( G_{[j]}\right) ^{k/(k+1)} \left( {p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)}\right) ^{1/k}}{C^{1/k}(G_{[j]})^{1/(k+1)}}},\nonumber \\&\quad j=1,2,\ldots ,n, \end{aligned}$$
(5)

where \(W_j\) is calculated by (4).

Proof

For any given sequence \(S=(J_{[1]}, J_{[2]}, \ldots ,J_{[n]})\), the Lagrange function is

$$\begin{aligned} L({ d^1}, D,\mathbf{u},\lambda )= & {} \sum _{j=1}^nG_ju_j +\lambda \left( \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } - C\right) \nonumber \\= & {} \sum _{j=1}^nG_{[j]}u_{[j]} +\lambda \left( \sum _{j=1}^n W_j \left( \frac{{p}_{[j]}j^{a_{[j]}}}{u_{[j]}}\right) ^k-C\right) \end{aligned}$$
(6)

where \(\lambda \) is the Lagrangian multiplier. Deriving (6) with respect to \(u_{[j]}\) and \(\lambda \), we have

$$\begin{aligned} \frac{\partial {L({ d^1}, D,\mathbf{u},\lambda )}}{\partial {\lambda }}= & {} \sum _{j=1}^n W_j \left( \frac{{p}_{[j]}j^{a_{[j]}}}{u_{[j]}}\right) ^k-C=0, \end{aligned}$$
(7)
$$\begin{aligned} \frac{\partial {L({ d^1}, D,\mathbf{u},\lambda )}}{\partial {u_{[j]}}}= & {} G_{[j]}-k\lambda W_j\times \frac{\left( {p}_{[j]}j^{a_{[j]}}\right) ^k}{(u_{[j]})^{k+1}}=0. \end{aligned}$$
(8)

Using (7) and (8) we have

$$\begin{aligned} u_{[j]}= & {} \frac{\left( k\lambda W_j \left( {p}_{[j]}{j}^{a_{[j]}}\right) ^k\right) ^{1/(k+1)}}{( G_{[j]})^{1/(k+1)}}, \end{aligned}$$
(9)
$$\begin{aligned} (k\lambda )^{k/(k+1)}= & {} \frac{\sum _{j=1}^n(W_j)^{1/(k+1)}\left( {p}_{[j]}G_{[j]} {j}^{a_{[j]} }\right) ^{k/(k+1)}}{C} \end{aligned}$$
(10)

and

$$\begin{aligned} u_{[j]}^*(\pi )={\frac{(W_j)^{1/(k+1)}\left( {p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)} \left( \sum _{j=1}^n(W_j)^{1/(k+1)}\left( G_{[j]}\right) ^{k/(k+1)} \left( {p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)}\right) ^{1/k}}{C^{1/k}(G_{[j]})^{1/(k+1)}}}. \end{aligned}$$

\(\square \)

2.2 Optimal sequences

Theorem 3

For the problem \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\), the optimal schedule can be determined by solving a linear assignment problem.

Proof

By substituting (5) into \(\sum _{j=1}^nG_ju_j\), we have

$$\begin{aligned} \sum _{j=1}^nG_ju_j({ d^1}, D,\mathbf{u},S)=C^{-1/k}\left( \sum _{j=1}^n(W_j)^{1/(k+1)} \left( G_{[j]}{p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)}\right) ^{1+1/k},\nonumber \\ \end{aligned}$$
(11)

\(W_j\) is calculated by (4). Let \(X_{jr}\) \((j = 1, 2,\ldots ,n; r = 1, 2, \ldots ,n)\) be a 0–1 variable such that

$$\begin{aligned} X_{jr}=\left\{ \begin{array}{ll} 1&{}\quad \mathrm{if\,job}\,J_j\,\mathrm{is\,processed\,in\,the}\, { r}\mathrm{th}\,\mathrm{position},\\ 0 &{}\quad \mathrm{otherwise,} \end{array}\right. \end{aligned}$$
(12)

and

$$\begin{aligned} \theta _{jr}=(W_r)^{1/(k+1)}\left( {p}_{j}G_{j}r^{a_{j}}\right) ^{k/(k+1)}, \end{aligned}$$
(13)

where

$$\begin{aligned} W_r=\left\{ \begin{array} {ll} \delta n+\alpha (r-1)+\eta &{}\quad \mathrm{for}\,\, r=1, 2,\ldots , h; \\ n\gamma +\eta &{}\quad \mathrm{for}\,\, r=h+1, h+2,\ldots , l; \\ \beta (n-r+1)+\eta &{}\quad \mathrm{for}\,\, r=l+1,l+2,\ldots , n. \end{array}\right. \end{aligned}$$
(14)

For k and C are given constant numbers, the optimal schedule of the problem

\(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k,\sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\right| \sum _{j=1}^nG_ju_j\) can be formulated as the following linear assignment problem:

$$\begin{aligned}&\mathbf{P}: \mathrm{Min}Z=\sum _{j=1}^n\sum _{r=1}^n\theta _{jr}X_{jr} \end{aligned}$$
(15)
$$\begin{aligned}&\mathrm{s.t.}\nonumber \\&\sum _{r=1}^nX_{jr}=1,\quad j=1,2,\ldots ,n, \end{aligned}$$
(16)
$$\begin{aligned}&\sum _{j=1}^nX_{jr}=1,\quad r=1,2,\ldots ,n, \end{aligned}$$
(17)
$$\begin{aligned}&X_{jr}=0\,\,\mathrm{or}\,\,1, \quad j,r=1,2,\ldots ,n. \end{aligned}$$
(18)

The first set of constraints (Eq. (16)) assures that each job will be assigned only to one position, the second set of constraints (Eq. (17)) assures that each position in the sequence will be occupied by exactly one job. \(\square \)

2.3 Optimal solution

For \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\right| \sum _{j=1}^nG_ju_j\) problem, the following optimization algorithm can be proposed.

Algorithm 2.1

  • Step 1. Calculate the indices \(h^*\) and \(l^*\): \(h^*=\lceil n(\gamma -\delta )/\alpha \rceil \) and \(l^*=\lceil n(\beta -\gamma )/\beta \rceil \).

  • Step 2. Compute \(\theta _{jr}=(W_r)^{1/(k+1)}\left( {p}_{j}G_{j}r^{a_{j}}\right) ^{k/(k+1)}\), where

    $$\begin{aligned} W_r=\left\{ \begin{array} {ll} \delta n+\alpha (r-1)+\eta &{}\quad \mathrm{for}\,\, r=1, 2,\ldots , h; \\ n\gamma +\eta &{}\quad \mathrm{for}\,\, r=h+1, h+2,\ldots , l; \\ \beta (n-r+1)+\eta &{}\quad \mathrm{for}\,\, r=l+1,l+2,\ldots , n. \end{array} \right. \end{aligned}$$
  • Step 3. Solve the linear assignment problem P (i.e., Eqs. (15)–(18)) to determine the optimal schedule \(S^*\).

  • Step 4. Compute the optimal resources by Eq. (5).

  • Step 5. Compute the optimal processing times by Eq. (1).

  • Step 6. Set \(d^{1*}=C_{[h^*]}\) and \(D^*=C_{[l^*]}-C_{[h^*]}\).

Theorem 4

The problem \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\) can be solved in \(O(n^3)\) time by Algorithm 2.1.

Proof

The correctness of the algorithm follows from Theorems 2.1, 2.2 and 2.3. It is well known that the linear assignment problem can be solved in \(O(n^3)\) time (i.e., Step 3 requires \(O(n^3)\) time); Step 2 requires \(O(n^2)\); Steps 1, 4, 5, and 6 can be performed in linear time. Thus the overall computational complexity of Algorithm 2.1 is \(O(n^3)\). \(\square \)

In order to illustrate Algorithm 2.1 for the problem

\(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\right| \sum _{j=1}^nG_ju_j\), we will solve the following instance:

Example 2.1

Data: \(n=7\), \(k=1.5\), \(\alpha =10, \beta =17, \delta =4, \gamma =6,\eta =1\), \(C=300\), and the other data are given in Table 1.

Table 1 The data of Example 2.1

Solution:

  • Step 1. By Theorem 2.1, we have \(h^*=\lceil n(\gamma -\delta )/\alpha \rceil =\lceil 7(6-4)/10\rceil =2\) and \(l^*=\lceil n(\beta -\gamma )/\beta \rceil =\lceil 7(17-6)/17\rceil =5\).

  • Step 2. \(W_1=29,W_2=39,W_3=W_4=W_5=43,W_6=35,W_7=18\). The values \(\theta _{jr}=(W_r)^{1/(k+1)}\left( {p}_{j}G_{j}r^{a_{j}}\right) ^{k/(k+1)}\) are given in Table 2.

  • Step 3. Solve the linear assignment problem P (i.e., Eqs. (15)–(18)), we obtain that \(S^*=(J_1\rightarrow J_7\rightarrow J_6\rightarrow J_4\rightarrow J_2\rightarrow J_5\rightarrow J_3)\) (see bold in Table 2).

  • Step 4. From Eq. (5), we have

    $$\begin{aligned} u_1^*= & {} {\frac{(W_j)^{1/(k+1)}\left( {p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)}\left( \sum _{j=1}^n(W_j)^{1/(k+1)}\left( G_{[j]}\right) ^{k/(k+1)} \left( {p}_{[j]}j^{a_{[j]}}\right) ^{k/(k+1)}\right) ^{1/k}}{C^{1/k}(G_{[j]})^{1/(k+1)}}}\\&=13.4661,\,u_7^*= 5.4778,\\ u_6^*= & {} 5.7616, u_4^*=10.7360, u_2^*=14.2249,u_5^*=4.8422, u_3^*=7.1504. \end{aligned}$$
  • Step 5. From Eq. (1), we have \(p_1^A=(35\times 1^{(-0.05)}/13.4661)^{1.5}=4.1902\), \(p_7^A= 2.7742\), \(p_6^A=3.5976\), \(p_4^A= 1.5533\), \(p_2^A=1.2742\), \(p_5^A=3.3253\), \(p_3^A=4.8576\).

  • Step 6. Set \(d^*=C_{[2^*]}=4.1902+2.7742=6.9644\) and \(D^*=C_{[h^*]}-C_{[l^*]}=3.5976+1.5533+1.2742=6.4251\).

Table 2 The \(\theta _{jr}\) values of Example 2.1

3 A special case

If \(a_j=a\) for all jobs, stem from (11), we have

$$\begin{aligned} \sum _{j=1}^nG_ju_j= & {} C^{-1/k}\left( \sum _{j=1}^n(W_j)^{1/(k+1)} \left( G_{[j]}{p}_{[j]}j^{a}\right) ^{k/(k+1)}\right) ^{1+1/k}\nonumber \\= & {} C^{-1/k}\left( \sum _{j=1}^n\mu _j\nu _{[j]}\right) ^{1+1/k}, \end{aligned}$$
(19)

where

$$\begin{aligned} \mu _j= & {} (W_j)^{1/(k+1)}\left( j\right) ^{ak/(k+1)}, \end{aligned}$$
(20)
$$\begin{aligned} \nu _{[j]}= & {} \left( {p}_{[j]}G_{[j]}\right) ^{k/(k+1)}, \end{aligned}$$
(21)

where \(W_j\) is calculated by Eq. (4).

Theorem 5

Problem \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\) can be solved in \(O(n\log n)\) time.

Proof

An optimal solution to the problem

\(1\left| p_{j}^A=\left( \frac{{p}_jr^{a}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d^1 +\gamma D)+\eta C_{\max } \le C\right| \sum _{j=1}^nG_ju_j\) can be constructed as follows: calculate the indices \(h^*\) and \(l^*\) (according Theorem 2.1), and then calculate \(\mu _j=(W_j)^{1/(k+1)}\left( j\right) ^{ak/(k+1)}\). Assign the smallest \(\mu _j\) to the job with the largest \(\nu _{j}=\left( {p}_{j}G_{j}\right) ^{k/(k+1)}\), the second smallest \(\mu _j\) to the job with the second largest \(\nu _{j}\), and so on. This matching procedure requires \(O(n\log n)\) time (Hardy et al. [3]). Denote the optimal sequence determined in this way by \(S^*=(J_{[1]}, J_{[2]}, \ldots ,J_{[n]})\) and calculate the optimal resources by Eq. (5), the optimal processing times by Eq. (1), and \(d^{1*}=C_{[h^*]}\) and \(D^*=C_{[l^*]}-C_{[h^*]}\). \(\square \)

4 Extensions

4.1 Extension 1

Similar to Sect. 3, the proposed model can be extended to the slack due window scheduling problem (Li et al. [5]) \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j^1 {+}\gamma D_j)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\), where \([d_j^1=p_j+q^1, d_j^2=p_j+q^2]\) is the due-window of job \(J_j\), \(D_j=d_j^2-d_j^1=q^2-q^1\) is due-window size, both \(q^1\) and \(q^2\) are decision variables.

Theorem 6

The problem \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j^1 +\gamma D_j)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\) can be solved in \(O(n^3)\) time.

Theorem 7

Problem \(1\Big |p_{j}^A=\left( \frac{{p}_jr^{a}}{u_j}\right) ^k, \sum _{j=1}^n(\alpha E_j+\beta T_j+\delta d_j^1 +\gamma D_j)+\eta C_{\max } \le C\Big |\sum _{j=1}^nG_ju_j\) can be solved in \(O(n\log n)\) time.

4.2 Extension 2

Similar to Sect. 3, the proposed model can be extended to a large set of scheduling problems, i.e., \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^nW_jp_{[j]}^A \le C\right| \sum _{j=1}^nG_ju_j\), where \(W_j\) is a position, job-dependent penalty for any job schedule in the jth position.

Theorem 8

The problem \(1\left| p_{j}^A=\left( \frac{{p}_jr^{a_j}}{u_j}\right) ^k, \sum _{j=1}^nW_jp_{[j]}^A \le C\right| \sum _{j=1}^nG_ju_j\) can be solved in \(O(n^3)\) time.

Theorem 9

Problem \(1\left| p_{j}^A{=}\left( \frac{{p}_jr^{a}}{u_j}\right) ^k, \sum _{j=1}^nW_jp_{[j]}^A {\le } C\right| \sum _{j=1}^nG_ju_j\) can be solved in \(O(n\log n)\) time.

4.3 Extension 3

In the real production process, learning effect may not reduce the jobs’ processing time without limitation (Wu et al. [17, 18]), hence Wang et al. [15] considered the truncated job-dependent learning effect model, i.e., \(p_{j}^A={p}_j\max \{r^{a_j},B\}\), where \(0<B\le 1\) is a truncation parameter for all jobs. Similar to the proof of Sect. 3, the proposed model can be extended to the following model: \(p_{j}^A=\left( \frac{{p}_j\max \{r^{a_j},B\}}{u_j}\right) ^k\).

Theorem 10

The problem \(1\left| p_{j}^A=\left( \frac{{p}_j\max \{r^{a_j},B\}}{u_j}\right) ^k, \sum _{j=1}^nW_jp_{[j]}^A \le C\right| \sum _{j=1}^nG_ju_j\) can be solved in \(O(n^3)\) time.

Theorem 11

Problem \(1\left| p_{j}^A=\left( \frac{{p}_j\max \{r^{a},B\}}{u_j}\right) ^k, \sum _{j=1}^nW_jp_{[j]}^A \le C\right| \sum _{j=1}^nG_ju_j\) can be solved in \(O(n\log n)\) time.

5 Conclusions

In this paper, we have considered the scheduling problem with learning effect and resource-dependent processing times on a single machine. It is showed that the due window assignment minimization problem can be solved in polynomial time. For the special case of the problem, we also gave a lower order algorithm. The algorithms can also be easily applied to the problems with the deterioration (aging) effect (e.g., \(a_j>0\)). For future research, it is worthwhile to study other scheduling problems with due window assignment, effects of deterioration and truncated job-dependent learning (Niu et al. [10]) and/or resource allocation, for example, the flow shop scheduling problems and other scheduling performance measures.