Keywords

1 Introduction and Literature Review

Increasing customer demand in the market leads to question the efficiency of mass production. Flexible manufacturing system has to achieve success by meeting up the highly increasing competition in the market. An efficient FMS must fulfil the demands of market where customized products are gaining importance. An efficient FMS should have highly reliable flexibility. The scheduling optimization is one of the important problems in FMS as it decides the efficiency and capability of the system. It is highly complex. It differs from conventional scheduling by routing flexibility that is available in the FMS. Scheduling is critical as its primary goal is to minimize the time taken to finish a product within a deadline. The level of difficulty for sequencing and scheduling of FMS depends upon the type of FMS, its process constraints and the performance parameters. Scheduling is one of the “NP” (nondeterministic polynomial time) hard problems, and it is complex in both time-wise and computational-wise.

Scheduling is a problem where jobs are allocated to particular machines with processing times and specific sequence of operations. It has to be ensured that machines are available without any breakdown, and it has to process an operation uninterrupted [1]. The intricacy of scheduling problem in a FMS is difficult than conventional job shop scheduling, as it should allocate jobs among the set of machines available. It is a well-known NP hard problem.

1.1 Earlier Research

FMS can be classified as (a) design problems (b) planning problems (c) scheduling problems (d) control problems. Stecke [2] proposed a framework to study planning and scheduling problems of FMSs. FMS problems are solved by traditional techniques and non-traditional techniques like meta-heuristics. In our proposed approach, we have used meta-heuristic techniques. Hence, the review is mainly done for scheduling problems that used meta-heuristic techniques. Zhao et al. [3] have presented a genetic algorithm with the concepts of virtual and real operations. Chromosome coding and genetic operators of GAs are defined during the problem solving. A minimum weighted tardiness objective function is used to define code fitness, which is used for selecting species and producing a new generation of codes. Kumar et al. [4] have used ACO for the graph-based representation of the FMS scheduling problem. The authors have studied the performance of various parameters and validated it. Noorul Haq et al. [5] have contemplated production and MHS scheduling in combination. The authors studied different scheduling decisions of FMS to check its efficiency by GA and SA. Jerald et al. [6] have proposed four different approaches such as PSO, GA, SA and memetic algorithm (MA) for large variety of problems with 10–43 jobs and 8–16 machines. PSO showed its superior performance over other techniques. Gnavel Babu et al. [7] have made AGV as a part of the scheduling problem. The author demonstrated the effectiveness of proposed differential evolution in obtaining optimal solutions for simultaneous scheduling for the same. Udhayakumar and Kumanan [8] have adopted extended Giffler and Thompson to generate an optimal schedule by an ACO approach. Their algorithm was found to be superior as its results are better than other techniques found in the literature. Sreedhar kumar et al. [9] have solved a combined objective function by bacterial foraging algorithm (BFOA). They moderated the penalty by including a reward and evaluated the effectiveness of objective function. Nidish Mathew Nidhry et al. [10] have attempted to solve a multi-objective problem. The authors considered 16 machines with 80 job varieties. And they have proved that their algorithm performs better than other algorithms. Reddy and Rao [11] considered the simultaneous scheduling of machine and automated guided vehicles for multi-objectives. The authors considered some benchmark problems reported in the literature and few case studies to test the proposed hybrid GA and found the algorithm to have a better performance. Burnwal and Deb [12] had developed a modified cuckoo search to solve a 43 jobs with 16 machines problem. They have proved that CS requires less time to converge and have the potential to explore further. Chawla et al. [13] have investigated the simultaneous scheduling of AGVs workload and its travel time. They have considered multi-objectives and obtained the optimal schedule by grey wolf optimization algorithm (GWO).

Literature review shows that various meta-heuristics have been applied in order to achieve the best optimization of benchmark problems. Though many meta-heuristic techniques have found success in finding optimal solution, there is still a scope to find better optimal result when we hybridize the meta-heuristic techniques. One such attempt has been made in this article, and an amalgamation of genetic algorithm (GA), particle swarm optimization (PSO), tabu search (TS) has been done. The resulting hybrid technique is christened as GAPSOTS. The main advantage of this hybrid technique is that we have used one global search and two local search methods, thereby preventing the trapping of local optima. GA has good probability for finding better global solutions, PSO is known for solving combinatorial optimization, and TS gives a better neighbourhood search. The novel algorithm GAPSOTS has been applied to generate optimal schedules for the FMS environment taken from the literature for combined objectives, which contradicts each other. The FMS environment considered in our paper is for two different set-ups, a 43 jobs 16 machine problem and 80 jobs 16-machine problem. The outcomes are then compared with existing optimal values taken from the literature for the same problems. The results and future scope have been discussed in results section.

2 Problem Definition

The FMS set-up considered in this work, the assumptions made and objective of the present work are adopted from [6] and are presented in the following points.

  1. 1.

    The FMS set-up taken in this work is shown in Fig. 1. It has five flexible machining cells (FMCs), with two to six computer numerical machines (CNCs) for each FMC, an autonomous tool magazine, one tool changer (ATC) that is automatic and one pallet changer (APC) that is automatic. One to three dedicated robots are assigned for intra cell movement of resources among the jobs. A loading and unloading stations are dedicated exclusively to release parts in batches when required and for moving the finished jobs to the storage. The automatic storage and retrieval system (AS/RS) are utilized for stocking the jobs that are incomplete. The five FMCs are conjoined by a couple of similar AGVs. These AGVs move any unfinished products to AS/RS and FMC and move finished products to the unloading dock.

  2. 2.

    The following assumptions are made:. i) The number of products considered vary from 40 to 80 jobs for a specific mix of tools in tool magazines. Each job must be processed in a specific order, lot size, due date, and a penalty cost is imposed when the product due date is not met. Every job has to follow a particular sequence in a particular machine for specified amount of time. A product mix with processing times is shown in Table 2.

  3. 3.

    This study aims to solve a combined optimization problem for minimization of idle time of a machine and total penalty cost.

Combined Objective Function (COF)

$$ {\text{COF}} = \left( {\omega_{1} } \right)^{*} \frac{{{\text{TPC}}}}{{\text{Max PP}}} + \left( {\omega_{2} } \right)^{*} \frac{{{\text{TMI}}}}{{{\text{Max}}\;{\text{TE}} }} $$
(1)

where

ω1 and ω2:

weights assigned. The value considered is 0.5 for both weights.

TPC:

penalty cost incurred in total.

TMI:

machine idle time.

Max PP:

maximum penalty that is permissible.

Max TE:

maximum elapsed time of the machine.

Fig. 1
figure 1

FMS structure

3 The Proposed Hybrid Algorithm (GAPSOTS)

3.1 GAPSOTS Algorithm Design

An effective multi-objective scheduling approach GAPSOTS is applied, which is a combination of the genetic algorithm (GA), particle swarm optimization (PSO) and tabu search (TS) algorithms (GAPSOTS algorithm). The purpose of using genetic algorithm with multiple objectives is to successfully resolve multiphase process scheduling in FMS setting. Then, PSO algorithm is applied for optimization the scheduling process and TS helps in solving combinatorial optimization issues. This new approach is made by hybrid method with multiple objectives to handle the flexible job scheduling complications with manifold goals. Investigational studies have been utilized to validate the method, and matching the results of the recommended method to specify the compliance/flexibility and supremacy of the present model does a comparative analysis. The proposed algorithms have been implemented and tested by using MATLAB R2016b or beyond, computing environment on an Intel Core™i7, with Windows 10.

The genetic algorithm (GA) is an adaptive heuristic exploration technique through which suitable answers to the problems of searching and optimizing can be made on the basis of a fitness function. It is a subsection of evolutionary algorithm, which provides responses to the optimizing issues by naturally stimulated algorithms and includes mutation, reproduction, and cross over, etc. In GA, there is a string of finite length called as chromosome that serves as representation of individual solution. There are position sets that are known as “genes” in a chromosome. Discrete values are given to those genes according to the solution. Crossover and mutation are genetic operators that are used to modify chromosome in order to create a new generation of “kids” that will have the characteristics of both “parents”. Every iteration results in moving towards the multi-objective optimization. Particle swarm optimization is mimicking the flock pattern of birds. In PSO, the probable solutions are termed as particles that will move in problem search space. Fitness values will be assigned to particles, which are then evaluated for fitness function that has to be optimized. Velocities will direct the path that should be taken by particles. The current feasible solutions will be pursued by the particles that will be flown in the problem space. Initialization of PSO is done randomly, and it moves towards optimum solution at each iteration by updating. The updated particles have two best values. One is “pbest”, the best solution achieved by a particular particle so far and “gbest”, the best solution achieved by any particle in the whole set (global best). Tabu search (Glover and Laguna, 1997) is a meta-heuristic method that has been effectively applied in several multi-objective optimization problems in scheduling. It is one of the popular local search algorithms that prevent the trapping of local optima. TS algorithm is fundamentally a neighbourhood approach and offers a way to clear the inflexible combinatorial problems of optimizing. It helps to get rid of “local optima” problems for such ambience. The process of transfer of present solution to the adjoining solution is termed as move. The neighbourhood/nearby solution providing optimal solution is attained through a “move” in case of TS algorithm.

3.2 The Framework of GAPSOTS

The framework of our hybrid technique GAPSOTS is given as follows:

  • Step 1: Setting up of parameters.

  • Step 2: First initial population is generated by GA.

  • Step 3: The fitness function is evaluated.

  • Step 4: Has termination criteria been achieved?

  • If yes End program and display results.

  • Else goto Step 5.

  • Step 5: Generation of New populace.

  • Step 5.1: Application of genetic operators.

  • Step 5.2: Apply PSO for getting quality particles. Update global best solution.

  • Step 5.3: Check for termination criteria.

  • If yes End program.

  • Else goto step 5.4

  • Step 5.4: Generation of neighbourhood.

  • Step 5.5: If aspiration criterion is met, replace and update tabu list.

  • Step 5.6: Perform steps 5–5.5 until a termination condition is satisfied or reach maximum iteration.

The workflow process of GAPSOTS is given in Fig. 2. Detailed explanation of the proposed GAPSOTS is given in following paragraphs.

Fig. 2
figure 2

Workflow process for proposed GAPSOTS

3.3 Genetic Operators for GA

To address the optimization problem, genetic operators that is used has to be very rich and so that we can achieve excellent individuals in the population. Selection, crossover and mutation are the three important genetic operators.

3.3.1 Reproduction

The method of rank selection is employed for reproduction. In rank selection method, first the entire population is ranked and the chromosomes are assigned fitness values. The least fitness will be given 1 and next will be 2 and the last will be N. The main advantage of rank selection method is it can avoid the algorithm to converge prematurely. Another advantage is that in this method the importance is given to ranks rather than the fitness values. That way the selection pressure is maintained. This method also helps to maintain diversity thereby enhancing the search. The fitness values assigned in this paper range from 0.4 to 1.6.

Once done with above calculation of every ranking, the reproduction is carried out using MATLAB simulation through employing arbitrary numbers.

3.3.2 Crossover

This operation is used after the reproduction is over. For this the “mating pool” is provided with strings. Generally, it is a convergence operation that is anticipated to have a value with regard to either “local” minimum or maximum.

  1. (1)

    Randomly select several positions of R1 and R2 and exchange the genes. Whereas other genes in other locations remain unchanged.

  2. (2)

    Check for any conflict to delete the same gene as the exchange position in the original parent gene.

  3. (3)

    Unused genes are then filled in the gene vacancy sequentially.

3.3.3 Mutation

Mutation is actually a dissimilarity task. It is just a process of tweaking the chromosome a little bit in order to maintain divergence. The mutation process is the “exploration” of the search universe. It is proposed to periodically break at least one individual from a populace out of a nearby least/greatest space and possibly find a superior least/most extreme space. A random variable is generated for each bit in a sequence to implement mutation. This random variable shows if a particular bit will be modified or not. The main objective of mutation is to avoid the chromosome population becoming too identical to each other thereby preventing the entrapment of local optima.

To boost the local search capability, the mutation operator is utilized and also maintained the population diversity ratio. The ratio of mutation probability is attained by,

$$ {\text{Mutate}} - p = k_{3} \frac{{f_{{{\text{max}}}} - f_{i} }}{{f_{{{\text{max}}}} - f_{av} }}\quad {\text{if}}\;f_{i} \ge f_{av} $$
(2)
$$ {\text{Mutate}} - p = k_{4} 4\quad {\text{if}}\;f_{i} < f_{av} $$
(3)

where

Mutate-p:

probability mutation.

fm:

maximal value of fitness in the population.

fav:

fitness value in average in the entire population.

k3 and k4:

are considered as a common value a (0 < k3 < 1, 0 < k4 < 1).

fi:

the individual fitness value that will perform the mutation operator.

3.4 Particle Swarm Optimization (PSO) and Tabu Search (TS)

Particle swarm optimization (PSO) algorithm is a meta-heuristic technique based on evolution developed by Eberhart and Kennedy (1995). The mutation done in GA may bring us the good results. The mutation of gene is vital to maintain the divergence of population. But to improve the optimal strategies performance, we have applied PSO and tabu search algorithm.

In PSO, the individual solution is called “bird” in the searching space. They are referred to as “particles”. Each and every particle will have a value for fitness that is estimated using the function of fitness. These functions are to made optimum. There is a particular velocity to guide the movement of the particles. The particles will be flowing inside the problematic region in the direction next to the present particles with optimum functions. This algorithm has set of arbitrary particles at initial stage, which are the solutions and then begin to search for further possible optimum values by means of upgrading the generations. The particles are reorganized with new values at every step of iteration with two terms with best values, namely particles’ best, Pbest and global best, gbest. One is the best fitness attained by the particle until then. The next one is the total best value of many such swarms. At every step, the velocity of every single particle is altered to stretch towards Pbest and gbest. With the fresh values of velocity and location, the fitness function will be estimated with new-fangled ordinates.

The best solution with respect to the ordering structure and choosing is needed for the scheduling optimizing of flexibility oriented manufacture. The PSO and TS algorithms are hence utilized for clearing the issues related to “combinatorial” optimizing. The methodology will be adopted for arbitrarily created instances and for practical applications to exhibit the superiority of the suggested method.

The movement of the particle to find the optimal value is given by Eqs. 4 and 5

$$ v_{id} = \omega \times v_{id} + G_{1 } \times r_{1 } \times \left( { p_{idk }^{{{\text{best}}}} - p_{idk} } \right) + G_{2 } \times r_{1 } \times ( p_{gd }^{{{\text{best}}}} - p_{idk} ) $$
(4)
$$ v_{{id}} = \omega \times v_{{id}} + G_{{1}} \times r_{{1}} \times \left( {p_{{idk}}^{\text{best}} - p_{{idk}} } \right) + ~G_{{2}} \times r_{{1}} \times (p_{{gd}}^{\text{best}} - p_{{idk}} ) $$
(5)

In Eq. (4), vid means the distance moved by idkth particle for one iteration between [−Mmax; Mmax] in which Mmax is the maximum distance moved by a particle in one step. “\(\omega\)” is called inertial weight that is a variable, which calculates the distance moved by a particle in each step. G1 is the self-learning factor, meaning a particle learn from its own knowledge \(p_{idk }^{{{\text{best}}}}\). G2 is the social learning factor, meaning a particle’s knowledge globally, i.e. of the entire swarm \(p_{gd }^{{{\text{best}}}}\). r1 is a random number generated from an uniform distribution of [0,1]. \(p_{idk}\) denotes the position of idkth particle.

\(p_{idk }^{{{\text{best}}}}\) denotes the best personal value of idkth particle and \(p_{gd }^{{{\text{best}}}}\) denotes the best global value of all particles of a swarm. The newly generated velocity for idkth particle is obtained by Eq. 4, and updation of position is obtained by Eq. 5

A particle might leave the search area, to prevent this a maximum velocity factor is employed to the PSO so that the velocity is limited within a range. This allows the particle to move freely but in a constrained search area and is given by Eq. 6

$$ v_{lmn}^{t} = \min \left\{ {M_{{{\text{max}}}} ;\max \left\{ { - M_{{{\text{max}}}} ;v_{lmn}^{t} } \right\}} \right\},\quad m = 1, \ldots ,M;\;l = 1, \ldots ,L $$
(6)

To achieve global optimum, a larger value of inertia weight ω is preferred and small inertia weight is preferred to explore locally and thus convergence happens fast. In order to strike a balance between exploration and exploitation, a linear inertia weight that decreases with the convergence of algorithm evolving was adopted that has exhibited good global search ability initially and in later iterations good local search ability.

$$ \omega \left[ t \right] = \left( {\omega_{{{\text{initial}}}} - \omega_{{{\text{final}}}} } \right).\left( {\frac{g - t}{g}} \right)^{m} + \omega_{{{\text{final}}}} $$
(7)

where

\(\omega_{{{\text{initial}}}}\):

initial weight inertia

\( \omega_{{{\text{final}}}}\):

final weight inertia

g:

maximal number of iterations.

\(\omega_{{{\text{initial}}}} > \omega_{{{\text{final}}}} ,{ }\) and \(m \in \left[ {0.6;1.4} \right]\):

are the nonlinear index [14, 15].

Then the process of tabu search (TS), which was introduced by Glover in 1986, overcomes the local optimality issues based on the evaluation function that has been selected with the highest evaluation solution at each iteration. The initial population of tabu search is done by taking the result of PSO, and some random solutions are introduced. Next step is neighbourhood structure where the existing solutions will be modified to obtain new solutions. The movement of solutions is accepted on the basis of probability function. A tabu list is created to store the properties of the accepted moves, and they will be avoided in later iterations thus called “taboo”. In the new neighbourhood, the best neighbour (solution) will be chosen that is not a tabu. This is applied to avoid cyclic movements.A strategy called forbidding is employed to control and update the tabu list. The parameters used for this hybrid technique GAPSOTS are given in Table 1.

Table 1 Parameters of proposed hybrid technique—GAPSOTS
Table 2 Part mix with due date and penalty cost

4 Results and Discussions

The FMS problem considered in this work is taken from existing literature [6]. The hybrid procedure GAPSOTS is developed for 43 jobs and 80 jobs for 16 machines using combined objective optimization method. The problem matrix taken for this problem is given in Table 2. MATLAB software version R2016b and above is used for programming. The MATLAB helps the user to solve complex scheduling problems. TOMLAB/CPLEX is also used as it is a subproblem solver and handles quicker solving of mixed-integer linear and quadratic programming (MILP,MIQP), and linear and quadratic programming (LP,QP). Feasible schedule has been achieved for the combinatorial scheduling using the above hybrid approach. A comparison between the proposed GAPSOTS and other algorithms, namely GA, SA, memetic algorithm, cuckoo search (found in the literature) has been analysed and shown in Table 3 and Fig. 3 for 43 jobs 16 machine problem and Fig. 4 for 80 jobs 16 machine problem. The proposed algorithm has been run for different iterations. When the iterations numbers are less the results were not good, but as iterations increased we got better results. We stopped at 800 iterations, as there was no significant change in optimal value later.

Table 3 Performance comparison of GAPSOTS with various techniques
Fig. 3
figure 3

Comparison of COF for 43 jobs by different approaches

Fig. 4
figure 4

Comparison of COF for 80 jobs by different approaches

5 Conclusion

The following points summarize the authors contribution made in this paper.

  • A new hybrid technique named GAPSOTS has been developed by hybridizing GA, PSO, TS. A global search technique has been combined with two local search methods thereby balancing the drawbacks of the said techniques.

  • The developed technique was tested for benchmark problems taken from the literature, which consists of 2 problems of 43 jobs × 16-machine problem and 80 jobs × 16 machine problem.

  • A less solved combinatorial optimization problem for minimizing the total penalty cost and total machine idleness has been evaluated.

  • Optimal results were obtained with less computational time, and the results were found to be superior to other approaches found in the literature. The difference we got is subtle yet the results were promising to explore other adaptations of GA, PSO, TS.

  • The main observation is with less number of generations we got a better optimal value. With adaptations made in genetic operators, the prematured convergence of algorithm has been prevented.

  • In future, the proposed GAPSOTS can be tested for integrated scheduling FMS and can be implemented for any FMS with different set-ups.