Abstract
Whale optimization algorithm is a new member of nature inspired optimization algorithm which is inspired from foraging behaviour of humpback whales. Similar to other heuristic algorithms, Whale optimization algorithm suffers with immature convergence and stagnation problems while solving optimization problems. In this paper, Whale optimization algorithm is hybridized with Laplace Crossover operator and a new algorithm, Laplacian whale optimization algorithm (LXWOA), has been proposed. It has been used to solve a set of 23 classical benchmark functions which consists of scalable unimodal functions, scalable multimodal functions and low dimensional multimodal functions and the results are compared with original whale optimization algorithm, particle swarm optimization, differential evolution, gravitational search algorithm and Laplacian gravitational search algorithm. In this paper, LXWOA and WOA have also been used to solve the problem of extraction of compounds from gardenia.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
It is a great challenge for the research community to solve real life nonlinear optimization problems arising from various branches of scientific engineering. Researchers have developed various techniques to provide the best result for such problems but still there are many optimization problems to be solved. Therefore, researchers are working in this research area, hence, the literature available for these methods are becoming rich. For simplicity, the available literature of these methods to solve the given problem can be classified into deterministic methods and probabilistic methods. Deterministic methods follow a set of rules and for a particular input, they always produce the same output for a given problem. Deterministic methods are applicable only to a restricted class of problems. However, probabilistic methods are more general and they are applicable to a wide range of problems. The inspiration of mostly probabilistic methods is natural laws. Therefore, they are also known as Nature Inspired Algorithms.
From the most recent couple of decades, Nature Inspired Algorithms (NIA) are becoming more and more popular in engineering application problems since they are based on simple ideas and are easy to execute. The beauty of these algorithms is that they bypass local optima. Classical optimization methods require some information about the functions like gradient etc. but NIA does not require any such information. Therefore, it can be used in an extensive variety of problems covering diverse disciplines.
Nature Inspired optimization algorithms mimic biological or physical phenomena. The available literature of NIA is vast. Genetic Algorithms (GA) (Holland 1992), Evolution Strategy (Rechenberg 1978), Genetic Programing (Koza 1992), Probability-based Incremental Learning (Dasgupta and Zbigniew 2013), Biogeography-Based Optimization (Simon 2008), Simulated Annealing (Kirkpatrick et al. 1983), Gravitational Search Algorithms (Rashedi et al. 2009; Singh and Deep 2015, 2017a, b), Central Force Optimization (Formato 2007), Curved Space Optimization (Moghaddam et al. 2012), Big-Bang Big-Crunch (Erol and Eksin 2006), Small-World Optimization Algorithm (Du et al. 2006), Particle Swarm Optimization (Kennedy 2011), Ant Colony Optimization (Dorigo et al. 2006), Marriage in Honey Bees Optimization (Abbass 2001), Cuckoo Search (Yang and Deb 2009), Artificial Bee Colony (Basturk and Karaboga 2006), Bat-inspired Algorithm (Yang 2010), Monkey Search (Mucherino and Seref 2007), Firefly Algorithm (Yang 2010), Teaching Learning Based Optimization (Rao et al. 2011), Tabu Search (Glover 1989, 1990), Harmony Search Algorithm (Geem et al. 2001), Firework Algorithm (Tan and Zhu 2010) etc. are the popular nature inspired algorithms.
All NIA starts with a random population and share a common feature and their search process have an exploration phase and an exploitation phase. The exploration phase corresponds to global search i.e. algorithm is capable of exploring the search space globally. Whereas, exploitation corresponds to local search i.e. algorithm is capable of detailed investigation in the promising area(s) of the search space. It is extremely difficult to maintain proper balance between exploration and exploitation due to the stochastic nature of the optimization procedure.
The literature to solve optimization problems is very rich, but there is no single technique which can solve all the problems (Wolpert and Macready 1997). Therefore new algorithms are being introduced and existing algorithms are being developed with the hope that they have some advantage over the existing ones. Consequently, searching a new heuristic algorithm is an open issue (Wolpert and Macready 1997). The algorithms are being hybridized with the operator which has the capability to search locally (to improve solution quality) or search globally search (to skip premature convergence) or both.
Whale Optimization Algorithm (WOA) (Mirjalili and Lewis 2016) is a newly developed algorithm. It has been tested on various optimization problems with different difficulty level, but similar to other heuristic algorithms, WOA suffers with immature convergence i.e. it may rapidly converge towards a local optima instead of global optima and stagnation in local solutions while solving optimization problems. Hence, the quality of the final solution may decrease dramatically. These issues may happen when the optimizer could not have a fine balance between its exploration and exploitation.
WOA has been hybridized with simulated annealing (SA) to tackle the feature selection problems (Mafarja and Mirjalili 2017). Kaveh and Ghazaan (Kaveh and Ghazaan 2017) have proposed a modified whale optimization algorithm for sizing optimization of skeletal structures. Oliva et al. (Oliva et al. 2017) utilized a chaos-embedded WOA to deal with parameter estimation of photovoltaic cells. WOA has been used to solve multilevel thresholding image segmentation (El Aziz et al. 2017), the optimal renewable resources placement problem in distribution networks (Reddy et al. 2017) and to find the optimal weights in neural networks (Aljarah et al. 2018). It also has been used to attain the best parameters of SVM classifier (Tharwat et al. 2017) for predicting the drug toxicity. Yan et al. (2018) proposed ameliorative whale optimization algorithm to solve multi-objective water resource allocation optimization models. Nasiri and Khiyabani (2018) proposed whale clustering optimization algorithm for clustering. Hasanien (2018) used whale optimization algorithm to improve the performance of photovoltaic power systems. Kaur and Arora (2018) proposed chaotic whale optimization algorithm. Jadhav and Gomathi (2018) proposed a technique for data clustering using whale optimization algorithm. Elaziz and Oliva (2018) used opposition-based learning to enhance the exploration phase of whale optimization algorithm and used it to estimate the parameters of solar cells using three different diode models. Algabalawy et al. (2018) employed whale algorithm to find the optimal design of the system for minimizing the total annual cost and system emissions. Mehne and Mirjalili (2018) used whale optimization algorithm to solve optimal control problems. Xiong et et al. (2018) proposed improved whale optimization algorithm to extract the parameters of different solar photovoltaic models accurately. Ala’m et al. (2018) proposed a hybrid machine learning model based on support vector machines and whale optimization algorithm for the task of identifying spammers in online social networks. Saidala and Devarakonda (2018) proposed improved whale optimization algorithm and applied to a clinical dataset of an anaemic pregnant woman and obtained optimized clusters and cluster heads to secure a clear comprehension and meaningful insights in the clinical decision-making process. Horng et al. (Horng et al. 2017) proposed a novel multi-objective method for an optimal vehicle traveling based on Whale optimization algorithm. Hassan and Hassanien (2018) presented a novel automated approach for extracting the vasculature of retinal fundus images using whale optimization algorithm. Mostafa et al. (2017) proposed a technique for liver segmentation in MRI images based on whale optimization algorithm. Abdel-Basset et al. (2019) used modified whale optimization algorithm for solving 0–1 knapsack problem. El Aziz et al. (2018) proposed a new method for determining the multilevel thresholding values for image segmentation using whale optimization algorithm. El Aziz et al. (2018) proposed non-dominated sorting technique based on multi-objective whale optimization algorithm for content-based image retrieval. Nazari-Heris et al. (2017) used whale optimization to solve combined heat and power economic dispatch problem. Sun et al. (2018) proposed a modified whale optimization algorithm for large-scale global optimization problems. Abdel-Basset et al. (2018) proposed a modified whale optimization algorithm for cryptanalysis of Merkle-Hellman Knapsack Cryptosystem. Luo and Shi (2018) proposed hybrid whale optimization algorithm based on modified differential evolution for global optimization problems. Whale Optimization Algorithm has been used in many application problems like feature selection and land pattern classification (Bui et al. 2019). Sreenu and Sreelatha (2017) proposed the task scheduling algorithm based on the multi-objective model and the whale optimization algorithm for task scheduling in cloud computing. Eid (2018) proposed binary whale optimisation algorithm for feature selection. Ghahremani-Nahr et al. (2019) used whale optimization algorithm to minimize the total costs of closed-loop supply chain network. Hussien et al. (2019) proposed a binary whale optimization algorithm to select the optimal feature subset for dimensionality reduction and classifications problem using a sigmoid transfer function. Laskar et al. (2019) proposed hybrid whale-particle swarm optimization algorithm for solving complex optimization problems. Yousri et al. (2019) proposed four chaotic whale optimization variants. Elhosseini et al. (2019) proposed A-C WOA and applied on a biped robot to find the optimal settings of the hip parameters. The more literature may be study in Mirjalili et al. (2020).
In this paper, the performance of WOA is improved by hybridizing it with Laplace Crossover, which is a well-known Real Coded Genetic Algorithm operator. The rest of the paper is written as follows: In Sect. 2, the Whale Optimization Algorithm is clarified. In Sect. 3, Laplace Crossover is reproduced. In Sect. 4, LXWOA is described. In Sect. 5, the numerical results are investigated. In Sect. 6, the problem of extraction of compounds from gardenia is discussed and solved using LXWO and WOA. At last, in Sect. 7, the conclusions are drawn.
2 Whale optimization algorithm
Whale Optimization Algorithm (WOA) is proposed by Mirjalili and Lewis in 2016 (2016). It is inspired from the foraging behaviour of humpback whales. This foraging mechanism is known by bubble-net feeding method. In this mechanism, particular bubbles are created in a circle or ‘9’-shaped path as shown in Fig. 1. Goldbogen et al. (2013) utilized tag sensors to investigate the foraging behaviour of humpback whale and found two maneuvers (1) upward-spirals: humpback whales jump around 12 m down and after that begin to rise in a winding shape around the prey and swim towards the surface, and (2) double-loops: it incorporates three distinct stages: coral loop, lobtail, and capture loop. More details can be found in Goldbogen et al. (2013).
Humpback whales use three strategies namely, searching of prey, encircling prey and bubble-net feeding method during the foraging behaviour. In the searching of prey, humpback whales search according to the positions of each other and explore new solutions randomly. In the encircling prey, humpback whales perceive the area of prey and recognize the location of prey after which they encircle them. Since there is no priori information about the position of the optimal solution in the search space, therefore, WOA algorithm expects that the current best solution is the target prey, or is close to the optimum. After defining it, other search agents update their positions towards the best search agent. In the bubble-net attacking, humpback whales swim around the prey within a shrinking circle and along a spiral-shaped path simultaneously. The working procedure of whale optimization algorithm is as follows:
Let Np is the size of population and Mitr is the maximum number of iterations. The algorithm has two parameters a and b. The parameter a decreases linearly from 2 to 0 and the parameter b decreases from − 1 to − 2 as the iteration increases from 1 to Mitr. Then the coefficients \( A_{i} \), \( C_{i} \) and a random number \( l_{i} \) for agent i are evaluated as follows:
where \( r_{1} ,r_{2} \) and \( r_{3} \) are three uniformly distributed random numbers in \( [0,\,1] \). Then a random number \( p_{r} \) is generated if \( p_{r} < 0.5 \) and \( \left| {A_{i} } \right| \ge 1 \) then ith agent updates the dth component of next position i.e.\( x_{i}^{d} (t + 1) \) with the random agent by the following equations:
where \( x_{rand}^{d} (t) \) is the position of a random agent in dth dimension chosen from the current population. If \( p_{r} < 0.5 \) and \( \left| {A_{i} } \right| < 1 \) then ith agent updates the dth component of next position i.e.\( x_{i}^{d} (t + 1) \) with the \( x_{best}^{d} \) agent by the following equations:
if \( p_{r} \ge 0.5 \) then, the distance \( D_{i}^{'d} = \left| {x_{best}^{d} (t)\, - x_{i}^{d} (t)} \right| \) between the whale located at \( x_{i} (t) \) and prey located at \( x_{best} (t) \) in dth dimension is evaluated and is used to mimic the helix-shaped movement of humpback whales in a spiral equation.
The constant b controls the shape of spiral.
3 Laplace crossover
Deep and Thakur (2007) proposed Laplace Crossover (LX) to generate a pair of offspring \( y_{1} = \left( {y_{1}^{1} ,y_{1}^{2} , \ldots ,y_{1}^{m} } \right) \) and \( y_{2} = \left( {y_{2}^{1} ,y_{2}^{2} , \ldots ,y_{2}^{m} } \right) \) from a pair of parents \( x_{1} = \left( {x_{1}^{1} ,x_{1}^{2} , \ldots ,x_{1}^{m} } \right) \) and \( x_{2} = \left( {x_{2}^{1} ,x_{2}^{2} , \ldots ,x_{2}^{m} } \right) \). LX produces a pair of offspring in such a manner that both the offspring are symmetric with respect to the position of the parents. A Laplacian distributed random number \( l_{i} \) is created by the following rule:
where \( u_{i} ,v_{i} \) are two uniformly distributed random numbers in \( [0,1] \). \( p \in \,R \) is the location parameter and \( q > 0 \) is the scale parameter. The offspring are created by the following rules:
If generated offspring does not belong to search space i.e. \( y^{i} < y_{low}^{i} \) or \( y^{i} > y_{up}^{i} \) for some \( i, \) then \( y^{i} \) is set a random number from the interval \( [y_{low}^{i} ,y_{up}^{i} ] \).
When q is a small value, then it is likely that LX generates the offspring near to the parents and when q is a great value, then it is likely that it generates the offspring far from the parents. When p and q are fixed values, then LX deploys offspring proportional to the spread of parents (Deep and Thakur 2007). In literature, LX has been used to improve the performance of GA (Deep and Thakur 2007), GSA (Singh and Deep 2015), BBO (Garg and Deep 2016), PSO (Deep and Bansal 2009), etc.
4 Proposed algorithm
In the present study, WOA has been hybridized with the above defined Laplace crossover, which is a real coded crossover operator for real coded genetic algorithm and LXWOA is proposed.
In LXWOA algorithm, first a randomly distributed population of size \( N_{p} \) is initialized. At each iteration, LXWOA follows the WOA procedure first, then two agents are selected in which first one is the best agent and the second one is selected randomly from the current population. Laplace crossover is applied to the best and randomly selected agents. It generates two offspring. The fitness of both the offspring are tested with the worst agent in the current population one by one. If offspring has better fitness then it is replaced with the worst particle of the current population. Best is updated and iteration is incremented. Algorithm follows this procedure till termination criteria is satisfied. The pseudo code of LXWOA algorithm is shown in Fig. 2.
5 Results and discussion
The performance of LXWOA is tested on a set of 23 classical benchmark functions (Mirjalili and Lewis 2016) which consists of three types of problems (1) Scalable unimodal function (F1 to F7) are described in Table 1 (2) Scalable multimodal function (F8 to F13) are in Table 2 (3) low dimensional multimodal function with fixed dimension (F14 to F23) are in Table 3. The experiments are performed on the processor: Intel (R) Core (TM) i3-2350 M CPU @ 2.30 GHz, RAM: 4.00 GB, Operating System: Window 10, Integrated Development Environment: MATLAB 2013. The parameter of Laplace crossover \( p \) is set to zero and the value of \( q \) is finely tuned by conducting several experiments. In these experiments, \( q \) varies from 0.05 to 0.30 and LXWOA algorithm is run 30 times with population size Np = 30 and the termination criteria is set at 500 iterations. For \( q = 0.05,\,0.10,\,0.15,\,0.20,\,0.25 \) and 0.30, iteration wise average best so far is plotted for the function F1, F8, F14 and F16 and shown in the Fig. 3. From these figures, it is observed that the performance of LXWOA on F1 is best and approximately same when \( q = \,0.05 \) and \( q = \,0.10 \) and it is worst when \( q = \,0.30 \). The performance of LXWOA on F8 is best when \( q = \,0.10 \) and it is worst when \( q = \,0.15 \). The performance of LXWOA on F14 is best when \( q = \,0.10 \) and it is worst when \( q = \,0.05 \). The performance of LXWOA on F16 is worst when \( q = \,0.25 \) and it is best approximately same when \( q = \,0.05 \) and \( q = \,0.10 \). From these figures, it is concluded that the performance of LXWOA is best when \( q = \,0.10 \).
LXWOA, WOA, GSA and LXGSA are run 30 times each with population size Np = 30, \( q = \,0.10 \) and the termination criteria is set at 500 iterations for all the functions. For a fair comparison among WOA and LXWOA the first randomly generated population is used for the first run of WOA and LXWOA, second randomly generated population is used for the second run of WOA and LXWOA, and so on. Best, Average, Median and Standard deviation (SD) of the objective function values obtained from LXWOA, WOA GSA and LXGSA are calculated over 30 runs and shown in Table 4 for scalable unimodal functions with dimension 30, Table 5 for scalable multimodal functions with dimension 30 and Table 6 for low dimensional multimodal functions. The results of Particle Swarm Optimization (PSO), and Differential Evolution (DE) are taken from (Mirjalili and Lewis 2016) to compare with LXWOA and WOA. In (Mirjalili and Lewis 2016), results are given in terms of Average and SD.
From the Table 4, it is observed that out of 7 problems, there are 3 problems, namely F1, F2 and F7 in which the performance of LXWOA is better than WOA, PSO, DE, GSA and LXGSA and there are 4 problems, namely F3, F4, F5 and F6 DE has better Average and SD. If LXWOA and WOA are compared together, then LXWOA has better performance on all 7 problems. Hence, from the Table 4, it is concluded that the performance of LXWOA is better than WOA on scalable unimodal functions when dimension is 30.
From the Table 5, it is observed that out of 6 problems, there are 3 problems, namely F9, F10, and F11 in which the performance of LXWOA is better than PSO, DE, GSA and LXGSA. On the problem F8 and F12, LXWOA has better Best and Median in comparison to WOA, GSA and LXGSA. On F13, LXGSA has better Best and LXWOA has better Median. DE has better Average and SD on F8, F12 and F13. When only LXWOA and WOA are considered then the performance of LXWOA is better on 3 problems, namely F8, F11 and F12. On F9 and F10, LXWOA and WOA have the same solution. On F13, WOA has better performance in comparison to LXWOA. Hence, from the Table 5, it is concluded that the performance of LXWOA is better than WOA on scalable multimodal function when dimension is 30.
From the Table 6, it is observed that out of 10 problems, there are 2 problems, namely F19 and F20 in which LXGSA has better performance in comparison to others. On F16 and F17 (up to 4 decimal), all algorithms have approximately same solution. On F18, LXGSA, PSO, DE found optimal solution and the solution found by LXWOA and WOA are very close to optimal solution. On F14, F15, F21, F22 and F23, DE has least Average and SD in comparison to other algorithms consider in this study which means that DE performs better on these problems. Hence, it is concluded that the performance of DE is better than others on low dimensional multimodal problems. When only LXWOA and WOA are considered then it is found that the performance of LXWOA is better than WOA.
In order to observe the behaviour of the objective function value with a passage of iterations the convergence plots of the WOA and LXWOA are plotted and shown in Figs. 4 and 5. On the horizontal axis the iterations are shown, whereas on the vertical axis the average best-so-far is shown. Average best-so-far is the average value of objective function in each iteration over 30 runs. From the plots it is concluded that LXWOA is converging fast towards optima in comparison to WOA algorithm.
t test the performance of LXWOA has been compared with WOA, GSA and LXGSA using a pairwise one tailed t-test with 29° of freedom at 0.05 level of significance over the objective function value of all the problems considered. The null hypothesis is assumed that “there is no difference between algorithms” and alternative hypothesis is “there is difference”. The p-value and conclusions are shown in Table 7. The following criterion is used to conclude the results: A+ shows that p-value < 0.01 and LXWOA is highly significant than algorithm 1, A shows that p-value < 0.05 and LXWOA is significantly better than algorithm 1, B shows that p-value = 0.05 and LXWOA is alike algorithm 1, C shows that p-value < 0.1 and LXWOA is marginally significant than algorithm 1, D shows that p-value > 0.1 and LXWOA is not significant.
On observing the results shown in Table 7, it can be concluded that if WOA versus LXWOA is considered then 8 out of the 20 problems show that LXWOA is highly significant than WOA. If GSA versus LXWOA is considered then 15 out of the 23 problems show that LXWOA is highly significant than GSA. If LXGSA versus LXWOA is considered then 12 out of the 22 problems show that LXWOA is highly significant than LXGSA. There are two problems, namely F1 and F16 in which p-value cannot be computed because the standard error of the difference is 0.
In the above experiment I, the maximum number of iterations (i.e. 500) are fixed. In one iteration, WOA requires 30 number of function evaluations and LXWOA requires 32 number of function evaluations for a population of size 30. Therefore, WOA and LXWOA evaluates 15,000 and 16,000 number of functions respectively in a run of 500 iteration. This shows that the cost of LXWOA is higher than the cost of WOA. Keeping in this mind, experiment II is conducted in which termination criteria is set to be “maximum number of function evaluation less than 5000”. The available results of PSO and DE are taken for 500 iteration, hence, experiment I has been conducted by fixing the termination criteria to be “maximum iteration = 500”. But in experiment II, author wants to utilize maximum exploration and exploitation capability of the algorithm to find the best quality of solution. So, the maximum number of function evaluation is set to 5000 for both the algorithms. The Best, Worst, Average, Median and Standard deviation (SD) of the objective function values obtained from LXWOA and WOA reported in Table 8 for all problems listed in Tables 1, 2 and 3.
The results in Table 8 shows that the solutions obtained from LXWOA are better than WOA on scalable unimodal functions (F1 to F7). Out of 6 problems of scalable multimodal functions (F8 to F13), there are 2 problems, namely F9 and F10 in which LXWOA and WOA have the same set of solutions. There is one problem, namely F13 in which WOA turns to have better solution and there is one problem, namely F11 in which LXWOA has better solution. On F8 and F12, LXWOA has better Best but WOA has better Worst, Average and Median. Out of 10 problems of low dimensional multimodal functions (F14 to F23), there are 7 problems, namely F19, F20, F21, F22 and F23 in which the performance of LXWOA is better than WOA. There are 4 problems, namely F14, F16, F17 and F18 in which LXWOA and WOA showed approximately the same solutions. On F15, LXWOA has better Worst and Median but better Best and Average has been found by WOA.
From the above analysis, it can be concluded that the performance of LXWOA has improved in comparison to WOA on Scalable unimodal function, Scalable multimodal function and low dimensional multimodal functions for a fixed dimensions.
6 The problem of extraction of compounds from Gardenia
Three bioactive compounds, namely, crocin (\( Y_{1} \)), geniposide (\( Y_{2} \)) and total phenolic (\( Y_{3} \)) compounds are obtained from gardenia fruits, which are affected by three independent variables, concentrations of ethanol (\( X_{1} \)), extraction temperature (\( X_{2} \)) and extraction time (\( X_{3} \)). Young et al. (Yang et al. 2009) have presented a nonlinear multi-objective optimization problem using the method of least square fitting and has solved it using Response Surface Method. Shashi et al. (2010) used DDX-LLM algorithm and Garg and Deep (2016) used Laplacian biogeography based optimization (LX-BBO) algorithm to solve it. In this paper, the above problem has been resolved with WOA and LXWOA. The mathematical model of the above problem, which is mentioned in Yang et al. (2009) and Shashi and Katiyar 2010), is as follows.
In formulation, each yield is in the form of a function of three independent variables. These functions are the second order polynomial equation which is as follows:
where, \( Y_{k} \) represents yield, \( b_{0} \) represents a constant, \( \,b_{i} ,\,b_{ii} \) and \( b_{ij} \) linear, quadratic and interactive coefficients respectively of the model. \( X_{i} \) and \( X_{j} \) are independent variables. The resulting equation of yield \( Y_{1} \), \( Y_{2} \) and \( Y_{3} \) are as follows:
This problem demonstrates a multi-objective optimization problem in which all three yield is needed to be maximized.
Multi-objective optimization problem is a problem in which more than one objective function are optimized together. The weighted method approach is a simple and effective technique to handle multi-objective optimization problems. In this technique, the user converts these problems into the single objective problem by defining different or equal weight for each objective function. In this research paper, it has been solved with WOA and LXWOA by converting the problem into one objective problem by giving equal weight to all the yields.
Mathematically, for the given yield function \( Y_{1} \), \( Y_{2} \) and \( Y_{3} \), the objective is to solve the following function:
where \( w_{1} ,\,w_{2} \) and \( w_{3} \) are the weights given by the user. In this paper, \( w_{1} = 0.33,\,w_{2} = 0.33 \) and \( w_{3} = 0.34 \) are kept.
Results In this section, the results of the problem of the extraction of compounds from gardenia by WOA and LXWOA have been presented and these results have been compared with the results available in the literature. The algorithms have been run to maximum 100 iterations by keeping the number of candidates in the population as 30. The results obtained on the basis of 30 independent experiments have been shown using boldface in Table 9. The range of \( X_{1} ,X_{2} \) and \( X_{3} \) are taken in 0–100.
When WOA and LXWOA are used to solve the above problem, it is concluded that when the concentration of ethanol is \( X_{1} = 63.482211 \)%, extraction temperature is \( X_{2} = 1 0 0 \) °C and extraction time \( X_{3} = 27.290530 \) min, then crocin \( (Y_{1} = 9.6418) \), geniposide \( (Y_{2} = 117.7906) \) and phenolic \( (Y_{3} = 25.2781) \) are obtained in maximum amount.
7 Conclusions
In this paper, a novel Laplacian whale optimization algorithm (LXWOA) has been proposed and results are compared with Whale Optimization Algorithm, Particle Swarm Optimization, Differential Evolution, Gravitational Search Algorithm and Laplacian Gravitational Search Algorithm. From the above study, it is concluded that Laplacian whale optimization algorithm gives most promising results as compared to whale optimization algorithm, particle swarm optimization, differential evolution, gravitational search algorithm and Laplacian gravitational search algorithm on scalable unimodal and scalable multimodal functions when dimension is set 30. The performance of LXWOA has been improved as compared to WOA on Scalable unimodal function, Scalable multimodal function and low dimensional multimodal functions with fixed dimensions. The results obtained from LXWOA and WOA for the problem of extraction of compounds from gardenia are better as compared to the results available in the literature.
References
Abbass HA (2001) MBO: marriage in honey bees optimization-a haplometrosis polygynous swarming approach. In: Proceedings of the 2001 congress on evolutionary computation, pp 207–214
Abdel-Basset M, El-Shahat D, El-Henawy I, Sangaiah AK, Ahmed SH (2018) A novel whale optimization algorithm for cryptanalysis in Merkle-Hellman cryptosystem. Mobile Netw Appl 1–11
Abdel-Basset M, El-Shahat D, Sangaiah AK (2019) A modified nature inspired meta-heuristic whale optimization algorithm for solving 0–1 knapsack problem. Int J Mach Learn Cybern 1–20
Ala’m AZ, Faris H, Hassonah MA (2018) Evolving support vector machines using whale optimization algorithm for spam profiles detection on online social networks in different lingual contexts. Knowl Based Syst 153:91–104
Algabalawy MA, Abdelaziz AY, Mekhamer SF, Aleem SHA (2018) Considerations on optimal design of hybrid power generation systems using whale and sine cosine optimization algorithms. J Electr Syst Inf Technol 5(3):312–325
Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 1–15
Basturk B, Karaboga D (2006) An artificial bee colony (ABC) algorithm for numeric function optimization. In: Proceedings of the IEEE swarm intelligence symposium, pp 12–14
Bui QT, Pham MV, Nguyen QH, Nguyen LX, Pham HM (2019) Whale optimization algorithm and adaptive neuro-fuzzy inference system: a hybrid method for feature selection and land pattern classification. Int J Remote Sens 1–16
Dasgupta D, Zbigniew M (2013) Evolutionary algorithms in engineering applications. Springer
Deep K, Bansal JC (2009) Optimization of directional over current relay times using Laplace Crossover Particle Swarm Optimization (LXPSO). In: 2009 World congress on nature & biologically inspired computing (NaBIC), pp 288–293
Deep K, Thakur M (2007) A new crossover operator for real coded genetic algorithms. Appl Math Comput 188(1):895–911
Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39
Du H, Wu X, Zhuang J (2006) Small-world optimization algorithm for function optimization. In: Jiao L, Wang L, Gao X, Liu J, Wu F (eds) Advances in natural computation. ICNC 2006. Lecture Notes in Computer Science, Springer, Berlin, Heidelberg, vol 4222, pp 264–273
Eid HF (2018) Binary whale optimisation: an effective swarm algorithm for feature selection. Int J Metaheuristics 7(1):67–79
El Aziz MA, Ahmed Ewees AA, Hassanien AE (2017) Whale Optimization Algorithm and Moth-Flame Optimization for multilevel thresholding image segmentation. Expert Syst Appl 83:242–256
El Aziz MA, Ewees AA, Hassanien AE, Mudhsh M, Xiong S (2018) Multi-objective whale optimization algorithm for multilevel thresholding segmentation. In: Advances in soft computing and machine learning in image processing. Springer, Cham, pp 23–39
El Aziz MA, Ewees AA, Hassanien AE (2018b) Multi-objective whale optimization algorithm for content-based image retrieval. Multimed Tools Appl 77(19):26135–26172
Elaziz MA, Oliva D (2018) Parameter estimation of solar cells diode models by an improved opposition-based whale optimization algorithm. Energy Convers Manag 171:1843–1859
Elhosseini MA, Haikal AY, Badawy M, Khashan N (2019) Biped robot stability based on an A-C parametric Whale Optimization Algorithm. J Comput Sci 31:17–32
Erol OK, Eksin I (2006) A new optimization method: big bang-big crunch. Adv Eng Softw 37(2):106–111
Formato RA (2007) Central force optimization: a new metaheuristic with applications in applied electromagnetics. Prog Electromagn Res 77:425–491
Garg V, Deep K (2016) Optimal extraction of bioactive compounds from Gardenia using Laplacian biogeography based optimization. In: Kim J, Jim Z (eds) Harmony search algorithm advances in intelligent systems and computing, vol 382. Springer, Berlin, pp 251–258
Geem ZW, Kim JH, Loganathan G (2001) A new heuristic optimization algorithm: harmony search. Simulation 76(2):60–68
Ghahremani-Nahr J, Kian R, Sabet E (2019) A robust fuzzy mathematical programming model for the closed-loop supply chain network design and a whale optimization solution algorithm. Expert Syst Appl 116:454–471
Glover F (1989) Tabu search—Part I. ORSA J Comput 1(3):190–206
Glover F (1990) Tabu search—Part II. ORSA J Comput 2(1):4–32
Goldbogen JA, Friedlaender AS, Calambokidis J, Mckenna MF, Simon M, Nowacek DP (2013) Integrative approaches to the study of baleen whale diving behavior, feeding performance, and foraging ecology. Bioscience 63(2):90–100
Hasanien HM (2018) Performance improvement of photovoltaic power systems using an optimal control strategy based on whale optimization algorithm. Electr Power Syst Res 157:168–176
Hassan G, Hassanien AE (2018) Retinal fundus vasculature multilevel segmentation using whale optimization algorithm. Signal Image Video Process 12(2):263–270
Holland JH (1992) Genetic algorithms. Sci Am 267(1):66–72
Horng MF, Dao TK, Shieh CS (2017) A multi-objective optimal vehicle fuel consumption based on whale optimization algorithm. In: Advances in intelligent information hiding and multimedia signal processing. Springer, Cham, pp 371–380
Hussien AG, Hassanien AE, Houssein EH, Bhattacharyya S, Amin M (2019) S-shaped binary whale optimization algorithm for feature selection. In: Recent trends in signal and image processing. Springer, Singapore, pp 79–87
Jadhav AN, Gomathi N (2018) WGC: hybridization of exponential grey wolf optimizer with whale optimization for data clustering. Alex Eng J 57(3):1569–1584
Kaur G, Arora S (2018) Chaotic whale optimization algorithm. J Comput Des Eng 5(3):275–284
Kaveh A, Ghazaan MI (2017) Enhanced whale optimization algorithm for sizing optimization of skeletal structures. Mech Based Des Struct Mach 45(3):345–362
Kennedy J (2011) Particle swarm optimization. In: Encyclopedia of machine learning. Springer US, pp 760–766
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220:671–680
Koza JR (1992) Genetic programming
Laskar NM, Guha K, Chatterjee I, Chanda S, Baishnab KL, Paul PK (2019) HWPSO: a new hybrid whale-particle swarm optimization algorithm and its application in electronic design optimization problems. Appl Intell 49(1):265–291
Luo J, Shi B (2018) A hybrid whale optimization algorithm based on modified differential evolution for global optimization problems. Appl Intell 1–19
Mafarja MM, Mirjalili S (2017) Hybrid Whale Optimization Algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312
Mehne HH, Mirjalili S (2018) A parallel numerical method for solving optimal control problems based on whale optimization algorithm. Knowl Based Syst 151:114–123
Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67
Mirjalili S, Mirjalili SM, Saremi S, Mirjalili S (2020) Whale Optimization Algorithm: theory, literature review, and application in designing photonic crystal filters. In: Nature-inspired optimizers. Springer, Cham, pp 219–238
Moghaddam FF, Moghaddam RF, Cheriet M (2012) Curved space optimization: a random search based on general relativity theory. arXiv: 1208.2214
Mostafa A, Hassanien AE, Houseni M, Hefny H (2017) Liver segmentation in MRI images based on whale optimization algorithm. Multimed Tools Appl 76(23):24931–24954
Mucherino A, Seref O (2007) Monkey search: a novel metaheuristic search for global optimization. In: AIP conference proceedings, pp 162–173
Nasiri J, Khiyabani FM (2018) A whale optimization algorithm (WOA) approach for clustering. Cogent Math Stat 5(1):1–13
Nazari-Heris M, Mehdinejad M, Mohammadi-Ivatloo B, Babamalek-Gharehpetian G (2017) Combined heat and power economic dispatch problem solution by implementation of whale optimization method. Neural Comput Appl 1–16
Oliva D, El Aziz MA, Hassanien AE (2017) Parameter estimation of photovoltaic cells using an improved chaotic whale optimization algorithm. Appl Energy 200:141–154
Rao RV, Savsani VJ, Vakharia DP (2011) Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des 43(3):303–315
Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248
Rechenberg I (1978) Evolutionsstrategien. Springer, Berlin, pp 83–114
Reddy PDP, Reddy VCV, Manohar TG (2017) Optimal renewable resources placement in distribution networks by combined power loss index and whale optimization algorithms. J Electr Syst Inf Technol 175–191
Saidala RK, Devarakonda N (2018) Improved whale optimization algorithm case study: clinical data of anaemic pregnant woman. In: Data engineering and intelligent computing. Springer, Singapore, pp 271–281
Shashi DK, Katiyar VK (2010) Multi-objective extraction optimization of bioactive compounds from Gardenia using real coded genetic algorithm. In: 6th World congress of biomaconics, vol 31, pp 1436–1466
Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713
Singh A, Deep K (2015) Real coded genetic algorithm operators embedded in gravitational search algorithm for continuous optimization. Int J Intell Syst Appl 7(12):1–22
Singh A, Deep K (2017a) Novel hybridized variants of gravitational search algorithm for constraint optimization. Int J Swarm Intell 3(1):1–22
Singh A, Deep K (2017b) Hybridized gravitational search algorithms with real coded genetic algorithms for integer and mixed integer optimization problems. In: Proceedings of sixth international conference on soft computing for problem solving. Springer, Singapore, pp 84–112
Sreenu K, Sreelatha M (2017) W-scheduler: whale optimization for task scheduling in cloud computing. Cluster Comput 1–12
Sun Y, Wang X, Chen Y, Liu Z (2018) A modified whale optimization algorithm for large-scale global optimization problems. Expert Syst Appl 114:563–577
Tan Y, Zhu Y (2010) Fireworks algorithm for optimization. In: Advances in swarm intelligence. Springer, pp 355–364
Tharwat A, Moemen YS, Hassanien AE (2017) Classication of toxicity effects of biotransformed hepatic drugs using whale optimized support vector machines. J Biomed Inform 68:132–149
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
Xiong G, Zhang J, Shi D, He Y (2018) Parameter extraction of solar photovoltaic models using an improved whale optimization algorithm. Energy Convers Manag 174:388–405
Yan Z, Sha J, Liu B, Tian W, Lu J (2018) An ameliorative whale optimization algorithm for multi-objective optimal allocation of water resources in Handan, China. Water 10(1):87–116
Yang X-S (2010) A new metaheuristic bat-inspired algorithm. In: Proceedings of the workshop on nature inspired cooperative strategies for optimization (NICSO 2010). Springer, pp 65–74
Yang X-S (2010b) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio-Inspired Comput 2(2):78–84
Yang X-S, Deb S (2009) Cuckoo search via Lévy flights. In: Proceedings of the world congress on nature & biologically inspired computing, NaBIC 2009, pp 210–214
Yang B, Liu X, Gao Y (2009) Extraction optimization of bioactive compounds (crocin, geniposide and total phenolic compounds) from Gardenia (Gardenia jasminoides Ellis) fruits with response surface methodology. Innov Food Sci Emerg Technol 10:610–615
Yousri D, Allam D, Eteiba MB (2019) Chaotic whale optimizer variants for parameters estimation of the chaotic behavior in permanent magnet synchronous motor. Appl Soft Comput 74:479–503
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Singh, A. Laplacian whale optimization algorithm. Int J Syst Assur Eng Manag 10, 713–730 (2019). https://doi.org/10.1007/s13198-019-00801-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13198-019-00801-0