Abstract
Recently, many population-dependent methods have been proposed. Despite their acceptance in many applications, we are still exploring suggested methods to solve actual problems. Consequently, researchers need to change and refine their procedures significantly based on the major evolutionary processes to achieve quicker convergence, more consistent equilibrium with high-quality performance and optimization. Therefore, a new hybrid method using Hunger Games Search (HGS) and Arithmetic Optimization Algorithm (AOA) is proposed in this paper. HGS is a recently proposed population-dependent optimization method that stabilizes the features and efficiently performs unconstrained and constrained problems. In contrast, AOA is a modern meta-heuristic optimization method. They can be applied to different problems, including image processing, machine learning, wireless networks, power systems, engineering design etc. The proposed method is analyzed in context with HGS and AOA. Each method is tested on the same parameters like population size and no. of iteration to evaluate the performance. The proposed method (AOA-HGS) is assessed by varying the dimensions on 23 functions (F1-F23). The impact of varying dimensions is a standard test utilized in previous studies for optimizing test functions that show the effect of varying dimensions on the efficiency of AOA-HGS. From this, it is noted that it works efficiently for both high and low dimensional problems. In high dimensional problem population, dependent methods give efficient search results. The AOA-HGS is very competitive and superior compared with others on test functions. No. of optimization methods obtained optimum results, but AOA-HGS has the best when compared with all. So, AOA-HGS is capable of getting optimum results.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Technologies in the present world have less feature space and depend on priorities and financial limits in information, knowledge-dependent, and expert systems. A practicable solution and adequate information are to be found by researchers using various algorithms for different problems in areas like image segmentation [34], optimization [31], target tracking systems [50], QoS-aware and social recommendation [28,29,30], problems in scheduling [40], prediction of gold prize [48], etc.
The method of determining appropriate values for variables of a given problem to minimize and optimize the objective function is termed optimization. There are difficulties of optimization in various fields of analysis. Various steps should be taken to solve an optimization problem. Next, the parameters of the problem need to be defined. Issues are categorized as continuous or discrete, depending on the type of parameters. Secondly, it is important to consider the restrictions applied to the parameters [42].
Constraints split optimization problems into confined and unregulated ones. The purposes of the issue should be examined and addressed [10, 36]. Mathematical optimization depends primarily on gradient-based knowledge of functions to obtain an optimum solution. While these methods are already being utilized by various researchers, and have few drawbacks. Mathematical optimization methods are influenced by local optima entrapment. This applies to a method that believes that the local solution is a global solution and fails to reach the optimum goal. They are most often unsuccessful in problems with uncertain or computationally costly derivatives [39]. Another form of an optimization algorithm that removes these major disadvantages is stochastic optimization [45].
Stochastic approaches rely on arbitrary operators to prevent local optimization. It begins the optimization by generating one or a series of arbitrary solutions to problems. Compared to mathematical optimization methods, it is not important to determine the solution’s gradient to test the solutions using an objective function (s). Decisions on how to enhance outcomes are taken based on the realistic principles measured. Thus, the problem is assumed to be a black box, and it’s a very effective tool for solving actual issues of undefined search spaces. Because of the above advantages, stochastic methods are commonly used [37]. Nature-inspired, population-dependent methods are the most common among stochastic optimization methods [51].
These approaches emulate normal problem-solving strategies, often those employed by species. Survival is the primary objective of all species. They have evolved and changed in various ways to accomplish this goal. It is also prudent to pursue guidance from nature as the greatest and oldest optimizing compiler on earth. These algorithms are categorized into two major groups: single-solution and multi-solution dependent. In the first stage single random solution for a specific problem is developed and improved and in the second multiple solutions for a specific issue are developed and improved. Multi-solution methods are more common than single-solution approaches [38]. Multi-solution models have an inherently higher local optimum-avoidance owing to the improvement of multiple solutions during the optimization process. The solution stuck in a locally optimal is supported by certain methods to leaping from the locally optimal.
Existing solutions are searching a greater portion of search space relative to single-solution methods, so the likelihood of reaching global optimum is large. Commonly used methods dependent on single-solution are simulated annealing and hill-climbing [14, 27]. They both are ideal, but the avoidance of the local optimum of SA is high the stochastics’ cooling factor. Some new methods dependent on single-solution are iterated local search and tabu search [18, 20, 32]. Some multi-solution dependent popular methods are particle swarm optimization [15], genetic algorithms (GA) [23], differential evolution (DE) [46] and ant optimization (ACO) [11]. Darwin’s evolutionary theory influenced the GA method. Solutions are viewed as entities in this method, and solution parameters take over the place of genes. Survival of the fittest species is the key motivation of this method, where the strongest appears to be more interested in improving bad solutions.
Previous studies state different swarm intelligence optimization methods like firefly algorithm (FA) [52, 57], dolphin echolocation (DEL) [25, 26], grey wolf optimizer (GWO) [39] and bat algorithm (BA) [53]. BA and DEL copy echolocation of dolphins for finding prey and bats navigate. But FA copies the mating nature of fireflies. In Cuckoo Search (CS) [54, 56] method, the cuckoo’s reproductive behavior is used in the optimization method. The hunting behavior of grey wolves is used in the swarm method technique called GWO. Other methods like the state of matter search (SMS) [12, 13] utilize different types of matter for problem optimization.
In contrast, the flower pollination algorithm (FPA) [55] utilizes the pollination and survival behavior of flowers for pollination. Now, the question is, what is the need for new methods when there are many pre-existing. This query’s solution is in the No Free Lunch (NFL) [49] theorem, which has scientifically demonstrated no optimization strategy to solve all optimization problems. In other terms, techniques in this area work similarly on average when considering all optimization problems. This theorem has inspired the fast-growing algorithms that have been proposed in the last decades. It is also one of the reasons behind this paper.
HGS is influenced by animals’ social behavior, whose search for food is related to their extent of hunger. This method is applied and implemented based on the common characteristics of animals and their food search. In this paper, we proposed a hybrid method using two meta-heuristic methods AOA and HGS. Because of gradient-free and simple structure, higher local optimum avoidance and letting issues as black-box nature-inspired methods are used as large in engineering and other problems [8, 21, 59] [34]. We are, therefore, still exploring the utilization of suggested methods to solve actual problems.
The main contributions of the paper are:
-
1.
The application of global optimization utilizing AOA-HGS gives better results when experimental results are compared with HGSO, AOA, GOA, GWO, SCA.
-
2.
AOA-HGS reduced the computational complexity and it also works efficiently for both high and low dimensional problems.
The paper is structured as: in section 2 expounds on HGS. AOA is discussed in section 3. Proposed work is discussed in section 4. Results are discussed in section 5. Conclusion and future scope are discussed in section 6.
2 Hunger games search (HGS) optimization
2.1 Approach food
To express its approaching behavior in mathematical formulas, the following formulas are proposed to imitate the contraction mode [58]:
where \( \overrightarrow{R} \) is in the range of [−a, a]; r1 and r2 respectively represent random numbers, which are in the range of [0, 1]; randn(1) is a random number satisfying normal distribution; t indicates that the current iterations; \( \overrightarrow{W_1} \) and \( \overrightarrow{W_2} \) represent the weights of hunger; \( \overrightarrow{X_b} \) represents the location information of a random individual in all the optimal individuals; \( \overrightarrow{X(t)} \) represents each individual’s location, and the value of l has been discussed in the parameter setting experiment.
The formula of E is as follows:
where i ∈ 1, 2, …, n, F(i) represents the fitness value of each individual, and BF is the best fitness obtained in the current iteration process. Sech is a hyperbolic function \( \left(\operatorname{sech}(x)=\frac{2}{e^x+{e}^{-x}}\right) \).
The formula of \( \overrightarrow{R} \)is as follows:
where rand is a random number in the range of [0, 1]; and Max _ iter stands for the largest number of iterations.
2.2 Hunger role
The starvation characteristics of individuals in search are simulated mathematically.
The formula of \( \overrightarrow{W_1} \) in Eq. (5) is as follows:
The formula of \( \overrightarrow{W_2} \) in Eq. (6) is shown as follows:
where hungry represents the hunger of each individual; N represents the number of individuals, and SHungry is the sum of hungry feelings of all individuals, that is sum(hungry). r3, r4 and r5 are random numbers in the range of [0, 1].
The formula for hungry(i) is provided below:
where AllFitness(i) preserves the fitness of each individual in the current iteration.
The formula for H can be seen as follows:
where r6 is a random number in the range of [0, 1]; F(i) represents each individual’s fitness value of individual; BF is the best fitness obtained in the current iteration process; WF stands for the worst fitness obtained in the current iteration process; and UB and LB indicate the upper and lower bounds of the search space, respectively. The hunger sensation H is limited to a lower bound, LH.
3 Arithmetic optimization algorithm
AOA [2] is a new meta-heuristic method that uses common mathematical operations such as Division (D), Addition (A), Multiplication (M), Subtraction (S), as shown in Fig. 1, which is applied and modeled to execute optimization in a wide variety of search fields [8]. Commonly, population-based algorithms (PBA) launch their improvement processes by randomly selecting several candidate strategies. This defined solution is enhanced incrementally by a set of optimization standards and analyzed sequentially by a particular objective function, and that’s the basis of optimization techniques. Although PBA is stochastically trying to find some efficient strategy to optimization problems, a single-run solution is not guaranteed. However, the chance of an optimum global solution to the problem is enhanced by a large set of possible solutions and optimization simulations [21]. Considering variations among meta-heuristic methods [43, 44] in PBA approaches, the optimization process comprises two cycles: exploitation vs. exploration. The previous examples for extensive coverage are search fields through search agents to bypass local solutions. Above is an increase in the performance of solutions achieved during the exploration process.
3.1 Motivation
Arithmetic is a key component of mathematics and its most important components of modern math, including analysis, geometry and algebra. Arithmetic operators (AO) [2] are traditionally used for the study of numbers [59]. These basic math functions are used for optimization for finding ideal elements, particularly with selected solutions. Optimization [1, 3, 4, 16, 17, 22, 41] challenges have appeared in all mathematical fields, such as engineering [5,6,7, 9, 19, 24, 47], economics and computer science to organizational analysis and technology, and the advancement of optimization methods has drawn mathematics [33] attention from time to time. The key motivation of the new AOA is the use of AO to solve problems. The behavior of AO and their effect mostly on existing algorithms, the arrangement of AO and their superiority is shown in Fig. 2. AOA is then proposed based on a statistical model. [35]
3.2 Initial stage
The method of optimization starts with selected sets denoted by A as in Eq. 10. The ideal set in every iteration is created randomly and is taken as the optimum solution [2].
Exploitation/Exploration should be carefully chosen at the start of AOA. The coefficient of math optimizer accelerated (MOA) is defined in Eq. 11.
where
MOA (Citer) = ith iteration function value.
Miter= Max. no. of iteration.
Max & Min = Accelerated Function of Max. and Min. Values.
Citer= current iteration (within 1 and Miter).
3.3 Exploration stage
The exploratory nature of AOA is discussed, as per the AO, mathematical calculations whether using Division (D) or Multiplication (M) operator have obtained high distribution values or decisions that contribute to an exploration search method. However, as opposed to other operators, these D and M operators never easily reach the objective due to the high distribution of S and A operators. AOA exploration operators exploit the search field arbitrarily through many regions and seek a better alternative dependent on two key search techniques M and D search techniques as shown in Eq.12 [58].
where,
ai(Citer + 1) = ith solution of next iteration.
ai, j(Citer + 1) = jth position in current iteration.
μ= control parameter ≤0.5.
LBj & UBj= Lower & Upper bound limit
ε=smallest integer no.
bestaj = jth position of optimum solution till now
where,
Math Optimizer Probability (MOP)= coefficient
MOP(Citer) = ithiteration function value
Citer= current iteration
Miter= Max. iterations ≤5
3.4 Exploitation stage
The exploitation nature of AOA is discussed, as per AO mathematical formulas, whether using addition (A) or subtraction (S) as they provided high-density results. AOA exploitation operators exploit the search field deeply through many regions and seek a better alternative dependent on two key search techniques A and S search techniques as shown in Eq.14 [2].
4 Proposed work
Several population-dependent approaches have recently been proposed. Despite their adoption in a wide area of engineering applications, we are still examining suggested methods to solve real problems. As a result, researchers need to substantially modify and improve their methods, often based on major evolutionary processes, to achieve faster integration, a more consistent balance with high-quality efficiency and optimization. Therefore, a new hybrid method using Hunger Games Search (HGS) and Arithmetic Optimization Algorithm (AOA) is proposed in this paper. HGS is a recently proposed population-dependent optimization method that stabilizes the features and efficiently performs unconstrained and constrained problems.
In contrast, AOA is a modern meta-heuristic optimization method. They can be applied to different problems, including image processing, machine learning, wireless networks, power systems, engineering design etc. The method proposed is analyzed in context with HGS and AOA. To evaluate the performance, each method is tested on the same parameters like population size and no. of iteration. The proposed method (AOA-HGS) is evaluated by varying the dimensions. The impact of varying dimensions is a standard test utilized in previous studies for optimizing test functions that show the effect of varying dimensions on the efficiency of AOA-HGS. From this, it is noted that it works efficiently for both high and low dimensional problems. In high dimensional problem population, dependent methods give efficient search results. Figure 3 below demonstrates the step-by-step flow and implementation of proposed model.
From Fig. 3, the different steps utilized in the proposed method according to their implementation are discussed. The 1st step is defining parameters in this different parameter to be used are defined. The 2nd step is to generate the solution by means of the defined parameters. The 3rd step is to estimate the fitness function followed by choosing the best solution in 4th step. In 5th step, there is a condition, if the value of the random number (rand) is larger than 0.5, then HGS is not used, and if the value is smaller than 0.5, AOA is used. Then in 6th step, if the desired criteria are met, then the best solution is returned in 7th step; else, by feedback lope, it is again fed to 3rd step for estimating the fitness function.
The complexity of the proposed AOAHGS is based on the complexity of original AOA and HGS and it is given as follows:
Therefore, the total complexity of the proposed AOAHGS is given as follows:
where, N presents the number of solutions, Dim presents the solution size, and t is the number of iterations.
5 Results & discussion
The proposed approach is examined, and the proposed system’s efficiency is correlated with the performance of existing methods. The implementation and testing are done in i5 − 1.70GHz processor using MATLAB software. The proposed method’s performance (i.e., AOA + HGS) is examined on 23 test functions as described in Fig. 4. The results are further compared with HGSO, AOA, GOA, GWO, SCA.
To evaluate the performance, each method is tested on the same parameters like population size and no. of iteration. The proposed AOA-HGS is evaluated by varying the dimensions. The impact of varying dimensions is a standard test used in previous studies for optimizing test functions that show the effect of varying dimensions on the efficiency of AOA-HGS. From this, it is noted that it works efficiently for both high and low dimensional problems. In high dimensional problem population, dependent methods give efficient search results. In this work, AOA-HGS is used to test scalable multi, and unimodal test functions (F1-F13) with two different dimensions (10 & 1000) and results of comparative methods are tested using ten benchmark functions (F14-F23). The AOA-HGS is dealing with 13 functions (F1-F13) with two different dimensions is compared using standard deviation (SD) and average fitness value (Avg).
AOA-HGS gives the most optimum results in 10, which shows that any optimization methods give optimum results in low dimensions. Now. AOA-HGS is testing using high dimensions (1000), which also gave the optimum results. To verify, the proposed AOA-HGS is compared with previously state-of-art algorithms using the same dimensions (10 & 100) and test function. When the results were analyzed, it showed that AOA-HGS is the most optimum in different cases.
It is compared with other methods to evaluate the convergence behavior of AOA-HGS, as shown in Fig. 5. The curves of convergence with test functions (F1-F13) can be seen in Fig. 5. After observing the Fig. 5, it becomes clear that AOA-HGS gives low and stable convergence compared with other methods. AOA-HGS has more optimum global search capability and fast convergence and attained optimum results than other methods on the same test functions concerning convergence speed and global search capability.
The average runtime of the proposed AOA-HGS algorithm is compared with other pre-existing methods with 10 dimensions is shown in Table 1, and with 1000 dimensions is shown in Table 2. As AOA-HGS depends on the population method, there is no need for optimization so, the running time needed by AOA-HGS is less in terms of seconds when compared with other methods. So, the computational efficiency of the proposed AOA-HGS is much optimum than other methods. Observing the results shows that the AOA-HGS is on top, followed by other methods.
If we notice the values in Table 3, the AOA-HGS is very competitive and superior compared with others on test functions (F14-F23). No. of optimization methods obtained optimum results, but AOA-HGS has the best when compared with all. So, AOA-HGS is capable of obtaining optimum results. It is that the ability of the proposed method gains the advantages of two powerful methods together. This proposal can also tackle the weaknesses of the signal search method, as clearly stated in the used tables and figures.
6 Conclusion & future scope
This work introduces a hybrid method (AOA-HGS) dependent on the population-based model to address optimization problems depending on the types of social animals in search of food (HGS) along with a meta-heuristic optimization method (AOA). The proposed AOA-HGS method is validated on an exhaustive collection of 23 functions (F1-F23). Results achieved were compared with other state-of-the-art methods like HGSO, AOA, GOA, GWO, SCA. Observing the experimental results, it became clear that AOA-HGS has more fast convergence with optimum global search capability. The AOA-HGS is very competitive and superior compared with others on test functions. No. of optimization methods obtained optimum results, but AOA-HGS has the best when compared with all. So, AOA-HGS is capable of obtaining optimum results. The main advantages of the proposed method it can find better results by combining the search process of two powerful methods and some limitations can be defined as other applications can be tested to further validation of the proposed method.
Further, AOA and HGS can be integrated with other existing state-of-the-art algorithms, improving the algorithm and giving more precise results with less computational time, which is needed in real-time applications and problems.
References
Abualigah L, Dulaimi AJ (2021) A novel feature selection method for data mining tasks using hybrid Sine Cosine Algorithm and Genetic Algorithm. Cluster Computing : 1–16.
Abualigah L, Diabat A, Mirjalili S, Abd Elaziz M, Gandomi AH (2020) The arithmetic optimization algorithm. Comput Methods Appl Mech Eng 376:113609
Alshaer HN, Otair MA, Abualigah L, Alshinwan M, Khasawneh AM (2021) Feature selection method using improved CHI Square on Arabic text classifiers: analysis and application. Multimed Tools Appl 80(7):10373–10390
Altabeeb AM, et al. (2021) Solving capacitated vehicle routing problem using cooperative firefly algorithm. Applied Soft Computing 108 (2021): 107403
Bansal M, Kumar M, Kumar M, Kumar K (2021) An efficient technique for object recognition using Shi-Tomasi corner detection algorithm. Soft Comput 25:4423–4432. https://doi.org/10.1007/s00500-020-05453-y
Bansal M, Kumar M, Kumar M (2021) 2D object recognition: a comparative analysis of SIFT, SURF and ORB feature descriptors. Multimed Tools Appl 80:18839–18857. https://doi.org/10.1007/s11042-021-10646-0
Bansal M, Kumar M, Sachdeva M, Mittal A (2021) Transfer learning for image classification using VGG19: Caltech-101 image data set. J Ambient Intell Human Comput. https://doi.org/10.1007/s12652-021-03488-z
Boussaïd I, Lepagnot J, Siarry P (2013 Jul 10) A survey on optimization metaheuristics. Inf Sci 237:82–117
Chhabra P, Garg NK, Kumar M (2020) Content-based image retrieval system using ORB and SIFT features. Neural Comput & Applic 32:2725–2733. https://doi.org/10.1007/s00521-018-3677-9
Coello CA (2002 Jan 4) Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Comput Methods Appl Mech Eng 191(11–12):1245–1287
Colorni A, Dorigo M, Maniezzo V (1991) Distributed optimization by ant colonies. InProceedings of the first European conference on artificial life 142, 134–142
Cuevas E, Echavarría A, Zaldívar D, Pérez-Cisneros M (2013 Nov 15) A novel evolutionary algorithm inspired by the states of matter for template matching. Expert Syst Appl 40(16):6359–6373
Cuevas E, Echavarría A, Ramírez-Ortegón MA (2014 Mar 1) An optimization algorithm inspired by the states of matter that improves the balance between exploration and exploitation. Appl Intell 40(2):256–272
Davis L (1991) Bit-climbing, representational bias, and test suit design. InProc. Intl. Conf. Genetic algorithm, (pp. 18–23)
Eberhat R, Kennedy J (1995) A new optimizer using particle swarm theory. InSixth international symposium on micro machine and human science, Piscataway (pp. 39–43)
Eid A, Kamel S, Abualigah L (2021) Marine predators algorithm for optimal allocation of active and reactive power resources in distribution networks. Neural Comput & Applic 33:1–29
Elaziz A, Mohamed LA, Attiya A (2021) Advanced optimization technique for scheduling IoT tasks in cloud-fog computing environments. Future Generation Computer Systems
Fogel, David B (1998) Artificial intelligence through simulated evolution. Wiley-IEEE Press
Ghosh KK et al (2021) Theoretical and empirical analysis of filter ranking methods: Experimental study on benchmark DNA microarray data. Expert Systems with Applications 169:114485
Glover F (1989 Aug) Tabu search—part I. ORSA J Comput 1(3):190–206
Gogna A, Tayal A (2013 Dec 1) Metaheuristics: review and application. J Exp Theoretic Artific Intell 25(4):503–526
Hassan, Mohamed H., et al. (2021) Development and application of slime mould algorithm for optimal economic emission dispatch. Expert Systems with Applications: 115205
Holland JH (1992) Genetic algorithms. Sci Am 267(1):66–73
Kaur H, Kumar M (2021) Performance evaluation of various feature selection techniques for offline handwritten Gurumukhi place name recognition. In: Singh T.P., Tomar R., Choudhury T., Perumal T., Mahdi H.F. (eds) data driven approach towards disruptive technologies. Studies in autonomic, data-driven and industrial computing. Springer, Singapore. https://doi.org/10.1007/978-981-15-9873-9_44
Kaveh A, Farhoudi N (2013 May 1) A new optimization method: dolphin echolocation. Adv Eng Softw 59:53–70
Kaveh A, Farhoudi N (2016 Mar 1) Dolphin monitoring for enhancing metaheuristic algorithms: layout optimization of braced frames. Comput Struct 165:1–9
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing science 220(4598):671–680
Li J, Lin J (2020 May 1) A probability distribution detection based hybrid ensemble QoS prediction approach. Inf Sci 519:289–305
Li J, Zheng XL, Chen ST, Song WW, Chen DR (2014 Jun 10) An efficient and reliable approach for quality-of-service-aware service composition. Inf Sci 269:238–254
Li J, Chen C, Chen H, Tong C (2017 Jul 1) Towards context-aware social recommendation via individual trust. Knowl-Based Syst 127:58–66
Liu E, Lv L, Yi Y, Xie P (2019 Jun 24) Research on the steady operation optimization model of natural gas pipeline considering the combined operation of air coolers and compressors. IEEE Access 7:83251–83265
Lourenço HR, Martin OC, Stützle T (2003) Iterated local search. InHandbook of metaheuristics 2003 (pp. 320-353). Springer, Boston, MA
Mahajan S, Pandit AK (2021) Hybrid method to supervise feature selection using signal processing and complex algebra techniques. Multimed Tools Appl. https://doi.org/10.1007/s11042-021-11474-y
Mahajan S, Mittal N, Pandit AK (2021) Image segmentation using multilevel thresholding based on type II fuzzy entropy and marine predators algorithm. Multimed Tools Appl 26:1–25
Mahajan S, Abualigah L, Pandit AK, Altalhi M (2022) Hybrid Aquila optimizer with arithmetic optimization algorithm for global optimization tasks. Soft Comput. https://doi.org/10.1007/s00500-022-06873-8
Marler RT, Arora JS (2004 Apr 1) Survey of multi-objective optimization methods for engineering. Struct Multidiscip Optim 26(6):369–395
Michalewicz Z. Genetic algorithms+ data structures= evolution programs. Springer Science & Business Media; 2013.
Mirjalili S, Lewis A (2013 Apr 1) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm and Evolutionary Computation 9:1–4
Mirjalili S, Mirjalili SM, Lewis A (2014 Mar 1) Grey wolf optimizer. Adv Eng Softw 69:46–61
Pang J, Zhou H, Tsai YC, Chou FD (2018 Sep 1) A scatter simulated annealing algorithm for the bi-objective scheduling problem for the wet station of semiconductor manufacturing. Comput Ind Eng 123:54–66
Şahin, Canan Batur, and Laith Abualigah (2021) A novel deep learning-based feature selection model for improving the static analysis of vulnerability detection. Neural Comput Appl: 1–19.
Saremi S, Mirjalili S, Lewis A (2017 Mar 1) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47
Sharaff, Aakanksha, et al. (2020) personalized recommendation system with user interaction based on LMF and popularity model. 2020 International Conference on System, Computation, Automation and Networking (ICSCAN). IEEE,
Sharma, Dimple, and Aakanksha Sharaff. "Identifying Spam Patterns in SMS using Genetic Programming Approach." 2019 International Conference on Intelligent Computing and Control Systems (ICCS). IEEE, 2019.
Spall JC (2005) Introduction to stochastic search and optimization: estimation, simulation, and control. John Wiley & Sons
Storn R, Price K (1997) Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359
Walia S, Kumar K, Kumar M, Gao X-Z (2021) Fusion of handcrafted and deep features for forgery detection in digital images. IEEE Access 9:99742–99755. https://doi.org/10.1109/ACCESS.2021.3096240
Wen F, Yang X, Gong X, Lai KK (2017 Jan 5) Multi-scale volatility feature analysis and prediction of gold price. International Journal of Information Technology & Decision Making 16(01):205–223
Wolpert DH, Macready WG (1997 Apr) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
Yan J, Pu W, Zhou S, Liu H, Bao Z (2020 Mar 1) Collaborative detection and power allocation framework for target tracking in multiple radar system. Information Fusion 55:173–183
Yang XS (2010) Nature-inspired metaheuristic algorithms. Luniver press
Yang XS (2010 Jan 1) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio-Inspired Comput 2(2):78–84
Yang XS (2010) A new metaheuristic bat-inspired algorithm. InNature inspired cooperative strategies for optimization (NICSO 2010) 2010 (pp. 65-74). Springer, Berlin. Heidelberg
Yang XS (2010) Nature-inspired metaheuristic algorithms. Luniver press
Yang XS (2012) Flower pollination algorithm for global optimization. InInternational conference on unconventional computing and natural computation 2012 Sep 3 (pp. 240-249). Springer, Berlin, Heidelberg
Yang XS, Deb S (2010 Jan 1) Engineering optimisation by cuckoo search. International Journal of Mathematical Modelling and Numerical Optimisation 1(4):330–343
Yang XS, Bramer M, Ellis R, Petridis M (2010) Research and development in intelligent systems XXVI. Development, Springer
Yang Y, Chen H, Heidari AA, Gandomi AH (2021 Mar) Hunger games search: visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst Appl 10:114864
Zhou A, Qu BY, Li H, Zhao SZ, Suganthan PN, Zhang Q (2011 Mar 1) Multiobjective evolutionary algorithms: a survey of the state of the art. Swarm and Evolutionary Computation. 1(1):32–49
Data and code availability
Not Applicable
Funding
The authors received no specific funding for this study.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
Authors declare that they have no conflicts of interest to report regarding the present study.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Mahajan, S., Abualigah, L. & Pandit, A.K. Hybrid arithmetic optimization algorithm with hunger games search for global optimization. Multimed Tools Appl 81, 28755–28778 (2022). https://doi.org/10.1007/s11042-022-12922-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-022-12922-z