Keywords

1 Introduction

Most of the artificial intelligence mechanisms draw inspiration from nature. These mechanisms follow the natural evolutionary process for their internal transformations. For example, Genetic algorithm (GA) which is based on the concept of natural genes and evolutionary inheritance process has been applied to solve many problems [1]. However, due to its inherent inefficiencies such as premature convergence [2], Memetic algorithm (MA) was developed address them.

Although MA builds on genetic evolution, it presents another form of evolution called memetic evolution [3,4,5]. This is where memes are transmitted from brain to brain through interaction and experience. A meme represents a unit of information in MA that can be passed on from one individual to another, such as ideas, beliefs, attitudes and principles. These can then be multiplied across generations.

Moscato [1] gives some of the benefits of Memetic Computing, and the efficacy of hybrid genetic algorithms in generation high quality solutions for complex and dynamic problems. One of their major benefits is the ability to refine solutions by performing a neighborhood search to avert premature convergence in global search methods. MA leverages both global and local search methods to improve the search, thereby offsetting the weaknesses of individual methods. For example, the global method does not perform search intensification to exploit the search space. By the same token, the local methods do not perform diversification to explore the search. These limitations are alleviated in MA.

MA has been extensively applied to solve a variety of problems in optimization [6,7,8,9,10,11,12]. However, MAs have been majorly applied to stationary environments, and yet the current problems have become more complex, continuous and dynamic [13,14,15], hence an increasing demand for more intelligent mechanisms.

The predictive approaches in evolutionary algorithms help to track moving optima in stochastic problem environments. Therefore, in this paper prediction is introduced in MA for dynamic problem environments, based on nonlinear regression method [18, 19], which lends itself to rugged landscapes and shifting optima in capricious search.

The remainder of this article is organized as follows. The Sect. 2 introduces the problem, Sect. 3 presents a review of related work. The Sect. 5 introduces the proposed architecture. Results are discussed in the Sect. 6. The Sect. 7 covers the conclusion, limitations and potential future research.

2 Problem Definition

In a stochastic environment, it is difficult to predict the optimal outcome, especially when the optima is constantly moving within the search space. This increases the complexity and intractability of the problem. In combinatorial optimization, multiple solutions are generated for an optimization problem. The problem can be defined by the objective function with a feasible region in which alternative solutions are found. In this context, the problem at hand is nonlinear, which depicts its stochasticity, and the optimization objective is to find a solution with the smallest error, denoted by \(\varepsilon \). \(R^{p}\) is the parameter space, \(R^{m}\) is the solution space and \(f(x,\theta )\) is a nonlinear function of \(\theta \). Therefore, the problem is to estimate \(\theta \) in a standard nonlinear regression model in Eqs. 1 and 2. \(\sigma \) and N are nonlinear parameter and population sample respectively.

$$\begin{aligned} y=f(x,\theta )+ \varepsilon \end{aligned}$$
(1)
$$\begin{aligned} \varepsilon \sim N(0,\sigma ^{2}) \end{aligned}$$
(2)

For \(x\in R^{m}\), the function of x and \(\theta \) denote nonlinear where \(\theta \in R^{p}\) [29]. Consider a solution set, where \(y_{i},x_{i}\) and \(i=1,2,3,\ldots , N\). We then minimize \(\theta \) as follows:

$$\begin{aligned} T(\hat{\theta })\le T(\theta )) \end{aligned}$$
(3)
$$\begin{aligned} T(\theta )=\sum \limits _{i=1}^{n}[y_{i}-f(x_{i},\theta )]^{2} \end{aligned}$$
(4)
$$\begin{aligned} Min T(\theta ) \end{aligned}$$
(5)

The nonlinear estimation problem is for intents and purposes an optimization problem, hence this requires the minimization of the objective function, \(T(\theta )\), in Eqs. 3, 4 and 5 for solutions in the sample.

3 Related Work

Zhou et al. [20] presented an adaptive hill climbing strategy implemented with MA for dynamic optimization problems. The crossover-based hill climbing and mutation-based hill climbing are combined to increase search diversity, although there is high computational complexity arising from other integrated methods, such as adaptive dual mapping and triggered random. [12, 14,15,16] introduced other techniques in prediction to foretell Pareto front’s presence in problems of multiobjective nature. Pareto’s approach draws on past information to read patterns and make predictions, in addition to using the apriori technique. This is similar to the multiobjective method presented in [17,18,19,20], where the population is reused. The method draws on past details to reach good solutions. The past record of an individual is characterized by the changes that have taken place and the perturbation record helps to determine the behavior of subsequent solutions. However, different variations are factored in, over and above, the population parameters and time.

The algorithm presented in [21, 22] estimates how the changes will be at a certain period of time. The function of Osmera and Knapsack problems were used to test the approach. The optimum in the last generation is used to shape the optimum in the next generation. This method uses two predictors, namely noisy predictor and perfect predictor. The predictors use observations and data of individuals. In a similar vein, Weicker [23] shows how the decision process at every stage provides leverage when solving recurrent problems. This method draws patterns through deeper studying of relationships between the problem and possible solutions in the sample. Kalman algorithm presented by Stroud [24] gives the importance of evaluating the strength of individuals through fitness values and using this analysis for better decision making. Every information is useful from the specific elements in the search to the search space. However, in nonlinear situations, the current information in linear samples may not be sufficient to accurately keep track of moving optima.

Karaman et al. [25] presented an online dynamic optimization method. Time is one of the factors used to track changes and draw patterns. The method was applied to problems, such as the vehicle routing and scheduling. In a similar vein, Bosman [26, 27] uses machine learning, evolutionary methods and statistics to forecast environment behaviors. In order to prevent premature convergence, intensification and diversification are used to enhance the accuracy of the prediction outcome. The majority of related work in the literature, shows that a lot of predictive research is directed towards evolutionary algorithms, such as genetic algorithms and others but few on MAs.

4 Dynamic Landscapes

Dynamic environments are explained extensively in [22,23,24, 28], especially on how the environments change, why they change and how they change. Although most of the research has been heavy in stationary problems, the pendulum of research is increasingly shifting to dynamic problems due a proliferation of such problems. Hongfeng et al. [41] presented an adaptive hill climbing method, where greedy crossover and steepest mutation are combined to produce good solutions. Dynamic problems can be either exogenous or endogenous. Weicker [42] defines exogenous as those dynamics that are directly inherent to the problem, while endogenous dynamics come from the evolutionary process. The population keeps on evolving, as individuals shift positions in the dynamic nature of the environment.

Figure 1 clearly shows that the shifting optima is always on the move. Intelligent methods would be able to track these dynamic alleles. The drifting landscapes are compounded by the morphology changes of individuals in the population.

Fig. 1.
figure 1

The nature of candidate solutions in a dynamic setting [42].

5 Proposed Predictive MA

figure a

Algorithm 1 shows a proposed prediction model implemented in Predictive MA. The objective of PMA is to find \(\theta \). In case of any signs of environmental change, PMA finds \(\theta \) that suits the available dataset, therefore, using apriori, \(\theta \) is approximated. After these changes have occurred, as seen in Fig. 1, \(\varepsilon \) is defined. The error defines the difference between the estimated and non estimated. However, in error prediction, the difference between the generational change and estimation is registered to provide parameters for dispensing with weak solutions. The smallest error remains the aim of the function, and if the smallest error is recorded, it is bench-marked for the next generation. High quality approximation of parameters is important to achieve accuracy, as presented by Simoes and Costa [43].

The proposed model is intrinsically nonlinear. The parameters pertaining to the vector are estimated, and the minimization of error is important for increasing accuracy and performance. MA follows the evolutionary process using the parameters as shown in Algorithm 1. Nonlinear regression applies to both linear and nonlinear problems. Many problems are inherently nonlinear which makes it hard for linear functions to produce good results on nonlinear behavioral patterns. The choice of nonlinear is also based on robust data exploitation capacity, manipulating data and behaviors to make accurate inferences. The proposed model approximates the parameters of \(\theta \) and makes predictions.

Table 1. \(\theta \) is solved by different methods GA, SA and PMA
Table 2. \(\theta \) is solved by different methods GA, SA and PMA
Table 3. \(\theta \) is solved by different methods GA, SA and PMA

6 Experimental Results

The standard problems of nonlinear regression [44,45,46] were used in this study to conduct the experiments. Results were obtained from 100 iterations, and in Tables 1, 2 and 3, PMA which is the proposed method, performs better by yielding the lowest value of \(\theta \), hence achieving the minimization of the objective function.

The sample \(Y=50\) and generation is set at 100, crossover, mutation and recombination are parametrized at 0.2, 0.1, 0.5 respectively. Owing to the dynamic changes, there is a continuous parameter change over and above the different changes created by different functions discussed in this paper. The predictor is then subjected to these variegated conditions to determine from its vantage point the performance in dynamic environments. The error magnitude is weighed on absolute terms, where by the gap between the approximation and exactness is gradually reduced.

PMA provides better results in terms of error reduction. \(\varepsilon \) is reduced to the lowest level. Also, SA is subjected to the same functions in the experiment, to observe comparisons with other methods although it represents the local search paradigm. The functions include mutation, crossover, recombination and genetic that define candidate features and evolve them to better solutions. The lowest error obtained is in PMA as seen in Tables 1, 2 and 3. This can be attributed to nearest neighbor aspect which help to prevent errors in optimization.

7 Conclusion and Future Work

The cyclic change behaviors are important pointers for prediction. The presence of repeated patterns facilitate the process of prediction to significantly increase accuracy. This work proposed a PMA that estimates parameters in a dynamic environment to obtain good solutions. The parametrized changes as a result of the landscape changes are inherently nonlinear. The algorithm is applied to different nonlinear problem data sets and compared with genetic algorithm and simulated annealing. The proposed method produces superior results and generates the lowest error. As part of future work, multiple populations will be considered and further memory based approaches will be explored. More intelligent pattern recognition techniques for environment changes will also be incorporated in the PMA.