Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Everyone loves music. It is the one art form that is entirely defined by time. For example, it can be broadly divided into three groups, namely, classical music, Jazz, and rock (Jarrett and Day 2008). In addition, the fantastic thing of music is that it can be improvisational played. For example, when you go to a concert, you will find that two or three guitar players can improvise freely on the guitar based on their own trained habits (French 2012). It just like act of grabbing a few seemingly random notes, however, in the end you will admire to these excellent melodic ideas. Inspired by that, several music based algorithms are proposed recently.

1.1 Harmony

Generally speaking, harmony is one of the major building blocks when you build a musical bridge between your different melodic themes, the other two are rhythm and melody, respectively. It can be defined as any combination of notes that can be simultaneously played (Jarrett and Day 2008). The elementary study of harmony is about chord progression in which a series of chord are played in order (Yi and Goldsmith 2010). One good source for harmony is the melody itself. In fact, they are interacted with each other. In addition, different harmonies can give you a totally different feeling.

2 Harmony Search Algorithm

2.1 Fundamentals of Harmony Search Algorithm

Harmony search (HS) algorithm was originally proposed by Geem et al. (2001). With the underlying fundamental of natural musical performance processes in which the musicians improvise their instruments’ pitch by searching for the pleasing harmony (a perfect state), HS find the solutions through the determination of an objective function (i.e., the audience’s aesthetics) in which a set of values (i.e., the musicians) assigned to each decision variable (i.e., the musical instrument’s pitch). In general, the HS algorithm has three main operations: harmony memory (HM) consideration, pitch adjustment, and randomization (Geem et al. 2001). The HS algorithm is performed in several steps, outlined below (Geem et al. 2001):

  • Preparation of harmony memory: The main building block of HS is the usage of HM, because multiple randomized solution vectors are stored in HM via Eq. 14.1 (Geem 2009):

    $$ {\text{HM}} = \left[ {\left. {\begin{array}{*{20}l} {D_{1}^{1} } \hfill & {D_{2}^{1} } \hfill & \cdots \hfill & {D_{n}^{1} } \hfill \\ {D_{1}^{2} } \hfill & {D_{2}^{2} } \hfill & \cdots \hfill & {D_{n}^{2} } \hfill \\ \vdots \hfill & \vdots \hfill & \cdots \hfill & \vdots \hfill \\ {D_{1}^{HMS} } \hfill & {D_{2}^{HMS} } \hfill & \cdots \hfill & {D_{n}^{HMS} } \hfill \\ \end{array} } \right|\begin{array}{*{20}l} {f\left( {{\mathbf{D}}^{1} } \right)} \hfill \\ {f\left( {{\mathbf{D}}^{2} } \right)} \hfill \\ \vdots \hfill \\ {f\left( {{\mathbf{D}}^{HMS} } \right)} \hfill \\ \end{array} } \right], $$
    (14.1)

    where D j i is the ith decision variable in the jth solution vector that has one discrete value out of a candidate set \( \left\{ {D_{i} \left( 1 \right),D_{i} \left( 2 \right), \ldots ,D_{i} \left( k \right), \ldots ,D_{i} \left( {K_{i} } \right)} \right\} \), \( f\left( {{\mathbf{D}}^{j} } \right) \) is the objective function value for the jth solution vector, and HMS is the harmony memory size (i.e., the number of multiple vectors stored in the HM).

  • Improvisation of new harmony: A new harmony vector \( D_{i}^{new} = \left( {D_{1}^{new} ,D_{2}^{new} , \ldots ,D_{n}^{new} } \right) \) is improvised by the following three rules (Geem 2009):

  1. (1)

    Random selection: Based on this rule, one value is chosen out of the candidate set via Eq. 14.2 (Geem 2009):

$$ D_{i}^{new} \leftarrow D_{i} \left( k \right), \, D_{i} \left( k \right) \in \left\{ {D_{i} \left( 1 \right),D_{i} \left( 2 \right), \ldots ,D_{i} \left( {K_{i} } \right)} \right\}. $$
(14.2)
  1. (2)

    HM consideration: In memory consideration, one value is chosen out of the HM set with a probability of harmony memory consideration rate (HMCR) via Eq. 14.3 (Geem 2009):

$$ D_{i}^{new} \leftarrow D_{i} \left( l \right), \, D_{i} \left( l \right) \in \left\{ {D_{i}^{1} ,D_{i}^{2} , \ldots ,D_{i}^{HMS} } \right\}. $$
(14.3)
  1. (3)

    Pitch adjustment: According to this rule, the obtained vale as in Eq. 14.3 is further changed into neighbouring values, with a probability of pitch adjusting rate (PAR) via Eq. 14.4 (Geem 2009):

$$ D_{i}^{new} \leftarrow D_{i} \left( {l \pm 1} \right), \, D_{i} \left( l \right) \in \left\{ {D_{i}^{1} ,D_{i}^{2} , \ldots ,D_{i}^{HMS} } \right\}. $$
(14.4)

Overall, these three rules are the core terms of the stochastic derivative of HS and can be summarized via Eq. 14.5 (Geem 2009):

$$ \begin{aligned} \left. {\frac{\partial f}{{\partial D_{i} }}} \right|_{{D_{i} = D_{i} \left( l \right)}} = & \frac{1}{{K_{i} }} \cdot \left( {1 - HMCR} \right) + \frac{{n\left( {D_{i} \left( l \right)} \right)}}{HMS} \cdot HMCR \cdot \left( {1 - PAR} \right) \\ + & \frac{{n\left( {D_{i} \left( {l \pm 1} \right)} \right)}}{HMS} \cdot HMCR \cdot PAR, \\ \end{aligned} $$
(14.5)

where \( \frac{1}{{K_{i} }} \cdot \left( {1 - HMCR} \right) \) denotes for the rate to choose a value \( D_{i} \left( l \right) \) for the decision variable D i by random selection, \( \frac{{n\left( {D_{i} \left( l \right)} \right)}}{HMS} \cdot HMCR \cdot \left( {1 - PAR} \right) \) chooses the rate by HM consideration, and \( \frac{{n\left( {D_{i} \left( {l \pm 1} \right)} \right)}}{HMS} \cdot HMCR \cdot PAR \) chooses the rate by pitch adjustment.

  • Update of HM: Once the new vector \( D_{i}^{new} = \left( {D_{1}^{new} ,D_{2}^{new} , \ldots ,D_{n}^{new} } \right) \) is completely generated, it will be compared with the other vectors that stored in HM. If it is better than the worst vector in HM with respect to the objective function, it will be updated (i.e., the new harmony is included in the HM and the existing worst harmony is excluded from the HM).

The optimization procedures of the HS algorithm are given as follows (Lee and Geem 2009; Geem et al. 2001):

  • Step 1: Initialize the optimization problem and algorithm parameters.

  • Step 2: Initialization of HM.

  • Step 3: Improvise a new harmony from the HM.

  • Step 4: Update the HM.

  • Step 5: Repeat Steps 3 and 4 until the termination criterion is satisfied.

2.2 Performance of HS

In order to show how the HS algorithm performs, three problems are presented to demonstrate the searching ability of HS in Geem et al. (2001), i.e., travelling salesman problem, relatively simple constrained minimization problem, and water network design problem. The computational results showed that HS outperforms other existing heuristic methods [such as genetic algorithm (GA)] in two specific applications (i.e., relatively simple constrained minimization problem and water network design problem).

3 Emerging Music Inspired Algorithms

Although music inspired algorithm is a new member of computational intelligence (CI) family, a number of similar algorithms have been proposed in the literature. This section gives an overview to some of these algorithms which have been demonstrated to be very efficient and robust.

3.1 Melody Search Algorithm

3.1.1 Fundamentals of Melody Search Algorithm

Melody search (MeS) algorithm was originally proposed by Ashrafi and Dariane (2011). It is inspired by the basic concepts applied in HS, but unless the HS algorithm used a single HM, the MeS algorithm employed the procedure of the group improvisation [i.e., several memories called player memory (PM)] simultaneously for finding the best succession of pitches in a melody. Main steps of MeS are outlined as follows (Ashrafi and Dariane 2011):

  • Step 1: Initializing the optimization problem and adopting algorithm parameters. In general, there are six major parameters defined in MeS, namely number of player memories (PMN), player memory size (PMS), maximum number of iterations (NI), maximum number of iterations for the initial phase (NII), bandwidth (bw), and player memory considering rate (PMCR).

  • Step 2: Initial phase that includes two repeated procedures (i.e., improvise a new melody from each PM and update each PM) until the criterion for stopping this step (i.e., NII) is satisfied, is given as follows,

  1. (1)

    Initialize PM is defined via Eq. 14.6 (Ashrafi and Dariane 2011):

$$ MM = \left[ {PM_{1} ,PM_{2} , \ldots ,PM_{PMN} } \right], $$
(14.6)

where MM denotes the melody memory in which a set of player memories are involved. The PM’s matrixes are generated via Eqs. 14.7 and 14.8, respectively (Ashrafi and Dariane 2011):

$$ PM_{i} = \left[ {\left. {\begin{array}{*{20}l} {x_{i,1}^{1} } \hfill & {x_{i,1}^{2} } \hfill & \cdots \hfill & {x_{i,1}^{D} } \hfill \\ {x_{i,2}^{1} } \hfill & {x_{i,2}^{2} } \hfill & \cdots \hfill & {x_{i,2}^{D} } \hfill \\ \vdots \hfill & \vdots \hfill & \cdots \hfill & \vdots \hfill \\ {x_{i,PMS}^{1} } \hfill & {x_{i,PMS}^{2} } \hfill & \cdots \hfill & {x_{i,PMS}^{D} } \hfill \\ \end{array} } \right|\begin{array}{*{20}l} {Fit_{i}^{1} } \hfill \\ {Fit_{i}^{2} } \hfill \\ \vdots \hfill \\ {Fit_{i}^{PMS} } \hfill \\ \end{array} } \right], $$
(14.7)
$$ \begin{aligned} x_{i,j}^{k} &= LB_{k} + r \cdot \left( {UB_{k} - LB_{k} } \right), \hfill \\ &{\text{for }}\left\{ {\begin{array}{*{20}l} {i = 1,2, \ldots ,PMN} \hfill \\ {j = 1,2, \ldots ,PMS} \hfill \\ {k = 1,2, \ldots ,D} \hfill \\ \end{array} } \right., \hfill \\ \end{aligned} $$
(14.8)

where D is the number of pitches of melodic line (i.e., decision variables), \( \left[ {LB_{k} ,UB_{k} } \right] \) is the possible range of the searching dimension, and r is a real number uniformly distributed in \( \left[ {0,1} \right] \).

  1. (2)

    Improvise a new melody \( X_{i,new} = \left( {x_{i,new}^{1} ,\,x_{i,new}^{2} , \ldots ,x_{i,new}^{n} } \right) \) from each PM according to three rules (Ashrafi and Dariane 2011):

Memory consideration: The value of each variable can be chosen from any value in the specified PM.

Pitch adjustment: Based on this rule, the value can be determined by a constant pitch bandwidth (bw) and a pitch adjusting rate (PAR) such as Eq. 14.9 (Ashrafi and Dariane 2011):

$$ PAR_{t} = PAR_{\hbox{min} } + \frac{{PAR_{\hbox{max} } - PAR_{\hbox{min} } }}{NI} \times t, $$
(14.9)

where PAR t is the pitch adjusting rate of the ith iteration, PAR min and PAR max are the minimum and maximum adjusting rates, respectively, and NI is the maximum number of iterations.

Randomization: This rule is used to increase the diversity of the solutions.

  1. (3)

    Update each PM.

  • Step 3: Second phase that includes two repeated procedures until the NI is satisfied, namely,

  1. (1)

    Improvise a new melody from each PM according to the possible range of pitches.

  2. (2)

    Update each PM.

  3. (3)

    Finally, determine the possible ranges of pitches for next improvisation (Just for randomization).

3.1.2 Performance of MeS

To evaluate the performance of MeS, five classical benchmark functions are tested in (Ashrafi and Dariane 2011). Compared with other CI methods [such as artificial bee colony (ABC), GA, HS, particle swarm optimization (PSO), and particle swarm and evolutionary algorithm (PS-EA)], the MeS is capable of finding better solutions.

3.2 Method of Musical Composition Algorithm

3.2.1 Fundamentals of Method of Musical Composition Algorithm

Method of musical composition (MMC) algorithm was originally proposed by Mora-Gutiérrez et al. (2012). The MMC algorithm used a dynamic creative system which means the composers exchange information among themselves and their environment to compose music. Normally, MMC involves four steps as follows (Mora-Gutiérrez et al. 2012):

  • Initialization: In this step, the scores (\( P_{*,*,i} \)), which used as memory, are randomly generated via Eqs. 14.10 and 14.11, respectively (Mora-Gutiérrez et al. 2012):

    $$ P_{*,*,i} = \left( {\begin{array}{*{20}c} {x_{1,1} } \hfill & {x_{1,2} } \hfill & \cdots \hfill & {x_{1,n} } \hfill \\ {x_{2,1} } \hfill & {x_{2,2} } \hfill & \cdots \hfill & {x_{2,n} } \hfill \\ \vdots \hfill & \vdots \hfill & \vdots \hfill & \vdots \hfill \\ {x_{Ns,1} } \hfill & {x_{Ns,2} } \hfill & \vdots \hfill & {x_{Ns,n} } \hfill \\ \end{array} } \right), $$
    (14.10)
    $$ P_{*,*,i} = x_{l}^{L} + \left( {rand \cdot \left( {x_{l}^{U} - x_{l}^{L} } \right)} \right), $$
    (14.11)

    where \( P_{*,*,i} \) is the score of the ith composer, \( x_{j,l} \) is the lth decision variable of jth tune, rand is a real number uniformly distributed in \( \left[ {0,1} \right] \), and \( \left( {x_{l}^{U} - x_{l}^{L} } \right) \) is the possible range of the searching dimension.

  • Exchanging of information among agents: According to the interaction policy, i.e., “composer i exchange a tune with composer k if and only if there is a link between them and the worst tune of composer k is better than the worst tune of composer i”. Two sub-phases (i.e., update of links among composers and exchange of information) are employed to exchange the information.

  • Generating for each agent a new tune: Based on the composer’s background and his innovative ideas, the new tune will be created. This phase includes two sub-phases, i.e., building the background of each composer (\( KM_{*,*i} \)) which includes the knowledge of composer i and the environment information that he perceived, and creating a new tune.

  • The \( P_{*,*,i} \) updating: Based on the value of objective function, the score will be updated.

3.2.2 Performance of MMC

To show the performance of MMC, 13 benchmark continuous optimization problems are performed in Mora-Gutiérrez et al. (2012). Compared with HS, improved HS, global-best HS, and self-adaptive HS, the experimental results showed that MMC improves the results obtained by the other methods, especially in the domain of multimodal functions.

4 Conclusions

In this chapter, we introduced a set of music inspired algorithms, namely, HS, MeS, and MMC. The former two are based on the idea of improvisation process by a skilled musician, while the last algorithm is inspired by the creative process of musical composition. Although the novelties of these music algorithms (e.g., HS) are still under debate (see (Weyland 2010) for details), we have witnessed the following rapid spreading of at least one of them, i.e., HS:

First, numerous enhanced versions of HS can be found in the literature as outlined below:

  • Box-Muller HS (Fetanat et al. 2011).

  • Chaotic differential HS (Coelho et al. 2010).

  • Chaotic HS (Pan et al. 2011b; Alatas 2010).

  • Coevolutionary differential evolution with HS (Wang and Li 2012).

  • Differential HS (Wang and Li 2013; Qin and Forbes 2011b).

  • Discrete HS (Gandhi et al. 2012; Pan et al. 2010b; Tasgetiren et al. 2012).

  • Effective global best HS (Zou et al. 2011a).

  • Efficient HS (Degertekin 2012).

  • Global-best HS (Omran and Mahdavi 2008).

  • Grouping HS (Landa-Torres et al. 2012; Askarzadeh and Rezazadeh 2011).

  • Guided variable neighborhood embedded HS (Huang et al. 2009).

  • Harmony fuzzy search algorithm (Alia et al. 2009a).

  • Highly reliable HS (Taherinejad 2009).

  • HS with dual-memory (Gao et al. 2012b).

  • Hybrid clonal selection algorithm and HS (Wang et al. 2009).

  • Hybrid differential evolution and HS (Mirkhani et al. 2013; Li and Wang 2009; Liao 2010; Duan et al. 2013; Gao et al. 2009).

  • Hybrid global best HS and K-means algorithm (Cobos et al. 2010).

  • Hybrid globalbest HS (Wang et al. 2010, 2011).

  • Hybrid HS (Gao et al. 2012a; Gil-López et al. 2012; Wang et al. 2010).

  • Hybrid HS and hill climbing (Al-Betar and Khader 2009).

  • Hybrid HS and linear discriminate analysis (Moeinzadeh et al. 2009).

  • Hybrid K-means and HS (Mahdavi and Abolhassani 2009; Forsati et al. 2008b).

  • Hybrid modified subgradient and HS (Yaşar and Özyön 2011).

  • Hybrid probabilistic neural networks and HS (Ameli et al. 2012).

  • Hybrid swarm intelligence and HS (Pandi and Panigrahi 2011; Pandi et al. 2011).

  • Improved discrete HS (Shi et al. 2011).

  • Improved HS based on exponential distribution (Coelho and Mariani 2009).

  • Intelligent tuned HS (Yadav et al. 2012).

  • Learning automata-based HS (Enayatifar et al. 2013).

  • Local-best HS with dynamic sub-harmony memories (Pan et al. 2011a).

  • Mixed-discrete HS (Jaberipour and Khorram 2011).

  • Modified HS (Kaveh and Nasr 2011; Zinati and Razfar 2012; Al-Betar and Khader 2012; Gao et al. 2008; Das et al. 2011; Mun and Cho 2012).

  • Multiobjective HS (Sivasubramani and Swarup 2011a, b; Li et al. 2012).

  • Novel global HS (Zou et al. 2010a, b, c, 2011b).

  • Opposition-based HS (Chatterjee et al. 2012).

  • Other hybrid HS (Jang et al. 2008; Yıldız 2008; Fesanghary et al. 2008; Zhao and Suganthan 2010).

  • Other improved HS (Afshari et al. 2011; Fourie et al. 2010; Sirjani et al. 2011; Yadav et al. 2011; Kaveh and Abadi 2010; Geem and Williams 2008; Geem 2010, 2012; Mahdavi et al. 2007; Coelho and Bernert 2009; Chakraborty et al. 2009; Jaberipour and Khorram 2010b; Qin and Forbes 2011a; Al-Betar et al. 2012).

  • Parallel HS (Lee and Zomaya 2009).

  • Parameter-setting-free HS (Geem and Sim 2010).

  • Particle-swarm enhanced HS (Geem 2009; Li et al. 2008; Zhao et al. 2011; Cheng et al. 2012).

  • Quantum inspired HS (Layeb 2013).

  • Self-adaptive global best HS (Kulluk et al. 2011; Pan et al. 2010a).

  • Self-adaptive HS (Degertekin 2012; Wang and Huang 2010; Chang and Gu 2012).

  • Social HS (Kaveh and Ahangaran 2012).

Second, the HS algorithm has been successfully applied to a variety of optimization problems as listed below:

  • Adaptive parameter controlling (Nadi et al. 2010).

  • Analog filter design optimization (Vural et al. 2013).

  • Antenna design optimization (Guney and Onay 2011).

  • Artificial neural network training (Kattan et al. 2010; Kattan and Abdullah 2011a, b; Kulluk et al. 2011, 2012).

  • Communication networks optimization (Forsati et al. 2008a; Shi et al. 2011; Landa-Torres et al. 2012; Ser et al. 2012).

  • Data mining (Mahdavi and Abolhassani 2009; Mahdavi et al. 2008; Forsati et al. 2008b; Moeinzadeh et al. 2009; Wang et al. 2009; Venkatesh et al. 2010; Cobos et al. 2010; Ramos et al. 2011).

  • Engineering design optimization (Mohammadi et al. 2011; Gil-López et al. 2012; Lee and Geem 2005).

  • Facility location optimization (Afshari et al. 2011; Kaveh and Nasr 2011).

  • Fuel cell research (Askarzadeh and Rezazadeh 2011).

  • Fuzzy-rough rule induction (Diao and Shen 2012).

  • Ground motion records analysis (Kayhan et al. 2011).

  • Image processing (Alia et al. 2009a, b, 2008, 2010; Fourie et al. 2010).

  • Interaction parameter estimation problem (Merzougui et al. 2012).

  • Knapsack problem (Zou et al. 2011b; Layeb 2013).

  • Lot sizing problem (Piperagkas et al. 2012).

  • Materials research (Mun and Geem 2009).

  • Milling process optimization (Razfar et al. 2011; Zarei et al. 2009; Zinati and Razfar 2012).

  • Music composition (Geem and Choi 2007).

  • Orienteering problem (Geem et al. 2005c).

  • Parameter-setting-free technique enhanced HS (Geem and Sim 2010).

  • Power system optimization (Vasebi et al. 2007; Mukhopadhyay et al. 2008; Fesanghary and Ardehali 2009; Coelho and Mariani 2009; Coelho et al. 2010; Yaşar and Özyön 2011; Fetanat et al. 2011; Geem 2011; Pandi and Panigrahi 2011; Pandi et al. 2011; Sivasubramani and Swarup 2011a, b; Khorram and Jaberipour 2011; Boroujeni et al. 2011a, b, c, d; Sirjani et al. 2011; Khazali and Kalantar 2011; Shariatkhah et al. 2012; Ezhilarasi and Swarup 2012; Javadi et al. 2012; Chatterjee et al. 2012; Ameli et al. 2012; Wang and Li 2013; Zhang et al. 2013).

  • Robot control optimization (Mirkhani et al. 2013).

  • Scheduling optimization (Huang et al. 2009; Zou et al. 2010a; Wang et al. 2010, 2011; Pan et al. 2010b, 2011a, b; Yadav et al. 2011; Gao et al. 2012a; Ahmad et al. 2012; Geem 2007; Tasgetiren et al. 2012).

  • Signal processing (Gandhi et al. 2012; Guo et al. 2012).

  • Software design optimization (Alsewari and Zamli 2012a, b).

  • Structure design optimization (Geem et al. 2005b; Geem and Hwangbo 2006; Degertekin 2008, 2012; Fesanghary et al. 2009, 2012; Kaveh and Talataha 2009; Kaveh and Abadi 2010; Hasançebi et al. 2010; Khajehzadeh et al. 2011; Bekdaş and Nigdeli 2011; Erdal et al. 2011; Lagaros and Papadrakakis 2012; Kaveh and Ahangaran 2012; Shahrouzi and Sazjini 2012; Miguel and Miguel 2012; Lee and Geem 2004; Ryu et al. 2007; Lee et al. 2011).

  • Sudoku puzzle problem (Geem 2008a).

  • Sum-of-ratios problem solving (Jaberipour and Khorram 2010a).

  • Supply chain optimization (Wong and Guo 2010; Taleizadeh et al. 2011, 2012; Purnomo et al. 2012).

  • System reliability optimization (Zou et al. 2010c, 2011a; Wang and Li 2012).

  • Timetabling (Al-Betar and Khader 2009, 2012; Al-Betar et al. 2008, 2010).

  • Transportation system optimization (Ceylan et al. 2008).

  • Vehicle routing problem (Geem et al. 2005b).

  • Water network optimization (Geem 2006a, b, 2008b, 2009; Ayvaz 2007, 2009; Mora-Meliá et al. 2009; Geem et al. 2011; Geem and Park 2006).

Interested readers please refer to them together with several excellent reviews [e.g., (Alia and Mandava 2011; Manjarres et al. 2013; Geem et al. 2008)] as a starting point for a further exploration and exploitation of these music inspired algorithms.