Abstract
This chapter follows the development of a class of neural networks (GlossaryTerm
NN
) called evolving connectionist systems (GlossaryTermECOS
). The term evolving is used here in its meaning of unfolding, developing, changing, revealing (according to the Oxford dictionary) rather than evolutionary. The latter represents processes related to populations and generations of them. An GlossaryTermECOS
is a neural network-based model that evolves its structure and functionality through incremental, adaptive learning and self-organization during its lifetime. In principle, it could be a simple GlossaryTermNN
or a hybrid connectionist system. The latter is a system based on neural networks that also integrate other computational principles, such as linguistically meaningful explanation features of fuzzy rules, optimization techniques for structure and parameter optimization, quantum-inspired methods, and gene regulatory networks. The chapter includes definitions and examples of GlossaryTermECOS
such as: evolving neuro-fuzzy and hybrid systems; evolving spiking neural networks, neurogenetic systems, quantum-inspired systems, which are all discussed from the point of view of the structural and functional development of a connectionist-based model and the knowledge that it represents. Applications for knowledge engineering across domain areas, such as in bioinformatics, brain study, and intelligent machines are presented.Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
1 Principles of Evolving Connectionist Systems (ECOS)
Everything in Nature evolves, develops, unfolds, reveals, and changes in time. The brain is probably the ultimate evolving system, which develops during a lifetime, based on genetic information (Nature) and learning from the environment (nurture). Inspired by information principles of the developing brain, GlossaryTerm
ECOS
are adaptive, incremental learning and knowledge representation systems that evolve their structure and functionality from incoming data through interaction with the environment, where in the core of a system is a connectionist architecture that consists of neurons (information processing units) and connections between them [1]. An GlossaryTermECOS
is a system based on neural networks and the use of also other techniques of computational intelligence (GlossaryTermCI
), which operates continuously in time and adapts its structure and functionality through continuous interaction with the environment and with other systems. The adaptation is defined through:-
1.
A set of evolving rules.
-
2.
A set of parameters (genes) that are subject to change during the system operation.
-
3.
An incoming continuous flow of information, possibly with unknown distribution.
-
4.
Goal (rationale) criteria (also subject to modification) that are applied to optimize the performance of the system over time.
GlossaryTerm
ECOS
learning algorithms are inspired by brain-like information processing principles, e. g.:-
1.
They evolve in an open space, where the dimensions of the space can change.
-
2.
They learn via incremental learning, possibly in an on-line mode.
-
3.
They learn continuously in a lifelong learning mode.
-
4.
They learn both as individual systems and as an evolutionary population of such systems.
-
5.
They use constructive learning and have evolving structures.
-
6.
They learn and partition the problem space locally, thus allowing for a fast adaptation and tracing the evolving processes over time.
-
7.
They evolve different types of knowledge representation from data, mostly a combination of memory- based and symbolic knowledge.
Many methods, algorithms, and computational intelligence systems have been developed since the conception of GlossaryTerm
ECOS
and many applications across disciplines. This chapter will review only the fundamental aspects of some of these methods and will highlight some principal applications.2 Hybrid Systems and Evolving Neuro-Fuzzy Systems
2.1 Hybrid Systems
A hybrid computational intelligent system integrates several principles of computational intelligence to enhance different aspects of the performance of the system. Here we will discuss only hybrid connectionist systems that integrate artificial neural networks (GlossaryTerm
NN
) with other techniques utilizing the adaptive learning features of the GlossaryTermNN
.Early hybrid connectionist systems combined GlossaryTerm
NN
with rule-based systems such as production rules [3] or predicate logic [4]. They utilized GlossaryTermNN
modules for a lower level of information processing and rule-based systems for reasoning and explanation at a higher level.The above principle is applied when fuzzy rules are used for higher-level information processing and for approximate reasoning [5, 6, 7]. These are expert systems that combine the learning ability of GlossaryTerm
NN
with the explanation power of linguistically plausible fuzzy rules [10, 11, 8, 9]. A block diagram of an exemplar system is shown in Fig. 40.1, where at a lower level a neural network (GlossaryTermNN
) module predicts the level of a stock index and at a higher level a fuzzy reasoning module combines the predicted values with some macro-economic variables representing the political and the economic situations using the following types of fuzzy rules [2]Along with the integration of GlossaryTerm
NN
and fuzzy rules for a better decision support, the system from Fig. 40.1 includes an GlossaryTermNN
module for extracting recent rules form data that can be used by experts to analyze the dynamics of the stock and to possibly update the trading fuzzy rules in the fuzzy rule-based module. This GlossaryTermNN
module uses a fuzzy neural (GlossaryTermFNN
) network for the rule extraction.Fuzzy neural networks (GlossaryTerm
FNN
) integrate GlossaryTermNN
and fuzzy rules into a single neuronal model tightly coupling learning and fuzzy reasoning rules into a connectionist structure. One of the first GlossaryTermFNN
models was initiated by Yamakawa and other Japanese scientists and promoted at a series of IIZUKA conferences in Japan [12, 13]. Many models of GlossaryTermFNN
s were developed based on these principles [14, 15, 2].2.2 Evolving Neuro-Fuzzy Systems
The evolving neuro-fuzzy systems further extended the principles of hybrid neuro-fuzzy systems and the GlossaryTerm
FNN
, where instead of training a fixed connectionist structure, the structure and its functionality evolve from incoming data, often in an on-line, one-pass learning mode. This is the case with evolving connectionist systems (GlossaryTermECOS
) [1, 16, 17, 18, 19].GlossaryTerm
ECOS
are modular connectionist-based systems that evolve their structure and functionality in a continuous, self-organized, on-line, adaptive, and interactive way from incoming information [17]. They can process both data and knowledge in a supervised and/or unsupervised way. GlossaryTermECOS
learn local models from data through clustering of the data and associating a local output function for each cluster represented in a connectionist structure. They can learn incrementally single data items or chunks of data and also incrementally change their input features [18].Elements of GlossaryTerm
ECOS
have been proposed as part of the early, classical GlossaryTermNN
models, such as Kohonen’s self organising maps (GlossaryTermSOM
) [20], redical basis function(GlossaryTermRBF
) [21], FuzyARTMap [22] by Carpenter etal and Fritzke’s growing neural gas [23], Platt’s resource allocation networks (RAN) [24].Some principles of GlossaryTerm
ECOS
are:-
Neurons are created (evolved) and allocated as centers of (fuzzy) data clusters. Fuzzy clustering, as a means to create local knowledge-based systems, was stimulated by the pioneering work of Bezdek, Yager and Filev [27, 28, 29, 30].
-
Local models are evolved and updated in these clusters.
Here we will briefly illustrate the concepts of GlossaryTerm
ECOS
on two implementations: evolving fuzzy neutral networks (GlossaryTermEFuNN
) [16] and dynamic neuro-fuzzy inference systems (GlossaryTermDENFIS
) [25]. Examples of GlossaryTermEFuNN
are shown in Figs. 40.2 and 40.3 and of GlossaryTermDENFIS
in Figs. 40.4 and 40.5. In GlossaryTermECOS
, clusters of data are created (evolved) based on similarity between data samples (input vectors) either in the input space (this is the case in some of the GlossaryTermECOS
models, e. g., GlossaryTermDENFIS
), or in both the input and output space (this is the case, e. g., in the GlossaryTermEFuNN
models). Samples that have a distance to an existing node (cluster center, rule node, neuron) less than a certain threshold are allocated to the same cluster. Samples that do not fit into existing clusters, form (generate, evolve) new clusters. Cluster centers are continuously adjusted according to new data samples, others are created incrementally. GlossaryTermECOS
learn from data and automatically create or update a local (fuzzy) model/function in each cluster, e. g.,where Fi can be a fuzzy value, a linear or logistic regression function (Fig. 40.5), or an GlossaryTerm
NN
model [25].GlossaryTerm
ECOS
utilize evolving clustering methods. There is no fixed number of clusters specified a priori, but clusters are created and updated incrementally. Other GlossaryTermECOS
that use this principle are: evolving self-organized maps (GlossaryTermESOM
) [17], evolving classification function [18, 26], evolving spiking neural networks (Sect. [4]).As an example, the following are the major steps for the training and recall of a GlossaryTerm
DENFIS
model:- Training::
-
-
(a)
Create or update a cluster from incoming data.
-
(b)
Create or update a Takagi–Sugeno fuzzy rule for each cluster:
IF x is in cluster Cj THEN yjfj (x),
where: yi x1 x2q.
-
(a)
The function coefficients are incrementally updated with every new input vector or after a chunk of data. Recall – fuzzy inference for a new input vector:
-
1.
For a new input vector x[x1, x2, xq] GlossaryTerm
DENFIS
chooses m fuzzy rules from the whole fuzzy rule set for forming a current inference system. -
2.
The inference result is
(40.3)where i is the index of one of the m closets to the new input vector x clusters, ωi is the weighted distance between this vector the cluster center, fi(x) is the calculated output for x according to the local model fi for cluster i.
2.3 From Local to Transductive (Individualized) Learning and Modeling
A special direction of GlossaryTerm
ECOS
is transductive reasoning and personalized modeling. Instead of building a set of local models fi (e. g., prototypes) to cover the whole problem space and then using these models to classify/predict any new input vector, in transductive modeling for every new input vector x a new model fx is created based on selected nearest neighbor vectors from the available data. Such GlossaryTermECOS
models are neuro-fuzzy inference systems (GlossaryTermNFI
) [31] and the transductive weighted neuro-fuzzy inference system (GlossaryTermTWNFI
) [32]. In GlossaryTermTWNFI
for every new input vector the neighborhood of the closest data vectors is optimized using both the distance between the new vector and the neighboring ones and the weighted importance of the input variables, so that the error of the model is minimized in the neighborhood area [33]. GlossaryTermTWNFI
is a further development of the weighted-weighted nearest neighbor method (GlossaryTermWWKNN
) proposed in [34]. The output for a new input vector is calculated based on the outputs of the k-nearest neighbors, where the weighting is based on both distance and a priori calculated importance for each variable using a ranking method such as signal-to-noise ratio or the t-test.Other GlossaryTerm
ECOS
were been developed as improvements of GlossaryTermEFuNN
, GlossaryTermDENFIS
, or other early GlossaryTermECOS
models by Ozawa etal and Watts [35, 36, 37], including ensembles of GlossaryTermECOS
[38]. A similar approach to GlossaryTermECOS
was used by Angelov in the development of the (GlossaryTermETS
) models [39].2.4 Applications
GlossaryTerm
ECOS
have been applied to problems across domain areas. It is demonstrated that local incremental learning or transductive learning are superior when compared to global learning models and when compared in terms of accuracy and new knowledge obtained. A review of GlossaryTermECOS
applications can be found in [26]. The applications include:-
Medical decision support systems (Fig. 40.5)
-
Bioinformatics, e. g., [40]
-
Neuroinformatics and brain study, e. g., [41]
-
Evolvable robots, e. g., [42]
-
Financial and economic decision support systems, e. g., [43]
-
Environmental and ecological modeling, e. g., [44]
-
Signal processing, speech, image, and multimodal systems, e. g., [45]
-
Cybersecurity, e. g., [46]
-
Multiple time series prediction, e. g., [47].
While classical GlossaryTerm
ECOS
use a simple McCulloch and Pitts model of a neuron and the Hebbian learning rule [48], evolving spiking neural network (GlossaryTermeSNN
) architectures use a spiking neuron model, applying the same or similar GlossaryTermECOS
principles.3 Evolving Spiking Neural Networks (eSNN)
3.1 Spiking Neuron Models
A single biological neuron and the associated synapses is a complex information processing machine that involves short-term information processing, long-term information storage, and evolutionary information stored as genes in the nucleus of the neuron. A spiking neuron model assumes input information represented as trains of spikes over time. When sufficient input spikes are accumulated in the membrane of the neuron, the neuron’s post-synaptic potential exceeds a threshold and the neuron emits a spike at its axon (Fig. 40.6a,b). Some of the-state-of-the-art models of spiking neurons include: early models by Hodgkin and Huxley [49], and Hopfield [50]; and more recent models by Maass, Gerstner, Kistler, Izhikevich, Thorpe and van Ruller [51, 52, 53, 54]. Such models are spike response models (GlossaryTerm
SRM
s), the leaky integrate-and-fire model (GlossaryTermLIFM
) (Fig. 40.6), Izhikevich models, adaptive GlossaryTermLIFM
, and probabilistic IFM [55].3.2 Evolving Spiking Neural Networks (eSNN)
Based on the GlossaryTerm
ECOS
principles, an evolving spiking neural network architecture (GlossaryTermeSNN
) was proposed in [26], which was initially designed as a visual pattern recognition system. The first GlossaryTermeSNN
s were based on Thorpe’s neural model [54], in which the importance of early spikes (after the onset of a certain stimulus) is boosted, called rank-order coding and learning. Synaptic plasticity is employed by a fast supervised one-pass learning algorithm. An exemplar GlossaryTermeSNN
for multimodal auditory-visual information processing on the case study problem of speaker authentication is shown in Fig. 40.7.Different GlossaryTerm
eSNN
models use different architectures. Figure 40.8 shows a reservoir-based GlossaryTermeSNN
for spatio-temporal pattern recognition where the reservoir [57] uses the spike-time-dependent plasticity (GlossaryTermSTDP
) learning rule [58], and the output classifier that classifies spatio-temporal activities of the reservoir uses rank-order learning rule [54].3.3 Extracting Fuzzy Rules from eSNN
Extracting fuzzy rules from an GlossaryTerm
eSNN
would make GlossaryTermeSNN
not only efficient learning models, but also knowledge-based models. A method was proposed in [59] and illustrated in Fig. 40.9a,b. Based on the connection weights w between the receptive field layer L1 and the class output neuron layer L2 fuzzy rules are extracted.3.4 eSNN Applications
Different GlossaryTerm
eSNN
models and systems have been developed for different applications, such as:-
GlossaryTerm
eSNN
for spatio- and spectro-temporal pattern recognition – http://ncs.ethz.ch/projects/evospike -
Dynamic GlossaryTerm
eSNN
(GlossaryTermdeSNN
) for moving object recognition – [60] -
Spike pattern association neuron(GlossaryTerm
SPAN
) for generation of precise time spike sequences as a response to recognized input spiking patterns – [61] -
Environmental and ecological modeling – [44]
-
GlossaryTerm
EEG
data modeling – [62] -
Neurogenetic models (Sect. 40.4).
A review of GlossaryTerm
eSNN
methods, systems and their applications can be found in [65].4 Computational Neuro-Genetic Modeling (CNGM)
4.1 Principles
A neuro-genetic model of a neuron was proposed in [41, 66]. It utilizes information about how some proteins and genes affect the spiking activities of a neuron such as fast excitation, fast inhibition, slow excitation, and slow inhibition. An important part of the model is a dynamic gene/protein regulatory network (GlossaryTerm
GRN
) model of the dynamic interactions between genes/proteins over time that affect the spiking activity of the neuron – Fig. 40.10.A GlossaryTerm
CNGM
is a dynamical model that has two dynamical sub-models:-
GlossaryTerm
GRN
, which models dynamical interaction between genes/proteins over time scale GlossaryTermT1
-
GlossaryTerm
eSNN
, which models dynamical interaction between spiking neurons at a time scale GlossaryTermT2
.
The two sub-models interact over time.
4.2 The NeuroCube Framework
A further development of the GlossaryTerm
eSNN
and the GlossaryTermCNGM
was achieved with the introduction of the NeuroCube framework [67]. The main idea is to support the creation of multi-modular integrated systems, where different modules, consisting of different neuronal types and genetic parameters correspond in a way to different parts of the brain and different functions (e. g., vision, sensory information processing, sound recognition, motor-control) and the whole system works in an integrated mode for brain signal pattern recognition. A concrete model built with the use of the NeuroCube would have a specific structure and a set of algorithms depending on the problem and the application conditions, e. g., classification of GlossaryTermEEG
, recognition of functional magneto-resonance imaging (GlossaryTermfMRI
) data, brain computer interfaces, emotional cognitive robotics, and modeling Alzheimer’s disease.A block diagram of the NeuroCube framework is shown in Fig. 40.11. It consists of the following modules:
-
An input information encoding module
-
A NeuroCube module
-
An output module
-
A gene regulatory network (GlossaryTerm
GRN
) module.
The main principles of the NeuroCube framework are:
-
1.
NeuroCube is a framework to model brain data (and not a brain model or a brain map).
-
2.
NeuroCube is a selective, approximate map of relevant to the brain data brain regions, along with relevant genetic information, into a 3-D spiking neuronal structure.
-
3.
An initial NeuroCube structure can include known connections between different areas of the brain.
-
4.
There are two types of data used for both training a particular NeuroCube and to recall it on new data: (a) data, measuring the activity of the brain when certain stimuli are presented, e. g., (GlossaryTerm
EEG
, GlossaryTermfMRI
); (b) direct stimuli data, e. g., sound, spoken language, video data, tactile data, odor data, etc. -
5.
A NeuroCube architecture, consisting of a NeuroCube module, (GlossaryTerm
GRN
)s at the lowest level, and a higher-level evaluation (classification) module. -
6.
Different types of neurons and learning rules can be used in different areas of the architecture.
-
7.
Memory of the system is represented as a combination of: (a) short-term memory, represented as changes of the neuronal membranes and temporary changes of synaptic efficacy; (b) long-term memory, represented as a stable establishment of synaptic efficacy; (c) genetic memory, represented as a change in the genetic code and the gene/protein expression level as a result of the above short-term and long-term memory changes and evolutionary processes.
-
8.
Parameters in the NeuroCube are defined by genes/proteins that form dynamic GlossaryTerm
GRN
models. -
9.
NeuroCube can potentially capture in its internal representation both spatial and temporal characteristics from multimodal brain data.
-
10.
The structure and the functionality of a NeuroCube architecture evolve in time from incoming data.
4.3 Quantum-Inspired Optimization of eSNN and CNGM
A GlossaryTerm
CNGM
has a large number of parameters that need to be optimized for an efficient performance. Quantum-inspired optimization methods are suitable for this purpose as they can deal with a large number of variables and will converge in much faster time that any other optimization algorithms [68]. Quantum-inspired GlossaryTermeSNN
(GlossaryTermQeSNN
) use the principle of superposition of states to represent and optimize features (input variables) and parameters of the GlossaryTermeSNN
including genes in a GlossaryTermGRN
[44]. They are optimized through a quantum-inspired genetic algorithm [44] or a quantum-inspired particle swarm optimization algorithm [69]. Features are represented as qubits in a superposition of (selected), with a probability α, and 0 (not selected) with a probability β. When the model has to be calculated, the quantum bits collapse in or .4.4 Applications of CNGM
Various applications of GlossaryTerm
CNGM
have been developed such as:5 Conclusions and Further Directions
This chapter presented a brief overview of the main principles of a class of neural networks called evolving connectionist systems (GlossaryTerm
ECOS
) along with their applications for computational intelligence. GlossaryTermECOS
facilitate fast and accurate learning from data and new knowledge discovery across application areas. They integrate principles from neural networks, fuzzy systems, evolutionary computation, and quantum computing. The future directions and applications of GlossaryTermECOS
are foreseen as a further integration of principles from information science-, bio-informatics, and neuro-informatics [71].Abbreviations
- CI:
-
computational intelligence
- CNGM:
-
computational neuro-genetic modeling
- DENFIS:
-
dynamic neuro-fuzzy inference system
- deSNN:
-
dynamic eSNN
- ECOS:
-
evolving connectionist system
- EEG:
-
electroencephalogram
- EFuNN:
-
evolving fuzzy neural network
- eSNN:
-
evolving spiking neural network
- ESOM:
-
evolving self-organized map
- ETS:
-
evolving Takagi–Sugeno system
- fMRI:
-
functional magneto-resonance imaging
- FNN:
-
fuzzy neural network
- GRN:
-
gene regulatory network
gene/protein regulatory network
- LIFM:
-
leaky integrate-and-fire
- NFI:
-
neuro-fuzzy inference system
- NN:
-
neural network
- QeSNN:
-
quantum-inspired eSNN
- RBF:
-
radial basis function
- SOM:
-
self-organizing map
- SPAN:
-
spike pattern association neuron
- SRM:
-
spike response model
- STDP:
-
spike-timing dependent plasticity
- T1:
-
type-1
- T2:
-
type-2
- TWNFI:
-
transductive weighted neuro-fuzzy inference system
- WWKNN:
-
weighted-weighted nearest neighbor
References
N. Kasabov: Evolving fuzzy neural networks – Algorithms, applications and biological motivation. In: Methodologies for the Conception, Design Application of Soft Computing, ed. by T. Yamakawa, G. Matsumoto (World Scientific, Singapore 1998) pp. 271–274
N. Kasabov: Foundations of Neural Networks, Fuzzy Systems and Knowledge Engineering (MIT, Cambridge 1996) p. 550
N. Kasabov, S. Shishkov: A connectionist production system with partial match and its use for approximate reasoning, Connect. Sci. 5(3/4), 275–305 (1993)
N. Kasabov: Hybrid connectionist production system, J. Syst. Eng. 3(1), 15–21 (1993)
L.A. Zadeh: Fuzzy sets, Inf. Control 8, 338–353 (1965)
L.A. Zadeh: Fuzzy logic, IEEE Computer 21, 83–93 (1988)
L.A. Zadeh: A theory of approximate reasoning. In: Machine Intelligence, Vol. 9, ed. by J.E. Hayes, D. Michie, L.J. Mikulich (Ellis Horwood, Chichester 1979) pp. 149–194
N. Kasabov: Incorporating neural networks into production systems and a practical approach towards realisation of fuzzy expert systems, Comput. Sci. Inf. 21(2), 26–34 (1991)
N. Kasabov: Hybrid connectionist fuzzy production systems – Towards building comprehensive AI, Intell. Autom. Soft Comput. 1(4), 351–360 (1995)
N. Kasabov: Connectionist fuzzy production systems, Lect. Notes Artif. Intell. 847, 114–128 (1994)
N. Kasabov: Hybrid connectionist fuzzy systems for speech recognition and the use of connectionist production systems, Lect. Notes Artif. Intell. 1011, 19–33 (1995)
T. Yamakawa, E. Uchino, T. Miki, H. Kusanagi: A neo fuzzy neuron and its application to system identification and prediction of the system behaviour, Proc. 2nd Int. Conf. Fuzzy Log. Neural Netw. (Iizuka, Japan 1992) pp. 477–483
T. Yamakawa, S. Tomoda: A fuzzy neuron and its application to pattern recognition, Proc. 3rd IFSA Congr., ed. by J. Bezdek (Seattle, Washington 1989) pp. 1–9
T. Furuhashi, T. Hasegawa, S. Horikawa, Y. Uchikawa: An adaptive fuzzy controller using fuzzy neural networks, Proc. 5th IFSA World Congr. Seoul (1993) pp. 769–772
N. Kasabov, J.S. Kim, M. Watts, A. Gray: FuNN/2 – A fuzzy neural network architecture for adaptive learning and knowledge acquisition, Inf. Sci. 101(3/4), 155–175 (1997)
N. Kasabov: Evolving fuzzy neural networks for supervised/unsupervised online knowledge–based learning, IEEE Trans. Syst. Man Cybern. B 31(6), 902–918 (2001)
D. Deng, N. Kasabov: On-line pattern analysis by evolving self-organising maps, Neurocomputing 51, 87–103 (2003)
N. Kasabov: Evolving Connectionist Systems: Methods and Applications in Bioinformatics, Brain Study and Intelligent Machines, Perpective in Neural Computing (Springer, Berlin, Heidelberg 2003)
M. Watts: A decade of Kasabov's evolving connectionist systems: A review, IEEE Trans. Syst. Man Cybern. C 39(3), 253–269 (2009)
N. Kohonen: Self-Organizing Maps, 2nd edn. (Springer, Berlin, Heidelberg 1997)
F. Girosi: Regularization theory, radial basis functions and networks. In: From Statistics to Neural Networks, ed. by V. Cherkassky, J.H. Friedman, H. Wechsler (Springer, Heidelberg 1994) pp. 166–187
G.A. Carpenter, S. Grossberg, N. Markuzon, J.H. Reynolds, D.B. Rosen: Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analogue multidimensional maps, IEEE Trans. Neural Netw. 3(5), 698–713 (1991)
B. Fritzke: A growing neural gas network learns topologies, Adv. Neural Inf. Process. Syst. 7, 625–632 (1995)
J. Platt: A resource allocating network for function interpolation, Neural Comput. 3, 213–225 (1991)
N. Kasabov, Q. Song: DENFIS: Dynamic, evolving neural-fuzzy inference Systems and its application for time-series prediction, IEEE Trans. Fuzzy Syst. 10, 144–154 (2002)
N. Kasabov: Evolving Connectionist Systems: The Knowledge Engineering Approach (Springer, Berlin, Heidelberg 2007)
J. Bezdek: A review of probabilistic, fuzzy, and neural models for pattern recognition, J. Intell. Fuzzy Syst. 1, 1–25 (1993)
J. Bezdek (Ed.): Analysis of Fuzzy Information (CRC, Boca Raton 1987)
J. Bezdek: Pattern Recognition with Fuzzy Objective Function Algorithms (Plenum, New York 1981)
R.R. Yager, D. Filev: Generation of fuzzy rules by mountain clustering, J. Intell. Fuzzy Syst. 2, 209–219 (1994)
Q. Song, N. Kasabov: NFI: A neuro-fuzzy inference method for transductive reasoning, IEEE Trans. Fuzzy Syst. 13(6), 799–808 (2005)
Q. Song, N. Kasabov: TWNFI – A transductive neuro-fuzzy inference system with weighted data normalisation for personalised modelling, Neural Netw. 19(10), 1591–1596 (2006)
N. Kasabov, Y. Hu: Integrated optimisation method for personalised modelling and case studies for medical decision support, Int. J. Funct. Inf. Pers. Med. 3(3), 236–256 (2010)
N. Kasabov: Global, local and personalised modelling and profile discovery in bioinformatics: An integrated approach, Pattern Recognit. Lett. 28(6), 673–685 (2007)
S. Ozawa, S. Pang, N. Kasabov: On-line feature selection for adaptive evolving connectionist systems, Int. J. Innov. Comput. Inf. Control 2(1), 181–192 (2006)
S. Ozawa, S. Pang, N. Kasabov: Incremental learning of feature space and classifier for online pattern recognition, Int. J. Knowl. Intell. Eng. Syst. 10, 57–65 (2006)
M. Watts: Evolving Connectionist Systems: Characterisation, Simplification, Formalisation, Explanation and Optimisation, Ph.D. Thesis (University of Otago, Dunedin 2004)
N.L. Mineu, A.J. da Silva, T.B. Ludermir: Evolving neural networks using differential evolution with neighborhood-based mutation and simple subpopulation scheme, Proc. Braz. Symp. Neural Netw. SBRN (2012) pp. 190–195
P. Angelov: Evolving Rule-Based Models: A Tool for Design of Flexible Adaptive Systems (Springer, Berlin, Heidelberg 2002)
N. Kasabov: Adaptive modelling and discovery in bioinformatics: The evolving connectionist approach, Int. J. Intell. Syst. 23, 545–555 (2008)
L. Benuskova, N. Kasabov: Computational Neuro-Genetic Modelling (Springer, Berlin, Heidelberg 2007)
L. Huang, Q. Song, N. Kasabov: Evolving connectionist system based role allocation for robotic soccer, Int. J. Adv. Robot. Syst. 5(1), 59–62 (2008)
N. Kasabov: Adaptation and interaction in dynamical systems: Modelling and rule discovery through evolving connectionist systems, Appl. Soft Comput. 6(3), 307–322 (2006)
S. Schliebs, M. Defoin-Platel, S.P. Worner, N. Kasabov: Integrated feature and parameter optimization for evolving spiking neural networks: Exploring heterogeneous probabilistic models, Neural Netw. 22, 623–632 (2009)
N. Kasabov, E. Postma, J. van den Herik: AVIS: A connectionist-based framework for integrated auditory and visual information processing, Inf. Sci. 123, 127–148 (2000)
S. Pang, T. Ban, Y. Kadobayashi, K. Kasabov: LDA merging and splitting with applications to multiagent cooperative learning and system alteration, IEEE Trans. Syst. Man Cybern. B 42(2), 552–564 (2012)
H. Widiputra, R. Pears, N. Kasabov: Multiple time-series prediction through multiple time-series relationships profiling and clustered recurring trends, Lect. Notes Artif. Intell. 6635, 161–172 (2011)
D. Hebb: The Organization of Behavior (Wiley, New York 1949)
A.L. Hodgkin, A.F. Huxley: A quantitative description of membrane current and its application to conduction and excitation in nerve, J. Physiol. 117, 500–544 (1952)
J. Hopfield: Pattern recognition computation using action potential timing for stimulus representation, Nature 376, 33–36 (1995)
W. Maass: Computing with spiking neurons. In: Pulsed Neural Networks, ed. by W. Maass, C.M. Bishop (MIT, Cambridge 1998) pp. 55–81
W. Gerstner: Time structure of the activity of neural network models, Phys. Rev. E 51, 738–758 (1995)
E.M. Izhikevich: Which model to use for cortical spiking neurons?, IEEE Trans. Neural Netw. 15(5), 1063–1070 (2004)
S. Thorpe, A. Delorme, R. van Ruller: Spike-based strategies for rapid processing, Neural Netw. 14(6/7), 715–725 (2001)
N. Kasabov: To spike or not to spike: A probabilistic spiking neuron model, Neural Netw. 23(1), 16–19 (2010)
S. Wysoski, L. Benuskova, N. Kasabov: Evolving spiking neural networks for audiovisual information processing, Neural Netw. 23(7), 819–836 (2010)
D. Verstraeten, B. Schrauwen, M. d'Haene, D. Stroobandt: An experimental unification of reservoir computing methods, Neural Netw. 20(3), 391–403 (2007)
S. Song, K. Miller, L. Abbott: Competitive Hebbian learning through spike-timing-dependent synaptic plasticity, Nat. Neurosci. 3, 919–926 (2000)
S. Soltic, N. Kasabov: Knowledge extraction from evolving spiking neural networks with rank order population coding, Int. J. Neural Syst. 20(6), 437–445 (2010)
N. Kasabov, K. Dhoble, N. Nuntalid, G. Indiveri: Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition, Neural Netw. 41, 188–201 (2013)
A. Mohemmed, S. Schliebs, S. Matsuda, N. Kasabov: SPAN: Spike pattern association neuron for learning spatio-temporal spike patterns, Int. J. Neural Syst. 22(4), 1250012 (2012)
N. Nuntalid, K. Dhoble, N. Kasabov: EEG classification with BSA spike encoding algorithm and evolving probabilistic spiking neural network, Lect. Notes Comput. Sci. 7062, 451–460 (2011)
G. Indiveri, B. Linares-Barranco, T.J. Hamilton, A. van Schaik, R. Etienne-Cummings, T. Delbruck, S.-C. Liu, P. Dudek, P. Häfliger, S. Renaud, J. Schemmel, G. Cauwenberghs, J. Arthur, K. Hynna, F. Folowosele, S. Saighi, T. Serrano-Gotarredona, J. Wijekoon, Y. Wang, K. Boahen: Neuromorphic silicon neuron circuits, Front. Neurosci. 5, 5 (2011)
G. Indiveri, E. Chicca, R.J. Douglas: Artificial cognitive systems: From VLSI networks of spiking neurons to neuromorphic cognition, Cogn. Comput. 1(2), 119–127 (2009)
S. Schliebs, N. Kasabov: Evolving spiking neural networks – a survey, Evol. Syst. 4(2), 87–98 (2013)
N. Kasabov, L. Benuskova, S. Wysoski: A computational neurogenetic model of a spiking neuron, Neural Netw. IJCNN'05. Proc. (2005) pp. 446–451
N. Kasabov: NeuCube EvoSpike architecture for spatio-temporal modelling and pattern recognition of brain signals, Lect. Notes Comput. Sci. 7477, 225–243 (2012)
M. Defoin-Platel, S. Schliebs, N. Kasabov: Quantum-inspired evolutionary algorithm: A multi-model EDA, IEEE Trans. Evol. Comput. 13(6), 1218–1232 (2009)
H. Nuzly, A. Hamed, S.M. Shamsuddin: Probabilistic evolving spiking neural network optimization using dynamic quantum inspired particle swarm optimization, Aust. J. Intell. Inf. Process. Syst. 11(1), 5–15 (2010)
N. Kasabov, R. Schliebs, H. Kojima: Probabilistic computational neurogenetic framework: From modelling cognitive systems to Alzheimer's disease, IEEE Trans. Auton. Ment. Dev. 3(4), 300–311 (2011)
N. Kasabov (Ed.): Springer Handbook of Bio/Neuroinformatics (Springer, Berlin, Heidelberg 2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Kasabov, N. (2015). Evolving Connectionist Systems: From Neuro-Fuzzy-, to Spiking- and Neuro-Genetic. In: Kacprzyk, J., Pedrycz, W. (eds) Springer Handbook of Computational Intelligence. Springer Handbooks. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-43505-2_40
Download citation
DOI: https://doi.org/10.1007/978-3-662-43505-2_40
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-43504-5
Online ISBN: 978-3-662-43505-2
eBook Packages: EngineeringEngineering (R0)