Abstract
In this paper, the almost periodic dynamical behaviors are considered for delayed complex-valued neural networks with discontinuous activation functions. We decomposed complex-valued to real and imaginary parts, and set an equivalent discontinuous right-hand equation. Depended on the differential inclusions theory, diagonal dominant principle, non-smooth analysis theory and generalized Lyapunov function, sufficient criteria are obtained for the existence uniqueness and global stability of almost periodic solution of the equivalent delayed differential system. Especially, we derive a series of results on the equivalent neural networks with discontinuous activations and periodic coefficients or constant coefficients, respectively. Finally, we give one numerical example to demonstrate the effectiveness of the derived theoretical results.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Recurrently, connected neural network has been extensively investigated in various science and engineering fields such as pattern classification, parallel computation, signal processing, associative memories, and solving some complicated optimization problems [1,2,3,4]. In these applications, which are depended on the dynamical behaviors of neural networks. Hence, it is extremely important to study the dynamical behaviors of neural networks in the practical design. For instance, when solving a periodic oscillation problem by neural networks, there must translate into considering dynamical behaviors for networks. As is well known that the research on neural networks not only involve discussion of stability property, but also involve other dynamical behaviors as bifurcation, periodic oscillatory, and chaos. In some applications, the properties of almost periodic solutions are very interest and important. Thus, it is great significant to study the almost periodic solution of the neural networks.
The complex-valued neural networks is an extension of real-valued neural networks, which have been presented and investigated, see [5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22]. Compared with real-valued neural networks, complex-valued neural networks have specific different from those in real-valued connected neural networks. In general, complex-valued neural networks have more complicated than the real-valued ones in some aspects. For the practical applications in physical systems, complex-valued neural networks become strongly desired. For example, light, quantum waves, ultrasonic, and electromagnetic [23, 24]. Therefore, it is necessary and vital to discuss the dynamical behaviors of complex-valued neural networks such as the stability of almost periodic solution. However, to the best of the authors’ knowledge, almost periodic solution for delayed complex-valued neural networks was seldom investigated.
It is well known that activation functions play a vital part in the dynamical behaviors of neural networks. As often, activation functions of neural networks are assumed to be continuous, bounded, globally Lipschitz, even smooth. Dynamical behaviors of neural networks depend on the structures of activation functions. In the past few years, there have two kinds of activation functions considered for neural network, i.e. continuous activation function and discontinuous activation functions. As pointed by Forti and Nistri [25], connected neural networks with discontinuous activation functions are important and do frequently arise in practical application when handling with dynamical systems with discontinuous right-hand sides. For this reason, much efforts has been committed to analyzing the dynamical behavior of the neural networks with discontinuous activation functions [26,27,28,29]. However, almost periodic dynamical behaviors for delayed complex-valued neural networks with discontinuous activation functions was seldom investigated.
Unfortunately, time delays are often arise in many practical applications of neural networks because of the finite switching speed of amplifiers and propagation time, such as control, signal processing, associative memory, and pattern recognition. In the past few years, there have a lot of articles on considering the dynamical behaviors for delayed complex-valued neural networks, see [30,31,32]. Time delays is a origin of instability and oscillation in neural networks. Moreover, the stability analysis is a prime research topic in neural networks theory. Therefore, it is important and necessary to study the dynamical behaviors of complex-valued neural networks with time delays, as the existence, uniqueness, and stability of almost periodic solutions.
From many applications, we know that almost periodic oscillatory is an universal phenomenon in the real world, it is more actual than others. In the dynamical behavior point of view, periodic parameters of dynamical networks often turn out to experience uncertain perturbations. That is, parameter can be taken as a periodic small error. The almost periodic neural networks are regard as a natural extension of the periodic ones, and which is more accordant with reality. Considering the importance of almost periodic property, much more efforts have been devoted to researching almost periodic the dynamical behaviors of connected neural networks, see [30, 31, 33,34,35,36,37,38]. Based on a fixed theorem and stability definition, Huang and Hu considered the multistability problem of delayed complex-valued neural networks with discontinuous real-imaginary-type activation functions [39]. In [40], Liang et al. study the multistability of complex-valued neural networks with discontinuous activation functions. Wang et al. researched the dynamical behavior of Complex-Valued Hopfield neural networks with discontinuous activation functions [41]. In [42], Yan researched the almost periodic dynamics of the delayed complex-valued recurrent neural networks with discontinuous activation functions. Compared with the almost periodic dynamics of real-valued neural networks, complex-valued are more complicated and suitable. However, to the best of our knowledge, almost periodic dynamics for complex-valued recurrent neural networks was seldom considered.
This paper investigates the almost periodic dynamical behaviors for complex-valued neural networks with discontinuous activation functions. The highlights of this paper are listed as follows: Firstly, The almost periodic solution is proposed in the complex domain, which is more feasible in practice compared to the periodical scheme. Furthermore, we considered decomposing complex-valued neural networks into real-valued, which the activation function has continuous real part and discontinuous imaginary. Secondly, the decomposed activation function is not assumed monotonous. Under these circumstances, we reconsider almost periodic dynamical behaviors by generalized Lyapunov function method. Lastly, the almost periodic dynamics for complex-valued neural networks with discontinuous functions is investigated, and some judgment conditions are obtained. the issue of time-varying delay is also considered, which make our research have more general significance.
The structure of the remaining paper is organized as follows. In Sect. 2, Complex-valued neural networks is formulated, and some preliminaries are presented. In Sect. 3, the existence of almost periodic solution for the dynamic system is obtained via some assumptions of activation functions. In Sect. 4, the uniqueness and global exponential stability of almost periodic solution are achieved. In Sect. 5, an example is presented to explicate the validity of our theoretical results. Lastly, some conclusions are shown in Sect. 6.
Notations The notations are quite standard in this paper. \(\Vert z\Vert \) denote the 1-norm of vector \(z(t)=\left( z_{1}(t),z_{2}(t),\ldots ,z_{n}(t)\right) ^{T}\in \mathbb {C}^{n}, \Vert z(t)\Vert =\left\{ \sum \nolimits _{j=1}^{n}\xi _{j}|z_{j}^{R}(t)|+\sum \nolimits _{j=1}^{n}\phi _{j}|z_{j}^{I}(t)|\right\} \) where \(\xi _{j}, \phi _{j}>0, j=1,2,\ldots ,n\). \(\overline{co}(E)\) is the closure of the convex hull of some set E. \(B(x,\delta )\) denotes the open \(\delta \)-neighborhood of a set \(x\subset R^{n}: B(x,\delta )=\{y\in R^{n}:\inf \limits _{z\in x}\Vert y-z\Vert <\delta \}\) for some \(\Vert \cdot \Vert ,~C([0,T],R^{n}),\ L^{1}([0,T],R^{n})\), and \(L^{\infty }([0,T],R^{n})\) are the spaces of continuous vector function, square integrable vector function, and essentially bounded function on [0, T], respectively. Z denotes the integer.
2 Model Formulation and Preliminaries
Consider the complex-valued dynamical networks with almost periodic coefficients described by the following delayed integro-differential equations:
where \(j=1, 2, \ldots , n\) \(z_{j}(t)\in \mathbb {C}\) is the state of the j-th neuron at time t; \(d_{j}(t)>0\) represents the positive rate with which the j-th unit will reset its potential to the resting state in isolation when disconnected from the network; \(f_{j}(\cdot ):\mathbb {C}\rightarrow \mathbb {C}\) are complex-valued activation functions; \(a_{jk}(t)\in \mathbb {C}\) are the connection strength of the k-th neuron on the j-th neuron; \(d_{s}K_{jk}(t,s)\) are Lebesgue-Stieltjes measures with respect to s.
Assumption 1
Let \(z=z^{R}+iz^{I}\), \(f_{j}(z)\) can be decomposed to its real and imaginary parts as \(f_{j}(z)=f_{j}^{R}(z^{R})+if_{j}^{I}(z^{I})\), where \(f_{j}^{R}(\cdot )\) is continuous on any compact interval of R, and \(f_{j}^{I}(\cdot )\) is continuous except on a finite number set of isolation points \(\{\rho _{k}^{j}: \rho _{k}^{j}<\rho _{k+1}^{j}, k\in \mathbb {Z}\}\), where \(f_{j}^{I}(\cdot )=g_{j}(\cdot )+h_{j}(\cdot )\), \(g_{j}\) is a monotonic continuous on R and \(h_{j}\) is continuous except on a countable set of isolation points \(\{\rho _{k}^{j}\}\), \(f_{j}^{R}(\cdot ), g_{j}(\cdot )\) are local Lipschizan, i.e., for any \(\zeta ,\varsigma \in (\rho _{k}^{j},\rho _{k+1}^{j})\) there exists positive constants \(L_{j}^{f}\) and \(L_{j}^{g}\), \(j=1,2,\ldots ,n\), such that \(|f_{j}^{R}(\zeta )-f_{j}^{R}(\varsigma )|\le L_{j}^{f}|\zeta -\varsigma |, \ |g_{j}(\zeta )-g_{j}(\varsigma )|\le L_{j}^{g}|\zeta -\varsigma |\).
Denote \(z_{j}(t)=x_{j}(t)+iy_{j}(t)\) with \(x_{j}(t)\) and \(y_{j}(t)\in R\), then the network (1) can be rewritten in the equivalent forms as follows:
The following assumptions are also needed for the systems (2a)–(2b).
Assumption 2
\(d_{j}(t), a_{jk}^{R}(t), a_{jk}^{I}(t), u_{j}^{R}(t), u_{j}^{I}(t), d_{s}K_{jk}(t,s)\) are all continuous almost functions on R. i.e., for any \(\varepsilon >0\), there exists \(l=l(\varepsilon )>0\), for any interval \([\alpha ,\alpha +l]\) containing \(\omega \), such that
hold for all \(j,k=1,2,\ldots ,n\) and \(t\in R\). Moreover, \(d_{s}K_{jk}(t,s)\) is dominated by some Lebesgue-Stieltjes \(d\overline{K}_{jk}(s)\) independent of t, i.e., \(\int _{0}^{+\infty }f(s)|d_{s}K_{jk}(t,s)|<\int _{0}^{+\infty }f(s)|d\overline{K}_{jk}(s)|\), where f(s) is any nonnegative measurable function.
Assumption 3
There exist positive constant \(\xi _{j}, \phi _{j}>0\) and \(d_{j}(t)>\delta >0\) such that \(\Gamma _{j}^{1}(t)<0, \Gamma _{j}^{2}(t)<0\) and \(\Upsilon _{j}^{1}(t)<0\),
where \(j=1,2,\ldots ,n\)
Definition 1
For given continuous functions \(\theta _{k}(s)\) and \(\varphi _{k}(s)\) defined on \((-\infty , 0]\) as well as the measurable functions \(\upsilon _{k}(s)\in \overline{co}[f_{k}^{I}(\varphi _{k}(s))]\) for almost all \(s\in (-\infty , 0]\), the absolute continuous function \(z(t)=x(t)+iy(t)\) with \(x_{k}(s)=\theta _{k}(s)\), \(y_{k}(s)=\varphi _{k}(s)\) for all \(s\in (-\infty , 0]\) is said to be a solution of the systems (2a)–(2b) on [0, T] if there exist measurable function \(\gamma _{k}^{I}(t)\in \overline{co}[f_{k}^{I}(y_{k}(t)]\) for almost all \(t\in [0,T]\) such that
and \(\gamma _{k}(s)=\upsilon _{k}(s)\) for almost all \(s\in (-\,\infty , 0]\), where \(k=1,2,\ldots ,n\).
Definition 2
Let \(z^{*}(t)=\left( z_{1}^{*}(t),z_{2}^{*}(t),\ldots ,z_{n}^{*}(t)\right) ^{T}\) is a solution of the given initial value problem of system (1), \(z^{*}(t)\) is called to be globally exponentially stable, if for any solution \(z(t)=\left( z_{1}(t),z_{2}(t),\ldots ,z_{n}(t)\right) ^{T}\) of the dynamical system (1), i.e. there exist constants \(M>0\), and \(\delta >0\) such that
As introduced by Fink [43] and He [44], the following concept of almost periodic solution is presented.
Definition 3
[37] A continuous function \(z(t):R \rightarrow \mathbb {C}^{n}\) is said to be almost periodic function on R, if for any \(\varepsilon >0\), it is possible to find a real number \(l=l(\varepsilon )>0\), for any interval with length \(l(\varepsilon )\), there exist a number \(\omega =\omega (\varepsilon )\) in this interval \([\alpha ,\alpha +l]\), such that \(\Vert z(t+\omega )-z(t)\Vert <\varepsilon \), for all \(t\in R\).
A chain rule for computing the time derivative of the composed function \(V(q(t)):[0, +\,\infty )\rightarrow R\), where \(q(t):[0, +\,\infty )\rightarrow R^{n}\) is absolutely continuous on any compact interval \([0, +\,\infty )\).
Lemma 1
(Chain Rule) [37] Assume that \(V(t):R^{n}\rightarrow R\) is \(C-regular\), and that q(t) is absolutely continuous on any compact interval of \([0, +\,\infty )\), then q(t) and \(V(t):[0, +\,\infty )\rightarrow R^{n}\) are differential for \(a.e.\ t\in [0, +\,\infty )\) and we have
3 The Existence of Almost Periodic Solution for the Dynamic System
In this section, the existence of almost periodic solution of system (1) is considered primarily. We applied with a suitable Lyapunov function that some sufficient criteria are obtained to guarantee the existence of the almost periodic solution.
Lemma 2
Suppose that the Assumptions 1–3 are satisfied, then for any initial value of the dynamical system (1), there exists a solution \(z(t)=x(t)+\,iy(t)\) associated with a measurable function \(\gamma (t)\) a.e. \(t\in R\). Moreover, for any solution there exists constant \(M>0\), such that \(\Vert z(t)\Vert <M\), for \(t\in R\) and \(\Vert \gamma (t)\Vert <M\) for \(a.e.\ t\in R\).
Proof
Define set-valued map as follows:
it is easy to see that this set-valued map are upper semi-continuous with nonempty compact convex values, which implies that the local solution x(t), y(t) of the (2a)–(2b) are obviously exist. That is to say that the initial valued problem of the systems (2a)–(2b) have at least a solution \(\left( x(t),y(t)\right) \) on [0, T) for some \(T\in (0, +\,\infty ]\). \(\square \)
Next, we will show that \(\lim \nolimits _{t\rightarrow T^{-}}\Vert z(t)\Vert <+\infty \) if \(T<+\infty \), which means that the maximal existing interval of z(t) can be extend to \(+\infty \). Note that \(f^{I}_{k}(y_{k}(t))=g_{k}(y_{k}(t))+h_{k}(y_{k}(t))\). There exists a vector function \(\eta (t)=\left( \eta _{1}(t),\eta _{2}(t),\ldots ,\eta _{n}(t)\right) ^{T}:(-\infty ,T)\rightarrow R^{n}\), such that \(\eta _{k}(t)+g_{k}(y_{k}(t))=\gamma _{k}(t)(k=1,2,\ldots ,n)\) where \(\eta _{k}(t)\in \overline{co}[h_{k}(y_{k}(t))]\), for \(a.e.\ t\in (-\,\infty , T)\).
Construct a function as follows:
where
From (3), we obtained
Then, we have
where \(v_{j}(t)=sign(x_{j}(t))\), if \(x_{j}(t)\ne 0\); \(x_{j}(t)\) can be arbitrarily select in \([-1,1]\), if \(x_{j}(t)=0\). In particular, we can select \(v_{j}(t)\) as follows
Thus, we have
Similarly, we can choose \(w_{j}(t)\) as follows
We have
Calculate the derivation of V(t) with respect to t along the solution trajectories of the systems (2a)–(2b) in the sense of (3) by using Lemma 1, one gets that
Let us continue to calculate the derivative of \(V_{2}(t)\).
Therefore,
It follows form the Assumption 3 and (6) that
where \(\widehat{u}=\sup \limits _{t\ge 0}\Vert u(t)\Vert <+\infty ,\ u(t)=\left( u_{1}(t),u_{2}(t),\cdots ,u_{n}(t)\right) ^{T}\), which implies that
Combining the definition of V(t) and (7), one has
Thus, there exists constant \(M'=V(0)+\frac{1}{\delta }\widehat{u}\), such that \(\Vert z(t)\Vert <M'\), for \(t\in R\). Furthermore, \(\lim \limits _{t\rightarrow T^{-}}\Vert z(t)\Vert <+\,\infty \), which means \(T=+\,\infty \). That is to say that the dynamical system (1) has a global solution for any initial valued problem.
Moreover, we have
where \(M_{0}=V(0)+\frac{1}{\delta }\widehat{u}+\Vert \theta \Vert \). \(\Vert \theta \Vert =\sup \nolimits _{-\infty \le s\le 0}\left\{ \sum \limits _{k=1}^{n}\xi _{k}|x_{j}(s)|+\sum \limits _{k=1}^{n}\phi _{k}|y_{j}(s)|\right\} \).
Due to \(f_{j}^{I}(\cdot )\) have finite number of discontinuous points on any compact interval of R. In speciality, \(f_{j}^{I}(\cdot )\) have finite number of discontinuous points on compact interval \([-\,M_{0},M_{0}]\). Without loss of generality, we select discontinuous points \(\{\rho _{k}^{j}:k=1,2,\ldots ,l_{j}\}\) of \(f_{j}^{I}(\cdot )\) on the interval \([-\,M_{0},M_{0}]\), and satisfied \(-\,M_{0}<\rho _{1}^{j}<\rho _{2}^{j}<\cdots<\rho _{l_{j}}^{j}<M_{0}\). First, let us consider a series of continuous function of \(f_{j}^{I}(\cdot )\) as follows:
Denote
and
It easy to see that
Note that \(\gamma _{j}(t)\in \overline{co}[f_{j}^{I}(y_{j}(t))]\), for \(a.e.\ t\in R\) and \(j=1,2,\ldots ,n\). Thus \(|\gamma _{j}(t)|\le \max \{|M_{j}^{1}|,|m_{j}^{1}|\},\ for\ a.e.\ t\in R\), and \(j=1,2,\ldots ,n\),
which implies that
Let \(M=\max \left\{ M_{0},\sum \limits _{j=1}^{n}\left| M_{j}^{1}\right| ,\sum \limits _{j=1}^{n}\left| m_{j}^{1}\right| \right\} \). Hence, from (8) and (9), we have
and
The proof of the Lemma 2 is complete.
Lemma 3
Suppose that the Assumptions 1–3 are satisfied, then any solution of system (1) in the sense of (3) is asymptotically almost periodic, i.e., for any \(\varepsilon >0\), it is possible to find a real number \(l=l(\varepsilon )>0\), for any interval with length \(l(\varepsilon )\), there exist a number \(\omega =\omega (\varepsilon )\) in this interval \([\alpha ,\alpha +l]\), such that
hold for all \(t\ge T\).
Proof
Construct the following auxiliary functions
From the Assumption 2 and the boundedness of \(z(t), f^{R}(x)\) and \(\gamma (t)\), it easy to see that for any \(\varepsilon >0\), there exists \(l=l(\varepsilon )>0\) such that for any interval \([\alpha ,\alpha +l]\) containing at least one point \(\omega \) with satisfying the following inequalities:
where \(\Delta \triangleq \max \limits _{1\le j\le n}\{\xi _{j},\phi _{j}\}\). Hence, we have
Denote \(\widehat{x}(t)=x(t+\omega )-x(t), \widehat{y}(t)=y(t+\omega )-y(t)\). It follows from (2a) and (2b) that
Consider the following candidate function:
where
Let
where \(v_{j}(t)=sign(x_{j}(t+\omega )-x_{j}(t))\), if \(x_{j}(t+\omega )\ne x_{j}(t)\); \(x_{j}(t+\omega )-x_{j}(t)\) can be arbitrarily select in \([-1,1]\), if \(x_{j}(t+\omega )=x_{j}(t)\). In particular, we can select \(v_{j}(t)\) as follows:
Thus, we have
Similarly, we can choose \(w_{j}(t)\) as follows:
We have
By the similar way utilized in Lemma 2, and combining the inequalities (14), (15), one has
Then
Note that L(0) is constant, and we can pick a sufficiently large \(T>0\) such that
Furthermore, we have
The proof of the Lemma 3 is completed. \(\square \)
Theorem 1
Suppose that the Assumptions 1–3 are satisfied, then system (1) has at least one almost periodic solution in the sense of (3).
Proof
Let z(t) be any solution of the neural network system (3). We can select a sequence \(\{t_{k}\}_{k\in N}\) satisfying \(\lim \nolimits _{k\rightarrow +\infty }t_{k}=+\infty \) and such that
and
where \(j=1,2,\ldots ,n, \ \varepsilon _{j}^{1}(t,t_{k}), \varepsilon _{j}^{2}(t,t_{k})\) are the auxiliary functions (12) and (13) defined in the proof of Lemma 3.
It follows from (10) and (11) that there exists \(M^{*}>0\) such that \(|z'_{j}(t)|\le M^{*}\) for \(a.e.\ t\in R\). Thus, the sequence \(\{z(t+t_{k})\}_{k\in N}\) is equi-continuous and uniformly bounded. By the Arzela-Ascoli theorem and diagonal selection principle, we can choose a subsequence of \(\{t_{k}\}\) (still denoted by \(\{t_{k}\}\)), such that \(z(t+t_{k})\) converges uniformly to some absolutely continuous function \(z^{*}(t)\) on any compact interval [0, T].
Next, we will prove that \(z^{*}(t)\) is an almost periodic solution of system (1) in the sense (3). Firstly, we prove that \(z^{*}(t)\) is a solution of system (1) in the sense (3).
According to Lebesgue’s dominated convergence theorem, for any \(t\in R\), we have
From (18) and (19), it is easy to conclude that
Therefore, it implied that the following equations hold
Thence, \(z^{*}(t)=x^{*}(t)+iy^{*}(t)\) is a solution of system (1).
In the following, we claim that \(\gamma _{j}^{*}(t)\in \overline{co}[f_{j}^{I}(y_{j}^{*}(t))]\) for \(a.e.\ t\in R\), Note that y(t) converges to \(y^{*}(t)\) uniformly with respect to \(t\in R\) and \(\overline{co}[f_{j}^{I}(y_{j}^{*}(t))]\) are upper semi-continuous set-valued map, for any \(\varepsilon >0\), there exists \(N>0\) such that \(f^{I}(y(t+t_{k}))\in B(\overline{co}[f^{I}(y(t))],\varepsilon )\) for \(k>N\) and \(t\in R\). Since that \(\overline{co}[f^{I}(y(t))]\) is convex and compact, then \(\gamma (t)\in B(\overline{co}[f^{I}(y(t))],\varepsilon )\), which implies \(\gamma ^{*}_{j}(t)\in B(\overline{co}[f^{I}_{j}(y_{j}(t))],\varepsilon )\) holds for any \(t\in R\). Because of the arbitrary of \(\varepsilon \), we know that \(\gamma ^{*}_{j}(t)\in \overline{co}[f_{j}^{I}(y_{j}^{*}(t))]\) for \(a.e.\ t\in R\).
Secondly, we prove that \(z^{*}(t)=x^{*}(t)+iy^{*}(t)\) is an almost periodic solution of systems (1). By the Lemma 3, For any \(\varepsilon >0\), there exist \(T>0\) and \(l=l(\varepsilon )\) such that any interval \([\alpha ,\alpha +l]\) contains an \(\omega \) such that
hold for all \(t\ge T\). Therefore, there exists sufficiently large constant \(K>0\) such that
holds for all \(k>K\) and \(t\in R\). As \(k\rightarrow +\infty \), we can conclude that\(\Vert z^{*}(t+\omega )-z^{*}(t)\Vert <\varepsilon \) for all \(t\in R\). This implies that \(z^{*}(t)\) is an almost periodic solution of the neural network system (1). The proof is complete. \(\square \)
4 The Uniqueness and Global Exponential Stability Analysis for Dynamical Networks
In this section, we will research the uniqueness and global exponential stability of almost periodic solution obtained in Sect. 3 for the dynamical networks (1). By utilizing a suitable Lyapunov function, some sufficient criteria are obtained to guarantee that networks has a uniqueness and global exponential stability almost periodic solution.
Theorem 2
Suppose that the Assumptions 1–3 are satisfied, then system (1) has a unique almost periodic solution, which is globally exponentially stable in the sense of (3).
Proof
Let \(z(t)=x(t)+iy(t)\) and \(\widetilde{z}(t)=\widetilde{x}(t)+i\widetilde{y}(t)\) be any two solutions of neural network system (1) associated with \(\gamma (t), \widetilde{\gamma }(t)\) and initial value pairs \((\psi ,\mu ), (\widetilde{\psi }, \widetilde{\mu })\) respectively. \(\square \)
Note that \(f^{I}_{j}=g_{j}+h_{j}\). There exists two vector functions \(\eta (t)=\left( \eta _{1}(t),\cdots ,\eta _{n}(t)\right) ^{T}\), and \(\widetilde{\eta }(t)=\left( \widetilde{\eta }_{1}(t),\ldots ,\widetilde{\eta }_{n}(t)\right) \) such that \(\eta _{j}(t)+g_{j}(y_{j}(t))=\gamma _{j}(t), \widetilde{\eta }_{j}(t)+g_{j}(\widetilde{y}_{j}(t))=\widetilde{\gamma }_{j}(t),\ (j=1,2,\ldots ,n)\), where \(\eta _{j}(t)\in \overline{co}[h_{j}(y_{j}(t))], \widetilde{\eta }_{j}(t)\in \overline{co}[h_{j}(\widetilde{y}_{j}(t))]\), for \(a.e.\ t\in (-\infty ,T)\).
Construct the following candidate function:
where
Now, let us calculate the derivative of W(t) with respect to t along the solution trajectories of the systems (2a)–(2b) in the sense of (3) by using Lemma 1, we get that
Furthermore, we have
where \(v_{j}(t)=sign\{x_{j}(t)-\widetilde{x}_{j}^{R}(t)\}\), if \(x_{j}(t)\ne \widetilde{x}_{j}(t)\); while \(v_{j}(t)\) can be arbitrarily select in \(\{-1,1\}\), if \(x_{j}(t)=\widetilde{x}_{j}(t)\). In particular, we can select \(v_{j}(t)\) as follows:
Thus, we have
Similarly, we can choose \(w_{j}(t)\) as follows
We have
Therefore,
It follows form the Assumption 3 and (21) that one has
Note that
It follows from (22) and (23) that one has
Let \(M=M(\psi ,\mu ,\widehat{\psi },\widehat{\mu })=W(0)\), then \(\Vert z(t)-\widetilde{z}(t)\Vert \le Me^{-\delta t}\). We know that there exists an almost periodic solution for system (1) in the sense (3). Hence, one has
which implies that the almost periodic solution \(z^{*}(t)\) is globally exponentially stable. Finally, we point that the almost periodic solution of system (1) is unique. Actually, suppose that \(z^{*}(t)\) and \(u^{*}(t)\) are two almost periodic solutions of the system (1). Applying (25) again gives
From Levitan and Zhikov [45], we conclude that if \(z^{*}(t)\) and \(u^{*}(t)\) are two almost periodic functions satisfying (25), then \(z^{*}(t)=u^{*}(t)\). Therefore, the almost periodic solution of system (1) is unique.
5 Applications of the Main Results
In this section, we consider the complex-valued neural networks with discontinuous activations and dalyed as specific cases in the main theorem.
Due to any periodic function can be regard as an almost periodic function, all the works also applied to periodic case. Now, replacing Assumption 3, we assume the following assumption.
Assumption 4
For each \(j=1,2,\ldots ,n, a_{jk}(t), b_{jk}(t), u_{j}(t)\) are all continuous functions on \(\mathbb {C}\), and \(d_{j}(t)>0\) is continuous function in R. there exists \(\omega >0\) such that
hold for all \(j,k=1,2,\ldots ,n\) and \(t\in R\).
According to the Theorems 1 and 2, the following corollary can be obtained immediately.
Corollary 1
Suppose that the Assumptions 1 and 4 are satisfied, then system (1) has a unique periodic solution, which is globally exponentially stable.
Furthermore, a constant can be regarded as a periodic function with any periodic. For example, the following delayed complex-valued neural networks
Assumption 5
Assume that the delays \(\tau _{jk}(t)\) are continuous functions and satisfy \(\tau '_{jk}(t)<1\) for \(j,k=1,2,\ldots ,n\). Moreover, there exist positive constant \(\xi _{j}, \phi _{j}\) and \(d_{j}>\delta >0\), such that \(\overline{\Gamma }_{j}<0\) and \(\overline{\Upsilon }_{j}(t)<0\), \(j=1,2,\ldots ,n,\).
where
Corollary 2
Suppose that the Assumptions 1, 2, 5 are satisfied, then system (21) has a unique periodic solution which is globally exponentially stable.
6 Numerical Example
In this section, a numerical example is provided to illustrate the validity of the theoretical results obtained in Theorem 2.
If \(d_{s}K_{jk}(t,s)=b_{jk}(t)\sin s e^{-2s}ds\). Hence, the delayed system (1) change to a systems with discrete delays:
In this case, \(b_{jk}(t)\) are almost periodic function for all \(j,k=1,2,\ldots ,n\).
Example 1
Consider the complex-valued neural networks consisting of two subnetworks as follows:
where discontinuous activation functions \(f=f^{R}+f^{I}i\),
It easy to see that \(f_{k}^{R}(\cdot ), f_{k}^{I}(\cdot )\) are local Lipschitz with \(L_{k}^{f}=L_{k}^{g}=2\). We observed \(d_{1}(t)=4,\ d_{2}(t)=6,\ a_{11}^{R}=-0.5+0.01\sin \sqrt{2}t, \ a_{11}^{I}(t)=0.01\sin \sqrt{2}t,\ a_{12}^{R}(t)=-0.01,\ a_{12}^{I}(t)=-0.01,\ a_{21}^{R}(t)=0.01\cos \sqrt{2}t,\ a_{21}^{I}(t)=0.01\sin \sqrt{2}t,\ a_{22}^{R}(t)=-0.4,\) \(\ a_{22}^{I}(t)=-0.01\cos \sqrt{3}t,\ b_{11}(t)=0.01\sin \sqrt{2}t,\ b_{12}(t)=-0.01\sin \sqrt{2}t,\) \(\ b_{21}(t)=0.01\cos \sqrt{5}t,\ b_{22}(t)=0.01\sin t,\ u^{R}_{1}(t)=0.02\sin \sqrt{2}t+0.01\cos \sqrt{5}t,\ u^{I}_{1}(t)=0.02\sin \sqrt{2}t+0.01\cos \sqrt{5}t,\ u^{R}_{2}(t)=0.03\cos \sqrt{3}t-0.01\sin t,\ u^{I}_{2}(t)=0.03\cos \sqrt{3}t-0.01\sin t,\ |d\overline{K}_{jk}(s)|=0.01e^{-2s}ds\), satisfied with Assumption 2. Let \(\delta =0.01,\ \xi _{1}=\xi _{2}=\phi _{1}=\phi _{2}=1\), we have \(\Gamma _{1}^{1}<0,\ \Gamma _{2}^{1}<0, \Gamma _{1}^{2}<0, \Gamma _{2}^{2}<0, \Upsilon _{1}^{1}<0,\ \Upsilon _{2}^{1}<0\). According to the Theorem 1 and 2, the system (29) has a unique global almost periodic solution and exponential stability. The dynamical behavior of system (29) are illustrated in Figs. 1, 2, 3 and 4, where we give five initial valued of system (29).
Remark 1
When \(a_{jk}^{I}(t)=b_{jk}^{I}(t)=0\) and \(f_{k}(\cdot )\) are real functions, the system (1) becomes a real-valued system as in [33]. In this paper, we firstly investigate the uniqueness and stability of almost periodic solution for delayed complex-valued neural networks with Almost periodic coefficients and discontinuous activations. It is a special kind of discontinuous complex-valued activation functions in which real parts and imaginary parts are discontinuous. Therefore, complex-valued neural networks are more suitable than real-valued neural networks.
Remark 2
In this paper, It is different from Article [42], which assumption 2 are no longer monotonic. Firstly, The almost periodic solution is proposed in the complex domain, which is more feasible in practice compared to the periodical scheme. Furthermore, we considered decomposing complex-valued neural networks into real-valued, which the activation function has continuous real part and discontinuous imaginary. Secondly, the decomposed activation function is not assumed monotonous. Under these circumstances, we reconsider almost periodic dynamical behaviors by generalized Lyapunov function method. Lastly, the almost periodic dynamics for complex-valued neural networks with discontinuous functions is investigated, and some judgment conditions are obtained. the issue of time-varying delay is also considered, which make our research have more general significance.
7 Conclusion
In past decades, the theory framework of the discontinuous neural networks and its application was set up in practice. In this article, we present the almost periodic solution of the complex-valued neural networks with discontinuous activations depending on the concept of Fillipov solution. Under the assumptions of the complex-valued activation functions decomposed into continuous real part and discontinuous imaginary part, we validated that the exponential convergence of the almost periodic solution by using the diagonal dominant principle, and non-smooth analysis theory with generalized Lyapunov approach. Furthermore, we achieved the existence, uniqueness and global stability of almost periodic solution for the complex-valued neural networks. Finally, a numerical example demonstrates the effectiveness of our obtained theoretical results.
References
Liu D, Xiong X, DasGupta B, Zhang H (2006) Motif discuveries in unaligned molecular sequences using self-organizing neural networks. IEEE Trans Neural Netw 17(4):919–928
Rutkowski L (2004) Adaptive probabilistic neural networks for patter classification in time-varying environment. IEEE Trans Neural Netw 15(4):811–827
Cao J, Liu Y (2004) A general projection neural networks for solving monmtone variational inequalities and related optimization problems. IEEE Trans Neural Netw 15(2):318–328
Xu X, Cao J, Xiao M, Ho DanielWC, Wen G (2015) A new framework for analysis on stability and bifurcation in a class of neural networks with discrete and distributed delays. IEEE Trans Cybernet 45(10):2224–2236
Hu J, Wang J (2012) Global stability of complex-valued recurrent neural networks with time-delays. IEEE Trans Neural Netw Learn Syst 23(6):853–865
Hirose A, Yoshida S (2012) Generalization characteristics of complex-valued feedforward neural networks in relation to signal Coherence. IEEE Trans Neural Netw Learn Syst 23(4):541–551
Zhou B, Song Q (2013) Boundedness and complete stability of complex-valued neural networks with time delay. IEEE Trans Neural Netw Learn Syst 24(8):1227–1238
Chen X, Song Q (2013) Global stability of complex-valued neural networks with both leakage time delay and discrete time delay on time scales. Neurocomputing 121:254–264
Huang Y, Zhang H, Wang Z (2014) Multistability of complex-valued recurrent neural networks with real-imaginary-type activation functions. Appl Math Comput 229:187–200
Hirose A (2006) Complex-valued neural networks. Springer, New York
Jankowski S, Lozowski A, Zyrada J (1996) Complex-valued multistate neural associative memory. IEEE Trans Neural Netw 7(6):1491–1496
Lee DL (2001) Relaxation of the stability condition of the complex-valued neural networks. IEEE Trans Neural Netw 12(5):1260–1262
Mathes JH, Howell RW (1997) Complex analysis for mathematics and engineering, 3rd edn. Jones and Bartlett Pub. Inc., Burlington
Hu J, Wang J (2012) Global stability of complex-valued recurrnnt neural networks with time-delays. IEEE Trans Neural Netw Learn Syst 23(6):853–865
Zhang ZY, Lin C, Chen B (2014) Global stability criterion for delayed complex-valued recurrnnt neural networks. IEEE Trans Neural Netw Learn Syst 25(9):1704–1708
Fang T, Sun JT (2014) Further investigate the stability of complex-valued neural networks with time-delays. IEEE Trans Neural Netw Learn Syst 25(9):1709–1713
Song Q, Zhao Z (2016) Stability criterion of complex-valued neural networks with both leakage delay and time-varying delays on time scales. Neurocomputing 171:179–184
Li X, Rakkiyappan R, Velmurugan G (2015) Dissipativity analysis of memristor-based complex-valued neural networks with time-varying delays. Inf Sci 294:645–665
Chen X, Song Q (2013) Global stability of complex-valued neural networks with both leakage time delay and discrete time delay on time scales. Neurocomputing 121:254–264
Liu X, Chen T (2016) Global exponential stability for complex-valued recurrent neural networks with asynchronous time delays. IEEE Trans Neural Netw Learn Syst 27(3):593–606
Gong W, Liang J, Cao J (2015) Matrix measure method for global exponential stability of complex-valued recurrent neural networks with time-varying delays. Neural Netw 70:81–89
Bai C (2009) Existence and stability of almost periodic solutions of Hopfield neural networks with continuously disturbuted delays. Nonlinear Anal 71(11):5850–5859
Nitta T (2004) Orthogonality of decision boundaries in complex-valued neural networks. Neural Comput 16(1):73–97
Faijul Amin M, Murase K (2009) Single-layered complex-valued neural network for real-valued classification problem. Neurocomputing 72:945–955
Forti M, Nistri P (2003) Global convergence of neural networks with discontinuous neuron activations. IEEE Trans Circuits Syst I Fundam Theory Appl 50:1421–1435
Forti M (2007) M-matrices and global convergence of discontinuous neural networks. Int J Circuit Theory Appl 35(2):105–130
Bao G, Zeng Z (2012) Analysis and design of associative memories based on recurrent neural network with discontinuous activation functions. Neurocomputing 77(1):101–107
Huang Y, Zhang H, Wang Z (2012) Dynamical stability analysis of multiple equilibrium points in time-varying delayed recurrent neural networks with discontinuous activation functions. Neurocomputing 91:21–28
Duan L, Huang L (2014) Global dynamics of equilibrium point for delayed competitive neural networks with different time scales and discontinuous activations. Neurocomputing 123:318–327
Bai C (2009) Existence and stability of almost periodic solutions of Hopfield neural networks with continuously distributed delays. Nonlinear Anal 71:5850–5859
Wang D, Huang L, Cai Z (2013) On the periodic dynamics of a general Cohen–Grossberg BAM neural networks via differential inclusions. Neurocomputing 118:203–214
Liu J, Liu X, Xie W (2012) Global convergence of neural networks with mixed time-varying delays and discontinuous neuron activations. Inf Sci 183:92–105
Lu W, Chen T (2008) Almost periodic dynamics of a class of delayed neural networks with discontinuous activations. Neural Comput 20(4):1065–1090
Allegretto W, Papini D, Forti M (2010) Common asymptotic behavior of solutions and almost periodicity for discontinuous, delayed, and impulsive neural networks. IEEE Trans Neural Netw 21:1110–1125
Huang Z, Mohamad S, Feng C (2011) New results on exponential attractivity of multiple almost periodic solutions of cellular neural networks with time-delaying delays. Math Comput Model 52:1521–1531
Duan L, Huang L, Guo Z (2014) Stability and almost periodicity for delayed high-order Hopfield neural networks with discontinuous activations. Nonlinear Dyn 77(4):1469–1484
Wang D, Huang L (2014) Alomst periodic dynamical behavior for generlized Cohen–Grossbberg neural networks with discontinuous activations via differential inclusion. Commun Nonlinear Sci Numer Simul 19:3857–3879
Li Y, Wu H (2009) Global stability analysis for periodic solution in discontinuous neural networks with nonlinear growth activations. Adv Differ Equ 2009:798685
Huang YJ, Hu HG (2015) Multistability of delayed complex-valued recurrent neural networks with discontinuous real-imaginary-type activation functions. Chin Phys B 24(12):120701
Liang J, Gong W, Huang T (2016) multistability of complex-valued neural networks with discontinuous activation functions. Neural Netw 84:125–142
Wang Z, Guo Z, Huang L, Liu X (2017) Dynamical behavior of complex-valued hopfield neural networks with discontinuous activation functions. Neural Process Lett 45(3):1039–1061
Yan M, Qiu J, Chen X, Chen X, Yang C, Zhang A (2017) Almost periodic dynamics of the delayed complex-valued recurrent neural networks with discontinuous activation functions. Neural Comput Appl. https://doi.org/10.1007/s00521-017-2911-1
Fink AM (1974) Almost periodic differential equations. Lecture notes in mathematics. Springer, Berlin
He C (1992) Almost periodic differential equation. Higher Education Publishing House, Beijing
Levitan BM, Zhikov VV (1982) Almost periodic functions and differential equations. Cambridge University Press, London
Acknowledgements
This work was supported in part by the National Natural Science Foundation of China under Grant Nos. 61403179, 61304023, 61703193 and 61773193, in part by the Natural Science Foundation of Shandong Province of China under Grant Nos. ZR2017MF022, ZR2014CP008 and ZR2016JL021, and in part by the AMEP of Linyi University.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that there is no conflict of interests regarding the publication of this article.
Rights and permissions
About this article
Cite this article
Yan, M., Qiu, J., Chen, X. et al. The Global Exponential Stability of the Delayed Complex-Valued Neural Networks with Almost Periodic Coefficients and Discontinuous Activations. Neural Process Lett 48, 577–601 (2018). https://doi.org/10.1007/s11063-017-9736-x
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-017-9736-x