Abstract
This work proposes a further improved global robust stability condition for neural networks involving intervalized network parameters and including single time delay. For the sake of obtaining a new robust stability condition, a new upper bound for the norm of the intervalized interconnection matrices is established. The homeomorphism mapping and Lyapunov stability theorems are employed to derive the proposed stability condition by making use of this upper bound norm. The obtained result is applicable to all nondecreasing slope-bounded activation functions and imposes constraints on parameters of neural network without involving time delay.
Access provided by CONRICYT-eBooks. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Analysis of various dynamics of neural networks has recently become an interesting research topic due to their qualitative properties that are employed to solve various practical real-word problems related to combinatorial optimization, image processing and control systems. When solving these types of problems by using neural networks, one needs to establish a neural system possessing a unique and globally asymptotically stable equilibrium point. Thus, one needs to deal with stability of neural networks. The fact that neurons implemented by amplifiers usually have finite switching speeds will result in time delays, which may have undesired affects on the dynamics of neural networks. Another problem is that the parameters of neural systems may involve some uncertainties, which can also have an affect on the equilibria of neural networks. Because of these reasons, for a proper stability analysis, the time delay in the states and uncertainties in the network parameters need to be included in the mathematical model of neural networks. That is to say, the key requirement would be the establishment of robust stability of neural systems which also involve time delay. When reviewing past literature, it can be realized that many researchers published useful robust stability criteria for delayed neural systems, (see references [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19]). This paper uses Lyapunov and Homeomorphic mapping theorems to derive a novel condition for global robust asymptotic stability.
Notations: Let \(z = (z_{1},z_{2}, ..., z_{n})^{T}\). We will use \(|z|=(|{z_1}|,|{z_2}|,...,|{z_n}|)^{T}\). For a given matrix \(E=({e_{ij}})_{n\times n}\), we will use \(|E|=({|e_{ij}|})_{n\times n}\), and \({\lambda _m}(E)\) will represent the minimum eigenvalue of E. If of \(E=E^T\), \(E>0\) will show that E is positive definite. \(E=({e_{ij}})_{n\times n}\) is nonnegative matrix if \({e_{ij}}\ge 0, \forall i,j\). Assume that \(E=({e_{ij}})_{n\times n}\) and \(F=({f_{ij}})_{n\times n}\) are nonnegative matrices. In this case, \(E\preceq F\) will denote that \({e_{ij}}\le {f_{ij}}, \forall i,j\). For the vector z, we will use the norm \(||z||_2^2 = {\sum _{i=1}^{n}z_i^2}\), and for E, we use \(||E||_{2} = [\lambda _{\max }(E^{T}E)]^{1/2}\).
2 Preliminaries
Consider the neural network of the mathematical form
in this above equation, \(a_{ij}\) and \(b_{ij}\) are interconnection parameters, \({c}_{i}\) are the neurons charging rates, \({x}_{i}(t)\) represent state of neuron i, the functions \({f_i}({\cdot })\) are the nonlinear activation functions, \({\tau }\) represents the time delay, \(u_i\) are the inputs.
Neural system (1) can be put into an equivalent system governed by the differential equation:
where \(C=diag({c_i})\), \(A=({a_{ij}})_{n\times n}\), \(B=({b_{ij}})_{n\times n}\), \(x(t)=({x}_{1}(t),{x}_{2}(t),...,{x}_{n}(t))^{T}\), \(u=({u}_{1},{u}_{2},...,{u}_{n})^{T}\), \(f(x(\cdot ))=({f_1}({x}_{1}(\cdot )),{f_2}({x}_{2}(\cdot )),...,{f_n}({x}_{n}(\cdot )))^{T}.\)
The functions \(f_i\) possess the following properties:
with \({k_i}\) being positive constants. The functions satisfying the above conditions are denoted by \(f\in \mathcal{K}\).
The matrices \(A=({a_{ij}})\), \(B=({b_{ij}})\) and \(C=diag({c_i}>0)\) in (1) are stated by the following intervals:
We now introduce the following lemma which is of great importance to obtaining our main result:
Lemma 1:
Let D be a positive diagonal matrix with n diagonal entries, x be any real vector having n elements, and consider any real \(n\times n\) dimensional matrix \(A=(a_{ij})\) with being intervalized as \({\underline{A}} \preceq {A} \preceq {\overline{A}}\). In this case, the following inequality is satisfied:
in which \({A^*}={\frac{1}{2}}~({\overline{A}}+{\underline{A}})\) and \({A_*}=\frac{1}{2}~({\overline{A}}-{\underline{A}})\).
Proof:
If \(A\in {A_I}\), then, \({a_{ij}}\) can be written as
Assume that \(\tilde{A}=({\tilde{a}_{ij}})_{n\times n}\) is a real constant matrix and whose elements are defined as \( {\tilde{a}}_{ij}={1\over 2}{\sigma _{ij}}({\overline{a}_{ij}}- {{\underline{a}_{ij}}})\). Then, A can be written as
We can now express the following:
Since \( |{\tilde{a}}_{ij}| \le {1\over 2}({\overline{a}_{ij}}- {{\underline{a}_{ij}}}), {\forall } i,j \), it follows that \(|\tilde{A}| \preceq {A_*}\). Then, we obtain
Below are two lemmas and a fact that will be needed in the proofs:
Lemma 2
[1]: Let D be a positive diagonal matrix with n diagonal entries, x be any real vector having n elements, and consider any real \(n\times n\) dimensional matrix \(A=(a_{ij})\) with being intervalized as \({\underline{A}} \preceq {A} \preceq {\overline{A}}\). In this case, the following inequality is satisfied:
where \(S=(s_{ij})\) such that \(s_{ii}=2{d_i}{\overline{a}_{ii}}\) and \(s_{ij}=max(|{d_i}\overline{a}_{ij}+{d_j}\overline{a}_{ji}|, |{d_i}\underline{a}_{ij}+{d_j}\underline{a}_{ji}|)\) for \(i\ne j\).
Lemma 3
[2]: Let the map \(H (y)\in C^0\) posses two properties: \(H (y)\ne H (z)\), \(\forall y\ne z\) and \(||H(y)||{\rightarrow }{\infty }\) as \(||y||{\rightarrow }{\infty }\) with \(y\in R^n\) and \(z\in R^n\). Then, H(y) is said to be homeomorphism of \(R^n\).
Fact 1:
If \(A=({a_{ij}})\) and \(B=({b_{ij}})\) satisfy (3), then, A and B have bounded norms, i.e., we can find some positive real constants \(\epsilon \) and \(\varepsilon \) satisfying the following
3 Existence and Uniqueness Analysis
The following theorem presents the criterion which ensures that system (1) possesses a unique equilibrium point for each constant input:
Theorem 1:
Let neuron activation functions belong \(\mathcal{K}\), and assume that the uncertain network elements A, B and C are defined by (3). Then, delayed neural network described by (1) possesses only one equilibrium point, if one can find a matrix \(D=diag({d_i}>0)\) satisfying the following condition
where \(K=diag({k_i}>0)\), \(Q={{B_*}^T}D|{{B^*}}|+{{B_*}^T}D{{B_*}}+|{{B^*}^T}D{{B^*}}|+|{{B^*}^T}|D{B_*}\), \(S=(s_{ij})\) is the matrix whose diagonal elements are defined by \(s_{ii}=2{d_i}{\overline{a}_{ii}}\) and off-diagonal elements are defined by \(s_{ij}=max{(|{d_i}\overline{a}_{ij}+{d_j}\overline{a}_{ji}|, |{d_i}\underline{a}_{ij}+{d_j}\underline{a}_{ji}|)}\), the matrix \({B^*}\) included in Q is defined as \({B^*}={1\over 2}({\overline{B}}+{\underline{B}})\) and the nonnegative matrix \({B_*}\) included in Q is defined as \({B_*}={1\over 2}({\overline{B}}-{\underline{B}})\).
Proof:
Consider the associated map H(x) representing neural network (2)
For every equilibrium point \(x^*\) of (2), by definition of equilibrium equation, the following must be satisfied
Apparently, when a vector x satisfies \(H(x)=0\), it results in the fact that \(H(x)=0\) also corresponds to the equilibrium points representing the solutions of (2). Thus, by the virtue of Lemma 3, one can conclude that neural model (2) possesses only one equilibrium point for the constant u if H(x) fulfills conditions in Lemma 3. For any randomly selected vectors \(x\ne y\), using (4), we express
Let \(H(x,y)={H}({x})-{H}({y})\) and \(f(x,y)={f}({x})-{f}({y})\). Then, the previous equation can be put in the form:
Since \(f\in \mathcal{K}\), if \(x\ne y\) and \({f}({x})={f}({y})\), (5) yields
in which \(C=diag({c_i}>0)\). Therefore, \(x-y\ne 0\) will ensure the condition that \(H(x,y)\ne 0\). For \(x-y\ne 0\), \(f(x,y)\ne 0 \), and \(D=diag({d_i}>0)\), multiplying (5) by the nonzero vector \(2{f^T}(x,y)D\) leads to
The following can be written
Thus, one would obtain
For activation functions in \( \mathcal{K}\), the following can be derived
Lemma 2 leads to
It is worth noting that
By using Lemma 1, one would get
Thus
Using (7)–(9) in (6) will give the following
Since \(\varTheta >0\), one can observe that
Obviously, \({f}(x,y)\ne 0 \) with \(\varTheta \) being positive definite, that is \(\varTheta >0\), (10) leads to
where \({f}(x,y)\ne 0 \) guarantees condition that \({H}({x})\ne {H}({y})\) for all \(x\ne y\).
Choosing \(y=0\), (10) will directly result in
It follows from the above inequality that
yielding
Using some basic properties of the vector norms, we can state
Knowing that \(||{H}({0})||_{1}\), \(||D||_{\infty }\), and \(||{f}({0})||_{2}\) have limited upper bounds will enable us to conclude that \(||{H}({x})||\rightarrow \infty \) if \(||{x}||\rightarrow \infty \). Q.E.D.
4 Stability Analysis
Stability of neural network (1) will be studied in this section. If \(x^*\) is defined to denote an equilibrium point of (1), then, by means of \({z}_{i}(\cdot )={x}_{i}(\cdot )-{x_i^*}\), neural system (1) is replaced by a new model whose dynamics is governed by:
Note that \({g_i}({z}_{i}(\cdot ))={f_i}({z}_{i}(\cdot )+{x_i^*})-{f_i}({x_i^*})\). We can easily observe that the functions \(g_i\) belong to the class \(\mathcal{K}\), that is, \(g\in \mathcal{K}\) satisfying \({g_i}(0)=0\).
The vector-matrix form of neural system (11) is
In this new system, \(z(t)=({z}_{1}(t),{z}_{2}(t),...,{z}_{n}(t))^{T}\), and the new nonlinear output functions state vector is \(g(z(\cdot ))=({g_1}({z}_{1}(\cdot )),{g_2}({z}_{2}(\cdot )),...,{g_n}({z}_{n}(\cdot )))^{T}\).
The stability result is given as follows:
Theorem 2:
Let neuron activation functions belong \(\mathcal{K}\), and assume that the uncertain network elements A, B and C are given by (3). Then, the origin of delayed neural system described by (11) is globally asymptotically stable, if one can find an appropriate matrix \(D=diag({d_i}>0)\) satisfying the following condition
where \(K=diag({k_i}>0)\), \(Q={{B_*}^T}D|{{B^*}}|+{{B_*}^T}D{{B_*}}+|{{B^*}^T}D{{B^*}}|+|{{B^*}^T}|D{B_*}\), \(S=(s_{ij})\) is the matrix whose diagonal elements are defined by \(s_{ii}=2{d_i}{\overline{a}_{ii}}\) and off-diagonal elements are defined by \(s_{ij}=max{(|{d_i}\overline{a}_{ij}+{d_j}\overline{a}_{ji}|, |{d_i}\underline{a}_{ij}+{d_j}\underline{a}_{ji}|)}\), the matrix \({B^*}\) included in Q is defined as \({B^*}={1\over 2}({\overline{B}}+{\underline{B}})\) and the nonnegative matrix \({B_*}\) included in Q is defined as \({B_*}={1\over 2}({\overline{B}}-{\underline{B}})\).
Proof:
The Lyapunov functional to be exploited for the proof of this theorem is chosen as:
where the \(d_i\), \(\gamma \) and \(\xi \) are constants. \(\dot{V}({z}(t))\) is determined to be as
Let \(\alpha ={\Vert A\Vert _2^2}{\Vert C^{-1}\Vert _2}\) and \(\beta ={\Vert A\Vert _2^2}{\Vert C^{-1}\Vert _2}\). We now observe the inequalities:
Inserting (14)–(18) into (13) yields
Taking \({\xi }={{\beta }}\) leads to
Since \(\varTheta >0\), (19) gives
Thus
guarantees that \({\dot{V}}({z}(t))\) will have negative values \(\forall {g}({z}(t))\ne 0\), or equivalently \({\dot{V}}({z}(t))<0\) when \( {z}(t)\ne 0\).
Let \({g}({z}(t))= 0\). Taking \({z}(t)\ne 0\) leads to
Then
leads to
in which \({\dot{V}}({z}(t))<0\) \(\forall {z}(t)\ne 0\). Finally, \({g}({z}(t))= {z}(t)=0\) leads to
Apparently, \({\dot{V}}({z}(t))<0\) \(\forall {g}({z}(t-\tau ))\ne 0\). Hence, \({\dot{V}}({z}(t))=0\) iff \({z}(t)={g}({z}(t))={g}({z}(t-\tau ))=0\). This also means \({\dot{V}}({z}(t))<0\) when states and delayed states are not equal to zero. The radially unboundedness the Lyapunov functional is to be easily checked by proving \(V({z}(t))\rightarrow \infty \) when \(\Vert {z}(t)\Vert \rightarrow \infty \). Q.E.D.
5 Conclusions
This work proposed a further improved condition for the robustness of neural networks involving intervalized network parameters and including single time delay. For the sake of obtaining a new robust stability condition, a new upper bound for the norm of the intervalized interconnection matrices has been established. The homeomorphism mapping and Lyapunov stability theorems are employed to derive the proposed stability condition by making use of this upper bound norm. The obtained result is applicable to all nondecreasing slope-bounded activation functions and imposes constraints on parameters of neural network without involving time delay.
References
Ensari, T., Arik, S.: New results for robust stability of dynamical neural networks with discrete time delays. Expert Syst. Appl. 37, 5925–5930 (2010)
Faydasicok, O., Arik, S.: Robust stability analysis of a class of neural networks with discrete time delays. Neural Netw. 29–30, 52–59 (2012)
Faydasicok, O., Arik, S.: A new upper bound for the norm of interval matrices with application to robust stability analysis of delayed neural networks. Neural Netw. 44, 64–71 (2013)
Shao, J.L., Huang, T.Z., Zhou, S.: Some improved criteria for global robust exponential stability of neural networks with time-varying delays. Commun. Nonlinear Sci. Numer. Simul. 15, 3782–3794 (2010)
Zhu, Q., Cao, J.: Robust exponential stability of markovian jump impulsive stochastic Cohen-Grossberg neural networks with mixed time delays. IEEE Trans. Neural Netw. 21, 1314–1325 (2010)
Huang, T., Li, C., Duan, S., Starzyk, J.A.: Robust exponential stability of uncertain delayed neural networks with stochastic perturbation and impulse effects. IEEE Trans. Neural Netw. Learn. Syst. 23, 866–875 (2012)
Zhou, J., Xu, S., Zhang, B., Zou, Y., Shen, H.: Robust exponential stability of uncertain stochastic neural networks with distributed delays and reaction-diffusions. IEEE Trans. Neural Netw. Learn. Syst. 23, 1407–1416 (2012)
Shen, Y., Wang, J.: Robustness analysis of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances. IEEE Trans. Neural Netw. Learn. Syst. 23, 87–96 (2012)
Song, Q., Yu, Q., Zhao, Z., Liu, Y., Alsaadi, F.E.: Boundedness and global robust stability analysis of delayed complex-valued neural networks with interval parameter uncertainties. Neural Netw. 103, 55–62 (2018)
Li, X., et al.: Extended robust global exponential stability for uncertain switched memristor-based neural networks with time-varying delays. Appl. Math. Comput. 325, 271–290 (2018)
Ali, M.S., Gunasekaran, N., Rani, M.E.: Robust stability of hopfield delayed neural networks via an augmented L-K functional. Neurocomputing 234, 198–204 (2017)
Wang, Z., Wang, J., Wu, Y.: State estimation for recurrent neural networks with unknown delays: a robust analysis approach. Neurocomputing 227, 20–36 (2017)
Wan, Y., Cao, J., Wen, G., Yu, W.: Robust fixed-time synchronization of delayed Cohen-Grossberg neural networks. Neural Netw. 73, 86–94 (2016)
Muthukumar, P., Subramanian, K., Lakshmanan, S.: Robust finite time stabilization analysis for uncertain neural networks with leakage delay and probabilistic time-varying delays. J. Franklin Inst. 353, 4091–4113 (2016)
Gong, W., Liang, J., Kan, X., Nie, X.: Robust state estimation for delayed complex-valued neural networks. Neural Process. Lett. 46, 1009–1029 (2017)
Dong, Y., Liang, S., Guo, L.: Robustly exponential stability analysis for discrete-time stochastic neural networks with interval time-varying delays. Neural Process. Lett. 46, 135–158 (2017)
Arik, S.: New criteria for global robust stability of delayed neural networks with norm-bounded uncertainties. IEEE Trans. Neural Netw. Learn. Syst. 25, 1045–1052 (2014)
Wang, J., Zhang, H., Wang, Z., Shan, Q.: Local synchronization criteria of markovian nonlinearly coupled neural networks with uncertain and partially unknown transition rates. IEEE Trans. Syst. Man Cybern. Syst. 47, 1953–1964 (2017)
Ding, Z., Zeng, Z., Wang, L.: Robust finite-time stabilization of fractional-order neural networks with discontinuous and continuous activation functions under uncertainty. IEEE Trans. Neural Netw. Learn. Syst. 29, 1477–1490 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Faydasicok, O., Cicek, C., Arik, S. (2018). A New Robust Stability Result for Delayed Neural Networks. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_30
Download citation
DOI: https://doi.org/10.1007/978-3-030-04179-3_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04178-6
Online ISBN: 978-3-030-04179-3
eBook Packages: Computer ScienceComputer Science (R0)