Abstract
In this paper, we investigate the asymptotical stability and synchronization of fractional neural networks. Multiple time-varying delays and distributed delays are taken into consideration simultaneously. First, by applying the Banach’s fixed point theorem, the existence and uniqueness of fractional delayed neural networks are proposed. Then, to guarantee the asymptotical stability of the demonstrated system, two sufficient conditions are derived by integral-order Lyapunov direct method. Furthermore, two synchronization criteria are presented based on the adaptive controller. The above results significantly generalize the existed conclusions in the previous works. At last, numerical simulations are taken to check the validity and feasibility of the achieved methods.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
In the recent few decades, various artificial neural network models such as Hopfield neural networks (de Castro and Valle 2020), recurrent neural networks (Achouri and Aouiti 2022), cellular neural networks (Xu et al. 2021) have been widely investigated due to their extensively application in optimization combination, filtering, signal processing, and so on. These applications are strongly affected by the dynamical behaviors of neural networks, especially stability and synchronization. Therefore, the investigation on the stability and synchronization of neural networks has become an attractive subject and a great deal of excellent results have been proposed, see for examples (Chen et al. 2020; Wang et al. 2021; Xiao et al. 2021; Liu et al. 2021).
Time delays, induced by the limited speed of the transmission between neurons, are usually ubiquitous in neural networks. What is more, some complicated behaviors such as oscillation, bifurcation, or chaos may be produced by time delays (Song and Peng 2006; Aouiti et al. 2020). Thus, it is significant and unavoidable to study the stability and synchronization of delayed neural networks. Zhang and Zeng (2021) provided several criteria to check the stability and synchronization of reaction–diffusion neural networks with general time-varying delays. By Lyapunov–Krasovskii functions and linear matrix inequalities, the asymptotic stability of recurrent neural networks with multiple discrete delays and distributed delays was concerned in Cao et al. (2006). The exponential stability of Clifford-valued inertial neural networks with mixed delays was studied by the means of pseudo almost periodic function theory and some inequality theories in Aouiti and Ben Gharbia (2020). In Sheng et al. (2021), the finite-time stability of competitive neural networks with discrete time-varying delays was discussed by adopting comparison strategies and inequality techniques. By now, a variety of results about the stability and synchronization of neural networks have been derived.
It is worth noting that the above analysis about neural networks is focusing on the integer calculus. Fractional calculus (Podlubny 1999; Kilbas et al. 2006), as an extension of the integer calculus, has received considerable attention due to its broad applications in many fields such as viscoelasticity (Koeller 1984), bioengineering (Debnath 2003), fluid mechanic (Tripathi 2011), and so on. Compared with the traditional integer-order systems, fractional-order models can depict multifarious processes and phenomena more exactly because of its memory and hereditary properties. Owning to these superiorities, many researchers have attempted to combine fractional calculus with neural networks, leading to fractional-order neural networks. Moreover, the dynamical behavior of fractional neural networks has become a noticeable subject and numerous results have been widely investigated (Fan et al. 2018; Huang et al. 2020; Zhang et al. 2018; Xiao and Zhong 2019; Pratap et al. 2018; Chen et al. 2018).
Obviously, it is of considerable importance to investigate the stability of fractional neural networks. In Liu et al. (2017), the properties of activation functions and M-matrix were utilized to consider the Mittag–Leffler stability of fractional recurrent neural networks. In Zhang and Zhang (2020), Chen et al. (2021), the method of the Lyapunov–Krasovskii function was used to study the asymptotic stability of fractional neural networks with time delays. Yao et al. (2021) considered the exponential stability of fractional-order fuzzy cellular neural networks with multiple delays by Laplace transform method and complex function. Some criteria about the finite-time stability of fractional inertial neural networks with time-varying delays were proposed by Lyapunov–Krasovskii function and analytical techniques in Aouiti et al. (2022). In Li et al. (2021), based on the sign function and activation functions, the uniform stability of complex-valued fractional neural networks with linear impulses and fixed time delays was discussed.
In addition, the synchronization of fractional neural networks is another heated topic in recent years. In Li et al. (2022), Bai et al. (2022), via employing the method of Lyapunov–Krasovskii function, the exponential synchronization and secure synchronization of fractional complex neural networks were investigated. You et al. (2020) studied the Mittag–Leffler synchronization of discrete-time complex fractional neural networks with time delay by applying the Lyapunov direct method and designing a suitable controller. Du and Lu (2021) utilized a new fractional-order Gronwall inequality to explore the finite-time synchronization of fractional memristor-based neural networks with time delay. The quasi-uniform synchronization of fractional neural networks with leakage and discrete delays was discussed by Laplace transformation and several analytical techniques in Zhang et al. (2021).
Apparently, we can find that there are several effective methods such as Laplace transformation (Yao et al. 2021; Zhang et al. 2021), linear programming approach (Yang et al. 2020), and Lyapunov direct method (Zhang and Zhang 2020) in demonstrating the stability and synchronization of neural networks. Among them, the Lyapunov direct method is the most frequently used in the existing literatures (Zhang and Wu 2022). For fractional systems, the main difficulty is how to construct an appropriate Lyapunov–Krasovskii function, because similar tools in integer calculus can not be generalized to fractional calculus easily. On the other hand, most of the previous works regarded the time delay as a single time-varying delay or constant time delays (Yao et al. 2021). However, it may not keep unchanged during the transmission in neuron and the delays between the neurons may different. In view of this, it is necessary and meaningful to explore some effective methods to investigate the stability and synchronization of fractional neural networks with various types of time delays. However, as far as we know, such investigation are rare (Syed Ali et al. 2021; Wu et al. 2022).
Motivated by the above analysis, we probe the asymptotical stability and synchronization of Riemann–Liouville fractional neural networks with multiple time-varying delays and distributed delays. The main contributions in this paper can be summarized as follows:
-
We introduce the multiple time-varying delays and distributed delays to fractional neural networks simultaneously. Compared with the previous works, the model is more close to actual applications.
-
The uniqueness, asymptotical stability and synchronization of the demonstrated system are proposed. In addition, the obtained results are expressed as the matrix inequalities, which are more concise and feasible to use.
-
To avoid the difficulty of calculating the fractional-order derivative of a function, two Lyapunov–Krasovskii functions associated with fractional integral terms are considered and we can compute their first-order derivative directly.
-
Based on the relationship between the stability and synchronization of fractional systems, two sufficient conditions about the synchronization of the considered system are proposed.
This paper is organized as follows. In Sect. 2, some preliminaries and the fractional neural network model are described. The asymptotical stability and synchronization criteria of the considered system are studied in Sect. 3. In Sect. 4, four numerical examples are taken to check the validity and feasibility of the proposed results. Some conclusions and further works are summarized in Sect. 5.
2 Preliminaries and model description
At present, there are several definitions for fractional-order derivatives, such as Riemann–Liouville and Caputo definitions. The two definitions have their own advantages. The Caputo derivative is more applicable in practical engineering, because its initial conditions are the same as integral-order systems in form. However, the Caputo derivative requires the function to be n-order differentiable and the Riemann–Liouville derivative only requires that the function be continuous. On the other hand, compared with the Caputo derivative, the Riemann–Liouville derivative can be considered as a natural generalization of integer derivative, because it is a continuous operator from the integer order to arbitrary order. Therefore, the Riemann–Liouville derivative is applied in this paper. Some preliminaries and the fractional neural network model are introduced in this section.
Definition 1
The Riemann–Liouville fractional integral of order p for the function u(t) is defined as
where \(\Gamma (\cdot )\) is the Gamma function and \(\Gamma (p)=\int _{t_{0}}^{\infty }t^{p-1}e^{-t}\textrm{d}t\).
Definition 2
The Riemann–Liouville fractional derivative of order q for the function u(t) is defined as
In this paper, we investigate the Riemann–Liouville fractional neural networks with multiple time-varying delays and distributed delays described by
where \(0<\alpha <1, i=1,2,\cdots ,n\); \(u_i(t)\) denotes the state of the ith neuron; \(b_{ij}, c^k_{ij}, h_{ij}\) represent the neuron interconnection parameters at time t; \(a_{i}>0\) is a constant; \(f_j, g_j, r_j\) are neuron activation functions with \(f_j(0)=g_j(0)=r_j(0)=0\); \(\varrho _k(t)\) is the time-varying delay satisfying \(\varrho _k(t)\le \varrho _k, \dot{\varrho }_{(k)}(t)\le \gamma _k<1\), \(I_i\) is the external input.
The initial conditions of system (1) are given by
Assumption 1
(A1) The delay kernel \(\psi _j \in C([0,+\infty ),\textrm{R})\) is a nonnegative function. Then, the following equalities hold
In this paper, the delay kernel \(\psi _j(t)\) is given by \(\psi _j(t)=e^{-t}.\)
Assumption 2
(A2) The neuron activation functions \(f_{j}(\cdot ), g_{j}(\cdot ), r_{j}(\cdot )\) are continuous and satisfy the Lipschitz condition, which means that the following inequalities hold
where \(u_1, u_2 \in \textrm{R}\) and \(l^1_{j}, l^2_{j}, l^3_{j}\) are Lipschitz constants.
Assumption 2* (A2*) The neuron activation functions \(g_{j}(\cdot ), r_{j}(\cdot )\) are continuous and satisfy the Lipschitz condition, which means that the following inequalities hold
where \(u_1, u_2 \in \textrm{R}\) and \(l^2_{j}, l^3_{j}\) are Lipschitz constants. In particularly, the neuron activation function \(f_{j}(\cdot )\) is monotonically increasing, bounded and Lipschitz continuous, that is
where \(l^1_{j}\) is a positive constant.
Lemma 1
If \(\alpha> \beta >0\), then the following property holds for integrable function u(t)
Lemma 2
Let \(u(t)\in \textrm{R}^n\) be a vector of differentiable function, then the following inequality holds
where \(P\in \textrm{R}^{n \times n}\) is a constant, square, symmetric, and positive definite matrix.
Lemma 3
For any \(x, y \in \textrm{R}^{n},\epsilon >0\), the following inequality holds
Lemma 4
For constant matrices \(\Omega _{11}, \Omega _{12}, \Omega _{22}\), where \(\Omega _{11}=\Omega ^{\textrm{T}}_{11},\Omega _{22}=\Omega ^{\textrm{T}}_{22}\), the following two inequalities are equivalent
Definition 3
A constant vector \(u^*=(u_1^*, u_2^*, \cdots , u_n^*)^\textrm{T}\in \textrm{R}^n\) is an equilibrium point of system (1) if and only if \(u^*\) satisfy the following equations:
3 Main results
In this section, several sufficient conditions for asymptotical stability and synchronization of Riemann–Liouville fractional delayed neural networks are derived.
3.1 Asymptotic stability criteria
Theorem 1
Suppose that A1, A2 hold; Let \(u=(\tilde{u}_1, \tilde{u}_2, \cdots , \tilde{u}_n)^\textrm{T} \in {\mathbb {B}}\), where \({\mathbb {B}}\) is a Banach space endowed with the norm \(\Vert u\Vert _1=\sum _{i=1}^{n}|\tilde{u}_i|\). Then, there must exist a unique equilibrium point \(u^*\) for system (1) if the following inequalities hold.
Proof
Let \(u=(\tilde{u}_1, \tilde{u}_2, \cdots , \tilde{u}_n)^\textrm{T}=(a_1u_1, a_2u_2, \cdots , a_nu_n)^\textrm{T}\in \textrm{R}^n\). Constructing a mapping \(\varphi : {\mathbb {B}} \rightarrow {\mathbb {B}}, \varphi (u)=(\varphi _1(u), \varphi _2(u), \cdots , \varphi _n(u))^\textrm{T}\) and
For any two different points \(\ell =(\ell _1, \ell _2, \cdots , \ell _n)^\textrm{T}, \jmath =(\jmath _1, \jmath _2, \cdots , \jmath _n)^\textrm{T}\), we have
Then, we can get
So \(\Vert \varphi (\ell )-\varphi (\jmath )\Vert \le \rho \Vert \ell -\jmath \Vert \). Based on (2), we can find that the mapping \(\varphi \) is a contraction mapping on \(\textrm{R}^n\). Hence , there must exist a unique fixed point \(\tilde{u}^*\in \textrm{R}^n\), such that \(\varphi (\tilde{u}^*)=\tilde{u}^*\). Scilicet,
Let \(u_i^*=\frac{\tilde{u}_i^*}{a_i}\), we have
According to the Definition 3, the theorem can be proved. \(\square \)
By the transformation \(\mu _i(t)=u_i(t)-u_i^{*}\), we can rewrite system (1) into the following vector form
where \(\mu (t)=[\mu _1(t),\mu _2(t),\cdots ,\mu _n(t)]^\textrm{T};\) \(A=\textrm{diag}[a_1,a_2,\cdots ,a_n],\) \(B=[b_{ij}], C_k=[c^k_{ij}],\) \(H=[h_{ij}],\) \(\psi (t-s)=\textrm{diag}[\psi _1(t-s), \psi _2(t-s), \cdots , \psi _n(t-s)]\) and \(f(\mu (t))=\big [f_1(\mu _1(t)), f_2(\mu _2(t)), \cdots , f_n(\mu _n(t))\big ]^\textrm{T},\) \(g(\mu (t-\varrho _k(t)))=\big [g_1(\mu _1(t-\varrho _k(t))), g_2(\mu _2(t-\varrho _k(t))), \cdots , g_n(\mu _n(t-\varrho _k(t)))\big ]^\textrm{T},\) \(r(\mu (t))=\big [r_1(\mu _1(t)), r_2(\mu _2(t)), \cdots , r_n(\mu _n(t))\big ]^\textrm{T}.\)
Theorem 2
Suppose that A1, A2 and Theorem 1 hold; Then the system (3) is asymptotical stable if there exist a positive definite matrix P, positive definite diagonal matrices \(M, G_k\) and positive scalars \(\delta _k, q_j\) such that the following inequality holds.
where \(S=2PA-L_1ML_1-L_3QL_3-\sum _{k=1}^{K}\delta _k L_2G_kL_2, L_1=\textrm{diag}[l^1_1, l^1_2, \cdots , l^1_n], L_2=\textrm{diag}[l^2_1, l^2_2, \cdots , l^2_n], L_3=\textrm{diag}[l^3_1, l^3_2, \cdots , l^3_n], Q=\textrm{diag}[q_1, q_2, \cdots , q_n].\)
Proof
Consider the following Lyapunov–Krasovskii function
where
Next, we can compute the derivative of V(t) with the help of Lemmas 1, 2 yields
Based on the integral form of Cauchy’s inequality
we have
According to Lemma 3, there must exist a positive definite diagonal matrix M such that
Hence, from (5)–(10), we can get
where \(\Lambda =2PA-L_1ML_1- L_3QL_3-\sum _{k=1}^{K} \delta _k L_2G_kL_2-PBM^{-1}B^{\textrm{T}}P-PHQ^{-1}H^{\textrm{T}}P- \sum _{k=1}^{K}\frac{1}{\delta _k(1-\gamma _k)}PC_kG_k^{-1}C_k^{\textrm{T}}P\). So \(\dot{V}(t)\) is negative if and only if \(\Lambda >0\). We convert the \(\Lambda \) to the form of matrix \(\Xi _1\) by Lemma 4 and the theorem can be proved. \(\square \)
Remark 1
It is worth noting that the matrices M and \(G_k\) should be positive and diagonal in Theorem 2. In the following Theorem, the constraint has been removed with the help of another Lyapunov–Krasovskii function.
Theorem 3
Suppose that A1, A2*, and Theorem 1 hold; Then, the system (3) is asymptotical stable if there exist positive definite matrices P, \(G_k\) and positive scalars \(\delta _k, w_j, q_j\) such that the following inequality holds.
where \(\Phi _1=2PA- L_3QL_3-\sum _{k=1}^{K}\delta _k L_2G_kL_2, \Phi _2=2WAL_1^{-1}, \Phi _3=-(L_1^TWB+PB)\) and \(W=\textrm{diag}[w_1, w_2, \cdots , w_n]\).
Proof
Consider the following Lyapunov–Krasovskii function
where
Similar to the process of Theorem 2, we can compute the derivative of \(V_4(t)\) yields
Combining (5), (6), (7), (12), we have
Therefore, the system (3) is asymptotical stable under the condition of (11). This completes the proof. \(\square \)
Remark 2
Obviously, the constraint that the matrices \(G_k\) should be positive and diagonal has been replaced with the positive matrices in Theorem 3. However, the restriction A2* is more strict than A2. Therefore, two sufficient conditions in Theorems 2 and 3 can be chosen according to the practical engineering. In the following part, we will discuss the synchronization problem of the system (1) based on the relationship between the stability and synchronization.
3.2 Asymptotic synchronization criteria
Taking system (1) as the drive system, the response system can be defined as
where \(z_i(t)\) is the suitable controller. In this subsection, we investigate the synchronization between (1) and (13).
Define the error \(e_i(t)=v_i(t)-u_i(t)\). Then, we can get the following error system between (1) and (13)
where \(\bar{f}_j(e_j(t))=f_j(v_j(t))-f_j(u_j(t)), \bar{g}_j(e_j(t-\varrho _k(t)))=g_j(v_j(t-\varrho _k(t)))-g_j(u_j(t-\varrho _k(t))), \bar{r}_j(e_j(t))=r_j(v_j(t))-r_j(u_j(t))\). The control law \(z_i(t)\) is adopted as \(z_i(t)=-\sigma _{i}e_i(t), \sigma _{i}\in \textrm{R}.\) For convenience, we transform the above system into the vector form yields
where \(\bar{\sigma }=\textrm{diag}[\sigma _{1}, \sigma _{2}, \cdots , \sigma _{n}]\). It is easy to find that the synchronization between the system (1) and (13) is equivalent to the asymptotical stability of the system (15) (Hu et al. 2018).
Theorem 4
Suppose that A1, A2, and Theorem 1 hold; Then, the systems (1) and (13) are in synchronization if the following inequality holds.
where \(\bar{S}=2P(A+\bar{\sigma })-L_1ML_1- L_3QL_3-\sum _{k=1}^{K}\delta _k L_2G_kL_2.\)
Theorem 5
Suppose that A1, A2*, and Theorem 1 hold; Then the systems (1) and (13) are in synchronization if the following inequality holds.
where \(\bar{\Phi }_1=2P(A+\bar{\sigma })- L_3QL_3-\sum _{k=1}^{K}\delta _k L_2G_kL_2, \bar{\Phi }_2=2W(A+\bar{\sigma })L_1^{-1}\).
Remark 3
The theorems obtained in this paper reveal the relationship between the asymptotical stability and synchronization of fractional neural networks.
Remark 4
Compared with Hu et al. (2018), the neural networks in this paper are more applicable due to its multiple time-varying delays and distributed delays. Some inequality theories are used in dealing with the time delays.
4 Numerical simulations
In this section, four examples are taken to show the correctness of our proposed results by LMI toolbox and predictor-corrector algorithm (Bhalekar and Daftardar-Gejji 2011).
Example 1
Consider the following two-dimensional fractional-order delayed neural networks
The parameters of (18) are set as
Obviously, the neuron activation functions satisfy the condition A2 and \(l^1_j=l^2_j=l^3_j=1\). It can be verified that
So, there exists a unique equilibrium point for system (18).
According to the Theorem 2, we can obtain positive scalars \(\delta =q_1=q_2=1\) and a positive definite matrix P, positive definite diagonal matrices M, G by Matlab LMI toolbox, which illustrate the asymptotical stability of system (18)
To further verify the correctness of the above analysis, the state trajectories of system (18) under different initial conditions and different fractional-order conditions are simulated by predictor–corrector algorithm in Figs. 1, 2. It is easy to find that the trajectories of states \(u_1(t), u_2(t)\) are asymptotical stable.
Example 2
Consider the following two-dimensional fractional-order delayed neural networks
The parameters of (18) are set as
Similarly, the Theorem 1 can be proved as follows
So, there exists a unique equilibrium point for system (19).
Then, we can find positive scalars \(\delta =q_j=\omega _j=1, j=1,2\) and positive definite matrices P, G that satisfy the inequality (11)
Therefore, we can conclude that the system (19) is asymptotical stable based on the Theorem 3. The trajectories of states \(u_1(t), u_2(t)\) under different initial conditions and different fractional-order conditions are given in Figs. 3, 4, which can verify the accuracy of the previous work.
Example 3
Consider the following two-dimensional fractional-order delayed neural networks
The parameters of (20) are set as
Then, the error system between (13) and (14) can be devised as
where the control law can be chosen as
Based on the Theorem 4, the positive definite matrix P and positive definite diagonal matrices M, G can be calculated by Matlab yields
So, the error system (21) is asymptotical stable which means that the systems (13) and (14) are in synchronization. Figure 5 presents the trajectories of \(e_1(t), e_2(t)\) under different fractional-order conditions. We can find that the time responses of the states tend to zero, which illustrates the correctness of the above analysis.
Example 4
Consider the following two-dimensional fractional-order delayed neural networks
The parameters of (22) are set as
Similarly, we can get the error system between (13) and (14), that is
where the control law can be devised as
So, there exist positive definite matrices P, G that satisfy the inequality (17)
Based on the Theorem 5, we can conclude that the error system (21) is asymptotical stable, which means that the systems (13) and (14) are in synchronization. The synchronization trajectories of \(e_1(t), e_2(t)\) in Fig. 6 further verify the accuracy of the above analysis.
5 Conclusion
As we know, various types of time delays are inevitable in the implementation of fractional neural networks. Recently, the dynamical analysis of fractional delayed neural networks has received considerable attention. In view of this, we consider the fractional neural networks with both multiple time-varying delays and distributed delays, and then investigate their asymptotical stability and synchronization. First, by the Banach’s fixed point theorem, the existence and uniqueness of the considered system are studied. Then, two sufficient conditions are derived to ensure the asymptotical stability of the addressed model by integer-order Lyapunov direct method, which can avoid calculating the fractional-order derivative of Lyapunov–Krasovskii functions. Furthermore, the synchronization criteria are presented as our main results. Numerical simulations are proposed at last by LMI toolbox and predictor–corrector algorithm to check the effectiveness of the obtained results.
In the future, we might study the dynamical behaviors including stability, synchronization and bifurcation of fractional memristive complex-valued neural networks with time delays and impulsive effects.
References
Achouri H, Aouiti C (2022) Dynamical behavior of recurrent neural networks with different external inputs. Int J Biomath 15(04):2250010
Aouiti C, Ben Gharbia I (2020) Dynamics behavior for second-order neutral Clifford differential equations: inertial neural networks with mixed delays. Comput Appl Math 39(2):120
Aouiti C, Assali EA, Ben Gharbia I (2020) Global exponential convergence of neutral type competitive neural networks with D operator and mixed delay. J Syst Sci Complex 33(6):1785–1803
Aouiti C, Cao J, Jallouli H et al (2022) Finite-time stabilization for fractional-order inertial neural networks with time varying delays. Nonlinear Anal Model Control 27(1):1–18
Bai J, Wu H, Cao J (2022) Secure synchronization and identification for fractional complex networks with multiple weight couplings under DoS attacks. Comput Appl Math 41(4):187
Bhalekar S, Daftardar-Gejji V (2011) A predictor-corrector scheme for solving nonlinear delay differential equations of fractional order. J Fract Calculus Appl 1(5):1–9
Cao J, Yuan K, Li HX (2006) Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays. IEEE Trans Neural Networks 17(6):1646–1651
Chen C, Zhu S, Wei Y (2018) Finite-time stability of delayed memristor-based fractional-order neural networks. IEEE Trans Cybern 50(4):1607–1616
Chen C, Li L, Peng H et al (2020) A new fixed-time stability theorem and its application to the fixed-time synchronization of neural networks. Neural Netw 123:412–419
Chen S, Song Q, Zhao Z et al (2021) Global asymptotic stability of fractional-order complex-valued neural networks with probabilistic time-varying delays. Neurocomputing 450:311–318
Cheng J, Zhang H, Zhang H et al (2021) Quasi-uniform synchronization of Caputo type fractional neural networks with leakage and discrete delays. Chaos Soliton Fract 152(111):432. https://doi.org/10.1016/j.chaos.2021.111432
de Castro FZ, Valle ME (2020) A broad class of discrete-time hypercomplex-valued Hopfield neural networks. Neural Netw 122:54–67
Debnath L (2003) Recent applications of fractional calculus to science and engineering. Int J Math Math Sci 2003(54):3413–3442
Du F, Lu JG (2021) New criterion for finite-time synchronization of fractional order memristor-based neural networks with time delay. Appl Math Comput 389(125):616
Fan Y, Huang X, Wang Z et al (2018) Nonlinear dynamics and chaos in a simplified memristor-based fractional-order neural network with discontinuous memductance function. Nonlinear Dyn 93(2):611–627
Hu T, Zhang X, Zhong S (2018) Global asymptotic synchronization of nonidentical fractional-order neural networks. Neurocomputing 313:39–46
Huang C, Liu H, Shi X et al (2020) Bifurcations in a fractional-order neural network with multiple leakage delays. Neural Netw 131:115–126
Kilbas AA, Srivastava HM, Trujillo JJ (2006) Theory and applications of fractional differential equations. Elsevier, NorthHolland
Koeller R (1984) Applications of fractional calculus to the theory of viscoelasticity. J Appl Mech 51(2):299–307
Li H, Kao Y, Bao H et al (2021) Uniform stability of complex-valued neural networks of fractional order with linear impulses and fixed time delays. IEEE Trans Neural Netw Learn Syst 32(10):5321–5331
Li R, Wu H, Cao J (2022) Impulsive exponential synchronization of fractional-order complex dynamical networks with derivative couplings via feedback control based on discrete time state observations. Acta Math Sci 42(2):737–754
Liu F, Liu H, Liu K (2021) New asymptotic stability analysis for generalized neural networks with additive time-varying delays and general activation function. Neurocomputing 463:437–443
Liu P, Zeng Z, Wang J (2017) Multiple Mittag–Leffler stability of fractional-order recurrent neural networks. IEEE Trans Syst Man Cybern Syst 47(8):2279–2288
Podlubny I (1999) Fractional differential equations. Academic Press, San Diego
Pratap A, Raja R, Sowmiya C et al (2018) Robust generalized Mittag–Leffler synchronization of fractional order neural networks with discontinuous activation and impulses. Neural Netw 103:128–141
Sheng Y, Zeng Z, Huang T (2021) Finite-time stabilization of competitive neural networks with time-varying delays. IEEE Trans Cybern. https://doi.org/10.1109/TCYB.2021.3082153
Song Y, Peng Y (2006) Stability and bifurcation analysis on a logistic model with discrete and distributed delays. Appl Math Comput 181(2):1745–1757
Syed Ali M, Hymavathi M, Saroha S et al (2021) Global asymptotic stability of neutral type fractional-order memristor-based neural networks with leakage term, discrete and distributed delays. Math Methods Appl Sci 44(7):5953–5973
Tripathi D (2011) Peristaltic transport of fractional Maxwell fluids in uniform tubes: applications in endoscopy. Comput Math Appl 62(3):1116–1126
Wang X, Cao J, Wang J et al (2021) A novel fixed-time stability strategy and its application to fixed-time synchronization control of semi-Markov jump delayed neural networks. Neurocomputing 452:284–293
Wu X, Liu S, Wang H (2022) Asymptotic stability and synchronization of fractional delayed memristive neural networks with algebraic constraints. Commun Nonlinear Sci Numer Simul 114(106):694
Xiao J, Zhong S (2019) Synchronization and stability of delayed fractional-order memristive quaternion-valued neural networks with parameter uncertainties. Neurocomputing 363:321–338
Xiao Q, Liu H, Wang Y (2021) An improved finite-time and fixed-time stable synchronization of coupled discontinuous neural networks. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2021.3116320
Xu C, Liao M, Li P et al (2021) New results on pseudo almost periodic solutions of quaternion-valued fuzzy cellular neural networks with delays. Fuzzy Sets Syst 411:25–47
Yang X, Liu Y, Cao J et al (2020) Synchronization of coupled time-delay neural networks with mode-dependent average dwell time switching. IEEE Trans Neural Netw Learn Syst 31(12):5483–5496
Yao X, Liu X, Zhong S (2021) Exponential stability and synchronization of Memristor-based fractional-order fuzzy cellular neural networks with multiple delays. Neurocomputing 419:239–250
You X, Song Q, Zhao Z (2020) Global Mittag–Leffler stability and synchronization of discrete-time fractional-order complex-valued neural networks with time delay. Neural Netw 122:382–394
Zhang H, Zeng Z (2021) Stability and synchronization of nonautonomous reaction–diffusion neural networks with general time-varying delays. IEEE Tran Neural Netw Learn Syst 33(10):5804–5817
Zhang H, Ye M, Cao J et al (2018) Synchronization control of Riemann–Liouville fractional competitive network systems with time-varying delay and different time scales. Int J Control Autom Syst 16(3):1404–1414
Zhang Z, Wu H (2022) Cluster synchronization in finite/fixed time for semi-Markovian switching T-S fuzzy complex dynamical networks with discontinuous dynamic nodes. AIMS Math 7(7):11942–11971
Zhang Z, Zhang J (2020) Asymptotic stabilization of general nonlinear fractional-order systems with multiple time delays. Nonlinear Dyn 102(1):605–619
Acknowledgements
This research was supported by National Natural Science Foundation of China (Grant No.12272011) and also supported by National Key R &D Program of China (Grant No. 2022YFB3806000).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Communicated by Leonardo Tomazeli Duarte.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Zhang, Y., Li, J., Zhu, S. et al. Asymptotical stability and synchronization of Riemann–Liouville fractional delayed neural networks. Comp. Appl. Math. 42, 20 (2023). https://doi.org/10.1007/s40314-022-02122-8
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-022-02122-8
Keywords
- Fractional neural networks
- Asymptotical stability and synchronization
- Multiple time-varying delays
- Distributed delays
- Lyapunov–Krasovskii function