Exponential stability of a class of competitive neural networks with multi-proportional delaysNeural Process Lett

About

Authors
Liqun Zhou, Zhongying Zhao
Year
2015
DOI
10.1007/s11063-015-9486-6
Subject
Computer Networks and Communications / Software / Neuroscience (all) / Artificial Intelligence

Text

Neural Process Lett

DOI 10.1007/s11063-015-9486-6

Exponential stability of a class of competitive neural networks with multi-proportional delays

Liqun Zhou1 · Zhongying Zhao1 © Springer Science+Business Media New York 2015

Abstract In this paper, the exponential stability of a class of competitive neural networks with multi-proportional delays is studied. First, through suitable transformations, a class of competitive neural networks with multi-proportional delays can be equivalently turned into a class of competitive neural networks with multi-constant delays and variable coefficients.

By using fixed point theorem, the existence and uniqueness of equilibrium point of the system is proved. Furthermore by constructing appropriate delay differential inequality, two delay-independent and delay-independent sufficient conditions for the exponential stability of equilibrium point are obtained. Finally, several examples and their simulations are given to illustrate the effectiveness of the obtained results.

Keywords Competitive neural networks · Proportional delays · Exponential stability ·

Fixed point theorem · Delay differential inequality 1 Introduction

In 1996, Meyer-Baese studied and put forward competitive neural network model in [1].

As one of the popular artificial neural networks, competitive neural networks have received significant attention. From the view of biology, there are two kinds of the human memories: short-term-memory (STM) and long-term-memory (LTM), STM prestnts fast neural activity and LTM presents unsupervised and slow synaptic modifications. Competitive neural networks contain two timescales, the one dealing with the fast change of the state and the other one with the slow change of the synapse by external stimulation. It is a kind of unsupervised learning neural networks, which refers to the whole interconnection between input and output of the single layer neural networks and is widely used in optimization design,

B Liqun Zhou liqunzhou@mail.tjnu.edu.cn; zhouliqun20000@163.com

Zhongying Zhao 984183008@qq.com 1 School of Mathematics Science, Tianjin Normal University, Tianjin 300387, China 123

L. Zhou, Z. Zhao pattern recognition, signal processing and control theory and so on [1–3]. Dynamics of competitive neural networks with different time scales can be found in [4–10]. No matter in biological or man-made neural networks, the synapse between neurons inevitably appear time delay effect, and connection weight between neurons is time-varying which may lead to oscillation, divergence, so as to instability. At present, a variety of dynamic behaviors about competitive neural networks with delays have been studied, such as singular perturbation [1], periodicity [11], stability [4,12–17], synchronization [8,10,18–22] and so on. And the dynamic behaviors of competitive neural networks with delays mainly focus on constant delays [4,11,12,18], bounded time-varying delays [8,13,14,16,17,19,20], mixed delays (i.e. bounded time-varying delay and distributed delay) [15,21,22], etc.

It is well known that stability has played a very important role in the applications of competitive neural networks. Thus, various stabilities of competitive neural networks with delays have been widely studied and a great deal of results have been obtained (see, [4,12–17]).

Global exponential stability of competitive neural networks with constant and time-varying delays had been studied by constructing Lyapunov functional in [4,14], respectively. In [12], existence and global exponential stability of equilibrium of competitive neural networks with different time scales and multiple delays had been discussed by nonlinear Lipschitz measure method and constructing suitable Lyapunov functional. In [13], exponential stability of competitive neural networks with time-varying and distributed delays were studied by inequality techniques and properties of an M-matrix. In [15,16], multi-stability of competitive neural networks with time-varying and distributed delays had been studied by using inequality technique. Global stability and convergence of equilibrium point for delayed competitive neural networks with different time scales and discontinuous activations were investigated by employing the Leray-Schauder alternative theorem in multi-valued analysis, linear matrix inequality technique and generalized Lyapunov-like method in [17].

Different from above mentioned delays, proportional delay is an unbounded time-varying delay. The proportional delay functions τ(t) = (1 − q)t, 0 < q < 1 is a kind of unbounded delay, which often rises in many fields such as physics, biology systems and control theory.

At the same time, since the differences between proportional delay and other delays, the past results about the stability of neural networks with delays can not be directly applied to neural networks with proportional delays. The category of proportional delayed differential equation, which the neural networks with proportional delays belongs to, is an important kind of unbounded delay differential equation and is widely used in many fields, such as light absorption in the star substance and nonlinear dynamic systems. Hence, researches on the dynamic behaviors of neural networks with proportional delays have important theoretical and practical value. The dynamical behaviors of neural networks with proportional delays have been studied in [23–31]. In [23], dissipativity of a class of cellular neural networks (CNNs) with proportional delays was investigated by using the inner product properties.

In [24–26,28], Zhou had discussed the global exponential stability and asymptotic stability of CNNs with multi-proportional delays by employing matrix theory and constructing Lyapunov functional, respectively. Delay-dependent exponential synchronization of recurrent neural networks (RNNs) with multiple proportional delays was studied in [28] by constructing appropriate Lyapunov functional. A few results about dynamical behaviors of neural networks with proportional delays in [24–26,28] mainly used to establish appropriate Lyapunov functionals. It iswell known that constructing newLyapunov functions is very difficult, and no general method can be found. At present, there are many other research methods, for example, by constructing nonlinear delay differential inequality, Zhou had studied the global exponential stability of the bidirectional associative memory neural networks with proportional delays in [27] and [29]. In [30], stability criteria for high-order networks with 123