|Table of Contents|

[1] Sun Changyin**, Fei Shumin, Feng Chunbo,. Absolute Exponential Stabilityof Generalized Dynamical Neural Networks* [J]. Journal of Southeast University (English Edition), 2002, 18 (2): 159-163. [doi:10.3969/j.issn.1003-7985.2002.02.012]
Copy

Absolute Exponential Stabilityof Generalized Dynamical Neural Networks*()
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
18
Issue:
2002 2
Page:
159-163
Research Field:
Automation
Publishing date:
2002-06-30

Info

Title:
Absolute Exponential Stabilityof Generalized Dynamical Neural Networks*
Author(s):
Sun Changyin** Fei Shumin Feng Chunbo
Research Institute of Automation, Southeast University, Nanjing 210096, China
Keywords:
absolute exponential stability partial Lipschitz continuity neural networks
PACS:
TP183
DOI:
10.3969/j.issn.1003-7985.2002.02.012
Abstract:
This paper investigates the absolute exponential stability of generalized neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions. The main obtained result is that if the interconnection matrix T of the neural system satisfies that -T is an H-matrix with nonnegative diagonal elements, then the neural system is absolutely exponentially stable(AEST). The Hopfield network, Cellular neural network and Bidirectional associative memory network are special cases of the network model considered in this paper. So this work gives some improvements to the previous ones.

References:

[1] Arik S, Tavasanoglu V. Absolute stability of nonsymmetric neural network [A]. In: Proc of the 1996 IEEE Int Symp Circuit Syst[C]. 1996, Ⅲ:441-444.
[2] Arik S, Tavasanoglu V. A comment on comments on necessary and sufficient condition for absolute stability of neural networks[J]. IEEE Trans Circuit Syst I, 1998, 45: 595-596.
[3] Forti M, Manetti S, Marini M. Necessary and sufficient condition for absolute stability of neural networks [J]. IEEE Trans Circuits Syst I, 1994, 41:491-494.
[4] Forti M, Liberatore A, Maneti S, et al. On absolute stability of neural networks [A]. In: Proc of the 1994 IEEE Int Symp Circuit Syst[C].1994, 6:241-244.
[5] Forti M. On global asymptotic stability of a class of nonlinear system arising in neural network theory [J]. J Differential Equations, 1994, 113:246-266.
[6] Kaszkurewicz E, Bhaya A. Comments on necessary and sufficient condition for absolute stability of neural network [J]. IEEE Trans Circuits Syst I, 1995, 42: 497-499.
[7] Kaszkurewicz E, Bhaya A. On a class of globally stable neural circuits [J]. IEEE Trans Circuits Syst I, 1994, 41: 171-174.
[8] Liang X B, Wu L D. New sufficient conditions for absolute stability of neural networks [J]. IEEE Trans Circuits Syst I, 1998, 45: 584-586.
[9] Liang X B, Wang J. Absolute exponential stability of neural networks with a general class of activation functions [J]. IEEE Trans Circuits Syst I, 2000, 47(8): 1258-1263.
[10] Forti M, Tesi A. New conditions for global stability of neural networks with application to linear and quadratic programming Problems [J]. IEEE Trans Circuits Syst I, 1995, 42:354-366.
[11] Bouzerman A, Pattison T R. Neural network for quadratic optimization with bound constraints [J]. IEEE Trans Neural Networks, 1993, 4:293-303.
[12] Sudharsanan S L, Sundareshan M K. Exponential stability and a systematic synthesis of a neural network for quadratic minimization [J]. Neural networks, 1991, 4(5): 599-613.
[13] Sun C, Song S, Feng C. On global robust exponential stability of interval neural networks with delays [A]. In: Proceedings of the 2002 International Joint Conference on Neural Networks[C]. Hawaii, 2002, 3:2738-2742.
[14] Sun C, Feng C. On exponential stability of delayed neural networks with globally lipschitz continuous activation functions [A]. In:Proceedings of the 4th World Congress on Intelligent Control and Automation[C]. Shanghai, 2002, 3:1953-1957.
[15] Sun C, Zhang K, Fei S, et al. New conditions for global exponential stability of neural networks [A]. In: Proceedings of the 4th world congress on Intelligent Control and Automation[C]. Shanghai, 2002, 3:1958-1960.
[16] Sun C, Zhang K, Fei S, et al. On exponential stability of delayed neural networks with a general class of activation functions [J]. Physics Letters A, 2002, 298:122-132.
[17] Hopfield J J, Tank W. Neural Computation of decisions in optimization problems [J]. Biol Cybern, 1985, 52:141-152.
[18] Chua L O, Yang L C. Cellular neural network: Theory [J]. IEEE Trans Circuits Syst, 1988, 35(10):1257-1272.
[19] Chua L O, Yang L C. Cellular neural network: Application [J]. IEEE Trans Circuits Syst, 1988, 35(10):1273-1290.
[20] Kosko B. Feedback stability and unsupervised learning [A]. In: Proceeding of the 1988 International Conference on Neural Networks [C]. 1988, 1:141-152.
[21] Hale J K. Ordinary differential equations[M].New York: Wiley Intersci, 1969.
[22] Chua L O, Wang N N. On the application of degree theory to the analysis of resistive nonlinear networks [J]. Int J Circuit Theory Applicants, 1977, 5:45-68.
[23] Espejo S, Carmona R. A VLSI-oriented continuous-time CNN model [J]. Int J Circuit Theory and Applications, 1996, 24:341-356.

Memo

Memo:
* The project supported by the National Natural Science Foundation of China(69934010).
** Born in 1975, male, graduate.
Last Update: 2002-06-20