Title

An improved upper bound on step-size parameters of discrete-time recurrent neural networks for linear inequality and equation system

Document Type

Article

Publication Date

5-1-2002

Abstract

In this brief, an improved upper bound on the step-size parameters of a globally convergent discrete-time recurrent neural network (RNN) model proposed recently in the literature for solving the linear inequality and equation system is obtained without needing the original boundedness requirement for the solution set of the linear system while the step-size parameters being allowed different. Consequently, the rate of convergence for the discrete-time RNN model can be improved by setting the step-size parameters as large as possible no matter whether the solution set of the linear system is bounded or not. It is shown by an example that the obtained upper bound is actually tight in the sense that the RNN in the specific example is globally convergent if and only if the step-size parameters are less than the given upper bound. A numerical simulation example of a globally convergent discrete-time RNN for solving a specific linear inequality and equation system with an unbounded solution set is also provided.

Publication Source (Journal or Book title)

IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications

First Page

695

Last Page

698

This document is currently not available here.

COinS