Title

A complete proof of global exponential convergence of a neural network for quadratic optimization with bound constraints

Document Type

Letter to the Editor

Publication Date

5-1-2001

Abstract

Sudharsanan and Sundareshan developed a neural-network model for bound constrained quadratic minimization and proved the global exponential convergence of their proposed neural network. The global exponential convergence is a critical property of the synthesized neural network for solving the optimization problem successfully. However, Davis and Pattison presented a counterexample to show that the proof given by Sudharsanan and Sundareshan for the global exponential convergence of the neural network is not correct. Bouzerdoum and Pattison then generalized the neural-network model given by Sudharsanan and Sundareshan and derived the global exponential convergence of the neural network under an appropriate condition. In this letter, we demonstrate through an example that the global exponential convergence condition given by Bouzerdoum and Pattisonin is not always satisfied by the quadratic minimization problem and show that the neural-network model under the global exponential convergence condition given by Bouzerdoum and Pattisonin is essentially restricted to contractive networks. Subsequently, a complete proof of the global exponential convergence of the neural-network models proposed by Sudharsanan and Sundareshan and Bouzerdoum and Pattison is given for the general case, without resorting to the global exponential convergence condition given by Bouzerdoum and Pattison. An illustrative simulation example is also presented.

Publication Source (Journal or Book title)

IEEE Transactions on Neural Networks

First Page

636

Last Page

639

This document is currently not available here.

COinS