## LSU Historical Dissertations and Theses

#### Title

Neural Networks With Asynchronous Control.

1988

Dissertation

#### Degree Name

Doctor of Philosophy (PhD)

#### Department

Electrical and Computer Engineering

Subhash C. Kak

#### Abstract

Neural network studies have previously focused on monolithic structures. The brain has a bicameral nature, however, and so it is natural to expect that bicameral structures will perform better. This dissertation offers an approach to the development of such bicameral structures. The companion neural structure takes advantage of the global and subset characteristics of the stored memories. Specifically we propose the use of an asynchronous controller C that implies the following update of a probe vector x by the connection matrix T: $x\sp\prime$ = sgn (C(x, TX)). For a VLSI-implemented neural network the controller block can be easily placed in the feedback loop. In a network running asynchronously, the updating of the probe generally offers a choice among several components. If the right components are not updated the network may converge to an incorrect stable point. The proposed asynchronous controller together with the basic neural net forms a bicameral network that can be programmed in various ways to exploit global and local characteristics of stored memory. Several methods to do this are proposed. In one of the methods the update choices are based on bit frequencies. In another method handles are appended to the memories to improve retrieval. The new methods have been analyzed and their performance studies it is shown that there is a marked improvement in performance. This is illustrated by means of simulations. The use of an asynchronous controller allows the implementation of conditional rules that occur frequently in AI applications. It is shown that a neural network that uses conditional rules can solve problems in natural language understanding. The introduction of the asynchronous controller may be viewed as a first step in the development of truly bicameral structures that may be seen as the next generation of neural computers.

135

COinS