A temporal neural network with finite states.

Chengke Sheng, The University of Texas at El Paso

Abstract

A new class of temporal associative neural network, called a finite state network (FSN), is presented. Unlike other temporal networks, the proposed FSN has the desirable feature that it can associate any input temporal pattern with any output temporal pattern. The temporal pattern is represented by a symbol string with each symbol being a bipolar vector. The FSN is trained on input-output exemplar string pairs. The FSN is always capable of learning new exemplar pairs while retaining all trained pairs unchanged. Suppose that the FSN has been trained on exemplar pairs $(\alpha\sb1,\theta\sb1),(\alpha\sb2,\theta\sb2),\..., (\alpha\sb{p},\theta\sb{p}).$ Each time the FSN receives an input string $\alpha,$ it will compare the string with each of $\alpha\sb1,\alpha\sb2,\...,\alpha\sb{p}$ and find out the closest one, denoted by $\alpha\sb{k}.$ Then the FSN will respond with the corresponding $\theta\sb{k}$ as its output. Training the FSN on an exemplar string pair $(\alpha,\ \theta)$ is a one-pass process and it adds $(\alpha,\ \theta)$ to the FSN's memory.

This dissertation describes the structure of the FSN which consists of four subnets; one for input, one for inner output, one for state representation and one for output. A process is given to train the network by adjusting all adaptive weights associated with the four subnets. Implementation and training of the FSN are accomplished using a simulation. Test results are given.

Subject Area

Engineering, Electronics and Electrical; Computer Science

Recommended Citation

Sheng, Chengke, "A temporal neural network with finite states." (1994). ETD Collection for University of Texas, El Paso. AAI9503983.
https://scholarworks.utep.edu/dissertations/AAI9503983

Share

COinS