next up previous
Next: 2.4.1 Linear Separability and Up: 2. Artificial Neural Networks Previous: 2.3.6 Learning Processes

2.4 Backpropagation Neural Networks

Backpropagation neural networks employ one of the most popular neural network learning algorithms, the Backpropagation (BP) algorithm. It has been used successfully for wide variety of applications, such as speech or voice recognition, image pattern recognition, medical diagnosis, and automatic controls. One of the most striking early applications was NETTalk by T. J. Sejnowski and C. R. Rosenberg in 1986. The NETTalk was able to learn the rules of phonetics, then the system produced a sound by reading from the sequence of given letters, with a behavior of a child learning to read aloud [Day90].

Backpropagation made a tremendous step forward from the single-layer perceptron network. With a more sophisticated learning rule, backpropagation networks overcome the limitations that single-layer networks have. Backpropagation is also the most suitable learning method for multilayer networks. Perhaps, the reason why the backpropagation made the major turning point is because the learning rule has a solid mathematical foundation and it is practical [Ler91].


Kiyoshi Kawaguchi