** Next:** 2.4.4 Backpropagation Learning Algorithm
** Up:** 2.4 Backpropagation Neural Networks
** Previous:** 2.4.2 Architecture of Backpropagation

The backpropagation processing unit should be in the form modified
from a linear perceptron so that the activation function is nonlinear and
smoothed out at the threshold point. The suggested form of the activation
function is the sigmoid function as mentioned previously. With sigmoid
function, we can obtain not only output from the neuron but also information
about how close we are to the threshold point using
the slope of the sigmoid function. Mathematically, we
can derive the slope from the equation (2.5) as follows:

This will be the key information for the weight adjustments in the
forthcoming discussions.

** Next:** 2.4.4 Backpropagation Learning Algorithm
** Up:** 2.4 Backpropagation Neural Networks
** Previous:** 2.4.2 Architecture of Backpropagation
*Kiyoshi Kawaguchi*

*2000-06-17*