next up previous
Next: 2.4.3 Backpropagation Processing Unit Up: 2.4 Backpropagation Neural Networks Previous: 2.4.1 Linear Separability and

2.4.2 Architecture of Backpropagation Networks

Our initial approach to solving linearly inseparable patterns of XOR function is to have multiple stages of perceptron networks. Each stage would set up one decision surface or a line that separate patterns. Based on the classification determined by the previous stage, the current stage can form sub-classifications. Figure 2.11 shows the network with two layers of perceptron units to solve the XOR problem [BJ91]. Node 1 detects the pattern for $ (1,0)$, while node 2 detects the pattern for $ (0,1)$. Combined, with these first-layer classifications, node 3 is allowed to classify XOR input patterns correctly [BJ91].

Figure 2.11: Suggested Network for Solving XOR Problem
\begin{figure}
\centerline {\epsfysize=2.0in \epsfbox{./figures/figXORNet.epsi}}\end{figure}

Generalizing the XOR case discussed above, the multilayer feedforward network seems to be the feasible network architecture for backpropagation. However, we still have to take into account how the learning is processed. Unfortunately, with multilayer perceptrons, the nodes in the output layer do not have access to input information in order to adjust connection weights. Because the actual input signals are masked off by the intermediate layers of threshold perceptrons, there is no indication of how close they are to the threshold point. For this reason, we need to modify a hard-limiting threshold function of the perceptron into a nonlinear function for backpropagation learning.


next up previous
Next: 2.4.3 Backpropagation Processing Unit Up: 2.4 Backpropagation Neural Networks Previous: 2.4.1 Linear Separability and
Kiyoshi Kawaguchi
2000-06-17