Weighted Networks-The Perceptron 3.1 Perceptrons and parallel processing In the previous chapter we arrived at the conclusion that McCulloch-Pitts units can be used to build networks capable of computing any logical function and of simulating any finite automaton. From the biological point of view, however, the types of network that can be built are not very relevant. The computing units are too similar to conventional logic gates and the network must be completely specified before it can be used. There are no free parameters which could be adjusted to suit different problems. Learning can only be implemented by modifying the connection pattern of the network and the thresholds of the units, but this is necessarily more complex than just adjusting numerical parameters. For that reason, we turn our attention to weighted networks and consider their most relevant properties. In the last section of this chapter we show that simple weighted networks can provide a computational model for regular neuronal structures in the nervous system. 3.1.1 Perceptrons as weighted threshold elements In 1958 Frank Rosenblatt, an American psychologist, proposed the percep-tron, a more general computational model than McCulloch-Pitts units. The essential innovation was the introduction of numerical weights and a special interconnection pattern. In the original Rosenblatt model the computing units are threshold elements and the connectivity is determined stochastically. Learning takes place by adapting the weights of the network with a numerical algorithm. Rosenblatt's model was refined and perfected in the 1960s and its computational properties were carefully analyzed by Minsky and Papert [312]. In the following, Rosenblatt's model will be called the classical perceptron and the model analyzed by Minsky and Papert the perceptron. The classical perceptron is in fact a whole network for the solution of certain pattern recognition problems. In Figure 3.1 a projection surface called the
CITATION STYLE
Rojas, R. (1996). Weighted Networks—The Perceptron. In Neural Networks (pp. 55–76). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-61068-4_3
Mendeley helps you to discover research relevant for your work.