Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- 2 The biology of neural networks: a few features for the sake of non-biologists
- 3 The dynamics of neural networks: a stochastic approach
- 4 Hebbian models of associative memory
- 5 Temporal sequences of patterns
- 6 The problem of learning in neural networks
- 7 Learning dynamics in ‘visible’ neural networks
- 8 Solving the problem of credit assignment
- 9 Self-organization
- 10 Neurocomputation
- 11 Neurocomputers
- 12 A critical view of the modeling of neural networks
- References
- Index
- Frontmatter
- Contents
- Preface
- Acknowledgments
- 1 Introduction
- 2 The biology of neural networks: a few features for the sake of non-biologists
- 3 The dynamics of neural networks: a stochastic approach
- 4 Hebbian models of associative memory
- 5 Temporal sequences of patterns
- 6 The problem of learning in neural networks
- 7 Learning dynamics in ‘visible’ neural networks
- 8 Solving the problem of credit assignment
- 9 Self-organization
- 10 Neurocomputation
- 11 Neurocomputers
- 12 A critical view of the modeling of neural networks
- References
- Index
Summary
Clearly, any neuronal dynamics can always be implemented in classical computers and therefore we could wonder why it is interesting to build dedicated neuronal machines. The answer is two-fold:
Owing to the inherent parallelism of neuronal dynamics, the time gained by using dedicated machines rather than conventional ones can be considerable, so making it possible to solve problems which are out of the reach of most powerful serial computers.
It is perhaps even more important to become aware that dedicated machines compel one to think differently about the problems one has to solve. To program a neurocomputer does not involve building a program and writing a linear series of instructions, step by step. In the process of programming a neurocomputer, one is forced to think more globally in terms of phase space instead, to eventually figure out an energy landscape and to determine an expression for this energy. Z. Pilyshyn made this point clear enough in the following statement (quoted by D. Waltz):
‘What is typically overlooked (when we use a computational system as a cognitive model) is the extent to which the class of algorithms that can even be considered is conditioned by the assumptions we make regarding what basic operations are possible, how they may interact, how operations are sequenced, what data structures are possible and so on. Such assumptions are an intrinsic part of our choice of descriptive formalism.’
- Type
- Chapter
- Information
- An Introduction to the Modeling of Neural Networks , pp. 379 - 402Publisher: Cambridge University PressPrint publication year: 1992