Hostname: page-component-586b7cd67f-2brh9 Total loading time: 0 Render date: 2024-11-28T02:10:31.552Z Has data issue: false hasContentIssue false

Efficiency, information theory, and neural representations

Published online by Cambridge University Press:  30 August 2019

Joseph T. Devlin
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England [email protected]@[email protected]/~jdevlin
Matt H. Davis
Affiliation:
Medical Research Council, Cognition and Brain Sciences Unit, Cambridge, CB2 2EF England [email protected]
Stuart A. McLelland
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England [email protected]@[email protected]/~jdevlin
Richard P. Russell
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England [email protected]@[email protected]/~jdevlin

Abstract

We contend that if efficiency and reliability are important factors in neural information processing then distributed, not localist, representations are “evolution's best bet.” We note that distributed codes are the most efficient method for representing information, and that this efficiency minimizes metabolic costs, providing adaptive advantage to an organism.

Type
Brief Report
Copyright
2000 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)