TY - JOUR
T1 - The minimum information principle and its application to neural code analysis
AU - Globerson, Amir
AU - Stark, Eran
AU - Vaadia, Eilon
AU - Tishby, Naftali
PY - 2009/3/3
Y1 - 2009/3/3
N2 - The study of complex information processing systems requires appropriate theoretical tools to help unravel their underlying design principles. Information theory is one such tool, and has been utilized extensively in the study of the neural code. Although much progress has been made in information theoretic methodology, there is still no satisfying answer to the question: "What is the information that a given property of the neural population activity (e.g., the responses of single cells within the population) carries about a set of stimuli?" Here, we answer such questions via the minimum mutual information (MinMI) principle. We quantify the information in any statistical property of the neural response by considering all hypothetical neuronal populations that have the given property and finding the one that contains the minimum information about the stimuli. All systems with higher information values necessarily contain additional information processing mechanisms and, thus, the minimum captures the information related to the given property alone. MinMI may be used to measure information in properties of the neural response, such as that conveyed by responses of small subsets of cells (e.g., singles or pairs) in a large population and cooperative effects between subunits in networks. We show how the framework can be used to study neural coding in large populations and to reveal properties that are not discovered by other information theoretic methods.
AB - The study of complex information processing systems requires appropriate theoretical tools to help unravel their underlying design principles. Information theory is one such tool, and has been utilized extensively in the study of the neural code. Although much progress has been made in information theoretic methodology, there is still no satisfying answer to the question: "What is the information that a given property of the neural population activity (e.g., the responses of single cells within the population) carries about a set of stimuli?" Here, we answer such questions via the minimum mutual information (MinMI) principle. We quantify the information in any statistical property of the neural response by considering all hypothetical neuronal populations that have the given property and finding the one that contains the minimum information about the stimuli. All systems with higher information values necessarily contain additional information processing mechanisms and, thus, the minimum captures the information related to the given property alone. MinMI may be used to measure information in properties of the neural response, such as that conveyed by responses of small subsets of cells (e.g., singles or pairs) in a large population and cooperative effects between subunits in networks. We show how the framework can be used to study neural coding in large populations and to reveal properties that are not discovered by other information theoretic methods.
KW - Information theory
KW - Maximum entropy
KW - Neural coding
KW - Population coding
UR - http://www.scopus.com/inward/record.url?scp=62549144942&partnerID=8YFLogxK
U2 - 10.1073/pnas.0806782106
DO - 10.1073/pnas.0806782106
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
C2 - 19218435
AN - SCOPUS:62549144942
SN - 0027-8424
VL - 106
SP - 3490
EP - 3495
JO - Proceedings of the National Academy of Sciences of the United States of America
JF - Proceedings of the National Academy of Sciences of the United States of America
IS - 9
ER -