Abstract
Mutual information (MI) is in increasing use as a way of quantifying neural responses. However, it is still considered with some doubts by many researchers, because it is not always clear what MI really measures, and because MI is hard to calculate in practice. This paper aims to clarify these issues. First, it provides an interpretation of mutual information as variability decomposition, similar to standard variance decomposition routinely used in statistical evaluations of neural data, except that the measure of variability is entropy rather than variance. Second, it discusses those aspects of the MI that makes its calculation difficult. The goal of this paper is to clarify when and how information theory can be used informatively and reliably in auditory neuroscience.
Original language | English |
---|---|
Pages (from-to) | 94-105 |
Number of pages | 12 |
Journal | Hearing Research |
Volume | 229 |
Issue number | 1-2 |
DOIs | |
State | Published - Jul 2007 |
Bibliographical note
Funding Information:This work was supported by a grant from the Israeli Science Foundation. We thank Yael Bitterman for useful comments on the manuscript.
Keywords
- Auditory system
- Entropy
- Information theory
- Mutual information
- Neural code
- Variance decomposition