Abstract
Information, entropy, probability: these three terms are closely interconnected in the prevalent understanding of statistical mechanics, both when this field is taught to students at an introductory level and in advanced research into the field’s foundations. This paper examines the interconnection between these three notions in light of recent research in the foundations of statistical mechanics. It disentangles these concepts and highlights their differences, at the same time explaining why they came to be so closely linked in the literature. In the literature the term ‘information’ is often linked to entropy and probability in discussions of Maxwell’s Demon and its attempted exorcism by the Landauer-Bennett thesis, and in analyses of the spin echo experiments. The direction taken in the present paper is a different one. Here we discuss the statistical mechanical underpinning of the notions of probability and entropy, and this constructive approach shows that information plays no fundamental role in these concepts, although it can be conveniently used in a sense that we shall specify.
Original language | English |
---|---|
Article number | 5 |
Journal | European Journal for Philosophy of Science |
Volume | 10 |
Issue number | 1 |
DOIs | |
State | Published - 1 Jan 2020 |
Bibliographical note
Publisher Copyright:© 2020, Springer Nature B.V.
Keywords
- Cybernetics
- Entropy
- Information
- Probability
- Reduction
- Shannon
- Statistical mechanics
- Thermodynamics