Information vs. entropy vs. probability

Orly Shenker*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


Information, entropy, probability: these three terms are closely interconnected in the prevalent understanding of statistical mechanics, both when this field is taught to students at an introductory level and in advanced research into the field’s foundations. This paper examines the interconnection between these three notions in light of recent research in the foundations of statistical mechanics. It disentangles these concepts and highlights their differences, at the same time explaining why they came to be so closely linked in the literature. In the literature the term ‘information’ is often linked to entropy and probability in discussions of Maxwell’s Demon and its attempted exorcism by the Landauer-Bennett thesis, and in analyses of the spin echo experiments. The direction taken in the present paper is a different one. Here we discuss the statistical mechanical underpinning of the notions of probability and entropy, and this constructive approach shows that information plays no fundamental role in these concepts, although it can be conveniently used in a sense that we shall specify.

Original languageAmerican English
Article number5
JournalEuropean Journal for Philosophy of Science
Issue number1
StatePublished - 1 Jan 2020

Bibliographical note

Publisher Copyright:
© 2020, Springer Nature B.V.


  • Cybernetics
  • Entropy
  • Information
  • Probability
  • Reduction
  • Shannon
  • Statistical mechanics
  • Thermodynamics


Dive into the research topics of 'Information vs. entropy vs. probability'. Together they form a unique fingerprint.

Cite this