Entropy, Shannon's measure of information and Boltzmann's H-theorem

Arieh Ben-Naim*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

47 Scopus citations

Abstract

We start with a clear distinction between Shannon's Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics.

Original languageEnglish
Article number48
JournalEntropy
Volume19
Issue number2
DOIs
StatePublished - 2017

Bibliographical note

Publisher Copyright:
© 2017 by the authors.

Keywords

  • Entropy
  • H-theorem
  • Second Law of Thermodynamics
  • Shannon's measure of information

Fingerprint

Dive into the research topics of 'Entropy, Shannon's measure of information and Boltzmann's H-theorem'. Together they form a unique fingerprint.

Cite this