Abstract
We start with a clear distinction between Shannon's Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics.
Original language | English |
---|---|
Article number | 48 |
Journal | Entropy |
Volume | 19 |
Issue number | 2 |
DOIs | |
State | Published - 2017 |
Bibliographical note
Publisher Copyright:© 2017 by the authors.
Keywords
- Entropy
- H-theorem
- Second Law of Thermodynamics
- Shannon's measure of information