A new Definition of Entropy based on Shannon's Measure of Information

Arieh Ben-Naim*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

In this chapter, we define the concept of entropy, based on Shannon's Measure of information (SMI). This definition is equivalent to Boltzmann's definition. However, it has some advantages; it leads naturally to a simple, intuitive and irrefutable interpretation of entropy. It also provides a solid probabilistic interpretation of the Second Law of Thermodynamics. In a second chapter we shall also show that this definition shows most clearly why Entropy is a timeless quantity.

Original languageEnglish
Title of host publicationFrontiers In Entropy Across The Disciplines - Panorama Of Entropy
Subtitle of host publicationTheory, Computation, And Applications
PublisherWorld Scientific Publishing Co. Pte Ltd
Pages91-131
Number of pages41
ISBN (Electronic)9789811259401
ISBN (Print)9789811259395
DOIs
StatePublished - 30 Aug 2022

Bibliographical note

Publisher Copyright:
© 2023 by World Scientific Publishing Co. Pte. Ltd. All rights reserved.

Fingerprint

Dive into the research topics of 'A new Definition of Entropy based on Shannon's Measure of Information'. Together they form a unique fingerprint.

Cite this