Abstract
This book is about the applications of Shannon’s measure of information (SMI). The SMI was originally developed within the theory of communication. Soon after its publication, it became quite useful concept in many branches of science. However, the SMI was unfortunately confused with two other concepts; entropy and information. We begin this chapter with a brief definition of SMI and also discuss some of its meanings and its various interpretations. We will demonstrate that although the SMI and the entropy have some mathematical similarities, they are entirely different and distinct from each other, and while the SMI is a specific measure of a specific information, it is not the same as the general concept of information. In subsequent sections, we shall further discuss the meaning of the bit as a binary digit and a unit of information, the concept of “self-information,” and the relationship between entropy and probability. We will also discuss the Monty-Hall problem and its solution.
Original language | English |
---|---|
Title of host publication | Information Theory |
Subtitle of host publication | An Exploration across Disciplines |
Publisher | Springer Nature |
Pages | 1-26 |
Number of pages | 26 |
ISBN (Electronic) | 9783031677472 |
ISBN (Print) | 9783031677465 |
DOIs | |
State | Published - 1 Jan 2024 |
Bibliographical note
Publisher Copyright:© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.