Introduction, Definitions, and Some General Comments on Shannon’s Measure of Information

Arieh Ben-Naim*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

This book is about the applications of Shannon’s measure of information (SMI). The SMI was originally developed within the theory of communication. Soon after its publication, it became quite useful concept in many branches of science. However, the SMI was unfortunately confused with two other concepts; entropy and information. We begin this chapter with a brief definition of SMI and also discuss some of its meanings and its various interpretations. We will demonstrate that although the SMI and the entropy have some mathematical similarities, they are entirely different and distinct from each other, and while the SMI is a specific measure of a specific information, it is not the same as the general concept of information. In subsequent sections, we shall further discuss the meaning of the bit as a binary digit and a unit of information, the concept of “self-information,” and the relationship between entropy and probability. We will also discuss the Monty-Hall problem and its solution.

Original languageEnglish
Title of host publicationInformation Theory
Subtitle of host publicationAn Exploration across Disciplines
PublisherSpringer Nature
Pages1-26
Number of pages26
ISBN (Electronic)9783031677472
ISBN (Print)9783031677465
DOIs
StatePublished - 1 Jan 2024

Bibliographical note

Publisher Copyright:
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.

Fingerprint

Dive into the research topics of 'Introduction, Definitions, and Some General Comments on Shannon’s Measure of Information'. Together they form a unique fingerprint.

Cite this