Novel lower bounds on the entropy rate of binary hidden Markov processes

Or Ordentlich*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the (1, ∞)-RLL constraint.

Original languageEnglish
Title of host publicationProceedings - ISIT 2016; 2016 IEEE International Symposium on Information Theory
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages690-694
Number of pages5
ISBN (Electronic)9781509018062
DOIs
StatePublished - 10 Aug 2016
Externally publishedYes
Event2016 IEEE International Symposium on Information Theory, ISIT 2016 - Barcelona, Spain
Duration: 10 Jul 201615 Jul 2016

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2016-August
ISSN (Print)2157-8095

Conference

Conference2016 IEEE International Symposium on Information Theory, ISIT 2016
Country/TerritorySpain
CityBarcelona
Period10/07/1615/07/16

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Fingerprint

Dive into the research topics of 'Novel lower bounds on the entropy rate of binary hidden Markov processes'. Together they form a unique fingerprint.

Cite this