The relative entropy rate for two hidden markov processes

Or Zuk*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The relative entropy rate is a natural and usefull measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discrete-time finite-alphabet HMPs.

Original languageEnglish
Title of host publicationTURBO - CODING 2006 - 4th International Symposium on Turbo Codes and Related Topics and 6th International ITG-Conference on Source and Channel Coding
PublisherVDE Verlag GmbH
ISBN (Electronic)3800729474, 9783800729470
StatePublished - 2006
Externally publishedYes
Event4th International Symposium on Turbo Codes and Related Topics and 6th International ITG-Conference on Source and Channel Coding, TURBOCODING 2006 - Munich, Germany
Duration: 3 Apr 20067 Apr 2006

Publication series

NameTURBO - CODING 2006 - 4th International Symposium on Turbo Codes and Related Topics and 6th International ITG-Conference on Source and Channel Coding

Conference

Conference4th International Symposium on Turbo Codes and Related Topics and 6th International ITG-Conference on Source and Channel Coding, TURBOCODING 2006
Country/TerritoryGermany
CityMunich
Period3/04/067/04/06

Bibliographical note

Publisher Copyright:
© 2020 TURBO - CODING 2006 - 4th International Symposium on Turbo Codes and Related Topics and 6th International ITG-Conference on Source and Channel Coding. All rights reserved.

Fingerprint

Dive into the research topics of 'The relative entropy rate for two hidden markov processes'. Together they form a unique fingerprint.

Cite this