The relative entropy rate for two hidden markov processes

Or Zuk*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

The relative entropy rate is a natural and usefull measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discrete-time finite-alphabet HMPs.

Original languageAmerican English
Title of host publicationTurbo Codes and Related Topics; 6th International ITG-Conference on Source and Channel Coding (TURBOCODING), 2006 4th International Symposium on
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9783800729470
StatePublished - 2006
Externally publishedYes
Event6th International ITG-Conference on Source and Channel Coding and 2006 4th International Symposium on Turbo Codes and Related Topics, TURBOCODING 2006 - Munich, Germany
Duration: 3 Apr 20067 Apr 2006

Publication series

NameTurbo Codes and Related Topics; 6th International ITG-Conference on Source and Channel Coding (TURBOCODING), 2006 4th International Symposium on

Conference

Conference6th International ITG-Conference on Source and Channel Coding and 2006 4th International Symposium on Turbo Codes and Related Topics, TURBOCODING 2006
Country/TerritoryGermany
CityMunich
Period3/04/067/04/06

Fingerprint

Dive into the research topics of 'The relative entropy rate for two hidden markov processes'. Together they form a unique fingerprint.

Cite this