Memory Complexity of Estimating Entropy and Mutual Information

Tomer Berg*, Or Ordentlich, Ofer Shayevitz

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We observe an infinite sequence of independent identically distributed random variables X1,X2,... drawn from an unknown distribution p over [n], and our goal is to estimate the entropy H(p) = -E[log p(X)] within an ϵ-additive error. To that end, at each time point we are allowed to update a finite-state machine with S states, using a possibly randomized but time-invariant rule, where each state of the machine is assigned an entropy estimate. Our goal is to characterize the minimax memory complexity S∗ of this problem, which is the minimal number of states for which the estimation task is feasible with probability at least 1 - δ asymptotically, uniformly in p. Specifically, we show that there exist universal constants C1 and C2 such that S∗ ≤ C1 · n(log n)42δ for ϵ not too small, and S∗ ≥ C2 · max{n, log n/ϵ} for ϵ not too large. The upper bound is proved using approximate counting to estimate the logarithm of p, and a finite memory bias estimation machine to estimate the expectation operation. The lower bound is proved via a reduction of entropy estimation to uniformity testing. We also apply these results to derive bounds on the memory complexity of mutual information estimation.

Original languageEnglish
JournalIEEE Transactions on Information Theory
DOIs
StateAccepted/In press - 2025

Bibliographical note

Publisher Copyright:
© 2025 IEEE.

Keywords

  • entropy estimation
  • finite memory algorithms
  • Memory complexity
  • mutual information estimation
  • sample complexity

Fingerprint

Dive into the research topics of 'Memory Complexity of Estimating Entropy and Mutual Information'. Together they form a unique fingerprint.

Cite this