Individual sequence prediction using memory-efficient context trees

Ofer Dekel*, Shai Shalev-Shwartz, Yoram Singer

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Scopus citations


Context trees are a popular and effective tool for tasks such as compression, sequential prediction, and language modeling. We present an algebraic perspective of context trees for the task of individual sequence prediction. Our approach stems from a generalization of the notion of margin used for linear predictors. By exporting the concept of margin to context trees, we are able to cast the individual sequence prediction problem as the task of finding a linear separator in a Hilbert space, and to apply techniques from machine learning and online optimization to this problem. Our main contribution is a memory efficient adaptation of the perceptron algorithm for individual sequence prediction. We name our algorithm the shallow perceptron and prove a shifting mistake bound, which relates its performance with the performance of any sequence of context trees. We also prove that the shallow perceptron grows a context tree at a rate that is upper bounded by its mistake rate, which imposes an upper bound on the size of the trees grown by our algorithm.

Original languageAmerican English
Pages (from-to)5251-5262
Number of pages12
JournalIEEE Transactions on Information Theory
Issue number11
StatePublished - 2009

Bibliographical note

Funding Information:
Manuscript received January 24, 2008; revised October 07, 2008. Current version published October 21, 2009. This work was supported in part by the Israeli Science Foundation under Grant 522-04. The material in this paper was presented in part at the Advances in Neural Information Processing Systems 17, Vancouver, BC, Canada, December 2004.


  • Context trees
  • Online learning
  • Perceptron
  • Shifting bounds


Dive into the research topics of 'Individual sequence prediction using memory-efficient context trees'. Together they form a unique fingerprint.

Cite this