Data Compression Conference
Download PDF

Abstract

The problem of assigning a probability to the next outcome of an individual binary sequence under the constraint that the universal predictor has a finite number of states, is explored. The two main loss functions that are considered are the square error loss and the self-information loss. Universal prediction w.r.t. the self-information loss can be combined with arithmetic encoding to construct a universal encoder, thus we essentially explore the universal coding problem. We analyze the performance of randomized time-invariant K-state universal predictors, and provide performance bounds in terms of the number of states K for long enough sequences. In the case where the comparison class consists of constant predictors we provide, for the square error loss, tight bounds indicating that the optimal asymptotic expected redundancy is O (\frac{1}{k}). For the self-information loss we show an upper bound on the coding redundancy of O (\frac{{\log K}}{K}) and a lower bound of O (\frac{1}{k}).
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles