Abstract
Suppose we have a memory storing 0s and 1s and we want to estimate the frequency of 1s by sampling. We want to do this I/O-efficiently, exploiting that each read gives a block of B bits at unit cost; not just one bit. If the input consists of uniform blocks: either all 1s or all Os, then sampling a whole block at a time does not reduce the number of samples needed for estimation. On the other hand, if bits are randomly permuted, then getting a block of B bits is as good as getting B indendent bit samples. However, we do not want to make any such assumptions on the input. Instead, our goal is to have an algorithm with instance-dependent performance guarantees which stops sampling blocks as soon as we know that we have a probabilistically reliable estimate. We prove our algorithms to be instance-optimal among algorithms oblivious to the order of the blocks, which we argue is the strongest form of instance optimality we can hope for. We also present similar results for I/O-efficiently estimating mean with both additive and multiplicative error, estimating histograms, quantiles, as well as the empirical cumulative distribution function. We obtain our above results on I/O-efficient sampling by reducing to corresponding problems in the so-called sequential estimation. In this setting, one samples from an unknown distribution until one can provide an estimate with some desired error probability. Sequential estimation has been considered extensively in statistics over the past century. However, the focus has been mostly on parametric estimation, making stringent assumptions on the distribution of the input, and thus not useful for our reduction. In this paper, we make no assumptions on the input distribution (apart from its support being a bounded set). Namely, we provide non-parametric instance-optimal results for several fundamental problems: mean and quantile estimation, as well as learning mixture distributions with respect to \ell_{\infty} and the so-called Kolmogorov-Smirnov distance. All our algorithms are simple, natural, and practical, and some are even known from other contexts, e.g., from statistics in the parameterized setting. The main technical difficulty is in analyzing them and proving that they are instance optimal.