Advice lower bounds for the dense model theorem

Abstract

We prove a lower bound on the amount of nonuniform advice needed by black-box reductions for the Dense Model Theorem of Green, Tao, and Ziegler, and of Reingold, Trevisan, Tulsiani, and Vadhan. The latter theorem roughly says that for every distribution D that is δ-dense in a distribution that is ε-indistinguishable from uniform, there exists a dense model for D, that is, a distribution that is δ-dense in the uniform distribution and is δ-indistinguishable from D. This δ-indistinguishability is with respect to an arbitrary small class of functions F. For the natural case where ε ≥( εδ ) and δ O(1), our lower bound implies that Ωp (1/ε) log(1/ ) log |F| advice bits are necessary. There is only a polynomial gap between our lower bound and the best upper bound for this case (due to Zhang), which is O (1/ε2) log(1/ ) log |F|. Our lower bound can be viewed as an analog of list size lower bounds for list-decoding of error-correcting codes, but for dense model decoding instead. Our proof introduces some new techniques which may be of independent interest, including an analysis of a majority of majorities of p-biased bits. The latter analysis uses an extremely tight lower bound on the tail of the binomial distribution, which we could not find in the literature.

Publication Title

Leibniz International Proceedings in Informatics, LIPIcs

Share

COinS