Advice lower bounds for the dense model theorem
Abstract
We prove a lower bound on the amount of nonuniform advice needed by black-box reductions for the Dense Model Theorem of Green, Tao, and Ziegler, and of Reingold, Trevisan, Tulsiani, and Vadhan. The latter theorem roughly says that for every distribution D that is δ-dense in a distribution that is ε -indistinguishable from uniform, there exists a "dense model" for D, that is, a distribution that is δ-dense in the uniform distribution and is ε′-indistinguishable from D. This ε -indistinguishability is with respect to an arbitrary small class of functions F. For the natural case where ε ≥ Δ (ε δ) and ε ≥ δO(1), our lower bound implies that Δ( √ (1/ε) log(1/δ) log |F| ) advice bits are necessary for a certain type of reduction that establishes a stronger form of the Dense Model Theorem (and which encompasses all known proofs of the Dense Model Theorem in the literature). There is only a polynomial gap between our lower bound and the best upper bound for this case (due to Zhang), which is O ((1/ε2) log(1/δ) log |F|). Our lower bound can be viewed as an analogue of list size lower bounds for list-decoding of error-correcting codes, but for "dense model decoding" instead.
Publication Title
ACM Transactions on Computation Theory
Recommended Citation
Watson, T. (2014). Advice lower bounds for the dense model theorem. ACM Transactions on Computation Theory, 7 (1) https://doi.org/10.1145/2676659