Abstract.
We prove that the concept class of disjunctions cannot be pointwise approximated by linear combinations of any small set of arbitrary real-valued functions. That is, suppose that there exist functions \(\phi_{1}, \ldots , \phi_{r}\) : {− 1, 1}n → \({\mathbb{R}}\) with the property that every disjunction f on n variables has \(\|f - \sum\nolimits_{i=1}^{r} \alpha_{i}\phi _{i}\|_{\infty}\leq 1/3\) for some reals \(\alpha_{1}, \ldots , \alpha_{r}\). We prove that then \(r \geq exp \{\Omega(\sqrt{n})\}\), which is tight. We prove an incomparable lower bound for the concept class of decision lists. For the concept class of majority functions, we obtain a lower bound of \(\Omega(2^{n}/n)\) , which almost meets the trivial upper bound of 2n for any concept class. These lower bounds substantially strengthen and generalize the polynomial approximation lower bounds of Paturi (1992) and show that the regression-based agnostic learning algorithm of Kalai et al. (2005) is optimal.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Author information
Authors and Affiliations
Corresponding author
Additional information
Manuscript received 26 January 2008
Rights and permissions
About this article
Cite this article
Klivans, A.R., Sherstov, A.A. Lower Bounds for Agnostic Learning via Approximate Rank. comput. complex. 19, 581–604 (2010). https://doi.org/10.1007/s00037-010-0296-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00037-010-0296-y