Abstract
In response to a 1997 problem of M. Vidyasagar, we state a necessary and sufficient condition for distribution-free PAC learnability of a concept class under the family of all non-atomic (diffuse) measures on the domain Ω. Clearly, finiteness of the classical Vapnik–Chervonenkis dimension of is a sufficient, but no longer necessary, condition. Besides, learnability of under non-atomic measures does not imply the uniform Glivenko–Cantelli property with regard to non-atomic measures. Our learnability criterion is stated in terms of a combinatorial parameter which we call the VC dimension of modulo countable sets. The new parameter is obtained by “thickening up” single points in the definition of VC dimension to uncountable “clusters”. Equivalently, if and only if every countable subclass of has VC dimension ≤ d outside a countable subset of Ω. The new parameter can be also expressed as the classical VC dimension of calculated on a suitable subset of a compactification of Ω. We do not make any measurability assumptions on , assuming instead the validity of Martin’s Axiom (MA).
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Arveson, F.: An Invitation to C ∗ -Algebras. Graduate Texts in Mathematics, vol. 39. Springer, Heidelberg (1976)
Blumer, A., Ehrenfeucht, A., Haussler, D., Warmuth, M.K.: Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM 36(4), 865–929 (1989)
Dudley, R.M.: Uniform Central Limit Theorems. In: Cambridge Studies in Advanced Mathematics, Cambridge University Press, Cambridge (1999)
Fremlin, D.H.: Consequences of Martin’s Axiom. Cambridge Tracts in Mathematics, vol. 84. Cambridge University Press, Cambridge (1984)
Jech, T.: Set theory. Academic Press, New York (1978)
Johnstone, P.T.: Stone Spaces. Cambridge Studies in Advanced Mathematics, vol. 3. Cambridge University Press, Cambridge (1986), Reprint of the 1982 edition
Kunen, K.: Set Theory. North-Holland, Amsterdam (1980)
Mendelson, S.: A few notes on statistical learning theory. In: Mendelson, S., Smola, A.J. (eds.) Advanced Lectures on Machine Learning. LNCS (LNAI), vol. 2600, pp. 1–40. Springer, Heidelberg (2003)
Pollard, D.: Convergence of Stochastic Processes. Springer, New York (1984)
Talagrand, M.: The Glivenko-Cantelli problem, ten years later. J. Theoret. Probab. 9(1996), 371–384 (1996)
Vapnik, V.N., Chervonenkis, A.Y.: On the uniform convergence of relative frequencies of events to their probabilities. Theory Probab. Appl. 16(2), 264–280 (1971)
Vidyasagar, M.: A theory of learning and generalization. With applications to neural networks and control systems. Communications and Control Engineering Series. Springer, London (1997)
Vidyasagar, M.: Learning and Generalization, with Applications to Neural Networks, 2nd edn. Springer, Heidelberg (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pestov, V. (2010). PAC Learnability of a Concept Class under Non-atomic Measures: A Problem by Vidyasagar. In: Hutter, M., Stephan, F., Vovk, V., Zeugmann, T. (eds) Algorithmic Learning Theory. ALT 2010. Lecture Notes in Computer Science(), vol 6331. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16108-7_14
Download citation
DOI: https://doi.org/10.1007/978-3-642-16108-7_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-16107-0
Online ISBN: 978-3-642-16108-7
eBook Packages: Computer ScienceComputer Science (R0)