Abstract
Developed only recently, support vector learning machines achieve high generalization ability by minimizing a bound on the expected test error; however, so far there existed no way of adding knowledge about invariances of a classification problem at hand. We present a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Abu-Mostafa, Y. S.: Hints. Neural Computation 7 (1995) 639–671
Boser, B. E., Guyon, I. M., Vapnik, V.: A training algorithm for optimal margin classifiers. Fifth Annual Workshop on Computational Learning Theory, PittsburghACM (1992) 144–152.
Bottou, L., Cortes, C., Denker, J. S., Drucker, H., Guyon, I., Jackel, L. D., Le Cun, Y., Müller, U. A., Säckinger, E., Simard, P., Vapnik, V.: Comparison of classifier methods: a case study in handwritten digit recognition. Proceedings of the 12th International Conference on Pattern Recognition and Neural Networks, Jerusalem (1994)
Burges, C.: Simplified support vector decision rules. 13th International Conference on Machine Learning (1996)
Cortes, C., Vapnik, V.: Support Vector Networks. Machine Learning 20 (1995) 1–25
Drucker, H., Schapire, R., Simard, P.: Boosting performance in neural networks. International Journal of Pattern Recognition and Artificial Intelligence 7 (1993) 705–719
Le Cun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., Jackel, L. J.: Backpropagation applied to handwritten zip code recognition. Neural Computation 1 (1989) 541–551
Schölkopf, B., Burges, C., Vapnik, V.: Extracting support data for a given task. In: Fayyad, U. M., Uthurusamy, R. (eds.): Proceedings, First International Conference on Knowledge Discovery & Data Mining, AAAI Press, Menlo Park, CA (1995)
Segman, J., Rubinstein, J., Zeevi, Y. Y.: The canonical coordinates method for pattern deformation: theoretical and computational considerations. IEEE Transactions on Pattern Analysis and Machine Intelligence 14 (1992) 1171–1183
Simard, P., Le Cun, Y., Denker, J.: Efficient pattern recognition using a new transformation distance. In: Hanson, S. J., Cowan, J. D., Giles, C. L. (eds.): Advances in Neural Information Processing Systems 5, Morgan Kaufmann, San Mateo, CA (1993)
Simard, P., Victorri, B., Le Cun, Y., Denker, J.: Tangent Prop — a formalism for specifying selected invariances in an adaptive network. In: Moody, J. E., Hanson, S. J., Lippmann, R. P.: Advances in Neural Information Processing Systems 4, Morgan Kaufmann, San Mateo, CA (1992)
Vapnik, V.: Estimation of Dependences Based on Empirical Data. [in Russian] Nauka, Moscow (1979); English translation: Springer Verlag, New York (1982)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer Verlag, New York (1995)
Vetter, T., Poggio, T.: Image Synthesis from a Single Example Image. Proceedings of the European Conference on Computer Vision, Cambridge UK, in press (1996)
Vetter, T., Poggio, T., Bülthoff, H.: The importance of symmetry and virtual views in three-dimensional object recognition. Current Biology 4 (1994) 18–23
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1996 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Schölkopf, B., Burges, C., Vapnik, V. (1996). Incorporating invariances in support vector learning machines. In: von der Malsburg, C., von Seelen, W., Vorbrüggen, J.C., Sendhoff, B. (eds) Artificial Neural Networks — ICANN 96. ICANN 1996. Lecture Notes in Computer Science, vol 1112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61510-5_12
Download citation
DOI: https://doi.org/10.1007/3-540-61510-5_12
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-61510-1
Online ISBN: 978-3-540-68684-2
eBook Packages: Springer Book Archive