Abstract
The support vector domain description is a one-class classification method that estimates the shape and extent of the distribution of a data set. This separates the data into outliers, outside the decision boundary, and inliers on the inside. The method bears close resemblance to the two-class support vector machine classifier. Recently, it was shown that the regularization path of the support vector machine is piecewise linear, and that the entire path can be computed efficiently. This paper shows that this property carries over to the support vector domain description. Using our results the solution to the one-class classification can be obtained for any amount of regularization with roughly the same computational complexity required to solve for a particularly value of the regularization parameter. The possibility of evaluating the results for any amount of regularization not only offers more accurate and reliable models, but also makes way for new applications. We illustrate the potential of the method by determining the order of inclusion in the model for a set of corpora callosa outlines.
Chapter PDF
Similar content being viewed by others
Keywords
- Support Vector Machine
- Regularization Parameter
- Mahalanobis Distance
- Decision Boundary
- Regularization Path
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Tax, D.M., Duin, R.P.: Support vector domain description. Pattern Recognition Letters 20(11-13), 1191–1199 (1999)
Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Computation 13, 1443–1471 (2001)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: The entire regularization path for the support vector machine. JMLR 5, 1391–1415 (2004)
Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of Statistics 32(2), 407–451 (2004)
Rosset, S., Zhu, J.: Piecewise linear regularized solution paths. Technical report, Stanford University (2003)
Rosset, S.: Tracking curved regularized optimization solution paths. In: NIPS (2004)
Pantoni, L., Basile, A.M., Pracucci, G., Asplund, K., Bogousslavsky, J., Chabriat, H., Erkinjuntti, T., Fazekas, F., Ferro, J.M., Hennerici, M., O’brien, J., Scheltens, P., Visser, M.C., Wahlund, L.O., Waldemar, G., Wallin, A., Inzitari, D.: Impact of age-related cerebral white matter changes on the transition to disability - the LADIS study: Rationale, design and methodology. Neuroepidemiology 24(1-2), 51–62 (2005)
Ben-Hur, A., Horn, D., Siegelmann, H.T., Vapnik, V.: Support vector clustering. Journal of Machine Learning Research 2, 125–137 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sjöstrand, K., Larsen, R. (2006). The Entire Regularization Path for the Support Vector Domain Description. In: Larsen, R., Nielsen, M., Sporring, J. (eds) Medical Image Computing and Computer-Assisted Intervention – MICCAI 2006. MICCAI 2006. Lecture Notes in Computer Science, vol 4190. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11866565_30
Download citation
DOI: https://doi.org/10.1007/11866565_30
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44707-8
Online ISBN: 978-3-540-44708-5
eBook Packages: Computer ScienceComputer Science (R0)