Abstract
This paper proposes a feature extraction method from the given empirical kernel vector. We show the necessary condition for the feature extraction mapping to make the trained classifier by using the linear SVM with the extracted feature vectors equivalent to the one obtained by the standard kernel SVM. The proposed feature extraction mapping is defined by using the eigen values and eigen vectors of the Gram matrix. Since the eigen vector problem of the Gram matrix is closely related with the kernel Principal Component Analysis, we can extract a dimension reduced feature vector. This feature extraction method becomes equivalent to the kernel SVM if the full dimension is used. The proposed feature extraction method was evaluated by the experiments using the standard data sets. The cross-validation values of the proposed method were improved and the recognition rates were comparable with the original kernel SVM. The number of extracted features was very low compared to the number of features of the kernel SVM.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer (2006)
Chen, Q., Chen, X., Wu, Y.: Optimization Algorithm with Kernel PCA to Support Vector Machines for Time Series Prediction. Journal of Computers 5(3), 380–387 (2010)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and other Kernel-based Learning Methods. Cambridge University Press (2000)
Csurka, G., Dance, C.-R., Fan, L., Willamowski, J., Bray, C.: Visual Categorization with Bag of Keypoints. In: Proc. of European Conference on Computer Vision 2004 Workshop on Statistical Learning in Computer Vision, pp. 59–74 (2004)
Dalal, N., Triggs, B.: Histogram of oriented gradients for human detection. In: Proc. of CVPR 2005 (2005)
Hariharan, B., Malik, J., Ramanan, D.: Discriminative decorrelation for clustering and classification. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part IV. LNCS, vol. 7575, pp. 459–472. Springer, Heidelberg (2012)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning -Data Mining, Inference, and Prediction, 2nd edn. Springer (2006)
Chang, C.-C., Lin, C.-J.: LIBSVM: a library for support vector machines (2001), http://www.csie.ntu.edu.tw/=sjlin/libsvm
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Smola, A., Müller, K.: Fisher discriminant analysis with kernels. In: Proc. IEEE Neural Networks for Signal Processing Workshop, pp. 41–48 (1999)
Nishida, K., Kurita, T.: Kernel Feature Selection to Improve Generalization Performance of Boosting Classifiers. In: The 2006 International Conference on Image Processing, Computer Vision, & Pattern Recognition, Monte Carlo Resort, Las Vegas, Nevada, June 26-29 (2006)
Nishida, K., Kurita, T.: RANSAC-SVM for Large-Scale Datasets. In: Proc. of International Conference on Pattern Recognition, December 8-11. Tampa Convention Center, Tampa (2008)
Scholköpf, B., Burges, C.-J.-C., Smola, A.-J.: Advances in Kernel Methods - Support Vector Learning. The MIT Press (1999)
Schölkopf, B., Mika, S., Burges, C.-J.-C., Knirsch, P., Müller, K.-R., Rätsch, G., Smola, A.-J.: Input Spcae Versus Feature Space in Kernel-Based Methods. IEEE Trans. on Neural Networks 10(5), 1000–1017 (1999)
Suykens, J.-A.-K., Gestel, T.V., Vandewalle, J., De Moor, B.: A Support Vector Machine Formulation to PCA Analysis and Its Kernel Version. IEEE Trans. on Neural Networks 14(2), 447–450 (2003)
Vapnik, V.-N.: Statistical Learning Theory. John Wiley & Sons (1998)
Yan, S., Xu, D., Zhang, B., Zhang, H.-J.: Graph embedding: a general framework for dimensionality reduction. In: Proc. of CVPR 2005 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Kurita, T., Harashima, Y. (2014). Extraction of Dimension Reduced Features from Empirical Kernel Vector. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8835. Springer, Cham. https://doi.org/10.1007/978-3-319-12640-1_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-12640-1_2
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12639-5
Online ISBN: 978-3-319-12640-1
eBook Packages: Computer ScienceComputer Science (R0)