Abstract
Subspace clustering (SC) achieves excellent clustering result via learning an affinity matrix and applying the spectral clustering on the matrix in order to divide data points into several different subspaces. One of the challenges faced by various SC methods is to construct a fitting affinity matrix from the given data. Sparse subspace clustering (SSC) learns the affinity matrix by minimizing l1 norm. Log-determinant approximation to rank (CLAR) learns the affinity matrix by employing a special function, called Log-determinant (LogDet), to take the place of rank function. In this paper, we propose an improved SSC algorithm by using LogDet function and Frobenius norm. The improved algorithm could effectively improve the grouping effect and obtain block diagonal solution, especially when the database size is large. Experiments on the Synthetic database, the ‘Hopkins155’ database and ‘Extended Yale B’ database, are demonstrations of the proposed approach could get a better clustering result.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Vidal, R.: Subspace clustering. IEEE Trans. Signal Process. Mag. 28(2), 52–68 (2011)
Ng, A., Weiss, Y., Jordan, M.: On spectral clustering: analysis and an algorithm. Neural Inf. Process. Syst. 2, 849–856 (2002)
Elhamifar, E., Vidal, R.: Sparse Subspace Clustering: Algorithm, Theory, and Applications. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2765–2781 (2013)
Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: International Conference on Machine Learning (ICML), pp. 663–670(2010)
Kang, Z., Peng, C., Cheng, Q.: Robust subspace clustering via smoothed rank approximation. IEEE Signal Process. 22(11), 2088–2092 (2015)
Lu, C.Y., Min, H., Zhao, ZQ., Zhu, L., Huang, D.S., Yan, S.: Robust and efficient subspace segmentation via least squares regression. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds) Computer Vision – ECCV 2012. ECCV 2012. Lecture Notes in Computer Science, vol 7578, pp. 347–360. Springer, Berlin (2012)
Wu, Z., Yin, M., Zhou, Y.: Robust Spectral Subspace Clustering Based on Least Square Regression. Neural Process. Lett. 48, 1359–1372 (2018)
Hu, H., Lin, Z., Feng, J., Zhou, J.: Smooth representation clustering. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3834–3841 (2014)
Zhang, S., Li, Y., Cheng, D.: Efficient subspace clustering based on self-representation and grouping effect. Neural Comput. Appl. 29, 51–59 (2018)
Kang, Z., Peng, C., Cheng, J., Cheng, Q.: Logdet rank minimization with application to subspace clustering. Comput. Intell. Neurosci. 2015, 68 (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Yu, Q., Zhang, Y., Sun, C. (2020). Efficient Subspace Clustering Based on Enhancing Local Structure and Global Structure. In: Xhafa, F., Patnaik, S., Tavana, M. (eds) Advances in Intelligent Systems and Interactive Applications. IISA 2019. Advances in Intelligent Systems and Computing, vol 1084. Springer, Cham. https://doi.org/10.1007/978-3-030-34387-3_45
Download citation
DOI: https://doi.org/10.1007/978-3-030-34387-3_45
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-34386-6
Online ISBN: 978-3-030-34387-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)