Abstract
A new manifold learning method, called incremental alignment method (IAM), is proposed for nonlinear dimensionality reduction of high dimensional data with intrinsic low dimensionality. The main idea is to incrementally align low-dimensional coordinates of input data patch-by-patch to iteratively generate the representation of the entire dataset. The method consists of two major steps, the incremental step and the alignment step. The incremental step incrementally searches neighborhood patch to be aligned in the next step, and the alignment step iteratively aligns the low-dimensional coordinates of the neighborhood patch searched to generate the embeddings of the entire dataset. Compared with the existing manifold learning methods, the proposed method dominates in several aspects: high efficiency, easy out-of-sample extension, well metric-preserving, and averting of the local minima issue. All these properties are supported by a series of experiments performed on the synthetic and real-life datasets. In addition, the computational complexity of the proposed method is analyzed, and its efficiency is theoretically argued and experimentally demonstrated.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Donoho D L. High-dimensional data analysis: The curses and blessings of dimensionality. American Math. Society Lecture, Match Challenges of the 21st Century, 2000.
Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, Dec. 2000, 290(5500): 2323–2326.
Tenenbaum J B, de Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction. Science, Dec. 2000, 290(5500): 2319–2323.
Bachmann C M, Ainsworth T L, Fusina R A. Exploiting manifold geometry in hyperspectral imagery. IEEE Trans. Geoscience and Remote Sensing, Mar. 2005, 43(3): 441–454.
Lee J G, Zhang C S. Classification of gene-expression data: The manifold-based metric learning way. Pattern Recognition, Dec. 2006, 39(12): 2450–2463.
Shin Y. Facial expression recognition of various internal states via manifold learning. Journal of Computer Science and Technology, Jul. 2009, 24(4): 745–752.
Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373–1396.
Zhang Z, Zha H. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J. Scientific Computing, 2005, 26(1): 313–338.
Donoho D L, Grimes C. Hessian eigenmaps: New locally linear embedding techniques for high-dimensional data. Proc. the National Academy of Sciences, 2003, 100(10): 5591–5596.
Weinberger K, Saul L. Unsupervised learning of image manifolds by semidefinite programming. In Proc. IEEE Int. Conf. Computer Vision and Pattern Recognition, Washington DC, USA, Jun. 27-Jul. 2, 2004, pp. 988–995.
Lee J A, Lendasse A, Verleysen M. Nonlinear projection with curvilinear distances: ISOMAP versus curvilinear distance analysis. Neurocomputing, Mar. 2004, 57: 49–76.
Hinton G, Roweis S. Stochastic neighbor embedding. In Proc. NIPS 2002, Vancouver, Canada, Dec. 9-14, 2002, pp. 833–840.
Agrafiotis D K, Xu H. A self-organizing principle for learning nonlinear manifolds. Proceedings of the National Academy of Sciences, 2002, 99(25): 15869–15872.
Yang L. Alignment of overlapping locally scaled patches for multidimensional scaling and dimensionality reduction. IEEE Trans. Pattern Analysis and Machine Intelligence, Mar. 2008, 30(3): 438–450.
de Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. In Proc. NIPS 2003, Vancouver and Whistler, Canada, Dec. 8-13, 2003, pp. 705–712.
Lin T, Zha H. Riemannian manifold learning. IEEE Trans. Pattern Analysis and Machine Intelligence, May, 2008, 30(5): 796–809.
Roweis S T, Saul L K, Hinton G E. Global coordination of local linear models. In Proc. NIPS 2001, Vancouver, Canada, Dec. 3-8, 2001, pp. 889–896.
Verbeek J. Learning nonlinear image manifolds by global alignment of local linear models. IEEE Trans. Pattern Analysis and Machine Intelligence, Aug. 2006, 28(8): 1236–1250.
Bachmann C M, Alinsworth T L, Fusina R A. Exploiting manifold geometry in hyperspectral imagery. IEEE Trans. Geoscience and Remote Sensing, Mar. 2005, 43(3): 441–454.
Teh Y W, Roweis S T. Automatic alignment of hidden representations. In Proc. NIPS 2002, Vancouver, Canada, Dec. 9-14, 2002, pp. 841–848.
Verveek J, Roweis S, Vlassis N. Non-linear CCA and PCA by alignment of local models. In Proc. NIPS 2003, Vancouver and Whistler, Canada, Dec. 8-13, 2003, pp. 297–304.
Zhang T, Yang J, Zhao D, Ge X. Linear local tangent space alignment and application to face recognition. Neuralcomputing, 2007, 70(7-9): 1547–1553.
Cox T, Cox M. Multidimensional Scaling. Chapman and Hall, 1994.
Law M H C, Zhang N, Jain A K. Nonlinear manifold learning for data stream. In Proc. SIAM Data Mining, Orlando, USA, Apr. 22-24, 2004, pp. 33–44.
Law M H C, Jain A K. Incremental nonlinear dimensionality reduction by manifold learning. IEEE Trans. Pattern Analysis and Machine Intelligence, Mar. 2006, 28(3): 377–391.
Kouropteva O, Okun O, PietikÄainen M. Incremental locally linear embedding. Pattern Recognition, 2005, 38(10): 1764–1767.
Kouropteva O, Okun O, Pietikäinen M. Incremental locally linear embedding algorithm. In Proc. Fourteenth Scandinavian Conference on Image Analysis, Joensuu, Finland, Jun. 19-22, 2005, pp. 521–530.
Bengio Y, Paiement J F, Vincent P, Delalleau O, Le Roux N, Ouimet M. Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and spectral clustering. In Proc. NIPS 2003, Vancouver and Whistler, Canada, Dec. 8-13, 2003, pp. 177–184.
Zhao D, Yang L. Incremental isometric embedding of high dimensional data using connected neighborhood graphs. IEEE Trans. Pattern Analysis and Machine Intelligence, 2009, 31(1): 86–98.
Jolliffe I T. Principal Component Analysis. Springer-Verlag, 1986.
Yang J, Zhang D, Frangi A, Yang J. Two-dimentional PCA: A new approach to appearance-based face representation and recognition. IEEE Trans. Pattern Analysis and Machine Intelligence, Jan. 2004, 26(1): 131–137.
Meng D, Leung Y, Fung T, Xu Z. Nonlinear dimensionality reduction of data lying on the multi-cluster manifold. IEEE Trans. Systems, Man and Cybernetics, Part B, Aug. 2008, 38(4): 1111–1122.
Meng D, Leung Y, Xu Z, Fung T, Zhang Q. Improving geodesic distance estimation based on locally linear assumption. Pattern Recognition Letters, May 2008, 29(7): 862–870.
Lee J A, Verleysen M. Nonlinear dimensionality reduction of data manifolds with essential loops. Neurocomputing, 2005, 67: 29–53.
Saul L K, Roweis S T. Think globally, fit locally: Unsupervised learning of low dimensional manifold. Journal Machine Learning Research, 2003, 4: 119–155.
Friedman J H, Bentley J L, Finkel R A. An algorithm for finding best matches in logarithmic expected time. ACM Transactions on Mathematical Software, 1977, 3(3): 209–226.
Nocedal J, Wright S J. Numerical Optimization, 2nd Ed. Berlin, New York: Springer-Verlag, 2006, p.24.
de Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. In Proc. NIPS 2002, Vancouver, Canada, Dec. 9-14, 2002, pp. 705–712.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by the National Basic Research 973 Program of China under Grant No. 2007CB311002 and the National Natural Science Foundation of China under Grant No. 60905003.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Han, Z., Meng, DY., Xu, ZB. et al. Incremental Alignment Manifold Learning. J. Comput. Sci. Technol. 26, 153–165 (2011). https://doi.org/10.1007/s11390-011-9422-9
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11390-011-9422-9