Abstract
Marginal Fisher Analysis (MFA) was introduced to remedy some of the shortcomings of the Fisher Discriminant Analysis (FDA). It performs local discrimination between classes. Whenever the training data set is small, MFA cannot directly be used with the original high-dimensional samples. This is referred to as the small sample size (SSS) phenomenon that happens whenever the feature dimension is higher than the number of examples. The classic remedy was using the projection of the raw data (e.g., using (PCA)). This paper introduces two regularization schemes that overcome the singularity and near singularity of the locality preserving scatters. The first scheme uses ridge regression regularization. The second scheme uses matrix exponential and introduces an implicit distance diffusion mapping. The experiments are conducted on four face data sets. These experiments demonstrate that the introduced schemes can enhance the performance of the MFA framework much better than the widely used PCA based regularization.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Li, X., Lin, S., Yan, S., Xu, D.: Discriminant locally linear embedding with high-order tensor data. IEEE Trans. Syst., Man, Cybern. B: Cybern 32, 342–352 (2008)
Li, H., Jiang, T., Zhang, K.: Efficient and robust feature extraction by maximum margin criterion. IEEE Trans. on Neural Networks 17, 157–165 (2006)
Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press, New York (1990)
Wang, F., Wang, X., Zhang, D., Zhang, C., Li, T.: Marginface: A novel face recognition method by average neighborhood margin maximization. Pattern Recognition 42, 2863–2875 (2009)
Alipanahi, B., Biggs, M., Ghodsi, A.: Distance metric learning vs. Fisher discriminant analysis. In: AAAI Conference on Artificial Intelligence (2008)
Globerson, A., Roweis, S.: Metric learning by collapsing classes. In: Conference on Advances in Neural Information Processing Systems (2006)
Yan, S., Xu, D., Zhang, B., Zhang, H.J.: Graph embedding: A general framework for dimensionality reduction. In: Int. Conference on Computer Vision and Pattern Recognition (2005)
Zhang, T., Fang, B., Tang, Y., Shang, Z., Xu, B.: Generalized discriminant analysis: A matrix exponential approach. IEEE Transactions on Systems, Man, and Cybernetics 40, 186–197 (2010)
Zhang, Z., Dai, G., Xu, C., Jordan, M.: Regularized discriminant analysis, ridge regression and beyond. Journal of Machine Learning Research 11, 2199–2228 (2010)
Price, K.V., Lampinen, J.A., Storn, R.M.: Differential Evolution: A Practical Approach To Global Optimization. Springer (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Dornaika, F., Bosagzadeh, A. (2013). On Solving the Small Sample Size Problem for Marginal Fisher Analysis. In: Kamel, M., Campilho, A. (eds) Image Analysis and Recognition. ICIAR 2013. Lecture Notes in Computer Science, vol 7950. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39094-4_14
Download citation
DOI: https://doi.org/10.1007/978-3-642-39094-4_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-39093-7
Online ISBN: 978-3-642-39094-4
eBook Packages: Computer ScienceComputer Science (R0)