Improved Fuzzy Local Mean Discriminant Analysis via Iterative Optimization for Feature Transformation and Classification
Subject Areas : Information Technology in Engineering Design (ITED) Journalسعید معدنی 1 , Mohammad Hossein Moattar 2 , Yahya Forghani 3
1 - Department of Computer Engineering, Mashhad Branch, Islamic Azad University, Mashhad, Iran
2 -
3 - Islamic Azad University, Mashhad branch, Mashhad, IRAN
Keywords: Fuzzy Local Mean Discriminant Analysis (FLMDA), Linear Discriminant Analysis (LDA), Dimensionality reduction, Feature Extraction, Classification,
Abstract :
Fuzzy Local Mean Discriminant Analysis (FLMDA) is a supervised dimensionality reduction method. FLMDA gathers local information for constructing between-class and within-class scatters. However, after feature transformation using FLMDA, the neighboring data points may differ. This fact may degrade the classification performance. In the proposed method, the feature extraction process is repeated based on the list of adjacent data after transformation, and this process continues until convergence. Therefore, it is supposed that the local information is preserved as much as possible and the local discrimination between the instances of different classes is increased. The experiments performed on different University of California, Irvine (UCI) datasets show the superiority of the proposed method compared to similar methods.
[1] Petscharnig, S., Lux, M. and Chatzichristofis, S. (2017). Dimensionality Reduction for Image Features using Deep Learning and Autoencoders. in Proceedings of the 15th International Workshop on Content-Based Multimedia Indexing. ACM.
[2] Chandrashekar, G. and Sahin, F. (2014). A survey on feature selection methods. Computers & Electrical Engineering, 40(1): p. 16-28.
[3] Santana, L.E.A.d.S. and de Paula Canuto, A.M. (2014). Filter-based optimization techniques for selection of feature subsets in ensemble systems. Expert Systems with Applications, 41(4): p. 1622-1631.
[4] Ambusaidi, M.A., et al. (2016). Building an intrusion detection system using a filter-based feature selection algorithm. IEEE transactions on computers, 65(10): p. 2986-2998.
[5] Rodrigues, D., et al. (2014). A wrapper approach for feature selection based on bat algorithm and optimum-path forest. Expert Systems with Applications, 41(5): p. 2250-2258.
[6] Erguzel, T.T., Tas, C. and Cebi, M. (2015). A wrapper-based approach for feature selection and classification of major depressive disorder–bipolar disorders. Computers in biology and medicine, 64: p. 127-137.
[7] Yassi, M. and Moattar, M.H. (2014). Robust and stable feature selection by integrating ranking methods and wrapper technique in genetic data classification. Biochemical and biophysical research communications, 446(4): p. 850-856.
[8] Wang, A., et al. (2017). Wrapper-based gene selection with Markov blanket. Computers in biology and medicine, 81: p. 11-23.
[9] Wang, A., et al. (2015). Accelerating wrapper-based feature selection with K-nearest-neighbor. Knowledge-Based Systems, 83: p. 81-91.
[10] Chen, G. and Chen, J. (2015). A novel wrapper method for feature selection and its applications. Neurocomputing, 159: p. 219-226.
[11] Ma, L., et al. (2017). A novel wrapper approach for feature selection in object-based image classification using polygon-based cross-validation. IEEE Geoscience and Remote Sensing Letters, 14(3): p. 409-413.
[12] Lu, H., et al. (2017). A hybrid feature selection algorithm for gene expression data classification. Neurocomputing.
[13] Apolloni, J., Leguizamón, G. and Alba, E. (2016). Two hybrid wrapper-filter feature selection algorithms applied to high-dimensional microarray experiments. Applied Soft Computing, 38: p. 922-932.
[14] Inbarani, H.H., Bagyamathi, M. and Azar, A.T. (2015). A novel hybrid feature selection method based on rough set and improved harmony search. Neural Computing and Applications, 26(8): p. 1859-1880.
[15] Fahy, C., Ahmadi, S. and Casey, A. (2015). A comparative analysis of ranking methods in a hybrid filter-wrapper model for feature selection in DNA microarrays, in Research and Development in Intelligent Systems XXXII. Springer. p. 387-392.
[16] Brahim, A.B. and Limam, M. (2016). A hybrid feature selection method based on instance learning and cooperative subset search. Pattern Recognition Letters, 69: p. 28-34.
[17] Solorio-Fernández, S., Carrasco-Ochoa, J.A. and Martínez-Trinidad, J.F. (2016). A new hybrid filter–wrapper feature selection method for clustering based on ranking. Neurocomputing, 214: p. 866-880.
[18] Belhumeur, P.N., Hespanha, J.P. and Kriegman, D.J. (1996). Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. in European Conference on Computer Vision. Springer.
[19] Jolliffe, I.T., Principal Component Analysis. Springer-Verlag, 1986.
[20] Mika, S. (1999). Fisher Discriminant Analysis with Kernels. in IEEE Conference on Neural Networks for Signal Processing IX. Madison, WI, USA.
[21] Ding, Y., et al. (2016). Image quality assessment method based on nonlinear feature extraction in kernel space. Frontiers of Information Technology & Electronic Engineering, 17(10): p. 1008-1017.
[22] Bach, F.R. and Jordan, M.I. (2002). Kernel independent component analysis. Journal of machine learning research, 3(Jul): p. 1-48.
[23] Widodo, A. and Yang, B.S. (2007). Application of nonlinear feature extraction and support vector machines for fault diagnosis of induction motors. Expert Systems with Applications, 33(1): p. 241-250.
[24] Lin, J. and Chen, Q. (2014). A novel method for feature extraction using crossover characteristics of nonlinear data and its application to fault diagnosis of rotary machinery. Mechanical Systems and Signal Processing, 48(1): p. 174-187.
[25] Roweis, S.T. and Saul, L.K. (2000). Nonlinear dimensionality reduction by locally linear embedding. science, 290(5500): p. 2323-2326.
[26] Zhang, Z. and Zha, H. (2004). Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM journal on scientific computing, 26(1): p. 313-338.
[27] Kim, K. and Lee, J. (2014). Sentiment visualization and classification via semi-supervised nonlinear dimensionality reduction. Pattern Recognition, 47(2): p. 758-768.
[28] Orsenigo, C. and Vercellis, C. (2013). A comparative study of nonlinear manifold learning methods for cancer microarray data classification. Expert systems with Applications, 40(6): p. 2189-2197.
[29] Tenenbaum, J.B., De Silva, V. and Langford, J.C. (2000). A global geometric framework for nonlinear dimensionality reduction. science, 290(5500): p. 2319-2323.
[30] Pavlinek, M., & Podgorelec, V. (2017). Text classification method based on self-training and LDA topic models. Expert Systems with Applications, 80, 83-93.
[31] Kaznowska, E., Depciuch, J., Łach, K., Kołodziej, M., Koziorowska, A., Vongsvivut, J., ... & Cebulski, J. (2018). The classification of lung cancers and their degree of malignancy by FTIR, PCA-LDA analysis, and a physics-based computational model. Talanta, 186, 337-345.
[32] Zhong, F. and Zhang, J. (2013). Linear discriminant analysis based on L1-norm maximization. IEEE Transactions on Image Processing, 22(8): p. 3018-3027.
[33] Zhang, D., Li, X., He, J., & Du, M. (2018). A new linear discriminant analysis algorithm based on L1-norm maximization and locality preserving projection. Pattern Analysis and Applications, 21(3), 685-701.
[34] Yang, J., et al. (2005). Two-dimensional discriminant transform for face recognition. Pattern recognition, 38(7): p. 1125-1129.
[35] Li, C.N., Shao, Y.H. and Deng, N.Y. (2015). Robust L1-norm two-dimensional linear discriminant analysis. Neural Networks, 65: p. 92-104.
[36] Xu, J., Gu, Z., and Xie, K. (2016). Fuzzy Local Mean Discriminant Analysis for Dimensionality Reduction. Neural Processing Letters, 44(3): p. 701-718.
[37] Kim, T.K. and Kittler, J. (2005). Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image. IEEE transactions on pattern analysis and machine intelligence, 27(3): p. 318-327.
[38] Dey, A., & Ghosh, M. (2019). A novel approach to fuzzy-based facial feature extraction and face recognition. Informatica, 43(4).
[39] Ma, M., Deng, T., Wang, N., & Chen, Y. (2019). Semi-supervised rough fuzzy Laplacian Eigenmaps for dimensionality reduction. International Journal of Machine Learning and Cybernetics, 10(2), 397-411.
[40] Sun, Y., & Lin, C. M. (2021). Design of Multidimensional Classifiers using Fuzzy Brain Emotional Learning Model and Particle Swarm Optimization Algorithm. Acta Polytechnica Hungarica, 18(4), 25-45.
[41] Dey A., Chowdhury S., and Sing J.K. (2022). A new fuzzy and Gaussian distribution induced two-directional inverse FDA for feature extraction and face recognition, International Journal of Advanced Intelligence Paradigms, 22(1-2): pp 148-166.
[42] Ghosh M., and Dey A. (2022) Fractional-weighted entropy-based fuzzy G-2DLDA algorithm: a new facial feature extraction method, Multimedia Tools and Applications.
[43] Chen, C., & Zhou, X. (2022). Collaborative representation-based fuzzy discriminant analysis for Face recognition. The Visual Computer, 38(4), 1383-1393.
[44] Gurubelli, Y., Ramanathan, M., & Ponnusamy, P. (2019). Fractional fuzzy 2DLDA approach for pomegranate fruit grade classification. Computers and Electronics in Agriculture, 162, 95-105.
[45] Zhang, X., Zhu, Y., & Chen, X. (2017). Fuzzy 2d-lda face recognition based on sub-image. In International Conference on Intelligent Data Engineering and Automated Learning (pp. 326-334). Springer, Cham.
[46] Dey, A., Chowdhury, S., & Sing, J. K. (2018). Feature Extraction Using Fuzzy Generalized Two-Dimensional Inverse LDA with Gaussian Probabilistic Distribution and Face Recognition. In Advanced Computational and Communication Paradigms (pp. 553-561). Springer, Singapore.
[47] Fukunaga, K. (2013). Introduction to statistical pattern recognition. 2013: Academic press.