EEG-based Emotion Recognition Using a Hybrid Fusion Approach
Subject Areas : International Journal of Smart Electrical Engineering
Mahdi Khezri
1
*
,
Shakiba Afsar
2
1 - Department of Electrical Engineering. Na.C., Islamic Azad University, Najafabad, Iran.
2 - Department of Electrical Engineering. Na.C., Islamic Azad University, Najafabad, Iran.
Keywords: Emotion detection, EEG Signal, Hybrid fusion, K-means clustering, SVM.,
Abstract :
In the design of affective computing systems, the use of multiple modalities (physiological and non-physiological) along with fusion strategies is typically considered to enhance emotion recognition. However, such multimodal systems inevitably face increased complexity in both data acquisition and information processing stages. In this work, we propose an emotion recognition system that relies solely on electroencephalogram (EEG) signal features, employing hybrid fusion at both the feature and classification levels. At the first stage, feature fusion is applied to dynamic features extracted from EEG sub-bands across all channels. The resulting feature set is then partitioned into subsets using k-means clustering, with each subset serving as input to an individual classification unit. Finally, the outputs of these classifiers are combined via majority voting fusion, determining the final emotional state. When evaluated on the DEAP dataset using a Support Vector Machine (SVM) classifier, the proposed method achieved 83.8% accuracy in classifying four emotional states. Compared to conventional affective systems using the same dataset with classical classification approaches, our results demonstrate significant improvement in recognition performance.
[1] Picard RW, Affective Computing. Cambridge, MA, USA: MIT Press, 1997. doi: 10.7551/mitpress/1140.001.0001.
[2] Afzal S, Khan HA, Piran MJ, Lee JW, "A comprehensive survey on affective computing: Challenges, trends, applications, and future directions" IEEE Access, 2024, vol. 12, pp. 96150–96168, doi: 10.1109/ACCESS.2024.3422480.
[3] Koelstra S, Mühl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, "DEAP: A database for emotion analysis using physiological signals," IEEE Transactions on Affective Computing, 2012, vol. 3, no. 1, pp. 18–31, Jan.–Mar, doi: 10.1109/T-AFFC.2011.15.
[4] Pan B, Hirota K, Jia Z, Dai Y, "A review of multimodal emotion recognition from datasets, preprocessing, features, and fusion methods," Neurocomputing, 2023, vol. 561, p. 126866, doi: 10.1016/j.neucom.2023.126866.
[5] Chen K, Jing H, Liu Q, Ai Q, Ma L, "A novel Caps-EEGNet combined with channel selection for EEG-based emotion recognition," Biomedical Signal Processing and Control, 2023, vol. 86, Part C, p. 105312, doi: 10.1016/j.bspc.2023.105312.
[6] Huang H, Deng Y, Hao B, Liu W, Tu X, Zeng G, "Emotion recognition method using U-Net neural network with multichannel EEG features and differential entropy characteristics," IEEE Access, 2025, vol. 13, pp. 59377–59389, doi: 10.1109/ACCESS.2024.3497160.
[7] Bagherzadeh S, Shalbaf A, Shoeibi A, Jafari M, Tan RS, Acharya UR, "Developing an EEG-based emotion recognition using ensemble deep learning methods and fusion of brain effective connectivity maps," IEEE Access, 2024, vol. 12, pp. 50949–50965, doi: 10.1109/ACCESS.2024.3384303.
[8] Li C, Wang Y, Zhang X, Chen L, Li H, Zhang J, "Emotion recognition by learning the manifold of fused multiscale information of EEG signals," IEEE Transactions on Affective Computing, early access. doi: 10.1109/TAFFC.2025.3555226.
[9] Awan AW, Usman SM, Khalid S, Anwar A, Alroobaea R, Hussain S, Almotiri J, Ullah SS, Akram MU, "An ensemble learning method for emotion charting using multimodal physiological signals," Sensors, 2022, vol. 22, no. 23, pp. 9480, doi: 10.3390/s22239480.
[10] Zhang Q, Zhang H, Zhou K, Zhang L, "Developing a physiological signal based mean threshold and decision level fusion algorithm (PMD) for emotion recognition," Tsinghua Science and Technology, 2023, vol. 28, no. 4, pp. 673–685, doi: 10.26599/TST.2022.9010038.
[11] Y Zhao Y, Cao X, Lin J, Yu D, Cao X, "Multimodal affective states recognition based on multiscale CNNs and biologically inspired decision fusion model," IEEE Transactions on Affective Computing, 2023, vol. 14, no. 2, pp. 1391–1403, doi: 10.1109/TAFFC.2021.3093923.
[12] Sharma A, Kumar A, "DREAM: Deep learning-based recognition of emotions from multiple affective modalities using consumer-grade body sensors and video cameras," IEEE Transactions on Consumer Electronics, 2024, vol. 70, no. 1, pp. 1434–1442, doi: 10.1109/TCE.2023.3325317.
[13] Li A, Wu M, Ouyang R, Wang Y, Li F, Lv Z, "A multimodal-driven fusion data augmentation framework for emotion recognition," IEEE Transactions on Artificial Intelligence, 2025, vol. 6, no. 8, pp. 2083–2097, doi: 10.1109/TAI.2025.3537965.
[14] Kim SH, "Mifu-ER: Modality quality index-based incremental fusion for emotion recognition," IEEE Access, 2025, vol. 13, pp. 112703–112719, doi: 10.1109/ACCESS.2025.3584642.
[15] Gong W, Wang Y, Wu Y, Gao S, Vasilakos AV, Zhang P, "A hybrid fusion model for group-level emotion recognition in complex scenarios," Information Sciences, 2025, vol. 704, p. 121968, doi: 10.1016/j.ins.2025.121968.
[16] M. A. Razzaq et al., "A hybrid multimodal emotion recognition framework for UX evaluation using generalized mixture functions," Sensors, vol. 23, no. 9, p. 4373, 2023. doi: 10.3390/s23094373.
[17] Akay M, Nonlinear Biomedical Signal Processing: Dynamic Analysis and Modeling. New York, NY, USA: Wiley-IEEE Press, 2000.
[18] Rodriguez-Bermudez G, Garcia-Laencina PJ, "Analysis of EEG signals using nonlinear dynamics and chaos: A review," Applied Mathematics & Information Sciences, 2023, vol. 9, no. 5.
[19] Chen L, Wang K, Li M, Wu M, Pedrycz W, Hirota K, "K-means clustering-based kernel canonical correlation analysis for multimodal emotion recognition in human–robot interaction," IEEE Transactions on Industrial Electronics, 2023, vol. 70, no. 1, pp. 1016–1024, doi: 10.1109/TIE.2022.3150097.
[20] Maćkiewicz A, Ratajczak W, "Principal components analysis (PCA)," Computers & Geosciences, 1993, vol. 19, no. 3, pp. 303–342, doi: 10.1016/0098-3004(93)90090-R.
[21] Tran MA, Nguyen LH, "EEG-based emotion recognition using principal component analysis and support vector machine," in Proc. IEEE Int. Conf. Control Autom., Electron., Robot., Internet Things, Artif. Intell. (CERIA), Bandung, Indonesia, 2024, pp. 1–5. doi: 10.1109/CERIA64726.2024.10914782.
[22] Ikotun AM, Ezugwu AE, Abualigah L, Abuhaija B, Heming J, "K-means clustering algorithms: A comprehensive review, variants analysis, and advances in the era of big data," Information Sciences, 2023, vol. 622, pp. 178–210, doi: 10.1016/j.ins.2022.11.139.
[23] Syarif I, Prugel-Bennett A, Wills G, "SVM parameter optimization using grid search and genetic algorithm to improve classification performance," TELKOMNIKA, 2016, vol. 14, no. 4, pp. 1502–1509, doi: 10.12928/telkomnika.v14i4.3956.
[24] Wang Y, et al., "Hierarchical transformer with auxiliary learning for subject-independent respiration emotion recognition," IEEE Sensors Journal, 2025, early access. doi: 10.1109/JSEN.2025.3587271.
[25] Zhou X, Huang D, Peng X, Yin L, "miMamba: EEG-based emotion recognition with multi-scale inverted Mamba models," IEEE Transactions on Affective Computing, 2025, early access. doi: 10.1109/TAFFC.2025.3587443.
[26] He X, Huang J, Fu Z, Li Y, Wu D, "A multi-kernel embedding fusion framework for physiological signal-based emotion recognition," IEEE Transactions on Affective Computing, 2025, early access. doi: 10.1109/TAFFC.2025.3562905.
[27] Xu M, Shi T, Zhang H, Liu Z, He X, "A hierarchical cross-modal spatial fusion network for multimodal emotion recognition," IEEE Transactions on Artificial Intelligence, 2025, vol. 6, no. 5, pp. 1429–1438, doi: 10.1109/TAI.2024.3523250.
[28] Yao Q, Gu H, Wang S, Li X, "A feature-fused convolutional neural network for emotion recognition from multichannel EEG signals," IEEE Sensors Journal, 2022, vol. 22, no. 12, pp. 11954–11964, doi: 10.1109/JSEN.2022.3172133.
[29] Khateeb M, Anwar SM, Alnowami M, "Multi-domain feature fusion for emotion classification using DEAP dataset," IEEE Access, 2021, vol. 9, pp. 12134–12142, doi: 10.1109/ACCESS.2021.3051281.