ارزیابی شبکه های عصبی عمیق در تشخیص احساسات با استفاده از الگوهای سیگنال الکتروانسفالوگرام
آذین کرمانشاهیان
1
(
دانشکده مهندسی برق- واحد نجفآباد، دانشگاه آزاد اسلامی، نجفآباد، ایران
)
مهدی خضری
2
(
مرکز تحقیقات پردازش دیجیتال و بینایی ماشین- واحد نجفآباد، دانشگاه آزاد اسلامی، نجفآباد، ایران
)
کلید واژه: شبکه عصبی کانولوشنی, تشخیص احساس, سیگنال الکتروانسفالوگرام, طبقهبند ماشین بردار پشتیبانی, ویژگیهای دینامیکی,
چکیده مقاله :
در این مطالعه طراحی یک سیستم قابل اعتماد که قادر به شناسایی احساسات مختلف با دقت مطلوب باشد، مورد توجه قرار گرفته است. برای رسیدن به این هدف، دو ساختار برای سیستم تشخیص احساسات شامل 1) ویژگیهای خطی و غیرخطی سیگنال الکتروانسفالوگرام (EEG) به همراه طبقهبندهای رایج و 2) سیگنال EEG در یک ساختار یادگیری عمیق مدنظر قرار گرفتهاست. برای طراحی سیستم، سیگنالهای EEG پایگاه داده DEAP که از ۳۲ نفر با نمایش ویدیوهای احساسی ثبت شدهاند، مورد استفاده قرارگرفتند. پس از آمادهسازی و حذف نویز، ویژگیهای سیگنال شامل چولگی، کشیدگی، پارامترهای جورث، نمای لیاپانف، آنتروپی شانون، بعد همبستگی، بعد فرکتال و برگشتپذیری زمان از زیرباندهای آلفا، بتا و گاما استخراج شدند. سپس با توجه به ساختار یک، ویژگیهای تعیین شده بهعنوان ورودی به طبقهبندهای رایج مانند درخت تصمیم (DT)، k نزدیکترین همسایه (kNN) و ماشین بردار پشتیبان (SVM) اعمال شدند. همچنین مطابق با ساختار دو، سیگنال EEG بهعنوان ورودی شبکه عصبی کانولوشنی (CNN) درنظر گرفته شد. هدف ارزیابی نتایج شبکههای آموزش عمیق و سایر روشها برای تشخیص احساسات است. با توجه به نتایج کسب شده، SVM با دقت 1/94 درصد بهترین عملکرد را برای شناسایی چهار حالت احساسی بهدست آورد. همچنین CNN پیشنهادی، با دقت 86 درصد حالتهای موردنظر را شناسایی کرد. روشهای یادگیری عمیق بهدلیل اینکه به تعیین ویژگی برای سیگنالها نیاز ندارند و در برابر نویزهای مختلف مقاومند، نسبت به طبقهبندهای ساده برتری دارند.
چکیده انگلیسی :
In this study, the design of a reliable detection system that is able to identify different emotions with the desired accuracy has been considered. To reach this goal, two different structures for the emotion recognition system include 1) using linear and non-linear features of the electroencephalography (EEG) signal along with common classifiers and 2) using EEG signal in a deep learning structure is considered to identify emotional states. To design the system, the EEG signals of the DEAP database which were recorded by displaying emotional videos from 32 subjects were used. After the preparation and noise removal, linear and non-linear features such as: Skewness, Kurtosis, Hjorth parameters, Lyapunov exponent, Shannon entropy, correlation and fractal dimension and time reversibility were extracted from the alpha, beta and gamma subbands of the EEG signals. Then according to structure 1, the features were applied as input to common classifiers such as decision tree (DT), k nearest neighbor (kNN) and support vector machine (SVM). Also in structure 2, the EEG signal was considered as the input of the convoloutional neural network (CNN). The goal is to evaluate the results of deep learning networks and other methods for emotion recognition. According to the obtained results, the SVM achieved the best performance for identifying four emotional states with 94.1 % accuracy. Also, the proposed CNN identified the desired emotional states with the accuracy of 86%. Deep learning methods are superior to simple classifiers because they do not require the features of the signals and are resistant to different noises. Using a short period of time for the signals and performing near optimal preprocessing and conditioning, can further improve the results of deep neural networks.
[1] R. W. Picard, E. Vyzas, J. Healey, "Toward machine emotional intelligence: analysis of affective physiological state," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 23, no. 10, pp. 1175-1191, Oct. 2001 (doi: 10.1109/34.954607).
[2] H. Gunes, B. Schuller, M. Pantic, R. Cowie, "Emotion representation, analysis and synthesis in continuous space: A survey", Proceedings of the IEEE/ICAFGR, pp. 827-834, Santa Barbara, CA, USA, May 2011 (doi: 10.1109/FG.2011.5771357).
[3] N. Dashti, M, Khezri, "Recognition of motor imagery based on dynamic features of EEG signals", Journal of Intelligent Procedures in Electrical Technology, vol. 11, no. 43, 13-27, December 2020 (in Persian) (dor: 20.1001.1.23223871.1399.11.43.2.5).
[4] A. Tragoudaras, C. Antoniadis, Y. Massoud, "Enhancing DNN models for EEG/ECoG BCI with a novel data-driven offline optimization method", IEEE Access, vol. 11, pp. 35888-35900, April 2023 (doi: 10.1109/ACCESS.2023.3265040).
[5] T. Hinterberger, N. Weiskopf, R. Veit, B. Wilhelm, E. Betta, N. Birbaumer, "An EEG-driven brain-computer interface combined with functional magnetic resonance imaging (fMRI)", IEEE Trans. on Biomedical Engineering, vol. 51, no. 6, pp. 971-974, June 2004 (doi: 10.1109/TBME.2004.827069).
[6] I. Neuner, R. Rajkumar, C.R. Brambilla, S. Ramkiran, A. Ruch, L. Orth, E. Farrher, J. Mauler, C. Wyss, E.R. Kops, J. Scheins, L. Tellmann, M. Lang, J. Ermert, J. Dammers, B. Neumaier, C. Lerche, K. Heekeren, W. Kawohl, K.J. Langen, H. Herzog, N.J. Shah, "Simultaneous PET-MR-EEG: Technology, challenges and application in clinical neuroscience", IEEE Trans. on Radiation and Plasma Medical Sciences, vol. 3, no. 3, pp. 377-385, May 2019 (doi: 10.1109/TRPMS.2018.2886525).
[7] M. Khezri, M. Jahed, "An inventive quadratic time-frequency scheme based on wigner-ville distribution for classification of sEMG signals", Proceeding of the IEEE/TTAB, pp. 261-264, Tokyo, Japan, Nov. 2007 (doi: 10.1109/ITAB.2007.4407397).
[8] S. Koelstra, C. Muehl, M. Soleymani, J.S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, I. Patras, "DEAP: A database for emotion analysis ;using physiological signals", IEEE Trans. on Affective Computing, vol. 3, no. 1, pp. 18-31, Jan./March 2012 (doi: 10.1109/T-AFFC.2011.15).
[9] M. Ali, A.H. Mosa, F.A. Machot, K. Kyamakya, "EEG-based emotion recognition approach for e-healthcare applications", Proceeding of the ICUFN, pp. 946-950, Vienna, Austria, July 2016 (doi: 10.1109/ICUFN.2016.7536936).
[10] S. Liu, X. Wang, L. Zhao, J. Zhao, Q. Xin, S.H. Wang, "Subject-independent emotion recognition of EEG signals based on dynamic empirical convolutional neural network", IEEE Trans. on Computational Biology and Bioinformatics, vol. 18, no. 5, pp. 1710-1721, Sept./Oct. 2021 (doi: 10.1109/TCBB.2020.3018137).
[11] H. Yang, J. Han, K.J.S. Min, "A multi-column CNN model for emotion recognition from EEG signals", Sensors, vol. 19, no. 21, p. 4736, Nov. 2019 (doi:10.3390/s19214736).
[12] T. Song, W. Zheng, P. Song, Z. Cui, "EEG emotion recognition using dynamical graph convolutional neural networks", IEEE Trans. on Affective Computing, vol. 11, no. 3, pp. 532-541, July/Sept. 2020 (doi: 10.1109/TAFFC.2018.2817622).
[13] J. Zhang, M. Chen, S. Hu, Y. Cao, R. Kozma, "PNN for EEG-based Emotion Recognition", Proceeding of the IEEE/ICSMC, pp. 002319-002323, Budapest, Oct. 2016 (doi: 10.1109/SMC.2016.7844584).
[14] M. Bengalur, A. Saxena, "A systematic review on approaches to recognize emotions using electroencephalography (EEG) signals", Data Engineering Intelligent, pp. 107-120, May 2021 (doi: 10.1007/978-981-16-0171-2_11).
[15] M.S. Aldayel, M. Ykhlef, A.N. Al-Nafjan, "Electroencephalogram-based preference prediction using deep transfer learning", IEEE Access, vol. 8, pp. 176818-176829, Sept. 2020 (doi: 10.1109/ACCESS.2020.3027429).
[16] P. Chrapka, H. Bruin, G. Hasey, J. Reilly, "Wavelet-based muscle artefact noise reduction for short latency rTMS evoked potentials", IEEE Trans. on Neural Systems and Rehabilitation Engineering, vol. 27, no. 7, pp. 1449-1457, July 2019 (doi: 10.1109/TNSRE.2019.2908951).
[17] Z. Gu, G. Yan, J. Zhang, Y. Li, Z.L. Yu, "Automatic epilepsy detection based on wavelets constructed from data", IEEE Access, vol. 6, pp. 53133-53140, Sept. 2018 (doi: 10.1109/ACCESS.2018.2867642).
[18] C. Cortes, V. Vapnik, "Support-vector networks", Machine Learning, vol. 20, no. 3, pp. 273-297, Sept. 1995.
[19] C.A. Frantzidis, C. Bratsas, C.L. Papadelis, E. Konstantinidis, C. Pappas, P.D. Bamidis, "Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli", IEEE Trans. on Information Technology in Biomedicine, vol. 14, no. 3, pp. 589-597, May 2010 (doi: 10.1109/TITB.2010.2041553).
[20] S. Karimi-Shahraki, M. Khezri, "Identification of ADHD patients using wavelet-based features of EEG signals", Journal of Intelligent Procedures in Electrical Technology, vol. 12, no. 47, pp. 1-11, Dec. 2021 (in Persian) (dor: 20.1001.1.23223871.1400.12.3.1.1).
[21] N.S. Altman, "An introduction to kernel and nearest-neighbor nonparametric regression", The American Statistician, vol. 46, no. 3, pp. 175-185, Aug. 1992 (doi: 10.2307/2685209).
[22] F.Z. Salmam, A. Madani, M. Kissi, "Facial expression recognition using decision trees", Proceeding of the IEEE/CGiV, pp. 125-130, Beni Mellal, Morocco, March 2016 (doi: 10.1109/CGiV.2016.33).
[23] D. Shen, G. Wu, H.I. Suk, "Deep learning in medical image analysis", Annual Review of Biomedical Engineering, vol. 19, pp. 221-248, June 2017 (doi: 10.1146/annurev-bioeng-071516-044442).
[24] P. Dutta, P. Upadhyay, M. De, R.G. Khalkar, "Medical image analysis using deep convolutional neural networks: CNN architectures and transfer learning", Proceeding of the IEEE/ICICT, pp. 175-180, Coimbatore, India, Deb. 2020 (doi: 10.1109/ICICT48043.2020.9112469).
[25] M. Torabian, H. Pourghassem, H. Mahdavi‑Nasab, P. Sanaee, "Fire detection based on extraction of spatio-temporal features by convolutional neural networks and fractal analysis", Journal of Intelligent Procedures in Electrical Technology, vol. 15, no. 60, pp. 71-86, March 2025 (in Persian).
[26] D.P. Sudharshan, R.N. Vismaya, "Handwritten signature verification system using deep learning", Proceeding of the IEEE/ICDSIS), pp. 1-5, Hassan, India, Oct. 2022 (doi: 10.1109/ICDSIS55133.2022.9915833).
[27] I. Shahin, A.B. Nassif, S. Hamsa, "Emotion recognition using hybrid gaussian mixture model and deep neural network", IEEE Access, vol. 7, pp. 26777-26787, Feb. 2019 (doi: 10.1109/ACCESS.2019.2901352).
[28] S. Chaabene, B. Bouaziz, A. Boudaya, A. Hökelmann, A. Ammar, L. Chaari, "Convolutional neural network for drowsiness detection using EEG signals", Sensors, vol. 21, no.5, pp.1734. March. 2021 (doi: 10.3390/s21051734).
[29] D. Yu, H. Wang, P. Chen, Z. Wei, "Mixed pooling for convolutional neural networks", Proceeding of the Springer/ICRSKT, pp. 364-375, Shanghai, China, Oct. 2014 (doi: 10.1007/978-3-319-11740-9_34).
[30] H. Nakahara, T. Fujii, S. Sato, "A fully connected layer elimination for a binarizec convolutional neural network on an FPGA", Proceeding of the IEEE/FPL, pp. 1-4, Ghent, Belgium, Sept. 2017 (doi: 10.23919/FPL.2017.8056771).
[31] N. Behzadfar, "A brief overview on analysis and feature extraction of electroencephalogram signals", Signal Processing and Renewable Energy, vol. 6, no. 1, pp. 39-64, March 2022 (dor: 20.1001.1.25887327.2022.6.1.3.9).
[32] M. Dorvashi, N. Behzadfar, G. Shahgholian, "An efficient method for classification of alcoholic and normal electroencephalogram signals based on selection of an appropriate feature", Journal of Medical Signals and Sensors, vol.13, no. 1, pp. 11-20, March 2023 (doi: 10.4103/jmss.jmss_183_21).
[33] M. Dorvashi, N. Behzadfar, G. Shahgholian, “Classification of alcoholic and non-alcoholic individuals based on frequency and non-frequency features of electroencephalogram signal”, Iranian Journal of Biomedical Engineering, vol. 14, no. 2, pp. 121-130, July 2020 (doi: 10.22041/ijbme.2020.119841.1551).
[34] Y.H. Kwon, S.B. Shin, S.D. Kim, "Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system", Sensors, vol. 18, no. 5, pp. 1383, May 2018 (doi: 10.3390/s18051383).
[35] Y. Yang, Q. Wu, Y. Fu, X. Chen, "Continuous convolutional neural network with 3D input for EEG-based emotion recognition", Proceeding of the Springer/ICNIP, pp. 433-443, Siem Reap, Cambodia. Dec. 2018 (doi: 10.1007/978-3-030-04239-4_39).
[36] E. S. Salama, R. A. El-Khoribi, M. E. Shoman, M. A. W. Shalaby, "EEG-based emotion recognition using 3D convolutional neural networks", International Journal of Advanced Computer Science Applications, vol. 9, no. 8, pp. 329-337, Sept. 2018 (doi: 10.14569/IJACSA.2018.090843).
[37] X. Xing, Z. Li, T. Xu, L. Shu, B. Hu, X. Xu, "SAE+LSTM: A New framework for emotion recognition from multi-channel EEG", Frontiers in Neurorobotics, vol. 13, Article Number: 37, June. 2019 (doi: 10.3389/fnbot.2019.00037).
[38] A. Topic, M. Russo, "Emotion recognition based on EEG feature maps through deep learning network", Engineering Science Technology, an International Journal, vol. 24, no. 6, pp. 1442-1454, Dec.2021 (doi: 10.1016/j.jestch.2021.03.012).
_||_